Was initiated the synchronisation of the code using gitHub as provider of source control; right now we are working in the development plan of the entire framework, briefly hope to upload the first version of meme, meanwhile, checkout the repository address: https://github.com/memeAdmin
One of the project milestones of the framework is support memeMusic.
Based in the study and science of musical instruments, the quintephones or “non-physical instruments” are not limited by space constraints; to do this, memeMusic can associate brain waves to music properties like pitch, notes or frequency using the recorded information of any dataset.
Another interesting sneak peek about the vision of memeMusic, inspired by solid instruments: e.g. we could mapping the 5 brain waves Delta, Theta, Alpha, Betha, Gamma (D,T,A,B,G) with the standard tuning of the first guitar strings 1 to 5 (E B G D A); the sixth string of the guitar is the same as the first E but two octaves below and is not used.
The first public presentation of meme framework (University of Minho) was uploaded with more basic information and details:
For the sustainability of the people working on this open source project, today was launched peoplesingularity.com offering services related with the framework; according the meme solutions delivery roadmap:
This video from nature explains the objective of connectomics project and the importance of gamification to accelerate any research area (powered by people):
The fundamental objective of meme framework is simply record, analyse and decode brain signals from millions of synapses according specific experiences previously designed; meme framework set and fix in the core anatomy components the spatial scale in centimetres (cm) giving importance to the whole voltage source and allowing the measurement of EEG signals mapped directly with specific brain region, according the size of the sensors selected and the headset.
Probably in the next years, meme framework will start to computing and optimising datasets on the cloud, with a on-line repository of experiences and some templates and machine learning models ready to use, certainly, gamification will be the drive of this vision, let’s play and help.
Was added in the roadmap and vision scope of the project a new feature: