After a lot of experiences in the latest two years, and only with focus in one part of the framework related with the intersection between art and science: MEME [Music], we officially launched the development of a new version 2.0 with a redesign of the entire framework.
Thank you very much for the patience of all the people that sent messages via social platform and email; shortly you will receive answers to your questions; right now, we are compiling all the artifacts required to well define the path of this amazing open source project; to rewind the past you can see the envisioning of the entire academic project that tries to remix artificial intelligence, quality assurance and general ambient assisted living concepts and was the initial inspiration of this brain computer interface framework that allows the experimentation of machine learning models according your own mind experience set of experiments.
Thanks also to everybody that inspires this project with your magic, very soon we will be publishing more details, see you!
Today another presentation was made in the Intelligent System class of Minho University http://mei.di.uminho.pt/?q=pt-pt/1314/si, the next week, day 14/01/2014 we will starting the recording sessions of [MEME] Emotion experience using GAPED with some people of this class at ISLab http://islab.di.uminho.pt/islab/.
Was initiated the synchronisation of the code using gitHub as provider of source control; right now we are working in the development plan of the entire framework, briefly hope to upload the first version of meme, meanwhile, checkout the repository address: https://github.com/memeAdmin
One of the project milestones of the framework is support memeMusic.
Based in the study and science of musical instruments, the quintephones or “non-physical instruments” are not limited by space constraints; to do this, memeMusic can associate brain waves to music properties like pitch, notes or frequency using the recorded information of any dataset.
Another interesting sneak peek about the vision of memeMusic, inspired by solid instruments: e.g. we could mapping the 5 brain waves Delta, Theta, Alpha, Betha, Gamma (D,T,A,B,G) with the standard tuning of the first guitar strings 1 to 5 (E B G D A); the sixth string of the guitar is the same as the first E but two octaves below and is not used.
The first public presentation of meme framework (University of Minho) was uploaded with more basic information and details:
For the sustainability of the people working on this open source project, today was launched peoplesingularity.com offering services related with the framework; according the meme solutions delivery roadmap:
This video from nature explains the objective of connectomics project and the importance of gamification to accelerate any research area (powered by people):
The fundamental objective of meme framework is simply record, analyse and decode brain signals from millions of synapses according specific experiences previously designed; meme framework set and fix in the core anatomy components the spatial scale in centimetres (cm) giving importance to the whole voltage source and allowing the measurement of EEG signals mapped directly with specific brain region, according the size of the sensors selected and the headset.
Probably in the next years, meme framework will start to computing and optimising datasets on the cloud, with a on-line repository of experiences and some templates and machine learning models ready to use, certainly, gamification will be the drive of this vision, let’s play and help.
Was added in the roadmap and vision scope of the project a new feature: