BCI MIDI Sequencer

MEME [Framework] 2.0

After a lot of experiences in the latest two years, and only with focus in one part of the framework related with the intersection between art and science: MEME [Music], we officially launched the development of a new version 2.0 with a redesign of the entire framework.

Thank you very much for the patience of all the people that sent messages via social platform and email; shortly you will receive answers to your questions; right now, we are compiling all the artifacts required to well define the path of this amazing open source project; to rewind the past you can see the envisioning of the entire academic project that tries to remix artificial intelligence, quality assurance and general ambient assisted living concepts and was the initial inspiration of this brain computer interface framework that allows the experimentation of machine learning models according your own mind experience set of experiments.­­

Thanks also to everybody that inspires this project with your magic, very soon we will be publishing more details, see you!

[MEME]music (beta test)


News: Recording sessions of [MEME] emotion @UMinho

Today another presentation was made in the Intelligent System class of Minho University, the next week, day 14/01/2014 we will starting the recording sessions of [MEME] Emotion experience using GAPED with some people of this class at ISLab


News: Meraki Lab

open source, GitHub, for developers

Was initiated the synchronisation of the code using gitHub as provider of source control; right now we are working in the development plan of the entire framework, briefly hope to upload the first version of meme, meanwhile, checkout the repository address:

Screen Shot 2013-08-30 at 5.29.38 PM

memeMusic, electrophone, organology

One of the project milestones of the framework is support memeMusic.

Based in the study and science of musical instruments, the quintephones or “non-physical instruments” are not limited by space constraints; to do this, memeMusic  can associate brain waves to music properties like pitch, notes or frequency using the recorded information of any dataset.

Another interesting sneak peek about the vision of memeMusic, inspired by solid instruments: e.g. we could mapping the 5  brain waves Delta, Theta, Alpha, Betha, Gamma (D,T,A,B,G) with the standard tuning of the first guitar strings 1 to 5 (E B G D A); the sixth string of the guitar is the same as the first E but two octaves below and is not used.


News: more details about meme framework at slideshare

The first public presentation of meme framework (University of Minho) was uploaded with more basic information and details: