The mind experiences and models experimenter [meme] framework, use non-evasive EEG technology to record and analyse information of user’s brain activity.
Allowing the configuration of specific test cases (experiments) based in visual, audio and another external stimuli, through sequences of image, sound and language, is possible searching for singular events into the datasets and apply models using machine learning algorithms for searching patterns. Identify user’s emotions, visualize and hear representations of our own thoughts, collaborate in the understanding of the brain and simply share knowledge are the main objectives of this framework.
The envisioning of this technology applied to education, health, music, arts and other industries is expanding at exponential rate and is the inspiration of the meme solutions delivery roadmap. We believe that simplifying our relationship with machines (hardware/software), using BCI, could improve the life accelerating our way to abundance and happiness like intelligent specie.
This blog explains details about the development of the project, news, system features, components architecture and also the methodology approach to train and validate artificial intelligence models for predict states of mind associated with the settings of the experiences.
In the future, meme framework could be the foundation of an general-purpose people singularity search engine, for now, is a open source project, join us!