BrainAble will conceive research, design, implement and validate an ICT-based human computer interface (HCI) composed of BNCI sensors combined with affective computing and virtual environments. This combination will dramatically improve the quality of life of people with disabilities by overcoming the two main shortcomings they suffer - exclusion from home and social activities - by providing inner functional independence for daily life activities and autonomy (HCI connected to accessible and interoperable home and urban automation) and outer social inclusion (HCI connected to advanced and adapted social networks services).
In terms of HCI, BrainAble will improve both direct and indirect interaction with computers. Direct control will be upgraded by creating tools that allow people to control those inner and outer environments using a hybrid Brain Computer Interface (BCI) system (BCIs, Electro Oculogram (EOG), Electromyography (EMG), and Heart Rate). Furthermore, BNCI information will be used for indirect interaction, such as by changing interface or overall system parameters based on measures of boredom, confusion, frustration, or information overload. These self-adaptive tools will increase effective bandwidth because users will be able to use a plurality of signals to effect control, and also because adaptation will reduce errors and help provide the user with the desired control. BrainAbles HCI will be complemented by an intelligent Virtual Reality-based user interface with avatars and scenarios that will help disabled people to move around on their wheelchairs, interact with all sort of devices, create self-expression assets using music, pictures and text, communicate online and offline with other people, play games to counteract cognitive decline, and get trained in new functionalities and tasks.