Voice-recognition and voice controlled gadgets are entering our everyday life since a few years.
Recent technology advances promise us even more.
Controlling our devices and apps will be even easier in the future: with a single thought through Brain-Computer Interface (BCI) technology.
In a little while, we will be able to interact with our smartphones and tablets just with your brain through the use of head-mounted electroencephalogram (EEG) sensors.
Thought-controlled computing is the eventual goal to control all of our devices. But, it may also help us to understand and react to what is happening inside our head by displaying brain activity. This may benefit biofeedback approaches as it was recently shown that humans more easily control brain activity and stress when they have a picture of what is going on in the their heads.
Prototypes are already developed and even in the early phases, BCI technology is gaining significant traction in consumer-electronics.
Users are already dreaming about improving their own mind and stress control methods.
Presently, BCI technology is mainly used to support patients that suffer from limited mobility due to muscle disorders and damage of peripheral nerves. It significantly improves the user’s ability to interact with their environment increasing their life quality.
Future BCI technology is trying to go beyond the clinical use and to enter our daily life.
So far, BCI technology isn’t a “mind reader”, which means it isn’t able to process or interpret thoughts but simply measures the electrical activity in the brain capturing a response to what the user engages.
Even though BCI research is working hard on overcoming current limitations in “mind reading”, it remains uncertain whether it will be ever possible to develop a mobile experience without any touch interaction necessary.
Another big challenge in developing BCI technology is the placement and management of the EEG sensors to make the technology more user friendly e.g. to develop comfortable and properly placed headsets and make people to wear them.
One important question is: How many years do we have to wait to see BCI technology to be usable in our daily life?
One of the leading technologist in the BCI area, Ariel Garten InteraXon CEO, predicts: “Just given the advances that need to happen in algorithm detection and consumer behavior, 20 to 25 years is really the right time frame for to be pertinent in the way that touch screens are today.”
Biological organisms and whole biological systems show the characteristics of adaption, reactivity and dispersion. Principles to that degrees of perfection are rarely used in human-engineered technologies.
The multi-disciplinary field of biological computing (also called bio-inspired computing or biocomputing) is based on theoretical natural science and system biology, mathematics, cognitive science, logic and complexity, computer science, informatics, robotics and cybernetics.
Biocomputing aims to solve complex problems and system architectural questions by developing computational models following Continue reading “Biological Inspired Computing – Logic by Nature” »
Affective Computing is a multidisciplinary research field applying central concepts of engineering, robotics, computer science and artificial intelligence as well as psychology, neuroscience, social science, anthropology and philosophy to develop enhance human-machine interactions.
Robots are increasingly becoming “social” entities that get integrated in our daily lives. Continue reading “Affective Computing – Emotional Machines?” »
Affective Computing studies how humans and machines interact on an emotional level. Besides measuring physiological parameters, most progress in affect detection is probably done in the area of speech recognition/detection. Continue reading “Speech Affect Detection – Computers Hear the Emotion in Your Voice” »
Few months ago, we saw quadriplegic patients using brain implants to control robot limbs. More recently, the researchers in the group of Miguel Nicolelis went a step further showing that rats were able to communicate through brain chips and collaborate on performing a task.
Who doesn’t immediately think of mind reading and mind control? Continue reading “Brain-to-brain interfaces: A first step into an organic computer?” »
Today’s smartphones are capable to do much more than just tracking our schedule. They have more sensors than most people would ever imagine that permit collecting data about their users. Both, most Android and iPhones are equipped with audio sensors (microphones), image sensors (cameras), touch interfaces on the screen, an acceleration sensor (tri-axial accelerometer), Global Positioning System (GPS), light sensors and multiple other features. Continue reading “Smartphones – Technology to Track your Life” »
- How machines correctly can read emotions and show the proper response
- How machines can express emotion-like patterns that effect human affection
Besides smart analytic algorithms Continue reading “Psychological-neurobiological Methods of Affective Computing” »
According to a recent IDC report the expected value of the U.S. cloud market will Continue reading “Cloud Services – Golden Times Still Ahead?” »
Affective Computing is an interdisciplinary field of psychology, computer science, neuroscience/ cognitive science, sociology, education and physiology that investigates and develops applications Continue reading “Affective Computing – When Computers Feel with You” »