Zazzle Shop

Screen printing
Showing posts with label Mind Control. Show all posts
Showing posts with label Mind Control. Show all posts

Wednesday, March 23, 2011

Brain–Computer Interface Allows Paralyzed Patients to Play Music with Brainpower Alone

From: http://www.nature.com/
mind musicThe brain-computer interface allows paralysed patients to play music just by thinking about it.ICCMR Research Team - University of Plymouth

A pianist plays a series of notes, and the woman echoes them on a computerized music system. The woman then goes on to play a simple improvised melody over a looped backing track. It doesn't sound like much of a musical challenge — except that the woman is paralysed after a stroke, and can make only eye, facial and slight head movements. She is making the music purely by thinking.

This is a trial of a computer-music system that interacts directly with the user's brain, by picking up the tiny electrical impulses of neurons. The device, developed by composer and computer-music specialist Eduardo Miranda of the University of Plymouth, UK, working with computer scientists at the University of Essex, should eventually help people with severe physical disabilities, caused by brain or spinal-cord injuries, for example, to make music for recreational or therapeutic purposes. The findings are published online in the journal Music and Medicine1.

"This is an interesting avenue, and might be very useful for patients," says Rainer Goebel, a neuroscientist at Maastricht University in the Netherlands who works on brain-computer interfacing.

Therapeutic use

Evidence suggests that musical participation can be beneficial for people with neurodegenerative diseases such as dementia and Parkinson's disease. But people who have almost no muscle movement have generally been excluded from such benefits, and can enjoy music only through passive listening.

The development of brain–computer interfaces (BCIs) that enable users to control computer functions by mind alone offer new possibilities for such people (see Mental ping-pong could aid paraplegics). In general, these interfaces rely on the user's ability to learn how to self-induce particular mental states that can be detected by brain-scanning technologies.

Miranda and his colleagues have used one of the oldest of these systems: electroencephalography (EEG), in which electrodes on the skull pick up faint neural signals. The EEG signal can be processed quickly, allowing fast response times, and the instrument is cheaper and more portable than brain-scanning techniques such as magnetic resonance imaging and positron-emission tomography.

Previous efforts using BCIs have focused on moving computer screen icons such as cursors, but Miranda's team sought to achieve the much more complex task of enabling users to play and compose music. Miranda says that he first became aware of the then-emerging field of BCIs more than a decade ago while researching how to make music using brainwaves. "When I realized the potential of a musical BCI for the wellbeing of severely disabled people," he says, "I couldn't leave the idea alone. Now I can't separate this work from my activities as a composer."

The trick is to teach the user how to associate particular brain signals with specific tasks by presenting a repeating stimulus — auditory, visual or tactile — and getting the user to focus on it. This elicits a distinctive, detectable pattern in the EEG signal. Miranda and his colleagues show several flashing 'buttons' on a computer screen, which each trigger a musical event. The users push a button just by directing their attention to it.

For example, a button could be used to generate a melody from a preselected set of notes. The user can alter the intensity of the control signal – how 'hard' the button is pressed – by varying the intensity of attention, and the result is fed back to them visually as a change in the button's size. In this way, any one of several notes can be selected by mentally altering the intensity of pressing.

With a little practice, this allows users to create a melody as if they were selecting keys on a piano. And, as with learning an instrument, say the researchers, "the more one practices the better one becomes".

Back in control

The researchers trialled their system on a female patient who has locked-in syndrome, a form of almost total paralysis caused by brain lesions, at the Royal Hospital for Neuro-disability in London. During a two-hour session, she got the hang of the system and was eventually playing along with a backing track. She reported that "it was great to be in control again".

ADVERTISEMENT

Goebel points out that the patients still need to be able to control their eye movements, which people with total locked-in syndrome cannot. In such partial cases, he says, "one can usually use gaze directly for controlling devices, instead of an EEG system". But Miranda points out that eye-gazing alone does not permit variations in the intensity of the signal. "Eye gazing is comparable to a mouse or joystick," he says. "Our system adds another dimension, which is the intensity of the choice. That's crucial for our musical system."

Miranda says that although increasing the complexity of the musical tasks is not a priority, music therapists have suggested it would be better if the system were more like a musical instrument — for instance, with an interface that looks like a piano keyboard. He admits that it is not easy to raise the number of buttons or keys beyond four, but is confident that "we will get there eventually".

"The flashing thing does not need to be on a computer screen," he says. It could, for example, be a physical electronic keyboard with light-emitting diodes on the keys. "You could play it by staring at the keys," he says.

  • References

    1. Miranda, E. R., Magee, W. L., Wilson, J. J., Eaton, J. & Palaniappan, R. Music and Medicine advance online publication doi:10.1177/1943862111399290 (2011).

Friday, January 14, 2011

XWave for iPhone lets you read your own mind

by Lin EdwardsMore information: XWave - http://www.plxwave.com/



XWave for iPhone lets you read your own mind Enlarge


The XWave can sense and detect human brainwaves, interpret them and connect it to everyday technology.



(PhysOrg.com) -- A new application for the iPhone, the XWave, lets you read your own mind via a headset clamped to your head and connected to the phone’s audio jack.









The plastic headband, which costs around $100, has a sensor that presses against the user’s forehead and communicates with a free XWave iPhone application that then shows your brain waves graphically on the iPhone screen. As you focus your mind on a task the graphics are changed — a ball may move higher for instance, or your state of relaxation may be indicated by changes in a pulsating color, which moves towards blue as you become more relaxed.






Brainwave detection is powered by an NeuroSky eSense dry sensor, which provides a brain-computer interface (BCI) to sense even faint electrical impulses in the brain and convert them to digital signals that are sent to the iPhone. Previous applications of the NeuroSky technology include computer games and toys. In XWave an algorithm is applied to the brain rhythms to convert them to graphical representations of attention and meditation values.

XWave for iPhone lets you read your own mind
Enlarge


XWave enables you to manipulate a number of other iPhone graphical applications and objects in games using only your brain waves, providing your rating in attention or meditation is high enough. At present you cannot text or browse the web using XWave, but you can use the device to train your mind to relax and focus on command. The list of applications for the device is likely to grow rapidly.






XWave, developed by PLX Devices, is meant to be used purely for entertainment, but the implications for the future are enormous, and may be particularly important for people who are disabled since they may be able to have much more control in their lives using their alone to control their phonse and potentially other applications. According to PLX, the headset device is also open for use with applications from other companies.

XWave for iPhone lets you read your own mind
Enlarge

XWave iPhone app screen.

XWave is compatible with the , iPod Touch and iPad. Wireless versions are also available for WiFi and Bluetooth devices. The free XWave application is available for download via iTunes.



© 2010 PhysOrg.com

Wednesday, November 3, 2010

Eternal sunshine? Scientists create technique to delete traumatic memories

By Daily Mail Reporter


From:
http://www.dailymail.co.uk/

Researchers have found a way of permanently deleting painful memories, which they say could lead to drugs for post-traumatic stress disorder.

A team at John Hopkins University in the U.S removed a protein from the region of the brain responsible for recalling fear in tests on mice.

The mice were then unable to recall fear associated with a loud sound.

Science-fiction could soon be reality after researchers found a way to delete painful memories

Science-fiction could soon be reality after researchers found a way to delete painful memories. The concept was explored in the film Eternal Sunshine Of A Spotless Mind where Jim Carrey (pictured) and Kate Winslet decide to erase each other from their memories after a difficult break-up

The method is similar to that imagined in the film Eternal Sunshine Of The Spotless Mind, where Jim Carrey and Kate Winslet decide to erase each other from their memories after a difficult break-up.

The scientists, whose report appears in Science Express, said it had important implications for patients whose lives were blighted by fear.

Lead researcher, Dr Richard L Huganir, said: 'When a traumatic event occurs, it creates a fearful memory that can last a lifetime and have a debilitating effect on a person’s life.

'Our finding describing these molecular and cellular mechanisms involved in that process raises the possibility of manipulating those mechanisms with drugs to enhance behavioural therapy for such conditions as post-traumatic stress disorder.'

Behavioural therapy has been shown to ease the depth of the emotional response to traumatic memories, but not in completely removing the memory itself, making relapse common.

Dr Huganir and post-doctoral fellow Roger Clem focused on the nerve circuits in the amygdala, the part of the brain known to underly so-called fear conditioning in people and animals.

Using sound to cue fear in mice, they observed that certain cells in the amygdala conducted more current after the mouse was exposed to a loud, sudden tone.

They found temporary increases in the amount of particular proteins - the calcium-permeable AMPARs - within a few hours of fear conditioning that peaked at 24 hours and disappeared 48 hours later.

These particular proteins are uniquely unstable and can be removed from nerve cells.

Dr Huganir said: 'The idea was to remove these proteins and weaken the connections in the brain created by the trauma, thereby erasing the memory itself.'

In further experiments, they found that removal of these proteins depended on the chemical modification of the GluA1 protein.

Mice lacking this chemical modification of GluA1 recovered fear memories induced by loud tones, whereas litter mates did not recover the same fear memories.

Dr Huganir suggests that drugs designed to control and enhance the removal of calcium-permeable AMPARs may be used to improve memory erasure.

Dr Huganir said: 'This may sound like science fiction, the ability to selectively erase memories.

'But this may one day be applicable for the treatment of debilitating fearful memories in people, such as post-traumatic stress syndrome associated with war, rape or other traumatic events.'

This study was funded by the National Institutes of Health and the Howard Hughes Medical Institute.

Thursday, April 29, 2010

Hilarious Hacked Device Electrocutes You for Thinking (Video)

by Aaron Saenz

most painful toy hack ever
What do you do after hacking a brain computer interface to electrocute people when they concentrate? You invite over some comedians and film them flipping out. Awesome.

In further proof that idle engineers are the most evil demographic in the world, I present to you the “Most Painful Toy Hack Ever”. Created by Aaron Rasmussen, co-founder of Harcos Laboratories, this hacked device monitors your brain activity and gives you a scream-out-loud electric shock as soon as you start concentrating as a way of making your friends laugh. That’s the sort of mixture of comedy and malevolence you can expect from Harcos. To promote their energy drinks (which look like mana potions, and bags of human blood) they’ve pulled a lot of crazy stunts using technology. They’re sort of the geeky version of Jackass. Watch the video below to see Rasmussen shock the crap out of himself, his co-founder Elijah Szasz, and the cast of SMBC-Theater. I never knew such hilarious antics could arise from combining BCI with electroshock therapy.


For a hack of a brain computer interface (BCI), the MPTHE is pretty cheap to build. Rasmussen says the entire project cost him around $105. That includes the BCI from a toy called MindFlex (~$80+), an electric shock card from Qkit (~$5), and various electronic parts. Harcos Labs has placed all the information you need to build your own MPTHE on their website. I’m sure hackers everywhere have already started to improve upon the design. We just saw the release of the first patient-ready BCI on the market. Maybe with that EEG they’d be able to do something more productive than shock you. As brain computer interfaces get more common, and accessible, I’m sure we’ll see some really incredible hacks, hopefully not all for evil.


[image and video credit: Harcos Laboratories]
[source: Harcos Laboratories]

Tuesday, March 2, 2010

Body acoustics can turn your arm into a touchscreen

Finding the keypad on your cellphone or music player a bit cramped? Maybe your forearm could be more accommodating. It could become part of a skin-based interface that effectively turns your body into a touchscreen.

Called Skinput, the system is a marriage of two technologies: the ability to detect the ultralow-frequency sound produced by tapping the skin with a finger, and the microchip-sized "pico" projectors now found in some cellphones.

The system beams a keyboard or menu onto the user's forearm and hand from a projector housed in an armband. An acoustic detector, also in the armband, then calculates which part of the display you want to activate.

But how does the system know which icon, button or finger you tapped? Chris Harrison at Carnegie Mellon University in Pittsburgh, Pennsylvania, working with Dan Morris and Desney Tan at Microsoft's research lab in Redmond, Washington, exploit the way our skin, musculature and skeleton combine to make distinctive sounds when we tap on different parts of the arm, palm, fingers and thumb (see video).

Bone machine

They have identified various locations on the forearm and hand that produce characteristic acoustic patterns when tapped. The acoustic detector in the armband contains five piezoelectric cantilevers, each weighted to respond to certain bands of sound frequencies. Different combinations of the sensors are activated to differing degrees depending on where the arm is tapped.

Twenty volunteers tested the system and most found it easy to navigate through icons on the forearm and tap fingers to actuate commands.

"Skinput works very well for a series of gestures, even when the body is in motion," the researchers say, with subjects able to deftly scroll through menus whether they moved up and down or flicked across their arm.

The system could use wireless technology like Bluetooth to transmit commands to many types of device – including phones, iPods and even PCs. The researchers will present their work in April at the ACM Computer-Human Interaction meeting in Atlanta, Georgia.

Body control

Pranav Mistry of the Media Lab at the Massachusetts Institute of Technology warns that users will have to position the armband very precisely so the projection always appears in the right place.

Nevertheless, Skinput looks a promising idea, says Michael Liebschner, director of the Bio-Innovations Lab at Baylor College of Medicine in Houston, Texas, who has worked on bone acoustic conduction technology for gadget-to-gadget transmission.

"This sounds a very feasible approach to using the body itself as an input device," he says. "When you are immersed in a virtual game using a head-mounted 3D display, you cannot just take it off to fiddle around with control buttons. This will make things much easier."

If you would like to reuse any content from New Scientist, either in print or online, please contact the syndication department first for permission. New Scientist does not own rights to photos, but there are a variety of licensing options available for use of articles and graphics we own the copyright to.

Have your say

Tuesday, June 30, 2009

Real-time Control Of Wheelchair With Brain Waves

ScienceDaily (June 29, 2009) — Japan's BSI-TOYOTA Collaboration Center has successfully developed a system that controls a wheelchair using brain waves in as little as 125 milliseconds.



BTCC was established in 2007 by RIKEN, an independent Japanese research institution, as a collaborative project with Toyota Motor Corporation, Toyota Central R&D Labs, Inc., and Genesis Research Institute, Inc. Also collaborating in the research were Andrzej Cichocki, Unit Leader, and Kyuwan Choi, Research Scientist, of BTCC's Noninvasive BMI Unit.

Recently technological developments in the area of brain machine interface (BMI) have received much attention. Such systems allow elderly or handicapped people to interact with the world through signals from their brains, without having to give voice commands.

BTCC's new system fuses RIKEN's blind signal separation1 and space-time-frequency filtering2 technology to allow brain-wave analysis in as little as 125 ms, as compared to several seconds required by conventional methods. Brain-wave analysis results are displayed on a panel so quickly that drivers do not sense any delay. The system has the capacity to adjust itself to the characteristics of each individual driver, and thereby is able to improve the efficiency with which it senses the driver's commands. Thus the driver is able to get the system to learn his/her commands (forward/right/left) quickly and efficiently. The new system has succeeded in having drivers correctly give commands to their wheelchairs. An accuracy rate of 95% was achieved, one of the highest in the world.

Plans are underway to utilize this technology in a wide range of applications centered on medicine and nursing care management. R&D under consideration includes increasing the number of commands given and developing more efficient dry electrodes. So far the research has centered on brain waves related to imaginary hand and foot control. However, through further measurement and analysis it is anticipated that this system may be applied to other types of brain waves generated by various mental states and emotions.

Notes:

(1) Blind signal separation (BSS) is a technology that separates the noise components and useful signal components from brain signals that can be used to control the wheelchair. It utilizes only on-line-recorded EEG signals.

(2) Space-time-frequency filtering is a technology which extracts space and time patterns and frequency oscillation data from EEG electrodes to discriminate significant features and components which are able to reliably control the wheelchair.



Wednesday, January 7, 2009

Toy trains 'Star Wars' fans to use The Force

Destiny with fun: An aspiring Jedi concentrates on making the ball in the tube rise.
Uncle Milton Industries
Destiny with fun: An aspiring Jedi concentrates on making the ball in the tube rise.
Could The Force be with you? A toy due in stores this fall will let you test and hone your Jedi-like abilities.

The Force Trainer (expected to be priced at $90 to $100) comes with a headset that uses brain waves to allow players to manipulate a sphere within a clear 10-inch-tall training tower, analogous to Yoda and Luke Skywalker's abilities in the Star Wars films.

No, you're not tapping into some "all-powerful force controlling everything," as Han Solo said in the movies. But you are reaching out with mind power via one of the first mass-market brain-to-computer products. "It's been a fantasy everyone has had, using The Force," says Howard Roffman, president of Lucas Licensing.

Mind-control games may be the coming thing: Mattel plans to demonstrate a Mind Flex game (also due this fall), which uses brain-wave activity to move a ball through a tabletop obstacle course, at the Consumer Electronics Show in Las Vegas on Thursday.

In the Force Trainer, a wireless headset reads your brain activity, in a simplified version of EEG medical tests, and the circuitry translates it to physical action. If you focus well enough, the training sphere, which looks like a ping-pong ball, will rise in the tower.A state of deep concentration is needed to achieve a Force-full effect. "When you concentrate, it activates the training remote," says Frank Adler of toymaker Uncle Milton Industries, which is creating the Trainer. "There is a flow of air that will move the (ball). You can actually feel like you are in a zone."

Star Wars sound effects and audio clips emitted from the base unit "cue you in to progress to the next level (from Padawan to Jedi) or when to move the sphere up or down to keep challenging yourself," Adler says.

"Until today, EEG technology has been designed for rigorous medical and clinical applications with little regard to price (and) ease of use," says Greg Hyver of NeuroSky, which developed the brain-wave technology for both games. "We are putting this exciting technology into everyone's living room."

Monday, December 29, 2008

When computers read our minds: Is the singularity at hand?

fora.tv — Watch a live demo of a device that uses a person's subconscious thoughts to input computer commands. It's 20 minutes of pure awesome. Start at 14:00 on the video if you want to skip to the actual demonstration.



Wednesday, September 10, 2008

The future of gaming is all in the mind

LONDON, England (CNN) -- Be excited, but be scared. A world of mind-blowing possibilities is suddenly being thrust upon the world of video gaming.

Detecting your thoughts: the EPOC headset is a breakthrough in brain - computer interfaces.

Detecting your thoughts: the EPOC headset is a breakthrough in brain - computer interfaces.


The era of thought controlled games has arrived, and soon you could be required only to 'think' to operate a video game. Maybe you'll even have the chance to be completely immersed in a video game 'world'.

The Emotiv EPOC headset - the first Brain Computer Interface (BCI) device for the gaming market is the technology behind the revolution -- and the company claims to have already mastered thought control.

The EPOC detects and processes real time brain activity patterns (small voltage changes in the brain caused by the firing of neurons) using a device that measures electric activity in the brain.

In total, it picks up over 30 different expressions, emotions and actions.

The leap in technology has been met with excitement amongst many gamers. Singapore enthusiast Samuel Lau has even made a video showing his hopes for the future of gaming.

But, for the creators, what possibilities does this open up for future video games? Photo View gallery of gaming through the years »

According to experts, the sci-fi scenarios depicted in The Matrix, and Star Trek's 'Holodeck' are now comprehendible realities in the future.

President and co-founder of Emotiv Systems, Tan Le, said the brain-to-computer interface was undoubtedly the future for video games.

"Being able to control a computer with your mind is the ultimate quest of human-machine interaction. When integrated into games, virtual worlds and other simulated environments, this technology will have a profound impact on the user's experience." What do you think video games of the future will look like?

Le envisaged the lines between games and reality continuing to blur.

"In the long run, the user's interactions with machines will more closely mimic our interactions with other humans. Our technology will ultimately bring communities of people closer together to richly share their experiences," he said.

Rick Hall, production director at the Florida Interactive Entertainment Academy, is also open-minded about possibilities in future gaming.

Hall, who has worked across machines such as the N64, Sony PSP, PS2, and Nintendo DS, told CNN that some of the concepts in The Matrix were now "eerily reaching towards theoretical possibility".

"If we can interpret basic control thoughts now, it isn't far off where we'll be able to interpret more complex thoughts, even potentially things you're not consciously thinking of. If we can now do it in a non-invasive fashion, it probably won't be long before we can read these things from across the room.

And if we can "read" complex thoughts, then shouldn't we also be able to "write" thoughts into a person's brain?

"So add that up: a wireless, remote, brain reading/writing device that can scan, interpret, and communicate with someone across the room, without them even knowing it. Connect that to the Internet... and talk about brainwashing possibilities. What if some hacker could figure out how to write viruses to people's brains? It's actually a little scary."

But, it's not all optimism and imagination for the technology.

American gaming analyst Todd Greenwald believes it may be some time yet before brain to computer interfaces reach a marketable standard, saying it is "a bit too far out and speculative to say with any confidence".

University of Ulster video gaming lecturer Darryl Charles told CNN he was also uncertain whether Emotiv's technology would take off.

"It's a little bit harder to see. It's quite a complex thing to force your thought on a television screen."

However, Emotiv's Le strongly defended the headset, saying it "works on a vast majority of people and can adapt to a wide variety of thought patterns. Emotiv has carried out tests with hundreds of people and so far we have had success on every single person," Le said.

While the speed of the revolution pushing the gaming world is hotly debated, one thing all experts agree on is the underlying themes of future games.

Gamers can be certain that social interaction and strong storylines will strengthen to form the core of games.

Tan Le told CNN, "The one thing that we believe will be core to the future of gaming is the social experience. Nothing a game developer can program can match the random nature of actually participating in a scenario with other live people."

Le said the social aspect was the key to growth of the industry, as it was opening the door to fresh markets. He acknowledged the new level of immersion offered with the Wii's interactive control had helped send the industry in the right direction.

Charles believed a move closer towards the movie and television entertainment realm was also imminent.

"The big blockbuster game is going to compete more with Hollywood movies. They will be a lot more competitive in storylines... there is a lot of production values already coming from cinema."

Greenwald said downloading games straight from the producer could soon become a reality. A market where simple games could be downloaded for free and then add-ons to significantly improve the game were sold at a premium, could be a more financially rewarding for the makers, he said.


Designed like headgear, the EPOC contains multiple sensors to measure electrical activity in the brain.

Designed like headgear, the EPOC contains multiple sensors to measure electrical activity in the brain.

Thursday, July 10, 2008

Gaming Headset Lets Your Thoughts Control On-Screen Action


It's mind over machine: a US high-tech company has created a headset allowing computer game lovers to use their thoughts to move mountains and make objects disappear on screen. Emotiv, a San Francisco-based startup that marries neuroscience and computer engineering, says its gaming headset offers only a glimpse of what the technology has to offer.

read more | digg story

Wednesday, February 20, 2008

Brain control headset for gamers



Tan Le


Gamers will soon be able to interact with the virtual world using their thoughts and emotions alone.

A neuro-headset which interprets the interaction of neurons in the brain will go on sale later this year.

"It picks up electrical activity from the brain and sends wireless signals to a computer," said Tan Le, president of US/Australian firm Emotiv.

"It allows the user to manipulate a game or virtual environment naturally and intuitively," she added.

The brain is made up of about 100 billion nerve cells, or neurons, which emit an electrical impulse when interacting. The headset implements a technology known as non-invasive electroencephalography (EEG) to read the neural activity.

Ms Le said: "Emotiv is a neuro-engineering company and we've created a brain computer interface that reads electrical impulses in the brain and translates them into commands that a video game can accept and control the game dynamically."

Headsets which read neural activity are not new, but Ms Le said the Epoc was the first consumer device that can be used for gaming.

"This is the first headset that doesn't require a large net of electrodes, or a technician to calibrate or operate it and does require gel on the scalp," she said. "It also doesn't cost tens of thousands of dollars."

The use of Electroencephalography in medical practice dates back almost 100 years but it is only since the 1970s that the procedure has been used to explore brain computer interfaces.

The headset could be used to improve the realism of emotional responses of AI characters in games
Tan Le, Emotiv

The Epoc technology can be used to give authentic facial expressions to avatars of gamers in virtual worlds. For example, if the player smiles, winks, grimaces the headset can detect the expression and translate it to the avatar in game.

It can also read emotions of players and translate those to the virtual world. "The headset could be used to improve the realism of emotional responses of AI characters in games," said Ms Le.

"If you laughed or felt happy after killing a character in a game then your virtual buddy could admonish you for being callous," she explained.

The $299 headset has a gyroscope to detect movement and has wireless capabilities to communicate with a USB dongle plugged into a computer.

The Emotiv said the headset could detects more than 30 different expressions, emotions and actions.

They include excitement, meditation, tension and frustration; facial expressions such as smile, laugh, wink, shock (eyebrows raised), anger (eyebrows furrowed); and cognitive actions such as push, pull, lift, drop and rotate (on six different axis).

Gamers are able to move objects in the world just by thinking of the action.

Emotiv is working with IBM to develop the technology for uses in "strategic enterprise business markets and virtual worlds"

Paul Ledak, vice president, IBM Digital Convergence said brain computer interfaces, like the Epoc headset were an important component of the future 3D Internet and the future of virtual communication.

THOUGHT-CONTROLLED GAMING HEADSET
Emotiv Epoc headset
Sensors respond to the electrical impulses behind different thoughts; enabling a user's brain to influence gameplay directly
Conscious thoughts, facial expressions, and non-conscious emotions can all be detected
Gyroscope enables a cursor or camera to be controlled by head movements
The headset uses wi-fi to connect to a computer