Findings Could Help Seamlessly Integrate Prosthetics
A University of Houston engineer is reporting in eNeuro that a brain-computer interface, a form of artificial intelligence, can sense when its user is expecting a reward by examining the interactions between single-neuron activities and the information flowing to these neurons, called the local field potential.
Professor of biomedical engineering Joe Francis reports his team’s findings allow for the development of an autonomously updating brain-computer interface (BCI) that improves on its own, learning about its subject without having to be programed.
The findings potentially have applications for robotic prosthetics, which would sense what a user wants to do (pick up a glass, for example) and do it. The work represents a significant step forward for prosthetics that perform more naturally.
“This will help prosthetics work the way the user wants them to,” said Francis. “The BCI quickly interprets what you’re going to do and what you expect as far as whether the outcome will be good or bad.” Francis said that information drives scientists’ abilities to predict reward outcome to 97%, up from the mid-70s.
To understand the effects of reward on the brain’s primary motor cortex activity, Francis used implanted electrodes to investigate brainwaves and spikes in brain activity while tasks were performed to see how interactions are modulated by conditioned reward expectations.
“We assume intention is in there, and we decode that information by an algorithm and have it control either a computer cursor, for example, or a robotic arm,” said Francis. Interestingly even when the task called for no movement, just passively observing an activity, the BCI was able to determine intention because the pattern of neural activity resembled that during movement.
“This is important because we are going to have to extract this information and brain activity out of people who cannot actually move, so this is our way of showing we can still get the information even if there is no movement,” said Francis. This process utilizes mirror neurons, which fire when action is taken and action is observed.
“This examination of reward motivation in the primary motor cortex could be useful in developing an autonomously updating brain machine interface,” said Francis.
Learn more: Research Moves Closer to Brain-Machine Interface Autonomy
The Latest on: AI brain-computer interface
[google_news title=”” keyword=”AI brain-computer interface” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
via Google News
The Latest on: AI brain-computer interface
- Brain chip unveiled that allows monkey to control robotic arm with brainon April 26, 2024 at 12:03 pm
Technology is advancing by leaps and bounds. Proof of this comes in the latest microchip presented in China, by means of which a monkey is able to control a robotic arm using only ...
- Mastering AI Powerhouse: Unleashing C++ for Machine Learning and AI Programmingon April 25, 2024 at 10:10 pm
Why is C++ the preferred language for AI development? Explore emerging trends, essential tools, and prospects within this dynamic landscape.
- Why We Need to Leverage AI for Patent Information in 2024 – the Macro Perspectiveon April 22, 2024 at 4:59 pm
The current macro environment combined with the increasing maturity of AI initiatives opens up a unique window of opportunity to explore and take advantage of these innovative solutions. IPRally's CCO ...
- How Elon Musk's AI prediction could come true by 2025on April 22, 2024 at 6:13 am
Watson said, 'A series of breakthroughs in neuroimaging driven by AI dramatically increase the resolution of MRI brain scans, allowing real-time observation of individual neuron activity. 'These ...
- Brain-computer interface research reaches new frontierson April 19, 2024 at 7:02 am
Brain-computer interfaces may seem like a science fiction concept, but multiple studies are working to make this technology a reality.
- AI buttons are being forced into your tech, whether you want it or noton April 18, 2024 at 11:10 am
But trying to make AI tools omnipresent in our interfaces, on screens and dedicated buttons, is the fastest way to make consumers frustrated with them. Nobody wanted a Windows button on a mouse, ...
- When technology can read your brain waves, who owns your thoughts?on April 18, 2024 at 10:34 am
Though the technology like this is still relatively nascent, neural rights activists and cornered lawmakers want to be ready for when it is more widespread. Critics warn companies may already possess ...
- AI mind-reading: Unlikely, but still worrisomeon April 17, 2024 at 11:52 pm
Well, for real-time AI-powered mind-reading to be possible, we need to be able to identify precise, one-to-one correspondences between particular conscious mental states and brain states. And this may ...
- Generative AI could supercharge the misinformation crisis, political experts warnon April 17, 2024 at 1:40 am
"It will break us if we don’t all address this," Tara McGuinness, founder of the New Practice Lab, warned at Fortune's Brainstorm AI London conference.
- How Generative AI Will Accelerate Other Cutting-Edge Technologieson April 16, 2024 at 10:26 pm
Discover how generative AI is set to accelerate technologies from the metaverse to robotics and synthetic biology. Explore the transformative potential of generative AI.
via Bing News