Findings Could Help Seamlessly Integrate Prosthetics
A University of Houston engineer is reporting in eNeuro that a brain-computer interface, a form of artificial intelligence, can sense when its user is expecting a reward by examining the interactions between single-neuron activities and the information flowing to these neurons, called the local field potential.
Professor of biomedical engineering Joe Francis reports his team’s findings allow for the development of an autonomously updating brain-computer interface (BCI) that improves on its own, learning about its subject without having to be programed.
The findings potentially have applications for robotic prosthetics, which would sense what a user wants to do (pick up a glass, for example) and do it. The work represents a significant step forward for prosthetics that perform more naturally.
“This will help prosthetics work the way the user wants them to,” said Francis. “The BCI quickly interprets what you’re going to do and what you expect as far as whether the outcome will be good or bad.” Francis said that information drives scientists’ abilities to predict reward outcome to 97%, up from the mid-70s.
To understand the effects of reward on the brain’s primary motor cortex activity, Francis used implanted electrodes to investigate brainwaves and spikes in brain activity while tasks were performed to see how interactions are modulated by conditioned reward expectations.
“We assume intention is in there, and we decode that information by an algorithm and have it control either a computer cursor, for example, or a robotic arm,” said Francis. Interestingly even when the task called for no movement, just passively observing an activity, the BCI was able to determine intention because the pattern of neural activity resembled that during movement.
“This is important because we are going to have to extract this information and brain activity out of people who cannot actually move, so this is our way of showing we can still get the information even if there is no movement,” said Francis. This process utilizes mirror neurons, which fire when action is taken and action is observed.
“This examination of reward motivation in the primary motor cortex could be useful in developing an autonomously updating brain machine interface,” said Francis.
Learn more: Research Moves Closer to Brain-Machine Interface Autonomy
The Latest on: AI brain-computer interface
[google_news title=”” keyword=”AI brain-computer interface” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
via Google News
The Latest on: AI brain-computer interface
- Big Tech sees neurotechnology as its next AI frontieron May 13, 2024 at 3:00 am
As Big Tech races to build off of neurotech advancements in the medical world, experts have cautioned that it could put our most valuable data — the privacy of our thoughts — at risk.
- What You Need To Know About Brain-Computer Interfaceson May 10, 2024 at 3:15 am
In my opinion, the least invasive technology will win out in the end. Similar to how the hybrid approach is winning with electric vehicles right now.
- Neuralink's Brain Implant Faces Setback As Part Malfunctions In Human Trialon May 9, 2024 at 2:57 pm
Neuralink, Elon Musk's brain-computer interface (BCI) startup, encountered a setback when part of its brain implant malfunctioned after being implanted in a human patient for the first time.
- AI-powered wearables that read your thoughtson May 9, 2024 at 11:20 am
Meta CEO Mark Zuckerberg is touting an armband that could allow you to type just by thinking. Apple has a patent for airpods that could measure brain activity. At the same time, researchers have ...
- Neurable raises $13M for brain-computer interface with everyday productson May 7, 2024 at 11:00 am
Neurable raised $13 million for its brain-computer interface (BCI) technology that can work with everyday products.
- AI Deep Learning Improves Brain-Computer Interface Performanceon May 6, 2024 at 8:54 am
AI deep learning powers a brain-computer interface that enables humans to continuously control a cursor using thoughts.
- New Non-Invasive Brain-Computer Interface Enables Thought-Controlled Object Manipulationon May 5, 2024 at 10:36 pm
Researchers have showcased noninvasive BCIs in their recent study, offering a promising alternative with enhanced safety, affordability, scalability, and accessibility for a broader demographic.
- AI Deep Learning Improves Brain-Computer Interface Performanceon May 5, 2024 at 5:00 pm
A new study published in PNAS Nexus by researchers at Carnegie Mellon University (CMU) demonstrates how a noninvasive brain-computer interface (BCI) powered by artificial intelligence (AI ...
- Non-invasive brain-computer interface to help control objects by thoughton May 5, 2024 at 8:49 am
Researchers at Carnegie Mellon University (CMU) recently showed that an AI-powered, non-invasive brain-computer interface (BCI) can allow a person to track a moving object on a screen just by simply ...
- Refined AI approach improves noninvasive brain-computer interface performanceon May 3, 2024 at 9:50 am
Pursuing a viable alternative to invasive brain-computer interfaces (BCIs ... arm had the ability to continuously track and follow a computer cursor. As technology has improved, their AI-powered deep ...
via Bing News