If you think with the release of every new i-device the world is getting closer to thought-controlled smart tech and robotic personal assistants, you might be right.
And thanks in part to work led by the University of Cincinnati‘s Anca Ralescu, we may be even closer than you realize.
Professor Ralescu of the Department of Electrical Engineering and Computing Systems will discuss her team’s research aims and current progress on brain-computer interface at the International Human-Centered Robotics Symposium (HuCeRo). The University of Cincinnati’s College of Engineering and Applied Science (CEAS) will host the symposium on Nov. 14-17 at UC’s Kingsgate Marriott Conference Center. The symposium aims to bring together leading researchers and engineers in the fields of robotics, computer science, material science and brain-computer interaction. Ralescu’s presentation will be Nov. 17.
Brain-computer interface uses electroencephalography (a measure of the brain’s electrical activity) to help distinguish which brain signal corresponds with the body’s performance of a particular intended action. In these experiments, Shikha Chaganti, a graduate student in computer science advised by Ralescu, specifically targeted brain impulses generated when a person thought about going from a sitting position to standing and vice versa. Computers process this data – which can be reinforced by combining it with measures of electrical activity in muscle – in order to detect these brain signals and interpret their intent. The idea is to allow a person to use thought alone to communicate with a computer about the intent to move.
“The problem is quite difficult,” Ralescu says. “We are experimenting with processing the signal and selecting useful features from it, and designing a classifier capable of distinguishing between the these two transitions – sitting to standing and standing to sitting.”
Ralescu’s work eventually could be used in conjunction with another project being presented at HuCeRo by UC’s Gaurav Mukherjee, a master’s student in mechanical engineering in UC’s College of Engineering and Applied Science (CEAS), and Grant Schaffner, an assistant professor in UC’s Department of Aerospace Engineering and Engineering Mechanics. Mukherjee and Schaffner designed and built a spring-assisted leg exoskeleton that can help people with impaired mobility. By integrating Ralescu’s brain-computer interface into the exoskeleton, someone using the device could think, “I’m going to stand,” and they’d receive a robotic boost as they rose to their feet.
Ralescu also says HuCeRo will feature discussion of UC’s development of an interdisciplinary curriculum for human-centered robotics.
“The idea of human-centered systems in general, and human-centered robotics in particular, is not new. But to some extent things were just not in place. Some technology or other was either missing or too expensive,” Ralescu says. “To my knowledge, there is no formal curriculum in this area in any university. If UC moves forward with creating such a curriculum, it could be among the first of its kind. So UC would be a pioneer in establishing such a curriculum.”
Go deeper with Bing News on:
Mind-Reading Robots
- Robot dogs armed with AI-targeting rifles undergo US Marines Special Ops evaluation
The United States Marine Forces Special Operations Command ( MARSOC) is currently evaluating a new generation of robotic "dogs" developed by Ghost Robotics, with the potential to be equipped with gun ...
- Swarm of tiny snail robots stick together to form new structures
Researchers have built a swarm of miniature, snail-inspired robots, minus all the mucus. Instead, a retractable suction cup works in tandem with the remote-controlled machine’s tank-like treads to ...
- Ball-balancing robot could assist wheelchair users
A robot that moves around by balancing on a ball could prove a better assistant for wheelchair users than humanoid robots that walk on two legs ...
- ChatGPT trains robot dog to walk on Swiss ball
This quadruped wobbling along balanced on top of a yoga ball is a fun experiment to watch – but at its core, it demonstrates that AIs like GPT-4 can train robots to perform complex, real-world tasks ...
- 250 Funny and Clever Roomba Names for Your Robot Vacuum
21. Moppet 22. Metal Man 23. Rollo 24. Spam Blocker 25. Roomsters 26. Picard 27. Roombulus 28. Buttercup 29. iRoomba 30. Q-Tip 31. Toner 32. Aang 33. Boomerang 34. Wanda 35. Flatty Related: 150 Names ...
Go deeper with Google Headlines on:
Mind-Reading Robots
[google_news title=”” keyword=”Mind-Reading Robots” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]
Go deeper with Bing News on:
Brain-computer interface
- Neuralink Safety Concerns Drove Co-Founder to Break Up With Elon Musk
While Neuralink is already testing its brain implants in humans, a scientist who helped start the company says he left to develop a less intrusive technology.
- Neurable raises $13M for brain-computer interface with everyday products
Neurable raised $13 million for its brain-computer interface (BCI) technology that can work with everyday products.
- A Neuralink co-founder on why he left Elon Musk's brain chip startup
Would you let Elon Musk tinker with your brain? That’s a question we all might be facing in the future if his brain-computer interface company Neuralink succeeds. But anyone who’s ready to raise their ...
- Neurable Inc. Raises $13M Paving the Way for Brain-Computer Interface Technology in Everyday Products
Neurable Inc., the neurotechnology company democratizing BCI technology, announced today that it has raised an additional $13 million in funding from Ultratech Capital Partners, TRAC, Pace Ventures, ...
- AI Deep Learning Improves Brain-Computer Interface Performance
AI deep learning powers a brain-computer interface that enables humans to continuously control a cursor using thoughts.
Go deeper with Google Headlines on:
Brain-computer interface
[google_news title=”” keyword=”brain-computer interface” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]