Blue looks a little bit like a child’s drawing of a robot: its made from bulky, 3D-printed parts, and it has a pair of humanoid robot arms with pincers for hands. It can be controlled using VR handsets, which let operators wave their arms about and then Blue will wave its arms in tandem. It can also be trained to manipulate objects using artificial intelligence, a control method thats still surprisingly rare in robots.
Pieter Abbeel, the roboticist leading the project, wants to change this, and he says Blue has been built from the ground up to take advantage of recent improvements in AI. The fact that AI is becoming more capable gave us an opportunity to rethink how to design a robot, Abbeel tells The Verge.
Abbeel explains that most robots in use today are built to be strong and accurate. Their movements are predefined, and they simply repeat the same action over and over again, whether thats lifting pallets of goods, welding cars, or fastening screws into a smartphone.
The robots of the future, by comparison, will be reactive and dynamic. They’ll be able to work safely alongside humans without crushing them, and instead of having their actions planned in advance, they’ll navigate the world in real time using cameras and sensors.
If you look at traditional robots, they’re designed around the principle of very high precision and repeated motions, says Abbeel. But you dont necessarily need sub-millimeter repeatability. (That’s the ability to perform the same task over and over with differences in movement of less than a millimeter.) Humans don’t have sub-millimeter repeatability. Instead, we use our eyes and sense of touch to get things done through feedback.
Abbeel and his team, research fellow Stephen McKinley and grad student David Gealy, hope Blue will operate in the same way. It has a central vision module with a depth-sensing camera, and its arms are controlled by motors with rubber bands that give it flexibility. If you push against an industrial robot arm, its like pushing against a brick wall. But Blue is more like a human in a crowded subway car: jostle it, and it’ll move aside.
This makes Blue safer to work around but also suitable for research using reinforcement learning, a type of AI training method that’s becoming popular in robotics. Reinforcement learning works by asking an agent to complete a task and rewarding it when it does. Its basically trial and error, with the agent starting out with no knowledge of how to complete its goal and then slowly teaching itself over time.
This is another area where Blue might make a difference. PR2, a popular research robot built by Willow Garage that also has a pair of arms and pincers, set researchers back around $400,000. The bill of materials for Blue, by comparison, is just $3,000. Abbeel says the team hasn’t decided on a final price, but they’re hoping to target the $5,000 range.
That becomes possible when you’re willing to forego sub-millimeter precision because you realize you don’t need it with AI-based control, says Abbeel.
Plenty of other research labs and startups are also targeting this new paradigm, hoping to teach robots how to work using artificial intelligence. Abbeel is the president of one of them, a startup named Embodied Intelligence. Kindred AI, a firm that builds robots that pick items in warehouses, is another. The Elon Musk-founded research lab OpenAI has done similar work using robot hands, and Google is also exploring AI-training for robots.
Still, some experts are skeptical about Blues appeal. They note that its not that different from Baxter, another bot with arms and pincers that was meant to work alongside humans. The company that made Baxter, Rethink Robotics, shut down last year.
Ankur Handa, a robotics researcher at Nvidia, said Blues pincers limit the range of tasks it can perform, and its lack of precision would be a problem, even with AI controls. Overall, I dont think they are offering anything new, Handa tells The Verge.
But Abbeel is bullish about Blues future. The robot is being built in small batches right now, but Abbeel hopes to scale up, eventually moving to outsourced manufacturing to produce larger numbers. The first target customers will be research labs and universities where robots are currently shared among teams, much like computers were in the 1960s. Offering a cheaper robot will make them more widely available, boosting the output of robot research.
More importantly, Abbeel hopes that Blue will provide a blueprint for what the home robot of the future could look like: something that is low cost, flexible, and plays well with humans. The home is absolutely what we have in mind with this kind of design, he says. Theres still a lot of challenges ahead, and its not like we think this specific robot is going in a home. [But] this is a design paradigm that takes us in a new direction.
berkeley open arms Intelligent Machines This may be the Apple II of AI-driven robot arms A new low-cost robot arm that can be controlled using a virtual-reality headset will make it easier to experiment with AI and robotics. by Will Knight April 9, 2019 Robots in factories today are powerful and precise, but dumb as toast.
A new robot arm, developed by a team of researchers from UC Berkeley, is meant to change that by providing a cheap-yet-powerful platform for AI experimentation. The team likens their creation to the Apple II, the personal computer that attracted hobbyists and hackers in the 1970s and 80s, ushering in a technological revolution.
Robots and AI have evolved in parallel as areas of research for decades. In recent years, however, AI has advanced rapidly when applied to abstract problems like labeling images or playing video games. But while industrial robots can do things very precisely, they require painstaking programming and cannot adapt to even the slightest changes. Cheaper, safer robots have emerged in recent years, but most are not designed specifically to be controlled using AI software.
Robots are increasingly able to learn new tasks, whether through trial and error or via expert demonstration, says Stephen McKinley, a postdoc at UC Berkeley who was involved with developing the robot. Without a low-cost platform—an Apple II-type device—experimentation, trial and error, and productive research will continue to move slowly. There is potential for research to be greatly accelerated by making more robots more accessible.
The new arm, known as Blue, costs around $5,000, and it can be controlled via a virtual-reality headset—a technique that is proving useful for training robot-controlling AI algorithms.
Blue is capable of carrying relatively heavy loads but is also extremely backdrivable, meaning it will comply when pushed or pulled. This makes it safe for people to work alongside, and allows it to be physically shown how to do something. The system provides low-level software for controlling the robot and for the VR system, and it is designed to be compatible with any computer running AI software.
The project comes from the lab of Pieter Abbeel, a professor at UC Berkeley who has pioneered the application of AI to robotics (see Innovators Under 35: Pieter Abbeel). The IP for the project has been licensed from UC Berkeley by a new company called Berkeley Open Arms, which will develop and sell the hardware.
It remains extremely difficult to translate machine learning from a virtual environment to the real world. Despite this, in recent years academic researchers have made progress in applying machine learning to robot hardware, leading to some spectacular demonstrations and a few commercial ventures.
Some canny companies have taken notice of the trend. Nvidia, a chipmaker that has ridden the AI boom by making microprocessors and software for deep learning, recently launched a lab dedicated to exploring applications of AI to robots (see This Ikea kitchen might teach industrial robots to be less dumb and more helpful).
Huang notes that while an industrial robot may cost around $50,000 to buy, it can cost many times that to reprogram one for new a series of different tasks. We have it the wrong way around, he says. He expects big advances in robotics in years to come thanks to advances in machine learning and virtual-reality simulation: Robots and AI are now the same thing.
The Latest on: Human-friendly robot
[google_news title=”” keyword=”human-friendly robot” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
via Google News
The Latest on: Human-friendly robot
- Xiaomi unveils CyberDog 2 robot, boasts fusion sensing, decision-making systemon February 27, 2024 at 1:48 pm
Xiaomi showcased its latest innovation in robotics, the CyberDog 2. This bio-inspired quadruped robot represents a significant leap forward in robotics technology, boasting a fusion sensing and ...
- India completes critical test for Gaganyaan flight crewed by humanoid robot later this yearon February 27, 2024 at 10:00 am
India is steadily inching toward the first uncrewed flight in its human spaceflight program ... astronauts on the moon in 2040 This year's test flight will carry a humanoid robot, Vyomitra (Sanskrit ...
- Xiaomi's $3,000 CyberDog 2 can do backflipson February 27, 2024 at 7:06 am
In 2021, Xiaomi launched the CyberDog, endearingly calling it an "open source quadruped robot companion." To say that the robotic ... We didn't think so, human. Stan is a Senior Editor at Mashable, ...
- Robots are now in Triangle restaurants to serve, clean and sing you ‘Happy Birthday’on February 27, 2024 at 2:30 am
A handful of eateries across Wake County have self-driving machines bringing dishes to tables. They seem friendly.
- Deutsche Telekom Unveils ‘Concept T’: Redefining Future Communication with Human-Centered Designon February 27, 2024 at 12:11 am
Deutsche Telekom unveils its visionary "Concept T" at the Mobile World Congress 2024, offering a glimpse into the future of personal communicatio ...
- The Eufy X10 Pro Omni Is a Pretty Good Mid-Priced Robot Vacuumon February 26, 2024 at 6:00 am
The X10 offers spectacular navigation and edge mopping, but struggles to recover from errors or vacuum larger debris.
- This AI robot garbage picker can sort over 500 types of trash in secondson February 26, 2024 at 3:00 am
This is double the average pick rate of a human sorter ... With its smart and fast robots, it offers a solution that is not only environmentally friendly but also economically viable and socially ...
- Human-like real-time sketching by a humanoid roboton February 24, 2024 at 7:10 am
The rapid advancement of deep learning algorithms and generative models has enabled the automated production of increasingly striking AI-generated artistic content. Most of this AI-generated art, ...
- Swarms of AI "killer robots" are the future of war: If that sounds scary, it shouldon February 24, 2024 at 1:01 am
Such computer-driven groupthink, labeled “emergent behavior” by computer scientists, opens up a host of dangers not yet being considered by officials in Geneva, Washington or at the U.N. Neither side ...
- Scientists Are Putting ChatGPT Brains Inside Robot Bodies. What Could Possibly Go Wrong?on February 21, 2024 at 8:00 am
As a dinner-handling robot formulates its “policy”—the plan of action it will follow to fulfill its instructions—it will have to be knowledgeable about not just the partic ...
via Bing News