HomeIndustriesAI doesn't mean that the robots are coming

AI doesn’t mean that the robots are coming

Stay up to this point with free updates

Pepper, the humanoid robot, was born in 2014. He experienced a temporary wave of hype, including a visit to the Financial Times to fulfill the editor. “This is a robot that behaves autonomously and is powered by love,” explained Masayoshi Son, the top of its essential financier SoftBank. Alibaba and Foxconn also invested lots of of thousands and thousands to make robotics a ubiquitous a part of every day life. But it wasn't to be. In a public library in Japan you possibly can still occasionally find Pepper, unplugged, head bowed, like a four-foot-tall Pinocchio who dreamed of becoming an actual boy but never made it. Production stopped in 2021 and only 27,000 units were made.

But the vision of humanoid robots—machines so much like us that they’ll do all of the work we don't wish to do—is simply too tempting to offer up for long. Recent, dramatic advances in artificial intelligence have sparked a brand new wave of Passionate about robotics. “The next wave of AI is physical AI. AI that understands the laws of physics, AI that may work amongst us,” Jensen Huang, CEO of chip designer Nvidia, said earlier this yr. Nvidia has capitalized on the boom in training AI models to grow to be the world's second-largest company by market capitalization.

Billions of dollars in enterprise capital are flowing into robotics startups. They aim to use the identical model training techniques that allow computers to predict how a protein will fold or produce stunningly realistic text. They aim, first, at making robots understand what they see within the physical world, and second, at interacting with it in a natural way and solving the massive programming task embodied in such an easy motion as picking up and manipulating an object is.

That's the dream. However, the most recent round of investors and entrepreneurs are more likely to be just as dissatisfied as those that backed Pepper. This is just not because AI is just not useful. Rather, the explanation is that the obstacles to developing a commercially viable robot that may cook dinner and clean toilets are a matter of hardware, not only software, and AI doesn’t inherently address them, let alone solve them.

These physical challenges are diverse and difficult. For example, a human arm or leg is moved by muscles, while a robot's limbs have to be powered by motors. Each axis of motion around which the limb must move requires more motors. All of that is doable, as demonstrated by the robotic arms in factories, however the high-performance motors, gears, and transmissions involved create space requirements, costs, power requirements, and diverse components that may and can fail.

Once the specified movement has been created, there’s the challenge of perception and feedback. For example, when you pick up a bit of fruit, the human nerves in your hand will inform you how soft it feels and the way hard you possibly can afford to squeeze it. You can taste whether the food is cooked and smell whether it’s burning. None of those senses are easy to supply for a robot, and to the extent they’re possible, they incur additional costs. Computer vision and AI can compensate for this by observing whether the fruit is mashed or the food within the pan has turned the fitting color, but they’re an imperfect substitute.

Then there’s the query of power. Every autonomous machine requires its own energy source. The robot arms in factories are connected to the facility grid. You can't move. A humanoid robot most definitely uses a battery, but then there are tradeoffs in size, power, strength, flexibility, uptime, useful life, and value. These are only among the problems. Lots of smart individuals are working on solving them, they usually're making progress. But the purpose is that these are physical challenges which were around for a very long time and are difficult. Even a revolution in AI won't make them disappear.

What then makes AI possible within the physical world? Instead of imagining how technology will enable recent machines, it’s more practical to assume how existing machines will change when AI is applied to them.

The most evident example is self-driving vehicles. In this case, the machine doesn't need to alter in any respect: the movement of a automotive through the physical world and its energy source work as all the time, while the perception of driving a automotive is nearly entirely visual. With the brand new fashion for AI, the hype around autonomous vehicles has died down. In fact, the other ought to be true: self-driving is a large market and the true challenge that AI can most easily address, a degree that anyone tempted to take a position in other applications of robotics should consider.

It's also value excited about how existing robots, from industrial robot arms to robot vacuum cleaners, will evolve. AI-powered machine vision will subtly expand the range of tasks a robot arm can perform and make collaboration with humans safer. Lightweight, single-purpose devices like robot vacuum cleaners are progressively becoming increasingly more useful. In Chinese hotels, for instance, it’s already common for a robot to deliver deliveries to the room. This form of limited and controlled autonomy is the simplest to realize.

In this fashion, AI will slowly bring us closer to androids. As for a robot like Pepper that may clean the bathroom – unfortunately, it's much easier to construct one which writes bad poetry, and that's unlikely to alter any time soon.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read