Skip to main content

What Is Embodied Intelligence and What Can It Do

Researchers discuss the link between artificial intelligence and robotics at San Diego Robotics Forum

Article Content

Artificial intelligence might be the focus of many conversations today, but it cannot touch or manipulate the physical world without a body. A group of robotics experts got together at the University of California San Diego to talk about the many ways intelligence could be embodied to interact with the real world.

The discussion was at the core of the 9th San Diego Robotics Forum, held Sept. 19 on the UC San Diego campus.

“Why embodied intelligence? Because we are starting to see how we can build AI systems that interact with the physical world,” said Henrik Christensen, director of UC San Diego’s Contextual Robotics Institute and a professor in the UC San Diego Department of Computer Science and Engineering. “We are seeing progress but we also see wild predictions. We need to exercise some caution and not oversell what AI can actually do.”

In classical thought, since at least French philosopher Descartes, mind and body are considered two separate entities, said Mike Tolley, a faculty member in the Department of Mechanical and Aerospace Engineering, who spoke on embodiment. But that’s a false dichotomy. Roboticists know well that bodies actually influence the way we think, plan and move.

Faculty members Nikolay Atanasov, from the Department of Electrical and Computer Engineering, and Hao Su, from the Department of Computer Science and Engineering, outlined various methods to get closer to embodied intelligence, including using sensors, large datasets and learning-based models to help robots perceive, plan, and act. Atanasov spoke on environment estimation and Su on foundational models in robotics.

In addition to self-driving cars and other applications, embodied intelligence can also have applications for robotic surgery. Michael Yip, also a faculty member in the Department of  Electrical and Computer Engineering, who spoke on planning with AI, is developing devices and algorithms to help improve a surgery needed to diagnose lung cancer; evacuate the wounded in situations where rescuers cannot reach them; assist surgeons during a procedure that requires three hands; and more.

The forum also featured talks by Jonathan Hurst, chief roboticist at Agility Robotics and Paolo Pirjanian, CEO and founder of Embodied, Inc.

The event also featured an open house in Franklin Antonio Hall, with demonstrations from the three robotics collaboratories.

Marie Christensen
Marie Christensen, who oversees marketing and outreach for the Institute of the Global Entrepreneur at the Jacobs School, welcomes the forum’s audience at the institute’s open house in Franklin Antonio Hall.
PhD student with a training robot
Researchers in Yip’s lab developed a robot that will train other robots to safely move humans who are unconscious or have mobility issues in areas where human rescuers would be in harm’s way. The robot provides a platform that other robots can learn to safely help people.
A student tele operating a robot, surrounded by people

Researchers in Yip’s lab are also testing a surgery robot that can be guided via a virtual reality headset and joysticks to perform operations remotely. The robot has already shown that it can tele-operate from the West Coast, in this case San Diego, to the East Coast, in this case Maryland, and vice versa.

The robot weighs 25 lbs and could be air-dropped into places where surgeries are needed but surgeons would be in harm’s way, such as natural disasters. It could also be used to provide medical services in low-access areas and remote communities. It would also allow patients to be treated by world experts in certain procedures without delays and the need to travel.

Dog robot surrounded by people

Researchers in Christensen’s lab demonstrate programming for a robot that scans the environment and learns how to avoid people and objects. The programming was developed in collaboration with Naval Information Warfare Systems Command in San Diego.

Self driving scooter surrounded by people

Researchers in Christensen’s lab are developing an autonomous scooter that can both come to a rider when called and return to base when the ride is over. Their prototype can currently drive autonomously on a regular road. Researchers are now teaching it to handle intersections.

An insect robot

This flying robot emulates natural phenomena to understand the properties of a flapping wing. It was developed in the lab of Nicholas Gravish, a faculty member in the Department of Mechanical and Aerospace Engineering.

PhD student with a robot shaped like a fish

A researcher in Tolley’s lab shows off a robot that swims both like a tuna and an eel, depending on the wavelength at which it oscillates. Researchers are working to make the robot swim untethered.

Share This:

Learn more about research and education at UC San Diego in: Artificial Intelligence

Category navigation with Social links