Body

Rice computer scientists steer robot workshop and privacy concerns at IROS 2023

Kavraki lab researchers lead robot task and motion planning workshop, address data and image privacy breaches in robotics

Rice University computer scientists steer robot workshop and privacy concerns at IROS 2023

At the 2023 International Conference on Intelligent Robots and Systems (IROS) in Detroit, Michigan, Rice University Computer Scientist Lydia Kavraki joined several of her Ph.D. students, postdocs, and alumni for a task and motion planning workshop and a paper focused on privacy concerns.

Privacy concerns addressed in new robotics paper

Their paper, “AI Robots as Double Agents: Privacy in Motion Planning,” was authored by Rahul Shome, Zak Kingston and Kavraki. Shome joined the faculty in the School of Computing at the Australian National University following his stint as a Rice Academy Fellow with Kavraki.

“I work on the computational processes and algorithms that robots use to ‘figure out’ and plan to solve problems intelligently,” said Shome. “The technologies our work is enabling will increasingly allow sophisticated robots with sensors to decide how to move and capture information in environments occupied by humans.” 

Through discussions with others in the lab, Shome and Kingston were motivated to “start asking some hard questions about the ethical and privacy implications of scientific advances in robot motion planning when these robots are controlled by manufacturers and operate among us. This has opened up several fruitful avenues of intellectual promise in the important interdisciplinary cross-cutting domains that can inform responsible robotic intelligence of the future,” said Shome.

Kingston completed his Ph.D. in 2021 and he is continuing his research as a postdoctoral fellow. He said, “The reason why Rahul and I wanted to investigate this topic is because we are now actually seeing robotic systems deployed in everyday life: autonomous cars are driving around town, food delivery robots are roaming campuses, and some systems are entering the home. It's no longer an intellectual exercise to think about what consequences having a robot around will bring; it's happening right now, and robots really are just very fancy data collection platforms that have a friendly face.”

The increasing acceptance and pervasive use of robotics in homes and workplaces is only possible because today’s robots have grown so adept at deliberately moving, sensing, and interacting with their environments. Conversely, the same capabilities that promise societal and economic payoffs make the robots susceptible to abuse. The Rice researchers dubbed robots “AI double agents” for their liability to violate the privacy of coworkers, privileged spaces, and other stakeholders. For example, the MIT Technology Review published a story in 2022 that explained how images like a woman sitting on a toilet were taken by a robot vacuum and subsequently posted on social media.

Kingston said, “When we took a look at the state of the academic literature on privacy and robotics, we found it to be anemic. This wasn't a problem that a lot of people were thinking about, but we need to think about privacy from the get-go. And we need to approach it not only from a cybersecurity angle of protecting collected data, but also from a robotics angle in which the system chooses what data it even collects in the first place. For example, images of your face can't be leaked if the robot doesn't look at your head.” 

The team believes that researchers and practitioners should be actively designing systems that respect humans in shared spaces. “At least that is my hope, because I want to live in a world where I don't have to worry about robots looking over my shoulder. A good starting point is for developers to actively think about what data actually needs to be collected for a robot to operate safely and efficiently,” said Kingston.

Shome was encouraged by the audience’s positive response to their suggestions and challenges to ‘do better.’  He said, “What was surprising to me was the number of young robotics researchers who sought us out after our presentation. It was rewarding to see them going through their ‘Ah ha!’ moment while engaging with us. I think starting these conversations is a very big positive first step.”

Kavraki, the Noah Harding Professor of Computer Science, professor of Bioengineering, Electrical and Computer Engineering, and Mechanical Engineering, and the Director of the Ken Kennedy Institute at Rice, was not surprised by the audience response. She said, “I think there is increased awareness of the unintended consequences of our research work. This is very encouraging. It will lead to improved algorithm design and responsible computing.”

Workshop explores transition of TAMP theory into practice

The IROS workshop, “Task and Motion Planning: from Theory to Practice,” was led by Neil Dantam and Khen Elimelech in collaboration with Kavraki and Federico Pecora, a senior manager in applied science at Amazon Robotics.

A previous member of Kavraki’s lab and now an associate professor of computer science at the Colorado School of Mines, Dantam’s research focuses on developing techniques for robot planning and control that are mathematically verifiable, physically applicable and easy to use. His interests align with the workshop’s target audience:  researchers from academia, industry and government working in AI planning, motion planning, and controls who are interested in improving the autonomy of robots for complex, real-world tasks such as mobile manipulation.

Elimelech is an instructor and a postdoctoral researcher working with both Kavraki and Rice University Professor Moshe Vardi. Leading the IROS workshop aligns with his research interest in developing autonomous robots that can robustly perform cognitive processes like planning and decision making in order to solve high-scale, real-world tasks. This is the latest of three professional workshops he led in major robotics conferences since joining Rice.

Elimelech explains how the workshop came to be, “About a year ago, Professor Dantam and I met at another robotics conference. We exchanged about our shared research interest in robotic planning, and chatted about the Task and Motion Planning (TAMP) worksop series he used to organize. The series was very successful and was drawing big crowds, but had to take a break during the pandemic. We decided to join forces and revive it.” 

Regarding the theme selection, Elimelech said, “Every year the workshop focuses on a specific theme. The theme for this year, ‘TAMP: from theory to practice,’ arose quite naturally. We were wondering why the academic research we were doing did not seem to be adopted broadly by the industry for real world applications. We thought this would be a point worth discussing.”

Dantam added, “TAMP research has been proceeding for about 20 years in academia, but we've so far seen limited reduction to practice from all this research effort. By bringing together academic researchers and practitioners from both industry and government labs, we hoped to bridge these gaps to raise the impact of TAMP research going forward.”

Elimelech continued, “We thought it was important to include diverse voices in this discussion. We invited world-leading TAMP researchers from academia, as well as industry leaders to share about their needs and experiences—all of whom agreed on the importance of the topic. The interaction among the speaker and the audience participation was fantastic. The closing panel discussion helped identify some of the standing gaps and hopefully gave a few directions towards bridging them.”

Kavraki, who has co-organized all previous five annual TAMP workshops in the past, was pleased with the 2023 participation and feedback. She said, “The field of TAMP has grown from an esoteric to a mainstream topic in the robotics community. This years’ workshop was particularly interesting as it gave us compelling examples of real-world problems where TAMP methods have been successfully applied. Research in this topic will bring us closer to the goal of having robots seamlessly collaborate with humans.”

 

Shown in Photo (from left to right): Rahul Shome, Khen Elimelech, Constantinos Chamzas, Wil Thomason, Kavraki, Zak Kingston, Kostas Bekris, and Neil Dantam.

 

Carlyn Chatfield, contributing writer