Roomba testers found sensitive images uploaded to social media

Roomba testers found sensitive images uploaded to social media

Source Node: 1891463
Audio player loading…

Robot vacuums have always seemed like helpful, friendly little guys to me, but perhaps that’s because I haven’t used them much. I love the idea of a little dude who goes around bumping into things and making my house cleaner. The idea seems less charming now I’ve found out they can take pictures of you on the toilet then upload them to social media without you knowing. That’s a level of weird bullying I don’t want from my mechanical servant.

Eileen Guo over at MIT Technology Review (opens in new tab) has been digging into how such photographs from iRobot’s Roomba vacuums, with clearly visible faces of users, made their way to social media (opens in new tab). The pictures include very candid shots of people doing personal things in their home, including images of women and children on the toilet. Guo also made an excellent Twitter thread (opens in new tab) with further explanations and helpful links.

The sensitive images are clearly taken from the position of the robot vacuum, which would have done so in the interest of collecting data. One affected user spoke to MIT Technology Review explaining that he was a product tester for the iRobot Roomba J series, which meant letting the robot roam the house collecting information in the hopes of improving the product.  

Expecting your information to be sent back to a secure company that’s looking to train its cleaning AI is one thing, and finding out those images are also being uploaded to social media is another. MIT Technology Review found that when iRobot collects all that data, it sends it off to data annotation companies. One such company was Scale AI, which hires remote contractors to help assess the uncensored data.

This led to workers sharing images between themselves on social media, which of course made their way to the wider world. Many users felt this is a breach of their trust, if not their contract as testers. There are at least 15 images that have made their way out, but it’s likely there are far more being shared. Appropriately, iRobot has stopped working with Scale AI.

Unfortunately, iRobot isn’t doing much more than that to help those affected or reinstill confidence. CEO Colin Angle responded to MIT Technology Review’s report in a LinkedIn post (opens in new tab) that didn’t acknowledge any problem or danger around having these uncensored images provided to gig workers. With little responsibility or recourse it seems like a dangerous idea.

Angle spends the first part of the Linked In post talking about how great the company’s Roombas are, and attributing this to data collected from testers like these, then goes on to throw them under the bus, saying they’re not consumers and have consented to have their data collected.

It’s good to know this isn’t happening with regular consumer iRobot products, but the lack of accountability to testers doesn’t make me want to go out and grab a Roomba any time soon.

The LinkedIn post also chastises MIT Technology Review for sharing censored versions of the images in its article, which doesn’t make a tonne of sense given iRobot already willingly shared uncensored ones with strangers that ended up online.

Time Stamp:

More from PC Gamer