iRobot's Roomba j Series ๐
โHerbert did what now?โ
Hi there EInsighters, and welcome to this monthโs issue on iRobot's Roombaย j Seriesย .
ย
Iโm your curator, Idil, and I'llย be taking you on a journey through one of the most talked about tech scandals in the world, while also drawing on the EInsight of our selected EI Experts.
So letโs dive right in - shall we?ย
Autonomous cars, military drones, andโฆ vacuum cleaners? Turns out your favorite household appliance might be more dangerous than you thought. I mean, it isnโt out binging on โHow to Get Away with Murderโ and plotting your demise - but it could just be the thing that invades your privacy and ruins your reputation (or makes you the talk of town at the next potluck). And thatโs exactly what happened when a group of tech enthusiasts unboxed their latest gadget; beta-testers of iRobot's Roombaย j Series have recently come forward claiming iRobotย shared users' data in a global data supply chain. The funny thing about data is that it comes in all shapes and sizes. In this case? It came in the form of unflattering images that were shared with contractors based outside the United States, who in turn shared the images online. Now, I don't know about you but I'm pretty sure I would have signed up to โBig Brotherโ if I wanted people watching me 24/7. Who knew these little devices were capable of being more than just over-the-top dramatic?
ย
Soย could it have been avoided?
Well, Iโve asked just that to our EI Experts, and this is what they had to say...
It's no surprise consent plays a vital role in this case but asย Annaย argues, "consent as we know it is not enough; users are not understanding what they accept, and GDPR did not help.ย Companies need to go the extra mile (yes, even beyond compliance) to explain to their users what data is being collected and what uses it will have."ย
ย
After all, you canย only accept that which you're aware of.
When it comes to consent, information needs to be laid out in simple terms. You can't expect your consumers to consent to something they're not aware of; that's Contract Formation 101.ย And last I checked, the GDPR is 88 pages long, so unless you're a data enthusiast or a self-proclaimed legal geek, it's safe to assume most people haven't read it in full.ย
Some companies also fall into the trap of appointing Chief Privacy Officers and patting themselves on the back thinking their work here is done; whereas in reality -
That's a great first step, but consent and privacy are ongoing concerns, not one-offs.
I asked Anna what would have happened in an ideal world whereย privacy was fully embedded in the company values and culture, and this is what she came up with:
Product team would have thought twice about placing a camera in a device that is supposed to be used just for cleaning,
Legal team would have warned about privacy issues of the camera,
Hardware and software teams would have flagged the device,
A possible privacy issue would have been explained to the beta-testers to help them reach a well-informed decision,
Employees ofย the external agency would have been aware of privacy issues and might not have shared the contents of the photos online, and
There would have been a clear policy (in the shape of a checklist) of what the external agency would need to do in terms of privacy.
Sounds simple and feasible, but even these seemingly minor changes would've protected the beta-testers. As Anna emphasised, "this shows a clear need of companies to embed their values in the culture, processes and technology. If values are not clear, we cannot expect that the end product will reflect them."
Time to roll up our sleeves and get a bit technical. You ready?
This time with Flavio, we're approaching things through the lens of multiuser technology affordances. Wow, wait - hear us out. It's not as complicated as it sounds!
"The affordance of an artefact characterises what can (positive affordances) and what cannot (negative affordances) be done using the artefact. Let's say I have a pencil and a piece of paper. Aย pencil affords me with the possibility of writing, which would be a positive affordance of a pencil. I could also potentially use that same pencil to stir my cup of coffee if I don't have a proper spoon within reach. To prevent this action from being possible, a designer must innovate in materials and shapes to ensure that stirring coffee is turned into a negative affordance of a pencil."
Simply put, using a pencil to write is a "yes, please", and using a pencil to stir coffee is a big "no-no",ย so let's design a pencil that's less likely to be used as a spoon alternative. See? Easy peasy.
"An artefact can be built to serve different types of users. Let's go back to our example, a pencil can be used by a writer as well as by an illustrator. Indeed, the same pencil can be used by both, hence (1) users of each type should be aware that an unattended pencil could afford different actions by users of different types, and (2) users of all types should be happy with everything that a pencil can afford to any user. Therefore, a key component of a good relationship between users and artefacts is a clear communication of what can and what cannot be afforded by an artefact considering every possible user."ย Essentially, what this means is that a single item can be designed to serve different purposes for different people. These uses should be communicated clearly to the users.
But how does all this relate back to our buddyย Herbert that's stuck on the cliff? Well...
"The new prototype released by iRobot seems to underperform in this communication. Clearly, it affords different services simultaneously to different types of users: it can afford high quality automated floor cleaning to homeowners and, at the same time, valuable information about consumer behaviour to retailers, for example. Engineers and product designers must still find a way to ensure that what these appliances can and cannot do is clearly communicated to all users through proper interface design, and that these affordances can be considered satisfactory by everybody."
ย
You know how sometimes we read about these scandals on the news and think, โbut like, thatโs it, right? Things canโt possibly get any worse.โย And somehow they just do?ย
According to Abigail, thereโs more to the story because - get this - iRobot got acquired by Amazon on August 5, 2022. This acquisition has been deemed as one of the 'scariest'ย in tech in terms of both antitrust and privacy concerns. โBear in mind that Amazon is trying to dominate the smart home industry. To gain this power, what they are essentially doing is taking out their largest competition, iRobot. What we have here is a potential antitrust violation. This acquisition would weaken competition and enhance Amazonโs monopoly power in the industry.โ And what about the data side of things?
โWe can all somewhat agree that the most valuable asset in the world right now is data. The more data one has, the more control one can exert. Amazon as we know has a track record of recording everything consumers do. Being able to survey your home through this acquisition is not only highly intrusive and breaches oneโs right to privacy, but it also presents Amazon a great opportunity to create customer-tailored ads to increase its sales. Additionally, if data collected by Amazon is not stored in anonymous datasets, this could violate the GDPR.
Another major issue is the possibility of hacking or data leakage. In the event of such, Amazon could potentially face legal implications. However, the main problem here is that they do not seem to be afraid of these legal implications. Amazon has a history of saying one thing and doing another, even if they promise to respect user data privacy rights and keep data safe. They also have a history of putting their own interest before that of its customers.โ Guess it all goes back to that saying:ย with great power comes great responsibility.
ย
Well, weโre still waiting on that responsibility element. Just running a bit lateโฆ Must'veย takenย theย District line. ๐
Anything you want to ask the EInsighters, Abigail? โThese days the cost of convenience is privacy. The key question now is, to what extent are you as the customer willing to trade your privacy for this convenience? What is it going to be - your Roomba vacuum cleaner or your privacy?โ
Tomasย hinted that this scandal fits into a broader picture of privacy and security issues that relate to the ever-increasing number of Internet of Things (IoT) applications. โWith many of the IoT applications on the market today, there are some question marks about how data is handled. With this scandal, we want to focus on the two values that play a prominent role here: privacy and security.โ
Before we get lost in the nitty-gritty of these concepts - Tomas, you mentioned a study that gave a nice segue into privacy?
โYeah, it is interesting to note that recent Belgian research has shown that a large group of people find it acceptable to keep tabs on third parties - such as the cleaning lady or the handyman, for example. When it comes to children or themselves, people seem much less tolerant.โ Makes sense if you think about it - we are usually intrigued about what others are doing, but tend to get a bit closed-off when weโre the ones under a magnifying glass. Anyway, you were talking aboutย privacy...
โEthically, privacy is an instrumental value. Thus, it can protect people's freedom or ensure that people are treated with sufficient respect. Which value is protected often depends on the context in which it is translated. Privacy is thus not unimportant to a lot of people. In fact, we see that a number of people even refrain from buying such devices because they fear for their privacy. Others do purchase the devices, but have no idea what happens to their data. People have agreed to the use of their data, but they don't know what exactly happens with it. This is where the so-called privacy paradox comes into play.
After all, you can only use the software or the device itself if you agree to the terms of use, which, among other things, ask for permission to use your data. Now, that data are used mainly to improve products. There is no doubt that data can provide a lot of information that can benefit the innovation of products. But at the same time, a second revenue model is attached to it. Data is sold to act as training data for other AI models. That means there is not always much control over where the data can go. Zooming in on this story, the question can be asked whether privacy is still being implemented in such a way that it continues to protect other values like freedom and respect.โ And what do you think?
โWell, some formal requirements of the law seem to be met mainly. And the second value we are discussing, security, reinforces this. Often, IoT devices opt for cheap components that are often not brilliantly secured. Moreover, the software of such devices is often not updated in case of security breaches. The lack of security ensures that devices can be easily hacked, which again leads to a threat to privacy and underlying values.โ
ย
So how can we prevent such risks and threats to privacy?
โIt is important to include values from the very beginning of the design process. The literature speaks of 'value by design'. However, the latter requires a choice within the development process.โ
So one thing is clear: thereโs always a choice. And whether or not you make the right one depends on your organisationโs culture and its values.
So what should your business do to avoid a similar tech scandal?
Clarify company values at the start to ensure they're embedded in the end product
Start implementing values from the very beginning of the design process
Ensure the capabilities of these appliances are clearly communicated to the consumers
Sounds like a lot of work? Let us help!
Check out our latest offering: the EI ETHICS BOARD.
PS Want to make "The Business Case for Ethics" at your organization?
Read our latest EQUATION Issueย onย why inaction is no longer an option.
A massive thank you to our incredible EI Experts for their contribution in the making of this month's issue: Anna Danรฉs, Dr. Flavio S. Correa da Silva, Abigail Ichoku, and Dr. Tomas Folens.ย
ย
That's all for now, EInsighters! See you next month...
Liked this month's issue? Share it with a friend!
Have a tech scandal you want us to cover? Let me know!
ย