Airtags and the weirdness of new technology – and what we can do about it by Luiza Jarowski | October, 2022

People wrapped in orange tape are protesting again
Unsplash. Photo by Micah Baumeister on

Some technologies can be scary and harm people.

My ongoing Ph.D. Research, I discuss unfair data practices in the data cycle, which means unfair practices that occur during data collection, data processing, and data usage. When inappropriate behavior occurs in the data use phase, they are associated with a lack of adequate monitoring, guidelines and enforcement, in addition to the absence of tools to protect vulnerable populations. As a result, users are left vulnerable and exposed to harm. I’ll tell:

A first example is the use of airtags by abusive partners, aiming to stalk their current or former partners. An AirTag can be defined as a “shiny, half-dollar-sized coin containing a speaker, Bluetooth antenna, and battery, which helps users keep track of their missing items.” Their main goal is to help their owners find luggage, wallets, keys, or any personal items that may be lost. They became increasingly popular when airports opened for the first time since the coronavirus lockdown, as overcrowding caused a huge increase in the amount of lost luggage.

photo of an airtag
Unsplash. Photo by Jonas Elia on

Despite not being the original plan for AirTag, they began to be used by abusive partners, ex-partners, or anyone wanting to inadvertently pursue another person. After gaining access to the records of eight police departments, Vice reported that:

“Out of a total of 150 police reports mentioning airtags, in 50 cases women called the police after they started receiving information that their whereabouts were being tracked by an airtag that they did not have. Of those, 25 could identify a person in their life – ex-partner, husband, boss – whom they strongly suspected put airtags on their cars to chase and harass them. Those women reported that current and former intimate partners – most likely people harming women overall – are using airtags to chase and harass them.”

Airtag Tracking
Unsplash by Duc Trinh. photo on

Specifically, in the context of Apple, there is an additional problem of scale, as AirTag can take advantage of a global network of nearly a billion iPhones and Macs to identify AirTags. A massive monitoring system has been created, where each Apple user becomes a live tracker until they exit “Find My Network”.

deepfakes

On the topic of abusive partners, former partners, or sexual predators, another technology that has been misused for harassment is deepfake software. Noel Martin explains that when she was 18, she found her face implicated in pornographic videos and images as if she were one of the actresses. These videos and images were edited by a group of unidentified sexual predators, and he occasionally discovered deepfakes during a reverse Google image search.

Even though deepfake techniques may have legitimate uses, such as learning tools, photo editing, image repair and 3D transformation, their main application nowadays seems to be cyber exploitation. According to a report by DeepTrace, 96 percent of all deepfake videos available online are non-consensual pornography.

gender identity

Another example of inappropriate data usage can be found in the fields of machine learning and facial recognition. Automated gender recognition (AGR) is a type of facial recognition technology that aims to automatically detect whether a picture or video is of male or female individuals through machine learning.

However, gender is not a binary trait but a spectrum, which is sometimes the object of a lifelong search. How would an algorithm possibly be able to classify it – if sometimes it’s not yet clear to the individual? As human-computer interaction researcher Os Keys put it:

“This technology reduces gender to a simple binary and, as a result, is often harmful to individuals such as trans and non-binary people who may not fit into these narrow categories. If done for things like verifying one’s identity for a doorstep or online service, it leads to discrimination.”

It is an algorithm built to fail, because no matter how accurate its developers may claim, attributing gender should not be the role of automated machines.

In the examples above, the use of data was invasive and limited the autonomy of affected individuals due to a lack of technical characteristics or regulatory constraints. In some cases, technology facilitated psychological or physical harm.

What I argue in my research and summarize here is that, before a product is made available to the public, its developers must ensure that it is unfavourable in terms of psychological well-being, physical safety, or any kind of harm. There will be no result.

For any product that deals with the collection and processing of personal data, a thorough evaluation is required to verify its potentially abusive use, in addition to a data protection impact assessment. Developers must be trained to identify a broad set of potential impacts that technology can produce on individuals and society, with particular attention to children, minorities, protected groups and vulnerable populations.

Technology is extremely powerful, and it can bring about so many positive changes. However, humans should always be the focus. No matter how advanced and innovative a certain technology is, there should always be enough barriers and mechanisms to support humans and prevent harm.

Of course, this is not solely the responsibility of the developers. Regulation on inappropriate data uses should be tougher and more specific. But that would be a topic for another edition of the newspaper.

,

What are your views on this? Do you have examples of other creepy techniques? What do you think makes some technologies scary? I would love to read your opinion in the comments below.

Privacy needs critical thinkers like you. Share this article and start a conversation on the topic.

Sign up for Privacy Whisperer and follow us TwitterLinkedIn, and YouTube.

see you next week. All the best, Luiza Jarowski

Leave a Comment