what my brassiere photo folder says about me and maybe the world

The realisation that our iPhones can now tag and neatly organise our boob shots into an album may not be all that sinister. But if tech is smart enough for that, we need to be smarter about tackling its misuse.

|
03 November 2017, 10:25pm

Premenstrual pumped up cleavage, after-gym full-frontals, and that time my friend kept chucking her T-shirt over her head like a footballer celebrating a hat trick. These pictures, and, weirdly, one of a full, nude bum, comprise my iPhone photo album's "Brassiere" category. Of 12,715 photos (I know), 27 are automatically filed under "Brassiere". This capability garnered attention this week after bemused women on Twitter -- including Chrissy Teigen -- expressed concern that Apple is categorising near-nude photos. It turns out the function had been rolled out with June 2016's introduction of iOS 10, and there are 4,432 objects or scenes that your phone's camera has been detecting since, including Guacamole (I score 5), Jubilations (208!) and Pontoons (91???). But, of course, there's no underpants category.

When I look at my "Brassiere" photos, I don't only see boobs (and a bum), I see an evolution of my relationship, now striding into its fourth year. We were long distance, why not share some fleeting close-ups of boobs? But it's a bit strange now, looking back, at how open we were with one another so early on. Passions can mellow, sure, but the more you learn about a partner's foibles and fears (tangling power cords and spiders respectively) the more you trust them. Could it be that we sent each other nudes not for the visuals, but for the thrills of handing trust over to someone who hadn't yet earned it?

Or maybe, once it's taken, a nude, cropped and filtered, feels less like us and more like something just on our phone. We trust our phones more than we trust each other. And they harvest what we give them with glee, and, most of the time, our permission. But while it's long been obvious tech is capable of horrors, that's only ever down to someone, somewhere, making it so. And people can be awful.

Recently, someone registered an email address in my name, then used it to request nudes from my girlfriend. It was a ploy so convincing that later that evening she asked, sincerely, why I'd be so perverse as to request nudes on her first day in a new job. But I was unaware of this email, and it was only when we tapped on the display name -- my name -- that we saw the actual email address -- my Twitter handle @mail.com -- and realised the unsophisticated, but targeted, phishing scam. Maybe they wanted to blackmail us with the nudes, maybe they just wanted the nudes. Either way, it gave us the creeps.

I got in touch with Action Fraud, an online fraud and cybercrime reporting centre. But while there's recourse for identity theft if the overall aim is to steal, say, your credit card details, you can always earn money back. You can't really earn your body back once it's been seen by someone you didn't want to see it.

Using tech for good, I asked Twitter and Facebook if anyone else had experienced this scam. Two women got in touch. Both had been emailed by an account under the name of a boyfriend or an ex, using the same Twitter handle @mail.com format. What was striking, though, is that neither have much in common with my girlfriend beyond a slightly high profile on social media. If whoever this phisher is had chosen us three, who else had they done this to?

I went to the police. There's power in numbers, after all. But I was told this isn't a crime (even though sextortion is actually a crime), and could only constitute harassment if each of us blocked the fake email accounts. When I said that might not stop the phisher doing this to others, the officer replied: "maybe delete your Twitter?"

A report is now with Action Fraud, and though I did consider speaking to a Revenge Porn hotline, their work comes after the fact, taking down non-consensually distributed images from porn sites. Meanwhile, whoever's spent the time creating email addresses in mine and others' images to get nudes of strangers without their informed consent, is still out there, emailing whoever else.

This is just one example not of tech's toxicities, but a lack of resources, or an unwillingness from present authorities, to deal with those who turn tech toxic.

The "Brassiere" revelation isn't shocking because cameras now have the power to detect what's in the image -- sometimes really badly, like, none of my pictures are of an actual pontoon. The revelation is shocking because people -- certain types of people, let's leave it at that -- have decided to create such a capability with no idea of how someone, somewhere, might one day pervert it. Because technology doing bad things is just people doing bad things, this time with with new, "disruptive" tools. If tech is smart enough to identify our "Brassieres", modern law and authorities need to be smart enough to identify the full spectrum of ways tech can be manipulated for worse, and shore up against them.

As this 12,000-times retweeted thread from actor and writer Kumail Nanjiani explains, there's no proper ethical arbiter of tech products, but also, there's no proper ethical arbiter of tech behaviours. And maybe there should be? After all, as I write, Fake News has been anointed word of the year by Collins Dictionary. Toxic tech's impacts are huge, and we need an army to fight it. Maybe it could use one of my pontoons?