Press "Enter" to skip to content

Category: WTF

Come again?

Last week, sex-tech geek Stu Nugent posted a thread critiquing a product pitch from a startup. Go and read it, it’s both hilarious and depressing.

(TL;DR, a woman’s orgasm is undetectable and probably fake, unless she has been fitted with monitoring gear and an algorithm for verification. Buy our cum-checksum app. Quite an innovative self-own there, lads).

The upstarts then compounded their error by trying to mansplain womens’ orgasms back to them, only with SCIENCE 1 this time, rather than misogynist-aimed marketing copy. According to the company, critiquing their marketing materials in public is ‘unethical’… but developing an algorithm that polices women’s sexual experience is apparently fine.

Since I’m on Covid-19 furlough and therefore not allowed to do client work, I thought I’d entertain myself by testing out the theory behind the app doing a bit of data protection analysis while indulging in the many amusing opportunities for bawdry it presents.

CAVEAT: all I know about this app is what I’ve linked here. I haven’t reviewed any designs or spoken to the company behind it. This blog post is not legal advice or a professional consultation.

So, who’s the Data Controller?

The company developing this ‘solution’ is Relida Ltd, so right now, they’re wearing the hat. However, personal data acquired or generated through phone apps usually has a number of Controllers, from developers to white-labellers, to the advertising networks and device manufacturers, each processing the data-take for different purposes and by different means. As this isn’t a live app yet, all I can say is that Relida had better have a very long and hard think about this before they achieve release.

Fair, lawful and transparent.

I’ll sound the FAIRNESS ALARM here and there during this analysis, because that’s much too widely-spread a topic to condense into one paragraph.

Let’s look at lawful basis.

This is an app intended explicitly for sexual use. That makes all of the personal data associated, special category personal data – whether the datapoint is heart rate, a username, advertising analytics ID, whatever. The only applicable Article 9 lawful basis 2 is consent; as even installing the app is a strong indicator that one has something of a sexual life, even if it is a solitary one. One could argue to exhaustion about whether heartrate data is SCD in the abstract, but in this specific case, it very definitely is. Consent required. Lots of it. Informed, unambiguous, specific, freely-given, evidenced, easily-withdrawable consent.

The app is marketed at people who want to know if the woman they are having sex with is having orgasms, therefore the subject of the heart-rate monitoring data is not necessarily the person who installs the app – it’s the woman being monitored. This means that consent can’t be reliably evidenced, no matter how many tickboxes or popups are added (FAIRNESS ALARM). Monkeying about with consent is dodgy enough when it’s about data; when sex is involved, there should be no wiggle room whatsoever. That’s a huge ethical/UX design challenge, and judging by the state of this company’s PowerPoint skills; not one they are remotely competent to attempt.

As there’s no actual commercial product here yet, transparency isn’t something that can be assessed – however, at the very least, any use of the data for marketing, profiling, customer research, or other secondary purposes must be explained in detail and based on explicit, unbundled consents. I’ve yet to see any app or web developer achieve this, ever; so if Relida manage it I will be astounded.

Speaking of which, if this app works the same way almost every other app in existence does, users will be inviting a gaggle of spectators to their happy fun time, and allowing them to gawk at someone’s inner workings. Now, if you’re into digital dogging as a form of exhibitionism; then you and your partner(s) might be okay with that – fine, it’s your choice as consenting adults; but if you’d prefer not to share your earth movements with a bunch of randos, then the app needs to come clean about who is getting an eyeful.

Data models and management.

Is there really a specific heartbeat pattern that is unique and exclusive to orgasm but universal to women? Well, I guess it’s possible, but I want to see repeatable peer-reviewed studies done in lab conditions with a vastly diverse subject set 3, and until that happens I’m going to be very, very sceptical. In the absence of any evidence supporting this theory (citations, references to studies, interrogable statistics), the entire premise of the app rests upon the unverified claims of the vendors. Now, I don’t know about you, but in my experience, vendors are unlikely to raise their skirts to allow exposure of their proprietary algorithms, so their claims are probably unverifiable anyway.

Assuming that there is a special heartbeat rapping out the body code equivalent of “YEEEEAAAAHHHHHHH’, then the only processing of personal data that is required is a comparison of input from the sensor to stored values for O, then output of a result. No names required. No geolocation. No contacts list or content of phone calls and messages. Just tracking data (presumably from a Watch or some such) and a comparison algorithm, which could very likely be run locally without requiring the device to squirt any data elsewhere.

Accuracy is a challenge here. If you thought it was difficult to ask a simple question of a human being, consider how much more complex it would be to make a judgement on a single individual’s unique physical state at a particular moment based on some total stranger’s unverified quanitification of what they think an orgasm should look like. I mean, come on. There are a lot of problems with the data quality of the ‘fitness monitors’ on the market at the moment, so much so that they cannot be relied on for medical purposes, let alone the subtleties of an extremely complex and variable state. Add to that the implications of false negatives (especially in the type of abusive relationship where the imposition of this technology is likely to be common and involuntary), and FAIRNESS ALARM; you’ve just ruined your shot at the first principle again.

Especially since there’s only one way to determine whether your sexual partner has arrived at the fireworks demonstration, and that’s to take their word for it that they did. If you don’t believe they will tell you the truth, then WTF are you doing having sex with someone you don’t trust? And ask yourself why they might feel that they can’t be honest with you? This is the true minimum-dataset approach to achieving the purpose of improving or enhancing your mutual encounters. It’s also the easiest, the most accessible and the most ethical approach. No technology required! USE YOUR WORDS, PEOPLE.

It’s proving difficult to get there….and by ‘there’, I mean any circumstance in which this product, as marketed, could be useful, let alone lawful.

As there is a better, less-intrusive, more accurate way to achieve the purpose that doesn’t involve processing any personal data at all; the app itself is redundant. Like all those ‘consent apps’, it seeks to address a perceived challenge with a technological solution but fails to identify the actual root cause of the problem; people doing sex wrong 4. In this example, the ‘problem’ is framed as ‘women lie or aren’t competent to recognise the Big Bang when it happens’ when actually, the problem is far more likely to be ‘you’re crap at sex’. 5

From a data protection point of view, if the primary purpose of the app were in fact to invalidate women’s lived experiences, foster mistrust of women’s sexual self-assertion and propogate the destructive notion that sex is mostly about gratifying men’s egos – well, you could build a data model that serves the app’s purpose, but – FAIRNESS ALARM: self-explanatory.

In case it isn’t clear; one does not enhance the quality of one’s intimate encounters by requiring KPIs and mandatory reporting from one’s partner.

If I were doing the DPIA, I wouldn’t even get as far as security assessment become making a very firm DO NOT DO THIS THING recommendation. However, as such recommendations are rarely – if ever – heeded; I would certainly want to see data maximally-minimised for every processing operation, encrypted in transit, encrypted at rest (with robust key management), de-linked and airgapped from identifiers, the app password-protected, the data stored only on the device and as much processing as possible done locally, robust assurance for all third-party arrangements, a programme of monitoring for internal misuse and a delete-on-demand option for the user that includes all data inputs and outputs, no matter where they located. Most of this is the bare minimum that the GDPR requires anyway, but I will buy a winning 6 Lottery ticket for anyone who can show me an app architecture that actually implements the Principles effectively.

Accountability can’t be analysed when a product doesn’t actually exist, but based on the company’s marketing materials, and their missing-the-point-completely response to the public reaction telling them that their idea is a) pointless, b) creepy, c) borderline-abusive; I’m not sanguine. The Cyprian Data Protection Authority does seem to have some teeth 7, although they appear to be reserved for nibbling at the outer lobes of big Controllers.

Speaking of DPIAs, this is a set of processing operations that definitely needs one 8. Data about sexual activity in (they hope) large amounts from devices linked to unique individuals. Potential for discrimination. Systemic monitoring. Yep, this one’s in the ‘high risk of harm to the rights and freedoms of individuals’ category.

And then there are the macro effects, the consequences affecting us in groups on a large scale. Harmful ideas being quantified and turned into targets. Like; ‘sexual enjoyment has no merit outside this very specific set of physiological process’ (bzzt – incorrect), or ‘treating women like autonomous human beings is an optional upgrade to the heterosexual experience’  (bzzzt again – it’s the minimum threshold for participation).

Now, if someone were to pitch a truly science-based, accessible, gender-agnostic, data-protection-by-design-and-default, privacy-friendly (this means no ad trackers, no social media SDK use, no 3rd party dataflows, great security, complete control over the data and total transparency) app for one to use to identify recognise one’s own stages of arousal, for therapeutic and entertainment purposes, then I’d be squealing with joy.

Data protection verdict: FAIL – frustrating, disappointing, unnecessary, possibly dangerous. You’re better off with inclusive & supportive sex education and frank conversations – none of which neessitate an app.

Background reading on the female orgasm

  1. *although they have yet to produce any actual science[]
  2. the GDPR applies in Cyprus, where the company is based[]
  3. I’d volunteer – sounds like a fun way to contribute to medical science[]
  4. by ‘wrong’, I mean treating it as anything other than a continually negotiated journey of mutual enjoyment between consenting adults, for which the parameters are agreed upon and respected by all parties. Examples of ‘wrong’ include: treating sex as a conquest, a competition, imposing physical conformity, coercive actions and behaviour, assuming entitlement to anything outside one’s own skin[]
  5. There are apps for helping you work on that though, you’ll be pleased to hear – although none of them in any combination are an adequate substitute for honest communication and mutual respect between participants.[]
  6. subject to the mathematical probabilities of the game, this is not a guarantee, terms and conditions apply[]
  7. I want to make a vagina dentata joke, I really do[]
  8. Article 35 says so[]

Meme Frenzy

At some point, I’m going to try and make a privacy notice delivered through the medium of internet memes. While playing about with the possibilities of this, I got totally sidetracked and ended up data-protection-ifying a load of popular memes for my own nerdy amusement.

Here are the fruits of my misdirected labour. I think I might need to get out more

StalkerChimps

This morning, I was spending my leisure time researching options for email newsletters. Just to be clear, this isn’t something I would necessarily choose to do for fun, but is linked to my role as Digital Officer for a certain professional association for information rights professionals.

All of the reviews I read seem to hold MailChimp up as cost-effective, easy to use and feature-rich. “Great”, I thought and then the privacy nerd in me started muttering….I wasn’t surprised to see that MailChimp are a US company, as their inability to spell common words such as “realise” and “harbour” had already clued me up to this, but that doesn’t necessarily present an insurmountable data protection problem for a UK organisation looking to use their services (setting aside the current kerfuffle about Safe Harbour/Privacy Seal/NSA etc etc). I thought as a prospective customer of their services, I’d check out the privacy policy (nothing more embarrassing than accidentally using personal data unfairly or unlawfully when you’re acting as a professional organisation for privacy enthusiasts…..).

And I found this:

(for the record; the annotations are mine).

Which basically translates to:

“We are going to follow you all over the web, conducting surveillance on you without telling you and then use what we have discovered to try and predict the best ways to manipulate you in order to make money for our customers, clients and suppliers.”

Oh yeah, and there’s also this: “As you use our Services, you may import into our system personal information you’ve collected from your Subscribers. We have no direct relationship with your Subscribers, and you’re responsible for making sure you have the appropriate permission for us to collect and process information about those individuals. We may transfer personal information to companies that help us provide our Services (“Service Providers.”) All Service Providers enter into a contract with us that protects personal data and restricts their use of any personal data in line with this policy. As part of our Services, we may use and incorporate into features information you’ve provided or we’ve collected about Subscribers as Aggregate Information. We may share this Aggregate Information, including Subscriber email addresses, with third parties in line with the approved uses in Section 6.[screenshot]”

Now, I have most definitely had emails from businesses that I’ve used in the past, which – upon unsubscribing – I have discovered are using MailChimp. No-one has ever told me that when I gave my email address to them, they would pass it on to a US company who would then use it for stalking and profiling me. Well, hur-hur, it’s the Internet, what did I expect?

Wait. Being “on the internet” does not mean “no laws apply”. And in the UK, for UK-registered organisations, the UK Data Protection Act does most certainly apply. You cannot contract out of your organisation’s responsibilities under DPA. Now, for those of you reading this who aren’t DP geeks (Hi, nice to see you, the party’s just getting started!), here’s a breakdown of why I think using MailChimp might be a problem for UK organisations….

The UK Data Protection Act has 8 Principles, the first of which is that “personal data shall be processed fairly and lawfully”. Part of “fair and lawful” is that you must be transparent about your use of personal data, and you mustn’t breach any of the Principles, commit any of the offences or use the data for activity which is otherwise inherenty unlawful (like scams and fraud, for example). One key requirement of being “fair and lawful” is using a Fair Processing Statement (a.k.a “Privacy Notice“) to tell people what you are doing with their data. This needs to include any activity which they wouldn’t reasonably expect – and I would think that having all of your online activity hoovered up and used to work out how best to manipulate you would fit squarely into that category. Or am I just old-fashioned?

Anyway, using MailChimp for email marketing if you don’t tell people what that implies for their privacy? Fail No.1.

Then there’s the small matter of MailChimp’s role in this relationship. Under DPA, we have Data Controllers and Data Processors. For the sake of user-friendliness, let’s call them respectively “Boss” and “Bitch”. The organisation that is the Boss gets to make the decisions about why and how personal data is used. The organisation that is the Bitch can only do what the Boss tells them. The terms of how the Boss-Bitch relationship works needs to be set out in a contract. If the Bitch screws up and breaches privacy law, the Boss takes the flak, so the Boss should put strict limitations on what the Bitch is allowed to do on their behalf.

Now, I haven’t seen the Ts and Cs that MailChimp are using or whether there is any mention of Data Controller/Data Processor relationships but I doubt very much if they could be considered a proper Bitch because they use a lot of subscriber data for their own ends, not just those of the organisation on whose behalf they are sending out emails. So if MailChimp aren’t a Bitch, then they are their own Boss – and so giving personal data to them isn’t the equivalent of using an agency for an in-house operation, it’s actually disclosure of the information to a third party to use for their own purposes (which may not be compatible with the purposes you originally gathered the data for). Now one of the things you’re supposed to tell people in a privacy notice is whether you are going to disclose their data, what for, and to whom. You’re also not supposed to re-purpose it without permission. Oops again (Fail No. 2)

I’m gonna skirt past the 8th Principle (don’t send data overseas without proper protection), because there’s just so much going on at the moment about the implications of sending data to the US, we’ll be here for hours if I get into that. Suffice to say, if the Data Controller (Boss) is a US firm, you have no rights to visibility of your data, control over its accuracy, use, security or anything else (Principles 2-7). None. Kthxbye. That might be fine with you, but unless you are informed upfront, the choice of whether or not to engage with the organisation that’s throwing your data over the pond to be mercilessly exploited, is taken away from you. Not fair. Not lawful. Fail No.3.

Aaaaand finally (for this post, anyway) there’s the PECR problem. Simplified: PECR is the law that regulates email marketing, one of the requirements of which is that marketing by email, SMS and to TPS-registered recipients requires prior consent – i.e., you can’t assume they want to receive it, you must ask permission. It does however contain a kind of loophole where if you have bought goods or services from an organisation, they are allowed to use email marketing to tell you about similar goods and services that you might be interested in (until you tell them to stop, then they can’t any more). This means that where the soft-opt in applies, you can send people email marketing without their prior consent (it’s a bit more complicated to that, but this isn’t a PECR masterclass – more info here if you’re interested)

However, PECR doesn’t cancel out DPA or contradict it, or over-ride it. You must comply with both. And this means that any company relying on the soft-opt-in to send email marketing via MailChimp is almost certainly in breach of the Data Protection Act unless they at the time they collect your email address have very clearly a) stated that they will use it for email marketing purposes and b) obtained your permission to pass it to MailChimp to use for a whole bunch of other stuff. Ever seen anything like that? Nope, me neither. Fail No. 4

So how come this is so widespread and no-one has sounded the alarm. Well, based on my observations, here are some reasons:

  1. No-one reads terms and conditions unless they are corporate lawyers. Even if tTs and Cs were read and alarm bells were rung, chances are that the Marketing department or CEO will have a different idea of risk appetite and insist on going ahead with the shiny (but potentially unlawful) option anyway.
  2. By and large, very few organisations in the UK actually ‘get’ the Data Protection Act and their responsibilities under it. They also don’t really want to pay for DP expertise either, since it will undoubtably open a can of worms that will cost money to fix and cause extra work for everyone. Much easier to take the ostrich approach and rely on the fact that….
  3. …the vast majority of UK citizens don’t understand or care about data protection either. Sometimes there is a gleam of interest when the word “compensation” pops up, but mostly they see it as a hurdle to be sneaked around rather than a leash on a snarling mongoose. Every now and again there is a spurt of outrage as another major breach is uncovered, but these are so common that “breach fatigue” has set in.
  4. Data-trading makes money, and ripping off people’s data/spying on them without giving them a choice/share of the cut/chance to behave differently makes more money than acting fairly and ethically.
  5. Fundamental cultural differences between the US and the EU’s approach to privacy. If you read this blog post by MailChimp’s General Counsel/Chief Privacy Officer, the focus is mostly on data security and disclosure to law enforcement. There’s little about the impact on personal autonomy, freedom of action or principles of fairness that EU privacy law is based on. Perhaps that’s because most of that stuff in in the US Constitution and doesn’t need restating in privacy law. Maybe it’s because the EU has had a different experience of what happens when privacy is eroded. Maybe he ran out of time/steam/coffee before getting into all that.

Anyway, if you got this far, thanks for reading – I hope there’s food for thought there. I’m not advocating that anyone boycott MailChimp or anything like that – but if you’re gonna use them, you should consult a data protection expert to find out how to protect a) your organisation b) your customers and c) the rest of us.

Right, back to web design research it is……

 

How To Not Be An Arse

(a.k.a the futility of compliance-for-the-sake-of-it programmes)

Imagine there was a law* that says “don’t be an arse to other people” which contains a list of 8 general requirements for avoiding arse-ness, including (among others) “be fair”, “be honest”, “don’t be reckless or negligent” and “don’t deny people their rights”.

Then hundreds of thousands of hours, billions of beer tokens and litres of sweat from the brows of assorted lawyers and auditors later; there were produced a number of standards and frameworks, guidance documents and checklists for helping everyone to ensure that whatever they’re doing, they’re avoiding being an arse.

At which point, everyone’s efforts get directed towards finding some technical way to acquire a clean, shiny glowing halo; ticking all of the boxes on the checklists, generating reams of ‘compliance’ paperwork, churning out Arse Avoidance Policies…….but actually ending up as almost *twice* as much of an arse because despite all of the shouting and scribbling and hymn-singing, what they are actually doing on a day to day basis looks remarkably arse-like (despite being called a “Posterior-Located Seating and Excretion Solution”; not the same thing at all) – since as it turns out, arsing around is lucrative and being well-behaved is not so much.

And then the questions is no longer “how do we avoid being arses” or even “what do we need to do to make sure we are not accidentally not arses?” but becomes “what is the bare** minimum we have to do in order not to appear to be arses?”

And that becomes the standard that (nearly) everyone decides to work to, writing long, jargon-filled statements explaining “why we are definitely not arses at all”, insisting that you must all complete a mandatory, dry-as-dust, uninformative half-hour “Anti Arse” e-learning module once a year (and calling it a “training programme” – hah!), hiring armies of lawyers to define the boundaries of “arse” and generally forgetting what it was that the law was trying to achieve in the first place. All of that costs quite a lot of money and – surprise surprise – doesn’t actually fulfill the intent of the law in the first place.

If you have to hide, obfuscate or misdirect from what you are really doing, then it’s quite likely that you are not achieving compliance with the law, no matter how much paperwork you generate or how shiny your halo looks.

It’s quite simple……just don’t be an arse.

 

(*in case you didn’t get it; that would be the Data Protection Act…..)

(**yes I had to get a ‘bare’ reference in there somewhere)

Hello. I use privacy-friendly analytics (Matomo) to track visits to my website. Can I please set a cookie to enable this tracking? I’m afraid that various plugins and content I have on the site here also use cookies, so a ‘yes’ to cookies is a ‘yes’ to those too. Please have a look at my Privacy Info page for more info about these, and visit my advice page for tips on protecting your privacy online