Last week, sex-tech geek Stu Nugent posted a thread critiquing a product pitch from a startup. Go and read it, it’s both hilarious and depressing.
(TL;DR, a woman’s orgasm is undetectable and probably fake, unless she has been fitted with monitoring gear and an algorithm for verification. Buy our cum-checksum app. Quite an innovative self-own there, lads).
The upstarts then compounded their error by trying to mansplain womens’ orgasms back to them, only with SCIENCE ((*although they have yet to produce any actual science)) this time, rather than misogynist-aimed marketing copy. According to the company, critiquing their marketing materials in public is ‘unethical’… but developing an algorithm that polices women’s sexual experience is apparently fine.
Since I’m on Covid-19 furlough and therefore not allowed to do client work, I thought I’d entertain myself by testing out the theory behind the app doing a bit of data protection analysis while indulging in the many amusing opportunities for bawdry it presents.
CAVEAT: all I know about this app is what I’ve linked here. I haven’t reviewed any designs or spoken to the company behind it. This blog post is not legal advice or a professional consultation.
So, who’s the Data Controller?
The company developing this ‘solution’ is Relida Ltd, so right now, they’re wearing the hat. However, personal data acquired or generated through phone apps usually has a number of Controllers, from developers to white-labellers, to the advertising networks and device manufacturers, each processing the data-take for different purposes and by different means. As this isn’t a live app yet, all I can say is that Relida had better have a very long and hard think about this before they achieve release.
Fair, lawful and transparent.
I’ll sound the FAIRNESS ALARM here and there during this analysis, because that’s much too widely-spread a topic to condense into one paragraph.
Let’s look at lawful basis.
This is an app intended explicitly for sexual use. That makes all of the personal data associated, special category personal data – whether the datapoint is heart rate, a username, advertising analytics ID, whatever. The only applicable Article 9 lawful basis1 is consent; as even installing the app is a strong indicator that one has something of a sexual life, even if it is a solitary one. One could argue to exhaustion about whether heartrate data is SCD in the abstract, but in this specific case, it very definitely is. Consent required. Lots of it. Informed, unambiguous, specific, freely-given, evidenced, easily-withdrawable consent.
The app is marketed at people who want to know if the woman they are having sex with is having orgasms, therefore the subject of the heart-rate monitoring data is not necessarily the person who installs the app – it’s the woman being monitored. This means that consent can’t be reliably evidenced, no matter how many tickboxes or popups are added (FAIRNESS ALARM). Monkeying about with consent is dodgy enough when it’s about data; when sex is involved, there should be no wiggle room whatsoever. That’s a huge ethical/UX design challenge, and judging by the state of this company’s PowerPoint skills; not one they are remotely competent to attempt.
As there’s no actual commercial product here yet, transparency isn’t something that can be assessed – however, at the very least, any use of the data for marketing, profiling, customer research, or other secondary purposes must be explained in detail and based on explicit, unbundled consents. I’ve yet to see any app or web developer achieve this, ever; so if Relida manage it I will be astounded.
Speaking of which, if this app works the same way almost every other app in existence does, users will be inviting a gaggle of spectators to their happy fun time, and allowing them to gawk at someone’s inner workings. Now, if you’re into digital dogging as a form of exhibitionism; then you and your partner(s) might be okay with that – fine, it’s your choice as consenting adults; but if you’d prefer not to share your earth movements with a bunch of randos, then the app needs to come clean about who is getting an eyeful.
Data models and management.
Is there really a specific heartbeat pattern that is unique and exclusive to orgasm but universal to women? Well, I guess it’s possible, but I want to see repeatable peer-reviewed studies done in lab conditions with a vastly diverse subject set2, and until that happens I’m going to be very, very sceptical. In the absence of any evidence supporting this theory (citations, references to studies, interrogable statistics), the entire premise of the app rests upon the unverified claims of the vendors. Now, I don’t know about you, but in my experience, vendors are unlikely to raise their skirts to allow exposure of their proprietary algorithms, so their claims are probably unverifiable anyway.
Assuming that there is a special heartbeat rapping out the body code equivalent of “YEEEEAAAAHHHHHHH’, then the only processing of personal data that is required is a comparison of input from the sensor to stored values for O, then output of a result. No names required. No geolocation. No contacts list or content of phone calls and messages. Just tracking data (presumably from a Watch or some such) and a comparison algorithm, which could very likely be run locally without requiring the device to squirt any data elsewhere.
Accuracy is a challenge here. If you thought it was difficult to ask a simple question of a human being, consider how much more complex it would be to make a judgement on a single individual’s unique physical state at a particular moment based on some total stranger’s unverified quanitification of what they think an orgasm should look like. I mean, come on. There are a lot of problems with the data quality of the ‘fitness monitors’ on the market at the moment, so much so that they cannot be relied on for medical purposes, let alone the subtleties of an extremely complex and variable state. Add to that the implications of false negatives (especially in the type of abusive relationship where the imposition of this technology is likely to be common and involuntary), and FAIRNESS ALARM; you’ve just ruined your shot at the first principle again.
Especially since there’s only one way to determine whether your sexual partner has arrived at the fireworks demonstration, and that’s to take their word for it that they did. If you don’t believe they will tell you the truth, then WTF are you doing having sex with someone you don’t trust? And ask yourself why they might feel that they can’t be honest with you? This is the true minimum-dataset approach to achieving the purpose of improving or enhancing your mutual encounters. It’s also the easiest, the most accessible and the most ethical approach. No technology required! USE YOUR WORDS, PEOPLE.
It’s proving difficult to get there….and by ‘there’, I mean any circumstance in which this product, as marketed, could be useful, let alone lawful.
As there is a better, less-intrusive, more accurate way to achieve the purpose that doesn’t involve processing any personal data at all; the app itself is redundant. Like all those ‘consent apps’, it seeks to address a perceived challenge with a technological solution but fails to identify the actual root cause of the problem; people doing sex wrong3. In this example, the ‘problem’ is framed as ‘women lie or aren’t competent to recognise the Big Bang when it happens’ when actually, the problem is far more likely to be ‘you’re crap at sex’.4
From a data protection point of view, if the primary purpose of the app were in fact to invalidate women’s lived experiences, foster mistrust of women’s sexual self-assertion and propogate the destructive notion that sex is mostly about gratifying men’s egos – well, you could build a data model that serves the app’s purpose, but – FAIRNESS ALARM: self-explanatory.
In case it isn’t clear; one does not enhance the quality of one’s intimate encounters by requiring KPIs and mandatory reporting from one’s partner.
If I were doing the DPIA, I wouldn’t even get as far as security assessment become making a very firm DO NOT DO THIS THING recommendation. However, as such recommendations are rarely – if ever – heeded; I would certainly want to see data maximally-minimised for every processing operation, encrypted in transit, encrypted at rest (with robust key management), de-linked and airgapped from identifiers, the app password-protected, the data stored only on the device and as much processing as possible done locally, robust assurance for all third-party arrangements, a programme of monitoring for internal misuse and a delete-on-demand option for the user that includes all data inputs and outputs, no matter where they located. Most of this is the bare minimum that the GDPR requires anyway, but I will buy a winning5 Lottery ticket for anyone who can show me an app architecture that actually implements the Principles effectively.
Accountability can’t be analysed when a product doesn’t actually exist, but based on the company’s marketing materials, and their missing-the-point-completely response to the public reaction telling them that their idea is a) pointless, b) creepy, c) borderline-abusive; I’m not sanguine. The Cyprian Data Protection Authority does seem to have some teeth6, although they appear to be reserved for nibbling at the outer lobes of big Controllers.
Speaking of DPIAs, this is a set of processing operations that definitely needs one7. Data about sexual activity in (they hope) large amounts from devices linked to unique individuals. Potential for discrimination. Systemic monitoring. Yep, this one’s in the ‘high risk of harm to the rights and freedoms of individuals’ category.
And then there are the macro effects, the consequences affecting us in groups on a large scale. Harmful ideas being quantified and turned into targets. Like; ‘sexual enjoyment has no merit outside this very specific set of physiological process’ (bzzt – incorrect), or ‘treating women like autonomous human beings is an optional upgrade to the heterosexual experience’ (bzzzt again – it’s the minimum threshold for participation).
Now, if someone were to pitch a truly science-based, accessible, gender-agnostic, data-protection-by-design-and-default, privacy-friendly (this means no ad trackers, no social media SDK use, no 3rd party dataflows, great security, complete control over the data and total transparency) app for one to use to identify recognise one’s own stages of arousal, for therapeutic and entertainment purposes, then I’d be squealing with joy.
Data protection verdict: FAIL – frustrating, disappointing, unnecessary, possibly dangerous. You’re better off with inclusive & supportive sex education and frank conversations – none of which neessitate an app.
Background reading on the female orgasm
- the GDPR applies in Cyprus, where the company is based [↩]
- I’d volunteer – sounds like a fun way to contribute to medical science [↩]
- by ‘wrong’, I mean treating it as anything other than a continually negotiated journey of mutual enjoyment between consenting adults, for which the parameters are agreed upon and respected by all parties. Examples of ‘wrong’ include: treating sex as a conquest, a competition, imposing physical conformity, coercive actions and behaviour, assuming entitlement to anything outside one’s own skin [↩]
- There are apps for helping you work on that though, you’ll be pleased to hear – although none of them in any combination are an adequate substitute for honest communication and mutual respect between participants. [↩]
- subject to the mathematical probabilities of the game, this is not a guarantee, terms and conditions apply [↩]
- I want to make a vagina dentata joke, I really do [↩]
- Article 35 says so [↩]