Press "Enter" to skip to content

Miss Info Geek Posts


What the GDPR does – and doesn’t – say about consent

Meme courtesy of Jenny Lynn (@JennyL_RM)

You may have noticed that the General Data Protection Regulation is rather in the news lately, and quite right too considering there is only a year left to prepare for the most stringent and wide-reaching privacy law the EU has yet seen. Unfortunately however, in the rush to jump onto the latest marketing bandwagon, a lot of misleading and inaccurate information posing as “advice” in order to promote products and services is flourishing and appears to be drowning out more measured and expert commentary. Having seen a worrying number of articles, advertisements, blog posts and comments all giving the same wrong message about GDPR’s “consent” requirements, I was compelled to provide a layperson’s explanation of what GDPR really says on the subject.


Come again?

Last week, sex-tech geek Stu Nugent posted a thread critiquing a product pitch from a startup. Go and read it, it’s both hilarious and depressing.

(TL;DR, a woman’s orgasm is undetectable and probably fake, unless she has been fitted with monitoring gear and an algorithm for verification. Buy our cum-checksum app. Quite an innovative self-own there, lads).

The upstarts then compounded their error by trying to mansplain womens’ orgasms back to them, only with SCIENCE this time, rather than misogynist-aimed marketing copy. According to the company, critiquing their marketing materials in public is ‘unethical’… but developing an algorithm that polices women’s sexual experience is apparently fine.

Since I’m on Covid-19 furlough and therefore not allowed to do client work, I thought I’d entertain myself by testing out the theory behind the app doing a bit of data protection analysis while indulging in the many amusing opportunities for bawdry it presents.

CAVEAT: all I know about this app is what I’ve linked here. I haven’t reviewed any designs or spoken to the company behind it. This blog post is not legal advice or a professional consultation.

So, who’s the Data Controller?

The company developing this ‘solution’ is Relida Ltd, so right now, they’re wearing the hat. However, personal data acquired or generated through phone apps usually has a number of Controllers, from developers to white-labellers, to the advertising networks and device manufacturers, each processing the data-take for different purposes and by different means. As this isn’t a live app yet, all I can say is that Relida had better have a very long and hard think about this before they achieve release.

Fair, lawful and transparent.

I’ll sound the FAIRNESS ALARM here and there during this analysis, because that’s much too widely-spread a topic to condense into one paragraph.

Let’s look at lawful basis.

This is an app intended explicitly for sexual use. That makes all of the personal data associated, special category personal data – whether the datapoint is heart rate, a username, advertising analytics ID, whatever. The only applicable Article 9 lawful basis is consent; as even installing the app is a strong indicator that one has something of a sexual life, even if it is a solitary one. One could argue to exhaustion about whether heartrate data is SCD in the abstract, but in this specific case, it very definitely is. Consent required. Lots of it. Informed, unambiguous, specific, freely-given, evidenced, easily-withdrawable consent.

The app is marketed at people who want to know if the woman they are having sex with is having orgasms, therefore the subject of the heart-rate monitoring data is not necessarily the person who installs the app – it’s the woman being monitored. This means that consent can’t be reliably evidenced, no matter how many tickboxes or popups are added (FAIRNESS ALARM). Monkeying about with consent is dodgy enough when it’s about data; when sex is involved, there should be no wiggle room whatsoever. That’s a huge ethical/UX design challenge, and judging by the state of this company’s PowerPoint skills; not one they are remotely competent to attempt.

As there’s no actual commercial product here yet, transparency isn’t something that can be assessed – however, at the very least, any use of the data for marketing, profiling, customer research, or other secondary purposes must be explained in detail and based on explicit, unbundled consents. I’ve yet to see any app or web developer achieve this, ever; so if Relida manage it I will be astounded.

Speaking of which, if this app works the same way almost every other app in existence does, users will be inviting a gaggle of spectators to their happy fun time, and allowing them to gawk at someone’s inner workings. Now, if you’re into digital dogging as a form of exhibitionism; then you and your partner(s) might be okay with that – fine, it’s your choice as consenting adults; but if you’d prefer not to share your earth movements with a bunch of randos, then the app needs to come clean about who is getting an eyeful.

Data models and management.

Is there really a specific heartbeat pattern that is unique and exclusive to orgasm but universal to women? Well, I guess it’s possible, but I want to see repeatable peer-reviewed studies done in lab conditions with a vastly diverse subject set , and until that happens I’m going to be very, very sceptical. In the absence of any evidence supporting this theory (citations, references to studies, interrogable statistics), the entire premise of the app rests upon the unverified claims of the vendors. Now, I don’t know about you, but in my experience, vendors are unlikely to raise their skirts to allow exposure of their proprietary algorithms, so their claims are probably unverifiable anyway.

Assuming that there is a special heartbeat rapping out the body code equivalent of “YEEEEAAAAHHHHHHH’, then the only processing of personal data that is required is a comparison of input from the sensor to stored values for O, then output of a result. No names required. No geolocation. No contacts list or content of phone calls and messages. Just tracking data (presumably from a Watch or some such) and a comparison algorithm, which could very likely be run locally without requiring the device to squirt any data elsewhere.

Accuracy is a challenge here. If you thought it was difficult to ask a simple question of a human being, consider how much more complex it would be to make a judgement on a single individual’s unique physical state at a particular moment based on some total stranger’s unverified quanitification of what they think an orgasm should look like. I mean, come on. There are a lot of problems with the data quality of the ‘fitness monitors’ on the market at the moment, so much so that they cannot be relied on for medical purposes, let alone the subtleties of an extremely complex and variable state. Add to that the implications of false negatives (especially in the type of abusive relationship where the imposition of this technology is likely to be common and involuntary), and FAIRNESS ALARM; you’ve just ruined your shot at the first principle again.

Especially since there’s only one way to determine whether your sexual partner has arrived at the fireworks demonstration, and that’s to take their word for it that they did. If you don’t believe they will tell you the truth, then WTF are you doing having sex with someone you don’t trust? And ask yourself why they might feel that they can’t be honest with you? This is the true minimum-dataset approach to achieving the purpose of improving or enhancing your mutual encounters. It’s also the easiest, the most accessible and the most ethical approach. No technology required! USE YOUR WORDS, PEOPLE.

It’s proving difficult to get there….and by ‘there’, I mean any circumstance in which this product, as marketed, could be useful, let alone lawful.

As there is a better, less-intrusive, more accurate way to achieve the purpose that doesn’t involve processing any personal data at all; the app itself is redundant. Like all those ‘consent apps’, it seeks to address a perceived challenge with a technological solution but fails to identify the actual root cause of the problem; people doing sex wrong . In this example, the ‘problem’ is framed as ‘women lie or aren’t competent to recognise the Big Bang when it happens’ when actually, the problem is far more likely to be ‘you’re crap at sex’.

From a data protection point of view, if the primary purpose of the app were in fact to invalidate women’s lived experiences, foster mistrust of women’s sexual self-assertion and propogate the destructive notion that sex is mostly about gratifying men’s egos – well, you could build a data model that serves the app’s purpose, but – FAIRNESS ALARM: self-explanatory.

In case it isn’t clear; one does not enhance the quality of one’s intimate encounters by requiring KPIs and mandatory reporting from one’s partner.

If I were doing the DPIA, I wouldn’t even get as far as security assessment become making a very firm DO NOT DO THIS THING recommendation. However, as such recommendations are rarely – if ever – heeded; I would certainly want to see data maximally-minimised for every processing operation, encrypted in transit, encrypted at rest (with robust key management), de-linked and airgapped from identifiers, the app password-protected, the data stored only on the device and as much processing as possible done locally, robust assurance for all third-party arrangements, a programme of monitoring for internal misuse and a delete-on-demand option for the user that includes all data inputs and outputs, no matter where they located. Most of this is the bare minimum that the GDPR requires anyway, but I will buy a winning Lottery ticket for anyone who can show me an app architecture that actually implements the Principles effectively.

Accountability can’t be analysed when a product doesn’t actually exist, but based on the company’s marketing materials, and their missing-the-point-completely response to the public reaction telling them that their idea is a) pointless, b) creepy, c) borderline-abusive; I’m not sanguine. The Cyprian Data Protection Authority does seem to have some teeth , although they appear to be reserved for nibbling at the outer lobes of big Controllers.

Speaking of DPIAs, this is a set of processing operations that definitely needs one . Data about sexual activity in (they hope) large amounts from devices linked to unique individuals. Potential for discrimination. Systemic monitoring. Yep, this one’s in the ‘high risk of harm to the rights and freedoms of individuals’ category.

And then there are the macro effects, the consequences affecting us in groups on a large scale. Harmful ideas being quantified and turned into targets. Like; ‘sexual enjoyment has no merit outside this very specific set of physiological process’ (bzzt – incorrect), or ‘treating women like autonomous human beings is an optional upgrade to the heterosexual experience’  (bzzzt again – it’s the minimum threshold for participation).

Now, if someone were to pitch a truly science-based, accessible, gender-agnostic, data-protection-by-design-and-default, privacy-friendly (this means no ad trackers, no social media SDK use, no 3rd party dataflows, great security, complete control over the data and total transparency) app for one to use to identify recognise one’s own stages of arousal, for therapeutic and entertainment purposes, then I’d be squealing with joy.

Data protection verdict: FAIL – frustrating, disappointing, unnecessary, possibly dangerous. You’re better off with inclusive & supportive sex education and frank conversations – none of which neessitate an app.

Background reading on the female orgasm

Brief Encounters: an open letter to lawyers

“When all you have is a legal education, everything looks like a contract”


Dear lawyers

The rule of law and principles of justice are the foundation of civilised society. Thank you for doing your part to prevent us from sinking into Hobbesian savagery. Getting a legal education and licence to practice is clearly a long, arduous and expensive process. Well done you, for coming out the other end with a job.



“I suppose it is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail.”

Abraham Maslow

An open letter to the information security profession

Dear infosec people,

You do a tough job in a complex, high-stress and fast-paced environment. I admire the cleverness of your technical capabilities and respect the challenges you face.


10 Legitimate Interests Lessons for Marketers

1. Just because you’re interested, doesn’t make it legitimate.

2. You can’t use LI to avoid getting consent when you suspect the answer will be “No”

3. Whether LI can be applied depends on your own assessment of what you’re doing, why and how – which you will be expected to justify and defend.

4. LI is not ‘unclear’ or ‘ambiguous’; it requires thinking to be done and a decision to be made.

5. Publish your Legitimate Interests Assessments (LIA) if you anticipate/plan to reject objections to processing.

6. If a law says you have to get consent for a processing activity, then forget about LI. You can’t use it. Move on.

7. LI is only a valid lawful basis for processing personal data if you’re adhering to all of the principles. It’s not a loophole around compliance.

8. If your LIA is post-hoc rationalisation of something you won’t consider ceasing to do even though you suspect it’s a bit dodgy; then you wasted your time. Just make sure you have funds set aside to deal with complaints, regulatory action and reputation damage when you get found out.

9. The ICO is not responsible for your continuing professional development

10. No-one else can do your thinking for you

“We take your privacy very seriously”

….says the intrusive ‘cookie consent’ popup which requires me to navigate through various pages, puzzle out the jargonerics and fiddle with settings before I can access the content I actually want to read on the site.

Here’s the thing. If your website is infested with trackers, if you are passing my data on to third parties for profiling and analytics, if your privacy info gets a full Bad Privacy Notice Bingo scorecard, then you DON’T take my privacy seriously at all. You have deliberately chosen to place your commercial interests and convenience over my fundamental rights and freedoms, then place the cognitive and temporal burden on me to protect myself. That’s the opposite of taking privacy very seriously, and the fact that you’re willing to lie about that/don’t understand that is a Big Red Flag for someone like me.

10 Anger Management Tips for DP Pros

Grrrrr! Gah! Aaarrrggghhhh!

Sometimes it feels like an uphill struggle, bringing data protection good practice to the masses. Sometimes it feels like an vertical climb up a razor-wire-covered fortress turret while hostile archers fire flame-tipped arrows down at you from overhead. I confess that sometimes I am a little short on patience and tolerance (although I try hard not to let it show!) and I do spend quite a lot of my time with gritted teeth and clenched fists. I’m probably not the only one – which is why I wrote this blog post. Despite my naturally sarcastic tone, the sentiment is genuine – and hopefully contains at least one nugget of actual good advice.

Take care of yourselves, don’t be ashamed to reach out for help when things get on top of you, and remember that come the Zombie Apocalypse; your survival will not be based on how successfully you got an organisation to implement data protection!

Meme Frenzy

At some point, I’m going to try and make a privacy notice delivered through the medium of internet memes. While playing about with the possibilities of this, I got totally sidetracked and ended up data-protection-ifying a load of popular memes for my own nerdy amusement.

Here are the fruits of my misdirected labour. I think I might need to get out more

Privacy vs Security: A pointless false dichotomy?

This is the text of a presentation I gave recently during Infosec18 week. By popular demand (i.e. more than three people asked), I’m re-posting it here for a wider audience. I also intend to record it as a downloadable audio file at some point when I have some free time (hahaha, what’s that???). I took out the specific case studies for the sake of brevity, but I will post those separately as Part 2.

Bad Privacy Notice Bingo!

Snark attack!

Having spent many, many hours reviewing privacy notices lately – both for the day job and for my own personal edification – I’m discouraged to report that most of them have a long way to go before they meet the requirements of Articles 13 and 14 of the GDPR, let alone provide an engaging and informative privacy experience for the data subject.

Because I am a nerd who cares passionately about making data protection effective and accessible, but also a sarcastic know-it-all smartarse, I created this bingo scorecard to illustrate the problems with many privacy notices (or “policies” as some degenerates call them) and splattered it across social media. Hours of fun.