Privacy Disasters: A Depressingly Regular Series
A friend on Bluesky shared the Calmera app with me on Bluesky, and now I am raging internally.
Heads up: In less than 10 months of occasional posting, I am now at over 100 subscribers on priva.cat and over 1400 subscribers on LinkedIn— If you like my snark, analysis, and insights, and aren’t already a subscriber, please consider subscribing, or sharing this with a friend.
So there’s lots of terrible tech out there, and loads of it relies on AI. I simply don’t have the spoons to process, much less respond to all of them. Still, sometimes a shitty idea is so very extra bad that I feel morally obligated as a public servant of privacy to share why the inventors are not only staggeringly wrong, but why the product they have released is toxic to humanity. And today ladies, and particularly gentlemen, I’m going to talk about the sheer unmitigated disaster that is Calmara.
CW: There will be terrible dick-related jokes throughout, and a cat.
According to its website, Calmara is an app that uses AI to detect whether your soon-to-be-lover is laden with a sexually transmitted infection or not. It’s basically hot dog/not hot dog, but for penises and STIs. It goes like this:
You (or the person you’re planning to have sex with) download the app.
You take a picture of your penis (or their penis, if you get their explicit but totally unverified consent) and upload it on their app.
Some pre-processing occurs (to control for lighting and other confounding issues).1
That image gets sent to … AWS somewhere, and an AI model assesses whether the image is full of STIs or not.
The site claims a 90% accuracy rate, with no cited sources, and is full of cutesy language that tries to make it seem cool and secure. However, there’s no actual data on what’s going on under the hood. Oh and it’s available in the EU (and everywhere else on earth, apparently) because absolutely nobody sat these lads down and told them that maybe this might run into legal/privacy/ethical/cultural/pornography/CSAM issues. After all, who needs a buzz-kill lawyer, when you’ve got AI, amirite?!?!
The Cofefe Calmara Privacy Policy (ugh) and Terms and Conditions of Use indicate that they clearly never bothered talking to anyone familiar with HIPAA, or data protection laws generally. By the looks of it, they avoided interacting with a lawyer entirely. Their terms of use statement is littered with the kind of language that lay people think will cover their ass but usually don’t. For example, the always fun, ‘The Service is intended for informational purposes and is not a substitute for professional medical advice’ and obligations to obtain explicit consent, and a full release of liability if the user fails to do so: ‘HeHealth Inc. will not be liable for any claims, damages, or legal actions resulting from the submission of images without proper consent, including but not limited to privacy infringements or violations of any law.’
To test this, I downloaded the app, and … yeah this is not how any of this works.
It’s the legal equivalent of magic pixie dust, and if I were the California Attorney General, I’d be hopping on this D(isaster) like a porn star.2
The Calmerror Calmara website also claims that there is an age verification process, and despite my effort to try this out with cat pictures, I never saw it.
Also, their TOS and FAQs include a general disclaimer that absolves them of liability if an underage person uploads an image. Guys, in many jurisdictions, there is no get-out-of-jail-free-card for child porn, and an underage person’s peen is probably gonna get you into trouble, no matter what the TOS says.
Their privacy policy is equally bad, and it doesn’t include any details on how images are used (beyond to deliver services and to ‘enhance and innovate our service offerings’), their legal reasons for processing data beyond the image itself, how long images are kept, how these very intimate images are secured, how data is shared and with whom. The sum total of their sharing statement is:
To provide you with seamless service, we share your information with service providers and partners who assist in service operation, including data hosting, analytics, marketing, payment processing, and security. These collaborations are vital for a seamless and secure service experience.
Given that the only personal data/information they admit to collecting is images, cookies, and ‘user behaviors’ and their statement is extremely vague about what they do with this data, it seems entirely reasonable to assume that they are mass-blasting dick pics and everything else with all parties involved. Congrats guys, your dicks might end up on marketing collateral!
Also, their FAQ claims that because they use AWS to store and do the AI magicks, that makes their app HIPAA compliant.
A Momentary Interlude While I Drink and Check Out Their IP
The company behind Calimari Calmara trades under the name HeHealth Inc.3 HeHealth, which launched in 2022, offers a flashier version of the same technology, called Hehealth.ai, which is basically a paid version of the app, with a bonus doctor’s review. No idea about which one of these dudes is actually reviewing the images, or whether any are actually licensed/certified in proctology, but hey, somehow that’s slightly better than the free Calzone Calmara version, which is doing medical diagnosis entirely by AI. According to the HeHealth app, 31,129 people have allegedly scanned their weens and uploaded them to an AWS server somewhere.
HeHealth.ai is based on patented technology, which can allegedly detect 10 different STIs according to the website (though the patent and pre-print validation study only seem to show detection for genital warts, herpes eruption, cancer, candidiasis, and syphilis). It uses a trained AI model that is augmented with a small number of actual healthy/unhealthy penises (<240), and potentially synthetic images, if I read the patent details correctly.
I’ll admit, I’m happy that there’s science behind it. But that doesn’t stop this whole affair from being a massive privacy disaster.
Lads: Just Go See a Doctor
Guys. Fellas. Gents. I get that sometimes the urge to get down may overwhelm better judgment. That privacy and data protection might not be top of mind when you’re getting ready to get dirty. Still, don’t let your little head get ahead of your bigger one on this. If you’re worried, or if things don’t look right down there, just go see a doctor.
Right now, based on everything provided on the Calmara and HeHealth websites, I’m unconvinced that they’re thinking of your interests. Based on what’s being presented, this app looks rushed and hoping to cash in on the AI hype. While it may be well-meaning, it seems to lack any consideration for the kinds of things I would expect to see out of a legitimately-compliant mobile health app, which is what it should be, because this app is offering you medical advice. Specifics and details are light. Testing seems limited and it took me effort to find it. Everything is far too cutesy, and really, do you want cutesy when you’re uploading images of your dick to god-knows-where?
Hell, I’m not even convinced that the AI can do much more than detect if your peen is not exactly right (whatever that means). For (literal) fuck’s sake: just go see a doctor.
This is at least the takeaway I got by reviewing the associated patent: US 11,721,023 B1
I am not the California AG though, and in the interests of avoiding a defamation lawsuit, will note that I’m not a regulator of any sort. I’m at best a snarky privacy blogger.
According to Pitchbook, HeHealth is a VC-backed startup that has secured at least $1.5M in funding from 2 VC firms I’ve never heard of before, and the incubator arm of Singapore Management University’s Institute of Innovation & Entrepreneurship.
PS: For the curious, I enjoyed the Otterbank Brewing & Blending Gimp Mask 2022. It felt appropriate given the context.