I have new news on the Calmara saga -- it turns out, the FAFO approach ended in bad times for this lot. I share my thoughts, the FTC's Closure Notice and why VCs and funders should really make data protection and legal considerations into their due diligence. I posted this as a cross-post a few weeks ago, but it never seems to have gone out. Apologies if you are seeing this as a duplicate.
Part I: The Fuck Around Part
Some readers may recall that in March, the folks over at HeHealth launched a new free app that would scan a person's dick for visible signs of certain sexually transmitted infections (STIs). The Calmara app was marketed to potential paramours, as a safety gauge pre-hookup. The website had lots of cutesy text, emojis, and promises, but it was deeply problematic on the privacy front.
It received a lot of deservedly bad press, and was frequently referred to as the 'Is your dick sick, send us a pic!' app (H/T Eva Galperin). When I discovered it, I wrote a few rather scathing posts on why it was a privacy disaster. I also swore a lot.
At first, the founders responded primarily by deflecting criticism and painting critics as innovation-hating meanies. This led me to write a follow-up piece (Founders, Please Remember You're Not Experts at Everything), where I swore a bit less and walked through the discussion we could have if they'd come to me pre-launch for data protection advice. Namely:
what personal data actually is, and why it's much broader than protected health information (PHI) (definitions matter!)
what 'meaningful consent' actually means under the law (and how to achieve it)
how to limit access by children (and what to do when it happens)
obligations around data subject rights (and how to make them happen)
the distinction between anonymous and pseudonymous data (yes, they are different, and no, you can't collect anonymous 'identifiers')
the pitfalls of overzealous data collection & unrestricted storage (it will bite you)
how to actually be 'transparent' (and why it's more than just cutesy words and emojis)
the benefits of hardened security controls and being honest about what you are doing (beyond just using AWS and saying that security is ‘at the top of our quest log’)
internationalization & accessibility (you can't be transparent if people don't understand what you're saying)
third-party risk (who else you share things with matters)
the value of adopting privacy and security by design and the utility of doing a data protection impact or risk assessment and an AI audit.
Part II: They Start to See the Error of Their Ways?
In May, I noticed that a half-dozen HeHealth folks began stalking my LinkedIn profile after the COO implied on LinkedIn that I was trying to extort money from them because <checks notes> I didn’t want to provide substantive data protection advice for free. That made me, according to him, an ‘ugly human’.
No good deed goes unpunished, I guess.
At that point, realizing I hadn’t checked on Calmara website in some time, I decided to take a gander and was pleased to note that … they actually did fix a few things I called out. For example:
They stopped storing images entirely. Once the AI determines if dick = sick/not sick, the uploaded image was deleted.
They switched to a paid model. Users would need to create an account (and pay a pretty hefty fee), which means the threat of kids uploading genitalia went down.
They encouraged users to use an email privacy service (like SimpleLogin/Protonmail, Addy.io, etc.) to generate the account email. I really like this idea and would love to see more companies adopt this practice.
The privacy notice was updated to acknowledge reality -- that images are personal data, even if they're not PHI.
As I noted, there was still room to improve, but this at least was a step in the right direction.
Part III: The Finding Out Bit
Unfortunately, they didn’t improve enough. One aspect I’d overlooked was the fact that while the website and app claimed to not be providing healthcare or health services, they were still making ‘health-related claims’: namely that their app could detect 10 different conditions, including syphilis, herpes, and HPV with 94.4% accuracy, and that the app provided ‘clear, science-backed answers’ about sexual health.1 The science-backed evidence Calmara/HeHealth cited to was a paper by Lao-Tzu Allan-Blitz, et al., The Development and Performance of a Machine-Learning Based Mobile Platform for Visually Determining the Etiology of Penile Pathology (2024).
All of this caught the attention of the Federal Trade Commission, who on June 10, 2024 issued a civil investigative demand to HeHealth, seeking information on the ‘company’s substantiation for Calmara’s accuracy claims’ and:
information about the company’s privacy practices, given the sensitivity of the images and information provided by consumers and Calmara’s advertising claims about maintaining anonymity with respect to the subjects of the images. In addition, on June 11, 2024, staff directed a Notice of Penalty Offenses (“NPO”) Concerning Substantiation of Product Claims to HeHealth, putting the company on notice that it is an unlawful act or practice to, among other things, “make claims relating to the health benefits or safety features of a product without possessing and relying upon competent and reliable scientific evidence that has been conducted and evaluated in an objective manner by qualified persons and that is generally accepted in the profession to yield accurate and reliable results, to substantiate that the claim is true.
The FTC challenged the efficacy of the study, noting that (1) 4 of the 5 researchers were employed by or paid as consultants of HeHealth2, (2) that the study only assessed a small number of images of individuals who were never diagnostically assessed for STIs, and (3) and that the model itself was only trained to detect 4 STIs, not the claimed 10.
Whoopsie!
Things moved rather quickly after that. On June 11, the FTC issued a Notice of Penalty Offenses, and put the company on notice that it is an unlawful act or practice to, among other things, rely on a questionable study to make ‘health-related claims’. No mention of the major privacy & data protection issues, but I’m sure those also were considered. That led to an agreement with the FTC by the CEO and COO (Mei-Ling Lu and Yudara Kularathne MD, FAMS(EM)) in early July to terminate the Calmara app, delete all personal data stored, including payment information, and refund customers by July 15.
I have so many thoughts on this.3 Personally, I’m pleased with the outcome. It’s good to see a regulator actually step up and take charge early on, rather than waiting for months or years to try to reign in a bad actor after a data breach, egregious processing occurs, or individuals are negatively impacted. I personally wish the FTC had done the same against PimEyes and Clearview AI early on. We’d be in a better place.
I say this not because I’m a science-hating ‘ugly human’ (as the COO characterized me), but because the FAFO mindset is toxic and it needs to stop. This isn’t 2012. We are no longer in a ‘move fast, break shit’ world, and founders need to come to grips with that fact. Actual, substantive privacy & data protection laws exist and as we see here, are starting to be enforced. VCs and angel investors should also be evaluating this as part of their due diligence processes. Privacy (and other legal) considerations should be as important as ‘will this thing make money’ and ‘what’s the market fit’?
Technology can be amazing, but part of how we get there is by treating privacy and data protection seriously, thinking about risks & outcomes, and dare I say it — trying to improve when we get things wrong. The founders at HeHealth disregarded thsee concerns. They assumed that their righteous crusade was bigger and more important than following the law. They were positively allergic to reasonable criticism. That’s not a good look, and hopefully other founders learn from their mistakes.
Factual statements in this section, including references to the study, the authors, total # of STIs & provision of health-related claims are made based on the FTC’s HeHealth Closing Letter, issued July 11, 2024: https://www.ftc.gov/system/files/ftc_gov/pdf/hehealth-closing-letter.pdf.
The paper published in the Mayo Clinic Proceedings, Digital Health journal lists only four authors: Lao-Tzu Allan-Blitz, Sithira Ambepitiya, Raghavendra Tirupathi, and Jeffrey D. Klausner. However, an earlier pre-print on arxiv also includes the COO, Yudara Kularathne as an author. https://arxiv.org/abs/2403.08417. Only Drs. Allan-Blitz, Klausner & Ambepitiya disclose their affiliation with HeHealth.
Just in case the founders get grumpy and want to sue for libelslander, I’m going to preface that this section is entirely my opinion, based on my years of experience in this space. The cautionary tale I describe is not unique to HeHealth or its founders, but the behavior is worth calling out as a reminder to others. We can all do better.