Privacy Disasters: AI Spy-Wearables, and the Scourge of Competing Friendants
Wherein, I break down why 'always-on' wearable tech is a blight on humanity, and share a mock DPIA covering two competing 'Friend' Pendants.
A Three-Way Torment Nexus
My body has a bad habit as of late. Despite taking melatonin, I consistently wake up at about 4:50am every morning. I’m not up-up, but I’m up enough that I can mindlessly scroll on my phone, where I catch up on all of the latest privacy disasters, discourse and outrage that occurred during the four hours of sleep I was able to manage. Yesterday, I was greeted with this on my Twitter timeline:
The video features four vignettes of Gen Zers living their best lives, wearing what looks like a slightly thicker and more glowy Apple AirTag around their necks1 The first scene features a 20-something woman, exerting herself on a solitary hike in the woods, the next a frustrated guy playing a video game with trash-talking friends, the third a woman messily eating a falafel in her non-descript apartment, and the fourth is what I can only describe as an extremely awkward date on someone’s rooftop.
And regardless of whether the characters are alone or with other actual humans, all are sporting this goofy-looking pendant, known as the ‘Friend’ (or possibly ‘Emily’?) that is recording in real-time their lives and feeding all of that to a large language model (LLM).
The noble goal of this always-on friend, according to the device’s founder, Avi Schiffmann in a recent Wired article, is to stave off loneliness by offering AI-based companionship. I liken it to what happens if the AI chatbot Replika had a three-way with Microsoft’s Recall, and the failed Humane pin.
According to the Wired piece:
The Friend purely offers companionship. It’s meant to develop a personality that complements the user and is always there to gas you up, chat about a movie after watching it, or help analyze how a bad date went awry. Not only does Schiffmann want the Friend to be your friend, he wants it to be your best friend—one that is with you wherever you go, listening to everything you do, and being there for you to offer encouragement and support. He gives an example, where he says he recently was hanging out, playing some board games with friends he hadn’t seen in a while, and was glad when his AI Friend chimed in with a quip.
There’s very little information about the pendant, which I will be referring to as the Friendant on the friend.com website, beyond the fact that it connects to the user’s mobile device via Bluetooth, that it’s always listening and processing what it hears, it only works on iOS devices for now and communicates by sending chat messages, presumably in a dedicated Friend.com app. Also that it’s $99 and won’t be available until Q1 2025.
Also, in a move that feels very reminiscent of early 2000s pre-crash internet exuberance, Schiffmann told the Wired folks that he purchased the friend.com domain for a cool $1.8 million, after raising $2.5 million in VC-backed funding over two rounds.
Now, I’m not a venture capitalist or anything, but I’ve been around the block a bit. Hell, I even lived in the Valley for a few years and if I learned anything, it’s that if you’re going to build a successful product, you really should be laser focused on the product itself. That means prioritizing product development (programming & UX), identifying a good market fit, and offering a catchy tagline on why people should actually buy the thing you’re hawking. If you’re a seasoned founder, you might also be thinking sensibly about longer-term risks, like potential legal & regulatory issues, especially if you’re creating some sort of ‘disruptive tech’.
What you should not do is spend most of that budget on a fucking domain name.
A Privacy Notice That Isn’t
With the product details out of the way, let’s get to the good stuff — me lambasting the Friendant’s privacy notice.
To start, the Privacy Policy Notice is, unsurprisingly, scant on details. The controller (MyTabAI) includes the barest information on processing and use of personal data. What little is there only touches on information collected through the website, or through pre-orders via the Stripe link. The notice says nothing about what the Friend device itself will be collecting, how it’s sharing this with Claude.ai, how long data is stored on device (or on Anthropic servers), what the context window is, how it somehow remembers user context, etc.
And guess what guys. If you were wondering if the notice mentioned anything at all about sensitive data, well, rest assured, it does.
We do not process sensitive information. All personal information that you provide to us must be true, complete, and accurate, and you must notify us of any changes to such personal information.
Uh huh.
It’s clear that Schiffmann and his team were laser focused on having a beautiful, albeit potentially legally-problematic design, registering the cleverest domain name they can, and launching it on World Friendship Day (30/07/2024). They were far less concerned with the annoying legal and data protection technicalities, or those pesky questions around autonomy, choice, consent, etc.
Funnily enough, the notice also gives away the fact that before Schiffmann started calling it the Friend(ant) in mid-May, he was referring to the device as the Tab. There’s even a YouTube video with that name that was released 9 months ago.2 That YouTube video, btw, is both illuminating and deeply, deeply cringe.
In it, Schiffmann describes the Friend (nee Tab) as a ‘context aware’ device, that ingests the contents of the wearer’s life so that they don’t have to provide context every time. It’s unclear how that happens exactly, beyond the fact that the Friend is constantly listening (for up to 15 hours) and shunting queries to Anthropic AI’s Claude 3.5 large language model. Between jokey scenes where Schiffmann is reading a book on ChatGPT prompts, he volunteers that he’s spent some months developing custom prompts, presumably so that Claude can provide vaguely relevant responses to everything the device is listening to.
But Wait… We Have Competing Disasters!
I need to stop and take a beat before I activate Angry DPIA mode to inform everyone reading that this is actually the second AI-based wearable pendant called Friend that is available on the market right now, because this is the cursed world we now live in.
This other Friendant is an open-source jobber created by Nik Shevchenko, a ‘Thiel Foundation Fellow’ (vomit) and CEO of BasedHardware. This Friendant is more about traditional productivity, like the Humane & Rabbit R1 wearables, not staving off loneliness like Schiffmann’s creation. Despite offering more whizz-bang AI features, including syncing with all the apps and storing and processing locally on-device, this Friend is available for pre-order for the low, low price of $70, with a Q3 2024 delivery date.3
The BasedHardware Friendant has a slightly better privacy notice, but it’s still pretty bad. Recorded audio is processed using Deepgram for transcription, and transcriptions are vectorized using Pinecone. Only the vector representations are stored, and all storage is on-device. Allegedly, ChatGPT also runs on-device as well, and it still manages to boast a 24-hour battery life somehow.
Truthfully, this sounds like privacy-washing bullshit. Especially if you understand how these services work and how computationally expensive all of this is. Oh, and there’s also this contradiction in terms:
None of the data is transmitted to our servers or any third-party servers, except for the services mentioned (Deepgram, Chat GPT, Pinecone), which also operate locally.
I mean, look at this thing? It’s the size of an anal suppository, and you mean to tell me that it’s got all of that complex GPU and storage-hungry stuff running all the time and the battery life is better than my phone, and somehow this doesn’t overheat to the point of catching fire?
Anyway, I haven’t looked at the code yet, because I’m tired. Maybe it does do all of that, IDK — here’s the Github link if you’re curious. To his credit, at least Shevchenko open sourced his work so it can be assessed more easily.
Finally, I would be remiss if I didn’t mention that Shevchenko, the founder and CEO of a real-life company hoping to secure actual VC money was mad enough about Schiffmann poaching the Friend name that he posted a diss track on Twitter, as well as challenging Schiffmann to fight him. In light of this, I would suggest that Shevchenko rename his product Frenemy. But this is not legal advice.
If Only They Hired Me
Now then, on with the fun that you’re all waiting for. Namely, what these lads would have considered if they’d been smart enough to hire me as a consultant before releasing these things into the world. Hell, here’s the kinds of questions any spy-wearable should probably look into at least. Again, not legal advice — it’s just basic common sense.
For the sake of my sanity, I’m going to refer to the Schiffmann’s wearable as the Friendant, and Schevchenko’s wearable as the Frenemy, because
First, a quick reminder. Under Article 35 of the General Data Protection Regulation (GDPR), a controller must perform a data protection impact assessment when certain “high risk” processing activities occur. The Irish DPC helpfully provides a list of high-risk processing activities. For purposes of these wearables, a DPIA is a good idea because it seems likely that both trigger at least one of these conditions apply:
use of innovative new technological or organizational solutions;
systematic monitoring of publicly-accessible areas on a large scale (since users will be wearing them everywhere);
processing of sensitive data of a highly personal nature (if you’re gonna treat it as your bestie and wear it around constantly, it’s going to pick up all sorts of juicy details about you and anyone you interact with);
collection of personal data about vulnerable individuals (this one depends heavily on context. It’s less a problem if the founders stick to targeting Bay Area GenZers, much more of a concern if it targets grandparents with Alzheimer’s or kids).
Now that I’ve got why a DPIA is important out of the way, let’s look at some of the questions and risks that I’d want answers to.
Transparency: There is absolutely no transparency here in the privacy notices. While the Frenemy at least goes into some detail, it’s still all very vague, and I doubt either of the privacy/transparency statements meet the letter and spirit of most US state privacy notice laws, much less the rigor of the GDPR. How are users being informed of the processing that’s happening here? Where is the data being stored? How is it being shared with Anthropic / OpenAI? What are you guys doing with all that data? Is it going overseas? Is it being secured? What’s the lawful basis?
Consent of users/third parties: Currently, only the wearer consents. Because these wearables are always on, the only way I as an innocent bystander can object or opt-out from being recorded by someone wearing these pendants is by asking them to turn it off and hoping they do, walking away, or ripping the damn thing off their neck and chucking it into oncoming traffic. I can see some regulators getting mighty miffed about that. And that presumes I know that these pendants are always recording spy-wear, not just weird jewelry.
And how does withdrawal of consent even work here? Say I’m fine with this at first, but I decide later I don’t want your Friendant to know about all the deep thoughts I shared with you. How does the owner of the Friendant/Frenemy delete that information? Can users purge information stored on Anthropic/OpenAI servers?Proportionality: Both wearables monitor and record every interaction, every intimate conversation, and all those deep thoughts shared out loud. As I’ve mentioned before, most of us are still not cool with having our lives constantly being recorded all the time. Notwithstanding the ubiquity of CCTV and facial recognition in public spaces, I think many people would still freak out if they knew that their casual conversations with someone were being recorded and shared with Anthropic or OpenAI, these guys, and god-knows-who-else, all for a little user convenience. Just look at how we collectively reacted to Google Glass. Or how quickly Recall was withdrawn by Microsoft.
Remember, it’s not just recording the wearer’s voice and personal information, but any voice and details shared by those interacting with the pendant-wearer.
Just imagine you’re a 20 or 30-something person, going out to some douchey bar in the Bay Area looking for love. You spy a cute potential paramour in the distance, who seems to be wearing a shiny piece of jewelry that glows weirdly for no reason. Now you have to wonder if what they’re wearing is some sort of repurposed AirTag, or if it’s a Friendant talking shit about you or sharing snarky observations at your expense. Who wants that?Children: What about kids? Most data protection laws impose heightened standards for consent in relation to children. With the rise of laws specifically designed to protect kids’ privacy gaining traction in the US and elsewhere (including the Kids Online Safety Act and the Children And Teens Online Privacy Protection Act, which recently passed with overwhelming support in the US Senate), having an always recording wearable seems like a recipe for regulatory scrutiny. And everybody wants to protect the children, so that’s something that crosses party lines.
Exfiltration & Data Breaches: While I haven’t looked at the Frenemy’s code base (yet), I suspect that security was … not exactly a priority for either of these companies. Given that both interact via Bluetooth via mobile devices, and Bluetooth is vulnerable to eavesdropping and other man-in-the-middle attacks, I could see someone finding an exploit in a matter of weeks here. Just like people found with Recall. It’s a goldmine for criminals.
Points to Frenemy for allegedly keeping some of this data on-device, which is better. Minus points for writing all of this in very vague and contradictory language, though.Controllership: Holy shit is this one a dumpster fire. So, ordinarily data protection laws treat data that normal people use for personal reasons differently than say, data collected by Meta or Google. That makes sense: Nobody wants to police Grandma’s collection of photographs of her grandkids, you recording your friend doing something stupid, or all those dick pics sent between people on WhatsApp. Generally speaking, if it’s not processing for a professional or commercial use, all the baggage of data protection laws generally will not apply.4
But these personal use exceptions are usually construed narrowly. It’s not enough that processing is done for a personal reason — it has to be only for a personal or household use. When you start adding others to the mix and sharing that information, things get messy. In the EU for example, if you say, set up a CCTV camera to film all your neighbors for your personal jollies, that might be a use for personal reasons, but it’s not a personal or household activity under the law. In fact, a 2014 decision by the Court of Justice of the European Union said as much:
To the extent that video surveillance … covers, even partially, a public space and is accordingly directed outwards from the private setting of the person processing the data in that manner, it cannot be regarded as an activity which is a purely ‘personal or household’ activity …5
I see no reason to believe that this would be different just because it’s a life-logging device that only records audio. That makes life fun both for the creators of these tools AND anyone foolish enough to wear them. If anyone needs help with handling data subject requests, HMU.Two-Party/All-Party Consent Laws & Eavesdropping: Many countries have some form of eavesdropping law on the books. While many target telephone communications and most only require a single party to consent, some apply to electronic communications generally. According to the Reporters Committee for Freedom of the Press, 11 states have what are known as two-party, or ‘all-party’ consent requirements for recording, including California, Delaware, Florida, Illinois, Maryland, Massachusetts, Michigan (at least for recordings made by a third party who is not involved in the conversation), Montana, New Hampshire, Pennsylvania and Washington. Other states (including Oregon and Missouri) require all-party consent for in-person recordings of conversations.
While laws vary on how consent is obtained and whether implicit consent suffices, the assumption is that recording must be overt and obvious enough to the other people being recorded, and can’t look like an innocuous piece of jewelry around someone’s neck.
This is more likely to impact the wearer than the Friendant/Frenemy creators, but imagine the lawsuits that would occur if Bay Area techbros start get arrested for recording their dates or whatever without consent. Or cops. Lol.The Data Act: There’s a separate and very interesting question about whether these sorts of devices (or other AI spy-wearables) might qualify as connected products under Article 1 of the Data Act. Obviously, more questions would need to be asked — namely, what kind of product data and related service data is collected. Interestingly, I don’t know if this triggers much additional review under the AI Act.
Various Biometric laws: Both Texas and Illinois have fairly robust biometric privacy laws that protect individuals from having their biometric data (including voiceprints) collected for commercial purposes. If I was doing a more robust DPIA, I’d want to interrogate whether and how voice recordings of participants are used. Clearly some processing is occurring. The answer is how much and for what specific purposes.
Yeah, so this isn’t an exhaustive assessment, and it’s theoretically possible that some or all of these concerns have been addressed and the founders simply haven’t informed anyone yet. It’s possible, but in the way that time-travel is possible, or me winning a Ms America pageant is possible, which is to say, not freaking likely.
So yeah. Maybe one or both of the Friendant/Frenemy creators will reach out and we can dig into how these things aren’t the stuff of dystopian nightmares. Hope springs eternal.
But for now, I won’t be holding my breath. As always, if you’ve got a thought/observation or think I missed something, leave a comment or send me an email.
I’m pretty sure that was by design, and I’m also pretty sure Apple will complain. Whether they’re successful depends on the scope of their design patents and trademarks, of course. https://www.macrumors.com/2020/10/22/apples-airtags-revealed-in-newly-published-patents/.
This video is fucking insane, BTW. To his credit, Schiffmann has not developed the hardened edge of a typical tech founder, and shares an unfiltered, unguarded view of his entire thought process. He even likens the tech startup scene to the Roman Empire. Instead of raising armies to conquer lands, he asserts, now you raise capital to conquer markets. Or something.
Absolutely none of these guys are going to make any money if these spy-wearables do half of what’s promised. Assuming, of course, this isn’t all vaporware.
See: Article 2 GDPR. In the US, most state privacy laws like the California Consumer Privacy Act set threshold requirements (e.g., processing of personal information of 100k or more consumers within the state, though others like Texas’ law, are broader, or specifically exclude household/personal uses, like the laws in Delaware, Iowa, Illinois, etc. The IAPP has a very handy guide in this state laws report: https://iapp.org/media/pdf/resource_center/us_state_privacy_laws_report_2024.pdf.
See: Ryneš v. Úřad pro ochranu osobních údajů, C-212/13, Court of Justice of the EU at: https://curia.europa.eu/juris/document/document.jsf?text=&docid=160561&pageIndex=0&doclang=en&mode=lst&dir=&occ=first&part=1&cid=12270020. In Ryneš, the court assessed the household activity exception in the context of the older Data Protection Directive, but nothing material changed with passage of the GDPR, so I think this is still good law.