Mind if I search your car?
Reuters reporting on Meta's AI chatbot and the dataset the company used to train it:
Meta also did not use private chats on its messaging services as training data for the model and took steps to filter private details from public datasets used for training, said Meta President of Global Affairs Nick Clegg, speaking on the sidelines of the company's annual Connect conference this week.
Emphasis mine.
This is a neat little trick. A reasonable reader, or someone not completely cynical, may think the term “private chat” just means chats, which are private in nature.
But Facebook / Meta doesn't believe chats are inherently private. Privacy is an opt-in feature on Messenger. You must explicitly switch on the end-to-end encryption. Only then will Meta agree to keep out of your user data.
From Facebook's help center:
A secret conversation in Messenger is encrypted end to end, which means the messages are intended just for you and the other person – not anyone else, including us.
So, when Nick Clegg, Meta's President of Global Affairs, goes on record to say the company's AI doesn't train on “private chats,” it reads like a benign statement. But, it's impossible to decipher how Clegg is using the term— as an adjective, or part of a noun with a precise technical meaning.
It's possible that the term belongs to Reuters, as “private chat” isn't directly quoted. But I find that to be a weird liberty for a journalist to take in a published interview.
Okay, I know it sounds like I'm splitting hairs but, in five years when an exposé breaks, and Zuck is invited to another congressional hearing over privacy concerns, that phrasing gives him an out.
Zuck can be like “we didn't mean private, we meant Private™. Then some congressperson with a hundred grand in Meta stock can throw up their hands and be like “who's to say, case closed.”
I know of at least one other situation where this type of wordplay occurs.
Ever been pulled over by a cop and they ask “mind if I search your car?” They specifically ask like this because you're likely to respond with “yes” or “no.” And because of the way the question is phrased, both potential answers can imply consent to search.
“Mind if I search your car?”
“No.”
You may've meant “no you can't search my car,” but a cop can argue to a judge that they thought you meant “no, I don't mind.” The reverse is true with the answer yes. “Yes I mind” and “yes I give you consent” are both plausible interpretations.
It's a neat little trick.
Type: #Note
Re: #Meta #Privacy #Technology