<- Back
Comments (309)
- gorbachevMeta cancels the contract with the outsourcing company they contracted to classify smart glasses content after employees at the company whistleblow about serious privacy issues with the content they were paid to classify.
- redbell> "We see everything - from living rooms to naked bodies," one worker reportedly said.> Meta said this was for the purpose of improving the customer experience, and was a common practice among other companies.Am I reading this correctly?! This is probably the weirdest statement I've read on the internet in twenty years.
- HarHarVeryFunnyNot sure which is worse here - that Meta are recording video from customers' smart glasses, or that they are firing people who talk about it.
- jmullI believe the tricky privacy and security issues around smart glasses (and other "personal" tech) can be navigated successfully enough by a thoughtful, diligent, responsive company.Which is why I'd never touch a person tech device from Meta.Their entire DNA is written to exploit their users for profit. In my judgement, they literally cannot and will never consider those issues as anything other than something to obscure to keep people unaware of the depth of the exploitation.
- reliablereasonI wonder under what circumstances footage from the glasses are uploaded for classification.Probably this is people asking the glasses something about what they see and the glasses uploading video for classification to generate an answer.People think it is "just AI" so are not very concerned about privacy.
- dhosekThis headline reminds me that “row” is one of these words I’ve been mispronouncing almost my whole life (I just learned the correct pronunciation this year). In this context row rhymes with cow,¹ now dough.⸻1. The first rhyme that came to mind was bow, but I realized there was a problem with that example.
- KaiserProEx Meta employee here (yes you are right to boo):The thing that really gets me is that internally there are 4 levels of data 1 being public domain shit (the sky is blue) up to 4 which is private user data, or something that is sensitive if leaked or shared.I was told that by default all user data is level 4, as in if you do anything without decent approval, you're insta fired. There are many stories about at least one person a month during boot camp accessing user data and getting escorted out of the building within hours.The part where I worked, in visual research, we had to jump through a years worth of legal hoops to get permission to record videos in public. We had to build an anonymisation pipeline, bullet proof audit trail, delete as much data as possible, with auto delete if something went wrong.We had rigid rule about where that data could be stored and _who_ could access it. We were not allowed to share "wild" footage (ie data that might have the hint of anyone who hadn't signed a contract) for annotation because it would be given to a third party. THe public datasets we released all had traceable people, locations all with legal waivers signed.Then I hear they just started fucking hosing private data to annotators to _train_ on? without any fucking basic controls at all? Just shows that whenever Zuck or monetsization want something, the rules don't apply.I look forward to that entire industry collapsing in on it's self.
- mproud
- swiftcoderOne of the bigger commercial niches for smart glasses is filming POV porn, so it is hardly surprising that sort of content ended up in the moderation queue. The project should have planned to account for that use case.
- touwerBigtech and the race to the bottom of the ethical pitt. We can still go lowerrrr!
- sheepcowIf you want to read more about how unsavory aspects of AI-training are off-loaded onto poor workers in third-world countries, would recommend Karen Hao's "Empire of AI". These workers are paid pennies an hour for unstable jobs that expose them to some horrific material.
- mxfhMeta ended its contract with SamaAt this scale, this sound like some insider joke contract made up only to make some hustle on the side capitalizing with stock options on the possibility of adhoc news trading bots glitching out on the keyword, here "x.com/sama" signals.
- m-p-3Absolutely no way I'd buy anything from Meta that has a camera built-in.
- bluedinoWhat does "in row" mean? For us non-English English speakers.
- anonundefined
- yaurIt seems the issue is not the glasses users, but the people that the glasses users were having sex with. Did meta get their consent before redistributing this content?
- jimmyjazz14It still blows my mind that anyone would volunteer to don these smart glasses, it's almost like some alien mindset to me.
- rufasteriscoIt would be refreshing for once to see the top comment to such articles to be“Yes, we all know it, and we keep those app installed regardless“.
- malsheA question for the HN folks who work for Meta - Is the pay so good that it makes it worth working for such a morally bankrupt organization?
- prependI think Meta, like all companies, doesn’t want its subcontractors creating bad press for them.So it doesn’t surprise me that Meta didnt renew/cancelled a contract that is a net negative for them. Arguing over the reason seems fruitless as no reason is needed per the terms of the contract (I assume since breach of contract wasn’t brought up by the sub).
- f311aWhy do they even need workers to classify naked content? They could filter some content prior to passing it to workers. They already have models to moderate explicit content.
- sidcoolWhy would anyone trust Meta with their personal data! After a while it's just natural selection.
- cwillu> Meta's glasses have a light in the corner of the frames that is turned on when the built-in camera is recording.Because nobody knows how to put a dot of nail polish on an led they don't want seen, right?
- letmetweakitUnfortunately this news will have no impact, neither on customer behavior, neither on policy, neither on Meta's behavior.
- I_am_tiberiusNot a fan of regulation in general, but would love to see a ban of cameras on glasses used in public spaces.
- swyxthis may be the greatest title i've seen on hacker news in a decade
- talkingtabMeta said the contracting "did not meet (meta's) standards". I am sure that is true. meta's "standard" is not to reveal the illegal, immoral, unethical things meta does. No matter what the harm.Maybe a company with those standards should not get our business. Oops, no wait, maybe they mean the Friedman Doctrine standards? In that case they are entitled to do any and every thing to make a profit. No matter what the harm.[edit: add last two sentences]
- theowsmnsnMeta is so evil
- dickeeTi don't think smart glasses itself is a good idea
- fortran77People have sex with their glasses on?
- shevy-javaFacebook may have to rename itself into NaughtyBook or SpyBook or Pr0nBook. They really want people to help them spy on other people here - including their sex life. Expect new sexy videos in 3 ... 2 ...
- mmanfrinI got a paywall, first time I've seen that on BBC.
- hirvi74Good. Anyone who works for such a company is immoral in my opinion.
- game_the0ryCan we boycott meta yet? I am sick of this company.
- JKCalhounOops! Oh, too late. And another nail in the heart of smart glasses…
- aklemmI bet the victims had their socks on too
- tamimio> and was a common practice among other companies.Meta isn’t lying, you should assume other companies are doing it too, Tesla did it with their cameras, and assume others like any company has access to your camera, I would even assume CCTV cameras too. It’s why for anything sensitive, try to use open source stacks, you might lose some of the features, but it’s a needed compromise.
- jmyeetSo I've never had a smart speaker in my house (Alexa, Apple, Google). I've just never been comfortable with the idea of having an always-on cloud-connected microphone in my house. Not because I thought these companies would deliberately start listening and recording in my house but because they will likely be careless with that data and it'll open the door for law enforcement to request it. Consider the Google Wi-fi scraping case from STreetView.Or they might start scanning for "problematic" behavior, a bit like the Apple CSAM fingerprinting initiative.So not one part of me would ever buy Meta glasses (or the Snap glasses before that). You simply don't have sufficient control over the recordings and big tech companies can't be trusted, as we've witnessed from outsourced workers sharing explicit images. And I bet that's just the tip of the iceberg.I honestly don't understand why anyone would get these and trust Meta to manage the risks.
- rickdgThis is what happens when you buy a camera from the "they trust me, dumb fucks" guy and put it on your face.
- ai-network-lab[flagged]
- 3748595995[flagged]
- ghm2180About the "they asked us to view it and then fired us for it". Having worked in their RL division(I don't work at meta anymore) this story is quite weird for two reasons:1. Meta AFAIR paid/compensated people — contractors or recruited via ads — to have them submit their data. There are strict privacy protocol and reviews in place to distinguish data use in these cases vs gen public. This is not to say the process is perfect, but if these users are gen public, I would be very shocked.2. Hiring contractors to submit data is a more controlled environment VS recruitment of gen pub via ads to submit data, but the former has more well understood privacy disclosures than the latter. This means in practice asking contractors to wear glasses and "move around their surroundings naturally and do things" goes well with basically the privacy practice "the data your are submitting we can view and use all of it for purpose X and nothing but X". BUT this framing is with ad based recruited people — which are general users who willingly submit data — is much much harder. My suspicion is they are running ad based recruiting in general public and while those users may have signed a privacy statement it is very surprising that they did not tighten the privacy practices around the use of the data and who has access.