Error Message Exposes Vending Machine’s Use Of Facial Recognition Tech

from the leveraging-munchies dept

Like most tech, facial recognition AI continues to become cheaper and easier to implement. Is it getting better? Well, that hardly seems to be a primary concern for those deploying it.

Adoption of this tech tends to focus on the law enforcement side of things. This is where it seems to perform worse. The tech is much more unreliable when asked to identify minorities. That’s problematic when deployed by the government, which has the power to deprive people of rights and personal freedom when given the go-ahead by tech that performs worse when identifying the very people our government already tends to oppress/over-police most frequently.

The private sector’s use of this tech is often no better. While it has some utility for internal use — i.e., verifying the identities of employees seeking to access certain areas or information — the most common deployments are tied to law enforcement: the (hopeful) identification of suspected criminals. So, even most private sector use invokes the excesses of government power, while still relying on faulty tech that generates the most false positives when dealing with people of color.

Is there such a thing as an innocuous deployment of this tech? Sure. There’s a chance that might happen. But it would involve telling people this information is being collected while making it clear what this information is being gathered for.

Facial recognition tech in a vending machine is unlikely to aid and abet a string of rights violations. But it’s far from innocent. In fact, it tends to disturb people who might otherwise be supportive of government use of this tech.

College students in search of snacks are never going to assume their purchases are triggering facial recognition tech. Wes Davis’ brief summary of a much deeper story for The Verge makes it immediately clear how regular people feel about unexpected facial recognition deployments.

“Why do the stupid M&M machines have facial recognition?”

A student at the University of Waterloo in Canada asked that in a post showing a vending machine error message that revealed a facial recognition app had failed.

Student publication mathNEWS found that the machine’s maker, Invenda, advertises that it gathers “estimated ages and genders of every client.” But don’t worry, Invenda told Ars Technica the machines are “fully GDPR compliant.”

The journalists at Waterloo University’s “mathNEWS” paper dug a lot deeper into this story. The end result may be the welcome removal of surprisingly intrusive snack machines, but the details show that vending machine manufacturers are willing to deploy this tech without performing much due diligence, but far more reluctant to own up to it.

The first mystery that needed to be solved was identifying which company was specifically responsible for adding facial recognition tech to machines that have generated healthy profits for years without attempting to surreptitiously gather demographic data on their customers.

The error message that inadvertently informed students of the presence of this tech included the name of the vendor:

Invenda.Vending.FacialRecognitionApp.exe
Application Error

Invenda is not the first link in this chain. The machines were placed on the campus by third party vendor Plant Ops. That company claimed to have zero involvement beyond the delivery and placement of the vending machines that were owned and operated by an entirely different company.

That company was Adaria Vending Services. But this third party also does not manufacture or control the machines’ operation or internal tech. The tech exposed by this error links back to the company named in the error message: Invenda. Not that Adaria’s hands are completely clean, as the student newspaper points out:

Adaria does not make the machines; [journalist] firstie determined the machines’ original manufacturer to be Invenda Group, an organization boasting intelligent vending machines with data collection capabilities. Some data collected is benign, including sales and UI performance metrics. But Adaria can also use these machines to collect further data, sending it to relevant parties including Mars, the manufacturer of M&M’s. In particular, Invenda’s sales brochures state the machines are capable of sending estimated ages and genders of every client.

Two beneficiaries of additional data, although sales and UI performance never necessitate the deployment of facial recognition tech. It’s only the latter — the stuff Invenda and its clients want — that can’t be gathered by anything other than cameras and tech that phone home with conjecture about age and race as determined by yet another company’s facial recognition tech.

According to the statement provided by Adaria, the machines (and the hidden tech) do not “take or store” photos of customers. Supposedly the tech acts like a motion sensor, doing nothing more than informing the machine that someone intends to make a purchase.

But a motion sensor is way different than a camera with facial recognition tech attached. While it might be useful to add something that can differentiate between someone standing in front of the machine, rather than someone just near it or passing it, there have been enough advancements in motion detection to accomplish this without the addition of facial recognition tech.

So, this excuse isn’t all that credible, even if it may truthfully portray Adaria’s relationship to its machines and its apparent data obligations to the manufacturer of the goods located in its vending machines.

Invenda’s statement makes it clear Adaria either doesn’t completely understand what’s going on, or has been forbidden to discuss further details as part of its agreement with Invenda.

As the producer of the Invenda loT solution, the Invenda smart vending machine, and its associated software, we formally warrant that the demographic detection software integrated into the smart vending machine operates entirely locally. It does not engage in storage, communication, or transmission of any imagery or personally identifiable information. The software conducts local processing of digital image maps derived from the USB optical sensor in real-time, without storing such data on permanent memory mediums or transmitting it over the Internet to the Cloud.

They go on to say:

It is imperative to note that the Invenda Software does not possess the capability to recognize any individual’s identity or any other form of personal information.

If we take this at face value, the facial recognition tech generates a demographic guess, stores it locally, and discards the images used to make this determination. All well and good. But storing it locally doesn’t make much difference overall, since it appears Invenda still harvests this data, even if it requires the deployment of techs to machines to collect it. It sounds like a GDPR workaround that allows Invenda to claim it’s not collecting this data remotely or storing it somewhere else than the location where it’s being collected.

That still doesn’t explain why Invenda now believes it’s essential its vending machines attempt to determine the demographics of customers. It also doesn’t explain why anyone involved in this — from Invenda to Adaria to the contractor hired to place machines on campuses — have failed to clearly inform vending machine customers this tech has been added to devices most people logically assume do nothing more than exchange goods for money.

As the student paper sums up succinctly:

No one needs M&M’s cameras.

These companies got along without this tech for the entirety of their existence. M&M/Mars has managed to turn a steady profit for more than a century without needing to harvest (supposedly anonymized) demographic data via surreptitious deployments of tech many people rightfully do not trust.

The fact that these companies are doing it now might only suggest an ever-increasing thirst for data — something that’s understandable as profit margins narrow and more competitors enter the market. That they couldn’t be bothered to be upfront about suggests the entities involved are well-aware these deployments would not have been welcomed by their customers. So, they chose to sneak it in, hoping no one would find out until this particular Overton window passed the inflection point.

But, they got caught — sold out by their own defective software and its far too transparent error message. And, now, they’re losing customers. As the paper reports, the machines infected with this AI are being removed from campus. I guess everyone on the other side of this food chain had better hope the (supposedly) locally collected data was worth it. And now everyone, everywhere will be deploying more side-eye than money to vending machines.

Filed Under: ,
Companies: adaria vending services, invenda, plant ops, university of waterloo

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Error Message Exposes Vending Machine’s Use Of Facial Recognition Tech”

Subscribe: RSS Leave a comment
71 Comments

This comment has been flagged by the community. Click here to show it.

Benjamin Jay Barber says:

Karl Bode cant code

“But a motion sensor is way different than a camera with facial recognition tech attached.”

Its called a “classifier” for example the Yolo series of neural networks, and clearly if someone does “need” cameras on the vending machines, for example to correlate products in the machine to the ones preferred by the demographics in the area, or to surveil people who are trying to vandalize the vending machines, the first amendment gives them the right to collect that data which is exposed to the public.

If that is the data that is being collected, on what legal basis does anyone have to object, is it merely that their “feelings” were hurt, or that the vending machines are “thinking” about them, or because they don’t want vending machines snitching on them for the property crimes they commit in plain view?

Anonymous Coward says:

Re:

Because I haven’t seen this highlighted in coverage so far…

Vending machines aren’t in public places. They’re in private establishments owned and operated by a third party.

So in any situation, you’ve got:
The vending machine owner
The vending machine operator
The leasee (business operator)
The vending machine customer
The business customer

These are all different entities. In this case, it’s the vending machine owner who implemented the facial recognition. The operator claims that they don’t do anything with the data. The business operator (a university in this case) was unaware that the cameras existed. The vending machine customers were also unaware, as were the “business customers” AKA students.

And as I pointed out elsewhere, the students are mostly between the ages of 16 and 21 — which means a LOT of them are minors.

So you’ve got facial recognition cameras running undisclosed in a private setting where there’s an expectation of no monitoring.

Or do you also not have an issue with a fourth party monitoring minors in secret?

Anonymous Coward says:

Re: Re:

Vending machines aren’t in public places. They’re in private establishments owned and operated by a third party.

Vending machines are very often in public places—it’s good for business—even if the land on which they sit is privately owned (which, by the way, does not seem to be the case for the University of Waterloo).

Anonymous Coward says:

Re:

What property crimes? There was no claim a machine could detect vandalism, and if it can’t take a picture, then lol? Further, recording an image of vandalism in progress requires no facial recognition.

So i guess none of these companies would ever hire you as an apologist.

We don’t care about corporate feelings, either, such as feeling the need for data revenue.

We also don’t need a legal basis for crticism or avoidance, and none have been claimed.

You’re really bad at this.

This comment has been deemed insightful by the community.
MrWilson (profile) says:

Re:

Tell me you didn’t read the article without telling you didn’t read the article.

the first amendment gives them the right to collect that data which is exposed to the public.

This is the University of Waterloo in Canada, which, last I checked, wasn’t a party to the US Constitution.

If that is the data that is being collected, on what legal basis does anyone have to object, is it merely that their “feelings” were hurt, or that the vending machines are “thinking” about them, or because they don’t want vending machines snitching on them for the property crimes they commit in plain view?

So you assumed property crimes were being committed against the vending machines and used that random, unsupported thought to justify the legality of something in a jurisdiction you clearly know nothing about.

I’ll do the legwork for the research you’re not going to do in lieu of just making up random arguments:

The the Office of the Privacy Commissioner of Canada, the Commission d’accès à l’information du Québec, the Information and Privacy Commissioner for British Columbia, and the Information Privacy Commissioner of Alberta had a join investigation of Clearview AI and determined that Canadian law requires opt-in consent by individuals for the use of facial biometric data collection, even if their data is obtained in public places.

https://www.priv.gc.ca/en/opc-actions-and-decisions/investigations/investigations-into-businesses/2021/pipeda-2021-001/

This comment has been flagged by the community. Click here to show it.

MrWilson (profile) says:

Re: Re: Re:

Tell me again that you didn’t read the article without telling me you didn’t read the article.

Invenda’s sales brochures state the machines are capable of sending estimated ages and genders of every client.

Estimated age and gender are biometric data. The momentary, local collection of the data is illegal, even if never transmitted.

its a good thing that a face classifier isn’t biometric data, and is incapable of identifying any person.

Notice your pivot here. I addressed scanning biometric data without consent being illegal and you shifted to the act of identifying people. It’s like me saying, “stabbing people is illegal” and you saying, “the stabber didn’t murder anyone, so it’s okay!”

So you don’t understand the technology, the law, or how to argue logically.

Anonymous Coward says:

Re: Re: Re:3

I confess my ignorance, but my impression of Canadian privacy law is that it’s only slightly less laissez-faire than the US.

In theory, Canada has a strong federal law (PIPEDA). It’s been in force since the year 2000, making it one of the earliest such laws. In practice, like the USA, there’s basically no enforcement, so companies can pretty much ignore it. Unlike GDPR, you won’t find any major cases in which actual meaningful fines or settlements were issued; at best, maybe the Privacy Commissioner told some companies to stop doing the most egregious shit they were doing.

Anonymous Coward says:

Re: Re: Re:

So it’s just a matter of the software/hardware is still to be improved.

What is face classification or detection?
Face detection is the technology that can locate and recognize human faces within a digital image or video. It is the first step in facial recognition systems. However, it only involves identifying the presence of a face, not who that person is.

BernardoVerda (profile) says:

Re: Re: Re:2

Besides, even if the software is “just classifying” scanned faces into demographic categories, Step 2 is inevitably going to consist of filtering for unique individuals, identifying repeat customers, tabulating how often they repeat… Etc.

Step 3 will be figuring out how to de-anonymize the data, or at least correlate it with other “anonymous” data, which probably isn’t all that difficult (see, for example, the work done on electronic fare cards).

This comment has been deemed insightful by the community.
NerdyCanuck (profile) says:

This is my sister's university

She saw the news and was shocked, she has walked by and used these machines many times.

And for those asking/wondering, Canada DOES have privacy laws, and they require informed consent before people can be surveilled, so the fact these machines don’t have a notice on them informing customers about the tech installed on them, makes it a clear violation of privacy law.

Though the AC above is a bit off in thier comment about minors – the vast majority of the university students in Canada aren’t minors, because the age of majority isnt 21 like in the USA, ita18 or 19 (depending on the province), so in this case UWaterloo is in Ontario, where the age is 18, not 21. So a portion of the freshman classes each year will still be 17 years old, and thus minors,

This comment has been deemed insightful by the community.
NerdyCanuck (profile) says:

Re:

oops hit post before I was done. Final paragraph should read:

… because the age of majority isnt 21 like in the USA, it’s 18 or 19 (depending on the province). So in this case UWaterloo is in Ontario, where the age of majority is 18, so a portion of the freshman classes each year will still be 17 years old, and thus minors, but not that many. Whereas if it was a university in BC, then it would be all the freshman class and a portion of the second year students that would be under 19. and thus still minors.

But to be clear, that doesn’t make it any better than if the students were all minors… just marginally less bad. Regardless this is clearly illegal and the University seems to have fucked up bad by not doing thier due diligence before allowing these machines to be placed in campus.

Methinks some school administrators may have just clicked “accept” on the Terms of Service and Privacy Policy without really reading them, like the rest of us peons often do! Whoops!

Anonymous Coward says:

Re: Re: Re:2

occasional talk of expanding [a 21-year-old requirement to] other things like voting

Really? It seems like many other parts of the world are talking about expanding youth rights. Notably, following Scotland’s lead and allowing 16-year-olds to vote seems to be “in the zeitgeist” in Europe.

Anonymous Coward says:

Re: Re: Re:3

Young people here tend to vote for a particular political party, so the other party (the one that tried to overthrow the government) has floated raising the voting age to 25. They’ve also been pushing “parents’ rights” as a way to attack trans people and think 12-year-olds are mature enough to give birth but not hear about LGBT people.

NerdyCanuck (profile) says:

Re: Re: Re: touché

Oh right, I guess he was also mixing up the difference between the “legal age” and the “age of majority”, in both Canada and the US then, not just Canada… the former being the age at which person’s can do stuff like buy liquor and cigarettes, whereas the latter is more like the ability to sign contracts or vote.

I just halfway made the same mixup in my rebuttal, though my point still stands that the vast majority of college students are NOT minors, but that that also doesn’t make this surveillance more okay morally, and certainly doesn’t make it any less illegal in Canada.

Anonymous Coward says:

Re:

Also, I don’t know about Waterloo, but at some Canadian universities, most people living in dorms get meal plans that have a cash component. That cash, if not spent, will disappear at the end of the school year—at the end of April. So it may not be an entirely free choice for students to be using these vending machines (in cases where it’s not practical to spend the remainder at on-campus restaurants).

NerdyCanuck (profile) says:

Re: Re:

interesting, I wonder if that is a thing at Waterloo… I will ask my sister about it tomorrow (it’s almost 2am there, whereas I live in Pacific standard time, so it’s 3 hours earlier).

I do know it’s a very large university, often rated in the top 5 in all of Canada on multiple fronts, and is known for its high-quality undergrad programs generally, and engineering and technology programs especially… in fact my sister said they have had other scandals in the past about rushing the implementation of new technologies, though I’m not sure what she meant by that. She has been a student there on and off for years, as a grad student and PhD student.

Anonymous Coward says:

Re:

These comments about the surveillance of minors made me think of some other scenarios in which surveillance by these machines is problematic: e.g., what if one of the (adult) students brings their (small) child with them when buying something? Or what if they send the kid over while standing across the room? What if they’re holding the kid up (so that they can reach the buttons) thus effectively making a 3-year-old 5 feet tall?

General comment: The headlong rush toward deploying surveillance and facial recognition and AI/LLMs because they can be, whether or not there’s a valid rationale for them, is going to end in tragedy.

Anonymous Coward says:

Re: Re:

These comments about the surveillance of minors made me think of some other scenarios in which surveillance by these machines is problematic

I suspect most of the areas that contain these machines are watched by other cameras. There are probably signs in these areas warning of the surveillance—just not of the facial recognition.

This comment has been flagged by the community. Click here to show it.

Benjamin Jay Barber says:

Re: Re:

If you had a classifier to do so, you could detect the kid, and refuse to sell him alcohol, at least if its Japan where they have those machines, in America they just get robbed.

But there is nothing about being a child, which exempts children from being in photographs of the public sphere, and moreover to do so would violate the first amendment.

This comment has been flagged by the community. Click here to show it.

Drew Wilson (user link) says:

Re:

And for those asking/wondering, Canada DOES have privacy laws, and they require informed consent before people can be surveilled, so the fact these machines don’t have a notice on them informing customers about the tech installed on them, makes it a clear violation of privacy law.

This is correct, but there is a huge catch to all of this. Canada’s privacy laws aren’t really enforceable. There’s no provision that can get a company fined for breaking these laws.

The only thing that can happen is a strongly worded letter from a privacy commissioner or two. This can come from a provincial or federal level. If the company responds by tossing that letter in the shredder, the commissioners job is done as they don’t have any more legal tools they can use in that role.

This leaves private citizens. They can legally make an attempt to file a lawsuit against the people behind the vending machine, but they have to prove in court damages. Hiring a lawyer and going through the process is a huge financial deterrent in the end for people who have been surveilled. In all likelihood, the companies might get a written scolding before they go back to whatever it is they want to do after.

Canada has needed real privacy reform for nearly a decade now and we aren’t getting it any time soon, unfortunately. Too much profit to be had maintaining the system of surveillance capitalism.

NerdyCanuck (profile) says:

Re: Re:

yes this is very true, just because the laws exist doesn’t mean they’re actually effective/enforced.

like so many other aspects of Canadian law, they basically rely on sternly admonishing the bad actor [in a snotty British-ish overly formal way] in the hopes that they will change their ways due to shame.

Which DID work for the case of that mall in Calgary that had implemented facial recognition technology a few years ago, to be faaaaair… but of course usually doesn’t. Companies pretty much never get charged with this kind of stuff in Canada, even less than in the US. sigh

Anonymous Coward says:

Re:

You could jam its data so that the machine cannot send your picture along.

Way to overcomplicate things. The students already solved the problem with electrical tape. Also, do we know the connections are wireless?

At my Canadian university, the machines were all connected via ethernet. Well, that was 20 years ago, so Wi-fi or cellular wouldn’t surprise me, but probably most of these machines are near old ethernet jacks. (In those days, laundry machines used the same campus card system as vending machines; but unlike the vending machines, they’d work for free when they couldn’t contact the server. I wasn’t the only one to notice that outages were common during certain times of night…)

NerdyCanuck (profile) says:

Re: Re:

doesn’t it say in the article that the machines weren’t wireless, that the data had to be collected manually, and that was how they claimed that the machines were “gdpr compliant”?

but I agree it’s very fishy that they would call it IoT, I didn’t catch that acronym in the article, but it certainly suggests the machines are at least capable of transmitting the data, just that they had supposedly turned that feature off to try and make them compliant in Canada***…

I wonder if someone could detect whether they were transmitting the data, like not necessarily jamming it, just detecting if it was as giving off a signal?

*** and like no wonder they aren’t lol,they got the wrong law – GDPR is the EU regulations, not Canada’s!

HotHead (profile) says:

Supposedly the tech acts like a motion sensor, doing nothing more than informing the machine that someone intends to make a purchase.

But a motion sensor is way different than a camera with facial recognition tech attached.

Someone who might be the same Benjamin J. Barber who distributed revenge porn took issue with the above excerpt.

In response to Barber’s coincidentally-revenge-porn-enabling rant, another commenter mentioned a great example of the problem:

So I can put a camera in your bathroom is what you are telling me?

Using a security camera as a motion sensor in a bathroom absolutely should be treated differently from using a mere motion sensor in a bathroom. The mere presence of a capability more privacy-invasive than necessary is the problem. Just as a bathroom doesn’t need a photo-capturing-capable camera inside for the building’s security, a vending machine doesn’t need a face-detection-capable camera for the machine’s security, never mind making sales transactions.

Anyway, I highly doubt that Invenda’s vending machines with face detection cameras are GDPR-compliant. Gender, age, and race easily fall under the GDPR’s definition of personal data:

‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person;

Here’s the GDPR’s definition of processing data (including recording, adaptation, alteration, and use of data):

‘processing’ means any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction;

The GDPR applies to processing (including local production of aggregate statistics, the same action or less invasive version of what every analytics service does):

Furthermore, the GDPR only applies to personal data processed in one of two ways:

Personal data processed wholly or partly by automated means (or, information in electronic form); and

Personal data processed in a non-automated manner which forms part of, or is intended to form part of, a ‘filing system’ (or, written records in a manual filing system).

Even if the machines delete direct recordings, the machines still made recordings in the first place, and also extracted data from those recordings. Invenda’s vending machines don’t ask for consent from each user to record the respective user. Sounds like a GDPR violation to me, but IANAL.

This comment has been flagged by the community. Click here to show it.

Benjamin Jay Barber says:

Re:

Sounds like a GDPR violation to me, but IANAL.

The machines are not in Europe

Also, the EU wonders why there is no innovation or “silicon valley”, yet at the same time it stifles all innovation.

But given that there are no vending machines in the bathroom with a camera, what is the concrete articulable “injury” caused by the vending machine camera other than hurt feelings?

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Re:

The machines are not in Europe

The point was that the manufacturer claims the machines to be GDPR-compliant. If the Canadian university bought them based on that statement (perhaps under the assumption that it would also be compliant with the less strict local laws), and it was untrue, it’s still fraud.

HotHead (profile) says:

Re: Re:

But given that there are no vending machines in the bathroom with a camera, what is the concrete articulable “injury” caused by the vending machine camera other than hurt feelings?

You keep going on about feelings. At least acknowledge this part of my previous comment:

The mere presence of a capability more privacy-invasive than necessary is the problem. Just as a bathroom doesn’t need a photo-capturing-capable camera inside for the building’s security, a vending machine doesn’t need a face-detection-capable camera for the machine’s security, never mind making sales transactions.

In a strong privacy legal framework, the burden of justification should be on the advocate of observation to demonstrate that an additional invasive capability is necessary (photo cameras vs. motion detectors.in bathrooms, face detection cameras vs. regular cameras on vending machines), not on the subject of observation to demonstrate that the additional invasive capability is harmful.

Privacy is primarily about being able to consent, withdraw consent and withhold consent to sharing personal data with other people. Hiding personal data from people who would abuse it is secondary but also important. Would you bar people from suing over privacy violations until data brokers have already distributed personal data collected without consent to someone who will actually try to blackmail, impersonate, dox, threaten, rob, etc. the data subjects? That’s a bad model. The burden of proof should be on the data collectors / data users to demonstrate that they acted with unambiguous consent or needed to use the data in a specific way to fulfill contractual obligations with the respective data subjects (or just legal obligations), not on the data subjects to demonstrate that concrete harm will definitely happen to themselves.

But here’s a generalized concrete harm: people behave differently when they notice or believe that they are being observed. It’s the Hawthorne effect, a contributor to chilling effects. If you use personal data that you have no consent to collect or use in such a way and have no strictly necessary contractual/legal obligation to do so, then you have an unjust power, a power to change the way other people behave, even if the degree to which you can control their behavior is limited.

LostInLoDOS (profile) says:

So much anger

The intent of this tech is blatantly obvious and far from nefarious.

It’s generic marketing info. A survey. The machine scans and guesses the age and gender. Stores it in a spreadsheet.

It’s useful for marketing. Say, who buys more of this product vs that.
If women buy more of a then b; it makes sense to stock product a in greater quantity in a location frequented by women.
In a dining hall where family visits are frequent, children may require a smaller sweeter product addition.

It’s not storing images. So what’s the big deal?
Sounds to me like a method that would ultimately make consumers happy. More of what you want where you want it!

MrWilson (profile) says:

Re:

You’re speaking from a position of privilege again. Not everyone is comfortable with what you consider benign. The technology isn’t opt-in or even opt-out for the user, who wasn’t aware anything was happening. And the specs of the machine specifically say that it can store and transmit data, so just because it is claimed that this one wasn’t doing so doesn’t mean these features wouldn’t have been turned on in the future or aren’t turned on in a different machine somewhere else.

Did you miss the bolded, quoted text in the article?

But Adaria can also use these machines to collect further data, sending it to relevant parties including Mars, the manufacturer of M&M’s. In particular, Invenda’s sales brochures state the machines are capable of sending estimated ages and genders of every client.

LostInLoDOS (profile) says:

Re: Re:

You’re speaking from a position of privilege again

Quite the opposite. I’m speaking from the world view of reality. Where cameras are everywhere and papers please is normal. Americans, and many Europeans, are soiled in the privacy in life.

store and transmit data

That’s the point. Data yes, photos and video no.

state the machines are capable of sending estimated ages and genders of every client

That was my point. It’s market research. Sounds quite benign to me.

Admittedly, I truly just don’t understand the privacy issues so many westerners create for themselves. Scanning your personhood to guess age and gender for them to do market research is hardly invasive. Hardly a privacy concern. And just more FUD for the ignorant masses.
Step outside of Eastern Europe or North America, get a glimpse of reality. Understand that this is more nonsense.

MrWilson (profile) says:

Re: Re: Re:

Quite the opposite. I’m speaking from the world view of reality.

“There’s ubiquitous injustice and privacy violations elsewhere in the world, so you should both stop being concerned and give up your rights,” is a weird argument. This is “there are starving children in Africa, so eat all your dinner,” levels of patronizing and gaslighting.

That’s the point. Data yes, photos and video no.

Non-consensual data collection is still non-consensual.

That was my point. It’s market research. Sounds quite benign to me.

Your judgment cannot be substituted for that of the people involved. Just because you don’t see or have a problem with it doesn’t mean they don’t also. Go dance in front of a camera. That’s your choice. You don’t get to choose for others.

Admittedly, I truly just don’t understand the privacy issues so many westerners create for themselves.

Yes, it’s the people being surveilled who create the problems. Holy fuck, dude. Victim-blame much?

How about companies just stop collecting data that people don’t want them to collect? Oh, that would cut into profits. Let’s get creepy and stalk our customers.

Give up your rights all you like. Don’t demand that anyone else do the same.

Step outside of Eastern Europe or North America, get a glimpse of reality. Understand that this is more nonsense.

This isn’t an argument against being concerned. This is an argument to be grateful that we have some, if under-enforced privacy rights in the US and Western Europe. We should be advocating for the same for other people elsewhere. Then maybe Chinese people in the US won’t get threating calls from Chinese police with their family members back at home telling them not to say bad things on social media about the CCP.

You’re once again advocating in favor of wealthy corporations and now adding advocacy for authoritarian governments. But sure, nothing to worry about.

LostInLoDOS (profile) says:

Re: Re: Re:2

Starting off with the easy:

We should be advocating for the same for other people elsewhere

You do that. I have no problem with surveillance and data collection. I save 19s of thousands every year via coupons and discounted offers from sharing my habits. I’ve seen first hand cameras catch criminals and how a neighbourhood of concerned citizens could use security cameras and car dash cams to bring down a criminal theft ring.
I assure you I won’t be advocating for less surveillance or data gathering.

Non-consensual data collection

Happens every day of the week. It’s non-invasive and non-discriminatory. Don’t use a toll way, they monitor your speed and use. Don’t drive at all. Speed cameras, red light cameras, that catch and penalise criminals, can track you too. Don’t use a cell phone, everyone knows where you are. And never connect to the internet. A VPN won’t protect you. I can unravel that in under 30 minutes. Not even onion routing can keep you anonymous. Takes a few hours though.
Maybe stick to garlic or peach for the internet if you’re daring.

doesn’t mean they don’t

That was again my point, this is FUD nonsense. Anyone worried about surveillance is a criminal or is indirectly supporting criminals.

Victim-blame much?

No? Because I see no victim to blame. Nobody has yet showed how data collection on its own hurts any law-abiding citizen.

We, the bulk of the readers here, live in some of the least surveilled countries in the world. And still many complain about nearly nothing, comparatively. The two most camera covered countries in the world, both functioning democracies, rarely have such nonsense complaints. Japan, and the southern half of Korea. Most stores in the Philippines take your photo as you enter and as you leave.

They have extremely low crime rates.

And your China nonsense is telling of your grasp on reality. Living in another country does not remove you from your country’s laws. The US is. no stranger to this laying down home law as on acts in foreign lands.
It’s also telling in, conscience or sub, what your real motivation likely is, protecting criminals from law you disagree with.

This is, again, generic non-identified market research. Allowing a company to serve the consumer better.

LostInLoDOS (profile) says:

For those that actually care

I did some personal research into this.
Lost to me to be confusion by lay public as to what a camera is. Not every optical image sensor takes photos or video.
The disconnect, and as i said, the cause of FUD, is by saying there is a camera in these machines.

Though I can be sure without actually opening one and looking, I strongly doubt we’re talking about what you imagine, some Nikon or Sony type lenses taking photos.
More likely we’re discussing a topography scanner. Similar, but more advanced, to the line and dot faces in 1980s sci fi films. Lawnmower man, binary, etc.

The topographical information is run against a database of ranges and saves the result in code. Man woman 10, 50, 90, etc.

There is a real discussion to be had on surveillance. I generally won’t agree with many of you on it. But see a need to educate the public on the facts
This is not such a topic. Nobody is storing a photo here or sending it to anyone. They’re watching who buys what for inventory management.
I’d argue this is FAR less invasive than the shopping card memberships I use, that monitor what I buy and when, to supply me with discounts and coupons.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...