Intel Wants To Add Unproven ‘Emotion Detection’ AI To Distance Learning Tech

from the and-what-does-disabling-the-camera-denote? dept

Last week, Zoom announced its plans to add emotion detecting tech to its virtual meeting platform, something it apparently felt would facilitate the art of the deal. Here’s Kate Kaye, breaking the news for Protocol.

Virtual sales meetings have made it tougher than ever for salespeople to read the room. So, some well funded tech providers are stepping in with a bold sales pitch of their own: that AI can not only help sellers communicate better, but detect the “emotional state” of a deal — and the people they’re selling to.

In fact, while AI researchers have attempted to instill human emotion into otherwise cold and calculating robotic machines for decades, sales and customer service software companies including Uniphore and Sybill are building products that use AI in an attempt to help humans understand and respond to human emotion. Virtual meeting powerhouse Zoom also plans to provide similar features in the future.

Advocates for this AI, which includes customers like Zoom, claim this unproven tech could make it easier to “build rapport” during virtual meetings or, at the very least, give those performing pitches a heads up when they’re losing their audiences.

That’s all well and good when we’re talking about a bunch of consenting adults playing sales pitch poker while attempting to Voight-Kampff their way into a competitive edge. Any advantage should be exploited, even if it means subjecting potential customers to AI with no proven track record. It’s unclear how consent to be emotionally analyzed is obtained (or if it’s even sought), but, again, we’re dealing with adults in a sales situation where this sort of manipulation is considered normal behavior.

The problem with Zoom is it thinks this same tech should be inflicted on non-consenting minors. Again, it’s Kate Kaye with the news for Protocol.

Rather than simply allow instructors to draw inferences from student facial expressions and behavior, a couple of companies think they can make teachers better by throwing more tech (and surveillance) at their students.

Intel and Classroom Technologies, which sells virtual school software called Class, think there might be a better way. The companies have partnered to integrate an AI-based technology developed by Intel with Class, which runs on top of Zoom. Intel claims its system can detect whether students are bored, distracted or confused by assessing their facial expressions and how they’re interacting with educational content.

“We can give the teacher additional insights to allow them to better communicate,” said Michael Chasen, co-founder and CEO of Classroom Technologies, who said teachers have had trouble engaging with students in virtual classroom environments throughout the pandemic.

This means the cameras always need to be on, even though some instructors are capable of teaching classes without expecting students to open up a window to their home lives via laptop cameras. IM services, microphones, and texting seem to fill the face-to-face void quite capably. The ability to strip things back to text-only communication allows students without access to speedy internet connections to stay connected without exceeding the bandwidth they’re allotted or burning up their data if their operating under a cap.

The business version requires always-on cameras to record footage that can then be processed by the emotion detection AI to provide customers with insights on detected mood swings by their sales pitch recipients. Presumably, the school version will operate the same way until Intel and Classroom Technologies feel the AI has learned enough to go live.

The end goal is always-on surveillance of students, with the stated goal being better instruction and more student engagement.

“We are trying to enable one-on-one tutoring at scale,” said [Intel research scientist Sinem] Aslan, adding that the system is intended to help teachers recognize when students need help and to inform how they might alter educational materials based on how students interact with the educational content.

At this point, the product is still in the testing phase. To become a full-fledged product, it will need significant buy-in from educational institutions. That’s the sort of thing that often happens without consulting the stakeholders most affected by the addition of new in-home surveillance tech: the students who will be the testing ground for the product.

Even if the reps for both companies are to be believed — that the product is intended to help teachers better reach their students — the potential for misuse (or deliberate abuse) is omnipresent. On top of that, most humans are incapable of accurately reading the emotions of others and they’ve got a lifetime of experience and better innate learning systems. Add to that the fact that not all cultures utilize the same expressions or body language to signal mood shifts, and you’ve got a product with the potential to generate a ton of useless or counterproductive data.

Filed Under: , , , ,
Companies: classroom technologies, intel, zoom

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Intel Wants To Add Unproven ‘Emotion Detection’ AI To Distance Learning Tech”

Subscribe: RSS Leave a comment
16 Comments
PaulT (profile) says:

Re:

…and like that tech, this stuff has the potential to be profitable while on “intended” to be for guidance rather than full enforcement.

Like that tech, this is likely going to be used in practice to enable harassment of vulnerable innocent people while the people doing to so buy themselves a get out clause for when that results in bad consequences.

Bobvious says:

Tone-deaf, vapid, gormless

Wow!

The opportunities for abuse are limitless with this system. We already have valid concerns from students about invasion of privacy when HUMAN observers are involved. The likelihood that HUMAN DESIGNED and WRITTEN algorithms will fare better is very low. Reading visual cues comes with experience and empathetic interaction with people, developed over time throughout a teaching period. Usually the only time these technologies get implemented is at exam time when people are stressed, fatigued and outside of their comfort zone.

One of the biggest problems with remote interactions is the isolation that students feel. They cannot spontaneously interact like they do in person.

one-on-one tutoring at scale

I see the potential desire to improve a tutor’s productivity and class learning outcomes, but I can also see the thin end of the wedge, where the bean counters get involved and increase class sizes to maximise profits (especially in private institutions), adding more students to the workload of fewer tutors, and fiddling the KPIs to match the shareholders’ desires, rather than the educational outcomes.

Anonymous Coward says:

Re:

The smart way would be a shifting baseline that reckognizes that its own estimate is inconsistent or implauisble. But building such a profile without explicit subject feedback has major epistemology issues. To put it blubrly “well he always looks like he is on the verge of panic, how was I to recognize high anxiety”?. One thing that ergonomically most if not all people absolutely suck at it mixing their assumptions with that of a black box output of any given tool when “black box” is anything that they don’t know the precise mechanism behind.

That Anonymous Coward (profile) says:

In a face to face meeting if you can’t read the room, perhaps sales isn’t for you.

It is one thing to be text blind, but when you have the full audio/visual input and can’t something wrong with you.
Oh the camera makes it different!
How? Other than your brain trying to 2nd guess itself because you are doing the thing you did every day for years but now through a camera didn’t change the game that much.

But hey we have all this data from the suicide prevention app, so we can tell when people are upset now.

PaulT (profile) says:

“The business version requires always-on cameras to record footage that can then be processed by the emotion detection AI to provide customers with insights on detected mood swings by their sales pitch recipients”

Cool. Now, does it tell you whether that mood swing is because they find the work difficult, because they’re annoyed at another student’s interaction, because the dog they can hear yapping 4 houses away won’t shut up, or because the abusive parent who promised to beat them if they let on about last night’s sexual adventure just got back from work?

I can think about a lot of things that could go horribly wrong when this sort of stuff is inevitably used to replace common sense. It could lead to some serious problems being missed and chalked dow to inattention, it could lead to already abused children being penalised for being abused. This already happens enough in schools where overworked and inattentive teachers give up on students they don’t like, it doesn’t need to be automated.

OGquaker says:

emotional characteristics of users

Bezos includes camera…
The I/O devices may include any of a variety of components such as a display or display screen having a touch surface or touchscreen; an audio output device for producing sound, such as a speaker; an audio capture device, such as a microphone; an image and/or video capture device, such as a camera; a haptic unit; and so forth.

https://patents.google.com/patent/US10096319B1/en

Leave a Reply to Anonymous Coward Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...