Microsoft Seeks Patent On Monitoring Employees' Brains

from the brains! dept

theodp writes "A just-published Microsoft patent application for Monitoring Group Activities describes how a company or the government can determine if employees are not meeting their project deadlines through the use of detection components comprised of ‘one or more physiological or environmental sensors to detect at least one of heart rate, galvanic skin response, EMG, brain signals, respiration rate, body temperature, movement, facial movements, facial expressions, and blood pressure.’ Yikes."

Filed Under: ,
Companies: microsoft

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Microsoft Seeks Patent On Monitoring Employees' Brains”

Subscribe: RSS Leave a comment
eskayp says:

Re: monitoring employees

Good question, Overcast; and on topic!
Just as NSA and the FBI use carnivore and other information screening / monitoring software to scan thousands of US citizens’ phone calls, MS could monitor all the personnel in a workplace by applying stress detection ( aka ‘lie detection’) sensing and monitoring.
Only the workers that the software deemed ‘edgy’, ‘uptight’, or ‘stressed’ would be brought to managment’s attention.
Keep in mind however, all that remotely sensed data from your cubicle, factory floor, or driver’s seat will be logged and ‘trended’ to reveal your ‘stress level’ over time and assign a personalized ‘stress quotient’ to each employee.
An increasing number of businesses are requiring applicants to agree to interviews that include so-called ‘lie detector’ sessions.
By adopting this technology, management is no longer required to feign friendly handshakes in order to check for the sweaty palms that indicate employee deceptiveness.
The corporate ‘stress server’ is already sensing, recording, and serving ‘stress alert’ files to managers.
The term ‘Stress Analysis’ is no longer restricted to the Engineering profession.
The next logical step is to have the stress monitoring system selectively and automatically activate our shock collars or shock shorts.
The new workplace will only require two pay levels: management and spasdic.

random cellist says:

So who decides if the set quotas are actually fair? Also, Microsoft shouldn’t need such methods if employees had enthusiasm and confidence in the products they are assigned to. Sounds like an easy way to allow the upper CEO’s to sit on their duffs while the script monkeys do all the work.

and as for angry dude, although everyone here may not agree with him, you have to admit that the world would be a bit more boring without the occasional heckler. Doesn’t every good band have at least one sarcastic loudmouth that always seems to be at the front row of every show?

Manta (user link) says:

Re: Good or Evil

True Good or Evil,

Or it could be used as another nail in freedoms coffin.
My spin on the mind of Bill Gates (as viewed by an ex hacker).
Hackerz, I mean true Hackers, are not motivated by money, they are motivated by power. Money like that of Bill Gates is a side effect of persuing power. My opinion has always been that Gates is not trying to be the richest man in the world, the money was the side effect of his stiving to have total power over the world. In other words World Conquest!

Whether his intentions of how he would use this power are good or evil are a topic for some other discussion.

All I have to say is guard your freedoms and rights as long as you can.
And watch out for Vista and other programs that steal away your indivuality and self governing control.

PS: No I am not a Microsoft hater, we owe much of the way the world functions today to mirosoft programmers and OS. All I am saying is BEWARE

Spaghettihead says:

I can’t see how this will really help anything. I worked at MS for years and everyone is stressed. I don’t think measuring the amount of stress of a group will yield valuable results as far as moving projects forward, but sociologically, they may just want to figure out how long an employee will endure stress before they ‘burn out’. This is much more interesting information to a company like MS who depends human resources so much.

xtrasico (profile) says:

Gattaca anyone?

This remember me the movie Gattaca. Everything was monitored and scanned. At the end, the principal character got what he wanted, with some work and a little help from friends.

Like we say in spanish: “El que hizo la ley, hizo la trampa y le dirá a sus amigos como hacerlo” (The people that makes laws knows how to circumvent them and will show their friends how to do it).

WTF NOW says:

When is enough enough?

This is beginning to get ridiculous. When are people going to start protecting their rights? I want someone to protect the privacy of my brain waves. If all jobs start to require mind reading waivers and your every thought is watched by your boss then where will this end?
We need to start protecting our rights as a public

PRMan (user link) says:

And if you already have sweaty hands and a differe

Not every person is the same. Between my sweaty hands and my different electrical conductance (I almost failed a physics lab in college until I could prove to the professor that my conductance really was different), I would NEVER agree to a lie detector test.

And any business that would attempt to make it a condition of hiring wouldn’t get me as an employee.

And yes, it would be their loss… 🙂

But you guys are right. This stuff is getting ridiculous.

Jessica says:

Re: Wire our brains

it said they could also use environmental sensors, which you would not be wearing and who knows if you could even see the device or know it was there? This is just a patent on the idea, but I hope it NEVER makes it into practice. #1 it doesnt work in the way that they want because right now we don’t know what patterns mean what and it mainly just measures stress and excitement, and #2 it’s a HUGE invasion of privacy.

TheDock22 says:

Don't really care.

Microsoft could implement this system for their employees and their is nothing the employees can do but quit.

Companies can force people to quit smoking, lose weight, and not drink alcohol (all which are legal things) when not at work. Why wouldn’t Microsoft be able to monitor brain waves and stress reactions?

If you don’t like it, quit the job. In the end though, I think this technology will not provide them with what they are looking for, and that is to catch liars. Most people do not lie, and the ones that do are good enough at it to get around these measures.

Phil (user link) says:

I’m a little hesitant to post this given the (understandably) negative response to this patent app by Microsoft, but…
…health and wellbeing software doesn’t necessarily have to be some big brother system if you’re genuinely interested in helping people look after their health at work.

Obviously I’m biased since I’ve developed an award-winning software tool to help computer users look after their health called PostureMinder. However, I thought I’d put over my thoughts and the decisions we made during the development of our product (as well as seeing if I can get in a sneaky little plug at the end).

PostureMinder uses a webcam to continually check how you’re sitting and provides reminders whenever you sit in a poor posture for a while.

It’s based on the principle that we all know how we *should* sit, but very few of us do when we’re engrossed in our work or play on the computer. In effect, PostureMinder acts as your posture conscience.

We took the decision very early on not to provide any sort of centralised monitoring or reporting information – PostureMinder’s purely standalone and any posture statistics it gathers are there for you to review yourself, not for your boss to look at.

The reason we took this stance is simple – the moment you move away from the principle that the system is purely, 100% a tool to help you look after yourself, and start adding monitoring of employees, you lose all trust and completely undermine the benefits that the software is trying to achieve.

Perhaps something that Microsoft should bear in mind if they actually develop the sort of product their patent suggests?

If any of you would like to try some health and wellbeing software that’s genuinely been built to try to help people look after themselves in an unobtrusive, non-lecturing, non-spying way, please take a look at our website at I’d really love to hear your feedback on it.


dboots says:

Microsoft's Monitoring

This technology is Big Brother. Here are some other links
about this technology, being able to be sometype of lie
detector of The Dept of Homeland Security called
Project Hostile Intent (PHI).

IMAGINE the scene. You arrive at New York’s JFK airport, tired after a long flight, and trudge into line at passport control. As you wait, a battery of lasers, cameras, eye trackers and microphones begin secretly compiling a dossier of information about your body.

The computer that is processing the data from these hidden sensors is not searching for explosives, knives, guns or contraband. Instead, it is working on a much tougher problem: whether you are thinking about committing a terrorist act, either imminently, or at sometime during your stay in the US. If the computer decides that might be your intention, you will be led off for interview with security officers.

The equipment could also screen passengers as they wait to have their bags checked before boarding, in an attempt to predict when someone is planning to bomb or hijack a plane.

It sounds far-fetched, but this is the aim of Project Hostile Intent (PHI), the latest anti-terrorism idea from the US Department of Homeland Security. According to DHS spokesman Larry Orluskie, the DHS wants to develop systems that can analyse behaviour remotely to predict which of the 400 million people who enter the US every year have “current or future hostile intentions”.

PHI aims to identify facial expressions, gait, blood pressure, pulse and perspiration rates that are characteristic of hostility or the desire to deceive. Then the idea is to develop “real-time, culturally independent, non-invasive sensors” and software that can detect those behaviours, says Orluskie. The DHS’s Advanced Research Projects Agency (HSARPA) suggests that these sensors could include heart rate and breathing sensors, infrared light, laser, video, audio and eye tracking.

PHI got quietly under way on 9 July, when HSARPA issued a “request for information” in which it asked security companies and US government labs to suggest technologies that could be used to achieve the project’s aims. It hopes to test them at a handful of airports, borders and ports as early as 2010 and to deploy the system at all points of entry to the US by 2012.

But experts in detecting when someone is deliberately hiding something and training machines to recognise human emotions, say that the DHS faces huge challenges, and is unlikely to achieve this goal by 2010, if ever. “I can’t imagine they will have any reasonable rates of success with such a system,” says Kerstin Dautenhahn of the University of Hertfordshire, UK, who specialises in teaching robots to understand human intentions. “I have serious doubts that it will be successful,” adds psychologist Paul Ekman of the University of California, San Francisco, an expert in detecting hidden emotions and intentions from human facial expressions.

We already know that people betray their true intentions via involuntary behaviour. In the 1960s Ekman found that even when people are trying to hide it they often reveal what they are about to do, by showing fleeting, involuntary facial expressions known as “micro-expressions” . For example, if for a fraction of a second you bare your teeth, lower your eyebrows and wrinkle your nose, while pretending to smile, you’ve just made the micro-expression for disgust.

Since 2003, the US Transportation Security Administration (TSA) has been using a program called Screening Passengers through Observation Techniques, which relies on micro-expressions. Under SPOT, dedicated “behaviour detection officers”, who are trained to observe and decipher micro-expressions, observe people milling around at airports and discreetly pull aside anyone whose micro-expressions seem suspicious. After starting a casual conversation, they might then pass them on for further questioning, depending on their responses. “We have caught a number of individuals, from drug dealers to money launderers, and a double murderer in one case,” says TSA spokesman Chris White.

A big problem, however, is that SPOT is an expensive, labour-intensive process and is not something a customs official or baggage screener can do in addition to their normal work. “Right now, screeners have typically less than one minute to examine a traveller’s documents and assess whether they are a threat,” says Orluskie. Similarly, the infamous polygraph or lie detector test, used routinely by intelligence agencies across the world when grilling suspects – despite its questionable reliability – is time-consuming and requires an officer’s undivided attention, as suspects must be hooked up to electrodes that measure blood pressure, sweat and pulse.

Enter PHI. With this latest idea, the DHS is hoping to automate the SPOT program, so that computers, not humans, search for micro-expressions, and at the same time beef up the range of bodily signs that can be investigated. Machines will not just look for micro-expressions, they will also attempt to sense whether someone is hiding something. For this they might use a remote-controlled, non-contact version of the polygraph, bouncing lasers or microwaves off a person’s skin, as suggested by the US Department of Defense in 2006. The DHS wants to use remote sensors so they don’t impede the flow of travellers.
Remote sensors will search for bodily signs that someone is hiding something


Link I posted is to the manufacter of the Cogito Detector

Cogito can read your mind
Rooting out terrorists using a host of non-contact sensors might be a long way off, but a hostility detector developed in Israel is much closer to being deployed.

A traveller being grilled by the Cogito detector, the handiwork of Suspect Detection Systems in Tel Aviv, sits inside a kiosk and places their right hand on a sensor that measures blood pressure, pulse and sweat, just like the polygraph test, while answering questions that appear on a screen. The questions are pretty benign, says SDS chief executive Shabtai Shoval, such as “Do you intend to live and work here?” But unlike the polygraph, the point is not to work out whether their answer to that specific question is a lie, but instead to compare their bodily responses to other people’s. “We check their specific reaction to certain questions and see how their polygraph response differs from that of innocent people who have answered before them,” says Shoval.

He says that a would-be terrorist will react differently to an innocent person. “Terrorists know that they are planning an attack. Coming into the country a year before, all they might want to do is learn to fly airplanes. But just having that plan in their mind changes their reaction,” he says. Because the test does not hinge on a display of hostility, which could be produced by someone innocent who is stressed from being in an airport and at the same time might not be displayed by a trained suicide bomber, Shoval says that Cogito is more reliable than PHI. “Take the September 11 hijackers. They did not look nervous on the CCTV footage as they stood in the queue before they ended their lives.”

The US Transportation Security Agency tried out Cogito at an airport in Tennessee last year and has ordered more $200,000 machines for further trials this year. But one problem is that either every passenger is interviewed, which slows passenger flow, or passengers will have to be profiled and a few selected, raising privacy concerns.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...