from the kids-and-their-dang-vidya-games dept
Governments sure seem to hate online advertisers and the platforms that profit from targeted advertising and tailored content algorithms. But they don’t — at least in this case — have anything against engaging in exactly this sort of behavior if it helps them achieve their ends.
In 2015, UK’s National Crime Agency started a program called Cyber Choices, which was meant to steer young people away from being malicious hackers. Starting with the assumption that any form of hacking would ultimately result in malicious hacking, the NCA hoped to engage in interventions that would redirect this apparently unguided energy into something more productive and less harmful.
The NCA’s insistent belief that children are our (grimdark) future if left unattended, it started making stupid assertions, like claiming modding videogames was the gateway drug for black hat hackers. To steer curious youngsters away from malicious hacking, the NCA got into the targeted advertising business.
Through the Cyber Choices program, the NCA identifies “at-risk” young people, based on online activity which indicates a potential interest in cybercrime forums or the purchase of cybercrime tools. Using a set of risk characteristics, the NCA then targets these young people before they engage in serious illegal activity.
Once identified, NCA officers visit these young people to discuss their behavior with them and with their parents.
Data gleaned from this program is also utilized in a complementary “influencer operations” project, in which young people, some as young as 14, who have googled cybercrime services receive targeted Google advertisements informing them that these services are illegal and that they face NCA action if they purchase them.
This raises multiple concerns, not the least of which is “pre-criming” minors because they’ve researched things they’re curious about. On top of that is the advertising targeting minors, something normally considered to be out-of-bounds behavior if performed by a private company. Apparently, the UK government feels it’s ok if it does it.
And a recently released report [PDF] on “targeted advertising by the UK state” by the Scottish Centre for Crime and Justice Research (SCCJR) says this combination of targeted advertising and pre-crime snooping on minors is at least achieving its aims.
These adverts, targeted at UK adolescents between the age of 14 and 20 with an interest in gaming, are calibrated to appear when users search for particular cybercrime services on Google, informing them that these services are illegal and that they face NCA action if they purchase them. Beginning as simple text-based adverts, the NCA developed them across a six month campaign in consultation with behavioural psychologists and using the data they were collecting from their operational work.
There is evidence that the adverts themselves have been extremely effective in dissuading particular kinds of online crime, with a six-month NCA campaign appearing to be linked to a total cessation in growth in the purchase of Denial of Service attacks in the UK, at a time during which these attacks were rising sharply across in comparable nations (Collier et al., 2021).
It doesn’t go so far as to say the ends justify the means, but it does make the assumption the ends are being accomplished with the assistance of the means, no matter how questionable those means are. This is the sort of thing that tends to get tech companies hauled in front of government inquiries. Here’s SCCJR’s of the UK government’s ad campaign.
It is striking how closely this process of gathering information and tailoring intervention resonates with (refracted through the logics of law enforcement practices) the practices of data gathering and iterative messaging development of the private sector consultancies who provide marketing services commercially. The iterative cycle of identifying and surveilling ‘at-risk’ children, targeting for in-person intervention, collecting information directly, moving on to focus groups, and then feeding these data back into an overall framework which guides the targeting and design of operations is reflective of similar practices of identifying customer groups, quantifying likelihood of purchase, conducting primary consumer research, and then feeding this information back into an overall campaign.
As it drills down into the details of this program, it also notes how the program drills down on minors suspected of Googling the wrong stuff, using Orwellian phrases like “tightening the network of surveillance and messaging developing around young people.”
I assume SCCJR won’t be asked to write any press releases for the NCA, which uses far cheerier phrasing as it pitches offshoots of this program to educators of kindergarten and elementary school students.
It’s well known that many children under the age of 12 are becoming tech savvy and have access to various technologies. A great source of information to teach them legal ways to use tech, is the Barefoot Programme. It enables primary school teachers (KS1 & 2) [explanation of these terms here] to deliver the computer curriculum effectively and in an entertaining way. Barefoot offer free face-to-face workshops, helpful online guides and engaging lessons.
And while we’re talking about things supposedly-free governments shouldn’t be thinking out loud, the phrasing moves from Orwellian to slightly friendlier fascism when discussing how the ads “nudge” minors towards more productive uses of their tech skills.
[Cyber Choices, et al] present not only a diversion from a negative behaviour or rationale but also a positive assertion drawing on the aesthetics of the target culture but often repurposing them in the context of a productive capitalist subject, concerned with consumption, accumulation of wealth, community ties, mainstream success, and job opportunities in the legitimate economy.
Work sets you free, say the targeted ads. And carried with it is an unsubtle reminder that the only acceptable path through life involves making lots of money to buy lots of stuff. Which I guess adds “They Live” into the mix. Alarming stuff, especially since we’re still talking about the targeting of minors.
Like law enforcement everywhere in the world, UK agencies are increasingly reliant on data and analytics to make decisions about enforcement efforts. Some of those enforcement efforts start before any crime has been committed. And the efforts are skewing younger and younger. Here’s The Crime Report noting that UK police are getting kids started on state surveillance as soon as possible.
A PowerPoint presentation delivered by West Midlands Police at the 2017 Excellence in Policing Conference indicates that the National Data Analytics Solution uses data from young people “up to 11” and “11 to 16” age groups, noting that “The younger you are when you commit a co-offending offence the more likely for you to go ahead to commit other crimes.”
A briefing note prepared by West Midlands Police Ethics Committee argues for the “social benefit of being able to identify the circumstances and reasons (particularly for young people) that lead individuals to commit their first violent offence …
Fortunately, the SCCJR report doesn’t conclude without noting the considerable downsides of programs like NCA’s Cyber Choice.
A central critique of these measures, and one which is no stranger to ‘nudge’ and behavioural science (Ewert, 2020) is their contested relationship with democracy; that as practiced they are essentially top-down, providing public bodies with a unidirectional capacity to shape the online environment, behaviours, and cultures of their citizens (and those groups who fall under their control but are denied citizenship). The notionally holistic approach to policy which this constitutes does indeed draw on a very wide set of levers – culture, economics, individual psychology, structural considerations such as poverty or racism – but instantiates them in a single site, the risky individual and their decisions. This is far from a liberatory (or even liberal) conception of state power, resting on the contention that individual behaviours and community cultures are the root of policy problems. Although some feedback from citizens does form a part of these processes, it often grants them little in the way of agency to shape policy themselves, contributing instead data to market researchers about their thoughts, opinions, and cultural sensibilities which can be drawn on by policymakers in making decisions, often in ways to which target communities might reasonably object.
It also notes the kids are savvier than the cops think. And that’s going to make targeted ad programs counterproductive.
The phenomenon of ‘blowback’, the violently negative reactions which occur when groups realise that they are being subject to these measures, reflect the fact that people’s relationship with media is multifaceted – they know that the targeted influence infrastructure exists, they can often tell when it is being used and speculate as to how they are being targeted, and can react not only to messages to which they are exposed, but to the broader political dimensions of the messaging practices themselves. There is also the potential for these influence approaches to in fact serve to expose vulnerable groups to the very messages and narratives which policymakers are trying to counter, spreading them far wider.
If the UK government really wants to cultivate a generation of activists that forcibly subject governments to greater transparency and accountability by liberating their internal documents, this is a great way to do it. If it would prefer to control the narrative, it’s going to have to dial back on the targeted ads and targeted surveillance. The SCCJR suggests one better way to accomplish the government’s aims is to take more community-based approach, which eliminates individual targeting in favor of something more participatory and broad that encourages the government and the people it serves to seek solutions for these problems.
But those solutions require treating the public as an equitable partner, something most governments are unwilling or unable to do. And with tons of tech companies offering analytics, data harvesting, and other aftermarket add-ons for the surveillance state, governments are more likely to pretend what they’re doing is smart, rather than just evil.