TheABB's Techdirt Profile

TheABB

About TheABB

Posted on Techdirt - 26 June 2020 @ 12:00pm

Privacy Questions Raised By Distance Learning

A Case Study in an edTech app

Today I discovered that my twelve year old daughter doesn’t read the books in school that she’d most like to read. She chooses the ones that will get her the most points on the school reading app.

Each book in the English school library, is listed on the American app, weighted with a reading level. Children earn a book’s points depending on how well they do on the online quiz to prove they read it.

Harry Potter and the Prisoner of Azkaban is rated at level 6 and gets you 18 points. Susan Cooper’s novel, The Dark is Rising, also a level 6, only wins the reader 13 points by comparison. Heller’s Catch-22 is a level 7.1 and gets you a whopping maximum possible 30 points. By contrast, Orwell’s Animal Farm while rated at a higher level, 7.3 only gets you 5 points closer to the bronze, silver, gold, and platinum goals.

“In many cases, a book’s interest level coordinates with its book level. Hank the Cowdog, for example, the content of which is suitable for fourth-graders, has a book level of 4.5. Many books, however, have a low book level but are appropriate for upper grades and vice versa. For example, Ernest Hemingway’s ‘The Sun Also Rises’ has a book level of 4.4 because its sentences are short and its vocabulary is simple.”

That’s how the company itself explains how the company-created ranking and scoring system works. In its own words, "No formula could possibly identify all the variables involved in matching the right books with the right child."

Clearly there is scope in this algorithm to create a misplaced perception of ‘value’ of a book to a child. Is the app limiting what she will read out of curiosity, sticking to her assigned ‘level’ or age band? It influences her choices in ways that are not fully transparent. Is it biased? In what ways, and are its unintended consequences mitigated?

This is not a new app but it’s new to me, since my daughters now log in at home under our UK COVID-19 lockdown. It’s given me more insight into the tools they normally use only at school.

What are we learning about edTech and what do they learn about us?

After three years of part-time research, I have mapped the main types and flows of data from school children across their education from age 2-19 in England. Our report will be out soon.

Schools are the gatekeepers for the State in the UK, by which I mean the national Department for Education and other branches of government. What could go wrong, has. The government has built a National Pupil Database of every child since 1996, and misused it since 2015 for immigration enforcement. It expanded in 2012 to start to store sexual orientation and religion on the named records of millions of students. And it gives identifiable school children’s records away for millions of people at a time, over 1000 times since 2012, to companies, charities, researchers, think tanks and press.

Schools are also the gatekeepers for thousands of third-parties to gain access to millions of children’s lives. School-home communications, surveillance through in-school body cams and wall-mounted cameras, attendance, absence, behaviours, recording special educational needs, benchmarking progress with other schools, biometrics for cashless payment systems, learning platforms, different apps to quiz in maths, science and foreign languages, reading tracking, mental health tracking, risk of radicalisation tracking, and all before the statutory testing and termly school census. All these data can go to commercial companies. What starts small is often bought out.

It’s a privacy fire sale and our kids in Britain are going cheap.

Both evasive and defensive, many companies will push back on requests asking to find out what they hold about your child, how they process and who they give personal data to, claiming they have no accountability, they’re only processors, not data controllers. Or they waive away their obligations with ‘we’ve got consent’. They’re usually wrong. Just saying ‘we process data on behalf of a school’, doesn’t make you a processor seen through the lens of data protection law. And a tick box exercise to manufacture consent isn’t freely given, or valid given the power imbalance in a school setting.

In this example, the nature of the processing is so clearly beyond the school’s control in determining how the algorithms work, or who the company can pass data on to, that it almost seems as if it is being deliberately difficult. Perhaps, and I am just guessing, they don’t want me to know who the company can pass my child’s personal data to, when it talks vaguely of ‘Affiliated Companies’ in its terms and conditions, because it is Cayman Island Private Equity owned, after Google private equity investment, and who knows where my children’s personal data are going.

Emerging education technology and emerging risks in the public sector

Black-box algorithms are embedded in a child’s daily school life in the classroom and outside it as a result of school-led procurement. Artificial intelligence can be used in educational settings from low-level decision making, such as assigning class seating plans based on children’s behavioural scoring through to shaping progress profiles or a personalised curriculum, or assigning serious risk classifications about their Internet activity. Safeguarding software installed on a school device, or applied whenever anyone connects to the school network, even on personal devices depending on the company of choice, may use AI to match what is typed with thousands of keywords, suggesting the child is a risk to themselves or to others, or at risk from radicalisation and label a child’s online activity with ‘terrorism'. Many of those monitoring software continue to work out of school, offsite and out of hours, and most families may be oblivious that their child’s Internet and offline activity continues to be monitored remotely though school software during lockdown, and will do so in the future weekends and summer holidays.

And as the complexity of algorithms grows, it is increasingly difficult for a school to really understand how lots of these tools work without the necessary expertise. Researchers at the Oxford University Department of Computer Science, revealed in 2018 the extent of hidden trackers, in an assessment of nearly one million apps, and the education apps were some of the worst offenders. If by borrowing the building blocks of code to create some of their apps functionality, developers might not even understand the full extent of their app development in the ecosystem, how can teachers be expected to understand how they work and explain it to families? Or really check their efficacy beyond what the company marketing tells them? All these circumstances create an increase in the need for expert due diligence in educational technology.

Is there due diligence in remote learning under lockdown?

Under COVID-19 and the rush to expand the tools used in remote learning, due diligence has been abandoned all too often. In the UK the government has made funding exclusively available to schools to spend on Microsoft or Google. The Welsh government believes it swept aside the lawful basis of consent required to process children’s data in Google G-Suite for Education, Microsoft 365 and Just2easy toolsuite.

As an estimated 90% of the world’s student population are affected by school closures in the COVID-19 pandemic, technology is playing a range of vital roles in the delivery of essential information, and connecting school communities beyond the classroom. Others provide national platforms for sharing materials, or provide an alternative means and modes of Assistive Technology and augmented communications, supporting the rights of those with disabilities. The mixed in-and out-of-classroom remote learning model for everyday learning is likely to stay for some months to come, and will remain as homework tools in the long term.

Risk analysis though necessary for the introduction of many types of technology at scale, and with children, has not been done. While some see the risks, others see this all as a great opportunity.

As Neil Selwyn wrote in Techlash last month, “in terms of digital education, there needs to be sustained scrutiny of the emergency actions and logics that are being put into place” and he quoted Bianca Wylie who wrote on Toronto’s ‘SideWalk Labs’ in the Boston Review,technology procurement is thus one of the largest democratic vulnerabilities that exists today.’”

Ben Williamson went on to say in the same Techlash paper on ‘the world’s biggest educational technology (EdTech) experiment in history,’ the OECD’s education director Andreas Schleicher claimed ‘It’s a great moment’: All the red tape that keeps things away is gone and people are looking for solutions that in the past they did not want to see. … Real change takes place in deep crisis. You will not stop the momentum that will build.”

Enabling a remote digital connection to school for the most deprived, has meant introducing welcome hardware to families in need, but with little in the way of user support or training or explanations of how software work. The long-term implications for the child’s personal data are way down the list of priorities. And if a school has introduced a new remote learning platform, a new Google or Microsoft core system, zoom and friends live calls, plus new apps for home quizzes and assessment, or offers no education at all, families aren’t in any position to say no.

No wonder there’s a land grab going on under the pandemic by companies determined to gain territory in the crisis while the children are so disempowered.

Together with 35 organisations and the world in March we urged education authorities to procure and recommend only those technologies which openly demonstrate that they uphold children’s rights. Providers' responsibilities haven’t changed under remote learning, but rushed adoption of technology around the world risks undermining learners’ and children’s rights at an unprecedented speed and scale.

What was once considered school records in the UK, now continue to be used outside the classroom and outside school hours. School surveillance of all kinds is made non-stop thanks to the cloud. Out of context many of these data will have no meaning, such as tests scores designed as school progress measures used as individual indicators of a child’s development, but it doesn’t stop its reuse.

The pre-crisis emerging market of social and emotional and mental health manipulation and surveillance is only going to feed on the COVID-19 fears and obligations to a duty of care. An almost digital equivalent of teenage FOMO, fear of missing out, seems to be growing in local governments around the re-use of children’s digital records. There is growing pressure to join up educational records, with health, with welfare, and even with policing data to rely on machine-made predictive risk scores about ‘vulnerable’ children and families that form a single view of ‘truth’ for social care and local authorities to intervene pre-emptively in children’s lives.

Children’s digital rights are quick to be forgotten, but the effects on their digital footprint and lived experience might last a lifetime.

The lasting effects of the COVID-19 crisis on children’s education and the future of our communities, will be as diverse as the variation of their experiences across different schools, staff, and families. Hand-wringing over the attainment gap as a result of lost classroom hours, is a somewhat artificial comparison with what would have been without the coronavirus crisis and often ignores that the damaging effects on children of the digital divide, of deprivation and discrimination also affected children before. Solutions for these systemic social problems, should not be short term ‘COVID-19’ reactions, but long term responses and the political will to solve child poverty.

What should privacy look like in the online world for children and digital learning?

Privacy certainly isn’t an abstract concept when it comes to remote learning and digital tools in education.

Privacy should protect children from covert manipulation of their brains, behaviour and growing personalities. Let kids read with rewards by all means, but without quantifying it for profit.

Privacy should enable a public educational space that’s free from excessive influence of advertising and marketing. Let kids use a learning app without being tracked around the web with personalised ads afterwards or have their parents sent mobile ads for the upgrade to its premium service.

Privacy should ensure educational activity doesn’t mean their exploitation for others commercial product development. Let kids use technology in learning where it is proven to have pedagogical benefit, but without being research trial guinea pigs for your unproven experimental technology or developing a training dataset.

Respect for privacy should protect children as they grow into adulthood from the exposure of their sexuality, physical and mental health conditions, past behaviours and misdemeanours in ways that could harm them or their community through loss of confidentiality, discrimination, stigma and worse —but with the wide range of actors that can access, create or infer such data about children as a result of the data collected about them just by going to state school in England, and Internet surveillance by over a dozen companies, I would not feel able to give my children that guarantee today.

Children should be able to go to school to be educated, and not be exploited.

Protecting privacy isn’t all about data either. Children need their privacy protected to enable their other rights in law and the UN Convention on the Rights of the Child, such as online participation, freedom of expression and free speech, the right to information and education, and the right to be heard.

Why isn’t data protection enough to protect privacy in education?

The quid pro quo for access to any country’s children, should be at very least pro-active transparency. We should know how many teachers Google and others have trained for free, and sent back to schools by the thousand as product ambassadors. What’s the business model and future economic expectations?

Large companies that capture large amounts of personal data about large numbers of children —millions in multiple countries— are growing a power base that other companies simply do not have. It is power that goes beyond data processing but starts to reach into how and what teachers teach. By shaping staff training, you capture elements of the curriculum content and the structure of how it is delivered, first at local, then country level, with worldwide implications. Research is needed to ask whether this shapes a change in not only the delivery of state delivered education but its purpose— why focus on teacher knowledge, after all, if your company has turned its search term to look for knowledge online, into a everyday verb?

The Data Protection Authority in Sweden was the first decision under GDPR, to recognise and act on the nature of the power imbalance in schools, pointing out why consent was invalid as a lawful basis for processing children’s personal data in the case of facial recognition used for registering attendance. France followed suit.

In Norway insufficient technical and organisational measures to ensure information security were found in a home school communications app.

However if enforcement is only on a case-by-case basis, it won’t bring about the systemic change needed to respect children’s rights at scale. Data protection is not enough while enforcement is almost non-existent in the face of non-stop and often invisible incursions.

As a small rights NGO, we have prepared a dozen detailed strategic complaints for assessment by Supervisory Authorities in the last year alone. We’re taking on our Department for Education in a legal challenge on their handling of national pupil data. But unless we reduce the volume of actors involved in the data processing of children’s data in schools, we’ll always be playing whack-a-mole. We need a better model for its overall handling and oversight.

We may need to borrow from the US when it comes to privacy legislation in education.

An alternative model of data rights’ management in education, as an addition not instead of individual empowerment, is that of the US, governed by FERPA with regional controls and oversight. It is imperfect in privacy protections, but it does offer a model of law and expertise for schools to rely on at scale, based on trusted contractual agreements. Schools are genuine data controllers. Processors cannot do all they’d like to under contract, were they instead to have a manufactured ‘consent' basis, nor can they change terms and conditions mid way through the year, without agreed notifications, and reasonable terms of change. Families would get a list each year (or at each school move) to explain the products their child will be using— and crucially, legal guardians retain a right to object. Schools are obliged to offer an equal level of provision via an alternative method, so that objection is not to the detriment of the child.

"Education happens to be today, the world’s most data-mineable industry by far," said the then CEO of Knewton, José Ferreira, in 2012 at the Datapalooza. Technology investment in this industry is laden with values and the politics of what education means for companies and the State, how and where it is delivered, and who controls it.

Education technology isn’t only shaping individuals lives, but affects the experience of education a child gets and its value. And the access to children's data those companies gain by offering often freeware tools, quantifies the value that each child returns to those private players.

Privacy isn’t only a tool to protect kids and their future selves. It is the gateway to the control of the world’s education system itself. And that isn’t a deal we should accept for our children, on the corporate Terms and Conditions currently on offer.

Jen Persson is the founder of defend digital me, a UK advocacy organization for children’s privacy and digital rights in the education sector.

More posts from TheABB >>