The Tech Policy Greenhouse is an online symposium where experts tackle the most difficult policy challenges facing innovation and technology today. These are problems that don't have easy solutions, where every decision involves tradeoffs and unintended consequences, so we've gathered a wide variety of voices to help dissect existing policy proposals and better inform new ones.

Privacy Questions Raised By Distance Learning

from the data-protection-v-child-protection dept

A Case Study in an edTech app

Today I discovered that my twelve year old daughter doesn’t read the books in school that she’d most like to read. She chooses the ones that will get her the most points on the school reading app.

Each book in the English school library, is listed on the American app, weighted with a reading level. Children earn a book’s points depending on how well they do on the online quiz to prove they read it.

Harry Potter and the Prisoner of Azkaban is rated at level 6 and gets you 18 points. Susan Cooper’s novel, The Dark is Rising, also a level 6, only wins the reader 13 points by comparison. Heller’s Catch-22 is a level 7.1 and gets you a whopping maximum possible 30 points. By contrast, Orwell’s Animal Farm while rated at a higher level, 7.3 only gets you 5 points closer to the bronze, silver, gold, and platinum goals.

“In many cases, a book’s interest level coordinates with its book level. Hank the Cowdog, for example, the content of which is suitable for fourth-graders, has a book level of 4.5. Many books, however, have a low book level but are appropriate for upper grades and vice versa. For example, Ernest Hemingway’s ‘The Sun Also Rises’ has a book level of 4.4 because its sentences are short and its vocabulary is simple.”

That’s how the company itself explains how the company-created ranking and scoring system works. In its own words, "No formula could possibly identify all the variables involved in matching the right books with the right child."

Clearly there is scope in this algorithm to create a misplaced perception of ‘value’ of a book to a child. Is the app limiting what she will read out of curiosity, sticking to her assigned ‘level’ or age band? It influences her choices in ways that are not fully transparent. Is it biased? In what ways, and are its unintended consequences mitigated?

This is not a new app but it’s new to me, since my daughters now log in at home under our UK COVID-19 lockdown. It’s given me more insight into the tools they normally use only at school.

What are we learning about edTech and what do they learn about us?

After three years of part-time research, I have mapped the main types and flows of data from school children across their education from age 2-19 in England. Our report will be out soon.

Schools are the gatekeepers for the State in the UK, by which I mean the national Department for Education and other branches of government. What could go wrong, has. The government has built a National Pupil Database of every child since 1996, and misused it since 2015 for immigration enforcement. It expanded in 2012 to start to store sexual orientation and religion on the named records of millions of students. And it gives identifiable school children’s records away for millions of people at a time, over 1000 times since 2012, to companies, charities, researchers, think tanks and press.

Schools are also the gatekeepers for thousands of third-parties to gain access to millions of children’s lives. School-home communications, surveillance through in-school body cams and wall-mounted cameras, attendance, absence, behaviours, recording special educational needs, benchmarking progress with other schools, biometrics for cashless payment systems, learning platforms, different apps to quiz in maths, science and foreign languages, reading tracking, mental health tracking, risk of radicalisation tracking, and all before the statutory testing and termly school census. All these data can go to commercial companies. What starts small is often bought out.

It’s a privacy fire sale and our kids in Britain are going cheap.

Both evasive and defensive, many companies will push back on requests asking to find out what they hold about your child, how they process and who they give personal data to, claiming they have no accountability, they’re only processors, not data controllers. Or they waive away their obligations with ‘we’ve got consent’. They’re usually wrong. Just saying ‘we process data on behalf of a school’, doesn’t make you a processor seen through the lens of data protection law. And a tick box exercise to manufacture consent isn’t freely given, or valid given the power imbalance in a school setting.

In this example, the nature of the processing is so clearly beyond the school’s control in determining how the algorithms work, or who the company can pass data on to, that it almost seems as if it is being deliberately difficult. Perhaps, and I am just guessing, they don’t want me to know who the company can pass my child’s personal data to, when it talks vaguely of ‘Affiliated Companies’ in its terms and conditions, because it is Cayman Island Private Equity owned, after Google private equity investment, and who knows where my children’s personal data are going.

Emerging education technology and emerging risks in the public sector

Black-box algorithms are embedded in a child’s daily school life in the classroom and outside it as a result of school-led procurement. Artificial intelligence can be used in educational settings from low-level decision making, such as assigning class seating plans based on children’s behavioural scoring through to shaping progress profiles or a personalised curriculum, or assigning serious risk classifications about their Internet activity. Safeguarding software installed on a school device, or applied whenever anyone connects to the school network, even on personal devices depending on the company of choice, may use AI to match what is typed with thousands of keywords, suggesting the child is a risk to themselves or to others, or at risk from radicalisation and label a child’s online activity with ‘terrorism'. Many of those monitoring software continue to work out of school, offsite and out of hours, and most families may be oblivious that their child’s Internet and offline activity continues to be monitored remotely though school software during lockdown, and will do so in the future weekends and summer holidays.

And as the complexity of algorithms grows, it is increasingly difficult for a school to really understand how lots of these tools work without the necessary expertise. Researchers at the Oxford University Department of Computer Science, revealed in 2018 the extent of hidden trackers, in an assessment of nearly one million apps, and the education apps were some of the worst offenders. If by borrowing the building blocks of code to create some of their apps functionality, developers might not even understand the full extent of their app development in the ecosystem, how can teachers be expected to understand how they work and explain it to families? Or really check their efficacy beyond what the company marketing tells them? All these circumstances create an increase in the need for expert due diligence in educational technology.

Is there due diligence in remote learning under lockdown?

Under COVID-19 and the rush to expand the tools used in remote learning, due diligence has been abandoned all too often. In the UK the government has made funding exclusively available to schools to spend on Microsoft or Google. The Welsh government believes it swept aside the lawful basis of consent required to process children’s data in Google G-Suite for Education, Microsoft 365 and Just2easy toolsuite.

As an estimated 90% of the world’s student population are affected by school closures in the COVID-19 pandemic, technology is playing a range of vital roles in the delivery of essential information, and connecting school communities beyond the classroom. Others provide national platforms for sharing materials, or provide an alternative means and modes of Assistive Technology and augmented communications, supporting the rights of those with disabilities. The mixed in-and out-of-classroom remote learning model for everyday learning is likely to stay for some months to come, and will remain as homework tools in the long term.

Risk analysis though necessary for the introduction of many types of technology at scale, and with children, has not been done. While some see the risks, others see this all as a great opportunity.

As Neil Selwyn wrote in Techlash last month, “in terms of digital education, there needs to be sustained scrutiny of the emergency actions and logics that are being put into place” and he quoted Bianca Wylie who wrote on Toronto’s ‘SideWalk Labs’ in the Boston Review,technology procurement is thus one of the largest democratic vulnerabilities that exists today.’”

Ben Williamson went on to say in the same Techlash paper on ‘the world’s biggest educational technology (EdTech) experiment in history,’ the OECD’s education director Andreas Schleicher claimed ‘It’s a great moment’: All the red tape that keeps things away is gone and people are looking for solutions that in the past they did not want to see. … Real change takes place in deep crisis. You will not stop the momentum that will build.”

Enabling a remote digital connection to school for the most deprived, has meant introducing welcome hardware to families in need, but with little in the way of user support or training or explanations of how software work. The long-term implications for the child’s personal data are way down the list of priorities. And if a school has introduced a new remote learning platform, a new Google or Microsoft core system, zoom and friends live calls, plus new apps for home quizzes and assessment, or offers no education at all, families aren’t in any position to say no.

No wonder there’s a land grab going on under the pandemic by companies determined to gain territory in the crisis while the children are so disempowered.

Together with 35 organisations and the world in March we urged education authorities to procure and recommend only those technologies which openly demonstrate that they uphold children’s rights. Providers' responsibilities haven’t changed under remote learning, but rushed adoption of technology around the world risks undermining learners’ and children’s rights at an unprecedented speed and scale.

What was once considered school records in the UK, now continue to be used outside the classroom and outside school hours. School surveillance of all kinds is made non-stop thanks to the cloud. Out of context many of these data will have no meaning, such as tests scores designed as school progress measures used as individual indicators of a child’s development, but it doesn’t stop its reuse.

The pre-crisis emerging market of social and emotional and mental health manipulation and surveillance is only going to feed on the COVID-19 fears and obligations to a duty of care. An almost digital equivalent of teenage FOMO, fear of missing out, seems to be growing in local governments around the re-use of children’s digital records. There is growing pressure to join up educational records, with health, with welfare, and even with policing data to rely on machine-made predictive risk scores about ‘vulnerable’ children and families that form a single view of ‘truth’ for social care and local authorities to intervene pre-emptively in children’s lives.

Children’s digital rights are quick to be forgotten, but the effects on their digital footprint and lived experience might last a lifetime.

The lasting effects of the COVID-19 crisis on children’s education and the future of our communities, will be as diverse as the variation of their experiences across different schools, staff, and families. Hand-wringing over the attainment gap as a result of lost classroom hours, is a somewhat artificial comparison with what would have been without the coronavirus crisis and often ignores that the damaging effects on children of the digital divide, of deprivation and discrimination also affected children before. Solutions for these systemic social problems, should not be short term ‘COVID-19’ reactions, but long term responses and the political will to solve child poverty.

What should privacy look like in the online world for children and digital learning?

Privacy certainly isn’t an abstract concept when it comes to remote learning and digital tools in education.

Privacy should protect children from covert manipulation of their brains, behaviour and growing personalities. Let kids read with rewards by all means, but without quantifying it for profit.

Privacy should enable a public educational space that’s free from excessive influence of advertising and marketing. Let kids use a learning app without being tracked around the web with personalised ads afterwards or have their parents sent mobile ads for the upgrade to its premium service.

Privacy should ensure educational activity doesn’t mean their exploitation for others commercial product development. Let kids use technology in learning where it is proven to have pedagogical benefit, but without being research trial guinea pigs for your unproven experimental technology or developing a training dataset.

Respect for privacy should protect children as they grow into adulthood from the exposure of their sexuality, physical and mental health conditions, past behaviours and misdemeanours in ways that could harm them or their community through loss of confidentiality, discrimination, stigma and worse —but with the wide range of actors that can access, create or infer such data about children as a result of the data collected about them just by going to state school in England, and Internet surveillance by over a dozen companies, I would not feel able to give my children that guarantee today.

Children should be able to go to school to be educated, and not be exploited.

Protecting privacy isn’t all about data either. Children need their privacy protected to enable their other rights in law and the UN Convention on the Rights of the Child, such as online participation, freedom of expression and free speech, the right to information and education, and the right to be heard.

Why isn’t data protection enough to protect privacy in education?

The quid pro quo for access to any country’s children, should be at very least pro-active transparency. We should know how many teachers Google and others have trained for free, and sent back to schools by the thousand as product ambassadors. What’s the business model and future economic expectations?

Large companies that capture large amounts of personal data about large numbers of children —millions in multiple countries— are growing a power base that other companies simply do not have. It is power that goes beyond data processing but starts to reach into how and what teachers teach. By shaping staff training, you capture elements of the curriculum content and the structure of how it is delivered, first at local, then country level, with worldwide implications. Research is needed to ask whether this shapes a change in not only the delivery of state delivered education but its purpose— why focus on teacher knowledge, after all, if your company has turned its search term to look for knowledge online, into a everyday verb?

The Data Protection Authority in Sweden was the first decision under GDPR, to recognise and act on the nature of the power imbalance in schools, pointing out why consent was invalid as a lawful basis for processing children’s personal data in the case of facial recognition used for registering attendance. France followed suit.

In Norway insufficient technical and organisational measures to ensure information security were found in a home school communications app.

However if enforcement is only on a case-by-case basis, it won’t bring about the systemic change needed to respect children’s rights at scale. Data protection is not enough while enforcement is almost non-existent in the face of non-stop and often invisible incursions.

As a small rights NGO, we have prepared a dozen detailed strategic complaints for assessment by Supervisory Authorities in the last year alone. We’re taking on our Department for Education in a legal challenge on their handling of national pupil data. But unless we reduce the volume of actors involved in the data processing of children’s data in schools, we’ll always be playing whack-a-mole. We need a better model for its overall handling and oversight.

We may need to borrow from the US when it comes to privacy legislation in education.

An alternative model of data rights’ management in education, as an addition not instead of individual empowerment, is that of the US, governed by FERPA with regional controls and oversight. It is imperfect in privacy protections, but it does offer a model of law and expertise for schools to rely on at scale, based on trusted contractual agreements. Schools are genuine data controllers. Processors cannot do all they’d like to under contract, were they instead to have a manufactured ‘consent' basis, nor can they change terms and conditions mid way through the year, without agreed notifications, and reasonable terms of change. Families would get a list each year (or at each school move) to explain the products their child will be using— and crucially, legal guardians retain a right to object. Schools are obliged to offer an equal level of provision via an alternative method, so that objection is not to the detriment of the child.

"Education happens to be today, the world’s most data-mineable industry by far," said the then CEO of Knewton, José Ferreira, in 2012 at the Datapalooza. Technology investment in this industry is laden with values and the politics of what education means for companies and the State, how and where it is delivered, and who controls it.

Education technology isn’t only shaping individuals lives, but affects the experience of education a child gets and its value. And the access to children's data those companies gain by offering often freeware tools, quantifies the value that each child returns to those private players.

Privacy isn’t only a tool to protect kids and their future selves. It is the gateway to the control of the world’s education system itself. And that isn’t a deal we should accept for our children, on the corporate Terms and Conditions currently on offer.

Jen Persson is the founder of defend digital me, a UK advocacy organization for children’s privacy and digital rights in the education sector.

Filed Under: , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Privacy Questions Raised By Distance Learning”

Subscribe: RSS Leave a comment
Upstream (profile) says:

What could go wrong, has.

And virtually all of the things that have gone wrong were completely predictable. One of your countrymen, Eric Arthur Blair, aka George Orwell, gave us all a preview over 70 years ago.

Much of my impressions of the advanced English Surveillance Nanny State comes from Charles Oliver’s Brickbats feature in Reason Magazine. It never ceases to amaze and terrify me. While the UK appears to have a commanding lead in the race to dystopian hell, at least among non-authoritarian Western nations, the US is trying desperately to catch up. Separation of State and school would be a good place to start, for all of us. Non-proprietary, open-source software requirements might also help. At least then the various apps and algorithms could be examined, and their functioning (or lack thereof) and biases could be exposed, and maybe even corrected. And, as always, strong data and privacy protection laws, along with pervasive default encryption, need to be the order of the day. This is an area where "For the children!" is a legitimate rallying cry.

Anonymous Coward says:

Re: Re:

This is an area where "For the children!" is a legitimate rallying cry.

That is never a legitimate rallying cry, by definition. Most people will instantly associate that with political fluff, and not the actual help for children which you are attempting to do. Unless you want your ideas to be discarded out of hand on the spot, you might want to come up with a different call to arms.

Separation of State and school would be a good place to start, for all of us.

Guaranteed never to happen. The whole underlying theme of the article is control and who has it. There is no discussion of control without politics coming into play. The fact that the schools have become so politicized is because of their position to indoctrinate and the belief (justified or not) of them being used for that purpose by parents and politicians. For proof of that, just look at the constant rallying call to "hold schools accountable." In short, the institution has lost the trust of the public, and you cannot demand that everyone give up control to an untrustworthy actor. Which is what you are demanding when you say "Separation of school and state." E.g. "Let the schools decide." Dispel the distrust first, and the separation will come naturally.

Non-proprietary, open-source software requirements might also help.

As someone who has worked in a school district with the authority to make this decision, albeit in the US and not the UK, let me make a few remarks about my experience with this:

  1. User workflows are problematic. The biggest hurdle to any software adoption, not just OSS, is user-impact and the need to retrain. This is much worse in a school district however. As not only do you have to worry about adults not wanting to switch, but also about children who haven’t properly learned the previous system yet and may not even know basic concepts common to all software of a given type. Depending on the district / school you may not have the benefit of a good computer science curriculum, mandated or not, that can help transition efforts. (My district hasn’t even taught computer science for over a decade, despite my calls to change that.) You also have the issue of needing to get the time to retrain, which may or may not be easy for students (again, that computer science course) but for teachers and staff is much harder to come by. After all, you are competing with mandatory trainings for other things like the never ending accountability changes, and curriculum updates. If you can’t navigate this it will never happen, and of course changing software for the sake of "privacy" will always be seen by everyone else as optional fluff.
  2. Privacy is "optional." As already stated, many people in the modern era take the whataboutism approach to privacy. "Everyone else is doing it." "$HATED_TECH_COMPANY already knows everything anyway." "$FANBOI_TECH_COMPANY wouldn’t do that, there’s no need to worry." "Teachers / Schools already have more information about the kids than their parents, why should we worry about $TECH_COMPANY?" These and more are just some of the excuses I’ve heard over the years. They are always an attempt to avoid work or change, and convincing everyone is no easy task. Often you need to make political maneuvers and play the long game to have a chance at getting where you want, and there is always the chance that something else even more privacy invasive will come along before then to cause you to start over. Simply put, privacy in general isn’t worth much to most people and, given the nature of the current IT industry, any gains will be viewed by others as a very labor intensive attempt at postponing the inevitable. Often getting around this roadblock requires significant financial gains, or a current system that enough of the right people hate.
  3. Cost isn’t limited to "free as in beer." There’s also the requirement of maintenance and continuing operations. As an example, I switched our systems over to using oVirt for our virtualization needs three years ago. However, I quickly found out that I was the only person who even knew about the software’s existence, and that if anyone else ever had to manage that system, oVirt would be on the chopping block. As it turns out people were unwilling to learn the new interface, (yes the web interface, not the configuration files) and there was no money in the budget to pay for support from Red Hat. Compounding matters was Spectre and Meltdown, which due to the age of the hardware I was able to set it up on, oVirt deprecated support for and hardcoded their hypervisor to reject starting VMs on. (The manufacturer refused to provide updated microcode for those CPU models.) As such we cannot update the servers anymore for risk of disabling all of the VMs, and we couldn’t deploy a new installation in case of failure due to the lack of installation media without replacing the server hardware. Which there was no budget for. The only thing that has prevented me from having to tear it down, is the cost of running the VM’s services on bare metal machines (power consumption). However with new management coming in and the department becoming redundant, (yes I lost my job), that system will probably be coming down soon anyway as the new management brings a lot more resources (cash and their own purchased commercial solution) to the table. Getting around this one requires ongoing convincing of merit, both on the side of management, and the software developers.
    Note: I do understand oVirt’s reasoning for blacklisting the CPUs that are not updated, but not providing a "I understand, do it anyway owner override" is a good way to mandate their removal in favor of something that doesn’t break the security of everything else on the system in the process.
  4. Unproven software is a hindrance. School districts are all about consensus. Often you will get one district that decides on a solution and everyone else will "steal" their solution for use as their own. Unless you just so happen to be the district that chooses, anything you do differently will be viewed poorly by your peers. Which means you’ll be dealing with complaints from your administration whenever problems happen in the form of "other districts don’t have these issues." It’s OK when the "approved" software malfunctions, it’s written off as "nothing we can do but bear it", but no such reprieve is granted for the non-conformant. Even if it’s the exact same issue encountered by conformant software. Do this too much and, of course, you’ll be viewed as incompetent in your job. As the belief of non-conformance being the root of all evil takes root even in compliant software. ("Well we have that program, it must be interfering somehow.") This problem can’t really be overcome for any software. It’s the hive mind mentality of school districts. The best you can do is testing and monitoring, and hope that nothing else comes along to mess it up. Of course that also means you have to limit the amount of non-conformance you do as well. The result is the places where OSS, or anything else for that matter, can be adopted is very limited.
  5. Laziness is king. I remember hearing someone at a conference claiming as a presenter that they were in the process of systematically identifying all of the "techie" kids in their schools, encouraging them to take on the role of the goto tech support (both for the "reboot your computer" and "take the chromebook apart and replace the keyboard" kind) for their entire grade level, and to teach other "techie" kids what they knew. All while hoping the teachers and staff would play along, so they could play Minecraft all day.
    Let that sink in. Because when I heard it I was wondering how this guy thought that this was acceptable or how he would still have a job if he succeeded. That kind of mentality is present in many areas of school district IT departments, but none more so than procurement. Again, hive mind mentality comes into play here, but even when it doesn’t, and the freedom to choose is granted, the thing that works with as little effort from the district IT staff as possible is typically the thing that is chosen. Regardless of cost. They are absolute suckers for SaaS solutions, and many of the district administrations are all too willing to go with the hive mind’s decision without question.
    As an example, the chromebook deployment in my district cost more than the district could reasonably afford within their support lifetimes. They got less out of them than what they could get with their existing equipment, and even lost capabilities they took for granted. As they would need to pay for new subscription services that weren’t in the existing budget to retain even a less capable version of them. None of that mattered, nor stopped the expenditure and deployment. Just like none of the IT staff knowing anything about managing these new systems or having a plan to integrate them into the teachers’ curriculum didn’t. The only thing that mattered to administration was: "Everyone else around us is doing it." and the decision wasn’t open for discussion. Anyone have ideas on getting around this one?
    (BTW: Yes, I realize this is horribly formatted. Blame markdown for constantly renumbering my list if I break it up better.)
  6. Many districts simply don’t want to be responsible for hosting services. One reason is that many fear COPA lawsuits as the risk of liability for data breaches is higher with locally hosted services. Whereas the legal threat is less when liability is quietly hidden away in a service contract, and local administrative staff have limited access. Another reason is cost. Not every district wants to run a data center nor have the requirement of hiring staff with the skill needed to manage one. Again the hive mentality comes back into play here as well, after all why bother exploring new worlds when others are nearby and risk is low? You can’t really expect to gain privacy if all of the data is held by someone else, and holding the data yourself is an expensive and legally risky endeavor. Let alone processing it using OSS where interoperability is counter to the business model of SaaS. Getting around this requires solving the individual problems, and given that one big issue is federal law, good luck.

Simply stated OSS can get used in a school district but, at least where I live, there is a massive disincentive against it on multiple fronts. Most of it coming from the very issue of the article: Control. The public wants accountability, from their untrustworthy foe, and that desire for ever increasing accountability is creating an ever increasing desire for the public to defund the school districts and for the school districts to do the least required of them due to the risks incurred. I.e. Most of the problems are caused by an unwillingness to trust each other to do what’s best for the kids.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

td;dr –

The point was made in the first six paragraphs, the next 5 or 6 were just reiteration. After that, I couldn’t justify the time expenditure on 30+ paragraphs of the same thing. Please take some time to study how Mike and the other TD writers get their point across, almost universally in succinct fashion.

Thank you.


Darkness Of Course (profile) says:

Been there. Stopped doing it.

My son’s teacher called, wanting to get my support for him to join the summer book club (or some such).

My son: No. They don’t count books over 100p.

Me to Teacher: Not going to happen. He’s read several hundred pages this month, yet would only get two counts for your system. He was in third grade, reading was his passion. Still is.

A decade ago my wife saw him going to the gym they both frequented. Adult, head down, reading a book, walking down the sidewalk to the gym. He never noticed her.

teka says:

As someone who had to support two kids through the end of the school year on ‘distance learning’, there is no sanity.

They were using 7 different systems/websites daily, with assignments showing up in 9/10 places and running the gamut from modifying shared google documents (with no guidelines) to printing things out from chromebooks with poor printer support and taking a picture of the completed sheet, emailing that to a school account and then finally submitting it as a image to a 3rd (or 4th or 5th) party host and hoping a teacher saw it eventually.

All of these places had different login schemes, had different amounts of student info (names, school, year, age) with no oversight about signups and had no real teacher training on the back end so the engagement was wildly different class to class. two fair/good students who liked school were two failing/barely passing student with high anxiety by the end of the year.

Surpere says:


Moreover, clouds getting darker and all of our devices becomes a spy gadgets without our permission. I read a lot about it in mathematical articles . If you interested in, you can check link where you will find a lot of essay and articles on such topic or another one for free. Its good example of food for brains and you can get a way to write your own text with a reference.

KateMiller (profile) says:

I agree with your privacy concerns but today’s learning management systems have evolved far better to provide better security and confidentiality. You can get a custom LMS developed by an IT company on demand with features of your choice. For more information check out the link below:

JeffreeV (profile) says:

"Education is the most powerful weapon which you can use to change the world", said Nelson Mandela, and we couldn’t agree more. Multi-billion dollar capital investments in edtech show the demand of the new era. Hi-tech solutions make education much more accessible, effective, and fun.

Have a look at this up-to-date article on trends in educational #technology It can be a good addition to your blog post.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Older Stuff
12:04 SOPA Didn't Die. It's Just Lying In Wait. (5)
09:30 Demanding Progress: From Aaron Swartz To SOPA And Beyond (3)
12:00 How The SOPA Blackout Happened (3)
09:30 Remembering The Fight Against SOPA 10 Years Later... And What It Means For Today (16)
12:00 Winding Down Our Latest Greenhouse Panel: Content Moderation At The Infrastructure Layer (4)
12:00 Does An Internet Infrastructure Taxonomy Help Or Hurt? (15)
14:33 OnlyFans Isn't The First Site To Face Moderation Pressure From Financial Intermediaries, And It Won't Be The Last (12)
10:54 A New Hope For Moderation And Its Discontents? (7)
12:00 Infrastructure And Content Moderation: Challenges And Opportunities (7)
12:20 Against 'Content Moderation' And The Concentration Of Power (32)
13:36 Social Media Regulation In African Countries Will Require More Than International Human Rights Law (7)
12:00 The Vital Role Intermediary Protections Play for Infrastructure Providers (7)
12:00 Should Information Flows Be Controlled By The Internet Plumbers? (10)
12:11 Bankers As Content Moderators (6)
12:09 The Inexorable Push For Infrastructure Moderation (6)
13:35 Content Moderation Beyond Platforms: A Rubric (5)
12:00 Welcome To The New Techdirt Greenhouse Panel: Content Moderation At The Infrastructure Level (8)
12:00 That's A Wrap: Techdirt Greenhouse, Broadband In The Covid Era (17)
12:05 Could The Digital Divide Unite Us? (29)
12:00 How Smart Software And AI Helped Networks Thrive For Consumers During The Pandemic (41)
12:00 With Terrible Federal Broadband Data, States Are Taking Matters Into Their Own Hands (18)
12:00 A National Solution To The Digital Divide Starts With States (19)
12:00 The Cost Of Broadband Is Too Damned High (12)
12:00 Can Broadband Policy Help Create A More Equitable And inclusive Economy And Society Instead Of The Reverse? (11)
12:03 The FCC, 2.5 GHz Spectrum, And The Tribal Priority Window: Something Positive Amid The COVID-19 Pandemic (6)
12:00 Colorado's Broadband Internet Doesn't Have to Be Rocky (9)
12:00 The Trump FCC Has Failed To Protect Low-Income Americans During A Health Crisis (26)
12:10 Perpetually Missing from Tech Policy: ISPs And The IoT (10)
12:10 10 Years Of U.S. Broadband Policy Has Been A Colossal Failure (7)
12:18 Digital Redlining: ISPs Widening The Digital Divide (21)
More arrow