NICK EICHER, HOST: Today is Tuesday, September 18th. Thank you for turning to WORLD Radio to help start your day.
Good morning. I’m Nick Eicher.
MARY REICHARD, HOST: And I’m Mary Reichard.
Students headed back to school this year are more likely than ever to be assigned a device of some kind by the school district. It might be a Chromebook laptop or a tablet. And as young people become more tech-connected at school, the need to keep them safe online increases, too.
EICHER: Federal laws aim to help, such as the Children’s Internet Protection Act. That requires federally funded K-to-12 schools and libraries to filter out inappropriate websites.
But some school administrators want to do quite a bit more. They seeing troubling trends like the increase in teen suicide and incidents of online bullying. Some of them are turning to software that scans students’ online activity. It identifies threats and references to drugs or to self-harm.
REICHARD: It’s no surprise that these surveillance tactics are raising privacy concerns. WORLD Radio technology reporter Michael Cochrane is here with an update on these so-called Safety Management Platforms.
Michael, how do these software platforms work, and what, exactly, do they do?
MICHAEL COCHRANE, REPORTER: They use what’s called “natural-language processing” to scan through all the words students type on their devices. This includes emails, text messages – even documents saved and stored on the cloud via school-provided networks. If a pre-programmed word or phrase surfaces that might indicate self-harm or bullying, the platform sends an alert message to selected school administrators.
REICHARD: So, to be clear, these scans are limited to school-related devices and internet networks, not their private accounts, right?
COCHRANE: That’s correct. It’s pretty well understood that there’s no expectation of complete privacy on school-issued devices. One school district in Omaha, Nebraska, spells it out in its student handbook. Quoting now—“[Email] and other computer use or storage is not guaranteed to be private or confidential. Computers, files and communications may be accessed and reviewed by district personnel.” End quote.
REICHARD: Do you have some examples where these safety management platforms raised an alert that prompted officials to intervene?
COCHRANE: Sure. Staying in Nebraska, the Omaha World-Herald reported on a number of school districts in the greater Omaha metro area that are using a product called Gaggle. One district reported 747 alerts last year. Most of those were for relatively minor things such as curse words. But some tipped off administrators to some serious situations. Gaggle’s CEO said that nationwide his company sent 542 alerts last year just on potential suicides. It sent another 240 notifications involving a student talking about bringing a weapon to school.
One particularly dramatic example came from the Wausau, Wisconsin, school district. A student sent an email to a friend at 7:42 a.m. mentioning intentions of suicide. This kicked off an alert to district officials who contacted the school at around 7:48 a.m.—just six minutes later. They found the student in the bathroom just seconds away from a potentially successful suicide.
REICHARD: Wow. How much are school districts paying for these services?
COCHRANE: It’s typically on a per-student basis. Gaggle charges $5 per student, per year. So, for a district with 10,000 students that would be $50,000 annually.
REICHARD: It sounds like this is run by the school administrators. How are parents involved?
COCHRANE: As more schools transition to this one-to-one model, many parents are just catching up to the realization that school administrators can obtain communications and documents on password-protected student accounts. That may be one reason why another company, Securly, directly targets parents with an additional service they call Parent Portal. It gives parents access to a dashboard or the option to receive email notifications about their child’s search histories and websites they’ve visited.
Based on my reporting, it seems most parents are supportive of creating a safer digital environment for their children. But some are concerned that students could end up getting punished for writing something, even if they’ve done nothing wrong. One example would be writing the word “gun” a number of times in an essay about gun control. And for Christian parents, it’s not hard to imagine how this could be used to clamp down on certain beliefs secular society no longer favors—like biblical views on marriage and sexuality issues.
REICHARD: And those concerns are apart from the privacy issues?
COCHRANE: Right, that’s another serious concern. Is safety coming at the expense of privacy? Some people just don’t like the snooping. But others suggest that, since so much of their formative years are spent online, kids need safe “digital spaces” to explore their own identities without fearing they’ll get in trouble.
REICHARD: It seems to me that if students know their computers and accounts provided by the school are being monitored, that might change their behavior somewhat?
COCHRANE: That’s actually starting to happen. Several of those Omaha school districts believe digital monitoring has actually deterred both students and staff from misusing technology. One school principal reported that his team used to get two to three alerts per month. Now that’s dropped to about one.
REICHARD: Michael Cochrane is WORLD’s science and technology correspondent. Thanks so much, Michael!
COCHRANE: You’re very welcome, Mary.