Ken Yeh thought his school was buying software to keep kids off of certain websites.
What he didn't know was that it could help identify a student who might be considering suicide.
Yeh is the technology director at a private K-12 school near Los Angeles. Three years ago, the school began buying Chromebook laptops for students to use in class and at home. That, Yeh says, raised concerns from parents about what they'd be used for, especially outside of school.
He turned to a startup called called GoGuardian, which helped the school create a list of off-limits sites: porn, hacking-related sites and "timewasters" like online games, TV and movie streaming. The software also has another feature: It tracks students' browsing and their searches.
And that's how Yeh was alerted that a student appeared to be in severe emotional distress.
He recalls getting an indicator at work that a student had been searching for suicide and several related terms. "I then went in to view the student's browsing history around this time period."
The more he saw, the more Yeh was convinced that this wasn't an idle or isolated query.
There were other warning signs in the web browsing of this student as well: searchers for specific methods of self-harm and "terms that strongly suggested that the student was struggling with certain issues," he says.
Yeh alerted the principals and the student was brought in to speak with a guidance counselor. That conversation, Yeh says, led to positive interventions. "It was a little unexpected. We weren't thinking about this as a usage for GoGuardian."
Yet in the nearly four years that GoGuardian has been in use at this school, this type of incident has happened four separate times, he says. And GoGuardian says that across the 2,000 districts where its software is in use, it has heard similar anecdotes dozens of times.
Rodney Griffin, the Chromebook coordinator for the Neosho School District in southwest Missouri, says it happens there an average of once a semester.
"Any time, day and night, I alert a school counselor or administrators," he says. "I've had it happen when they contacted home at like 10 p.m. and said, 'I think you need to check on your child.' "
So, these software programs are identifying students in distress, but they do so by effectively thrusting school IT directors such as Yeh into the role of eavesdroppers.
That can be problematic, says Elana Zeide, a research fellow at NYU's Information Law Institute and an expert on student privacy and data.
"This is a growing trend where schools are monitoring students more and more for safety reasons," she says. "I think student safety and saving lives is obviously important, and I don't want to discount that. But I also think there's a real possibility that this well-meaning attempt to protect students from themselves will result in overreach."
A student who types "suicide" into a search box could be researching Sylvia Plath, Socrates or terrorist movements, Zeide points out. And there could be legitimate personal or educational reasons for students to search other flagged terms, from sexual anatomy to sexually transmitted diseases or drugs, without "sending immediate alerts to the powers that be."
She points out that low-income students may be disproportionately subject to surveillance, as school-owned devices are more likely to provide their only access to the Internet. And she worries about the broader message: "Are we conditioning children to accept constant monitoring as the normal state of affairs in everyday life?"
This type of dilemma is almost certainly going to become more common, as school-owned devices and laptops proliferate. In 2015 alone, according to a report released this month, U.S. K-12 districts bought 10.5 million devices like laptops and tablets, a 17.5 percent increase over the year before.
Carolyn Stone, ethics chair of the American School Counselor Association, says that she was "taken aback" to hear that student Web searches done at home were triggering interventions by school staff. "It's so intrusive," she says.
On the one hand, she says, the issue of students thinking about suicide needs to be taken very seriously and treated differently from other types of disclosures. When guidance counselors hear anything about potential self-harm, even secondhand, she says, "We're on it. We're calling home. Privacy and confidentiality go out the window."
On the other hand, she says, she worries about school staffers without mental health training having access to what are, essentially, students' private thoughts.
"On the surface, it sounds like a very good idea to err on the side of caution when it comes to student suicide," Stone says. "But this is something that sounds like it could spin out of control. ... It's a slippery slope."
Cody Rice, the technical product manager at GoGuardian, says schools are given control over what search terms are flagged and what to do about them, and no client to date has raised privacy concerns.
"Schools and parents are the primary protectors of the students, and GoGuardian provides another tool to help them in their endeavors, but does not make decisions on which types of online activity may lead to alerts to the administration for the benefit of the student."
Yeh says parents at his school have never complained about privacy violations. He adds that they've raised complaints only when the filtering has malfunctioned, allowing students temporary access to off-limits sites.
As for being asked, with no mental health training, to serve as a de facto mental health early warning system for the school community, he seems to accept it as a new part of his job.
"It is a way for us to proactively intervene when they are looking for help. And so we feel a good sense of responsibility in trying to look out for the welfare of our students."
A version of this story was published on NPR Ed in March 2016.
300x250 Ad
300x250 Ad