Discussion about this post

User's avatar
Haley Fritz's avatar

I was a social work intern at a school that used Gaggle to flag instances of suicide/SH language in school emails and G-chats. When a convo was flagged, a real person reviewed it and referred the student to us for a risk assessment. It actually added to our workload rather than reducing it because we had to follow up with every student, even if it was clear they were joking or using the words in other context. That meant other kids who needed services may not get to meet with us right away because we had to meet with them to avoid liability in case they actually did harm themselves.

Expand full comment
Ian Cobb's avatar

I really feel uncomfortable with this as a student. I love what the potential is, being able to help students without them having to talk first (considering many students don’t know how to talk about mental health needs), but honestly, how can a computer program know what we need any better than we do? And what if it starts assigning medication to kids who don’t know they don’t need it? I just don’t like it.

Expand full comment
12 more comments...

No posts