Facebook artificial intelligence spots suicidal users


Leo Kelion at BBC News: “Facebook has begun using artificial intelligence to identify members that may be at risk of killing themselves.

The social network has developed algorithms that spot warning signs in users’ posts and the comments their friends leave in response.

After confirmation by Facebook’s human review team, the company contacts those thought to be at risk of self-harm to suggest ways they can seek help.

A suicide helpline chief said the move was “not just helpful but critical”.

The tool is being tested only in the US at present.

It marks the first use of AI technology to review messages on the network since founder Mark Zuckerberg announced last month that he also hoped to use algorithms to identify posts by terrorists, among other concerning content.

Facebook also announced new ways to tackle suicidal behaviour on its Facebook Live broadcast tool and has partnered with several US mental health organisations to let vulnerable users contact them via its Messenger platform.

Pattern recognition

Facebook has offered advice to users thought to be at risk of suicide for years, but until now it had relied on other users to bring the matter to its attention by clicking on a post’s report button.

It has now developed pattern-recognition algorithms to recognise if someone is struggling, by training them with examples of the posts that have previously been flagged.

Talk of sadness and pain, for example, would be one signal.

Responses from friends with phrases such as “Are you OK?” or “I’m worried about you,” would be another.

Once a post has been identified, it is sent for rapid review to the network’s community operations team.

“We know that speed is critical when things are urgent,” Facebook product manager Vanessa Callison-Burch told the BBC.

The director of the US National Suicide Prevention Lifeline praised the effort, but said he hoped Facebook would eventually do more than give advice, by also contacting those that could help….

The latest effort to help Facebook Live users follows the death of a 14-year-old-girl in Miami, who livestreamed her suicide on the platform in January.

However, the company said it had already begun work on its new tools before the tragedy.

The goal is to help at-risk users while they are broadcasting, rather than wait until their completed video has been reviewed some time later….(More)”.