Facebook reveals new plans to provide community help in real-time

Facebook is one of the most widely used social media platform. People get connected with their near and dear ones through Facebook and also share their pics, videos, moments that they cherish, through this giant social media platform. According to Guy Rosen, VP of Product Management, Facebook, as many people including families and friends are connected with each other through Facebook, so what Facebook is doing is it is helping people in distress to get connected with those who can support them. By doing this they want to ensure a safe community both on and off Facebook. Initially, they are planning to help those people who are expressing thought of suicide or providing hints about suicide through the social media platform.

Rosen informed that they are using pattern recognition to detect posts or live videos that indicate that some person might be expressing thoughts of suicide through them, and to help respond to those vulnerable reports faster. They want to improve the way they identify appropriate fast responders and for that that they are allotting more reviewers from their Community Operations team to review reports of suicide or self-harm.

Talking about their progress in helping people eliminate their thoughts of attempting suicide, Rosen said, “Over the last month, we’ve worked with first responders on over 100 wellness checks based on reports we received via our proactive detection efforts. This is in addition to reports we received from people in the Facebook community.” Rosen further informed that they are also using pattern recognition to help accelerate the most concerning reports and have also found out that these accelerated reports, that require immediate attention, get escalated to local authorities twice as quickly as other reports. Facebook is committed to investing more in pattern recognition technology so that it can serve the community in a better way.

Rosen said that Facebook wants to expand its use proactive detection and for that, it has started rolling out artificial intelligence outside the US to help identify when someone might be expressing thoughts of contemplating suicide, including on Facebook live. “This approach uses pattern recognition technology to help identify posts and live streams as likely to be expressing thoughts of suicide. We continue to work on this technology to increase accuracy and avoid false positives before our team reviews,” informed Rosen.

According to Rosen, Facebook has a Community Operation team consisting of thousands of people around the world who review reports about content on Facebook. Among them, many are specially trained to identify and handle possible suicide and self-harm cases. The company also uses AI to prioritize the order in which its team reviews reported posts, videos, and live streams according to the level of urgency and as a result, they can quickly alert first responders.

Rosen said if someone doubts a person about having thoughts of contemplating suicide then he/ she can contact the Facebook team directly or report the post to them. Their team is working around the world 24/7 and continuously reviewing the reports that come and prioritizing the most serious reports. Rosen informed that Facebook has been working on suicide prevention tools for more than 10 years and its approach was developed in collaboration with mental health organizations such as Save.org, National Suicide Prevention Lifeline, and Forefront Suicide Prevention and with input from people who have had personal experience thinking about or attempting suicide.

SOURCEFB Newsroom
SHARE

Senior writer & Rumors Analyst, James is a postgraduate in biotechnology and has an immense interest in following technology developments. Quiet by nature, he is an avid Lacrosse player. He is responsible for handling the office staff writers and providing them with the latest updates happenings in the world of technology. You can contact him at james@pc-tablet.com.

LEAVE A REPLY

Please enter your comment!
Please enter your name here