Facebook founder Mark Zuckerberg said his company currently doesn’t have the capacity to review all content uploaded to the site, even when the content is a live-streamed suicide. But artificial intelligence could help change that.

He wrote in a 6,000-word letter on the future of the company — which he posted to Facebook — that it needs more resources to police all of the content uploaded by more than one billion users.

“There are billions of posts, comments and messages across our services each day, and since it’s impossible to review all of them, we review content once it is reported to us. There have been terribly tragic events — like suicides, some live streamed — that perhaps could have been prevented if someone had realized what was happening and reported them sooner,” Zuckerberg wrote. “There are cases of bullying and harassment every day, that our team must be alerted to before we can help out. These stories show we must find a way to do more.”

But Zuckerberg said his site could actually help prevent suicide in the future.

“To prevent harm, we can build social infrastructure to help our community identify problems before they happen,” Zuckerberg wrote. “When someone is thinking of committing suicide or hurting themselves, we’ve built infrastructure to give their friends and community tools that could save their life.”

He wants the site’s technology, using artificial intelligence, to be able to detect when someone could be in danger.

“Some of these cases around saving people’s lives are probably going to the the first ones that you want to use it,” Zuckerberg told Buzzfeed of such technology.

“We will not have this tech ready for a while, but once you have the ability to understand what content is about, you have the ability to unlock that,” Zuckerberg told Recode.

Suicide is most common among people aged 45 to 64, at 19.6 per 100,000 people reported in 2015.

One of the earliest live-streamed suicides took place in 2008, but there have been increasing reports of people using Facebook to make their last moments public. On Dec. 30, a 12-year-old Georgia girl killed herself. She said she had been sexually abused and she had uploaded previous videos about her struggle with depression. Facebook initially declined to remove her suicide video — which lasted 40 minutes — from the site because the company said it didn’t violate any policies. Two weeks later, amid a police request and public outrage, Facebook took it down.

A Florida 14-year-old live-steamed for nearly two hours as she prepared to hang herself. Thousands watched as she died.

Facebook has previously recommended that people who encounter potential suicide live-streams on Facebook contact law enforcement or emergency services so someone can respond in real life.

Our editors found this article on this site using Google and regenerated it for our readers.