Posting a smartphone video online has never been so easy -- even if the video shows a murder. After two recent cases that shocked the world, this has become a tricky but urgent problem for Facebook to tackle.
On Monday, a 20-year-old Thai man broadcast live video on the world's most popular social media platform, showing him killing his baby daughter before his own suicide.
The previous week, a US man dubbed the "Facebook Killer" fatally shot himself after three days of a frantic nationwide manhunt.
Steve Stephens, 37, had posted a video on Easter Sunday, saying he intended to kill. He followed up two minutes later with video showing the fatal shooting of an elderly grandfather whom he had seemingly chosen at random in Cleveland, Ohio.
In a third video 11 minutes later, streamed live from Stephens' car, he said he intended to kill others.
Facebook denounced the "horrific crime," saying it had "no place" on its platform. But the video of the shooting wasn't taken down by Facebook for more than two hours after it was posted. The Thai murder video remained on Facebook for close to 24 hours.
Critics say the social media giant has been too slow to react, and questioned whether Facebook Live - a strategic area of development for the company - should be disabled.
After the Cleveland killing, Facebook chief Mark Zuckerberg pledged to "keep doing all we can to prevent tragedies like this from happening."
But he conceded: "There is a lot of work to do here."
Facebook's video functions have also been used to broadcast rapes. This week, three men in Sweden were sentenced to prison for gang-raping a woman and live-streaming the attack.
Other shocking content includes the suicides of young people who broadcast their deaths on Facebook Live or competing applications like Twitter's Periscope and Live.me.
Any online platform that lets users freely publish content will face these problems, but Facebook is particularly vulnerable given its huge user base, said Lou Kerner, a partner at Flight VC and a social media specialist. Facebook had 1.86 billion users as of the end of December.
There are no "easy answers," he said. "They're going to struggle to stop it from occurring. The question is how fast they can take it down."
Most social networks ban violent and shocking content, but given the volume of postings, they mostly rely on users to identify and report them.
Facebook said it has "thousands" of people combing through the "millions" of items posted weekly in more than 40 languages, adding that it was trying to speed up the process.
However, it's unlikely the network will impose a delay of a few seconds to verify content before broadcast, the way some television channels do for live events, said Roger Kay, an analyst with Endpoint Technologies Associates.
"When you have more than a billion people connected to each other," you are "well beyond the scope of what a human can do," he said.
Artificial intelligence technology is improving but still not good enough and would create too many "false positives."
"I don't know if there's a real human or technical solution. You can punish the breach, but some people don't care," Kay said, giving as examples the Islamic State jihadist group which has been posting violent and inflammatory content online for years, or those who just want their 15 minutes of fame.
Kerner said even if Facebook is not legally responsible for the actions of users, it has a "social responsibility to address it in an appropriate manner."
But Kay pointed out that any restrictions on content require "moral judgment" on what should be permitted.
"I'm pretty sure Facebook doesn't want that role," he said.
The social media network has a difficult balancing act, risking criticism if seen as too lax toward content, but also facing heat for being too restrictive.
It has been accused of censorship in the past, such as last year when it blocked the iconic 1972 image of a Vietnamese girl fleeing a napalm attack, because she was naked. Facebook later reinstated the photo, citing its historical importance.
Facebook also found itself embroiled in controversy last summer when it blocked a video showing the death of an American black man who had been shot by police, which was broadcast live by his girlfriend.
Zuckerberg later reversed the decision, saying the video, while shocking, shines "a light on the fear that millions of members of our community live with every day."