Advertisement
Advertisement
TV shows and streaming video
Get more with myNEWS
A personalised news feed of stories that matter to you
Learn more
Wu Yongning fell to his death from a skyscraper in central China in 2017. (Picture: Guancha.cn)

When online stunts turn deadly, should video sites be held responsible?

A Chinese court ruled a streaming platform is partially responsible for a daredevil’s death as the US debates whether internet companies should be liable for user-generated content

This article originally appeared on ABACUS

Before November 8, 2017, Wu Yongning was one of the most-watched streamers in China. The daredevil racked up millions of views online for his hair-raising selfie videos showing him scaling skyscrapers with no safety gear.

Last month, his family was awarded US$4,300 in compensation by a Beijing court. It was ruled that the owner of Huajiao, the platform that live-streamed Wu’s final stunt, was partially responsible for his fatal fall from the top of a 62-story building two years ago. He was 26.
Wu Yongning fell to his death from a skyscraper in central China in 2017. (Picture: Guancha.cn)
Dangerous stunts like rooftopping, or climbing structures like buildings or cranes, have proven popular on streaming platforms even though the stunts are often illegal. In Wu’s case, the court ruled that the Huajiao bore some responsibility for the rooftopper’s death partly because it didn’t stop him from uploading the video or give any safety warnings.

But elsewhere in the world, platforms could be protected against legal liability.

In the US, platforms like YouTube or TikTok generally aren’t held liable for user-generated content under Section 230 of the Communications Decency Act. Instead, most major platforms have set up self-enforcing rules forbidding users from posting dangerous videos.

TikTok, the viral short video sensation, has its roots in China

TikTok’s community guidelines ban content that “depicts dangerous acts” or “encourages other people to engage in such activities.” The app also adds warnings to certain videos when the activity portrayed is deemed dangerous to imitate. YouTube’s policy explicitly bans “extremely dangerous challenges” and “dangerous or threatening pranks.” The platform also supports age restrictions on some videos.

Still, companies say it’s a difficult challenge to catch every offending video, given the sheer amount of content created everyday.

And Wu’s death is far from the only social media stunt to have gone terribly wrong. Around the world, people continue to engage in reckless pranks in exchange for online fame.

Last year, an American YouTuber died after a bullet penetrated an encyclopedia he was holding in front of his chest. This year, an Indian teen died after he hit a moving train when trying to run parallel to it while shooting a TikTok video.

TikTok unblocks US teen who criticized China on Uyghurs

Dangerous stunts can attract copycats, sometimes with deadly consequences.

In August, two girls in China, who were said to be imitating popular social influencer Ms Yeah, suffered serious burns when two heated tin cans filled with alcohol exploded. Zhezhe, just 14 years old, later died from her injuries.
Ms Yeah, who has some seven million YouTube subscribers, apologized and promised to delete all potentially dangerous videos. She denied the girls were copying her but reportedly offered compensation to Zhezhe’s family.

Online accounts of Ms Yeah, a celebrity chef famous for whipping up meals using office tools, still contain videos of her cooking with alcohol and open fire in what looks like an indoor office. These videos don’t appear to violate YouTube’s rules. Our inquiry to an email address listed on Ms Yeah’s YouTube account went unanswered.

Two girls were badly burnt after mimicking a cooking trick they reportedly saw online earlier this year. (Picture: Weibo)
Lawmakers in Washington are now debating changes to Section 230 that would limit immunity for internet companies. Some critics of the law argue that companies have been using it to shirk responsibility for questionable content. But supporters say the law allows platforms to moderate content without risking liability or engaging in excessive censorship.
“When you think about what the moderators are dealing with at this rapid-fire pace, and then to expect all of their decisions to satisfy everyone, that’s not going to happen,” US Naval Academy professor Jeff Kosseff told The Verge.

Some people also wonder if laws alone could prevent future tragedies. In a commentary this week, China’s state broadcaster questioned whether the recent court ruling on Wu was enough to deter “bad online platforms.”

“Although the court established the responsibility of online platforms, a compensation of 30,000 yuan translates to a very small legal liability,” wrote a CCTV commentator. “Online platforms should recognize that in addition to legal responsibilities, it also shoulders moral responsibilities and social responsibilities.”

For more insights into China tech, sign up for our tech newsletters, subscribe to our award-winning Inside China Tech podcast, and download the comprehensive 2019 China Internet Report. Also roam China Tech City, an award-winning interactive digital map at our sister site Abacus.

Post