“The problem is that literally anyone can watch these videos — kids, adults, whatever,” she says. Matt first saw a fractal firewood video shared by a friend on Facebook and was so intrigued that he “started watching YouTube videos – and they are endless.”
Matt was electrocuted when part of the sheath of the connecting cables he was using ripped off and his palm touched the metal. “I sincerely believe that if my husband were fully aware [of the dangers], he wouldn’t do it,” says Schmidt. Her message is simple: “When you’re dealing with something that could kill someone, there should always be a warning… YouTube needs to work better, and I know they can because they censor all types of people.” “.
After Matt’s death, medical officials from the University of Wisconsin wrote: paper titled “Shocked, although the heart and YouTube are to blame.” Referring to Matt’s death and four fractal tree burn injuries they personally treated, they asked for “insertion of a warning label before users can access video content” in a crafting technique. “While it is impossible or even undesirable to label every video depicting a potentially dangerous activity,” they wrote, “it seems practical to apply a warning label to videos that, if imitated, could lead to instant death.”
Matt and Caitlin Schmidt have been best friends since they were 12. He left three children. Schmidt says her family has endured “pain, loss and devastation” and will grieve for life. “Now we are a cautionary tale,” she says, “and I wish for everything in my life that we were not.”
YouTube told MIT Technology Review that its community guidelines forbid content that is intended to encourage dangerous action or that carries an inherent risk of physical harm. Warnings and age restrictions apply to graphic videos, and a combination of technology and human staff ensure that company guidelines are followed. Dangerous videos banned by YouTube include calls that pose an immediate risk of injury, pranks that cause emotional distress, drug use, glorification of violent tragedies, and instructions on how to kill or harm. However, videos may depict dangerous activities if they contain sufficient educational, documentary, scientific or artistic context.
YouTube removed “several” videos of fractal firewood burning and imposed age restrictions on others when approached by the MIT Technology Review. But the company didn’t say why it moderates pranks and challenges rather than hacks.
This would certainly be tricky to do – each 5 Minute Crafts video contains a plethora of crafts, one after the other, many of which are just quirky but not harmful. And the ambiguity of hacking videos — an ambiguity not found in challenge videos — can be difficult for human moderators to appreciate, let alone AI. AT September 2020YouTube reinstated human moderators who were “offline” during the pandemic after it determined its AI was overdoing it, doubling the number of incorrect deletions between April and June.