Many of the sites tried to delete videos when they were uploaded but were overloaded. This was stated by Facebook deleted 1.5 million videos within 24 hours of the incident, although many managed to avoid detection. The Reddit post from the video was viewed more than a million times before it was removed. Google said the speed at which the video was distributed was higher than after any tragedy it had seen before, the New Zealand government said.
Over the next few days, some people began discussing ways to evade automated platform systems to keep Christchurch videos online. In the Telegram on March 16, 2019, people who were part of a group associated with the predominance of whites, were looking for ways to manipulate the video so as not to delete it, according to discussions viewed by The Times.
“Just change the discovery,” wrote one user. “Speed it up 2 times and [expletive] I can’t find it. “
Within days, some clip clips were posted on page 4chan, the old online bulletin board. In July 2019, a 24-second clip with the murders also appeared on Rumble, according to a review by The Times.
In the following months, the New Zealand government identified more than 800 versions of the original video. According to a government report, officials have asked Facebook, Twitter, Reddit and other sites to allocate more resources to remove them.
New copies or links to videos were uploaded to the Internet every time Christchurch shots appeared in the news or on the anniversary of the event. In March 2020, about a year after the shooting, almost a dozen tweets appeared on Twitter with links to video options. More videos appeared when Mr. Tarant was sentenced to life in prison in August 2020.
Other groups jumped in to put pressure on technology companies to erase the video. Tech Against Terrorism, a United Nations-supported initiative that develops technologies to detect extremist content, from December 2020 to November 2021 sent 59 warnings about Christchurch content to technology companies and file hosting, said Adam Hadley, founder and CEO of the group. . That’s about 51 percent of the right-wing terrorist content the group tried to remove online, he said.