The main Russian media conglomerate recently launched a domestic rival to the hugely popular video-sharing app TikTok. Russian media called it a campaign by Russia to reduce the influence of foreign websites and technological advancements. Player. Developed with support from the Innopraktika Foundation, an organization led by Katerina Tikhonova, one of President Vladimir Putin’s alleged daughters, the Yappy app began beta testing in September, with early access granted to 300 upcoming bloggers. The service has a number of features similar to TikTok and is based on sharing short vertical video clips of up to 60 seconds in length.
What prompted Russia to have a homemade TikTok alternative?
Russia has become angry with China’s TikTok app over some objectionable messages targeting children. The messages are said to have prompted children to participate in unauthorized street protests in support of jailed Kremlin critic Alexei Navlany.
TikTok has faced criticism alongside US tech giants for refusing to cut jobs and ignoring the Russian government’s request.
The Russian court even fined TikTok 2.5 million rubles ($ 34,000) for failing to remove illegal content inducing minors to participate in unauthorized demonstrations in Moscow.
Tiktok – Russia’s Most Popular Video Sharing App
Without a doubt, the Chinese video sharing app TikTok was the most downloaded phone app in Russia and elsewhere in the world. But now Russian Gazprom-Media has jumped into the business to cash in on TikTok’s popularity.
At the end of 2000, Russia made public its decision to create a national alternative to TikTok.
Russia has since named TikTok as one of 13 international social media platforms that will have to open an office on Russian soil by the end of 2021 – the latest law that critics say was designed to crush dominance. foreign technology companies and social media platforms in Russia. Whether it is Russia or Australia, TikTok has come under attack for having detrimental effects on health and children in particular.
Controversy around the TikTok algorithm
TikTok’s powerful algorithm is unlike anything the world has seen before. Tiktok has been accused of sharing data with the Chinese government. A joint investigation by Triple j’s Hack and Four Corners of the Australian Broadcasting Corporation found that the TikTok algorithm exposes Australians to dangerous content while controlling the people and political movements that grab users’ attention. The survey made some startling revelations: For example, TikTok says its mission is “to inspire creativity and bring joy”, but that risks distorting how much of a generation sees the world. world, and not always for the best. Upon registration, TikTok begins collecting data on the user’s location, gender and age and, more controversially, their facial data.
The more a user clicks “Like” videos, follows an account, or watches a TikTok video until it ends, the more the algorithm learns about the user’s interests.
It’s very difficult to break that cycle, and it’s by design that the user never really gets to the end of the content. The more a user scrolls through you, the more likely the user is to show ads. That’s what catapulted TikTok’s Chinese parent company, ByteDance, to more than $ 250 billion.
Researchers have alleged that TikTok promotes eating disorders. Researchers say there are many factors that contribute to eating disorders. Tiktok’s algorithm searches for vulnerable people and then plays on this vulnerability. Dr Suku Sukunesan from Swinburne University advised TikTok on how to make the app safer. It has integrated into the eating disorder communities of the app.
âI was immediately given all of this eating disorder content. After a few hours, TikTok suggested 30 different accounts to follow and they were all people living with eating disorder issues,â a- he declared. According to Dr. Sukunesan, these TikToks effectively teach people how to have an eating disorder, and the algorithm can lead to more serious videos, such as those that encourage self-harm. âIt’s almost like an endless chasm and you find that these kids would end up hurting themselves more,â he said.
Company policies maintained that TikTok prohibited “content that describes, promotes, normalizes, or glorifies activities that could lead to suicide, self-harm or eating disorders.”
A user tried to report videos promoting eating disorders only to be told that they did not violate any of TikTok’s guidelines. TikTok’s response to this problem is to ban pro-food hashtags so that users cannot search for those videos. If they do, a number for The Butterfly Foundation’s eating disorder support service appears. “Our teams consult with NGOs and other partners to constantly update the list of keywords we work on,” said a spokesperson for TikTok.
Another TikTok user told Hack and Four Corners that when she reported a viral video of a man taking his own life, it was also found not to violate the app’s community guidelines. According to several researchers, it takes less than 30 seconds to find harmful content on TikTok, and a few hours for the algorithm to dominate someone’s stream with offensive videos. Tech advocacy organization Reset Australia has carried out experiments and found that it takes about four hours for the algorithm to learn that a 13-year-old is interested in racist content, and about seven hours for sexist videos overwhelm someone’s thread. The longer these users watch this type of content, the more frequently they appear.
While TikTok has come under pressure to root out harmful videos, it has also been accused of using the algorithm to censor and remove posts for the wrong reasons. In July, several black influencers went on an indefinite strike, refusing to choreograph the viral dances TikTok relies on and accusing the app of capitalizing on their creativity without preferring them in the algorithm. In March 2020, policy documents from TikTok were leaked showing that moderators were responsible for removing posts from creators considered “ugly, poor, or disabled.”
Last year, TikTok apologized for removing posts with the hashtags “Black Lives Matter” and “George Floyd” after thousands of creators took to the platform to protest their removal. videos or banning their accounts. The Australian Strategic Policy Institute (ASPI) conducted the first academic censorship investigation on TikTok and found that the company was actively using its algorithm to mask political rhetoric it deemed controversial.
The study – which was funded by the US State Department – found hashtags about the mass detention of Uyghurs, protests in Hong Kong, LGBTQI and anti-Russian government videos were among those removed. . In a statement, TikTok denies the company is involved in censorship.