TikTok adds Content Levels – new settings that should protect minors from harmful videos

TikTok recently announced a setting that will allow users initiating livestream sessions to restrict access to people under the age of 18. Realizing that they can’t rely solely on the cooperation of content authors, platform administrators are preparing Content Levels, a more comprehensive way to differentiate videos based on their intended audience.

Tacitly acknowledging that pornographic posts are just the “tip of the iceberg” in a veritable ocean of dangers, TikTok administrators will apply filtering to videos for a particular audience based on other aspects, such as hashtags used. Thus, simply assigning a hashtag to a distributed video that TikTok administrators assign to a topic not suitable for minors will automatically result in the video being hidden from users who do not fall into the appropriate age group.

Read:  Tool from Germany is supposed to free Midjourney and Co. from prejudices

Recently, TikTok has introduced the criterion of communicating the real age when logging into the newly installed app on a smarpthone or tablet, filling in this information will determine further access to the videos shared on the platform. While the measure is welcome, it is worth noting that TikTok does not actually verify the accuracy of the data entered. In other words, underage users who declare an age other than their real age are not further verified. We can only assume that TikTok will eventually introduce an additional system for testing the real age, such as AI analysis of videos shared on the account, to determine whether the protagonist is an adult or not. For now, the whole system seems to rely on the honesty of users, who have every incentive to “get around” the newly raised hurdles.

Read:  WhatsApp would display a new indicator for status messages

The Best Online Bookmakers April 25 2024

BetMGM Casino

Bonus

$1,000