...

Facebook Starts Issuing Alerts For Searches With The Term ‘Loli’, But Not ‘Shota’

According to the social media giant, the alert is a part of two new tools that they are testing.

Facebook has recently started alerting users who search the term ‘loli’ or any phrase that includes the word ‘loli’ on the platform.

The alert warns people that the term/phrase that they are searching for is sometimes associated with child abuse or sexual exploitation of a minor.

However, during our tests, we found out that such alerts were not issued when a user searches for phrases that includes the term ‘shota’.

For the uninitiated, while the term loli may be used to refer to a sexualized girl child, the term shota may be used to refer to a little boy who is sexualized.

At first, we searched for the terms ‘lolis’, ‘loli’ and ‘loli girls’ on Facebook, and every time we did so, the social media platform issued an alert with one of the following options.

If the user is concerned about having sexual thoughts about children, they can choose to seek help, or if they are concerned about the content that shows up on such searches, they can choose to report the content instead.

Searching for Loli FB
Search Loli FB

If a user chooses to get help, Facebook takes them to a page which lists out organizations and resources globally as well as for different regions and countries including Canada, Spain, India, Belgium, Netherlands etc.

“If you’re concerned by your own sexual thoughts about children and young people, or by someone else’s behaviour (such as a friend or family member), there are organisations that can provide support for people who need help coping with these feelings and urges,” the message on the page read.

Get help loli FB

On the other hand, if a user wishes to continue with their search by ignoring the warning, Facebook issues one more alert which states that engaging with such content will get a user removed from Facebook.

“Viewing sexual images or videos of children can lead to imprisonment and other severe personal consequences. People who engage in this type of content may be removed from Facebook and reported under the law,” the alert read.

Next-up, we searched for the terms ‘shota’, ‘shotas’, and ‘shota boys’. This time, not only did we get some suggestive photos in the search, the content warning too was missing.

searching shota on facebook

According to the social media giant, they have recently started testing two new tools, one aimed at potentially malicious searches for content related to child sexual abuse and another aimed at sharing of this content for reasons other than to harm a child.

“The first intervention is a pop-up that is shown to people who initiate searches on Facebook using terms associated with child exploitation. The pop-up offers offender diversion resources from child protection organisations and shares information about the consequences of viewing illegal content,” Facebook says on their online child protection page.

The second is a safety alert designed for people who have shared viral memes of child exploitative content, informing them of the harm it causes the victim and warning them that it is against our policies and there are legal consequences for sharing this material.”

Facebook has maintained that the platform has a comprehensive approach to child safety that includes zero tolerance policies prohibiting child exploitation.

Source: Facebook