Meta is blocking ‘potentially sensitive’ topics from Threads search

It turns out that Threads’ recently introduced keyword-searching abilities won’t work for all topics. The app is currently blocking searches for a number of “potentially sensitive” words, including “vaccines,” “covid,” and other variations of words that have previously been linked to misinformation on Meta’s platform.

The limits, which were first reported by The Washington Post, are an apparent attempt to prevent controversial content from spreading on Meta’s newest app. The company has blocked a number of covid and vaccine-related terms, including “covid,” “coronavirus,” “covid-19,” “vaccines” and “covid vaccines,” as well as other terms associated with potentially unsavory content like “gore,” nude,” and “sex.”

The company confirmed it was blocking searches in a statement to The Post, calling it a temporary measure. “The search functionality temporarily doesn’t provide results for keywords that may show potentially sensitive content,” a spokesperson said. Adam Mosseri, the head of Instagram who also oversees Threads, tweeted that the company was “trying to learn from last [sic] mistakes and believe it’s better to bias towards being careful as we roll out search.”

Meta’s history shows the company has good reasons to be cautious about search on Threads. Instagram search has been widely criticized as a vector for misinformation and its ability to lead users down conspiratorial rabbit holes. The app’s search was particularly weaponized during the early days of the pandemic, when it promoted conspiracy-touting anti-vax accounts in its top results for simple queries like “vaccine” and “5g.”

At the same time, it’s telling that Meta is now opting to block all searches containing “potentially sensitive” keywords, even posts that don’t contain rule-breaking content. It’s also a notably more aggressive approach than the social media company has taken in the past.

While Meta has previously limited search functionality on both Facebook and Instagram, the company has typically intervened when search terms were explicitly linked to rule-breaking content, like specific hashtags related to QAnon. In other cases, the company has worked to clean up search results for topics like vaccines, and pushed in-app PSAs directing users to official resources.

As The Washington Post points out, the result of the total block on covid-related search terms is that users are also prohibited from looking for information, resources and conversations that don’t break the platform’s rules, which could be a barrier to those seeking advice or credible information from experts.

Meta’s caution also underscores just how quickly the company rushed the development of Threads. The app was released just five months after a small group of Instagram engineers started working on the project. The quick turnaround meant that Threads launched with several basic features missing from the service. And while Meta has said Threads has the same safety policies of Instagram, it hasn’t disclosed many details about its plans to moderate content on the Twitter-like app, where posts look and feel very different.

This article originally appeared on Engadget at 

Leave a Comment

Generated by Feedzy