All suggested material will eventually be filtered by Instagram's 'sensitive content' rules.

News Sand DC
Source: Getty Images

Last year, Instagram introduced a feature that allows users to exclude some types of "sensitive" content from the Explore tab. Instagram is now allowing users to turn off such material in all recommendations throughout the app.

Instagram isn't exactly forthcoming about how it classifies sensitive content or what constitutes it. The firm defined sensitive content as "posts that don't necessarily contravene our standards but might possibly be distressing to certain people — such as postings that are sexually suggestive or violent" when it first implemented the sensitive content control last year.

Search, Reels, hashtag pages, "accounts you might follow," and in-feed recommended posts will all benefit from the new content restrictions. The improvements will be rolled out to all Instagram users in the coming weeks, according to the company.

Rather than allowing users to mute specific content themes, Instagram's controls only provide three options: one that displays you less of this bucket of information, the regular setting, and a sensitive content option. Users under the age of 18 will not be allowed to choose the latter option.

On a Help Center page delving deeper into the content filters, the category is described as anything that "hinders our capacity to maintain a secure community." According to Instagram, this includes:

“Content that may depict violence, such as people fighting. (We remove graphically violent content.)

Content that may be sexually explicit or suggestive, such as pictures of people in see-through clothing. (We remove content that contains adult nudity or sexual activity.)

Content that promotes the use of certain regulated products, such as tobacco or vaping products, adult products and services, or pharmaceutical drugs. (We remove content that attempts to sell or trade most regulated goods.)

Content that may promote or depict cosmetic procedures.
Content that may be attempting to sell products or services based on health-related claims, such as promoting a supplement to help a person lose weight.”

"Some individuals don't want to see information regarding themes like drugs or weapons," Instagram says in the picture that accompanies its blog entries. Instagram's lack of clarity on how it defines sensitive content, as well as its choice not to provide users more specific content filters, is concerning, especially given its decision to bundle sex and violence together as "sensitive," as we observed when the option was introduced.
Instagram is infamous for its anti-sex workers, anti-sex educators, and even anti-sexually provocative emoji. The upgrade is usually bad news for accounts affected by Instagram's strict sexual content guidelines, but such groups are already used to going to great lengths to stay in the platform's good graces.
From our vantage point, it seems counterintuitive that a user who dislikes postings about weight loss schemes and diet culture would also dislike photos of individuals wearing see-through clothes, but Instagram is obviously painting in broad strokes here. The end result is a widget that urges users to turn off an opaque blob of "adult" material rather than a useful means for people to avoid seeing things they don't want to see when browsing Instagram's algorithms.

#buttons=(Accept !) #days=(20)

Our website uses cookies to enhance your experience. Learn More
Accept !