Instagram ‘choco skin’ filter is form of brownface, says teacher and activist

An Instagram filter titled “choco skin” has provoked outrage, and been identified as brownface – a deeply offensive practice which originated when white performers wore theatrical make-up to represent a caricature of a non-white person.

Vaani Kaur, a 29-year-old teacher and activist from London, discovered the filter whilst using the popular social media app, and shared a screenshot on her profile of herself before and after using it to show its intended skin-darkening effect.

Writing in her Instagram caption, she said: “Having a filter to darken yourself as a “trend” is not acceptable when racial discrimination is still a very real problem.

“Our skin isn’t an exotic look for your selfies and POC don’t have the privilege to be white passing when it might help us.

“You don’t get to wear us like an accessory and ignore all negative connotations to dark skin.

“We are in our skin for life, not for likes”.

A second person commented on the post: “This is sickening and indeed, racist. Let’s get this filter removed.”

Another added: “The name is what bothers me most about this. It’s trying to change skin colour, not the level of a tan from being in the sun.”

Ms Kaur reported the effect to Instagram and contacted the creator of the filter who did remove it, but also justified its creation by saying that darkness is a virtue.

She also raised concerns about other discriminatory filters on Instagram, including ones called “skin black”, “brown sugar” and “Indian girl”.

Instagram filters are created via Spark AR software, and can range in complexity from basic randomiser cards to more elaborate effects.

The social media company has banned filters that modify facial features – such as hyper-plumping lips or making eyes rounder – but anyone can create their own assets, by uploading them to Spark AR software alongside metadata, a video showcasing how the filter works and several pictures.

vaani-post.jpg

It then typically takes just a day for a new filter to be approved by Instagram.

Since 2012 Instagram has been owned by Facebook, and speaking to The Standard, a company spokesperson said: “We don’t allow effects that focus on harmful stereotypes, and we’ve removed those that broke our rules.

“When creators submit new AR effects, we use a combination of human review and automated systems to review them before they appear in the gallery.

“These systems aren’t perfect, which is why we’re constantly working to improve, and why we encourage our community to report any effects they think break our rules.”

In response to Ms Kaur’s complaint, Instagram have now removed the “skin black” filter, but explained they don’t ban effects simply for adding cultural or religious makeup of clothing, although they do remove effects that focus on harmful stereotypes, including blackface.

Speaking to the Standard, Ms Kaur said: “I wanted to raise this issue as a wider conversation; it hasn’t been acknowledged, and there hasn’t been any kind of progress.

“For us, race is not something you can switch on and off.

Also read>>>>>itlearnbd

“You can’t cherry-pick certain characteristics without acknowledging the negative connotations.”

From the mid-19th century actors performed in minstrel shows by darkening their skin to look stereotypically “black” in order to entertain white audiences, fostering negative representations of black and brown people.

This custom perpetuated the notion of whiteness as superior by ridiculing and dehumanising those of different races.

However, the practice became so widespread that even black actors ended up participating, as white audiences did not want to see them on stage without it.

White people utilised blackface to further justify the enslavement of, and state violence against, African Americans.

Alongside other forms of racism, the practice has endured, with Matt Lucas and David Walliams apologising for their use of blackface in Little Britain comedy sketches last month.

Be the first to comment

Leave a Reply

Your email address will not be published.


*