Instagram’s crucial steps were long overdue - GulfToday

Instagram’s crucial steps were long overdue

Instagram

The photo has been used for illustrative purposes.

Madeline Palacz, The Independent

On Sunday, Instagram announced that it would be extending its pledge to remove all graphic images of self-harm from its platform to incorporate “fictional depictions of self-harm”. That includes drawings, cartoons and memes about suicide, in addition to any other method “promoting” self-harm. It is a welcome move from a key player in the debate over online harms, but it represents a small drop in what is fast becoming an even murkier ocean.

The regulation of harmful, yet non-illegal content, remains a concept devoid of clarity. Even the government seems to be perplexed by the issue – the plans set out in the Online Harms White Paper which was published earlier this year was widely criticised as lacking in definition. The question, it seems, is proving more complicated than the answer. In the meantime, and in the absence of any form of regulation, tech giants have taken action in response to public pressure, which has been growing since the tragic death of British teenager Molly Russell. In identifying the scale of the issue, her death has rightly changed the sense of immediacy.

In February this year, Instagram pledged to remove graphic self-harm related content and not to show non-graphic self-harm related content in its search, hashtags and the explore tab. Instagram says that between April and June it removed 834,000 pieces of content from its site. Yet the lack of transparency over the specific nature of this content, who (or what) made the decision to remove it, and on what basis, should give us all cause for concern as Instagram moves into this new phase.

Memes, for example, which will now be the subject of Instagram’s scrutiny, are a relatively new phenomenon. They are widely considered to be an expression of a cultural idea or practice. They can be interpreted as being humorous, critical, divisive – even dangerous. The multi-faceted meanings attached to one small box on the internet will require careful consideration by Instagram if it wishes to be consistent in its decision making.

By extending its pledge to remove harmful content to involve more creative forms of self-expression, Instagram is moving into grey territory. It will need to take care to avoid any unnecessary infringement of a person’s right to speak about their own personal experiences. How might Instagram resolve a case where the meme’s image shows non-graphic, self-harm related content, but the meaning of the text which accompanies it encourages survival? When non-graphic images of self-harm such as healed scars were blurred by Instagram, the hashtag #youcantcensormyskin ignited debate about why such content is considered harmful.

A balance must be struck between the rights of those individuals personally effected by self-harm or thoughts of suicide and who wish to share their experiences, and the potential harm which sharing those stories might cause to others. Since February, algorithms have helped to remove a large quantity of graphic self-harm content from the site. There has, so far, been no public explanation by Instagram regarding the reasons why its algorithms deemed certain content to require censorship. This sets a dangerous precedent. It remains to be seen how an algorithm will deal with the nuances involved in more creative forms of self-expression, such as memes and drawings. Transparency and accountability around such decision-making will be vital to establish and maintain public trust in the platform.

Instagram’s next step is already overdue. It will need to be a crucial one. While the removal of graphic self-harm related content is certainly welcome, it does not address a central problem which has, as yet, been left largely unaddressed: the influence of Instagram’s algorithms.

Instagram’s popularity comes from its ability to give a user’s most “relevant posts” the most visibility. What shows up first in your feed is determined by what posts and accounts you engage with the most, as well as other contributing factors such as the timeliness of posts, how often you use Instagram, and how many people you follow. This is undoubtedly a valuable tool, which allows a user to cultivate their own personal space on the internet. However, according to her father, it was Instagram’s algorithm which allowed “similar content” to be “pushed” on Molly as she viewed images of self-harm.

As users of the platform, it seems we have little control over the alien concept of the algorithm. There is certainly a lack of available information on whether it is even possible for a user to substantially alter the algorithm once it becomes established on a page, should they wish to do so. It will remain to be seen how Instagram will address this thorny issue. It will need to admit that its algorithm, its core appeal, might be just as harmful as the content which it is pledging to remove.


Related articles