Photo used for illustrative purposes. Reuters
Malaysian lawmakers on Wednesday called on authorities to investigate reports of a teenager who allegedly jumped to her death after asking her social media followers to vote on whether she should take her own life.
Police in Sarawak, on Malaysia's east, said a 16-year-old girl, who was not named, had run a poll on photo-sharing app Instagram with the question "Really Important, Help Me Choose D/L", hours before jumping off the roof of a building on May 13, media reported on Tuesday.
"According to a close friend of the victim, the 'D/L' meant 'Death/Life'," district police chief Aidil Bolhassan was quoted as saying by The Borneo Post newspaper.
The poll showed 69% of the girl's followers chose 'D', Aidil said.
"We are conducting a post-mortem to determine whether there were other factors in her death," he said, adding that the girl had a history of depression.
Instagram reviewed the teenager's account and found that the online poll, which ran over a 24-hour period, ended with 88% percent votes for 'L', said Wong Ching Yee, Instagram's head of communications in the Asia-Pacific.
Aidil, however, said that the poll's numbers may have changed after news of the girl's death spread.
The case had sparked concern among Malaysian lawmakers who called for a wider probe.
Ramkarpal Singh, a lawyer and member of parliament, said that those who voted for the teenager to die could be guilty of abetting suicide.
"Would the girl still be alive today if the majority of netizens on her Instagram account discouraged her from taking her own life?" he said in a statement.
"Would she have heeded the advice of netizens to seek professional help had they done so?"
Youth and Sports Minister Syed Saddiq Syed Abdul Rahman also called for a probe, saying that rising suicide rates and mental health issues among young people needed to be taken seriously.
Under Malaysian law, anyone convicted of abetting the suicide of a minor could face the death penalty or up to 20 years' jail and a fine.
Instagram extended its sympathies to the teenager's family, and said the company had a responsibility to make its users feel safe and supported.
"As part of our own efforts, we urge everyone to use our reporting tools and to contact emergency services if they see any behaviour that puts people’s safety at risk," Wong said.
In February, Instagram banned graphic images and content related to self-harm from its platform, citing a need to keep vulnerable users safe.
The changes came following pressure from the parents of a British teenager, who believed that viewing Instagram accounts related to self-harm and depression contributed to their daughter's suicide in 2017.
The death of 14-year-old Molly Russell sparked a debate in Britain about regulating children's social media use.
Her parents did not directly blame Instagram for the loss of their daughter but they cited the easy access to disturbing content as a contributing factor, and urged the network to respond.
Instagram has never allowed posts that promote or encourage suicide or self-harm.
But as part of the clampdown, it removed references to non-graphic content related to people hurting themselves from its searches and recommendation features.
It also banned hashtags relating to self-harm.
The measures are meant to make such images more difficult to find for depressed teens who might have suicidal tendencies.
Agencies