Survey points to influence of social networks on girl’s suicide

London – A judicial inquiry that ended on Friday (30) in London directly blamed social networks for the suicide of 14-year-old Molly Russell, after a fight of more than five years by the girl’s father, Ian Russell, to prove that the content seen by her influenced the decision.

Molly took her own life in 2017, becoming a symbol of the risks posed by social media to children and young people and a reference for those who advocate stricter regulation to punish platforms that allow harmful content to circulate.

Even Prince William, who follows the protocol of the monarchy and does not usually speak out on controversial topics, yesterday called for more online safety for children. On Twitter, he said:

“No parent should have to put up with what Ian Russell and his family went through. They were incredibly brave. Online safety for our children and youth needs to be a prerequisite, not an afterthought.”

The message was signed with ‘W’, used to flag a post written personally by William and not by aides. He has three young children and has supported Ian Russell’s cause in the past.

Social media contributed to suicide, says expert

Molly was found dead in her bedroom in November 2017. Although apparently healthy and balanced, the girl followed topics such as suicide and self-harm on social media.

The father embarked on a courageous journey to challenge the power of digital platforms, creating a foundation named after his daughter to raise awareness of the risk of suicide among young people and the danger posed by content freely disseminated on social media to people of any age.

During the two weeks of the judicial inquiry, in which evidence is presented and witnesses testify, the family’s lawyers proved that of the 16,300 posts that the girl saved, shared or liked on Instagram in the six months before her death, 2.1 a thousand were about depression, self-harm, or suicide.

Expert Andrew Walker, who led the investigation, chose to give the cause of death as self-mutilation.

In the final report he wrote that the girl was “exposed to material that may have influenced her in a negative way and, in addition, what started out as a depression became a more serious depressive illness”.

Walker also said that the “particularly graphic” content she saw “romanticized acts of self-harm”, “normalized her condition” and focused on “limited and irrational vision without any counterbalance of normality”.

What do digital platforms say?

Representatives of Meta, which owns Instagram, and Pinterest were among the deponents in the inquiry and took different positions on the case of Molly Russell.

Pinterest COO Judson Hoffman said he “deeply regrets” the posts seen by Molly, stating that he would not show them to his children. And he apologized by admitting that the platform “wasn’t safe” when the young woman used it.

Meta went the other way. The company’s head of Health and Welfare, Elizabeth Lagone, flew from the US to participate in the inquiry, although Meta’s head of corporate affairs is former British politician Nick Clegg, who lives in the UK.

After spending an hour on Monday (25) viewing posts accessed by Molly about suicide, drugs, alcohol, depression and self-harm, the executive of the Instagram holding company said she thought it was safe for people to express themselves, thus justifying the posts seen by the Instagram. girl.

She deemed the posts “generally permissible” but acknowledged that two of the posts violated the platform’s policies. However, she defended the thesis that it is “important to give voice to people” with suicidal thoughts.

The head of the inquiry questioned during the deposition Meta’s right to make decisions about what material is safe for children to see. The executive said that decisions are made in collaboration with experts.

It also introduced Instagram’s guidelines at the time of Molly’s death, which authorized content about suicide and self-harm to “facilitate coming together to support” other users, but not if it “encourages or promotes” the act.

Well-prepared, Lagone was evasive when asked if the contents could have driven Molly to the extreme. And she avoided separating young people or adults, on the grounds that all Instagram users are over 13 years old.

The content of the testimonies illustrates the different views on the subject, even across platforms. The Goal equates adults and young people in training, without taking into account the natural insecurities of adolescence that can make it more difficult to discern between “uniting to support” or “encouraging” suicide.

The understanding of the inquiry was clear. Coroner Andrew Walker said online content seen by student Molly Russell was “not safe” and “should not be available for a child to see”.

Father of the girl who committed suicide does not forgive social media

As he has been doing since he began his crusade against social media after his daughter’s suicide, Ian Russell harshly criticized Meta’s position:

“We heard a senior executive at the company describe this deadly stream of content that the platform’s algorithms sent to Molly as ‘safe’ and not in violation of platform policies.

“If this demented trail of life-sucking content were safe, my daughter Molly would probably still be alive and instead of us being a grieving family of four, there would be five of us looking forward to a life full of purpose and promise for the future for the family. our lovely Molly.

“It’s time to change the toxic corporate culture at the heart of the world’s largest social media platform.”

The inquiry is not a conviction, but it has an effect on future court decisions and has had a major impact in the UK. Local media followed the case daily, with extensive coverage of each deposition.

In an interview with Sky News, Rachel de Souza, England’s children’s commissioner, that social media giants should remove such harmful content from their platforms following the inquiry’s findings, claiming it is “Despicable” that companies put “profits ahead of children’s safety”.

Although the legislation pending in Parliament to regulate social networks provides for fines and even imprisonment for executives, de Souza questioned the need to wait for punishments:

“Why can’t these companies take these things down soon?

She said that in biannual meetings with platform leaders, she often asks: “How many children do you have online? Are you downloading this stuff?”, but she regretted that they avoid the questions and thinks they “don’t do enough”.

“They need to have a moral compass and work it out now, they can do it.”

Basically, it’s about online safety.

On the day the survey result was announced, the Molly Rose Foundation released a statement highlighting the issue of online safety and the need for regulation:

“The inquiry has demonstrated very clearly the significant dangers that social media platforms like Instagram, Twitter and Pinterest present in the absence of any effective regulation.

[…) Se o governo e as plataformas de tecnologia agirem sobre as questões levantadas no inquérito, isso terá um efeito positivo no bem-estar mental dos jovens, que é o principal objetivo da Molly Rose Foundation.

As plataformas devem parar de criar perfis de crianças para fornecer conteúdo prejudicial – mesmo que tenham o direito de hospedar esse conteúdo para usuários adultos, que também podem ser vulneráveis.

As coisas precisam mudar. As plataformas de mídia social provaram que não estão dispostas a se proteger sem ação legislativa, portanto, é vital que a Lei de Segurança Online seja aprovada sem demora e que uma era de responsabilidade seja iniciada. Para as mídias sociais, a era do Velho Oeste acabou.”

Lei de redes sociais parada no Parlamento 

As redes sociais são objeto de críticas por motivos diversos. No Reino Unido, a questão principal tem sido o risco a crianças, agravada pelo caso de Molly Russel.

A imprensa acompanha atentamente uma sinalização do novo governo da primeira ministra Liz Truss sobre mudanças no projeto de lei da Online Harm Bill.

O projeto foi apresentado em 2020 como iniciativa ousada para tornar o país o mais seguro do mundo para crianças na Internet, e chegou ao Parlamento em 2021. Mas está empacado.

A possível alteração no texto inicial é um relaxamento que não agradou a entidades que defendem mais rigor em relação a assédio infantil e acesso das crianças a conteúdo nocivo: não obrigar as plataformas a removerem conteúdo de risco, porém legal.

Os que querem a alteração alegam que dar às plataformas a responsabilidade de remover conteúdo representa poder excessivo a elas.

Para a ONG Open Rights, o projeto de lei britânico em sua forma atual “efetivamente terceiriza o policiamento da internet, passando-o da polícia, tribunais e Parlamento para o Vale do Silício”.

Os que são a favor dizem que cabe a elas retirar em nome do interesse maior da sociedade sem depender os poderes constituídos, como defende Rachel de Souza e a fundação Molly Rose, que em seu comunicado sobre o resultado do inquérito disse: 

“Já se passaram quase cinco anos desde a morte de Molly e ainda estamos aguardando a prometida legislação governamental. Não podemos esperar mais; não se deve almejar a perfeição, muitas vidas estão em risco.

A estrutura regulatória pode ser aperfeiçoada nos próximos meses e anos.”

O partido Trabalhista, de oposição ao Conservador que está no poder, também faz pressão para uma aprovação rápida da lei. Após o fim do inquérito, a secretária de cultura e mídia digital do partido, Lucy Powell, defendeu a necessidade de aprovar a lei o mais rápido possível. 

Nem todos estão com paciência para esperar. Há duas semanas o Ofcom, órgão regulador de mídia, notificou o TikTok por violação da lei e proteção de dados ao não garantir a privacidade de crianças que usam a plataforma, antecipando que vem aí uma multa de £ 27 milhões.

Source link

About Admin

Check Also

Explosions hit nuclear bomber bases in Russia; watch video

At least three people died and six were injured; origin of the attacks were not …

Leave a Reply

Your email address will not be published. Required fields are marked *