New Delhi, Dec 7, 2021- A global coalition of researchers has called on Meta to be more transparent and serious about the mental health of child and adolescent users on Facebook, Instagram and WhatsApp, as debate rises over the harmful impact these platforms have on the minds of children.
In an open letter to Meta CEO Mark Zuckerberg, the group asked Meta executives to commit to gold standard transparency on child and adolescent mental health research, contribute to independent research on child and adolescent mental health around the globe, and establish an independent oversight trust for child and adolescent mental health on Meta platforms.
“Recently, we have been following news reports about research within your companies on the mental health of child and adolescent users of Facebook, Instagram, and WhatsApp. Unfortunately, that research is happening behind closed doors and without independent oversight,” the coalition wrote in the letter.
“Therefore, we have only a fragmented picture of the studies your companies are conducting. We do not believe that the methodologies seen so far meet the high scientific standards required to responsibly investigate the mental health of children and adolescents,” the group lamented.
Facebook has been grilled recently after a leak exposed how Instagram’s own research had found the platform could harm children’s well-being.
Facebook’s global head of safety, Antigone Davis, testified in September about child protection in the US Senate.
The researchers said in the letter that although nothing in the leaks suggests that social media causes suicide, self-harm, or mental illness, “these are serious research topics”.
“This work, and the tools you are using should not be developed without independent oversight. Sound science must come before firm conclusions are drawn or new tools are launched. You and your organisations have an ethical and moral obligation to align your internal research on children and adolescents with established standards for evidence in mental health science,” the group argued.
Meanwhile, Instagram has rolled out new tools to safeguard teenagers from harmful content, after whistleblower Frances Haugen testified before the US Congress that Instagram can have a negative effect on the mental health of teenagers.
According to the coalition, with three billion people using Meta platforms for socialising, leisure and business, it is highly plausible that these virtual environments have far-reaching effects on the mental health of younger users – in both positive and negative ways.
“The fact that you are conducting the research revealed in recent press reports makes it clear you agree that such effects are a real possibility. While we applaud these attempts to understand how your platforms may be impacting young people’s mental health, we believe that the methodologically questionable and secretive ways your teams are conducting this important work is misguided and, in its present state, doomed to fail,” the researchers elaborated.
“The time is right for a new global trust dedicated to promoting credible, independent, and rigorous oversight on the mental health implications of Meta. Expanding upon the Facebook Oversight Board model, in place of quasi-judicial rulings the trust would conduct independent scientific oversight,” the letter read. (Agency)