Skip to main content
https://www.highperformancecpmgate.com/rgeesizw1?key=a9d7b2ab045c91688419e8e18a006621

Social media should have “duty of care” towards kids, UK MPs urge

Social media platforms are being urged to be far more transparent about how their services operate and to make “anonymised high-level data” available to researchers so the technology’s effects on users — and especially on children and teens — can be better understood.

The calls have been made in a report by the UK parliament’s Science and Technology Committee which has been looking into the impacts of social media and screen use among children — to consider whether such tech is “healthy or harmful”.

“Social media companies must also be far more open and transparent regarding how they operate and particularly how they moderate, review and prioritise content,” it writes.

Concerns have been growing about children’s use of social media and mobile technology for some years now, with plenty of anecdotal evidence and also some studies linking tech use to developmental problems, as well as distressing stories connecting depression and even suicide to social media use.

Although the committee writes that its dive into the topic was hindered by “the limited quantity and quality of academic evidence available”. But it also asserts: “The absence of good academic evidence is not, in itself, evidence that social media and screens have no effect on young people.”

“We found that the majority of published research did not provide a clear indication of causation, but instead indicated a possible correlation between social media/screens and a particular health effect,” it continues. “There was even less focus in published research on exactly who was at risk and if some groups were potentially more vulnerable than others when using screens and social media.”

The UK government expressed its intention to legislate in this area, announcing a plan last May to “make social media safer” — promising new online safety laws to tackle concerns.

The committee writes that it’s therefore surprised the government has not commissioned “any new, substantive research to help inform its proposals”, and suggests it get on and do so “as a matter of urgency” — with a focus on identifying people at risk of experiencing harm online and on social media; the reasons for the risk factors; and the longer-term consequences of the tech’s exposure on children.

It further suggests the government should consider what legislation is required to improve researchers’ access to this type of data, given platforms have failed to provide enough access for researchers of their own accord.

The committee says it heard evidence of a variety of instances where social media could be “a force for good” but also received testimonies about some of the potential negative impacts of social media on the health and emotional wellbeing of children.

“These ranged from detrimental effects on sleep patterns and body image through to cyberbullying, grooming and ‘sexting’,” it notes. “Generally, social media was not the root cause of the risk but helped to facilitate it, while also providing the opportunity for a large degree of amplification. This was particularly apparent in the case of the abuse of children online, via social media.

“It is imperative that the government leads the way in ensuring that an effective partnership is in place, across civil society, technology companies, law enforcement agencies, the government and non-governmental organisations, aimed at ending child sexual exploitation (CSE) and abuse online.”

The committee suggests the government commission specific research to establish the scale and prevalence of online CSE — pushing it to set an “ambitious target” to halve reported online CSE in two years and “all but eliminate it in four”.

A duty of care

A further recommendation will likely send a shiver down tech giants’ spines, with the committee urging a duty of care principle be enshrined in law for social media users under 18 years of age to protect them from harm when on social media sites.

Such a duty would up the legal risk stakes considerably for user generated content platforms which don’t bar children from accessing their services.

The committee suggests the government could achieve that by introducing a statutory code of practice for social media firms, via new primary legislation, to provide “consistency on content reporting practices and moderation mechanisms”.

It also recommends a requirement in law for social media companies to publish detailed Transparency Reports every six months.

It is also for a 24 hour takedown law for illegal content, saying that platforms should have to review reports of potentially illegal content and take a decision on whether to remove, block or flag it — and reply the decision to the individual/organisation who reported it — within 24 hours.

Germany already legislated for such a law, back in 2017 — though in that case the focus is on speeding up hate speech takedowns.

In Germany social media platforms can be fined up to €50 million if they fail to comply with the NetzDG law, as its truncated German name is known. (The EU executive has also been pushing platforms to remove terrorist related material within an hour of a report, suggesting it too could legislate on this front if they fail to moderate content fast enough.)

The committee suggests the UK’s media and telecoms regulator, Ofcom would be well-placed to oversee how illegal content is handled under any new law.

It also recommends that social media companies use AI to identify and flag to users (or remove as appropriate) content that “may be fake” — pointing to the risk posed by new technologies such as “deep fake videos”.

More robust systems for age verification are also needed, in the committee’s view. It writes that these must go beyond “a simple ‘tick box’ or entering a date of birth”.

Looking beyond platforms, the committee presses the government to take steps to improve children’s digital literacy and resilience, suggesting PSHE (personal, social and health) education should be made mandatory for primary and secondary school pupils — delivering “an age-appropriate understanding of, and resilience towards, the harms and benefits of the digital world”.

Teachers and parents should also not be overlooked, with the committee suggesting training and resources for teachers and awareness and engagement campaigns for parents.

Comments

Popular posts from this blog

Uber co-founder Garrett Camp steps back from board director role

Uber co-founder Garrett Camp is relinquishing his role as a board director and switching to board observer — where he says he’ll focus on product strategy for the ride hailing giant. Camp made the announcement in a short Medium post in which he writes of his decade at Uber: “I’ve learned a lot, and realized that I’m most helpful when focused on product strategy & design, and this is where I’d like to focus going forward.” “I will continue to work with Dara [Khosrowshahi, Uber CEO] and the product and technology leadership teams to brainstorm new ideas, iterate on plans and designs, and continue to innovate at scale,” he adds. “We have a strong and diverse team in place, and I’m confident everyone will navigate well during these turbulent times.” The Canadian billionaire entrepreneur signs off by saying he’s looking forward to helping Uber “brainstorm the next big idea”. Camp hasn’t been short of ideas over his career in tech. He’s the co-founder of the web 2.0 recommendatio

Drone crash near kids leads Swiss Post and Matternet to suspend autonomous deliveries

A serious crash by a delivery drone in Switzerland have grounded the fleet and put a partnership on ice. Within a stone’s throw of a school, the incident raised grim possibilities for the possibilities of catastrophic failure of payload-bearing autonomous aerial vehicles. The drones were operated by Matternet as part of a partnership with the Swiss Post (i.e. the postal service), which was using the craft to dispatch lab samples from one medical center for priority cases. As far as potential applications of drone delivery, it’s a home run — but twice now the craft have crashed, first with a soft landing and the second time a very hard one. The first incident, in January, was the result of a GPS hardware error; the drone entered a planned failback state and deployed its emergency parachute, falling slowly to the ground. Measures were taken to improve the GPS systems. The second failure in May, however, led to the drone attempting to deploy its parachute again, only to sever the line

ProtonMail logged IP address of French activist after order by Swiss authorities

ProtonMail , a hosted email service with a focus on end-to-end encrypted communications, has been facing criticism after a police report showed that French authorities managed to obtain the IP address of a French activist who was using the online service. The company has communicated widely about the incident, stating that it doesn’t log IP addresses by default and it only complies with local regulation — in that case Swiss law. While ProtonMail didn’t cooperate with French authorities, French police sent a request to Swiss police via Europol to force the company to obtain the IP address of one of its users. For the past year, a group of people have taken over a handful of commercial premises and apartments near Place Sainte Marthe in Paris. They want to fight against gentrification, real estate speculation, Airbnb and high-end restaurants. While it started as a local conflict, it quickly became a symbolic campaign. They attracted newspaper headlines when they started occupying prem