Skip to main content
https://www.highperformancecpmgate.com/rgeesizw1?key=a9d7b2ab045c91688419e8e18a006621

Instagram expands ban on suicide content to cover cartoons and memes

Instagram has expanded a ban on graphical self-harm imagery to include a broader range of content depicting suicide, including fictional illustrations of self-harm and suicide methods such as drawings, cartoons and memes.

“This past month, we further expanded our policies to prohibit more types of self-harm and suicide content. We will no longer allow fictional depictions of self-harm or suicide on Instagram, such as drawings or memes or content from films or comics that use graphic imagery,” writes Instagram boss, Adam Mosseri, explaining the latest policy shift. “We will also remove other imagery that may not show self-harm or suicide, but does include associated materials or methods.”

Earlier this year Mosseri, met with the UK’s health secretary to discuss the platform’s policy towards self-harm content. The company has faced high level pressure in the country following a public outcry after the family of Molly Russell, a 14-year-old UK schoolgirl who killed herself after viewing suicide content on Instagram, went public with the tragedy by talking to the BBC.

In February the Facebook-owned social media platform announced that it would prohibit graphic images of self-harm, such as cutting, and restrict access to non-graphic self-harm content, such as images of healed scars — by not recommending it in searches.

At the time it also suggested it was toying with the idea of using sensitive screens to blur non-graphical suicide content, saying it was consulting with experts. In the event it appears to have decided to go further — by now saying it will also remove fictional content related to self-harm, as well as anything that depicts methods of suicide or self-harm.

Instagram says it’s doubled the amount of self-harm content it has acted on following the earlier policy change — with Mosseri writing that in the three months following the ban on graphic images of cutting it “removed, reduced the visibility of, or added sensitivity screens to more than 834,000 pieces of content”.

While more than 77% of this content was identified by the platform prior to it being reported, he adds.

A spokesperson for Instagram confirmed to us that the latest policy shift is in effect.

Although it’s not clear how long it could take for it to be effectively enforced. Mosseri told BBC News: “It will take time to fully implement,” adding that: “It’s not going to be the last step we take.”

In his blog post about the policy change, the Instagram boss writes that the new policy is “based on expert advice from academics and mental health organisations like the Samaritans in the UK and National Suicide Prevention Line in the US”, saying: “We aim to strike the difficult balance between allowing people to share their mental health experiences while also protecting others from being exposed to potentially harmful content.”

“Accounts sharing this type of content will also not be recommended in search or in our discovery surfaces, like Explore. And we’ll send more people more resources with localized helplines like the Samaritans and PAPYRUS in the UK or the National Suicide Prevention Lifeline and The Trevor Project in the United States,” he adds.

He goes on to argue that the issues involved are complex and “no single company or set of policies and practices alone can solve”, while defending continuing to allow some suicide and self-harm content on Instagram by saying “experts tell us that giving people a chance to share their most difficult moments and their stories of recovery can be a vital means of support” and that “preventing people from sharing this type of content could not only stigmatize these types of mental health issues, but might hinder loved ones from identifying and responding to a cry for help”.

“But getting our approach right requires more than a single change to our policies or a one-time update to our technology. Our work here is never done. Our policies and technology have to evolve as new trends emerge and behaviors change,” he adds.

Comments

Popular posts from this blog

Uber co-founder Garrett Camp steps back from board director role

Uber co-founder Garrett Camp is relinquishing his role as a board director and switching to board observer — where he says he’ll focus on product strategy for the ride hailing giant. Camp made the announcement in a short Medium post in which he writes of his decade at Uber: “I’ve learned a lot, and realized that I’m most helpful when focused on product strategy & design, and this is where I’d like to focus going forward.” “I will continue to work with Dara [Khosrowshahi, Uber CEO] and the product and technology leadership teams to brainstorm new ideas, iterate on plans and designs, and continue to innovate at scale,” he adds. “We have a strong and diverse team in place, and I’m confident everyone will navigate well during these turbulent times.” The Canadian billionaire entrepreneur signs off by saying he’s looking forward to helping Uber “brainstorm the next big idea”. Camp hasn’t been short of ideas over his career in tech. He’s the co-founder of the web 2.0 recommendatio

Drone crash near kids leads Swiss Post and Matternet to suspend autonomous deliveries

A serious crash by a delivery drone in Switzerland have grounded the fleet and put a partnership on ice. Within a stone’s throw of a school, the incident raised grim possibilities for the possibilities of catastrophic failure of payload-bearing autonomous aerial vehicles. The drones were operated by Matternet as part of a partnership with the Swiss Post (i.e. the postal service), which was using the craft to dispatch lab samples from one medical center for priority cases. As far as potential applications of drone delivery, it’s a home run — but twice now the craft have crashed, first with a soft landing and the second time a very hard one. The first incident, in January, was the result of a GPS hardware error; the drone entered a planned failback state and deployed its emergency parachute, falling slowly to the ground. Measures were taken to improve the GPS systems. The second failure in May, however, led to the drone attempting to deploy its parachute again, only to sever the line

How the world’s largest cannabis dispensary avoids social media restrictions

Planet 13 is the world’s largest cannabis dispensary. Located in Las Vegas, blocks off the Strip, the facility is the size of a small Walmart. By design, it’s hard to miss. Planet 13 is upending the dispensary model. It’s big, loud and visitors are encouraged to photograph everything. As part of the cannabis industry, Planet 13 is heavily restricted on the type of content it can publish on Instagram, Facebook and other social media platforms. It’s not allowed to post pictures of buds or vapes on some sites. It can’t talk about pricing or product selection on others.   View this post on Instagram   A post shared by Morgan Celeste SF Blogger (@bayareabeautyblogger) on Jan 25, 2020 at 7:54pm PST Instead, Planet 13 encourages its thousands of visitors to take photos and videos. Starting with the entrance, the facility is full of surprises tailored for the ‘gram. As a business, Planet 13’s social media content is heavily restricted and monito