Skip to main content
https://www.highperformancecpmgate.com/rgeesizw1?key=a9d7b2ab045c91688419e8e18a006621

Big tech’s ‘blackbox’ algorithms face regulatory oversight under EU plan

Major Internet platforms will be required to open up their algorithms to regulatory oversight under proposals European lawmakers are set to introduce next month.

In a speech today Commission EVP Margrethe Vestager suggested algorithmic accountability will be a key plank of the forthcoming legislative digital package — with draft rules incoming that will require platforms to explain how their recommendation systems work as well as offering users more control over them.

“The rules we’re preparing would give all digital services a duty to cooperate with regulators. And the biggest platforms would have to provide more information on the way their algorithms work, when regulators ask for it,” she said, adding that platforms will also “have to give regulators and researchers access to the data they hold — including ad archives”.

While social media platforms like Facebook have set up ad archives ahead of any regulatory requirement to do so there are ongoing complaints from third party researchers about how the information is structured and how (in)accessible it is to independent study.

More information for users around ad targeting is another planned requirement, along with greater reporting requirements for platforms to explain content moderation decisions, per Vestager — who also gave a preview of what’s coming down the pipe in the Digital Services Act and Digital Markets Act in another speech earlier this week.

Regional lawmakers are responding to concerns that ‘blackbox’ algorithms can have damaging effects on individuals and societies — flowing from how they process data and order and rank information, with risks such as discrimination, amplification of bias and abusive targeting of vulnerable individuals and groups.

The Commission has said it’s working on binding transparency rules with the aim of forcing tech giants to take more responsibility for the content their platforms amplify and monetize. Although the devil will be in both the detail of the requirements and how effectively they will be enforced — but a draft of the plan is due in a month or so.

“One of the main goals of the Digital Services Act that we’ll put forward in December will be to protect our democracy, by making sure that platforms are transparent about the way these algorithms work – and make those platforms more accountable for the decisions they make,” said Vestager in a speech today at an event organized by not-for-profit research advocacy group AlgorithmWatch.

“The proposals that we’re working on would mean platforms have to tell users how their recommender systems decide which content to show – so it’s easier for us to judge whether to trust the picture of the world that they give us or not.”

Under the planned rules the most powerful Internet platforms — so-called ‘gatekeepers’ in EU parlance — will have to provide regular reports on “the content moderation tools they use, and the accuracy and results of those tools”, as Vestager put it.

There will also be specific disclosure requirements for ad targeting that go beyond the current fuzzy disclosures that platforms like Facebook may already offer (in its case via the ‘why am I seeing this ad?’ menu).

“Better information” will have to be provided, she said, such as platforms telling users “who placed a certain ad, and why it’s been targeted at us”. The overarching aim will be to ensure users of such platforms have “a better idea of who’s trying to influence us — and a better chance of spotting when algorithms are discriminating against us,” she added. 

Today a coalition of 46 civic society organizations led by AlgorithmWatch urged the Commission to make sure transparency requirements in the forthcoming legislation are “meaningful” — calling for it to introduce “comprehensive data access frameworks” that provide watchdogs with the tools they need to hold platforms accountable, as well as to enable journalists, academics, and civil society to “challenge and scrutinize power”.

The group’s set of recommendations call for binding disclosure obligations based on the technical functionalities of dominant platforms; a single EU institution “with a clear legal mandate to enable access to data and to enforce transparency obligations”; and provisions to ensure data collection complies with EU data protection rules.

Another way to rebalance the power asymmetry between data-mining platform giants and the individuals who they track, profile and target could involve requirements to let users switch off algorithmic feeds entirely if they wish — opting out of the possibility of data-driven discrimination or manipulation. But it remains to be seen whether EU lawmakers will go that far in the forthcoming legislative proposals.

The only hints Vestager offered on this front was to say that the planned rules “will also give more power to users — so algorithms don’t have the last word about what we get to see, and what we don’t get to see”.

Platforms will also have to give users “the ability to influence the choices that recommender systems make on our behalf”, she also said.

In further remarks she confirmed there will be more detailed reporting requirements for digital services around content moderation and takedowns — saying they will have to tell users when they take content down, and give them “effective rights to challenge that removal”. While there is widespread public support across the bloc for rebooting the rules of play for digital giants there are also strongly held views that regulation should not impinge on online freedom of expression — such as by encouraging platforms to shrink their regulatory risk by applying upload filters or removing controversial content without a valid reason.

The proposals will need the support of EU Member States, via the European Council, and elected representatives in the European parliament.

The latter has already voted in support of tighter rules on ad targeting. MEPs also urged the Commission to reject the use of upload filters or any form of ex-ante content control for harmful or illegal content, saying the final decision on whether content is legal or not should be taken by an independent judiciary.

Simultaneously the Commission is working on shaping rules specifically for applications that use artificial intelligence — but that legislative package is not due until next year.

Vestager confirmed that will be introduced early in 2021 with the aim of creating “an AI ecosystem of trust”.

Comments

Popular posts from this blog

Uber co-founder Garrett Camp steps back from board director role

Uber co-founder Garrett Camp is relinquishing his role as a board director and switching to board observer — where he says he’ll focus on product strategy for the ride hailing giant. Camp made the announcement in a short Medium post in which he writes of his decade at Uber: “I’ve learned a lot, and realized that I’m most helpful when focused on product strategy & design, and this is where I’d like to focus going forward.” “I will continue to work with Dara [Khosrowshahi, Uber CEO] and the product and technology leadership teams to brainstorm new ideas, iterate on plans and designs, and continue to innovate at scale,” he adds. “We have a strong and diverse team in place, and I’m confident everyone will navigate well during these turbulent times.” The Canadian billionaire entrepreneur signs off by saying he’s looking forward to helping Uber “brainstorm the next big idea”. Camp hasn’t been short of ideas over his career in tech. He’s the co-founder of the web 2.0 recommendatio...

Drone crash near kids leads Swiss Post and Matternet to suspend autonomous deliveries

A serious crash by a delivery drone in Switzerland have grounded the fleet and put a partnership on ice. Within a stone’s throw of a school, the incident raised grim possibilities for the possibilities of catastrophic failure of payload-bearing autonomous aerial vehicles. The drones were operated by Matternet as part of a partnership with the Swiss Post (i.e. the postal service), which was using the craft to dispatch lab samples from one medical center for priority cases. As far as potential applications of drone delivery, it’s a home run — but twice now the craft have crashed, first with a soft landing and the second time a very hard one. The first incident, in January, was the result of a GPS hardware error; the drone entered a planned failback state and deployed its emergency parachute, falling slowly to the ground. Measures were taken to improve the GPS systems. The second failure in May, however, led to the drone attempting to deploy its parachute again, only to sever the line...

How the world’s largest cannabis dispensary avoids social media restrictions

Planet 13 is the world’s largest cannabis dispensary. Located in Las Vegas, blocks off the Strip, the facility is the size of a small Walmart. By design, it’s hard to miss. Planet 13 is upending the dispensary model. It’s big, loud and visitors are encouraged to photograph everything. As part of the cannabis industry, Planet 13 is heavily restricted on the type of content it can publish on Instagram, Facebook and other social media platforms. It’s not allowed to post pictures of buds or vapes on some sites. It can’t talk about pricing or product selection on others.   View this post on Instagram   A post shared by Morgan Celeste SF Blogger (@bayareabeautyblogger) on Jan 25, 2020 at 7:54pm PST Instead, Planet 13 encourages its thousands of visitors to take photos and videos. Starting with the entrance, the facility is full of surprises tailored for the ‘gram. As a business, Planet 13’s social media content is heavily restricted a...