Skip to main content
https://www.highperformancecpmgate.com/rgeesizw1?key=a9d7b2ab045c91688419e8e18a006621

Facebook accused of blocking wider efforts to study its ad platform

Facebook has been accused of blocking the ability of independent researchers to effectively study how political disinformation flows across its ad platform.

Adverts that the social network’s business is designed to monetize have — at very least — the potential to influence people and push voters’ buttons, as the Cambridge Anaytica Facebook data misuse scandal highlighted last year.

Since that story exploded into a major global scandal for Facebook the company has faced a chorus of calls for increased transparency and accountability from policymakers on both sides of the Atlantic.

It has responded with lashings of obfuscation, misdirection and worse.

Among Facebook’s less controversial efforts to counter the threat that disinformation poses to its business are what it bills “election security” initiatives, such as identity checks for political advertisers. Even as these efforts have looked hopelessly flat-footed, patchy and piecemeal in the face of concerned attempts to use its tools to amplify disinformation in markets around the world.

Perhaps more significantly — under amped up political pressure — Facebook has launched a searchable ad archive. And access to Facebook ad data certainly has the potential to let external researchers hold the company’s claims to account.

But only if access is not equally flat-footed, patchy and piecemeal, with the risk that selective access to ad data ends up being just as controlled and manipulated as everything else on Facebook’s platform.

So far Facebook’s efforts on this front continue to attract criticism for falling way short.

“the opposite of what they claim to be doing… “

The company opened access to an ad archive API last month, via which it provides rate-limited access to a keyword search tool that lets researchers query historical ad data. (Researchers first need to pass an identity check process and agree to the Facebook developer platform terms of service before they can access the API.)

However a review of the tool by not-for-profit Mozilla rates the API as a lot of weak-sauce ‘transparency-washing’ — rather than a good faith attempt to support public interest research which could genuinely help quantify the societal costs of Facebook’s ad business.

“The fact is, the API doesn’t provide necessary data. And it is designed in ways that hinders the important work of researchers, who inform the public and policymakers about the nature and consequences of misinformation,” it writes in a blog post where it argues that Facebook’s ad API meets just two out of five minimum standards it previously set out — backed by a group of sixty academics, hailing from research institutions including Oxford University, the University of Amsterdam, Vrije Universiteit Brussel, Stiftung Neue Verantwortung, and many more.

Instead of providing comprehensive political advertising content, as the experts argue a good open API must, Mozilla writes that “it’s impossible to determine if Facebook’s API is comprehensive, because it requires you to use keywords to search the database”.

“It does not provide you with all ad data and allow you to filter it down using specific criteria or filters, the way nearly all other online databases do. And since you cannot download data in bulk and ads in the API are not given a unique identifier, Facebook makes it impossible to get a complete picture of all of the ads running on their platform (which is exactly the opposite of what they claim to be doing),” it adds.

Facebook’s tool is also criticized for failing to provide targeting criteria and engagement information for ads — thereby making it impossible for researchers to understand what advertisers on its platform are paying the company to reach; as well as how effective (or otherwise) these Facebook ads might be.

This exact issue was raised with a number of Facebook executives by British parliamentarians last year, during the course of a multi-month investigation into online disinformation. At one point Facebook’s CTO was asked point blank whether the company would be providing ad targeting data as part of planned political ad transparency measures — only to provide a fuzzy answer.

Of course there are plenty of reasons why Facebook might be reluctant to enable truly independent outsiders to quantify the efficacy of political ads on its platform and therefore, by extension, its ad business.

Including, of course, the specific scandalous example of the Cambridge Analytica data heist itself, which was carried out by an academic, called Dr Aleksandr Kogan, then attached to Cambridge University, who used his access to Facebook’s developer platform to deploy a quiz app designed to harvest user data without (most) people’s knowledge or consent in order to sell the info to the disgraced digital campaign company (which worked on various U.S. campaigns, including the presidential campaigns of Ted Cruz and Donald Trump).

But that just highlights the scale of the problem of so much market power being concentrated in the hands of a single adtech giant which has zero incentives to voluntarily report wholly transparent metrics about its true reach and power to influence the world’s 2BN+ Facebook users.

Add to that, in a typical crisis PR response to multiple bad headlines last year, Facebook repeatedly sought to paint Kogan as a rogue actor — suggesting he was not at all a representative sample of the advertiser activity on its platform.

So, by the same token, any effort by Facebook to tar genuine research as similarly risky rightly deserves a robust rebuttal. The historical actions of one individual, albeit yes an academic, shouldn’t be used as an excuse to shut the door to a respected research community.

“The current API design puts huge constraints on researchers, rather than allowing them to discover what is really happening on the platform,” Mozilla argues, suggesting the various limitations imposed by Facebook — including search rate limits — means it could take researchers “months” to evaluate ads in a particular region or on a certain topic.

Again, from Facebook’s point of view, there’s plenty to be gained by delaying the release of any more platform usage skeletons from its bulging historical data closet. (The ‘historical app audit’ it announced with much fanfare last year continues to trickle along at a disclosure pace of its own choosing.)

The two areas where Facebook’s API is given a tentative thumbs up by Mozilla is in providing access to up-to-date and historical data (the seven year availability of the data is badged “pretty good”); and for the API being accessible to and shareable with the general public (at least once they’ve gone through Facebook’s identity confirm process).

Though in both cases Mozilla also cautions it’s still possible that further blocking tactics might emerge — depending on how Facebook supports/constrains access going forward.

It does not look entirely coincidental that the criticism of Facebook’s API for being “inadequate” has landed on the same day that Facebook has pushed out publicity about opening up access to a database of URLs its users have linked to since 2017 — which is being made available to a select group of academics.

In that case 60 researchers, drawn from 30 institutions, who have been chosen by the U.S.’ Social Science Research Council.

Notably the Facebook-selected research dataset entirely skips past the 2016 U.S. presidential election, when Russian election propaganda infamously targeted hundreds of millions of U.S. Facebook voters.

The UK’s 2016 Brexit vote is also not covered by the January 2017 onwards scope of the dataset.

Though Facebook does say it is “committed to advancing this important initiative”, suggesting it could expand the scope of the dataset and/or who can access it at some unspecified future time.

It also claims ‘privacy and security’ considerations are holding up efforts to release research data quicker.

“We understand many stakeholders are eager for data to be made available as quickly as possible,” it writes. “While we remain committed to advancing this important initiative, Facebook is also committed to taking the time necessary to incorporate the highest privacy protections and build a data infrastructure that provides data in a secure manner.”

In Europe, Facebook committed itself to supporting good faith, public interest research when it signed up to the European Commission’s Code of Practice on disinformation last year.

The EU-wide Code includes a specific commitment that platform signatories “empower the research community to monitor online disinformation through privacy-compliant access to the platforms’ data”, in addition to other actions such as tackling fake accounts and making political ads and issue based ads more transparent.

However here, too, Facebook appears to be using ‘privacy-compliance’ as an excuse to water down the level of transparency that it’s offering to external researchers.

TechCrunch understands that, in private, Facebook has responded to concerns raised about its ad API’s limits by saying it cannot provide researchers with more fulsome data about ads — including the targeting criteria for ads — because doing so would violate its commitments under the EU’s General Data Protection Regulation (GDPR) framework.

That argument is of course pure ‘cakeism’. Aka Facebook is trying to have its cake and eat it where privacy and data protection is concerned.

In plainer English, Facebook is trying to use European privacy regulation to shield its business from deeper and more meaningful scrutiny. Yet this is the very same company — and here comes the richly fudgy cakeism — that elsewhere contends personal data its platform pervasively harvests on users’ interests is not personal data. (In that case Facebook has also been found allowing sensitive inferred data to be used for targeting ads — which experts suggest violates the GDPR.)

So, tl;dr, Facebook can be found seizing upon privacy regulation when it suits its business interests to do so — i.e. to try to avoid the level of transparency necessary for external researchers to evaluate the impact its ad platform and business has on wider society and democracy.

Yet argues against GDPR when the privacy regulation stands in the way of monetizing users’ eyeballs by stuffing them with intrusive ads targeted by pervasive surveillance of everyone’s interests.

Such contradictions have not at all escaped privacy experts.

“The GDPR in practice — not just Facebook’s usual weak interpretation of it — does not stop organisations from publishing aggregate information, such as which demographics or geographic areas saw or were targeted for certain adverts, where such data is not fine-grained enough to pick an individual out,” says Michael Veale, a research fellow at the Alan Turing Institute — and one of ten researchers who co-wrote the Mozilla-backed guidelines for what makes an effective ad API.

“Facebook would require a lawful basis to do the aggregation for the purpose of publishing, which would not be difficult, as providing data to enable public scrutiny of the legality and ethics of data processing is a legitimate interest if I have ever seen one,” he also tells us. “Facebook constantly reuse data for different and unclearly related purposes, and so claiming they could legally not reuse data to put their own activities in the spotlight is, frankly, pathetic.

“Statistical agencies have long been familiar with techniques such as differential privacy which stop aggregated information leaking information about specific individuals. Many differential privacy researchers already work at Facebook, so the expertise is clearly there.”

“It seems more likely that Facebook doesn’t want to release information on targeting as it would likely embarrass [it] and their customers,” Veale adds. “It is also possible that Facebook has confidentiality agreements with specific advertisers who may be caught red-handed for practices that go beyond public expectations. Data protection law isn’t blocking the disinfecting light of transparency, Facebook is.”

Asked about the URL database that Facebook has released to selected researchers today, Veale says it’s a welcome step — while pointing to further limitations.

“It’s a good thing that Facebook is starting to work more openly on research questions, particularly those which might point to problematic use of this platform. The initial cohort appears to be geographically diverse, which is refreshing — although appears to lack any academics from Indian universities, far and away Facebook’s largest userbase,” he says. “Time will tell whether this limited dataset will later expand to other issues, and how much researchers are expected to moderate their findings if they hope for continued amicable engagement.”

“It’s very possible for Facebook to effectively cherry-pick datasets to try to avoid issues they know exist, but you also cannot start building a collaborative process on all fronts and issues. Time will tell how open the multinational wishes to be,” Veale adds.

We’ve reached out to Facebook for comment on the criticism of its ad archive API.

Comments

Popular posts from this blog

Uber co-founder Garrett Camp steps back from board director role

Uber co-founder Garrett Camp is relinquishing his role as a board director and switching to board observer — where he says he’ll focus on product strategy for the ride hailing giant. Camp made the announcement in a short Medium post in which he writes of his decade at Uber: “I’ve learned a lot, and realized that I’m most helpful when focused on product strategy & design, and this is where I’d like to focus going forward.” “I will continue to work with Dara [Khosrowshahi, Uber CEO] and the product and technology leadership teams to brainstorm new ideas, iterate on plans and designs, and continue to innovate at scale,” he adds. “We have a strong and diverse team in place, and I’m confident everyone will navigate well during these turbulent times.” The Canadian billionaire entrepreneur signs off by saying he’s looking forward to helping Uber “brainstorm the next big idea”. Camp hasn’t been short of ideas over his career in tech. He’s the co-founder of the web 2.0 recommendatio...

Drone crash near kids leads Swiss Post and Matternet to suspend autonomous deliveries

A serious crash by a delivery drone in Switzerland have grounded the fleet and put a partnership on ice. Within a stone’s throw of a school, the incident raised grim possibilities for the possibilities of catastrophic failure of payload-bearing autonomous aerial vehicles. The drones were operated by Matternet as part of a partnership with the Swiss Post (i.e. the postal service), which was using the craft to dispatch lab samples from one medical center for priority cases. As far as potential applications of drone delivery, it’s a home run — but twice now the craft have crashed, first with a soft landing and the second time a very hard one. The first incident, in January, was the result of a GPS hardware error; the drone entered a planned failback state and deployed its emergency parachute, falling slowly to the ground. Measures were taken to improve the GPS systems. The second failure in May, however, led to the drone attempting to deploy its parachute again, only to sever the line...

How the world’s largest cannabis dispensary avoids social media restrictions

Planet 13 is the world’s largest cannabis dispensary. Located in Las Vegas, blocks off the Strip, the facility is the size of a small Walmart. By design, it’s hard to miss. Planet 13 is upending the dispensary model. It’s big, loud and visitors are encouraged to photograph everything. As part of the cannabis industry, Planet 13 is heavily restricted on the type of content it can publish on Instagram, Facebook and other social media platforms. It’s not allowed to post pictures of buds or vapes on some sites. It can’t talk about pricing or product selection on others.   View this post on Instagram   A post shared by Morgan Celeste SF Blogger (@bayareabeautyblogger) on Jan 25, 2020 at 7:54pm PST Instead, Planet 13 encourages its thousands of visitors to take photos and videos. Starting with the entrance, the facility is full of surprises tailored for the ‘gram. As a business, Planet 13’s social media content is heavily restricted a...