Tuesday, January 26, 2021

Facebook’s secret settlement on Cambridge Analytica gags UK data watchdog

Remember the app audit Facebook founder Mark Zuckerberg promised to carry out a little under three years ago at the height of the Cambridge Analytica scandal? Actually the tech giant is very keen that you don’t.

The UK’s information commissioner just told a parliamentary subcommittee on online harms and disinformation that a secret arrangement between her office and Facebook prevents her from publicly answering whether or not Facebook contacted the ICO about completing a much-trumpeted ‘app audit’.

“I think I could answer that question with you and the committee in private,” information commissioner Elizabeth Denham told questioner, Kevin Brennan, MP.

Pressed on responding, then and there, on the question of whether Facebook ever notified the regulator about completing the app audit — with Brennan pointing out “after all it was a commitment Mark Zuckerberg gave in the public domain before a US Senate committee” — Denham referred directly to a private arrangement with Facebook which she suggested prevented her from discussing such details in public.

“It’s part of an agreement that we struck with Facebook,” she told the committee. “In terms of our litigation against Facebook. So there is an agreement that’s not in the public domain and that’s why I would prefer to discuss this in private.”

In October 2019 Facebook settled with the UK’s data protection watchdog — agreeing to pay in full a £500,000 penalty announced by the ICO in 2018 in relation to the Cambridge Analytica breach but which Facebook had been appealing.

When it settled with the ICO Facebook did not admit liability. It had earlier secured a win, from a first-tier legal tribunal that had held June that “procedural fairness and allegations of bias” against the regulator should be considered as part of its appeal, so its litigation against Facebook had got off to a bad start — likely providing the impetus for the ICO to settle with Facebook’s private army of in-house lawyers.

In a statement at the time, covering the bare bones of the settlement, the ICO said Denham considered the agreement “best serves the interests of all UK data subjects who are Facebook users”.

There was no mention of any ‘gagging clauses’ in that disclosure. But the regulator did note that the terms of the agreement gave Facebook permission to “retain documents disclosed by the ICO during the appeal for other purposes, including furthering its own investigation into issues around Cambridge Analytica”.

So — at a stroke — Facebook gained control of a whole lot of strategically important information.

The settlement looks to have been extremely convenient for Facebook. Not only was it fantastically cheap (Facebook paid $5BN to settle with the FTC in the wake of the Cambridge Analytica scandal just a short while later); and not only did it provide Facebook with a trove of ICO-obtained data to do its own digging into Cambridge Analytica safely out of the public eye; but it also ensured the UK regulator would be restricted in what it could say publicly.

To the point where the information commissioner has refused to say anything about Facebook’s post-Cambridge Analytica app audit in public.

The ICO seized a massive trove of data from the disgraced (and since defunct) company which had become such a thorn in Facebook’s side, after raidingCambridge Analytica’s UK offices in early 2018. How much of that data ended up with Facebook via the ICO settlement is unclear.

Interestingly, the ICO also never produced a final report on its Cambridge Analytica investigation.

Instead it sent a letter to the DCMS committee last year — in which it set out a number of conclusions, confirming its view that the umbrella of companies of which CA was a part had been aggregating datasets from commercial sources to try to “make predictions on personal data for political alliance purposes”, as it put it; also confirming the improperly obtained Facebook data had been incorporated into a pre-existing database containing “voter file, demographic and consumer data for US individuals”.

The ICO also said then that its investigation did not find evidence of the Facebook data that had been sold to Cambridge Analytica had been used for political campaigning associated with the UK’s Brexit Referendum. But there was no overarching report detailing the underlying workings via which the regulator got to its conclusions.

So, again from Facebook’s perspective, a pretty convenient outcome.

Asked today by the DCMS committee why the regulator had not produced the expected final report on Cambridge Analytica, Denham pointed to a number of other reports it put out over the course of the multi-year probe, such as audits of UK political parties and an investigation into credit reporting agencies.

“The letter was extensive,” she also argued. “My office produced three reports on the investigation into the misuse of data in political campaigning. So we had a policy report and we had two enforcement reports. So we had looked at the entire ecosystem of data sharing and campaigning… and the strands of that investigation are reported out sufficiently, in my view, in all of our work.”

“Taken together the letter, which was our final line on the report, with the policy and the enforcement actions, prosecutions, fines, stop processing orders, we had done a lot of work in this space — and what’s important here is that we have really pulled back the curtain on the use of data in democracy which has been taken up by… many organizations and parliamentarians around the world,” she added.

Denham also confirmed to the committee that the ICO has retained data related to the Cambridge Analytica investigation — which could be of potential use to other investigations still ongoing around the world. But she denied that her office had been asked by the US Senate Intelligence Committee to provide it with information obtained from Cambridge Analytica — seemingly contradicting an earlier report by the US committee that suggested it had been unable to obtain sought for information. (We’ve contacted the committee to ask about this.)

Denham did say evidence obtained from Cambridge Analytica was shared with the FTC, SEC and with states attorneys general, though.

We’ve also reached out to Facebook about its private arrangement with the ICO, and to ask again about the status of its post-Cambridge Analytica ‘app audit’. (And will update this report with any response.)

The company has produced periodic updates about the audit’s progress, saying in May 2018 that around 200 apps had been suspended as a result of the internal probe, for example.

Then in August 2019 Facebook also claimed to the DCMS committee that the app audit was “ongoing”.

In its original audit pledge — in March 2018 — Zuckerberg promised a root and branch investigation into any other ‘sketchy’ apps operating on Facebook’s platform, responding in a ‘crisis’ length Facebook post to the revelations that a third party had illicitly obtained data on millions of users with the aim of building psychographic profiles for voter targeting. It later turned out that an app developer, operating freely on Facebook’s platform under existing developer policies, had sold user data to Cambridge Analytica.

“We will investigate all apps that had access to large amounts of information before we changed our platform to dramatically reduce data access in 2014, and we will conduct a full audit of any app with suspicious activity,” Zuckerberg wrote at the time. “We will ban any developer from our platform that does not agree to a thorough audit. And if we find developers that misused personally identifiable information, we will ban them and tell everyone affected by those apps. That includes people whose data [Aleksandr] Kogan misused here as well.”

It’s notable that the Facebook founder did not promise to transparently and publicly report audit findings. This is of course what ‘self regulation’ looks like. Invisible final ‘audit’ reports.

An ‘audit’ that’s entirely controlled by an entity deeply implicated in core elements of what’s being scrutinized obviously isn’t worth the paper it’s (not) written on. But, in Facebook’s case, this opened-but-never-closed ‘app audit’ appears to have served its crisis PR purpose.



from Social – TechCrunch https://ift.tt/3sXpnVL

No comments:

Post a Comment