por John Harris
colunista no jornal The Guardian
Big corporate scandals tend not to come completely out of the blue. As
with politicians, accident-prone companies rarely become that way by
accident, and a spectacular crisis can often arrive at the end of a long
spell of bad decisions and confidence curdling into hubris. So it is
with the tale of Facebook and Cambridge Analytica, and a saga that vividly highlights the awful mess that the biggest player in billions of online lives has turned into.
Four days after a story pursued for over a year
by my brilliant Observer colleague Carole Cadwalladr burst open, its
plot now feels very familiar: in early 2014, 270,000 people did an
online “personality test” that appears to have resulted in information
about 50 million of their Facebook friends being passed to the nasty and
amoral men featured in Channel 4’s secret filming,
which would be in contravention of Facebook’s rules about data being
used by third parties for commercial purposes. In the second act,
Facebook failed to alert users and took only limited steps to recover
and secure the data in question. On Tuesday, as Facebook’s value
continued to slide, the plot thickened, with the re-appearance of a
whistleblower named Sandy Parakilas, who claimed that hundreds of millions
more people were likely to have had similar information harvested by
outside companies, and that while he was working for the company between
2011 and 2012, Facebook’s systems for monitoring how such information
was used often seemed to barely exist.
Even if Facebook has since changed its rules to stop third-party apps
gaining access to data from people’s friends, all this still goes back
to something that remains absolutely fundamental to the company. A lot
of its users know, and yet constantly choose to forget: beneath all the
bromides about “bringing the world closer together”
gushed out by its founder and CEO Mark Zuckerberg, and the joy of
posting your holiday pictures, Facebook’s employees tirelessly work to
amass as much data as they can about users and their online friends and
make vast amounts of money by facilitating micro-targeting by
advertisers. (This has had nasty aspects beyond political messaging: it
was only last year, for example, that Facebook decisively stopped housing advertisers excluding certain ethnic groups, and disabled people).
If you use its services as their creators intend and cough
out the small details of your life on a daily – or even hourly – basis, Facebook will know
all about your family, friends, education, politics, travel habits,
taste in clothes, connected devices, and scores of things besides. Its
eyes can extend just about everywhere online: to quote from its privacy policy,
“We receive data whenever you visit a game, application, or website
that uses Facebook Platform or visit a site with a Facebook feature …
sometimes through cookies.” And though third-party apps can be
restricted from scooping up personal information, we all know what tends
to deliver their makers what they want: the fact that most people have
no idea how to restrict access to their data, and are subtly enticed to
ignore such things.
All this stuff defines Facebook’s raison d’etre. Indeed, hinting at
its drive for omniscience, Zuckerberg once habitually talked about what Facebook insiders called radical transparency,
an idea that partly amounted to an insistence that old ideas about
privacy were becoming outmoded. Facebook was leading the way, and this
was nothing but a good thing.
“To get people to the point where there’s more openness – that’s a
big challenge,” Zuckerberg said. “But I think we’ll do it. I just think
it will take time. The concept that the world will be better if you
share more is something that’s pretty foreign to a lot of people, and it
runs into all these privacy concerns.” (You could write a doctoral
thesis about those words: the professed belief in improving the lot of
humanity sounding distinctly like window-dressing for the company’s
pursuit of endlessly increasing revenues; the seeming impatience summed
up in the words “all these privacy concerns”.) In retrospect, talking
like that, and encouraging your people to think of a lot of worries
about personal confidentiality as increasingly the stuff of the past,
was always going to invite disaster.
Facebook’s latest bout of anxiety and what some people call
“reputational damage” now dates back at least 18 months. By the end of
the US presidential election campaign, its algorithms had ensured that the top fake stories in people’s news feeds were generating more engagement than the most popular real ones. Zuckerberg initially described the claim that Facebook had been instrumental in the victory of Donald Trump as a “pretty crazy idea”, only to recant. Having been scared by Twitter and enthusiastically pushing the idea that Facebook could be a news platform, he then ran in the opposite direction,
insisting that its job was to allow people to share “personal moments”.
At times, he looks like someone who cannot keep up even with himself.
Facebook sometimes behaves like a government – sending in “auditors”
to examine material at the London offices of Cambridge Analytica while
the UK information commissioner’s investigators waited for legal
permission to do the same thing, and reportedly demanding access to the
whistleblower Christopher Wylie’s phone and computer. But at the same
time, its bosses defy the most basic expectations of corporate
governance. Like Facebook’s COO Sheryl Sandberg, Zuckerberg is still nowhere to be seen: a statement issued on Tuesday
said he and Sandberg were “working around the clock to get all the
facts and take the appropriate action moving forward”, and that “the
entire company is outraged we were deceived”, which is most of the way
to being laughable. Were it not for his $70bn fortune, he would arguably
inspire pity, rather than anger: it looks like he is in way over his
head.
Even if the majority of Facebook users still seem content to give it
the data it constantly devours, over the past two or three years, a
rising chorus of voices has demanded that governments and legislators
bring the company to heel. The EU’s General Data Protection Regulation
represents a step in the right direction; as does the fact that the
Cambridge Analytica scandal is being looked into by the US federal trade
commission. The work being done by the Tory MP Damian Collins as the
chair of the digital, culture, media and sport select committee is great
to see. But even at their most potent, these efforts do not get near
questions centred on Facebook’s sheer size, and the possibility of
anti-monopoly action that would have to originate on the company’s home
turf.
In the US, anti-trust actions only succeed
if a supposedly monopolistic company can be found to have affected
consumers’ wellbeing in terms of the quality of products and services
they can access, the levels of innovation in a given economic sector,
and in particular, the prices people have to pay. The fact that Facebook
would probably slip free of such criteria surely suggests that the
rules are unfit for the online age, and that a different set of
considerations ought be introduced, perhaps built around the power a
company wields, relative to its collective competence. In those terms,
Zuckerberg and his colleagues are guilty of an epic fail, and everything
that now happens to them should follow from it. #
Sem comentários
Enviar um comentário