The Expectation of Privacy

By Michalis Kamprianis


Everybody has something to say about the Facebook / Cambridge Analytica case. And I am annoyed by people saying that when you give your data to Facebook, you forego some parts of your privacy (true) so you should not be surprised (false). In simple terms, it was an actual data breach. Individuals who had not consented, had their data exposed. This was not supposed to happen. There are two aspects I would focus on regarding this issue:

The Facebook Perspective

From the Facebook perspective this is a plain data breach due to bad design. The design of the API allowed for this to happen. Technically it was not a hack. There was not a ‘bug’ in a software piece that was exploited and allowed that to happen. It was a design flaw. Intentional or unintentional, this is a different discussion. Looking at it under the upcoming GDPR perspective, if the API had been released after May 2018, then a Data Protection Impact Assessment should have taken place, and such a design flaw would be punishable. Based on the $40 billion revenue in 2017, the penalty could go up to $1,6 billion. At the moment though, there is no GDPR relevance here.

The Cambridge Analytica perspective

Cambridge Analytica is a data analytics company. It lives by the data it collects. The company is irrelevant without data, and the quality of the services it offers is directly related to the amount of data it has. Cambridge Analytica engineers found that they could collect additional data and they did. They tried to improve the offering of their company. What would you do?

And what about ethics?

Here’s the thing though: Let’s say I work for an automotive company and I find somewhere leaked (e.g. on a poorly protected S3 bucket as is usually the case lately) the designs of an imaginative succesful car company called Tigers. Do I use them? I didn’t hack anyone, and I don’t care how they ended up there. A design flaw from the Tigers exposed them. By using them I can add value to my company; I avoid the R&D costs and I take advantage of someone else’s efforts, but without doing anything illegal myself.

Having said that, I believe there is a consensus that they should not be used. But if I cannot use intellectual property that accidentally came in my hands, why can I use private data that I happened to stumble upon? Shouldn’t I report it?

As it goes, Cambridge Analytica did not formally report it, but they certainly bragged about it.

The way I see it, Cambridge Analytica’s people failed to respect the intended use of data. They bragged about having the data and using it. But I may give them the benefit of a doubt; they may have not realized that their actions violated the subjects’ privacy. At the same time Facebook’s people failed to close the loophole when they found out. Although Cambridge Analytica was vocal about having the data since 2014, Facebook only changed the design in 2015 – some months afterwards.

Nobody Spoke Up

But the most obvious failure is that nobody spoke up. Cambridge Analytica and Facebook people knew. It was all over the news and it was not even a secret. Yet nobody came forward to say ‘hey, this is wrong’, or ‘hey, we need to inform people’. And this is where we all should be worried. Our expectation of privacy is based on companies respecting our privacy; companies are a set of individuals.

Among the (at least) 100 people that knew about it, I am very interested to find out who actually was willing to jeopardize their jobs in order to do the right thing and inform upper management, public and news about that violation.

We know that speaking up is a difficult thing but unless people are willing to speak up, we will never be in a better place.

About Michalis Kamprianis
Michalis has worked for well known organizations, in a wide spectrum of industries for over 20 years. With deep technical know how, combined with his business acumen, he has extended experience in building and leading IT and security teams, developing information security management systems for GRC, defining policies and technical standards, as well as monitoring controls and their efficiency through a proper risk management framework.
Michalis holds several certifications for Information Security, IT Management and Governance as well as Project Management while his academic qualifications include an MSc in Computer Science with Distinction, and an MBA with merit. He enjoys and frequently presents on security conferences.
Follow his thoughts at his website: