Der 800-Terabyte-Gorilla: Sind große Datenmengen gleichbedeutend mit Marktmacht?

The EU Competition Commissioner has highlighted “competition in a big data world” as a new frontier in market regulation.  The Commission hasn’t, Margrethe Vestager tells us, found a competition problem yet. But: “This certainly doesn’t mean we never will.”  What would it take to create one?

The idea that data could be source of market power has been floating increasingly close to the surface in Brussels. In a speech in Munich in January 2016, Margrethe Vestager, the EU Competition Commissioner, cast a fly over it.  “If just a few companies control the data you need,” she warned, “that could give them the power to drive their rivals out of the market.”

If just a few companies control the data you need, that could give them the power to drive their rivals out of the market.

Margrethe Vestager, EU Competition Commissioner

Big data tends to be held by big firms with big billions of customers. It can leave some of their competitors feeling that such firms are…well, a bit big. Yet their services are typically free to use, and they have Silicon Valley investors who are willing to suspend disbelief with respect to the likelihood of a reliable stream of returns. So they do not display the features of market power that regulators traditionally get their hooks into: big prices and big profits. If we paid through the nose to use Facebook or to do a Google search, we (and the competition regulators) would have the usual, financial, causes for concern.  But in this kind of market, the focus needs to be elsewhere – on what’s on the servers.

SO WHO CARES?

Of course, money must come into it somewhere.  In the end, those Silicon Valley investors do expect to get their bucks back. And that expectation is founded on the belief that big data means big returns. So the first question is: how?

Most obviously, personal data can be monetised through advertising, particularly if it can lead to more targeted, more successful marketing. This is not an economic “evil”: more effective advertising means that customers see adverts for things they actually might want to buy, rather than having their time wasted (by, let’s say, ads for nose hair removers), and this makes the wheels of commerce turn more efficiently.

 

The old warning about the unavailability of a free lunch applies here.

But the old warning about the unavailability of a free lunch applies here.  Nothing comes that “free”.  A transaction takes place when I give you something I would rather keep in exchange for something I want even more.  Usually, what I give is money; in this case, I’m paying with a loss of privacy for otherwise “free” services.  This does have a cost, just of a different type. (Are you happy with the commercial world knowing that yes, you do need a nose hair remover?)

If one looks at the transaction in this way – “free” service paid for by loss of privacy – then it becomes easier to see how the precepts of competition law could be brought to bear.  There is a “price”, or as the EU Commissioner put it, “customers have a new currency that we can use to pay…our data.” And, most importantly, a dominant firm might be capable of setting that price unacceptably high. Customers might find themselves handing over much more of their valuable data than they would have to do in a competitive market.

BIG DATA POWER

The key point is that the bigger the data such polishing machines have, the better they work.

Big data operates rather like processing plant. Put raw personal information in one end, polish it up with some statistical analysis, and receive usable marketing insights at the other.  The key point is that the bigger the data such polishing machines have, the better they work.

To understand this, begin with the amount of personal information such firms possess that might be relevant to your purchasing preferences: for example, your marital status, your dog, your obsession with Star Trek Deep Space Nine.  Let’s say the data-crunchers know ten such personal facts about you. Now suppose that you share each of these characteristics with a quarter of the population. Then, if these characteristics are unrelated, your combination of attributes  can be expected to occur in only 0.0001% of the population.  You are, quite literally, one in a million.

Now suppose that I want to monetise this information by selling it to advertisers. They want to know what types of advert people like you click, and how many of those clicks generate transactions. To answer those questions, I need a sample of people like you. A good sample – if one looks at typical market research – might be something like 1,000 people. If you are one in a million, then my big data machine would need information on one billion users to generate that size of sample and operate efficiently.

TESTING TIME

Whilst proper testing clearly requires big sets of data, and there are advantages to be had from accumulating lots, the scale of any commercial and competitive advantages remains unproven.

But wait. There are reasons to remain skeptical that such data machines, however big, are really sufficient to achieve market dominance. In real life, of course, those ten personal facts aren’t likely to be totally unrelated (married people are more likely than singles to have dogs). So you’re unlikely to be as unusual as one in a million – on the information the data firm has, anyway.  Whilst proper testing clearly requires big sets of data, and there are advantages to be had from accumulating lots, the scale of any commercial and competitive advantages remains unproven.

Not all data is equally useful. As the Commissioner has said: “We shouldn’t take action just because a company holds lots of data.  After all, data doesn’t automatically equal power…we need to look at the type of data.”  What effect will such scrutiny have?

The Commission’s decision with respect to the Facebook and Whatsapp showed the usual sort of merger analysis: it looked at market shares, the number and type of competitors, the closeness of competition between the merging parties’ services. The existence of effective downstream competitors, offering services than end-users find attractive, is likely to remain the most important factor in such merger cases.

So how will big data come to affect such cases? As we’ve seen, most competition investigations feature an analysis of prices and profit margins. When no cash price is paid by consumers, we need a more sophisticated economic approach to the analysis of dominance.

In future, we expect regulators will start to analyse the financial value of the data consumers provide. Where larger firms systematically earn a much higher income per subscriber from exploiting exactly the same sort of data, this might be an indicator that they are in possession of a big data machine with market power.  But proving this won’t be easy, and is likely to be a highly contested element of the economic analysis in such cases. Is your ability to earn better returns from data really a sign of market power, or simply a function of your clever algorithm for generating valuable insights?

SEE YOUR DATA, ALLIGATOR

The Commission has signalled a concern that there might be times when a dominant big data advantage could be used to stifle innovative new services. It’s not immediately obvious how, because even if a firm has built itself a big data machine of unparalleled sophistication, this does not have any direct impact on viability of its current or future competitors.

Any complaint that such a machine generates more revenues to invest in superior services – giving rise to an “efficiency offence” – is unlikely to get much traction. After all, these are markets where inventors of unique or better services have very quickly gained big user bases to fuel their own machines.

In fact, such concerns are most likely to arise in markets where history matters.  If a first mover can build up a long history of useful data – a Facebook timeline, perhaps, or a Google search history – a new competitor might be unable to match its sophistication, even if such an entrant can achieve the same number of users. In such circumstances, one can imagine that authorities might start to take seriously the need for historic data to follow the customer.

The European Commission believes that a new right to data portability under the forthcoming Data Protection Regulation will mean “smaller companies will be able to access data markets dominated by digital giants.

Such a solution would have very significant implications.  It would raise major regulatory questions with respect to inter-operability and data-sharing between rival services. Policy makers have already started to engage with these issues. A government sponsored voluntary initiative in the UK already allows customers to download and share personal current account transaction history across participating banks. The European Commission believes that a new right to data portability under the forthcoming Data Protection Regulation will mean “smaller companies will be able to access data markets dominated by digital giants”. We can expect competition authorities to continue build on that idea in their approach to remedies in data driven markets.

CONCLUSION

In principle, the development of big data should not be game-changing for the regulators: it merely requires a more sophisticated analysis of the nature of transactions with customers, in order to identify the “prices” they pay in terms of lost privacy rather than hard cash. But in practice, that can be quite a tall order.  Such a requirement demands a new and challenging type of economic analysis.  And as the competition regulators advance further into big data territory, they  are beginning to put forward other questions that in turn will raise the prospect of significant changes in regulatory approach.

Comments are closed.