Big data, digital platforms and market competition
How does big data generate economic value for firms and individuals? How should we respond to potential antitrust concerns?
SEE BELOW FOR THE VIDEO AND AUDIO AND EVENT MATERIALS
Hal Varian described the context in which digital platforms operate. Several structural factors determine an intensive competition in this market. Human, software and financial resources are readily available to both incumbents and entrants. Switching costs are low for both producers (given the high flexibility of web technology) and consumers (due to multi-homing). Moreover, even if core competencies are different (e.g. retailing for Amazon, social networks for Facebook, search engine for Google, devices for Apple), competition is really across the board. The results of this intense competition are low prices and high quality (which benefit consumers) but also rapid innovation, fostered by a robust entry of new tech companies (born as “micro”-multinational) and continuous reshuffle of leadership positions in the market.
To explain why some of these digital platform have become so large, direct and indirect networks effects are often claimed as a relevant driver of market concentration. However, according to Hal Varian this explanation does not necessarily work for all large web firms, like those providing search services. He argued that using data to learn about customers and offer better services at low costs is simply a supply-side phenomenon, more precisely a learning by doing mechanism. Not the exploitation of feedback effects through data with potential anti-competitive concerns, as alleged by many. Moreover, learning from data is not an automatic side effect of scale, it requires investments and is at constant risk of devaluation due to technology changes. Interestingly, data analytics can be even outsourced to companies offering their expertise.
Damien Neven and Tommaso Valletti challenged this Schumputerian view, according to which disruptive innovation is ongoing, competition is fierce and, thus, no anti-competitive concerns could be raised. On the contrary, they warned that the source of the lasting strong positions of digital platforms such as Google or Facebook is somehow puzzling and should be analysed more carefully. Is it totally due to their capacity to develop innovative and successful products (which attracts costumers, create big markets, in which they collect big volume of data)? Or has this process been somehow (or can be) harmful to the competitors, given that such markets can tip and first-mover advantages make huge differences? Because it is true that size is not an issue per se, but it is certainly a precondition to potentially foreclose competitors.
In terms of policy instruments, both argued that the current European merger regulation provides already the necessary economic and legal tools to face anticompetitive cases regarding digital platforms but further understanding is certainly needed to assess whether, for example, there are minimum thresholds when economies of scale in using big data kick in. Moreover, the role that the stock of all past information collected may play in competition is still far to be assessed by competition authorities. This is essential, for example, to square huge merger value with the acquisition of zero-profits firms (e.g. the past Facebook/Whatsapp or the forthcoming Microsoft/Linkedin case).
In terms of policy-making, Reinhilde Veugelers added to the discussion that even if the market is very competitive and contestable, that’s not necessarily imply lack of market failures. Unfortunately, this does not even imply that policy intervention can really readdress them, given the complexity of the competitive dynamics and the unpredictability of future technological innovation in the ICT ecosystem. She presented as an example of potential market failure the tendency to the establishment of incompatible platforms as a result of strong competition for the market. In such context, however, she claimed that public intervention in terms of standardization regulation is fraught with danger too. In her conclusion, she advocated more use of big data and artificial intelligence by public authorities too in order to contaminate and improve the design and the evaluation of their interventions.
Event notes by Filippo Biondi, Research Assistant.
audio & viDEO RECORDING
Check-in and lunch
Georgios Petropoulos, Research Fellow
Hal Varian, Chief Economist, Google
Professor of International Economics at the Graduate Institute of Geneva
Chief Economist, Google
Professor of Economics, Imperial College Business School
Location & Contact