Antitrust concerns in zero price markets
"Free" products and services are the latest trend but many of those are not actually free. Consumers are exchanging their data for them. Only recently competition policy practitioners have started to adequately account for the unique attributes of zero-price markets.
See below for the event recording and materials.
Daniel Rubinfeld presented his latest paper focusing on the analytical challenges that the widespread provision of “free” products have brought to the forefront of competition policy debate. As he explained at the beginning of his intervention, this work represented a first holistic attempt to identify the several unanswered questions regarding the welfare effects of zero pricing strategies of firms and in particular the appropriate regulatory and enforcement tools to approach them. In this regards, he openly called for further economic research in order to provide new theoretical guidance to antitrust practitioners. Outlining the economic rationales for the supply of free products, he emphasized the need to analyse products or services that are “companions” to those that are offered at zero price. Another important motivation is related to the existence of a specific “zero price effect”, recently acknowledged by several behavioural economics studies.
When faced with zero price, consumer become affective rather than rational decision makers, probably due to some cognitive bias. Such behavioural “discontinuities”, according to Rubinfeld, should be included in any economic model used to predict the competitive dynamics in these markets. In terms of welfare implications, his pre-emptive assessment was that the supply of zero price products provide in general real benefits to consumers, well beyond the costs saved. For example, free goods may create pro-competitive effects by encouraging firms to compete more on non-price dimension such as quality. Nonetheless, he identified many cases in which zero price products have the potential to create negative effects on both competition and welfare.
To correctly identify these circumstances, Rubinfeld argued that it is crucial to acknowledge that in a zero price context most basic market-related assumptions nested in economic models at the core of the current competition policy framework do not hold anymore. In this sense, he shared his view with the audience on the limitations of existing antitrust tools, ranging from market definition, the analysis of market power and the detection of predatory strategies. At the same time, he argued that there are ways that antitrust can be extended to deal with such issues.
Paul Gilbert expressed a more optimistic view on the competitive effects of the provision of free goods. He argued that free services are generally good, not only for consumers but also for competition. Indeed, if we look at many recent cases (e.g. the successive growth of social media platforms, innovation in search), the supply of free goods usually encouraged competition, given multi-homing and portability. In terms of measurement issues, he expressed scepticism about the possibility to assess anticompetitive effects on the advertising market, given that advertising opportunities are so vast. Moreover, he pointed out that it might be very difficult to measure a degradation in quality of future innovation. He concluded arguing that there is such a thing as a free lunch and hence antitrust concerns are likely to be rare. In such cases, however, he notably argued that we have the necessary tools to address them.
Miguel de la Mano pointed out that firms often create the “illusion” that a product is free of charge. The reality is that in two-sided markets someone else is actually paying. In many other cases, consumers indirectly pay a monetary (or non-monetary) price. However, every time there is a transaction, value is created for both consumers and firms. Hence, whether this occurs either at zero or positive price, he agreed with Gilbert that we should not question too much about free goods, as far as such transactions are welfare enhancing. However, he pointed out how the reactions of consumers facing free goods may contrast standard economic predictions. By removing the usual pricing setting mechanisms, indeed, other consideration comes into play in determining consumers’ behaviour. While free is definitely not a novelty for competition authorities (e.g. bundling of free razors has been permitted for its procompetitive dynamic effects), Internet determined a dramatic reduction (if not zeroing) of marginal costs for many digital products. This, in turns, has created even higher incentives to fiercer competition. At this point, de la Mano identified some situations in which there might be welfare concerns. However, not all of them necessarily fall under the responsibility and area of competence of competition authority.
For example, when consumers pay for zero-price goods with personal data, checks for misuse should be strict but would fall under the responsibility of personal data protection authorities. When certain segments of the population are more exposed to negative consequences of allegedly free products, fairness considerations should be raised and possibly solved. Another situation in which it would be advisable a public intervention (not necessarily by competition authorities) regards the deterioration in the quality of information, as a consequence of fiercer competition in the media industry.
Alexandre de Streel warned about the oversimplifying rhetoric about free goods. In many cases, indeed, supposedly free goods are not actually free: someone else is paying, or consumer are paying in other ways. In this sense, he gave credit to Rubinfeld’s paper to have properly identified the presence of hidden costs regarding zero price goods. De Streel argued that in these markets the “usual” anticompetitive concerns (related to bundling, barriers to entry, etc.) are still present, but newer types of harm, mainly related to the use of big data, are potentially more relevant. In terms of antitrust analytical toolkit, he agreed with Rubinfeld that the first trap to avoid is the “no price, no market” argument: there are for sure other related markets with positive price to scrutiny.
Conceptual tools to define relevant market must hence be adapted to allow a broader analysis of actual competitive constraints. Even more radically, he proposed to skip (or at least downgrade) the relevant market definition phase and focus more on the analysis of entry barriers, mainly related to users’ data which has become the essential input for innovation.
Event notes by Filippo Biondi, Research Assistant.
AUDIO & video recording
Report by Michal S. Gal, Daniel L. Rubinfeld
Presentation by Daniel L. Rubinfeld
Presentation by Paul Gilbert
Check in and lunch
Daniel Rubinfeld, Professor of Law at the New York University and Professor of Law and Economics Emeritus at the University of California, Berkeley
Chair: Georgios Petropoulos, Research Fellow
Paul Gilbert, Counsel, Cleary Gottlieb Steen & Hamilton LLP
Alexandre de Streel, Professor of Law at the University of Namur and joint-academic director at the Centre on Regulation in Europe (CERRE).
Miguel de la Mano, Executive Vice President at Compass Lexecon
Chair: Georgios Petropoulos, Research Fellow
Miguel de la Mano
Executive Vice President at Compass Lexecon
Alexandre de Streel
Professor of Law at the University of Namur and joint-academic director at the Centre on Regulation in Europe (CERRE).
Counsel, Cleary Gottlieb Steen & Hamilton LLP
Professor of Law at the New York University and Professor of Law and Economics Emeritus at the University of California, Berkeley
Location & Contact