Blog Image

Publications

AI and price fixing: Collusion by machines

Home Insights AI and price fixing: Collusion by machines

Contributed by:

Contributed by: Troy Pilkington, Liz Blythe, Zoe Sims, Bradley Aburn and Chris Brunt

Published on:

Published on: March 18, 2019

Share:

The New Zealand Commerce Act 1986 (Commerce Act) prohibits anti-competitive cartel agreements between competitors, such as price fixing agreements.1 But what happens when businesses adopt AI technologies that use algorithms that have not been trained to notice, and avoid, anti-competitive cartel behaviour?2 There is a risk that, as businesses increasingly move towards using AI-based algorithms to set their prices, algorithms could make it easier for competitors to achieve and sustain collusion without any formal agreement or human interaction. 

Problematic conduct involving AI-based pricing algorithms could include two or more humans agreeing to fix prices, but rather than agreeing an explicit price, they agree to implement a joint pricing algorithm that coordinates prices on their behalf (i.e. human-to-human collusion on the selection of the algorithm). Companies have already been prosecuted for such conduct in Europe, including in respect of an agreement to reconfigure automated pricing software so as not to undercut each other;3 and for agreeing to implement an algorithm to allocate customers between each other.4 This does not differ substantially from traditional price fixing – it is still a cartel agreement between two people and is prohibited conduct under the Commerce Act. The only difference is how the agreement is implemented (i.e. using a common AI algorithm).

Another example of problematic conduct could arise where a business outsources its pricing function to a third party. If multiple competitors engage the same third party agent to set their prices using an identical algorithm (with the knowledge that their competitors are also engaging the same price-setting agent) there is a risk that this would amount to 'hub and spoke' collusion. For example, travel agents that sold on an online platform in Lithuania were prosecuted when the platform's administrator unilaterally imposed technical restrictions on the ability of the independent travel agents to offer packaged tours at a discount exceeding 3%. The travel agents who knew about the restriction, and did not take any steps to oppose it, were fined for engaging in cartel conduct.5 Notwithstanding the lack of direct coordination between the competitors, the New Zealand Commerce Commission (NZCC) or courts could similarly form the view that this arrangement has the purpose or effect of controlling prices between the parties, using a third-party conduit (i.e. AI) – which is prohibited conduct under the Commerce Act.

It is likely that the NZCC's response to the examples above would be quite straightforward – that the agreements on using the same AI algorithm or appointing a common third party pricing agent would be treated by the NZCC as cartel conduct under its existing cartel conduct paradigm. 

A slightly more grey area arises where, with no human involvement or instruction, a price-setting AI algorithm teaches itself to coordinate with competitors – otherwise referred to as tacit algorithmic collusion. While it is unclear the extent to which current AI technology allows for this tacit coordination, a 2018 University of Bologna study found that if two competitors both employ pricing algorithms, and each gives its software total autonomy to set prices, those two algorithms would reach collusive, price-fixing arrangements more often than not.6

Given that tacit algorithmic collusion does not involve any element of human agency and is often conducted by computers that do not have explicable decision-making processes, it is not clear how the NZCC would regulate and enforce this type of cartel-like behaviour. This leads to questions of whether regulators and policy-makers should revisit the concepts of "agreement" and "collusion" for competition law purposes and whether there is a need to specifically regulate pricing algorithms.

European Commissioner Margrethe Vestager has indicated that the European Commission (EC) will likely take a strict liability approach to enforcement against cartel conduct by AI. Under this approach, a business that utilises price-setting AI will be liable for any software-initiated price-fixing behaviour, even when humans do not initiate (or even understand) this behaviour. Margrethe Vestager refers to this as "compliance by design":7

[Pricing] algorithms need to be built in a way that doesn’t allow them to collude. What businesses need to know is that when they decide to use an automated system, they will be held responsible for what it does. So they had better know how that system works.

While some overseas competition law authorities have expressed doubt in relation to the effectiveness of this approach,8 and the NZCC is yet to bring any enforcement action in this space, it is possible that, when it does, it will also seek to adopt this strict liability enforcement approach in New Zealand. 

As there have not been any cases internationally yet that allege "collusion by machines", the way that regulators, the courts, and policy-makers will think about these issues remains to be seen. However, it seems to be only a matter of time before the first cases are brought, and the conduct of machines faces the same scrutiny as the rest of us.

If you would like any advice regarding the issues discussed in this article, or assistance in getting the right legal protections in place for your business before implementing AI technology in your organisation, please do not hesitate to contact us.

To view the other articles in our "Implementing AI in your business" series, please visit our landing page here.

This article was first published by CIO New Zealand.  

FOOTNOTES
  1. OECD (2017), Algorithms and Collusion: Competition Policy in the Digital Age.
  2. Commerce Act 1986, s 30.
  3. CMA Case 50233 Online sales of posters and frames (12 August 2016).
  4. ∧  Case C-172/14 ING Pensii v Consiliul Concurentei (16 July 2015).
  5.  Case C-74/14 Eturas v Lietuvos Respublikos konkurencijos taryba (21 January 2016).
  6. ∧   Calvano, Emilio and Calzolari, Giacomo and Denicolò, Vincenzo and Pastorello, Sergio "Algorithmic Pricing: What Implications for Competition Policy?" (July 2018).
  7.  Politico "When Margrethe Vestager takes antitrust battle to robots" (May 2018). 
  8.  In 2017, the Chairman of the UK Competition and Markets Authority (CMA) expressed concern regarding this "compliance by design", strict liability approach to enforcement, asking "how far can the concept of human agency be stretched to cover these sorts of issues?"

 


This publication is intended only to provide a summary of the subject covered. It does not purport to be comprehensive or to provide legal advice. No person should act in reliance on any statement contained in this publication without first obtaining specific professional advice. If you require any advice or further information on the subject matter of this newsletter, please contact the partner/solicitor in the firm who normally advises you, or alternatively contact one of the partners listed below.

Talk to one of our experts:
Related Expertise