The impact algorithms continue to have on market dynamics has been under scrutiny by competition authorities around the world in recent years. On 19 January 2021, the CMA published a report, following its investigation into algorithms and their potential harms on competition law. The paper examines how algorithms are used in the commercial world, the ways in which harm can arise and the options open to the CMA to monitor their use. The CMA reviewed algorithms in a broad context, under the term ‘algorithmic system’, meaning an intersection of the algorithms, data, models, processes, objectives, and how people interact and use these systems.
The key takeaway is that although technology and algorithms can be beneficial to consumers, enhancing their ability to use the web efficiently, there are risks inherent within machine learning which both regulators and businesses need to be aware of. The CMA noted that as algorithmic systems become more sophisticated, they are often less transparent, and it is more challenging to identify when they cause harm.
Personalisation
A key area of concern for the CMA related to the personalisation of websites, especially where consumers who are vulnerable or have protected characteristics under the Equality Act 2010 are concerned. Personalisation can be harmful because it is difficult to detect either by consumers or others and can have unfair distributive effects. These harms often occur through the manipulation of consumer choices, without the consumer being aware of them.
Due to the mass availability of customer data and advanced algorithmic systems which can analyse and extrapolate that data, businesses can learn more than ever about who is buying from them. This data can ultimately influence what a consumer sees, potentially to their detriment. For example, algorithms can determine what product comes up first when a consumer searches a website. However, the average consumer cannot tell how or why search results are presented in any given order. Companies are therefore able to manipulate consumers into making different decisions that are more profitable for the firm, but which the consumer would not have made under more objective or neutral conditions For example the CMA investigation into Booking.com, concluded that the comparison site listed hotels who had paid more commission higher up in the results. For more information on the booking.com case, see this Bird & Bird article.
Self-preferencing
Consumers can also be affected where websites self-preference. This can occur where a dominant platform favours its own products and services, where they are in competition with rival products offered on its platform. The Google Shopping case demonstrated this type of harm. The European Commission found that Google had infringed competition law by positioning and displaying its own Google Shopping service in its general search result pages more favourably than competing comparison shopping services.
Cartels
Algorithms could also enhance the ability of cartels to operate ‘under the radar’, unless the technology is closely monitored. The CMA identified three ways this could occur:
The CMA suggests that businesses should be aware of these risks when developing and using algorithmic technologies. Other regulators, for example in the energy and financial services sector, have already issued guidance on the risks of market manipulation arising from algorithmic trading, and on ensuring compliance (see for example the report of the Financial Conduct Authority)
The case for intervention
The CMA concludes that one of the issues faced is that often, they will not be able to access these algorithms to review them. Furthermore, if they are advanced algorithms, it will be necessary for companies to provide guides or explainers to the CMA as to their use and effect. The risk inherent in the opacity of algorithmic systems and the lack of operational transparency make it hard for consumers and customers challenge them. Ultimately, algorithms may themselves be used by competition authorities to detect anti-competitive behaviour. This is something which the Korean Fair-Trade Commission has considered. Other tools to explore could include regulation into the pre-programming of compliance software within algorithms.
As the technology develops and becomes more subtle, it will be even harder to detect. With the creation of the new Digital Markets Unit, the CMA has demonstrated it will be closely monitoring technology and its effects on competition.
For more information please contact Peter Willis, Ariane Le Strat or Chloe Birkett.