Heightened antitrust scrutiny of algorithms and AI
Competition authorities are increasingly focused on the competitive risks posed by algorithms and artificial intelligence. In recent public remarks, the European Commission (Commission) has highlighted pricing algorithms as an area of concern, noting their potential to facilitate coordination between competitors. Similar themes have emerged in the UK, where the Competition and Markets Authority (CMA) has emphasised that while algorithms and AI can generate substantial efficiencies and consumer benefits, they may also enable novel forms of collusion. In a recent blog post, the CMA explained that it is actively screening for algorithm‑driven coordination and set out expectations for the practical steps companies should take to manage these risks.
For further background, please see our earlier publications on artificial intelligence and competition: AI: redefining the boundaries of competition (law), AI-enhanced competition enforcement in the EU and the second edition of the book Digital Competition Law in Europe.
CMA blog post
In its blog post, the CMA has been careful to stress that algorithmic pricing itself is not a recent innovation. Automated pricing tools have been used for many years across a variety of sectors, including aviation, hospitality and retail. This continuity is reflected in its earlier analytical work, most notably the CMA’s 2018 study into pricing algorithms and its 2021 market study examining the potential negative effects of algorithms on competition and consumers.
What has changed is the scale, sophistication and pervasiveness of these tools. Contemporary pricing systems can ingest highly granular data, operate continuously in real time, and increasingly rely on advanced machine‑learning techniques and large language models. At the same time, the cost and accessibility of predictive technologies have fallen sharply. As a result, businesses can now deploy powerful tools that shape (or fully automate) key commercial decisions, including pricing, with limited human intervention.
A separate CMA analysis has also examined so‑called “agentic AI”: autonomous software agents that optimise pricing or other strategic parameters on behalf of firms. The CMA’s research and analysis identifies a distinct competition risk where agents used by competing firms interact in ways that dampen competitive intensity, a phenomenon often described as “agentic collusion”.
Examples of enforcement in practice
Enforcement practice across jurisdictions demonstrates that the use of algorithms does not insulate firms from liability. On the contrary, authorities have repeatedly made clear that companies remain responsible for the conduct of their automated systems, even where decision‑making is partially delegated to software or where senior management claims limited insight into how an algorithm functions.
The Court of Justice of the EU (CJEU) ruled on 21 January 2016 (Case C-74/14, Eturas UAB and Others v. Lithuanian Competition Council) that a group of Lithuanian travel agencies violated competition law by using a shared online booking platform to cap customer discounts. The CJEU held that travel agencies could be liable for a concerted practice (collusive behavior) if they were aware of and accepted an automatic restriction of discounts, even without a formal agreement among them. In other words, the mere dissemination of a platform administrator’s message capping online discounts (and the technical implementation of that cap) sufficed to presume that the agencies engaged in a price-fixing concerted practice unless they publicly distanced themselves or reported the matter to authorities. This precedent (often called the “Eturas” case) confirms that “hub-and-spoke” collusion via a shared IT system can violate Article 101 TFEU without an explicit agreement.
On 24 July 2018, the Commission announced it had fined four consumer electronics manufacturers – Asus, Denon & Marantz, Philips, and Pioneer – a total of EUR 111 million for illegal resale price maintenance (RPM) in online sales. According to the Commission’s decision, the companies restricted their online retailers’ freedom to set prices and used software tools to monitor retail pricing. Notably, many large online retailers were using pricing algorithms to automatically adjust prices to match competitors, so the manufacturers’ fixed or minimum resale price policies had a broader impact across online marketplaces. The Commission found that the firms also employed “sophisticated monitoring tools” to detect price cuts and swiftly enforce the uniform pricing by threatening or sanctioning non-compliant retailers. This case underlines that using algorithms or price-monitoring software to sustain minimum resale prices is a breach of EU antitrust rules.
In July 2025, the Deputy Director-General at DG Competition publicly acknowledged at a conference that several antitrust cartel investigations involving algorithmic pricing were underway at the EU level. It was also confirmed that the Commission was actively examining multiple cases of “algorithm-driven” price coordination across different sectors.
In March 2026, at the American Bar Association Spring Meeting, the Commission’s Acting Director for Competition, McCallum, stated that the Commission is actively pursuing cases related to algorithmic pricing. While acknowledging that algorithmic pricing represents a relatively new area of enforcement, she emphasised that it is not exempt from antitrust scrutiny and noted that algorithmic pricing may constitute a potential antitrust infringement. In addition, she referred to the recent appointment of a Chief Technology Officer, a newly created role within the Directorate‑General for Competition, and confirmed that the Commission is conducting a large‑scale review of pricing algorithms to identify potential red flags and anomalous pricing conduct.
In 2020, the Netherlands Authority for Consumers and Markets (ACM) published a position paper outlining how algorithmic applications fall within its competition, consumer protection, and sector‑specific oversight. The paper clarifies that algorithms used for price‑setting, personalisation, market coordination, or influencing consumer choice can be scrutinised under existing legal frameworks, regardless of whether they are rule‑based or self‑learning systems.
The ACM also highlights enforcement challenges such as limited transparency of complex models, reliance on third‑party developers, and cross‑border data dependencies. Overall, the paper establishes that algorithm‑driven conduct does not escape regulatory scrutiny simply because decisions are automated, and it provides the conceptual foundation for later ACM initiatives on algorithmic pricing and digital markets.
In July 2025, the ACM launched a market study into computer-driven consumer pricing in the aviation sector. The inquiry focuses on how airlines use dynamic and potentially personalised pricing algorithms to set ticket prices for consumers resident in the Netherlands. Rather than being triggered by suspected infringements, the study aims to better understand how data- and algorithm-based pricing systems operate in practice and what effects they have on consumers and competition.
According to the ACM’s published research approach and consultation paper, the authority is examining whether algorithmic pricing contributes to reduced price transparency, difficulties in price comparability, or risks of tacit coordination between pricing algorithms. Particular attention is paid to the use of large datasets, machine‑learning techniques, and demand-based pricing adjustments that can cause rapid price fluctuations for identical tickets. While the ACM has not made any findings of unlawful conduct at this stage, it has explicitly noted that algorithmic pricing may, in certain circumstances, undermine effective competition or consumer trust. In sum, the ACM’s initiative signals heightened supervisory scrutiny of algorithmic pricing in aviation, with a strong emphasis on transparency, consumer understanding, and the competitive effects of automated pricing systems.
The CMA investigated a cartel between two online retailers of posters and picture frames on Amazon’s UK Marketplace. On 12 August 2016, the CMA issued an infringement decision finding that Trod Ltd. and GB eye Ltd. (trading as “GB Posters”) colluded not to undercut each other’s prices on Amazon. The cartel was implemented through automated price-monitoring software – both companies programmed a repricing tool to coordinate and maintain higher prices, rather than compete. GB eye received immunity for alerting the CMA under the leniency program, while Trod (which admitted the conduct) was fined £163,371. The case (settled in July 2016) is notable as one of the first UK cases where an algorithm was used to implement a price-fixing agreement, and the CMA issued warnings that use of automated pricing tools is no excuse if they are configured to break competition law.
Across a series of RPM enforcement decisions in the musical instruments and lighting sectors, the CMA found that suppliers including Casio, Fender Europe, Korg, Roland, Dar Lighting Limited, and, in one case, retailer GAK (with Yamaha), used price‑monitoring software to detect and enforce minimum online resale prices. The total fines imposed amount to approximately £16.4 million. In these cases, real‑time monitoring tools enabled firms to quickly identify discounting retailers and intervene (through pressure, threats, or supply restrictions) when prices fell below specified thresholds. The CMA emphasised that while price‑monitoring software can be used legitimately, its deployment in these cases facilitated systematic RPM, contributing to sustained restrictions on online price competition and resulting in significant fines and infringement findings.
On 2 March 2026, the CMA announced a formal investigation into several major hotel chains (Hilton, IHG, Marriott) and a hotel data analytics firm (STR, owned by CoStar) for suspected anti-competitive information sharing. The CMA’s probe focuses on whether these hotel groups used a third-party “hotel data services provider” to exchange competitively sensitive business information (such as pricing and occupancy data) through an analytics platform. The concern is that by sharing real-time commercially sensitive information via a common data tool, competing hotels might reduce uncertainty about each other’s pricing and potentially coordinate their prices or business strategies, harming competition. The investigation was opened in late February 2026 (publicly confirmed by the 2 March 2026 press release) and was prompted by worries that new data-driven tools and algorithms could be facilitating collusion or soft pricing coordination in the UK hotel sector.
In July 2025, Italy’s competition authority (AGCM) concluded a market investigation into airline ticket pricing on routes to and from the islands of Sicily and Sardinia. The inquiry scrutinised the role of pricing algorithms and dynamic pricing systems used by airlines. AGCM’s findings (published in a Preliminary Report) did not establish evidence of unlawful collusion or price-fixing by airlines, but did raise concerns about a lack of price transparency and comparability for consumers. In particular, the authority noted that automated pricing and ancillary fee algorithms made it hard for travelers to compare the true final costs of flights, potentially dampening competition. As a result, AGCM opened talks with the Commission in July 2025, seeking EU-level measures to improve airfare transparency and comparability (especially regarding ancillary services like baggage fees) in order to bolster competition. In sum, the Italian regulator highlighted that “sky-high airfares” and algorithmic pricing techniques require greater oversight and transparency, although it stopped short of enforcement due to no formal infringement found.
In September 2025, the President of Poland’s Office of Competition and Consumer Protection (UOKiK), Tomasz Chróstny, publicly revealed that UOKiK is investigating potential price-fixing facilitated by algorithms in two sectors: banking and pharmaceuticals. According to a report by Global Competition Review (8 September 2025), Chróstny confirmed his agency’s concern that:
- Major banks in Poland may have used pricing algorithms fed with data from the nation’s largest credit risk database combined with internal data to coordinate interest rates and fees on consumer loans and mortgages, rather than competing independently.
- In the pharmaceutical wholesale sector, UOKiK is examining whether three leading drug distributors (controlling about 80% of the market) used shared IT systems to exchange sensitive information, such as medicine prices, profit margins, and sales volumes at pharmacies, to align their strategies.
These investigations remain ongoing. They illustrate UOKiK’s focus on algorithmic tools as possible “collusion facilitators”. Notably, UOKiK had previously taken enforcement action in 2022 against a group of pharmaceutical wholesalers for allegedly sharing sales data, showing a pattern of vigilance in this area.
Algorithmic pricing and competition law: managing exposure
The introduction of pricing software that relies on algorithms or artificial intelligence does not reduce a company’s responsibility under competition law. Liability cannot be avoided by pointing to third‑party vendors, automated decision‑making, or a lack of insight into the technical workings of a system. Businesses are expected to take ownership of how pricing outcomes are generated, including understanding the role of data inputs, the logic that connects those inputs to outputs, and whether similar tools are used by competitors. These considerations are particularly important where existing pricing systems are gradually supplemented with AI‑based features, potentially increasing risk without a clear moment of reassessment.
Close attention should be paid to the information processed by pricing tools and to the conditions under which that information is shared or accessed. The use of non‑public, commercially sensitive data requires special care, especially where the same software provider services multiple market participants. In such cases, even indirect mechanisms of information signalling or alignment may raise competition concerns.
Independent decision‑making remains a core requirement. Pricing software must support, rather than replace, autonomous commercial judgment. To that end, companies should implement effective internal controls, backed by well‑defined policies, supervision of day‑to‑day use, and training programs focused on the specific risks posed by algorithmic and AI‑assisted pricing. Even when software appears technically compliant, employees must not coordinate with competitors on how tools are configured or deployed. This principle applies not only to automated price‑setting systems, but also to tools that track or benchmark market behaviour, which, despite being less sensitive, may still enable practices such as resale price maintenance.
Risk‑mitigation measures in practice
A structured compliance approach is recommended, which may include the following elements.
Tool selection and onboarding
Before deploying algorithmic pricing or AI‑enabled solutions, businesses should undertake thorough pre‑implementation assessments as part of their procurement processes. This includes examining how the model operates, where training and operational data originate, and whether outputs could reflect or reinforce competitors’ pricing strategies. Beyond the tool’s stated purpose, companies should consider how its capabilities could expand over time, including the potential for more autonomous decision‑making. Where tools are commonly used by actual or potential competitors, seeking legal guidance prior to adoption may be prudent.
Operational controls and governance
Clear rules should regulate how pricing tools are configured and used internally. Companies should retain documentation on system settings, data inputs, and any human intervention in pricing outcomes. Significant changes to algorithms or functionality should be subject to internal review. Particular restraint is warranted when considering whether to input non‑public or sensitive information, supported by internal processes to assess data sensitivity and, where necessary, apply safeguards such as aggregation or delayed use.
Review and testing mechanisms
Ongoing oversight is essential. Regular reviews should examine not only outcomes, but also the integrity of underlying data, access rights, override mechanisms, and employee interaction with the system. Where AI tools incorporate language‑based models, behavioural or linguistic stress‑testing may help identify unintended risks, and technical guardrails should be implemented where possible.
Awareness and training
Employees involved in pricing decisions should receive focused training on the competition law implications of algorithmic tools and on the risks of sharing or indirectly exchanging sensitive information, including through platforms, consultants, or software providers. Maintaining awareness is critical to ensuring that technological efficiency does not come at the expense of legal compliance.
Conclusion
Loyens & Loeff closely monitors the enforcement actions of competition rules in the digital sector. The actions of the competition authorities across jurisdictions reflect the increased focus on addressing competition concerns relating to the use of algorithms and AI, and ensuring fair competition in rapidly evolving technology markets.
Contact
If you have any questions or would like to explore the implications of these developments for your business, please feel free to get in touch with one of the advisers mentioned below.