Sustained advances in computing power and efficiency are lowering the cost of artificial intelligence (AI). Rising adoption of AI will, in turn, fuel demand for the companies enabling its expanding scale and improving cost competitiveness. Chip designers are at the heart of this trend.

Source: Andreessen Horowitz, January 2025.

Measuring Massive Multitask Language Understanding (MMLU) is a benchmark for evaluating the capabilities of large language models (LLMs). The MMLU score represents the percentage of multiple-choice questions answered correctly by the LLM. A minimum MMLU score of 42 means that at least 42% of questions were answered correctly.

Header: AI has become 10 times cheaper every year
Subhead:         Cheapest LLM with a minimum MMLU score of 42, cost per million tokens (US$), log scale
 
Overview:        This line chart shows the cost of selected large language models, or LLMs, that were launched from 2022 to 2024. Each was the most advanced LLM at the time of its launch. The y-axis shows cost per million tokens of use on a log scale, given the exponential decline in cost during the period.
 
Overall, this chart demonstrates that the cost of large language models, or LLMs, has rapidly tumbled, falling tenfold in each of the past three years.

Breathtaking advances in AI have hinged on the smart algorithms behind models, the colossal volumes of data that train them, and the computational power needed for them to learn on their own.1

The achievements of DeepSeek illustrate how innovative algorithms can deliver leaps forward in efficiency. In early 2025, the Chinese start-up launched a highly capable open-source large language model (LLM) at significantly lower cost – to develop and to run – than established models from the likes of OpenAI and Meta.2

Such breakthroughs are relatively rare, however. Sustained year-on-year progress has instead largely been driven by another leg of the ‘AI triad’: compute efficiency. Progressively more efficient chip designs continue to enhance computational power and cost competitiveness, enabling the scaling of AI models.

Chip innovation is key to sustaining gains

The graphics processing units (GPUs) that are particularly well suited to undertaking the parallel repetitive calculations that underpin AI models contain billions of transistors. The most advanced transistors, which are switches that control the flow of electrical signals inside chips, are now 3 nanometres tall – many times smaller than bacteria or a virus.3

Technical advances in chip manufacturing are slowing, though. One area of innovation is advanced multi-chip packaging – the process of stacking chips to improve performance and make better use of space. Vertical configurations of chips are driving power and efficiency gains in data centre servers that require high capacity and speed.4

Also sustaining improvements in the extraordinary complexity of cutting-edge chips are sophisticated designs developed using electronic design automation software from the likes of US company Synopsys. Using its software, chip producers can now apply generative AI to accelerate and optimise the process of exploring chip performance parameters and verifying designs.

Customised designs can deliver further efficiencies

Advanced software is being used to develop more customised chips designed for bespoke rather than general purposes. Such application-specific integrated circuits (ASICs) can go further in delivering superior performance and workload efficiency in specific tasks, reducing energy usage and cost.

One of the leading designers of ASICs is Marvell Technology, whose specialisms include custom solutions for cloud service providers that help decrease the cost of AI model training and inference. We see the US company’s extensive intellectual property as a significant competitive advantage in the market for tailored cutting-edge chips.

Also focused on ASICs is Taiwan-listed Mediatek, which is leveraging its expertise in designing highly efficient chips for devices like smartphones to develop custom chips for use in data centres. MediaTek’s next-generation ASIC design platform integrates co-packaged optics solutions, which combine optics and silicon on one surface, aimed at addressing the high-performance computing needs of modern data centres.

Beneficiaries of unbounded demand

Time after time, technological advances have demonstrated the endurance of the Jevons paradox: gains in efficiency spur increases in overall demand. As costs fall per unit of output, appetite for AI-based services should soar across industries as advanced technology becomes less prohibitive on cost.

Irrespective of which AI models end up dominant, ever-expanding applications for AI-powered services will sustain demand for ever more sophisticated chip designs. Appetite for more efficient and powerful computing is logically infinite.

It is therefore our conviction that innovative chip designers whose products and services enable more efficient computing, and meet the specific needs of cloud service providers, will be key contributors to – and beneficiaries from – the rapid expansion of the AI ecosystem.


1 Sutton, R., 2019: The Bitter Lesson
2 Heikkilä, M., 29 January 2025: DeepSeek’s ‘aha moment’ creates new way to build powerful AI with less money. Financial Times
3 Rodgers, L. et al, 2024: Inside the miracle of modern chip manufacturing. Financial Times
4 McKinsey, 2023: Advanced chip packaging: How manufacturers can play to win


References to specific securities are for illustrative purposes only and should not be considered as a recommendation to buy or sell. Nothing presented herein is intended to constitute investment advice and no investment decision should be made solely based on this information. Nothing presented should be construed as a recommendation to purchase or sell a particular type of security or follow any investment technique or strategy. Information presented herein reflects Impax Asset Management’s views at a particular time. Such views are subject to change at any point and Impax Asset Management shall not be obligated to provide any notice. Any forward-looking statements or forecasts are based on assumptions and actual results are expected to vary. While Impax Asset Management has used reasonable efforts to obtain information from reliable sources, we make no representations or warranties as to the accuracy, reliability or completeness of third-party information presented herein. No guarantee of investment performance is being provided and no inference to the contrary should be made.

Scroll to top
To get started, please select your location and investor type below.

If you are invested in Impax Funds – regardless of share class (Investor, Institutional, or Class A) or account type (individual, business or other entity) please select Impax Funds Investor as your Investor Type.

Access Impax Asset Management Limited’s Form CRS here.

Important Information

I confirm that my investor role is [investor_type] and I am based in [investor_country] and I have read and understood the important information, privacy policy and terms and conditions which govern the use of this website.

Risk Warning

Capital at risk. The value of investments may go up or down and is not guaranteed.