02 Apr 2025
Oxford spin-out is latest to target AI with claims of optical processing 'breakthrough' to address compute and power challenges.
Lumai, a spin-out from the UK’s University of Oxford working on a radical optical computing approach to artificial intelligence (AI) that goes beyond integrated photonics, says it has secured venture investment of more than $10 million.
Announced during the Optical Fiber Communications (OFC) conference taking place in San Francisco this week, the funding round was led by Cayman Islands-registered investor Constructor Capital and will be used to aid product development, double the firm’s headcount, and create a presence in the US.
Other backers include the Dutch deep-tech investment platform PhotonVentures and existing UK investor IP Group, alongside participation from Israel-based Journey Ventures, Italy’s LIFTT, quantum-focused Qubits Ventures, US-based State Farm Ventures, and Japan’s TIS.
“Lumai’s revolutionary optical computing technology will help AI data centers dramatically reduce costs and boost performance - while simultaneously minimizing energy consumption,” announced the firm, whose technology is based around optics research by two of the company’s co-founders, Xianxin Guo and James Spall.
Guo and Spall are both named on a pending US patent application entitled "Optical multiplication system and optical multiplication method", as well as a 2022 paper in the journal Optica that details a training method for optical neural networks.
AI limitations
Citing a December 2024 analysis by Lawrence Berkeley National Laboratory suggesting that rapidly escalating demand from data centers could consume as much as 12 per cent of electrical power in the US by 2028, Lumai believes that its optical processing approach could make a huge difference.
“Lumai is tackling the limitations of AI compute by using optical processing to accelerate large language models (LLMs) and other transformer-based AI,” explains the company.
“Lumai has succeeded where others have failed, overcoming the scalability challenges of optical computing. Its technology processes AI’s core arithmetic operations within optical beams traveling through three-dimensional space, by-passing the limits of silicon GPUs and integrated photonics.”
Said to utilize low-cost optical components, the approach is claimed to enable cost-effective, high-performance AI “inference” - the ability of an AI model to draw accurate conclusions from data it has not previously seen.
Lumai suggests that adopting its optical computing approach would reduce AI inference costs by an order of magnitude compared with today’s state-of-the-art LLMs that rely on large GPU chips made by the likes of Nvidia.
‘Optical matrix multiplier’
“Lumai’s unique design will deliver 50 times the performance of silicon-only accelerators while using just 10 per cent of the power required for AI in data centers, lowering both capital costs and total cost of ownership,” claims the firm.
Tim Weil, CEO and another of the co-founders, said in a release announcing the funding round: “The future of AI demands radical breakthroughs in computing. The cost of current LLMs is unsustainable, and next-generation AI won’t happen without a major shift.
“Lumai’s innovative optical computing design overcomes the scalability challenges that have held others back and dramatically reduces power consumption, which will drive down the cost of AI.”
On Lumai’s web site, the team describes the heart of the AI processor as an “optical matrix multiplier”, where beams of light perform calculations.
“By using very wide vectors and high optical clock speeds, data is processed at least 50 times faster and 90 per cent more efficiently than today’s solutions,” they wrote.
“We are developing an optical matrix-vector multiplier (MVM) using a technology that has a near-term speed limit of up to 1017 operations per second - 100 times faster than the human brain and 1000 times faster than traditional electronics.”
Investor views
Serg Bell, the founder and chairman of Constructor Capital, noted: “Life and intelligence are a large carbon-and electron-based neural model trained over 2 billion years.
“Fossil fuels are a by-product of this evolution, and they may not generate enough energy to create a better model if we continue using electron-based computation.
“We need more efficient, faster energy sources for the next generation of humanity's neocortex: Artificial General Intelligence. Photons are the only known choice. Lumai’s technology is a significant step forward in improving matrix multiplication, similar to the advancements quantum computers offer for other computational scenarios.”
Lee Thornton, a partner at IP Group, added: “Having solved the challenges of optical compute to provide a low-cost, scalable solution. Lumai’s technology has the potential to transform the future of AI.”
Ewit Roos from PhotonVentures commented that Lumai was “fundamentally reshaping the future of AI compute”, and hailed the startup as “one of the most compelling opportunities in next-generation data center technology”.
© 2025 SPIE Europe |
|