Skip to content

“Revolutionary Photonic AI Hardware Illuminates with Massive $154 Million Funding Boost”

The Emergence of Photonic Computing and Its Role in AI

The world of Artificial Intelligence (AI) has grown by leaps and bounds in recent years, with companies employing complex algorithms and state-of-the-art hardware to train and produce models that accurately predict future outcomes. However, this advancement came with a cost. Traditional chips used in this process, namely GPUs and TPUs, have exhausted their limits in terms of density and computational speed, resulting in equipment that consumes massive amounts of power and produces a significant amount of waste heat.

Thankfully, photonic computing, the study of using photons or light as a means of transmitting information, has emerged as a promising alternative for the AI industry. Lightmatter, a company specializing in photonic computing, has claimed that its technology can revolutionize the AI industry, allowing it to upscale while reducing power consumption. Their technology uses microscopic optical waveguide arrays to perform logical operations using light, expediting the training process while being easy to scale.

The concept of optical computing is not entirely new. Still, up until now, it has been mostly limited to academic research, primarily due to the cost and technological limitations of producing and scaling said technology. Lightmatter’s photonic computing, however, promises to be practical and feasible, the ideal solution for the world’s energy power wall.

The Advantages of Photonic Computing

The limitations of traditional computing have forced companies to find alternative technologies. The rise of photonic computing directly addresses these limitations. Below, this article outlines some of the advantages of photonic computing:

1. Faster Computing Speeds

Current GPUs and TPUs are limited by the speed at which they can transmit electrical signals through a chip, which leads to latency issues and slows down the training process. However, photonic computing, using light instead of electricity, promises to provide near-instantaneous data transfer. In theory, photonic data can move at or near light speed, allowing data centers to process vast amounts of data at a much faster rate.

2. Cost-Effective and Energy Efficient

The issue of energy consumption is a critical and timely concern for the AI industry. Photonic computing operates using microscopic optical waveguide arrays that allow light to pass through, meaning it consumes significantly less power than traditional computing devices. The supercomputers that currently perform AI work are massive and require a vast amount of energy, posing environmental challenges. Photonic computing offers an eco-friendly solution to the energy crisis facing the AI industry in particular and the environment in general.

3. Scalability

Photonic computing technology is both readily scalable and flexible, meaning it will adapt to future changes or advancements. Its fiber optic cables offer an easy-to-scale solution, making it ideal for large operations.

The Lightmatter Photonic Computing Solution

Lightmatter, a startup at the forefront of photonic computing research, has emerged as the leading supplier of this technology. Their photonic chip uses Optical Processing Units (OPU), where waveguide arrays allow light to pass through and perform logical operations. These chips offer flexibility and can process data at astonishing speeds, while the fact that data is passed through the chip using light instead of electricity means they use minimal amounts of power.

Lightmatter has pointed out that its technology can work in tandem with other chips, including GPUs and TPUs, in more general-purpose computing situations. However, its claim of being particularly useful in processing the dense mathematical operations found in AI work cannot be ignored.

With the Photonic Turn, the Future of AI looks Bright

Photonic computing technology offers a compelling promise in the industry, and with researchers and companies like Lightmatter at the forefront, it is likely to gain traction. As improved training methods and techniques continue to emerge, computing devices will meet their limit. Photonic computing represents the photonic turn that the industry needs to continue to scale. Lightmatter’s technology offers the industry an eco-friendly, scalable, and efficient solution that can handle vast amounts of data with minimal power usage. The emergence of photonic computing points to a brighter future for the AI industry, and we must watch closely to see how it progresses.

Summary:

Lightmatter’s Photonic Computing technology offers the AI industry a practical and cost-effective alternative to current computing methods. Its microscopic optical waveguide arrays allow light to pass through and perform logical operations, speeding up the process while minimizing energy consumption. Photonic computing is faster, more energy-efficient, and scalable, which looks set to revolutionise the AI industry. Lightmatter’s potential to utilize different colored lights through the waveguide allows for “virtual chips” that can further simplify computation. By utilizing their full stack of products, the company can seamlessly integrate with PyTorch and TensorFlow, using the industry standard Neural Networks. The industry is ready for photonic computing, and the future of AI looks bright.

—————————————————-

Article Link
UK Artful Impressions Premiere Etsy Store
Sponsored Content View
90’s Rock Band Review View
Ted Lasso’s MacBook Guide View
Nature’s Secret to More Energy View
Ancient Recipe for Weight Loss View
MacBook Air i3 vs i5 View
You Need a VPN in 2023 – Liberty Shield View

Commissioning of photonic computing light matter is taking its big break in the fast-growing AI computing market with a combination of hardware and software that it says will help the industry level up and save a lot of electricity to boot.

Lightmatter chips basically use optical flow to solve computational processes as products of vector matrices. This math is at the heart of a lot of AI work and is currently done by GPUs and TPUs that specialize in it but use traditional transistors and silicon gates.

The problem with them is that we are approaching the limits of density and therefore speed for a given wattage or size. Advances are still being made, but at great cost and pushing the limits of classical physics. The supercomputers that make training models like GPT-4 possible are huge, consume enormous amounts of power, and produce a great deal of waste heat.

“The world’s largest companies are facing an energy power wall and experiencing massive challenges with AI scalability. Traditional chips are pushing the limits of what is possible to cool, and data centers are producing ever larger power footprints. AI advancements will slow significantly unless we implement a new solution in data centers,” said Lightmatter CEO and founder Nick Harris.

Some have projected that training a single large language model can consume more energy than 100 American homes consume in a year. Furthermore, there are estimates that 10-20% of the world’s total power will be spent on AI inference by the end of the decade, unless new computing paradigms are created.”

Lightmatter, of course, claims to be one of those new paradigms. Their approach is, potentially at least, faster and more efficient, using microscopic optical waveguide arrays to allow light to perform logical operations simply by passing through them: a sort of analog-digital hybrid. Since waveguides are passive, the main power consumption is creating the light itself, then reading and driving the output.

One really interesting aspect of this form of optical computing is that you can increase the power of the chip simply by using more than one color at a time. Blue does one operation while red does another, although in practice it’s more like a wavelength of 800 nanometers does one, and 820 does another. It’s not trivial to do, of course, but these “virtual chips” can vastly increase the amount of computation done on the array. Twice the colors, twice the power.

Harris founded the company based on the optical computing work he and his team did at MIT (which is licensing the relevant patents to them) and managed to dispute an initial round of $11 million. in 2018. An investor said then that “this is not a science project,” but Harris admitted in 2021 that while they knew “in principle” that the technology should work, there was a lot to do to make it operational. Fortunately, he was telling me that in the context of investors throwing another $80 million into the company.

Now Lightmatter has raised a $154 million C-round and is gearing up for its royal debut. It has multiple pilots with its full stack of Envise (computing hardware), Passage (interconnect, crucial for large computing operations), and Idiom, a software platform that Harris says should allow machine learning developers to adapt quickly.

A Lightmatter Envise unit in captivity. Image Credits: light matter

“We have created a software stack that integrates seamlessly with PyTorch and TensorFlow. The workflow for machine learning developers is the same from there: we take the neural networks built into these industry-standard applications and import our libraries, so all the code runs in Envise,” he explained.

The company declined to make any specific claims about speedups or efficiency improvements, and because it’s a different computing architecture and method, it’s hard to make apples-to-apples comparisons. But we are definitely talking about an order of magnitude, not a measly 10% or 15%. Interconnect is similarly upgraded, as it’s useless to have that level of processing isolated on one board.

Of course, this isn’t the kind of general-purpose chip you might use in your laptop; it is very specific for this task. But it’s the lack of task specificity at this scale that seems to be holding back AI development, although “holding back” is a misnomer, as it’s moving at such speed. But that development is enormously expensive and unwieldy.

Pilots are in beta and mass production is planned for 2024, by which time presumably they should have enough feedback and maturity to implement in data centers.

Funding for this round came from SIP Global, Fidelity Management & Research Company, Viking Global Investors, GV, HPE Pathfinder and existing investors.

Lightmatter’s photonic AI hardware is ready to shine with $154M in new funding


—————————————————-