Photonics is proving to be a tough nut to crack
the growing reckoning power needed to train sophisticated AI models like OpenAI ChatGPT you could eventually hit a wall with mainstream chip technologies.
In a 2019 analysis, OpenAI found that from 1959 to 2012, the amount of power used to train AI models doubled every two years, and that power use started increasing seven times faster after 2012.
It’s already causing tension. Microsoft is reportedly it is facing an internal shortage of the server hardware needed to run its AI, and the shortage is driving up prices. CNBC, speaking with analysts and technologists, estimates the current cost of training a model similar to ChatGPT from zero to over $4 million.
One solution to the AI training dilemma that has been proposed is photonic chips, which use light to send signals instead of the electricity used by conventional processors. In theory, photonic chips could lead to higher training performance because light produces less heat than electricity, can travel faster, and is much less susceptible to temperature changes and electromagnetic fields.
light matter, Light on, luminous computing, Intel and NTT are among the companies developing photonic technologies. But while the technology generated a lot of excitement a few years ago and attracted a lot of investment, the sector has cooled noticeably since then.
There are a number of reasons why, but the general message from investors and analysts studying photonics is that photonic chips for AI, while promising, are not the panacea they were once thought to be.