Skip to content

Jaw-dropping: UK’s supercomputer plan demands billions for success, says Javid

Britain’s £900m Supercomputer Funding “Too Modest,” Says Former Chancellor

Former chancellor Sajid Javid has criticized the government’s £900m plan to invest in a supercomputer to boost artificial intelligence research. Speaking at the London Tech Week conference, Javid warned that the proposal was too modest and risked leaving the UK lagging behind global rivals. He said that “billions” of pounds were needed to deliver a machine with the capacity to compete on a global scale.

Hunt’s Supercomputer Plans

Current chancellor Jeremy Hunt announced the funding in March, claiming that “artificial intelligence needs computer power.” Hunt pledged to implement the recommendations in an independent Future of Compute Review, with the aim of delivering an “exascale” machine by 2026. Exascale computers are capable of processing vast amounts of data quickly and can support scientific breakthroughs in areas such as climate change and security.

However, Javid argued that Hunt’s plans would produce a facility with the same processing power as OpenAI’s ChatGPT had in 2022. This would leave the UK behind global rivals like the US, which have invested billions in similar supercomputing projects.

Javid called for faster progress, citing the need for large-scale investment and a sense of urgency to keep pace with the rapidly-evolving AI ecosystem. His concerns were echoed by former prime minister Tony Blair and Tory leader Lord William Hague, who warned of the “risk of never recovering” if the UK does not act quickly.

Silicon Valley’s Supercomputing Resources

The sums pledged by the UK government are comparable to similar supercomputer projects in the US and the EU. However, they lag far behind the multi-billion dollar resources available to Silicon Valley’s tech giants. OpenAI raised $1bn from Microsoft in 2019 to build the LLM technology that powers ChatGPT, its popular chatbot. Earlier this year, Microsoft invested another billion dollars in OpenAI, enabling the company to release an updated version of GPT-4.

The US has allocated up to $1.8bn for two exascale supercomputers. Last month, the Frontier supercomputer at the US Department of Energy’s Oak Ridge National Laboratory became the first to break the exascale barrier, recording more than 1 quintillion calculations per second. Meanwhile, Europe’s first exascale computer, Jupiter, is set to launch later this year. The computer will be housed in the Jülich Supercomputing Center in Germany and jointly funded by Brussels and the German government to the tune of up to €500m.

The UK Government’s Response

At the time of writing, the UK government had not yet responded to Javid’s comments. However, the Department of Science, Innovation and Technology has been contacted for comment.

Additional Piece: Overcoming the Challenges of AI Innovation

Building a supercomputer capable of competing on a global scale presents a significant challenge, but it is just one of many challenges facing AI innovation today. As Javid and others have noted, AI is evolving rapidly, and innovators must keep pace to remain competitive. However, this is easier said than done.

One significant challenge is the shortage of skilled AI professionals, which is not unique to the UK. Across the globe, there is a growing demand for AI specialists, but a limited supply. This gap presents an opportunity for upskilling and reskilling the existing workforce, as well as investing in training programs for the next generation of AI innovators.

Another major challenge is ethical and governance concerns. As AI systems become increasingly sophisticated, the potential risks associated with their misuse or unintended consequences become more significant. This can include everything from biases in AI algorithms to the automation of jobs and the impact on human rights. Addressing these concerns requires a multi-stakeholder approach, involving governments, civil society, and the private sector.

Finally, funding remains a significant barrier to AI innovation, as Javid and others have noted. While the UK’s £900m investment may seem significant, it pales in comparison to the resources available to Silicon Valley’s most wealthy companies. Governments around the world must recognize the importance of funding AI research and development if they hope to compete on a global scale.

Conclusion

In conclusion, Javid’s criticism highlights the challenges facing the UK’s AI industry, including the need for large-scale investment and a sense of urgency. As AI continues to evolve, innovators must keep pace with a rapidly-changing ecosystem. Addressing the challenges of AI innovation requires a multi-stakeholder approach, involving governments, civil society, and the private sector. With the right investments, governance frameworks, and upskilling programs, the UK can remain competitive in AI innovation.

—————————————————-

Article Link
UK Artful Impressions Premiere Etsy Store
Sponsored Content View
90’s Rock Band Review View
Ted Lasso’s MacBook Guide View
Nature’s Secret to More Energy View
Ancient Recipe for Weight Loss View
MacBook Air i3 vs i5 View
You Need a VPN in 2023 – Liberty Shield View

Britain’s £900m plan to build a supercomputer to boost research into artificial intelligence is too modest and will leave the country lagging behind rivals, former chancellor Sajid Javid warned on Tuesday.

Javid told the London Tech Week conference that “billions” would need to be invested in the project to deliver 10 times the capacity of the “exascale” machine promised by current chancellor Jeremy Hunt in his March budget.

Hunt has announced £900m in funding to implement recommendations from an independent Future of Compute Review. for the supercomputer, saying that “artificial intelligence needs computer power”.

But Javid said the government’s plans would produce only a facility by 2026 with the same power as OpenAI’s ChatGPT had in 2022, leaving Britain behind rivals such as the United States.

“It’s nowhere near where we need to be,” he said. “The investment cannot be in the hundreds of millions; it will have to be in the billions. I don’t think we have a choice.

Javid, a former adviser to US artificial intelligence company C3 AI, said speed is of the essence and the project needs to get going much faster.

He added that while Rishi Sunak understood the issues, the prime minister is convening the first global AI summit in Britain in the fall – the Whitehall system was “woefully unprepared” for the AI ​​revolution.

His comments echoed a warning from Sir Tony Blair, former Labor prime minister and former Tory leader Lord William Hague, who wrote in a report this week that if the UK “doesn’t play its game quickly, there is the risk of never recovering on”.

Exascale computers require highly specialized equipment, including tens of thousands of processors and intensive cooling systems. They will be able to radically reduce the time needed to process vast masses of information or tackle more complex types of data, promising scientific breakthroughs in areas such as biology, climate change and security.

The sums pledged by the UK government are comparable to supercomputer projects in the US and the EU, but pale in comparison to the multi-billion dollar resources available to Silicon Valley’s wealthiest companies.

OpenAI raised $1 billion from Microsoft in 2019 to build the “large language model” (LLM) technology behind ChatGPT, its revolutionary chatbot that it launched to great acclaim in November last year.

LLMs require massive computing resources to train or prepare the massive amounts of data that power chatbots and other ‘generative AI’ systems capable of creating human-like text and images.

Earlier this year, Microsoft invested another billion dollars in OpenAI, which in March released an updated version of its AI model, GPT-4, leading to another dramatic change in the quality of ChatGPT responses.

In 2018, the United States allocated up to $1.8 billion for two exascale supercomputers. Last month, the Frontier supercomputer at the US Department of Energy’s Oak Ridge National Laboratory in Tennessee paid off on that investment, becoming the first to break the exascale barrier.

In doing so, it recorded performance levels of 1.1 exaflops, or more than 1 quintillion calculations per second.

The European Commission announced a year ago that the Jülich Supercomputing Center in Germany will house Jupiter, Europe’s first exascale computer, due for launch later this year.

The commission has allocated a total budget of up to €500 million for Jupiter, jointly funded by Brussels and the German government.

The Department of Science, Innovation and Technology was contacted for comment.


https://www.ft.com/content/6e3938b2-1952-4ae9-b52e-c2f2ddc1812b
—————————————————-