The Economist reports that the UK supercomputer (exascale) will probably not get finished.
As they report: The government insists it remains committed to supercomputing. Ministers are expected to announce alternative proposals in a budget due on October 30th. Under the previous government’s plans, the Edinburgh project would have eaten up much of the available investment; some experts thought the scheme was too focused on traditional scientific simulations, rather than on AI.
Right. So instead of on science, let’s focus on burning money on a silly technology to generate text full of misleading statements and confabulations, and images of six-fingered orange men.
Edit: Someone apparently thought the illustration was an “advertisement” (!?).
This complete lack of the human characteristic of understanding a joke makes me think that perhaps the post was flagged by an AI?
Image hence removed. We don’t want to piss off the first sentient AI. Anyway, if you want to see the six-fingered praying orange man online, go for it. Hilarious.
UK is well, after Finland 5th (0.38 exaflops), Italy, Spain 8th, and China 13th 0.093 exaflops, France 17th 0.057 EFLOPS
I suppose all those other European EU funded, and UK no longer there.
Your answer even marked offtopic for offtopic, wasn’t while I prepared my answer. Since your delisted, I couldn’t send you a private message (on the post at least), so I’m not actually sure you (or anyone) sees this…
That article sounds more positive than the earlier news reports which suggested the Edinburgh machine was completely cut from the budget. Fortunately, there is still Isambard-AI being built in University of Bristol (initial phase is up and running), which might get the UK back into the top ten - for a few months at least…
(And, despite the name, I suspect it’ll be used for a good amount of traditional science.)
To be fair, as a taxpayer the benefits of exascale computing are not clear to me either. Please don’t get me wrong:
as a nerd I find the project super exciting,
I am not implying that the benefits do not exist, just that it is really hard to find a cost/benefit analysis.
Also, note that “AI” is tagged onto every grant proposal these days where it is even remotely applicable (an usually means ML, not AI). Take for example this recent highlight from the Exascale Computing Project, it reads like someone inserted “AI driven” at random spots. So it may just be a relabeling.
British taxpayers do care about the weather and will have no reason to complain about its forecast when exascale computing makes it hyper-local and absolutely accurate.
We already have AI/neural network weather forecast. It’s supposedly more accurate, at least for local weather conditions AND another upside is say lower compute, so exascale not needed.
In a paper published in Science, we introduce GraphCast, a state-of-the-art AI model able to make medium-range weather forecasts with unprecedented accuracy.
I’m not even sure if any weather modeling is done on a petascale computer (historically), or what the largest one is. The UK MET office, and similar for other countries is way down the list, down to 131st on the list (from 3rd in 1997):
Way behind Korea Meteorological Administration (highest ranked MET office onless I missed something) at 58th (previously in 2021 ranked 24th), 18 PFLOPS, 0.018 exaflops:
These large computers always seem to be DoE related, I at least often saw nuclear weapons testing as a claimed application in the past, for the US ones.
You can learn more about supercomputing and nuclear weapons from these videos from the National Nuclear Security Administration (NNSA):
NNSA Supercomputers Part 1: About ASC and Our Supercomputers
NNSA Supercomputers Part 2: Hardware, Software, Codes and Data
NNSA Supercomputers Part 3: Non-nuclear Success
Ensemble weather/climate forecasts for instance. Each forecast is highly scalable to multiple cpus and we need many of them in the ensemble to get statistics right and all of them need to run on the same cluster. We are currently not even near the resolution which resolves essential physical processes in the atmosphere/ocean.
Doesn’t that mean you can run, hypothetically on a separate computer (shared data, so might not want to)? Also you ignored how low on the list the MET offices are, recent advances over traditional, I mentioned, and likely limit of data to put in. We in Iceland care very much about the weather, up to some limit of how we can afford to spend on it. We have the largest air-control area in the world, so we do not just predict the weather for ourselves.
This is a wrong statement. GraphCast is a nice try, but it is not a panacea. We still need to run full GCM to produce data to train something like GraphCast and in reality it is not that great as some say. NN is good to approximate subgridscale physics, but the climate model still needs a dynamical core.
P.S. At NASA, climate modelling and data assimilation is the biggest user of HPC cluster resources, and it is still not enough.
You do not support that exascale is needed for the weather (far from done at least, see the Korea Meteorological Administration computer I quoted, and the way smaller UK MET one, I’m not sure, could even the former have dual use for something non-weather related? Or maybe for climate). Maybe for climate, a different kind of simulation, didn’t look into that.
Certainly a joke, but actually forecasts are enormously important for agriculture.
I’m not sure if these computers are actually used for that, though, or if for forecasting other dedicated machines are used. I think here (in Brazil) forecasting uses dedicated machines, not the clusters that are available for general usage.
I also wonder which are the applications that actually use that computer power (instead of using it for fragmented small projects).
People use resources which are available to them. Less resources available, coarser resolution models they run, less forecast accuracy they have. Current climate models do not resolve a lot of essential physics and ensemble sizes are not very large statistically. Yeah, you can live with this, it all depends on how much are you willing to pay to gain extra few % in forecast accuracy, but the demand in computing power exists.