SHAREing HPC Testbeds
UK researchers have access to a vast number of testbeds and machines where they can quickly get access and trial new codes or ideas. Some of these machines have been procured through governmental grants such as the ExCALIBUR H&ES programme, some universities open up their systems to UKRI-eligible colleagues, and some commercial providers grant access to their systems in the Cloud, too. Unfortunately, it is sometimes difficult to keep track of all the different opportunities. This page collects systems as well as some meta information such that it is easier for researchers to find the right system for the right purpose.
For each system, click its name to find out more about that system.
| System name | Status | Category | Focus | Focus area | Grouping | Funders | Nodes | Accelerators | Accelerator count per node | Memory bandwidth | Memory bandwidth benchmark | Floating point performance | Floating point performance precision | Floating point benchmark | I/O bandwidth | I/O bandwidth benchmark | Manufacturer | Scheduler | Interconnects | Reference |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| AccelerateAI | In service | Production system | Regional system | Wales | Supercomputing Wales | ERDF Welsh Government |
6 | NVIDIA A100 40GB | 8 | — | — | — | — | — | — | — | Atos | Slurm | Infiniband HDR NVLink |
Link |
| Bede (Tesla V100 32G) | In service | Production system | Regional system | N8 group universities, and EPSRC-funded projects | N8CIR | N8 EPSRC |
32 | Tesla V100 32G | 4 | — | — | — | — | — | — | — | IBM | Slurm | InfiniBand EDR NVLink 2.0 |
Link |
| Bede (Tesla T4 16G) | In service | Production system | Regional system | N8 group universities, and EPSRC-funded projects | N8CIR | N8 EPSRC |
4 | Tesla T4 16G | 4 | — | — | — | — | — | — | — | IBM | Slurm | InfiniBand EDR NVLink 2.0 |
Link |
| Bede (H100 96GB) | In service | Production system | Regional system | N8 group universities, and EPSRC-funded projects | N8CIR | N8 EPSRC |
7 | H100 96GB | 1 | — | — | — | — | — | — | — | gh001-gh006 Vespertec, gh007 SuperMicro | Slurm | InfiniBand EDR NVLink 2.0 |
Link |
| Bede (H100 144GB) | In service | Production system | Regional system | N8 group universities, and EPSRC-funded projects | N8CIR | N8 EPSRC |
1 | H100 144GB | 2 | — | — | — | — | — | — | — | gh008 SuperMicro | Slurm | InfiniBand EDR NVLink 2.0 |
Link |
| COSMA A100 (cosma8-shm) | In service | Test bed | Discipline-specific system | Astronomy and Cosmology | COSMA | STFC DiRAC ExCALIBUR |
2 | NVIDIA A100 40GB | 1 | 1352 GB/s | BabelStream | — | — | — | — | — | Dell | Slurm | Infiniband HDR200 Liqid composable fabric |
Link |
| COSMA A100 (mad06) | In service | Test bed | Discipline-specific system | Astronomy and Cosmology | COSMA | STFC DiRAC ExCALIBUR |
1 | NVIDIA A100 40GB | 1 | — | — | — | — | — | — | — | Dell | Direct SSH | Infiniband HDR200 Liqid composable fabric |
Link |
| COSMA A30 | In service | Test bed | Discipline-specific system | Astronomy and Cosmology | COSMA | STFC DiRAC ExCALIBUR |
8 | NVIDIA A30 | 1 | 822 GB/s | BabelStream | — | — | — | — | — | NVIDIA | Slurm | CerIO composable fabric | Link |
| COSMA GH200 (gn002) | In service | Test bed | Discipline-specific system | Astronomy and Cosmology | COSMA | STFC DiRAC ExCALIBUR |
1 | NVIDIA GH200 | 1 | — | — | — | — | — | — | — | NVIDIA | Direct SSH | NVLink-C2C | Link |
| COSMA GH200 (gn003) | In service | Test bed | Discipline-specific system | Astronomy and Cosmology | COSMA | STFC DiRAC ExCALIBUR |
1 | NVIDIA GH200 | 1 | 3500 GB/s | BabelStream | — | — | — | — | — | NVIDIA | Slurm | NVLink-C2C | Link |
| COSMA H100 | In service | Test bed | Discipline-specific system | Astronomy and Cosmology | COSMA | STFC DiRAC ExCALIBUR |
1 | NVIDIA H100 NVL | 1 | 3387 GB/s | BabelStream | — | — | — | — | — | NVIDIA | Direct SSH | Link | |
| COSMA MI100 | In service | Test bed | Discipline-specific system | Astronomy and Cosmology | COSMA | STFC DiRAC ExCALIBUR |
1 | AMD MI100 | 1 | 947 GB/s | BabelStream | — | — | — | — | — | AMD | Slurm | Link | |
| COSMA MI210 | In service | Test bed | Discipline-specific system | Astronomy and Cosmology | COSMA | STFC DiRAC ExCALIBUR |
2 | AMD MI210 | 2 | 1250 GB/s | BabelStream | — | — | — | — | — | AMD | Slurm | Link | |
| COSMA MI300A | In service | Test bed | Discipline-specific system | Astronomy and Cosmology | COSMA | STFC DiRAC ExCALIBUR |
1 | AMD MI300A 128GB | 4 | 3648 GB/s | BabelStream | — | — | — | — | — | AMD | Direct SSH | Link | |
| COSMA MI300X | In service | Test bed | Discipline-specific system | Astronomy and Cosmology | COSMA | STFC DiRAC ExCALIBUR |
1 | AMD MI300X 192GB | 8 | 4036 GB/s | BabelStream | — | — | — | — | — | AMD | Slurm | Link | |
| COSMA V100 | In service | Test bed | Discipline-specific system | Astronomy and Cosmology | COSMA | STFC DiRAC ExCALIBUR |
1 | NVIDIA V100 32GB | 6 | 823 GB/s | BabelStream | — | — | — | — | — | NVIDIA | Direct SSH | Link | |
| Tursa (NVIDIA A100 40GB) | In service | Production system | Discipline-specific system | High-Energy Physics | DiRAC | STFC UKRI DSIT |
114 | NVIDIA A100 40GB | 4 | — | — | — | — | — | — | — | Atos | Slurm | Infiniband HDR NVLink |
Link |
| Tursa (NVIDIA A100 80GB) | In service | Production system | Discipline-specific system | High-Energy Physics | DiRAC | STFC UKRI DSIT |
65 | NVIDIA A100 80GB | 4 | — | — | — | — | — | — | — | Atos | Slurm | Infiniband HDR NVLink |
Link |
Disclaimer
This page is not yet complete, and we plan to work towards precise guidelines on what additional detail should be documented per cluster on such an overview page.
Is your system not included on this list? Please add it at the GitHub repository for this site!