• High Performance Computing (HPC) and Cluster Computing

  • SCC gives KIT users access to High Performance Computing (HPC) and Data Intensive Computing (DIC) infrastructures. SCC itself operates several computer systems (bwUniCluster 2.0, HoreKa) for different user groups. In addition, KIT scientists can use the bwForClusters within the framework of the bwHPC federation. The SCC also advises on applications for computing time on supra-regional high-performance computers (Tier-2), such as the HoreKa, or on the national supercomputers (Tier-1) of the Gauss Centre for Supercomputing (JSC, HLRS, LRZ).

HoreKa Supercomputer

The HoreKa supercomputer system is a tier-2 HPC system located directly below the national tier-1 systems. It has nearly 60.000 CPU cores, 668 GPUs, more than 220 terabytes of main memory and can deliver up to 17 PetaFLOPS.

Read more

 

bwUniCluster 3.0

The bwUniCluster 3.0 is a parallel computing system that provides universities and colleges in Baden-Württemberg with broad basic access to high-performance computing resources. As part of the state’s "bwHPC" initiative for high-performance computing in research and education, this level 3 cluster is operated by the Scientific Computing Center (SCC).

Read more

4 bwForCluster

In addition to the bwUniCluster basic supply system, there are four high-performance Computing Clusters for research purposes (in short, bwForCluster) at HPC performance level 3 (Tier-3) in Baden-Württemberg, which are used to supply different scientific areas with computing time.

Read more

Future Technologies Partition

In addition to the supercomputer HoreKa, NHR@KIT has established a second operating environment, the so-called "Future Technologies Partition".

Read more

HAICORE

The Helmholtz AI COmpute REssources (HAICORE) infrastructure project was launched in early 2020 as part of the Helmholtz Incubator "Information & Data Science" to provide high-performance computing resources for artificial intelligence (AI) researchers in the Helmholtz Association.

Read more

Scientific Support

An essential task for the SCC is to support the users with their technological and scientific applicationswhich do not only have to be purely within the HPC area. In addition, support and consulting is provided for components of software development such as compilers, debuggers, analysis tools, MPI, etc. as well as for open source codes and numerical libraries. Research-related and research-accompanying support is provided by SimLabs (Simulation Laboratories), which currently cover four research areas: Earth (climate) and Environment, NanoMicro, Energy and Astroparticle Physics. For more information on SimLabs, please visit the Scientific Computing and Simulation Department.

Scientific Computing & Simulation (SCS) 
Department Head: Robert Barthel (Deputy: René Caspart)

Operations

Software Sustainability & 
Performance Engineering

Project & Software
Management

Head: 
Samuel Braun
Head:
René Caspart
Head:
Robert Barthel

The operations team is responsible for the reliable and efficient operation of the high-performance computers (HPC) at the SCS.

 

It looks after the underlying infrastructure, ensures maintenance, monitoring and further development of the systems and supports users with technical issues.

 

The team also ensures efficient use of resources through user advice, documentation and close cooperation with scientific projects.

The Software Sustainability and Performance Engineering (SSPE) team supports scientific software developers in selecting the right tools to make scientific codes fit for the future and to optimally utilize the available computing resources.

 

The SSPE team advises on porting, testing and benchmarking on new architectures, such as the systems of the Future Technologies Partition.

The project management team supports the planning, management and successful implementation of projects in the IT and scientific context.

 

It accompanies project managers through all phases - from initiation to implementation and completion.

 

The SCS software team provides comprehensive support in the use of scientific and administrative software and acts as an interface to manufacturer support.

 

It offers advice on operation and access to the IT infrastructure, provides software via various channels (e.g. software store, remote desktop, HPC), maintains central license servers and handles license and contract management.

 

In addition to supporting research and development with hardware, software and many years of know-how in these two areas, teaching in the HPC field and its environment is of equal importance. The page "Teaching, Training and Further Education provides information about offers in the field of teaching.

Cost of this service will be calculated according to the budgeting rules applied to your organization.

Storage for scientific computing

The SCC operates storage systems for different purposes:

Storage systems of High-Performance Computers
So-called parallel file systems are connected directly to the HPC systems via fast networks. These are characterized by very high throughput performance and very good scalability. Both Lustre and IBM Storage Scale file systems are used on the HPC systems at KIT. There are currently a total of 7 parallel file systems with a storage capacity of 25 petabytes on 50 servers and 1300 clients.

In addition to the parallel file systems visible on all computing nodes, fast local data carriers can be used in each computing node. In addition, so-called on-demand file systems are offered, which are created exclusively for an HPC job and are only available on the assigned computing nodes during the simulation.

Further information: User Guides of the various HPC systems.

Mass storage for scientific data (LSDF Online Storage)
The service "LSDF Online Storage" provides users of KIT with access to a data storage, which is especially designed for the storage of scientific measurement data and simulation results of data intensive scientific disciplines. The LSDF Online Storage is operated by the Scientific Computing Center. Access is guaranteed via standard protocols.

The backup and protection of the data is carried out according to the current state of the art. The service is not suitable for storing personal data.
More information: Mass storage for scientific data (LSDF Online Storage).

Archive storage (bwDataArchive)
The state service bwDataArchive offers a technical infrastructure for long-term archiving of scientific data.

bwDataArchive is available especially for members of universities and public research institutions in Baden-Württemberg. Data archiving is carried out at KIT and includes reliable storage of even large data sets for a period of ten years or more. The service enables a qualified implementation of the recommendations of the German Research Foundation (DFG) on good scientific practice (recommendation 7 on the securing and storage of research data).

Online storage for sharing data (bwSync&Share)
The state service bwSync&Share is an online storage service for employees and students of universities and colleges in Baden-Württemberg. It has been operated at KIT since January 1, 2014 and enables users to synchronize or exchange their data between different computers, mobile devices, and users. Access to the bwSync&Share portal via bwsyncandshare.kit.edu.