The digital heart of FAU

Server room at Regional Computer Centre Erlangen (photo: FAU/Georg Pöhlein)
Server room at Regional Computer Centre Erlangen (photo: FAU/Georg Pöhlein)

Interesting places at FAU: the server rooms of Regional Computer Centre Erlangen

Our journey to the digital heart of FAU begins with a fascinating relic from computer history. Visitors to Regional Computer Centre Erlangen (RRZE) in Martenstraße can already glimpse the legendary Z23 made by computing pioneer Konrad Zuse through a glass door on the second floor. It was not the first electronic computer system at FAU, but it is one of few computers made by the computing pioneer which still functions today. As Marcel Ritter unlocks the door, he says ‘This computer was not operational for 30 years. We have Edwin Aures and Volkmar Sieh from the Department of Computer Science to thank for restoring it to a functioning state.’ Today the system is part of the Computer Technology Collection Erlangen (ISER). When the Z23 starts, it rattles and roars. ‘It sounds like a jet engine,’ remarks Ritter. ‘There is something magical about it.’

Marcel Ritter, head of Central Systems at FAU has now arrived at the digital heart of FAU – the RRZE server rooms. There isn’t much magic in these rooms but plenty of computing performance. ‘The servers in these rooms power the University’s infrastructure,’ says Ritter. This includes IT services for students and employees as well as an increasing number of departments who do not want to manage their own IT.

Marcel Ritter, head of Central Systems at RRZE, is an expert on the digital heart of FAU (photo: FAU/Georg Pöhlein)
Marcel Ritter, head of Central Systems at RRZE, is an expert on the digital heart of FAU (photo: FAU/Georg Pöhlein)

When the computer centre was founded in 1968, it was originally run by an American-built computer: the Control Data (CDC) 3300. Up until the 1970s, the system was managed by operators in white laboratory clothing. Today, the RRZE server facility, which is also known informally as the ‘Bunker’, is one of the most important components of the university research network. Data traffic from more than 50,000 students and employees, 200 sites and five cities is transferred via RRZE. This also includes the University’s wireless network. ‘There are usually more than 10,000 users connected to the wireless network at the same time,’ says Ritter. Students can access the wireless network via 1200 access points. RRZE is also connected to the German Research Network (DFN) via a high availability Internet connection with speeds of up to 10,000 Mbit/s.

The rows of cabinets in the server rooms are long and the servers have many different tasks in processing data and storage. ‘Usually, nobody enters these rooms,’ says Ritter. ‘Only when a hard disk or power supply needs to be replaced. 99% of the time, we don’t actually have to enter the server room to solve a problem.’

Inside you can’t hear a roaring rattle like the Zuse Z23 but there is a constant hum from the ventilation and cooling systems. ‘It’s a perfect location for drying off from the rain,’ says Ritter with a wink. It’s not really warm unless you open the enormous water-cooled door of the high performance computer Emmy which is around 10cm thick. ‘The air coming out is around 35 to 40 degrees. In the newer high performance computers, the cooling water is pumped around the racks, otherwise the energy mass could not be transferred away from the room effectively.’ What began with a Linux cluster which could handle 776,000,000,000 operations per second in 2003 has gone far beyond the original capacity. ‘The high performance computers Emma and LiMa have a total computing power of 300 teraflops,’ says Ritter proudly.

More business than pleasure

All five faculties at FAU use the high performance computing (HPC) facility, although humanities and business and economics researchers use less computing time than biologists, physicists or material scientists. And it’s not really a surprise that the Cluster of Excellence ‘Engineering of Advanced Materials’ (EAM) which is concerned with the development of new high performance materials and processes takes up more than half of the computing time. Sometimes visiting school groups are disappointed by the supercomputers when Ritter explains that HPC clusters can’t be used for playing games as they do not have graphics cards or video outputs.

In other locations server racks are cooled by cold aisle containment. ‘This is when cool air is drawn up from the double floor and used to cool the server systems. The warm air is transferred to the outside.’

Marcel Ritter has everything under control in the server room. But what if the power fails? ‘We can run the servers for around 10 minutes (with the exception of the HPC clusters) through the uninterruptible power supply,’ says Ritter. If the power cut lasts for longer than 10 minutes, there is an emergency backup generator which takes over the energy supply. Marcel Ritter and his team are obviously prepared for the worst case with one small exception: ‘The only system which could fail completely is the coffee machine.’