CERN’s unique knowhow derived from storage and worldwide distribution of vast amounts of data.
- Design and operation of robust, cost effective and highly reliable very large data storage data services.
- Data storage services with variable, on demand quality of service and reliability through flexible software.
- Bulk data mover. High throughput, optimized, reliable, multiprotocol, fast data transfers.
Facts & Figures:
- 256 PB of total storage under management at CERN
- 3,5 billion of data files
- 230,000 processor cores
- 20 Gb/sec: highstest data transfer rate
Read more about High Volume Data Management & Storage here.
Storing & managing very large data sets:
Data storage and management of very large data sets in highly distributed, global environment. Rapidly responding and adaptable to the LHC experiments frequently changing requirements for fast, reliable and affordable storage.
A few specifications:
- Fine tuned, adaptable data storage policies to accommodate conflicting requirements between speed, reliability and affordability.
- Expertise in very high rate, reliable, sustained data transfers
- Running a highly complex service integrating different types of storage devices (tapes, ssds, memory etc) combined to provide an infinitely scalable and highly flexible service.
- Made to measure services (storage pools) with on-demand "Quality of service"
Highly scalable cloud storage engine:
CERN needs to store vast amounts of data. Therefore, CERN developed a reliable cloud storage engine where all the LHC data and more, is stored. This multi storage cloud software platform (EOS) is characterized by very low latency, high flexible and high scalability.
File Transfer Service (FTS):
CERN needs to transfer large amount of data across the world, to make it accessible for its scientific partners. Therefore CERN developed software for fast, global and reliable bulk data transfers, and optimized existing network and storage resources for best transfer outcome.