Google today signed an agreement with the European Organization for Nuclear Research to collaborate on joint research and development. More commonly known as CERN, the premier physics lab will be working with Google on quantum and cloud computing, as well as machine learning.
One area of focus between the two groups is upcoming improvements to the Large Hadron Collider. The particle accelerator has a number of planned upgrades in the coming years to “increase researchers’ visibility into the fundamental nature of matter, increasing “luminosity” and generating more particle collisions.”
What is known as the “High-Luminosity LHC”, or HL-LHC, will come on-line around 2026. Using current software, hardware and analysis techniques, it is estimated that the computing capacity required would be around 50-100 times higher than today. Data storage needs are expected to be on the order of exabytes by this time — an order of magnitude higher than today.
These massive physics experiments will require computing capacity that is an estimated 50-100 times higher than today, while data storage from experiments will be in the exabytes.
Google finds these “data management, analysis, and processing” challenges “exciting” and has already been working with the U.S. Fermilab and the Brookhaven National Laboratory (BNL) to store and analyze data from the LHC using the Google Computer Engine.
The data challenges for the 2021-2023 and 2026-2029 runs of the LHC will need to be addressed by more than just cloud scale-out of computing using off-the-shelf services.
CERN created CERN openlab in 2001 as a framework for collaboration between CERN and leading companies in information and computing technologies. There are currently over 20 active CERN openlab projects, spread across four R&D topics: data center technology and infrastructure, computing performance and software, machine learning and data analytics, and interdisciplinary applications.