Researchers from Harvard University have successfully utilized Google LLC’s public cloud infrastructure to replicate a supercomputer. This supercomputer was initially employed for a heart disease study. The replication aims to assist other researchers who face challenges in accessing high-powered supercomputers for their research endeavors.
The success of the project could present alternative solutions for researchers and organizations in need of vast computing power. The adaptability of Google Cloud, combined with its high configurability, makes it a potential contender in the realm of high-performance computing.
Purpose and Challenges of the Study
Harvard professor Petros Koumoutsakos, in a statement to Reuters, mentioned that the study’s objective was to simulate a novel therapy designed to dissolve blood clots and tumor cells within the human circulatory system. Such a study demands vast computing capabilities, typically exclusive to supercomputers. Koumoutsakos highlighted that due to limited supercomputer availability, the team could only conduct one full simulation, restricting them from refining or optimizing the test further.
Cloud Computing: A Potential Solution
Given the scarcity of supercomputers and the long waiting lists for access, Koumoutsakos and his team collaborated with Citadel Enterprise Americas. Their goal was to explore the possibility of replicating a supercomputer within the public cloud, eliminating wait times. While public cloud platforms like Google Cloud aren’t inherently designed for tasks typically performed on supercomputers, they offer reliability, resilience, and immediate access.
In partnership with researchers from ETH Zurich in Switzerland, the team demonstrated their ability to use thousands of virtual machines on Google Cloud to emulate a supercomputing platform. By optimizing their code, they managed to achieve approximately 80% of the efficiency that dedicated supercomputer facilities offer.
According to the press release by Citadel Securities and Google Cloud, Prof. Koumoutsakos’s research centers on the use of magnetically controlled artificial bacterial flagella (ABF) to target and dissolve blood clots and circulating tumor cells in human vasculatures. The intricate simulations required for this research demand significant computing power and technical expertise, both of which are being provided by market making firm Citadel Securities and Google Cloud.
Prof. Koumoutsakos said, “With the support of Citadel Securities and Google Cloud, we aim to demonstrate that public cloud resources can be harnessed to handle large-scale, high-fidelity simulations for medical applications.” He emphasized the potential benefits of easily accessible cloud computing resources, including reduced research costs and faster solutions to pressing global issues.
Bill Magro, Chief Technologist, HPC at Google Cloud, stated, “Google Cloud’s high performance computing technologies and solutions are purpose-built to both simplify and scale the largest, most complex workloads, enabling researchers to dramatically accelerate time to discovery and impact.”
Quantum Theory Driving Supercomputer Development
Tech giants like Microsoft are putting a lot of R&D into quantum computing and see it as a major part of our technological future. As quantum computing moves into scalable reality through platforms such as Azure Quantum, the potential of supercomputers is becoming even greater.
Quantum computers are not just faster than classical computers. They are different. They can solve problems that are impossible for classical computers, such as factoring large numbers or simulating complex molecules. This has the potential to revolutionize many industries, from cryptography to drug discovery.
For example, quantum computers could be used to crack unbreakable encryption codes, which could have a major impact on internet security. They could also be used to design new drugs and materials, or to optimize transportation networks.
Last Updated on November 8, 2024 12:01 pm CET