Scientific Computing
Software and Computing for High-energy Physics
Advanced software and globally distributed computing play a crucial role in the success of large scientific projects, of which the LHC is the largest. My involvement for computing in particle physics began as a postdoc at MIT, where colleagues and I developed the CDF Analysis Facility IEEE 51 (2004) 892. In my role as Deputy Software and Computing Manager for US ATLAS, I was responsible for managing a budget of $20M/year and 45 FTEs for U.S. computing facilities and core software activities. I was a co-PI for the NSF-funded DASPOS Project, which explores viable data, software, and analysis preservation strategies for high-energy physics and other science domains. In partnership with NCSA in 2010, I started the Tier-2 Computing Center at Illinois for LHC data processing. The Illinois Tier-2 Center now has ~5400 CPU cores and is an integral part of the largest LHC Tier-2 facility in the world Midwest Tier-2. Through allocations on the Blue Waters Supercomputer, my graduate student Dewen Zhong and I have worked to apply machine learning techniques to the problem of boosted Higgs boson identification.
Software Institute for High-energy Physics
Conceptualization. In 2016, I received an NSF award ACI-1558233 along with co-PIs at Princeton and Cincinnati, to lead a conceptualization effort for an NSF Scientific Software Innovation Institute for HEP (S2I2-HEP). We initiated a community-wide roadmap process to broadly identify key elements of computing infrastructure and software R&D required to realize the full scientific potential of the high-luminosity LHC running starting in 2024. The kick-off workshop for the S2I2-HEP Conceptualization was jointly held at the University of Illinois and NCSA in December 2016.
The S2I2-HEP Conceptualization is now complete, with our Strategic Plan submitted to the NSF and a Community White Paper where I served on the Editorial Board and as convener of the Data Analysis and Interpretation (DAI) Working Group and Editor for the DAI WG Report.
Further Reading
- T. Kim, et al., “The CDF Central Analysis Farm,” IEEE Trans. Nucl. Sci. 51, 892-896 (2004)
- Antonio Alves Jr.et al. [HSF Collaboration]. “A Roadmap for HEP Software and Computing R&D for the 2020s”, arXiv:1712.06982 (2017)
- P. Elmer, M. S. Neubauer, and M. D. Sokoloff, “Strategic Plan for a Scientific Software Innovation Institute for High Energy Physics”, arXiv:1712.06592 (2017)
- L. Bauerdicket et. al. [HSF Collaboration], “HEP Software Foundation Community White PaperWorking Group - Data Analysis and Interpretation”, arXiv:1804.03983 (2018)
- D. Berzano, et. al. [HSF Collaboration], “HEP Software Foundation Community White Paper Working Group - Training, Staffing and Careers”, arXiv:1807.02875 (2018)
- K. Albertsson, et. al. [HSF Collaboration], “HEP Software Foundation Community White Paper Working Group - Machine Learning”, arXiv:1807.02876 (2018)