Robert W. Gardner

Senior Fellow

Biography

I spent the first part of my academic career doing traditional experimental high-energy physics research at different universities in the Midwest and have been a physicist on the 3000-member ATLAS experiment using the Large Hadron Collider (LHC) at the CERN Laboratory, Geneva, Switzerland continuously since 1998.  My experimental work led me to specialize in developing and improving distributed computing technologies necessary for discoveries at the frontier of particle physics. 

No longer a classic physicist, I currently integrate data analytic techniques, technological development and implementation of distributed computing and data systems, large-scale coordination of international teams, and research on computing.  As part of GriPhyN (Grid Physics Network, National Science Foundation), I developed virtual data technologies for dataset provenance tracking and automated workflow generation using high-energy physics as an exemplar-use case.  I conducted collaborative work with Intalio Inc. to develop frameworks based on process calculi for distributed, service-based computation in physics.  I was instrumental in developing early research computing grids in the U.S.: the International Virtual Data Grid Laboratory (iVDGL) for which I was Co-PI and Coordinator (NSF-ITR), and the first deployment of the Open Science Grid (OSG) (NSF, Department of Energy), which I led.  I have also generated systems for metrics collection for distributed systems (Grid Telemetry, PI, NSF-ITR).

A central theme of my activity has been facilitating the analyses of the multi-petabyte-scale data from the LHC. These analyses have been conducted by international physics teams during the important startup-phase of the accelerator, leading up to the last year’s major discovery of the Higgs-like boson.  Currently my responsibilities include directing the ATLAS Midwest Tier2 Center, which is comprised of integrated computing facilities from the University of Chicago, Indiana University, and the University of Illinois. The Midwest Tier2 Center has exhibited consistently top performance world-wide, at times second only in production to the Tier1 Center at Brookhaven National Laboratory, New York among all computing centers in the LHC computing grid (~150 centers), including CERN.

In charge of the distributed U.S. ATLAS Computing Facility Integration Program, I also coordinate the activities of the U.S. ATLAS Tier2 Centers.  This Integration Program involves the 5 federated centers comprising 10 institutions in the U.S., including Harvard University, Boston University, University of Michigan, Michigan State University, University of Texas at Arlington, University of Oklahoma, and the Stanford Linear Accelerator Center, in addition to the institutions of the Midwest Tier2 Center.  This coordination activity has been highly rewarding and has fostered wide recognition of the superior performance and efficiency of the U.S. ATLAS Computing Facility among regional computing fabrics in Europe and Asia.

Data analysis by large groups of physicists distributed around the globe prompted organization of a data federation project (which I also lead) with the goal of providing seamless access to the global data store for the international ATLAS Collaboration.  There are to date 25 institutions participating in the data federation from Germany, Italy, Russia, Spain, the UK, the US, and CERN, promising to yield new modes of physics access to both data and computation resources.

Another exciting leadership role has been in the context of multi-science campus computing initiatives at the University of Chicago and as part of the broader communities of the Open Science Grid (an international cyberinfrastructure comprised of over 115 universities and national laboratories in North, South, and Central America, as well as Korea) with a particular focus on bridging knowledge, expertise, best practices, and resources across university campuses.   At the University of Chicago, we have established the first steps in creating a distributed high-throughput computing infrastructure that opens up project dedicated resources for shared use and as well serves a gateway function to schedule jobs onto opportunistic computing cycles available on the Open Science Grid.  In this program we are partnering with the University’s new Research Computing Center team so as to provide a coherent roadmap with clear support channels and investment options, and which flexibly utilizes all the available resources to faculty, postdocs and students.  The “UC3” project (for UChicago Computing Cooperative) has begun with engagements with researchers from the South Pole Telescope Collaboration (particle astrophysics), chemistry, biomedical sciences, and computational economics, all disciplines that benefit from the high-throughput-computing infrastructure my team built.  Thus far two publications in have resulted.

Finally, I am in the beginning phases of a new program of research into data and software preservation and environments ensuring sustainable usability, recently having formed a local group with members from the Computation Institute, the Institute for Genomics and Systems Biology, and the digital librarian staff of the University of Chicago (DASPOS, NSF).  From this standpoint, my work connects with the computer and information sciences in addition to being applicable to physics and other disciplines concerned with what is currently labeled Big Data.