CI Year in Review 2016

22
Dec
2016

As the year comes to a close, it's time to look back at an exciting 2016 of discoveries, events, announcements, and exploration at the Computation Institute. Enjoy this recap of our work in climate change, particle physics, education, cyberinfrastructure, genomics, spatial analytics, and much, much more.

January

Array of Things had a very busy 2016, building up to the installation of the first nodes on Chicago intersections in late August. At the start of the year, the urban sensing project solidified its manufacturing partnerships, working with local companies PDT and Surya Electronics on creating the distinctive node enclosures that protects the sensors and the internal technology that powers them. 

Another project between the Urban Center for Computation and Data and the City of Chicago made its debut in January, as the open data portal OpenGrid went live. Built on top of UrbanCCD's Plenario platform, the website allows users to easily discover and work with city data on topics ranging from business licenses to food inspections to 311 service requests. OpenGrid will be one of many places to find Array of Things data when it starts flowing in early 2017.

Two startups led by Computation Institute researchers received investment from the Chicago Innovation Fund, which helps University research crossover to entrepreneurial opportunities. Navipoint Genomics, led by Paul DavéRavi MadduriDina Sulakhe, and Alex Rodriguez, expands the cloud-based genomic analysis capabilities of Globus Genomics to clinical testing. Praedictus Climate Solutions, led by Joshua ElliottDavid Kelly, and Ian Foster, applies climate and agriculture modeling techniques for financial and insurance markets.

February

The experimental cloud computing infrastructure Chameleon, led by CI Senior Fellow Kate Keahey, was used this year for a broad variety of scientific projects that touch upon the growing potential and usage of cloud resources. One multi-institutional research group applied the technology to improve defenses against cyberattacks on cloud computing, developing better intrusion detection and prevention systems.

While most of the particle-colliding action of CERN occurs at the Large Hadron Collider in Switzerland, the data analysis on those experiments happens worldwide as thousands of scientists collaborate on the results. To help coordinate this massive effort, Computation Institute scientists on the ATLAS experiment built a powerful new dashboard to visualize and explore the petabytes of data produced by every LHC run. 

As extreme weather events such as droughts and heat waves become more frequent, food shocks disrupting global agriculture are likely to follow. A task force including RDCEP's Joshua Elliott brought their scientific findings on these crises and how to best prepare for them to Washington DC, where they made separate presentations to Congressional staffers and the scientific community assembled for the American Association for the Advancement of Science (AAAS) Annual Meeting. You can watch a AAAS video on Elliott's climate and agricultural modeling work below.

March

New kinds of science require new kinds of scientific publications, and the computation community often discusses how best to credit those who write the software that enables today's most exciting discoveries. A new journal called SoftwareX, launched by a team including the CI's Kate Keahey, received a special Award for Innovation in Publishing for developing a new format (Original Software Publication) to help preserve software and encourage collaboration and citation.

A team of CI researchers at Argonne pushed climate modeling to a new level by running the highest resolution forecast of North America ever conducted. CI senior fellow V. Rao Kotamarthi and Argonne postdoctoral researcher Jiali Wang simulated 100 years of climate with grids of only seven miles per side, improving performance on modeling rare weather patterns and creating powerful new data for studying crops and flood prevention.

Argonne's first-ever neuroscientist, Narayanan “Bobby” Kasthuri, visited the CI to talk about how computation helps him and other researchers build more and more highly-detailed maps of the brain -- and how the intricate networks of the human brain could return the favor by inspiring new kinds of algorithms and artificial intelligence approaches.

April

The superconductor was one of the greatest inventions of the 20th century, but to continue improving upon its design in the 21st, CI and Argonne researchers must turn to new computational materials science techniques. A paper published in Advanced Materials by CI staff researcher Ivan Sadovskyy used simulations run on supercomputers to study how small defects in superconductor material can improve their performance. 

Most of the news about climate change and agriculture is very bad, but a Nature Climate Change study from RDCEP researchers led by Delphine Deryng found that elevated atmospheric carbon may actually help certain types of crops. 

As more cities discover and implement innovative ways of using data to improve their operations, it becomes imperative that the successful ideas are shared and propagated between municipalities. A new Civic Analytics Network, created by the CI's Center for Data Science and Public Policy and Harvard's Ash Center for Democratic Governance and Innovation, seeks to lower the cultural and technical obstacles for city chief data officers to implement new software tools generated elsewhere within their own governments.

The 2016 edition of Globusworld was organized around the transition of the Globus data management service from software to platform, with CI director Ian Foster offering up a new vision of international scientific collaborations using Globus to coordinate their complex cyberinfrastructure needs. From Big Science such as the National Spherical Torus Experiment to the little lab, Foster suggested that data pipelines, user authentication, and other Globus features can help make science and discovery easier for all.

LoT3_0.jpeg

May

While its primary purpose is to enable new data collection for the improvement of cities, the Array of Things platform is also a valuable tool for education, giving students of all ages an opportunity to learn and practice digital fabrication, data science, and other valuable skills on a subject that is familiar and relevant. Lane Tech High School in Chicago was the site for the first pilot AoT educational workshop, dubbed "Lane of Things" -- read about the sensor projects developed and built by Lane Tech students to learn more about their school environment.

The first Convening on Urban Data Science, organized by UrbanCCD and held at the Polsky Center for Entrepreneurship and Innovation, brought together scientists from around the world and from across disciplines to discuss this growing scientific field, and its promise, challenges, and ethics. Panels covered applications of urban data science for improving everything from public transit to health care, so long as this new community of sociologists, computer scientists, urban planners, city officials, and the public find common ground on the right research questions and standards of consent.

“Cyberinfrastructure is the substrate of all scientific computation,” said CI Senior Fellow Rob Gardner at his May talk for the Enrico Fermi Institute about the future of cyberinfrastructure. The talk explained how the Open Science Grid and other distributed resources are changing how research institutions handle, compute, and share data, within and across campus borders.  

To kick off the conservation-focused Campus as a Lab Initiative, several organizations including RDCEP hosted a late-night hackathon where UChicago students, staff, and faculty worked with real campus energy data. After learning about how to work with these data sources and hearing from UChicago Facilities Services about the challenges they face, teams explored the data and generated new ideas for improving campus sustainability. 

Summer

Another effort to create a new form of scientific publication, called Whole Tale, launched with participation from the Computation Institute and Globus. Combining software tools such as Jupyter, Globus, and D3, the platform was designed to make scientific results more replicable, accessible, and collaborative. 

The annual Data Science for Social Good summer fellowship enjoyed a successful fourth year, working on projects ranging from improving sanitation in Kenya and social services in Mexico to supporting education, criminal justice, and police reforms in Charlotte, Milwaukee, Nashville, and Tulsa. After receiving attention from the Chicago Tribune, The Economist, and Government Technology, the teams made their final presentations at the summer-ending Data Fest, while also helping organizations develop similar efforts at the inaugural Data Science for Social Good conference

Thousands of miles away from the Large Hadron Collider in Geneva, Switzerland, UChicago scientists play critical roles in the unprecedented high energy physics experiments conducted at CERN. A UChicago news feature looked at the local contributions to this international collaboration, including the work of several CI researchers. 

For all their technological sophistication, supercomputers and data centers are constrained by a very simple restriction: heat. To help create more energy-efficient high performance computing, a team of researchers at Northwestern University used the CI's experimental computing cloud Chameleon to test new machine learning algorithms that determine the best way to distribute tasks and reduce temperatures in busy clusters. 

September

Array of Things reached an important landmark with the installation of the first nodes on public Chicago intersections, starting with two nodes in the Pilsen neighborhood on the southwest side. The occasion was marked by coverage from USA Today, CNN, and Marketplace.

The new NSF-funded research cloud Jetstream, which uses Globus services to help create an easier "on-ramp" for researchers interested in using high-performance computing, officially launched on September 1st. 

As part of the Department of Energy's Exascale Computing Project, three efforts including CI and Argonne researchers received grants to develop important new applications that will rapidly capitalize upon this new era of computational power. Teams led by CI Senior Fellows Salman Habib, Rick Stevens, and Charlie Catlett each received funding to investigate topics such as dark energy, precision medicine for cancer, and city-scale urban data.

A serendipitous connection between researchers at the CI's Knowledge Lab and RDCEP led to an innovative new approach to testing economic theories: using satellite imagery of nighttime lighting as a proxy for economic activity. CI scientists Eamon Duede and Victor Zhorin found that these images, collected by the NOAA, could provide highly sensitive indicators for economic growth and humanitarian crises.

October

The CI's newest research group, the Center for Spatial Data Science, officially joined the fold, bringing expertise and popular research software on using spatial relationships to improve studies of economics, public health, social services, and many other topics. 

It's unlikely, but just in case the zombie apocalypse reaches Chicago, the researchers at Argonne's Complex Adaptive Systems group have got us covered. Using ChiSIM, an agent-based model of the city and its citizens more commonly used for studying outbreaks such as MRSA and ebola, the researchers simulated how long Chicago would be able to resist the spread of a zombie-creating virus (Spoiler: not long).

P1130106.jpg

November

The second Campus as a Lab hackathon focused on campus building energy data, with student teams discovering anomalous energy usage that facilities personnel can further investigate for potential conservation and savings. University leadership also announced the new Odyssey Metcalf Energy Fellowship, an opportunity for students to find summer work helping make campus more Earth-friendly and efficient.

Another round of funding from the Exascale Computing Project again selected CI work for support, this time boosting "co-design" centers led by Ian Foster and other researchers. CODAR, the Co-Design Center for Online Data Analysis and Reduction at the Exascale, will develop new methods that bridge the gap between compute speed and data storage, making sure that faster supercomputers don't outpace scientists' ability to tease out discoveries from rapidly growing pools of data.

December

A puzzling mystery of climate models, which commonly predict more intense storms but only a mild increase in future precipitation, was potentially resolved by RDCEP researchers, who developed new statistical methods to identify and track storm features. Published in the Journal of Climate, the study found that climate change will indeed drive more intense storms, but that these storms will be smaller in size in many regions of the United States.

While millions of Americans struggle to find work, millions of job listings fail to attract qualified applicants. To help close this "skills gap," a new CI research center will draw upon data and expertise from economics, education, social science, and computer science to create the National Center for Opportunity Engineering & Analysis (NCOEA), announced this month.

The Dust Bowl droughts of the 1930s devastated US agriculture and led to the displacement of millions of American families. RDCEP research published in Nature Plants found that, despite decades of improvements in farming technology, a similar drought would hit today's agriculture just as hard -- and higher temperatures might exacerbate the crisis even further.
 

Written By: