26
Mar
2013

Among scientific disciplines, botany might be considered one of the least tech-minded branches, concerned as it is with the natural world of plant life. But like the rest of biology, botany is quickly moving into the types of large-scale experiments that require more sophisticated and advanced techniques. In many botany labs, high-throughput sequencers generate data at unprecedented rates about plant genomics for many different species.

21
Mar
2013

The general public tends to think of supercomputers as the big brothers of their home computers; a larger, faster and more powerful version of more familiar every devices. But in his talk last week for the Argonne National Laboratory's OutLoud series, CI senior fellow Peter Beckman urged the crowd to think of supercomputers more imaginatively as the real-life version of a common sci-fi device: the time machine.

A modern laptop is faster than the state-of-the-art supercomputer Beckman used at Los Alamos National Laboratory in 1995, he said. That same year, a supercomputer with the computing speed of today's iPad would rank on the Top 500 list of fastest computers in the world. Beyond raw speed, the programming strategies and hardware architectures developed on the room-sized supercomputers of the last 60 years have eventually trickled down to the consumer, as with the multi-core processors and parallel operations that can be found in new laptops.

19
Mar
2013

"Civic hacking" has become a popular way for people skilled in programming and data crunching to give back to their community. Through organized Hack-a-thons or groups such as Open City and Code for America, volunteers imaginatively transform enormous tables of numbers into user-friendly web and mobile tools that bring localized and interactive information about a city to its citizens.

15
Mar
2013

A few weeks ago, we urged readers to vote in the Bloomberg Mayors Challenge for the City of Chicago's entry, a collaboration with the Urban Center for Computation and Data called the SmartData Platform. This week, the project received good news as it was chosen for a $1 million grant from Bloomberg Philanthropies to launch the project, one of five proposals to receive funding from the original pool of 305 applications. The SmartData platform will put city datasets -- like those that can found on the city's data portal -- to work in making the city run more effectively and efficiently, and the UrbanCCD will help provide the computational expertise and tools to extract the maximum potential from the data. The new open-source platform is considered the next iteration of the WindyGrid system currently used internally by the city, which was discussed by Chicago's Chief Data Officer Brett Goldstein at the recent Urban Sciences Research Coordination Network workshop.

Chicago and the other Bloomberg winners were covered by the New York Times, the Chicago Sun-Times, Crain's Chicago Business, NBC Chicago, ABC Chicago, The Atlantic Cities.

13
Mar
2013

Big science projects can afford big cyberinfrastructure. For example, the Large Hadron Collider at CERN in Geneva generates 15 petabytes of data a year, but also boasts a sophisticated data management infrastructure for the movement, sharing and analysis of that gargantuan data flow. But big data is no longer an exclusive problem for these massive collaborations in particle physics, astronomy and climate modeling. Individual researchers, faced with new laboratory equipment and methods that can generate their own torrents of data, increasingly need their own data management tools, but lack the hefty budget large projects can dedicate to such tasks. What can the 99% of researchers doing big science in small labs do with their data?

That was how Computation Institute director Ian Foster framed the mission at hand for the Research Data Management Implementations Workshop, happening today and tomorrow in Arlington, VA. The workshop was designed to help researchers, collaborations and campuses deal with the growing need for   high-performance data transfer, storage, curation and analysis -- while avoiding wasteful redundancy.

"The lack of a broader solution or methodology has led basically to a culture of one-off implementation solutions, where each institution is trying to solve their problem their way, where we don't even talk to each other, where we are basically reinventing the wheel every day," said H. Birali Runesha, director of the University of Chicago Research Computing Center, in his opening remarks.

07
Mar
2013

"We know more about the movement of celestial bodies than about the soil underfoot."

Leonardo da Vinci never gave a TED talk, but if he did, that quote from around the beginning of the 16th century might have been a good tweetable soundbite. Five centuries later, da Vinci's statement still holds true, and it was there for CI Senior Fellow Rick Stevens to pluck as the epigraph for his talk in November 2012 at the TEDxNaperville conference. Stevens used his 18 minutes on the TED stage to talk about the Earth Microbiome Project, an international effort "to systematically study the smallest life forms on earth to build a comprehensive database to capture everything we can learn about these organisms."

{C}

06
Mar
2013

They say a picture is worth a thousand words. But if your camera is good enough, the photos it takes could also be worth billions of data points. As digital cameras grew increasingly popular over the last two decades, they also became exponentially more powerful in terms of their image resolution. The highest-end cameras today can claim 50 gigapixel resolution, meaning they are capable of taking images made up of 50 billion pixels. Many of these incredible cameras are so advanced that they have out-paced the resolution of the displays used to view their images – and the ability of humans to find meaningful information within their borders.

Closing this gap was the focus of Amitabh Varshney's talk for the Research Computing Center's Show and Tell: Visualizing the Life of the Mind series in late February. Varshney, a professor of computer science at the University of Maryland-College Park, discussed the visual component of today's big data challenges and the solutions that scientists are developing to help extract maximum value out of the new wave of ultra-detailed images -- a kind of next-level Where's Waldo? search. The methods he discussed combine some classic psychology about how vision and attention works in humans with advanced computational techniques.

01
Mar
2013

Climate change is not an equal opportunity threat. Certain areas of the world are more susceptible than others to the shifts in temperature, precipitation and other effects climate scientists predict to occur over the next century. Even when these vulnerable regions are remote and rarely visited by humans, how climate change affects these ecosystems could have dramatic consequences for the global population.

One such area of vulnerability is the high-latitude peatlands, the bogs, fens and boreal forests found in the northern parts of Canada, Alaska and Russia. Much of this frigid land carries a layer of permafrost year-round, but the vegetation and soil there accounts for the majority of Earth's biomass and carbon storage – roughly double that of the much better known tropical forests. Climate change is also expected to be particularly dramatic in these areas, with some models predicting temperatures will increase by as much as 7.5 degrees Celsius over the next hundred years.

Because of this vulnerability, Argonne computer scientist Zhaosheng Fan is focusing his attention specifically on how climate will affect this unique ecosystem. In his talk at the Computation Institute on February 14th, Fan (Assistant Biogeochemical Modeler in the Biosciences Division at ANL) described how scientists created a model of these high-latitude peatlands, how they refined the model based on field experiments and the very serious warnings the model produced about how these remote areas might someday affect the rest of the world.