The traditional science research article doesn’t perform many tricks. When journals made the leap from paper to web, they largely settled on the static format of the PDF, a format that offers easy printing and little else. In the meantime, research across the spectrum of science has grown more data-intensive and computational, and programmers have developed exciting new ways to document, publish, share, and collaborate on projects. The ingredients are all there for a new kind of living, dynamic scientific publication.


Cyberinfrastructure is the connective tissue for computational science, tying together the research projects, resources, software, data, networks, and people needed to make important discoveries. In an era where soon all research will be computational science, to varying degrees, the importance of building strong cyberinfrastructures to support that research grows -- as do the challenges. But what will the cyberinfrastructures of the future look like?


Chameleon: Why Computer Scientists Need a Cloud of Their Own


It's been almost a year since Chameleon, the experimental cloud computing testbed co-run by the Computation Institute and Texas Advanced Computing Center, went into full production for research use. Already, 600 users and 150 projects have used the system to test new uses and technologies for cloud computing, from finding unknown exoplanets to preventing cyberattacks. Last week, HPCwire spoke to CI Senior Fellow Kate Keahey and other members of the Chameleon team, surveying its early successes and previewing the innovations still to come.


Globus logo

The Discovery Cloud is CI Director Ian Foster's vision to deliver powerful computational tools and methods to every professional and amateur scientist around the world, fundamentally transforming the ecosystem of science. Globus is the first step towards realizing this vision.

XSEDE logo

The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated advanced digital resources and services in the world. It is a single virtual system that scientists can use to interactively share computing resources, data, and expertise.

The OpenAD/F project seeks to develop a modular, open-source tool for the automatic generation of adjoint code from Fortran 95 source code. Discrete adjoint computations are used for sensitivity analysis and to provide the gradients used in geophysical state estimation. Because derivatives are needed with respect to millions or billions of independent variables, finite different approximations are impractical: a gradient computation that would take minutes or hours using an adjoint computation would take months or years using finite differences.

Researcher Spotlight