Select Page

Compiler 2/22/13: Vote Chicago and Dr. Watson

By Rob Mitchum // February 22, 2013

A CITY PROJECT BATTLE ROYALE

As the keynote speaker at the Urban Sciences Research Coordination Network kickoff last Friday, the City of Chicago’s Brett Goldstein presented a blizzard of exciting city projects at various states of development. One slightly-under-wraps project Goldstein touched upon was the SmartData platform, an ambitious plan to craft a new tool for decision-making and city services out of the abundant raw material of city data. In collaboration with the Computation Institute and the Urban Center for Computation and Data, the city’s Innovation and Technology team hopes to create a tool that will analyze the city’s many large datasets in real time to help the city respond to challenges more quickly and efficiently, while providing frequently updated, useful information to its citizens.

Wednesday, that exciting new effort was announced as a finalist in the Bloomberg Philanthropies Mayors Challenge, a competition among ideas proposed by cities across the United States. As part of the judging, the public is invited to vote for their favorite project among the 20 finalists at the Huffington Post. We’re biased of course, but to help make the case for Chicago’s project, you can read more about the SmartData platform here, or watch a video about the concept featuring Mayor Rahm Emanuel below.

{C}

 

COMPUTER IN A WHITE COAT

Last year at the Research Computing Center opening reception, attendees were given the chance to play Jeopardy against a legendary contestant: Watson. While most famous for once taking down Jeopardy grandmaster Ken Jennings on television, Watson was actually developed by IBM to be a complex problem-solver for any field faced with a tidal wave of data. Medicine is one such field, and in the cover story of The Atlantic this month, Dr. Watson gets the spotlight. Much like Watson drew upon a vast storage of word associations to rapidly locate the right question on Jeopardy, IBM believes that the computer system will be able to quickly scan through an enormous corpus of medical literature and clinical information to find the right diagnosis for a patient, free from the bias or limitations of a regular human physician.

Beyond the details about the cover star, the article is a good overview of how data technology and computation are expected to transform the practice of medicine in the near future. Two main advantages of Watson pointed out by the article — the ability to handle “unstructured data” and the ability to convey uncertainty about a conclusion — reflect broader trends in computational research for medicine and biology, as well as other projects on climate change and the humanities. The article also covers health IT trends such as personalized medicine, using smartphones to monitor/transmit medical data and tele-health interventions.

In the future as the innovators imagine it—“Health 2.0,” as some people have started calling it—you would be in constant contact with the health-care system, although you’d hardly be aware of it. The goal would be to keep you healthy—and any time you were in danger of becoming unhealthy, to ensure you received attention right away. You might wear a bracelet that monitors your blood pressure, or a pedometer that logs movement and exercise. You could opt for a monitoring system that makes sure you take your prescribed medication, at the prescribed intervals. All of these devices would transmit information back to your provider of basic medical care, dumping data directly into an electronic medical record.

 

OTHER NEWS IN COMPUTATIONAL SCIENCE

Many experts believe that quantum computers will someday become necessary to keep improvements in processor speed along the exponential track of Moore’s Law. But in the meantime, two new papers published in Science this week propose an intermediate step towards quantum computation, demonstrating how a relatively easily-scaled method called photonic boson sampling can be used to solve certain matrix calculations faster than a “classical” computer.

Can big data build a better lie detector? An interesting article at Datanamitalks about how fraud prevention services are using data algorithms akin to a souped up version of Facebook’s new graph search to put together the history of a given person…and sniff out if they are up to no good.

Is the human brain computable? That’s going to be one of science’s biggest debates over the next decade as both Europe and the United States plan to launch massive and expensive efforts to simulate and/or map the human brain. But some scientists, including brain-machine interface researcher Miguel Nicolelis, are skeptical that engineering will ever be capable of reproducing the complex abilities and functions of the human brain.