Select the search type
  • Site
  • Web

The natural world. Looking pretty for 3.5b years.

Big Data

Author: Guest Writer/Thursday, March 29, 2012/Categories: Uncategorized

When I was a grad student conducting ecological field research in the 1970’s, I spent much time in the mountains. My research consisted of an early examination into the environmental factors influencing restoration of vegetation destroyed by fires in the Colorado Rockies. The project required the collection of copious scientific measurements from multiple study sites across diverse plant species, soil chemistry, light intensity, air temperatures, and fire timelines. All this data was necessary to develop a comprehensive predictive model of the limiting factors to high elevation forest and tundra recovery.

‘Back in the Day’ I collected this data using a hand-held device, a punch-card system developed by IBM, called a “Port-a-punch”. IBM had never thought of an ecological application for their prototype device so I was on the “cutting edge”. Unlike taking handwritten field notes, the little, portable, gadget was really quite handy. It worked by using a stylus to manually eliminate chads in the cards, each hole representing a specific field measurement. The punched cards were later fed into a reader attached to one of those massive computers stored in a sterile, air-conditioned, room back at the university.

I now use a laptop with more computing power than that entire room of machines and data collection and storage has exploded exponentially over the past 40 years. There is little chance of this changing anytime soon. Informed data interpretation is still critical in understanding a given situation, issue, or topic, however.

old-school-computer-history-museum       IBM-Port-A-Punch

Old School Computer Room               Port-a-punch Portable Data Recorder

(credit: Computer History Museum)    (credit: Wiki-Commons)

I was reminded of all this while reading an article about efforts to get a handle around the flood of new digital information, or big data.

The White House Office of Science and Technology, DARPA, the National Science Foundation, the National Institute of Health, and the Defense Department have joined forces and developed a new computational research initiative. The goal of this effort is to address opportunities by combine diverse data sources from the internet, industrial sensors, social networks, and satellites to increase interpretation of important issues like improved weather prediction, climate change impacts, visual animations, crime prediction, business and insurance risk assessments, and any number of discoveries not yet imagined.

Hurricane and environmental impact analyses have already benefited from using “big data” for predictive modeling. However, the scientific and artistic applications of these new manipulation technologies is limited only by the imagination.

big_data-NSF           data-visualizaions-NSF

Hurricane Models Using "Big Data”    Visualization of Data Sets

(credit: NSF)                                     (credit: NSF)

The Advanced Computing Center at the University of Texas was an early adopter of “big data” sets for creating advanced hurricane models and the Scientific and Imaging Institute at the University of Utah has been a leader in data visualizations of everything from interactive surgery to animated films.

Such new computational and imaging tools will impact all of us in ways we can’t imagine. They sure have come a long way from my Port-a-punch data and mainframe computer analysis days.



Number of views (2038)/Comments (0)

Please login or register to post comments.