© 2024 St. Louis Public Radio
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Danforth event will examine growing world of bioinformatics

This article first appeared in the St. Louis Beacon, Aug. 28, 2012 - The dynamic of man vs. machine has fascinated the world for decades, but the new science of bioinformatics and the explosion of information it will bring may present significant hurdles for both of them.

“Genome sequencing efforts generate billions -- and in some cases trillions -- of data points,” said Dr. Todd Mockler. “The amount of data is routinely so big that no human can mine it without the aid of computational tools. That’s what it’s about, trying to make sense of large biological data sets so that biologists can infer actual results or make discoveries from overwhelming amounts of information.”

Mockler, the Geraldine and Robert Vigil Distinguished Investigator at the Danforth Plant Science Center, will be among those on hand Thursday evening at the center to discuss the challenges and consequences of the rise of bioinformatics, a part of the “big data” scene that could yield both great promise and important technical conundrums in the coming years. Mockler, a former Oregon State University professor whose work deals with genomics and computational approaches to understanding plant function, will be joined for a panel discussion by Mich Hein, founding and managing partner of Nidus Partners and Ralph S. Quatrano, dean of the School of Engineering & Applied Science at Washington University.

Quatrano has nearly a quarter century of research under his belt from his work at the Center for Gene Research and Biotechnology at Oregon State University, DuPont Company, the University of North Carolina and Monsanto.

Dr. Jim Davis of Washington University will moderate the discussion, entitled “Let’s Talk … Bioinformatics at the Danforth Center and Beyond. The event is part of the “Conversations” Series and is sponsored by Boeing.

Mockler said the very meaning of bioinformatics can be daunting.

“It’s one of these terms that has a bunch of definitions depending on who you talk to,” he said, “but from my perspective what it refers to is using computational approaches such as programming, algorithms and database infrastructures to query and mine large biological data sets. The last 10-15 years most of the biological sciences have really been taken over by so-called ‘big data’ approaches.”

That’s created a real need for academics and researchers who have experience in both the biological and computer science fields.

“There’s a huge IT aspect. Most biologists are not equipped or trained to manage a computational infrastructure from the hardware perspective,” he said, “so there’s the IT challenge of putting together the hardware and the team of people to keep it running.”

But keeping the machines going isn’t the only problem. The programs needed to take on the huge data sets emerging from research often haven’t even been written yet.

“A lot of the cool, cutting-edge stuff that people want to do with these data sets are things that there isn’t prewritten software to do,” Mockler said. “You can’t just buy some software package that will analyze the data for you.”

It’s a challenge increasingly faced by fields all across the spectrum of science, government and commerce. With an exponential amount of new information being produced, how can relevant data be stored, sifted and organized so that those who need it can make sense of the cacophony?

“I think in the near term it’s more of the same in that the data being generated isn’t getting smaller, it’s getting bigger,” Mockler said.  "Bioinformatics is playing catch up all the time, trying to catch up with the state of the art from the data generation side. Engineers generate genome sequencing machines with new versions coming out every six months and each new version generates maybe a hundred times more data than the last version.”

Often the answer is a multidisciplinary approach that uses expertise where needed.

“Even though there is a huge IT and programming aspect, there is still a really meaningful place for the biologists,” he said. “A lot of times the computer scientists and the statisticians don’t understand the biology at a sufficient level or don’t have an intuitive feel for the biology that’s required to make inferences from it. Usually, you have to have a team of people who bring different disciplines and skills to the table.”

The right answers could lead to advances in everything from better biofuels to the creation of useful biomaterials and new synthetically- engineered biological systems to benefit a number of fields.

Mockler said that bioinformatics has become a serious focus at Danforth since the appointment of Dr. James Carrington as president. Carrington was director of Oregon State University’s Center for Genome Research and Biocomputing.

Mockler noted that it could have wider implications for the community with ever-expanding research from institutions like Washington University and Monsanto.

Many St. Louisans are increasingly looking at these sorts of issues. StampedeCon_2012, an event examining the challenges posed by big data, took place earlier this month.

“It’s an area where St. Louis could lead, so I think that will be a topic of conversation,” Mockler said.

Either way, the future remains as unwritten as the programs needed to manage it.

The event is free of charge, but reservations are required. Call 314-587-1070 for more information.