Testing from Pre-Clinical to Product Launch (Part Four)

author: Steven R. Tannenbaum, Department of Biological Engineering, Massachusetts Institute of Technology, MIT
author: Noubar Afeyan, Flagship Ventures
author: Joseph V. Bonventre, Harvard-MIT Division of Health Sciences and Technology (HST), Massachusetts Institute of Technology, MIT
author: Linda G. Griffith, Center for Future Civic Media, Massachusetts Institute of Technology, MIT
author: James Green, Biogen Idec
published: June 4, 2013,   recorded: August 2005,   views: 59
Categories

Related Open Educational Resources

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.
  Bibliography

Description

“To me, systems biology is the religion you switch to when target-based drug discovery doesn’t work,” Noubar Afeyan states boldly. He claims that after losing billions of dollars, the pharmaceutical industry and academia are beginning to see the value in testing drugs by measuring outcomes in biological networks. He calls this systems pharmacology, where you “measure in living systems multiple analytes in the same organism, perturbing the state and taking thousands of measurements per sample.” Researchers use computer images to visualize the differences and similarities in drug response across many networks, and then try to correlate these responses statistically.

The inability to predict toxicity early in drug development cost the pharmaceutical industry an astonishing $8 billion in 2003, says Joseph Bonventre, approximately one-third the cost of all drug failures. “We generally can’t pick up toxicity until it’s too late,” he says, so key challenges are developing better preclinical studies with useful biomarkers, improved animal models, and high throughput techniques; and on the clinical side, coming up with a “safe harbor approach to amass kidney and other toxicity data,” developing consortia to validate biomarkers, dealing with IP issues and building “an improved bedside to bench flow of information.”

Linda Griffith's vision is “building a human body on a chip.” She’s not talking about an individual’s genome or health history, but “a living, 3D interconnected set of tissues on a chip. If you perturb it, you make it develop a disease.” Such a device would enable researchers to predict negative drug interactions and even to build models of disease. Griffiths’ version of liver tissue, built on a silicon scaffold, may prove especially useful for drug toxicity tests.

At Biogen, “the holy grail for any justification of a new approach or technology is that we’re going to chop a significant amount off the time it takes to move a new product from bench to bedside,” says James Green. He believes that “drugs and paradigms are orders of magnitude more complicated than 24 years ago.” He hopes that new techniques “that take us into the genome, interpreting data as patterns” offer some promise.

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Write your own review or comment:

make sure you have javascript enabled or clear this field: