by Dr. Amy Herr
Limited success in translation of protein disease biomarkers to the diagnostic arena has emerged as a perplexing development of the last decade. In spite of significant advances in proteomic technology, few new protein biomarkers have emerged from the proteomic discovery pool, progressed though the scrutiny of validation studies, and become incorporated in diagnostic tools.[2, 3] In a compelling analysis of the biomarker “pipeline” problem, Zolg[3, 4] boldly posits that the biomedical community has a tendency to overrate the biomarker discovery phase. In fact, he asserts, researchers under-appreciate another true challenge facing personalized medicine in the 21st century: the arduous task of developing and undertaking rigorous, candid assessments of biomarker candidates within carefully planned validation schemes. Without a concerted, cohesive effort to develop the instrumental infrastructure required for high-throughput, reproducible validation studies, the critical gap between biomarker discovery and translation of said biomarkers to clinical and point-of-care diagnostic tools remains.
Validation studies are essential for determining the statistically-verifiable diagnostic potential of suspected disease biomarkers - both as ‘stand alone’ and multi-analyte diagnostic panels. Researchers have recently speculated that biomarker validation may pose a greater challenge - and require more substantial innovation - than biomarker discovery. Several factors support this assertion including: (i) Conventional validation schemes can be more costly and labor-intensive than discovery endeavors. Yet, the value of a promising biomarker rests on validation of the marker in the context of its intended use. (ii) Inherent to any validation undertaking is attrition of possible candidates - communication of negative results can be difficult and may attract limited attention from the community. Consequently, validation undertakings may be of limited interest to researchers. (iii) Tremendous innovation is sorely required to meet validation scheme design specifications. Key specifications include reducing the required labor necessary to complete a validation study, increasing throughput (number of samples from unique patients, as well as measurement of multiple analytes in a single sample), and providing reproducible protein quantitation.
In these challenges is a remarkable opportunity for decisive technological advances. Of primary importance to protein biomarker validation studies is the unrivaled opportunity for ready integration of multiple sophisticated preparatory and analytical steps using microfluidic technology. Since the early 1990’s interest has arisen regarding bioanalytical methods developed using microscale tools - key among the advantages of microfluidic approaches is the capability for integration of numerous functions. Integration and subsequent automation of sample preparation and analysis is advantageous for performing reproducible, quantitative measurements. As readers of the AES Newsletter know, electrophoretic handling and analysis are scale-depended transport mechanisms, making sample handling and analysis remarkably efficient in microfluidic formats.[8-11] Although disease biomarker validation has not benefited directly, microfluidically-enabled bioanalytical methods are emerging as rapid, reproducible core technologies for measurement of proteins in clinical samples. Recent reports describe lab-on-a-chip instruments that perform multiple operations in parallel in extra-laboratory settings (e.g., field-deployment, near-patient environments, resource-poor settings).[12-15]
Further, manipulation of small fluid volumes is readily performed with lab-on-a-chip tools, making preparation, handling, and analysis of previously inaccessible fluids and tissues with potentially rich protein content possible (e.g. prostatic fluid). Analysis of volume-limited diagnostic fluids - especially for multiple candidate biomarkers - would truly benefit from limited sample consumption. Volume-sparing analysis also enables study of tissues and fluids archived in patient sample registries - an absolute necessity for biomarker validation. While microfluidic methods are promising, significant innovation in streamlined sample preparation is required. While perhaps “unglamorous”, sample preparation is an enormous challenge to reproducible biomarker validation data sets. Integration of preparatory functions enables dependable results as well as multiplexed assays (either by fluid type, patient identity, or biomarker) required in validation study design and implementation.
Critical technological hurdles associated with biomarker validation must be surmounted in new, more effective ways if we aim to stem the leaky pipeline between biomarker discovery and use of protein biomarkers to improve the human condition. Lab-on-a-chip methods - technologies that seamlessly integrate sample preparation and analysis - offer compelling advantages for realization of a shift from curative medicine to an envisioned ideal of predictive, personalized, preemptive medicine.
|Dr. Amy Herr|
|Department of Bioengineering