home > ebr > autumn 2012 > a store point
European Biopharmaceutical Review

A Store Point

Biosamples are vital to the growing field of biomarker discovery and personalised medicine, and good storage practices are essential to ensure the integrity of the sample is maintained for current and future research.

Recent developments in the fields of molecular biology, genetics and pathology present unprecedented research opportunities for scientists to apply biomarkers to better understand the origin and diagnosis of diseases and their subsequent prevention and treatments. Much of this progress has been propelled by access to high-quality biospecimen samples and advancements in sample preparation and bioprocessing techniques that support pharmacogenomics and biomarker research. Given the intrinsic value of these materials, high standards for sample management – from the point of collection, and throughout the complete sample processing and/or storage life cycle, to destruction – have become critical from research efforts through to prospective and retrospective biomarker analyses.

Sample Registration and Accessioning

The first step in assuring high quality samples is at the point of collection. Special care should be taken to ensure that samples are collected in a manner consistent with their anticipated downstream use, and that necessary associated data is collected and maintained with the sample throughout its life cycle. Data management begins with sample collection, and proceeds through processing and accessioning. Irrespective of sample source and the steps and derivatives generated for a sample, it is essential that the unique information associated with each and every sample be defi ned at the time of receipt in the repository, and maintained throughout the lifetime of the sample. In some cases, critical information about a sample is missing, incomplete or inaccurate, which can lead to costly and time-consuming study-specifi c errors. To this end, researchers should employ an informatics system that integrates sample pre-registration data and cataloguing of qualitative and quantitative information at the time of accessioning, as well as defi ned standard operating procedures (SOPs) regarding sample discrepancies to ensure all data elements required are captured accurately and with fi delity. One key component to sample pre-registration is capturing information, which will allow for appropriate sample and quality control reconciliation, and to ensure the sample has sufficient information maintained to ensure its value and future utility. This information can be captured electronically at the time of sample submission, or uploaded at the time of sample accessioning, but does require foresight and diligence from the time the sample is generated, as lost information at this step can result in critical samples being unavailable for downstream application and use.

Sample accessioning is one of the single most important steps of the sample life cycle. At this stage, a variety of information related to the sample is collected and annotated for sample life cycle maintenance, which can help reconcile potential sample collection and processing errors downstream. For example, the digital documentation of the characteristics and necessary data elements of the primary sample (for example, a 4ml blood tube) within a database, affords a researcher the ability to reconcile its assignment of a sample during the pre-registration process. There is a variety of sample relevant data elements that will depend upon the future anticipated use of the sample, related to the sample type and source, how the sample was collected, the availability of the sample for downstream analysis, or finally, any notation required for sample state at receipt or otherwise through the process that may impact its downstream utility. Collection and storage of this information associated with each sample helps alleviate any sample discrepancies that may take place in the future. As such, a sample management technology that connects data elements on a sample from collection to the downstream bioprocessing stage of the sample life cycle allows for efficient management of collected samples, as well as an avenue to transform samples into a renewable resource that will be viable for future research.

Reducing Pre-Analytical Variables Through Sample Preparation

Once a sample is collected it may begin to take on new characteristics based on changes to the sample's environment; changes in exposure to certain nutritional, chemical, and/or other environmental factors that may occur during a surgical or collection procedure, or changes that may occur over time during storage (1). Pre-analytical variables introduced during clinical sample collection and processing can significantly affect the molecular integrity of specimens and bias the results of assays or biomarker studies (2). Therefore, an essential component of any sample management strategy requires standardised protocols for sample collection and preparation techniques. Pre-analytic variables may be divided into three general areas:
  • The physiology of the human research participant prior to sample collection
  • Sample collection practices
  • Sample handling practices subsequent to collection and prior to their inclusion in downstream testing

Sample Preparation

Techniques and Best Practices Sample preparation enables researchers to protect and ensure the quality of the sample prior to storage, and can include aliquoting (dividing into subsamples), nucleic acid extraction, purification and downstream processing. Aliquoting of parent samples enables researchers to provide exact required sample volumes to the research bench, which reduces the cost of shipping or storing extraneous sample volumes at testing labs, and most importantly, preserves the parent sample by reducing the impact of processing (freeze-thaw) and providing sample replicates to ensure fidelity of the parent sample. From an operational standpoint, an example of insufficient planning is the duplication of storage partners. It is not uncommon for a research company to pay multiple storage fees for the same sample stored at different testing locations. Centralisation and consolidation of sample storage with a single biorepository and volumetric aliquoting of parent samples to meet specific testing volume requirements and provide duplicate sample sets as required provides organisations with significant cost savings, and is an example of cost-effective best practice for sample storage.

When possible, researchers should aliquot once and as early as visibility and planning provides, into as many small and controlled volumes vessels as deemed reasonable. Annotating a single freeze-thaw cycle is especially important for submission documentation when testing is conducted well after collection, or as an adjunct test supporting other findings. The provision of aliquots from the main parent sample enables back-up samples for confirmation testing, or to ensure integrity of analytical data if required.

The method used for the mechanisation of sample processing is dictated to a large degree by two factors: the complexity of the task and the volume of work to be performed. Manual techniques are suitable for low volume, low complexity tasks. Variable mechanisation, where the procedure steps are programmed at the beginning of each batch, is cost effective for medium workload with low to high complexity of sample preparation (3). Fixed programme analysers and dedicated instruments are effective when handling high sample volumes with a range of processes with varying complexities. Partnering this aspect of sample processing with a storage partner reduces the cost and need for complex and expensive automation technologies, and ensures the data elements associated with each sample processed are maintained, alongside the provision of quality and controlled sample processing. Furthermore, use of the sample storage partner for sample processing provides a united and cohesive one-stop shop approach and mitigates the need for sample shipments and duplicate vendor negotiation and partnerships.

Considerations for Long-Term Sample Storage

Sample preservation is essential for both prospective and retrospective analyses. A sample that has maintained the appropriate storage temperature will yield better results than a sample that has undergone fluctuations in temperature due to poor handling or storage practices. The ability to access sample storage history that tracks sample temperature maintenance is important to provide confidence for pivotal clinical data (4). Organisations can both protect their financial investment and streamline their research with a strategic sample management plan that takes into account best practices for temperature-controlled storage and logistics, regulatory guidelines and audit trails tracking chain of custody of the sample.

The US Food and Drug Administration (FDA), the US Centers for Disease Control (CDC), and professional organisations, such as the International Society for Biological and Environmental Repositories, provide guidelines for biorepositories. Examples of good storage practices (GSP) include:

  • Secure facilities and robust quality assurance measures to ensure specimens are stored in compliant conditions at all times
  • Qualified staff that have been trained, for example, in global sample transportation procedures, including regulatory and customs issues
  • Temperature monitoring of samples around the clock with a comprehensive audit trail and automatic notification system
  • Business continuity plans, back-up power, and redundant systems to protect sample integrity during emergencies

Sample Bioprocessing

Sample bioprocessing requires careful coordination and planning to reduce pre-analytical variability that can degrade sample integrity. Accordingly, once collection and storage parameters have been determined, downstream bioprocessing protocols should be explicit regarding their processing requirements, including time to processing, temperature maintenance at all times (storage, aliquotting, processing), centrifugation time, and the manner in which the specimen should be processed and distributed. Many sites meet this challenge by developing SOPs specific to each trial or intended use of the sample that guide laboratory technicians through the processing requirements. Certain samples should be processed within an immediate or finite lifespan, whereas other samples (DNA) may afford a longer storage time before processing is required. Bioprocessing techniques are also specific for the sample type and volume, and care should be taken for complex sample types and potential extractions to ensure the appropriate technology is applied for maximal extraction and target analyte maintenance. As appropriate for the study, the amount of time elapsed during collection and processing should be recorded and tracked, and further samples processed should maintain the associated data elements collected and logged during sample accessioning to ensure all required information about the newly created samples is maintained.

Information Management

Information management is a critical aspect in maximising a sample’s utility for current and future research initiatives. For analysis to be meaningful and accurate, a repository’s informatics system should record and report processes through all stages of a sample’s shipping, processing and storage life cycle. The documentation must include an auditable record of who performed what work at what time and on what date. The system should also identify subsamples of the stock and exact locations of inventory. Because sample inventories can grow exponentially over time, systems should be scalable and robust enough to quickly locate a physical sample and retrieve its data for an approved user located anywhere in the world. Finally, the system needs to auto-store temperature records without human intervention.

To prove that a sample has been maintained in adequate conditions, three sets of data are required: storage temperature(s), a life cycle audit trail of the sample from entry into the system through retrieval or disposal, and a complete chain of custody of actions taken on the sample during storage. An audit trail that complies with US FDA 21 CFR Part 11 requires documentation of the action taken on the sample along with the date, time and handler’s signature (5).

Regulatory Concerns

The FDA and international regulatory agencies that enforce good laboratory practices, good manufacturing practices, and good clinical practices have laws, regulations and guidance that affect various aspects of sample collection, processing, storage, transportation and electronic documentation. These laws and regulations include the aforementioned US FDA 21 CFR Part 11 and the Health Insurance Portability and Accountability Act (HIPAA). New regulations are certain to emerge in the future; they may restrict or prohibit the use of samples that have not been maintained under the most stringent conditions.


Samples are becoming an increasingly valuable asset for research and drug development, tied to the growing importance of biomarkers for disease identification, life cycle management and within the drug development process for driving decisions during clinical development and for downstream patient care. Sample annotation and maintenance of necessary data elements is crucial to ensuring these samples are available for these important healthcare needs. The availability of high-quality samples are vital to the development of new drug therapies and can help speed molecules to the market faster while supporting reductions in overall clinical development costs. Analytical and pre-analytical variables introduced during the sample collection and processing can contribute significantly to errors in testing if not mitigated upfront. As such, sample processing and preparation requires careful coordination and planning to reduce variability that can degrade sample integrity. Regardless of the sample type, samples should be handled and stored in accordance with Good Storage Practices to ensure the integrity of the sample is maintained for current and future research.

  1. National Cancer Institute, Office of Biorepositories and Biospecimen Research, Technical and Operational Best Practices. Biospecimen Collection, Processing, Storage, Retrieval, and Dissemination, August 2012, visit http://
  2. Michael C, Biorepositories for Long-Term Preservation and Future Analysis, Journal of Clinical Research Best Practices 7(1): January 2011
  3. Geary TD, Sample Preparation and Processing, Journal of Automatic Chemistry, 13(2): pp57-58
  4. National Cancer Institute, 2011 Best Practices for Biospecimen Resources, visit practices/2011bp.asp
  5. US Food and Drug Administration, Electronic Records; Electronic Signatures – Scope and Application, visit RegulatoryInformation/Guidances/ucm125067.htm

Read full article from PDF >>

Rate this article You must be a member of the site to make a vote.  
Average rating:

There are no comments in regards to this article.


Kristina Robson, PhD, is Senior Director of Comprehensive Solutions at BioStorage Technologies, a global provider of comprehensive sample management solutions for the bioscience industry, including storage, cold chain logistics and virtual sample intelligence. Kristina manages global policies and procedures for sample management operations and oversees strategic sample management service partnerships.
Kristina Robson, PhD
Print this page
Send to a friend
Privacy statement
News and Press Releases

OGM celebrates 60th anniversary

Leading plastic injection moulding company OGM (Owen Greenings and Mumford Ltd) is celebrating an impressive milestone this month with its 60th anniversary.
More info >>

White Papers

Medpace Reference Laboratories establishes state of the art Flow Cytometry techniques for flexible approaches to clinical trials across multiple therapeutic areas.


Cytometry is the process of measuring the properties of individual cells. These properties may include gene or protein expression, chemical properties, deoxyribonucleic acid (DNA) content, and various cellular functions. The earliest methods of cytometry relied upon light microscopy for the classification and observation of cells and cellular components. Microscopy permitted direct visual observation of cells for the first time, leading to the classification of cells by morphology and insight into cellular functions. However, the time required for microscopic analysis constrains the number of samples or number of cells in each sample that can be examined. Therefore, the utility of microscopy for analysis of rare cells or in situations where sample throughput is a priority is limited. Flow cytometry was developed largely to improve upon these limitations.
More info >>




©2000-2011 Samedan Ltd.
Add to favourites

Print this page

Send to a friend
Privacy statement