home > ict > summer 2012 > strategic sample management
International Clinical Trials

Strategic Sample Management

Graham Hughes reports on the Global Sample Management Benchmarking Symposium in Brussels – an event organised by Biostorage Technologies, and supported by International Clinical Trials

In recent years the storage of biological samples and specimens obtained during clinical trials has become increasingly important. The Global Sample Management Benchmarking Symposium brought together a panel of experts to explore these themes in Brussels this May. I had the pleasure of moderating the event, which was organised by industry-leader Biostorage Technologies. The speaker panel consisted of: Alexander Roussanov, Associate, Hogan Lovells International LLP, Brussels; Agnes Rivaille, Scientific Affairs Director – Europe, PRA International; Hyung Park, Director Translational Development Operations, Celgene; and Steve Sweeney, Head of Clinical Technologies, Infinity Pharmaceuticals, Inc.

Provisions of the Current European Clinical Trials Directive

Alexander Roussanov explained that the intention of the Clinical Trial Directive is to protect patients, ensure that clinical trials are conducted in accordance with sound ethical principles and, of course, to maintain the reliability and robustness of data that is generated, which can eventually lead to the marketing authorisation of medicinal products. A further objective is to simplify and harmonise the national administrative provisions which govern clinical trials. Unfortunately, as the commissioner has recently pointed out, there is a lack of harmonisation within Europe. Currently there is no centralised assessment of clinical trials and every member state in the EU has set up its own administrative procedures to govern their conduct. At the present time, each trial requires a positive opinion from a competent ethics committee as well as an authorisation from its competent authority. Perhaps surprisingly, the EMA itself is not involved in the assessment of clinical trials, apart from offering scientific advice to applicant companies.

In the EU, regulations are implemented in various ways by member states, which potentially leads to inconsistent and diverging interpretations and occasionally opposing decisions. A further complication is data protection legislation, which again is interpreted in various ways by different countries.

The European Commission has prepared a strategic paper and proposed several options on how to address the issue. According to the commission, the legislation will be a regulation rather than a directive, which is more binding on the member states. It is expected the proposal will attempt to increase the numbers of clinical trials in Europe, harmonising the EU rules on the conduct of clinical trials and reducing administrative burden on companies. The favoured option at the present time appears to be a Coordinated Assessment Procedure by competent authorities of the member states. It seems unlikely that a single ethics committee approval will be possible or required. Further complications may lie in the areas of patient insurance, liability rules and data protection requirements. The situation as far as the new legislation goes is, unfortunately, not yet clear, and we will have to wait until July before the commission proposes rules, which will be adopted in the following years.

Prospective and Retrospective Sample Management

Key to the conduct of clinical trials are the trial sites themselves. Currently sites are expected to provide robust and accurate data and to comply with current Good Clinical Practice and other legislation which governs the conduct of trials. Agnes Rivaille concentrated on so-called minimal risk studies and, in particular, looked at the role of biomarkers in such studies. It is important to decide at the outset whether the study’s objective is to discover a biomarker, or to validate a biomarker that has been predefined. This approach defines which specimens will be taken and how they are managed. Thus it is important to define at the start what analysis will be performed, whether the samples are going to be stored and what kind of storage will be required.

Rivaille went on to explain that it is important to evaluate the experience of the site and whether it has prior experience of similar studies. Companies should consider not only how sites can comply with legislation, but also how they manage their patients. A clear and concise explanation of what the patients will be expected to do, and how their specimens will be managed is critical. One issue of particular relevance is that sites must be aware of a sample management plan, and this has implications in terms of site training. The plan will also consider what the key milestones of the study are and whether interim analysis will affect the conduct of the study. Good communications from clinical trial management to site can reduce the frequency of on-site visits. Optimisation of the quality of the samples is more important than their quantity.

Rivaille concluded that for biomarker studies, flexibility is needed in terms of implementation and execution. The management of samples is a continuous evolution from the beginning to the end of the study. However, even if you try to anticipate problems and mitigate risk, some steps in the process will go wrong and will need to be revisited in order to reorganise the conduct of the study.

Advancement of Personalised Medicine

Personalised medicine is a form of medicine that uses information about a person’s genes, proteins and cellular environment to prevent, diagnose and treat diseases. All of these factors require biomarkers, which provide a key to delivering the right treatment to the right patient at the right time. As a result, biomarkers can be used to determine the optimal dosage regimen, identify subjects most likely to respond to a drug, detect alterations in disease processes and, most often in cancer, identify patients likely to experience adverse events.

Hyung Park explained that one of the challenges we face in delivering personalised medicine is obtaining a sufficient quantity of samples of sufficient quality so that we can measure what we need to and then interpret the results. In order to do this, a standardised method of collection, processing and storage throughout the study is needed so that samples can be tested in a robust and consistent manner. One of the complications in this area is the transportation of samples over national borders, in particular from Europe to the US. The translation of a laboratory-developed biomarker, and the validation of diagnostics as a true surrogate marker of a disease or response to treatment, has become increasingly challenging and is extremely expensive.

Park stated that a comprehensive sample management plan is imperative. One of the problems with this is preanalytical variability, which may be contributed to by specimen collection, processing and storage. This variability must be minimised. Samples need to have their stability determined; once a sample has been shown to be unstable, there is little quality that can be associated with that sample. Thus, the quality of the sample begins at the investigative site. Training is one of the most important tools at a company’s disposal so that sites do collect the samples in an appropriate manner and process them in the right way. Likewise, the sample repository must be meticulously organised for the reception and redistribution of specimens.

Park concluded that having a quality sample begins at the investigative site and the preparation that goes into ensuring that the sites deliver quality samples. From then onwards, with an adequate sample repository of your own or an outsourced repository in place, it is possible to store biospecimens with the quality necessary to deliver meaningful personalised medicine that ultimately benefits the patient.

Using Technology to Improve Sample Asset Utilisation and Research Results

Steve Sweeney recounted the process of setting up an on-site sample biorepository at Infinity Pharmaceuticals: why they did it, how they did it, and how technology has been used at this facility to support improved research and development of new drug candidates. Being a small molecule emerging company, Infinity decided to keep its science in-house. It built a translational medicine group and molecular pathology group to help discover the new successful oncology therapy, the ‘needle in the haystack’. In order to manage the samples generated by its research groups, Infinity decided to build its own biorepository.

This decision, Sweeney continued, was made because Infinity had regulatory obligations, chain of custody concerns and sample quality issues. It is important to note that samples beget samples; you extract an aliquot, and you cut paraffin blocks. Thus, maintaining the highest quality and integrity of samples was a key objective. Since the company was just about to launch a Phase 3 study, a specimen repository had to be set up very quickly. Since the company wanted to focus on its core research competency, it sought a partner whose competency was in sample management. The company selected BioStorage Technologies and set about in-sourcing their technology and standard operating procedures (SOPs) to help set up and manage its internal repository.

Sweeney explained that the sample management system in-house is managed through web services. It now has a system to identify where all the samples are, and for managing requests – finding the samples, getting the requests implemented and tracking the experiments that have been done and the data they generate. Infinity has had to put in place a whole series of quality control checks and part of this includes generating sample labels for secondary samples generated from specimens taken from patients. Thus, utilisation of the web services to manage the sample repository is crucial, has improved quality and has replaced large volumes of paper.

From regulations to samples, sites and technology, the EU symposium focused on the importance of sample integrity and management to supporting R&D.

Read full article from PDF >>

Rate this article You must be a member of the site to make a vote.  
Average rating:

There are no comments in regards to this article.


Graham Hughes
Print this page
Send to a friend
Privacy statement
News and Press Releases

More info >>

White Papers

Points to Consider When Developing a TMF (Trial Master File) Strategy

Phlexglobal Ltd

Many organizations are currently outsourcing clinical trial activities to one or more contract research organizations (CROs). This strategy enables companies to leverage specialized expertise and take advantage of flexible resourcing throughout the conduct of a clinical trial. Outsourcing minimizes the costs of recruiting experts, building a team and maintaining an infrastructure. However, it can also add complexity as the organization looks to meets its compliance obligations regarding clinical trial documentation. The documentation referred to in Article 15(5) of Directive 2001/20/EC as the trial master file shall consist of essential documents, which enable both the conduct of a clinical trial and the quality of the data produced to be evaluated.1 This essential study specific documentation is also known as the TMF. As organizations try to minimize their reliance on paper files, the electronic TMF (eTMF) has emerged. A current industry initiative to standardize the organization of this content is known as the TMF Reference Model. This model is helping standardization efforts across paper and electronic systems. As companies implement outsourcing strategies, CROs and sponsor organizations look for a common foundation on which to build their TMF capabilities. The following paper outlines some of the challenges organizations face when outsourcing clinical trial activities to multiple contract research organizations and a strategy to facilitate partnering and management of trial information between sponsors and CROs.
More info >>




©2000-2011 Samedan Ltd.
Add to favourites

Print this page

Send to a friend
Privacy statement