spacer
home > ict > winter 2014 > level best
PUBLICATIONS
International Clinical Trials

Level Best

Let us state the obvious: software automates tasks. In clinical trials, a fair amount of the available software automates manual processes. For example, electronic data capture (EDC) systems have eliminated the manual tasks required to print and send case report forms to sites, mail or fax the completed forms, and the double data entry task of entering the content of the forms into a computer for further review. Furthermore, clinical trial management systems (CTMS) have enabled contract research organisations to automate and improve their internal processes, enhancing accuracy and accountability.

However, process automation can only go so far because these processes and methods reflect an organisation’s structure – and it is this structure that needs to change in order to reach a higher level of optimisation.

Process Re-Engineering

If software automates tasks, good software can often go further by enabling people to perform tasks they could not complete manually. In manufacturing, for example, a person can be trained to control a robotic assembly line designed for intricate soldering and fabrication, even if the person could not do any of those tasks themselves. This type of automation provides leverage by creating higher order activities and producing more consistent, higher quality results.

The creation of these higher order tasks and activities has required a complete rethinking of how organisations are structured, and within those structures, how worker activities are divided. Barriers have had to be shattered. People’s impressions of their work and the jobs they have to perform often conflict with a perception of position and title. When processes are re-engineered, individual tasks may shift from one position to another as new roles are defined and others are eliminated.

Lab Evolution

Today, for instance, imaging trial software enables trial coordinators to perform image quality checks, and other self-service activities using automation tools, with a limited degree of detailed imaging knowledge. By performing these self-service quality check tasks before imaging data submission, administrative and image acquisition errors can be detected and corrected, eliminating delays and reducing the number of queries.

In breaking the barriers, the first line of evolution is at the core laboratory. This level of automation is upending processes and methods that have been used for at least a couple of decades. Core lab infrastructure has been the beneficiary of process automation software, optimising most manual activities without really changing them. The integration of data receipt, CTMS and image storage has made these lower order activities more efficient.

Nonetheless, although some simple quality control activities have benefited from software used by trained personnel, most of these jobs are still done by imaging specialists – part of a multi-layered operations organisation with various levels of project managers and project coordinators receiving data, detecting submission problems, and creating, forwarding and following up on data queries to sites worldwide.

Now that this software exists, we need to redefine these downstream organisations to take full advantage of, and provide our industry with, the efficiencies we need. Once this is done, the industry can benefit from the new technologies that will make possible the type of analysis we have all been looking for.

Small Data to Big Data

When we detect and minimise acquisition errors early in the data generation process – when it is still ‘small data’ – the aggregate of tens or hundreds of thousands of data-points is likely to yield interesting information. To extract information from ‘big data’ we need advanced analytical software enabling experts to manipulate, transform, combine and summarise patterns across vast and dissimilar datasets.

This discovery process is, under the best of circumstances, opaque. When the ‘atomic’ elements can be checked for accuracy close to acquisition, the same process benefits from additional transparency. When they are checked later, the process becomes obscure and the ability to address the underlying procedural deficiencies becomes almost insurmountable. This fact seems obvious, but sometimes the obvious eludes us. The real issue is the objective of the task – but this has changed because the role, responsibility and accountability of those performing the task has changed.

Software Development

In a re-engineered processing of imaging data of the near future, task allocation, role, responsibilities and accountability is likely to focus on the optimisation of data quality and enablement of advanced diagnostic biomedical informatics – allowing sponsors to reap much more information from the data collected. When precious funds can be directed toward interpretation and discovery, we will see new software developed that will make use of vast amounts of already existing subject data.

The development of this software will occur in stages. Since most data across individual trials was not acquired using consistent standards, one of the first steps will be to intelligently normalise what we have. It is necessary to say that going forward we will use a standard, but this is not sufficient, since billions of dollars have already been spent to collect data which contains information.

Future Tools

Perhaps the next step is collaboration software designed specifically for clinical research. The key will be to build a tool enabling dissimilar disciplinary interpretations of related data to inform one another. For example, genomics, proteomics and imaging experts looking at related datasets are living in a scientific Tower of Babel; their individual perspectives and views about common-origin data – the same patient or clinical trial – do not efficiently inform each other. However, clinical trial collaboration software will enable observations to be readily shared, and actively alert them to possible data relationships invisible to any of them individually.

Another step must be software-assisted interpretation. Highly specialised versions of this have proven quite useful in medicine. For example, computer-aided diagnosis is helping doctors and hospitals to detect breast cancer. In oncology research, as part of response evaluation criteria in solid tumour based protocols, some labs and companies are developing software capable of delineating, and thus sizing, volumes for large tumours.

This is just the start. When volumetric analysis is combined with studies of mass and porosity, and then viewed under the prism of certain genetic configurations, we will be able to discover how specific therapies can help small sets of individuals, even if the larger clinical study appears to fail for a large portion of the population.

Quality at the Source

We live in an exciting time, and our industry is primed for change. Software advances have finally enabled data quality to be checked at the source – the investigator site – prior to submission. This includes not just textual data entered into EDC systems, but also the more complex medical images that are increasingly becoming an integral part of trials. With the ability to check the data automatically, we are freeing hospital personnel by reducing queries, thus enabling them to spend more time providing care to patients.

High-quality data enables a redistribution of resources to transform scientific and clinical R&D data into actionable, applicable atomic insights of the data previously collected, and provides a promise of re-use. We will be able to develop collaboration tools that can optimise communications among experts spanning multiple disciplines, thereby providing them with insights that are elusive to each one individually.

As these exchanges mature and software tools are refined, we will be able to develop better detection algorithms and programs which will, in turn, aid the discovery process. Advanced quality assurance at the source will lead to better detection at the destination, and more effective therapies for patients.


Read full article from PDF >>

Rate this article You must be a member of the site to make a vote.  
Average rating:
0
     

There are no comments in regards to this article.

spacer
Abraham Gutman leads AG Mednet in its mission to improve, automate and expedite outcomes in clinical trials by ensuring quality and compliance within critical medical imaging processes. He founded the company in 2005. More than 25,000 registered users across thousands of investigator sites in 60 countries use AG Mednet to participate in projects sponsored by each of the world’s top 20 pharma, biotech and device companies. Abraham holds a BA in Computer Science from Cornell University and an MSc in Computer Science from Yale University.
spacer
Abraham Gutman
spacer
spacer
Print this page
Send to a friend
Privacy statement
News and Press Releases

Applikon Biotechnology launches fully customizable single-use bioreactor using 3D printing technology

Dutch-based upstream bioprocess equipment specialist Applikon Biotechnology B.V. (Applikon) has launched the single-use AppliFlex ST lab-scale bioreactor as a significant extension to its range of innovative bioreactor systems. AppliFlex ST is a fully customizable and scalable stirred tank single-use bioreactor that uses 3D printing technology to provide a head plate that is uniquely configured to each individual process, including custom impeller design, and different sample port connections.
More info >>

White Papers

Streamlining patient recruitment using EHR data

The field of (bio)pharmaceutical research is booming. Increasingly more innovative therapeutics are being developed with the ability to drastically improve the quality of life of millions of patients around the world. Rigorously testing each of these candidate drugs for their safety and efficacy is an essential part of the development process. As the number of clinical trials steadily increases, the search for patients who present a specific medical profile and are willing to participate intensifies.
More info >>

 
Industry Events

Nordic Life Science Days 10/12 September 2019

10-12 September 2019, Malmo Sweden

Nordic Life Science Days is the largest Nordic partnering conference for the global Life Science industry. Bringing together the best talents in Life Science, offering amazing networking and partnering opportunities, providing inputs and content on the most recent trends. Nordic Life Science Days attracts leading decision makers from the Life Science sector, not only from biotech, pharma and medtech but also from finances, research, policy and regulatory authorities.
More info >>

 

 

©2000-2011 Samedan Ltd.
Add to favourites

Print this page

Send to a friend
Privacy statement