Biometrics Built Better

The Evolution of the Data Manager Role in Clinical Trials: Part 2 – Inter-Trial Relationships, Growing Complexity and Anticipating the Future

Share this article

In the second of this two-part series, we’ll explore the relationship between data managers and statisticians in clinical trials, the responsibility of data managers for facilitating the implementation of novel trial designs, and how they maintain quality through good clinical data management practices as technology and trial designs become more complex. Jorge Torres Borrero, Senior Clinical Data Manager, shares his insights

 

What is the relationship between a data manager and statisticians during a clinical trial?

Data managers and statisticians work side by side during a clinical trial, collaborating to ensure that the data generated by the trial is complete and accurate. Their specific roles can be broken down as follows:

The data manager is responsible for overseeing the collection, storage and quality of the data collected during the trial. This means ensuring that the data is recorded accurately, organized in an appropriate way and securely stored in line with any regulatory requirements. Data managers draw on their experience to work closely with study coordinators to determine the best methods for optimizing the data collection process.

Meanwhile, statisticians are responsible for analyzing the collected data to draw valuable insights that can be used to support decision making. They work closely with data managers to ensure the quality of the data that they use.

It’s important to note that there can be some crossover. As Jorge states, “I love that I’m a resource on data analytics for my studies. Programming these reports and graphs and helping the study team to understand where our data trends are going is a highlight of my job.”

 

How are data managers responsible for implementing novel trial designs?

As our understanding of human biology and disease development continues to grow, so too does the complexity of trial designs. Novel trial designs are approaches that deviate from traditional randomized controlled trials to improve the feasibility, efficiency and cost-effectiveness in clinical research.

Data managers play an important role in implementing novel trial designs by ensuring highly accurate and efficient data management throughout the trial lifecycle. They do this through:

  • Designing and implementing data collection tools
  • Developing data management plans
  • Maintaining quality standards to accommodate the unique needs of these designs
  • Adapting Case Report Forms (CRFs)
  • Selecting and implementing Electronic Data Capture (EDC) systems that can handle complex data structures and real-time data capture

 

How do you maintain quality through good clinical data management practices as technology and trial designs become increasingly complex?

Jorge tells us, “In the past, I’ve been fortunate to work in the Clinical Data Interchange Standards Consortium (CDISC) standards groups for Clinical Data Acquisition Standards Harmonization (CDASH) so I’m very familiar with the standards requirements and their development. Not to mention the fact that these are always front of mind at Veramed when we’re conducting work, as we want to ensure that we’re attuned to the regulatory organization and the standards required by them.”

Whenever any significant changes are made to the standards, both biostatistics and data management teams are updated, and given any training necessary, to make sure that they are both aware of them and how they will impact submission requirements.

 

How do you maintain data standards, particularly in highly complex and evolving trials?

As a data manager, staying up to date with any changes to the Good Clinical Data Management Practices (GCDMP) is extremely important. Today, the Society for Clinical Data Managers (SCDM) updates individual chapters at a time, making it much more manageable for organizations to ensure that they remain compliant. Jorge advises any data manager to “be involved in a least one working group within either SCDM, CDISC, CDASH or any other organization that is collaborating with other industry representatives” as this provides a much broader view on how others view data collection and invaluable opportunities for learning from one another. 

Jorge also stresses the importance of managing data proactively, not only by reviewing patient profiles one-by-one which is customary, but by developing a much more holistic approach to tracking and identifying risks in both the data and it’s quality. He says, “I think programming the data cleaning reports is the ideal practice to be able to quickly review for outliers, for example, using trending graphics on vital signs. Another key practice for me is to collaborate with the study team to have regular data reviews. This can be difficult as we are all much more global in our practices and are managing different priorities, but if we are continually reviewing the data as a study team to include clinical operations, medical monitors, safety monitors, biostatistics etc, we ensure that data management is keyed into the potential problems that different stakeholders might see. We learn from this what might be important for each group, and we can adapt our strategies and trend reports to ensure we capture those potential risks. In this way, we can ensure the entire team’s opinion is included in any risk identifiers we monitor to ensure data quality.”

 

What does the future look like for data managers?

Anyone attending SCDM or other data management conferences may have heard about clinical data sciences being the next logical step as the development of a clinical data manager. This would involve performing tasks of data management as well as contributing as a programmer and study designer. However, this requires more knowledge of programming standards and language, as well as principles of biostatistics and trial design. Jorge believes this presents a challenge. “It relates to other traditional study roles and how our industry relates to Clinical Research Organizations (CROs). Often CROs are brought on board once a study protocol has already been developed, meaning it may be difficult to contribute to study design and protocol development. Nevertheless, I currently implement programming knowledge and skills in my role – which is something unique to Veramed.”

Jorge advocates this approach stating that it increases efficiencies by reducing duplicative CRF and edit-check specifications that are sometimes done in triplicate with the traditional sponsor-CRO-tech vendor infrastructure. Instead, Veramed creates them in a development environment, and only creates them once for client approval, shortening the EDC development timeline. 

Jorge also believes that we can expect to see the integration of machine learning into the daily practices of a data manager over the next few years. “I think that data managers will need to learn how to leverage this new technology as we have with EDC adoption, while also remembering that machine learning is influenced by the learning models and the data that is used.”

Machine learning will certainly contribute to how data management performs tasks, most likely in identifying patterns and outliers where this is mostly carried out manually. Jorge continues by saying “I think the tipping point will depend on current data management directors investing in their staff to ensure they understand programming, and in providing the tools for their staff to proactively implement emerging technologies so they can be effectively leveraged to elevate outcomes.”

At Veramed, we can provide end to end comprehensive data management services support for your clinical trials and managing your data flow to optimize the speed and success of your study.  

 

Find out more about our expert clinical data management services.

Read Part 1 of this blog

Contact us to see how we can help you. 

 

Share this article

Stay Connected

Sign up to receive the latest insights, case studies, and Veramed news—straight to your inbox.

Related Posts