Handling CRF design changes in a live environment
by Nieke de Ruiter & André Snelting (both OCS)
Building an EDC database is based on the study protocol which should contain all required information for the eCRF. Precision, consistency, and completeliness are highly important for that. In many situations these criteria are not met. In other scenarios the protocol demands a high level of complexity from the EDC design. For example, study procedures can be country or site specific, or a combination of paper and electronic diaries is used. These aspects and challenges should be taken into account when reviewing the protocol. Even when the protocol fulfils these criteria, and the protocol is not overly complex, change requests in the production phase cannot be avoided but are often time consuming and error prone. This paper provides a number of examples of change requests to the eCRF and the database and describes how these were implemented. In addition, lessons learned about smart design changes will be shared and discussed.
The use of SDTM standards for non-submission purposes
by Lieke Gijsbers (OCS) & Paul Vervuren (Nutricia Research B.V.)
A proprietary, CDASH/SDTM-hybrid data model to expedite clinical data review
by Lieke Gijsbers (OCS) & Tom Van Der Spiegel (Janssen Pharmaceutica N.V.)
In 2016 Janssen identified the need to expedite clinical data review. A proof of concept demonstrated the value of pursuing a new and proprietary data model for data review, serving as single source of truth. The Data Review Model (DRM) that was introduced is strongly based on CDISC CDASH and SDTM. DRM provides full traceability and describes both clinical and operational (system) data consistently across studies. On the longer term, the pharmaceutical company plans to implement a metadata-driven environment, including data conversion from source data into DRM. In 2017, OCS Life Sciences and Janssen piloted DRM by implementing a mapping framework that supports both documentation and execution of source to target data mapping. This paper describes how multiple trials were mapped to support the pilot phase of DRM, to learn, refine and document the value of DRM prior to moving to production implementation.
Work Less - Do More
by Kai Wanke & Sofia Vale (OCS)
As Define-XML contains many repetitive items, manual validation is not only a tedious and time-consuming task, but also prone to errors. However, this repetitive structure allows for many of its contents to be validated automatically. It is our recommendation to aim at automating the validation on top of the development for a number of reasons, including:
1) Define-XML should ideally be created before the SDTM or ADaM data is available; 2) manual enrichments may have been made which (inadvertently or not) affect the metadata; and 3) the data may be updated but a new Define-XML cannot be automatically generated as it would undo manual enrichments.
This whitepaper will describe how a significant part of the validation of Define-XML can be automated using SAS scripts, independent of the software or method used to generate the Define-XML, to reduce the workload of the programmer while guaranteeing a high degree of accuracy.