CLINICAL DATA MANAGEMENT
Clinical data management (CDM) is a set of practices that handle information generated during medical research. It aims to ensure data quality, integrity, and compliance with internal protocols and state regulations.
Also, the CDM process helps keep key clinical trial stakeholders on the same page :
1. Sponsors — pharmaceutical companies, institutions, and other organizations that initiate, monitor, and finance the trial.
2. CROs (control research organizations) — an organization contracted by another company to take the lead in managing that company's trials and complex medical testing responsibilities.
CRF design-Data manager Database design-Database designer
Data capture-Data entry associate Data validation-medical coder
Database lock-Quality control associate
Protocol review:reviewed done by the company team including data manager along with other team members
Protocol amendments :Changes procedure in document
Data management plan:
DMP should be developed for each study and early during the setup of the study
A data management plan or DMP is a document detailing all procedures, tasks, milestones, and deliverables throughout the CDM lifecycle. It gives a roadmap on how to work with information and handle possible risks. Another important function is to clearly communicate what happens in the course of the trial to each stakeholder.
1.1 Purpose of the Data Management Plan (DMP)
4. CORE/Primary project team members
6. clinical database used for project
10. Clinical Database Design and Testing
10.1 Clinical Database Specifications
10.2 Clinical Database Development
10.3 User Acceptance Testing (UAT)
1.1 Purpose of the Data Management Plan (DMP)
4. CORE/Primary project team members
6. clinical database used for project
10. Clinical Database Design and Testing
10.1 Clinical Database Specifications
10.2 Clinical Database Development
10.3 User Acceptance Testing (UAT)
19.1 Data Cleaning Requirements
19.2 Post Analysis Unfreezing/Unlocking of data
20. quality control and database lock
21. PROJECT DISPOSITION AND ARCHIVAL
The DMP typically describes the following aspects:
data to be gathered from trial participants,
existing data that can be integrated(how to mix,link,parts of data)
data formats,
metadata and its standards,
storage and backup methods,
security measures to protect confidential information,
data quality procedures,
responsibility assignments across team members,
access and sharing mechanisms and limitations,
long-time archiving and preservation procedures,
the cost of data preparation and archiving, and
compliance with relevant regulations and requirement
Description of the data to be collected
Data storage & backup.
Data sharing & dissemination
Data security & confidentiality
Data preservation & archiving
The DMP must be ready at a trial design stage, before the first participant is enrolled. This will ensure that data will be collected in the correct format and properly organized. However, the plan is not something immutable: It has to be updated across the trial, capturing any changes that influence data management.
STUDY START UP/STUDY SET UP:
eCRF(case report form):
The case report form is a printed or electronic questionnaire for collecting data from study participants and reporting it to trial sponsors. The document is created specifically for each research project in accordance with
the trial protocol, and
recommendations of the Clinical Data Acquisitions Standards Harmonization (CDASH). They are developed by Clinical Data Interchange Standards Consortium to streamline industry-wide data exchange. Say, CDASH dictates dd/mm/yy format for collecting dates. (Read our article on CDISC standards to learn more.)
Well-designed case report forms collect only data necessary for the particular study avoiding any redundancy. The fields to be filled in may include
demographics (age, gender),
basic measurements (height, weight),
vital signs (blood pressure, temperature, etc.) captured at various time points,
lab exams,
medical history,
adverse events, and
more, based on the research requirements.
Data managers create data entry screens and eCRF layouts in collaboration with a database programmer. The design usually goes through several review cycles before finalization.
Crf pfd
Compare crf pdf with protocol
eCRF specification-prepare
Explain ecrf spec to programmer
Draft ecrf Review,updates and changes
Final ecrf
Ecrf in EDC(electronic data management)
Compare ecrf spec with EDC
EDIT CHECK DOCUMENT:
Edit check document
Edit check specification
Manual query and automated query
Dynamics
Populate a query in edc
Provide edit check specification parallel with ecrf specification to programmer
Compare edit check specifications with edc
UAT(user acceptance testing):
Role testing: testing role for team members like access
Screen testing
Structural functionality
Edit checks
Any of the data which falls out of range it should populate a query and for any thing which falls within a range that it express should not populate a query
LAB values-enter in UAT
Sponsor DM—CRO DM—LNR(lower normal range) template—Lead CRA(site)---Fill(template)---CRO CM—EDC —---manual,automated
eCRF completion guidelines
STUDY CONDUCT:
Patient —site—assessments–enter edc
Company project manager or clinical operation manager send SHIPPING MANIFEST to lab(vendor) – lab entry data in acquisition form
Lab—send test tubes—site
|
Pk samples–lab—analyze—send data to data manager
Lab 2 types
Central
Local
CENTRAL: Reference range values: lab maintain their own lab range values
Set up: acquisition form =site data in vendor database
LOCAL: lab specific,site specific,sponsor specific
Set up: they do not maintain acquisition form
Data transfer specifications
Version
Table of contents
Contact details
Purpose
Data file structure
Data file transfer
Frequency = weekly or monthly
Data transfers will be cumulative or incremental
Date of file
File naming conventions=blinded,unblinded
Data deliver method
email,sftp(secure file portal) like passwords or client web portal
Test transfer
Data quality
Data receipt
Data file contents
Mapping tables =visit dates,codes
Data transfers:
Mock data from vendors,IRT,SAE
Data transfer agreement:
Purpose
Type of data
File naming conventions
Data transfer mode
Data transfer type
Frequency/Transfer frequency /schedule
Vendor agreements
Company agreements
Reconciliation :External data
Vendor reconciliation : reconcile lab data eg:Pk,ECG etc..
IRT reconciliation : reconcile header data eg:subjectid,site id etc…
IRT=interactive response or randomization Technology
IRT helps clinical trial sponsors & site for manage the patient & drug supply logistics—-->because =ability to offer control & flexibility while increasing efficiency
IRT also helps to maintain blinding by making sure that specific roles in a study do not know the treatment that each patient is receiving
CDM reconcile below list:
Screening date
Re-screening date
Randomization number
SAE reconciliation:reconcile SAE data:
Adverse event : Any effect/disease
Adverse event maybe : MILD,MODERATE,SEVERE
Serious adverse event
SAE →collected from→clinical trial & marketed products
Reporting containing SAE —>case—->goes to safety department—>case of adverse effect
During clinical trial=SAE information also received through CRF or EDC—>stored in ADVERSE EVENT—>DMS
Stored in clinical & safety database
Important to match = SAE with data management system—------->SAE reconciliation
During information following information are checked:
Cases found in the SAE system but not in the CDM system—->(case of adverse effect)safety system.
Cases found in the CDM system but not in the SAE system —>(adverse event)database system.
Death —>any case—>found only one system–>.(cases of adverse effect) adverse event
Need of SAE reconciliation=compare SAE with CDM
Depends for
Companies—---->reports & manual comparison—-->SAEreconclliation
On
When the reporting system has access to the underlying database
initial match on some information such as study ID,Sub ID etc…
Underlying database present in SAE system and CDM system.
SAE reconciliation-method SAE CDM
|required | |
PLAN Data entered to create
|
Two lists for direct comparison
|then
1)events in both lists are cross checks
2)According to the plan,key data should be
Compared in each database
3)All discrepancy arise during reconciliation
Should be reviewed &resolved
Reconciliation issues:
Ex: unschedule visit
MRI assessments
Accession ID
Query management:
Query management. In terms of clinical trials, a query is a request for clarification from trial sponsors to researchers. Such requests are made during data review, before database lock, and aimed at resolving errors and inconsistencies. The query management feature facilitates communication among data managers,sponsors, and other stakeholders. It helps faster resolve all questions.
Open queries : when site coordinator open query( seen the query)
Closed queries: when site coordinator opened ,answered query and closed query
Answered queries: when site coordinator opened and answered query
Canceled queries:when DM/biostatician/medical monitor/clinical coder enter query in EDC and it can be canceled by these people
Queries prepare in excel sheet which discrepancy is present in data(EDC)
Query posting
DM of CRO will provide weekly metrics
Sponsor DM will review meetrics
Metrics: can be open query by–DM,CRA,medical reviewer
Queries like eg:subject part,unconfirmed missed date etc..,
Automated query—query populate in EDC when discrepancy is entered by site coordinate–edit checks
Manual data listing
SAS programme:
\
CRO will provide data listing-sponsor will review programme listing
If not possible sponsor will prepare programme listing from EDC
EDC listing:
EDC–installed modules–reporter–Data listing STUDY -
prod–submit==production clinical review —-from:ad,cm,medical—run–DATA–download file
Patient identifier
J-review
Spot fire
illuminate
Manual data review
Manual checks
Ex : Compare adverse events with cocomident medication
Apply concatenate and vlookup
Create Unique ID : SUB ID
Concatenate:
Formula: Unique ID + the variable we want to compare (Record number)
Create 2 concatenate records in both the listings
Vlook up:
Formula:vlookup(first concatenate cell no 1,second concatenate cell no 2 in different listing:$ columns name of the variable we want to compare $ total no of rows/records,1,0).
Review coding dictionary MEDRA and WHO drug.
STUDY CLOSEOUT:
On the study completion, the database is locked so that no changes can be done to the information. After that, clean data is submitted to stakeholders for statistical analysis, reporting and, finally, publication of the results. However, all these steps are beyond the clinical data management workflow.
LPLV
Database lock checklist:
Ensure data is complete (both ecrf&non-ecrf)
Perform final data listing review
Data entry
Data completion
Last Data transfer
Data review
Last reconciliation
Closing all discrepancies/issue trackers
Open queries-all close out
PI Signatures
SDV status-100% SDV
SDV is completed for all CRDs by the CRA
After last patient last visit —-->data entry in ECRF —-->data completion—->Last data transfer—->closing all discrepancies/issue trackers in EDC—>Close out all open queries—->SDV(Source data verification) done by the CRA in site—100%SDV
Database lock checklist:
Electronic archival:
Preparing archival(a collection of historical documents or records providing information)
It is done for secure retention maintaince & retrieval of data which goes to the trial master file
TERMS
Go-live:
After study start up--Database ready in EDC--site --subject enroll---Site PI(Doctor) enters patient details time from FPFV and LPLV.
Split deployment:
In start up step skipping edit check document step and done with remaining ecrf specification,UAT
Blinding:
Not aware of anything about drug
Open/Unblinding:
Aware of every thing about drug
Double blinding:
Nor subject/Doctor/the company aware of drug
SDV:
source data verification=CRA-->site-->verify site data with EDC
Randomization:IRT-->IRT ENTER DATA IN
Randomly assigning something to subjects
Example:oncology
study-->age:18-50-->100 subjects-->information-->computer-->computer gives each participant a code-->code number are randomly assigned
1 2 3 4 5 subjects
computer assigned
3 5 1 2 4 subjects-->Treatment starts
Place-bo:
It looks like a drug but there is no therapeutic active due to minimize patient bias
Screening:
example:Pass --10 subjects) enrollment different-->not possible of above criteria-->changes
fail--40 subjects)
screening based on mainly only inclusion and exclusion certeria.
FPFV:
When approval got from IRB, the site started clinical trials. Beginning of the clinical trial, the patient comes first to FPFV(first patient first visit) and starts data entry in the database(EDC).
LPLV:
End of clinical trial which patient comes last for study
Disposition date:
When adverse event starts—> Treatment start—-->end of the treatment (date){last day}
outsource
Outsourcing means that you hire outside resources to help you complete tasks or projects. These might include freelancers or agencies that specialize in performing a particular type of task or project. For example, hiring a digital marketing agency is a way to outsource your social media management.
Inhouse
In-house resources, on the other hand, are your existing employees — including yourself. When you handle a task or project in-house, you assign one or more of your team members to work on it.
Inclusion Criteria
Participants are eligible to be included in the study only if all of the following criteria apply
Exclusion Criteria
Participants are excluded from the study if any of the following criteria apply
Comments
Post a Comment