Data management

Data quality

Detailed reports containing statistical data analysis are our primary output. The quality of these reports is reliant on the quality of the data.

Through data management, we aim to ensure that the quality of data is maintained and we to strive to improve it.

How do we ensure data quality?

Detailed dataset rules and rigorous data validation are key parts of our audits and studies. Data are collected to set rules, and are only analysed once they have successfully passed the validation process.

How are dataset rules implemented?

Dataset rules prevent basic invalid data being recorded at data entry. For all new studies and the National Cardiac Arrest Audit (NCAA) these rules are built into the secure data collection web portals. For the Case Mix Programme (CMP) they are set in specification documents issued to software developers.

How are data validated?

Data are either validated live on the secure web portal, or via Data Validation Reports (DVRs). For each individual event we look for missing, unusual and invalid data, whilst at the site level we check for unusual patterns, duplication and consistency.

Meaningful analyses

These processes ensure that sites are all collecting and validating data to the same rules. We therefore have logical and consistent data, and can produce meaningful analyses.

How are new data quality issues identified?

Throughout all our processes, from reviewing validation to preparing statistical analyses, we monitor for any additional data issues.

When new data quality issues are identified, they are reviewed and if necessary new checks or procedures are introduced.

Trials and National Audit Programme (NAP) data management

Trials data management has more information on data management for trials and studies

NAP data management has more information on data management for audits.