A Capability Maturity Model for Research Data Management

2.4 Process Assessment

Last modified by Arden Kirkland on 2014/06/03 16:52

2.4 Process Assessment

Process Assessment includes Measurement and Analysis and Verifying Implementation. Measurement and Analysis describes the need to measure the process and analyze the measurements, and typically includes examples of the measurements that could be taken to determine the status and effectiveness of the Activities Performed. Verifying Implementation describes the steps to ensure that the activities are performed in compliance with the process that has been established, and typically encompasses reviews and audits by management and quality assurance.

2.4.1 Measurement and Analysis

Measurement and analysis of data acquisition and processing provides specific practices and procedures that guide this process area. It should keep in mind that the goal of measurement and analysis is to provide "general guidance about measuring, analyzing, and recording information that can be used in establishing measures for monitoring actual performance of the process" (CMMI Product Team, 2006). Projects should develop and implement metrics for the data acquisition, processing and quality assurance processes. Example metrics include the quantity of data being collected or the observed error rate at different points in the process. A small sample of data might be intensively quality checked to provide an estimate of the level of undetected errors in the data collected. 

2.4.2 Assure data quality

Data quality should be assessed as data are collected, and the data quality process is documented. Checking for data quality as the data are collected ensures that only valid data are recorded and that erroneous values are either recollected or at least eliminated from further analysis. 

At a minimum, data items must be consistent with the data type of the column.

Data should be inspected after data collection to check for validity (e.g., plotting for visual examination). Times and dates should be checked to be sure they are valid (DataONE, 2011b). Locations coordinates can be mapped and checked to ensure that they are valid (DataONE, 2011b). Values recorded by instruments should be inspected to check that they are within a sensible range for the property being measured and for the instrument (e.g., within the detection limits of the equipment) (DataONE, 2011b). 

Data can be transcribed by two or more people and the values compared to ensure accuracy (DataONE, 2011a). Newly collected data can be compared to data from other data sets with similar data. Comparison to historic ranges can help identify anomalous values that require further examination. However, outliers should not be removed without careful consideration that they do not represent a true measurement. 

Supervisors should review and sign off on data to signify completeness and accuracy (Columbia Center for New Media Teaching and Learning, n.d.). 

Codes should be recorded in the data file to represent the quality of data at the time quality is assessed (DataONE, 2011b). Problematic data should be flagged to indicate known issues (DataONE, 2011c). Any ancillary data used to assess data quality should be described and stored (DataONE, 2011b). 

2.4.3 Check data integration from other sources

If data from other sources are used, the quality of those other sources should be reviewed (Hale et al., 2003). In addition, the license or permissions for those data should be reviewed to ensure that the use is allowed. Finally, the source of the data should be recorded to ensure that the data can be cited as appropriate. 

Rubric

 Rubric for  2.4 - Process Assessment
Level 0
This process or practice is not being observed 
No steps have been taken to establish procedures for measurement, analysis, or verification of data collection and documentation
Level 1: Initial
Data are managed intuitively at project level without clear goals and practices 
Measurement, analysis, and verification of data collection and documentation have been considered minimally by individual team members, but not codified
Level 2: Managed
DM process is characterized for projects and often reactive 
Measurement, analysis, and verification of data collection and documentation have been recorded for this project, but have not taken wider community needs or standards into account
Level 3: Defined
DM is characterized for the organization/community and proactive 
The project follows approaches to measurement, analysis, and verification of data collection and documentation that have been defined for the entire community or institution
Level 4: Quantitatively Managed
DM is measured and controlled  
Quantitative quality goals have been established regarding measurement, analysis, and verification of data collection and documentation, and both data and practices are systematically measured for quality
Level 5: Optimizing
Focus on process improvement  
Processes regarding measurement, analysis, and verification of data collection and documentation are evaluated on a regular basis, and necessary improvements are implemented

References


CMMI Product Team. (2006). CMMI for Development, Version 1.2 (No. CMU/SEI-2006-TR-008). Pittsburgh, PA, USA: Carnegie Mellon Software Engineering Institute. Retrieved from http://repository.cmu.edu/sei/387


Columbia Center for New Media Teaching and Learning. (n.d.). Responsible conduct of research: Data acquisition and management: Foundation text. Retrieved from http://ccnmtl.columbia.edu/projects/rcr/rcr_data/foundation/index.html#3_B


DataONE. (2011a). Double-check the data you enter. Retrieved from https://www.dataone.org/best-practices/double-check-data-you-enter


DataONE. (2011b). Ensure basic quality control. Retrieved from https://www.dataone.org/best-practices/ensure-basic-quality-control

DataONE. (2011c). Mark data with quality control flags. Retrieved from https://www.dataone.org/best-practices/mark-data-quality-control-flags


Hale, S. S., Miglarese, A. H., Bradley, M. P., Belton, T. J., Cooper, L. D., Frame, M. T., et al. (2003). Managing Troubled Data: Coastal Data Partnerships Smooth Data Integration. Environmental Monitoring and Assessment, 81(1-3), 133–148. doi:10.1023/A:1021372923589. Retrieved from http://link.springer.com/article/10.1023%2FA%3A1021372923589

<--Previous Page / Next Page -->

Created by Jian Qin on 2013/10/08 20:59

XWiki Enterprise 5.1-milestone-1 - Documentation