A Capability Maturity Model for Research Data Management

3.4 Process Assessment

Last modified by Arden Kirkland on 2014/06/06 13:00

3.4 Process Assessment

Process Assessment includes Measurement and Analysis and Verifying Implementation. Measurement and Analysis describes the need to measure the process and analyze the measurements, and typically includes examples of the measurements that could be taken to determine the status and effectiveness of the Activities Performed. Verifying Implementation describes the steps to ensure that the activities are performed in compliance with the process that has been established, and typically encompasses reviews and audits by management and quality assurance.

As discussed in Chapter 1 - Data Management in General, process assessment involves identifying needed measurements and analysis and using the measurements for verification. For the data description and representation process area, the measurement of performance is related to the quality of metadata and ability of metadata schemas to communicate with other standards and systems. 

3.4.1 Measuring and verifying implementation

Measurement in the data description and representation process includes two aspects: one is the performance of metadata generation/creation and the other is the quality of metadata as the product of this process. Quantitative measures for assessing the performance typically include the time taken to complete describing a dataset or documenting the study context and data, workflow steps from start to finish in metadata description, time spent in finding relevant sources in order to enter accurate metadata in the record, and unnecessary repetitions in data entry. The data for these measures should be collected in action to ensure the reliability of data because such very specific data values tend to become forgotten and affect the accuracy of measurement. 

The quality of metadata can be measured by the criteria below:

  • Completeness: the portion of elements in a description record that actually contain values (non-empty elements).
  • Correctness in content, format, input, browser interpretation, and mapping.
  • Consistency in data recording, source links, identification and identifiers, description of sources, metadata representation, and data syntax. 
  • Duplication rate in integrated collections. (Zeng & Qin, 2014)

Performance assessment in this process area is closely tied to the quality of metadata. A problematic workflow in metadata creation may hinder the discovery of potential issues and miss the opportunity to correct the process sooner to prevent the problem from becoming worse. Data for the quality of metadata descriptions should be regularly collected and procedures established to ensure the capturing of data that will later be used to assess both the process performance and quality of metadata. 

Data collected against the measurements for performance and quality will be used to verify the implementation of the policies, schemas, and operations. The verifying process can be formal as described in the original CMMI document (Paulk et al., 1993). The Australian National Data Services (ANDS, 2011) and the DMVitals project at the University of Virginia Library (Sallans & Lake, 2014) are examples of two initiatives in the data management community exploring strategies for supporting verification of implementation. 

Verification also includes making sure that the metadata schema(s) developed conform to standards and internal verification by building documentation verification steps into one's daily practice and into the project workflow at key milestones (Long, 2009). One strategy Long uses for ensuring internal compliance with agreed upon documentation standards is designating a project team member to be responsible for checking verification.

Rubric

Rubric for 3.4 - Process Assessment
Level 0
 This process or practice is not being observed
No steps have been taken to establish procedures for measurement, analysis, or verification to ensure quality and compliance with metadata standards
Level 1: Initial
 Data are managed intuitively at project level without clear goals and practices
Measurement, analysis, or verification to ensure quality and compliance with metadata standards have been considered minimally by individual team members, but not codified
Level 2: Managed
 DM process is characterized for projects and often reactive
Measurement, analysis, or verification  to ensure quality and compliance with metadata standards have been recorded for this project, but have not taken wider community needs or standards into account
Level 3: Defined
 DM is characterized for the organization/community and proactive
The project follows approaches to measurement, analysis, or verification to ensure quality and compliance with metadata standards as defined for the entire community or institution
Level 4: Quantitatively Managed
 DM is measured and controlled
Quantitative quality goals have been established including measurement, analysis, and verification to ensure quality and compliance with metadata standards, and both metadata and practices are systematically measured for quality
Level 5: Optimizing
 Focus on process improvement
Processes regarding measurement, analysis, or verification to ensure quality and compliance with metadata standards are evaluated on a regular basis, and necessary improvements are implemented

References


Australian National Data Service (2011). Research data management framework: Capability maturity guide http://ands.org.au/guides/dmframework/dmf-capability-maturity-guide.pdf

Long, J. S. (2009). The workflow of data analysis using Stata. College Station, Texas: Stata Press Books.

Paulk, M. C., Curtis, B., Chrissis, M. B., & Weber, C. V. (1993). Capability Maturity Model for Software, Version 1.1 (No. CMU/SEI-93-TR-024). Software Engineering Institute. Retrieved from http://resources.sei.cmu.edu/library/asset-view.cfm?assetID=11955

Sallans, A. & Lake, S. (2014). Data management assessment and planning tools. In J. Ray (Ed.), Research Data Management: Practical Strategies for Information Professionals. West Lafayette, Indiana: Purdue University Press.

Zeng, M. L. & Qin, J. (2014). Metadata. Chicago, IL: ALA Neal Schuman. 

                                                                                                                          

<--Previous Page / Next Page -->

Created by Jian Qin on 2013/06/14 05:46

XWiki Enterprise 5.1-milestone-1 - Documentation