A framework for evaluating electronic health record vendor user-centered design and usability testing processes.

MedStar author(s):
Citation: Journal of the American Medical Informatics Association. 24(e1):e35-e39, 2017 Apr 01PMID: 27375292Institution: MedStar Institute for Innovation | MedStar Washington Hospital CenterDepartment: Emergency Medicine | National Center for Human Factors in HealthcareForm of publication: Journal ArticleMedline article type(s): Journal ArticleSubject headings: *Commerce | *Ergonomics | *Medical Records Systems, Computerized/st [Standards] | Certification | Electronic Health Records/st [Standards] | Humans | Medical Informatics | User-Computer InterfaceYear: 2017ISSN:
  • 1067-5027
Name of journal: Journal of the American Medical Informatics Association : JAMIAAbstract: Conclusion: The framework highlights the need for clearer certification requirements and suggests that the authorized certification bodies that examine vendor usability reports may need to be provided with clearer guidance.Discussion: The framework provides a method to more easily comprehend EHR vendors' usability processes and serves to highlight where EHR vendors may be falling short in terms of best practices. The framework provides a greater level of transparency for both purchasers and end users of EHRs.Materials and Methods: We reviewed current usability certification requirements and the human factors literature to develop a 15-point framework for evaluating EHR products. The framework is based on 3 dimensions: user-centered design process, summative testing methodology, and summative testing results. Two vendor usability reports were retrieved from the Office of the National Coordinator's Certified Health IT Product List and were evaluated using the framework.Objective: Currently, there are few resources for electronic health record (EHR) purchasers and end users to understand the usability processes employed by EHR vendors during product design and development. We developed a framework, based on human factors literature and industry standards, to systematically evaluate the user-centered design processes and usability testing methods used by EHR vendors.Results: One vendor scored low on the framework (5 pts) while the other vendor scored high on the framework (15 pts). The 2 scored vendor reports demonstrate the framework's ability to discriminate between the variabilities in vendor processes and to determine which vendors are meeting best practices.All authors: Fairbanks RJ, Hodgkins ML, Kosydar A, Ratwani RM, Zachary Hettinger AFiscal year: FY2017Digital Object Identifier: Date added to catalog: 2017-05-06
Holdings
Item type Current library Collection Call number Status Date due Barcode
Journal Article MedStar Authors Catalog Article 27375292 Available 27375292

Conclusion: The framework highlights the need for clearer certification requirements and suggests that the authorized certification bodies that examine vendor usability reports may need to be provided with clearer guidance.

Discussion: The framework provides a method to more easily comprehend EHR vendors' usability processes and serves to highlight where EHR vendors may be falling short in terms of best practices. The framework provides a greater level of transparency for both purchasers and end users of EHRs.

Materials and Methods: We reviewed current usability certification requirements and the human factors literature to develop a 15-point framework for evaluating EHR products. The framework is based on 3 dimensions: user-centered design process, summative testing methodology, and summative testing results. Two vendor usability reports were retrieved from the Office of the National Coordinator's Certified Health IT Product List and were evaluated using the framework.

Objective: Currently, there are few resources for electronic health record (EHR) purchasers and end users to understand the usability processes employed by EHR vendors during product design and development. We developed a framework, based on human factors literature and industry standards, to systematically evaluate the user-centered design processes and usability testing methods used by EHR vendors.

Results: One vendor scored low on the framework (5 pts) while the other vendor scored high on the framework (15 pts). The 2 scored vendor reports demonstrate the framework's ability to discriminate between the variabilities in vendor processes and to determine which vendors are meeting best practices.

English

Powered by Koha