Scroll down for a listing of my recent publications.

ICTLogoDeveloping and Testing Accessible eLearning Courses.

The 2020 ICT Accessibility Testing Symposium: Time for Testing in Testing Times (Remote Work, Commerce, Education, Support…), October 2020. View the article online. Download PDF.

Basis: This article codifies my teams experiences testing eLearning applications for various clients.  In it my teammates describe best practices for developing eLearning modules and I outline a framework for integrating accessibility at each step of the development lifecycle.  The paper was selected as “Best Paper” for the 2020 conference because it provided concrete and practical advice for anyone doing accessibility testing, regardless of their level of experience.

Abstract: The term eLearning refers to using electronic technologies to access educational curriculum outside a traditional classroom. Organizations create and publish eLearning content using sophisticated authoring software, Learning Management Systems, and a network. This type of remote learning is becoming even more popular in the age of COVID-19; however, it can be challenging for instructional designers to know how to make eLearning content accessible. Proprietary players, multimedia, animation, and interactive quizzes all provide engaging content but present a variety of accessibility difficulties. This paper will provide an overview of key accessibility considerations and technical solutions for both eLearning designers and accessibility testers. In addition, the paper will offer guidance on incorporating the accessibility testing team into the designing process and capturing remediation guidance for future eLearning projects..

Authors:  Mayo, L., Kedarshetty, A., Dobre, J.

SnipImageQuick Lit Reviews Reduce UX Research Time and Supercharge Your Design.

UX Booth, March 2020. View the article.

Basis: This article describes my adaptations to the Literature Review process to provide fast, inexpensive knowledge collection and management for UX strategy and research consultations.  This method was developed from my human factors experiences.

Abstract: A quick and dirty literature review is a way to capture and synthesize information about a topic (a design problem, a new technology, an unfamiliar business area, etc.). It’s a simple structure that will allow you to document relevant information in an organized and intentional format. Creating the Lit Review can take a relatively short time compared with formal UX research; but leaves you with a lasting resource that can organize your thoughts, inform your strategy, educate others, and positively influence team behavior and design.

Authors:  Dobre, J.

informatics-default-coverThe Potential Impact of Data Source and Interoperability Messaging on Health Information Technology (HIT) Users: a Study Series from the United States Department of Veterans Affairs. 
The Journal of Innovation in Health Informatics, April 2019. View the article.

Basis: This academic paper was based on my usability testing and design recommendations for the Joint Legacy Viewer (JLV);  view the project.

Abstract: The implementation of Health Information Exchange (HIE) systems is becoming commonplace across the United States.  The promise of HIE systems lies in their potential to provide clinicians and administrative staff rapid access to relevant patient data to support judgment and decision making. However, despite the increasing adoption rates of HIE systems across the health care industry, the actual use of HIE systems by users remains minimal. This paper describes the studies and design methods employed to improve usability, technical performance, and support user workflow of an HIE. Improvement recommendations and future investigations that improve usability are noted.

Authors: Baggetta, D. , Herout, J., Dietz, A., Robbins, J., Maddox, K., Cournoyer A., & Dobre, J.

HFE NewsletterFall2018From Study to Software: Implementing Human Factors Recommendations in the Joint Legacy Viewer (JLV).
The Human Factors Quarterly, Issue 23: Fall 2018, Nov 1, 2018.

Basis: This academic paper was based on my usability testing and design recommendations for the Joint Legacy Viewer (JLV);  view the project.

Synopsis: The Fall 2018 issue of the Human Factors Quarterly discusses various methods human factors professionals use to learn about, evaluate, and direct work in health care settings. In the first article, Jolie Dobre describes how heuristic evaluation and usability testing were employed to understand user challenges and evaluate potential solutions for a web application that clinicians use to view electronic health records.   [Download the Fall 2018 issue of the Human Factors Quarterly]

Author:  Dobre, J.

User-centered Design Innovations in Government.
Awaiting publication, October 2018.

Basis: This paper describes practice innovations developed during a web-presence assessment and target-audience research for the U.S. Department of Interior’s Bureau of Safety and Environmental Enforcement (BSEE) as a part of an effort to upgrade their web platform and improve their web presence; view the project.

Abstract: The case study offers strategies to accelerate user-centered design activities through the use of a content inventory database and social media and Internet news monitoring. Combining a content inventory with social media and Internet news monitoring provided deep insights into the agency, the industry it monitors, and consumers of its content. The database accelerated knowledge capture and supported the efficient management, collection, aggregation, and analysis of data.

Authors: Dobre, J., Reibeling, R., Cramer, B.

Logo-HFESAnnualMeetingGathering Information in Healthcare Settings: A Tool to Facilitate On-Site Work
Proceedings of the Human Factors and Ergonomics Society 62nd Annual Meeting, 600-604.

Basis: This academic paper was  based on my experiences, and those of my colleagues, performing site visits for the VA to aid problem analysis for their modernization effort; view the project.

Abstract: The coordination of “site visits” to execute human factors methods, such as observations, interviews, or onsite usability tests, in clinical settings requires a high level of coordination in order to achieve successful data collection outcomes. Members of the Veterans Health Administration (VHA) Human Factors Engineering (HFE) team are dispersed throughout the country and sometimes have a need to visit medical centers or outpatient clinics in order to complete our work. We have, therefore, developed a practice innovation to facilitate the logistical coordination necessary when gathering data onsite: a site visit checklist. This Practice Oriented paper includes the full checklist as well as discussion of its use to enable other groups to benefit from lessons we have learned in conducting onsite work in healthcare settings.

Authors: Herout, J., Dobre, J., Plew, W., Saleem, J.J.

Logo-HFESHealthCareMinimizing the Impact of Interoperability Errors on Clinicians
Proceedings of the Human Factors and Ergonomics in Health Care 2018 Conference, March 2018

Basis: This academic paper was based on my usability testing and design recommendations for the Joint Legacy Viewer;  view the project.

Abstract: There is little guidance in the literature on how health information technology (HIT) interfaces should be designed to inform clinicians of data availability. As the industry focuses on interoperability between systems and devices, and as more HIT products aggregate data from external sources, it becomes increasingly critical to identify methods to alert clinicians of the availability of data without negatively impacting clinician workflow or contributing to alert fatigue. This paper reports on a case study of a usability study done on the U.S. Department of Veteran’s Affairs (VA) Joint Legacy Viewer (JLV) to provide guidance to developers on communication of connection errors and interface status. The issue, process to explore the issue, and findings are discussed. As publicly developed software, the efforts behind VA’s JLV design choices and images of design solutions can be shared to further the field’s understanding.

Authors:  Dobre, J., Carter, T., Herout, J., & Cournoyer, A.

HumanFactorsQuarterly.PNGProblem Analysis of Team Care in the VA Health Care System
The Human Factors Quarterly. Issue 20: Fall 2017, Dec 5, 2017. View the Newsletter.

Basis: This newsletter article was based on the UX research I performed for the VA to aid problem analysis for their modernization effort; view the project.

Summary: As one piece of ongoing work in analyzing clinical domains in support of Electronic Health Record Modernization (ERHM), Human Factors Engineering (HFE) conducted an in-depth study on team care. The objectives were to (1) understand how teams are defined and constructed and (2) understand how teams function today. Final analysis included: How clinicians communicate, manage tasks, and coordinate with each other; Issues with performing follow-up on ordered consults, orders, appointments, and labs; How clinicians compensate when they can’t accomplish required work within the Veterans Health Administration (VHA); and Findings that affect team cohesion, workflows, and patient coordination.

Authors:  Noonan, A., Herout, J., Dobre, J., Moon, B., & Bagetta, D.

Logo-HFESAnnualMeetingRapid Heuristic Evaluation: Ensuring Fast and Reliable Usability Support
Proceedings of the Human Factors and Ergonomics 2017 Annual Conference, October 2017

Basis: This academic paper was based on my work to insert usability guidance and clinical-user expertise into an established Agile development rhythm for the VA’s enterprise Health Management Platform; view the project.

Abstract: The U.S. Department of Veterans Affairs (VA) Human Factors Engineering (HFE) office developed a usability testing method called “Rapid Heuristic Evaluation” (Rapid HE) that offers benefits to users of the Agile development process. Rapid HE addresses the need to combine fast, reliable usability support with feedback from clinical subject matter experts (SMEs) during the design and development of an electronic health record (EHR). The Rapid HE process leverages established EHR heuristics to accelerate wireframe review and approval, and merges a traditional heuristic evaluation (HE) with an expert review by two SMEs. Our application of Rapid HEs has maximized use of resources and minimized the amount of time needed to provide feedback during Agile development cycles. This paper describes the Rapid HE process, deviations from traditional HEs, and reports on data from 16 HEs that our group conducted on an EHR platform currently being developed by VA.

Authors:  Herout, J., Dobre, J., Harrington, C., Weir, C., Baggetta, D., Carter, T., & Cook, A. (2016).

HFE NewsletterRapid Heuristic Evaluations to inform design of Enterprise
Health Management Platform (eHMP)

The Human Factors Quarterly. Issue 15: Summer 2016, August 15, 2016.

Basis: This academic paper was based on my work to insert usability guidance and clinical-user expertise into an established Agile development rhythm for the VA’s enterprise Health Management Platform; view the project.

Abstract: The Department of Veterans Affairs (VA) has experienced several very well-publicized events resulting in outcries to improve the delivery of health care to Veteran patients. A major initiative toward this improvement is the design and development of a next generation of VA’s Electronic Health Record (EHR), known as the
Enterprise Health Management Platform (eHMP). Although based on Agile development principles, the design reviews were marked by inefficient practices that resulted in numerous design iterations as VA stakeholders (clinician informaticists, Subject Matter Experts [SMEs], and eHMP leadership) struggled to reach an agreement on design. We developed a process solution to support the decision makers: A rapid heuristic evaluation (HE) of early wireframes and system design documentation. This process leveraged established EHR heuristics or design guidelines (Armijo, McDonnell, & Werner, 2009) to accelerate wireframe’s transition from design to development. The rapid HE merged a traditional HE with an expert review by two subject matter experts

Authors:  Herout, J., Dobre, J., Harrington, C., Weir, C., Baggetta, D., Carter, T., & Cook, A. (2016).

Cyber Newsletter-Sep-2013-Issue003_Interactive_Page_1

Increasing the Cyber Security Posture of the Nation through Workforce Development
SRA Cyber Connections Newsletter, September 2013

Basis: This newsletter article was developed to market the NICCS website to cybersecurity professionals within our organization; view the project. It described the origins of the NICCS Program and creation of the site.

Authors:  Dobre, J., Yohannes, M. (2013).

%d bloggers like this: