Applying HCI Principles in Designing Usable Systems for Dentistry



Fig. 9.1
Recording of pocket depth (numbers in mm in red), bleeding points (P), missing teeth (M) and existing restorations (colored areas on teeth) in the EHR



Dentistry has a unique clinical workflow, (Button et al. 1999) yet only a few studies have been conducted on workflow and the role of technology in the dental clinic (Button et al. 1999; Wotman et al. 2001). Nevertheless, previous studies have demonstrated that limited consideration of HCI related issues often interferes with the dental clinic workflow. For example, Irwin et al. showed that over 60 % of the 27 “breakdowns” during initial examination and treatment planning using EHRs in general dentist practices were associated with technology (Irwin et al. 2009). Usability issues and unfamiliarity with chair-side use of clinically relevant electronic data were major barriers to EHR adoption for dental practitioners, (Schleyer et al. 2006, 2007; John et al. 2003; Thyvalikakath et al. 2007, 2008) not unlike the barriers associated with medical provider encounters (Miller and Sim 2004; Fitzpatrick and Koh 2005; Simon et al. 2007). In the U.S., the axiUm EHR has achieved near ubiquity in dental academic settings. This is not to say that the axiUm EHR has surmounted the usability challenges of the private practice dental EHRs; indeed, a survey and interview study conducted during the implementation of axiUm at the University of Texas Health Science Center at Houston Dental Branch identified usability as a major concern (Walji et al. 2009).



9.1.3 Development of a Standardized Dental Diagnostic Terminology


A complete list of patient problems and diagnoses is a cornerstone of the medico-legal document that is the patient record. It serves as a valuable tool for providers assessing a patient’s clinical status, succinctly communicates this information between providers and to front desk and administrative personnel, and serves as a fulcrum around which research and quality improvement levers pivot.

Early efforts to standardize dental diagnostic terms have fallen short with respect to comprehensiveness and availability (World Health Organization 1973; Ettelbrick et al. 2000). Subsequently, the ICD-DA (application of the International Classification of Diseases to Dentistry and Stomatology) was added to ICD-8 in 1965 (World Health Organization 1973). However, the oral health coverage of the ICD terminology continues to call for improvement (Ettelbrick et al. 2000). Over the years, some groups independently generated dental diagnostic terminologies (Orlowsky and Glusman 1969; Gregg and Boyd 1996; Bader et al. 1999). Of these, the Toronto Codes (Leake et al. 1999) have been systematically evaluated, (Leake 2002) while we do not know to what extent the other terminologies have met dental teams’ diagnostic documentation needs (Sabbah 1999). In the early 1990s, the American Dental Association (ADA) started the development of SNODENT, a Systematized Nomenclature for Dentistry. In 1998, the ADA entered into an agreement to incorporate SNODENT Version I into SNOMED (SNODENT Update 2004). SNODENT is composed of diagnoses, signs, symptoms and complaints, and currently includes over 7700 terms (Goldberg et al. 2005; Torres-Urquidy and Schleyer 2006). In 2012, SNODENT Version II was incorporated into the SNOMED CT. Until its recent inclusion into SNOMED CT, SNODENT was only available by license and was maintained by the ADA. As a result, SNODENT is currently not widely implemented. In 2007, our research team developed the EZCodes, (Kalenderian et al. 2011) renamed Dental Diagnostic System or DDS for short, to enhance the proper and consistent registration of diagnostic findings. The DDS has been mapped to SNOMED, ICD 9, ICD 10, ICD 9-CM and ICD 10-CM (CM is the American version of ICD 9 and 10). With 1518 terms, the DDS is developed as an interface terminology (a set of terms designed to be compatible with the natural language of the user, used to mediate between a user’s colloquial conceptualizations of concept descriptions and an underlying reference terminology (Clinical Information Modeling Initiative (CIMI) Category: Interface terminology 2012)) to be used in the dental clinic with SNOMED CT as its back-end reference terminology (a terminology where each term has a codable, computer-usable definition to support retrieval and data aggregation (Reference Terminology)). The few DDS terms that did not have adequate coverage with SNOMED terms were submitted for integration with SNOMED, of which the majority has been accepted. As such, SNOMED truly functions as the reference terminology for the DDS terminology. Similarly, we have submitted terms to ICD in an effort to enhance the ICD oral health classification and improve the mapping between DDS and ICD oral health terms. The DDS terminology is also in its last phase of becoming a norm in The Netherlands, meaning that it will be the standardized diagnostic terminology that all Dutch dentists are expected to use (Nederlands Tandartsenblad Nederlandse Norm voor diagnostische termen).

However, prior analyses of the EZCodes (DDS) terminology in use in an EHR demonstrated both low utilization and frequent errors (Kalenderian et al. 2011). Between July 2010 and June 2011, the EZCodes were utilized 12 % of the time in three dental schools. More than 1,000 terms of the available 1,321 terms were never chosen. Caries and periodontics were the most frequently used categories. 60.5 % of the EZCodes entries were found to be valid (Blumenthal and Glaser 2007). The low utilization rate reiterated findings from an earlier study, (White et al. 2011) but also suggested the need to conduct more training, improve the EHR interface, and add descriptions and synonyms to the terms.

In Sect. 9.2, we describe our approach in using HCI principles to systematically identify usability problems, and to drive the re-design of an existing EHR to enhance the effective and efficient entry of dental diagnostic terms.

To put this work in context, we first review some of the recent and relevant literature regarding usability, dental EHRs and interface terminologies. A number of researchers have established that dental EHRs have some distance to go to be usable. Reynolds and colleagues provide a brief overview of dental informatics, reiterating that usability challenges represent a primary hurdle to the adoption of dental EHRs (Reynolds et al. 2008). In 2008, Hill, Stewart, and Ash explored the impact of EHRs on dental faculty and students in the dental academic setting. Newly developed clinical processes were considered more time consuming than previous paper processes. The end users’ needs appeared to be intense, immediate and significant. Here too, the authors reported significant usability problems standing in the way of smooth implementation. Additionally, changes in workflow were significant and often cumbersome (Hill et al. 2010a). Juvé-Udina reported on the evaluation of the usability of the diagnosis axis in a nursing interface terminology. Utilization of the diagnostic terms was high at 92.3 % where some of the concepts were used rarely and others as often as 51.4 % (Juve-Udina 2013). Thyvalikakath et al. similarly concluded that using a combination of heuristic evaluation and user tests methods showed that the four major commercial dental EHRs had significant usability problems (Thyvalikakath et al. 2009). Despite the fact that dental EHR usability is an established problem, little has been published on the use of cognitive engineering approaches, like think-aloud protocols, workflow observations and semi-structured interviews, to remedy the issues (Thyvalikakath et al. 2014).


9.1.4 Challenges of Dental EHR Use and Usability


Although healthcare providers, including dental providers, increasingly adopt EHRs, in part driven by current significant governmental incentives (Marcotte et al. 2012) and the hope for increased efficiency and quality (Blumenthal and Glaser 2007; Chaudhry et al. 2006), usability issues remain a major barrier to adoption (Patel et al. 2008; Zhang 2005a, b). As with medical EHRs, a user-centered designed dental EHR facilitates good usability, assuring that the user can efficiently and effectively complete work tasks satisfactorily and successfully (Walji et al. 2014). It is also understood that, on the contrary, a poorly designed EHR with poor usability can lead to potential patient safety issues (Horsky et al. 2005a, b; Ash et al. 2004).

There is a plethora of challenges of dental EHR use and usability concerns. Usability challenges include visual as well as functional interface design problems (Thyvalikakath et al. 2009). Illogical button placement, unanticipated button functionality, difficulty switching between the odontogram and periodontal chart, inability to easily delete a mistaken entry on the odontogram, the need for better visual representation of dental findings and the fact that many icons resemble each other in shape and color are just some specific examples of interface design problems detected in the dental EHR (Reynolds et al. 2008; Thyvalikakath et al. 2009; Walji et al. 2013; Song et al. 2010).

Low chair-side adoption rate of dental EHRs is also thought to be, in part, due to the unsuitability of the conventional EHR set-up in the dental operatory (Reynolds et al. 2008). Keyboards and mice are potential sources of infection and need protective covers (D’Antonio et al. 2013). Electronic clinical data entry is often believed to take longer than entering this information in the paper chart or is thought to be impractical because the dental assistant is needed to perform other duties (Reynolds et al. 2008). Additionally, the inability to effectively use clinical decision support within the dental EHR to positively influence dental patient care outcomes (Schleyer and Thyvalikakath 2012) and the lack of integration of evidence based guidelines (Song et al. 2010) into the EHR have limited adoption by dental practitioners.



9.2 Applying Theory to Practice: Redesigning a Treatment Planning Module in a Dental EHR



9.2.1 Design Challenge


Because the axiUm EHR is widely used amongst dental school clinics to document patient care, it was possible to work in close collaboration with the vendor to redevelop one of its existing modules, using a participatory, work-centered design approach with an aim to better support the diagnostic-centric treatment planning process for dental students. Specifically, the existing treatment planning module within the EHR was deemed too complicated and difficult to use. Several dental institutions had also recently adopted the DDS Dental Diagnostic Terminology (formerly called the EZCodes), which drove the diagnostic entry functionality of the Treatment Planning module.

The work of treatment planning in dentistry is the process of using information obtained from the patient history, clinical examination and diagnostic tests to formulate a sequence of treatment steps designed to eliminate disease and restore efficient, comfortable aesthetic and masticatory function to a patient. When developing a treatment plan, the provider should follow a general phasing and sequencing format designed to solve the patient’s dental problems in a way that first manages the patient’s emergent concerns (e.g., pain and infection). The next step is disease (e.g., caries) removal and tooth restoration; then, tooth replacement and reconstruction. Once these priorities have been met, aesthetic and cosmetic concerns are addressed, and lastly, preventive and maintenance measures are ensured. Any given phase may contain several individual procedures, some of which may be sequenced in a specific order (Stefanac and Nesbit 2007).

The Treatment Planning module in the axiUm EHR was originally developed with input from dental educators and thought leaders, and follows the treatment planning philosophy of Stefanac (Stefanac and Nesbit 2007). In order to develop a treatment plan within axiUm, a user (i) enters the patient’s problems/complaints; (ii) selects the appropriate diagnoses from a comprehensive list; (iii) enters the treatment objectives, which represent the intent or rationale for the final treatment plan, usually expressed as short statements and clear goals from both the student’s and patient’s perspectives; and (iv) enters a detailed plan for treating each of the selected diagnoses (Fig. 9.2). Following treatment planning, the student obtains instructor approval and patient consent before beginning treatment.

A322542_1_En_9_Fig2_HTML.gif


Fig. 9.2
Original treatment planning process in axiUm (Reprinted from Tokede et al. permission required)


9.2.2 Design Approach


A participatory design process to systematically identify challenges in the use of the existing Treatment Planning Module was used to inform an improved user interface that effectively supports the underlying needs of the end users. As summarized in Fig. 9.3, usability challenges were first identified and prioritized. New mockups were then developed, tested, refined and implemented in the EHR by the vendor. After further usability assessments, the new module was released to customers. Post-implementation usability assessments were conducted to determine the impact of the re-design in comparison to the original version.

A322542_1_En_9_Fig3_HTML.gif


Fig. 9.3
Overall process for assessing, improving and implementing the Treatment Planning (TP) Module in axiUm


9.2.2.1 Usability Assessment to Identify Challenges in Existing Treatment Planning Module


In general, a terminology is evaluated in terms of its ability to represent relevant concepts, and user interfaces are evaluated in terms of their usability. As Patel and Cimino noted, a combined approach towards evaluating both the terminology and the user interface offers a more holistic perspective on how the task is carried out and where it can be improved (Cimino et al. 2001). Consider when a user would like to enter a diagnosis into the Treatment Planning module but faces significant hurdles or fails. The reason for the failure might be any one or a combination of the following: inadequate completeness of the terminology (e.g., the terminology does not represent the diagnosis), poor usability (e.g., the interface does not provide adequate access to the diagnostic terminology), or insufficient representation within the terminology (e.g., poor organization of the terminology). The same problems could underpin the selection of a term that does not capture the intended meaning. By attending to both the terminology and the user interface, we can begin to characterize the breadth of the causes of failure. Thus, we analyzed the following when considering this human computer interaction challenge: (1) use of the DDS terminology itself, (2) use of the existing Treatment Planning interface and (3) use of the DDS terminology as part of clinic workflow.

We conducted usability assessments of EHRs at two dental schools: Harvard School of Dental Medicine (HSDM) and University of California, San Francisco (UCSF). Both institutions have university-owned clinics to train dental students as well as residents (post graduate students). Both dental schools also have a private faculty practice, use the axiUm EHR system, and were early adopters of the DDS dental diagnostic terminology. Study participants included a sample of third and fourth year dental students (who were actively involved with delivering patient care), residents, and faculty. These groups represent the primary users of the DDS dental diagnostic terminology. As mentioned previously, dental students are responsible for updating the dental patient record under the supervision of attending faculty. Because one does not get many opportunities to overhaul a major module within the EHR, we conducted three complementary usability assessments in order to maximize our ability to capture challenges. We will summarize that work here; we have published full details in the International Journal of Medical Informatics (Walji et al. 2013, 2014).


Think-Aloud User Testing

We created two pre-defined scenarios to assess users’ interactions with DDS in the axiUm Treatment Planning module: a simple task of entering one diagnosis and a more complex treatment-planning task. Participants were asked to think aloud (Ericsson and Simon 1993) and verbalize their thoughts as they worked through each scenario. As part of user testing, quantitative data was captured to assess if tasks were completed successfully (a measure of effectiveness) and the amount of time spent in accomplishing the task (a measure of efficiency). To evaluate whether a user successfully completed the tasks, we had to define the correct path to complete the tasks. We did this using Hierarchical Task Analysis (HTA) (Diaper and Stanton 2004) after gathering input from expert dentists at each site. After determining the appropriate path to complete the tasks, we calculated the expert performance time, which is the time it would take an expert (who makes no errors) to complete the tasks. We did this using CogTool, (John et al. 2004) an open source software that predicts performance time on the basis of application screenshots and the specification of a path to complete a specific task. After completing the exercises, participants were asked to provide additional feedback on the use of the module, and complete a user satisfaction survey using the validated and widely used System Usability Scale (Brooke 1996).


Observations Using Ethnography

Observational data were collected over a 3-day period by a trained researcher in order to provide insight into the clinical workflow, information gathering and diagnostic decision-making process in the clinical environment where the dentists and dental students worked. To minimize any impact on patient care, a non-participatory observational technique was used. The researcher engaged with the dental team members only if there was a need for any clarification or during downtime such as when a patient did not show for an appointment. Observational data were captured using paper-based field notes. Each set of observations occurred for approximately 4 h, in two separate shifts (morning and afternoon). The primary purpose of the observations was to capture overall clinical workflow and to identify how diagnoses were made and captured in the EHR using the DDS dental diagnostic terminology, and to identify any associated challenges. Actual clinical work was not part of the observation.


Semi-structured Interviews

The third approach we took for evaluating the terminology and interface was to conduct semi-structured interviews with open-ended questions. The semi-structured format ensured uniformity of questions asked, while the open-ended format allowed the interviewees to express themselves. New questions were allowed to arise as a result of the discussion. The prepared questions focused on two broad themes: (1) the perception and internal representation of the clinic, patient care and role of dentists/students within the clinic; and (2) the nature of the workflow and environment of care within this dental clinic with the use of EHR. The questions were influenced by the knowledge gained from the observations. Interviews lasted approximately 30 min each. The interview data were collected in order to assess information on the role, situational awareness and general work philosophy of the subjects in the dental clinic. The sample was representative of those who are usually present in the clinical environment and as such included dental third and fourth year students, residents and faculty.


Findings

User testing revealed that only 22 % of users were able to successfully complete all of the steps in the simple task of entering one diagnosis, while no user was able to complete the more complex treatment-planning task. Table 9.1 provides an overview of the 24 high-level usability problems that were found through the use of the three methods. The methods together identified a total of 187 usability violations: 54 % via user testing, 28 % via the semi-structured interview and 18 % from the survey method, with modest overlap (Walji et al. 2014). Interface-related problems included unexpected approaches for displaying diagnosis, lack of visibility, and inconsistent use of user interface widgets. User interface widgets are elements of the interface with which a user interacts. Terminology related issues included missing and mis-categorized concepts. Work domain issues involved both absent and superfluous functions. In collaboration with the vendor, each usability problem was prioritized and a timeline set to resolve the concerns.


Table 9.1
Summary of usability problems, priorities and timeframe to address and implement solutions












































































































 
Usability problem (PRIORITY)

Timeframe to implement solutions

Description/example

Interface

1. Illogical ordering of terms (HIGH)

Immediate: Reorder alphabetically; ≤1 year: Users to customize ordering

Terms are ordered based on numeric code rather than alphabetically

2. Term names not fully visible (HIGH)

≤6 months

Users select incorrect diagnosis as they are unable to read the full name

3. Time consuming to enter a diagnosis (HIGH)

≤1 year

User must navigate several screens and scroll through a long list to find and select a diagnosis

4. Inconsistent naming and placement of user interface widgets (HIGH)

≤6 months

To add a new diagnosis, a user must click a button labeled “Update”

5. Ineffective feedback to confirm diagnosis entry (HIGH)

≤1 year

User only sees the numeric code for the diagnosis and not the name of the term.

6. Search results do not match users expectations (MEDIUM)

≤1 year

A search for “pericornitis” retrieves 3 concepts with the same name but a different numerical code

7. Users unaware of important functions to help find a diagnosis (MEDIUM)

≤1 year

System defaults to “quick list”, so some users do not navigate the “full list” or discover the use of the search feature

8. Limited flexibility in user interface (MEDIUM)

≤1 year

User unable to modify an entered diagnosis on the “details” page and must go back to previous screens to edit diagnosis

9. Distinction between Category Name and Concept unclear (MEDIUM)

Immediate

Users attempt to select a category name.

Terminology

10. Inappropriate granularity/specificity of concepts (MEDIUM)

≤1 year

Some sub-categories have a large number of concepts making it very difficult for users to find an appropriate term

11. Some concepts appear missing/not included (HIGH)

≤6 months

Examples of missing concepts according to users include: missing tooth, arrested caries, and attritional teeth

12. Some concepts not classified in appropriate categories/sub categories (HIGH)

≤6 months

Example: aesthetic concerns

13. Abbreviations not recognized by users (HIGH)

≤6 months

Example: F/U, NOS, VDO

14. Visibility of the numeric code for a diagnostic term (HIGH)

Immediate: Use Quicklist to hide code ≤1 year: Remove numeric code in UI

Although the numeric code is a meaningless identifier, users had an expectation that the identifier should provide some meaning

15. Users not clear about the meaning of some concepts (MEDIUM)

≤1 year

Novice users (students) had difficulty distinguishing between similar terms, and definitions and synonyms were not provided

Work domain

16. Free text option can be used circumvent structured data entry (HIGH)

Immediate: Disable option ≤1 year: Remove option altogether

Instead of selecting a structured term, some users free text the name of the diagnosis.

17. Synonyms not displayed (HIGH)

≤1 year

Users must search by preferred term name

18. Knowledge level of diagnostic term concepts and how to enter in EHR limited (HIGH)

≤1 year

Users appear to have had little concerted education and training either by institution or vendor

19. Only one diagnosis can be entered for each treatment (HIGH)

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Oct 21, 2016 | Posted by in BIOCHEMISTRY | Comments Off on Applying HCI Principles in Designing Usable Systems for Dentistry

Full access? Get Clinical Tree

Get Clinical Tree app for offline access