Right information
|
DSS content or algorithms are incorrectly defined or do not reflect evidence-based practice
Borum, 2018; Van Dort, 2022
|
- Provide transparent explanation of algorithms and clinical content.
- Base decision support on validated, current evidence and quality assured sources of information.
- For decision support algorithms that process data within electronic care record systems, assess the availability and quality of available data needed to drive the DSS.
Khairat-2018; Wright-2015; Kouri, 2022; Wu, 2012; Van Dort, 2022
|
Inaccurate patient information / lack of patient information / mistrust in information
Borum, 2018; Van Dort, 2019
|
- Links to patient information must be readily available, consistent and relevant.
- Provide links to quality assured external patient information resources are important.
- Include patient-friendly explanations about the decision support recommendation.
- Provide patient-specific recommendations, relevant for the clinical situation at hand, rather than generic recommendations which may not be relevant to individual patient / service user needs.
Van Dort, 2019; Keyworth, 2018; Kilsdonk 2017
|
DSS content does not meet user needs.
Borum, 2018;
|
- Iterative testing and piloting - iterative modification to meet user needs
Keyworth, 2018
|
Right format
|
Alert fatigue / over alerting and nuisance alerts.
Borum, 2018; Lee, 2020; Van Dort, 2019; Liu, 2021; Miller, 2018; Olakotan 2020
|
- Reminders and alerts should be presented in such a way that the user does not find them threatening or obtrusive.
- User needs and expectations of a DSS alerts and thresholds for alerting should be evaluated early and throughout the development lifecycle.
- User involvement in implementation of alerts.
- Alerts should be non-interruptive to workflow.
- Alerts can be useful in cases of uncertainty.
- Alerts should signal a critically important issue and should be of practical use in informing and guiding the practitioner.
Khairat-2018; Van Dort, 2019; Miller, 2018; Olakotan 2020
|
Presentation and formatting of information is not clear or helpful to the user.
Borum, 2018; Kennedy, 2019
|
- Present information based on users’ needs and expectations.
- Information should be actionable or directive, as opposed to strictly numerical (e.g. risk score) or narrative explanation.
- Provide clear direction on how to recue modifiable risk factors
- Information should be clearly visible on the screen in a visually oriented design.
- Apply principles of risk communication, signal words, and formatting of information design (see Annex 3).
Kennedy, 2019; Khairat, 2018; Kennedy, 2019; Kilsdonk 2017
|
Over-dependence on manual input of data.
|
- If possible, DSS should automatically process coded data e.g. from an electronic care record system.
- Alternatively, DSS can use predefined ranges to minimise the requirement for manual input, or automated checks to confirm that the inputted data is within a realistic range.
Borum, 2018; Westerbeek, 2021; Kouri, 2022
|
Usability
Borum, 2018; Olakotan 2020,
|
- Provide access to important information relevant to the clinical encounter.
- Technology and user interface must be easy to use.
- DSS recommendations should take account of complexities of individual patients / service users.
- DSS should help to facilitate discussions and shared decision-making with patients /service users.
- User interactions and time required to use the system should be kept to a minimum.
Keyworth, 2018; Kilsdonk 2017
|
Right time
|
Inappropriate timing of DSS
Borum, 2018; Kilsdonk 2017; Olakotan 2020
|
- Integrate DSS intervention and triggers with practitioners’ workflow and preferences – e.g. before, during or after consultation (see section 2.2.3 above on aligning DSS with workflow.)
Kennedy, 2019; Kilsdonk 2017; Olakotan 2020
|
DSS is inefficient – e.g. slow to run, too many clicks.
Borum, 2018
|
- Optimize performance and number of clicks based on user testing.
- Make asynchronous calls – e.g. present results based on algorithms processing patient data overnight.
Wright, 2015; Olakotan 2020
|
Right channel
|
Hardware and software issues
Borum, 2018; Keyworth-2018; Wright-2015; Olakotan 2020
|
- Pilot testing before wide-scale usage.
- Iterative testing and modification, customising tools to the needs of the staff.
Keyworth, 2018; Olakotan 2020
|
Lack of interoperability
Borum, 2018
|
- The Right Decision Service platform provides interfaces for interaction with electronic care record systems. However, note constraints above on quality of data and interfaces provided by electronic care record systems.
- If integrating DSS with electronic care systems, it is important to engage with practitioners and analyse available data to identify the range of coded data in use within the care records.
Kouri-2022; Wright-2015; Thomas-Craig-2021; Kilsdonk 2017
|