Background There is an increasing number of interactive web-based advance care planning (ACP) support tools, which are web-based aids in any format encouraging reflection, communication, and processing of publicly available… Click to show full abstract
Background There is an increasing number of interactive web-based advance care planning (ACP) support tools, which are web-based aids in any format encouraging reflection, communication, and processing of publicly available information, most of which cannot be found in the peer-reviewed literature. Objective This study aims to conduct a systematic review of web-based ACP support tools to describe the characteristics, readability, and quality of content and investigate whether and how they are evaluated. Methods We systematically searched the web-based gray literature databases OpenGrey, ClinicalTrials.gov, ProQuest, British Library, Grey Literature in the Netherlands, and Health Services Research Projects in Progress, as well as Google and app stores, and consulted experts using the following eligibility criteria: web-based, designed for the general population, accessible to everyone, interactive (encouraging reflection, communication, and processing of information), and in English or Dutch. The quality of content was evaluated using the Quality Evaluation Scoring Tool (score 0-28—a higher score indicates better quality). To synthesize the characteristics of the ACP tools, readability and quality of content, and whether and how they were evaluated, we used 4 data extraction tables. Results A total of 30 tools met the eligibility criteria, including 15 (50%) websites, 10 (33%) web-based portals, 3 (10%) apps, and 2 (7%) with a combination of formats. Of the 30 tools, 24 (80%) mentioned a clear aim, including 7 (23%) that supported reflection or communication, 8 (27%) that supported people in making decisions, 7 (23%) that provided support to document decisions, and 2 (7%) that aimed to achieve all these aims. Of the 30 tools, 7 (23%) provided information on the development, all of which were developed in collaboration with health care professionals, and 3 (10%) with end users. Quality scores ranged between 11 and 28, with most of the lower-scoring tools not referring to information sources. Conclusions A variety of ACP support tools are available on the web, varying in the quality of content. In the future, users should be involved in the development process of ACP support tools, and the content should be substantiated by scientific evidence. Trial Registration PROSPERO CRD42020184112; https://tinyurl.com/mruf8b43
               
Click one of the above tabs to view related content.