This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognizing you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
In an age where smart watches track our heart rate, apps detect our sleep patterns, and digital platforms monitor the calories we burn, the potential of patient-generated health data (PGHD) to support health care providers in improving better health outcomes is greater than ever. However, many people are reluctant to share their personal health information, in and outside of clinical settings. To understand their concerns, Nathan Botts, PhD, a Westat Principal Research Associate for Clinical Research, recently headed a study, examining trust, consent models, and incentives for PGHD sharing. Here, he discusses the research, challenges, findings, and practical implications for Americans.
Q. What concerns motivated this research?
A. With the expanded adoption of health technology and the increased integration of PGHD into our health care systems, concerns about safeguarding individuals’ personal health data are prevalent. Without adequate consent models for sharing sensitive health information, the data shared by individuals cannot be adequately protected, limiting the health benefits that could be derived from this data.
Q. What role do consent models play in ensuring the ethical use of PGHD?
A. Consent models are crucial to the ethical use of PGHD because they allow individuals to better control how their data are shared. With consent models, they can determine what health information will be released, under what circumstances, and by whom.
Q. What were the study’s main objectives?
A. The study investigated users’ comprehension of, and willingness to, share specific types of health data. It examined their concerns about data-sharing, explored incentives to sharing them, and sought to understand the implications of these findings for organizations adopting personal health information systems.
Q. What challenges did you face in designing the survey?
A. The primary challenges included needing to address the complexity of how users perceive data-sharing policies, ensuring they understood what it means to “share” information, especially in the age of big data and generative AI and particularly related to sensitive information. So, we had to design the survey to capture nuanced attitudes about privacy across different types of health data, ensure representation across diverse demographic groups, measure complex constructs like trust and consent, and address user resignation toward privacy policies that they may not fully understand but agree to anyway.
Q. What methodology did you use for this study?
A. We used a non-experimental, cross-sectional survey design targeting U.S. adults across diverse demographic backgrounds. The web-based survey consisted of 22 questions asked of 336 adults. While we acknowledge the limitations inherent in web-based surveys, the sample size provided an appropriate balance between statistical requirements and practical research constraints, allowing us to draw meaningful conclusions about health information-sharing preferences. We used a statistical test called Cronbach’s alpha to confirm that questions in our survey were consistent and reliable. A score above 0.70 means that the questions worked well together to measure the same idea and was confirmed for this study.
Q. What were the findings and respondents’ primary concerns?
A. The study revealed that while 82% of respondents understood privacy policies, 55% were hesitant to trust them. Respondents were very concerned about sharing financial data and health data but less concerned about sharing political and social data. They were most willing to share fitness and lifestyle data but more hesitant to share clinical and mental health data, except with professionals and family.
Q. How did respondents feel about receiving incentives for sharing PGHD?
A. Respondents were generally positive about incentives, preferring first the incentive of overall better health, then better personalized health services, and third, money.
Q. How can this research help create more effective data-sharing frameworks?
A. This research can help organizations develop data-sharing frameworks that respect users’ varying comfort levels with different types of health data, implement stronger privacy protections for sensitive data categories, offer appropriate incentives aligned with user preferences, provide transparency about who receives shared data, and create accessible consent management tools that help users maintain control over their information.
Q. How can trust be improved for future health data-sharing platforms?
A. Emerging technologies like blockchains could improve trust in the sharing of personal health data by providing a secure, transparent, and tamper-proof record of all data transactions. Blockchains work like a digital public ledger that records every action—such as who accessed the data and when—in a way that cannot be changed or deleted. This transparency helps ensure that personal health information transactions, including PGHD, are used appropriately and not manipulated or falsified. By making the full history of data use visible and verifiable, blockchains can reduce the risk of fraud, enhance security, and give individuals more confidence that their health data are being handled responsibly.
Q. How can these findings shape real-world models for PGHD?
A. These findings can help produce patient-centered tools that allow individuals greater control over sharing preferences, create transparency measures showing data use history, and fashion incentive structures aligned with user preferences.
Q. How will the findings influence engagement with health apps and wearables?
A. These findings will likely influence how people engage with health apps and wearables by encouraging users to demand clearer consent models that can segment different types of health data and prompt greater selectivity about what information they share. The findings could also encourage the use of features that build trust and support the creation of reward-based systems that promote data sharing in ways that lead to better health outcomes.
Q. What are the practical implications for the average American?
A. The average American may encounter more transparent data-sharing policies, gain improved control over the health information they share, potentially receive incentives for sharing their health data, and see the development of tools that make consent management easier and more understandable.
Focus Areas
Biomedical Informatics and Data Coordination Health Informatics Health Services Research and Health Policy Public Health Real-World Data and EvidenceCapabilities
Biomedical Informatics and Data Coordination Data Collection Data Privacy Health InformaticsFeatured Expert
Nathan Botts
Principal Research Associate
-
Expert Interview
Why Are People Reluctant to Share Health App Data?May 2025
In an age where smart watches track our heart rate, apps detect our sleep patterns, and digital platforms monitor the calories we burn, the potential…
-
Expert Interview
Fine-Tuning LLMs to Improve Adverse Drug Event Detection and ReportingApril 2025
The expedient, accurate detection of the severity of adverse drug events (ADEs) remains a serious challenge. Despite technological advancements, data-driven models often fail to precisely…
-
Perspective
Leveraging Quantum Computing to Accelerate Biomedical InnovationsMarch 2025
Quantum computing is poised to revolutionize health care and biomedical research, making a tangible impact on Americans’ everyday lives. By rapidly analyzing vast genetic and…