Reflections on the FTC’s Disclosure Evaluation Workshop


By:


Lorrie Cranor

On September 15, 2016, the FTC convened a public workshop, Putting Disclosures to the Test, that examined ways of testing and evaluating the effectiveness of disclosures in communicating a wide range of information that consumers need to make informed decisions in the marketplace. Presenters discussed disclosure evaluation studies that focused on a range of disclosures including privacy notices, medical study consent forms, native advertising disclosures, drug fact labels, front-of-package nutrition labels, qualifying claims in advertisements, and more. Presenters emphasized the importance of conducting empirical studies throughout the disclosure design process to improve disclosure effectiveness. Judging from workshop attendance, approximately 225 people in person and 735 remotely via webcast, disclosure evaluation is a topic of broad interest. Videos, transcripts, photos, and other materials from the workshop are all available on the workshop website. But if you’re short on time, check out our staff summary that we released today.

Disclosure WorkshopIn this blog post, I would also like to share some of my observations about the workshop and about issues related to disclosure evaluation more generally, including low-cost test methods, metrics for determining effectiveness, approaches to improving disclosure design, and strategies to reduce the burden on consumers.

For over a decade, I have conducted research with my computer science and engineering students to evaluate privacy disclosures. I found it very informative to hear about disclosure evaluation from researchers coming from other academic disciplines and studying other types of disclosures. Disclosure evaluation seems to be an area that could benefit from more interdisciplinary collaboration.

It appears that disclosure evaluation is becoming easier and less costly. Academics and industry researchers told us about how they use crowdsourcing platforms and even new approaches to eye tracking and mouse tracking to conduct user studies much more quickly and less expensively than was previously possible. I hope that these developments will lead to more disclosure testing, and ultimately better disclosures. Even small studies with non-representative samples can be useful to help designers improve their disclosures.

Workshop panelists discussed a number of different metrics and thresholds for determining that a disclosure is effective. In some safety-critical areas there are some very specific metrics that have been established by standards bodies or regulators. It is not clear that these are necessarily appropriate for other disclosure areas, but they may be worth considering. On the other hand, when consumers are bombarded with many disclosures, there are likely some that really are more important than others, and given limited time and attention, it may be better to focus consumers’ attention on the most critical disclosures.

Throughout the day, panelists offered advice on how to improve disclosure design. One idea that was mentioned repeatedly was the need to identify the most important information in long disclosures or the information consumers may not be already aware of, and to highlight it for consumers. While this may seem obvious, this advice is often not followed in practice.

I found the discussion on the last panel about the future of disclosures particularly interesting, as it offered some strategies for reducing the burden of an increasing number of disclosures. Panelists presented data-driven approaches to determining what information to highlight for consumers in disclosures. In addition, they discussed the use of automated tools that can learn consumers’ preferences and reduce the need for large numbers of consent prompts. Machine-readable disclosures allow a user’s smart phone or other device to read disclosures for them and to display the information that is most important to that user. However, coordination and a common language may be needed for such approaches to be adopted.

Our staff summary goes into more detail on these topics and includes links to research papers and other references.