This blog post is in reaction to the USENIX/Google research titled “Alice in Warningland: A Large-Scale Field Study of Browser Security Warning Effectiveness.”
The overarching questions I have are:
1) how and when should the notifications be displayed to users and
2) how should the notifications be written
For additional commentary on how Google Chrome is reacting to the findings from the research, see the WeLiveSecurity post Google Chrome security warnings – now in plain English.
It would be interesting to see the results of clickthrough rates with antivirus dialogs when combined with the browser dialogs. Users don’t purchase/use a web browser to have dialog warnings as a primary feature, but an argument can be made that by purchasing antivirus these users prefer an extra layer of security and additional warnings/notifications – this could inform whether this will impact if those users visit more malicious sites. As the study indicated as a limitation, we need to “consider user behaviors that are indicative of attention to warnings” (258).
“Users click through more-frequent errors more quickly” (268). Companies that are trying to help users may be inadvertently causing a long-term malaise from the proliferation of warnings that users experience from all the products they use (i.e., not just web browsers). Not working in concert from early on may have long-term effects for the efficacy of malware warnings and other important notifications that users need today. The study recommended “that security practitioners should limit the number of warnings that users encounter” (270). The question is, who decides which warnings to include, and more importantly, who should be responsible for authoring them? How can we ensure that warning notifications are user-friendly and usable, taking into account the many products people are using today and the proliferation of notifications they receive on a daily basis.
More Information Help for Users:
“The “Help me understand” button was clicked during 1.6% of Google Chrome SSL warning impressions. For Mozilla Firefox warnings, 0 users clicked on “Technical Details,” and 3% of viewers of the “Add Exception” dialog clicked on “View Certificate.” This additional content therefore has no meaningful impact on the overall clickthrough rates” (267).
Perhaps product owners don’t want to include this information on the primary notification window, but if users don’t look at or use it to make a decision, what is its purpose then? This study concluded that “Google Chrome places error details in the main text of its SSL warning, and the error has a large effect on user behavior. It is possible that moving this information into Mozilla Firefox’s primary warning could reduce their clickthrough rates even further for some errors” (270).