Statistical Analysis of End-User Comfort & Trust With Using SSL/TLS Client Implementations

Mark Stanislav, Leslynn Terry, & Jeff Tang

In the 1960s, the Advanced Research Projects Agency developed the capabilities for scientists and researchers to communicate and collaborate on a private digital network known as ARPANET which formed the foundations of what we now know as the Internet, a series of highly interconnected network nodes transferring information at the speed of light across the globe enabling friends and family to connect with one another and businesses and clients to engage in transactions. However, there are a number of issues surrounding digital communications security and privacy: with the introduction of online banking and social networks, people are becoming increasingly vulnerable to involuntary personal information exposure and subsequent identity theft. Embarrassing photos, medical records, secret recipes, proprietary algorithms, and money, there is no limit to what the Internet can store. Unfortunately, there is no magical solution to protect our personal information and combat cybercrime; the only way to mitigate this problem is through information security awareness and training. In this study, we will examine the relationship between an individual’s comfort level with utilizing the Internet and their Internet security practices. Furthermore, this study will explore the extent to which gender moderates this relationship. It’s increasingly important for employers and software developers alike to understand their enduser’s information security awareness and practices so they can tailor their training and development to supplement the security of their users. The implementation of the Secure Sockets Layer (SSL) and more recently Transport Layer Security (TLS) protocols within end-user applications has been approached in a myriad of ways by the different perspectives of information technology. “In general, a connection’s properties are indicated by several features of a browser’s

2

user interface, e.g., the padlock icon in the status bar or the certificate dialog (Adelsbach, Gajek, & Schwenk, 2005)”. While many have been familiarized over the years with this de-facto security notation, it’s not as obvious what the underlying components of an SSL/TLS connection is actually doing to protect a user’s information in transit. Beyond even understanding how the web browser is protecting you, not all users understand what it means to have perceived security versus actual security. For instance, visual spoofing, “…imitate certain parts of the browser’s user interface, pretending that users communicate securely with the desired service, while actually communicating with the attacker (Adelsbach, et al., 2005)”. The usage of SSL (and subsequently TLS) has been evolving over a long time period, with many of the uses of browsers (such as day-to-day ecommerce) not fully yet considered. In many ways, our relatively esoteric methods of making users aware of potential danger or security haven’t evolved passed their original incarnation. Created in August 1994 by Netscape (Thomas, 2000), SSL has evolved over nearly 20 years very little considering the reliance e-commerce and other critical Internet usage has versus the time of its creation when it was more of an ‘extra’. It wasn’t even until 1996 that the Internet Engineering Task Force (IETF) took over SSL in the sense of creating a standard that could be regulated and adhered to (Thomas, 2000). In research done by Qi, Tang, Wang, and Wu, potential methods to trick a user into believing that they were talking securely over SSL/TLS to a banking web site was discussed. “To cheat the careful users, lock.html provides the security lock icon, as well as its actions. When the user clicks on the security lock icon, the response action will display the certificate information of the bank server as the genuine browser does (Qi,

3

Tang, Wang, & Wu, 2009)”. This sort of involved attack on both the human element of “if there is a lock icon, I can trust this web site” as well as technologically forcing upon a web page an icon which provides the hook for user acceptance is becoming more common and effective due to pre-conditioning over the years of end-user awareness. Further, in situations where users are told they are ‘securely’ interacting with a web site, they may not actually be able to see any icon because the data is transmitted from an insecure (non-SSL/TLS) web site to a secure one without their direct knowledge. This creates a confusion in being able to easily determine the ‘safety’ of the data they are submitting somewhere. While the lock icon and SSL/TLS protocols themselves are quite effective when used properly, the “World Wide Web Consortium” (W3C) points out that while the data in transit is safe from prying eyes, that’s not a direct statement on the total security of the process. “In particular, validated and augmented assurance certificates make guarantees about some level of owner identity verification having been performed (see definitions) but they do not represent any guarantees that a site is operated in a safe manner, or is not otherwise subject to attack (W3C, 2010).” The issues of security with SSL/TLS aren’t just about data security, but being able to trust who you are communicating with as well. The ability to trick users into submitting data can go even farther than just slight browser tricks. A noted information security researcher named Moxie Marlinspike presented in 2009 at the BlackHat D.C. security conference showing a new tool he had created call ‘SSL Strip’ (Marlinspike, 2009). The purpose of the tool is to augment ‘insecure’ web pages before they arrive to the end-user (known as man-in-the-middle) and change all

4

instances of ‘https’ to ‘http’. Thusly, when a user submits a form that should have been secure, they instead transmit the data in clear text (unencrypted) allowing the attacker to see the login credentials or other sensitive information go across the network. These sorts of security vulnerabilities go even further beyond crafty attacks and straight to the heart of implementation of the protocols themselves. As noted in an ISSA Journal from 2011, “For example, an unpatched server allowing weak algorithms to be negotiated within the handshake process will result in a vulnerable scenario although mostly unnoticed from client’s perspective (Carvalho, 2011).” This is a great example of how end-users trust their browsers to protect them and warn them when something seems dangerous. In reality, a browser also has to work with the web server’s own security in what can be provided to an end-user for protection. An important consideration to keep at the forefront of information security implementation is the end-user’s ability to leverage technology in order to allow them to decide to be more secured. In the conclusion of a publication regarding user interaction design for secure systems, the author states, “I have argued that consideration of human factors is essential for security, and that security and usability do not have to be in conflict (Yee, 2002).” It’s important that we don’t fight the user in efforts to provide them with security. If we obscure the underlying secure mechanism to make it more palatable, we can find ourselves designing a system that provides no assurances and allows for easier exploitation (in the example of Marlinspike previously mentioned). It’s the give-and-take of technology implementation and human convenience that provides true information security. “But the browser recognizes a site based on system properties, e.g., whether the site has an SSL certificate, when and where this site

5

registered, etc. As a result, neither the computer system nor the human user alone can effectively prevent phishing attacks (Wu, Miller, & Little, 2006).” Without harmony between the implementation of SSL/TLS and a user’s ability to understand the mechanisms, which provide them security, and an ability to spot when situations are dangerous, little is achieved except superficial benefit. To further highlight the disparity in usable security design for the general population, a paper regarding the implementation of cryptography (in a different implementation than a web browser, but still relevant) utilizing PGP (Pretty Good Privacy) provides insight into end-user experience. “Computer security management often involves security policies, which are systems of abstract rules for deciding whether to grant access to resources. The creation and management of such rules is an activity that programmers take for granted, but that may be alien and unintuitive to many members of the wider user population (Whitten & Tyger, 2005).” Much like a web browser, the PGP interface leverages a graphical representation of security. “The metaphor of keys is built into cryptologic terminology, and PGP’s user interface relies heavily on graphical depictions of keys and locks (Whitten & Tyger, 2005).” The thread here is that end-user design is consistent through many types of applications related to security, but in different contexts, an important difference exists. In the case of PGP, data is being stored securely and parties receiving that data have to be given it explicitly. With a web session utilizing SSL/TLS, browsers and web servers interact constantly without an end-user necessarily allowing data transmission to occur. Having the two types of data security and associated transmission properties can muddy the waters as to the reliability of any given technology utilizing these familiar icons.

6

Perhaps more concerning than whether or not an end-user trusts a lock icon are the plethora of browser warning messages which range from the frantic “This web site is not trusted.” to the ambiguous, “this web certificate is invalid.” In all cases of SSL/TLS browser errors, a user is left to decide what they will accept as ‘safe’ and ‘not safe’. By providing non-technical users with the decision of how far to take their own security, a primary concern is that they will choose convenience over proactive concern. “The pervasive nature of SSL errors raises questions about the efficacy of SSL warnings. A survey of 297,574 SSL-enabled websites queried in January 2007 found 62% of the websites had certificates that would trigger browser warnings (Sunshine, Egelman, Almuhimedi, Atri, & Cranor, 2009)”. Considering the breadth of web sites that responded with some form of SSL/TLS warning message, it’s no surprise that we’ve become somewhat desensitized to these altogether. Only technical users could possibly discern whether or not one SSL/TLS warning message was more or less ‘concerning’ due to having to assess context with the warning being received. “We found that while the experts were more likely to identify the warnings than non-experts, even in the best case, the experts were only able to correctly define the expired certificate warnings an average of 52% of the time, the unknown CA warnings 55% of the time, and the domain mismatch warnings 56% of the time (Sunshine, et al., 2009).” This is a great example that understanding one side of the equation does not mean an end-user will be able to interpret the other. A study conducted at CMU with regard to mechanisms to draw attention to warnings in browsers noted that, “Our findings support the notion that users do not necessarily believe or even read warnings and messages they come across while

7

performing a task; rather, they use their past experience and knowledge (in this case, their knowledge of the warning’s layout and how to overcome it) to complete their tasks (Sotirakopoulos, Hawkey, & Beznosov, 2011).” Related to this, it’s a perplexing situation that user-interface designers face in trying to overwrite years of ‘mis-training’ that a browser warning is simply a hoop to jump through to get to what you want. Instead, they should be informational in such a way that a user expects to not visit that web site without extended knowledge on why it is safe, versus assuming the best, not the worst. Perhaps providing awareness through incident disclosure of security issues is more likely to adjust human behavior than training off the consideration of fear. “Only 2 participants stopped at the bank site but proceeded at Hotmail; an equal number of them proceeded at the bank site because they felt that it was trustworthy, whereas they did not continue with the Hotmail task, later explaining that they had heard of security incidents at Hotmail (Sotirakopoulos, et al., 2011).” While spreading awareness through popular media can be dangerous due to the propensity to be sensationalized, if it’s an effective way to ‘stir the pot’ it may be worth provoking. Current day trending towards more user awareness have brought increased visibility to ‘proper’ SSL/TLS implementation to be a good thing rather than focusing on the bad (warning messages, scary red boxes). Extended Validation (EV) SSL certificates provide an additional visual queue in some browsers in that the address bar now turns a friendly and reassuring shade of green (Mangiafico, 2010). Each owner of an EV certificate must be vetted through a process before being provided a certificate so not only do you have assurances of security (in that, the communication is encrypted) but you also receive the additional assurance that the person you are

8

speaking with is reputable as well. Even so, adoption rate is slow as only 20% of top 100-web retailers even use an EV certificate (Mangiafico, 2010). Despite this slow curve towards a new assurance, the sensibility is going towards the right direction it would seem: tell users they are doing well which only needs one message “you’re okay”, versus having to parse through the globs of scary and ambiguous messages. We expect a user's comfort with utilizing the Internet will have a positive impact on their Internet security practices.  We hypothesis that if a user is comfortable using Internet technology, they are capable of utilizing good security practices. Additionally, we believe gender does not have any significant effect on good security practices. To perform this study, we constructed a survey and submitted it to our peers to evaluate it on for content validity. We asked for volunteers to take the survey online via SurveyMonkey.com, an online service which specializes in hosting surveys. The participants were told that the survey is completely voluntary and the data will be kept anonymous and confidential. Afterwards, we coded the data for input into IBM’s SPSS software package to perform the tests for statistical significance. Q1 through Q5 are constructed to measure a user’s positive comfort level when utilizing the Internet. Q6 is a measurement of anxiety or discomfort when utilizing the Internet for personal information. These questions were measured on a 5 anchor scale (Strongly disagree, Disagree, Neither agree nor disagree, Agree, Strongly Agree). We calculate a user’s total comfort level (Internet_comfort) as: Internet_comfort = Comfort_privacy + Comfort_security + Comfort_banking + Comfort_warnings Afraid_theft. Q7 and Q10 measure a user’s behavior when it comes to good security practices. Q8 tests a user’s understanding of HTTPS: even though a hyperlink may

9

contain ‘https’ in it does not make it trusthworthy or safe. Q9 is a measurement of a user’s reckless behavior when utilizing the Internet. The behavior items were measured on a 4 anchor scale (Never, Sometimes, Often, Always). We define a user’s good security practice (Good_practice) as Good_practice = Behavior_cert + Behavior_password - Behavior_HTTPS - Behavior_clickthru. Once the data was gathered, we measured the Cronbach’s alpha level for internal consistency among the responses. We had moderate consistency levels with alpha levels of 0.618, 0.562, and 0.306. The weakest value was for bad security practices.

Cronbach’s Alpha (Left to Right): Internet Comfort, Good Security Practice, Bad Security Practice

The Pearson correlation was calculated for Internet_comfort and Good_practice and we expected a positive correlation between the two variables. We found a weak correlation r = 0.172. Internet comfort explains only 2.9% of the variance of good security practices. However, this did not meet the criteria for significance of p < 0.05.

10

Figure: Pearson Correlation for Internet Comfort and Good Security Practices

We divided up the dataset based on gender and conducted the same calculation to determine if there were any moderator effects due to gender and found gender did have an impact on the Pearson correlation for females (r = 0.363; p = 0.03 < 0.5).

Figure: Pearson Correlation for Males

Figure: Pearson Correlation for Females

This is also reinforced when we performed a t-test on Internet_comfort and Good_practice grouped by gender. H_0: There is no difference between males and females when it comes to good Internet security practices. H_1: There is a difference between males and females when it comes to good Internet security practices. We get a significance level of p = 0.045 and p = 0.059 which are very close to the p < 0.05

11

threshold warranting further studies.

Figure: t-test for Good Practices and Internet Comfort by Gender

Our data shows that as a whole there is no correlation between a user’s comfort level utilizing the Internet and his/her security practices. However, gender plays a significant role in moderating this correlation. Females have a significant correlation between the two variables while males do not. This data shows familiarizing a user with the Internet to increase their comfort levels do have an effect on his/her security practice and may be a worthwhile opportunities for companies to look into in order to safeguard their networks and intellectual property or trade secrets held within. More studies are needed along with more research into constructing an improved questionnaire with a stronger measure of internal consistency.

12

Questionnaire and Data Key Q1 - What gender are you? (Gender) Q2 - I feel comfortable about my privacy when sending personal information via e-mail to another person or company. (Comfort_privacy) Strongly Disagree Disagree Neither agree or Disagree Agree Strongly Agree Q3 - I am comfortable in the security of my transactions on Internet web sites. (Comfort_security) Strongly Disagree Disagree Neither agree or Disagree Agree Strongly Agree Q4 - I feel safe in using the Internet for online banking activities. (Comfort_banking) Strongly Disagree Disagree Neither agree or Disagree Agree Strongly Agree Q5 - I feel comfortable with understanding warning messages my browser gives me with regard to SSL/TLS functionality. (Comfort_warnings) Strongly Disagree Disagree Neither agree or Disagree Agree Strongly Agree
13

Q6 - I am afraid of getting my identity stolen through online financial transactions. (Afraid_theft) Strongly Disagree Disagree Neither agree or Disagree Agree Strongly Agree Q7 - I only use web sites that have a valid SSL certificate and show no warnings when visiting. (Behavior_cert) Never Sometimes Often Always Q8 - I click on links as long as they have an ‘https’ prefix attached to them. (Behavior_HTTPS) Never Sometimes Often Always Q9 - I click through all warnings and popups as long I get to the content/data I want. (Behavior_clickthru) Never Sometimes Often Always Q10 - I check to see the website is protected by SSL/TLS before entering my password. (Behavior_password)

14

Never Sometimes Often Always

15

References Adelsbach, A., & Gajek, S. (2005). Visual Spoofing of SSL Protected Web Sites and Effective Countermeasures. Information Security Practice and … Retrieved from http://www.springerlink.com/index/VYW5963153FGGV9H.pdf Carvalho, M. (n.d.). SSL/TLS Revisited. issa.org. Retrieved from http://www.issa.org/ Library/Journals/2011/August/Carvalho-SSL-TLS%20Revisited.pdf Mangiafico, R. (Ed.). (2010). Extended Validation EV SSL Certificates – Should Your Website Have One (R. Mangiafico, Ed.)Extended Validation EV SSL Certificates – Should Your Website Have One? LexiConn. Retrieved September 2011, from http://www.lexiconn.com/blog/2010/01/extended-validation-ev-ssl-certificatesshould-your-website-have-one/ Marlinspike, M. (2009). New Tricks for Defeating SSL in Practice. BlackHat DC. Retrieved from http://blackhat.com/presentations/bh-dc-09/Marlinspike/BlackHatDC-09-Marlinspike-Defeating-SSL.pdf Qi, F., Tang, Z., & Wang, G. (2009). SSL-enabled trusted communication: Spoofing and protecting the non-cautious users. Security and Communication …, 4(4), 372– 383. Retrieved from http://onlinelibrary.wiley.com/doi/10.1002/sec.159/pdf Sotirakopoulos, A., & Hawkey, K. (2011). On the Challenges in Usable Security Lab Studies: Lessons Learned from Replicating a Study on SSL Warnings. Symposium on Usable …. Retrieved from http://cups.cs.cmu.edu/soups/2011/ proceedings/a3_Sotirakopoulos.pdf Sunshine, J., Egelman, S., Almuhimedi, H., Atri, N., & Cranor, L. F. (2009). Crying wolf: an empirical study of SSL warning effectiveness. In SSYM'09: Proceedings of the

16

18th conference on USENIX security symposium. Presented at the SSYM'09: Proceedings of the 18th conference on USENIX security symposium, Montreal, Canada:  USENIX Association. Retrieved from http://lorrie.cranor.org/pubs/ sslwarnings.pdf Thomas, S. A. (2000). SSL & TLS essentials: Securing the Web (p. 197). John Wiley & Sons Inc. Retrieved from http://library.books24x7.com.ezproxy.emich.edu/ toc.aspx?site=QQLIX&bookid=2298 Web Security Context: User Interface Guidelines. (2010). Web Security Context: User Interface Guidelines. Web Security Context: User Interface Guidelines. W3C. Retrieved September 2011, from http://www.w3.org/TR/wsc-ui/ Whitten, A., & Tygar, J. D. (1999). Why Johnny Can't Encrypt. USENIX Security. Retrieved from http://www.doug-tygar.com/papers/Why_Johnny_Cant_Encrypt/ OReilly.pdf Wu, M., Miller, R. C., & Little, G. (2006). Web wallet: preventing phishing attacks by revealing user intentions. In SOUPS '06: Proceedings of the second symposium on Usable privacy and security. Presented at the SOUPS '06: Proceedings of the second symposium on Usable privacy and security,  ACM. doi: 10.1145/1143120.1143133. Retrieved from http://cups.cs.cmu.edu/soups/2006/ proceedings/p102_wu.pdf Yee, K.-P. (2002). User Interaction Design for Secure Systems. (R. Deng, F. Bao, J. Zhou, & S. Qing, Eds.)Lecture Notes in Computer Science (Vol. 2513, pp. 278– 290). Information and Communications Security. doi:10.1007/3-540-36159-6_24.

17

Retrieved from http://www.springerlink.com/index/ 10.1007/3-540-36159-6_24"

18