The SSL Performance Trade-off and Web 2.0 Security

by | July 10th, 2007

Security TruckEveryone knows about the sharp trade-off that exists when using SSL: You get the security of an encrypted connection but you pay for it with a significant performance hit. Servers work much harder, and pages load much slower. SSL processing consumes about 70% of HTTPS transaction time (Zhao, Iyer, Makineni and Bhuyan, “Anatomy and Performance of SSL Processing” [PDF]). It’s no accident that SSL offloading and acceleration are big business.

But the practical effect of this trade-off is less often noted, even though anyone who uses e-commerce enabled Web sites run across it almost every day. Put it this way: Do you know many sites that require SSL for all requests? Compare this to the number of sites you know that use SSL for just a portion of the site — usually a section requiring authentication, from within which financial transactions can be made. It’s quite likely you come across many more sites like the latter type rather than the former. It’s also quite likely that a site with a SSL-protected section allows you to navigate back into the unprotected area.

All this heterogeneity is obviously a spontaneous adaptation to the SSL performance trade-off. In general, site owners don’t want to make users wait (and load servers up with extra work) unless the security risk is so obvious and immediate to the end user that they won’t use the app (or buy the goods) without the security reassurances that the use of SSL provides. But of course the unprotected areas of a site are an invitation to mischief, especially given the rise of such Web 2.0 exploits as Cross-Site Request Forgery and its variant, JavaScript Hijacking [PDF].

Indeed, purely from the point of view of discouraging the kind of malicious reconnaissance that can be the prelude to a serious attack, you might think that a better adaptation to the SSL performance trade-off would be to require all application code and data to be sent over HTTPS, while allowing images and other static content to travel the cheaper road of HTTP. But of course that won’t fly, since the browser would immediately bark at the user to warn of the mixing of secure and unsecured content on the page. And this is not exactly a warning that would be wise to ignore!

Maybe what is needed here are more granular solutions — ones that manage the trade-offs better by zeroing in on the new types of vulnerabilities that risk being exposed when transactional sites and apps are not fully encrypted. An example of this type of approach is Francesco Sullo’s Ajax Secure Service Layer (aSSL) library. While Sullo readily admits that aSSL cannot provide the authentication that HTTPS can (there is no PKI component in the mix), he rightly points out that for non-commercial Web 2.0 apps such as chats and blogs, it does provide confidentiality for data exchanged with the app (e.g., user passwords). The interesting thing from the point of view of the SSL performance trade-off is that this is accomplished with a very selective use of SSL encryption, avoiding the vast overhead of a full HTTPS transaction.

There is obviously much more work to do in this field, and SSL may not always be part of the solution at all (such techniques as session-ized URLs and double submitted cookies are increasingly recommended as means of providing granular Web 2.0 security in the face of CSRF and related exploits). But when it is, it’s worth doing some hard thinking about how at least some of its benefits can be employed without the forcing the SSL performance trade-off.

[photo under CC from flickr user Marshall Astor]

  • Do you mind if I quote a couple of your articles as long as I provide credit and sources back to your blog? My blog is in the very same area of interest as yours and my users would really benefit from some of the information you present here. Please let me know if this okay with you. Appreciate it!