Table Of Contents
Websites could use a security feature of your iPhone/iPad to track your browsing even if you clear the browser history.
It’s a well observed tip to make sure you’re using a secure connection whenever you visit a website where you might share sensitive or personal information. When you’re visiting a website with a secure connection your web browser displays a padlock icon. The icon indicates that your connection to the site is encrypted and can’t be interfered with or intercepted.
Demonstration
…
This is a unique value that was generated by JavaScript in this page. The page attempts to store this value in your web browser and read it again when you visit the page in the future.
Different web browsers don’t behave exactly the same way. To see how your browser performs try these tests and see if the value stays the same:
- Refresh the page.
- Open the same web address in a “private”/”incognito” window.
- Clear your browser cookies and refresh the page.
- Visit the page on a different iOS device, synced with the same iCloud account.
If the value stays the same during any of these steps this technique could be used to track your browsing habits.
This is a unique value that was generated by JavaScript in this page. The page attempts to store this
value in your web browser and read it again when you visit the page in the future.
A security feature of modern web browsers called “HTTP Strict Transport Security” (HSTS) allows a website to indicate that it should always be accessed using a secure connection. If you visit a site that has HSTS enabled, your web browser will remember this flag and ensure the connection is secure any time you visit the website in the future. Subsequent visits to the site without using a secure connection get automatically redirected by the web browser to the secure variant of the web address, beginning https://
This automatic redirecting protects your access to the site from being intercepted but could also be abused by a malicious site to store a unique number to track your web browser. A number can be encoded as a series of bits (true and false values) and stored by accessing a set of web addresses. Each web address responds with HSTS enabled or disabled depending on the address. Once the number is stored it could be read by other sites in the future. Reading the number just requires testing if requests for the same web addresses are redirected or not.
Using HSTS to track your browsing habits evades the features of web browsers designed to control more normal “cookie” based tracking mechanisms. Using “incognito” or “private” modes means that existing cookies won’t be shared with sites you visit. Browsers also let you entirely delete cookies that could be used to track you.
Because HSTS is a security feature and isn’t intended to be used for tracking, web browsers treat it differently from cookies. It is only by intentional misapplication that HSTS can be exploited to track users.
Some browsers such as Google Chrome, Firefox and Opera do mitigate the issue. Erasing cookies on these browsers also erases HSTS flags so any stored value will be cleared. However, unlike cookies, existing HSTS flags are still shared with sites when using “incognito” or “private” windows. The impact is that it’s possible for a site to track you even if you choose to use “incognito” or “private” browsing features in an effort to avoid such tracking.
Considerably more worrying is the behavior displayed by Safari, the default browser for iPad and iPhone. When using Safari on an Apple device there appears to be no way that HSTS flags can be cleared by the user. HSTS flags are even synced with the iCloud service so they will be restored if the device is wiped. In this case the device can effectively be “branded” with an indelible tracking value that you have no way of removing.
A notable exception is Internet Explorer which has no support for HSTS (although it is in development at the time of writing) so it’s not vulnerable to this technique.
I initially thought this wasn’t an avenue that had previously been explored, However Mikhail Davidov wrote about the potential misuse of HSTS in April 2012 although I’m not aware of any direct response to his observations by browser vendors.
I’m not aware of the technique being used to track users in the wild, although that is not to say it isn’t. I’d like to reach out to the rest of the technical community to consider how this might be mitigated while still deriving as much value from HSTS as possible. If you would like to get in touch regarding this information please email me at [email protected].
Update
Members of the Google Chrome security team have been kind enough to highlight the discussions that have lead to a number of patches, and reverts to the chromium code base in attempts to mitigate the effects of the problem this article demonstrates. You can read more here and here
Ultimately they conclude that there is a necessary trade-off between security and privacy. The chromium security FAQ goes on to say that “defeating such fingerprinting is likely not practical without fundamental changes to how the Web works”.
The Technical analysis of client identification mechanisms talks about the possibility of this technique, along with many others, saying “In an attempt to balance security and privacy, any HSTS pins set during normal browsing are carried over to the incognito mode in Chrome; there is no propagation in the opposite direction, however.”
This article has been published with permission of Sam Greenhalgh in its entirety. ย You can also read the entire article on Radical Research.