Let there be ambient light sensing, without data theft • The Register


Six years after web security and privacy concerns surfaced over ambient light sensors in mobile phones and laptops, browser boffins have finally put some defenses in place.

The W3C, everyone’s favorite web standards body, began formulating an Ambient Light Events API specification in 2012 to define how web browsers should handle ambient light sensor (ALS) data and events. Section 4 of the draft spec, “Security and Privacy Considerations”, was blank. It was a more carefree time.

In 2015, the specification evolved to include recognition of the possibility that ALS could enable data correlation and device fingerprinting, to the detriment of individual privacy. And he suggested that browser makers might consider event rate throttling as a potential mitigation.

In 2016, it became clear that allowing web code to interact with device light sensors posed privacy and security risks beyond fingerprints. Independent privacy researcher and consultant Dr. Lukasz Olejnik explored the possibilities in a 2016 blog post.

Olejnik cited a number of ways ambient light sensor readings could be abused, including data leakage, profiling, behavioral analysis and various forms of communication between devices.

He described a few proof-of-concept attacks, designed with the help of security researcher Artur Janc, in a 2017 article and delved into more detail in a 2020 article. [PDF].

“The attack we designed was a conceptually very simple side-channel leak, taking advantage of the optical properties of human skin and its reflective properties,” Olejnik explained in his paper.

“Skin reflection is only 4-7% of the light emitted, but modern display screens emit light with significant luminance. We exploited these natural facts to design an attack that reasoned about the content of the website via information encoded in the light level and transmitted through the user’s skin, returning to the browsing context following the light sensor readings.”

It was this technique that enabled proof-of-concept attacks like stealing web history through inferences made from CSS changes and stealing cross-origin assets, such as images or web content. iframes.

snail speed

Browser vendors have responded in a variety of ways. In May 2018, with the release of Firefox 60, Mozilla moved access to the W3C proximity and ambient light APIs behind flags, and enforced other limitations in later versions of Firefox.

Simply apple refused to implement the API in WebKit, as well as a number of other features. Apple and Mozilla are currently opposing a generic sensor API proposal.

Google took what Olejnik described in his post as a “more nuanced” approach, limiting the accuracy of sensor data.

But those working on the W3C specification and the browsers implementing the specification have recognized that such privacy protections should be formalized, to increase the likelihood that the API will be widely adopted and used.

So they voted to make the inaccuracy of ALS data normative (standard for browsers) and to require permission for camera access as part of the ALS specification.

These changes finally landed in the ALS spec this week. As a result, Google and perhaps other browser makers may choose to make the ALS API available by default rather than hiding it behind a flag or ignoring it altogether. ®


Comments are closed.