Security news that informs and inspires

Tracking By Any Name Is Still A Privacy Concern

By

The fact that the rate of new infections seems to be slowing down in some areas suggest that social distancing and shutting down non-essential businesses may be having an effect. In order to contain—or at least, slow down—the spread of the novel coronavirus pandemic, governments are keenly interested in contact-tracing, or keeping track of all the people who have come into contact with individuals who have been infected with COVID-19. To address that gap, Google and Apple are teaming up on a contact-tracing framework, but questions persist about the privacy implications.

The idea is to have mobile devices transmit Bluetooth beacons to other devices nearby and listen for incoming signals. When one device owners comes into close proximity (short range Bluetooth ranges about 30 feet) with another person carrying a device with the app for more than a few minutes, both devices record each other’s beacon identifiers. Individuals can self-report that they are infected with COVID-19, and the system is designed so that devices that encountered that infected person’s device during the contagion period can be notified. The beacons and key collection will first be handled by special apps, expected in Apple’s App Store and Google Play Store in mid-May, and later, built-in features in Android and iOS.

“Privacy, transparency and consent are of utmost importance in this effort, and we look forward to building this functionality in consultation with interested stakeholders,” Google and Apple said in a joint statement.

Both companies stressed the extra privacy precautions, such as making the tool opt-in only and requiring explicit user consent. The decision to use Bluetooth instead of GPS to record location is another, since the system is designed to say that the individuals were in close proximity to each other, as opposed to track their physical movements over the course of the day. The apps broadcast an anonymous identifier beacon, and the identifier is randomly generated and cycled every 15 minutes to make it harder to track the actual device it is associated with. There is no centralized database with identifying phone information about the people the user has been in contact with. The beacon identifier for the infected user is transmitted to other phones in the system—and each device can check its own records to see if any of those identifiers have been saved (meaning there was contact). The identifiers of all the devices that the user came into contact with never leaves the user’s phone.

“An immediate observation is that this protocol is strictly no worse than Bluetooth's privacy in the specific case that you don't ever get COVID and the tracing key stays on your phone,” Matt Tait, a cyber fellow at the University of Texas at Austin, wrote on Twitter.

However, once the person tests positive for COVID-19, that presumption of privacy is gone since the beacon identifiers then become public so that other devices can download them and check against their internal records. Moxie Marlinspike, a cryptography expert behind the encrypted-messaging app Signal, said that takes privacy backwards, since existing beacon tracking technologies (such as those used in advertising) can incorporate these lists into their stacks for their purposes.

“At that point adtech (at minimum) probably knows who you are, where you've been, and that you are covid+,” Marlinspike said.

The American Civil Liberties Union has expressed concerns about policymakers fixating on using mobile devices to track user locations, and how that can be abused. Location data collected on mobile devices can contain "enormously invasive and personal set of information about each of us, with the potential to reveal such things as people’s social, sexual, religious, and political associations," the ACLU warned in “The Limits of Location Tracing in an Epidemic.” Any attempts to use mobile devices and location data need to have a clear idea of the purpose behind the activity and how specific that location data has to be. The kind of data being collected has to match the purpose, since tracking overall trends would have different data requirements from identifying unknown patients, for example.

“To their credit, Apple and Google have announced an approach that appears to mitigate the worst privacy and centralization risks, but there is still room for improvement,” Jennifer Granick, surveillance and cybersecurity counsel at the ACLU, said in a statement. “We will remain vigilant moving forward to make sure any contact tracing app remains voluntary and decentralized, and used only for public health purposes and only for the duration of this pandemic.”

Both companies stressed that the access to the API will be granted only to those efforts related to the pandemic. Privacy advocates will be watching carefully to see who will be able to use the API to access the framework, since the developers will also have to be careful to not integrate the beacons with other information they may have in ways that infringe on user privacy. And finally, there has to be some attention paid to the data's lifecycle, in how long the information will be stored, and how the data will be destroyed to prevent abuse or mis-management.

"Any uses of such data should be temporary, restricted to public health agencies and purposes, and should make the greatest possible use of available techniques that allow for privacy and anonymity to be protected, even as the data is used," Granick wrote with ACLU's Jay Stanley.

For some privacy advocates, making a framework “voluntary” or “opt-in” is not enough protection from abuse since they can become compulsory if normal activity gets tied to using these services. For example, policymakers can that using this kind of tool is a requirement in order to be allowed to return to work, Ashkan Soltani, the former CTO of the Federal Trade Commissioner and a White House senior advisor during the Obama Administration, wrote on Twitter. Something similar happened in China, where citizens had to show a code on their smartphone app before they could use public transportation, according to the New York Times report.

Bluetooth signals can also go through walls, so in areas of high population density, such as an apartment building, a user can be shown to have been in “close proximity” with an infected person despite that person being in a completely separate apartment the entire time.

“This type of approach is likely to generate significant FALSE POSITIVES and FALSE NEGATIVES -- which is highly problematic when this data is (eventually) used to make decisions that will affect citizen's freedoms -- voluntarily or not,” said Soltani.