Paper by Dan Feldman and Eldar Haber: “Datamining practices have become greatly enhanced in the interconnected era. What began with the internetnow continues through the Internet of Things (IoT), whereby users can constantly be connected to the internet through various means like televisions, smartphones, wearables and computerized personal assistants, among other “things.” As many of these devices operate in a so-called “always-on” mode, constantly receiving and transmitting data, the increased use of IoT devices might lead society into an “always-on” era, where individuals are constantly datafied. As the current regulatory approach to privacy is sectoral in nature, i.e., protects privacy only within a specific context of information gathering or use, and directed only to specific pre-defined industries or a specific cohort, the individual’s privacy is at great risk. On the other hand, strict privacy regulation might negatively impact data utility which serves many purposes, and, perhaps mainly, is crucial for technological development and innovation. The tradeoff between data utility and privacy protection is most unlikely to be resolved under the sectoral approach to privacy, but a technological solution that relies mostly on a method called differential privacy might be of great help. It essentially suggests adding “noise” to data deemed sensitive ex-ante, depending on various parameters further suggested in this Article. In other words, using computational solutions combined with formulas that measure the probability of data sensitivity, privacy could be better protected in the always-on era.
This Article introduces legal and computational methods that could be used by IoT service providers and will optimally balance the tradeoff between data utility and privacy. It comprises several stages. The first Part discusses the protection of privacy under the sectoral approach, and estimates what values are embedded in it. The second Part discusses privacy protection in the “always-on” era. First it assesses how technological changes have shaped the sectoral regulation, then discusses why privacy is negatively impacted by IoT devices and the potential applicability of new regulatory mechanisms to meet the challenges of the “always-on” era. After concluding that the current regulatory framework is severely limited in protecting individuals’ privacy, the third Part discusses technology as a panacea, while offering a new computational model that relies on differential privacy and a modern technique called private coreset. The proposed model seeks to introduce “noise” to data on the user’s side to preserve individual’s privacy — depending on the probability of data sensitivity of the IoT device — while enabling service providers to utilize the data….(More)”.