Echizen Laboratory National Institute of Informatics (NII)

 

Department of Information and Communication Engineering,
Graduate School of Information Science and Technology, The University of Tokyo

 

Department of Informatics, School of Multidisciplinary Sciences,
The Graduate University For Advanced Studies (SOKENDAI)

Privacy-enhancing technologies for resolving trade-offs between data anonymity and utility

Privacy Visor | Fingerprinting technologies for anonymizing data | Privacy in business processes

Privacy Visor

 Due to developments in the ubiquitous information society, computers, sensors and their networks are located in all places, and useful services can now be received at all times and in all spaces of our lives. On the other hand, however, there is now the actual problem that privacy information is easily disclosed as a result of the popularization of portable terminals with built-in cameras or GPS and other sensors. In particular, invasion of the privacy of photographed subjects is becoming a social problem due to photographs taken without the permission of the subjects and photos unintentionally captured in camera images by portable terminals with built-in cameras being disclosed by the photographer on SNS together with photographic information. As a result of developments in facial recognition technology in Google images, Facebook, etc. and the popularization of portable terminals that append photos with photographic information (geotags), such as photo location and time, as metadata when the photo is taken, information such as when and where photographed subjects were is revealed from the disclosed photo of the person concerned via photos taken and disclosed without their permission. Essential measures for preventing the invasion of privacy caused by photographs taken in secret and unintentional capture in camera images is now required. The possibility of unintentional capture in camera images resulting in the invasion of privacy has already been pointed out in Europe and other regions. It has been reported that, according to experiments conducted at Carnegie Mellon University (CMU), for close on a third of tested subjects who had agreed to being photographed for the experiment, their names could be identified by comparison with information of photos, etc. on disclosed SNSs, and, further, that there were also cases where the interests of the tested subjects and some social security numbers also were found out. Furthermore, due to concerns about the invasion of privacy from SNS facial recognition functions, the European Union (EU) has requested the invalidation of facial recognition in Facebook intended for European users.

 Against this backdrop, we have become the first in the world to develop new technology for protecting photographed subjects from the invasion of privacy caused by photographs taken in secret and unintentional capture in camera images. This technology focuses in the differences on human visual sense and the spectral sensitivity characteristics of imaging devices on cameras, and facial detection of photographed subjects can be made to fail only when photos are being taken without the addition of any new functions to existing cameras. This is achieved by the photographed subject wearing a wearable device – a privacy visor – equipped with a near-infrared LED that appends noise to photographed images without affecting human visibility.

p-visor
Privacy Visor

p-visor face detection
Execution of face detection

PrivacyVisor
New version of the Privacy Visor without power supply
(designed by Tsuyoshi Ando, Airscape Architects Studio)

Reference

  1. NII Press Release, Privacy Protection Techniques Using Differences in Human and Device Sensitivity -Protecting Photographed Subjects against Invasion of Privacy Caused by Unintentional Capture in Camera Images- December 12, 2012
  2. BBC News(UK), Privacy visor blocks facial recognition software January 22, 2013
  3. BBC News(UK)BBC One, BBC Two, BBC News Channel and BBC World News Channel
    Click, Infrared glasses to thwart embarrassing Facebook photos  January 26-27, 2013
  4. NBC News, LED-powered 'privacy visor' thwarts facial recognition June 20, 2013
  5. TIME, Leery of Facial Recognition? These Glasses Might Help June 20, 2013
  6. T. Yamada, S. Gohshi, and I. Echizen, "Use of invisible noise signals to prevent privacy invasion through face recognition from camera images," Proc. of the ACM Multimedia 2012 (ACM MM 2012), pp.1315-1316, (October 2012)
  7. T. Yamada, S. Gohshi, and I. Echizen, “Privacy Visor: Method for Preventing Face Image Detection by Using Differences in Human and Device Sensitivity,” Proc. of the 14th Joint IFIP TC6 and TC11 Conference on Communications and Multimedia Security (CMS 2013), 10 pages, (September 2013)
  8. T. Yamada, S. Gohshi, and I. Echizen, “Privacy Visor: Method based on Light Absorbing and Reflecting Properties for Preventing Face Image Detection,” Proc. of the 2013 IEEE International Conference on Systems, Man, and Cybernetics (IEEE SMC 2013), 6 pages, (October 2013)
  9. NII Today, No.50, Get Control of Personal Data Back in Our Hands -PrivacyVisor raises discussion on arbitrary facial recognition- (June 2014)

  10. See also research achievements

Fingerprinting technologies for anonymizing data (collaboration with the Vienna University of Technology)

 Another area of concern is the use of statistical data, including personal survey data that were collected with the assurance that they would be used only within the company or research institution, beyond organizational boundaries while maintaining anonymity to a certain degree.

 For example, if medical data, which typically includes the patient’s name, address, age, disease, and medication, were made publically available without change, the patient could be identified. To prevent this, it is necessary to blur the patient’s attributes by deleting the name and address and by generalizing some of the information; for example, “Tokyo” could be generalized to “Japan,” and “age 32” could be generalized to “thirties.” Since more than one person would usually have the same general attribute, individuals could not be identified.

 However, this generalization approach to ensuring anonymity impairs the value and accuracy of data. In other words, there is a trade-off between the level of data anonymity and the academic utility of the data. Data anonymity has traditionally been emphasized on the assumption that anonymized data could be made freely available. Nowadays, the degree of anonymization has been lowered to increase data utility while more emphasis has been placed on measures to prevent data leaks.

 We have developed a method for identifying the source of leaked anonymized data that associates individual anonymization processes with user identification data. It is called “fingerprinting of anonymized data.” This approach capitalizes on the multiplicity of data anonymization processes, which means that there are many different processes for achieving the same degree of anonymization. Suppose that there are data consisting solely of birth date and gender. User A is provided with a data file containing “1971” and “male” while User B is provided with one containing “August 10, 1971” and “gender unknown.” Our method prepares, for each user, a set of data generated by an anonymization process that varies with the user that has the same level of anonymization as every other prepared set. In the event of data leakage, the association between the anonymization process and the user ID is used to help identify the person responsible. Moreover, awareness of this identification method among users should make them more careful about data management. That is, the anonymization processes themselves deter data leakage.

 Application of this method to social networking services (SNSs) and blogs would enable the source of a privacy leak to be identified from an analysis of the text containing the leaked information. In this application, not only would the anonymization process used vary with the user, but the degree of anonymization would vary with the group.

Reference

  1. S. Schrittwieser, P. Kieseberg, I. Echizen, S. Wohlgemuth, and N. Sonehara. “Using Generalization Patterns for Fingerprinting Sets of Partially Anonymized Microdata in the Course of Disasters,” In International Workshop on Resilience and IT-Risk in Social Infrastructures (RISI 2011), Proc. of the 6th ARES conference (ARES 2011), IEEE Computer Society, pp. 645-649 (August 2011)
  2. S. Schrittwieser, P. Kieseberg, I. Echizen, S. Wohlgemuth, N. Sonehara, and E. Weippl, “An Algorithm for k-anonymity-based Fingerprinting,” Proc. of the 10th International Workshop on Digital Watermarking (IWDW 2011), LNCS, 14 pages, Springer (October 2011)
  3. H. Nguyen-Son, Q. Nguyen, M. Tran, D. Nguyen, H. Yoshiura, and I. Echizen, "New Approach to Anonymity of User Information on Social Networking Services," The 6th International Symposium on Digital Forensics and Information Security (DFIS-12), Proc. of the 7th FTRA International Conference on Future Information Technology (FutureTech2012), 8 pages (June 2012)

Privacy in business processes (collaboration with the University of Freiburg)

 The objective is to enhance the trust model of the practice, whereas data owners have to trust data consumers that they follow the agreed-upon obligations for the processing of data owner’s data. By this project, data owner should control the enforcement of obligations concerning the disclosure of their personal data by data providers. Service providers should be able to prove the enforcement of obligations and so to show the usage of personal data according to the Japanese Act on the Protection of Personal Information and the European Data Protection Directive. This is supposed to support the exchange of personal data between Japanese and European service providers. An information system is being developed so that data owners are able to check the enforcement of these obligations. The foundation for this approach is information flow control mechanisms to trace the flow of personal data, e.g. by modified digital watermarking schemes.

 As an ex post enforcement of privacy policies, our proposal for traceable disclosures of personal data to third parties is using data provenance history and modified digital watermarking schemes. The expected result is a novel privacy management, which presents new higher cryptographic protocols realizing a traceable linkage of personal data involving several disclosures of the same data by their data provenance history.

 The concept is to tag every disclosure of given personal data between the two parties (signaling). Tagging gives data providers and consumers the proof they need to show that disclosure and receipt of given personal data are done according to the agreed-upon obligations (monitoring). The tag for personal data d consists of the data providers’ identity and data consumers’ identity In the used orchestration of services as well as the corresponding users identity and a pointer to the agreed-upon obligations, since they should be modifiable if the purpose of the data usage changes or the participating service providers change. The tag should stick to d, so that d*= d|tag can be disclosed while assuring the integrity of d*. If d* is disclosed again in compliance with the obligations, the tag has to be updated by the identity of the previous data consumer and by adding the identity of the new data consumer. The sequence of tags for the same personal data thus constitutes a disclosure chain, which represents the flow of these personal data.

 This is one option for checking authenticity and confidentiality of data in adaptive ICT systems, where systems are on demand orchestrated for delivering in real time requested services. A check of confidentiality implies to take the interdependencies of the participating systems into account. Even in case of an information leakage by a covert channel, our approach should identify the data consumer where the information leakage has occurred and hence gives evidence if this orchestration is threatened by a covert channel.

Reference

  1. N. Sonehara, I. Echizen, and S. Wohlgemuth, “Isolation in Cloud Computing and Privacy-Enhancing Technologies: Suitability of Privacy-Enhancing Technologies for Separating Data Usage in Business Processes,” Business Information Systems Engineering (BISE)/WIRTSCHAFTSINFORMATIK", vol. 3, no. 3, pp. 155-162, Gabler (June 2011)
  2. S. Haas, S. Wohlgemuth, I. Echizen, N. Sonehara and G. Mueller, “Aspects of Privacy for Electronic Health Records”, International Journal of Medical Informatics, Special Issue: Security in Health Information Systems 80(2), pp.e26-e31, Elsevier, http://dx.doi.org/10.1016/j.ijmedinf.2010.10.001 (February 2011)
  3. S. Wohlgemuth, I. Echizen, N. Sonehara and G. Mueller, “Privacy-compliant Disclosure of Personal Data to Third Parties”, International Journal it - Information Technology 52(6), Oldenbourg, pp. 350-355 (December 2010)
  4. S. Wohlgemuth, I. Echizen, N. Sonehara, and G. Mueller, “Tagging Disclosures of Personal Data to Third Parties to Preserve Privacy,” Proc. the 25th IFIP TC-11 International Information Security Conference (IFIP SEC 2010), to be published in IFIP AICT series, Springer, pp.241-252 (September 2010) <One of the best papers>