Home / Cyber / DARPA’s Brandeis program developing tools and techniques to protect the private the private and proprietary information of individuals and enterprises

DARPA’s Brandeis program developing tools and techniques to protect the private the private and proprietary information of individuals and enterprises

Significant technological advances are being made across a range of fields, including information communications technology (ICT); artificial intelligence (AI), particularly in terms of machine learning and robotics; nanotechnology; space technology; biotechnology; and quantum computing. The technologies are mostly dual use, in that they can be used as much to serve malicious or lethal purposes in the hands of hackers and terrorists. One of the threats from emerging technologies have been privacy threat.  While living in an ever-more connected world provides us with easier access to  lot of useful services and information, it also exposes  large amounts of our personal information including our personal data, habits, and life to a wider world. Depending on  your browsing habits, the websites and services you visit, all manner of data from  your birthday, address and marital status can be harvested from your online presence. Data is valuable & can cause harm if released for example medical records, purchase history, internal company documents, etc.

 

The collection and analysis of information on massive scales has clear benefits for society: it can help businesses optimize online commerce, medical workers address public health issues and governments interrupt terrorist activities. Yet at the same time, respect for privacy is a cornerstone principle of our democracy. The right to privacy, as Louis Brandeis first expounded in 1890, is a consequence of modernity because we better understand that harm comes in more ways than just the physical. Numerous recent incidents involving the disclosure of data have heightened society’s awareness of the extreme vulnerability of private information within cyberspace and of the relationship of private data with personal and national security. There is a growing desire to understand, control and manage the “digital contrail” of personal information continually being produced – data that other people or organizations could exploit.

 

DARPA launched the Brandeis program in 2015, that seeks to develop the technical means to protect the private and proprietary information of individuals and enterprises. The vision of the Brandeis program is to break the tension between: (a) maintaining privacy and (b) being able to tap into the huge value of data. Rather than having to balance between them, Brandeis aims to build a third option – enabling safe and predictable sharing of data in which privacy is preserved.

 

The objective of Brandeis is to develop tools and techniques that enable us to build systems in which private data may be used only for its intended purpose and no other. It seeks to restructure our relationship with data by providing the data owner with mechanisms for protecting their data before sharing it with a data user. It will also tackle a cognitive challenge: the volume and complexity of data means that individuals or enterprises need a meaningful way to make choices about how to share data, including understanding the implications of the use of any stored data about them. The potential impact of the Brandeis program is dramatic. Assured data privacy can open the doors to personal medicine, effective smart cities, detailed global data and fine-grained internet awareness.

 

The DARPA program’s privacy-protecting approach taps advanced cryptography, differential privacy and machine-learning software in this era of emerging smart cities and uber-connected devices that can track almost every movement. “How can we gather information from lots and lots of devices in a secure way, in a privacy-preserving way, and yet still do interesting computations that may be useful for all sorts of things, from traffic to law enforcement and anti-terrorism?” Launchbury wonders.

 

He outlines an example of the program’s application using the 2013 Boston Marathon bombing. Envision an app for smartphones that can scan photographs and identify people, he says. Police have a possible suspect but cannot simply disseminate a photograph to the public. They do have the technology, however, to tap the app to find out whether any user’s phone contains photographs of the suspect. “When I say it just like that, that sounds like privacy violations all over the place,” Launchbury shares.

 

In the hypothetical scenario John Launchbury, director of the agency’s Information Innovation Office (I2O) describes, it “turns out you can cryptographically process the [suspect’s] photograph to find some information about it—a sort of visual fingerprint that could be sent to the phones. Then, the software on the phone could compare that with [the] cryptographic fingerprint of the faces within their photographs. For most people, it will be no match. But maybe on one or two phones, a message will come to the user saying, ‘The police are looking for a suspect, and it appears the suspect is in this photograph of yours. Are you willing to share this photograph with the police?’”

 

The potential impact of such technology, and the program itself, could be dramatic, he offers. Imagine if businesses and governments could guarantee data privacy. “Democracy and innovation depend on creativity and the open exchange of diverse ideas, but fear of a loss of privacy can stifle those processes,” Launchbury says. The program marries the best of the two worlds, he reasons.

 

The Brandeis project aims to break the polarity between maintaining privacy and accessing valuable data, particularly if the data is useful for law enforcement, according to DARPA. Rather than balancing the two concerns, Brandeis would create a third option: enabling safe and predictable data sharing that preserves privacy.

 

 

Program Awards

In 2016, Galois  announced that its TAMBA project has been selected by Defense Advanced Research Projects Agency (DARPA) to measure the privacy, performance and utility of systems for its Brandeis program, which is focused on developing tools and techniques for building systems in which private data may be used only for its intended purpose and no other.

 

Brandeis seeks to restructure society’s relationship with data by providing the data owner with mechanisms for maintaining control of their data while sharing it with others. The multi-disciplinary TAMBA team, which includes the University of Pennsylvania, the University of Maryland College Park, The Hebrew University of Jerusalem and Charles River Analytics, will evaluate the level of privacy being offered at a system level, while also accounting for users’ motivations, personal notions of data value and other human factors that impact system effectiveness.

 

“Every day we knowingly and unknowingly contribute data to applications and systems that claim to be privacy preserving – ranging from navigation apps on your phone to social networks and medical systems – but for which there are limited means to measure true privacy levels and user privacy expectations,” said Stephen Magill, Research Lead, Software Analysis, Galois. “TAMBA will build the analysis techniques and tools necessary to formally check whether the privacy controls offered by a system match user expectations.”

 

TAMBA’s approach involves a novel collection of analyses and metrics called the Knowledge-Based Measurement Framework (KBMF). The KBMF is built on semantics for tracking how private data leaks from the system over time and can be used to reason about the broad variety of privacy guarantees provided by Brandeis systems. The KBMF tools will measure the damage caused by leaks, including economic measures to capture the impact of any information release. TAMBA research will result in both theoretical advances and practical tools, all of which will be widely shared. The resulting software will be released as open source.

 

In 2020, Charles River Analytics Inc., developer of intelligent systems solutions, partnered with Galois, Inc. to integrate probabilistic modeling into the Defense Advanced Research Projects Agency’s (DARPA) Brandeis program. Under the Brandeis program — which seeks to develop the technical means to protect the private and proprietary information of individuals and enterprises — the Galois-led TAMBA team developed technology to evaluate the effectiveness of privacy-aware systems.

 

“We applied our rich understanding of probabilistic modeling and inference to build a toolkit for evaluating privacy systems. The integration of prior models was a core part of our work on the Brandeis program,” explained Michael Harradon, Scientist at Charles River Analytics and TAMBA teammate. “Alongside Galois, we developed sophisticated adversary models and probabilistic inference approaches to test and evaluate privacy protection systems.”

 

Some new privacy systems employ encryption and differential privacy algorithms, with the goal of preventing others from inferring sensitive data. Charles River Analytics developed an approach to test these systems—inference algorithms for probabilistic models are used to determine whether the sensitive information under protection can be inferred.The approach incorporates side-channel information and complex domain prior models to evaluate the strength of the system against sophisticated adversaries. This probabilistic approach also provided a platform for improving the utility of these systems by enhancing accuracy without reducing privacy, according to the company.

 

Raytheon Technologies, Two Six Labs Help DARPA Develop Application Privacy Tool

Researchers with Raytheon Technologies’ BBN Technologies subsidiary and Two Six Labs have created a mobile app designed to help application developers identify and apply information privacy techniques. The team developed the Privacy Enhancements, or PE, for Android app under the Defense Advanced Research Projects Agency’s Brandeis program, DARPA said Wednesday. PE for Android features application programming interfaces, a privacy abstraction layer and services that maintain an application’s privacy as the development process goes on. The mobile app isolates sensitive data away from the high-risk areas of development.

 

Users will also receive privacy policy context that informs decisions on how to manage information access. “User privacy should be a first-rate concern for mobile app development, and we are hoping that open-sourcing PE for Android will galvanize the Android developer community,” said Josh Baron, the Brandeis program manager at DARPA.

About Rajesh Uppal

Check Also

Navigating the Volcanic Landscape: Understanding Threats and Mitigation Technologies

Introduction: Volcanoes stand as majestic, yet formidable, natural wonders that have captivated humanity for ages. …

error: Content is protected !!