The Next Frontier in Data Privacy

April 17, 2018

In a digital age where data is produced and collected by the second, Cummings searches for a place for privacy

Last month, Facebook and Cambridge Analytica brought data privacy into the spotlight. The UK-based data firm acquired millions of Facebook users’ personal data to build software that could target swing voters during political campaigns. Essentially, Facebook data was leveraged to create targeted ads for political gain, leading to questions around the legality and moral state of data privacy.  

Data privacy can be a grey area for thousands of companies that use online behavioral data to target consumers every day – from ads on the websites we visit to the coupons we get at the grocery store. These choices are tracked, collected and analyzed en masse. It can help consumers: you watched a certain movie on Netflix, so it suggests another you might like in the same genre. But it can also be intrusive, creating a feeling of ‘big brother’: your running route was recorded by a workout app and shared with others.

Rachel Cummings, an assistant professor at Georgia Tech’s H. Milton Stewart School of Industrial and Systems Engineering, is working to better understand data privacy and how it relates to both human behavior and the economy.

“The issue with Facebook and Cambridge Analytica highlights that data privacy is a highly nuanced issue,” said Cummings. “Unlike traditional data breaches, these two companies were legally sharing data according to an agreed upon contract. The issue in this case is downstream data use: once a person shares their data, who is allowed to use it and for what purposes?” 

Cummings recognizes that challenges exist for companies that want to use their data – they stand to gain valuable insights from it but are hesitant for fear of bad press. So, to effectively capitalize on data in a non-intrusive way, differential privacy can help, which is Cummings’ area of focus.  

Cummings works within the field of differential privacy – a type of database privacy that guarantees the input data from a single individual (your home address, for example) has a very small impact on the output of a computation (Zillow reporting how many people live in a neighborhood, for instance). The goal of differential privacy is to ensure you only learn from the global database aggregate, rather than any specific individual.

Cummings’ lab develops and optimizes the algorithms that support differential privacy for corporations like Google, where data can be turned into dollars. Google uses it when a Chrome web browser crashes. To identify the problem without exposing the search history of users, a differential privacy algorithm is used to strip out personal identifiable user information. It’s designed to protect the privacy of individuals, but still provide Google with helpful information to make their browser service better.

Other Fortune 500 companies, like Apple, are leading the way in piloting differential privacy to make better business decision based on their data. Companies can better understand customer’s preferences, explain why they made the choices they did, and predict their future behavior. The algorithms can also be applied to healthcare and medical records to determine patterns in diseases or discern treatments that work on specific demographics, without violating medical privacy laws.

“With great data comes the potential for great privacy violations,” said Cummings. “As companies make more efficient use of personal data, they must also respect the privacy needs of the individuals who shared their data.  I’m hoping to revolutionize my field, as well as U.S. business practices, by redesigning privacy policies so individuals have some say over how companies use the data they create.”

Cummings plans to continue her work at the intersection of economics, machine learning and data privacy. She proposes that companies need to think about how to incentivize people to share their data, while still giving them privacy guarantees. By striking the right balance between protecting consumer privacy and monetizing data, companies will be able to leverage differential privacy to their advantage.

Recent News

four clark scholars (photo)

Clark Scholars Philanthropy Challenge

Engineering scholars send a $25,000 grant to the Midtown Assistance Center
Sep 23 2020
diversity graphic with nsf logo

NSF Grant Awarded to Advance Recruitment of Underrepresented Minorities in STEM Ph.D. Pipeline

College of Engineering and College of Sciences leverage grant to drive diversity across all graduate programs
Sep 21 2020

Travis and Troy Nunnally: In their Own Words

How two brothers are building a minority entrepreneurship empire in Atlanta  
Sep 18 2020

Jacobs Appointed Interim Dean of the College of Engineering

Laurence J. Jacobs, associate dean for Academic Affairs in the College of Engineering, and professor of civil and environmental engineering and mechanical engineering, has been appointed interim dean of the College.
Sep 17 2020
Steve McLaughlin (photo)

Dr. Steven McLaughlin Named Provost and Executive Vice President for Academic Affairs at Georgia Tech

Dr. Steven McLaughlin will assume the role of Provost and Executive Vice President for Academic Affairs effective October 1, 2020.
Sep 15 2020