We are surrounded by an incredible expanse of data, created at an exponential rate and offered on a multitude of electronic devices; a wealth of significant information at the ready to benefit a population eager to absorb it. Can this bountiful data be protected from theft, exploitation, and other misuse yet still be shared with those who need it? It is Dr. Murat Kantarcioglu’s mission to ensure that useful information be extracted without sacrificing privacy or security. To achieve these goals, Dr. Kantarcioglu is currently working on security, privacy and accountability issues in machine learning and big data, developing machine learning and graph analysis techniques for detecting cyber attacks, and creating blockchain based techniques for accountable data sharing.
What research are you currently immersed in?
My research focuses on data and privacy. For example, we want to use the social network information gathered online to better understand how society evolves but at the same time we don’t want any data misused and privacy needs to be protected. There is lots of useful big data to be shared but my goal is ensuring safety and privacy.
Why did you choose this career path?
In high school I was thinking of becoming a medical doctor but then I watched a movie—Sneakers—about cybersecurity. I don’t want to spoil the movie for you but there was a great quote in the movie from Ben Kingsley to Robert Redford that went something like this: “There is a war out there, old friend. And it’s not about who’s got the most bullets, it’s about who controls the information.” I thought this was a change-the-world type of area. I was in high school, and naïve in a sense, and I just wanted to understand cybersecurity and data and how they really work.
How do you stay ahead of cybercrime?
It’s a mindset. I think I’m a little bit cautious by nature so I always think about what could go wrong. I think about weaknesses in a system and how someone could exploit those weaknesses. Therefore, it is my team’s practice to imagine these attacks before they actually happen. Take social media data, for instance, we think up ways this information can be abused. We think like attackers.
Is the motivation behind these crimes social science or an economic agenda?
There are many different dimensions depending on who the attacker is. So, if you’re thinking of petty thieves or cyber criminals then they are motivated by economic aspects. If you think about state actors they usually have a goal such as getting important secrets or influencing some political outcome. It depends on who the attacker is but it ranges from social to economic to getting sensitive information.
What challenges do you face in cybersecurity?
When you think of cybersecurity there are so many different areas. My focus is trying to protect the data and data resources but, of course, there are other aspects. The biggest challenge we now have is that more complex systems are being built by big software tools and developed by many different people. Usually these systems are being put together quite quickly. The first thing about that is there are a lot of vulnerabilities. The second thing is that, again, from a data security point of view we now have a lot of data that’s been collected about individuals that can be accessed and misused. Trying to come up with access control and other protective measures while still enabling use of this data is a challenge. In other words, you want people to continue their jobs by accessing and using this data but at the same time try to limit these accessors from certain things. How you balance choices and limitations is a big challenge. If you think about medical data—it may be good to share medical data across hospitals for different organizations to use this data and come up with different ways of doing things, or, ways that optimize the healthcare system—but there are privacy issues and you don’t want health records to be leaked or misused. Weighing the balance between security and privacy is something I think about a lot.
What would you say is your biggest discovery?
We’ve done lots of work on understanding risks and social network data and how it may be used for violating user privacy. And we also looked into how machine learning and AI techniques could be attacked. There is an increase in machine learning and AI techniques used for automating decisions and making predications. We are looking at how these techniques can be attacked and also how we can make machine learning models more robust against those attacks. So those are some things that I think are good discoveries that came out of our research lab.
What do you think of platforms such as Facebook and LinkedIn?
First, there is lots of information on social media. Our past research shows that you can use the information on social networks like Facebook to infer even more information about individuals. For example, maybe you never disclosed your sexual orientation on your Facebook profile but because of some of the data available machinery techniques can very accurately predict it. In other words, the more you reveal in posting comments and such the more that can be inferred. So even if you didn’t put down that sexual orientation your other activity can reveal what that is, your political affiliation and more. A good example of this is travel. Facebook predicts you’re a frequent flyer if you are accessing Facebook from five different locations within a three-month period. I think people need to be very careful about this. There was some research that suggested that you can predict some people’s intelligence based on their Facebook likes. Our research shows that’s not very accurate and people can simply lie in Facebook to look smarter to these kinds of algorithms. Therefore, there’s a limit on that.
What security advice can you give us?
My advice is to limit the amount of information you put out there. For example, in my case, I do use Facebook but I try not to give my Birthday, etc. I suggest to my friends before you post anything to think about what happens if your mom sees it, what happens if your significant other sees it and what happens if your boss sees it. I regularly delete old stuff—just in case. I don’t have posts past two or three years in case I said something earlier and now times have changed. It’s a good idea to do a spring cleaning type of thing—clean your profiles and the other information you have out there.
As a means to support continued research on campus, the Office of Research recently launched several seed grant initiatives to provide funding for… read more
Dr. Michael Burton is an Assistant Professor in the Systems Neuroscience Program in the School of Behavioral and Brain Sciences. The National… read more
The Office of Research is excited to announce that registration for UTD Microscopy Workshop Summer 2019 is now open. The workshop will cover the… read more