Zeynep Tufekci
Inaugural Director, Craig Newmark Center for Journalism Ethics and Security, Columbia University | Assistant Professor, University of North Carolina, Chapel Hill
Activism & Advocacy
Big Data & Data Science
Technology
Social Media
Artificial Intelligence
Techno-sociologist Zeynep Tufekci is an internationally recognized authority on the interactions between technology and social, cultural, and political dynamics. She has special expertise in how social change movements are using social media and on the social and moral implications of how we use big data and algorithms to make decisions. The author of Twitter and Tear Gas: The Power and Fragility of Networked Protests, Zeynep Tufekci is the Inaugural Director of the Craig Newmark Center for Journalism Ethics and Security at Columbia University, an assistant professor at the University of North Carolina, Chapel Hill, and a monthly contributing opinion writer for the New York Times.

Social science trifecta. Zeynep brings a unique combination of gifts to her audiences. She is a technologist, having begun her career as a computer programmer. She is a brilliant and creative social science researcher and analyst. And finally, she speaks from direct experience as a participator in a number of the most important social movements in the last couple of decades, from the first to use social media as an organizing tool (the Zapatista movement in Mexico) to the Arab Spring and Tahrir Square in Egypt.

Privacy, security, and big data. In more and more areas of business and society, we are relying on big data analysis and algorithms, on machine learning and artificial intelligence, to make our decisions. We do this in the belief that they are better and more “objective” than those made by humans. Zeynep challenges these trends and the assumptions on which they are based with hard evidence and careful analysis of the results — which do not support the faith we put in these technologies. Her research in this area is featured in the documentary Coded Bias which sheds light on the biases embedded into artificial intelligence technologies and the human rights violations that arise because of them. With the help of Zeynep's work, the film makes the case that these algorithms and mahcine-learning systems are only as unbiased as the humans and historical data programming them.

Twitter and Tear Gas. Zeynep has witnessed firsthand the power of social media as a tool for organizing large numbers of people — and she’s seen the weaknesses unfold when this is how you organize. In her presentations, as in her landmark book, she takes you inside these movements as no one else can and at the same time offers an essential critique, not just of these new tools and their impact, but more broadly, of the emerging intersections between authority, technology, and culture.

Credentials. Zeynep Tufekci is a 2022 Pulitzer Prize Finalist in Commentary. She is the Inaugural Director of the Craig Newmark Center for Journalism Ethics and Security at Columbia University and also an assistant professor at the University of North Carolina, Chapel Hill, in the School of Information and Library Science, with an affiliate appointment in the Department of Sociology. She contributes Op-Ed pieces monthly to the New York Times and is a regular contributor to The Atlantic. She has given two TED talks and is an Andrew Carnegie Fellow. She is a faculty associate at the Harvard Berkman Center for Internet and Society and has been a fellow at Princeton University Center for Information Technology.
Show more
Topics
This speaker does not have any topics yet.
Videos
Online social change: easy to organize, hard to win | TED
Zeynep Tufekci
Machine intelligence makes human morals more important | TED
Zeynep Tufekci
We’re building a dystopia just to make people click on ads | TED
Zeynep Tufekci
Short Videos
This speaker does not have any short videos yet.
Podcasts
This speaker does not have any podcasts yet.
Book
Zeynep Tufekci
for your event
Request Availability
Download Bio
Related Speakers
No items found.
Request Availability