Write a 250+ words third-person summary of the TED Talk, “We’re Building a Dystopia Just to Make People Click on Ads” https://www.ted.com/talks/zeynep_tufekci_we_re_building_a_dystopia_just_t
10 September 2018
Watch video
Take notes with/for students
Zeynep Tufekci TED Talk
“We’re building a dystopia just to make people click on ads”
https://www.ted.com/talks/zeynep_tufekci_we_re_building_a_dystopia_just_to_make_people_click_on_ads?referrer=playlist-the_race_for_your_attention
Keywords from first half
persuasion architecture
Algorithms
Dark posts
Data brokers
FIRST HALF OF TALK
Profiling people – individual profiles
AI is tailor-making “ads” for individual
Algorithms – Picking up on human behaviors
No one (no person) understands the algorithms. But artificial intelligence does, i.e., Facebook and Google
Like taking a cross-section of the brain, and no one knows what she is thinking.
Only the AI (artificial intelligence) knows.
Artificial Intelligence means
Internet and technology
Means that computers can use algorithms and “big data” to learn about an individual
Facebook and Google, examples of using “persuasion architecture”
Her example of “selling plane tickets to Las Vegas”
Mania and bipolar people
SECOND HALF OF TALK
“Dark posts”
Targeting people
They are not doing public posts; individualized
Good persuasion architecture makes a company successful
Two parts: Facebook has it; a company can buy it.
Information/news is individualized so that people don’t have the same knowledge or news.
Persuades your political views; who to vote for or not.
They are not considering your opinion; they are trying to give you an opinion
Because of a lack of transparency, voters don’t know who are persuading them
Facebook algorithms does not show posts in time order but algorithms selecting which ones to see
Posts are affecting you emotionally
Reforming the internet/technology?
We need to take control of the technology, so that information is transparent.
We need to re-structure the “business model.”
We should restrict algorithms. For example, not apply algorithms to politicians
Do not offer data from algorithms to the highest bidder/the biggest company