Fake: Searching For Truth In The Age Of Misinformation | Full Documentary | Connecticut Public
Estimated read time: 1:20
Learn to use AI like a Pro
Get the latest AI workflows to boost your productivity and business performance, delivered weekly by expert consultants. Enjoy step-by-step guides, weekly Q&A sessions, and full access to our AI workflow archive.
Summary
In an era where misinformation runs rampant on the internet, the struggle to discern truth from fiction has become increasingly challenging. The documentary by Connecticut Public delves into the complexities of the digital age, where propagandists harness the power of social media to spread fake news, thus shaping public opinion and threatening democracy. Experts highlighted in the documentary emphasize the need for media literacy as a crucial skill to navigate this landscape, stressing that misinformation is not a new issue but has been magnified by technology.
Highlights
Imagine grabbing a random paper among 2 million flying in a library; that's the internet now! π
Fake news affects democracy, yet people struggle to identify it. π€
Local newspapers, albeit struggling, hold trust more than social media. π°
Comedy and satire influence news consumption in this 24-hour cycle. π
Educating all generations in digital literacy helps combat fake information. π
Key Takeaways
The internet is like a library with 2 million floating papers, making it hard to discern truth. π
Fake news isn't just about elections; it's a complex, nuanced issue. π€
Local journalism remains a trusted news source amid digital chaos. π°
Satire and comedy play a role in shaping perceptions in the 24-hour news cycle. π
Digital and media literacy are crucial for current and future generations. π
Overview
Step into a world where information is as disordered as a room full of scattered papers. The internet has turned into a chaotic library, making it increasingly difficult to distinguish verified facts from sheer fiction. With so much information, democracy takes a hit as fake news and misinformation spread like wildfire, impairing our judgment and opinions.
Delving into the core, the documentary by Connecticut Public explores the role of media literacy today. Itβs not just about dodging electoral manipulation; itβs about understanding the intricate nature of misinformation. This isn't merely a fight against fake posts; itβs about enhancing our cognitive toolkit to question, analyze, and discern what streams through our digital feed.
In these troubled times, newspapers, especially local outlets, still hold a beacon of trust, offering a dose of reliable news amidst the digital mayhem. However, they face challenges adapting to this swift-paced, ever-evolving media landscape. Meanwhile, satirical shows inject humor and irony into news, shaping public perception, and underscoring the continuing relevance of literacy in both schools and homes to combat fake news effectively.
Chapters
00:00 - 09:00: Introduction and Definition of Fake News The chapter delves into the pervasive issue of fake news and misinformation, particularly how it spreads rapidly on the internet and influences reality. It highlights the impact of false information on discussions surrounding democracy and propaganda, underlining fake news as a significant concern.
09:00 - 16:00: Sandy Hook and Conspiracy Theories The chapter discusses the intertwined relationship between Sandy Hook and conspiracy theories, particularly in the realm of digital media. It highlights the massive change brought by websites and social media, enabling rapid dissemination of information, whether factual or propagandistic.
21:00 - 27:00: Psychological Effects of Repetition This chapter explores the overwhelming amount of information available in the digital age through various communication tools and technologies. It compares the modern internet to a chaotic library where information is unorganized, emphasizing the difficulty of determining the legitimacy of such information. It also touches upon the challenges this poses to democracy, highlighting the importance of consuming and creating content responsibly in this digital environment.
27:00 - 38:30: Misinformation vs. Disinformation Chapter Title: Misinformation vs. Disinformation
The chapter discusses the challenges posed by the flood of information in the modern media landscape. It emphasizes the difficulty in discerning truth from falsehood when there is widespread disagreement on basic facts. The explosion of news from various sources, including cable TV, social media, and countless websites, can lead to the presentation of misinformation or disinformation, often in the form of jokes, hoaxes, or propaganda that appear to be legitimate news.
38:30 - 50:00: Media Literacy and Digital Citizenship This chapter discusses the challenges of discerning truth in the overwhelming volume of news available to the average person. It highlights the tragic event of the Sandy Hook Elementary school shooting as a case study, illustrating how such incidents can temporarily unite the country. However, it also addresses how misinformation can quickly spread in the aftermath of such events, as exemplified by Alex Jones and his platform Infowars, which began disseminating false claims. This narrative underscores the importance of media literacy and digital citizenship in navigating the digital information landscape effectively.
50:00 - 68:00: Challenges Facing Traditional and Local Media This chapter discusses the challenges faced by traditional and local media in the context of misinformation. It highlights a specific case involving conspiracy theories about a shooting incident, where misinformation was spread by claiming the massacre was a hoax. The chapter explores the implications of such actions, whether driven by malice, greed, or psychological reasons, on the public's perception in a deeply divided society. Additionally, it points to the role of the rapid news cycle and tech companies in influencing the dissemination and control of news.
68:00 - 84:00: The Impact of Satire and Humor in Media The chapter explores the role of satire and humor in media and how they can influence the receipt and distribution of information. It highlights a case involving radio host Alex Jones, who is being sued by families of the 2012 Sandy Hook Elementary victims for defamation. Jones has been accused of promoting conspiracy theories by suggesting the media fabricated details of the Sandy Hook shooting. The chapter delves into how such claims can transform serious situations into 'political theater,' affecting public perception and discourse.
84:00 - 100:00: Misinformation Business Models and Ad Revenue The chapter discusses the role of misinformation in online media, particularly focusing on how it is used as a tool for revenue generation through ad models.
100:00 - 125:00: Deepfakes and Media Manipulation This chapter explores the phenomena of deepfakes and media manipulation, highlighting how they are devoid of empathy, sympathy, or cognitive understanding. It discusses how alternate narratives are constructed to fit certain worldviews and questions why people are inclined to believe such manipulated content. It challenges the notion that our reasoning abilities should naturally allow us to detect the falsity of such content.
125:00 - 150:00: Global Computational Propaganda The chapter discusses the illusory truth effect, a concept in cognitive psychology where repeated statements are perceived as more plausible. The authors describe an experiment where participants read stories, engaged in distractor tasks, and then rated the accuracy of various stories, some of which they'd seen before and some they hadn't.
Fake: Searching For Truth In The Age Of Misinformation | Full Documentary | Connecticut Public Transcription
00:00 - 00:30 <b>This program is made
possible with support from</b> <b>Connecticut Humanities.</b> <b>- As misinformation and so
called fake news continues</b> <b>to be rapidly distributed
on the internet,</b> <b>our reality has become
increasingly shaped by</b> <b>false information.</b> <b>- The so called fake news.</b> <b>- Fake news.</b> <b>- You are fake news.</b> <b>- Discussions of democracy
and discussions of propaganda</b>
00:30 - 01:00 <b>have always gone together.</b> <b>- The digital space in
general, when websites</b> <b>were sort of coming about
had a massive change.</b> <b>Social change things even
more in the sense that</b> <b>there was a lot that
people could come across</b> <b>by chance, and by other
sharing with them.</b> <b>- If we can't discriminate
between serious arguments</b> <b>and propaganda, then
we have problems.</b> <b>- What does it mean to be
a literate citizen in</b> <b>today's world?</b> <b>And from our perspective
its media literacy, right?</b> <b>They you really need to
understand all of our</b>
01:00 - 01:30 <b>communication tools, all the
different types of technology.</b> <b>We have to be able to consume
and create using all of them.</b> <b>- Imagine if you walked
into a library today and</b> <b>instead of everything
being arranged, there were</b> <b>2 million pieces of paper
just flying around in the air</b> <b>and you grab one of them,
and you have no idea</b> <b>who wrote it,
who financed it.</b> <b>Was it legitimate?
Is it illegitimate?</b> <b>That's the internet.</b> <b>- You know, it seems hard to
imagine that democracy can</b>
01:30 - 02:00 <b>function that well
when there's widespread</b> <b>disagreement
on basic facts.</b> <b>NARRATOR: The daily avalanche of
legitimate news from cable TV,</b> <b>social media and
endless websites,</b> <b>often includes news reports that
look real but are actually</b> <b>jokes or hoaxes
and propaganda.</b>
02:00 - 02:30 <b>Given the volume of news
available, how can the</b> <b>average person separate
fact from fiction?</b> <b>(indistinct chatter)</b> <b>- There were several
fatalities at the scene</b> <b>both students and staff.</b> <b>NARRATOR: The Sandy Hook
Elementary school shooting</b> <b>shocked the country, and for a
moment like other tragic</b> <b>events in our
recent history</b> <b>brought the country together.</b> <b>Unfortunately, it didn't
take long before Alex Jones,</b> <b>the founder of<i>
Infowars</i></b> <b> began to peddle a</b>
02:30 - 03:00 <b>conspiracy theory
about the shooting.</b> <b>Jones has repeatedly
claimed the massacre was a</b> <b>giant hoax, carried out by
crisis actors in a broad</b> <b>scheme to trample on
Second Amendment rights.</b> <b>Whether Jones actions were
motivated by greed, malice</b> <b>or as he later
claimed, psychosis,</b> <b>his story exposed the risks of
misinformation in a deeply</b> <b>divided environment, where
the breaking news cycle is</b> <b>thriving and tech
companies control how we</b>
03:00 - 03:30 <b>receive and distribute
information.</b> <b>- Two families of children
who died in the 2012</b> <b>Sandy Hook Elementary shooting
are now suing radio host</b> <b>Alex Jones for defamation.</b> <b>The controversial host
of Infowars has long</b> <b>suggested the media faked the
information about this shooting.</b> <b>- They stage Sandy Hook, the
evidence is just overwhelming.</b> <b>It took me about a year
with Sandy Hook to come to</b> <b>grips with the fact that
the whole thing was fake.</b> <b>- It's almost
political theater.</b>
03:30 - 04:00 <b>It's, you know, agitprop
kind of performance art</b> <b>in the most bastardized
and sadistic way.</b> <b>- They've got the kids going
in circles in and out of</b> <b>the building with their hands up
I've watched the footage,</b> <b>and it
looks like a drill.</b> <b>- Ultimately someone like
Alex Jones, given the</b> <b>architecture of the
internet is probably going</b> <b>to make more money as an
outrage influencer,</b> <b>than he would as a small
local news organization.</b> <b>It's not actually about
children anymore,</b>
04:00 - 04:30 <b>it's stripped of any notes of
empathy or sympathy or</b> <b>cognitive understanding
about how things happen.</b> <b>But what happens is
there's an alternate</b> <b>framing that attempts to
make sense out of, you know,</b> <b>an event in a way that fits
their own worldview.</b> <b>- Given how crazy
this stuff seems.</b> <b>Why is it that people
could come to believe it?</b> <b>Shouldn't our reasoning
abilities allow us to see</b> <b>that this content is
obviously not true?</b>
04:30 - 05:00 <b>This classic effect from
cognitive psychology</b> <b>called the illusory truth
effect is the finding that just</b> <b>hearing a statement repeated
makes it seem more plausible.</b> <b>βͺ</b> <b>We had people read some
stories then do some</b> <b>distractor tasked
thoughts and random surveys</b> <b>for five minutes,
and then we have them rate</b> <b>the accuracy of a bigger
set of stories,</b> <b>some of which we showed
them in the beginning</b> <b>and some of
which we didn't.</b>
05:00 - 05:30 <b>And they rate the ones
that we showed them at the</b> <b>beginning as more accurate
than the ones that don't.</b> <b>If they hadn't seen the
headline before,</b> <b>about 18% of the headlines
got rated as true,</b> <b>but if they had just been
shown it five minutes earlier</b> <b>that went up to 24%.</b> <b>When it comes to the kind
of partisan misinformation</b> <b>that circulates on social
media which is what we've</b> <b>been focusing on, it's
really not that reasoning</b> <b>powers are
getting hijacked,</b> <b>it's just that people are
not bothering to reason in</b> <b>the first place, and
people are just kind of</b>
05:30 - 06:00 <b>going with their
intuitive gut responses.</b> <b>And what we found is when
people stop and think a</b> <b>little bit more, they're
actually substantially</b> <b>better at telling what's
true versus false.</b> <b>- One perspective of fake
news when it comes to</b> <b>democracy is that it's just
about electoral manipulation.</b> <b>It's about like duping voters
and trying to steal elections.</b> <b>I think fake
news and its effects are</b>
06:00 - 06:30 <b>powerful but also much
more sophisticated and</b> <b>a little bit more
subtle than that.</b> <b>- We're here today to
discuss online imposters</b> <b>and disinformation.</b> <b>Researchers generally
define misinformation as</b> <b>information that is false
but promulgated with sincerity</b> <b>by a person
who believes it is true.</b> <b>Disinformation on
the other hand,</b> <b>is shared with the deliberate
intent to deceive.</b> <b>- Members of the committee,</b> <b>thank you for having
me here today.</b>
06:30 - 07:00 <b>As you know, this problem
is nuanced and complex.</b> <b>I've been looking at
disinformation campaigns</b> <b>for many years.</b> <b>I want to highlight that
while we tend to focus on</b> <b>fake content, the most
sophisticated actors</b> <b>I have seen operate online
actually tend to use</b> <b>authentic content weaponized
against their targets.</b> <b>Today, I think there's so
many different ways that</b> <b>an organized group can
manipulate a conversation.</b> <b>What we use internally is
called the ABC framework,</b>
07:00 - 07:30 <b>and basically what it says
is there are three different</b> <b>ways in which something
could be disinformation.</b> <b>It can be disinformation because
its deceptive actors, right?</b> <b>Perhaps it looks like it's
a just normal activist but</b> <b>in reality it's a Russian
military officer,</b> <b>that's a deceptive actor.</b> <b>Or it could be
disinformation because of</b> <b>a deceptive
behavior, right?</b> <b>For the actor is exactly
who they say they are</b> <b>but the way the
campaign is amplified,</b>
07:30 - 08:00 <b>the way the
message is amplified</b> <b>is what makes it
disinformation.</b> <b>That's when you use a
troll farm in</b> <b>order to flood the
internet with messages to</b> <b>make it look like there's
some collective action,</b> <b>but really there is not,
it's coordinated.</b> <b>And then there is
C for content.</b> <b>Sometimes the content
itself is deceptive, right?</b> <b>It could be a fake image
or it could be a fake video,</b> <b>something in which
the content itself</b> <b>is the vector for deception.</b> <b>So, in reality
it's a lot of</b> <b>different ways to do
disinformation.</b>
08:00 - 08:30 <b>NARRATOR: In the run up to the
2020 presidential election,</b> <b>the United States is headed
into what could be one of</b> <b>the most extraordinary
years of claims and</b> <b>counterclaims,
misinformation, and a</b> <b>renewed public perception
that our democracy and</b> <b>truth itself are
under attack.</b> <b>- When we asked Americans
about made up news themselves,</b> <b>is that 50% say</b> <b>that it's a very big problem
facing the country today,</b>
08:30 - 09:00 <b>and that places
it above things like the</b> <b>environment or terrorism or some
of these other major issues.</b> <b>So they definitely see
it as a very big problem,</b> <b>they think it's getting
in the way of the country</b> <b>being able to function well,
of leaders being</b> <b>able to effectively make
decisions and do their work,</b> <b>of Americans being
able to stay informed</b> <b>about current events.</b> <b>- This is not a simple problem
and it's not a new problem.</b> <b>It's the problem that
so much of democratic</b> <b>political theory in its
2000 plus years</b>
09:00 - 09:30 <b>has been devoted
to addressing.</b> <b>- Epidemic of
malicious fake news.</b> <b>- We know that the Russians
were propagating fake news</b> <b>through Facebook
and other outlets.</b> <b>- The expression fake news
is a terrible expression</b> <b>and should never be used.</b> <b>Fake news suggests that
there's some new thing</b> <b>that wasn't there before.</b> <b>What we face is the
problem of propaganda.</b> <b>Propaganda as a concept
begins at the very</b>
09:30 - 10:00 <b>beginning of discussions of
democratic political philosophy.</b> <b>In the 20th century,
we have propaganda arising in</b> <b>the First World War.</b> <b>In wartime propaganda is
always needed in order to</b> <b>represent the enemy as
some sort of</b> <b>uncommon villain that is so
beyond the pale, and so fearsome</b> <b>that you need to risk your life
in order to protect your family.</b> <b>Then we have the national
Socialists who turned</b> <b>propaganda into
an art form.</b>
10:00 - 10:30 <b>So the problem of
propaganda has always been</b> <b>central to any
discussion of democracy,</b> <b>because democracy allows people
to say what they want.</b> <b>Plato in book eight of<i>
The Republic</i></b> <b> , says that</b> <b>democracy will lead
immediately to tyranny,</b> <b>because democracy has
at its core freedom of speech.</b> <b>You can't have democracy
without the freedom of speech.</b> <b>How do we deal with this,</b> <b>consistently with
remaining a democracy.</b> <b>So this is the central
problem of democracy.</b>
10:30 - 11:00 <b>- We can actually remove
a lot of the partisan</b> <b>politics from this and
say, "It's not about the left,</b> <b>"it's not about the right,
it's about being</b> <b>"able to trust your sources
of information."</b> <b>- The public puts most of that
onus on journalists</b> <b>and the news media to sort of
solve that problem of made</b> <b>up news, although most of
the public thinks it will</b> <b>get worse over the next
five years than they think</b>
11:00 - 11:30 <b>it will get better.</b> <b>- Decades earlier we could
all rely on and trust the</b> <b>fact that what news
anchors were telling us on</b> <b>television was
socially accepted fact.</b> <b>But with the rise of
social media and going</b> <b>from web 1.0 to 2.0 to
web 3.0, where anyone can</b> <b>create a social footprint
that mirrors that of a</b> <b>media organization,
the ability to trust that</b> <b>information is almost
completely lost on us.</b>
11:30 - 12:00 <b>NARRATOR:As tech giants and
media conglomerates fight for</b> <b>control of the 24
hour cycle,</b> <b>independent newspapers, the
backbone of local and regional</b> <b>journalism remain the most
trusted source of news</b> <b>around the country.</b> <b>Is it the threat of
Russian maneuvering</b> <b>or the death of trusted local
journalism that poses the</b> <b>biggest risk to reliable
news in America.</b> <b>Even the best local
newspapers have struggled</b>
12:00 - 12:30 <b>to fully adapt their
business models and</b> <b>newsrooms to this
new media landscape.</b> <b>Thousands of reporters and
editors have been cut in</b> <b>the past decade, greatly
diminishing the capacity</b> <b>of independent newspapers
to consistently cover</b> <b>their communities
in depth.</b> <i><b>- The Gazette's</b></i> <b> been around
for 125 years we just</b> <b>celebrated our 125th,
and it has been a family</b> <b>newspaper the whole time.</b> <b>Right now we're owned and
controlled by</b> <b>a brother and a sister.</b> <b>We really, I think are the
quintessential, you know,</b>
12:30 - 13:00 <b>locally owned family business.</b> <b>It's definitely the worst
that's ever been in terms of</b> <b>questioning whether
the media is in bed with</b> <b>somebody or is corrupt.</b> <b>- When you ask questions
about trust, you'd see you</b> <b>ask about local news
organizations and they</b> <b>tend to get the highest
level of trust from</b> <b>Americans about 25%, that would
say they have a lot of trust.</b> <b>National drops down
to close to 20%,</b> <b>but when you ask about
social media specifically,</b>
13:00 - 13:30 <b>you're down in the single digits
when it comes to trust.</b> <b>- There's still not a lot
of understanding once you</b> <b>pull back the curtain on
the newspaper business,</b> <b>I'd say the media
business in general.</b> <b>As a reporter I don't
have an opinion,</b> <b>it's not my job to
have an opinion.</b> <b>You know, my job is to
present the facts.</b> <b>A newspaper has an opinion
section which is different</b> <b>than the editorial
section, not many people</b> <b>understand that that's not the
opinion of the reporters,</b>
13:30 - 14:00 <b>but just a small group of
people who form those.</b> <b>GPS VOICE: Head Southwest on
Maxon Road extension</b> <b>toward Van Der Bogart Street.</b> <b>- Schenectady is a, it's a
classic American Rust Belt city.</b> <b>A lot of the, you know,</b> <b>sub, the housing stock
and some of the city's</b> <b>neighborhoods
is substandard.</b> <b>We've been reporting
on these issues because</b> <b>several prominent buildings
have been condemned</b> <b>and we find that people are
still living there despite</b> <b>safety issues, but
where can they go</b> <b>if they are low income.</b> <b>I know somebody is home
cause we saw them come out</b>
14:00 - 14:30 <b>as we were driving by.</b> <b>There's kids inside.</b> <b>This is local reporting
man it's a lot of just</b> <b>waiting on people's
porches and being inconvenient.</b> <b>βͺ</b> <b>- Our paper like a lot of
papers once upon a time,</b> <b>made a lot of profit from, from
our print advertising business.</b> <b>We had tons of very
loyal advertisers,</b>
14:30 - 15:00 <b>there was this shift to
digital journalism</b> <b>we were slow to respond.</b> <b>- The digital space in
general,</b> <b>when websites were sort
of coming about</b> <b>had a massive change to the way
news was structured</b> <b>and sort of breaking
up that bundle.</b> <b>But social change things
even more in the sense</b> <b>that there was a lot more
that people could come</b> <b>across by chance, and by
others sharing with them.</b> <b>Sort of that
network of sharers and the</b> <b>bumping into news as
opposed to sort of one</b> <b>specific time I was going
to sit down and take it in,</b> <b>through this dedicated
organized fashion.</b>
15:00 - 15:30 <b>- We have fewer resources,
that's a reality.</b> <b>We have fewer,
far fewer editors.</b> <b>- And so there's in many
cases in newsrooms,</b> <b>far fewer staff didn't use to
exist, and the news cycle itself</b> <b>has become
minute by minute.</b> <b>- Breaking news tonight the
drama unfolding on Capitol Hill.</b> <b>- We are breaking in on
a very busy news day.</b> <b>- Busy news night,
busy news week.</b> <b>- Fast moving developments
on fast moving fires.</b> <b>- And so there's a
constant feel of</b> <b>the need to stay up to date
with whatever is breaking</b>
15:30 - 16:00 <b>or happening at the
moment, and with fewer</b> <b>staff being able to sort
of turn away and</b> <b>spend time on a focused
dedicated story,</b> <b>becomes harder for news
organizations to be able to do.</b> <b>NARRATOR: It would be impossible
to understand today's</b> <b>news environment without
understanding the role</b> <b>and impact of satire and
late night comedy in the</b> <b>current 24 hour
news cycle.</b> <b>- Thank you, and what kind
of real news have you</b> <b>heard out there?</b> <b>(audience laughter)</b>
16:00 - 16:30 <b>- We've always wanted to do
an exhibit about the power</b> <b>of satire and free expression in
the First Amendment,</b> <b>and this gave us a perfect
opportunity to do it.</b> <b>What better moment than
the year before a major</b> <b>presidential election to
talk about the power of</b> <b>politics and satire.</b> <b>- And now for our continued
comprehensive coverage of</b> <b>the final blow.</b> <b>(audience laughter)</b> <b>You're out of order,
he's out of order,</b> <b>this whole trial is sexy.</b> <b>(audience laughter)</b>
16:30 - 17:00 <b>President Clinton's
historic impeachment trial</b> <b>begins Thursday, and
the most important issue</b> <b>facing the United -</b> <b>back 20 years ago,
when Jon Stewart started</b> <b>doing his show, and people
dubbed him fake news</b> <b>because they weren't
real journalists.</b> <b>They were doing sort
of reporting, they were</b> <b>interviewing people, they
were gathering facts and</b> <b>adding humor to it and
people called it fake news.</b> <b>Well, that term has
become a much more</b> <b>malevolent term these days.</b> <b>-Waterboarding is how
we baptize terrorists.</b>
17:00 - 17:30 <b>(crowd cheering)</b> <b>-Huh?</b> <b>(audience laughter)</b> <b>- We start this exhibit
actually back at the</b> <b>beginning of our country,</b> <b>British colonists were
making fun of the,</b> <b>of British rule and of
King George,</b> <b>so sort of that element of
wanting to make fun of people</b> <b>in power is part of
our American DNA.</b> <b>And so Jon Stewart comes
on the scene in 1999</b> <b>and kind of makes it into a
real cultural powerhouse.</b>
17:30 - 18:00 <b>βͺ</b> <b>I think it evolved.</b> <b>Early on was doing I guess you
could kind of say a little bit</b> <b>that juvenile humor, that you
know, is so much a part of</b> <b>shows like Saturday
Night Live and others,</b> <b>and then he really kind of
twisted it to kind of get it</b> <b>into like let's talk about some
more important issues.</b> <b>- That's what I want to get
to, is that, that vision of what</b>
18:00 - 18:30 <b>21st century government
looks like</b> <b>outside of the
polarization of it.</b> <b>- Individuals have
to be really careful</b> <b>about the way that they
experience information.</b> <b>Does the headline
support the facts?</b> <b>Is the story even
a legitimate story?</b> <b>You have to ask, "What's the
source for this information?"</b> <b>"Does this sound
exaggerated?"</b> <b>"Does this sound
ridiculous?"</b> <b>And then search a little
bit more.</b> <b>Review it and think
critically about it.</b> <b>And I think that's what
shows like</b> <i><b> The Daily Show</b></i> <b>and John Oliver and Sam
Bee and Hasan Minhaj</b>
18:30 - 19:00 <b>are trying to
get people to do.</b> <b>Take a look at the facts
and then really think about it.</b> <b>Really look at
the hypocritical nature</b> <b>that politicians have
in some cases.</b> <b>Really ask hard questions about
the information that you're</b> <b>getting, and about the people
who are telling you things.</b> <b>- If I walk over there and
sit next to Mr. Johnson</b> <b>and carry my phone,
does Google know that I was</b> <b>sitting here and then
I moved over there?</b>
19:00 - 19:30 <b>- I genuinely don't know what
type, knowing what type -</b> <b>-I'm shocked you
don't know,</b> <b>I think Google
obviously does.</b> <b>Are you familiar with the
general data protection</b> <b>regulation by the
European Union?</b> <b>-I think the fear of what
happened with the social</b> <b>media platforms, the fear
that people now have of like,</b> <b>privacy issues and
data mining and all of</b> <b>these things, has just
led people to find ways to</b> <b>counteract the problems
that we're seeing.</b> <b>And a lot of people are
turning to media literacy</b>
19:30 - 20:00 <b>and the media
literacy educators</b> <b>and community to
be that answer.</b> <b>- Okay, so first
things first.</b> <b>Let's get to your folders.</b> <b>- It's kind of a new and
enhanced version</b> <b>of literacy, right?</b> <b>So it's really asking the
question, you know, what does it</b> <b>mean to be a literate
citizen in today's world?</b> <b>- The idea's that you're
teaching people to not</b> <b>just learn how to read and
write, but you're learning-</b> <b>you're teaching them how to be
able to negotiate</b> <b>all source of media forms.</b> <b>- All of our people
communication tools,</b>
20:00 - 20:30 <b>all the different types of
technology,</b> <b>we have to be able to consume
and create using all of them.</b> <b>- If you're going to start
teaching about reading and</b> <b>writing, you should be
teaching about how the</b> <b>digital environment
operates, and you know,</b> <b>you should be teaching all of
that with an eye towards</b> <b>critical understanding.</b> <b>NARRATOR: As students spend
more and more time</b> <b>in the digital world, the
concept of digital citizenship</b> <b>is seen as an important area
of educational knowledge,</b> <b>not only in the United
States but around the world.</b>
20:30 - 21:00 <b>- So, digital learning the
general definition is any</b> <b>type of learning that
incorporates</b> <b>technology usage within it.</b> <b>Where a more practical use
of digital learning or how</b> <b>digital learning is more
formally known across</b> <b>education is in the type
of practices, of teaching</b> <b>kids how to use computers
in a responsible</b> <b>and respectful manner.</b> <b>- Go ahead and minimize the
game so that we can see</b> <b>your code Maya.</b> <b>And team let's
get ready to share some</b> <b>shout outs and
suggestions for Maya.</b> <b>I see several students
want to give you</b>
21:00 - 21:30 <b>a shout out and suggestion.</b> <b>- They don't know how to use
this technology,</b> <b>and when they get online and
start to use the internet,</b> <b>there's a whole world
that's open to them, right?</b> <b>There's so many things
that they're exposed to</b> <b>and there's so many things
that they can explore.</b> <b>And unfortunately without
the proper guidance</b> <b>kids tend to make bad
decisions, they tend to</b> <b>not understand how their
actions in the internet</b> <b>will affect them later
on in their life.</b> <b>A lot of times we have the
misconception that because</b> <b>our kids are so exposed
to technology that they</b> <b>automatically know how to
use it and they know what to do,</b> <b>and that is false.</b>
21:30 - 22:00 <b>Kids need guidance in how
to use the technology</b> <b>just like anything else.</b> <b>NARRATOR: As we examine the role
of media literacy in the</b> <b>lives of our children,
we must also reconsider its</b> <b>position in the lives
of our parents and grandparents,</b> <b>who've witnessed
the drastic shift</b> <b>from a print culture
to a social media culture,</b> <b>vulnerable to
similar threat.</b> <b>- I think that there is
something very much to be</b> <b>said about the
generational gap of how</b> <b>people have used the
internet over time.</b> <b>I remember using Napster
as a kid,</b>
22:00 - 22:30 <b>and also understanding, like,</b> <b>what is aboveboard
and what is not.</b> <b>You know for older
generations,</b> <b>and this is what we've
seen in a lot of the</b> <b>academic literature,
there is more of kind of a</b> <b>susceptibility or
vulnerability to kind of</b> <b>consuming more of it,</b> <b>and then also over time
believing it.</b> <b>And that's not to say
that older generations are</b> <b>searching for this content
by any means,</b> <b>they are being targeted
repeatedly.</b> <b>- Over 65, that population
shared more fake information</b>
22:30 - 23:00 <b>than any other population during
the 2016 election.</b> <b>And I think culturally
and generationally</b> <b>they grew up in a time where,
you know,</b> <b>if you read it, it was true.</b> <b>- So I'm going to rest my
finger where it says</b> <b>open seven days a week,
right on the word open</b> <b>and I'm going to let go.</b> <b>And now what I've done is-</b> <b>- The dirty word that nobody
wants to use is ageism, right?</b> <b>We live in a society where
we have a bombardment of</b>
23:00 - 23:30 <b>negative stereotypes about
people who are older.</b> <b>And we have kind of
accepted that somehow</b> <b>we're going to segregate our
society on the basis of age.</b> <b>You know, maybe 15
or 20 years ago</b> <b>seniors and older adults
were driving a lot of</b> <b>our community and our
civic dialogues.</b> <b>So, when you went to
a public meeting or a</b> <b>political event, it was 70
year olds who were often</b> <b>the people that had
the experience and the</b> <b>confidence to speak
out on policy issues.</b> <b>But today we've had this
sort of moment where</b> <b>all of those dialogues have
shifted online,</b>
23:30 - 24:00 <b>and a lot of the tools and
environments that people are</b> <b>using are now digital.</b> <b>- The speed at which
technology changes</b> <b>puts a challenge on us as a
population and as a society</b> <b>to continue to
educate people</b> <b>outside of formal education.</b> <b>So, the question
is how do we, how do we teach,</b> <b>you know, how do we deal
with that divide?</b> <b>Because it's there.</b> <b>-I read, I learn how to do
things that I'm interested in.</b> <b>If I'm looking for news I
go to look for news stories</b>
24:00 - 24:30 <b>that I'm interested in, I use
it, you know, for my daily life.</b> <b>- Pulling that right handle
either to the right and down or-</b> <b>- What we need today is to get
seniors trained and empowered</b> <b>and included in the
digital conversations</b> <b>about our country and
our communities futures,</b> <b>so that they can bring those
perspectives back in.</b> <b>- I think that a lot of
media literacy education</b> <b>is focused on messages
and the importance of</b> <b>interpreting messages
and being mindful of the</b>
24:30 - 25:00 <b>messages that you create.</b> <b>Or her claim is that we
perceive the printed word</b> <b>as more credible
than the visual text.</b> <b>Do we agree with that?</b> <b>One of the reasons I love
media literacy,</b> <b>I love teaching media literacy,
I love talking about media</b> <b>literacy is because it is
so broad and there's so</b> <b>many different ways to
practice it,</b> <b>but of course that makes it
hard to scale, right?</b> <b>Because I can't just go
into a school district and say,</b>
25:00 - 25:30 <b>"This is the way
you have to do media literacy."</b> <b>With that said though,
the fact that it</b> <b>is adaptable, the fact that it
can be flexible for different</b> <b>communities in different
contexts is a positive.</b> <b>- When we're online we
already have the notion</b> <b>that a lot of things are
like fake news or like not</b> <b>every -</b> <b>you can't like trust
what you see online.</b> <b>But when it comes to
printed I feel like</b> <b>there's so many people
that look over it over and</b> <b>over and over again
before it gets printed.</b> <b>So I feel like -</b>
25:30 - 26:00 <b>I guess it may not even be
100% true that printed has</b> <b>more accurate news, but
I think it's a perceived</b> <b>notion that it is.</b> <b>- It's a very interesting
point that you just made.</b> <b>The process that something
like this goes through</b> <b>to get to print versus the
ease at which we can share</b> <b>information online,
I think it's an excellent point.</b> <b>The challenges of scaling
media literacy often kind</b> <b>of show up in
this idea of</b> <b>"Oh, teachers have too
much to do,"</b> <b>"Teachers that are,
you know, in elementary school</b>
26:00 - 26:30 <b>"or middle school or high
school in the United States,</b> <b>"have so much on
their plate that you can't</b> <b>"also on top of it give
them media literacy to teach."</b> <b>So, the way that we frame
it is media literacy is a</b> <b>way to teach, it's not
a subject to teach.</b> <b>What's been so interesting
about the fake news conversation</b> <b>and the misinformation and
disinformation conversation,</b> <b>is that we're making an
assumption that the problem is</b> <b>the misinformation and
disinformation.</b> <b>The problem is much
broader than that.</b>
26:30 - 27:00 <b>Even if we eliminated
everything that's fake,</b> <b>even if Facebook could
magically, you know,</b> <b>make all the
disinformation disappear,</b> <b>we would still need media
literacy, we still have</b> <b>so much to understand.</b> <b>We should have always been
asking questions around</b> <b>journalism just because-</b> <b>- It's hard to unify the
country though,</b> <b>with the news media
being so split up.</b> <b>(people talking over each other)</b>
27:00 - 27:30 <b>- People tend to be connected to
other similar minded people</b> <b>and so then that can lead
to polarization,</b> <b>because, you know, if the two
separate groups are only talking</b> <b>to each other, you know, they
can sort of get feedback loops,</b> <b>so they get more and more
extreme.</b> <b>If have two parties in echo
chambers.</b> <b>but one party is able to
draw some members of the</b> <b>other party into their
own echo chamber</b> <b>then the people that could
get drawn in,</b> <b>even though they don't
like that party</b> <b>they're more likely to vote in
that direction, because</b>
27:30 - 28:00 <b>they think everybody else
is going to vote that way.</b> <b>- But it is important
to point out what is a</b> <b>blatant left-wing double
standard in this country.</b> <b>- The hypocrisy on the
Republican side for the</b> <b>last few years I
can't even fathom.</b> <b>NARRATOR: As adults continue to
find new ways to disagree,</b> <b>some kids are working hard
to learn better ways to</b> <b>communicate and debate the
more challenging topics</b> <b>of our time.</b> <b>The New Haven Urban
Debate League,</b> <b>a student organization at
Yale University,</b> <b>provides free debate and
communication skills,</b>
28:00 - 28:30 <b>critical thinking, teamwork and
advocacy that students will use</b> <b>for the rest of their lives.</b> <b>- This practice is a little
bit different than most</b> <b>practices just because
we're done in terms of</b> <b>tournament for
this semester.</b> <b>It's going to be a little
bit more laid back.</b> <b>We'll go over what we saw in the
last tournament and kind</b> <b>of what we learned
a lot of it.</b> <b>- They want to win the round
and they want to succeed.</b> <b>So, even if that means
arguing against something</b> <b>they really believe in,
they're going to try their</b> <b>best to see the
argument from that side.</b>
28:30 - 29:00 <b>So, I think in that way the
competitive drive is helpful.</b> <b>(Indistinct chatter)</b> <b>- So you guys know that were,
that was it,</b> <b>that was our last
tournament of the semester.</b> <b>(clapping)</b> <b>Debate is you know, exactly what
fights polarization.</b> <b>We have two kids debating
against two kids.</b> <b>They get about 15 minutes to
prep their arguments before</b> <b>they debate against each
other, they don't get</b> <b>to choose which side of
the argument they're on.</b>
29:00 - 29:30 <b>The issues that we give them,
we talk about them</b> <b>in advance but they don't
actually know the resolution,</b> <b>what they're actually arguing
until right before the round.</b> <b>Debate is learning how,</b> <b>learning about both
sides of the argument.</b> <b>So, you know, no matter what you
believe in, you're going</b> <b>to have to get up and
talk about something,</b> <b>and try your hardest to win
the argument from that side.</b> <b>- Like let's say that you get
something wrong online</b> <b>about like a political
candidate or something,</b> <b>that can affect how you
vote in elections and that can</b>
29:30 - 30:00 <b>like really damage the kind of
coherency of our news sources.</b> <b>-There's a competitive
incentive to win.</b> <b>So you're trying your
hardest to find the best</b> <b>arguments for and
against your position.</b> <b>- And our first contention
was that banning fake news</b> <b>or censoring fake news destroys
the purpose of social media.</b> <b>So we did-</b> <b>- Awesome.</b> <b>Okay, so if you look at your
notes from the last round,</b> <b>we're going to do like a
quick look at how their</b> <b>final round went, and
what they've said and the</b> <b>arguments that they made,
and then we'll go over</b>
30:00 - 30:30 <b>into what everybody else
said if that's good.</b> <b>- Debate is about being able
to justify your opinions.</b> <b>So, you know, I remember in high
school having really strong</b> <b>opinions about things,
and then you go into a debate</b> <b>and you really have
to explain why.</b> <b>And so I think that gives
people a really healthy</b> <b>way of looking at their
own opinions and the</b> <b>opinions of others.</b> <b>- And I think also a really,
really important part of</b> <b>debate is learning
effective communication.</b> <b>I see a lot of adults who
get very riled up when</b>
30:30 - 31:00 <b>they speak about something
that they believe in,</b> <b>and they,they, you know, jump to
arguments that really aren't</b> <b>arguments and they
start jabbing at others.</b> <b>And what I've learned is
it's not just learning the</b> <b>information, understanding
how to utilize that,</b> <b>how to process it
very quickly,</b> <b>it's also how to effect-
how to very effectively</b> <b>communicate that,
so that your</b> <b>not only are you getting your
point across,</b> <b>but you're creating a good
conversation.</b> <b>- I think we're very aware
that we are the generation</b> <b>that is going to see
the consequences of</b> <b>what happens to our democracy,
what happens to our planet,</b>
31:00 - 31:30 <b>and there is a
growing sense of urgency.</b> <b>- It's really up front and
personal in terms,</b> <b>in terms of them realizing
that things that they're</b> <b>debating right now are things
that are in their hands</b> <b>currently at the same time.</b> <b>I can be doing something
about this if I wanted to,</b> <b>because I can tell right
now that I have the</b> <b>brainpower to do
something about this.</b> <b>So maybe I should.</b> <b>The work that they're doing for
an extracurricular activity</b> <b>kind of forces them to
look at reality.</b> <b>-You know, we look around the
industry and we're...</b>
31:30 - 32:00 <b>I mean for lack of a
better word horrified</b> <b>about what we see in terms
of, you know, involvement of</b> <b>hedge funds, or anybody who
may be, is interested in the</b> <b>business for a purely
business standpoint.</b> <b>The</b> <i><b> Daily Gazette</b></i> <b> has about 200
employees all together.</b> <b>It's a mix of
part time and full time.</b> <b>We still have to, you know,
pay attention to the finances,</b> <b>but we're also exploring
other avenues and we're</b>
32:00 - 32:30 <b>not alone in this, about
finding ways to pay for</b> <b>reporters and staff members
through various grant money.</b> <b>Right now we're in
discussions with a few</b> <b>different philanthropies.</b> <b>So we're looking at all
these kind of models</b> <b>that, that might help.</b> <b>We're going to do as long
as we can do it,</b> <b>and we're going to try to,
you know,</b> <b>do it independently as
much as possible.</b> <b>- The traditional media forgot
two things in the last 20 years.</b> <b>First of all, they forgot
that their content is</b> <b>valuable and therefore if
it's valuable,</b>
32:30 - 33:00 <b>they should not be giving it
away for free.</b> <b>Once you give it away for
free you become solely</b> <b>dependent on advertising,
which brings us to the</b> <b>second problem which is if
you're solely dependent on</b> <b>advertising, the most
important thing is to have some</b> <b>catchy or deceptive headline
in order to get a page view.</b> <b>- Capitalism makes news
deviate from their mission.</b> <b>Under successful democratic
times the news is boring.</b>
33:00 - 33:30 <b>People are making
concessions, one group</b> <b>wants this and other
group wants this,</b> <b>the politician says,</b> <b>No one's going to turn on
the news to watch that,</b> <b>there is no
money to be made and</b> <b>straightforwardly doing
your democratic job.</b> <b>- And it's only 499
per month.</b> <b>Now you can watch One America
News Network live anytime,</b> <b>anywhere in the world.</b> <b>Go to our Facebook page
and click here</b> <b>to become a supporter today.</b> <b>- One of the things about
the architecture of the internet</b>
33:30 - 34:00 <b>is that it provides a
series of spaces,</b> <b>for content
creators to monetize not</b> <b>only the content
that they create,</b> <b>but also the visitors that they
bring to those websites.</b> <b>- If this bill were to pass,
would this prohibit the sale</b> <b>of the Bible that teaches these
things about sexual morality?</b> <b>- Well, literally according
to how this law was written,</b> <b>yes it would.</b> <b>NARRATOR: The misinformation
factory model is so successful</b> <b>because it can be easily
replicated, streamlined,</b>
34:00 - 34:30 <b>and often requires very
little expertise to operate.</b> <b>Meanwhile, legitimate
local news organizations,</b> <b>which often rely on
similar Ad supported</b> <b>infrastructure and
industries for their livelihood</b> <b>are suffering.</b> <b>- So, oftentimes a lot of
the hyper partisan social media</b> <b>pages and sites that
portray themselves as news-like,</b> <b>but are actually
just pushing disinformation</b> <b>are able to use social
media to bring people</b>
34:30 - 35:00 <b>to the website, which
then brings them money through</b> <b>paid banner advertisements that
are usually programmatically</b> <b>placed by big Ad
exchanges like Google.</b> <b>- The structure of the
social networks controls</b> <b>what information
you have access to.</b> <b>- People who get into it
for political reasons,</b> <b>can make it much more of their
career because of the</b> <b>financial prospects
that are often involved.</b>
35:00 - 35:30 <b>- Sometimes what you
really need is just good</b> <b>investigative
journalism, right?</b> <b>A lot of these
campaigns are exposed by</b> <b>investigative journalists
and by the media,</b> <b>and what they've done is to
follow the money,</b> <b>is to pick up the phone,</b> <b>is to try to understand who's
really behind something.</b> <b>βͺ</b> <b>- I 100% support everyone's
right to free speech,</b> <b>but freedom of speech doesn't
mean you're entitled to</b> <b>profit from that work.</b> <b>We've entered an era where
brands don't know where</b>
35:30 - 36:00 <b>their Ad dollars
are going.</b> <b>NARRATOR: When a brand purchases
an Ad with Google,</b> <b>it uses an algorithm based on
key words to target consumers.</b> <b>However, sometimes these
Ads can be placed on</b> <b>websites that are
unfavorable</b> <b>without the brand knowing.</b> <b>- There was a study that
came out earlier this year</b> <b>from the Global Disinformation
Index that found</b> <b>that $235 million every year is
going to fund disinformation.</b> <b>Each dollar that's
going to fund disinformation</b>
36:00 - 36:30 <b>is $1 that is not going to find
legitimate sources of news,</b> <b>and that's really problematic.</b> <b>The other thing is that
brands never asked to be</b> <b>on disinformation sites,
so how is it that that</b> <b>much money is going
towards disinformation?</b> <b>There's something up.</b> <b>I went on Breitbart News
Breitbart.com for the</b> <b>first time after hearing
about it throughout the</b> <b>election cycle, and I
wanted to see for myself</b> <b>what this website
was about.</b> <b>And when I went on the
site for the first time,</b>
36:30 - 37:00 <b>I was shocked to see Ads for
brands and companies that</b> <b>I shop with, that I
frequent, that I'm a</b> <b>customer of advertising
on this website.</b> <b>I had a hunch, a very
strong feeling that they</b> <b>had no idea they
were on this website.</b> <b>So I started by writing a
medium article, like just</b> <b>a blog post online
bringing attention to the fact</b> <b>and that's how I met
my partner, Matt Rivitz,</b> <b>who had already started an
account a couple of weeks before</b>
37:00 - 37:30 <b>called Sleeping Giants.</b> <b>We would just take a screenshot
of their Ad on Breitbart,</b> <b>and we would, you know, we would
just tweet at them with</b> <b>that screenshot saying,
"Hey, did you know you're</b> <b>"appearing on this
racist website?"</b> <b>And it really was
just a question.</b> <b>It turns out nobody knew,
not a single brand was</b> <b>aware that their Ads were
appearing on that site.</b> <b>But almost across the
board they were horrified</b> <b>to find out
that they were.</b> <b>So we generally got
responses pretty quickly,</b> <b>brands would just...</b> <b>It was Twitter so they
would just respond to us saying,</b>
37:30 - 38:00 <b>"Thanks for
letting us know, we will</b> <b>be sure to take it down."</b> <b>Breitbart lost 90% of its Ad
revenue within three months.</b> <b>It hit them really hard.</b> <b>We had no idea we
were that effective.</b> <b>What we did know is that
brands were slowly sort of</b> <b>waking up to where their
advertising was going.</b> <b>When they sign up and turn
on their Google Ads or</b> <b>their Facebook Ads, those
companies have promised them</b> <b>that they will not
serve their Ads on any</b> <b>website or any publication that
is objectionable or hateful,</b>
38:00 - 38:30 <b>and they've really
reneged on that promise.</b> <b>With Breitbart that was
the first time that a lot</b> <b>of brands understood
this, but what the bigger</b> <b>problem here that I see,
is that there's hundreds</b> <b>of thousands of more
Breitbarts out there that</b> <b>tech companies are not
doing anything about,</b> <b>that they're not paying
attention to.</b> <b>And unfortunately a lot
of brands still don't</b> <b>understand just how
massive this problem is.</b> <b>- I think on a more
structural level, you know,</b>
38:30 - 39:00 <b>this is about freedom of speech
versus freedom of reach.</b> <b>Do you have the right
to say something online?</b> <b>Sure, one could argue
that, however do you have the</b> <b>right to monetize and amplify
it by breaking the rules?</b> <b>That is an area that I
think that we can make</b> <b>significant progress on.</b> <b>five or 10 years ago.</b> <b>Consumers need to
understand that we can</b> <b>make that difference
just by speaking up.</b>
39:00 - 39:30 <b>- There's definitely this
tension in the United States</b> <b>between the freedom to access
and publish information,</b> <b>and the
desire to have, you know,</b> <b>the issue of made up news
somehow be addressed,</b> <b>and there's a real tension
that exists there.</b> <b>And in most cases when we
ask this, the public did</b> <b>not want the government to
take these kinds of steps</b> <b>if it was going to be a
risk to the freedom to</b> <b>access and publish
information,</b> <b>they were more willing for
technology companies to do so.</b>
39:30 - 40:00 <b>- One man with total control of
billions of people stolen data</b> <b>all their secrets, their
lives, their futures.</b> <b>NARRATOR: As you may have
already realized</b> <b>this is not Mark Zuckerberg.</b> <b>Computer generated videos
like this one are known as</b> <b>Deepfakes after a 2017
Reddit user of the same name,</b> <b>began posting doctored
videos on the site.</b> <b>- To address their concerns,
their hopes and their dreams.</b> <b>- Obviously, the potential
for serious harm</b> <b>with these
Deepfakes is quite great,</b>
40:00 - 40:30 <b>on elections,
international states,</b> <b>for diplomatic purposes and
even for our private lives.</b> <b>That's why we as a country
need to take swift action</b> <b>and invest in the
research and the tools for</b> <b>identifying and combating
the deepfakes, and create</b> <b>a national strategy
immediately especially for</b> <b>election integrity
and ahead of the 2020</b> <b>presidential election.</b> <b>We already know Russia's
intentional campaign to</b> <b>spread disinformation
throughout the last one,</b> <b>and I don't even want to
imagine what Russia or China</b> <b>or just private players,
the havoc they could wreak</b> <b>on our elections and
our personal lives.</b>
40:30 - 41:00 <b>- Thanks Cheryl for bringing</b> <b>attention to the problems
of deepfake technology,</b> <b>and go Navy beat Army.</b> <b>- So I will say media
manipulation is really not</b> <b>a new thing.</b> <b>that was about
20 years after the first</b> <b>photography was ever
made for, in human history.</b> <b>You know, about 100 years later
with the computers, with the</b> <b>internet and digital
camera, we have photoshops,</b> <b>and then we start to see
fake photographs,</b>
41:00 - 41:30 <b>and making photographs,</b> <b>photographs is much easier.</b> <i><b>Forrest Gump</b></i> <b> is one
famous example where they</b> <b>actually put Gump into
this video sequence.</b> <b>- Congratulations,
how do you feel?</b> <b>- I got to pee.</b> <b>- I believe he said
he had to go pee.</b> <b>- Making fake videos are
difficult but it's also</b> <b>possible, so it's usually
what Hollywood big studios</b> <b>can make them
with big budgets.</b> <b>My general research
interest is in computer research</b>
41:30 - 42:00 <b>and machine learning</b> <b>with a special focus on
digital media forensics,</b> <b>which is essentially tell
if a piece of digital</b> <b>media including image,
video or audio has been</b> <b>manipulated digitally
in some ways.</b> <b>With the abundance of
online media we share,</b> <b>anyone is a potential
target of a deepfake attack.</b> <b>Blinking is a subconscious
activity that normal</b> <b>person usually blinks between,
you know, eight seconds,</b>
42:00 - 42:30 <b>six seconds to 10 seconds.</b> <b>And if you have a longer
video that nobody...</b> <b>that the person doesn't,
does not blink,</b> <b>this probably gave us some kind
of some cue that this may</b> <b>not be a real video.</b> <b>The face, the head is
moving in 3D</b> <b>but the face is actually pasted
on with a 2D transform,</b> <b>and a 2D transform always
some discrepancy.</b> <b>Well, I think there is no
doubt that this technology</b> <b>is going to grow
as the year...</b>
42:30 - 43:00 <b>as time goes by.</b> <b>And there's a huge
amount of interest</b> <b>actually on this,
on the research side</b> <b>of generating more realistic
images as close as</b> <b>possible to human
voice or human faces.</b> <b>It can be used in either
a good way or bad way</b> <b>depends on who's
going to use it.</b> <b>- It's really hard to give a
sense of the growing and global</b>
43:00 - 43:30 <b>scale of the issue, but here
are a few recent examples.</b> <b>Today a report by my
colleagues over at</b> <b>the Oxford Internet Institute
highlighted that more than</b> <b>70 countries currently use
computational propaganda</b> <b>techniques to manipulate
public opinion online.</b> <b>Since October 2018
Twitter has disclosed</b> <b>information around more
than 25,000 accounts</b> <b>associated with
information operations in</b> <b>10 different countries.</b> <b>On Facebook over 40
million users</b>
43:30 - 44:00 <b>have followed pages
that Facebook has taken</b> <b>down for being involved
in what they call</b> <b>coordinated inauthentic
behavior.</b> <b>- Now politicians increase
their spends,</b> <b>increase their budget for
social media operations,</b> <b>and there's a budget for the
legitimate official campaigns.</b> <b>So, these are the
maintenance of the</b> <b>official profile on
Facebook but also</b> <b>Instagram, on Twitter, and
YouTube, but they also have</b> <b>budget for these underground
Black Ops operations,</b>
44:00 - 44:30 <b>where it's about creating
these communities.</b> <b>- In reality if I'm a
government trying to</b> <b>manipulate social media,
what I want to do is I</b> <b>really want to hide inside
organic groups of people, right?</b> <b>And it's going to be
really hard to go and look</b> <b>at a group and say,
"Actually, those accounts</b> <b>"are not real people, are
not part of an organic</b> <b>"campaign, they're part of
an information operation</b> <b>"that a state is
conducting."</b> <b>So in order to do that we look
at patterns at a large scale</b>
44:30 - 45:00 <b>because if you have a
small number of accounts,</b> <b>who are trying to
replicate the organic</b> <b>diversity of a large
number of accounts,</b> <b>there are different ways in
which they're going to fail.</b> <b>And we say, "Okay, well,
that campaign doesn't move</b> <b>"like an organic campaign
of people just coming</b> <b>"online together."</b> <b>And so sometimes you have, you
know, a campaign with people who</b> <b>are behind it, who didn't
try to hide very much.</b>
45:00 - 45:30 <b>And so for instance you
can look at the email</b> <b>addresses that are behind
an account or the IP address</b> <b>where they're
coming from.</b> <b>But sometimes it can be
really difficult because</b> <b>you can have actors who
are really keen to</b> <b>hide their identities.</b> <b>In cyber security we call it
OPSEC, operational security.</b> <b>It can be very difficult
to tell who's behind an account.</b> <b>Digital
forensics can be,</b> <b>can be quite hard
in some cases.</b> <b>- To what's being called a
bold move by Twitter CEO</b>
45:30 - 46:00 <b>Jack Dorsey, announcing
plans to ban all political</b> <b>Ads as the 2020
campaigns ramp up.</b> <b>- Would I be able to run
advertisements on Facebook</b> <b>targeting republicans in
primaries saying that they</b> <b>voted for the
Green New Deal?</b> <b>I mean if you're not
fact checking</b> <b>political advertisements,</b> <b>I'm just trying to
understand the bounds here,</b> <b>what's fair game?</b> <b>- Congresswoman I don't
know the answer to that</b> <b>off the top my head.</b> <b>- So you don't know if I'll
be able to do that?</b> <b>- You know, we talk about the
disruptions and media</b> <b>climate over the past 15
years and, you know,</b>
46:00 - 46:30 <b>Facebook has been a huge
contributor to that,</b> <b>you know, just the fact
that it's played like</b> <b>a universal role
in people's lives.</b> <b>- What is the right
intervention here?</b> <b>And one advocate is let's
just ban these people,</b> <b>let's just take this down,
let's lobby Facebook to</b> <b>invest in content
moderation, which is</b> <b>precisely about finding, like,
hurtful words and banning</b> <b>those people forever
from the platform.</b> <b>And we say that that's
important that's one</b>
46:30 - 47:00 <b>component of it, but
that's also about taking</b> <b>down content and fake news
that is already out there,</b> <b>and it's already out in
the wild it has already</b> <b>been circulated, right?</b> <b>This could be easily weaponized
particularly in communities,</b> <b>in national
contexts where there's</b> <b>authoritarian
aspiring leaders.</b> <b>- And I think Duterte has
taken out all the rules,</b> <b>he has green lighted
extrajudicial executions,</b> <b>he has harassed and
bullied opponents.</b>
47:00 - 47:30 <b>He's insulted the
President of the United States</b> <b>and the
United Nations.</b> <b>- We see this in Malaysia,
we see this in Indonesia,</b> <b>Thailand and the
Philippines where there</b> <b>are attempts to
regulate fake news.</b> <b>And the solution to fake
news is actually even</b> <b>worse than the actual
disease,</b> <b>to actually silence
the opposition,</b> <b>to muffle people who are
expressing political dissent.</b> <b>And that's why in our
research we're trying to</b>
47:30 - 48:00 <b>argue against content
regulation, which we can't</b> <b>rely on the government to tell
us what is fake news or not.</b> <b>- And so the trust level
that sort of gut reaction</b> <b>to trusting the news I get
on social media is very low.</b> <b>We also ask about sort of,
do you think it's</b> <b>generally accurate
or inaccurate?</b> <b>And you have more people
that would say it's</b> <b>largely inaccurate.</b> <b>A majority said also it's
not really helping me be</b> <b>more informed about
the events of the day.</b> <b>But at the same time,
they turn to it and the main</b>
48:00 - 48:30 <b>reason they say they turn
to it is for convenience.</b> <b>You know, there are, there are
sort of these caution flags that</b> <b>people may have in their head,
but they're definitely</b> <b>still going to use it.</b> <b>- In the newspaper you could
skip from story to story,</b> <b>but the default is once
you start reading a story</b> <b>the default thing is
keep reading that story.</b> <b>Whereas in social media
the default is</b> <b>get immediately onto the next
story and you have to put</b> <b>an extra effort to then
click out, open it up and</b> <b>go through the story
in its entirety.</b>
48:30 - 49:00 <b>- A lot of things in that
space come to you as</b> <b>opposed to you needing
to seek them out.</b> <b>- I think it's hard for the
artificial intelligence</b> <b>algorithms to keep up with the
continually changing</b> <b>target of what makes
something misinformation.</b> <b>- Every time we detect a
new technique,</b> <b>we see people inventing
another technique.</b> <b>- What we have been working
on a lot is the</b> <b>power of the Wisdom of Crowds.</b> <b>The classic example of this was,
you know, from the early 1900s,</b>
49:00 - 49:30 <b>at some county fair
there was an ox, and a guy</b> <b>would have people guess
what's the weight of the ox,</b> <b>and any individual
person had no idea.</b> <b>Once he averaged the
answers of a whole bunch</b> <b>of people it was
exactly right.</b> <b>And this has been shown in all
different kinds of domains,</b> <b>and so I think
the big question right now</b> <b>is how well does the
Wisdom of Crowds work in</b> <b>identifying fake news
and misinformation?</b> <b>But I think that the
advantage of the Crowd approach</b>
49:30 - 50:00 <b>is you don't
need trained experts.</b> <b>If your Facebook for
example, you have a lot of</b> <b>money to throw at this
problem if you are so inclined,</b> <b>and so you can hire a big crowd
if the crowd doesn't</b> <b>have any, have to have any
particular expertise or skills.</b> <b>- Right now we have the
destruction of truth itself.</b> <b>Right now we have people trying
to say "there is no truth",</b> <b>it's just
"Which side are you on?"</b> <b>That's just trading the
information space</b>
50:00 - 50:30 <b>as a complete game.</b> <b>- There are always going to
be very loud voices on the</b> <b>fringes of both sides
advocating things that</b> <b>will generate, you know,
conversation.</b> <b>But if we focus on how, you
know, they can sustain financial</b> <b>operations by doing this,
that's an area that I</b> <b>think that everyone can
agree on deserves action.</b> <b>- It has to be human first,
machine second.</b> <b>On the one hand there is
a role for automation,</b>
50:30 - 51:00 <b>for artificial intelligence,
for machine learning,</b> <b>because the scale of the
problem calls for that.</b> <b>But on the other hand, like it
has to be guided by,</b> <b>by humans of course, right?</b> <b>Like by deep expertise
by people who truly</b> <b>understand the nuances of
the problem and the trade</b> <b>offs that come with it.</b> <b>- NewsGuard is a browser
plugin currently and it</b> <b>works on Safari and Chrome
and Firefox and Edge,</b>
51:00 - 51:30 <b>and it's the result of humans
looking at news and</b> <b>information websites
and trying to figure out</b> <b>whether or not those websites
are credible or transparent.</b> <b>- What we try to do it
NewsGuard is to restore</b> <b>some order and some
sense of context.</b> <b>We've hired dozens of
journalists who read</b> <b>and review and rate websites.</b> <b>It's a process that
typically involves five or</b> <b>six layers of people
starting with the person</b> <b>who drafts what we call
the nutrition label</b>
51:30 - 52:00 <b>and the rating.</b> <b>- These different shields,
these are our reviews</b> <b>and if you
hover over it,</b> <b>you can see that we think this
website USnews.com is</b> <b>generally reliable, and we
explain the different criteria</b> <b>whether it's a yes or
no for those things.</b> <b>And we also have a
full report,</b> <b>we call it a nutrition label.</b> <b>- We've rated about 3,700
news sites in the United States,</b> <b>Italy, France, Germany,
and the United Kingdom.</b>
52:00 - 52:30 <b>And in each of those
countries those sites are</b> <b>responsible for at least
90% of the news and</b> <b>information consumed
online in those countries.</b> <b>- If you come to a website
maybe you're not familiar with,</b> <b>you'll see our red
shield and so we think</b> <b>that this website severely
violates basic standards</b> <b>of credibility
and transparency.</b> <b>And again we sort of
explain all our rationale</b> <b>for any of those nodes,
that also shows up in</b>
52:30 - 53:00 <b>Facebook feeds
and Twitter feeds.</b> <b>So, if you come across
a website and a headline</b> <b>that you've not heard
about or don't know</b> <b>anything about, we've
probably rated it and you can,</b> <b>you know, make your
own decision,</b> <b>if you think that website is
reliable or not.</b> <b>- What we've designed this
whole thing to be is the</b> <b>opposite of an algorithm.</b> <b>First of all we're
completely accountable,</b> <b>all of our work is
right there,</b> <b>we're completely transparent.</b> <b>We call for comment if any
news site seems not to be</b>
53:00 - 53:30 <b>living up to even one of
the nine criteria,</b> <b>even if it's relatively minor,
we call for comment.</b> <b>The last differences,
unlike an algorithm,</b> <b>we want people to
game our system.</b> <b>We want news sites to see
well, if I did this,</b> <b>if I had a correction's
policy for example,</b> <b>I can get a higher score
from NewsGuard.</b> <b>And so far over 600
different news sites here</b> <b>and around the world have
changed something about</b>
53:30 - 54:00 <b>what they do in order
to get a higher score.</b> <b>NARRATOR: As the volume of data
grows so does the chance</b> <b>of handling
misinformation,</b> <b>that challenges both the
machine and human ability</b> <b>to uncover the truth.</b> <b>- So, it's really important
I think, to zoom out and</b> <b>look at this problem set
as something that's really</b> <b>about consumer protections
and access to information.</b> <b>NARRATOR: Too many factors,
too many viewpoints,</b> <b>too many arguments.</b>
54:00 - 54:30 <b>But what if there
were right answers?</b> <b>What if they've been right here
under our noses all this time,</b> <b>and we've been
too busy trying to prove</b> <b>ourselves right to notice.</b> <b>- The role of consumers
is to become more</b> <b>knowledgeable consumers
of information, to be more</b> <b>knowledgeable about all
those headlines that they</b> <b>see on their newsfeed.</b> <b>- That's the always
important piece of it,</b> <b>you need to be asking a lot of
questions about what it is</b> <b>that you're doing, what
are you engaging with,</b>
54:30 - 55:00 <b>whether it's a written
text, a video text,</b> <b>a sound text.</b> <b>- I'm hoping that a lot
of this resources and</b> <b>strategies is more
accessible to all schools,</b> <b>and all schools understand
how to implement it better.</b> <b>- If we pretend that the
problem is entirely new,</b> <b>we will forget
the old solutions.</b> <b>The old solutions tell us
to eliminate inequality,</b> <b>to address poverty,
to educate our citizenry,</b> <b>to make people less
susceptible to fear and anxiety.</b>
55:00 - 55:30 <b>- Information campaigns
right before elections are</b> <b>problematic and have been
problematic for a long time.</b> <b>It's like social media
didn't invent that,</b> <b>but it's certainly possible
that social media</b> <b>exacerbates it by making
it easier for things to</b> <b>really spread widely.</b> <b>- Everyone is really intent
on limiting the actual</b> <b>impact of these kinds of
disinformation networks.</b> <b>However, the landscape is
constantly changing on a</b> <b>regular basis, so it's a
constant arms race between</b>
55:30 - 56:00 <b>the disinformation
networks and the Ad exchanges</b> <b>and other kind
of platforms that do not</b> <b>want disinformation
spreading across the internet.</b> <b>- Going forward, if you're
going to deal with these</b> <b>informational problems,
you're going to have to</b> <b>make the people less
susceptible to them.</b> <b>βͺ</b>
56:00 - 56:30 <b>This program is made possible
with support from</b>