Everything I Disagree With is #FakeNews: Correlating Political - - PowerPoint PPT Presentation

everything i disagree with is fakenews
SMART_READER_LITE
LIVE PREVIEW

Everything I Disagree With is #FakeNews: Correlating Political - - PowerPoint PPT Presentation

Everything I Disagree With is #FakeNews: Correlating Political Polarization and Spread of Misinformation Manoel Horta Ribeiro Pedro H. Calais Virglio A. F. Almeida Wagner Meira Jr. Motivation |||||||| News consumption after Online


slide-1
SLIDE 1

Everything I Disagree With is #FakeNews:

Correlating Political Polarization and Spread of Misinformation

Manoel Horta Ribeiro Pedro H. Calais Virgílio A. F. Almeida Wagner Meira Jr.

slide-2
SLIDE 2

Motivation ||||||||

  • News consumption after Online Social Networks:

Reputation matters less Profits comes from clicks Recommended content

Motivation > Methods > Results > Discussion

slide-3
SLIDE 3
  • Due to this, two phenomena have their impact increased:

Opinion Polarization Spread of Misinformation

Motivation ||||||||

Motivation > Methods > Results > Discussion

slide-4
SLIDE 4

Opinion Polarization

  • Recommendation algorithms may limit

users to ideologically diverse content.

  • System may fuel partisan news, thus

increasing polarization.

“the extent to which opinions on an issue are

  • pposed in relation to some theoretical

maximum”

Motivation ||||||||

Motivation > Methods > Results > Discussion

slide-5
SLIDE 5
  • Made easier by the decrease in the

accountability of sources.

  • Bots may be employed to disseminate

misinformation.

Spread of Misinformation

“misinformation is false or incorrect information, that is spread intentionally or unintentionally”

Motivation ||||||||

Motivation > Methods > Results > Discussion

slide-6
SLIDE 6
  • The media suggests an interaction between these:

Opinion Polarization Spread of Misinformation Opinion Polarization Spread of Misinformation

Motivation ||||||||

Motivation > Methods > Results > Discussion

slide-7
SLIDE 7
  • The media suggests an interaction between these:

Opinion Polarization Spread of Misinformation Opinion Polarization Spread of Misinformation

Motivation ||||||||

Motivation > Methods > Results > Discussion

slide-8
SLIDE 8
  • But also previous studies also do:

Opinion Polarization Spread of Misinformation Opinion Polarization Spread of Misinformation polarized groups are more susceptible to the dissemination of misinformation

Motivation ||||||||

Motivation > Methods > Results > Discussion

slide-9
SLIDE 9
  • But also previous studies also do:

Opinion Polarization Spread of Misinformation Opinion Polarization Spread of Misinformation the dissemination of misinformation plays a key role in creating polarized groups

Motivation ||||||||

Motivation > Methods > Results > Discussion

slide-10
SLIDE 10
  • Can this interaction happen in some other way?

Motivation ||||||||

Motivation > Methods > Results > Discussion

slide-11
SLIDE 11
  • Users designate incorrectly classify sources
  • f misinformation due to disagreement.
  • Alternate narratives of “what is true”

???

“everything I disagree with is fake news”

Motivation ||||||||

Motivation > Methods > Results > Discussion

slide-12
SLIDE 12

Q1: How is polarization quantitatively related to information perceived as or related to fake news? Q2: Are users designating content that they disagree with as misinformation? Motivation ||||||||

Motivation > Methods > Results > Discussion

slide-13
SLIDE 13

Method |||||||||||

Motivation > Methods > Results > Discussion

slide-14
SLIDE 14
  • We collect a dataset trying to answer this questions in the

following fashion:

Method |||||||||||

Motivation > Methods > Results > Discussion

slide-15
SLIDE 15
  • We collect the tweets with words and hashtags related to

misinformation using the stream API.

Method |||||||||||

Motivation > Methods > Results > Discussion

slide-16
SLIDE 16
  • We collect the tweets with words and hashtags related to

misinformation using the stream API.

{fakenews, #fakenews, fake-news, #fake-news, posttruth, #posttruth, post-truth, #post-truth, alternativefact, #alternativefact, alternative-fact, #alternative-fact}

Method |||||||||||

Motivation > Methods > Results > Discussion

slide-17
SLIDE 17
  • We collect the tweets with words and hashtags related to

misinformation using the stream API.

{fakenews, #fakenews, fake-news, #fake-news, posttruth, #posttruth, post-truth, #post-truth, alternativefact, #alternativefact, alternative-fact, #alternative-fact}

F a k e

  • n

e w s c

  • n

t e x t : H u f fi n g C

  • m

P

  • s

t i s a j

  • k

e . N

  • b
  • d

y b e l i e v e s t h e i r # f a k e p

  • l

l s

  • r

# f a k e n e w s . # M A G A { U R L }

Method |||||||||||

Motivation > Methods > Results > Discussion

slide-18
SLIDE 18
  • We use the URLs in the tweets in the search API and find

more general tweets about it (not necess. w/ keywords)

Method |||||||||||

Motivation > Methods > Results > Discussion

slide-19
SLIDE 19
  • We use the URLs in the tweets in the search API and find

more general tweets about it (not necess. w/ keywords)

C a n a d i a n v i e w s

  • f

U . S . h i t a n a l l

  • t

i m e l

  • w

, p

  • l

l s h

  • w

s , { U R L }

Method |||||||||||

Motivation > Methods > Results > Discussion

slide-20
SLIDE 20
  • With this we can manage to get an URL and a many

associated tweets.

Method |||||||||||

Motivation > Methods > Results > Discussion

slide-21
SLIDE 21
  • The second step envolves a bigger data collection in the

stream API involving more broad political hashtags

Method |||||||||||

Motivation > Methods > Results > Discussion

slide-22
SLIDE 22
  • This allow us to (with an community detection algorithm)

get a polarization metric for some of the users

Method |||||||||||

Motivation > Methods > Results > Discussion

slide-23
SLIDE 23
  • Assume that the number of

communities K formed around a topic T is known

  • We build the retweet

bipartite graph using the retweets in the collected dataset.

Method |||||||||||

Motivation > Methods > Results > Discussion

slide-24
SLIDE 24
  • We select seeds with

known political position, (i.e. politicians)

  • A random walker departs

from each seed and travels, with some probability of restarting from its original

Method |||||||||||

Motivation > Methods > Results > Discussion

slide-25
SLIDE 25
  • The relative proximity of

each node to the sets of seeds yield a prob. that that node belongs to that community

Method |||||||||||

Motivation > Methods > Results > Discussion

slide-26
SLIDE 26

Method |||||||||||

Motivation > Methods > Results > Discussion

slide-27
SLIDE 27

Method |||||||||||

Motivation > Methods > Results > Discussion

slide-28
SLIDE 28
  • With this data we:

(i) Estimate users political polarization

  • n different domains.

Method |||||||||||

Motivation > Methods > Results > Discussion

slide-29
SLIDE 29
  • With this data we:

(ii) Estimate political polarization of URLs.

Method |||||||||||

Motivation > Methods > Results > Discussion

slide-30
SLIDE 30
  • With this data we:

(iii) Qualitatively analyze the domains and the content of the URLs.

Method |||||||||||

Motivation > Methods > Results > Discussion

slide-31
SLIDE 31

Results |||||||

Motivation > Methods > Results > Discussion

  • Data collection extracts the political orientation of 374,191
  • f users that commented some of the collected URL (29%)
slide-32
SLIDE 32

Results |||||||

Motivation > Methods > Results > Discussion

  • Although it is a relatively small sample of all the users in a

broader context (2.67%), it jumps to 15.72% when we consider only the active users.

slide-33
SLIDE 33

Results |||||||

Motivation > Methods > Results > Discussion

  • The users in the fake-news-related dataset are more

polarized than in the general politics one. This is evidence that fake-news-related discourse induces polarization.

slide-34
SLIDE 34

Results |||||||

Motivation > Methods > Results > Discussion

  • The polarization grows according to association with

misinformation.

slide-35
SLIDE 35

Results |||||||

Motivation > Methods > Results > Discussion

  • The polarization decreases with number of reactions.
slide-36
SLIDE 36

Results |||||||

Motivation > Methods > Results > Discussion

  • People cite sources that they agree ideologically with in

this fake-news-related context.

slide-37
SLIDE 37

{ D I S M I S S I N G A N A R R A T I V E } N e w Y

  • r

k P

  • s

t : F B I c l e a r s M i c h a e l F l y n n i n p r

  • b

e l i n k i n g h i m t

  • R

u s s i a

  • Qualitatively analyzing top URLs.

Results |||||||

Motivation > Methods > Results > Discussion

slide-38
SLIDE 38

{ H U M O U R } T e l e g r a p h : P r i s

  • n

e r d r e s s e d a s w

  • m

a n i n f a i l e d e s c a p e b i d

  • Qualitatively analyzing top URLs.

Results |||||||

Motivation > Methods > Results > Discussion

slide-39
SLIDE 39

{ N E W S F A K E T A G G I N G } R a w S t

  • r

y : F a m i l y b l a s t s r i g h t

  • w

i n g m e d i a f

  • r

s p r e a d i n g f a k e n e w s s t

  • r

y a b

  • u

t s l a i n D N C s t a f f e r a s R u s s i a s c a n d a l d e e p e n s

  • Qualitatively analyzing top URLs.

Results |||||||

Motivation > Methods > Results > Discussion

slide-40
SLIDE 40

{ U R L F A K E T A G G I N G } @ r e a l d

  • n

a l d t r u m p : T h e R u s s i a

  • T

r u m p c

  • l

l u s i

  • n

s t

  • r

y i s a t

  • t

a l h

  • a

x , w h e n w i l l t h i s t a x p a y e r f u n d e d c h a r a d e e n d ?

  • Qualitatively analyzing top URLs.

Results |||||||

Motivation > Methods > Results > Discussion

slide-41
SLIDE 41
  • We present quantitative evidence of various interactions of

polarization and misinformation.

  • We present qualitative evidence of different uses of

misinformation-related tags.

Discussion |||

Motivation > Methods > Results > Discussion

What does this means?

slide-42
SLIDE 42
  • This may present challenges for solutions that use the

“wisdom of the crowd” to determine what is fake.

  • Polarization may prove itself useful as a feature to

distinguish between fake and biased.

Discussion |||

Motivation > Methods > Results > Discussion

slide-43
SLIDE 43
  • Future directions:
  • How to quantify the influence of bias on what is

perceived as fake?

  • How to explicitly tell how biased a piece of information

is? Should we do this?

Discussion |||

Motivation > Methods > Results > Discussion

slide-44
SLIDE 44

github manoelhortaribeiro twitter manoelribeiro mail manoelribeiro at dcc.ufmg.br

Thank You!