Quantcast

Essay about fake news, algorithms, social bots, and more

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Essay about fake news, algorithms, social bots, and more

Pine W
I'm finding it encouraging to see that a number of researchers and
journalists are taking these problems seriously, trying to understand them,
and trying to improve the situation.
http://www.pbs.org/wgbh/nova/next/tech/misinformation-on-social-media-could-outfox-technical-solutions-for-now

Pine
_______________________________________________
Wiki-research-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Essay about fake news, algorithms, social bots, and more

James Salsman-2
Pine wrote:
>
> I'm finding it encouraging to see that a number of researchers and
> journalists are taking these problems seriously, trying to understand them,
> and trying to improve the situation.
> http://www.pbs.org/wgbh/nova/next/tech/misinformation-on-social-media-could-outfox-technical-solutions-for-now

I'm encouraged by the studies, but confused about why the fake news
phenomenon is considered novel, rather than continuations of age-old
disinformation, yellow journalism, aggressive public relations,
manufactured consent, astroturfing, propaganda, and deceptive
marketing. There's nothing new about it other than the term.

_______________________________________________
Wiki-research-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Essay about fake news, algorithms, social bots, and more

Pine W
Agreed that what we're seeing are Internet-enabled implementations of old
practices. I think that there has been a recent renewal of awareness of how
effective these dark arts can be at generating revenue and perhaps
affecting political systems.

Over the years, a number of people and organizations have tried to
manipulate the neutrality of Wikipedia content for political, financial, or
PR advantage. I have the impression that the community's human resources
capacity and technical tools are currently insufficient in comparison to
the scale of the problems. I'm hoping that some of the tools that are being
developed as a part of the anti-harassment initiative will help a little.
I'm also thinking that a good exercise for students in Wikipedia in
Education classes would be to identify content that is noncompliant with
neutrality and verifiability standards, and either change that content
themselves or flag it for review by more experienced editors.

Pine


On Sat, May 13, 2017 at 5:53 AM, James Salsman <[hidden email]> wrote:

> Pine wrote:
> >
> > I'm finding it encouraging to see that a number of researchers and
> > journalists are taking these problems seriously, trying to understand
> them,
> > and trying to improve the situation.
> > http://www.pbs.org/wgbh/nova/next/tech/misinformation-on-
> social-media-could-outfox-technical-solutions-for-now
>
> I'm encouraged by the studies, but confused about why the fake news
> phenomenon is considered novel, rather than continuations of age-old
> disinformation, yellow journalism, aggressive public relations,
> manufactured consent, astroturfing, propaganda, and deceptive
> marketing. There's nothing new about it other than the term.
>
> _______________________________________________
> Wiki-research-l mailing list
> [hidden email]
> https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
>
_______________________________________________
Wiki-research-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: [Wikimedia Education] Essay about fake news, algorithms, social bots, and more

James Salsman-2
I've studied this question using the same framework I use to track the
WP:SPVA changes. I'm convinced that the English Wikipedia can, given
enough time, handle every kind of controversy except:

(1) religious disputes (e.g., "Historicity of Jesus... Not to be
confused with Historical Jesus."

(2) international political disputes (any number of disputed borders
and islands, Israel/Palestine etc.),

(3) economic disputes pertaining to http://talknicer.com/ehip.pdf and
http://talknicer/egma.pdf

The issues regarding (1) don't have a material (world) impact; (2) are
intractable outside of Wikipedia, so why even bother; but (3) has
profound real-world political and economic impacts which affect the
Foundation's Mission by altering the extent to which free educational
content can be created and effectively disseminated. However, if
assertions that the issues pertaining to (3) are a result of systemic
bias are met with ridicule.

So what we have is Wikipedia perpetuating the "fake news" promoted by
trickle down economics that tax cuts for the rich are good. How might
that affect electoral outcomes, for example?

The best example at present is at
https://en.wikipedia.org/wiki/Talk:Economics#Tax_cut_claim_in_Fiscal_policy_section
which has stood for months with no interest expressed by any
Wikipedians in addressing the problem.


On Sun, May 14, 2017 at 8:22 AM, Pine W <[hidden email]> wrote:

> Agreed that what we're seeing are Internet-enabled implementations of old
> practices. I think that there has been a recent renewal of awareness of how
> effective these dark arts can be at generating revenue and perhaps
> affecting political systems.
>
> Over the years, a number of people and organizations have tried to
> manipulate the neutrality of Wikipedia content for political, financial, or
> PR advantage. I have the impression that the community's human resources
> capacity and technical tools are currently insufficient in comparison to
> the scale of the problems. I'm hoping that some of the tools that are being
> developed as a part of the anti-harassment initiative will help a little.
> I'm also thinking that a good exercise for students in Wikipedia in
> Education classes would be to identify content that is noncompliant with
> neutrality and verifiability standards, and either change that content
> themselves or flag it for review by more experienced editors.
>
> Pine
>
>
> On Sat, May 13, 2017 at 5:53 AM, James Salsman <[hidden email]> wrote:
>
>> Pine wrote:
>> >
>> > I'm finding it encouraging to see that a number of researchers and
>> > journalists are taking these problems seriously, trying to understand
>> them,
>> > and trying to improve the situation.
>> > http://www.pbs.org/wgbh/nova/next/tech/misinformation-on-
>> social-media-could-outfox-technical-solutions-for-now
>>
>> I'm encouraged by the studies, but confused about why the fake news
>> phenomenon is considered novel, rather than continuations of age-old
>> disinformation, yellow journalism, aggressive public relations,
>> manufactured consent, astroturfing, propaganda, and deceptive
>> marketing. There's nothing new about it other than the term.
>>
>> _______________________________________________
>> Wiki-research-l mailing list
>> [hidden email]
>> https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
>>
> _______________________________________________
> Education mailing list
> [hidden email]
> https://lists.wikimedia.org/mailman/listinfo/education

_______________________________________________
Wiki-research-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Loading...