Fwd: [foundation-l] Bot policy on bots operating interwiki

classic Classic list List threaded Threaded
40 messages Options
12
Reply | Threaded
Open this post in threaded view
|

Fwd: [foundation-l] Bot policy on bots operating interwiki

White Cat
I think we have a serious problem with this. When the interwiki bot issue
was last discussed there only was a handful of wikis. I think it is time to
bring some attention to this.

http://meta.wikimedia.org/wiki/Special:SiteMatrix displays quite a large
number of wikis (I was told this is around 700). Wikipedia alone has 253
language editions according to
http://meta.wikimedia.org/wiki/List_of_Wikipedias

I was told only 60 of these 700ish wikis have an actual local bot policy of
which most are just translations or mis-translations of en.wiki.

Why is this a problem? Well, if a user decides to operate an interiwki bot
on all wikis. He or she (or it?) would have to make about 700 edits on the
individual wikis. Aside form the 60 most of these wikis do not even have a
bot request page IIRC. Those individual 700 edits would have to be listed on
[[m:Requests for bot status]]. A steward will have to process these 700 -
wikis with active bcrats. Thats just one person. As we are a growing
community, now imagine just 10 people who seek such interwiki bot operation.
Thats a workload of 7000. Wikimedia is a growing community. There are far
more than 700 languages on earth - 7000 according to
http://en.wikipedia.org/wiki/Natural_language#Native_language_learning thats
ultimately 7000 * (number of sister projects) wikis per individual bot. With
the calculation of ten bots thats 70,000 requests.

There are a couple of CPU demanding but mindless bot tasks. All these tasks
are handled by the use of same code. Tasks that come to my mind are:

    * Commons delinking
    * Double redirect fixes
    * Interwiki linking
    * Perhaps even anti-spam bots


Currently we already have people who make bot like alterations to individual
such as mediawiki developers wikis without even considering the opinions of
local wikis. I do not believe anyone finds this problematic. Also we elect
stewards from a central location. We do not ask the opinion of individual
wikis. Actions a steward has access to is vast but the permission they have
is quite limited. So the concept of centralized decisions isn't a new
concept. If mediawiki is a very large family we should be able to make
certain decisions family wide.

I think the process on bots operating inter-wiki should be simplified
fundamentally. Asking every wiki for permission may seem like the nice thing
to do but it is a serious waste of time, both for the bot operator and for
the stewards as well as the local communities actually. There is no real
reason to repetitively approve "different" bots operating the same code.

My suggestion for a solution to the problem is as follows:

A foundation/meta bot policy should be drafted prompting a centralized bot
request for a number of very spesific tasks (not everything). All these need
to be mindless activities such as interwiki linking or double redirect
fixing. The foundation will not be interfering with the "local" affairs, but
instead regulating inter-wiki affairs. All policies on wikis with a bot
policy should be compatible or should be made compatible with this
foundation policy. Bot requests of this nature would be processed in meta
alone saving every one time. The idea fundamentally is "one nom per bot"
rather than "one nom per wiki" basically.

If a bot breaks, it can simply be blocked. Else the community should not
have any problem with it. How much supervision do interwiki bots really need
anyways?

Perhaps an interface update is necessary allowing stewards to grant bot
flags in bulk rather than individually if this hasn't been implemented
already.


   - White Cat
_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: Fwd: [foundation-l] Bot policy on bots operating interwiki

Teun Spaans
interwiki bots occasionally need serious attention, interwiki bots spread
interwiki links but not always in the right fashion. When one wiki has a
link to the wrong article, interwiki bots tend to spread this errror to all
wikis.


On 9/6/07, White Cat <[hidden email]> wrote:

>
> I think we have a serious problem with this. When the interwiki bot issue
> was last discussed there only was a handful of wikis. I think it is time
> to
> bring some attention to this.
>
> http://meta.wikimedia.org/wiki/Special:SiteMatrix displays quite a large
> number of wikis (I was told this is around 700). Wikipedia alone has 253
> language editions according to
> http://meta.wikimedia.org/wiki/List_of_Wikipedias
>
> I was told only 60 of these 700ish wikis have an actual local bot policy
> of
> which most are just translations or mis-translations of en.wiki.
>
> Why is this a problem? Well, if a user decides to operate an interiwki bot
> on all wikis. He or she (or it?) would have to make about 700 edits on the
> individual wikis. Aside form the 60 most of these wikis do not even have a
> bot request page IIRC. Those individual 700 edits would have to be listed
> on
> [[m:Requests for bot status]]. A steward will have to process these 700 -
> wikis with active bcrats. Thats just one person. As we are a growing
> community, now imagine just 10 people who seek such interwiki bot
> operation.
> Thats a workload of 7000. Wikimedia is a growing community. There are far
> more than 700 languages on earth - 7000 according to
> http://en.wikipedia.org/wiki/Natural_language#Native_language_learningthats
> ultimately 7000 * (number of sister projects) wikis per individual bot.
> With
> the calculation of ten bots thats 70,000 requests.
>
> There are a couple of CPU demanding but mindless bot tasks. All these
> tasks
> are handled by the use of same code. Tasks that come to my mind are:
>
>    * Commons delinking
>    * Double redirect fixes
>    * Interwiki linking
>    * Perhaps even anti-spam bots
>
>
> Currently we already have people who make bot like alterations to
> individual
> such as mediawiki developers wikis without even considering the opinions
> of
> local wikis. I do not believe anyone finds this problematic. Also we elect
> stewards from a central location. We do not ask the opinion of individual
> wikis. Actions a steward has access to is vast but the permission they
> have
> is quite limited. So the concept of centralized decisions isn't a new
> concept. If mediawiki is a very large family we should be able to make
> certain decisions family wide.
>
> I think the process on bots operating inter-wiki should be simplified
> fundamentally. Asking every wiki for permission may seem like the nice
> thing
> to do but it is a serious waste of time, both for the bot operator and for
> the stewards as well as the local communities actually. There is no real
> reason to repetitively approve "different" bots operating the same code.
>
> My suggestion for a solution to the problem is as follows:
>
> A foundation/meta bot policy should be drafted prompting a centralized bot
> request for a number of very spesific tasks (not everything). All these
> need
> to be mindless activities such as interwiki linking or double redirect
> fixing. The foundation will not be interfering with the "local" affairs,
> but
> instead regulating inter-wiki affairs. All policies on wikis with a bot
> policy should be compatible or should be made compatible with this
> foundation policy. Bot requests of this nature would be processed in meta
> alone saving every one time. The idea fundamentally is "one nom per bot"
> rather than "one nom per wiki" basically.
>
> If a bot breaks, it can simply be blocked. Else the community should not
> have any problem with it. How much supervision do interwiki bots really
> need
> anyways?
>
> Perhaps an interface update is necessary allowing stewards to grant bot
> flags in bulk rather than individually if this hasn't been implemented
> already.
>
>
>   - White Cat
> _______________________________________________
> foundation-l mailing list
> [hidden email]
> http://lists.wikimedia.org/mailman/listinfo/foundation-l
>
_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: Fwd: [foundation-l] Bot policy on bots operating interwiki

Peter van Londen
Hi,

I would like to turn this also in a technical issue. Interwiki/Interlanguage
(there is a difference between interwiki and interlanguage, most people mean
interlanguage when talking about interwiki) organized as it is
(decentralized), is becoming more and more a problem, because:
* amount of edits needed, growing exponentially with growth of languages
* multiplying the mistakes (wrong interlanguages) through bot actions.
* bots are set to automatic, which means that only the easy interwiki's are
done. The difficult interwiki's, requiring handmade changes language
knowledge and investigations, are not done.

There is only one real solution imho: organize it centrally: which means
something like a central database hosted by commons.

Over the years there have been several proposals about that, also on this
list but until now it was apparently not seen as a huge problem. Maybe that
still is the case or maybe it is time to plan for a solution?

Talking about interlanguage some feature requests come into my mind:
* a possibility to limit the shown interwikis, set in the preferences
* a possibility to set the order of interwikis, also set in the preferences.

I would be interested in a comment from the devs if they see this as a
potential problem and if they would see some solutions to interlanguage.

Kind regards, Peter van Londen/Londenp

2007/9/7, teun spaans <[hidden email]>:

>
> interwiki bots occasionally need serious attention, interwiki bots spread
> interwiki links but not always in the right fashion. When one wiki has a
> link to the wrong article, interwiki bots tend to spread this errror to
> all
> wikis.
>
>
> On 9/6/07, White Cat <[hidden email]> wrote:
> >
> > I think we have a serious problem with this. When the interwiki bot
> issue
> > was last discussed there only was a handful of wikis. I think it is time
> > to
> > bring some attention to this.
> >
> > http://meta.wikimedia.org/wiki/Special:SiteMatrix displays quite a large
> > number of wikis (I was told this is around 700). Wikipedia alone has 253
> > language editions according to
> > http://meta.wikimedia.org/wiki/List_of_Wikipedias
> >
> > I was told only 60 of these 700ish wikis have an actual local bot policy
> > of
> > which most are just translations or mis-translations of en.wiki.
> >
> > Why is this a problem? Well, if a user decides to operate an interiwki
> bot
> > on all wikis. He or she (or it?) would have to make about 700 edits on
> the
> > individual wikis. Aside form the 60 most of these wikis do not even have
> a
> > bot request page IIRC. Those individual 700 edits would have to be
> listed
> > on
> > [[m:Requests for bot status]]. A steward will have to process these 700
> -
> > wikis with active bcrats. Thats just one person. As we are a growing
> > community, now imagine just 10 people who seek such interwiki bot
> > operation.
> > Thats a workload of 7000. Wikimedia is a growing community. There are
> far
> > more than 700 languages on earth - 7000 according to
> >
> http://en.wikipedia.org/wiki/Natural_language#Native_language_learningthats
> > ultimately 7000 * (number of sister projects) wikis per individual bot.
> > With
> > the calculation of ten bots thats 70,000 requests.
> >
> > There are a couple of CPU demanding but mindless bot tasks. All these
> > tasks
> > are handled by the use of same code. Tasks that come to my mind are:
> >
> >    * Commons delinking
> >    * Double redirect fixes
> >    * Interwiki linking
> >    * Perhaps even anti-spam bots
> >
> >
> > Currently we already have people who make bot like alterations to
> > individual
> > such as mediawiki developers wikis without even considering the opinions
> > of
> > local wikis. I do not believe anyone finds this problematic. Also we
> elect
> > stewards from a central location. We do not ask the opinion of
> individual
> > wikis. Actions a steward has access to is vast but the permission they
> > have
> > is quite limited. So the concept of centralized decisions isn't a new
> > concept. If mediawiki is a very large family we should be able to make
> > certain decisions family wide.
> >
> > I think the process on bots operating inter-wiki should be simplified
> > fundamentally. Asking every wiki for permission may seem like the nice
> > thing
> > to do but it is a serious waste of time, both for the bot operator and
> for
> > the stewards as well as the local communities actually. There is no real
> > reason to repetitively approve "different" bots operating the same code.
> >
> > My suggestion for a solution to the problem is as follows:
> >
> > A foundation/meta bot policy should be drafted prompting a centralized
> bot
> > request for a number of very spesific tasks (not everything). All these
> > need
> > to be mindless activities such as interwiki linking or double redirect
> > fixing. The foundation will not be interfering with the "local" affairs,
> > but
> > instead regulating inter-wiki affairs. All policies on wikis with a bot
> > policy should be compatible or should be made compatible with this
> > foundation policy. Bot requests of this nature would be processed in
> meta
> > alone saving every one time. The idea fundamentally is "one nom per bot"
> > rather than "one nom per wiki" basically.
> >
> > If a bot breaks, it can simply be blocked. Else the community should not
> > have any problem with it. How much supervision do interwiki bots really
> > need
> > anyways?
> >
> > Perhaps an interface update is necessary allowing stewards to grant bot
> > flags in bulk rather than individually if this hasn't been implemented
> > already.
> >
> >
> >   - White Cat
> > _______________________________________________
> > foundation-l mailing list
> > [hidden email]
> > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> >
> _______________________________________________
> foundation-l mailing list
> [hidden email]
> http://lists.wikimedia.org/mailman/listinfo/foundation-l
>
_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: Fwd: [foundation-l] Bot policy on bots operating interwiki

Brianna Laugher
Some relevant bugs:

167 Metadata (interwiki & category links) should be stored outside the
article text
http://bugzilla.wikimedia.org/show_bug.cgi?id=167

195 links between Wikimedia-sites (request for ability to interwiki
between projects eg wikipedia to wikinews)
http://bugzilla.wikimedia.org/show_bug.cgi?id=195

6628 Special page to edit interwiki table
http://bugzilla.wikimedia.org/show_bug.cgi?id=6628

cheers,
Brianna

_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: Fwd: [foundation-l] Bot policy on bots operating interwiki

White Cat
In reply to this post by Teun Spaans
Yes, but that has nothing to do with the "local approval" issue I mentioned.
Such a fix would require interwiki attention. And if there are no bots
operating on all wikis a fix on that would take a lot of time. All the
reason more we should have more interwiki bots operating on all wikis.

    - White Cat

On 9/7/07, teun spaans <[hidden email]> wrote:

>
> interwiki bots occasionally need serious attention, interwiki bots spread
> interwiki links but not always in the right fashion. When one wiki has a
> link to the wrong article, interwiki bots tend to spread this errror to
> all
> wikis.
>
>
> On 9/6/07, White Cat <[hidden email]> wrote:
> >
> > I think we have a serious problem with this. When the interwiki bot
> issue
> > was last discussed there only was a handful of wikis. I think it is time
> > to
> > bring some attention to this.
> >
> > http://meta.wikimedia.org/wiki/Special:SiteMatrix displays quite a large
> > number of wikis (I was told this is around 700). Wikipedia alone has 253
> > language editions according to
> > http://meta.wikimedia.org/wiki/List_of_Wikipedias
> >
> > I was told only 60 of these 700ish wikis have an actual local bot policy
> > of
> > which most are just translations or mis-translations of en.wiki.
> >
> > Why is this a problem? Well, if a user decides to operate an interiwki
> bot
> > on all wikis. He or she (or it?) would have to make about 700 edits on
> the
> > individual wikis. Aside form the 60 most of these wikis do not even have
> a
> > bot request page IIRC. Those individual 700 edits would have to be
> listed
> > on
> > [[m:Requests for bot status]]. A steward will have to process these 700
> -
> > wikis with active bcrats. Thats just one person. As we are a growing
> > community, now imagine just 10 people who seek such interwiki bot
> > operation.
> > Thats a workload of 7000. Wikimedia is a growing community. There are
> far
> > more than 700 languages on earth - 7000 according to
> >
> http://en.wikipedia.org/wiki/Natural_language#Native_language_learningthats
> > ultimately 7000 * (number of sister projects) wikis per individual bot.
> > With
> > the calculation of ten bots thats 70,000 requests.
> >
> > There are a couple of CPU demanding but mindless bot tasks. All these
> > tasks
> > are handled by the use of same code. Tasks that come to my mind are:
> >
> >    * Commons delinking
> >    * Double redirect fixes
> >    * Interwiki linking
> >    * Perhaps even anti-spam bots
> >
> >
> > Currently we already have people who make bot like alterations to
> > individual
> > such as mediawiki developers wikis without even considering the opinions
> > of
> > local wikis. I do not believe anyone finds this problematic. Also we
> elect
> > stewards from a central location. We do not ask the opinion of
> individual
> > wikis. Actions a steward has access to is vast but the permission they
> > have
> > is quite limited. So the concept of centralized decisions isn't a new
> > concept. If mediawiki is a very large family we should be able to make
> > certain decisions family wide.
> >
> > I think the process on bots operating inter-wiki should be simplified
> > fundamentally. Asking every wiki for permission may seem like the nice
> > thing
> > to do but it is a serious waste of time, both for the bot operator and
> for
> > the stewards as well as the local communities actually. There is no real
> > reason to repetitively approve "different" bots operating the same code.
> >
> > My suggestion for a solution to the problem is as follows:
> >
> > A foundation/meta bot policy should be drafted prompting a centralized
> bot
> > request for a number of very spesific tasks (not everything). All these
> > need
> > to be mindless activities such as interwiki linking or double redirect
> > fixing. The foundation will not be interfering with the "local" affairs,
> > but
> > instead regulating inter-wiki affairs. All policies on wikis with a bot
> > policy should be compatible or should be made compatible with this
> > foundation policy. Bot requests of this nature would be processed in
> meta
> > alone saving every one time. The idea fundamentally is "one nom per bot"
> > rather than "one nom per wiki" basically.
> >
> > If a bot breaks, it can simply be blocked. Else the community should not
> > have any problem with it. How much supervision do interwiki bots really
> > need
> > anyways?
> >
> > Perhaps an interface update is necessary allowing stewards to grant bot
> > flags in bulk rather than individually if this hasn't been implemented
> > already.
> >
> >
> >   - White Cat
> > _______________________________________________
> > foundation-l mailing list
> > [hidden email]
> > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> >
> _______________________________________________
> foundation-l mailing list
> [hidden email]
> http://lists.wikimedia.org/mailman/listinfo/foundation-l
>
_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Fwd: Fwd: [foundation-l] Bot policy on bots operating interwiki

White Cat
In reply to this post by Peter van Londen
Bugzilla has lots and lots of bugs waiting dev attention. A bug fix may not
necessarily happen in a "timely" fashion. Code wise interwiki links aren't
critical unlike many other bugs that need to be fixed as soon as posible.
Wiki can survive without them but they benefit the project greatly.

Yes the three challenges you mention are the root of the problem. Having
more interwiki bots operate on all wikis rather than a few would be a
solution to the problem. Say that I noticed a mistakenly linked language...
Correcting it would be quite a challenge.

We are talking about tens of thousands of pages on hundereds of wikis.
Surely you aren't suggesting that edits like this should be done manually.
For the most part interwiki bots operate without a problem. When they do
make improper edits that is due to errors made locally. The problem itself
could be solved if we had one bot per wiki scanning local pages. Sadly few
people have the courage to deal with the 'insane' workload of seeking
permission from individual wikis.

I'd really wish you did not turn this into a technical issue as it isn't
one. Or at least there are a lot of dependencises that hasn't been addressed
such as interwiki templates. Only then can we have a serious discussion on
your suggestion which may have problems that require bot edits.

   - White Cat

On 9/7/07, Peter van Londen <[hidden email]> wrote:

>
> Hi,
>
> I would like to turn this also in a technical issue.
> Interwiki/Interlanguage
> (there is a difference between interwiki and interlanguage, most people
> mean
> interlanguage when talking about interwiki) organized as it is
> (decentralized), is becoming more and more a problem, because:
> * amount of edits needed, growing exponentially with growth of languages
> * multiplying the mistakes (wrong interlanguages) through bot actions.
> * bots are set to automatic, which means that only the easy interwiki's
> are
> done. The difficult interwiki's, requiring handmade changes language
> knowledge and investigations, are not done.
>
> There is only one real solution imho: organize it centrally: which means
> something like a central database hosted by commons.
>
> Over the years there have been several proposals about that, also on this
> list but until now it was apparently not seen as a huge problem. Maybe
> that
> still is the case or maybe it is time to plan for a solution?
>
> Talking about interlanguage some feature requests come into my mind:
> * a possibility to limit the shown interwikis, set in the preferences
> * a possibility to set the order of interwikis, also set in the
> preferences.
>
> I would be interested in a comment from the devs if they see this as a
> potential problem and if they would see some solutions to interlanguage.
>
> Kind regards, Peter van Londen/Londenp
>
> 2007/9/7, teun spaans <[hidden email]>:
> >
> > interwiki bots occasionally need serious attention, interwiki bots
> spread
> > interwiki links but not always in the right fashion. When one wiki has a
> > link to the wrong article, interwiki bots tend to spread this errror to
> > all
> > wikis.
> >
> >
> > On 9/6/07, White Cat <[hidden email]> wrote:
> > >
> > > I think we have a serious problem with this. When the interwiki bot
> > issue
> > > was last discussed there only was a handful of wikis. I think it is
> time
> > > to
> > > bring some attention to this.
> > >
> > > http://meta.wikimedia.org/wiki/Special:SiteMatrix displays quite a
> large
> > > number of wikis (I was told this is around 700). Wikipedia alone has
> 253
> > > language editions according to
> > > http://meta.wikimedia.org/wiki/List_of_Wikipedias
> > >
> > > I was told only 60 of these 700ish wikis have an actual local bot
> policy
> > > of
> > > which most are just translations or mis-translations of en.wiki.
> > >
> > > Why is this a problem? Well, if a user decides to operate an interiwki
> > bot
> > > on all wikis. He or she (or it?) would have to make about 700 edits on
> > the
> > > individual wikis. Aside form the 60 most of these wikis do not even
> have
> > a
> > > bot request page IIRC. Those individual 700 edits would have to be
> > listed
> > > on
> > > [[m:Requests for bot status]]. A steward will have to process these
> 700
> > -
> > > wikis with active bcrats. Thats just one person. As we are a growing
> > > community, now imagine just 10 people who seek such interwiki bot
> > > operation.
> > > Thats a workload of 7000. Wikimedia is a growing community. There are
> > far
> > > more than 700 languages on earth - 7000 according to
> > >
> >
> http://en.wikipedia.org/wiki/Natural_language#Native_language_learningthats
> > > ultimately 7000 * (number of sister projects) wikis per individual
> bot.
> > > With
> > > the calculation of ten bots thats 70,000 requests.
> > >
> > > There are a couple of CPU demanding but mindless bot tasks. All these
> > > tasks
> > > are handled by the use of same code. Tasks that come to my mind are:
> > >
> > >    * Commons delinking
> > >    * Double redirect fixes
> > >    * Interwiki linking
> > >    * Perhaps even anti-spam bots
> > >
> > >
> > > Currently we already have people who make bot like alterations to
> > > individual
> > > such as mediawiki developers wikis without even considering the
> opinions
> > > of
> > > local wikis. I do not believe anyone finds this problematic. Also we
> > elect
> > > stewards from a central location. We do not ask the opinion of
> > individual
> > > wikis. Actions a steward has access to is vast but the permission they
> > > have
> > > is quite limited. So the concept of centralized decisions isn't a new
> > > concept. If mediawiki is a very large family we should be able to make
> > > certain decisions family wide.
> > >
> > > I think the process on bots operating inter-wiki should be simplified
> > > fundamentally. Asking every wiki for permission may seem like the nice
> > > thing
> > > to do but it is a serious waste of time, both for the bot operator and
> > for
> > > the stewards as well as the local communities actually. There is no
> real
> > > reason to repetitively approve "different" bots operating the same
> code.
> > >
> > > My suggestion for a solution to the problem is as follows:
> > >
> > > A foundation/meta bot policy should be drafted prompting a centralized
> > bot
> > > request for a number of very spesific tasks (not everything). All
> these
> > > need
> > > to be mindless activities such as interwiki linking or double redirect
> > > fixing. The foundation will not be interfering with the "local"
> affairs,
> > > but
> > > instead regulating inter-wiki affairs. All policies on wikis with a
> bot
> > > policy should be compatible or should be made compatible with this
> > > foundation policy. Bot requests of this nature would be processed in
> > meta
> > > alone saving every one time. The idea fundamentally is "one nom per
> bot"
> > > rather than "one nom per wiki" basically.
> > >
> > > If a bot breaks, it can simply be blocked. Else the community should
> not
> > > have any problem with it. How much supervision do interwiki bots
> really
> > > need
> > > anyways?
> > >
> > > Perhaps an interface update is necessary allowing stewards to grant
> bot
> > > flags in bulk rather than individually if this hasn't been implemented
> > > already.
> > >
> > >
> > >   - White Cat
> > > _______________________________________________
> > > foundation-l mailing list
> > > [hidden email]
> > > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> > >
> > _______________________________________________
> > foundation-l mailing list
> > [hidden email]
> > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> >
> _______________________________________________
> foundation-l mailing list
> [hidden email]
> http://lists.wikimedia.org/mailman/listinfo/foundation-l
>
_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: Fwd: [foundation-l] Bot policy on bots operating interwiki

Andre Engels
In reply to this post by Peter van Londen
2007/9/7, Peter van Londen <[hidden email]>:

> I would like to turn this also in a technical issue. Interwiki/Interlanguage
> (there is a difference between interwiki and interlanguage, most people mean
> interlanguage when talking about interwiki) organized as it is
> (decentralized), is becoming more and more a problem, because:
> * amount of edits needed, growing exponentially with growth of languages
> * multiplying the mistakes (wrong interlanguages) through bot actions.

And even removing them might be reverted if not done well enough,
although on the other hand in many cases bot interwiki grinds to a
halt after an error because of what you mention in your next point.

> * bots are set to automatic, which means that only the easy interwiki's are
> done. The difficult interwiki's, requiring handmade changes language
> knowledge and investigations, are not done.

I rarely set my bots automatic, preferring to disambiguate the
problematic cases. I might be the exception, however. And even so,
there are still plenty of cases where I decide to just let it go.
Which brings me to the fourth point:

* The bots work from the assumption that interwiki links form an
equivalence relationship - that is, that when A has an interwiki to B
and B has one to C, that there should be interwiki B to A and C to B
as well - and that there should be only one interwiki from a given
page to a given language. Unfortunately, there are cases where this
assumption fails. For example, some languages have a municipality in
Denmark and its main town as separate pages, others as one page. To
make things worse, in some cases this latter page is basically about
the town, in others about the municipality. Those two pages one would
like to connect to each other, but to different pages in a language
that separates the two. In the current setting this is impossible
(well, we could define something to notify bots of this situation, but
one would need to define and add comments on almost each language to
make it work, not to mention that all bots as well as involved human
editors should know them). If one would create a new system as
proposed by you, it could be worked in (something like "if in a
language there is no page in this 'ring', make an interwiki to that
ring instead").

> Talking about interlanguage some feature requests come into my mind:
> * a possibility to limit the shown interwikis, set in the preferences
> * a possibility to set the order of interwikis, also set in the preferences.

If one would have a system like the one you describe, I would like to add:
* a possibility to also show interwiki that are already defined, but
for which the target page does not exist yet.

--
Andre Engels, [hidden email]
ICQ: 6260644  --  Skype: a_engels

_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: Fwd: [foundation-l] Bot policy on bots operating interwiki

White Cat
Either way there is a need for an interwiki bot operating on all wikis for a
fix otherwise as you point out two interwiki bots may end up reverting each
other. Had either bot had permission to edit each wiki this problem would
have been avoided.

On 9/7/07, Andre Engels <[hidden email]> wrote:

>
> 2007/9/7, Peter van Londen <[hidden email]>:
>
> > I would like to turn this also in a technical issue.
> Interwiki/Interlanguage
> > (there is a difference between interwiki and interlanguage, most people
> mean
> > interlanguage when talking about interwiki) organized as it is
> > (decentralized), is becoming more and more a problem, because:
> > * amount of edits needed, growing exponentially with growth of languages
> > * multiplying the mistakes (wrong interlanguages) through bot actions.
>
> And even removing them might be reverted if not done well enough,
> although on the other hand in many cases bot interwiki grinds to a
> halt after an error because of what you mention in your next point.
>
> > * bots are set to automatic, which means that only the easy interwiki's
> are
> > done. The difficult interwiki's, requiring handmade changes language
> > knowledge and investigations, are not done.
>
> I rarely set my bots automatic, preferring to disambiguate the
> problematic cases. I might be the exception, however. And even so,
> there are still plenty of cases where I decide to just let it go.
> Which brings me to the fourth point:
>
> * The bots work from the assumption that interwiki links form an
> equivalence relationship - that is, that when A has an interwiki to B
> and B has one to C, that there should be interwiki B to A and C to B
> as well - and that there should be only one interwiki from a given
> page to a given language. Unfortunately, there are cases where this
> assumption fails. For example, some languages have a municipality in
> Denmark and its main town as separate pages, others as one page. To
> make things worse, in some cases this latter page is basically about
> the town, in others about the municipality. Those two pages one would
> like to connect to each other, but to different pages in a language
> that separates the two. In the current setting this is impossible
> (well, we could define something to notify bots of this situation, but
> one would need to define and add comments on almost each language to
> make it work, not to mention that all bots as well as involved human
> editors should know them). If one would create a new system as
> proposed by you, it could be worked in (something like "if in a
> language there is no page in this 'ring', make an interwiki to that
> ring instead").
>
> > Talking about interlanguage some feature requests come into my mind:
> > * a possibility to limit the shown interwikis, set in the preferences
> > * a possibility to set the order of interwikis, also set in the
> preferences.
>
> If one would have a system like the one you describe, I would like to add:
> * a possibility to also show interwiki that are already defined, but
> for which the target page does not exist yet.
>
> --
> Andre Engels, [hidden email]
> ICQ: 6260644  --  Skype: a_engels
>
> _______________________________________________
> foundation-l mailing list
> [hidden email]
> http://lists.wikimedia.org/mailman/listinfo/foundation-l
>
_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: Fwd: [foundation-l] Bot policy on bots operating interwiki

Andre Engels
2007/9/7, White Cat <[hidden email]>:
> Either way there is a need for an interwiki bot operating on all wikis for a
> fix otherwise as you point out two interwiki bots may end up reverting each
> other. Had either bot had permission to edit each wiki this problem would
> have been avoided.

Reverting each other? No, why would they? I don't understand what
situation you mean.

--
Andre Engels, [hidden email]
ICQ: 6260644  --  Skype: a_engels

_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: Fwd: [foundation-l] Bot policy on bots operating interwiki

White Cat
Bots aren't sentient so they can act stupidly. There are situations where
you have a bad interwiki link. Unless that is removed from every single
instance where it forms a chain it will eventually return to the list (which
makes sense, the bots think the wrong link as a new member to the chain).
However if all interwiki bots were able to operate on all wikis such
problems could be very easily avoided.

   - White Cat

On 9/7/07, Andre Engels <[hidden email]> wrote:

>
> 2007/9/7, White Cat <[hidden email]>:
> > Either way there is a need for an interwiki bot operating on all wikis
> for a
> > fix otherwise as you point out two interwiki bots may end up reverting
> each
> > other. Had either bot had permission to edit each wiki this problem
> would
> > have been avoided.
>
> Reverting each other? No, why would they? I don't understand what
> situation you mean.
>
> --
> Andre Engels, [hidden email]
> ICQ: 6260644  --  Skype: a_engels
>
> _______________________________________________
> foundation-l mailing list
> [hidden email]
> http://lists.wikimedia.org/mailman/listinfo/foundation-l
>
_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: Fwd: [foundation-l] Bot policy on bots operating interwiki

Tuvic
Indeed, that's right. Just remember that interwiki-bots just spread
the bad link, they don't make it: it are human users who make the bad
link.

It happened to me on several occasions: I had just spend 20 minutes to
untangle an web of interwiki-linked articles, and some user just puts
a bad link back, because he/she thinks that the link should be there.
Very annoying, and not always revertable: after all, I'm just an
interwiki-bot-operator, while it's their home wiki most of the time.

So, not all problems would be avoided when having a general bot policy.

Greetings, Tuvic

2007/9/7, White Cat <[hidden email]>:
> Bots aren't sentient so they can act stupidly. There are situations where
> you have a bad interwiki link. Unless that is removed from every single
> instance where it forms a chain it will eventually return to the list (which
> makes sense, the bots think the wrong link as a new member to the chain).
> However if all interwiki bots were able to operate on all wikis such
> problems could be very easily avoided.
>
>    - White Cat
>

_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: Fwd: [foundation-l] Bot policy on bots operating interwiki

White Cat
Yes, whats breaking the bot is human error. and as a fellow interwiki-bot
operator I think it would be of great help if we were given some slack on
bot flag bureaucracy. You could just use the bot to fix the bad
interwikilink rather than fixing them manually. The policy would not solve
everything but would be a good step in the right direction.

      - White Cat

On 9/7/07, Tuvic <[hidden email]> wrote:

>
> Indeed, that's right. Just remember that interwiki-bots just spread
> the bad link, they don't make it: it are human users who make the bad
> link.
>
> It happened to me on several occasions: I had just spend 20 minutes to
> untangle an web of interwiki-linked articles, and some user just puts
> a bad link back, because he/she thinks that the link should be there.
> Very annoying, and not always revertable: after all, I'm just an
> interwiki-bot-operator, while it's their home wiki most of the time.
>
> So, not all problems would be avoided when having a general bot policy.
>
> Greetings, Tuvic
>
> 2007/9/7, White Cat <[hidden email]>:
> > Bots aren't sentient so they can act stupidly. There are situations
> where
> > you have a bad interwiki link. Unless that is removed from every single
> > instance where it forms a chain it will eventually return to the list
> (which
> > makes sense, the bots think the wrong link as a new member to the
> chain).
> > However if all interwiki bots were able to operate on all wikis such
> > problems could be very easily avoided.
> >
> >    - White Cat
> >
>
> _______________________________________________
> foundation-l mailing list
> [hidden email]
> http://lists.wikimedia.org/mailman/listinfo/foundation-l
>
_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: Fwd: [foundation-l] Bot policy on bots operating interwiki

Effe iets anders
I think that above situations have described perfectly that bots are not
perfect :) And although I think that the advantages outweight the
disadvantages, that doesn't mean that every community (with 0-bizillioin
members) agrees to that conclusion. I think that it is of the uttermost
importance that communities are independant, and are at least able to
protest to another new bot user. I know this is a pain in the ass, I know
this means more work to you guys, and I know that you don't like this. But
when determining this kind of things, I think that you should not only look
from the point of view of the bot owner, but even more to the pov of the
community (yes, even is there is only half a person there). Put the request
on the appropriate page (that is either a bot request page either some much
visited community page or even possibly the talk:Main_Page in the extreme
case) and give those folks the ability to protest to the new bots. if they
don't want them, well, it's their wiki, their choise. If that is because of
wrong information, well, either inform them well, either leave it there. I
think it is totally wrong if stewards are forcing bots up their throat.

And btw, I am confident that you are able to write some script to make that
making the requests somewhat easier in the first place... For the stewards
it makes no difference btw, because we have to grant hte rights seperately
anyways...

Effeietsanders

2007/9/8, White Cat <[hidden email]>:

>
> Yes, whats breaking the bot is human error. and as a fellow interwiki-bot
> operator I think it would be of great help if we were given some slack on
> bot flag bureaucracy. You could just use the bot to fix the bad
> interwikilink rather than fixing them manually. The policy would not solve
> everything but would be a good step in the right direction.
>
>       - White Cat
>
> On 9/7/07, Tuvic <[hidden email]> wrote:
> >
> > Indeed, that's right. Just remember that interwiki-bots just spread
> > the bad link, they don't make it: it are human users who make the bad
> > link.
> >
> > It happened to me on several occasions: I had just spend 20 minutes to
> > untangle an web of interwiki-linked articles, and some user just puts
> > a bad link back, because he/she thinks that the link should be there.
> > Very annoying, and not always revertable: after all, I'm just an
> > interwiki-bot-operator, while it's their home wiki most of the time.
> >
> > So, not all problems would be avoided when having a general bot policy.
> >
> > Greetings, Tuvic
> >
> > 2007/9/7, White Cat <[hidden email]>:
> > > Bots aren't sentient so they can act stupidly. There are situations
> > where
> > > you have a bad interwiki link. Unless that is removed from every
> single
> > > instance where it forms a chain it will eventually return to the list
> > (which
> > > makes sense, the bots think the wrong link as a new member to the
> > chain).
> > > However if all interwiki bots were able to operate on all wikis such
> > > problems could be very easily avoided.
> > >
> > >    - White Cat
> > >
> >
> > _______________________________________________
> > foundation-l mailing list
> > [hidden email]
> > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> >
> _______________________________________________
> foundation-l mailing list
> [hidden email]
> http://lists.wikimedia.org/mailman/listinfo/foundation-l
>
_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: Fwd: [foundation-l] Bot policy on bots operating interwiki

White Cat
I think it is beyond silly to demand people to make over 700 individual
human edits just so they can run an interwiki bot. It takes well over weeks
if not months of work to file all the requests. All these bots operate the
same code. I still need to see one logical explanation why communities
needto "approve" a spesific script repetitively. Bot A and B makes
identical
edits since they run the same code.

No I cannot write a script. Fundamentally bots are what you call, a
"script". What you suggest is the use of an unauthorized bot, something
exclusively banned. I can't believe you are even suggesting it.

If the local community is unhappy with a bot they can simply block it or ask
on meta to be removed from wikis that support interwiki bots. If the local
wiki does not have a single admin they they are not truly ready for a bot
request discussion. The bot's would make rare appearances in such wikis with
their article count anyways.

Wikipedia/Wikimedia isn't a democracy. If devs are allowed to "force"
software upgrades down the local communities throats, I truly do not see why
interwiki bot operators are not allwed to do the same.

    - White Cat

On 9/8/07, effe iets anders <[hidden email]> wrote:

>
> I think that above situations have described perfectly that bots are not
> perfect :) And although I think that the advantages outweight the
> disadvantages, that doesn't mean that every community (with 0-bizillioin
> members) agrees to that conclusion. I think that it is of the uttermost
> importance that communities are independant, and are at least able to
> protest to another new bot user. I know this is a pain in the ass, I know
> this means more work to you guys, and I know that you don't like this. But
> when determining this kind of things, I think that you should not only
> look
> from the point of view of the bot owner, but even more to the pov of the
> community (yes, even is there is only half a person there). Put the
> request
> on the appropriate page (that is either a bot request page either some
> much
> visited community page or even possibly the talk:Main_Page in the extreme
> case) and give those folks the ability to protest to the new bots. if they
> don't want them, well, it's their wiki, their choise. If that is because
> of
> wrong information, well, either inform them well, either leave it there. I
> think it is totally wrong if stewards are forcing bots up their throat.
>
> And btw, I am confident that you are able to write some script to make
> that
> making the requests somewhat easier in the first place... For the stewards
> it makes no difference btw, because we have to grant hte rights seperately
> anyways...
>
> Effeietsanders
>
> 2007/9/8, White Cat <[hidden email]>:
> >
> > Yes, whats breaking the bot is human error. and as a fellow
> interwiki-bot
> > operator I think it would be of great help if we were given some slack
> on
> > bot flag bureaucracy. You could just use the bot to fix the bad
> > interwikilink rather than fixing them manually. The policy would not
> solve
> > everything but would be a good step in the right direction.
> >
> >       - White Cat
> >
> > On 9/7/07, Tuvic <[hidden email]> wrote:
> > >
> > > Indeed, that's right. Just remember that interwiki-bots just spread
> > > the bad link, they don't make it: it are human users who make the bad
> > > link.
> > >
> > > It happened to me on several occasions: I had just spend 20 minutes to
> > > untangle an web of interwiki-linked articles, and some user just puts
> > > a bad link back, because he/she thinks that the link should be there.
> > > Very annoying, and not always revertable: after all, I'm just an
> > > interwiki-bot-operator, while it's their home wiki most of the time.
> > >
> > > So, not all problems would be avoided when having a general bot
> policy.
> > >
> > > Greetings, Tuvic
> > >
> > > 2007/9/7, White Cat <[hidden email]>:
> > > > Bots aren't sentient so they can act stupidly. There are situations
> > > where
> > > > you have a bad interwiki link. Unless that is removed from every
> > single
> > > > instance where it forms a chain it will eventually return to the
> list
> > > (which
> > > > makes sense, the bots think the wrong link as a new member to the
> > > chain).
> > > > However if all interwiki bots were able to operate on all wikis such
> > > > problems could be very easily avoided.
> > > >
> > > >    - White Cat
> > > >
> > >
> > > _______________________________________________
> > > foundation-l mailing list
> > > [hidden email]
> > > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> > >
> > _______________________________________________
> > foundation-l mailing list
> > [hidden email]
> > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> >
> _______________________________________________
> foundation-l mailing list
> [hidden email]
> http://lists.wikimedia.org/mailman/listinfo/foundation-l
>
_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: Fwd: [foundation-l] Bot policy on bots operating interwiki

Thomas Dalton
In reply to this post by White Cat
Unified login ought to make this much easier. While I don't know of
any plan for site-wide bot flags, it shouldn't be too difficult to
implement such a feature once unified login is in place. Then we would
just need one bot request page on meta for all interwiki bots (and the
bot policy would be a superset of all existing bot policies, where
possible).

_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: Fwd: [foundation-l] Bot policy on bots operating interwiki

Effe iets anders
In reply to this post by White Cat
From Andre's explanation I do understand that his robot is less automatic
and he solves more manually. That means that although you might run the same
script, the number of errors is not necessarily the same. Besides that,
there is also a matter of trust involved (do you trust the person (s)he will
only run that specific script? Do you trust that (s)he will do the necessary
updates in time? And what about the conflicting situations?) So there are
differences enough. But even if they'd not allow you, just because they want
maximal three bots or so (which seems understandable to me) I would think
that as a valid argument.

But *even* if they wouldn't want a bot to have a bit just because of no
single reason, they just dont want it, so be it. I feel that this is up to
the *community* and not to us to decide, unless the software changes
dramatically, as peter described before.

Eia

2007/9/8, White Cat <[hidden email]>:

>
> I think it is beyond silly to demand people to make over 700 individual
> human edits just so they can run an interwiki bot. It takes well over weeks
> if not months of work to file all the requests. All these bots operate the
> same code. I still need to see one logical explanation why communities
> need to "approve" a spesific script repetitively. Bot A and B makes
> identical edits since they run the same code.
>
> No I cannot write a script. Fundamentally bots are what you call, a
> "script". What you suggest is the use of an unauthorized bot, something
> exclusively banned. I can't believe you are even suggesting it.
>
> If the local community is unhappy with a bot they can simply block it or
> ask on meta to be removed from wikis that support interwiki bots. If the
> local wiki does not have a single admin they they are not truly ready for a
> bot request discussion. The bot's would make rare appearances in such wikis
> with their article count anyways.
>
> Wikipedia/Wikimedia isn't a democracy. If devs are allowed to "force"
> software upgrades down the local communities throats, I truly do not see why
> interwiki bot operators are not allwed to do the same.
>
>     - White Cat
>
> On 9/8/07, effe iets anders <[hidden email]> wrote:
>
> > I think that above situations have described perfectly that bots are not
> > perfect :) And although I think that the advantages outweight the
> > disadvantages, that doesn't mean that every community (with 0-bizillioin
> >
> > members) agrees to that conclusion. I think that it is of the uttermost
> > importance that communities are independant, and are at least able to
> > protest to another new bot user. I know this is a pain in the ass, I
> > know
> > this means more work to you guys, and I know that you don't like this.
> > But
> > when determining this kind of things, I think that you should not only
> > look
> > from the point of view of the bot owner, but even more to the pov of the
> >
> > community (yes, even is there is only half a person there). Put the
> > request
> > on the appropriate page (that is either a bot request page either some
> > much
> > visited community page or even possibly the talk:Main_Page in the
> > extreme
> > case) and give those folks the ability to protest to the new bots. if
> > they
> > don't want them, well, it's their wiki, their choise. If that is because
> > of
> > wrong information, well, either inform them well, either leave it there.
> > I
> > think it is totally wrong if stewards are forcing bots up their throat.
> >
> > And btw, I am confident that you are able to write some script to make
> > that
> > making the requests somewhat easier in the first place... For the
> > stewards
> > it makes no difference btw, because we have to grant hte rights
> > seperately
> > anyways...
> >
> > Effeietsanders
> >
> > 2007/9/8, White Cat <[hidden email] >:
> > >
> > > Yes, whats breaking the bot is human error. and as a fellow
> > interwiki-bot
> > > operator I think it would be of great help if we were given some slack
> > on
> > > bot flag bureaucracy. You could just use the bot to fix the bad
> > > interwikilink rather than fixing them manually. The policy would not
> > solve
> > > everything but would be a good step in the right direction.
> > >
> > >       - White Cat
> > >
> > > On 9/7/07, Tuvic < [hidden email]> wrote:
> > > >
> > > > Indeed, that's right. Just remember that interwiki-bots just spread
> > > > the bad link, they don't make it: it are human users who make the
> > bad
> > > > link.
> > > >
> > > > It happened to me on several occasions: I had just spend 20 minutes
> > to
> > > > untangle an web of interwiki-linked articles, and some user just
> > puts
> > > > a bad link back, because he/she thinks that the link should be
> > there.
> > > > Very annoying, and not always revertable: after all, I'm just an
> > > > interwiki-bot-operator, while it's their home wiki most of the time.
> > > >
> > > > So, not all problems would be avoided when having a general bot
> > policy.
> > > >
> > > > Greetings, Tuvic
> > > >
> > > > 2007/9/7, White Cat <[hidden email]>:
> > > > > Bots aren't sentient so they can act stupidly. There are
> > situations
> > > > where
> > > > > you have a bad interwiki link. Unless that is removed from every
> > > single
> > > > > instance where it forms a chain it will eventually return to the
> > list
> > > > (which
> > > > > makes sense, the bots think the wrong link as a new member to the
> > > > chain).
> > > > > However if all interwiki bots were able to operate on all wikis
> > such
> > > > > problems could be very easily avoided.
> > > > >
> > > > >    - White Cat
> > > > >
> > > >
> > > > _______________________________________________
> > > > foundation-l mailing list
> > > > [hidden email]
> > > > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> > > >
> > > _______________________________________________
> > > foundation-l mailing list
> > > [hidden email]
> > > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> > >
> > _______________________________________________
> > foundation-l mailing list
> > [hidden email]
> > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> >
>
>
_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Bot policy on bots operating interwiki

White Cat
To Thomas Dalton:

I think SUL is a distant dream at this point. Even when that is available we
will have problems as how rules are set at this point. How compatible is the
local wiki policies with each other?

I am En-N Tr-4 and Ja-1. I do not understand local policies aside from those
languages (I understand Japanese with great difficulty and I do not get it
all that much). Most interwiki bot operators know just a single language.
All local bot policies must be advertised/linked from on a page on meta
preferably with an English translation. After all I can only follow rules I
can find and understand. It has to be both. It would be beyond silly if we
are going to require interwiki bot operators to know every language out
there.

All bot policies on all local wikis should be compatible with each other.
There should be a meta bot standard for a few spesific tasks such as
interwiki linking, double redirect fixing, commons delinking to avoid
conflicts. Local wikis would NOT be required to follow this standard but
they would be jawdroppingly incompetent if they did not. This is much like
how commons is. Wiki's aren't  required to use/move images on/to commons but
they are recommended to do so instead. That worked pretty well.

This will be a problem even when SUL becomes a reality... that is if it
becomes a reality.

To: effeietsanders

I bet it is the same code (redirect.py). You can intwerwiki link pages like
that with it too. The amount of attention I pay to it is directly
proportional to the number of wikis my bot operates on (other wise it is a
waste of time as explained above as another interwiki bot would simply
revert). I do not have the time for that at the moment because I have 665
wikis to request a bot flag which I rather not.

Indeed trust is important, in fact critical. How would a wiki with half a
user determine weather or not to trust a bot operator? Say the bot operator
is from de.wikipedia and local wiki user is from a far eastern language (ex:
ml.wikipedia) that does not even use latin text. How would he be able to
determine weather or not to trust a user?

I think the operation of an interwiki bot needs interwiki consensus. If a
user is trustable on an interwiki meta discussion, I think it is safe to say
he/she/it is trustable on all wikis. If a local wiki still decides to refuse
that interwiki consensus, they can and are more than welcome to block the
bot in question.

Let's take your example. Suppose a wiki states their local request of a "3
bot max" on meta. All users who want to operate an interwiki bot would see
it and react to it. That wiki would be ignored by interwiki bot operators as
per their request. Bot operators would not even bother to file a request for
an interwiki bot flag on that wiki. So this would save every one a lot of
time. As for the 699 other wikis such a restriction is not the case and they
do not have a problem with the interwiki bots.

I still need to see one logical explanation why communities need to
"approve" a spesific script repetitively.

It only makes sense to handle interwiki affairs like interwiki bots on an
interwiki median such as meta.

    - White Cat


On 9/8/07, effe iets anders <[hidden email]> wrote:

>
> From Andre's explanation I do understand that his robot is less automatic
> and he solves more manually. That means that although you might run the same
> script, the number of errors is not necessarily the same. Besides that,
> there is also a matter of trust involved (do you trust the person (s)he will
> only run that specific script? Do you trust that (s)he will do the necessary
> updates in time? And what about the conflicting situations?) So there are
> differences enough. But even if they'd not allow you, just because they want
> maximal three bots or so (which seems understandable to me) I would think
> that as a valid argument.
>
> But *even* if they wouldn't want a bot to have a bit just because of no
> single reason, they just dont want it, so be it. I feel that this is up to
> the *community* and not to us to decide, unless the software changes
> dramatically, as peter described before.
>
> Eia
>
> 2007/9/8, White Cat < [hidden email]>:
> >
> > I think it is beyond silly to demand people to make over 700 individual
> > human edits just so they can run an interwiki bot. It takes well over weeks
> > if not months of work to file all the requests. All these bots operate the
> > same code. I still need to see one logical explanation why communities
> > need to "approve" a spesific script repetitively. Bot A and B makes
> > identical edits since they run the same code.
> >
> > No I cannot write a script. Fundamentally bots are what you call, a
> > "script". What you suggest is the use of an unauthorized bot, something
> > exclusively banned. I can't believe you are even suggesting it.
> >
> > If the local community is unhappy with a bot they can simply block it or
> > ask on meta to be removed from wikis that support interwiki bots. If the
> > local wiki does not have a single admin they they are not truly ready for a
> > bot request discussion. The bot's would make rare appearances in such wikis
> > with their article count anyways.
> >
> > Wikipedia/Wikimedia isn't a democracy. If devs are allowed to "force"
> > software upgrades down the local communities throats, I truly do not see why
> > interwiki bot operators are not allwed to do the same.
> >
> >     - White Cat
> >
> > On 9/8/07, effe iets anders < [hidden email]> wrote:
> >
> > > I think that above situations have described perfectly that bots are
> > > not
> > > perfect :) And although I think that the advantages outweight the
> > > disadvantages, that doesn't mean that every community (with
> > > 0-bizillioin
> > > members) agrees to that conclusion. I think that it is of the
> > > uttermost
> > > importance that communities are independant, and are at least able to
> > > protest to another new bot user. I know this is a pain in the ass, I
> > > know
> > > this means more work to you guys, and I know that you don't like this.
> > > But
> > > when determining this kind of things, I think that you should not only
> > > look
> > > from the point of view of the bot owner, but even more to the pov of
> > > the
> > > community (yes, even is there is only half a person there). Put the
> > > request
> > > on the appropriate page (that is either a bot request page either some
> > > much
> > > visited community page or even possibly the talk:Main_Page in the
> > > extreme
> > > case) and give those folks the ability to protest to the new bots. if
> > > they
> > > don't want them, well, it's their wiki, their choise. If that is
> > > because of
> > > wrong information, well, either inform them well, either leave it
> > > there. I
> > > think it is totally wrong if stewards are forcing bots up their
> > > throat.
> > >
> > > And btw, I am confident that you are able to write some script to make
> > > that
> > > making the requests somewhat easier in the first place... For the
> > > stewards
> > > it makes no difference btw, because we have to grant hte rights
> > > seperately
> > > anyways...
> > >
> > > Effeietsanders
> > >
> > > 2007/9/8, White Cat < [hidden email] >:
> > > >
> > > > Yes, whats breaking the bot is human error. and as a fellow
> > > interwiki-bot
> > > > operator I think it would be of great help if we were given some
> > > slack on
> > > > bot flag bureaucracy. You could just use the bot to fix the bad
> > > > interwikilink rather than fixing them manually. The policy would not
> > > solve
> > > > everything but would be a good step in the right direction.
> > > >
> > > >       - White Cat
> > > >
> > > > On 9/7/07, Tuvic < [hidden email]> wrote:
> > > > >
> > > > > Indeed, that's right. Just remember that interwiki-bots just
> > > spread
> > > > > the bad link, they don't make it: it are human users who make the
> > > bad
> > > > > link.
> > > > >
> > > > > It happened to me on several occasions: I had just spend 20
> > > minutes to
> > > > > untangle an web of interwiki-linked articles, and some user just
> > > puts
> > > > > a bad link back, because he/she thinks that the link should be
> > > there.
> > > > > Very annoying, and not always revertable: after all, I'm just an
> > > > > interwiki-bot-operator, while it's their home wiki most of the
> > > time.
> > > > >
> > > > > So, not all problems would be avoided when having a general bot
> > > policy.
> > > > >
> > > > > Greetings, Tuvic
> > > > >
> > > > > 2007/9/7, White Cat <[hidden email] >:
> > > > > > Bots aren't sentient so they can act stupidly. There are
> > > situations
> > > > > where
> > > > > > you have a bad interwiki link. Unless that is removed from every
> > > > single
> > > > > > instance where it forms a chain it will eventually return to the
> > > list
> > > > > (which
> > > > > > makes sense, the bots think the wrong link as a new member to
> > > the
> > > > > chain).
> > > > > > However if all interwiki bots were able to operate on all wikis
> > > such
> > > > > > problems could be very easily avoided.
> > > > > >
> > > > > >    - White Cat
> > > > > >
> > > > >
> > > > > _______________________________________________
> > > > > foundation-l mailing list
> > > > > [hidden email]
> > > > > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> > > > >
> > > > _______________________________________________
> > > > foundation-l mailing list
> > > > [hidden email]
> > > > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> > > >
> > > _______________________________________________
> > > foundation-l mailing list
> > > [hidden email]
> > > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> > >
> >
> >
>
_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: Bot policy on bots operating interwiki

Thomas Dalton
> I think SUL is a distant dream at this point.

Really? From what I've heard, the devs are making pretty good progress
towards it.

_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: Bot policy on bots operating interwiki

Andre Engels
In reply to this post by White Cat
2007/9/8, White Cat <[hidden email]>:

> I bet it is the same code (redirect.py). You can intwerwiki link pages like
> that with it too.

I suppose you mean interwiki.py rather than redirect.py? In that case,
yes, it's the same code. The options are a bit different (in
particular, I don't use -automatic as a smaller difference I do use
-whenneeded:3), but it's the same code. There might be differences on
the decision for that (on the one hand, my bot does some changes that
would not 'otherwise'  be done by another bot, on the other hand,
those are changes that would be more likely to be protested). Still,
it's mostly the same code.

> The amount of attention I pay to it is directly
> proportional to the number of wikis my bot operates on (other wise it is a
> waste of time as explained above as another interwiki bot would simply
> revert). I do not have the time for that at the moment because I have 665
> wikis to request a bot flag which I rather not.

If people want to give my bot a bot flag, I'm happy with that. If they
prefer to have it run without a bot flag, or if they don't mind either
way, I'm happy with that too. Basically, I just start running my bot
on a wiki, and sometimes (often 2 years or so afterward) I get to hear
that I should request a bot flag. I still think that's silly - I can
imagine having to ask permission, but the bot flag is for the normal
users, not for me, so I don't see any reason why I should be the one
to ask it.


--
Andre Engels, [hidden email]
ICQ: 6260644  --  Skype: a_engels

_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: Bot policy on bots operating interwiki

Andre Engels
In reply to this post by Thomas Dalton
2007/9/8, Thomas Dalton <[hidden email]>:
> > I think SUL is a distant dream at this point.
>
> Really? From what I've heard, the devs are making pretty good progress
> towards it.

That's what I heard too. But 'good progress towards it'  is farther
away than what I heard one year ago (then it was essentially ready,
and would be implemented after the board elections had finished), and
about similar to what I heard two and a half years ago. I guess it
will come some day, but whether that day is next week or in ten years
I have stopped having any expectations about.


--
Andre Engels, [hidden email]
ICQ: 6260644  --  Skype: a_engels

_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
12