MediaWiki 1.18 release plans

classic Classic list List threaded Threaded
13 messages Options
Reply | Threaded
Open this post in threaded view
|

MediaWiki 1.18 release plans

Jeroen De Dauw-2
Hey,

I'm curious to what the release plans for 1.18 are. 1.18wmf1 has been
deployed on the wmf wikis, but still there is no beta release for 1.18. For
when is such an initial beta planned, and what's the target data for the
actual 1.18 release?

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil.
--
_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: MediaWiki 1.18 release plans

Siebrand Mazeland
Hi Jeroen,

> Jeroen De Dauw wrote:
> I'm curious to what the release plans for 1.18 are. 1.18wmf1 has been
> deployed on the wmf wikis, but still there is no beta release for 1.18.
> For
> when is such an initial beta planned, and what's the target data for the
> actual 1.18 release?

Completion of the 1.18 deployment for Wikimedia sites was followed by
multi day "all staff" and "tech meetings" in San Francisco. Because of
this, resources have been spread extremely thin this week.

This weekend ops and others deeply involved in the 1.18 release process
are participating in the New Orleans hackathon. Today they are en route to
New Orleans.

It simply has not yet been possible to push out a tarball with a beta of
MediaWiki 1.18.

Monday or Tuesday, things will largely be back to normal, and I think I'm
not wrong if many of the WMF techies also want to get a beta of 1.18 out
of the door, and will push hard to make that happen next week.

Siebrand


_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: MediaWiki 1.18 release plans

Mark A. Hershberger
"Siebrand Mazeland" <[hidden email]> writes:

> This weekend ops and others deeply involved in the 1.18 release process
> are participating in the New Orleans hackathon. Today they are en route to
> New Orleans.

I'll be holding a triage here in New Orleans of the 1.18 regressions
that we saw when we deployed 1.18 to enwiki.  You can see the list of
bugs here with notes (to be added shortly): http://hexm.de/8b

Fair warning: it is likely that some of these will turn into tarball
blockers.  I hope to use this time in New Orleans to start working on an
acceptance test suite so that we don't have a big batch of regressions
like this again for 1.19.

--
Mark A. Hershberger
Bugmeister
Wikimedia Foundation
[hidden email]

_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Autolink to Wikipedia

WJhonson

Right now the logic is that if you have a link using the [[ ]] the code will check that there is a page with that name.

What I want to do is add to that, that if there is NOT, then it checks to see if there is a page with that name at En.Wikipedia

Is there already an extension that does that?  Does anyone see any problem with modding the code to do this?


_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Autolink to Wikipedia

Daniel Friesen-4
Could be a little tricky to implement efficiently. I don't know if we
have a way to batch the links parsed from the parser. ie: Without
knowing titles up front it would be a HTTP query to WP for each red [[]]
you have in the page.

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

On 11-10-15 04:55 PM, Wjhonson wrote:
> Right now the logic is that if you have a link using the [[ ]] the code will check that there is a page with that name.
>
> What I want to do is add to that, that if there is NOT, then it checks to see if there is a page with that name at En.Wikipedia
>
> Is there already an extension that does that?  Does anyone see any problem with modding the code to do this?

_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Autolink to Wikipedia

Jon Davis-5
In reply to this post by WJhonson
I read this question 2 ways.

#1 - You want an extension that will automatically link words to Wikipedia,
that are not currently linked. This would be very difficult to
implement efficiently (with multi word titles being taken into account) and
would end up with half of your text as wikilinks, IE [1][2][3][4]

#2 - You want an extension that will replace redlinks, where possible, with
links to wikipedia.  I don't know how the parser engine handles links like
that, but I'm sure a hack could be made that would be reasonable.

Which were you talking about?
-Jon

[1] https://en.wikipedia.org/wiki/Right
[2] https://en.wikipedia.org/wiki/Now
[3] https://en.wikipedia.org/wiki/The
[4] https://en.wikipedia.org/wiki/Logic

On Sat, Oct 15, 2011 at 16:55, Wjhonson <[hidden email]> wrote:

>
> Right now the logic is that if you have a link using the [[ ]] the code
> will check that there is a page with that name.
>
> What I want to do is add to that, that if there is NOT, then it checks to
> see if there is a page with that name at En.Wikipedia
>
> Is there already an extension that does that?  Does anyone see any problem
> with modding the code to do this?
>
>
> _______________________________________________
> MediaWiki-l mailing list
> [hidden email]
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>



--
Jon
[[User:ShakataGaNai]] / KJ6FNQ
http://snowulf.com/
http://ipv6wiki.net/
_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Autolink to Wikipedia

John Du Hart
I'm pretty sure it's #2

On Sun, Oct 16, 2011 at 7:30 PM, Jon Davis <[hidden email]> wrote:

> I read this question 2 ways.
>
> #1 - You want an extension that will automatically link words to Wikipedia,
> that are not currently linked. This would be very difficult to
> implement efficiently (with multi word titles being taken into account) and
> would end up with half of your text as wikilinks, IE [1][2][3][4]
>
> #2 - You want an extension that will replace redlinks, where possible, with
> links to wikipedia.  I don't know how the parser engine handles links like
> that, but I'm sure a hack could be made that would be reasonable.
>
> Which were you talking about?
> -Jon
>
> [1] https://en.wikipedia.org/wiki/Right
> [2] https://en.wikipedia.org/wiki/Now
> [3] https://en.wikipedia.org/wiki/The
> [4] https://en.wikipedia.org/wiki/Logic
>
> On Sat, Oct 15, 2011 at 16:55, Wjhonson <[hidden email]> wrote:
>
> >
> > Right now the logic is that if you have a link using the [[ ]] the code
> > will check that there is a page with that name.
> >
> > What I want to do is add to that, that if there is NOT, then it checks to
> > see if there is a page with that name at En.Wikipedia
> >
> > Is there already an extension that does that?  Does anyone see any
> problem
> > with modding the code to do this?
> >
> >
> > _______________________________________________
> > MediaWiki-l mailing list
> > [hidden email]
> > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >
>
>
>
> --
> Jon
> [[User:ShakataGaNai]] / KJ6FNQ
> http://snowulf.com/
> http://ipv6wiki.net/
> _______________________________________________
> MediaWiki-l mailing list
> [hidden email]
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>



--
John
_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Autolink to Wikipedia

John Mark Vandenberg
this might do the trick efficiently:

1. on save, identify redlinks and create pages as '#redirect
[[wikipedia:{{{PAGENAME}}}]]' if a wikipedia page exists

2. on html generation, or using JavaScript and the API, find linked
redirects containing interwikis and render them in the page as
external links to the redirect target, rather than as internal links
to a local page containing a external link.

If step two is done in php and cached, rather than JS, the cache is
stale whenever a redirect page with an interwiki is turned into a
normal page with text, so all pages that link to the redirect page
will need to be regenerated.

--
John Vandenberg

_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Autolink to Wikipedia

Daniel Friesen-4
On Sun, 16 Oct 2011 17:33:04 -0700, John Vandenberg <[hidden email]>  
wrote:

> this might do the trick efficiently:
>
> 1. on save, identify redlinks and create pages as '#redirect
> [[wikipedia:{{{PAGENAME}}}]]' if a wikipedia page exists
>
> 2. on html generation, or using JavaScript and the API, find linked
> redirects containing interwikis and render them in the page as
> external links to the redirect target, rather than as internal links
> to a local page containing a external link.
>
> If step two is done in php and cached, rather than JS, the cache is
> stale whenever a redirect page with an interwiki is turned into a
> normal page with text, so all pages that link to the redirect page
> will need to be regenerated.
>
Editing one of the redirects should trigger a refresh job for the other  
page.

--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Autolink to Wikipedia

Platonides
In reply to this post by Daniel Friesen-4
Daniel Friesen wrote:
> Could be a little tricky to implement efficiently. I don't know if we
> have a way to batch the links parsed from the parser. ie: Without
> knowing titles up front it would be a HTTP query to WP for each red [[]]
> you have in the page.

We do batch the link existance.

Wjhonson wrote:
> Does anyone see any problem with modding the code to do this?
You don't change the core code. You create an extension that does what
you want.




_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Autolink to Wikipedia

WJhonson

I agree that some people do that.  I don't particularly walk that path myself.




Platonides wrote: "You don't change the core code. You create an extension that does what
ou want."





-----Original Message-----
From: Platonides <[hidden email]>
To: mediawiki-l <[hidden email]>
Sent: Tue, Oct 18, 2011 8:40 am
Subject: Re: [Mediawiki-l] Autolink to Wikipedia


Daniel Friesen wrote:
 Could be a little tricky to implement efficiently. I don't know if we
 have a way to batch the links parsed from the parser. ie: Without
 knowing titles up front it would be a HTTP query to WP for each red [[]]
 you have in the page.
We do batch the link existance.
Wjhonson wrote:
 Does anyone see any problem with modding the code to do this?
ou don't change the core code. You create an extension that does what
ou want.


______________________________________________
ediaWiki-l mailing list
[hidden email]
ttps://lists.wikimedia.org/mailman/listinfo/mediawiki-l

_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: MediaWiki 1.18 release plans

Brion Vibber
In reply to this post by Mark A. Hershberger
On Sat, Oct 15, 2011 at 9:14 AM, Mark A. Hershberger <
[hidden email]> wrote:

> "Siebrand Mazeland" <[hidden email]> writes:
>
> > This weekend ops and others deeply involved in the 1.18 release process
> > are participating in the New Orleans hackathon. Today they are en route
> to
> > New Orleans.
>
> I'll be holding a triage here in New Orleans of the 1.18 regressions
> that we saw when we deployed 1.18 to enwiki.  You can see the list of
> bugs here with notes (to be added shortly): http://hexm.de/8b
>
> Fair warning: it is likely that some of these will turn into tarball
> blockers.  I hope to use this time in New Orleans to start working on an
> acceptance test suite so that we don't have a big batch of regressions
> like this again for 1.19.
>

Any updated news on release plans?

-- brion
_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: MediaWiki 1.18 release plans

reedy
In reply to this post by Jeroen De Dauw-2
I was planning on working on the 1.18 release last week, however most of my
time was absorbed with Contest related work, meaning this didn't happen.

 

As of writing this email, there are 27 outstanding revisions tagged as 1.18
[1]. 26 are new, one is fixme. All revisons don't have to be merged to the
REL1_18 branch at this point. Any major bugs that have been in fixed (that
are applicable in 1.18) should be backported if they haven't been already.
Helps limit the number of duplicate bug reports we may get.

 

On the bug front, there are numerous outstanding bugs, not all of these need
to be fixed for 1.18 [2][3]. Please feel free to remove them from being
tarball blockers

 

The intention is to push a beta out this week (today? maybe.. Certainly by
the end of the week)

 

Certainly, if anyone knows of anything that really "needs" to be in the
beta, please let me know ASAP.

 

 

 

Sam

 

[1] https://www.mediawiki.org/wiki/Special:Code/MediaWiki/tag/1.18

[2] https://bugzilla.wikimedia.org/show_bug.cgi?id=28425 - Tarball bugs

[3] https://bugzilla.wikimedia.org/show_bug.cgi?id=29876 - WMF Deployment
related bugs

_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l