Re: image rebuilding issue - Rich

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Re: image rebuilding issue - Rich

kevin zhang
Rich,

To rebuild the images, just run rebuildimages.php. Change directory so you
are in the maintenance folder.

Thanks,

Kevin


On Wed, Feb 6, 2019 at 7:00 AM <[hidden email]>
wrote:

> Send MediaWiki-l mailing list submissions to
>         [hidden email]
>
> To subscribe or unsubscribe via the World Wide Web, visit
>         https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> or, via email, send a message with subject or body 'help' to
>         [hidden email]
>
> You can reach the person managing the list at
>         [hidden email]
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of MediaWiki-l digest..."
>
>
> Today's Topics:
>
>    1. Re: help importing image into a new wiki (Manuela)
>    2. Re: help importing image into a new wiki
>       (Evans, Richard K. (GRC-H000))
>    3. Re: What's the best way to improve performance, with regard
>       to edit rate (Ariel Glenn WMF)
>    4. Re: What's the best way to improve performance, with regard
>       to edit rate (Brian Wolff)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Tue, 5 Feb 2019 10:01:14 -0700 (MST)
> From: Manuela <[hidden email]>
> To: [hidden email]
> Subject: Re: [MediaWiki-l] help importing image into a new wiki
> Message-ID: <[hidden email]>
> Content-Type: text/plain; charset=us-ascii
>
> Hi Rich,
>
> I don't know why your approach does not work.
>
> This is what I do: I export the database as sql using pypmyadmin, import it
> to the new host and copy the images folder. This has always worked for me
>
> > Manuela
>
>
>
>
>
> --
> Sent from: http://mediawiki-i.429.n8.nabble.com/
>
>
>
> ------------------------------
>
> Message: 2
> Date: Tue, 5 Feb 2019 17:13:25 +0000
> From: "Evans, Richard K. (GRC-H000)" <[hidden email]>
> To: MediaWiki announcements and site admin list
>         <[hidden email]>
> Subject: Re: [MediaWiki-l] help importing image into a new wiki
> Message-ID:
>         <[hidden email]>
> Content-Type: text/plain; charset="utf-8"
>
> Unfortunately I'm not exporting the database SQL directly as you are ..
> I'm using the exportDump.php maintenance script to get the pages and page
> content as an XML file. You're approach works because your database
> transfer includes all data in all tables.. my approach is just the page
> names and page content as XML imported as new articles.
>
> I can't (really don't want to) export the database directly as SQL because
> it's a very old version of MW 1.17 .. and the new site is MW 1.30.
>
> The XML export from 1.17 and import to 1.30 went wonderfully well (except
> no images).. I have the image folder from the 1.17 site copied in to the
> 1.30 site and I just need to figure out how to tell MW to "rebuild the
> image pages" from the images folder.
>
> Anyone?
>
> /Rich
>
> -----Original Message-----
> From: MediaWiki-l [mailto:[hidden email]] On
> Behalf Of Manuela
> Sent: Tuesday, February 05, 2019 12:01 PM
> To: [hidden email]
> Subject: Re: [MediaWiki-l] help importing image into a new wiki
>
> Hi Rich,
>
> I don't know why your approach does not work.
>
> This is what I do: I export the database as sql using pypmyadmin, import
> it to the new host and copy the images folder. This has always worked for me
>
> > Manuela
>
>
>
>
>
> --
> Sent from: http://mediawiki-i.429.n8.nabble.com/
>
> _______________________________________________
> MediaWiki-l mailing list
> To unsubscribe, go to:
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
> ------------------------------
>
> Message: 3
> Date: Wed, 6 Feb 2019 10:54:00 +0200
> From: Ariel Glenn WMF <[hidden email]>
> To: MediaWiki announcements and site admin list
>         <[hidden email]>
> Subject: Re: [MediaWiki-l] What's the best way to improve performance,
>         with regard to edit rate
> Message-ID:
>         <
> [hidden email]>
> Content-Type: text/plain; charset="UTF-8"
>
> It's not split up (sharded) across servers, a least as far as page and
> revision tables go. There is one active master at any given time hat
> handles all writes; the current host has 160GB of memory and 10 physical
> cores (20 with hyperthreading). The actual revision *content* for all
> projects is indeed split up across several servers, in an 'external
> storage' cluster. The current server configuration is available at
> https://noc.wikimedia.org/conf/highlight.php?file=db-eqiad.php
>
> You can get basic specs as well as load information on these servers by
> looking up each one in grafana; here's db1067 (the current enwiki master)
> as an example:
>
> https://grafana.wikimedia.org/d/000000607/cluster-overview?orgId=1&var-datasource=eqiad%20prometheus%2Fops&var-cluster=mysql&var-instance=db1067
>
> Ariel
>
> On Wed, Nov 28, 2018 at 4:34 PM Hershel Robinson <[hidden email]>
> wrote:
>
> > > I've heard that Wikimedia splits their enwiki database up among more
> than
> > > one server; is that how they're able to handle several page saves per
> > > second on the master?
> >
> > That is correct. See here
> > https://meta.wikimedia.org/wiki/Wikimedia_servers for more details.
> >
> > --
> > http://civihosting.com/
> > Simply the best in shared hosting
> >
> > _______________________________________________
> > MediaWiki-l mailing list
> > To unsubscribe, go to:
> > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >
>
>
> ------------------------------
>
> Message: 4
> Date: Wed, 6 Feb 2019 10:46:03 +0000
> From: Brian Wolff <[hidden email]>
> To: MediaWiki announcements and site admin list
>         <[hidden email]>
> Subject: Re: [MediaWiki-l] What's the best way to improve performance,
>         with regard to edit rate
> Message-ID:
>         <CA+oo+DWr-qEeL=
> [hidden email]>
> Content-Type: text/plain; charset="UTF-8"
>
> What is your caching setup (e.g. $wgMainCacheType and friends)? Caching
> probably has more of an effect on read time than save time, but it will
> also have an effect on save time, probably a significant one. If its just
> one server, apcu (i.e. CACHE_ACCEL) is probably the easiest to setup.
>
> There's a number of factors that can effect page save time. In many cases
> it can depend on what the content of your page edits (e.g. If your edits
> have lots of embedded images, 404 handling can result in significant
> improvements).
>
> The first step I would suggest would be to do profiling -
> https://www.mediawiki.org/wiki/Manual:Profiling This will tell you what
> part is being slow, and we can give more specific advice based on that
>
> --
> Brian
>
> On Wed, Nov 28, 2018 at 2:17 PM Star Struck <[hidden email]>
> wrote:
>
> > The server is just the localhost that I use for testing; it's Apache and
> > MySQL running on Ubuntu 18.04 on an HP Elite 3.0ghz with 4GB of RAM
> > <
> >
> https://www.amazon.com/HP-Elite-Professional-Certified-Refurbished/dp/B0094JF1HA
> > >.
> > I haven't really figured out what kind of hardware or software I want to
> > use for production, because I haven't done a lot of server administration
> > (I've typically just used a VPS when I needed webhosting, but perhaps my
> > needs now have expanded beyond that, because this is going to be a huge
> > wiki, the same scale as Wikipedia; although my main concern at the moment
> > is with making page saves, rather than page loads, more efficient, since
> I
> > don't necessarily anticipate having a lot of visitors from the Internet
> > reading the wiki, or else I'd be focusing more on stuff like caching; I
> > mostly just want to set up a workable proof-of-concept for the moment.)
> >
> > I've heard that Wikimedia splits their enwiki database up among more than
> > one server; is that how they're able to handle several page saves per
> > second on the master?
> >
> > On Wed, Nov 28, 2018 at 8:19 AM Hershel Robinson <[hidden email]>
> > wrote:
> >
> > > > What's the best way to boost performance ...
> > >
> > > Depends on a myriad of factors, such as the OS, web server and
> > > database and hardware etc.
> > >
> > > The simplest answer is to increase your hardware resources, meaning if
> > > the site has one CPU, give it two. For anything more specific, we
> > > would need more details about the server software and hardware.
> > >
> > > Hershel
> > >
> > _______________________________________________
> > MediaWiki-l mailing list
> > To unsubscribe, go to:
> > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >
>
>
> ------------------------------
>
> Subject: Digest Footer
>
> _______________________________________________
> MediaWiki-l mailing list
> [hidden email]
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
>
> ------------------------------
>
> End of MediaWiki-l Digest, Vol 185, Issue 5
> *******************************************
>
_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: image rebuilding issue - Rich

rkevans
Hmm.. it didn't find any images:

> [user@server maintenance]$ wiki WIKI=MyWiki php rebuildImages.php
> Processing image...
> Finished image... 0 of 0 rows updated
> Processing oldimage...
> Finished oldimage... 0 of 0 rows updated
> [user@server maintenance]$

Any thoughts as to why?

-----Original Message-----
From: MediaWiki-l [mailto:[hidden email]] On Behalf Of Kevin Zhang
Sent: Wednesday, February 06, 2019 9:03 AM
To: [hidden email]
Subject: Re: [MediaWiki-l] image rebuilding issue - Rich

Rich,

To rebuild the images, just run rebuildimages.php. Change directory so you are in the maintenance folder.

Thanks,

Kevin


On Wed, Feb 6, 2019 at 7:00 AM <[hidden email]>
wrote:

> Send MediaWiki-l mailing list submissions to
>         [hidden email]
>
> To subscribe or unsubscribe via the World Wide Web, visit
>         https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> or, via email, send a message with subject or body 'help' to
>         [hidden email]
>
> You can reach the person managing the list at
>         [hidden email]
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of MediaWiki-l digest..."
>
>
> Today's Topics:
>
>    1. Re: help importing image into a new wiki (Manuela)
>    2. Re: help importing image into a new wiki
>       (Evans, Richard K. (GRC-H000))
>    3. Re: What's the best way to improve performance, with regard
>       to edit rate (Ariel Glenn WMF)
>    4. Re: What's the best way to improve performance, with regard
>       to edit rate (Brian Wolff)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Tue, 5 Feb 2019 10:01:14 -0700 (MST)
> From: Manuela <[hidden email]>
> To: [hidden email]
> Subject: Re: [MediaWiki-l] help importing image into a new wiki
> Message-ID: <[hidden email]>
> Content-Type: text/plain; charset=us-ascii
>
> Hi Rich,
>
> I don't know why your approach does not work.
>
> This is what I do: I export the database as sql using pypmyadmin,
> import it to the new host and copy the images folder. This has always
> worked for me
>
> > Manuela
>
>
>
>
>
> --
> Sent from: http://mediawiki-i.429.n8.nabble.com/
>
>
>
> ------------------------------
>
> Message: 2
> Date: Tue, 5 Feb 2019 17:13:25 +0000
> From: "Evans, Richard K. (GRC-H000)" <[hidden email]>
> To: MediaWiki announcements and site admin list
>         <[hidden email]>
> Subject: Re: [MediaWiki-l] help importing image into a new wiki
> Message-ID:
>        
> <[hidden email]>
> Content-Type: text/plain; charset="utf-8"
>
> Unfortunately I'm not exporting the database SQL directly as you are ..
> I'm using the exportDump.php maintenance script to get the pages and
> page content as an XML file. You're approach works because your
> database transfer includes all data in all tables.. my approach is
> just the page names and page content as XML imported as new articles.
>
> I can't (really don't want to) export the database directly as SQL
> because it's a very old version of MW 1.17 .. and the new site is MW 1.30.
>
> The XML export from 1.17 and import to 1.30 went wonderfully well
> (except no images).. I have the image folder from the 1.17 site copied
> in to the
> 1.30 site and I just need to figure out how to tell MW to "rebuild the
> image pages" from the images folder.
>
> Anyone?
>
> /Rich
>
> -----Original Message-----
> From: MediaWiki-l [mailto:[hidden email]] On
> Behalf Of Manuela
> Sent: Tuesday, February 05, 2019 12:01 PM
> To: [hidden email]
> Subject: Re: [MediaWiki-l] help importing image into a new wiki
>
> Hi Rich,
>
> I don't know why your approach does not work.
>
> This is what I do: I export the database as sql using pypmyadmin,
> import it to the new host and copy the images folder. This has always
> worked for me
>
> > Manuela
>
>
>
>
>
> --
> Sent from: http://mediawiki-i.429.n8.nabble.com/
>
> _______________________________________________
> MediaWiki-l mailing list
> To unsubscribe, go to:
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
> ------------------------------
>
> Message: 3
> Date: Wed, 6 Feb 2019 10:54:00 +0200
> From: Ariel Glenn WMF <[hidden email]>
> To: MediaWiki announcements and site admin list
>         <[hidden email]>
> Subject: Re: [MediaWiki-l] What's the best way to improve performance,
>         with regard to edit rate
> Message-ID:
>         <
> [hidden email]>
> Content-Type: text/plain; charset="UTF-8"
>
> It's not split up (sharded) across servers, a least as far as page and
> revision tables go. There is one active master at any given time hat
> handles all writes; the current host has 160GB of memory and 10
> physical cores (20 with hyperthreading). The actual revision *content*
> for all projects is indeed split up across several servers, in an
> 'external storage' cluster. The current server configuration is
> available at
> https://noc.wikimedia.org/conf/highlight.php?file=db-eqiad.php
>
> You can get basic specs as well as load information on these servers
> by looking up each one in grafana; here's db1067 (the current enwiki
> master) as an example:
>
> https://grafana.wikimedia.org/d/000000607/cluster-overview?orgId=1&var
> -datasource=eqiad%20prometheus%2Fops&var-cluster=mysql&var-instance=db
> 1067
>
> Ariel
>
> On Wed, Nov 28, 2018 at 4:34 PM Hershel Robinson <[hidden email]>
> wrote:
>
> > > I've heard that Wikimedia splits their enwiki database up among
> > > more
> than
> > > one server; is that how they're able to handle several page saves
> > > per second on the master?
> >
> > That is correct. See here
> > https://meta.wikimedia.org/wiki/Wikimedia_servers for more details.
> >
> > --
> > http://civihosting.com/
> > Simply the best in shared hosting
> >
> > _______________________________________________
> > MediaWiki-l mailing list
> > To unsubscribe, go to:
> > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >
>
>
> ------------------------------
>
> Message: 4
> Date: Wed, 6 Feb 2019 10:46:03 +0000
> From: Brian Wolff <[hidden email]>
> To: MediaWiki announcements and site admin list
>         <[hidden email]>
> Subject: Re: [MediaWiki-l] What's the best way to improve performance,
>         with regard to edit rate
> Message-ID:
>         <CA+oo+DWr-qEeL=
> [hidden email]>
> Content-Type: text/plain; charset="UTF-8"
>
> What is your caching setup (e.g. $wgMainCacheType and friends)?
> Caching probably has more of an effect on read time than save time,
> but it will also have an effect on save time, probably a significant
> one. If its just one server, apcu (i.e. CACHE_ACCEL) is probably the easiest to setup.
>
> There's a number of factors that can effect page save time. In many
> cases it can depend on what the content of your page edits (e.g. If
> your edits have lots of embedded images, 404 handling can result in
> significant improvements).
>
> The first step I would suggest would be to do profiling -
> https://www.mediawiki.org/wiki/Manual:Profiling This will tell you
> what part is being slow, and we can give more specific advice based on
> that
>
> --
> Brian
>
> On Wed, Nov 28, 2018 at 2:17 PM Star Struck <[hidden email]>
> wrote:
>
> > The server is just the localhost that I use for testing; it's Apache
> > and MySQL running on Ubuntu 18.04 on an HP Elite 3.0ghz with 4GB of
> > RAM <
> >
> https://www.amazon.com/HP-Elite-Professional-Certified-Refurbished/dp/
> B0094JF1HA
> > >.
> > I haven't really figured out what kind of hardware or software I
> > want to use for production, because I haven't done a lot of server
> > administration (I've typically just used a VPS when I needed
> > webhosting, but perhaps my needs now have expanded beyond that,
> > because this is going to be a huge wiki, the same scale as
> > Wikipedia; although my main concern at the moment is with making
> > page saves, rather than page loads, more efficient, since
> I
> > don't necessarily anticipate having a lot of visitors from the
> > Internet reading the wiki, or else I'd be focusing more on stuff
> > like caching; I mostly just want to set up a workable
> > proof-of-concept for the moment.)
> >
> > I've heard that Wikimedia splits their enwiki database up among more
> > than one server; is that how they're able to handle several page
> > saves per second on the master?
> >
> > On Wed, Nov 28, 2018 at 8:19 AM Hershel Robinson
> > <[hidden email]>
> > wrote:
> >
> > > > What's the best way to boost performance ...
> > >
> > > Depends on a myriad of factors, such as the OS, web server and
> > > database and hardware etc.
> > >
> > > The simplest answer is to increase your hardware resources,
> > > meaning if the site has one CPU, give it two. For anything more
> > > specific, we would need more details about the server software and hardware.
> > >
> > > Hershel
> > >
> > _______________________________________________
> > MediaWiki-l mailing list
> > To unsubscribe, go to:
> > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >
>
>
> ------------------------------
>
> Subject: Digest Footer
>
> _______________________________________________
> MediaWiki-l mailing list
> [hidden email]
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
>
> ------------------------------
>
> End of MediaWiki-l Digest, Vol 185, Issue 5
> *******************************************
>
_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l