[MediaWiki-l] Mediawiki articla export

classic Classic list List threaded Threaded
8 messages Options
Reply | Threaded
Open this post in threaded view
|

[MediaWiki-l] Mediawiki articla export

John Foster-2
Is there any way to export ALL the articles & or pages from a very slow
but working mediawiki. I want to move them to a much faster upgraded
mediawiki server.
I have tried the dumpbackup script in /maintainence, but that didn't get
all the pages, only some, a& I dont know why. Any tips are appreciated.
Thanks
john


_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Mediawiki articla export

Yan Seiner
John W. Foster wrote:
> Is there any way to export ALL the articles & or pages from a very slow
> but working mediawiki. I want to move them to a much faster upgraded
> mediawiki server.
> I have tried the dumpbackup script in /maintainence, but that didn't get
> all the pages, only some, a& I dont know why. Any tips are appreciated.
> Thanks
> john
>
>  
If it's the same version of mediawiki you can always try dumping the database directly and importing it into mysql on the new server.  I'm not sure but you might have to create the exact file structure as well....

--
Project Management Consulting and Training
http://www.ridgelineconsultingllc.com


_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Mediawiki articla export

John Foster
On Mon, 2013-10-21 at 16:46 -0700, Yan Seiner wrote:

> John W. Foster wrote:
> > Is there any way to export ALL the articles & or pages from a very slow
> > but working mediawiki. I want to move them to a much faster upgraded
> > mediawiki server.
> > I have tried the dumpbackup script in /maintainence, but that didn't get
> > all the pages, only some, a& I dont know why. Any tips are appreciated.
> > Thanks
> > john
> >
> >  
> If it's the same version of mediawiki you can always try dumping the database directly and importing it into mysql on the new server.  I'm not sure but you might have to create the exact file structure as well....
>
Thanks.
I am aware of that solution & in fact it is my preferred method for
moving a wiki. However; the reason the mediawiki is slow is a totally
messed up MySql database system, & I don't know how to fix it. I tried
for over a year, as the wiki has thousands of pages/articles. Therefore
I don't want to move the table structure for this db into the new,
properly functioning wiki.
Anything else, maybe.


_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Mediawiki articla export

WJhonson
How about a script that Googles site:www.myurl.com and then walks every page and copies it ;)


 

 

 

-----Original Message-----
From: John Foster <[hidden email]>
To: mediawiki-list <[hidden email]>
Cc: MediaWiki announcements and site admin list <[hidden email]>
Sent: Mon, Oct 21, 2013 4:58 pm
Subject: Re: [MediaWiki-l] Mediawiki articla export


On Mon, 2013-10-21 at 16:46 -0700, Yan Seiner wrote:

> John W. Foster wrote:
> > Is there any way to export ALL the articles & or pages from a very slow
> > but working mediawiki. I want to move them to a much faster upgraded
> > mediawiki server.
> > I have tried the dumpbackup script in /maintainence, but that didn't get
> > all the pages, only some, a& I dont know why. Any tips are appreciated.
> > Thanks
> > john
> >
> >  
> If it's the same version of mediawiki you can always try dumping the database
directly and importing it into mysql on the new server.  I'm not sure but you
might have to create the exact file structure as well....
>
Thanks.
I am aware of that solution & in fact it is my preferred method for
moving a wiki. However; the reason the mediawiki is slow is a totally
messed up MySql database system, & I don't know how to fix it. I tried
for over a year, as the wiki has thousands of pages/articles. Therefore
I don't want to move the table structure for this db into the new,
properly functioning wiki.
Anything else, maybe.


_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l


 
_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Mediawiki articla export

John Doe-27
I can do a little better than that, I can whip up something that copies
everything based off Special:Allpages. If you want me to do that drop me a
email off list and we can work out the details. ~~~~

On Mon, Oct 21, 2013 at 8:19 PM, Wjhonson <[hidden email]> wrote:

> How about a script that Googles site:www.myurl.com and then walks every
> page and copies it ;)
>
>
>
>
>
>
>
>
> -----Original Message-----
> From: John Foster <[hidden email]>
> To: mediawiki-list <[hidden email]>
> Cc: MediaWiki announcements and site admin list <
> [hidden email]>
> Sent: Mon, Oct 21, 2013 4:58 pm
> Subject: Re: [MediaWiki-l] Mediawiki articla export
>
>
> On Mon, 2013-10-21 at 16:46 -0700, Yan Seiner wrote:
> > John W. Foster wrote:
> > > Is there any way to export ALL the articles & or pages from a very slow
> > > but working mediawiki. I want to move them to a much faster upgraded
> > > mediawiki server.
> > > I have tried the dumpbackup script in /maintainence, but that didn't
> get
> > > all the pages, only some, a& I dont know why. Any tips are appreciated.
> > > Thanks
> > > john
> > >
> > >
> > If it's the same version of mediawiki you can always try dumping the
> database
> directly and importing it into mysql on the new server.  I'm not sure but
> you
> might have to create the exact file structure as well....
> >
> Thanks.
> I am aware of that solution & in fact it is my preferred method for
> moving a wiki. However; the reason the mediawiki is slow is a totally
> messed up MySql database system, & I don't know how to fix it. I tried
> for over a year, as the wiki has thousands of pages/articles. Therefore
> I don't want to move the table structure for this db into the new,
> properly functioning wiki.
> Anything else, maybe.
>
>
> _______________________________________________
> MediaWiki-l mailing list
> [hidden email]
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
>
>
> _______________________________________________
> MediaWiki-l mailing list
> [hidden email]
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Mediawiki articla export

Tom Hutchison
In reply to this post by John Foster
> On Oct 21, 2013, at 7:58 PM, John Foster <[hidden email]> wrote:
> Thanks.
> I am aware of that solution & in fact it is my preferred method for
> moving a wiki. However; the reason the mediawiki is slow is a totally
> messed up MySql database system, & I don't know how to fix it. I tried
> for over a year, as the wiki has thousands of pages/articles.

Dump the db

Use something like xampp to recreate the wiki locally.

Up the script time out to unlimited then run the script to dump from maintenance.

Round abt way, yes, but you have specific issues.

Tom
_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Mediawiki articla export

Jan Steinman
In reply to this post by Yan Seiner
cd root-of-destination
wget -r http://site.to.be.copied/wiki

But I don't think that's what the OP wants.

I'd just copy the database tables over. But since he's copying to an "upgraded server," that may not work well.

Maybe copy the database tables over to an install of the same rev, then upgrade the new server.

On 2013-10-21, at 17:43, [hidden email] wrote:

> From: John <[hidden email]>
>
> I can do a little better than that, I can whip up something that copies
> everything based off Special:Allpages. If you want me to do that drop me a
> email off list and we can work out the details. ~~~~
>
> On Mon, Oct 21, 2013 at 8:19 PM, Wjhonson <[hidden email]> wrote:
>
>> How about a script that Googles site:www.myurl.com and then walks every
>> page and copies it ;)
>>
>>
>>
>>
>>
>>
>>
>>
>> -----Original Message-----
>> From: John Foster <[hidden email]>
>> To: mediawiki-list <[hidden email]>
>> Cc: MediaWiki announcements and site admin list <
>> [hidden email]>
>> Sent: Mon, Oct 21, 2013 4:58 pm
>> Subject: Re: [MediaWiki-l] Mediawiki articla export
>>
>>
>> On Mon, 2013-10-21 at 16:46 -0700, Yan Seiner wrote:
>>> John W. Foster wrote:
>>>> Is there any way to export ALL the articles & or pages from a very slow
>>>> but working mediawiki. I want to move them to a much faster upgraded
>>>> mediawiki server.
>>>> I have tried the dumpbackup script in /maintainence, but that didn't
>> get
>>>> all the pages, only some, a& I dont know why. Any tips are appreciated.
>>>> Thanks
>>>> john
>>>>
>>>>
>>> If it's the same version of mediawiki you can always try dumping the
>> database
>> directly and importing it into mysql on the new server.  I'm not sure but
>> you
>> might have to create the exact file structure as well....
>>>
>> Thanks.
>> I am aware of that solution & in fact it is my preferred method for
>> moving a wiki. However; the reason the mediawiki is slow is a totally
>> messed up MySql database system, & I don't know how to fix it. I tried
>> for over a year, as the wiki has thousands of pages/articles. Therefore
>> I don't want to move the table structure for this db into the new,
>> properly functioning wiki.
>> Anything else, maybe.
>

:::: If all the advertising in the world were to shut down tomorrow, would people still go on buying more soap, eating more apples, giving their children more vitamins, roughage, milk, olive oil, scooters and laxatives, learning more languages by iPod, hearing more virtuosos by radio, re-decorating their houses, refreshing themselves with more non-alcoholic thirst-quenchers, cooking more new, appetizing dishes, affording themselves that little extra touch which means so much? Or would the whole desperate whirligig slow down, and the exhausted public relapse upon plain grub and elbow-grease? -- Dorothy Sayers
:::: Jan Steinman, EcoReality Co-op ::::
:::: (Send email to [hidden email] to get a random quote, or [hidden email] to get 50 random quotes. Put a word in the Subject line to filter for that word.)


_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Mediawiki articla export

John Foster
On Mon, 2013-10-21 at 22:44 -0700, Jan Steinman wrote:
> cd root-of-destination
> wget -r http://site.to.be.copied/wiki

I think that would just copy the directory contents of the existing wiki
to the new root & that will not solve my issue.

>
> But I don't think that's what the OP wants.
>
> I'd just copy the database tables over. But since he's copying to an "upgraded server," that may not work well.

That is exactly what happened to CAUSE the issue with the old wiki. I
did a MySQL database dump & imported it into a new UPGRADED wiki and the
tables were all wrong. It takes forever for the wiki to find the
articles. It does eventually..... but that is just unsatisfactory.

>
> Maybe copy the database tables over to an install of the same rev, then upgrade the new server.
Hmmm, Haven,t tried that. I figure that would either work or crap out
the old slow but still working wiki. I may try that as a last resort.
Thanks

>
> On 2013-10-21, at 17:43, [hidden email] wrote:
>
> > From: John <[hidden email]>
> >
> > I can do a little better than that, I can whip up something that copies
> > everything based off Special:Allpages. If you want me to do that drop me a
> > email off list and we can work out the details. ~~~~
Maybe
> >
> > On Mon, Oct 21, 2013 at 8:19 PM, Wjhonson <[hidden email]> wrote:
> >
> >> How about a script that Googles site:www.myurl.com and then walks every
> >> page and copies it ;)
Beyond my skill set.LOL

> >>
> >>
> >>
> >> Sent: Mon, Oct 21, 2013 4:58 pm
> >> Subject: Re: [MediaWiki-l] Mediawiki articla export
> >>
> >>
> >> On Mon, 2013-10-21 at 16:46 -0700, Yan Seiner wrote:
> >>> John W. Foster wrote:
> >>>> Is there any way to export ALL the articles & or pages from a very slow
> >>>> but working mediawiki. I want to move them to a much faster upgraded
> >>>> mediawiki server.



_______________________________________________
MediaWiki-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l