[MediaWiki-l] non-server side wiki dumps

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

[MediaWiki-l] non-server side wiki dumps

John Doe-27
Are there any tools that allow you to create a dump of a mediawiki install
that doesn’t require direct database access? My primary focus is on
creating a backup of the wiki contents.
Thanks
_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: non-server side wiki dumps

Huib Laurens
XML export if enabled should do the trick.

On Fri, 15 Mar 2019 at 18:04, John <[hidden email]> wrote:

> Are there any tools that allow you to create a dump of a mediawiki install
> that doesn’t require direct database access? My primary focus is on
> creating a backup of the wiki contents.
> Thanks
> _______________________________________________
> MediaWiki-l mailing list
> To unsubscribe, go to:
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
--
Met vriendelijke groet,

Huib Laurens
_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: non-server side wiki dumps

John Doe-27
Unfortunately that doesn’t really work unless it’s a fairly small wiki. If
it’s bigger the export request times out.

What I was hoping for was a tool that utilizes either special export or the
API to build a dump file using multiple requests.

I can probably write something to do the same thing, but was hoping for an
existing solution

On Fri, Mar 15, 2019 at 1:12 PM Sterkebak <[hidden email]> wrote:

> XML export if enabled should do the trick.
>
> On Fri, 15 Mar 2019 at 18:04, John <[hidden email]> wrote:
>
> > Are there any tools that allow you to create a dump of a mediawiki
> install
> > that doesn’t require direct database access? My primary focus is on
> > creating a backup of the wiki contents.
> > Thanks
> > _______________________________________________
> > MediaWiki-l mailing list
> > To unsubscribe, go to:
> > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >
> --
> Met vriendelijke groet,
>
> Huib Laurens
> _______________________________________________
> MediaWiki-l mailing list
> To unsubscribe, go to:
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: non-server side wiki dumps

Isarra Yos
Not exactly the same thing, but there is a set of grabber scripts
designed to get and import an entire wiki via the api, but these are
currently mostly just mw maintenance and a few python scripts that only
expect to be used with an actual second mediawiki instance.

That being said, it probably would be a good idea to rearchitect them to
be a bit less redundant with each other, and in so doing also make their
usage a tad more flexible anyway, such as for creating dumps and backups
instead. Whatever the case, at very least they might make a useful
reference point.

See: https://www.mediawiki.org/wiki/Manual:Grabbers

-I

On 15/03/2019 18:43, John wrote:

> Unfortunately that doesn’t really work unless it’s a fairly small wiki. If
> it’s bigger the export request times out.
>
> What I was hoping for was a tool that utilizes either special export or the
> API to build a dump file using multiple requests.
>
> I can probably write something to do the same thing, but was hoping for an
> existing solution
>
> On Fri, Mar 15, 2019 at 1:12 PM Sterkebak <[hidden email]> wrote:
>
>> XML export if enabled should do the trick.
>>
>> On Fri, 15 Mar 2019 at 18:04, John <[hidden email]> wrote:
>>
>>> Are there any tools that allow you to create a dump of a mediawiki
>> install
>>> that doesn’t require direct database access? My primary focus is on
>>> creating a backup of the wiki contents.
>>> Thanks
>>> _______________________________________________
>>> MediaWiki-l mailing list
>>> To unsubscribe, go to:
>>> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>>>
>> --
>> Met vriendelijke groet,
>>
>> Huib Laurens
>> _______________________________________________
>> MediaWiki-l mailing list
>> To unsubscribe, go to:
>> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>>
> _______________________________________________
> MediaWiki-l mailing list
> To unsubscribe, go to:
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l



_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: non-server side wiki dumps

K. Peachey-2
There is also MWOffliner which might work you, which is maintained by
the kiwix team iirc.

On Sun, 17 Mar 2019 at 00:04, Isarra Yos <[hidden email]> wrote:

>
> Not exactly the same thing, but there is a set of grabber scripts
> designed to get and import an entire wiki via the api, but these are
> currently mostly just mw maintenance and a few python scripts that only
> expect to be used with an actual second mediawiki instance.
>
> That being said, it probably would be a good idea to rearchitect them to
> be a bit less redundant with each other, and in so doing also make their
> usage a tad more flexible anyway, such as for creating dumps and backups
> instead. Whatever the case, at very least they might make a useful
> reference point.
>
> See: https://www.mediawiki.org/wiki/Manual:Grabbers
>
> -I
>
> On 15/03/2019 18:43, John wrote:
> > Unfortunately that doesn’t really work unless it’s a fairly small wiki. If
> > it’s bigger the export request times out.
> >
> > What I was hoping for was a tool that utilizes either special export or the
> > API to build a dump file using multiple requests.
> >
> > I can probably write something to do the same thing, but was hoping for an
> > existing solution
> >
> > On Fri, Mar 15, 2019 at 1:12 PM Sterkebak <[hidden email]> wrote:
> >
> >> XML export if enabled should do the trick.
> >>
> >> On Fri, 15 Mar 2019 at 18:04, John <[hidden email]> wrote:
> >>
> >>> Are there any tools that allow you to create a dump of a mediawiki
> >> install
> >>> that doesn’t require direct database access? My primary focus is on
> >>> creating a backup of the wiki contents.
> >>> Thanks
> >>> _______________________________________________
> >>> MediaWiki-l mailing list
> >>> To unsubscribe, go to:
> >>> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >>>
> >> --
> >> Met vriendelijke groet,
> >>
> >> Huib Laurens
> >> _______________________________________________
> >> MediaWiki-l mailing list
> >> To unsubscribe, go to:
> >> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >>
> > _______________________________________________
> > MediaWiki-l mailing list
> > To unsubscribe, go to:
> > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
>
>
> _______________________________________________
> MediaWiki-l mailing list
> To unsubscribe, go to:
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l

_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: non-server side wiki dumps

Greg Rundlett (freephile)
I would think the scripts lsarra mentioned would work best for MediaWiki.
That said, there are a large number of 'archiving' tools that let you take
copies of websites for historical preservation and backup:

e.g.
https://github.com/pirate/ArchiveBox/wiki/Web-Archiving-Community#The-Master-Lists
One noteworthy group - the 'Archive Team' https://www.archiveteam.org/

~ Greg

Greg Rundlett
https://eQuality-Tech.com
https://freephile.org


On Sat, Mar 16, 2019 at 8:18 PM K. Peachey <[hidden email]> wrote:

> There is also MWOffliner which might work you, which is maintained by
> the kiwix team iirc.
>
> On Sun, 17 Mar 2019 at 00:04, Isarra Yos <[hidden email]> wrote:
> >
> > Not exactly the same thing, but there is a set of grabber scripts
> > designed to get and import an entire wiki via the api, but these are
> > currently mostly just mw maintenance and a few python scripts that only
> > expect to be used with an actual second mediawiki instance.
> >
> > That being said, it probably would be a good idea to rearchitect them to
> > be a bit less redundant with each other, and in so doing also make their
> > usage a tad more flexible anyway, such as for creating dumps and backups
> > instead. Whatever the case, at very least they might make a useful
> > reference point.
> >
> > See: https://www.mediawiki.org/wiki/Manual:Grabbers
> >
> > -I
> >
> > On 15/03/2019 18:43, John wrote:
> > > Unfortunately that doesn’t really work unless it’s a fairly small
> wiki. If
> > > it’s bigger the export request times out.
> > >
> > > What I was hoping for was a tool that utilizes either special export
> or the
> > > API to build a dump file using multiple requests.
> > >
> > > I can probably write something to do the same thing, but was hoping
> for an
> > > existing solution
> > >
> > > On Fri, Mar 15, 2019 at 1:12 PM Sterkebak <[hidden email]> wrote:
> > >
> > >> XML export if enabled should do the trick.
> > >>
> > >> On Fri, 15 Mar 2019 at 18:04, John <[hidden email]> wrote:
> > >>
> > >>> Are there any tools that allow you to create a dump of a mediawiki
> > >> install
> > >>> that doesn’t require direct database access? My primary focus is on
> > >>> creating a backup of the wiki contents.
> > >>> Thanks
> > >>> _______________________________________________
> > >>> MediaWiki-l mailing list
> > >>> To unsubscribe, go to:
> > >>> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> > >>>
> > >> --
> > >> Met vriendelijke groet,
> > >>
> > >> Huib Laurens
> > >> _______________________________________________
> > >> MediaWiki-l mailing list
> > >> To unsubscribe, go to:
> > >> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> > >>
> > > _______________________________________________
> > > MediaWiki-l mailing list
> > > To unsubscribe, go to:
> > > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >
> >
> >
> > _______________________________________________
> > MediaWiki-l mailing list
> > To unsubscribe, go to:
> > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
> _______________________________________________
> MediaWiki-l mailing list
> To unsubscribe, go to:
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l