Use Teleport to fetch a local copy of your site, or any similar program.
Now make a PDF from it :)
Hello to all,
i've a problem with making an offline copy of my own local wiki.
The following szenario:
All the admin-related stuff and also the whole documentation of our
infrastructure is stored on a webserver. So the document to restore
the webserver if it crashes is located on the same webserver.... so
therefore i need an offline copy of my local wiki. (I know this is
not the perfect solution but there is a backup from the wiki - but i
need a simple way to have the important documents stored on my local
I've searched the web and found some solutions:
o Script 'dumpHtml.php in the 'maintenance'-Folder:
The script works and stores all the Wiki-Sites as static HTML-pages
in a given folder ('php dumpHtml.php -d /my/folder/')
the images are lost if you copy the directory to your local pc (in
apache) and the relative links inside the pages are broken (why is
there '../../' in front of all links). What is the intension of this
script? Am i doing something wrong?
- http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/maintenance/ dumpHTML.php?view=markup
o Serveral Alternative Parsers:
See http://meta.wikimedia.org/wiki/Alternative_parsers for more
I tried the HTML2FPDF and Mediawiki Article (http://
meta.wikimedia.org/wiki/HTML2FPDF_and_Mediawiki) but i did not get it
working. Also it's not good that you habe to change some mainfiles
from the mediawiki installation.
The other projects are imho not 'ready' or the intension is something
Are there any other solutions or suggestions? I think there is a need
for such a offline copy because i found many questions for a tool
like this. Is there any chance to customize the script dumpHTML.php?