Re: Making a offline [HTML|PDF] copy of my local

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Re: Making a offline [HTML|PDF] copy of my local

Leon Kolchinsky
Hi,

Use Teleport to fetch a local copy of your site, or any similar program.
Now make a PDF from it :)

Regards,
Leon Kolchinsky




----------------
Hello to all,

i've a problem with making an offline copy of my own local wiki.

The following szenario:
All the admin-related stuff and also the whole documentation of our  
infrastructure is stored on a webserver. So the document to restore  
the webserver if it crashes is located on the same webserver.... so  
therefore i need an offline copy of my local wiki. (I know this is  
not the perfect solution but there is a backup from the wiki - but i  
need a simple way to have the important documents stored on my local  
computer).

I've searched the web and found some solutions:

o Script 'dumpHtml.php in the 'maintenance'-Folder:
The script works and stores all the Wiki-Sites as static HTML-pages  
in a given folder ('php dumpHtml.php -d /my/folder/')
BUT
the images are lost if you copy the directory to your local pc (in  
apache) and the relative links inside the pages are broken (why is  
there '../../' in front of all links). What is the intension of this  
script? Am i doing something wrong?
Links:
- http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/maintenance/ 
dumpHTML.php?view=markup

o Serveral Alternative Parsers:
See http://meta.wikimedia.org/wiki/Alternative_parsers for more  
Information.
I tried the HTML2FPDF and Mediawiki Article (http://
meta.wikimedia.org/wiki/HTML2FPDF_and_Mediawiki) but i did not get it  
working. Also it's not good that you habe to change some mainfiles  
from the mediawiki installation.
The other projects are imho not 'ready' or the intension is something  
different.

o wget to mirror the wiki-Site
I also tried mirroring the wiki with 'wget -m http://mydomain.com/ 
mywikidirectory/' (i also tried the url http://mydomain.com/ 
mywikidirectory/index.php/MainSite) but it is not only mirroring the  
wiki. It's also mirroring the whole site at 'http://mydomain.com/'.  
Why? Can i customize the wget command that it is only mirroring sites  
from 'http://mydomain.com/mywikidirectory/'?


Are there any other solutions or suggestions? I think there is a need  
for such a offline copy because i found many questions for a tool  
like this. Is there any chance to customize the script dumpHTML.php?

Best regards.

Carsten Marx


-Links:
- http://meta.wikimedia.org/wiki/Talk:MediaWiki_FAQ#HTML_export

My Wiki installtion:
- MediaWiki: 1.7.1
- PHP: 5.1.4-Debian-0.1~sarge1 (apache2handler)
- MySQL: 4.1.11-Debian_4sarge5-log


------------------------------
_______________________________________________
MediaWiki-l mailing list
[hidden email]
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l