Populating the help namespace

classic Classic list List threaded Threaded
10 messages Options
Reply | Threaded
Open this post in threaded view
|

Populating the help namespace

Terry Auspitz
Hello all,

I've tried searching, but with little success. I'm hopeful that someone
can explain to me, in reasonably small words, how to properly populate
the "Help" namespace on my new Wiki.

Ideally, if there's a document floating around somewhere that I've
missed, please point me in that direction (after all 'RTFM' *is* a
reasonably small word).

If it helps, the wiki is at http://www.wikitography.org and is 3rd-party
hosted on Redhat.

Thanks in advance for any help you can provide.
~Terry
_______________________________________________
MediaWiki-l mailing list
[hidden email]
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Populating the help namespace

Bugzilla from rowan.collins@gmail.com
On 17/01/06, Terry Auspitz <[hidden email]> wrote:
> I've tried searching, but with little success. I'm hopeful that someone
> can explain to me, in reasonably small words, how to properly populate
> the "Help" namespace on my new Wiki.

Unfortunately, there is no easy way (unless someone implemented one
while I wasn't looking) of downloading all the documentation developed
on meta (or the other set which seems to have sprung up on
mediawiki.org). (I guess we could really do with an
export-by-namespace option, which would work if we moved the relevant
templates from their current Template:h* home...)

The best you can do, I think, is get (or make) a list of all the pages
you want, use Special:Export to plonk them all in an XML file, and
then Special:Import (that does work now, right folks?) to shove them
in your own wiki. I make no guarantees about the technical or legal
validity of these instructions, but it oughta work...

--
Rowan Collins BSc
[IMSoP]
_______________________________________________
MediaWiki-l mailing list
[hidden email]
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Populating the help namespace

Terry Auspitz-2
Rowan,

Thanks for the quick reply. I tried the Special:Export option and was
moderately successful, but I did run into trouble bringing the
templates over. Looking at
http://www.wikitography.org/wiki/Template:H:f for example should give
you an idea of the trouble I'm encountering.

I feel like i'm close and I hope it's just the templates that are
hanging me up.

As far as legality, both Meta and my project are under GFDL so I
should be ok there.

I've been documenting what works for me along the way, so once this is
complete I'll try to throw together a coherent howto. I can't code,
but at least I can write.

~Terry

Rowan Collins wrote:

> Unfortunately, there is no easy way (unless someone implemented one
> while I wasn't looking) of downloading all the documentation
> developed on meta (or the other set which seems to have sprung up
> on mediawiki.org). (I guess we could really do with an
> export-by-namespace option, which would work if we moved the
> relevant templates from their current Template:h* home...)
>
> The best you can do, I think, is get (or make) a list of all the
> pages you want, use Special:Export to plonk them all in an XML
> file, and then Special:Import (that does work now, right folks?) to
> shove them in your own wiki. I make no guarantees about the
> technical or legal validity of these instructions, but it oughta
> work...

_______________________________________________
MediaWiki-l mailing list
[hidden email]
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Populating the help namespace

Terry Auspitz
In reply to this post by Bugzilla from rowan.collins@gmail.com
Rowan,

Thanks for the quick reply. I tried the Special:Export option and was moderately successful, but I did run into trouble bringing the templates over. Looking at http://www.wikitography.org/wiki/Template:H:f for example should give you an idea of the trouble I'm encountering.

I feel like i'm close and I hope it's just the templates that are hanging me up.

As far as legality, both Meta and my project are under GFDL so I should be ok there.

I've been documenting what works for me along the way, so once this is complete I'll try to throw together a coherent howto. I can't code, but at least I can write.

~Terry

Rowan Collins wrote:


>> Unfortunately, there is no easy way (unless someone implemented one
>> while I wasn't looking) of downloading all the documentation
>> developed on meta (or the other set which seems to have sprung up
>> on mediawiki.org). (I guess we could really do with an
>> export-by-namespace option, which would work if we moved the
>> relevant templates from their current Template:h* home...)
>>
>> The best you can do, I think, is get (or make) a list of all the
>> pages you want, use Special:Export to plonk them all in an XML
>> file, and then Special:Import (that does work now, right folks?) to
>> shove them in your own wiki. I make no guarantees about the
>> technical or legal validity of these instructions, but it oughta
>> work...
>  
>


_______________________________________________
MediaWiki-l mailing list
[hidden email]
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

$wgArticlePath when using CGI

Graeme Canivet
Hello,

I wondered why you must use ugly URLs with the CGI module...?

Also, I wondered how the rewrite $wgArticlePath = "$wgScriptPath/$1"
works.... When I pull up an article in my wiki I use the url
index.php/ArticleName, however, I have never written a rewrite rule so I am
unsure how Apache knows to rewrite "$wgScriptPath/$1" into
"$wgScript?title=$1".

Any information would be appreciated,
Graeme


_______________________________________________
MediaWiki-l mailing list
[hidden email]
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: $wgArticlePath when using CGI

Bugzilla from rowan.collins@gmail.com
In reply to this post by Terry Auspitz
On 17/01/06, Graeme Canivet <[hidden email]> wrote:
> I wondered why you must use ugly URLs with the CGI module...?

Dunno. Bad voodoo, I guess.

> Also, I wondered how the rewrite $wgArticlePath = "$wgScriptPath/$1"
> works....

This is a rewrite Apache does automatically for appropriate scripts,
without the need for any specific setup - since a script can't also be
a directory, the name is unambiguous, so if Apache didn't treat it
specially, it would be a 404 anyway.

--
Rowan Collins BSc
[IMSoP]
_______________________________________________
MediaWiki-l mailing list
[hidden email]
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Populating the help namespace

Bugzilla from rowan.collins@gmail.com
In reply to this post by Terry Auspitz
On 17/01/06, Terry Auspitz <[hidden email]> wrote:
> Thanks for the quick reply. I tried the Special:Export option and was moderately successful, but I did run into trouble bringing the templates over. Looking at http://www.wikitography.org/wiki/Template:H:f for example should give you an idea of the trouble I'm encountering.

Yeah, transfering complex templates is a pain; I'm almost tempted to
come up with some kind of specialised template export utility, since
people will insist on doing crazy nesting etc. (I was already
pondering this because everyone wants to copy en.Wikipedia's funky
Portal designs...) Bob knows how you could automatically enumerate
dependencies on other templates determined by built-in variables or
parameters, though.

It would also help if people didn't insist on using stupidly cryptic
abbrev'd names for all their templates - even the ones only designed
to be used in one other template - but I guess that really is too much
to ask!

--
Rowan Collins BSc
[IMSoP]
_______________________________________________
MediaWiki-l mailing list
[hidden email]
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: $wgArticlePath when using CGI

Brion Vibber
In reply to this post by Bugzilla from rowan.collins@gmail.com
Graeme Canivet wrote:
> Hello,
>
> I wondered why you must use ugly URLs with the CGI module...?

In most cases, the additional PATH_INFO bit doesn't work with PHP in CGI mode.
All kinds of ugly loops and breakage result.

If you really know it works, you can poke around and change it.

> Also, I wondered how the rewrite $wgArticlePath = "$wgScriptPath/$1"
> works.... When I pull up an article in my wiki I use the url
> index.php/ArticleName, however, I have never written a rewrite rule so I am
> unsure how Apache knows to rewrite "$wgScriptPath/$1" into
> "$wgScript?title=$1".

There's an entire document dedicated to such topics; see the MediaWiki FAQ.
(Link is in the README, the RELEASE-NOTES and can also be easily googled.)

-- brion vibber (brion @ pobox.com)


_______________________________________________
MediaWiki-l mailing list
[hidden email]
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l

signature.asc (257 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

RE: $wgArticlePath when using CGI

Graeme Canivet
Hey Brion,

I read those FAQs, and did not find the answer I'm looking for....  How does
apache translate the  $1 part of $wgScript/$1 into the get variable
'article'.  Or, perhaps apache puts it into a predefined variable that
MediaWiki uses -- I checked the source code, but perhaps I have to delve
deeper? Unless you know what it is? I'll have to try some tests and phpinfo
it.....

Thank you Brion (and Rowan too),
Graeme

-----Original Message-----
From: [hidden email]
[mailto:[hidden email]] On Behalf Of Brion Vibber
Sent: January 17, 2006 12:45 AM
To: MediaWiki announcements and site admin list
Subject: Re: [Mediawiki-l] $wgArticlePath when using CGI

Graeme Canivet wrote:
> Hello,
>
> I wondered why you must use ugly URLs with the CGI module...?

In most cases, the additional PATH_INFO bit doesn't work with PHP in CGI
mode.
All kinds of ugly loops and breakage result.

If you really know it works, you can poke around and change it.

> Also, I wondered how the rewrite $wgArticlePath = "$wgScriptPath/$1"
> works.... When I pull up an article in my wiki I use the url
> index.php/ArticleName, however, I have never written a rewrite rule so
> I am unsure how Apache knows to rewrite "$wgScriptPath/$1" into
> "$wgScript?title=$1".

There's an entire document dedicated to such topics; see the MediaWiki FAQ.
(Link is in the README, the RELEASE-NOTES and can also be easily googled.)

-- brion vibber (brion @ pobox.com)


_______________________________________________
MediaWiki-l mailing list
[hidden email]
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Populating the help namespace

Jan Steinman
In reply to this post by Terry Auspitz
> From: Terry Auspitz <[hidden email]>
>
> I'm hopeful that someone
> can explain to me, in reasonably small words, how to properly populate
> the "Help" namespace on my new Wiki.

I went through that on one site, and decided it wasn't worth the  
effort. There are too many "leaves"!

Luckily, there are relatively few "branches" from which those leaves  
spring, and you can easily find them all in the MediaWiki namespace.  
I changed each one to point to Meta -- changing "Help:Editing" to  
"Meta:Help:Editing", and all is swell. (Oh, I think I hand to turn on  
some obscure option like $wgScaryTransclusion or something similar.)

Of course, this is not perfect. Unsophisticated users might not  
realize they're not in Kansas any more, and the Meta help may not be  
perfectly synch'd to your version, and you can't customize or improve  
the help (unless, of course, it improves Meta's help, which is  
negentropic and great!). But for me, it was a heck of a lot better  
than importing stuff, waiting for it to break, then going back and  
importing more stuff, then waiting for it to break, then importing  
more stuff...



:::: We can only continue to use oil as long as it lasts. We should  
be looking for other sources of energy. There's only one that's big  
enough, it's free, and good for at least a billion years. That's the  
sun. We must move into solar energy. -- M. King Hubbert, 1976 ::::
:::: Jan Steinman <http://www.EcoReality.org> ::::




_______________________________________________
MediaWiki-l mailing list
[hidden email]
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l