Ajaxify page requests extension

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Ajaxify page requests extension

Cyken Zeraux
I've been working on a little project that intercepts normal wiki links
from reloading the entire page and instead requests that page with ajax,
seperating out the content and replacing it on the page. So far, I've
gotten this to work with only the main content, however I need to also be
able to safely grab the 'mw-navigation' and 'footer' which seems to also be
standardized in most skins.

The only problem with that is client side JS cannot safely parse returned
pages for all three elements because it is possible for users to specify
replica elements with the same ID in-page, and DomDocument is not optimized
or well supported.

I would like to be able to grab all of the page content before its echoed
out, DomDocument in PHP, grab my elements, and echo out JSON of them. The
client side js plugin would then refresh those elements.

I've done a good bit of trial and error and research in the Mediawiki docs
for this, and it seems its not currently supported because the active skin
is the one that echoes out the entire page, not Mediawiki itself.

Am I wrong in my findings, and are there any plans to make Mediawiki handle
pages and echo them out? It would break standard currently, but I feel that
having skins build up an output string and passing it to Mediawiki rather
than delivering the content itself is a better approach.

Thank you for your time.
_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Ajaxify page requests extension

Purodha Blissenbach
The downside is, page rendering must work when JavaScript is not
available
on the client. That makes things a bit complicated. You likely will have
to
overhaul some legacy stuff.
Purodha

On 07.05.2016 04:33, Cyken Zeraux wrote:

> I've been working on a little project that intercepts normal wiki links
> from reloading the entire page and instead requests that page with
> ajax,
> seperating out the content and replacing it on the page. So far, I've
> gotten this to work with only the main content, however I need to also
> be
> able to safely grab the 'mw-navigation' and 'footer' which seems to
> also be
> standardized in most skins.
>
> The only problem with that is client side JS cannot safely parse
> returned
> pages for all three elements because it is possible for users to
> specify
> replica elements with the same ID in-page, and DomDocument is not
> optimized
> or well supported.
>
> I would like to be able to grab all of the page content before its
> echoed
> out, DomDocument in PHP, grab my elements, and echo out JSON of them.
> The
> client side js plugin would then refresh those elements.
>
> I've done a good bit of trial and error and research in the Mediawiki
> docs
> for this, and it seems its not currently supported because the active
> skin
> is the one that echoes out the entire page, not Mediawiki itself.
>
> Am I wrong in my findings, and are there any plans to make Mediawiki
> handle
> pages and echo them out? It would break standard currently, but I feel
> that
> having skins build up an output string and passing it to Mediawiki
> rather
> than delivering the content itself is a better approach.
>
> Thank you for your time.
> _______________________________________________
> Wikitech-l mailing list
> [hidden email]
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l