BitTorrent Downloads

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

BitTorrent Downloads

jmerkey-3

Since I have open sourced the wikix program, now anyone wanting the
images can download them directly from Wikipedia.
I am in process of restructuring the WikiGadugi site, so anyone wanting
the bittorrent downloads need to finish up this week,
as I will discontinue them shortly since folks now have the ability to
download them directly.   The wikix program
is not very intensive on the main Wikimedia servers.   The program is
setup to behave as several workstations, and it really
does not take that long to get the images.

To date, under ten folks have downloaded the entire image archive, so it
does not appear that there is that much interest.

Jeff

_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: BitTorrent Downloads

Robert Leverington
Hi,
I think this is a great idea, any chance you could give me a URL so
that I can download the program.
Thanks.

On 25/04/07, Jeff V. Merkey <[hidden email]> wrote:

>
> Since I have open sourced the wikix program, now anyone wanting the
> images can download them directly from Wikipedia.
> I am in process of restructuring the WikiGadugi site, so anyone wanting
> the bittorrent downloads need to finish up this week,
> as I will discontinue them shortly since folks now have the ability to
> download them directly.   The wikix program
> is not very intensive on the main Wikimedia servers.   The program is
> setup to behave as several workstations, and it really
> does not take that long to get the images.
>
> To date, under ten folks have downloaded the entire image archive, so it
> does not appear that there is that much interest.
>
> Jeff
>
> _______________________________________________
> foundation-l mailing list
> [hidden email]
> http://lists.wikimedia.org/mailman/listinfo/foundation-l
>


--
Robert ############################
[[User:Lcarsdata]] ################
http://www.wikitest.co.uk #########
http://roberthl.wikitest.co.uk ####

_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: BitTorrent Downloads

jmerkey-3
Robert Leverington wrote:

>Hi,
>I think this is a great idea, any chance you could give me a URL so
>that I can download the program.
>Thanks.
>
>  
>

http://meta.wikimedia.org/wiki/Wikix

ftp://ftp.wikigadugi.org/wiki/MediaWiki/wikix.tar.gz.bz2

Jeff

>On 25/04/07, Jeff V. Merkey <[hidden email]> wrote:
>  
>
>>Since I have open sourced the wikix program, now anyone wanting the
>>images can download them directly from Wikipedia.
>>I am in process of restructuring the WikiGadugi site, so anyone wanting
>>the bittorrent downloads need to finish up this week,
>>as I will discontinue them shortly since folks now have the ability to
>>download them directly.   The wikix program
>>is not very intensive on the main Wikimedia servers.   The program is
>>setup to behave as several workstations, and it really
>>does not take that long to get the images.
>>
>>To date, under ten folks have downloaded the entire image archive, so it
>>does not appear that there is that much interest.
>>
>>Jeff
>>
>>_______________________________________________
>>foundation-l mailing list
>>[hidden email]
>>http://lists.wikimedia.org/mailman/listinfo/foundation-l
>>
>>    
>>
>
>
>  
>


_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: BitTorrent Downloads

Steve Sanbeg
In reply to this post by jmerkey-3
On Tue, 24 Apr 2007 17:22:57 -0600, Jeff V. Merkey wrote:

>
> Since I have open sourced the wikix program, now anyone wanting the
> images can download them directly from Wikipedia.
> I am in process of restructuring the WikiGadugi site, so anyone wanting
> the bittorrent downloads need to finish up this week,
> as I will discontinue them shortly since folks now have the ability to
> download them directly.   The wikix program
> is not very intensive on the main Wikimedia servers.   The program is
> setup to behave as several workstations, and it really
> does not take that long to get the images.

I was under the impression that bulk downloads needed to be throttled, and
that it would take a lot longer than that to download everything.  Does
this just grab the images as fast as it can get them?  Is that allowed?



_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: [Wikitech-l] BitTorrent Downloads

jmerkey-3
Steve Sanbeg wrote:

>On Tue, 24 Apr 2007 17:22:57 -0600, Jeff V. Merkey wrote:
>
>  
>
>>Since I have open sourced the wikix program, now anyone wanting the
>>images can download them directly from Wikipedia.
>>I am in process of restructuring the WikiGadugi site, so anyone wanting
>>the bittorrent downloads need to finish up this week,
>>as I will discontinue them shortly since folks now have the ability to
>>download them directly.   The wikix program
>>is not very intensive on the main Wikimedia servers.   The program is
>>setup to behave as several workstations, and it really
>>does not take that long to get the images.
>>    
>>
>
>I was under the impression that bulk downloads needed to be throttled, and
>that it would take a lot longer than that to download everything.  Does
>this just grab the images as fast as it can get them?  Is that allowed?
>  
>
It's faster to get them from Wikipedia. The bittorrent downloads take
about 1 1.2 weeks to download the archive. Using wikix
directly only takes 1 1/2 days given the current size of the image set
for commons.

Getting them from Wikipedia is faster due to the squid caching both
locally and internet wide. My analysis of the
data sets from Wikipedia indicates that 60% of the images are cached
either locally on squid or at other remote
cache servers.

Since they are cached in a distributed manner, the program will only
access wikipedia intermittently. Copyvio is the bigger
issue than performance. My image mirroring has had almost no noticable
impact on Wikipedia with wikix. The program
behaves like 16 workstations, so Wikipedia seems to be able to handle it
with little additional overhead. Given the number
of squid servers Brion has active, I think the impact is minimal in
comparison to the massive amounts of access the site gets
daily.

Jeff

>
>
>_______________________________________________
>Wikitech-l mailing list
>[hidden email]
>http://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>  
>


_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l
Reply | Threaded
Open this post in threaded view
|

Re: [Wikitech-l] BitTorrent Downloads

jmerkey-3
Jeff V. Merkey wrote:

>Steve Sanbeg wrote:
>
>  
>
>>On Tue, 24 Apr 2007 17:22:57 -0600, Jeff V. Merkey wrote:
>>
>>
>>
>>    
>>
>>>Since I have open sourced the wikix program, now anyone wanting the
>>>images can download them directly from Wikipedia.
>>>I am in process of restructuring the WikiGadugi site, so anyone wanting
>>>the bittorrent downloads need to finish up this week,
>>>as I will discontinue them shortly since folks now have the ability to
>>>download them directly.   The wikix program
>>>is not very intensive on the main Wikimedia servers.   The program is
>>>setup to behave as several workstations, and it really
>>>does not take that long to get the images.
>>>  
>>>
>>>      
>>>
>>I was under the impression that bulk downloads needed to be throttled, and
>>that it would take a lot longer than that to download everything.  Does
>>this just grab the images as fast as it can get them?  Is that allowed?
>>
>>
>>    
>>
>It's faster to get them from Wikipedia. The bittorrent downloads take
>about 1 1.2 weeks to download the archive. Using wikix
>directly only takes 1 1/2 days given the current size of the image set
>for commons.
>
>Getting them from Wikipedia is faster due to the squid caching both
>locally and internet wide. My analysis of the
>data sets from Wikipedia indicates that 60% of the images are cached
>either locally on squid or at other remote
>cache servers.
>
>Since they are cached in a distributed manner, the program will only
>access wikipedia intermittently. Copyvio is the bigger
>issue than performance. My image mirroring has had almost no noticable
>impact on Wikipedia with wikix. The program
>behaves like 16 workstations, so Wikipedia seems to be able to handle it
>with little additional overhead. Given the number
>of squid servers Brion has active, I think the impact is minimal in
>comparison to the massive amounts of access the site gets
>daily.
>
>Jeff
>
>  
>
>  
>
Also, the number of people actually needing to do this seems small. To
date, only 9 folks have downloaded
the image archive in a month period. That's a small number. I will leave
the bittorrent active at wikigadugi
and throttled since it has little impact on my bandwidth at present for
the next month or so if folks still
want to get at it. Wikix is a better method and given all the "gloom and
doom" talk about creating
backup sites for Wikimedia (which I think is probably not a big a
concern as people think), the Wikix tool's time has come
and folks should get access to it if they feel a need to mirror
Wikipedia sites elsewhere. At least it gives the
community the tools to do this.

Jeff


_______________________________________________
foundation-l mailing list
[hidden email]
http://lists.wikimedia.org/mailman/listinfo/foundation-l