[MediaWiki-l] Moving large mediawiki site databases

classic Classic list List threaded Threaded
12 messages Options
Reply | Threaded
Open this post in threaded view
|

[MediaWiki-l] Moving large mediawiki site databases

John Foster-2
I have to move a mediawiki site that has a database of approx 4.1GB. It
is somthing I have done before and I usually just use the command line
syntax to do it. However I have tried 3 times to export it and the files
are always either incomplete due to server (Mysql) disconnect.  I
finally did get one that seemed to complete OK but then I tried several
times to import it into the new server and likewise got server
disconnect errors. I am aware that there are a multitude of
possibilities, so what I'm asking here is for any tips on moving
databases of this size or larger. I'm beginning to wonder if Mysql is
the way to go here.
Thanks!

--
John Foster
JW Foster & Associates


_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Moving large mediawiki site databases

John Doe-27
are you using the dump/import script or a feature of the database software?
mysql or mariadb?

On Thu, May 14, 2015 at 10:52 PM, John Foster <[hidden email]>
wrote:

> I have to move a mediawiki site that has a database of approx 4.1GB. It is
> somthing I have done before and I usually just use the command line syntax
> to do it. However I have tried 3 times to export it and the files are
> always either incomplete due to server (Mysql) disconnect.  I finally did
> get one that seemed to complete OK but then I tried several times to import
> it into the new server and likewise got server disconnect errors. I am
> aware that there are a multitude of possibilities, so what I'm asking here
> is for any tips on moving databases of this size or larger. I'm beginning
> to wonder if Mysql is the way to go here.
> Thanks!
>
> --
> John Foster
> JW Foster & Associates
>
>
> _______________________________________________
> MediaWiki-l mailing list
> To unsubscribe, go to:
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Moving large mediawiki site databases

John Doe-27
See http://www.mediawiki.org/wiki/Manual:Backing_up_a_wiki for information
about backing up your wiki. For restoring your wiki, see
http://www.mediawiki.org/wiki/Manual:Moving_a_wiki#Import_the_database_backup
for more. If you don't have database access, try !grabber.

On Thu, May 14, 2015 at 10:58 PM, John <[hidden email]> wrote:

> are you using the dump/import script or a feature of the database
> software? mysql or mariadb?
>
> On Thu, May 14, 2015 at 10:52 PM, John Foster <[hidden email]>
> wrote:
>
>> I have to move a mediawiki site that has a database of approx 4.1GB. It
>> is somthing I have done before and I usually just use the command line
>> syntax to do it. However I have tried 3 times to export it and the files
>> are always either incomplete due to server (Mysql) disconnect.  I finally
>> did get one that seemed to complete OK but then I tried several times to
>> import it into the new server and likewise got server disconnect errors. I
>> am aware that there are a multitude of possibilities, so what I'm asking
>> here is for any tips on moving databases of this size or larger. I'm
>> beginning to wonder if Mysql is the way to go here.
>> Thanks!
>>
>> --
>> John Foster
>> JW Foster & Associates
>>
>>
>> _______________________________________________
>> MediaWiki-l mailing list
>> To unsubscribe, go to:
>> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>>
>
>
_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Moving large mediawiki site databases

Jan Steinman-2
In reply to this post by John Doe-27
> From: John Foster <[hidden email]>
>
> ... I have tried 3 times to export it and the files
> are always either incomplete due to server (Mysql) disconnect.

Been there, done that, fixed it, don't remember exactly how, except there was some /etc/my.cnf timeout parameter that I temporarily set to some ridiculously high value.

I think I used "wait_timeout = 6000".

:::: What's the use of a fine house if you haven't got a tolerable planet to put it on? -- Henry David Thoreau
:::: Jan Steinman, EcoReality Co-op ::::


_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Moving large mediawiki site databases

Dave Humphrey
In reply to this post by John Foster-2
Exactly what is the error message you are getting? If it is something like
"Mysql Server has gone away" it may be due to a too small
"max_allowed_packet" setting. See
http://stackoverflow.com/questions/19214572/can-not-import-large-sql-dump-into-mysql-5-6

My MediaWiki db is going on ~20GB now and I can export/import it fine with
a max_allowed_packet of 16M and a wait_timeout of 90. If you have a lot of
large articles in your wiki you may need a larger max_allowed_packet. For
example, in my "/etc/my.cnf" file I have:

          [mysqldump]
          quick
          max_allowed_packet = 16M

I don't see anything else obvious in my config file that would affect the
export/import speed. I know there's a few good results if you google for
"mysqldump fast restore" or similar.

Exports/imports are as simple as:

          mysqldump -avz -u user -p database_name > file.sql
          mysql -u user -p new_database_name < file.sql

If I'm transferring it to a different server I simply rsync/scp the SQL
file over between the two steps. The export takes about 10 min and import
close to 1 hour for me (I've never had a disconnect error).



On 14 May 2015 at 22:52, John Foster <[hidden email]> wrote:

> I have to move a mediawiki site that has a database of approx 4.1GB. It is
> somthing I have done before and I usually just use the command line syntax
> to do it. However I have tried 3 times to export it and the files are
> always either incomplete due to server (Mysql) disconnect.  I finally did
> get one that seemed to complete OK but then I tried several times to import
> it into the new server and likewise got server disconnect errors. I am
> aware that there are a multitude of possibilities, so what I'm asking here
> is for any tips on moving databases of this size or larger. I'm beginning
> to wonder if Mysql is the way to go here.
> Thanks!
>
> --
> John Foster
> JW Foster & Associates
>
>
> _______________________________________________
> MediaWiki-l mailing list
> To unsubscribe, go to:
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>



--
Dave Humphrey -- [hidden email]
Founder/Server Admin of the Unofficial Elder Scrolls Pages -- www.uesp.net
www.viud.net - Building the world's toughest USB drive
_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Moving large mediawiki site databases

John Foster-2
In reply to this post by John Doe-27
I'm using the basic mysqldump for dumping and mysql  syntax for importing.
Moving from a Linux Mint LMDE server to an Ubuntu Enterprise server.
pretty vanilla stuff. The database is too big for phpmyadmin to do
either & it's just too slow. Also tried BigDump but that did not perform
either. Mostly figure I need to beef up mysql timing & tuning & have no
experience doing that.

On 05/14/2015 09:58 PM, John wrote:

> are you using the dump/import script or a feature of the database software?
> mysql or mariadb?
>
> On Thu, May 14, 2015 at 10:52 PM, John Foster <[hidden email]>
> wrote:
>
>> I have to move a mediawiki site that has a database of approx 4.1GB. It is
>> somthing I have done before and I usually just use the command line syntax
>> to do it. However I have tried 3 times to export it and the files are
>> always either incomplete due to server (Mysql) disconnect.  I finally did
>> get one that seemed to complete OK but then I tried several times to import
>> it into the new server and likewise got server disconnect errors. I am
>> aware that there are a multitude of possibilities, so what I'm asking here
>> is for any tips on moving databases of this size or larger. I'm beginning
>> to wonder if Mysql is the way to go here.
>> Thanks!
>>
>> --
>> John Foster
>> JW Foster & Associates
>>
>>
>> _______________________________________________
>> MediaWiki-l mailing list
>> To unsubscribe, go to:
>> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>>
> _______________________________________________
> MediaWiki-l mailing list
> To unsubscribe, go to:
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l

--
John Foster
JW Foster & Associates


_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Moving large mediawiki site databases

Fannon
I've had good experiences with http://www.mysqldumper.net/

It tries to split the import / export to smaller batches. This will take a
while longer, but it works with bigger databases.

Best,
Simon

2015-05-15 18:03 GMT+02:00 John Foster <[hidden email]>:

> I'm using the basic mysqldump for dumping and mysql  syntax for importing.
> Moving from a Linux Mint LMDE server to an Ubuntu Enterprise server.
> pretty vanilla stuff. The database is too big for phpmyadmin to do either &
> it's just too slow. Also tried BigDump but that did not perform either.
> Mostly figure I need to beef up mysql timing & tuning & have no experience
> doing that.
>
>
> On 05/14/2015 09:58 PM, John wrote:
>
>> are you using the dump/import script or a feature of the database
>> software?
>> mysql or mariadb?
>>
>> On Thu, May 14, 2015 at 10:52 PM, John Foster <[hidden email]>
>> wrote:
>>
>>  I have to move a mediawiki site that has a database of approx 4.1GB. It
>>> is
>>> somthing I have done before and I usually just use the command line
>>> syntax
>>> to do it. However I have tried 3 times to export it and the files are
>>> always either incomplete due to server (Mysql) disconnect.  I finally did
>>> get one that seemed to complete OK but then I tried several times to
>>> import
>>> it into the new server and likewise got server disconnect errors. I am
>>> aware that there are a multitude of possibilities, so what I'm asking
>>> here
>>> is for any tips on moving databases of this size or larger. I'm beginning
>>> to wonder if Mysql is the way to go here.
>>> Thanks!
>>>
>>> --
>>> John Foster
>>> JW Foster & Associates
>>>
>>>
>>> _______________________________________________
>>> MediaWiki-l mailing list
>>> To unsubscribe, go to:
>>> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>>>
>>>  _______________________________________________
>> MediaWiki-l mailing list
>> To unsubscribe, go to:
>> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>>
>
> --
> John Foster
> JW Foster & Associates
>
>
> _______________________________________________
> MediaWiki-l mailing list
> To unsubscribe, go to:
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Moving large mediawiki site databases

Larry Silverman
The description of mysqldumper kinda doesn't sound like you'd end up with
something transactionally consistent.

Larry Silverman
Chief Technology Officer
TrackAbout, Inc.

On Fri, May 15, 2015 at 11:49 AM, Simon Heimler <[hidden email]>
wrote:

> I've had good experiences with http://www.mysqldumper.net/
>
> It tries to split the import / export to smaller batches. This will take a
> while longer, but it works with bigger databases.
>
> Best,
> Simon
>
> 2015-05-15 18:03 GMT+02:00 John Foster <[hidden email]>:
>
> > I'm using the basic mysqldump for dumping and mysql  syntax for
> importing.
> > Moving from a Linux Mint LMDE server to an Ubuntu Enterprise server.
> > pretty vanilla stuff. The database is too big for phpmyadmin to do
> either &
> > it's just too slow. Also tried BigDump but that did not perform either.
> > Mostly figure I need to beef up mysql timing & tuning & have no
> experience
> > doing that.
> >
> >
> > On 05/14/2015 09:58 PM, John wrote:
> >
> >> are you using the dump/import script or a feature of the database
> >> software?
> >> mysql or mariadb?
> >>
> >> On Thu, May 14, 2015 at 10:52 PM, John Foster <[hidden email]>
> >> wrote:
> >>
> >>  I have to move a mediawiki site that has a database of approx 4.1GB. It
> >>> is
> >>> somthing I have done before and I usually just use the command line
> >>> syntax
> >>> to do it. However I have tried 3 times to export it and the files are
> >>> always either incomplete due to server (Mysql) disconnect.  I finally
> did
> >>> get one that seemed to complete OK but then I tried several times to
> >>> import
> >>> it into the new server and likewise got server disconnect errors. I am
> >>> aware that there are a multitude of possibilities, so what I'm asking
> >>> here
> >>> is for any tips on moving databases of this size or larger. I'm
> beginning
> >>> to wonder if Mysql is the way to go here.
> >>> Thanks!
> >>>
> >>> --
> >>> John Foster
> >>> JW Foster & Associates
> >>>
> >>>
> >>> _______________________________________________
> >>> MediaWiki-l mailing list
> >>> To unsubscribe, go to:
> >>> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >>>
> >>>  _______________________________________________
> >> MediaWiki-l mailing list
> >> To unsubscribe, go to:
> >> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >>
> >
> > --
> > John Foster
> > JW Foster & Associates
> >
> >
> > _______________________________________________
> > MediaWiki-l mailing list
> > To unsubscribe, go to:
> > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >
> _______________________________________________
> MediaWiki-l mailing list
> To unsubscribe, go to:
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Moving large mediawiki site databases

Fannon
True, that won't work!

2015-05-15 19:54 GMT+02:00 Larry Silverman <[hidden email]>:

> The description of mysqldumper kinda doesn't sound like you'd end up with
> something transactionally consistent.
>
> Larry Silverman
> Chief Technology Officer
> TrackAbout, Inc.
>
> On Fri, May 15, 2015 at 11:49 AM, Simon Heimler <[hidden email]>
> wrote:
>
>> I've had good experiences with http://www.mysqldumper.net/
>>
>> It tries to split the import / export to smaller batches. This will take a
>> while longer, but it works with bigger databases.
>>
>> Best,
>> Simon
>>
>> 2015-05-15 18:03 GMT+02:00 John Foster <[hidden email]>:
>>
>> > I'm using the basic mysqldump for dumping and mysql  syntax for
>> importing.
>> > Moving from a Linux Mint LMDE server to an Ubuntu Enterprise server.
>> > pretty vanilla stuff. The database is too big for phpmyadmin to do
>> either &
>> > it's just too slow. Also tried BigDump but that did not perform either.
>> > Mostly figure I need to beef up mysql timing & tuning & have no
>> experience
>> > doing that.
>> >
>> >
>> > On 05/14/2015 09:58 PM, John wrote:
>> >
>> >> are you using the dump/import script or a feature of the database
>> >> software?
>> >> mysql or mariadb?
>> >>
>> >> On Thu, May 14, 2015 at 10:52 PM, John Foster <[hidden email]>
>> >> wrote:
>> >>
>> >>  I have to move a mediawiki site that has a database of approx 4.1GB.
>> It
>> >>> is
>> >>> somthing I have done before and I usually just use the command line
>> >>> syntax
>> >>> to do it. However I have tried 3 times to export it and the files are
>> >>> always either incomplete due to server (Mysql) disconnect.  I finally
>> did
>> >>> get one that seemed to complete OK but then I tried several times to
>> >>> import
>> >>> it into the new server and likewise got server disconnect errors. I am
>> >>> aware that there are a multitude of possibilities, so what I'm asking
>> >>> here
>> >>> is for any tips on moving databases of this size or larger. I'm
>> beginning
>> >>> to wonder if Mysql is the way to go here.
>> >>> Thanks!
>> >>>
>> >>> --
>> >>> John Foster
>> >>> JW Foster & Associates
>> >>>
>> >>>
>> >>> _______________________________________________
>> >>> MediaWiki-l mailing list
>> >>> To unsubscribe, go to:
>> >>> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>> >>>
>> >>>  _______________________________________________
>> >> MediaWiki-l mailing list
>> >> To unsubscribe, go to:
>> >> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>> >>
>> >
>> > --
>> > John Foster
>> > JW Foster & Associates
>> >
>> >
>> > _______________________________________________
>> > MediaWiki-l mailing list
>> > To unsubscribe, go to:
>> > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>> >
>> _______________________________________________
>> MediaWiki-l mailing list
>> To unsubscribe, go to:
>> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>>
>
>
_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Moving large mediawiki site databases

John Foster-2
In reply to this post by Dave Humphrey
This is the dump command I use:

mysqldump --verbose -u root -p my_wiki > my_wiki.sql

This is the error I get:
mysqldump: Error 2013: Lost connection to MySQL server during query when
dumping table `transcache` at row: 12

This is the relevant part of etc/mysql/my.cnf:
# * Fine Tuning
#
key_buffer        = 64M
max_allowed_packet    = 64M
thread_stack        = 192K
thread_cache_size       = 8
# This replaces the startup script and checks MyISAM tables if needed
# the first time they are touched
myisam-recover         = BACKUP
#max_connections        = 100
#table_cache            = 64
#thread_concurrency     = 10
#
# * Query Cache Configuration
#
query_cache_limit    = 8M
query_cache_size        = 64M
#
# * Logging and Replication
#
# Both location gets rotated by the cronjob.
# Be aware that this log type is a performance killer.
# As of 5.1 you can enable the log at runtime!
#general_log_file        = /var/log/mysql/mysql.log
#general_log             = 1
#
# Error logging goes to syslog due to
/etc/mysql/conf.d/mysqld_safe_syslog.cnf.
#
# Here you can see queries with especially long duration
#log_slow_queries    = /var/log/mysql/mysql-slow.log
#long_query_time = 2
#log-queries-not-using-indexes
#
# The following can be used as easy to replay backup logs or for
replication.
# note: if you are setting up a replication slave, see README.Debian about
#       other settings you may need to change.
#server-id        = 1
#log_bin            = /var/log/mysql/mysql-bin.log
expire_logs_days    = 10
max_binlog_size         = 100M
#binlog_do_db        = include_database_name
#binlog_ignore_db    = include_database_name
#
# * InnoDB
#
# InnoDB is enabled by default with a 10MB datafile in /var/lib/mysql/.
# Read the manual for more InnoDB related options. There are many!
#
net_write_timeout = 360




On 05/15/2015 08:24 AM, Dave Humphrey wrote:
> Exactly what is the error message you are getting? If it is something like
> "Mysql Server has gone away" it may be due to a too small
> "max_allowed_packet" setting. See
> http://stackoverflow.com/questions/19214572/can-not-import-large-sql-dump-into-mysql-5-6
>

--
John Foster
JW Foster & Associates


_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Moving large mediawiki site databases

Larry Silverman
Something to try: pipe the dump through gzip. If you have more CPU oomph
than disk speed, the whole operation might finish faster and you might
avoid getting kicked by whatever timeout is killing your process.

Here's the mysqldump command I use:
mysqldump --databases devmediawiki --single-transaction --add-drop-database
--triggers --routines --events --user=root --password | gzip  >
/tmp/devmediawiki-sql.gz

Larry Silverman
Chief Technology Officer
TrackAbout, Inc.

On Fri, May 15, 2015 at 3:56 PM, John Foster <[hidden email]> wrote:

> This is the dump command I use:
>
> mysqldump --verbose -u root -p my_wiki > my_wiki.sql
>
> This is the error I get:
> mysqldump: Error 2013: Lost connection to MySQL server during query when
> dumping table `transcache` at row: 12
>
> This is the relevant part of etc/mysql/my.cnf:
> # * Fine Tuning
> #
> key_buffer        = 64M
> max_allowed_packet    = 64M
> thread_stack        = 192K
> thread_cache_size       = 8
> # This replaces the startup script and checks MyISAM tables if needed
> # the first time they are touched
> myisam-recover         = BACKUP
> #max_connections        = 100
> #table_cache            = 64
> #thread_concurrency     = 10
> #
> # * Query Cache Configuration
> #
> query_cache_limit    = 8M
> query_cache_size        = 64M
> #
> # * Logging and Replication
> #
> # Both location gets rotated by the cronjob.
> # Be aware that this log type is a performance killer.
> # As of 5.1 you can enable the log at runtime!
> #general_log_file        = /var/log/mysql/mysql.log
> #general_log             = 1
> #
> # Error logging goes to syslog due to
> /etc/mysql/conf.d/mysqld_safe_syslog.cnf.
> #
> # Here you can see queries with especially long duration
> #log_slow_queries    = /var/log/mysql/mysql-slow.log
> #long_query_time = 2
> #log-queries-not-using-indexes
> #
> # The following can be used as easy to replay backup logs or for
> replication.
> # note: if you are setting up a replication slave, see README.Debian about
> #       other settings you may need to change.
> #server-id        = 1
> #log_bin            = /var/log/mysql/mysql-bin.log
> expire_logs_days    = 10
> max_binlog_size         = 100M
> #binlog_do_db        = include_database_name
> #binlog_ignore_db    = include_database_name
> #
> # * InnoDB
> #
> # InnoDB is enabled by default with a 10MB datafile in /var/lib/mysql/.
> # Read the manual for more InnoDB related options. There are many!
> #
> net_write_timeout = 360
>
>
>
>
> On 05/15/2015 08:24 AM, Dave Humphrey wrote:
>
>> Exactly what is the error message you are getting? If it is something like
>> "Mysql Server has gone away" it may be due to a too small
>> "max_allowed_packet" setting. See
>>
>> http://stackoverflow.com/questions/19214572/can-not-import-large-sql-dump-into-mysql-5-6
>>
>>
> --
> John Foster
> JW Foster & Associates
>
>
> _______________________________________________
> MediaWiki-l mailing list
> To unsubscribe, go to:
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Reply | Threaded
Open this post in threaded view
|

Re: Moving large mediawiki site databases

John Doe-27
Another issue might be that your dumping the cache tables too causing
significantly more overhead than needed.

On Friday, May 15, 2015, Larry Silverman <[hidden email]> wrote:

> Something to try: pipe the dump through gzip. If you have more CPU oomph
> than disk speed, the whole operation might finish faster and you might
> avoid getting kicked by whatever timeout is killing your process.
>
> Here's the mysqldump command I use:
> mysqldump --databases devmediawiki --single-transaction --add-drop-database
> --triggers --routines --events --user=root --password | gzip  >
> /tmp/devmediawiki-sql.gz
>
> Larry Silverman
> Chief Technology Officer
> TrackAbout, Inc.
>
> On Fri, May 15, 2015 at 3:56 PM, John Foster <[hidden email]
> <javascript:;>> wrote:
>
> > This is the dump command I use:
> >
> > mysqldump --verbose -u root -p my_wiki > my_wiki.sql
> >
> > This is the error I get:
> > mysqldump: Error 2013: Lost connection to MySQL server during query when
> > dumping table `transcache` at row: 12
> >
> > This is the relevant part of etc/mysql/my.cnf:
> > # * Fine Tuning
> > #
> > key_buffer        = 64M
> > max_allowed_packet    = 64M
> > thread_stack        = 192K
> > thread_cache_size       = 8
> > # This replaces the startup script and checks MyISAM tables if needed
> > # the first time they are touched
> > myisam-recover         = BACKUP
> > #max_connections        = 100
> > #table_cache            = 64
> > #thread_concurrency     = 10
> > #
> > # * Query Cache Configuration
> > #
> > query_cache_limit    = 8M
> > query_cache_size        = 64M
> > #
> > # * Logging and Replication
> > #
> > # Both location gets rotated by the cronjob.
> > # Be aware that this log type is a performance killer.
> > # As of 5.1 you can enable the log at runtime!
> > #general_log_file        = /var/log/mysql/mysql.log
> > #general_log             = 1
> > #
> > # Error logging goes to syslog due to
> > /etc/mysql/conf.d/mysqld_safe_syslog.cnf.
> > #
> > # Here you can see queries with especially long duration
> > #log_slow_queries    = /var/log/mysql/mysql-slow.log
> > #long_query_time = 2
> > #log-queries-not-using-indexes
> > #
> > # The following can be used as easy to replay backup logs or for
> > replication.
> > # note: if you are setting up a replication slave, see README.Debian
> about
> > #       other settings you may need to change.
> > #server-id        = 1
> > #log_bin            = /var/log/mysql/mysql-bin.log
> > expire_logs_days    = 10
> > max_binlog_size         = 100M
> > #binlog_do_db        = include_database_name
> > #binlog_ignore_db    = include_database_name
> > #
> > # * InnoDB
> > #
> > # InnoDB is enabled by default with a 10MB datafile in /var/lib/mysql/.
> > # Read the manual for more InnoDB related options. There are many!
> > #
> > net_write_timeout = 360
> >
> >
> >
> >
> > On 05/15/2015 08:24 AM, Dave Humphrey wrote:
> >
> >> Exactly what is the error message you are getting? If it is something
> like
> >> "Mysql Server has gone away" it may be due to a too small
> >> "max_allowed_packet" setting. See
> >>
> >>
> http://stackoverflow.com/questions/19214572/can-not-import-large-sql-dump-into-mysql-5-6
> >>
> >>
> > --
> > John Foster
> > JW Foster & Associates
> >
> >
> > _______________________________________________
> > MediaWiki-l mailing list
> > To unsubscribe, go to:
> > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >
> _______________________________________________
> MediaWiki-l mailing list
> To unsubscribe, go to:
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l