Showing entries 11 to 18
« 10 Newer Entries
Displaying posts with tag: import (reset)
Improved Performance of Data Export/Import for MySQL Utilities 1.3.6 GA

The performance of the mysqldbcopy, mysqldbexport, and mysqldbimport utilities has been optimized in MySQL Utilities 1.3.6. In the case of export/import there have been significant improvements. In particular, multiprocessing support has been added to these utilities and can be enabled with the new --multiprocess option. The option permits concurrent execution and makes the most of the CPU resources available (number of cores).

Multiprocessing is applied at different levels according to the operating system. For non-POSIX systems, multiprocessing is limited to the database-level whereas POSIX systems can make multiprocess at the table level.

More specifically, the mysqldbcopy and mysqldbexport utilities allow multiprocessing at the table level for non- Windows systems and database level for Windows system. The mysqldbimport utility allows multiprocessing at the file level independently from the OS.

[Read more]
RDS Migration from 5.5 to 5.6 with mysqldump

Amazon recently announced support for 5.6, unfortunately, direct upgrade from lower versions is not yet supported. On a recent migration work – running mysqldump flat out would’ve meant 6+hrs of downtime. How did we cut it off to 1h45m? Simple, run dump per table and pipe it directly to the new 5.6 instance in parallel using Percona Server’s mysqldump utility to take advantage of –innodb-optimize-keys.

Here’s the base script we used – of course, YMMV and make sure to optimize the destination instance as well!

#!/bin/bash
# export-run.sh
# This is the wrapper script which builds up the list of tables to split into $parallel parts and calls export-tables.sh

parallel=6
dblist="db1 db2 db3"
smysql="mysql -hsource-55.us-east-1.rds.amazonaws.com"
dmysql="mysql -hdest-56.us-east-1.rds.amazonaws.com" …
[Read more]
Check (Rough) Progress of Your CSV Import to MySQL

If you are importing large CSV or SQL dumps to MySQL, chances are you were looking for ways to see how far the import has gone. If you know how many rows there are from the file being imported, you can do a SELECT COUNT(*) but that would take sometime for the query to finish especially on really big imports.

Using lsof, you can monitor the current file offset to which a process is reading from using the -o option. Knowing the size of the file and some snapshots of the offset, you can get a somewhat rough idea of how fast the import goes. Note though that this is only file-read-pace not actual import speed as MySQL import can vary depending on a number of conditions i.e. table growth, secondary indexes, etc.

Let’s say I am importing a 1.1G CSV file into a table.

[revin@forge msb_5_5_300]$ ls -al /wok/dta/samples/ft_history.csv 
-rw-rw-r--. 1 revin revin 1075456654 Nov 8 23:25 /wok/dta/samples/ft_history.csv …
[Read more]
How To - Guide to Importing Data from a MySQL Database to Excel using MySQL for Excel

Fetching data from a database to then get it into an Excel spreadsheet to do analysis, reporting, transforming, sharing, etc. is a very common task among users. This task can be accomplished in several different ways and with different tools getting the same result; but users may find the process rather complicated, too technical and lengthy. With MySQL for Excel the task of importing data from a MySQL database to an Excel spreadsheet becomes an easy one and accessible to all types of users.  Here is a quick guide describing how to import data to Excel using MySQL for Excel.

MySQL JSON import tool version 1.4 is out

This is just a quick notice that there is a new version of the JSON import tool available now. The main thing in it is that it fixes an iddues with threadinig that caused a threaded load (and this is the default) to hang of there was an error or the import was interrupted. Also, there are some minor fixes and also a runtime status printing, available by sending a SIGUSR1 signal to the process, feature is available.

Download from sourceforge.

Cheers
/Karlsson

MySQL errors

Next time you have an error whilst running an sql script into mysql using the pretty easy redirect into eg: mysql -uroot -pmsandbox -h127.0.0.1 -P3306 dbname < filename you might want to give the -v option a shot.

MySQL normally outputs an error with the line number like:

[mysql@dcassar-ubuntu /mysql/dumps/stored_procs 11:10:12]$ mysql -uroot -pmsandbox -h127.0.0.1 -P5151 test < file
ERROR 1267 (HY000) at line 375: Illegal mix of collations (latin1_swedish_ci,IMPLICIT) and (latin1_general_cs,IMPLICIT) for operation '='

But 375 is not including comments and stuff so it’s a bit hard to go through the file to locate the exact section which is failing.

Run the same command with -v:

[mysql@dcassar-ubuntu /mysql/dumps/stored_procs 11:10:30]$ mysql -uroot -pmsandbox -h127.0.0.1 -P5151 -v test < file | tail -5
ERROR 1267 (HY000) at line 375: Illegal mix …

[Read more]
Next Week’s MySQL Sessions at ODTUG Kaleidoscope

By now you know that there is a MySQL Track during next week’s ODTUG Kaleidoscope in Washington, DC. Ronald Bradford and I organized the schedule at the last minute (Ronald did a lot of the work!). It was difficult to fill a schedule with 19 sessions that are either 1 hour or 1.5 hours long, and to do it I ended up with three presentations.

At each presentation I will be giving away a copy of The MySQL Administrator’s Bible, so be sure to show up! All MySQL track sessions are in Maryland C, and all times are Eastern.

On Monday, June 28th from 4 pm – 5:30 pm I will be presenting …

[Read more]
Copying InnoDB tables between servers

The feature I announced some time ago http://www.mysqlperformanceblog.com/2009/06/08/impossible-possible-moving-innodb-tables-between-servers/ is now available in our latest releases of XtraBackup 0.8.1 and XtraDB-6.

Now I am going to show how to use it (the video will be also available on percona.tv).
Let's take tpcc schema and running standard MySQL ® 5.0.83, and assume we want to copy order_line table to different server. Note I am going to do it online, no needs to lock or shutdown server.

To export table you need XtraBackup, and you can …

[Read more]
Showing entries 11 to 18
« 10 Newer Entries