Showing entries 1 to 10 of 82
10 Older Entries »
Displaying posts with tag: Scripts (reset)
Developing a MySQL Workbench plugin

The MySQL Workbench tool is great for development and administration tasks. Also it's available on Windows, Linux and Mac OS X which, according to information from third party sources, is more than you can say for most of the other equivalent tools. And Workbench is free. Having said that, most of the provided functionalities are intuitive and of daily use for developer and DBA staff alike.

Real time query monitoring on MySQL - with 3rd party tool and without

I've tried out Idera's MySQL Query Explorer, a free tool and I found it easy to use and simple to setup. The only improvement that can be suggested is to add, on the technical requirements page, that your MySQL instance (MySQL Server version 5.5 and newer) must be running with the performance_schema turned on. Otherwise the tool will just display an empty grid. After you've successfully set

Real time query monitoring on MySQL using the SYS schema

On an earlier post I wrote about real time query monitoring on MySQL with a third party tool and without one. The script is useful as it works with MySQL 5.5 and later. However, if you're using a later version of MySQL, you should look at the SYS schema. It is a collection of views, functions and procedures to help MySQL administrators get insight in to MySQL Database usage. It's available to

Yahoo's MySQL Partition Manager is Open Source

The guys at Yahoo released their partition management script on github:

At Yahoo, we manage a massive number of MySQL databases spread across multiple data centers. We have thousands of databases and each database has many partitioned tables. In order to efficiently create and maintain partitions we developed a partition manager which automatically manages these for you with minimal pre

Get a summary footprint on a MySQL server instance

Landing on an enterprise with ongoing projects mean that servers are often handed to IT staff without complete knowledge of  what's inside. I've built a simple script, scraping from here and there, to gather a summary of relevant information. Once you've gained remote access to the MySQL instance, you can execute the queries to identify the following information regarding the target database

MySQL sample databases: for testing and training

Sometimes it's useful to have a set of data prepared to be used on a fresh MySQL install for testing purposes. Or you might be preparing some training or workshop and want to prepare examples with fake information. For MySQL there are some choices available:

Employees Sample Database: provides a combination of a large base of data (approximately 160MB) spread over six separate tables and

Print tab separated values as table using MySQL

Using mysql command line utility to get recordsets, the data rows alignment and line breaks are often  a mess. You can use the command line tool on a mysql database server to get a set of rows into an output TSV file like so: shell> mysql -u your_user -p < your_statement.sql > data.csv I came up with the following python script to grab the output file and pretty print:

You can put your own

Reading RBR binary logs with pt-query-digest

For purposes of auditing anything that goes on our servers we're looking to parse the binary logs of all servers (masters), as with "Anemomaster". With Row Based Replication this is problematic since pt-query-digest does not support parsing RBR binary logs (true for 2.2.12, latest at this time).

I've written a simple script that translates RBR logs to SBR-like logs, with a little bit of cheating. My interest is that pt-query-digest is able to capture and count the queries, nothing else. By doing some minimal text manipulation on the binary log I'm able to now feed it to pt-query-digest which seems to be happy.

The script of course does not parse the binary log …

[Read more]
Monitoring DML/slow queries with graphite

pt-query-digest, Anemometer or "Anemomaster" do a great job of analysing your queries and giving you visibility into what's going on with your MySQL servers. However, the place where the query digests are written is just some MySQL tables on some server. Do you have monitoring/alerts on that table? How will you verify a specific query does not exceed some runtime/execution count threshold, and get notified when it does?

At Outbrain we use Graphite to collect almost all of our data. We like it for its simplicity and for the fact it has a "push" strategy as opposed to "pull" strategy: every …

[Read more]
Bash script: report largest InnoDB files

The following script will report the largest InnoDB tables under the data directory: schema, table & length in bytes. The tables could be non-partitioned, in which case this is simply the size of the corresponding .ibd file, or they can be partitioned, in which case the reported size is the sum of all partition files. It is assumed tables reside in their own tablespace files, i.e. created with innodb_file_per_table=1.

(
    mysql_datadir=$(grep datadir /etc/my.cnf | cut -d "=" -f 2)
    cd $mysql_datadir
    for frm_file in $(find . -name "*.frm")
    do
        tbl_file=${frm_file//.frm/.ibd}
        table_schema=$(echo $frm_file | cut -d "/" -f 2)
        table_name=$(echo $frm_file | cut -d "/" -f 3 | cut -d "." -f 1)
        if [ -f $tbl_file ]
        then
            # unpartitioned table
            file_size=$(du -cb $tbl_file 2> /dev/null | tail -n 1) 
        else
            # attempt partitioned innodb …
[Read more]
Showing entries 1 to 10 of 82
10 Older Entries »