Home |  MySQL Buzz |  FAQ |  Feeds |  Submit your blog feed |  Feedback |  Archive |  Aggregate feed RSS 2.0 English Deutsch Español Français Italiano 日本語 Русский Português 中文
Showing entries 1 to 9

Displaying posts with tag: dump (reset)

Provisioning an Oracle slave using Tungsten and MySQL Sandbox
+3 Vote Up -0Vote Down

A few years ago, I used MySQL Sandbox to filter binary logs. While that one was a theoretical case, recently I came across a very practical case where we needed to provision an Oracle database, which is the designated slave of a MySQL master.

In this particular case, we needed to provision the Oracle slave with some auditing features in place. Therefore, mass load functions were not considered. What we needed was the contents of the MySQL database as binary rows i.e. the same format used for row-based replication.

To achieve the purpose as quickly and as painlessly as we could, I thought to employ the services of a MySQL Sandbox. The flow of information would be like this:

  • Install a sandbox, using the same version and
  •   [Read more...]
    How to exclude a database from your dump with ZRM (MySQL Community help needed)
    +1 Vote Up -0Vote Down

    Last month, Ronald Bradford, Giuseppe Maxia and Mark Leith spoke about how to simulate a mysqldump –ignore-database.
    This mysqldump option doesn’t exist and these three guys have given us various helpful solutions.

    But for those of us who use ZRM community to make backup with mysqldump, the –exclude-pattern seems to do the job :

    --exclude-pattern "pattern" 
    All databases or tables that match the pattern are not backed up. 
    If --all-databases or --databases are

      [Read more...]
    Dumping HTTP header with ngrep – the network grep
    +0 Vote Up -0Vote Down

    To get the HTTP Header informations for specific clients connections use ngrep and a pattern or a regular expression that will match the packets.

    install ngrep (example for debian / ubuntu):

    apt-get install ngrep

    These examples dumps HTTP header for any connection matching the string “images” on port 80.

    user@host:~# ngrep -qi -W normal '/images/'  port 80
    interface: lo (127.0.0.1/255.255.255.255)
    match: /images/
    
    T 10.1.1.199:62073 -> 127.0.0.1:80 [AP]
    GET /images/globe_blogs.gif HTTP/1.1..Host: frederikkonietzny.de..User-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; de; rv:1.9.2.12) Gecko/20101026 Firefox/3
    .6.12..Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8..Accept-Language: de-de,de;q=0.8,en-us;q=0.5,en;q=0.3..Accept-Encoding:
      [Read more...]
    mysqldump each object separately
    +3 Vote Up -1Vote Down

    As a continuation to a previous blog post last week and inspired by Kedar I have created a small script to export tables, stored procedures, functions and views into their respective file. It works for multiple databases where you can specify a list of databases too and although things like events, triggers and such are still missing they are easily added.

    It is especially useful to dump stored procedures separately since it is a lacking functionality in mysqldump.

    I placed the script in mysql forge for anybody to use, provide feedback and possibly enhancements to it.

    Dumping DDL – mysqldump tables, stored procedures, events, triggers (separately)
    +4 Vote Up -0Vote Down

    If you like to keep your ddl backed up in some source management tool like svn or cvs and want to do it individually for stored procedures, events, triggers, tables and such rather than having a single file you can easily do so using the below. You could even include the –skip-dump-date or –skip-comments and use the below to compare ddl daily checking for alterations thus making sure you are aware of any ddl changes done on the database.

    user=backup_user
    password=`cat ~/.backup_password`
    hostname=127.0.0.1
    port=3306
    dbname=test_db
    path=/home/mysql/ddl
    date=`date +%Y%m%d`

    mysqldump -u$user -p$password -h$hostname -P$port --no-create-info --no-data --no-create-db --skip-opt $dbname > "$path"/"$dbname"_triggers_"$date".sql
    mysqldump -u$user -p$password








      [Read more...]
    Filtering mysqldump output
    +6 Vote Up -1Vote Down



    Several people have suggested a more flexible approach at mysqldump output in matter of user privileges.
    When you dump the data structure for views, triggers, and stored routines, you also dump the permissions related to such objects, with the DEFINER clause.
    It would be nice to have such DEFINER clauses removed, or even replaced with the appropriate users in the new database.

    The mysqldump filter was created with this need in mind. It allows you to remove all DEFINER clauses and eventually replacing them with a better one.
    For example:

    mysqldump --no-data sakila | dump_filter --delete >









      [Read more...]
    Dumping BLOBs the right way (UPDATED)
    +1 Vote Up -0Vote Down
    I ran into a very interesting issue with a customer recently. The customer was storing JPEGs in MySQL (as BLOBs obviously) and was wanting to setup some additional MySQL slaves. Nothing crazy there. So I did the normal steps of scheduling a mysqldump with the --all-databases and --master-data options. Trouble is, the next day when I start importing the dump, 'mysql' bombed up with "ERROR at line X: Unknown command ''". I used 'sed' to pull out the lines around the error and didn't really notice anything out of the ordinary. After a bit of Googling, one suggestion was to do a simultaneous backup and restore (basically a "mysqldump .. | mysql -h ..."). In this case, since I was using an existing slave to do the backups, this was something that was feasible to do. Unfortunately, the result was the same.

    It turns out that 'mysqldump' will not always product a

      [Read more...]
    Editing a dump file from the airport
    +0 Vote Up -0Vote Down
    Keith Murphy asks if you have ever edited your dump files before feeding them to the database. And he recommends using sed. This reminds me of a war story, when I was consulting in Milan. The customer needed, among other things, to import a huge text file (created by a third party application) into a MySQL database. I did everything, including a cron job to load the data every day. By the time I left for the airport, the third party application hadn't yet provided its first sample, but since we knew the fields list, it was a simple case of creating a LOAD DATA INFILE command.
    I was boarding my plane, when the customer called, saying that the application was giving dates in the wrong format. Instead of 'YYYY-MM-DD', it was using the Italian human readable format 'DD/MM/YYYY'. No
      [Read more...]
    How To Add A File Extension To vim Syntax Highlighting
    +1 Vote Up -0Vote Down

    Today I was asked a question about defining custom extensions for vim syntax highlighting such that, for example, vim would know that example.lmx is actually of type xml and apply xml syntax highlighting to it. I know vim already automatically does it not just based on extension but by looking for certain strings inside the text, like

    After digging around I found the solution. Add the following to ~/.vimrc (the vim configuration file):

    1
    2
    3
    
    syntax on
    filetype on
    au BufNewFile,BufRead *.lmx set filetype=xml

    After applying it, my .lmx file is highlighted:

      [Read more...]
    Showing entries 1 to 9

    Planet MySQL © 1995, 2014, Oracle Corporation and/or its affiliates   Legal Policies | Your Privacy Rights | Terms of Use

    Content reproduced on this site is the property of the respective copyright holders. It is not reviewed in advance by Oracle and does not necessarily represent the opinion of Oracle or any other party.