Showing entries 41 to 49
« 10 Newer Entries
Displaying posts with tag: Scripts (reset)
Parallel mysqldump backup script available. Testers wanted.

Large databases, long mysqldump times, long waits for globally locked tables. These problems basically never go away when you rely on mysqldump with –all-databases or a list of databases, as it dumps schemas serially. I’m not going to explain serial vs parallel processing here since that’s a larger topic. Suffice to say that in these days of multi-core / multi-cpu servers we only make use of one processor’s core when we serially export databases using mysqldump. So, I have a new script that attempts to alleviate those issues and now I need testers to provide feedback/improvements.

 

In order to keep some sanity when dealing with hundreds of database servers, the script takes care of the following:

  1. low global locking time requirements: solved by parallel tasks / forked processes
  2. backup file checking: with mysqldump files; it checks for “–Dump completed” at the end of the sql file …
[Read more]
MySQL, PHP, XML = mysql-dba.com

This is a basic heads up post, perhaps even blatant self marketing. So, please continue reading.

If anyone recalls the website http://mysql-dba.com they would know that it’s based on the planet.py codebase that is written in python. I originally wrote a simple php script that utilized the lastRSS.php class for parsing feeds on the backend for archival purposes to be used at a later date. I say archival and later date because the site itself did not utilize any of the relational data storage to run the site. The site’s python code and cache was updated by cron scripts every 15 minutes and new data was scp’d from my dev server to my webhost’s servers.  This process eventually was quite randomly run since my development server rack in the garage at home gets really hot during the summer months and I ended up taking the servers offline unless I was actively using them for other purposes. You could say the priority of the site came …

[Read more]
Introducing wordpress-scripts 0.1 (0.2 out)

Update: I’ve been suffering some ungly and stupid bugs today, so I’ve fixed them and released version 0.2. It also includes a new script wp-update-home.


I’ve just published some scripts that help me manage my personal wordpress installations, and publish some plugins I’m working on.

Warning: these are early versions which I use for small tasks. If you find
a bug or have suggestions, contact me at jbernal@warp.es

Download version 0.1

[Read more]
One backup script that does it all.

This integrates with Monolith, but the database update function can be stripped out for use without Monolith. The idea is that this script is a wrapper for mysqldump that does backup file consistency checking, email reporting, file based logging and directory pruning.

I used to have one script for daily, weekly, and monthly all running out of /etc/cron.daily /etc/cron.weekly /etc/cron.monthly - respectively. But maintaining 3 scripts is foolish if one can do everything. So I added some variables to check day of week and day of month to achieve this.

Enjoy the code. Script Link here.

Project: RSS Feed Storage Using InnoDB #2

So far so good. I have a bit over two hundred RSS entries logged in the database for testing purposes. Today I changed the table structure for for the title and description to support longer entries. Here is the pertinent code.

Add feed function:

function add_feed($item,$site_id) {
$each_title = $item['title'];
$each_link = $item['link'];
$each_desc = $item['description'];
if(isset($item['guid'])) { $each_guid = $item['guid'];} else { $each_guid="";}
if(isset($item['pubDate'])) { $each_pubDate = $item['pubDate'];} else { $each_pubDate="";}
//print "\n$each_desc\n";
$sql2=sprintf("
INSERT INTO `extrabigassfries`.`feed_items` (
`id` ,
`rss_site_id` ,
`item_title` ,
`item_link` ,
`item_description` ,
`item_guid` ,
`item_pubDate` ,
`Creation_time`
)
VALUES (
NULL , …

[Read more]
Project: RSS Feed Storage Using InnoDB

I’ve been coding a couple of scripts that run on 5 minute intervals to grab RSS/Atom feed data from http://mysql-dba.com and import that into a MySQL database. It idea is to create a search function for the site that will look at all past data from the aggregated feeds. Since there are multiple pollers running at different intervals I decided to use Innodb for the read/write nature of the poller/processing scripts.

This is very simple so far - and as such I felt it should be documented from the start unlike many of my other projects. Here’s the feed table that is storing the information from the RSS feeds.


mysql> show create table feed_items\G
*************************** 1. row ***************************
Table: feed_items
Create Table: CREATE TABLE `feed_items` (
`id` bigint(20) NOT NULL auto_increment,
`rss_site_id` int(11) …

[Read more]
Rotating General Query & Slow Logs

Sometimes you need to have the general query log on and even though it causes more disk I/O than you may want, it’s good for troubleshooting. This log can and probably will fill up your disks rather quickly. Then there’s the slow query log - setting log_slow_queries and log_queries_not_using_indexes will write out the queries that take longer than long_query_time to execute, as well as any query not using an index.

So, since MySQL does not apply the expire_logs_days value to these logs - only to the binary log (log_bin), we need another solution. There are probably a bunch of custom scripts out there that do this, but big surprise - we have one as well. This was originally written by Jim Wood until I got my hands on it and made some changes. The changes are listed in the head of the script. This little guy will rotate the logs out to another directory and gzip them. …

[Read more]
Schema for DBA duties

I’m starting up a schema to hold stored procedures that automate common DBA tasks. SQL file for import is here: opsmonitor.sql

More SPs will be added in time… whenever I get the time to write them that is.

UPDATE: Jeff Stoner will be heading up this project to contain his scripts, of which the ones currently listed in the first link are written by himself.

Bad internets!

So I have a constant battle with Charter Cable - high speed internet at home. Constantly causing all kinds of problems. So I decided to make a little script that runs from cron every minute that will track my outages in a single table database. Then I’ll be graphing the outages in real-time via jpGraph [as seen here]. Here’s the simple code.

The database table create script for router_uptime

CREATE TABLE `response` (
`id` bigint(32) NOT NULL auto_increment,
`state` tinyint(1) NOT NULL,
`ip_address` varchar(15) NOT NULL,
`Creation_time` datetime NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1 AUTO_INCREMENT=3833 ;

The Script

#-(0)> cat check_ping
#!/bin/sh

[Read more]
Showing entries 41 to 49
« 10 Newer Entries