Showing entries 41051 to 41060 of 44737
« 10 Newer Entries | 10 Older Entries »
WordPress Blog Upgrade Time

Time to Upgrade my WordPress Blog software from Version 1.5.2 to Version 2.0.2 after my latest spam attacks and Combating Blog Spam attempt.

Here is what I did.

cd /home/arabx/www
tar cvfz blog.backup.20060520.tar.gz wordpress-1.5.2
mysqldump -uroot -p arabx_blog > blog.db.20060520.sql
mv blog.*20060520* /u01/backup   # Not in my WWW
scp blog.*20060520* to@someplacesave:/backup
wget http://wordpress.org/latest.tar.gz
mv latest.tar.gz    wordpress-2.0.2.tar.gz # I really hate unversioned files

Disable Plugins. You had to do this manually from the Admin interface (I’m sure it’s just a SQL statement).
Goto http://blog.arabx.com.au/wp-admin/plugins.php

Now some more work.

tar xvfz wordpress-2.0.2.tar.gz  # creates a wordpress directory
mv wordpress wordpress-2.0.2
ln -s wordpress-2.0.2 blog
cp wordpress-1.5.2/wp-config.php wordpress-2.0.2 …
[Read more]
Have you tried it?

So, I try to answer questions on the mysql users list. It usually frustrates me, but I also want to help. And often, I learn things.

My 2nd biggest pet peeve is that people don’t think to Google their findings. Or even “MySQL” their findings. If you have a short phrase or word, type the following into your address bar and MySQL does the right thing:

http://www.mysql.com/short word or phrase

try it:

http://www.mysql.com/replication
http://www.mysql.com/insert syntax
http://www.mysql.com/can’t%20connect

The bigger pet peeve I have is that people are AFRAID. They shake in their boots if there’s something they don’t understand. This is why there are such things as test servers. Most …

[Read more]
Group commit and XA

Returning to post Group commit and real fsync I made several experiments:

I ran sysbench update_key benchmarks without ---log-bin, with ---log-bin, and with ---log-bin and ---innodb-support-xa=0 (default value is 1). Results (in transactions / sec)

threads without ---log-bin ---log-bin ---log-bin and
---innodb_support-xa=0
1 1218.68 614.94 1010.44
4 2686.36 667.77 1162.60 …
[Read more]
Find Your Most-Commented Blog Categories in Wordpress

Scott recently IMed me and asked how to find out which blogging categories are his most commented, and I thought I’d share the query:

SELECT wp_categories.cat_name, COUNT(wp_comments.comment_id)
FROM wp_categories LEFT JOIN wp_post2cat ON wp_categories.cat_id = wp_post2cat.category_id
LEFT JOIN wp_comments ON wp_post2cat.post_id = wp_comments.comment_post_id
WHERE comment_approved != ’spam’
GROUP BY wp_categories.cat_id
ORDER BY 2 DESC

+------------------------------------------+-------------------------------+
| cat_name                                 | COUNT(wp_comments.comment_id) |
+------------------------------------------+-------------------------------+
| MySQL                                    |                           307 |
| Visual Basic 6                           |                           236 |
| General …
[Read more]
Fifty lines of Perl to protect against web site defacing

I was following conversation on LogAnalysis mailing list (LogAnalysis@lists.shmoo.com) and one of the members suggested that finding out if a web site is defaced is one of the valuable things to find out.

Realizing, that all my hosted web sites are stored in a MySQL table (for automated management) — what would it take to add the functionality?

Just add one column for the main page "page TEXT" (and a flag "verify CHAR(1)" in case you don't want checking on a domain)... and add 50 lines of Perl...

!/usr/bin/perl

# NOTE: Your wget needs to be at least v1.8 to handle PHP based pages
# NOTE: Could use Algorithm::Diff -- if it was more widely available

use strict;
use DBI();

my $debug = 0;

# Connect to sites database
my $dbh = DBI->connect("DBI:mysql:database=floyd;host=localhost", "USERNAME", "PASSWORD", {'RaiseError' => …
[Read more]
Fifty lines of Perl to protect against web site defacing

I was following conversation on LogAnalysis mailing list (LogAnalysis@lists.shmoo.com) and one of the members suggested that finding out if a web site is defaced is one of the valuable things to find out.

Realizing, that all my hosted web sites are stored in a MySQL table (for automated management) — what would it take to add the functionality?

Just add one column for the main page "page TEXT" (and a flag "verify CHAR(1)" in case you don't want checking on a domain)... and add 50 lines of Perl...

!/usr/bin/perl

# NOTE: Your wget needs to be at least v1.8 to handle PHP based pages
# NOTE: Could use Algorithm::Diff -- if it was more widely available

use strict;
use DBI();

my $debug = 0;

# Connect to sites database
my $dbh = DBI->connect("DBI:mysql:database=floyd;host=localhost", "USERNAME", "PASSWORD", {'RaiseError' => …
[Read more]
Fifty lines of Perl to protect against web site defacing

I was following conversation on LogAnalysis mailing list (LogAnalysis@lists.shmoo.com) and one of the members suggested that finding out if a web site is defaced is one of the valuable things to find out.

Realizing, that all my hosted web sites are stored in a MySQL table (for automated management) — what would it take to add the functionality?

Just add one column for the main page "page TEXT" (and a flag "verify CHAR(1)" in case you don't want checking on a domain)... and add 50 lines of Perl...

!/usr/bin/perl

# NOTE: Your wget needs to be at least v1.8 to handle PHP based pages
# NOTE: Could use Algorithm::Diff -- if it was more widely available

use strict;
use DBI();

my $debug = 0;

# Connect to sites database
my $dbh = DBI->connect("DBI:mysql:database=floyd;host=localhost", "USERNAME", "PASSWORD", {'RaiseError' => …
[Read more]
Fifty lines of Perl to protect against web site defacing

I was following conversation on LogAnalysis mailing list (LogAnalysis@lists.shmoo.com) and one of the members suggested that finding out if a web site is defaced is one of the valuable things to find out.

Realizing, that all my hosted web sites are stored in a MySQL table (for automated management) — what would it take to add the functionality?

Just add one column for the main page "page TEXT" (and a flag "verify CHAR(1)" in case you don't want checking on a domain)... and add 50 lines of Perl...

!/usr/bin/perl

# NOTE: Your wget needs to be at least v1.8 to handle PHP based pages
# NOTE: Could use Algorithm::Diff -- if it was more widely available

use strict;
use DBI();

my $debug = 0;

# Connect to sites database
my $dbh = DBI->connect("DBI:mysql:database=floyd;host=localhost", "USERNAME", "PASSWORD", {'RaiseError' => …
[Read more]
MySQL :: Developer Zone Quick Polls

I don’t get to the MySQL Developer Zone main page often enough. In thinking about what pages I view everyday or regularly, it doesn’t rate as high as Planet MySQL, MySQL Forums or even the MySQL Forge.

I was most dissappointed in the results of a recent poll What did you think of the 2006 Users Conference?. The top response was I had no idea there was a Users Conference. That’s not good to see this.

An interesting poll What are you most looking forward to at the MySQL Users Conference (April 24-27)?, the clear …

[Read more]
ZeroLogik Podcast: A Little MySQL and a Lot of Everything Else

If you didn't get your fill of Jay and I on the DBAZine podcast . . . we connected with the guys over at ZeroLogik for the ZeroLogik podcast (number 19b) on Wednesday.

The ZeroLogic podcast promises a minimum of 98% pure opinion. Yes, the topic is MySQL and Pro MySQL, but that doesn't stop the conversation from veering into other topics like limited-functionality versions of software, Web 2.0, VC funding, Google Notepad, bootstrapping a business, selling hollywood scripts, the starship enterprise, MacBooks and much, much …

[Read more]
Showing entries 41051 to 41060 of 44737
« 10 Newer Entries | 10 Older Entries »