View currently running MySQL queries in realtime

Today I was playing around with Apache Solr. I was really impressed by its text searching capability, especially the MoreLikeThis search handler. I wanted to configure the DataImportHandler to import data directly from my MySQL database. It was really easy to configure, and I was able to perform a full import quickly. But when I tried to do a delta import, I found that it was not working as expected. Even though I was calling the delta import, it was causing a full import.

You might be wondering¬† why I am saying all these here. Well, I suspected that the problem was actually because of my SQL query for delta load.¬† But to be sure, I wanted to see the query being executed by Solr DataImportHandler. As always I turned to Google for assistance, and I finally reached the MySQL documentation on the General Query Log. Voila! This was exactly what I wanted. All I had to do was use the –log=[filename] parameter and all my queries will be logged to the specified log file. Nice, isn’t it?

Now I have to stop my running MySQL server and restart it with the –log switch, in addition to the other regular options. But there was a problem, I was not sure of the other required parameters. You can use the ps utility, when the MySQL server is running, to find out the normal parameters.

ps -ax | grep mysql

For me the output was

/usr/local/mysql/bin/mysqld –basedir=/usr/local/mysql –datadir=/usr/local/mysql/data –user=mysql –pid-file=/usr/local/mysql/data/localhost.pid –port=3306 –socket=/tmp/mysql.sock

Now shutdown the MySQL server.

// On Mac
/Library/StartupItems/MySQLCOM/MySQLCOM stop

// For other Linux/Unix variants try
/etc/init.d/mysqld restart
service mysql restart

Start mysqld with –log option

/usr/local/mysql/bin/mysqld --basedir=/usr/local/mysql --datadir=/usr/local/mysql/data --user=mysql --pid-file=/usr/local/mysql/data/localhost.pid --port=3306 --socket=/tmp/mysql.sock --log=/tmp/query.log

The general query log contains lots of irrelevant information. To view the log after filtering out the unwanted details use tail and grep as given below

tail -f /tmp/query.log | grep -v Connect | grep -v Quit

The amount of information added to the file is quite large. If you are using this on a production server, I recommend turning off the logging once you are done.

Saving wget file with different filename

Anyone who has worked with Linux must be familar with the wget utility. wget utility allows you to download files from a remote server using HTTP, HTTPS and FTP protocols. Downloading files using wget is as simple as

wget http://www.joycebabu.com/my-photo.jpg
Where http://www.joycebabu.com/my-photo.jpg is the file to be downloaded.

By default wget saves the file with the same name as the fetched file. For example, the above command will save the file as my-photo.jpg in the current working directory. If a file already exists with the given name, the downloaded file will be named as my-photo.jpg.1, my-photo.jpg.2 etc until a non existent filename is found.

It is possible to explicitly specify a different name for the downloaded file. This is possible using the -O switch (--output-document also works, but I believe short is sweet). The new command is

wget -O photo.jpg http://www.joycebabu.com/my-photo.jpg
Where photo.jpg is the new filename.

But be careful while using the -O switch. If there is an existing file with the same name, it will be overwritten with the new file.

Copying multiple files simultaneously using scp utility

Happy New Year to all.

I have been using the Secure Copy (scp) utility for copying files between my local server and development server. Sometimes I have to copy more than one file. Previously I used to copy the files one at a time. This is very annoying, as you have to type the password every time you use the command . But it is possible to copy multiple files using scp, just like the copy (cp) utility.

When you have to copy multiple files to your remote server, the syntax is similar to the cp command.

scp file1.sql file2.sh joyce@joycebabu.com:~/upload

Where file1.sql and file2.sh are the files to be copied, joyce is the username, joycebabu.com is the hostname and ~/upload is the destination directory on the remote server.

In order to download multiple files from the remote server, the command to be used is

scp joyce@joycebabu.com:"file1.log file2.log" ~/logs

Where file1.log and file2.log are the files to be downloaded and ~/logs is the destination directory on the local server. Notice the quotes around the filenames. This ensures that the filenames list is not parsed by the local shell and is passed to the remote shell. Similarly, when you want to download files using wildcards (*.php, files_?.log etc), you should enclose the name within quotes to ensure that the expansion is done by the remote server.

The -r option can be used to copy directories recursively.

scp -r joyce@joycebabu.com:~/logs ~/logs

This may not be a lifesaver tip and the time gained by this method may be small. After all, when a large number of files are to be transferred, I use FTP or tar my files and copy it. But at times when things go wrong, even this small gain can help.