History of User Agent string

If you are a web developer, you would have definitely seen the User Agent strings of the common web browsers. I have always wondered why the UA string began with Mozilla for browsers like Internet Explorer and Opera, which are not based on Mozilla code. I found the answer on WebAIM blog.

History of the browser user-agent string

This is the part that I liked the most

And then Google built Chrome, and Chrome used Webkit, and it was like Safari, and wanted pages built for Safari, and so pretended to be Safari. And thus Chrome used WebKit, and pretended to be Safari, and WebKit pretended to be KHTML, and KHTML pretended to be Gecko, and all browsers pretended to be Mozilla, and Chrome called itself Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/ Safari/525.13, and the user agent string was a complete mess, and near useless, and everyone pretended to be everyone else, and confusion abounded.

View currently running MySQL queries in realtime

Today I was playing around with Apache Solr. I was really impressed by its text searching capability, especially the MoreLikeThis search handler. I wanted to configure the DataImportHandler to import data directly from my MySQL database. It was really easy to configure, and I was able to perform a full import quickly. But when I tried to do a delta import, I found that it was not working as expected. Even though I was calling the delta import, it was causing a full import.

You might be wondering  why I am saying all these here. Well, I suspected that the problem was actually because of my SQL query for delta load.  But to be sure, I wanted to see the query being executed by Solr DataImportHandler. As always I turned to Google for assistance, and I finally reached the MySQL documentation on the General Query Log. Voila! This was exactly what I wanted. All I had to do was use the –log=[filename] parameter and all my queries will be logged to the specified log file. Nice, isn’t it?

Now I have to stop my running MySQL server and restart it with the –log switch, in addition to the other regular options. But there was a problem, I was not sure of the other required parameters. You can use the ps utility, when the MySQL server is running, to find out the normal parameters.

ps -ax | grep mysql

For me the output was

/usr/local/mysql/bin/mysqld –basedir=/usr/local/mysql –datadir=/usr/local/mysql/data –user=mysql –pid-file=/usr/local/mysql/data/localhost.pid –port=3306 –socket=/tmp/mysql.sock

Now shutdown the MySQL server.

// On Mac
/Library/StartupItems/MySQLCOM/MySQLCOM stop

// For other Linux/Unix variants try
/etc/init.d/mysqld restart
service mysql restart

Start mysqld with –log option

/usr/local/mysql/bin/mysqld --basedir=/usr/local/mysql --datadir=/usr/local/mysql/data --user=mysql --pid-file=/usr/local/mysql/data/localhost.pid --port=3306 --socket=/tmp/mysql.sock --log=/tmp/query.log

The general query log contains lots of irrelevant information. To view the log after filtering out the unwanted details use tail and grep as given below

tail -f /tmp/query.log | grep -v Connect | grep -v Quit

The amount of information added to the file is quite large. If you are using this on a production server, I recommend turning off the logging once you are done.

Get free sms alerts when your website is down

Every webmaster wants their websites to have 100% uptime. But the truth is that not even Google is able to ensure that. So the next best thing we can hope for is to be informed whenever a website goes down, so that the necessary actions can be initiated to bring it back online. Unfortunately, it is not possible nor viable to constantly monitor your website and check whether it is up or not. Hence we go for uptime monitoring services like Pingdom, Mon.itor.us, BasicState etc. The best uptime monitoring services charge a premium fee for using their service. Fortunately, the monitoring services mentioned above have a cut down version of their premium service available for free. Most of the times the free service will be limited in frequency of check, alert methods, detailed statistics, number of checks etc. In fact, none of the free uptime monitoring services I had used provided free sms alerts directly. They either require us to pay or use our own sms gateway subscription.

Everything I have mentioned so far might be known to you already. So why put a false title? Well, I am not finished yet. Yes, you can get a free every minute uptime check with free sms notification when your website is down, depending on your location and mobile service provider.

Pingdom is one of the best uptime monitoring services available. I had been their premium subscriber till last month. When my subscription period was about to end, I started looking for an alternative, since I felt I was not using my subscription fully. I was using only one check, and a premium account allowed up to 5 checks. Most of the free service I checked were limited in frequency of checks (max being every 5 mins). Finally I came across Pingdom’s twitter account and found that they were offering free accounts with one every min check. There was also 20 free sms credits available with the account. What happens once that is over? Their alert methods also includes sending direct messages to your Twitter account. Now twitter allows sms updates for tweets from people you follow. By setting up Direct Message alert in Pingdom and following PingdomAlert on Twitter with sms alerts turned on, you can get unlimited free sms alerts when your website is down.

When you are depending on two services, it is possible that problem with any of providers can cause delay’s and reduce the overall effectiveness of the system. Even if Pingdom sends the alert within a minite, there is no guarantee that twitter will send the sms alert on time. If you have a critical application, do not rely on such an unreliable setup and go for a premium subscription. When I checked, I got sms notifications within 1 min of receiving a Direct Message. Also, sms check is not much important for me since I have my mailbox open most of the time, and usually detect downtimes within 5 mins from email alerts. Your case might be different, so use at your own risk.

Enabling gzip compression if mod_deflate is not enabled

Today I saw a tweet by my friend Niyas that Host Gator does not support mod_gzip. This made me wonder whether it is possible to achieve this using PHP. I contacted HostGator support and confirmed that GZip support for PHP is enabled on both shared and dedicated hosting accounts, whereas mod_deflate Apache module is available only for dedicated hosting customers. Fortunately, we can use PHP to compress files on the fly.

Enabling gzip compression in an existing PHP file is very simple. Add the following line at the beginning of the file (it should appear before any output statement).

ob_start ("ob_gzhandler");

Very simple, right? But what if we have hundreds of files to edit? Don’t worry, there is solution for that too. We can use the PHP configuration variable auto_prepend_file to automatically include the above code at the beginning of all PHP files. Copy the above code to a file named prepend.php and place it at the root of your website (~/public_html in CPanel, ~/httpdocs in Plesk).

Now we are going to automatically include the above file at the beginning of all PHP files using auto_prepend_file. Depending on how PHP is configured on your server, the steps for modifying the PHP configuration variable is also different.

If PHP is configured in CGI/FastCGI mode (HostGator uses this mode), we will be using php.ini file to make the above change. Create a file named php.ini at the root of your website and copy the following code into it. If the file already exists, append it to the end of the file.


If PHP is loaded as apache module (mod_php), we will have to use .htaccess to achieve the same effect. Create a file named .htaccess at the web root and copy the following code into it. If the file exists, append at the end of the file.

php_value auto_prepend_file <full-path-to-document-root>/prepend.php

In both methods, replace <full-path-to-document-root> with the full path to your website root directory, for example /home/joyce/public_html.

Now the prepend.php file is automatically included by the PHP interpreter, whenever a PHP file is requested. But this method (as of now) does not work for non PHP files like HTML, CSS, JavaScript etc, which should also be compressed. This is because these files are handled directly by Apache and is not passed to the PHP interpreter. To ensure that these files are also compressed, we have to instruct Apache to pass these files to the PHP interpreter before sending them to the browser. For this, lets once again go back to .htaccess. Append the following code at the end of your .htaccess file (create one, if you don’t have  it already).

<FilesMatch "\.(htm|html|css|js)">
	ForceType application/x-httpd-php

The above code instructs Apache that files ending with .js, .htm, .html, .css and .js extensions are also PHP files and should be passed through PHP interpreter. The only problem remaining is that Content-type header for all files have been changed to text/html. To fix this we need to check the requested file-name and set the correct Content-type header depending on the file extension. In order to do this, open prepend.php and replace the current code with

// Start output buffering
ob_start ("ob_gzhandler");

// Set the correct content type header depending on filename
$arContentType = array('htm' => 'text/html', 'html' => 'text/html', 'css' => 'text/css', 'js' => 'text/javascript');
if(strpos($_SERVER['REQUEST_URI'], '.') !== FALSE){
	$ext = explode('.', $_SERVER['REQUEST_URI']);
	$ext = array_pop($ext);
	if(($pos = strpos($ext, '?')) !== FALSE){
		$ext = substr($ext, 0, $pos);
		header ("Content-type: {$arContentType[$ext]}; charset: UTF-8");

// You can also set expiration headers in this file

That’s all Folks. You have successfully enabled Gzip compression on your server without using any Apache modules.

This is a working code. But I have tested it only with mod_php configuration. You have to be very careful when enabling PHP code in JS/CSS files. There may be many flaws in this code. If you find any, please let me know. We can further improve this code by adding expiration and cache headers to enable client side caching.

Google Analytics Reporting Suite on AIR

Today I am going to write about an excellent application, which I discovered yesterday. The main reason why I hated checking out my Google Analytics reports was that I hated logging in to my Analytics account. My personal Google account is different from the one I use for website related uses, including Adsense, Sitemaps and Analytics. Whenever I try to check my personal mails and Analytics account together, I get logged out from the former. That was when I discovered Google Analytics Reporting Suite based on Adobe Integrated Runtime (Previosly Apollo), developed by Nico.

Google Analytics Reporting suite is a desktop application based on AIR, which allows you to access your Google Analytics Reports directly from your desktop. GARS provides a much better interface compared to the original web application. It also implements almost all the features provided by the web application.

The program allows you to manage multiple profiles from different Google Accounts directly from the program. The login credetials will be saved on your computer so that you don’t have to re-login every time you check your reports.

Here is a screencast of the previous version

Highlighted features of GARS (from AboutNico.be)

  • Easy profile selection and account management »
  • Use multiple profiles from different Analytics accounts »
  • All visitors, traffic and content reports available »
  • Tabbed interface to easily switch between reports »
  • Data drilldown, goal values, data segmentation »
  • Animated, interactive graphs »
  • Advanced data grids with filtering and paging »
  • Switch between interactive reports or PDF reports »
  • Site overlay view »
  • Exports to PDF, Excel and XML »

Google Analytics AIR v1.0 sneak peakThe program is still in its beta stage with beta 2 released on October 5th. I recommend everyone  to tryout the program and provide feedback to Nico, so that it can be improved further. Version 1.0 of the reporting suite is expected to be released by early 2008. A sneak peak of Google Analytics AIR 1.0 was provided and the new version is much prettier.

Google does not provide an official API to access Google Analytics data by external programs. Fortunately, programmers have developed unofficial APIs[1, 2] for doing the same, even though Google is remaining silent on the legality of using it. Google Analytics Reporting Suite is also based on an unofficial API developed by Nico, by studying the working of the web application for months. Fortunately, both Google and Adobe extended support to Nico for further developing the application. GARS is included in Adobe’s Showcase program and Google has offered assistance in making the program more secure.

Now that I have Google Analytics Reporting Suite installed, I now check my Google Analytics reports 4 – 5 times a day. Thank you Nico.  🙂


Speed up page loading & save bandwidth by reducing page size

Yesterday, while fixing some browser compatibility issues of my new theme, I was surprised to know that my homepage was over 400 KB in size and took more than 30 seconds to completely load with my DSL connection. More than 80 files, including images, javascript and css, were embedded within the page.

I decided to optimize the page and found that the following actions can be taken to decrease the load time.

  • Reduce the number of requests
    First of all, I opened up my theme’s stylesheet and found several unused classes with image backgrounds and bullets. I removed those classes and removed all references to non-existant files.
  • Reduce the total download size
    I tried removing unwanted markup and html comments left by my theme and plugins. But that did not create much difference in size. Next I converted some of the images from PNG to JPEG format. This reduced the total size by over 30 KB. But most of the page size was due to the embedded Javascript and CSS files. For example, the prototype javascript library is over 90 KB in size, which on compression get reduced to 22KB. Hence I used Apache Module mod_deflate to compress my files dynamically for gzip enabled browsers. I used the following code in my .htaccess file to compress files of the following mime types  – text/css,  text/html,  text/plain,  text/xml and application/x-javascript. For this to work on your server, you should have mod_deflate enabled. Most webhosts have this enabled. If not, you may also try  PHP mod_gzip or ZLib compression.

    # Insert filter
    SetOutputFilter DEFLATE
    AddOutputFilterByType DEFLATE text/css text/html text/plain text/xml application/x-javascript
    # Netscape 4.x has some problems...
    BrowserMatch ^Mozilla/4 gzip-only-text/html
    # Netscape 4.06-4.08 have some more problems
    BrowserMatch ^Mozilla/4\.0[678] no-gzip
    # MSIE masquerades as Netscape, but it is fine
    # BrowserMatch \bMSIE !no-gzip !gzip-only-text/html
    # NOTE: Due to a bug in mod_setenvif up to Apache 2.0.48
    # the above regex won't work. You can use the following
    # workaround to get the desired effect:
    BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html
    # Make sure proxies don't deliver the wrong content
    Header append Vary User-Agent env=!dont-vary
  • Cache files client side
    Almost all browsers have the ability to cache files on your computer for faster loading. If we set the expiry time of the file, the browser will check for updated content only after the specified time. We can use the Apache module mod_expires for this purpose. The following code will request the browser to cache all files for three hours. This can make the browser download the file only once during a session, but will update the content on next visit. You may also use the ExpiresByType directive to set different time for images, javascript and css files.

    # Cache all files for the next 3 hours
    ExpiresActive On
    ExpiresDefault "access plus 3 hours"


The final result was great. The homepage size got reduced to 159 KB, which was less than half of the original size. The page load time also reduced to 19.3 seconds from about 32 seconds.

The load time and page size were calculated using the Web Page Speed Report tool from www.websiteoptimization.com.

Make sure you backup your current .htaccess file before trying the above code. So that you can revert back if something goes wrong.