Extract single/multiple tables from MySQL dumpfile

An year back I posted on how to extract a single table from a MySQL dump file. Today, I decided to write a shell script to automate the whole process. Now it is possible to extract a single table or a range of tables from a dump file with a single command.


The script can be invoked with and without any parameters. The script usage is

./extract_table.sh mydumpfile.sql tablename tablename2

All parameters are optional. If the third argument is provided, the script will extract all tables from tablename to tablename2. If it is not specified, only tablename will be extracted.

If first and/or second argument(s) are/is omitted, the script goes into interactive mode, allowing you to select the file and table name. The interactive mode also allows you to view a list of all the tables in the dump file. You can extract a group of tables or a single table.


It took me a few hours to write the code. So, with the hope that someone will find this useful, I am releasing the code under MIT, BSD and GPL licenses. Feel free to contact me, if you are a fan of another license šŸ™‚


The script can be downloaded from Github. The current version is 1.0.
MySQL Dump Table Extractor

Enabling gzip compression if mod_deflate is not enabled

Today I saw a tweet by my friend Niyas that Host Gator does not support mod_gzip. This made me wonder whether it is possible to achieve this using PHP. I contacted HostGator support and confirmed that GZip support for PHP is enabled on both shared and dedicated hosting accounts, whereas mod_deflate Apache module is available only for dedicated hosting customers. Fortunately, we can use PHP to compress files on the fly.

Enabling gzip compression in an existing PHP file is very simple. Add the following line at the beginning of the file (it should appear before any output statement).

ob_start ("ob_gzhandler");

Very simple, right? But what if we have hundreds of files to edit? Don’t worry, there is solution for that too. We can use the PHP configuration variable auto_prepend_file to automatically include the above code at the beginning of all PHP files. Copy the above code to a file named prepend.php and place it at the root of your website (~/public_html in CPanel, ~/httpdocs in Plesk).

Now we are going to automatically include the above file at the beginning of all PHP files using auto_prepend_file. Depending on how PHP is configured on your server, the steps for modifying the PHP configuration variable is also different.

If PHP is configured in CGI/FastCGI mode (HostGator uses this mode), we will be using php.ini file to make the above change. Create a file named php.ini at the root of your website and copy the following code into it. If the file already exists, append it to the end of the file.


If PHP is loaded as apache module (mod_php), we will have to use .htaccess to achieve the same effect. Create a file named .htaccess at the web root and copy the following code into it. If the file exists, append at the end of the file.

php_value auto_prepend_file <full-path-to-document-root>/prepend.php

In both methods, replace <full-path-to-document-root> with the full path to your website root directory, for example /home/joyce/public_html.

Now the prepend.php file is automatically included by the PHP interpreter, whenever a PHP file is requested. But this method (as of now) does not work for non PHP files like HTML, CSS, JavaScript etc, which should also be compressed. This is because these files are handled directly by Apache and is not passed to the PHP interpreter. To ensure that these files are also compressed, we have to instruct Apache to pass these files to the PHP interpreter before sending them to the browser. For this, lets once again go back to .htaccess. Append the following code at the end of your .htaccess file (create one, if you don’t haveĀ  it already).

<FilesMatch "\.(htm|html|css|js)">
	ForceType application/x-httpd-php

The above code instructs Apache that files ending with .js, .htm, .html, .css and .js extensions are also PHP files and should be passed through PHP interpreter. The only problem remaining is that Content-type header for all files have been changed to text/html. To fix this we need to check the requested file-name and set the correct Content-type header depending on the file extension. In order to do this, open prepend.php and replace the current code with

// Start output buffering
ob_start ("ob_gzhandler");

// Set the correct content type header depending on filename
$arContentType = array('htm' => 'text/html', 'html' => 'text/html', 'css' => 'text/css', 'js' => 'text/javascript');
if(strpos($_SERVER['REQUEST_URI'], '.') !== FALSE){
	$ext = explode('.', $_SERVER['REQUEST_URI']);
	$ext = array_pop($ext);
	if(($pos = strpos($ext, '?')) !== FALSE){
		$ext = substr($ext, 0, $pos);
		header ("Content-type: {$arContentType[$ext]}; charset: UTF-8");

// You can also set expiration headers in this file

That’s all Folks. You have successfully enabled Gzip compression on your server without using any Apache modules.

This is a working code. But I have tested it only with mod_php configuration. You have to be very careful when enabling PHP code in JS/CSS files. There may be many flaws in this code. If you find any, please let me know. We can further improve this code by adding expiration and cache headers to enable client side caching.

Local Analytics – Errors & Suggestions

First of all, let me thank everyone, especially Joe, Kevin and David,Ā  for their active participation and support in reporting the errors in Local Analytics. According to my Google Analytics stats, the plugin was downloaded over 200 times so far. A few errors were reported and fixed. Now the only unfixed problem is the corrupted zip file. I think it is caused by the some problem with my system, because it is showing some other troubles too. I will format my system and recreate the files in a few days.

I request you to report any difficulty or error you are having with the plugin. If no errors are reported during the next two days, then I will declare the current version stable and start working on the next version.

I also request you to suggest any improvements and changes you wish to include in the next version. Alex and David have made a very good suggestion, which I will try to incorporate in the new version. Though I don’t promise this, if I have time I will try to integrate the Analytics Reports also within the Administration Panel, as Jared and ekowanz asked.

Local Analytics v1.1 Released

Today Kevin reported an error with Local Analytics. The plugin was spitting a warning on accessing a page from a RSS feed. Though I wasn’t able to exactly replicate the error, this version should fix that error. Also I have added some more changes to the plugin, especially in the admin section.

As before, you can download the latest version of the plugin from the Local Analytics page.

Change Log

  • Fixed the error pointed out by Kevin
  • Removed the onclick events from RSS feed
  • Dropped domain name from tracked downloads
  • Edited admin panel to display configuration errors.

Local Analytics – My first WordPress plugin

Today I am releasing my first WordPress plugin – Local Analytics. Though this is not the first plugin I have started working on, this is the first one to be released. The first plugin I started working on was WapPress – a plugin to make your blog Mobile Friendly.

Local Analytics is based on a program I wrote a few days back on how to speed up Google Analytics by locally hosting urchin.js and automatically updating it. The wonderful idea of creating a Word press plugin with it was suggested by Carl mercier. Carl is the author of the anti spam plugin, Defensio, which is still in beta stage. I have been been beta testing the plugin for the past few days and found it much easier to use, compared to Akismet.

Local Analytics is simple to use for a normal user and highly configurable for advanced users. Normal users only need to enter their Analytics Account ID, wheras advanced users can control the complete behavior of the plugin.

Please try the plugin and let me know your valuable comments and suggestions.

Speed up Google Analytics using simple PHP Script

Update: Ā Thanks to the wonder idea by Carl Mercier, the script is now available as a WordPress pluging. Please try Local Analytics.

Google Analytics is one of the best web traffic analysis service available. The best thing about Google Analytics is that it is FREE, while most Ā of the similar services requires a premium membership for such detailed reports. Microsoft announced starting their own Web Analytics service codenamed

Back to our topic. Recently I came across an article in JonLee.ca on speeding Up Google Analytics by hosting urchin.js it locally. Google support says that it is allowed, though not recommended, to host urchin.js locally. Google recommends directly linking to the file hosted on their server, because it will ensure that your users are downloading the latest version of the file.

A work around to this proposed by the above site and many other blogs is to setup a cron job to automatically update the file at regular intervals. But most of the sites does not provide any information on how to do that. Also not all webhosts support crontab. When a gzip enabled browser requests for the file, Google compresses the file before sending theirby reducing the file size from 21KB to 6KB. If your server is not configured to automatically compress all Javascript files, then this can increase the file download time.

Hence I have created a simple php script that will automatically download the script from Google Server at regular intervals and provides your users with latest version of urchin.js. The script also supports Gzip compression by default, though it can be turned off by changing $useGzip to false.
Here is the script:

< ?php

// Remote file to download
$remoteFile = 'http://www.google-analytics.com/urchin.js';
// Local File name. Must be made writable
$localFile = "local-urchin.js";
// Time to cache in hours
$cacheTime = 24;
// Connection time out
$connTimeout = 10;
// Use Gzip compression
$useGzip = true;


if(file_exists($localFile) && (time() - ($cacheTime * 3600) < filemtime($localFile))){
     $url = parse_url($remoteFile);
     $host = $url['host'];
     $path = isset($url['path']) ? $url['path'] : '/';

     if (isset($url['query'])) {
          $path .= '?' . $url['query'];

     $port = isset($url['port']) ? $url['port'] : '80';

     $fp = @fsockopen($host, '80', $errno, $errstr, $connTimeout ); 

          // On connection failure return the cached file (if it exist)
          // Send the header information
          $header = "GET $path HTTP/1.0\r\n";
          $header .= "Host: $host\r\n";
          $header .= "User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv: Gecko/20070725 Firefox/\r\n";
          $header .= "Accept: */*\r\n";
          $header .= "Accept-Language: en-us,en;q=0.5\r\n";
          $header .= "Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7\r\n";
          $header .= "Keep-Alive: 300\r\n";
          $header .= "Connection: keep-alive\r\n";
          $header .= "Referer: http://$host\r\n\r\n";

          fputs($fp, $header);

          $response = '';
          // Get the response from the remote server
          while($line = fread($fp, 4096)){
               $response .= $line;

          // Close the connection
          fclose( $fp );

          // Remove the headers
          $pos = strpos($response, "\r\n\r\n");
          $response = substr($response, $pos + 4);

          // Return the processed response
          echo $response;

          // Save the response to the local file
               // Try to create the file, if doesn't exist
               fopen($localFile, 'w');
          if(is_writable($localFile)) {
               if($fp = fopen($localFile, 'w')){
                    fwrite($fp, $response);


Ā Installing the Script

Installing the script is extremely simple. Save the above code in a php file, say analytics.php or download it from here. Edit your Google Analytics code and change http://www.google-analytics.com/urchin.js to your new file path, say http://www.yourdomain.com/analytics.php.

Now your code will be something like this

Use this code instead of your original Google Analytics code.

After uploading analytics.php to the server, directly access the file using your web browser. This will create a local copy of urchin.js, named local-urchin.js, on your server. Make sure that local-urchin.js is writable (chmod to 777).

Important: If your server is using the zlib compression for compressing javascript files, then turn off gzip compression by changing line no. 11 to :

$useGzip = true;

Advantages of using the script

  • Your webpage loads faster, since your user’s browser can use the existing connection with your web server to download the file and doesn’t need to create a new connection to google’s server.
  • Sometimes, though very rarely, google servers become overloaded and your page load time will be affected drastically.
  • Some ad blockers are now blocking Google Analytics too. This script will be a work around to that too.
  • Unlike other methods, the file is compressed before sending to gzip enabled browsers.

TorPlus includes Kestrel support

I am releasing a minor update of the TorPlus proxy solution. This update changes the way in which file paths are referenced within the Opera Settings files. Previous released used relative path names. This created minor problems with user stylesheets. Hence the built-in IRC chat and feed reader lost their CSS formatting. Now absolute path is used within the files. This path is updated everytime TorPlus is run, thus the program is still portable. It is highly recommended that you don’t run the Opera browser directly.

Felix and Ray requested Kestrel support for TorPlus. Hence I am releasing TorKestrel+ also with this version. For those who are unaware of Kestrel, it is the official alpha release of the upcoming Opera v9.5. Opera v9.5 is promised to be much faster than all previous versions. Since Kestrel is still is alpha stage, TorKestrel+ is provided as a separate zip file, without Tor and Privoxy. To use TorKestrel+, download and extract the zip files of TorOpera+ and TorKestrel+ to the same directory.

Read more and download TorPlus

TorOpera Plus

Ever since I blogged on the topic “Protect your privacy, Hide your identity“, I have been thinking about creating a package similar to OperaTor. My search for a simple programming solution lead me to AutoIt. According to AutoIt Script homepage,

AutoIt is a freeware Windows automation language. It can be used to script most simple Windows-based tasks (great for PC rollouts or home automation).

I too found AutoIt a helpful tool and was really amazed by its simplicity. It also supports GUI for your program. It cannot be used to create resource intensive programs. But for small programs like the one I intended to create, it is excellent.

I have created TorOpera +, which is a bundle of Opera, Tor and Privoxy. The program is fully configurable and contains many features that OperaTor lacked. The program can be configured to copy Opera to the Hard Disk from your pen drive which will speed up Opera. Also I have included several features like Ad Blocker, Flash Blocker, Image Blocker and several more.

I plan to create a similar bundle for Firefox too. PleaseĀ download TorOpera+ and let me know your valuable suggestions and opinions.

jQuery – Fast, Light-weight Javascript Library

A javascript library is a collection of functions and methods in Javascript for performing some of the most common needs like Dom traversing, event handling, adding Ajax interaction etc. There are many free Javascript libraries available like jQuery, Prototype, Scriptaculous, Dojo, MochiKit, Yahoo! UI Library (YUI). Prototype and Scriptaculous together forms the backbone of the Ruby-on-Rails project. YUI is a free set of utilities and controls supported by Yahoo!. Dojo Toolkit is a library having Java like syntax. All the libraries have their own highs and lows.

Recently jQuery has started getting a lot of attention. jQuery is a fast and extremely lightweight Javascript library. Recently jQuery v1.1.3 was released, which is said to be upto eight times faster than the previous version. But the biggest advantages is that the size of the library when compressed is just 20KB. Don’t you think that is great? Drupal 5.0 is using jQuery instead of the older drupal.js.

Another advantage is that jQuery can be used with other Javascript Libraries without conflict. It also has a clean and modular approach to plugins. Since writing plugins is very simple in jQuery, a number of plugins are available for most of the common uses.

With jQuery running an animation like sliding and fading are very simple. Interface is a jQuery plugin for animation. Interface is like scriptaculous library for prototype. It can be used for handling sorting, drag and drop effect and other complex effects.

Another important feature of jQuery is the ready function. For Dom scripting we have to wait till the complete Dom loads. Using the onload event handler for this purpose is a waste of time for pages having images as the event is triggered only when every element, including the images, have been loaded. jQuery provides a workaround, ie the ready function, which is called when the dom has loaded completely.

Sending and retrieving data via ajax has never been easier. jQuery provides us with a rich set of functions for managing it.

jQuery has a really simplified format for traversing the Dom. jQuery’s selector syntax is based heavily on CSS3 and XPath. Hence selecting and traversing through elements have never been simpler. Elements can be selected using id, classname, a combination of id and classname and XPath reference.

Another important feature of jQuery is method chaining. Every call of a method on a jQuery object returns the object itself. Hence to apply multiple methods to an element you can do it without typing the selector again. All you have to do is to chain the new method at the end of the previous method.

jQuery has made Javascript programming fun and enjoyable for me. Try it and feel its power.