Monitor Your Own Uptime

If you manage websites you know how bad it feels to have someone tell you a site is not only down but it’s been that way for 2 hours … or days. There are solutions out there to monitor your sites uptime but many of them charge and are at best clunky. I’m going to share with you a simple php script I wrote to monitor the sites I maintain for my work and my personal sites.

I’ll assume you can use PHP already in this, if you can’t start here. First you want an array of the sites to test:

 $sitelist = array(
        "http://asdf.not_a_domain_and_will_fail.com",
        "http://www.jonathanmccarver.com",
        "http://google.com"

);

The first url is one we know will fail for testing purposes. The next two should be up and have content. To test the sites I will be using curl because it is well integrated with php already and it does everything we need. So in a loop on the array objects lets use curl to read the contents of each site.

$errormsg = "There is an error with the following sites: \n\n";
$error = False;

foreach ($sitelist as $i => $site)
{
        $crl = curl_init(); // create a curl object
        $timeout = 10; // give it 10 seconds to attempt to read the site
        curl_setopt ($crl, CURLOPT_URL, $site); // assign url
        curl_setopt ($crl, CURLOPT_RETURNTRANSFER, 1); // I don't know what that means
        curl_setopt ($crl, CURLOPT_CONNECTTIMEOUT, $timeout); // set timeout
        $content = curl_exec($crl); // run the read command

        if( curl_errno($crl) ) // see if the curl object threw an error
        {
                $error = True;
                $errormsg .= $site." was not able to be read. (curl error: ".curl_error($crl)."\n";
                // save the error
        }
}

Now we have a loop that attempts to read each site and saves a message if the site times out. This is good but unfortunately this is not the only way sites fail. You may have a 404 not found, a 503 error, a blank page, or a successful load. Lets start by checking status codes.

        $status = curl_getinfo($crl, CURLINFO_HTTP_CODE);
        $status_range =  (string)$status; // convert to a string
        $status_range = $status_range[0]; // get the first digit

        if( !( $status_range == '2' || $status_range == '3' ) )
        // 200's are a good response and 300's are redirects
        // everything else is some kind of error.
        {
                $error = True;
                $errormsg .= $site." returned a code of ".$status." and needs to be checked.\n";
        }

Http codes are 3 digit numbers and there’s a whole bunch of them. 200 codes are all good and 300 codes are a redirect of some kind which for my purposes is also a working site. By writing the code to only pass if the first digit is a 2 or a 3 we are catching every possible error code. There is one more situation to check though, if a site successfully renders it has a 200 code but what if it delivers no content? We will look at the length of the content to see if that is the case.

        $length = strlen( $content );
        if( $length < 2 )
        {
                $error = True;
                $errormsg .= $site." returned no content.\n";
        }

        curl_close($crl);

This is all happening in the same loop we started with. At the end of this snippet you will see we close the curl object. This is good to prevent memory leaks especially since we will be running this code on a timer unattended. Remember if the server hosting this code crashes you'll never know the difference.

We've done a pretty good job of determining if the sites on our list are up and working or not but this script doesn't communicate with anyone. You could output the string but that doesn't alert you, it just displays when you run the script. First lets make it email us if it finds a problem.

if( $error )
{
        echo $errormsg;

        $to      = 'address1@email.com,address2@email.com';
        $subject = '## Websites are down ##';
        $message = $errormsg;
        $headers = 'From: noreply@mywebsite.com' . "\r\n" .
        'X-Mailer: PHP/' . phpversion();

        mail($to, $subject, $message, $headers);
}else
{
        echo "No errors encountered.\n";
}

Now it sends an email but it needs to run on a timer. I have linux based servers to that means I'll use a cron job. Editing your crontab or cron configuration can be confusing and it is easy to mess up. On the command line, assuming you have the right permissions, you can enter crontab -l (thats a lowercase L) to see what crons your system has. You can enter crontab -e to start editing the file but be warned it uses the default editor which will probably be "vi"; a VERY confusing program. You can change your default to the easy to use nano program by adding this line to your .bash_profile file: "export EDITOR=nano" but if none of the last 3 sentences have made any sense to you then you should probably stop here and work on learning unix/linux a little better before trying to set a cron.

Here is an example of the line you might add to your crontab.

0,5,10,15,20,25,30,35,40,45,50,55 * * * * /usr/local/bin/php /home/mydirectory/htdocs/uptime/index.php >> /home/mydirectory/htdocs/uptime/Logfile.txt

If that looks like a lot of jibberish then it just means you're paying attention. This part "0,5,10,15,20,25,30,35,40,45,50,55 * * * * " instructs the script to run every 5 minutes. The part after that is just a command that you could type on the command line yourself and that's what gets executed. One important difference from being logged in and typing it yourself is you need to use full paths. If I were in the right directory I could just enter "php index.php" and run the program but the cron executes the line from nowhere and with no prior knowledge so you have to spell everything out. Find out where your php install is (the first thing after the asterisks) by typing "which php" on the command line. I also am keeping a log by appending the output to Logfile.txt. The >> instructs the system to append the programs output to the end of that file ( as opposed to one > that would overwrite it every time ).

I know you just want the full source code so here it is:

 $site)
{
        $crl = curl_init();
        $timeout = 10;
        curl_setopt ($crl, CURLOPT_URL, $site);
        curl_setopt ($crl, CURLOPT_RETURNTRANSFER, 1);
        curl_setopt ($crl, CURLOPT_CONNECTTIMEOUT, $timeout);
        $content = curl_exec($crl);

        if( curl_errno($crl) )
        {
                $error = True;
                $errormsg .= $site." was not able to be read. (curl error: ".curl_error($crl)."\n";
        }

        $status = curl_getinfo($crl, CURLINFO_HTTP_CODE);
        $status_range =  (string)$status;
        $status_range = $status_range[0]; // status starts as an integer and has to be converted to string to get the first digit

        if( !( $status_range == '2' || $status_range == '3' ) ) // 200's are a good response and 300's are redirects
        {
                $error = True;
                $errormsg .= $site." returned a code of ".$status." and needs to be checked.\n";
        }

        $length = strlen( $content );
        if( $length < 2 )
        {
                $error = True;
                $errormsg .= $site." returned no content.\n";
        }

        curl_close($crl);
}

if( $error )
{
        echo $errormsg;

        $to      = 'address1@email.com,address2@email.com';
        $subject = '## Websites are down ##';
        $message = $errormsg;
        $headers = 'From: noreply@mywebsite.com' . "\r\n" .
        'X-Mailer: PHP/' . phpversion();

        mail($to, $subject, $message, $headers);
}else
{
        echo "No errors encountered.\n";
}

echo date(DATE_RSS)."\n";

?>

And since you skipped down here to copy and paste it good luck to you. If you have any trouble read the article and if it just doesn't work leave me a comment.

One last tip, get another server. If you run the code that checks your site on the server that hosts it then you've just wasted a bunch of time. If the server crashes then NEITHER of those two things work so your website is down and you don't get an email because your checking script is also down. If you're running a site that's important enough for constant monitoring then it's worth another 8 or 10 dollars a month for a second hosting plan to do it from.

7 thoughts on “Monitor Your Own Uptime

  1. Jeff

    Good stuff here. I implemented this and now have monitoring of my site that has been having issues lately. I send the alerts (if there are any) to my cell phone.

  2. Brice

    Thanks a lot for this helpfull post.

    I will try it today! I will try to save result in a database to be able to get figures about my websites uptime rate.

    Have a good day !

  3. Brice

    Tested and perfectly working !

    Thanks again for your script !

    NB: To identify you bot in the logs of your servers, you can add a line in the configuration of your cURL task:
    curl_setopt($crl,CURLOPT_USERAGENT,’My Uptime Monitoring Bot (PHP and cURL)’);

  4. Stiofan

    Thanks for the script, it was exactly what i was looking for so i didn’t have to write it myself 🙂 I have combined it with my own script that now opens a ticket with my hosting provider if all 3 of my sites are down. Many thanks.
    Stiofan

Leave a Reply to Anonymous Cancel reply