Jan 302012
Shell Scripting

There are many instances where a administrator needs to check the status of application url from time to time to ensure all applications deployed on server is running fine. Below script is designed to do exactly same, and it helps a lot when you have a huge number of application urls which needs to be checked from time to time. Below bash script to check url status can be used as a script to check url availability.

This script contains 3 files:

urllist: File to store application urls one by one.

Example urllist” file Needs to be created in the same directory where script will reside:





ScriptMonitor.log: This is the log file generated on the same directory where script is ran. It contains all status of urls checked.

CheckURL.sh : The bash script which will be used to check the application URL’s.

Contents/Variables Desc. Inside Script in Functions:

SetParam() : Contains all the variables which needs to be set in script.

URL_Status() : Main function which calculates the RESPONSE CODES

Mail_Admin() : Sends mail to admin mailing list incase of server down issue.

Send_Log() : Sends all logs generated by the script to admin mailing list.

Main_Menu() : Used to call all functions

tee Command: Used to record the logs.

curl Command: Used to get the status codes.







#bash to check url status.
#set -x; # Enable this to enable debug mode.
#clear # Enable this to clear your screen after each run.

SetParam() {
export URLFILE="urllist"
export TIME=`date +%d-%m-%Y_%H.%M.%S`
SAFE_STATUSCODES=( 200 201 202 203 204 205 206 207 208 226 401 )
export STATUS_UP=`echo -e "\E[32m[ RUNNING ]\E[0m"`
export STATUS_DOWN=`echo -e "\E[31m[ DOWN ]\E[0m"`
export MAIL_TO="admin(at)techpaste(dot)com"
export SCRIPT_LOG="Script_Monitor.log"

URL_Status() {

sed -i '/^$/d' $URLFILE; #To Parse the URLFILE for removal of blank rows
cat $URLFILE | while read next
STATUS_CODE=`curl --output /dev/null --silent --head --write-out '%{http_code}\n' $next`
# If you want to set a timeout then add --max-time 15, here 15 is 15seconds
case $STATUS_CODE in

100) echo "At $TIME: $next url status returned $STATUS_CODE : Continue" ;;
101) echo "At $TIME: $next url status returned $STATUS_CODE : Switching Protocols" ;;
102) echo "At $TIME: $next url status returned $STATUS_CODE : Processing (WebDAV) (RFC 2518) " ;;
103) echo "At $TIME: $next url status returned $STATUS_CODE : Checkpoint" ;;
122) echo "At $TIME: $next url status returned $STATUS_CODE : Request-URI too long" ;;
200) echo "At $TIME: $next url status returned $STATUS_CODE : OK" ;;
201) echo "At $TIME: $next url status returned $STATUS_CODE : Created" ;;
202) echo "At $TIME: $next url status returned $STATUS_CODE : Accepted" ;;
203) echo "At $TIME: $next url status returned $STATUS_CODE : Non-Authoritative Information" ;;
204) echo "At $TIME: $next url status returned $STATUS_CODE : No Content" ;;
205) echo "At $TIME: $next url status returned $STATUS_CODE : Reset Content" ;;
206) echo "At $TIME: $next url status returned $STATUS_CODE : Partial Content" ;;
207) echo "At $TIME: $next url status returned $STATUS_CODE : Multi-Status (WebDAV) (RFC 4918) " ;;
208) echo "At $TIME: $next url status returned $STATUS_CODE : Already Reported (WebDAV) (RFC 5842) " ;;
226) echo "At $TIME: $next url status returned $STATUS_CODE : IM Used (RFC 3229) " ;;
300) echo "At $TIME: $next url status returned $STATUS_CODE : Multiple Choices" ;;
301) echo "At $TIME: $next url status returned $STATUS_CODE : Moved Permanently" ;;
302) echo "At $TIME: $next url status returned $STATUS_CODE : Found" ;;
303) echo "At $TIME: $next url status returned $STATUS_CODE : See Other" ;;
304) echo "At $TIME: $next url status returned $STATUS_CODE : Not Modified" ;;
305) echo "At $TIME: $next url status returned $STATUS_CODE : Use Proxy" ;;
306) echo "At $TIME: $next url status returned $STATUS_CODE : Switch Proxy" ;;
307) echo "At $TIME: $next url status returned $STATUS_CODE : Temporary Redirect (since HTTP/1.1)" ;;
308) echo "At $TIME: $next url status returned $STATUS_CODE : Resume Incomplete" ;;
400) echo "At $TIME: $next url status returned $STATUS_CODE : Bad Request" ;;
401) echo "At $TIME: $next url status returned $STATUS_CODE : Unauthorized" ;;
402) echo "At $TIME: $next url status returned $STATUS_CODE : Payment Required" ;;
403) echo "At $TIME: $next url status returned $STATUS_CODE : Forbidden" ;;
404) echo "At $TIME: $next url status returned $STATUS_CODE : Not Found" ;;
405) echo "At $TIME: $next url status returned $STATUS_CODE : Method Not Allowed" ;;
406) echo "At $TIME: $next url status returned $STATUS_CODE : Not Acceptable" ;;
407) echo "At $TIME: $next url status returned $STATUS_CODE : Proxy Authentication Required" ;;
408) echo "At $TIME: $next url status returned $STATUS_CODE : Request Timeout" ;;
409) echo "At $TIME: $next url status returned $STATUS_CODE : Conflict" ;;
410) echo "At $TIME: $next url status returned $STATUS_CODE : Gone" ;;
411) echo "At $TIME: $next url status returned $STATUS_CODE : Length Required" ;;
412) echo "At $TIME: $next url status returned $STATUS_CODE : Precondition Failed" ;;
413) echo "At $TIME: $next url status returned $STATUS_CODE : Request Entity Too Large" ;;
414) echo "At $TIME: $next url status returned $STATUS_CODE : Request-URI Too Long" ;;
415) echo "At $TIME: $next url status returned $STATUS_CODE : Unsupported Media Type" ;;
416) echo "At $TIME: $next url status returned $STATUS_CODE : Requested Range Not Satisfiable" ;;
417) echo "At $TIME: $next url status returned $STATUS_CODE : Expectation Failed" ;;
500) echo "At $TIME: $next url status returned $STATUS_CODE : Internal Server Error" ;;
501) echo "At $TIME: $next url status returned $STATUS_CODE : Not Implemented" ;;
502) echo "At $TIME: $next url status returned $STATUS_CODE : Bad Gateway" ;;
503) echo "At $TIME: $next url status returned $STATUS_CODE : Service Unavailable" ;;
504) echo "At $TIME: $next url status returned $STATUS_CODE : Gateway Timeout" ;;
505) echo "At $TIME: $next url status returned $STATUS_CODE : HTTP Version Not Supported" ;;




URL_SafeStatus() {
for safestatus in ${SAFE_STATUSCODES[@]}
#echo "got Value of STATUS CODE= $1";
#echo "Reading Safe Code= $safestatus";
if [ $1 -eq $safestatus ] ; then

echo "At $TIME: Status Of  URL $next = $STATUS_UP";

if [ $flag -ne 1 ] ; then
echo "At $TIME: Status Of  URL $next = $STATUS_DOWN" | Mail_Admin $TIME $next


Mail_Admin() {
echo "At $1 URL $2 is DOWN!!" | mailx -s " Application URL: $2 DOWN!!!" $MAIL_TO

Send_Log() {
if [ -f $SCRIPT_LOG ] ; then
mailx -s "$0 Script All Url Check Log Details Till $TIME" $MAIL_TO < $SCRIPT_LOG

Main_Menu() {


Main_Menu | tee -a $SCRIPT_LOG


After Running the script with above urllist the output will look like below. And It will send a mail to the MAIL_TO list incase any url found down or status code does not come in the safe list defined in SetParam function.

url check script output

Note: Edited MAIL_TO due to bombardment of test emails from users :). See my comment.

© Incase of any copyright infringements please check copyrights page for faster resolutions.

  20 Responses to “Bash script to check url status”

  1. Hi All,

    Please change the email address in MAIL_TO=”admin(at)techpaste(dot)com” before you test the script.
    My INBOX is getting bombarded with your test emails. Please make sure you change it to some other email id before you test.


  2. I like this script – it’s easy to understand and it works well when checking application urls.

    It’s worth adding –max-time 10 to the curl command so that it drops out after 10 seconds. Then, if there are network issues or the site is down, the script will not hang.

    PS I hate your share banner as it hides the lhs of the text in Firefox 13.0.1. Annoying.

    • I forgot to add that curl returns a staus of 28 if there is no response before max-time is reached so it’s easy to include a check to see if there was a timeout.

      • Hi Kevin,

        We liked your suggestion and have added a comment to add max-time to the script.

        PS: We liked your PS too and have disabled the share banner… 🙂


  3. Hi

    I was in looking for the same kind of script which checks for the URL.

    I tried running your script but did not get the expected out put.

    i guess i have missed to make appropriate changes to the below lines :

    STATUS_CODE=`curl –output /dev/null –silent –head –write-out ‘%{http_code}\n’ $next`
    # If you want to set a timeout then add –max-time 15, here 15 is 15seconds

    could you please explain this …

    Thanks in advance 🙂

    • Hi Veer,

      Whats the error you are getting.. while running the script your side?

      STATUS_CODE=`curl –output /dev/null –silent –head –write-out ‘%{http_code}\n’ $next`

      for the above one max-time is the timeout seconds to check the url load. You can use ti like below but not required without the maxtime stuff also the script should just work fine.

      STATUS_CODE=`curl –max-time 15 –output /dev/null –silent –head –write-out ‘%{http_code}\n’ $next`

  4. Hi Admin,

    I am trying to use the above script for checking few Application URL.
    I just need to check whether a particular URL is up & running or not.

    Also please let me know if i want to check the https URL.

    • you can use -k option in curl for ssl urls

      -k, –insecure (SSL) This option explicitly allows curl to perform “insecure” SSL connections and transfers.

  5. Hi Admin,

    Let me clear the picture bit more…

    I am trying to access the Application URL which i used to launch via Citrix i.e. remote internet explorer.
    When i am running this script i am getting error code as “302” for few url which means that URL FOUND.
    but i can say that the url is up & running only when the return code is “200” i.e. URL is OK.

    Also for some of the url i am getting error code as ‘000’ which is not a valid return code.

    Appreciate your help on this.

    Thanks 🙂

    • Hi Veer,

      The HTTP response status code 302 Found is a common way of performing a redirection. Your few urls are giving 302 as they MUST be redirecting to some new url like for authentication, etc while accessing them. You will never get a 200 status code from this kind of urls. In this case what you can do is add the 302 to the safe status list like below:

      replace: SAFE_STATUSCODES=( 200 201 202 203 204 205 206 207 208 226 401 )

      with: SAFE_STATUSCODES=( 200 201 202 203 204 205 206 207 208 226 302 401 )

      After this it will report status OK for the urls which are getting redirected to some other url while you access them.

      Check more about 302 behavior here http://en.wikipedia.org/wiki/HTTP_302

      If you have static urls which does not redirect to multiple urls for authentication you can also use below script to check them.



  6. Hi Admin,

    I had made few changes in the above script.
    And now it works perfectly fine for http connections but does not work for https connections i.e.

    http://XXXXXX.sg.ap.XXXXX.com:12050/mantas :: WORKS FINE

    https://XXXXXX.sg.ap.XXXXXX.com:9017/abn_adm_ae :: DOES NOT WORK

    Is there any way of checking for secured connections as well..


    • Hi Veer,

      Glad things are working fine for you. You can use -k option to check simple ssl urls .

      Replace : STATUS_CODE=`curl –output /dev/null –silent –head –write-out ‘%{http_code}\n’ $next`
      With : STATUS_CODE=`curl -k –output /dev/null –silent –head –write-out ‘%{http_code}\n’ $next`

      From the curl man pages:

      (SSL) This option explicitly allows curl to perform “insecure” SSL connections and transfers. All SSL connections are attempted
      to be made secure by using the CA certificate bundle installed by default. This makes all connections considered “insecure” to
      fail unless -k/–insecure is used.

      If this option is used twice, the second time will again disable it.

      • Hi Admin,

        Its working fine according to my requirement.

        Thanks a lot for your support!!!

        • Hi Admin,

          once again i am here..

          could you please let me know how the same script can work with wget command.


          • Something like this
            wget –server-response http://www.google.com 2>&1 | awk ‘/^ HTTP/{print $2}’
            wget –spider -S “http://url/to/be/checked” 2>&1 | grep “HTTP/” | awk ‘{print $2}’

            do a quick google search you will get many posts about the same

  7. Hi Admin

    I made the all the change but is is asking user id and password ?
    i am very new to shell scripting could you please help?


  8. Hi Admin

    Script is working fine but https:// xxxxx
    urls are not getting the proper result
    can you please help 🙂

    i tryied -k but no luck

    can you give me the exact code to use https:// ?


  9. output file is created but not writing anything.

Leave a Reply

Show Buttons
Hide Buttons