I would like to be able to monitor a website through Kaseya. It would need to actually check if content is on the page, not just that port 80 is open.
I am trying to figure out what criteria Kaseya Web Server Monitoring uses to tell if a site is "up".
Does it resolve the IP address and just check if port 80 is open?
Does it look for content on the web page?
Any help would be appreciated.
I believe it sends a request for the website and as long as it gets an answer it assumes the site is up and running. Had a problem recently with a customer who's sites were having some issues and this method wasn't working as although the content wasn't there the page was returning some data, albeit a SQL error.
To combat this I wrote a procedure to retrieve the webpage using GET URL then it scans the content of the returned data for a word that should be displayed and if there is an error it pings off an email to a few people.
I appreciate the information.
That was my concern. I feared it would see something was still there, but not actually let me know if the site was running properly.
I will create a script as you suggested. That makes sense.
Thanks for the reply.
Alistair could you share your script?
Here you go NIKNAK - Goes without saying you'll need to alter with your websites and address to send emails to.
I run this directly on my kserver as I know it has better connectivity than most servers but you could run from pretty much anywhere.
Just to add to the variety, I figured I'd post what my solution was, too.
I tried it as a procedure as Alistair suggested, but in my case I wanted it to check every 1 minute. When I did that, I would get an email every 1 minute until the issue was fixed.
To remedy this, instead of a procedure, I used the same basic format, but converted it to a batch file, that runs as a "Custom" System Check. That way I can tell kaseya to ignore alarms for 30 minutes, until I have a chance to look at the problem. This requres having a copy of wget.exe on the monitoring server.
Here' the batch file, which is stored on the server doing the monitoring:
type index.html | find "Whatever text your index.html file should have" >> c:\kworking\testsite\testsiteresults.txt
To set this up, do the following:
Go to: Monitoring: External Monitoring: System Check
Set your alarms and alerts as needed.
Choose to check "Custom"
I set the "Program" to: c:\kworking\testsite\testsite.bat
I set the "Output" to: c:\kworking\testsite\testsiteresults.txt
I told it to alarm if the output does NOT contain: "Whatever"
Of course, you would have to change the website, file and folder names to match what you are doing.
You can also use an online tool like http://www.downnotifier.com
This also has the ability to check the content of the site.
looks good, i'll try this over the weekend.
I just tweaked the below script to run for the DNS Changer Malware set to run on the 9th.
Thanks for this answer. I used it to monitor the front end of a sharepoint server. The site would be accessible, but the content would not be visible if something crashed - with this script I'm able to check the content of the site to make sure it's all still there. Thank you.