I'm working on migrating application deployment from Software Deployment, to Agent Procedures.
We are a SaaS customer, and downloading from the Kaseya server is frustratingly slow. The one good thing about Software Deployment is how it uses a local cache (if available) to speed up file downloads.
Our agents are geographically dispersed and seldom have access to internal files shares, so I'm looking to use a cloud-based location to store the files to download.
Any recommendations of where to host files, and how to download (do I just use Curl?)
We found using FTP and any scriptable (command-line) controllable FTP agent works really well thru the Kaseya Agent .... The only reason to use the FTP agent would be to password-lock the FTP site.
I am sure there are other methods to do this, but this has worked for us in the past.
Our utility software is updated daily by each agent via HTTPS from a cloud server array. We use a custom-developed app that does the work very efficiently, but we prototyped the process using wget.exe and that worked well - it was just slower when downloading the 68-70 files because of individual invocations of wget.exe. Our tool decides what to download and gets all needed files in one execution. We used WGET over CURL because it was more up-to-date and had some features we were interested in, but for general file downloads via scripts, either should work fine, and downloading 1-2 files should not be a performance hit.
We had to define one of our custom file extensions on the web server so it could be downloaded, but otherwise, hosting the files on the web server has worked well. The only other challenge we've experienced was when a couple of systems didn't have the latest root certs and could not validate our certificate. We adjusted our code to fall-back to HTTP if HTTPS fails. We're not passing any data to the website, just downloading software, so HTTPS is desired, not mandatory. Updating the root certs on the agent resolved the issue.
Thanks to everyone for your responses.
Question answered + more!
You can host your files elsewhere and utilize the GetURL command in the VSA which will distribute that file from your source to endpoint.
I really like the speed of GetURL for large packages (+400MB), however, as we discussed in Minneapolis last week, the GetURL as implemented, exposes an unsecure interface outside of the agent's secure channel whether or not you store files for downloading to the agent behind that interface. I would really like to see Kaseya setup a secure service/method that would ensure that you could not gain access to any surface of the VSA w/o a set of credentials, etc. when that surface is outside of the agent's secure channel.
I would like to hear about how the FTP solution was implemented. Can you share? I guess at best, having files on a different server other than the VSA and using the FTP solution with credentials is a start in the right direction, but the interface on the VSA that allows GetURL as implemented today, still needs a lockdown.
Hopefully, I am not blowing smoke on this. Please correct me if I am over reacting.
I recall that conversation. Thanks again for attending our local event in Minneapolis - our partners and users had a great time.
As we spoke - to secure files of a intimate business nature could pose a potential security gap from an access standpoint however, in my particular case the main driver of my getURL was third party/general business facing software.