Kaseya Community

Read an SQL table on endpoint

This question is not answered

Situation: Many endpoints in my Kaseya environment have an application running which has a small local SQL database. Whenever the application (partially) fails and support is needed, the support engineer logs in remotely on the pc, has to start a program (similar to SQL management studio) with pre-programmed query's. The query's show the last actions the software took, the last clicks a user did, the latest log messages. Basically gives support information. The program shows it nicely in in a table.

Issue: The process above is slow and time consuming. It also renders the endpoint useless, since Support wants to be able to check information on the background without interrupting the enduser.

Question: Does Kaseya have something to do this from the Kaseya VSA? If so; How is the information presented to the support engineer?

Thanks for reading. I hope there is someone out here who has a good solution!

Verified Answer
  • Hey guys, thanks again for your suggestions.

    I tried the getfile to that documents tab, that works great, didn't know about that one.

    It is a way to get the information to the support engineer, I think it wont be much faster. There many user action to be taken. Say for one call an engineer must check three tables. That would be selecting three seperate procedures from the (very) long list, download it from the Docs tab with a 'save as' dialog, and then open it in notepad, three times. Then imagine doing that 5 times per hour.

    Right now im thinking in the direction of a centralized webserver that can read and present these tables (asp.net or something like that) from over the network. Is there a way to 'add' a custom webpage to the kaseya VSA interface? That would be awesome :)

    Another options is to put a PS script on all the PC's that we can run from the KLC Powershell tool.

    PS pedro, I was not able to download your script from the AE. the weblink i received in the 'view' popup is also invalid.

    Thanks again.

  • Hey guys,

    I had written a nice post about what i've done, but it didnt get posted.

    What I did is make a powershell script (with help of the SQLServer-cmdlet) to talk to the local DB. It fire query's straight from the PShell to SQL, and presents the table on the PShell CLI.

    This is the fastest way to get to the information.

    Thanks a lot for sharing!

All Replies
  • Not *directly* in the VSA, but if you know the queries, and the credentials for the database it should be easy enough to throw together an agent procedure that uses "Execute Command Shell" to run the OSQL commandline query program that ships with SQL server.

  • There are a few possible solutions to your problem.

    #1.  If you have access to the actual SQL queries being run then you can use Powershell & an agent procedure to execute the query, save the results to the documents tab on the end-point and/or open a ticket with the results in the body of the ticket.

    #2.  Depending if the application run by your engineers accepts command line arguments and will output the data in a form that can be redirected to disk you can use the core concept of solution #1 to get the output and addit to the documents tab and/or open a ticket with the details.

    Either solution will not be a pre-designed function of Kaseya but it is possible with a well designed procedure and the supporting scripting or executable(s) on the end-point

  • Thanks for the replies so far.

    John: Talking directly to OSQL came to my mind as well, but I'm struggling to see how the result would present itself to the support engineer.

    Pedro: I don't know what you mean by "...save the results to the documents tab..."? What procedure function would you use for that?

    Writing to disk and downloading the file is a concept I understand :)

  • Marice.  Basically exactly what Pedro was talking about.  You would redirect the output into a text file and then you use the "GetFile" function in agent procedures to pull that file back over to the documents tab.  You could then send an email to an email address with a link to the document from the documents tab (assuming they have access to kaseya).

  • While not widely known the getfile procedure can write the file to the storage area that is reserved for the documents tab.  The path you want is: "..\docs\"  If you want a subfolder under the documents tab then you use "..\docs\MyFolderName\".  We use this in a high number of our procedures to populate the documents tab with sub-folders of information specific to the process involved.

  • Actually what we do is call the sendmail procedure with the actual error / diagnostics included in the body of the email message along with storing it in the documents tab.  It's also possible to add the information into the agent logs so that you can then run reports that pull out that data

  • Maurice:  This is where PoSh is your friend.  The results of the SQL query can be placed into a PSObject which can then be exported to a .CSV (or any other form you prefer) to a file on disk.  You can then use the agent procedures to read the contents of the file into a variable and from there do whatever you want with it.

    I have a meltdown / spectre package posted on the automation exchange that is an excellent example of this.

  • Hey guys, thanks again for your suggestions.

    I tried the getfile to that documents tab, that works great, didn't know about that one.

    It is a way to get the information to the support engineer, I think it wont be much faster. There many user action to be taken. Say for one call an engineer must check three tables. That would be selecting three seperate procedures from the (very) long list, download it from the Docs tab with a 'save as' dialog, and then open it in notepad, three times. Then imagine doing that 5 times per hour.

    Right now im thinking in the direction of a centralized webserver that can read and present these tables (asp.net or something like that) from over the network. Is there a way to 'add' a custom webpage to the kaseya VSA interface? That would be awesome :)

    Another options is to put a PS script on all the PC's that we can run from the KLC Powershell tool.

    PS pedro, I was not able to download your script from the AE. the weblink i received in the 'view' popup is also invalid.

    Thanks again.

  • Hello Maurice,

    Instead of building a whole web server you could re-use the SSRS one that comes with SQL Server.

    We do the same for our custom AddRemove programs collection from the Endpoint (that contain more information than what provided by Kaseya: the installation date for example).

    We have a procedure that runs in the client, collect some information, exports that to CSV then uses a "Transfer File" step to push those to the SQL Server box.

    Inside this SQL Server box we schedule a Powershell command that takes those CSVs and loads them into a Custom Table that can be reported from SSRS easily (you can also write queries to filter this data).

    An example of what you would run in the client is this:

    $outputFile = $args[0]

    $SQLServer = "(local)"

    $SQLDBName = "yourclientdbname"

    $uid ="yourclientuserid"

    $pwd = "thepassword"

    $SqlConnection = New-Object System.Data.SqlClient.SqlConnection

    $SqlConnection.ConnectionString = "Server = $SQLServer; Database = $SQLDBName; Integrated Security = False; User ID = $uid; Password = $pwd;"

    $SqlCmd = New-Object System.Data.SqlClient.SqlCommand

    $SqlQuery = "SELECT * from WhateverTableYouNeedToCheck"

    $SqlCmd.CommandText = $SqlQuery

    $SqlCmd.Connection = $SqlConnection

    $DataAdapter = new-object System.Data.SqlClient.SqlDataAdapter $SqlCmd

    $dataset = new-object System.Data.Dataset

    $DataAdapter.Fill($dataset)

    $delimiter = [char]236

    $dataset.Tables[0] | export-csv $outputFile -notypeinformation -Delimiter $delimiter

    You would pass to this powershell the file name as a parameter inside the kaseya procedure.

    This should be unique so that CSV files would not override each other when you transfer them into the SQL Box for processing.

    Something like #vAgentConfiguration.AgentTempDir#\#vMachine.ComputerName#___#vAgentLabel.DisplayName#_YourTableName.csv

    The generated file would be transferred by the Kaseya Procedure in your SQL Box where you could schedule (using the Task Scheduler if you like or even a Kaseya procedure) something like this.

    In my case I use the Computer Name to delete previous values and keep only the last obtained from the previous read (but you could do differently)

    $files = Get-ChildItem -Path "C:\kworking\anypathwhereyoustorethosefiles" -Filter "*.csv"

    $SQLServer = "your sql server instance"

    $SQLDBName = "your db name"

    $uid ="youruserid"

    $pwd = "thepassword"

    ForEach ($f in $files)

    {

     $compunterName = $f.Name.Substring(0, $f.Name.IndexOf("___"))

     $SqlConnection = New-Object System.Data.SqlClient.SqlConnection

     $SqlConnection.ConnectionString = "Server = $SQLServer; Database = $SQLDBName; Integrated Security = False; User ID = $uid; Password = $pwd;"

     $SqlCmd = New-Object System.Data.SqlClient.SqlCommand

     $SqlQuery = "DELETE FROM YourCustomTable WHERE ComputerName = '" + $compunterName + "'"

     $SqlCmd.CommandText = $SqlQuery

     $SqlCmd.Connection = $SqlConnection

     $SqlConnection.Open()

     $rowsAffected = $sqlcmd.ExecuteNonQuery()

     $SqlConnection.Close()

     $SqlQuery = "BULK INSERT YourCustomTable FROM '" + $f.FullName + "' WITH(FORMAT = 'CSV', FIRSTROW = 1, FIELDTERMINATOR = 'ì', ROWTERMINATOR = '\n', TABLOCK )"

     $SqlCmd.CommandText = $SqlQuery

     $SqlCmd.Connection = $SqlConnection

     $SqlConnection.Open()

     $rowsAffected = $sqlcmd.ExecuteNonQuery()

     $SqlConnection.Close()

     Move-Item -Path $f.FullName -Destination ($f.DirectoryName + "\Processed\") -Force

    }

    The above would automate pushing the data from the endpoint to a central database.

    The last thing to do is to write an SSRS report to give technicians the ability to see this data with one click.

    Alex

  • Hey guys,

    I had written a nice post about what i've done, but it didnt get posted.

    What I did is make a powershell script (with help of the SQLServer-cmdlet) to talk to the local DB. It fire query's straight from the PShell to SQL, and presents the table on the PShell CLI.

    This is the fastest way to get to the information.

    Thanks a lot for sharing!