2 Replies Latest reply on Jun 11, 2020 2:07 PM by Michael Kramer

    TSCO - SQL metics

    Steve Lake
      Share This:

      Interested in the ways our community has set up data pulls from SQL servers. Would like to pull SQL metrics into TSCO. What type of datasources were used to extract from. Pulled form a SQL monitoring tool? Or from the SQL server itself?

        • 1. Re: TSCO - SQL metics
          Ben Davies

          Steve Lake  OR Steve Lake


          It may be helpful to separate an SQL Server (hardware - virtual or not) and an SQL Service  (software delivering a service).


          Like any other SERVER monitoring you go where the data is.  Who is monitoring the hardware? Where is that data, make a TSCO ETL to get that data.  This is just like any other SERVER monitoring metrics.


          Like any other SERVICE monitoring you go where the data is.  An SQL Service is no different than a WEB service, or login service, authentication service etc.  Who is monitoring the service? Where is that data?  Make a TSCO ETL to get that data.


          The 'problem(s)' is (are) when the data is local to the server or service.  That implies an ETL to connect to each server and or service to get the monitoring data.  While individually this is possible, it quickly gets out of hand if you try to scale individual ETLs to an environment.


          A 'typical' life-cycle may look like this.

          Enable a local (server and or service) monitoring tool  to prove what data can be gathered.

          Analyze that data to prove that actionable intelligence can be made with that data.

          Create an individual ETL (or three) to the data source to prove that the data can be made available to TSCO and used against other TSCO collected data.

          To the extent that this data is useful in TSCO, devise a strategy to implement this on a larger scale (to all other servers and or services)..   Trying to scale individual ETLs to individual servers and services is a sub optimal solution (and will eat your life).


          This I think is the root of the question.  What have others done in similar cases...


          When we had a similar challenge the server and service teams got together and used NMON to monitor the servers and or services, then use a utility to push that data into an NMON database  (a MySQL database) (I do not recall the exact details but this link seems close to the idea  http://ps-2.kev009.com/AIXtip/nmon2mysql.htm ).  Then we used TSCO to query the  resulting database.  


          The monitoring setup was handled by the normal system administration process (and team), and was handled by the normal monitoring administration process (and team).   The Capacity Management Team had ingested the data with one off TSCO ETLs to prove that if the data were available it would make actionable intelligence (or not).  Our preliminary analysis show that the monitoring efforts were worth the cost (licensing, processing, transportation, storage and management) in time, money and effort.  Then we removed the several one off ETLs and replace them with a single ETL to the consolidated data source (database).


          In another case we had an 'expensive' monitoring tool with say 100 licenses.  The server and monitoring teams moved the licenses from one application stack to another to monitor and model performance.  TSCO has an ETL to the monitoring service database and got whatever data was available.  For a couple weeks we got data from servers that supported the BlueWidget Application.  Then the agents were shifted to servers that supported the PinkElephants Application.  Then something different a few weeks on.    TSCO just gathered the data available and augmented  other data already in TSCO   One a given server you would see a set of metrics being monitored for a few weeks then the data collection stopped.   I have to say that this effort was soon abandon as the server and monitoring team abandon the monitoring tool all together.   The take away is that TSCO cares not one bit about where the data comes from.  TSCO just gets the data it has been told to get, to the extent that data is available.

          1 of 1 people found this helpful
          • 2. Re: TSCO - SQL metics
            Michael Kramer

            I'm working on doing this right now actually. I've set up data collection from perfmon counters from those systems into Splunk for real time monitoring. We're only able to keep 30 days of data there so I'm working on a query using the Splunk connector to pull data into TSCO so that the database metrics can be used.

            1 of 1 people found this helpful