11 Replies Latest reply on May 20, 2019 4:47 AM by Stefan Antelmann

    Requirement for higher granularity for exceptional servers

    Oliver Buering
      Share This:

      We have a Business requirement to deliver reports for some servers at a higher granularity to analyze business peak periods (e.g. Christmas period).

      Data is collected by agents. We understand lowest spill interval can be set to 1 minute.

       

      Will we be able to report 1 minute interval with DETAIL granularity setting in a quick analysis ?

       

      Thanks in advance

      Oliver

        • 1. Re: Requirement for higher granularity for exceptional servers
          Dima Seliverstov

          The requirements for this type reporting will be very high in terms of size of UDR data, processing the data on the GWS, as well as storing the data in the TSCO database.  You may want to leverage the webInvestigate tool for viewing high resolution data.

          You can use

          1. UDR (if you collect the data at 1 minute interval, which will make the UDR data), but process the data in 5/15/1 hour increments.

               (UDR data will be bigger, but no impact to processing or TSCO database

          2. Agent History (real time and near real time) with resolutions as low as 10 seconds.

             (No impact to UDR data, processing or TSCO database)

           

          Overview

          Setting up and using the Investigate tool - Documentation for BMC TrueSight Capacity Optimization 11.5 - BMC Documentati…

           

          Details

          The Investigate Tool | TrueSight Capacity Optimization - YouTube

           

          Command line support

          Re: Web Investigate command line

          1 of 1 people found this helpful
          • 2. Re: Requirement for higher granularity for exceptional servers
            Darryl Day

            The agent can go as low as 1 second.

             

            However,... a wider question ... can DETAIL be amended to a lower level than 5 minutes overall ?  If so a separate TSCO instance could be used - I have a similar requirement,

            • 3. Re: Requirement for higher granularity for exceptional servers
              Dima Seliverstov

              I would assume that the resolution in the vis file is used in the vis ETL.

              • 4. Re: Requirement for higher granularity for exceptional servers
                Darryl Day

                I was a TSCO in general question  - I meant in general can detail be amened to 1 min?  (not just vis) 

                 

                Thanks

                • 5. Re: Requirement for higher granularity for exceptional servers
                  Dima Seliverstov

                  The resolution is dependent on the data source used by the ETL.

                  The limits are the resolution provided by the datasource, as well as the utilization profile in extracting the data from the data source (may be linear or non-linear in terms of resolution).

                  Either the data may not be available at the lowest resolution, or the extraction would impact the product being monitored.

                  • 6. Re: Requirement for higher granularity for exceptional servers
                    Darryl Day

                    Hi Dima, rewording the question

                     

                    assumng I can get 1 min data can I amend TSCO config to use 1min as DETAIL, rather than the default of 5 Minutes ? I thought it was in early releases.

                    • 7. Re: Requirement for higher granularity for exceptional servers
                      Dima Seliverstov

                      It would depend on a specific data source. The danger is that the measurement will impact what is being measured. Most likely a restriction was put in for this reason.

                      • 8. Re: Requirement for higher granularity for exceptional servers
                        Oliver Buering

                        Thank you Dima for your answer, but not sure why the status is “Answered”. WebInvestigate doesn’t meet our requirement to send reports to business owners on a scheduled basis.

                        The requirement is to have the data at a higher granularity available in the data warehouse only for a small group of servers, so we can report it on a scheduled basis and also in a TSPS
                        view. Additionally, we need to find the best candidates for consolidation in a Virtual Planer study. Systems with extreme peaks occurring at the same time shall not be mixed in the same virtual server.

                         

                        WebInvestigate does not meet these requirements.

                         

                        ETL documentation states DETAIL granularity is configurable, by Default 5 minutes. Given that we want to limit number of systems and metric groups to keep the amount of data low, how can we report RAW or DETAIL data at 1min granularity?

                        • 9. Re: Requirement for higher granularity for exceptional servers
                          Darryl Day

                          Raw may be a consideration. It is the "by Default 5 minutes" I was also asking about.

                           

                          There may be an opiton to have one TSCO at a lower granularity (so DEFAULT amended to say 1 minute) for a subset of servers.

                          • 10. Re: Requirement for higher granularity for exceptional servers
                            Dima Seliverstov

                            The shortest summarization period supported for data import into TSCO is a 5 minute interval.

                            You can configure manager runs to process data at 1 minute interval, but the data will not be visible in reports (only internal BCO table SYS_DATA_DETAIL)

                             

                            Full details of the limitations.

                            For TSCO, what are the recommendations regarding high granularity VIS parser ETL data being imported into the database?

                             

                            Article Number000160885

                             

                             

                            LanguageEnglish

                             

                             

                            Article TypeProduct/Service Description

                             

                             

                             

                            Product Version11.3.01

                             

                             

                            Component

                             

                             

                            Component Version

                             

                             

                            Applies toTrueSight Capacity Optimization 11.3, 11.0, 10.7, 10.5

                             

                             

                            Detail
                            Officially the smallest interval supported by a TSCO Gateway Manager run is 2 minutes.  There is a workaround to allow VIS files to be created with a 1 minute interval but it isn't officially supported (although it is a configuration we've seen used in the field successfully -- typically for a small subset of the total machines in an environment).

                             

                            The shorted summarization period supported for data import into TSCO is a 5 minute interval.  That is the minimum interval length for data to be stored at SYS_DATA_DETAIL.  If Manager is configured to create a VIS file with 1 or 2 minute intervals, when that data is imported into TSCO by the TSCO Gateway Server VIS parser ETL it will be summarized to a 5 minute interval when it gets loaded into the SYS_DATA_DETAIL table.  When the data is imported into the SYS_DATA_DETAIL table the 1 (or 2) minute data points will be summarized into a single 5 minute data point but that data point will carry the MIN, MAX, and AVG statistics.  So, in addition to the typical AVG value it will be possible to report the MIN or MAX value from the data points summarized together for each 5 minute period.

                             

                            There is a separate data storage area in TSCO for high granularity data called the RAW summary level but if you were try to use that it would require significant technical consideration (possibly a Professional Services engagement) because the RAW data summary level isn't partitioned (so there is a significant performance concern) and it isn't managed by the Database Space Manager (so data isn't aged out of it).  But, if 1 minute data were really necessary it would be possible to re-configure the TSCO Gateway Server ETLs to import the RAW data and then it would be necessary to build a data management and cleanup regime on top of that.

                             

                            The hardware requirements for the TSCO Gateway Server and TSCO Application Server/ETL Engine scale pretty linearly with the data summarization interval.  So, if you are currently importing data into TSCO at a 15 minute interval then the hardware requirements will be about 3 times greater to import data at a 5 minute summary interval and 15 times greater to import at a 1 minute summary interval.

                             

                            Question:
                              * Is the goal to have 1 minute data imported into TSCO for all 6000 TSCO Agents or just a subset of the 6000 TSCO Agents?
                              * Is the goal definitely 1 minute intervals or would 5 minute intervals imported into TSCO be sufficient?

                             

                            Here is the KA that describes the Manager run change required for 1 minute intervals:
                              000030157: How can the minimum VIS interval be reduced to 1 minute in Manager on Linux since by default Manager only allows a minimum VIS interval of 2 minutes? (https://bmcsites.force.com/casemgmt/sc_KnowledgeArticle?sfdcid=000030157)

                             

                            Q: Can near-real-time data from webInvestigate be exported to a 3rd party application for reporting?

                            webInvestigate supports the creation of near-real-time charts and drill downs (so data over the last few days up to the last few minutes) but since the data is being sourced from the TSCO Agent itself there isn't data 'export' functionality associated with it.  It can be accessed via the webInvestigate UI but there isn't a way to extract that near-real-time data for all your agents holistically and make it available via another tool.  There are some limited command line APIs for extracting data from a single agent (similar to the old printUDR fucntionality) but that would be for a limited scope data extraction (and it would require custom scripting to be usable).

                            Q: Does webInvestigate support the presentation of 10 second sample data?

                            Yes, it can show 10 seconds samples using Investigate History data (so you can 'scroll back in time' or with Investigate History disabled it can show 10 second data points from the time you bring up the chart and add new data points as they become available.  Functionality in relation to charts and drilldowns it is very similar to the existing TSCO 10.7 and earlier Investigate functionality provided via the TSCO Gateway Server on Windows.  Where it differs is that it doesn't use "policy" files to define the computers it uses something called an "Investigate Study".

                             

                            One can find information on webInvestigate here:

                              https://docs.bmc.com/docs/display/btco11/Changes+to+the+Investigate+tool

                             

                            For new environments that don't yet have access to Investigate through TSPS that is a tool than can be implemented separately from your existing TSCO production environment in a lab environment.  It requires a bit of hardware (since it would require a lab server with RSSO and TSPS, a separate TSCO AS, and a separate TSCO database server [maybe Postgres instead of Oracle], and a separate TSCO Gateway SErver [maybe running on the TSCO AS -- although that configuration isn't supported in production).  But, it could then access your existing TSCO Agents.  That also gives you good exposure to the TSCO 11.3 product to be prepared for your PROD upgrade.

                            Q: If we wanted to import near-real-time data from the TSCO Gateway Server into the TSCO database is that something that BMC Professional Services could work with us to implement?

                            Based upon input from the TSCO Product Architects the TSCO product is definitely not designed to support near-real-time import of data into the TSCO database from the TSCO Gateway Server ETL and any attempt to do something like that would be outside of the supported configuration of the product.  The BMC tool that is likely more consistent with that use-case is the TrueSight Operations Manager tool where the KMs stream data back to the TSOM database throughout the day (since the purpose of that tool is near-real-time monitoring and alerting -- very similar to TSCO Investigate functionality).

                             

                            Manager run configurations attempting to import near-real-time data into the TSCO database isn't something that could be implemented via a Professional Services engagement because the current data flow design isn't compatible with it.  Having multiple 6 or 12 hour Manager runs per day, although not entirely compatible with some product functionality, is within the realm of what could be considered a supported use-case for the product.  Any attempt to stream TSCO Agent data into the TSCO database at a higher frequency than that would be in the realm of an unsupported configuration of the product.

                            • 11. Re: Requirement for higher granularity for exceptional servers
                              Stefan Antelmann

                              The approach which was suggested by Dima is working, but has some side affects since it will affect every manager run and is getting lost when doing product updates, but there is a way to control this as well using the ETL.

                               

                              When open a Vis File Parser ETL in Advanced Mode, under Load Configuration there is a Detail Option.

                               

                              STANDARD – default data import - >= 5min   

                              RAW ALSO – populate samples < 5min into RAW tables AND aggregate data to 5min, hours and so on   

                              RAW ONLY – populate samples < 5min ONLY in RAW tables without performing data aggregation 

                               

                              This helps to control that Data Load on the ETL which is what we recommend.

                               

                              screen.pngHello,

                              1 of 1 people found this helpful