1 of 1 people found this helpful
The requirements for this type reporting will be very high in terms of size of UDR data, processing the data on the GWS, as well as storing the data in the TSCO database. You may want to leverage the webInvestigate tool for viewing high resolution data.
You can use
1. UDR (if you collect the data at 1 minute interval, which will make the UDR data), but process the data in 5/15/1 hour increments.
(UDR data will be bigger, but no impact to processing or TSCO database
2. Agent History (real time and near real time) with resolutions as low as 10 seconds.
(No impact to UDR data, processing or TSCO database)
Command line support
The agent can go as low as 1 second.
However,... a wider question ... can DETAIL be amended to a lower level than 5 minutes overall ? If so a separate TSCO instance could be used - I have a similar requirement,
I would assume that the resolution in the vis file is used in the vis ETL.
I was a TSCO in general question - I meant in general can detail be amened to 1 min? (not just vis)
The resolution is dependent on the data source used by the ETL.
The limits are the resolution provided by the datasource, as well as the utilization profile in extracting the data from the data source (may be linear or non-linear in terms of resolution).
Either the data may not be available at the lowest resolution, or the extraction would impact the product being monitored.
Hi Dima, rewording the question
assumng I can get 1 min data can I amend TSCO config to use 1min as DETAIL, rather than the default of 5 Minutes ? I thought it was in early releases.
It would depend on a specific data source. The danger is that the measurement will impact what is being measured. Most likely a restriction was put in for this reason.
Thank you Dima for your answer, but not sure why the status is “Answered”. WebInvestigate doesn’t meet our requirement to send reports to business owners on a scheduled basis.
The requirement is to have the data at a higher granularity available in the data warehouse only for a small group of servers, so we can report it on a scheduled basis and also in a TSPS
view. Additionally, we need to find the best candidates for consolidation in a Virtual Planer study. Systems with extreme peaks occurring at the same time shall not be mixed in the same virtual server.
WebInvestigate does not meet these requirements.
ETL documentation states DETAIL granularity is configurable, by Default 5 minutes. Given that we want to limit number of systems and metric groups to keep the amount of data low, how can we report RAW or DETAIL data at 1min granularity?
Raw may be a consideration. It is the "by Default 5 minutes" I was also asking about.
There may be an opiton to have one TSCO at a lower granularity (so DEFAULT amended to say 1 minute) for a subset of servers.
The shortest summarization period supported for data import into TSCO is a 5 minute interval.
You can configure manager runs to process data at 1 minute interval, but the data will not be visible in reports (only internal BCO table SYS_DATA_DETAIL)
Full details of the limitations.
For TSCO, what are the recommendations regarding high granularity VIS parser ETL data being imported into the database?
Article Number 000160885 Language English Article Type Product/Service Description Product Version 11.3.01 Component Component Version Applies to TrueSight Capacity Optimization 11.3, 11.0, 10.7, 10.5 DetailOfficially the smallest interval supported by a TSCO Gateway Manager run is 2 minutes. There is a workaround to allow VIS files to be created with a 1 minute interval but it isn't officially supported (although it is a configuration we've seen used in the field successfully -- typically for a small subset of the total machines in an environment).
* Is the goal to have 1 minute data imported into TSCO for all 6000 TSCO Agents or just a subset of the 6000 TSCO Agents?
* Is the goal definitely 1 minute intervals or would 5 minute intervals imported into TSCO be sufficient?
000030157: How can the minimum VIS interval be reduced to 1 minute in Manager on Linux since by default Manager only allows a minimum VIS interval of 2 minutes? (https://bmcsites.force.com/casemgmt/sc_KnowledgeArticle?sfdcid=000030157)
Q: Can near-real-time data from webInvestigate be exported to a 3rd party application for reporting?
webInvestigate supports the creation of near-real-time charts and drill downs (so data over the last few days up to the last few minutes) but since the data is being sourced from the TSCO Agent itself there isn't data 'export' functionality associated with it. It can be accessed via the webInvestigate UI but there isn't a way to extract that near-real-time data for all your agents holistically and make it available via another tool. There are some limited command line APIs for extracting data from a single agent (similar to the old printUDR fucntionality) but that would be for a limited scope data extraction (and it would require custom scripting to be usable).
Q: Does webInvestigate support the presentation of 10 second sample data?
Yes, it can show 10 seconds samples using Investigate History data (so you can 'scroll back in time' or with Investigate History disabled it can show 10 second data points from the time you bring up the chart and add new data points as they become available. Functionality in relation to charts and drilldowns it is very similar to the existing TSCO 10.7 and earlier Investigate functionality provided via the TSCO Gateway Server on Windows. Where it differs is that it doesn't use "policy" files to define the computers it uses something called an "Investigate Study".
One can find information on webInvestigate here:
For new environments that don't yet have access to Investigate through TSPS that is a tool than can be implemented separately from your existing TSCO production environment in a lab environment. It requires a bit of hardware (since it would require a lab server with RSSO and TSPS, a separate TSCO AS, and a separate TSCO database server [maybe Postgres instead of Oracle], and a separate TSCO Gateway SErver [maybe running on the TSCO AS -- although that configuration isn't supported in production). But, it could then access your existing TSCO Agents. That also gives you good exposure to the TSCO 11.3 product to be prepared for your PROD upgrade.
Q: If we wanted to import near-real-time data from the TSCO Gateway Server into the TSCO database is that something that BMC Professional Services could work with us to implement?
Based upon input from the TSCO Product Architects the TSCO product is definitely not designed to support near-real-time import of data into the TSCO database from the TSCO Gateway Server ETL and any attempt to do something like that would be outside of the supported configuration of the product. The BMC tool that is likely more consistent with that use-case is the TrueSight Operations Manager tool where the KMs stream data back to the TSOM database throughout the day (since the purpose of that tool is near-real-time monitoring and alerting -- very similar to TSCO Investigate functionality).
Manager run configurations attempting to import near-real-time data into the TSCO database isn't something that could be implemented via a Professional Services engagement because the current data flow design isn't compatible with it. Having multiple 6 or 12 hour Manager runs per day, although not entirely compatible with some product functionality, is within the realm of what could be considered a supported use-case for the product. Any attempt to stream TSCO Agent data into the TSCO database at a higher frequency than that would be in the realm of an unsupported configuration of the product.
1 of 1 people found this helpful
The approach which was suggested by Dima is working, but has some side affects since it will affect every manager run and is getting lost when doing product updates, but there is a way to control this as well using the ETL.
When open a Vis File Parser ETL in Advanced Mode, under Load Configuration there is a Detail Option.
STANDARD – default data import - >= 5min
RAW ALSO – populate samples < 5min into RAW tables AND aggregate data to 5min, hours and so on
RAW ONLY – populate samples < 5min ONLY in RAW tables without performing data aggregation
This helps to control that Data Load on the ETL which is what we recommend.