How I may help
LinkedIn Profile Email me!
Call me using Skype client on your machine

Hyperformix Mercury Capacity Planning (MCP)

Here are my notes taken while investigating the internals of Performance Engineering

 

Topics this page:

  • Overview Modeling
  • Annoyances
  • Installation
  • Model Components
  • Licensing
  • Product Modules
  • Sample Projects
  • Data Flows
  • Your comments???

  •  

    Site Map List all pages on this site 
    About this site About this site 
    Go to first topic Go to Bottom of this page

    Set screen Overview: What is the big deal?

      Hyperfomix Performance/Cost Analysis Hyperformix software's IPS (Integrated Performance Suite) provides capacity planners the foresight to reduce "firefighting" through disciplined capacity management.

      The product enables more accurate quantificantion of alternative "trade-offs" between performance and cost. In this sample XY scatter plot, alternative "A" was found to actually cost more than the "current" configuration, yet performs worse!

      The product is worth $135,000 because it is used to identify alternative "B" (which both costs less and does more) through "predictive simulation" examining various "what-if" conceptual configurations more quickly and less expensively than installing actual hardware.

      "Systems are now too complex for analysts to predict without simulation modeling."
      "Predictive modeling" examines (at the same time) limitations of:

      • response time ("latency") to process business functions;
      • throughput of transactions per second processed by each subsystem tier; and
      • utilization of a server's total CPU capacity and other resources.


    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen Types of Modeling

      The types of performance models:

      • Stochastic models represent components that are subject to chance occurance.

      • Discrete Event models represent systems which changes state instanteously at the times that events occur.

      • Dynamic models represent systems that change over time.

     

      Mercury sold a version of the Hyperformix IPS (Integrated Performance Suite) as "Mercury Capacity Manager (MCP) — Powered by Hyperformix" until June 2006.


    Go to Top of this page.
    Previous topic this page
    Next topic this page

      Set screen The Methodology: Dynamic Modeling

      Hyperformix models are not merely accounting formulas in a static spreadsheet.

      Hyperformix modeling is "dynamic" in that 1) predictions are obtained by "running" 2) a modeling program that predicts what happens, moment-by-moment, over simulated time frames in response to a workload (the arrival rate of requests).

      Impact on different hardware within a "virtual data center" can be simulated because Hyperformix has amassed 3) a proprietary (separately licensed) Hardware Model Library containing the result of SPECint tests and other characteristics of As of March 31, 2006 over 2,000 hardware components.

      With the Hyperformix Performance Engineering methodology simulation run results are generated in the same format as data parsed from actual runs so that the 4) "Visualizer" utility can "validate" simulated results against real measurements in a bar chart report such as this.

      Measurements from real runs are obtained by parsing 5) "raw" logs captured from running actual systems and summarized into 6) an MS-Excel spreadsheet called a "profile". The 7) proprietary Excel add-in code is called the Hyperformix "Application Model Generator (AMG)".

      The 8) "Profiler" utility parses a wide variety of logs, including 9) result files from LoadRunner runs.

      A wide variety of logs, including 9) result files from LoadRunner runs, can be used because of the 10) sumarization and filtering before creating the 11) data used as the basis of simulation runs.

      Fine-grained "behaviors" of an application are coded as Java-like 12) Application Definition Notation (ADN) included in Modeler runtime code.

     
    Visualizer Modeler Simulation model files Reports & Charts ADN files AMG Summary files Profiler LoadRunner result files LoadRunner Raw logs Application under test Hardware Model Library Profile Spreadsheet

    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen Annoyances

    1. There's a lot of complicated "moving parts" to a data center. So I drew this (highly technical) data flow diagram to make sense of what file types are passed between modules.
    2. The product comes with a lot of sample files, but forces you to know where they all are. In fact, if you type in the wrong path in Visualizer, you'll get a Java error. So I provide you complete file paths so you don't have to drill down 4 or 5 levels -- assuming that you installed to default locations.
    3. The number of sample reports is overwhelming. Screenshots of sample reports are not included in help files. So I've made some of them available to you'all here. So I created this reports matrix to organize all sample reports provided for the Visualizer utility.

    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen Installation

      At the time you read this, the version may have likely moved on from "3.3" created May, 2005.

        Get announcements of new releases: email me.

        Hyperformix offers a class on "Transition to 3.4".

      Before installing MCP, run the standalone Mercury Installation Integrity Checker (IIC.exe in CD folder Bin\Suite\IIC). This outputs (in C:\Documents and Settings\%user%\My Documents\installlog.html) a log of whether a computer meets requirements for running MCP. (1 GB physical memory, OS Version Win XP Pro SP1, Admin permissions for COM access, Impersonation level, Registry update, Excel 2000/2002 - not 2003!, FlexLM Port 27xxx Availability, Switchboard port 5353 ).

      The LrrTransSetup program by default does not install to the "Mercury Interactive" folder within Windows' "Program Files", but "Mercury".

      By default, the setup program creates a "Mercury" folder in "Program Files", what LoadRunner 8.1 also does. This table lists some of its sub-folders used to store the 150,428 bytes used during installation, listed below.

     

      As of April 17, 2006, the Knowledge Base at Mercury Support had no articles associated with this product.

      Hyperformix support line:
      800.336-2107
      I've found Tom Hyde and Mark Spalding both knowledgeable and quick.

     
    Go to Top of this page.
    Previous topic this page
    Next topic this page
    AMG

      The sequence of business functions are defined in the AMG Business Process Flow worksheet.

      Modeler ADN (Application Definition Notation) code defines classes and interfaces. ADN is the object-oriented programming language built into Modeler. ADN code describes software processes and control simulation runs in Modeler Simulation models. AMG profiles are used to generate an ADN include file that contains transaction classes defining workloads and business functions.

      Avoid putting your own files within the installation folder. Inevitably, when versions change, you would need to figure out which files need to be saved or end up spending frustrating hours debugging why new versions don't work.

      In "Third Party Applications", you can install "J2SE" and "Adobe Reader".

      The setup program installs these files in C:\WINDOWS\system32\

      • pdh.dll - 04/27/05 10:30 PM - 249KB (254,976 bytes) - 32-bit - v.5.1.2600.1106 (xpsp1.020828-1920) replaces the previous version in folder C:\WINDOWS\system32\dllcache\ as well.
      • MSWINSCK.OCX - 11/15/04 06:06 PM - 107KB (109,248 bytes) - 32-bit - v.6.00.8988
      • VB40032.DLL - 04/27/05 10:30 PM - 705KB (722,192 bytes) - 32-bit - v.4.00.2924
      • olch2d8.dll - 11/15/04 06:06 PM - 573KB (587,072 bytes) - 32-bit - v.8.0.20032.14
      • olch2x8.ocx - 11/15/04 06:06 PM - 1.98MB (2,078,032 bytes) - 32-bit - v.8.0.20032.14
      • tabctl32.ocx - 11/15/04 06:06 PM - 204KB (209,192 bytes) - 32-bit - v.6.00.8169

      The setup program also install an empty "flexlm" folder in C:\ root.

      Set screen File Extensions

      The setup program installs these file name extensions:

        .pta - Registered ProfilerWorkflowAnalyzer.Document - No Icon C:\PROGRA~1\Mercury\PROFIL~1.6\wfa.exe /dde
        .rpt - Registered Modeler.ReportDocument - No Icon C:\PROGRA~1\Mercury\Modeler4.6\Program\runexcel.exe "C:\Program Files\Mercury\Modeler4.6\FormatReport.xls","%1"
        .tra Modeler Workflow Analyzer simulation trace WorkflowAnalyzer.Document trace icon C:\PROGRA~1\Mercury\Modeler4.6\program\workflow.exe /dde
        .yyyy-mm-dd-21-23-27-0105-00 Not Registered - No Filetype - No Icon No Command

      No file extensions are created for:

        Modeler simulation model (.csm) files.

        Visualizer project (.vop) files.

        Navigator project (.npj) files.

        scenario (.sm) files.

        Old scenario (.sem) files.


    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen Licensing and Invocation

      After you install, invoke the "HostID" from Start > Programs > Mercury > License Management > HostID for a dialog like this.

      Copy the Host ID and Machine Name into an email to support.Mercury.com to obtain a license key. If you switch NIC cards on your laptop or swap hard drives, mention it because the Host ID is generated from the machine's hard disk serial # and MAC address.

      Separately licensed are the Hyperformix Application Modules for Oracle 7, 8, or 9; Peoplesoft 8, SAP, and WebSphere Application Server.

      The *.lic file should be copied into folder C:\Program Files\Mercury\License.

      Once you are "legal", you can invoke executable modules without getting pop-up such as this for the Navigator.

      As I understand it, Mercury's MCP product does not include the Hyperformix Data Manager product — a central repository to store historical heterogeneous data for reuse.

      Mercury's MCP product also does not include the Hyperformix Capacity Manager product that predicts the impact of growth in workload, modeled at a higher level than the MCP product. Unlike its competitor TeamQuest, this slices and dices workload by business unit.

      Metron's Athene is another competitor.

     

      Hyperformix, Inc. is privately owned with
      $14.5 annual sales.
      800.759-6333
      4301 Westbank Drive, Bldg. A,
      Austin, Texas 78746 (West of downtown, N off Hwy 1 on S. Capital of Texas Hwy 360)
      14.3 miles from the Austin-Bergstrom Intl Airport (AUS)

      Leading its 85 people are:

      Doug Neuse, Chief Science & Research Officer
      Jim Browne, expert in "performability"
      Derek Weeks, VP Product Marketing
      Jody Mattingly is Sales
      512-425-5156
      Shauna Osborne, SVP Client Services
      Keith Smith develops training.


    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen Sample Project Data

      MCP is installed with several sample models in C:\Program Files\Mercury\Modeler4.6\Samples
      eComm3, IntelligentNetworkingModels, LoadBalance, SimpleDemo, WorkloadModel

      eComm3 demonstrates a multi-tier web eCommerce system created by combining the hardware infrastructure from the Modeler "eComm3_Topology.csm" sample file and the application and workload specification from the Application Model Generator "eComm3 Profile.xls" sample file. There are two input workloads of 50 local users and 450 remote users. The remote user transactions arrive from the Internet via a T1 connection and the local user transactions arrive directly via a local Ethernet.

        In Visualizer, copy a line below, then click File, Open, and paste it in the "File name:" field, and click "Open":
        C:\Program Files\Mercury\Visualizer3.0\Samples\eComm3\eComm3.vop
        or
        C:\Program Files\Mercury\Visualizer3.0\Samples\eComm3 with Performance Objectives\eComm3.vop

        These show mean utilization at 100% for the Network component.

      IntelligentNetworkingModels contains four models:

      NetworkTrafficBalancingI - Routing Path Distribution illustrates the impact of using network traffic balancing when routing. The impact is demonstrated by running the same model under several different scenarios:

      • Scenario 1: Running the model with all network components in their default mode.
      • Scenario 2: Running the model with a switch configured to be an intelligent networking device that is balancing traffic across multiple paths using a round-robin balancing algorithm.
      • Scenario 3: Running the model with a switch configured to be an intelligent networking device that is balancing traffic across multiple paths using a uniform distribution.
      • Scenario 4: Running the model with a switch configured to be an intelligent networking device that is balancing traffic across multiple paths using a balancing algorithm that distributes traffic according to the bandwidth of the switch’s outgoing link.

      NetworkTrafficBalancingII Port Distribution illustrates the impact of using network traffic balancing when routing. The impact is demonstrated by running these scenarios:

      • Scenario 1: Running the model with all network components in their default mode.
      • Scenario 2: Running the model with a switch configured to be an intelligent networking device that is balancing request traffic across multiple ports using a round-robin balancing algorithm.
      • Scenario 3: Running the model with a switch configured to be an intelligent networking device that is balancing reply traffic across multiple ports using a uniform distribution.

      NetworkTrafficBalancingII Priority-basedRouting illustrates the impact of using priority-based routing. The impact is demonstrated by running the same model under two different scenarios:

      • Scenario 1: Running the model with all network components in their default mode.
      • Scenario 2: Running the model with a switch configured to be an intelligent networking device.

      UserDefinedRoutingTables-LeastHops illustrates how to define and use custom routing tables. The default routing table uses a bandwidth-based criteria (much like the OSPF protocol) when calculating routing paths. In this example, a least-hops based criteria is defined, as in Routing Information Protocol. The impact of using the custom routing tables over the default routing table is demonstrated by running the same model under two different scenarios:

      • Scenario 1: Running the model with all network components in their default mode.
      • Scenario 2: Running the model with a router configured to be an intelligent networking device that has custom configuration and custom routing logic defined.

      LoadBalance illustrate the use of load-balancing techniques in an environment of several identical client workstations connected by an Ethernet to a pair of servers performing a workload reading and writing a message. Semaphores are used to synchronize threads in the server process.

      • run time query of built-in statistic
      • user-defined discrete statistic
      • user-defined continuous statistic
      • dynamic display feature
      SimpleDemo demonstrates a model of a customer service center, designed to illustrate the following techniques:

      • connecting clients and servers through a network.
      • specifying the set of services offered by a server.
      • invoking a service from a client.
      • specifying multiple processes on a single computer.
      • specifying multiple identical computers.
      • simulating computer resource usage via the Execute statement.
      • decomposing a behavior into a sequence of statements.
      • parameterizing behaviors.
      • using Application Definition Notation to specify behaviors that involve decisions or loops.
      • using probability distributions to vary workload.
      • using various trace mechanisms to observe the behavior of the model.

      WorkloadModel illustrates modeling applications at a high level. It assumes that detailed modeling of the databases is not required. The environment for this model consists of a token ring with 225 workstations, a mid-range data center which provides database services, and a mainframe which handles remote procedure calls. The mid-range data center and the mainframe are on separate Ethernets. The Token Ring and Ethernets are all connected by routers to a wide-area network.

      C:\Program Files\Mercury\Visualizer3.0\Samples\Application Profile Report\Application Profile Report.vop


    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen Navigator Method Steps

      The Navigator GUI provides a central command center: it outlines "method steps" which are clickable.

      From within the Navigator, MCP users can start LoadRunner.

      Scaling analysis identifies the bottlenecks by scaling up the workload on a "benchmark" system as it is.

      This measurement provides an initial model for tradeoff analysis.

      Configuration tradeoff analysis uses an initial model from measurements to analyze the ability of different hardware and software configurations to handle various workloads.

      Within C:\Program Files\Mercury\Navigator2.3\MethodTemplates

      • Simple123.mtp - includes all operational steps from the Dashboard.
      • File_Viewer.mtp - each step views a specific type of file.
      • Mercury Capacity Planning Basic Integration.mtp - adds to System Scalability methodology template, with analysis of T2 loadrunner tests.
      • Mercury Capacity Planning Automated Integration.mtp - has more steps than MCP Basic to invoke LoadRunner scenarios.
      • Modeler_Quickstart.mtp - uses an existing model as input.
      • SystemScalability.mtp - as a starting point the full network and data collection of the System Scalability methodology.
      • SystemScalability_No_DataFile.mtp - using an AMG profile as a starting point.
      • SystemScalability_No_Res.mtp - uses no resource data, just network data.

      There are MCP "Basic Integration" and the MCP "Automated Integration" method.

      The MCP Automated Integration.mtp method template has this sequence for the sample "FMStocks":

        1. Planning
        • Define the goals of the performance study.
        • Define application business functions, system hardware, and performance objectives.

        2. Data Collection and Analysis
        • Process "T1" Single Business Function Traces
        • Process "T2" Single Business Function load tests
        • Construct a complete Application Profile Report
        • Process "T3" Mixed Business Function load tests.

        3. Model Construction, Verification, and Validation
        • Create the common hardware typology that is used by all model building and validation in this phase.
        • Construct and verify T1 models
        • Construct and verify T2 models
        • Construct a complete Application Profile
        • Construct and verify T3 models

        4. Scalability Scenario Analysis
        • Define the baseline deployment hardware and application
        • Copy and execute scenarios
          • Experimentation by using Scenario Manager
          • Experimentation by explicitly modifying the model
          • Experimentation by explicitly modifying the AMG profile
     

      AMG

      • Define AMG Profile
      • Edit AMG Profile
      • Generate AMG Output
      • Import Transaction Summary into AMG

      Basic

      • Execute Commands
      • For Each
      • For Each File
      • Hardware Environment
      • List
      • Performance Objectives
      • Select Files
      • Text
      • Text List (Bulleted Style)

      LoadRunner

      • LoadRunner Calculate Load Scenario
      • LoadRunner Convert
      • LoadRunner Create Simple Scenario
      • LoadRunner Invoke Scenario

      Modeler

      • Edit Model
      • Execute Model
      • Hardware Environment into Simulation Model
      • Import XML into Model
      • Merge Model Results

      Profiler

      • Convert Raw to NSF
      • Convert Raw to RSF
      • Convert Transaction Report to RSF
      • Convert Web Log to Transaction Report
      • Create Load Test Summary
      • Create NSF Reports
      • Create RSF Reports
      • Create Resource Business Function Summary
      • Create Run Reports
      • Create Transaction Reports
      • Dice
      • Edit Address Map
      • Edit Host Names Map
      • Edit Http Map
      • Join RSF
      • Merge Business Function Summaries
      • Merge RSF
      • Merge Resource Data
      • Merge Transaction Summary
      • Organize Resource Data
      • Organize Runs
      • Pick a Network or Resource Report
      • Profiler Manual Session
      • Recap Network Summary
      • Recap Resource Summary
      • Recap Transaction Summary
      • Replace and Filter by Address
      • Resource Data by Business Function
      • Resource Time Window
      • Scrub Addresses

      Scenario Manager

      • Define Scenario
      • Execute Scenario
      • Increase Workload
      • Run As Is
      • Vary Topology
      • Vary Workload Using AMG
      • Vary Workload and Topology

     
    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen Data Flows

      This chart is under contruction...
      Navigator Visualizer AMG Modeler Profiler Weblog Files RSF Files NSF Files


    Go to Top of this page.
    Previous topic this page
    Next topic this page

      In this chart, "raw" data is processed from the bottom, then combined in the middle, and finally graphically displayed by the visualizer at the top.

      First, various utilities reformat logs from their native format into a common format. Hyperformix has defined different common formats for resource and network data.

      Reformatting utilities are called Profiler utilities because the Profiler needs them to create its reports and diagrams.

      Some Profiler reports are imported into the profile spreadsheet using the Application Model Generator Excel add-in.

      At the top, the "Visualizer" creates reports and graphs as html and graphic files from statistics generated by simulation runs conducted by the "Modeler".

      Hyperformix uses the Performance Manager/Monitor/Agent for zooming within graph within HP Openview.


    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen Hardware Components Typology

      The icons are shown twice their normal size here.

      Component Icon Additional Data (beyond the name, latency, and cost)
      Client: Client Workload File -
      Computer (Processor Machines): Server Mainframe -
      Processes and Applications: Server Process/Application Simple Process -
      Data Communications: Data Link background load, bandwidth, configuration,
      overhead bytes,
      arbitration time,
      minimum payload bytes,
      maximum payload bytes,
      simultaneous streams
      Ethernet FDDI Token Ring background load, bandwidth, configuration
      Point to Point background load, bandwidth, configuration
      overhead bytes,
      minimum payload bytes,
      maximum payload bytes,
      communications mode,
      directionality
      Frame Relay active status,
      committed information rate,
      committed information rate computation interval
      Router Switch Switched Ethernet Bridge WAN  
      Interconnect [between networks] - -

      Set screen Hardware Model Library

      *.txx files are stored in a proprietary format because they are separately licensed from the Hyperformix Hardware Model Library of As of March 31, 2006 over 2,000 hardware components.


    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen Model Components

      The product's conceptual "model" (a "virtual data center") consists of "components":

        AMG
        Worksheet/Tab
        Software
        Component
        Workload
        Component
        Application x x
        Business Process Flow - x
        Business Function Flow - x
        Transaction Flow x -
        Transaction Properties x -
        Workload - x
        Client - x
        Subsystem x -

        Connector An Execution Environment of computer hardware and networking components connected into a hardware "topology"

           
          • Computers, which consists of >
          • Network
          • Internetwork
            Computers:
          • CPU
          • Disk
          • I/O Controller
          • Volume
          • Tasks (Processes)

        Behavior Software components

        • clients
        • servers
        • subsystem tier (e.g., "WebServer", "AppServer", "DBServer", etc.)
        • threads

        Workload Workload components with an arrival pattern of requests made by the application's users, as measured by:

        • business functions (T1, T2, T3)
        • transactions
        • behavior


    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen Reports and Charts

    Set screen Reports Matrix

    Set screen The Visualizer

      From a Navigator method step, open a Visualizer (.vop) file:

      C:\Program Files\Mercury\Visualizer3.0\lib\DefaultChartProperties.vop

      C:\Program Files\Mercury\Visualizer3.0\Samples\Scenario with 18 Runs\Modeler Results 18 Runs.vop

      C:\Program Files\Mercury\Visualizer3.0\Samples\BF Resource Summary\BF Resource Summary.vop

      C:\Program Files\Mercury\Visualizer3.0\Samples\T2 Validation Report BF_ViewSummary\BF_ViewSummary T2 Validation Report.vop

      C:\Program Files\Mercury\Visualizer3.0\PluginTemplates\visualizer.vop

      Set screen Questions Answered

      The Navigator's System Scalability wizard asks the same questions I ask in my performance reports

      • How many users can the system handle?
      • What is the system's maximum throughput?
      • What are the sensitive hardware components?
      • How many servers are required at each tier?

      C:\Program Files\Mercury\Visualizer3.0\Samples\Scenario with 18 Runs\Modeler Results 18 Runs.vop

        has multiple bars by run name to answer these questions:

      Questions Charts
      Are the performance objectives being met?

      Most SLAs specify something like "90% of response times are less than 5 seconds" rather than simply specify averages. The "90th percentile" report provides this statistic.

      Key performance indicators
      • Business function 90th percentile response time information chart
        Business function response time information
      • Business function throughput information chart
        Business function throughput information
      • Non-zero CPU utilization information chart
        CPU utilization information
      • Non-zero disk utilization information chart
        Disk utilization information
      • Non-zero network utilization information chart
        Network utilization information
      • Subsystem population information chart
        Subsystem population information
      Where are the potential bottlenecks?
      • Top 5 mean utilization chart [of components and subcomponents]
        Top 5 mean utilizations
      • Top 5 ending utilization chart
        Top 5 ending utilizations
      • Working tables for potential bottlenecks:
          All mean utilizations
      Where is response time spent? -
        Where is business function response time spent?
      • Business function response time breakdown chart
      • Business function maximum response time breakdown chart
        Where is transaction function response time spent?
      • Transaction response time breakdown [list by Subsystem]
      Application performance reports -
      Execution environment performance reports -

      Define in the Visualizer project performance objectives (PO).

      Visualizer automatically attaches "Model Bottleneck Detection" reports to a component wherever it detects a bottleneck in the model. These reports use pre-defined "MBDException.rox" and "MBDException Multiple Run.rox" from the Templates/Hidden directory where Visualizer is installed.

      Scenario Manager scenario definitions point to Modeler simulation data.

      Switchboard.exe handles communication between Navigator or Modeler to and from the Visualizer.

      Although numbers are reported to four significant digits in the raw simulation output, Excel and Visualizer rounds to three digits.


    Go to Top of this page.
    Previous topic this page
    Next topic this page

      Set screen Execution Environment Performance report

      In the Visualizer Execution environment performance reports:

        Run description and model parameters
        Computer Summary
          CPU
          • CPU utilization chart
            CPU utilization
          • CPU utilization by subsystem chart
            CPU utilization breakdown by subsystem
          Disk
          • Non-zero ... utilization chart
            Non-zero ... utilization

          • All ... utilization
          I/O controller
          • Non-zero ... utilization chart
            Non-zero ... utilization

          • All ... utilization
          Volume
          • Volume throughput chart
            Volume throughput
          • Volume response time chart
            Volume response time
          Active Tasks
          • Computer task manager active tasks chart
            Computer task manager active tasks

        Network summary
          ... utilization summary
          ... total throughput summary
          ... data throughput summary
          ... response time summary

        Interconnect summary
          ... utilization summary
          ... data throughput summary
          ... response time summary

        Custom Statistics summary
          ... continuous statistics
          ... discrete statistics


    Go to Top of this page.
    Previous topic this page
    Next topic this page

      Set screen Application Profile report

      C:\Program Files\Mercury\Visualizer3.0\Samples\Application Profile Report\Application Profile Report.vop

        Response Time Profile:
        • "Response time" (barchart by business function)
        • "Response time subsystem details" (stacked bar chart by business function)

        Network profile:
          "Application turns (visits)" (by business function stacked by subsystem/tier)
          "Network bytes transmitted" (by business function stacked by "Request size wire" and "Reply size wire")
          "Network bytes transmitted subsystem details" (by business function stacked by subsystem for "Request size wire" and "Reply size wire")

        CPU profile:
          "CPU requirements" ("CPU seconds" by business function)

        I/O profile:
          "I/O bytes transferred"
          "I/O bytes transferred subsystem details"


    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen AMG Configuration

      The Application Model Generator (AMG) executable amg.exe invokes MS-Excel with the Application Menu Generator add-in.

      Do not open profile spreadsheets by double-clicking on the .xls spreadsheet icon. Invoke AMG, then pull down the Application Menu Generator menu to open a profile in Program Files > Mercury > AMG4.0.1 > Samples > WebBrowserExample.xls

      Each AMG profile spreadsheet is created to store data in several tabs which are collectively called a "model".

      "Transaction Summary" worksheets are created by opening a Transaction Report created by the Profiler from resource data.

      Transaction Summary worksheets are used to update resource data.

      Menu "View Profile As" filters what tabs are visibile.

      Profiles are also created with colored cells. Information is required in rose colored cells. Information in blue cells can be changed even after files have been generated and updated in the modeler. Sending changed data is called "model recalibration".

      Information in orange cells are populated as defaults from the "Size" and "Default Values for Read-Write Types" cells on the Oracle Transactions worksheet, a part of the Oracle Application module.

      Changing information in green colored cells would require new model generation rather than regeneration.

      Default values are placed in orange colored cells.

      From information entered in green cells with the profile spreadsheet, AMG generates XML import (.xml) and ADN (Application Definition Notation) Java source include (.adn) files, and Modeler simulation model (.csm) files.

      Critique validates the model or parts of it.

      Generate first runs Critique.

      Optimized?


    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen Workload Components

      There are three types of workloads: T1, T2, and T3.

     

     
    Go to Top of this page.
    Previous topic this page
    Next topic this page

      Set screen T1

      "T1" tests exercise a single business function. This is also called a "network trace" test because it is run so that Ethereal can trace the sequence of calls triggered after the business function is requested.

      A raw data file produced by a network monitoring program can be displayed as a transaction traffic diagram (_trans.pta) (a form of UML sequence diagrams) shown here by the Workflow Analyzer from

      C:\Program Files\Mercury\Profiler3.6\Samples\WFA\SampleWfaTrace.pta

      This graphically displays columns that each represent a tier in its system:

      "0.0.0.2" for the client/injector
      "0.0.0.1" for the web server
      etc.

      Each flow line color represents all transaction request and response threads invoked by a single transaction. Other sample sequence diagrams include:

      C:\Program Files\Mercury\Visualizer3.0\Samples\T1 Run Comparison BF_Login\BF_Login T1 Run Comparison.vop

      C:\Program Files\Mercury\Visualizer3.0\Samples\T1 Validation Report BF_Login\BF_Login T1 Validation Report.vop

     

    Sequence Diagram Sequence Diagram


    Go to Top of this page.
    Previous topic this page
    Next topic this page

      Set screen T2

      "T2" exercises a single business function in runs at varying (but steady) levels of load (from 20% to 80% CPU utilization with no contention for CPU and disks).

      C:\Program Files\Mercury\Visualizer3.0\Samples\Business Function Profile\Business Function Profile.vop

        is used by the Visualizer to create reports for individual business functions.

      However, this test requires the automatic and repetitive submission of a single business function. The results of the T2 test provide estimates for resource usage of a single business function.

      The computed resources used by one of the runs needs to be chosen for use in a model.

      It takes as input several resource summary files, one for each load level. Navigator creates these files by dicing a single Resource Summary File into several files, one for each load level.

      Results from one of these runs are combined with network trace data and hardware resource usage data (also called "service demand") to create the Business Function Summary.

      C:\Program Files\Mercury\Visualizer3.0\Samples\T2 Run Comparison BF_Login\BF_Login T2 Run Comparison .vop

        This T2 run comparison report compares system resource usage at different load levels.

        This report was created by using file T2 Validation Report.rox


    Go to Top of this page.
    Previous topic this page
    Next topic this page

      Set screen T3

      "T3" tests "stresses" a mix of business functions or business processes that drives the system's resource usage up to measurable levels (such as 25, 50, 75, 100 vusers).

      C:\Program Files\Mercury\Visualizer3.0\Samples\T3 Test Validation Report\T3 Test Validation Report.vop

      C:\Program Files\Mercury\Visualizer3.0\Samples\T3 Test Run Comparison\T3 Test Run Comparison .vop


    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen Application performance reports:

      Behavior summary
        Behavior mean response time chart
        Behavior response time

      Business function summary
        ... response time summary
          ... mean response time by subsystem chart
          ... 90th percentile response time chart
          ... mean response time chart
          ... response time
          ... maximum response time by subsystem
          ... response time details

        ... throughput summary
          ... throughput chart
          ... mix chart

      Client workload summary
        ... interarrival summary
        ... response time summary
        ... throughput summary
        ... executions in progress summary

      Subsystem summary
        ... population summary
        ... thread utilization summary
        ... response time summary
        ... throughput summary

     

       
    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen Behavior Times

      Discrete statistics collected for a behavior:

      • interarrival time (the times between successive arrivals)
      • message reply time
      • message send time
      • message server time
      • receive response time
      • receive throughput (rate)
      • response time
      • throughput (rate)

      A class is a named collection of data and methods specified as a new reference type. A specified collection of data can include constants, variables, or arrays. A specified method can include a behavior, function, or constructor.

     

     
    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen Distribution Functions

    1. Constant
    2. Normal
    3. Lognormal
    4. Exponential
    5. Gamma
    6. Triangular
    7. Uniform

     

     
    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen Profiler

      The Profiler provides a graphical interface for executing batch command files.

      RunOrg: Navigator run organization that associates time intervals with business functions
      TRSUMM: Profiler Transaction Summary; report is imported into AMG
      Recap: Profiler Recap utility used to compare test runs
      AddrMap: Profiler symbolic address map
      Dice breaks up large network or resource standard input file by time intervals

      Set screen Profiler Summary Files

      The output from Profiler are summary files in two standard formats:


    Go to Top of this page.
    Previous topic this page
    Next topic this page

      Set screen "Raw" Log Files

      Raw text log files created by network monitoring programs need to be transformed and filtered into NSF files by the Profiler:

      • EtherealFull: converts Ethereal format files
      • EtherPeek: converts EtherPeek format files
      • NetmonFull: converts Netmon format files
      • Sniffer: converts Sniffer format files (.csv)
      • SnifferPro: converts SnifferPro data to NSF
      • SnifferProCap: converts raw binary data files generated by Network Associates Sniffer Pro to NSF
      • SnifferProFull: converts SnifferPro format files (.prn)
      • WeblogIIS4: converts IIS4.0/W3C format Web logs
      • WeblogNCSA: converts NCSA/Netscape format Web logs

      SAR files from UNIX machines should be save in ... format.

      The MCP method templates in Navigator rely on Profiler to convert LoadRunner performance data results files into resource standard format (RSF) data analysis files.

      These RSF files, in comma-separated value (CSV) format, can then be used as building blocks for an AMG system profile used to construct a Modeler simulation model.

      LoadRunner results data is stored in one of the following formats: • LRR, where performance data is spread across numerous text files • LRA, as a database constructed from one binary (.mdb) and one text file To integrate LoadRunner with MCP,

      Navigator and Profiler use the LoadRunner Results Translator program to create an .lrv file from LoadRunner performance data results. An .lrv file serves as a single-file bridge between LoadRunner performance data results and a Profiler RSF data analysis file.

      To integrate MCP with LoadRunner, you must install LoadRunner Results Translator software on the following computers: • The local computer where you install MCP • Any remote computer running LoadRunner to generate results files that you plan to use for a Navigator-driven performance analysis project

      tool ISM's Perfman Capacity Planning and Performance Management (CPM) software

      Set screen LoadRunner Result Files

      MCP accepts both types of LoadRunner result files:

      • .lrr files which are created during a run by the LoadRunner Controller.
      • .lrs files which are created after a run by the LoadRunner Analysis program.

      Set screen Resource Standard Format (RSF)

      Resource Standard Format (RSF) The Profiler CSV file format for hardware resource logs. After you specify a hardware resource log as input in Profiler or Navigator, Profiler generates RSF file to be used by its filtering and data-conversion modules.

      Set screen Network Standard Format (NSF)

      Network Standard Format (NSF) The Profiler CSV file format for network trace data. After you specify a network trace file as input in Profiler or Navigator, Profiler generates an NSF file to be used by its filtering and data-conversion modules.


    Go to Top of this page.
    Previous topic this page
    Next topic this page

      Set screen Beautifycsv

      This utility launches MS-Excel 8+ to display generic .csv files in an aesthetic, functional format. It gives time stamp fields with no date the date 1900/01/00.

      To conveniently invoke "Beautify CSV" by Right-clicking within Windows Explorer:

      1. Within Windows Explorer, Tools, Folder Options...
      2. Click on "File Types" tab and wait.
      3. Scroll down to "CSV" with "Microsoft Excel Comma Separated Values" and highlight it.
      4. Click the "Advanced" button for the "Edit File Type" dialog.
      5. Click the "New..." button.
      6. Click the New... button under the Actions: list
      7. In the "Action" field, enter "Beautify with Excel".
      8. In the "Appl used to perform..." field, enter

          "C:\Program Files\Mercury\Profiler3.6\beautifycsv.exe" %1 -v

          Notice that %1 and other parameters (described in the next section) are to the right of the double quotes.

      9. Click "OK".
      10. Right click on a .csv file and try it.

      The DOS Window pops up a disappears. So alternatively, Run from a command window:

        cd \program files\mercury\Profiler3.6
        Beautifycsv SAR_resourcemap.csv -v -nobanner -nodecorate -nolock

          -v
          -verbose
               This switch will cause lots of status information to be
               printed while files are processed. This is mostly useful
               for debugging.
        
          -headers n
               The number of rows before the actual data starts. The
               default is auto-detect.
        
          -nodividers
               Will suppress the dividing lines in the data area.
         
          -nobanner
               Suppresses the copyright information
        
          -noautofilter
               Suppresses the autofiltering of the columns in Excel.
        
          -nodecorate
               Suppresses the colorization of rows in Excel.
        
          -lock -unlock
               Un/lock the worksheet after formatting.
        

      The program scans for *.tpl template files and must find one matching.

      Sample tpl (template) files within C:\program files\mercury\Profiler3.6 include:

        activityreport.tpl
        dice.tpl
        DiceNewTemplate.tpl
        hostmap.tpl
        httpmap.tpl
        ipmap.tpl
        networkinfo.tpl
        networkstd.tpl
        recap.tpl
        resinfo.tpl
        ResourceBFSummary.tpl
        resourcemap.tpl
        resourcestd.tpl
        tcpwin.tpl
        transactioninfo.tpl
        transsummaryinfo.tpl

      Sample csv files within C:\program files\mercury\Profiler3.6 include:

        Best1_resourcemap.csv
        DB2Snapshot_resourcemap.csv
        DefaultHttpMap.csv
        Dstat_resourcemap.csv
        Introscope_resourcemap.csv
        LoadRunner_resourcemap.csv
        Measureware_resourcemap.csv
        PatrolPerform_resourcemap.csv
        Perfmon_resourcemap.csv
        Prstat_resourcemap.csv
        SAR_resourcemap.csv
        Vmstat_resourcemap.csv
        Vxstat_resourcemap.csv


    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen LoadRunner Translation

      needs to be run from where LoadRunner is installed, which is
      for LoadRunner 8.1 is C:\Program Files\Mercury\LoadRunner\bin

      So after you run a command window, cd to that folder, then:

        lrrtranslatecl.exe -in C:\Northrop\LoadTests\c15.s1w.p2apass2.noover.8u-7m.1h.r2\c15.s1w.p2apass2.noover.8u-7m.1h.r2.lrr

      lrrtranslatecl.exe looks for *.eve files specified within *.lrr file you provide, so you have moved files from a controller, make sure to recreate the exact path.

      Command Line Options:

        -in <:filepath>           : Full path to .LRA or .LRR file-out <:filepath>          : Full path to output file (default: .LRV)-delim <:delimiter>       : Output file column delimiter (default: \t)-style <:style>           : Conversion style -- plain (default), line#-tz <:mode>               : Timezone of timestamp -- local (default), original-na                       : Fill empty entries with 'n/a' (default)-nona                     : Fill empty entries with 0-exclude <:types>         : Exclude type of data                           	t - transaction                           	s - server metric                           	d - datapoints                           	w - web metrics                           	v - vuser status


    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen Simulation Runs

      Simulation runs are used in two types of reports:

      1. Run Comparison reports; and
      2. Validation reports

      A Scenario Manager scenario (.sm) file with a multiple-run scenario contains multiple .stx files. The Visualizer project maintains references to the .stx files rather than to the .sm file.

     

     
    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen System Resource Usage Metrics

      As described in my page on performance monitoring

      MCP standard resource statistics converted by Navigator Nickname Object LoadRunner raw resource statistics Collected Description
      CPU Utilization CPU Time Processor % Processor Time (Processor_Total) measures the time that the processor spends executing the thread of the Idle process in each sample interval, and subtracts that value from 100%. Each processor has an Idle thread which consumes cycles when no other threads are ready to run.
      - - System Processor Queue Length number of processors servicing the workload.
      Reads per Sec Read Count Physical Disk Total Disk Reads/sec (PhysicalDisk_Total)
      Read Bytes per Sec Read Size Disk Read Bytes/sec (PhysicalDisk_Total)
      Writes per Sec Write Count Disk Writes/sec (PhysicalDisk_Total)
      Write Bytes per Sec Write Size Disk Write Bytes/sec (PhysicalDisk_Total)

      Although numbers are reported to four significant digits in the raw simulation output, Excel and Visualizer rounds to three digits.


    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen Transaction Details

    • Client Time
    • Trans. local time
    • Trans. reply time
    • Trans. request time

    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen Transaction Summary Reports

      Transaction Summary reports (_tsummary.csv) are derived from a Profiler Transaction Report or a HTTP Transaction Report to provide Response Time breakdowns:

        By business function:
        • Overall
        • Client Time
        • Transaction Local Time
        • Transaction Reply Time
        • Transaction Request
        • Transaction Visit Count

        By transaction:

          Response Time
          • Overall
          • Request Time
          • Server Time
            • Overall
            • Local Time
              • Overall
              • CPU Time
          • Reply Time
          Requests:
          • Request Frame Count
          • Request Size (Wire)
          • Request Size (Payload)
          Reply:
          • Reply Frame Count
          • Reply Size (Wire)
          • Reply Size (Payload)
          I/O:
          • Read Count
          • Read Size
          • Write Count
          • Write Size

     

     
    Go to Top of this page.
    Previous topic this page
    Next topic this page

    Set screen Simulation Report (.rpt)

    Set screen Simulation ADN Coding

    Set screen Simulation Trace .tra files

    Set screen Debugging

    Portions ©Copyright 1996-2011 Wilson Mar. All rights reserved. | Privacy Policy |

    Related Topics:
    Load Testin Products
    Mercury LoadRunner
    Mercury LoadRunner Scripting
    NT Perfmon / UNIX rstatd Counters
    WinRunner
    Rational Robot
    Free Training!
    Tech Support


    How I may help

    Send a message with your email client program


    Your rating of this page:
    Low High




    Your first name:

    Your family name:

    Your location (city, country):

    Your Email address: 



      Top of Page Go to top of page

    Thank you!