Performance Validator provides performance collectors and class and function filters.
At first glance they appear to do quite similar things - the enabling or disabling of data collection from specific functions.
Performance collectors switch the global data collection state as you enter and leave a function, while filters control instrumentation (and therefore data collection) locally for a function.
The outcomes are that differences lie in:
•how much code is instrumented
•flexibility of conditions for when data is collected
•complexity of targeting the functions of interest
Collectors let you keep all of your application instrumented, but selectively collect data on different parts of the application at different times.
Whenever instrumented code is executed, the data will be collected for certain functions according to the conditional definitions in the performance collectors.
Performance collectors are more flexible since they allow runtime conditionals.
There are two mindsets for how to use performance collectors:
•they let you collect data for a specified function and all the called child functions
•they let you collect data for functions depending on the calling functions
Class and function filters
Filters allow you to only instrument certain classes and functions.
Data collection occurs only when code is instrumented as specified by the Class and Function Filter settings.
Class and function filters are more rigid since there are no runtime conditionals.
Instrumentation filters allow code to execute faster since there's less overhead from Performance Validator.
If you're not concerned with the conditional aspects, then you can often achieve the same results using either method, but one way may require you to specify many more functions than the other to include or exclude.
Rather than setting up collectors or filters early on in your performance monitoring, it may be easier to first use Performance Validator inspect the Call Tree, Relations and other Statistics for your application before determining what functions you need to eliminate from instrumentation and which ones you only need in certain situations.
The help topic for Performance Collector settings shows a simple example of conditionally collecting data from a function depending on the parent function.
Absolutely. There is a help section on working with NT Services.
Not using CreateProcess
The Inject and Wait for Application to Start functionality use CreateRemoteThread to inject into an application.
For the reasons below, injection using CreateRemoteThread does not always work.
Common reasons for injection failure
•A missing DLL in your application
•The target application is a .NET application or .NET service
•A missing DLL in Performance Validator
•The application may have started and finished before the DLL could be injected
•The application security settings do not allow process handles to be opened
•The application is a service and is running with different privileges than Performance Validator
Flush the symbol cache files:
•Settings Menu Settings Hook safety Clean Instrumentation Cache Scan and delete symbol cache files Close OK
You may also want to disable the on-disk cache of PDB file symbols for functions and lines:
•Settings Menu Settings Hook safety deselect Cache instrumentation data... OK
Some features such as the Callstack tab can use thread names to make things a bit more intuitive.
From within your application you can provide a name for use by a debugger or debugging tool by using the Win32 RaiseException() API.
Add the function below to your application. This is based on an example from Microsoft and there are other examples
available on the web; some specify a buffer size of 8 characters and one terminator, others specify no strict buffer
After adding this function declaration you can call it from inside the thread procedure of any executing thread to name.
To name a thread from outside of the thread procedure pass the thread id instead of -1.
The example application shipped with C++ Performance Validator demonstrates how to use nameThread. See nativeExample.cpp.
We have tried to add as many features to Performance Validator that we thought would be useful to our users.
In fact, every feature in Performance Validator has been used to solve problems and bugs for clients who consult us, and in our own business, so we know the features we have are useful.
However, maybe we overlooked a feature that you would find very useful.
We'll happily consider most ideas for new features to Performance Validator. But no Quake, FlightSim or Flappy Bird Easter eggs though, sorry!
Please contact us to let us know your thoughts.