Hi,
what is the preferred method of automated testing for drivers. are there
any oop patterns, test tools, regression testers, unit testers available
around.
kutty
Hi,
what is the preferred method of automated testing for drivers. are there
any oop patterns, test tools, regression testers, unit testers available
around.
kutty
“Kutty Banerjee” wrote in message news:xxxxx@ntdev…
> what is the preferred method of automated testing for drivers. are there
> any oop patterns, test tools, regression testers, unit testers available
> around.
Well there are a number of things you can do to test a driver, things I do
are:
1. During development and most testing
a. Run the driver under the verifier
b. Test the driver with the checked build of the kernel/HAL
and any needed drivers
c. Run under Win2k3 since this has the latest set of checks
d. Use the Call Usage Verifier (requires Win2k3)
2. While developing the driver:
a. Built the driver under PREFAST and fix all the warning
b. Buld the driver with /W4 on your compile (yes you have to
pragma out some things!) and fix all the warnings.
c. Run the INF file for the driver through ChkInf and fix all
the errors and warnings
3. Testing the driver:
a. If you have power management support use the acpi tests
from the DDK tools
b. Always run dc2 from the DDK tools
c. For any PNP driver run pnpdtest from the DDK tools
d. Run the HCT tests use the general tests for any device and
run the device specific tests if your device is a supported class
4. Consider getting a Code Coverage tool see
http://www.compuware.com or http://www.bullseye.com/ and run your driver
with this, add tests to so all the major code paths are exercised. Note:
Microsoft just started a program to sign “unique devices” you have to have
75% code coverage.
–
Don Burn (MVP, Windows DDK)
Windows 2k/XP/2k3 Filesystem and Driver Consulting
>
Well there are a number of things you can do to test a driver, things I do
are:
During development and most testing
a. Run the driver under the verifier
b. Test the driver with the checked build of the kernel/HAL
and any needed drivers
c. Run under Win2k3 since this has the latest set of checks
d. Use the Call Usage Verifier (requires Win2k3)While developing the driver:
a. Built the driver under PREFAST and fix all the warning
b. Buld the driver with /W4 on your compile (yes you have
to
pragma out some things!) and fix all the warnings.
c. Run the INF file for the driver through ChkInf and fix
all
the errors and warningsTesting the driver:
a. If you have power management support use the acpi tests
from the DDK tools
b. Always run dc2 from the DDK tools
c. For any PNP driver run pnpdtest from the DDK tools
d. Run the HCT tests use the general tests for any device
and
run the device specific tests if your device is a supported classConsider getting a Code Coverage tool see
http://www.compuware.com or http://www.bullseye.com/ and run your driver
with this, add tests to so all the major code paths are exercised. Note:
Microsoft just started a program to sign “unique devices” you have to have
75% code coverage.–
Don Burn (MVP, Windows DDK)
Windows 2k/XP/2k3 Filesystem and Driver Consulting
Hi,
so it appears that most of the testing if end product testing. are there
any unit testing mechanisms. what about regression testing or having test
suites per driver. i remember somebody in this group talking about running
regression tests .
kutty
> so it appears that most of the testing if end product testing. are there
any unit testing mechanisms. what about regression testing or having test
suites per driver. i remember somebody in this group talking about running
regression tests .
It just depends on the nature of the driver and the hardware it drives,
if any. You’ll have to develop something specific to your situation.
There are posts in the archive about how various people test drivers -
some of it is quite elaborate indeed.
One amplification to Don’s response - try to test on a dual-proc box, as
well as on a 64-bit box. An ideal target for me is a 2-proc AMD64 - not
expensive, and well worth it.
-sd
> >
> Well there are a number of things you can do to test a
driver, things I do
> are:
>
> 1. During development and most testing
> a. Run the driver under the verifier
> b. Test the driver with the checked build of
the kernel/HAL
> and any needed drivers
> c. Run under Win2k3 since this has the
latest set of checks
> d. Use the Call Usage Verifier (requires Win2k3)
>
> 2. While developing the driver:
> a. Built the driver under PREFAST and fix
all the warning
> b. Buld the driver with /W4 on your compile
(yes you have
to
> pragma out some things!) and fix all the warnings.
> c. Run the INF file for the driver through
ChkInf and fix
all
> the errors and warnings
>
> 3. Testing the driver:
> a. If you have power management support use
the acpi tests
> from the DDK tools
> b. Always run dc2 from the DDK tools
> c. For any PNP driver run pnpdtest from the DDK tools
> d. Run the HCT tests use the general tests
for any device
and
> run the device specific tests if your device is a supported class
>
> 4. Consider getting a Code Coverage tool see
> http://www.compuware.com or http://www.bullseye.com/ and
run your driver
> with this, add tests to so all the major code paths are
exercised. Note:
> Microsoft just started a program to sign “unique devices”
you have to have
> 75% code coverage.
>
>
> –
> Don Burn (MVP, Windows DDK)
> Windows 2k/XP/2k3 Filesystem and Driver Consulting
>
>
>
>
>
Hi,
so it appears that most of the testing if end product
testing. are there
any unit testing mechanisms. what about regression testing or
having test
suites per driver. i remember somebody in this group talking
about running
regression tests .
Our testing, on graphics drivers, are almost all done in big test-suites
like the WHQL-HCT and running many different applications.
Aside from that, we obviously test our individual code, and when it’s big or
sensitive code-paths that have changed, we’d cross run a build on some
machine before releasing the change to the common code-base.
We also have some “run a bit of everything tests” that are pretty quick but
tries to cover a wide range of different code-paths. These are developed in
house, and I think every driver developer has a few (or lots) of these
test-programs that do a bunch of things to test things.
It’s of course possible to build a test-suite that tests individual
functions or portions of code, but it’s quite hard for a couple of reasons:
Most driver code would run in Kernel mode. Running PARTS of a kernel
driver can be very hard, because you need to still support enough of the
driver to make it operate in some suitable way in conjunction with Windows.
You could consider moving the code to user mode, but then all the kernel
specific code would not work, or would have to be replaced with other bits
of code.
Most driver code is fairly trivial [a], but it needs to co-operate with
the rest of the system in a nice way. This is the hard part to figure out.
Driver verifier is one of the tools that help with this.
[a] There are certainly a lot of example of complex driver code, but as a
general rule, the workings of a driver is to take something (an IRP), do
some simple processing (figure out the physical address of a page for
instance) and program some hardware registers (set up DMA for instance).
Don’t flame me, I’m fully aware that there are lots of complex code in
drivers, not to mention graphics drivers…
It’s a good idea to be a “paranoid developer” when developing drivers, so
that you check and double check anything that could be passed wrong at any
stage. So check that pointers are not NULL, that values are in the range you
expect them, etc, etc.
If you use the correct assert macros, you can leave the checks in there for
the debug build, and just build it for release and that leaves them out. Of
course, you do most of the testing in a debug build.
–
Mats
kutty
> One amplification to Don’s response - try to test on a
dual-proc box, as
well as on a 64-bit box. An ideal target for me is a 2-proc
AMD64 - not
expensive, and well worth it.
Ideal because it’s got the capability of being a Single 32 or 64 bit
processor machine, and a 32 or 64-bit dual processor machine. Very nice…
Covers just about everything except Itanium.
Another good idea is to have two different processor manufacturers, so that
you test on at least the Intel and AMD platforms. When I worked at AMD, I
found a fair few cases where companies wouldn’t guarantee that their code
would run on AMD simply because someone couldn’t be bothered to run the
test-suite on an AMD machine… Not that I EVER found a problem with any
software (or hardware), just customers complaining that AMD wasn’t supported
hardware for .
My test-machine is a single processor AMD64, which works well for me, but we
do have multi-proc machines too.
–
Mats
>
> -sd
>
>
>
> —
> Questions? First check the Kernel Driver FAQ at
http://www.osronline.com/article.cfm?id=256
You are currently subscribed to ntdev as: xxxxx@3dlabs.com
To unsubscribe send a blank email to xxxxx@lists.osr.com
“Kutty Banerjee” wrote in message news:xxxxx@ntdev…
> so it appears that most of the testing if end product testing. are there
> any unit testing mechanisms. what about regression testing or having test
> suites per driver. i remember somebody in this group talking about running
> regression tests .
Well the HCT’s are reasonable tests, and I was pleasantly amazed at data
from a BOF at WinHEC that many people run the tests during development.
Perhaps you have a different definition of regression testing, but I’ve
always known that term to mean testing that a known and fixed bug has not
reappeared, using my definition these are not something you can get off the
shelf.
There are no unit testing mechanisms that I know of, here I define unit
testing as testing a piece of functionality in a driver. While as I
mentioned there are things like the ACPI tests for power handling, and DC2
for device I/O control edge cases, again many driver have unique
functionality so how do you test that. I have put together my own unit
tests, for components of a driver. For instance I needed to create a custom
AVL tree in a driver recently, this module was tested by linking it to a
simple NT4.0 style driver that use DbgPrint and DbgPrompt for console I/O to
WinDBG and allowed me to exercise the tree code independent of the rest of
the driver. I do this fairly often in non-hardware drivers, where underly
code can in many cases be easily packaged.
One thing I forgot to mention in my previous list. Be sure to run all
testing on a multiprocessor system and a uniprocessor kernel also, there are
weird bugs that can occur only in a uniprocessor, and of course one needs to
test multiprocessor.
–
Don Burn (MVP, Windows DDK)
Windows 2k/XP/2k3 Filesystem and Driver Consulting
Steve Dispensa wrote:
One amplification to Don’s response - try to test on a dual-proc box, as
well as on a 64-bit box. An ideal target for me is a 2-proc AMD64 - not
expensive, and well worth it.
Don did an excellent job of summarizing “best in class” test activities.
Just to amplify Steve’s amplification: Try to test on a box with as many
procs as possible. A quad is better than a dual – An octo is better
than a quad.
As you go, build both for x86 and 64-bit AMD-64. Test on AMD64.
Also, in terms of CUV: You need to build your driver in the Server 03
build environment, but the created driver will run on XP or later.
The next issue of The NT Insider is scheduled to be a “Testing
Mega-Issue” – We should have some good guidelines there. In fact, I’d
be very happy to hear people’s test guidelines/ideas/suggestions (email
me off list, please) to share with the community.
Peter
OSR
On Thu, 2004-07-08 at 15:32, PeterGV wrote:
Just to amplify Steve’s amplification
…reminds me of rule 6a
http://www.ietf.org/rfc/rfc1925.txt
I make all my coders grok this before they get to start.
-sd