> Driver signing is the most complex and expensive addition to hit
drivers since I have been writing them for 20 years. I am unsure why it
has to be this way. Answers to any of these questions might help me
understand this situation:
I’ve been putting signatures on Windows code since about 1995. Of course I
was also involved with early Internet commerce security so became very aware
of the need for signatures long ago. If I think about a year of work, say
it’s 2000 hours, I’d bet I spend less than 40 hours/year fooling with
digital signature stuff, implying the overhead is perhaps 2% or less.
- Why are there different classes of certificates for driver
developers? For instance, why can’t the VeriSign organization
certificate be used for code signing or a GlobalSign code signing
certificate be used for winqual?
Driver signing is ONLY useful for preventing malicious code if the ability
to get a signing key is tightly controlled. My guess is different groups at
Microsoft have different levels of trust for the various Certificate
Authorities. Just by comparison, if you’re an Apple iPhone developer, you
ONLY can get signature keys directly from Apple, so really Microsoft is
allowing flexibility in the origin or many kinds of signing keys. Perhaps it
would just be simpler if Microsoft was the ONLY source for any keys used to
sign Windows software.
- Why are we absolutely required to do business with VeriSign for
logo? What would happen if they went out of business?
Microsoft would pick another CA, or else do it themselves. The signatures
used for WHQL submissions are short lived and it would be quick to shift to
some other root. The actual keys used to sign code need to live much longer,
as the root keys are in systems out in the world.
- Why are certificates forced to expire every 1-3 years. Why can’t we
just buy one that lasts forever?
That’s a policy choice of the CA’s. I assume a business model with long
expiration key’s is not as viable a business, and you DO want the CA’s to
stick around. There also is the security issue that shorter expirations are
safer. For example, employees at a company using a signature key might take
copies of the key when they leave, and if the expirations were say 10 years
the only way to deactivate the “stolen” keys would be to use certificate
revocation lists. The problem with CRL’s is if they grow to be very big,
they start to degrade signature checking performance. Having a 1 year
expiration on a signing key limits the lifetime that a stolen key can be
used. Is it really the expiration length or is it the price of the key. If
keys were really cheap, but expired in 3 months, that might actually be a
better security model, but may not be as good a business model for CA’s.
- Why are certificates so expensive? And why is it an annual fee based
rather than a single setup fee? How much work does VeriSign do year 2
compared to year 1?
Verisign I believe is a public company, and you could read their financials,
and report back here. If I’m not mistaken, CA’s are not exactly growing
money on trees, and some CA’s have even gone out of business. I do at times
wonder how it could cost $400 for a few kilobytes of data that are generated
in a few seconds. On the other hand they do need to have trustworthy people
and systems to verify the authenticity of somebody requesting a signature
key is authentic. My guess is you need to pay those people decent pages, and
perhaps run background checks against them, and a variety of other things to
trust them. The systems at Verisign seems like they need to assure
authentication of only trustworthy key requests, even though the people at
Verisign might be less trustworthy. What do you think is costs to create a
system that makes untrusted people work as a system of greater trust? The
problem is a LOT more complex than just running the key generation
algorithms on a processor.
- Why is so much red tape necessary to get a certificate issued? It is
impractical to get a certificate for some mobile, internet based
consultants who need to meet physical presence tests for somewhere they
have barely stayed or won’t be there much longer anyway.
Like I said, digital signatures are ONLY useful if it’s hard for the bad
guys to get one. I personally have a driver consulting corporation, and we
have a GlobalSign key and the VeriSign whql submission key. We don’t view
$400/year for signing keys that big an expense. The $2000+/year we pay to
Microsoft for an MSDN subscription is a way bigger expense, but pretty much
a requirement.
- Why is signing the driver not part of the build tool? I modified
mine by hand that everytime I press build it pops out a perfectly
release signed driver, even for checked builds. I and my customers
agree this has every advantage and no disdavantage.
Different companies may want different policies about who the private keys
are controller by. If the build process uses it then the private keys are
likely usable on any build machine with no direct user approval of the
signature. Unless the private key is stored in a hardware token, it also
means a security breach of a build machine might allow the theft of a
company’s signing keys. The only remedy to this would be to revoke the key,
which should make all software signed with that key stop working, basically
breaking EVERY customer out in the world.
At many companies I’ve been at, signing keys were pretty loosely controlled,
probably out of ignorance about the danger of them escaping into the wild. I
personally think a company should VERY closely control their private key, my
company does. Originally, the best security was to generate the key on a
hardware token, and connect that token to the machine used for signing. This
has the advantage that it’s impossible to steal the private key, as it’s
forever locked INSIDE the token hardware. Practically speaking, this is a
little inconvenient. An alternative is to generate the private/public key
pair, and get the public key signed by the CA. You then put this private
key/public certificate on removable media, like a CD or USB fob. On machines
that need to do signing, you then import the key pair as non-exportable.
Ideally, you import the key paid to a hardware token (or TPM) such that the
machine can do signing, but the key can’t be stolen. If you are very into
security, you should generate the key pair on a hardware token, and
PHYSICALLY secure that hardware token in way that’s only accessible for
FINAL signing of a driver by people in a company you trust. Putting the
hardware token in a safe, with combination known only to trusted people on
one way. One strategy to balance security and risk is you buy TWO signing
keys for each PRODUCT you make (yes, about $800/product) and you generate
one key pair with a name that distinguishes is as for test use only (Xyz
Inc. FOR INTERNAL TESTING ONLY). When you ship the final product, you take
the hardware token out of the safe (by a trusted person) and you sign the
final product bits with the real key pair. The advantage of two key pairs is
if the development/test key pair is compromised, you can just revoke it
(also when you ship). Revoking the REAL key pair used for your release
products brings down customers, which is bad.
I’d actually like to see key pair costs go way down, and security of them
improve. Like the Apple model of a EACH developer requires a key pair, with
a fairly short expiration. You still might do final product signing with a
very carefully controlled key pair.
- Why aren’t individuals allowed to write drivers anymore? They are
prohibited from obtaining a certificate and thus barred from access to
new Windows systems.
I’m basically an individual. I go through the trouble of having a
corporation, with a small number of employees, typically me and sometimes
one or two others. I deeply care about the security of products I work on
and the Windows platform in general. Actually, Dun and Bradstreet says my
company has been in business for 30 years now, although it has only been
incorporated in its current legal form since 1998. I fully expect that if
code signed by my company is malicious, men in police uniforms will show up
at my door and take me away. I also want any code YOU write, that’s
malicious, and distributed in any form to also be traceable to YOUR door, so
men in uniforms can show up and take you away. If you’re not willing to be
RESPONSIBLE for your code, I don’t want it running on any of MY or my
customers systems. If you not willing to assure that traceability of your
responsibility, I’m sorry to say you should not be allowed to distribute
Windows software to the public.
It’s not true that responsible software professionals who take the correct
legal steps, can’t write and distribute drivers for Windows anymore. It is
true that random people, who write software as a hobby, and who are not
willing to be legally responsible for the code they release into the world,
are deterred from releasing such code. For internal research, or personal
use, or even prototypes to a small number of people, you can just generate a
self-signed test certificate and enable test certificate mode on a specific
machine. Anybody with the WDK can generate the self-signed certificates and
there are no legal requirements of any kind. Random hackers (using the term
as a positive) can full write 64-bit Windows code and play with it. The
limits start happening when you want to distribute code to the public. There
are no real limits on writing code in you lab.
Microsoft’s requirements are much less stringent than Apple’s for iPhone
development. Since I’m a responsible software professional, I fully support
Microsoft’s efforts to assure the traceability of malicious software and the
stability of their OS. I’d actually support Microsoft going MUCH father, and
not allow execution of ANY code that wasn’t signed with trusted credentials.
This is basically the signature whitelist strategy of anti-virus protection,
instead of the current blacklist strategy.
People writing Java apps expected to be accessed online have needed to sign
everything for years. People writing user mode downloaded extensions like
ActiveX controls are been required to sign things for years.
I actually support PROCESSOR manufactures adding support for only running
code that is trusted by them. Things like firmware/BIOS.
Sure, I wish we lived in a world where EVERYBODY could be trusted, and no
signatures on code were ever required, but the reality of the world is you
can’t trust every piece of code. So can YOU tell me why YOU don’t feel like
the rules for preventing malicious code should apply to YOU? There are LOTS
of basically individuals, here on NTDEV, who do write 64-bit driver code,
and who do take the proper steps to assure responsibility is traceable.
- Since this forum is riddled with posts about driver signing is it
time to open a new forum for it?
MOST of us don’t really have a problem with signing. I’m sure people who are
not willing to be responsible for their code do. Yes there are technical
nuances than need to be learned, but it’s generally a lot simpler than
writing drivers in general. I’m expecting to create a blog for my consulting
company in the near future, and we might have some info about digital
signatures eventually. Just doing a search on Microsoft docs and/or the
NTDEV archives will find all the details you need to do it.
Jan