For what it may or may not be worth, here are some stats from the Wikipedia
"microprocessor’ article:
===================
Market statistics
In 2003, about $44 billion (USD) worth of microprocessors were manufactured
and sold. [1] Although about half of that money was spent on CPUs used in
desktop or laptop personal computers, those count for only about 0.2% of all
CPUs sold.
About 55% of all CPUs sold in the world are 8-bit microcontrollers. Over 2
billion 8-bit microcontrollers were sold in 1997. [2]
Less than 10% of all the CPUs sold in the world are 32-bit or more. Of all
the 32-bit CPUs sold, about 2% are used in desktop or laptop personal
computers, the rest are sold in household appliances such as toasters,
microwaves, vacuum cleaners and televisions. “Taken as a whole, the average
price for a microprocessor, microcontroller, or DSP is just over $6.” [3]
So lets see. 10% of the micros are 32 bit or more and 2% of those end up in
PCs. So the 32+bit desktop market is 0.2% of the microprocessor market.
Even if we only care about 32+ bit processors, 98% of them end up in
refrigerators, toasters, and car stereos rather than on the desktop. Manybe
it is the 98% market share that should be at least partially driving the
feature content of the processors?
Loren
----- Original Message -----
From: “Arlie Davis”
To: “Windows System Software Devs Interest List”
Sent: Tuesday, February 20, 2007 12:00 PM
Subject: RE: [ntdev] 16 Bits Compatibility
You might as well ask why English (and French, German, Hindi, etc.) still
contains words that people used 2000+ years ago.
Backward compatibility! It’s not some scourge, some evil. It simply means
things have value, and people don’t throw valuable things away without a
reason.
Besides, it’s not like 16-bit code is some huge burden on modern processor
designers. Modern x86 processors are nothing, NOTHING like the little
sequential processors of the early 80s. Only the front-end instruction
decoder knows anything about the x86 instruction set, and it simply decodes
it into an internal, intermediate language, which we never see. That
instruction decoder is a tiny, tiny part of the overall processor. When you
consider the complexity of a modern processor, the instruction decoder is
basically negligible. These processors devote a huge amount of silicon to
instruction-level parallelism – instruction reordering, register renaming,
speculative execution, branch prediction, deep pipelining, multiple issue,
etc. So who cares if the processor can still interpret 16-bit opcodes?
We used to think the instruction set matters… remember CISC vs. RISC?
(Here’s a trick question – Is a modern x86 processor CISC or RISC?) In
certain extremes, it does (like exotic vector architectures), but within the
basic framework of low-processor count (including 1) commodity computers, it
simply doesn’t matter. That’s why there’s really only one ISA any more,
ignoring the modest x64 extensions, in the commodity workstation/server
market.
------------------------------------------
From: xxxxx@lists.osr.com
[mailto:xxxxx@lists.osr.com] On Behalf Of amitr0
Sent: Friday, February 16, 2007 5:48 PM
To: Windows System Software Devs Interest List
Subject: [ntdev] 16 Bits Compatibility
Sorry if this is Off topic, but I didn’t know who else to ask…
Y are processors still made back ward compatible to the extent of 16 bit
code support? Why do we still bootstrap in 16 bit modes when DOS based
programs are long obsolete?
Can’t the support be dropped now?
–
- amitr0
— Questions? First check the Kernel Driver FAQ at
http://www.osronline.com/article.cfm?id=256 To unsubscribe, visit the List
Server section of OSR Online at
http://www.osronline.com/page.cfm?name=ListServer
—
Questions? First check the Kernel Driver FAQ at
http://www.osronline.com/article.cfm?id=256
To unsubscribe, visit the List Server section of OSR Online at
http://www.osronline.com/page.cfm?name=ListServer