Central Processing Unit
The Central Processing Unit (usually
shorted shortened to CPU or processor) is the thingy inside your computer that computes the various processes that make your computer fun and useful, such as rendering pornographic websites and running video games. Processors are notable for being small, hot, and extremely expensive. So is Devon Aoki, but to our knowledge she cannot play Metal Gear Solid or Grand Theft Auto. She is, however, reputed to be able to display pornographic images of her body. Sadly, she is not presently for sale.
And hypothetically, even if she were, you probably wouldn't win the bidding war anyway. Industry experts therefore recommend that you buy a cheap PC on closeout, use the money you saved towards the purchase of a high speed Internet connection, then sequester yourself in a locked room for days at a time to masturbate like a crazed chimpanzee.
The first processors that saw widespread consumer adoption were produced by the Cuisinart Corporation. These were largely self-contained units that did not rely on a computer. They did however include an attached plastic dome monitor, where the activity of the processor could be visually observed. Popular applications that ran on the Cuisinart included: coleslaw, shredded carrot, egg salad, and the ever popular milkshake.
While these processors did not suffer from the heat dispersal problems that current generation CPUs do, they were enormously dangerous. The primitive circuitry of the time did not distinguish between hardware and software, nor between artificial reality and real-world objects. Consequently, many thrill seekers -- as well as the miserably ignorant -- lost fingers, toes, and other dangly bits to the rotary processing might of those old machines. And before you ask, no pictures of the maimed are fit for inclusion here. Shame on you for even thinking it.
First generation processors were largely composed of plastic and stainless steel. Software was run at extremely high speeds (for the time), by the turning of a swirly-whirly rotary blade. Various buttons at the base of the unit allowed for user control, though it wasn't always clear that the various modes produced distinctly different results. Historians have since surmised that the buttons were probably included to provide the appearance of advanced functionality, and to provide weak-minded simpletons something to play with. Not surprisingly, these were a near-instant hit, and are now also regarded as the first generation ancestors of the hand-manipulated controllers that are common to modern day PCs and video game consoles.
Beginning in the early 1980s, a miniaturization craze resulted in processors shrinking down to the size of a book of matches. Thereafter, they ceased to be self-contained units, and instead became separate pieces of the PC. In the newly designed computer case, the clear dome top was phased out along with the neat plastic rammer used to shove in the software. In replacing this with cases sheathed in sheet metal, much of the danger of lost dangly bits was eliminated, along with much of the fun of watching them go whirrr whizzz cha cha cha plub plub plub smush smush smush.
From 1981 onward, the number of fingers, toupees and penises puréed by misadventure dropped to near zero. Insurance companies heaved a collective sigh of relief. Doctors initially mourned, but soon busied themselves with spontaneous, anonymous sex with nurses, orderlies, candy stripers, comatose patients... anything really... while continuing to bill for deceased clients, and others that hadn't been in for a check-up in years.
And meanwhile, the computer industry eventually devised other ways of amusing PC buyers -- a neck-and-neck race between hardware capability and software upgrades. Processors continued to double in speeds every two years, double in width every 6 months, and double in height every 5 days (if you include the fan and heat sink). Which bring us to...
State of the Art
Today's generation of processors owe as much to building-trade architects as they do to the intellectually gifted but socially inept idiot savants -- muzzled and chained to computer workstations in marginally developed banana republics and third-world nations that no-one really cares about anyway. It is from these technological sweatshops -- owned by a manufacturer we can't actually name here due to a standing gag order -- that the "skyscraper" processors emerged.
Because single core processors had enlarged to the point where they were in danger of consuming the entire footprint of a small bungalow, and the heat sinks were nearly as large as the air conditioning unit of an averaged-sized professional sports arena, the processor industry was in trouble. Then, one particularly foul-smelling but not-terribly-obtuse geek hit upon the idea of building upward instead of outward. Of course he was perched on the balcony on the 187th floor of his cheaply built company-owned condominium at the time, so it wasn't exactly a stroke of brilliance out of the blinding blue. More like a stray thought as he grasped for his Tamagotchi, which rolled and plummeted down the 187 floor drop, where it cleaved a DHL Express deliveryman clean in two.
Happily, the struggling manufacturer was able to avoid a costly and potentially lethal wrongful death lawsuit when the deliveryman's family found themselves far too poor to sustain legal proceedings against a multinational corporation. This was particularly true after the company made another killing, rolling out the next generation dual-core processor.
This was followed by the quad-core, and the octo-core. Not to be outdone by the company-who-shall-not-be-named, Cuisinart recently announced their forthcoming 1000 core processor -- dubbed the kilo-core -- which will undoubtedly be the subject of much debate, envy, and wet dreams for journalists, technophiles and other miscellaneous weenies for some time to come.
Everybody has to run back and forth to the store everyday to get the latest and greatest cpus. Once 720x1280 pixel computers came out everyone bought them for $5,000. The next day 721x1282 pixel computers came out, and everyone bought them for $15,000. Nobody thought graphics could be any better than that. Then the next day 722x1284 pixel computers came out, and everyone was impressed by the almost unnoticeably better graphics, and bought the new chip for $60,000.
Simply upgrading cpu's automatically make graphics better. If you pull out the old cpu from the Atari 2600 (make sure you didn't pull out the farting chip by accident, or else Space Invaders can't fart when you shoot them up the ass) and replace it with a 666 Ghz Intel Octcore Pentium-12XZ Turbo, then plug in an old game like Pacman, you'd automatically get 3D High Definition tectures and shading, and Pacman is now a game about a crack-addicted smiling asian man, with a big head, who is in a haunted house full of flesh eating Zombies, and it's up to the crack-addicted smiling asian man to eat the Zombies himself with his giant mouth before the Zombies start eating, uhh, fruits. Pacman has always actually been about this, you just didn't reallise it because of it's low graphic power.