The First Apple Computer Stars In Tomorrow’s Amazing History Of Science Auction

An original Apple 1 computer built by Steve Wozniak in the Jobs family garage is the star attraction in an incredible collection of scientific collectibles being auctioned off in New York tomorrow.

Bonhams History of Science auction – its first – is a history buff’s ultimate shopping trip. Along with the Apple 1, of which just 200 were made, articles belonging to luminaries the likes of Charles Darwin and Ada Lovelace, medical and biological curiosities, artwork and original inventions are on the block.

The auction starts at Bonhams New York, 1pm EDT (4am AEDT). Here’s a selection of the most wanted; you can view the whole collection at Bonhams’ website.

Apple 1 Motherboard

Built by Steve Wozniak in the Jobs family garage, only five of these have been sold in the past four years. Unlike those four, this one looks like it’s in original shipping condition. Just 200 were made and they became the first pre-assembled personal computers on the market. This one is in working order – Bonhams have this video of it in action – and comes with tape decks, keyboard and monitor.
Link: Lot 286 – APPLE 1 COMPUTER.
Expected price: AU$340,000 – 570,000

Ada Lovelace sketch

If you’re hardcore about vintage computers, this lot puts the Apple 1 in the shade – and for a fraction of the price. It’s a “Sketch of the Analytical Engine invented by Charles Babbage” by none other than the Countess of Lovelace, Lord Byron’s only child, maths prodigy and considered by many to be history’s first computer programmer. Babbage demonstrated his Analytical Engine concept in Turin in 1841 and Italian military engineer L.F. Menabrea took notes. Lovelace not only translated the notes to English, but expanded on them in a way that proved to the world it was a genuinely programmable computer.
Link: Lot 272 – [LOVELACE, AUGUSTA ADA BYRON, COUNTESS OF, translator.] MENABREA, LUIGI FEDERICO.
Expected price: AU$20,000 – 28,000

Helmholtz Sound Synthesizer

Mix some phat 19th Century beats with this wood and brass sound synthesizer built by Max Kohl after the design by Hemholtz. It’s the first electric keyboard, designed to “identify the various frequencies of the pure sine wave components of complex sounds containing multiple tones”. Bonhams says it knows of only one other example, none of this size or quality, which combines “timbres of 10 harmonics to form various vowel sounds”.
Link: Lot 245 – HELMHOLTZ, HERMANN VON. 1821-1894. Chemnitz: Max Kohl, c.1905.
Expected price: AU$23,000 – 34,000

Charles Darwin signed letter about barnacles

Darwin’s study of barnacles in particular led him to classify a group of organisms according to the principle of ‘common descent,’ the idea that related animals and plants all descend from a common ancestor. In this letter, Darwins tells a colleague that he can’t wait to hear details from a man who claims he has watched a group of barnacles have sex.
Link: Lot 80 – DARWIN, CHARLES. 1809-1882.
Expected price: AU$23,000 – 34,000

Manhattan Project Viewing Window

A viewing window from the secret WWII Manhattan project bomb program, in which work by Albert Einstein and Enrico Fermi led to the development of the first hydrogen bomb and the Fat Man atomic bomb which destroyed Nagasaki. It’s on a wooden cart, because despite the fact it’s 54″ – about the size of large family TV – it weighs 680kg. It emits a yellow glow due to its high percentage of protective lead oxide. But it’s definitely not radioactive.
Link: Lot 262W.
Expected price: AU$170,000 – 280,000

Letters from Samuel Morse

Morse first attempted to build his famous overland telegraph underground. When it fizzled out, he was persuaded to use a system of poles proposed by the British engineer Charles Wheatstone. This group of documents relate to Morse’s construction of the first electromagnetic telegraph line, in which he orders 400 chestnut posts at a price of 98 cents apiece. Six weeks later, construction would begin and end on May 24, 1844, with Morse telegraphing his famous message: “What hath God wrought!”
List: Lot 225 MORSE. TELEGRAPH DOCUMENTS.
Expected price: AU$11,000 – 17,000

Magic Lantern

An early form of the motion picture projector, widely used by traveling showmen in the 17th Century known as “Savoyards” who gave lantern shows projected onto white backdrops. Also popular parts of magic acts. This one comes in a beautiful mahogany box, with 10 hand-colored glass slides depicting “a variety of scenes, including the great Pyramids of Egypt, a scholar in his study, a jungle scene with dragons and snakes, and a tropical scene”.
List: Lot 233 MAGIC LANTERN. Pettibone New Improved Sciopticon. Cincinatti, Ohio
Expected price: AU$2,300 – 3,400

Follow Business Insider Australia on Facebook, Twitter, and LinkedIn

Article source: http://www.businessinsider.com.au/the-first-apple-computer-stars-in-tomorrows-amazing-history-of-science-auction-2014-10

Auction offers fascinating glimpse into the history of science and technology

Apple-1 Computer. (Image courtesy of Bonhams)

From one of the first Apple computers to a Charles Darwin letter discussing Barnacle sex, a major auction of technological and scientific artifacts this week will give a fascinating glimpse into the great minds that helped shape our world.

Spanning centuries, the 288 items in the sale are estimated to raise around $2 million when they go under the hammer at auction house Bonhams on Oct. 22.

One of the key items in the New York auction is one of the first Apple-1 computers, which still works, a product which helped lay the foundations for the Cupertino, Calif.-based consumer tech giant.

“This is pretty much on the top of the list for technological artifacts,” Cassandra Hatton, Bonhams senior specialist in fine books, manuscripts, and the history of science, told FoxNews.com. “It’s one of the first 50 Apple Computers that were made.”

Bonhams estimates that the Apple-1 motherboard, which comes with a vintage keyboard, monitor, and power supply, will be sold for between $300,000 and $500,000. The seller is John Anderson, founder of the AppleSiders Apple user group in Cinncinati.

“It could very easily go to an institution or very easily go to a private collection,” said Hatton, noting that that the last operational Apple-1 computer sold for $672,000.

Built in 1976 by Apple co-founder Steve Wozniak, the computer is still operational.  

According to the Apple-1 Registry run by computer expert Mike Willegas, there are only 63 surviving authentic Apple-1’s. Only 15 of the 63 are said to have been successfully operated since 2000.

“The condition on this is really just fantastic,” said Hatton. “Hobbyists would really tinker with these things – so often, you see them burnt and scratched, but this is a really clean [mother] board.”

A rare 1857 letter from Charles Darwin will also be auctioned this week. In the four-page letter to zoologist and dentist Charles Spence Bate, Darwin inquires about Barnacle reproduction.

“This is really a classic Charles Darwin piece,” noted Hatton. “Darwin letters of this length are not commonly found.”

Signed “C.Darwin,” the letter forms part of the naturalist’s extensive research into barnacles, which helped hone his theory of evolution.

The letter is expected to raise between $20,000 and $30,000 at auction.

Another item of great historical importance in the sale is a large viewing window from the Manhattan Project. Originally used to view the production of plutonium at a Manhattan Project’s site in Hanford, Washington State, the 1500-pound window is six-inches thick.

Hatton told FoxNews.com that the window, which is not radioactive, is 70% lead oxide.

“Because the lead oxide is such a high percentage, it reacts more like a metal than glass,” she explained. “If you took a grinder to the window, in the same way that a metal would crumble – it wouldn’t shatter.”

Bonhams predicts that the window will sell for between $150,000 and $250,000.

Other items on sale this week include the Helmholtz sound synthesizer, described as the world’s first electric keyboard, and the extensive archive of pioneering astronomer George Willis Richey.

The auction house is also selling dozens of globes, some of which are miniature and date back to the eighteenth century. More modern items on sale include an Apple flag from the company’s European headquarters in France and a portrait of Microsoft co-founder Bill Gates. Painted in 2000 for the cover of Wired magazine, the portrait has an estimated value of $700 to $900.

Follow James Rogers on Twitter @jamesjrogers

 

Article source: http://www.foxnews.com/tech/2014/10/20/auction-offers-fascinating-glimpse-into-history-science-and-technology/

PARC Alto source code released by computer history museum

Choosing a cloud hosting partner with confidence

The Computer History Museum in Mountain View has released another foundational piece of software to the world at large: some of the code that gave the world the Xerox Alto computer, which among other things helped inspire a couple of young garage developers, Steves Jobs and Wozniak.

To the modern eye, the Alto looks odd – mostly because of the portrait orientation of its screen – but it represents Silicon Valley legend and was the first shot at making the UI visual rather than text. Want to draw things using a mouse? Check. Want a desktop metaphor for the screen? Ditto. WYSIWYG word processor? Ditto-ditto.


As Xerox PARC alumnus Paul McJones (he worked there on the Star office automation project) describes here, getting the software ready to be shown to the public (with the permission of the Palo Alto Research Centre) was no mean feat.

First the original Alto IFS files had to be archived to nine-track tape (software for this was written by Tim Diebert); the tapes were then transferred to eight-millimetre cartridges (James Foote wrote that software); Al Kossow put that lot on CD, and Dan Swinehart got permission to release the files.

A machine that changed the world: Alto

The file dump includes “snapshots of Alto source code, executables, documentation, font files, and other files from 1975 to 1987”, McJones writes. It includes the Bravo word processor, Markup, Draw and Sil drawing programs, and the Laurel e-mail program. There’s also the BCPL, Mesa, Smalltalk, and Lisp programming environments along with various utilities and the Alto’s Ethernet implementation.

While the Alto was bigger than a single-person project, McJones highlights as its creators: “Bob Taylor inspired PARC to invent the Alto, Chuck Thacker designed its hardware, Butler Lampson architected its software, Alan Kay envisioned its use as a personal dynamic medium, and many people at PARC and throughout the Xerox Corporation contributed to fulfil the vision.”

The file system archive is here.

If you’re more interested in the non-GUI thread of early personal computer history, the Computer History Museum also dumped some early CP/M source code at the beginning of October, here. ®

Beginner’s guide to SSL certificates

Article source: http://www.theregister.co.uk/2014/10/22/chm_releases_parc_alto_source_code/

History of the Personal Computer, Part 5: Computing goes mainstream, mobile …

1996 and Beyond: New Frontiers

Computing Goes Mainstream, Mobile, Ubiquitous

The microprocessor made personal computing possible by opening the door to more affordable machines with a smaller footprint. The 1970s supplied the hardware base, the 80s introduced economies of scale, while the 90s expanded the range of devices and accessible user interfaces.

The new millennium would bring a closer relationship between people and computers. More portable, customizable devices became the conduit that enabled humans’ basic need to connect. It’s no surprise that the computer transitioned from productivity tool to indispensable companion as connectivity proliferated.

As the late 1990s drew to a close, a hierarchy had been established in the PC world. OEMs who previously deposed IBM as market leader found that their influence was now curtailed by Intel. With Intel’s advertising subsidy for the “Intel Inside” campaign, OEMs had largely lost their own individuality in the marketplace.

The Pentium III had a lead role in the ‘Gigahertz War’ against AMD’s Athlon processors between 1999 and 2000. Ultimately it was AMD who crossed the finish line first, shipping the 1GHz Athlon days before Intel could launch theirs (Photo: Wiki Commons)

Intel, in turn, had been usurped by Microsoft as the industry leader after backing off its intention to increase multimedia efficiency by pursuing NSP (Native Signal Processing) software. Microsoft explained to OEMs that the company would not support NSP with Windows operating systems, as was the case with the then current MS-DOS and Windows 95.

Intel’s attempt to move onto Microsoft’s software turf had almost certainly been at least partly motivated by Microsoft’s increasing influence in the industry and the arrival of the Windows CE operating system, which would decrease Microsoft’s reliance on Intel’s x86 ecosystem with support for RISC-based processors.

Intel’s NSP initiative marked only one facet of the company’s strategy to maintain its position in the industry. A more dramatic change in architectural focus would arrive in the shape of the P7 architecture, renamed Merced in January 1996. Planned to arrive in two stages, the first would produce a consumer 64-bit processor with full 32-bit compatibility while the second more radical stage would be a pure 64-bit design requiring 64-bit software.

The HP 300LX, released in 1997, was one of the first handheld PCs designed to run the Windows CE 1.0 operating system from Microsoft. It was powered by a 44 MHz Hitachi SH3 (Photo: evanpap)

The hurdles associated with both hardware execution and a viable software ecosystem led Intel to focus its efforts in competing with RISC processors in the lucrative enterprise market with the Intel Architecture 64-bit (IA-64) based Itanium in partnership with Hewlett-Packard. The Itanium’s dismal failure — stemming from Intel’s overly optimistic predictions for the VLIW architecture and subsequent sales — provided a sobering realization that throwing prodigious RD resources into a bad idea just makes for an expensive bad idea. Financially, Itanium’s losses were ameliorated by Intel moving x86 into the professional markets with the Pentium Pro (and later Xeon brand) from late 1996, but Itanium remains an object lesson in hubris.

In contrast, AMD’s transition from second source vendor to independent x86 design and manufacture moved from success to success. The K5 and following K6 architectures successfully navigated AMD away from Intel dependency, rapidly integrating its own IP into both processors and motherboards as Intel moved to the P5 architecture with its Slot 1 and 440 chipset mainboard (both denied to AMD under the revised cross-license agreement).

AMD first adapted the existing Socket 7 into the Super Socket 7 with a licensed copy of VIA’s Apollo VP2/97 chipset, which provided AGP support to make it more competitive with Intel’s offerings. AMD followed that with its first homegrown chipset (the “Irongate” AMD 750) and Slot A motherboard for the product that would spark real competition with Intel. So much so that Intel initially pressured AMD motherboard makers to downplay AMD products by restricting supply of 440BX chipsets to board makers who stepped out of line.

AMD’s transition from second source vendor to independent x86 design and manufacture moved from success to success.

The K6 and subsequent K6-II and K6-III had increased AMD’s x86 market share by 2% a year following their introduction. Gains in the budget market were augmented with AMD’s first indigenous mobile line with the K6-II-P and K6-III-P variants. By early 2000, the much refined mobile K2-II and III + (Plus) series were added to the line-up, featuring lower voltage requirement and higher clock speed thanks to a process shrink as well as AMD’s new PowerNow dynamic clock adjustment technology to complement the 3D Now! instructions introduced with the K6-II to boost floating point calculations.

Aggressive pricing would be largely offset by Intel’s quick expansion into the server market, and what AMD needed was a flagship product that moved the company out of Intel’s shadow. The company delivered in style with its K7 Athlon.

The K7 traced its origins to the Digital Equipment Corporation, whose Alpha RISC processor architecture seemed to be a product in search of a company that could realize its potential. As DEC mismanaged itself out of existence, the Alpha 21064 and 21164 co-architect Derrick Meyer moved to AMD as the K7′s design chief, with the final design owing much to the Alpha’s development including the internal logic and EV6 system bus.

As the K6 architecture continued its release schedule, the K7′s debut at the Microprocessor Forum in San Jose on October 13, 1998 rightfully gained the lion’s share of the attention. Clock speeds beginning at 500MHz already eclipsed the fastest Pentium II running at 450MHz with the promise of 700MHz in the near future thanks to the transition to copper interconnects (from the industry standard aluminum) used in the 180µm process at AMD’s new Fab 30 in Dresden, West Germany.

The Athlon K7 would ship in June 1999 at 500, 550 and 600MHz to much critical acclaim. Whereas the launch of Intel’s Katmai Pentium III four months earlier was tainted by problems with its flagship 600MHz model, the K7′s debut was flawless and quickly followed up by the promised Fab 30 chips of 650 and 700MHz along with new details about the HyperTransport system data bus developed in part by another DEC alumnus, Jim Keller.

The Athlon’s arrival signaled the opening salvos in what was coined ‘The Gigahertz War’. Less about any material gains than the associated marketing opportunities, the battle between AMD and Intel allied with new chipsets provided a spark for a new wave of enthusiasts. Incremental advances in October 1999 from the 700MHz Athlon to the 733MHz Coppermine Pentium III gave way to November’s Athlon 750MHz and Intel’s 800MHz model in December.

The Athlon’s arrival signaled the opening salvos in the so-called ‘Gigahertz War’.

AMD would reach the 1GHz marker on January 6, 2000 when Compaq demonstrated its Presario desktop incorporating an Athlon processor cooled by KryoTech’s Super G phase change cooler at the Winter Consumer Electronics Show in Las Vegas.

This coup was followed by the launch of AMD’s Athlon 850MHz in February and 1000MHz on March 6, two days before Intel debuted its own 1GHz Pentium III. Not content with this state of affairs, Intel noted that its part had begun shipping a week earlier, to which AMD replied that its own Athlon 1000 had begun shipping in the last week of February — a claim easily verified since Gateway was in the process of shipping the first customer orders.

This headlong pursuit for core speed would continue largely unabated for the next two years until the core speed advantage of Intel’s NetBurst architecture moved AMD to place more reliance on its rated speed instead of actual core frequency. The race was not without casualties. While prices tumbled as a constant flow of new models flooded the market, OEM system prices began climbing — particularly those sporting the 1GHz models as both AMD and Intel had supplied small voltage increases to maintain stability, necessitating more robust power supplies and cooling.

AMD’s Athlon would also be held back initially by an off-die cache running slower than the CPU, causing a distinct performance disadvantage against the Coppermine-based Pentium III and teething troubles with AMD’s Irongate chipset and Viper southbridge. VIA’s excellent alternative, the KX133 chipset with its AGP 4x bus and 133MHz memory support, helped immensely with the latter while revising the Athlon to include an on-die full speed level 2 cache enabled the Athlon to truly show its potential as the Thunderbird.

AMD Athlon CPU “The Pencil Trick” Using pencil lead to reconnect the L1 bridges, allowing the processor to be set at any clock frequency. (Wikipedia)

With AMD now alongside Intel in the performance segment, the company introduced its cut-down Duron line to compete with Intel’s Celeron chipsets, southbridges and mobile processors in the growing notebook market where Intel’s presence was all-pervasive in flagship products from Dell (Inspiron), Toshiba (Tecra), Sony (Vaio), Fujitsu (Lifebook), and IBM (ThinkPad).

Within two years of the K7′s introduction AMD would claim 15% of the notebook processor market — a tenth of a percent more than its desktop market share at the same point. AMD’s sales fluctuated wildly between 2000 and 2004 as shipping was tied heavily to the manufacturing schedule of AMD’s Dresden foundry.

AMD management had been unprepared for the level of success gained by the K7, resulting in shortages during a critical time where it was beginning to cause anxiety within Intel.

AMD management had been unprepared for the level of success gained by the company’s design, resulting in shortages that severely restricted the brand’s growth and market share during a critical window where AMD was beginning to cause some anxiety within Intel. The supply constraints would also impact company partners, notably Hewlett-Packard, while strengthening Dell, a principal Intel OEM who had been selling systems at a prodigious rate and collecting payments for carrying Intel-only platforms since shortly after the mid-2003 launch of AMD’s K8 “SledgeHammer” Opteron, a product which threatened to derail Intel’s lucrative Xeon server market.

A share of this failure to fully capitalize on the K7 and following K8 architectures arose through the mindset of AMD CEO Jerry Sanders. Like his contemporaries at Intel, Sanders was a traditionalist from an era where a semiconductor company both designed and fabricated its own products. The rise of the fabless companies drew distain and prompted his “real men have fabs” outburst against newly formed Cyrix.

The stance was softened as Sanders stepped aside for Hector Ruiz to assume the role of CEO, with an outsourcing contract going to Chartered Semiconductor in November 2004 that began producing chips in June 2006, a couple of months after AMD’s own Dresden Fab 36 expansion began shipping processors. Such was the departure from the traditional semiconductor model that companies who design and manufacture their own chips became increasingly rare with foundry costs escalating and a whole mobile-centric industry built upon buying off-the-shelf ARM processor designs while contracting out chip production.

AMD’s Fab 36 in Dresden, now part of spinoff company GlobalFoundries.

Just as Intel had coveted the high margin server market, AMD also looked to the sector as a possible means of expansion where its presence was basically non-existent. Whereas Intel’s strategy was to draw a line under x86 and pursue a new architecture devoid of competition using its own IP, AMD’s answer was more conventional. Both companies looked to 64-bit computing as the future; Intel because it perceived a strong possibility that RISC architectures would in future outperform their x86 CISC designs, and AMD because Intel had the influence as an industry leader to ensure that 64-bit computing became a standard.

With x86 having moved from 16-bit to 32-bit, the next logical step would be to add 64-bit functionality with backwards compatibility for a bulk of existing software to ensure a smooth transition without breaking the current ecosystem. This approach was very much Intel’s fallback position rather than its preferred option.

AMD led the way in 64-bit computing, and with Intel’s need to make EM64T compatible with AMD64, the Sunnyvale company gained validation in the wider software community.

Being eager to lead in processor design on multiple fronts, Intel was attracted to steering away from the quagmire of x86 licenses and IP ownership, even if pursuing a 64-bit x86 product line also raised the issue of Intel’s own products working against IA64′s acceptance. AMD had no such issues.

Once the decision had been made to incorporate a 64-bit extension into the x86 framework rather than work on an existing architecture from DEC (Alpha), Sun (SPARC), or Motorola/IBM (PowerPC), what remained was to forge software partnerships in bringing the instruction set to realization since AMD was without the luxury of Intel’s sizeable in-house software development teams. This situation was to work in AMD’s favor as the collaboration fostered strong ties between AMD and software developers, aiding in industry acceptance of what would become AMD64.

Working closely with K8 project chief Fred Weber and Jim Keller would be David Cutler and Robert Short at Microsoft, who along with Dirk Meyer (AMD’s Senior Vice President of the Computation Products Group) had strong working relationships from their time at DEC. AMD would also consult with open source groups including SUSE who would provide the compiler. The collaborative effort allowed for a swift development and publishing of the AMD64 ISA and forced Intel to provide a competing solution.

Six months after Fred Weber’s presentation of AMD’s new K8 architecture at the Microprocessor Forum in November 1999, Intel began working on Yamhill (later Clackamas) which would eventuate as EM64T and later Intel64. With AMD leading the way and Intel’s need to make EM64T compatible with AMD64, the Sunnyvale company gained validation in the wider software community – if any more were needed with Microsoft aboard.

The arrival of the Opteron server based Athlon 64 in July 2003 as well as the desktop and mobile consumer version in October marked a period of sustained growth and heightened market presence for AMD. Server market share that had previously not existed rose to 22.9% of the x86 market at Intel’s expense by early 2006, prompting aggressive Intel price cuts. Gains in the consumer market were equally impressive with AMD’s share rising from 15.8% in Q3 2003 to an all-time high of 25.3% in Q1 2006 when the golden age for the company came to an abrupt halt.

AMD’s share rose to an all-time high of 25.3% in Q1 2006 when the golden age for the company came to an abrupt halt.

AMD would suffer a number of reverses that the company has still to recover from. After being snubbed by Dell for a number of years, the companies entered into business together for the first time in 2006 with Dell receiving allocation preference over other OEMs. At the time, Dell was the largest system builder in the world, shipping 31.4 million systems in 2004 and almost 40 million in 2006, but was locked in a fierce battle with Hewlett-Packard for market dominance.

Dell was losing in part to a concerted marketing campaign mounted by HP, but its position was also hit by the sale of systems at a loss, falling sales in the high-margin business sector, poor management, run-ins with the SEC and a recall of over four million laptop batteries. By the time the company turned its fortunes around, it had competition not just from new market leader HP, but from a growing list OEMs riding the popularity wave of netbooks and light notebooks.

2006 would also see a new competitor from Intel — one that pushed AMD back to being a peripheral player once again. At the August 2005 Intel Developer Forum, CEO Paul Otellini publicly acknowledged the failings of NetBurst with the rising power consumption and heat generation of the then current Pentium D. Once Intel realized that its high speed, long pipeline NetBurst architecture required increasing amounts of power as clock rates rose, the company shelved its NetBurst-based Timna system-on-a-chip (SoC) and set the design team on a course for a low-power processor that borrowed from the earlier P6 Pentium Pro.

The subsequent Centrino and Pentium M led directly to the Core architecture and a lineage of that leads to the present day model lineup. For its part, AMD continued to tweak the K8 while the following 10h architecture offset a lack of evolution with vigorous price cutting to maintain market share until its plan to integrate graphics and processor architectures could bear fruit.

Bottom side of an Intel Pentium M 1.4. The Pentium M represented a new and radical departure for Intel, optimizing for power efficiency at a time laptop computer use was growing fast.

Severely constrained by the debt burden from acquiring an overpriced ATI in 2006, AMD was faced not only with Intel recovering with a competent architecture, but also a computing market which was beginning to embrace mobile systems for which the Intel architectures would be better suited. An early decision to fight for the hard-won server market resulted in AMD choosing to design high-speed multi-core processors for the high-end segment and deciding to repurpose these core modules in conjunction the newly acquired ATI graphics IP.

The Centrino and Pentium M signaled a renewed Intel and led directly to the Core architecture whose lineage extends to the present day model lineup.

The Fusion program was officially unveiled on the day AMD completed its acquisition of ATI on November 25, 2006. Details of the actual architectural makeup and Bulldozer followed in July 2007 with roadmaps for both to be introduced in 2009.

The economic realities of debt servicing and poor sales that dropped AMD’s market share back to pre-Athlon 64 days would provide a major rethinking a year later when the company pushed its roadmaps out to 2011. At the same time, AMD decided against pursuing a smartphone processor, prompting it to sell its mobile graphics IP to Qualcomm (later emerging as the Adreno GPU in Qualcomm’s ARM-based Snapdragon SoCs).

Both Intel and AMD targeted integrated graphics as a key strategy in their future development as a natural extension of the ongoing practice of reducing the number of discrete chips required for any platform. AMD’s Fusion announcement was followed two months later by Intel’s own (intended) plan to move its IGP to the Nehalem CPU. Integrating graphics proved just as difficult for Intel’s much larger RD team as the first Westmere-based Clarkdale chips in January 2010 would have an “on package” IGP, not fully integrated in the CPU die.

Sandy Bridge was the second 32nm processor design from Intel and the first to have fully integrated graphics.

In the end, both companies would debut their IGP designs within days of each other in January 2011, with AMD’s low-power Brazos SoC quickly followed by Intel’s mainstream Sandy Bridge architecture. The intervening years have largely seen a continuation of a trend where Intel gradually introduces small incremental performance advances, paced against its own product and process node cadence, safely reaping the maximum financial return for the time being.

Both Intel and AMD targeted integrated graphics as a key strategy in their future development.

Llano became AMD’s first performance oriented Fusion microprocessor, intended for the mainstream notebook and desktop market. The chip was criticised for its poor CPU performance and praised for its better GPU performance.

For its part, AMD finally released the Bulldozer architecture in September 2011, a debut delayed long enough that it fared poorly on most performance metrics thanks to Intel’s foundry process execution and remorseless architecture/die shrink (Tick-Tock) release schedule. AMD’s primary weapon continues to be aggressive pricing, which somewhat offsets Intel’s “top of the mind” brand awareness among vendors and computer buyers, allowing AMD to maintain a reasonably consistent 15-19% market share from year to year.

The microprocessor rose to prominence not because it was superior to mainframes and minicomputers, but because it was good enough for the simpler workloads. The same dynamic unfolded with the arrival of a challenger to the x86 CPU hegemony: ARM.

The microprocessor rose to prominence not because it was superior to mainframes and minicomputers, but because it was good enough for the simpler workloads required of it while being smaller, cheaper, and more versatile than the systems that preceded it. The same dynamic unfolded with the arrival of a challenger to the x86 CPU hegemony of low cost computing as new classes of products became envisioned, developed and brought to market.

ARM has been instrumental in bringing personal computing to the next step in its evolution, and while it traces its development back over 30 years, it required advances in many other fields of component and connectivity design as well as its own evolution to truly propel it into the ubiquitous architecture we see today.

The ARM processor grew out of the need of a cheap co-processor for the Acorn Business Computer (ABC), which Acorn was developing to challenge IBM’s PC/AT, Apple II, and Hewlett-Packard’s HP-150 in the professional office machine market.

With no chip meeting the requirement, Acorn set about designing its own RISC-based architecture with Sophie Wilson and Steve Furber, who had both previously designed the prototype that later became the BBC Micro educational computer system.

Steve Fulber at work around the time of the BBC Micro development in the early 1980s. He led the design of the first ARM microprocessor along with Sophie Wilson. (British Library)

The initial development ARM1 would be followed by the ARM2, which would form the processing heart of Acorn’s Archimedes and spark interest from Apple as a suitable processor for its Newton PDA project. The chip’s development timetable would coincide with a slump in demand for personal computers in 1984 that strained Acorn’s resources.

While the Newton wasn’t an economic success, its entry into the field of personal computing elevated ARM’s architecture significantly.

Facing mounting debts from unsold inventory, Acorn spun off ARM as ‘Advanced Risc Machines.’ Acorn’s major shareholder, Olivetti, and Apple’s stake in the new company were set at 43% each, in exchange for Acorn’s IP and development team and Apple’s development funding. The remaining shares would be held by manufacturing partner VLSI Technologies and Acorn co-founder Hermann Hauser.

The first design, the ARM 600, was quickly supplanted by the ARM 610, which replaced ATT’s Hobbit processor in the Newton PDA. While the Newton and its licensed sibling (the Sharp Expert Pad PI-7000) initially sold 50,000 units in the first 10 weeks after August 3, 1993, the $499 price and an ongoing memory management bug which affected the handwriting recognition feature slowed sales to the extent that the product line would cost Apple nearly $100 million, including development costs.

The Apple Newton concept would spark imitators. ATT, who had supplied the Hobbit processor for the original Newton, envisaged a future development where the Newton could incorporate voice messaging — a forerunner to the smartphone — and was tempted to acquire Apple as the Newton neared production status. In the end it would be IBM’s Simon Personal Communicator that would lead the way to the smartphone revolution — albeit briefly.

While the Newton wasn’t an economic success, its entry into the field of personal computing elevated ARM’s architecture significantly. The introduction of the ARM7 core followed by its influential Thumb ISA ATM7TDMI variant would lead directly to Texas Instruments signing a licensing deal in 1993 followed by Samsung in 1994, DEC in 1995, and NEC the following year. The landmark ARMv4T architecture would also be instrumental in securing deals with Nokia to power the 6110, Nintendo for the DS and Game Boy Advance, and it has a long running association with Apple’s iPod.

Sales exploded with the evolution of the mobile internet and the increasingly capable machines that provide access to the endless flow of applications, keeping people informed, entertained and titillated. This rapid expansion has led to a convergence of sorts as ARM’s architectures become more complex while x86 based processing pares away its excess and moves down market segments into low-cost, low-power areas previously reserved for ARM’s RISC chips.

Apple MessagePad 120 next to the iPhone 3G (Photo: Flickr user admartinator)

Intel and AMD have hedged their bets by allying themselves with ARM-based architectures — the latter designing its own 64-bit K12 architecture and the former entering into a close relationship with Rockchip to market Intel’s SoFIA initiative (64-bit x86 Silvermont Atom SoCs) and to fabricate its ARM architecture chips.

Both Intel strategies aim to ensure that the company doesn’t fall into the same mire that affected other semiconductor companies who fabricated their own chips, ensuring high production so the foundries continue to operate efficiently. Intel’s foundry base is both extensive and expensive to maintain and thus requires continuous high volume production to remain viable.

Intel and AMD have since hedged their bets by allying themselves with ARM-based architectures.

The biggest obstacles facing the traditional powers of personal computing, namely Microsoft and Intel, and to a lesser degree Apple and AMD, are the speed that the licensed IP model brings to vendor competition and the all-important installed software base.

Apple’s closed hardware and software ecosystem has long been a barrier to fully realize its potential market penetration, with the company focusing on brand strategy and repeat customers over outright sales and licensing. While this works in developed markets, it’s comparably less successful in vast emerging markets where MediaTek, Huawei, Allwinner, Rockchip and others cash in on affordable ARM-powered, open source Android smartphones, tablets and notebooks.

Such is Android’s all-pervasive presence in the smartphone market that even as Microsoft collects handsome royalties tied up with the open source ecosystem, and divests itself of Android-based Nokia products, the company looks likely to make overtures to Cyanogen, the software house that caters to people who prefer their Android without Google.

Amazon and Samsung are reportedly matching Microsoft’s interest in either partnering with or outright purchasing Cyanogen, too. As the scale of the world’s mobile computing society becomes apparent, Microsoft’s keystone product, Windows, is at a crossroads between being primarily based on desktop users and moving to a mobile-centric touchscreen GUI.

For once, Microsoft has found itself second in software choice. Along with facing Android’s wrath in the mobile sector, Microsoft failed to learn from Hewlett-Packard’s experience with the touchscreen HP-150 of some 30 years ago — namely that productivity suffers when alternating between mouse/keyboard and touchscreen, not to mention an inherent reluctance of some users to embrace a different technology, leading many desktop users to resist the company’s vision of a unified operating system for all consumer computing needs.

For once, Microsoft has found itself second in software choice, and its keystone product, Windows, is at a crossroads between being primarily based on desktop users and moving to a mobile-centric GUI.

Time will tell how these interrelationships resolve, whether the traditional players merge, fail, or triumph. What seems certain is that computing is coalescing across what used to be distinct, clearly defined market segments. Heterogeneous computing aims to unite the many disparate systems with universal protocols that enable not just increased user-to-user and user-to-machine connectedness, but a vast expansion of machine-to-machine (M2M) communication.

Coined ‘The Internet of Things’, this initiative will rely heavily upon cooperation and common open standards to reach fruition — not exactly a given with the rampant self-interest generally displayed by big business. However, if successful, the interconnectedness would involve over seven billion computing devices, including personal computers (desktop, mobile, tablet), smartphones and wearables and close to 30 billion other smart devices.

This article series has largely been devoted to the hardware and software that defined personal computing from its inception. Conventional mainframe and minicomputer builders first saw the microprocessor as a novelty — a low-cost solution for a range of rudimentary applications. Within 40 years the technology has evolved from being limited to those skilled in component assembly, soldering and coding, to pre-school children being able access every corner of the world with the swipe of a touchscreen.

The computer has evolved into a fashion accessory and a method of engaging the world without personally engaging with the world — from poring over hexadecimal code and laboriously compiling punch tape to the sensory overload of today’s internet with its siren song of a pseudo-human connectivity. Meanwhile, using the most personal of computers, the human brain, is encouraged to be abbreviated, almost as much as modern language has embraced emoticons and text speak.

People have long been dependent on the microprocessor. While it may have started with company bookkeeping, secretarial typing and stenography pools, much of humanity now relies on computers to tell us how, when and why we move through the day. The next stage in computing history may just center on how we went from shaping our technology to how our technology shaped us.

This is the fifth and final installment on our History of the Microprocessor and the PC. Be sure to check out the complete series for a stroll down a number of milestones from the invention of the transistor to modern day chips powering our connected devices.

Article source: http://www.techspot.com/article/904-history-of-the-personal-computer-part-5/

Computer History Museum Welcomes Silicon Valley Rising Stars to Advisory …

MOUNTAIN VIEW, Calif., Oct. 8, 2014 (GLOBE NEWSWIRE) — The Computer History Museum (CHM), the world’s leading institution exploring the history of computing and its ongoing impact on society, today announced that it has added TechCrunch’s Alexia Tsotsis, Version One Ventures’ Angela Kingyens, and Google’s Victoria Pinchin to its NextGen Advisory Board.

“One of the reasons Silicon Valley is such a special place is all the experience sharing and mentoring that happens here,” said Sunil Nagaraj, co-founder and co-chair of the NextGen Advisory Board. “Alexia, Angela, and Veronica will be a huge boost to all the events and activities we organize to this end.”

Tsotsis serves as co-editor at TechCrunch, the world’s leading startup publication, where she’s responsible for managing an editorial team of over 30 people, leading them through 3 large conferences a year in addition to producing news content day in and day out. She has a Bachelor’s degree in English from the University of Southern California.

At Version One Ventures, Kingyens is an associate investing in early-stage consumer internet, SaaS and mobile entrepreneurs across North America. Prior to joining Version One, she was a partner at Insight Data Science, a YCombinator-backed startup helping PhDs transition from academic research to industry careers via a six-week training program. Kingyens holds a PhD in Operations Research and Financial Engineering from the University of Toronto.

Pinchin is a product manager for Google Knowledge (Google Search). Prior to Google, she was a consultant at McKinsey Company, and worked at Amazon.com and National Instruments in a variety of technical marketing and product roles. Pinchin has an MBA from Harvard Business School.

“I’m delighted to welcome Alexia, Angela and Victoria to the board. NextGen plays a vital role in helping the Museum connect the past to the future, and they connect a new generation to the people and stories that inspire breakthrough thinking,” said Museum President and Chief Executive Officer John Hollar.

Tsotsis, Kingyens and Pinchin join existing board members Vishal Arya, Susie Caulfield, Alec Detwiler, Joel Franusic, Julia Grace, Serge Grossman, Amy Jackson, Sunil Nagaraj, Jason Shah, Jeremiah Stone and Michelle Zatlyn.

About the NextGen Advisory Board

The Computer History Museum’s NextGen Advisory Board was created to bring technology enthusiasts together over our rich history in Silicon Valley. Their events aim to bring together young professionals who love technology and the history of computing. Combined, the board has extensive experience in technology entrepreneurship, venture capital, product management, marketing and public relations, and many other professional fields at the heart of computer history. The group’s “Future History Makers” series has featured guests including Drew Houston of Dropbox, Phil Libin of Evernote, Travis Kalanick of Uber, and many other rising stars in Silicon Valley.

For more information on the NextGen Advisory Board please visit www.computerhistory.org/nextgen.

About the Computer History Museum

The Computer History Museum in Mountain View, California is a nonprofit organization with a four decade history as the world’s leading institution exploring the history of computing and its ongoing impact on society. The Museum is dedicated to the preservation and celebration of computer history, and is home to the largest international collection of computing artifacts in the world, encompassing computer hardware, software, documentation, ephemera, photographs and moving images. The Museum brings computer history to life through large-scale exhibits, an acclaimed speaker series, a dynamic website, docent-led tours and an award-winning education program.

The Museum’s signature exhibition is “Revolution: The First 2000 Years of Computing,” described by USA Today as “the Valley’s answer to the Smithsonian.” Other current exhibits include “Charles Babbage’s Difference Engine No. 2,” “IBM 1401 and PDP‐1 Demo Labs”, and “Where To? The History of Autonomous Vehicles.”

For more information and updates, call (650) 810‐1059, visit www.computerhistory.org, check us out on Facebook, follow @computerhistory on Twitter and the Museum blog @chm.

Carina Sweet

(650) 810-1059

Article source: http://globenewswire.com/news-release/2014/10/08/671676/10101846/en/Computer-History-Museum-Welcomes-Silicon-Valley-Rising-Stars-to-Advisory-Board.html

‘The Innovators’ traces the history of the computer and its creators

If you could pick up your smartphone and look at it with the eyes of an historian, you wouldn’t see a single device – you’d see hundreds. You’d see gears, and punchcards, and vacuum tubes, and transistors, and circuit boards – in short, you’d see the evolution of the modern computer, a dizzying process of increasing efficiency, miniaturization, and organizational insight that has put us in the enviable position of being able to figure out how to drive and order from the nearest sushi bar without so much as handling anything printed on dead trees. 

The Innovators, a new book by Steve Jobs biographer and former Time managing editor Walter Isaacson, does far more than analyze the hardware and software that gave birth to digital revolution – it fully explores the women and men who created the ideas that birthed the gadgets. In so doing, Isaacson tells stories of vanity and idealism, of greed and sacrifice, and of the kind of profound complexity that lies behind the development of seemingly simple technological improvements.

“Collaboration” is the author’s supporting theme, and he weaves it in throughout his anecdotes and character studies. Approached lazily, this kind of leitmotif would be more irritating than illuminating, but Isaacson fully commits. Throughout the book’s pages, we watch the men and women of the emerging computer industry swap (and steal) ideas, share (and hog) the spotlight, and get rich (or get cheated) after their innovations have been embraced and implemented by the market.

And while Isaacson brings “The Innovators” all the way up to the Google Era, he lays the groundwork for his tale in the Victorian era, as increased mechanization was beginning to change lives (for better or worse) throughout the British Empire. Watching Ada Lovelace (intellectually) swoon over, collaborate with, and eventually bitterly split from Charles Babbage is a treat, and Isaacson takes a great deal of care to explore and present the originality and importance of Lovelace’s contributions to foundational concepts in computing, including an arguable but persuasive assertion that Lovelace first put into words a description of the modern computer, writing:

“The Analytical Engine does not occupy common ground with mere ‘calculating machines.’ It holds a position wholly its own. In enabling a mechanism to combine together general symbols, in successions of unlimited variety and extent, a uniting link is established between the operations of matter and the abstract mental processes.”

His chapter on Lovelace sets the tone for “The Innovators” as a whole: She is painted as a person whose pure intellect and talent for math is united, seamlessly, with an imagination capable of seeing and describing the as-of-yet undiscovered.

Isaacson is skilled at untangling the tangled strands of memory and documentation and then reweaving them into a coherent tapestry that illustrates how something as complicated and important as the microchip emerged from a series of innovations piggybacking off of one another for decades (centuries, ultimately.)

One of his most absorbing chapters tells the story of computing pioneer William Shockley, whose work at Bell Labs helped give birth to the transistor, but whose desire for recognition and power alienated himself from his patrons, his employees, and – eventually – from polite society as he espoused increasingly crackpot theories about race and intelligence. Isaacson’s portrait of Shockley is harsh but nuanced, and watching the storm clouds gather around one of the book’s most striking antiheroes makes for engrossing reading and rich food for thought.

And while he follows Shockley toward his eventual collapse and disgrace, he manages to advance an astounding parade of incremental and foundational improvements in computer technology in a manner that is both dynamic and, somewhat incredibly, actually quite entertaining to read about. As transistors and microchips lurch from the possible to the real, geniuses and titans, talk, clash, laugh, collaborate, fight. Meanwhile, they forge the beginning of Silicon Valley as we, privileged spectators with ringside seats to the crucible of invention, look on.

If “The Innovators” were stripped of its vivid personal details and reflections on the nature of collaboration, it would still read as a lucid, lively, easy-to- follow scientific history of the evolution of the modern computer. But given its ambition and rich detail, it’s rather more than that. It’s a portrait both of a technology, and the culture that nurtured it. That makes it a remarkable book, and an example for other would-be gadget chroniclers to keep readily at hand before getting lost in a labyrinth of ones and zeros – at the expense of the human beings who built the maze in the first place.

Article source: http://www.csmonitor.com/Books/Book-Reviews/2014/1013/The-Innovators-traces-the-history-of-the-computer-and-its-creators

The (complicated) oral history of Hewlett-Packard Co.’s PC business


The Hewlett-Packard Co. logo is displayed on the back of the Envy x2, part of a new class of personal-computer of hybrids that look and work like regular laptops, but whose screens pop off to become fully functional tablets. Photographer: David Paul Morris/Bloomberg








Greg Baumann
Editor in Chief- Silicon Valley Business Journal

Email
 | 
Twitter
 | 
Google+

Hewlett-Packard Co.’s decision to split its personal computer and printer business away from its enterprise hardware, software and services operation is the latest gyration in a long drama. Here, in the company’s own words, is its thinking on the PC business…

HP loves the PC business!— “Being in the consumer business when the ‘consumerization’ of IT is driving the entire industry is an immense competitive advantage.” — February 2011, HP CEO Leo Apotheker

HP hates the PC business! — “To be successful in the consumer device business we would have had to invest a lot of capital and I believe we can invest it in better places.” — August 2011, HP CEO Leo Apotheker

HP really hates the PC business!— “HP also reported that it plans to announce that its board of directors has authorized the exploration of strategic alternatives for its Personal Systems Group (PSG). HP will consider a broad range of options that may include, among others, a full or partial separation of PSG from HP through a spin-off or other transaction.” — August 2011, HP company announcement

HP loves the PC business!— “Our role in our big businesses is to optimize performance from a growth and profit perspective, and I think we do that pretty well. Everyone always says the PC business is really hard. I’m anxious to find a business that’s easy, because I’m ready to sign up.” —October 2011, HP Executive Vice President Todd Bradley

HP really loves the PC business!— “While the operating margin of this business is not as high as some of the other businesses at HP, the return on investment capital is really terrific.” — October 2011, HP CEO Meg Whitman

HP really, for sure loves the PC business!— “HP objectively evaluated the strategic, financial and operational impact of spinning off PSG. It’s clear after our analysis that keeping PSG within HP is right for customers and partners, right for shareholders, and right for employees,” said Meg Whitman, HP president and chief executive officer. “HP is committed to PSG, and together we are stronger.” — October 2011, HP CEO Meg Whitman

HP is *totally* over the PC business— “Let me be clear: One HP was the right approach,” Whitman said. “During the fix and rebuild phase of our turnaround plan, we used the strength found in being together to become stronger throughout, but of course, the marketplace never stands still in our industry.” — October 2014, HP CEO Meg Whitman

HP Inc. loves the PC business!— “As the market leader in printing and personal systems, an independent HP Inc. will be extremely well positioned to deliver that innovation across our traditional markets as well as extend our leadership into new markets like 3-D printing and new computing experiences — inventing technology that empowers people to create, interact and inspire like never before.” — October 2014, HP Inc. CEO Dion Weisler

Greg Baumann is editor in chief at the Silicon Valley Business Journal.




Article source: http://www.bizjournals.com/sanjose/news/2014/10/06/the-complicated-oral-history-of-hewlett-packard-co.html?page=all

History of the Personal Computer, Part 3: IBM PC Model 5150 and the attack of …

1980 – 1984: x86, Rise of the 40-Year Stopgap

IBM PC Model 5150 and the Attack of the Clones

The only remarkable thing about the product that revolutionized the personal computing business was the fact that IBM built it. If any other company of the era built and marketed the IBM Personal Computer Model 5150, it might be looked back on with fondness but not as a product that changed an industry.

IBM’s stature guaranteed the PC to initiate a level of standardization required for a technology to attain widespread usage. That same stature also ensured competitors would have unfettered access to the technical specifications of the Model 5150, since IBM was obligated to disclose such information under the Department of Justice 1956 consent decree, which the company operated under as an accommodation for its previous monopolistic practices.

The third facet of the Model 5150′s enduring legacy came about from sourcing components via independent hardware vendors. IBM’s business was built on the company designing and manufacturing nearly all of its hardware and software in house, which maximized profit at the expense of overall agility in the market as corporate in-fighting and rivalries between divisions within such a monolithic company added inertia to the decision making processes.

The Datamaster was an all-in-one computer with text-mode CRT display, keyboard, processor, memory, and two 8-inch floppy disk drives all contained in one cabinet. (Photo: Oldcomputers.net)

The Model 5150 wasn’t IBM’s first attempt at building a personal computer, with at least four previous projects being scrapped as the market moved faster than IBM’s corporate decision making. The Intel 8085-equipped System/23 DataMaster business computer also enduring a protracted development starting in February 1978. The DataMaster system entry into the market in July 1981 led to the change in design strategy in addition to members of the design team being assigned work on the new PC project.

IBM’s original plan had been to design the personal computer around Motorola’s 6800 processor at its Austin, Texas research center. IBM marketing had arranged for the PC to be sold through the stores of Sears, Roebuck Co., and the deal teetered in the balance as Motorola’s 6800 along with its support chips slipped in schedule.

A contingency plan named Project Chess was set up to run concurrently with the Austin design and seemed to gain traction after Atari approached IBM about building a personal computer, if IBM were so inclined to design one. Official IBM sanction was achieved when project director William (Bill) Lowe pledged to have the design finalized in a year. To meet this timescale, Lowe would source from vendors outside IBM.

Project director William Lowe pledged to have the design finalized in a year sourcing components from vendors outside IBM.

What remained was choice of processor and operating system for the PC. Lowe and Estridge were astute enough to realize that IBM’s senior management would not look kindly upon a PC that posed a performance threat to the company’s lucrative business machines (a System/23 DataMaster terminal with printer listed for around $9,900 at the time).

The original intention seems to have been to use an 8-bit processor, which would have allowed MOS Tech’s 6502, Zilog’s Z80, and Intel’s 8085 to be considered. However, IBM engineers favored the use of 16-bit, as did Bill Gates, who lobbied IBM to use 16-bit to fully showcase the operating system he was developing , while the arrival of 32-bit architectures from Motorola and National Semiconductor (the 68000 and 16032 respectively) were set to enter production outside of the one year deadline.

The eventual choice was a compromise of 8-bit and 16-bit to allay concerns over compatibility with existing software and expansion options while reducing the bill of materials from a cheaper processor and support chips that were already available, and to retain a significant performance gap between the PC and IBM’s business machines.

IBM’s decision was made easier as the microprocessor landscape was becoming a war of attrition. MOS Tech was acquired by Commodore after MOS was financially decimated by Texas Instrument’s calculator price war and focus shifted from innovation to capitalizing on the success of the 6502. Western Design Center (WDC) would eventually bring 16-bit computing to the 6500 series, but as with many microprocessor companies, the competition had rendered them all redundant by the time they were ready for market.

Zilog’s fortunes also suffered a downturn, as majority shareholder and later parent company Exxon was happy to see the fledgling company go into breakneck product diversification. RD expenditure topped 35% of revenue, while the wider range of development caused slippage in its own 16-bit Z8000 processor as Exxon’s demands and the relative managerial inexperience of Federico Faggin became exposed.

Faggin and Ungermann had started Zilog to build microprocessors, but Exxon had bought Zilog as a cog in a machine along with a host of other electronics and software company acquisitions for a grand design they hoped would rival IBM. This would turn into a billion dollar failure.

Zilog’s waning fortunes, even as its Z80 powered a prodigious number of computers, terminals, and industrial machines, also cascaded down upon its second source licensees. AMD’s license for Intel’s 8085 hadn’t translated into an invitation to do likewise with its follow up 8086 processor. For a viable 16-bit processor this left Jerry Sanders with the alternative of approaching Motorola or Zilog as National Semiconductors offering was shaping up as promising much but delivering little.

What remained was choice of processor and operating system for the PC, and IBM’s decision was made easier as the microprocessor landscape was becoming a war of attrition.

With Motorola’s delays and AMD’s inability to produce its own competitive architecture, Zilog looked like the more attractive option given that as a relative newcomer it may have proved easier to work with and the Z8000 became AMD’s choice. The Z8000′s lack of backwards compatibility with 8-bit software doomed sales of the processor, and customers that had flocked to the Z80 quickly turned to Intel. Where Zilog had the successful Z80, and world’s largest oil company covering losses, AMD had no such options.

For its part, Intel had planned for a vast leap in processor architecture as soon as the 8080 had been completed. The envisaged chip, known internally as the 8816, would have been over four times the size of the existing 8-bit chip, with 16-bit and 32-bit functionality and a host of features that typify a modern processor.

It became apparent in an April 14, 1976 assessment that the architecture would be a formidable challenge to produce, and most certainly not within the timeframe needed to combat the 16-bit chips of Motorola, Zilog, National Semiconductor, and Texas Instruments.

Intel required at least an interim design to fend off the competition and continue its growth and software group manager Terry Opdendyk accepted the challenge of designing a new processor architecture inside of 10 weeks — the estimated maximum length of time required for the chip to be completed in a year. Opdendyk chose Steve Morse, an engineer specializing in software and the author of the condemnatory 8816 review that initiated this project, marking the first time that design of an Intel chip architecture hadn’t been the responsibility of hardware engineers.

Die shot of the 16-bit Intel 8086 microprocessor. The 8086 gave rise to the famed x86 architecture which eventually turned out as Intel’s most successful line of processors.

Architecture design work started in May with the two-man team of Morse and project manager Bill Pohlman, and the first revision of the architecture was duly completed in August. Two stipulations were imposed on the design: it needed backwards compatibility with the 8080 and it needed to address memory up to 128KB, double that of the 8080.

The answer to the second problem was solved by an awkward method of segmented addressing to allow 20-bit memory addresses with a chip that handled data 16 bits at a time which allowed for up to 1 megabyte of memory to be addressed. As inelegant as the resulting 8086 solution was when it arrived on June 8, 1978, it allowed Intel to beat both Motorola and Zilog in the race to a commercially viable 16-bit processor.

Intel followed up on the 8086 a year later with a cost reduced 8088 that halved the 8086′s external bus from 16-bit to 8-bit targeting tighter budgets and customers seeking to extend the life of their 8080 and 8085-based systems and associated software.

While Motorola’s own 16-bit processor, the 68000, wasn’t launched until a full 15 months after the 8086, its design had merits over the 8086 which offset Intel’s initial market lead as soon as it began sampling. Whereas Zilog was seen as a small company regardless of the amount of money being pumped into it by Exxon, Motorola was an established semiconductor company with a proven track record and high market visibility.

The original Apple Macintosh and early successors use the Motorola 68000 processor as their CPU.

Up to this point, Intel never really had to “sell” products to customers. Its product lines were generally superior to (or at least the equal of) the competition and demand often exceeded supply. Consequently, Intel salesmanship often amounted to a mix of complacency and arrogance, and customers relished the chance to push back against the company’s attitude.

Against this backdrop, Intel instituted its first national marketing campaign spurred by an eight-page report detailing Motorola’s design wins at Intel’s expense from Don Buckhout, an East Coast Field Applications Engineer (FAE). The company set a goal of securing one design win per month for a year from each of its 170 sales representatives (a nominal 2000 design wins by December 1980) as part of Operation Crush, a name inspired by the Denver Bronco’s “Orange Crush” 3-4 defense as well as the campaign’s stated aim of crushing Motorola.

When faced with a superior processor, Intel would emphasize the system as a whole including support chips, an area where Motorola was relatively weak. Intel was processor-centric while microprocessors represented just a small part of Motorola’s diverse company. Intel unashamedly played on the fear, uncertainty and doubt of customers by asking whether Motorola could sustain support, integration, and future products.

Faced with the superior 68000 processor, Intel would emphasize the system as a whole including support chips, an area where Motorola was relatively weak.

While Intel’s corporate character did not ingratiate the company to those who had to deal with it, its level of technical support and products were undeniable. Intel abandoned its corporate ethos of never publicizing a product until its production was under way, allying the PR of a future product line with an extensive 100-page catalog of its chip developments.

Rather than concentrating on the 68000 where it had a marketing and technical advantage, Motorola took the bait and tried to answer Intel’s yet to be built products with its own more meager alternatives. From that point, Motorola and the 68000′s position of power evaporated.

Intel’s goal had been to gain 2000 design wins by December 1980. Such was the success of Operation Crush that the final figure was closer to 2500, in part due to the added incentive of a prize for the most wins bought in by a salesman and their FAE. Bill Handel won with nearly 100 contracts including one for use in an electronic temperature monitoring brassiere that supposedly notified the wearer of the optimum time for conception. Handel and his FAE were each awarded 86 shares Intel stock as first prize, and along with every other sales representative who met their quota, a junket to Tahiti.

While Bill Handel might have won Operation Crush’s contest, Earl Whetstone, a salesman on the opposite coast, took a chance to add to his own tally by calling IBM’s Boca Raton development laboratory, despite IBM’s history of designing its processors in-house. As luck would have it, IBM’s Project Chess had just been initiated. Operation Crush had been set in motion to maximize the 8086′s exposure, but the biggest design win of the nearly 2500 achieved was the stripped down 8088 8-bit external/16-bit internal bus hybrid chip (developed from the 8086) landing in the IBM PC Model 5150.

IBM Personal Computer model 5150 with IBM CGA monitor, IBM PC keyboard, IBM 5152 printer and paper stand.

Like the company’s mainframe business, the IBM PC was designed for a high degree of customization. However, whereas the modularity of the mainframe came from IBM-built parts, the PC would draw on the wider opportunities afforded by independent vendors. Two separate graphics adapters were available, CGA (Color Graphics Adapter) targeting the home user and the MDA (Monochrome Display Adapter) for commercial users that included a matrix printer port. The IBM-designed mainboard offered enough expansion ports (five) for most users as well as a fairly wide range of RAM, storage, and printer options.

Intel’s Operation Crush was a resounding success, but the biggest design win of the nearly 2500 achieved was the stripped down 8088 8-bit external/16-bit internal bus hybrid chip landing in the IBM PC Model 5150.

IBM’s choice of operating system was to be outsourced also. The DataMaster’s lengthy gestation was largely attributed to a change in design philosophy regarding the choice of BASIC interpreter for the system during its development phase and IBM had taken the lesson to heart. The exact chain of events that followed IBM’s decision to use Microsoft’s solution aren’t entirely clear, but it began with IBM approaching Gary Kildall’s Digital Research.

During the formative years of the personal computer industry, Kildall’s CP/M had been the choice of operating systems for many vendors. While Kildall concentrated on the OS side of the street, Bill Gates and Paul Allen had focused on programming language variants of BASIC. An unspoken contract seemed to exist that neither company would intrude on the other’s specialty.

This arrangement would sour with Digital Research Inc.’s (DRI) close association with and later acquisition of Compiler Systems, which was run by one of Kildall’s graduate students, Gordon Eubanks, who had written a version of BASIC (CBASIC) for IMSAI. Shopping for an operating system, IBM arranged to meet with Kildall with a view to adapting CP/M to the 8086 and 8088 processors.

When IBM representative Jack Sams arrived at DRI to discuss the proposal, Kildall wasn’t in the office and when confronted by IBM’s infamous and one-sided NDA, Dorothy Kildall refused until she could seek a legal opinion. It seems probable that Kildall arrived for the meeting many hours later and too late to discuss specifics. Sams then flew to Seattle to meet with Bill Gates.

What seems certain is that Kildall’s casual demeanor was at odds with Sam’s (and by extension IBM’s) corporate philosophy. The level of professionalism and deference that should be accorded to one of America’s largest companies was distinctly lacking. Kildall’s lack of urgency in accelerating what would become CP/M-86, a compatible OS for the 8086/8088, also stands out as a major stumbling block. Bill Gates however was acutely aware of IBM’s stature and was quite willing to accede to IBM’s demands if it meant building his business.

The only real problem Bill Gates had was that he didn’t have an operating system to sell to IBM — but knew of someone who did. Gates approached Seattle Computer Products whose SCP-200B development kit utilized the 8086. While SCP sold the kits, it had been faced with the same lack of operating system as other 8086 vendors. The solution became to design its own — design being a fairly loose term, as SCP’s Tim Paterson borrowed heavily from the existing CP/M by scrupulously copying the OS’s API calls (which the OS uses to interact with other software).

Bill Gates was willing to accede to IBM’s demands if it meant building his business. The only real problem was that he didn’t have an OS to sell — but knew of someone who did.

The resultant QDOS (Quick and Dirty Operating System) debuted in July 1980, a full month before IBM went looking for the same thing from DRI and Microsoft, although it wouldn’t start shipping until the end of September. SCP’s loss instantly turned into Microsoft’s exceedingly large gain as Bill Gates and Microsoft’s first business manager Steve Ballmer pitched to IBM a license agreement for the newly found operating system, associated software, and four programming languages on a royalty basis.

Importantly, Microsoft was to remain free to license MS-DOS to other vendors.

Paul Allen and Bill Gates pose next to a few early desktop systems.

IBM duly accepted and handed over a $700,000 advance a week after QDOS had been secured by Microsoft. Obtaining the operating system without alerting SCP to its empire-building promise had fallen to Paul Allen, who negotiated a per-license agreement with SCP at $10,000 per customer plus $5,000 if the source code was included and a $10,000 advance to seal the deal. The implication was that Gates and Allen had a large number of licensees locked up although a clause in the agreement stated that Microsoft’s customer list was confidential — a customer list that numbered a single client.

Tim Paterson was soon added to the Microsoft workforce when 86-DOS, as it was then known, needed revision at IBM’s behest. By the time the IBM PC was ready for release, Rod Brock, SCP’s owner, was nearing bankruptcy as the 86-DOS licensees had failed to materialize. With his programmer long gone and lacking the funds to stay afloat, Brock accepted Gates’ offer to sell 86-DOS to Microsoft for $50,000 — the equivalent of two minutes worth of sales at the height of the software’s popularity a decade later as MS-DOS.

The first version of DOS had no support for hard disks, directories or loadable device drivers. (Photo: OS/2 Museum)

A major consideration for IBM was the integrity of its supply chain. With other IBM products that meant one division of the company delivering products to another. Outsourcing components was a common practice in an industry where manufacturing and yield issues were rife, so over the strong objections of Intel, IBM demanded a second source for the 8088.

Second sourcing also had the added advantage of introducing another level of quality control as processors from each company could be compared for performance and adherence to shipping schedule. IBM’s preferred second source, AMD, just so happened to be looking to replace its ill-selling licensed copy of the Zilog Z8000, the AMZ8000.

With Motorola’s microprocessor market share plummeting to a mere 15% as a consequence of Operation Crush, the standout candidate became fairly obvious.

Intel and AMD hammered out a deal in February 1982. Intel for its part fulfilled IBM’s edict, and AMD, knowing that Intel’s position was weakened due to IBM’s duress, extracted a lengthy license agreement. AMD would pay royalties to Intel for three years, at which time the payments would be weighted according to value when and if Intel decided to take up options for licensing AMD products. This part of the agreement was to last for at least 5 years of the 10 provided for.

As the IBM PC neared introduction, the main talking points were of an affordable computer for the consumer, made by an iconic American company. Enthusiasts tended to note the use of BIOS, IBM’s attention to system integrity with the 14-step power-on self-test (POST) sequence, and the comprehensive user manual — all taken for granted now, but groundbreaking in 1981.

No real base configuration for the IBM PC existed as set by the company, but a $1,595 price is often quoted for the machine in its unbundled form. A fully optioned version topped $6,000 with extra hardware such as dual floppy drives, memory expansion kits and additional operating systems (BASIC was included, MS-DOS added $40, CP/M-86 was $240 and UCSD p-System cost $695 — CP/M-86 being included to forestall any adverse publicity from Digital Research).

CP/M-86 was a version of the CP/M operating system from Digital Research for the Intel 8086 and Intel 8088.

For IBM PC owners desperate to run CP/M, an add-in co-processor board such as Xedex’s Z80-equipped Baby Blue card could be procured. The average price of the 13,533 PCs sold by the year’s end (against advance orders for over 35,000) was around $3,000. The IBM Model 5150′s sales would reach 50,000 within six months, and 200,000 after a year.

The arrival of the Model 5150 did not alter the personal computing landscape at the time. The machine was too expensive for many, and like the Apple II, sales stemmed more from business users. The IBM PC became a safe option because of the reputation of the company standing behind it. The PC’s effect on the industry wouldn’t be felt until “IBM” became less of a selling point than “IBM Compatible”.

The 13,533 PCs sold by IBM in the closing months of 1981 represented less than 1% of total sales and 1.9% of revenue for the $3 billion personal computer market, while Radio Shack and Apple accounted for a combined 37% of the sales (20% and 17% respectively). What the IBM PC would achieve was an almost instantaneous boost to the add-in board market, and a solid base for software developers.

Building on this growth, 1982 would double personal computer shipment from the previous year with 2.8 million sold worldwide. A significant part of this grown occurred with the arrival of the Commodore 64 which would redefine the budget of the PC market previously the province of the company’s own VIC 20, Atari’s 400 and 800, and the TRS 80, while low-end machines such as Sinclair’s ZX81 intruded upon pricing until now reserved for pocket calculators.

The arrival of the Model 5150 did not alter the personal computing landscape at the time, but it became a safe option because of the company standing behind it. Its effect on the industry wouldn’t be felt until “IBM” became less of a selling point than “IBM Compatible”.

These lower entry level machines boosted the gaming entertainment landscape while the Commodore 64 maintained the momentum gained by the sprite-driven graphics lineage of the Atari 2600, although graphics were about to be given a serious makeover. 1982 also saw the founding of SGI, Hercules, Diamond Multimedia, Orchid Technology, Number Nine, Autodesk’s AutoCAD, Electronic Arts, and On-Line became Sierra On-Line as the company grew in scope with its association with IBM as did MSA’s Peachtree Accounting software, IUS’s EasyWriter, ISS’s WordPerfect, and the spreadsheet application synonymous with the IBM PC, Lotus 1-2-3.

The venture capital that enabled Lotus Software to begin operation originated in part from the two people that had supplied start-up funding for Silicon Graphics (SGI) and Electronics Arts. L.J.Sevin and Ben Rosen also provided half the initial capital for Compaq Computers. Formed by three disgruntled Texas Instruments engineers, and soon to be followed by many more hires from T.I., the three had seen a market for a portable version of IBM’s PC.

Compaq was far from alone in quickly noticing that the PC’s sales were driven more by the software available to it than the sum of the hardware in the machine, so the ideal product would be one that could use software that already existed. With IBM leading the way it was a pretty safe bet that its standard would succeed in the marketplace — by force of will if necessary. It was an added bonus being able to ally itself (however tangentially) with Big Blue by marketing its product as “IBM PC Compatible”.

Companies looking to ride the coattails of a market leader were nothing new. Franklin Computer had released an exact copy of the Apple II as the Ace 100 and Apple quickly prosecuted a case against Franklin who would continue to plagiarize Apple’s design until August 1983 when the U.S. Court of Appeals ruled in favor of Apple.

IBM’s only protection against similar imitators would be its BIOS, and the company’s false assumption that competitors could not source the components cheaper than Big Blue.

Columbia Data Product’s MPC 1600 became the first IBM PC clone in June 1982, and although cheaper than the Model 5150 by around a third, the reverse engineered BIOS wasn’t fully compatible with IBM hardware or the software suite. Eagle Computers, Corona and Handwell attempted the less labor-intensive option of starting with IBM’s BIOS, the complete code for which IBM had included in its PC’s manual, which only got IBM’s formidable team of lawyers involved. Several vendors would also look to improve upon the Model 5150′s feature set, notably DEC’s Rainbow 100 and Seequa’s Chameleon, both of which featured a Z80 in addition to the 8088, and both would fail commercially without full IBM PC compatibility.

Compaq on the other hand consulted its lawyers before starting a PC project and to protect itself used a “clean room” approach with the BIOS programmer never having seen the original IBM code. Phoenix Technologies would go on to employ the same methodology to engineer its BIOS ROM resulting in any company being able to purchase IBM compatibility off the shelf for $25 a chip and a $290,000 license fee.

The Compaq Portable was the first 100% IBM-compatible PC, and the first portable one. (Photo: Maximum PC)

Compaq’s first product was the Portable Personal Computer, publicly launched on November 4, 1982, and the first machine completely compatible on a hardware and software basis with IBM’s Model 5150. The first 300 machines shipped in January 1983 with 50,000 sold by year end — part of the million 8088-based computers sold to that date. Although sales number for the year didn’t indicate it (IBM sold half a million PCs during 1983 and Apple sold 750,000 which combined equaled sales of the Commodore 64), the arrival of the clones and Compaq in particular signaled the end of IBM’s short-lived dominance in the market. The driving forces would from now on be companies that powered both the PC and its imitators, Intel and Microsoft.

Apple’s business included neither Intel nor Microsoft and on the face of it seemed to be booming. The company shipped its one-millionth Apple II in June 1983 and continued to be a big seller until the Macintosh became the company’s focus. Balanced against this was the relative flop of the Apple III, doomed by poor attention to detail during design and a robotic assembly line that failed to seat chips into circuit boards with enough pressure to ensure correct contact.

The arrival of the clones and Compaq in particular signaled the end of IBM’s short-lived dominance in the market. The driving forces would from now on be companies that powered both the PC and its imitators, Intel and Microsoft.

Apple’s recommendation of raising the machine a few inches then dropping it to reseat components and repeating until the machine regained its function seemed like a less than professional workaround to its users. The Lisa project, which grew out of Steve Jobs’ eye opening visit to Xerox PARC in December 1979, also became a drain on resources. RD for Jobs to recreate PARC’s wonderland of networked Alto workstations topped $50 million — a thousand times the Apple II’s development cost of — but at least the technology was successfully applied to later products even if the resulting Lisa was limited to around 100,000 units sold.

The principle drawback of the Apple Lisa was price. At $10,000 per machine, the target market was small and made smaller with the Lisa’s lack of networking capability.

Apple rectified the cost (and customer expectation that comes from it) with the Macintosh, a project developed in tandem with Lisa, which incorporated many of the same features in a simpler design costing only a quarter of the Lisa’s price tag. A Ridley Scott directed commercial using George Orwell’s Nineteen Eighty-Four motif widely regarded as the pinnacle of advertisement preceded its successful launch in January 1984.

The Macintosh would continue to drive Apple’s competitive market share with steady sales, although pricing and subsequent revenue fell consistently as Windows 3.0, and more importantly, associated productivity applications such as Excel and Word began to encroach into traditional Apple Mac strongholds.

Total sales of personal computers in the first four years since the Altair’s introduction had amounted to 200,000 machines worldwide. A decade later, the year of 1988 would see 19 million personal computers shipped. A business that started in garages by hobbyists had turned into a multi-billion dollar industry where the camaraderie of enthusiasts had been replaced by cutthroat market competition — a competition that was about to see the patent and intellectual property wielded as economic weapons.

This article is the third installment on a series of five. If you enjoyed this, make sure to join us next week as get into a decade of steady growth and the consolidation of power from those that endured the industry’s intense rivalries. If you feel like reading more about the history of computing, check out our feature on iconic PC hardware. Until next week!

Bonus video

Masthead image and video above by YouTube user 41D57.

Article source: http://www.techspot.com/article/893-history-of-the-personal-computer-part-3/

The Forgotten Female Programmers Who Created Modern Tech

Jean Jennings (left) and Frances Bilas set up the ENIAC in 1946. Bilas is arranging the program settings on the Master Programmer. (Courtesy of University of Pennsylvania)

If your image of a computer programmer is a young man, there’s a good reason: It’s true. Recently, many big tech companies revealed how few of their female employees worked in programming and technical jobs. Google had some of the highest rates: 17 percent of its technical staff is female.

It wasn’t always this way. Decades ago, it was women who pioneered computer programming — but too often, that’s a part of history that even the smartest people don’t know.

I took a trip to ground zero for today’s computer revolution, Stanford University, and randomly asked over a dozen students if they knew who were the first computer programmers. Almost none knew.

“I’m in computer science,” says a slightly embarrassed Stephanie Pham. “This is so sad.”

A few students, like Cheng Dao Fan, get close. “It’s a woman, probably,” she says searching her mind for a name. “It’s not necessarily [an] electronic computer. I think it’s more like a mechanic computer.”

She’s thinking of Ada Lovelace, also known as the Countess of Lovelace, born in 1815. Walter Isaacson begins his new book, The Innovators: How a Group of Hackers, Geniuses and Geeks Created the Digital Revolution, with her story.

“Ada Lovelace is Lord Byron’s child, and her mother, Lady Byron, did not want her to turn out to be like her father, a romantic poet,” says Isaacson. So Lady Byron “had her tutored almost exclusively in mathematics as if that were an antidote to being poetic.”

Lovelace saw the poetry in math. At 17, she went to a London salon and met Charles Babbage. He showed her plans for a machine that he believed would be able to do complex mathematical calculations. He asked Lovelace to write about his work for a scholarly journal. In her article, Lovelace expresses a vision for his machine that goes beyond calculations.

She envisioned that “a computer can do anything that can be noted logically,” explains Isaacson. “Words, pictures and music, not just numbers. She understands how you take an instruction set and load it into the machine, and she even does an example, which is programming Bernoulli numbers, an incredibly complicated sequence of numbers.”

Babbage’s machine was never built. But his designs and Lovelace’s notes were read by people building the first computer a century later.

The women who would program one of the world’s earliest electronic computers, however, knew nothing of Lovelace and Babbage.

As part of the oral history project of the Computer History Museum, Jean Jennings Bartik recalled how she got the job working on that computer. She was doing calculations on rocket and canon trajectories by hand in 1945. A job opened to work on a new machine.

“This announcement came around that they were looking for operators of a new machine they were building called the ENIAC,” recalls Bartik. “Of course, I had no idea what it was, but I knew it wasn’t doing hand calculation.”

Bartik was one of six female mathematicians who created programs for one of the world’s first fully electronic general-purpose computers. Isaacson says the men didn’t think it was an important job.

“Men were interested in building, the hardware,” says Isaacson, “doing the circuits, figuring out the machinery. And women were very good mathematicians back then.”

Isaacson says in the 1930s female math majors were fairly common — though mostly they went off to teach. But during World War II, these skilled women signed up to help with the war effort.

Bartik told a live audience at the Computer History Museum in 2008 that the job lacked prestige. The ENIAC wasn’t working the day before its first demo. Bartik’s team worked late into the night and got it working.

“They all went out to dinner at the announcement,” she says. “We weren’t invited and there we were. People never recognized, they never acted as though we knew what we were doing. I mean, we were in a lot of pictures.”

At the time, though, media outlets didn’t name the women in the pictures. After the war, Bartik and her team went on to work on the UNIVAC, one of the first major commercial computers.

The women joined up with Grace Hopper, a tenured math professor who joined the Navy Reserve during the war. Walter Isaacson says Hopper had a breakthrough. She found a way to program computers using words rather than numbers — most notably a program language called COBOL.

“You would be using a programming language that would allow you almost to just give it instructions, almost in regular English, and it would compile it for whatever hardware it happened to be,” explains Isaacson. “So that made programming more important than the hardware, ’cause you could use it on any piece of hardware.”

Hopper retired from the Navy Reserve as a rear admiral. An act of Congress allowed her to stay past mandatory retirement age. She did become something of a public figure and even appeared on the David Letterman show in 1986. Letterman asks her, “You’re known as the Queen of Software. Is that right?”

“More or less,” says the 79-year-old Hopper.

But it was also just about this time that the number of women majoring in computer science began to drop, from close to 40 percent to around 17 percent now. There are a lot of theories about why this is so. It was around this time that Steve Jobs and Bill Gates were appearing in the media; personal computers were taking off.

Computer science degrees got more popular, and boys who had been tinkering with computer hardware at home looked like better candidates to computer science departments than girls who liked math, says Janet Abbate, a professor at Virginia Tech who has studied this topic.

“It’s kind of the classic thing,” she says. “You pick people who look like what you think a computer person is, which is probably a teenage boy that was in the computer club in high school.”

For decades the women who pioneered the computer revolution were often overlooked, but not in Isaacson’s book about the history of the digital revolution.

“When they have been written out of the history, you don’t have great role models,” says Isaacson. “But when you learn about the women who programmed ENIAC or Grace Hopper or Ada Lovelace … it happened to my daughter. She read about all these people when she was in high school, and she became a math and computer science geek.”

Lovelace, the mathematician, died when she was 36. The women who worked on the ENIAC have all passed away, as has Grace Hopper. But every time you write on a computer, play a music file or add up a number with your phone’s calculator, you are using tools that might not exist without the work of these women.

Isaacson’s book reminds us of that fact. And perhaps knowing that history will show a new generation of women that programming is for girls.

Copyright 2014 NPR. To see more, visit http://www.npr.org/.

STEVE INSKEEP, HOST:

The typical computer programmer is a man, a young man; that’s just a reality right now. Tech companies recently revealed how few of their female employees worked in programming and technical jobs. Google was among the highest – with 17 percent. It would be easy to assume it’s always been that way, but it has not. Decades ago, women pioneered computer programming, and in today’s installment of The Changing Lives of Women, we will hear what those women did and why more women did not follow. NPR’s Laura Sydell reports.

LAURA SYDELL, BYLINE: I took a trip to Stanford University, Ground Zero for today’s computer revolution. I figured if any group of random students would know about the pioneers of computer programming I’d find them here.

Who were the first people who programmed computers? Do you know?

ANSHULMAN PRADHAN: No, I don’t think so.

DESHAE JENKINS: I have no idea.

ANTHONY CARRINGTON: No, I don’t.

STEPHANIE PHAM: I’m in computer science. This is so sad. (Laughter).

SYDELL: That was Anshulman Pradhan, Deshae Jenkins, Anthony Carrington and Stephanie Pham. Of the more than a dozen students I interviewed, I got a couple close calls. Here’s one of them, Cheng Dao Fan.

CHENG DAO FAN: It’s a woman probably. Yeah, and she’s – it’s not necessarily electronic computer, I think it’s more like a mechanical computer.

SYDELL: She’s thinking of Ada Lovelace, also known as the Countess of Lovelace, born in 1815. Walter Isaacson begins his new book, “The Innovators,” with her story.

WALTER ISAACSON: Ada Lovelace is Lord Byron’s child. And her mother, Lady Byron, did not want her to turn out to be like her father, a romantic poet.

SYDELL: So Lady Byron…

ISAACSON: Had her tutored almost exclusively in mathematics, as if that were an antidote to being poetic.

SYDELL: Lovelace saw the poetry in math. At 17, she went to a London salon and met Charles Babbage. He showed her plans for a machine, which he believed would be able to do complex mathematical calculations. He asked Lovelace to write about his machine for a scholarly journal. In her article, Lovelace expresses a vision for his machine that goes beyond calculations.

ISAACSON: That a computer can do anything that can be noted logically. In other words, words and pictures and music, not just numbers. Secondly, she understands how you take an instruction set and load it into the machine. And she even does an example, which is programming Bernoulli numbers – an incredibly complicated sequence of numbers.

SYDELL: Babbage’s machine was never built, but his designs and Lovelace’s notes were read by people building the first computer a century later – though the women who would program one of the world’s earliest electronic computers knew nothing of Lovelace and Babbage. As part of the oral history project of the Computer History Museum, Jean Jennings Bartik recalled she was doing calculations on rocket and canon trajectories by hand. In 1945, a job opened to work on a new machine.

JEAN JENNINGS BARTIK: This announcement came around that they were looking for operators of a new machine they were building called the ENIAC, so of course I had no idea what it was.

SYDELL: Bartik was 1 of 6 women mathematicians who created programs for one of the world’s first fully electronic general-purpose computers – the ENIAC. Isaacson says the men didn’t think this was an important job.

ISAACSON: Men were interested in building the hardware, you know, doing the circuits, figuring out the machinery. And women were very good mathematicians back then.

SYDELL: Isaacson says in the 1930s, women math majors were fairly common, though most of them went off to teach. But during the Second World War, these skilled women signed up to help with the war effort. In front of a live audience at the Computer History Museum in 2008, Jean Bartik recalled that the job lacked prestige. The ENIAC wasn’t working the day before its first demo. Bartik’s team worked late into the night and got it working.

(SOUNDBITE OF ARCHIVED RECORDING)

BARTIK: They all went out to dinner and at the announcement, and we weren’t invited, and there we were. People never recognized – they never acted as though we knew what we were doing. I mean, we were in a lot of pictures…

SYDELL: Though at the time, media outlets didn’t name the women in the pictures. After the war, Bartik and her team went on to work on the UNIVAC, one of the first major commercial computers. The women joined up with Dr. Grace Hopper, a tenured math professor who joined the Navy Reserve. Walter Isaacson says Hopper had a breakthrough. She found a way to program computers using words rather than numbers, most notably a program language called COBOL.

ISAACSON: Because you would be using a programming language that would allow you almost to just give it instructions, almost in regular English. It would compile it for whatever hardware it happened to be. So that made programming more important than the hardware.

SYDELL: Hopper retired from the Navy Reserve as a rear admiral. An act of Congress allowed her to stay past mandatory retirement age. She did become a public figure. Hopper even appeared on David Letterman in 1986.

(SOUNDBITE OF TV SHOW, “LATE SHOW WITH DAVID LETTERMAN”)

DAVID LETTERMAN: But you’re known as the queen of software – is that right?

(LAUGHTER)

SYDELL: But it was also just about this time that the number of women majoring in computer science began to drop, from close to 40 percent to around 17 percent now. It was around this time that Steve Jobs and Bill Gates were appearing in the media. Personal computers were taking off; computer science degrees got more popular; boys who had been tinkering with computer hardware at home looked like better candidates to departments than girls who liked math. That’s according to Janet Abbate, a professor at Virginia Tech who has studied this topic.

JANET ABBATE: It’s kind of the classic thing. You pick people who look like, you know, what you think a computer person is – which is probably a teenage boy who was in the computer club in high school.

SYDELL: And for decades the women who pioneered the computer revolution were often overlooked, but not in Isaacson’s book about the history of the digital revolution.

ISAACSON: When they have been sort of written out of the history, you don’t have great role models, but when you learn about the women who programmed ENIAC or Grace Hopper or Ada Lovelace – it happened to my daughter. She read about all these people when she was in high school, and she became a math and computer science geek.

SYDELL: Ada Lovelace, the mathematician, died when she was 36. The women who worked on the ENIAC have all passed away, as has Grace Hopper. But every time you write on a computer, play a music file or add up a number with your phone’s calculator, you are using tools that might not exist without the work of these women. Isaacson’s book reminds us of this fact, and perhaps knowing that history will show a new generation of women that programming is for girls. Laura Sydell, NPR News.

INSKEEP: It’s MORNING EDITION from NPR News. I’m Steve Inskeep.

RACHEL MARTIN, HOST:

And I’m Rachel Martin. Transcript provided by NPR, Copyright NPR.

Article source: http://www.wbur.org/npr/345799830/the-forgotten-female-programmers-who-created-modern-tech

Rare Apple-1 Computer Could Fetch $500000 at Bonhams’ Upcoming History of …

On October 22 at Bonhams New York, centuries of scientific inquiry will be recapitulated in one afternoon during the auction house’s inaugural History of Science sale. The books, manuscripts, prints, photographs, scientific instruments, and technological devices on offer range from the 16th through the 20th century—including at least one Renaissance printing of an astronomy text from the first century BC.

Astronomy in fact makes up an entire segment of the sale, with the shining star in that category being an extraordinary archive of items from George Willis Ritchey (expected to fetch from $450,000 to $550,000). Ritchey was an astronomer, astro-photographer, and telescope designer who designed and built the telescope at the U.S. Naval Observatory. The lot for sale includes hundreds of his photographs, a number of books and periodicals, notes on the origin of the moon, and other documents relating to his work.

A wide assortment of books and manuscripts on natural history, medicine, and physiology will appeal to collectors and aesthetes alike, with such items as a first edition of Charles Darwin’s The Origin of Species ($25,000 to $35,000), a trio of lizard and frog X-rays ($800 to $1,200), books containing floral and faunal lithographs, exquisitely (and sometimes gruesomely) illustrated anatomical atlases and surgical manuals, and much more. Handwritten personal correspondences from both Darwin and Albert Einstein will be available, including a charmingly enthusiastic inquiry from Darwin to an eyewitness on the mechanics of barnacle reproduction ($20,000 to $30,000).

Perhaps the most exciting items in the sale come from the physics and technology segments. One is a circa-1905 mahogany-and-brass Helmholtz sound synthesizer ($20,000 to $30,000)—an example of the first electric keyboard. These instruments are surpassingly rare; Cassandra Hatton, the senior specialist at Bonhams who is in charge of the sale, knows of only one similar synthesizer in a U.S. institution, and it is smaller than the one on offer this month.

Two other notable items mark thresholds in the development of computing. A first edition of Ada Lovelace’s 1843 publication on Charles Babbage’s Analytical Engine ($18,000 to $25,000) includes her algorithm for the device to compute Bernoulli numbers—one of the first-ever computer programs. Her paper also propounds her insight that any element that could be represented by numbers could be manipulated by the machine.

That idea ultimately led to the first preassembled personal computer ever sold—the Apple-1. Examples of the 1976 machine have occasionally come up for sale recently, but the one that will headline the Bonhams sale ($300,000 to $500,000) is, perhaps uniquely, unmodified, aside from having one of its computing chips replaced. Only about 15 Apple-1s have been successfully operated in this century, and this one is the only one since at least 2010 to come up for public sale with nearly all of its original parts. (bonhams.com)

Article source: http://robbreport.com/art-collectibles/rare-apple-1-computer-could-fetch-500000-bonhams-upcoming-history-science-sale