The Forgotten Female Programmers Who Created Modern Tech

i
i

Jean Jennings (left) and Frances Bilas set up the ENIAC in 1946. Bilas is arranging the program settings on the Master Programmer.

Courtesy of University of Pennsylvania


hide caption

itoggle caption

Courtesy of University of Pennsylvania

Jean Jennings (left) and Frances Bilas set up the ENIAC in 1946. Bilas is arranging the program settings on the Master Programmer.

Jean Jennings (left) and Frances Bilas set up the ENIAC in 1946. Bilas is arranging the program settings on the Master Programmer.

Courtesy of University of Pennsylvania

If your image of a computer programmer is a young man, there’s a good reason: It’s true. Recently, many big tech companies revealed how few of their female employees worked in programming and technical jobs. Google had some of the highest rates: 17 percent of its technical staff is female.

It wasn’t always this way. Decades ago, it was women who pioneered computer programming — but too often, that’s a part of history that even the smartest people don’t know.

I took a trip to ground zero for today’s computer revolution, Stanford University, and randomly asked over a dozen students if they knew who were the first computer programmers. Almost none knew.

“I’m in computer science,” says a slightly embarrassed Stephanie Pham. “This is so sad.”

A few students, like Cheng Dao Fan, get close. “It’s a woman, probably,” she says searching her mind for a name. “It’s not necessarily [an] electronic computer. I think it’s more like a mechanic computer.”

She’s thinking of Ada Lovelace, also known as the Countess of Lovelace, born in 1815. Walter Isaacson begins his new book, The Innovators: How a Group of Hackers, Geniuses and Geeks Created the Digital Revolution, with her story.

i
i

Augusta Ada, Countess of Lovelace, was the daughter of poet Lord Byron. The computer language ADA was named after her in recognition of her pioneering work with Charles Babbage.

Hulton Archive/Getty Images


hide caption

itoggle caption

Hulton Archive/Getty Images

Augusta Ada, Countess of Lovelace, was the daughter of poet Lord Byron. The computer language ADA was named after her in recognition of her pioneering work with Charles Babbage.

Augusta Ada, Countess of Lovelace, was the daughter of poet Lord Byron. The computer language ADA was named after her in recognition of her pioneering work with Charles Babbage.

Hulton Archive/Getty Images

“Ada Lovelace is Lord Byron’s child, and her mother, Lady Byron, did not want her to turn out to be like her father, a romantic poet,” says Isaacson. So Lady Byron “had her tutored almost exclusively in mathematics as if that were an antidote to being poetic.”

Lovelace saw the poetry in math. At 17, she went to a London salon and met Charles Babbage. He showed her plans for a machine that he believed would be able to do complex mathematical calculations. He asked Lovelace to write about his work for a scholarly journal. In her article, Lovelace expresses a vision for his machine that goes beyond calculations.

She envisioned that “a computer can do anything that can be noted logically,” explains Isaacson. “Words, pictures and music, not just numbers. She understands how you take an instruction set and load it into the machine, and she even does an example, which is programming Bernoulli numbers, an incredibly complicated sequence of numbers.”

Babbage’s machine was never built. But his designs and Lovelace’s notes were read by people building the first computer a century later.

The women who would program one of the world’s earliest electronic computers, however, knew nothing of Lovelace and Babbage.

As part of the oral history project of the Computer History Museum, Jean Jennings Bartik recalled how she got the job working on that computer. She was doing calculations on rocket and canon trajectories by hand in 1945. A job opened to work on a new machine.

More On Women In Tech

“This announcement came around that they were looking for operators of a new machine they were building called the ENIAC,” recalls Bartik. “Of course, I had no idea what it was, but I knew it wasn’t doing hand calculation.”

Bartik was one of six female mathematicians who created programs for one of the world’s first fully electronic general-purpose computers. Isaacson says the men didn’t think it was an important job.

“Men were interested in building, the hardware,” says Isaacson, “doing the circuits, figuring out the machinery. And women were very good mathematicians back then.”

Isaacson says in the 1930s female math majors were fairly common — though mostly they went off to teach. But during World War II, these skilled women signed up to help with the war effort.

Bartik told a live audience at the Computer History Museum in 2008 that the job lacked prestige. The ENIAC wasn’t working the day before its first demo. Bartik’s team worked late into the night and got it working.

“They all went out to dinner at the announcement,” she says. “We weren’t invited and there we were. People never recognized, they never acted as though we knew what we were doing. I mean, we were in a lot of pictures.”

At the time, though, media outlets didn’t name the women in the pictures. After the war, Bartik and her team went on to work on the UNIVAC, one of the first major commercial computers.

The women joined up with Grace Hopper, a tenured math professor who joined the Navy Reserve during the war. Walter Isaacson says Hopper had a breakthrough. She found a way to program computers using words rather than numbers — most notably a program language called COBOL.

“You would be using a programming language that would allow you almost to just give it instructions, almost in regular English, and it would compile it for whatever hardware it happened to be,” explains Isaacson. “So that made programming more important than the hardware, ’cause you could use it on any piece of hardware.”

i
i

Grace Hopper originated electronic computer automatic programming for the Remington Rand Division of Sperry Rand Corp.

AP


hide caption

itoggle caption

AP

Grace Hopper originated electronic computer automatic programming for the Remington Rand Division of Sperry Rand Corp.

Grace Hopper originated electronic computer automatic programming for the Remington Rand Division of Sperry Rand Corp.

AP

Hopper retired from the Navy Reserve as a rear admiral. An act of Congress allowed her to stay past mandatory retirement age. She did become something of a public figure and even appeared on the David Letterman show in 1986. Letterman asks her, “You’re known as the Queen of Software. Is that right?”

“More or less,” says the 79-year-old Hopper.

But it was also just about this time that the number of women majoring in computer science began to drop, from close to 40 percent to around 17 percent now. There are a lot of theories about why this is so. It was around this time that Steve Jobs and Bill Gates were appearing in the media; personal computers were taking off.

Computer science degrees got more popular, and boys who had been tinkering with computer hardware at home looked like better candidates to computer science departments than girls who liked math, says Janet Abbate, a professor at Virginia Tech who has studied this topic.

“It’s kind of the classic thing,” she says. “You pick people who look like what you think a computer person is, which is probably a teenage boy that was in the computer club in high school.”

For decades the women who pioneered the computer revolution were often overlooked, but not in Isaacson’s book about the history of the digital revolution.

“When they have been written out of the history, you don’t have great role models,” says Isaacson. “But when you learn about the women who programmed ENIAC or Grace Hopper or Ada Lovelace … it happened to my daughter. She read about all these people when she was in high school, and she became a math and computer science geek.”

Lovelace, the mathematician, died when she was 36. The women who worked on the ENIAC have all passed away, as has Grace Hopper. But every time you write on a computer, play a music file or add up a number with your phone’s calculator, you are using tools that might not exist without the work of these women.

Isaacson’s book reminds us of that fact. And perhaps knowing that history will show a new generation of women that programming is for girls.

Read an excerpt of The Innovators

Article source: http://www.npr.org/blogs/alltechconsidered/2014/10/06/345799830/the-forgotten-female-programmers-who-created-modern-tech

Computer History Museum Welcomes Silicon Valley Rising Stars to Advisory … – Virtual

New Members Hail from TechCrunch, Google and Version One Ventures

MOUNTAIN VIEW, Calif., Oct. 8, 2014 (GLOBE NEWSWIRE) — The Computer History Museum (CHM), the world’s leading institution exploring the history of computing and its ongoing impact on society, today announced that it has added TechCrunch’s Alexia Tsotsis, Version One Ventures’ Angela Kingyens, and Google’s Victoria Pinchin to its NextGen Advisory Board.

“One of the reasons Silicon Valley is such a special place is all the experience sharing and mentoring that happens here,” said Sunil Nagaraj, co-founder and co-chair of the NextGen Advisory Board. “Alexia, Angela, and Veronica will be a huge boost to all the events and activities we organize to this end.”

Tsotsis serves as co-editor at TechCrunch, the world’s leading startup publication, where she’s responsible for managing an editorial team of over 30 people, leading them through 3 large conferences a year in addition to producing news content day in and day out. She has a Bachelor’s degree in English from the University of Southern California.

At Version One Ventures, Kingyens is an associate investing in early-stage consumer internet, SaaS and mobile entrepreneurs across North America. Prior to joining Version One, she was a partner at Insight Data Science, a YCombinator-backed startup helping PhDs transition from academic research to industry careers via a six-week training program. Kingyens holds a PhD in Operations Research and Financial Engineering from the University of Toronto.

Pinchin is a product manager for Google Knowledge (Google Search). Prior to Google, she was a consultant at McKinsey Company, and worked at Amazon.com and National Instruments in a variety of technical marketing and product roles. Pinchin has an MBA from Harvard Business School.

“I’m delighted to welcome Alexia, Angela and Victoria to the board. NextGen plays a vital role in helping the Museum connect the past to the future, and they connect a new generation to the people and stories that inspire breakthrough thinking,” said Museum President and Chief Executive Officer John Hollar.

Tsotsis, Kingyens and Pinchin join existing board members Vishal Arya, Susie Caulfield, Alec Detwiler, Joel Franusic, Julia Grace, Serge Grossman, Amy Jackson, Sunil Nagaraj, Jason Shah, Jeremiah Stone and Michelle Zatlyn.

About the NextGen Advisory Board

The Computer History Museum’s NextGen Advisory Board was created to bring technology enthusiasts together over our rich history in Silicon Valley. Their events aim to bring together young professionals who love technology and the history of computing. Combined, the board has extensive experience in technology entrepreneurship, venture capital, product management, marketing and public relations, and many other professional fields at the heart of computer history. The group’s “Future History Makers” series has featured guests including Drew Houston of Dropbox, Phil Libin of Evernote, Travis Kalanick of Uber, and many other rising stars in Silicon Valley.

For more information on the NextGen Advisory Board please visit www.computerhistory.org/nextgen.

About the Computer History Museum

(C) Copyright 2014 GlobeNewswire, Inc. All rights reserved.

Article source: http://www.virtual-strategy.com/2014/10/08/computer-history-museum-welcomes-silicon-valley-rising-stars-advisory-board

The (complicated) history of HP’s PC business


HP has a long love-hate relationship with its personal computer business.








Greg Baumann
Editor in Chief- Silicon Valley Business Journal

Email
 | 
Twitter
 | 
Google+

Hewlett-Packard Co’s decision to split its personal computer and printer business away from its enterprise hardware, software and services operation is the latest gyration in a long drama. The Silicon Valley Business Journal runs down the company’s thinking on the PC business:

HP loves the PC business!— “Being in the consumer business when the ‘consumerization’ of IT is driving the entire industry is an immense competitive advantage.” — February 2011, HP CEO Leo Apotheker

HP hates the PC business! — “To be successful in the consumer device business we would have had to invest a lot of capital and I believe we can invest it in better places.” — August 2011, HP CEO Leo Apotheker

HP really loves the PC business!— “While the operating margin of this business is not as high as some of the other businesses at HP, the return on investment capital is really terrific.” — October 2011, HP CEO Meg Whitman

HP is *totally* over the PC business— “Let me be clear: One HP was the right approach,” Whitman said. “During the fix and rebuild phase of our turnaround plan, we used the strength found in being together to become stronger throughout, but of course, the marketplace never stands still in our industry.” — October 2014, HP CEO Meg Whitman

HP loves the PC business!— “As the market leader in printing and personal systems, an independent HP Inc. will be extremely well positioned to deliver that innovation across our traditional markets as well as extend our leadership into new markets like 3-D printing and new computing experiences — inventing technology that empowers people to create, interact and inspire like never before.” — October 2014, HP Inc. CEO Dion Weisler

Read more at the Silicon Valley Business Journal.

Greg Baumann is editor in chief at the Silicon Valley Business Journal.



Article source: http://www.bizjournals.com/sacramento/blog/morning-roundup/2014/10/the-complicated-oral-history-of-hewlett-packards.html

Walter Isaacson’s ‘The Innovators’ Charts the History of Computing and the …

More than a decade ago, Walter Isaacson began working on a book to highlight the history of computers and the Internet, but the project was sidelined in early 2009 when he took on the task of writing Steve Jobs’ authorized biography. That book, which debuted just weeks after Jobs’ death in October 2011, topped best seller charts and revealed a number of interesting details about Jobs and Apple.

Following the publication of Steve Jobs, Isaacson returned to his earlier project of documenting the history of computing, and that work debuts tomorrow as The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution. While Apple and Jobs play relatively minor roles in the book, overall it offers an interesting look at how computers and the Internet developed into what they are today.

Isaacson breaks his book into nearly a dozen different sections, highlighting a number of advancements along the way. It begins with Ada Lovelace and Charles Babbage outlining their thoughts on a mechanical “Analytical Engine” in the 1830s and 1840s before jumping ahead nearly 100 years to Vannevar Bush and Alan Turing and their visions for the earliest computers that would follow soon after. Further sections address advances in programming, transistors, microchips, video games, and the early Internet before broaching the topics of the modern personal computer and the World Wide Web.

Throughout the book, Isaacson focuses on the importance of teamwork rather than individual genius in the development of computers, frequently involving contrasting but complementary personalities of visionaries, technical experts, and managers. Popular examples include Steve Jobs and Steve Wozniak at Apple, or Bob Noyce, Gordon Moore, and Andy Grove at Intel, but the observation extends further as time and time again teams have been responsible for many of the biggest innovations.

Innovation comes from teams more often than from the lightbulb moments of lone geniuses. This was true of every era of creative ferment. [...] But to an even greater extent, this has been true of the digital age. As brilliant as the many inventors of the Internet and computer were, they achieved most of their advances through teamwork.

Isaacson also emphasizes the importance of building on previous discoveries, including collaboration both within and between generations of scientists. A number of characters in the book appear at multiple stages, often first as innovators themselves and later helping to foster discoveries by the next generation.

Other observations include the various roles of government, academia, and business in the development of computing and how they frequently came together, particularly in the early days, to lead advancements. Isaacson also uses several cases to argue that innovation works best when different business models compete against each other, particularly in software development as with Apple’s integrated systems vying with Microsoft’s unbundled model while the free and open-source approach maintained its position in the market.

Each model had its advantages, each had its incentives for creativity, and each had its prophets and disciples. But the approach that worked best was having all three models coexisting, along with various combinations of open and closed, bundled and unbundled, proprietary and free. Windows and Mac, UNIX and Linux, iOS and Android: a variety of approaches competed over the decades, spurring each other on — and providing a check against any one model becoming so dominant that it stifled innovation.

Packing the entire history of computing into 500 pages leaves some topics feeling brief or left out altogether, but Isaacson’s book gives an interesting overview for those who may not be familiar with the technical advances stretching back decades that have given rise to the current state of the art. Focusing more on the people and relationships than the technical details, it offers some insight into how breakthroughs have been made and how some innovators have gained fame and fortune while others slipped into near obscurity.

Article source: http://www.macrumors.com/2014/10/06/the-innovators-isaacson/

History of the Microprocessor and the Personal Computer, Part 4

1984 – 1996: Consolidation of Power

The Mighty Wintel Empire

The infamous quote “There is no reason anyone would want a computer in their home” by Digital Equipment Corporation founder Ken Olsen in 1977 is a perfect study of the prevailing corporate attitude towards personal computing in the early years. Computers were mainframes and minicomputers that could cost a million dollars and were often sold in single digit numbers per month with the initial hardware cost being a fraction of the overall upgrade and service contract.

The decades before the microprocessor-based revolution were also a much more convivial and fraternal environment regarding the sharing of ideas and inventions. Between the low expectations of the companies involved at the inception of consumer electronics and the need for early allies to create a broad base of support for the budding industry, the early days saw a spirit of cooperation that has so completely eroded it is hard to believe it ever existed.

As the integrated circuit industry became more lucrative, former colleagues that had started out in a close knit community created an industrial diaspora as ideas and applications (not to mention the lure of wealth) began to exceed the existing company’s ability to bring them to fruition. Small companies that started out with camaraderie and enthusiasm soon became the monoliths that had prompted many to leave their previous jobs.

Intel could trace its existence to the breakup of Shockley Electronics and Fairchild Semiconductor and with the departure of Federico Faggin and Ralph Ungermann to start Zilog as well as David Stamm and Raphael Klein to found Daisy Systems and Xicor Incorporated respectively, Intel’s Andy Grove was determined to see that the company wasn’t gutted as Fairchild was. Lawsuits became object lessons to those still employed at Intel, a means of protecting its IP (which Fairchild had failed miserably at) and a method of tying up a competitor’s financial resources while delaying its time to market for products.

Intel could trace its existence to the breakup of Shockley Electronics and Fairchild Semiconductor. Determined to avoid the same fate, lawsuits became object lessons to employees, a means of protecting its IP, and a method of tying up a competitor’s financial resources.

Some of these suits were completely justified. More than a few competitors saw the success of the 8088/8086 as an invitation to directly copy the Intel design including NEC, who subsequently won the second legal battle when it defended its reverse engineered 8086 clone V series, while some became use of the patent and legal system to wage economic war on competitors.

When a group of employees led by Gordon Campbell left to capitalize on the EEPROM market by starting Seeq Technology, Intel promptly sued the new company on Grove’s order. At Arthur Rock’s behest, the lawsuit also targeted the venture capital firm that had supplied the start-up capital, which oddly enough included Intel’s Gordon Moore as an investor. Funds poured into both Seeq and the legal defense, with a relatively peaceful resolution being reached through the moderating influence of Intel’s chief legal counsel, Roger Bovoroy, after Seeq had stepped into a legal minefield regarding talk of a licensing arrangement with Zilog.

The next major conflict turned into the longest running and most acrimonious battles in the industry, and it occurred in large part because moderating influences were nowhere to be seen. Bovoroy had left Intel, and of the eight original founders of AMD, only Jerry Sanders remained — chip architect and voice of reason Sven-Erik Simonsen being the last to depart.

Sanders led AMD for over thirty years and earned a reputation as a charismatic, outspoken CEO (Robert Cardin)

Intel’s relationship with AMD became openly hostile in September 1984 when the press asked Jerry Sanders how AMD got its Intel-licensed EPROMs to market faster than Intel had itself. Sanders began a lengthy tirade on Intel’s lack of manufacturing ability. With Intel anxious to be rid of AMD as a second source partner for the 386, Sanders’ outburst added an ignition source for an already volatile situation. The present agreement called for a cross-license of product but Intel declined AMD’s offerings (a hard disc controller and a graphics chip, the Quad Pixel Display Manager or QPDM), forcing AMD to make up the shortfall in royalties to Intel for its 286 license.

Litigation started in 1987 with AMD claiming there to have been a breach of contract. Intel responded with a countersuit for copyright infringement (Intel’s 287 FPU), followed by an antitrust suit from AMD and then a second copyright suit by Intel over AMD’s AM486 IP.

Both sides won their respective suits and some sense of order had been restored by 1995. AMD received $10 million and the rights to build the 386 in 1993 as well as $18 million with the rights build the 486 and outsource up to 20% of its x86 production, while Intel received $58 million from AMD for patent infringements.

More importantly for Intel, it stalled AMD’s growth during a boom period of microprocessor growth and personal computing in particular. At a time when AMD was looking to take the next step to higher echelons of semiconductor companies — albeit still relying heavily on licensed production — the company’s expansion was severely curtailed. Worse was to follow for AMD as the original license agreement signed with Intel was due to expire on December 31, 1996.

As their original license agreement was set to expire, Intel negotiated a much tougher one where AMD would have no access to Intel’s microcode after the 486 architecture and future AMD processors after the 586-class could not be compatible with Intel sockets.

Intel negotiated a much tougher agreement just before the old one expired. In exchange for the continued use of existing Intel IP, AMD would have no access to Intel’s microcode after the 486 architecture and future AMD processors after the 586-class could not be compatible with Intel sockets. This effectively meant that AMD was now racing against Intel’s RD timetable not just to produce its own processor architecture but also supporting chipsets and mainboards. Offering a cheaper alternative to an Intel CPU for what was otherwise an Intel-based system would no longer be enough.

For Intel, the AMD suits were just one of a number centering on the ’338 patent, “the Crawford Patent”, that it was pursuing in the early 1990s, including those against UMC (which Intel won) as well as Chips and Technologies and Cyrix, both of which ended with settlements.

The early years of the microprocessor industry, as with previous IC manufacture, were based on a vertically integrated model with the company both designing and manufacturing chips. The mid-80s ushered in the rise of the fabless semiconductor company who’d produce the design but outsource production to either an independent company who only fabricated chips (the pure-play foundry model pioneered by TSMC), or a design house with manufacturing ability who could produce chips for other companies if the IP was licensed and there was no conflict of interest.

A number of companies sprang up armed with technical knowledge but not the capital to invest in manufacture: NexGen (IBM), Cyrix (Texas Instruments, SGS-Thomson, IBM), Chips and Technologies (Hitachi and Toshiba), and Western Design Center being the most prominent. The list also includes Acorn, a small British company who began designing the Acorn RISC Machine processors (better known by the acronym ARM), to be fabricated by VLSI Technology.

The BBC Micro was designed and built by Acorn Computer for the BBC Computer Literacy Project. The machine also served to simulate and develop the ARM architecture which is widely used in tablets and cellphones today.

None of the x86-based architecture companies lasted the distance, but they still contributed to the furtherance of the industry. Chips and Technologies was run on a shoestring budget but produced consolidated chipsets for the IBM PC-XT and its clones, vastly reducing the manufacturing cost as well as the first IBM compatible VGA graphics adapter which laid, the groundwork for the first wave of 2D graphics add-in board vendors.

Cyrix became a popular underdog in the processor market and combined excellent integer with mediocre floating-point performance like AMD’s designs. A lengthy battle with Intel over the right to fabricate x86 designs drained resources and was made much worse by Cyrix’s contract with foundry partner IBM who sold Cyrix-designed chips under the IBM name at lower cost.

An ailing Cyrix would be acquired by National Semiconductor in 1997 and onsold to VIA along with National’s pre-existing x86 license with the exception of the MediaGX line which would end its days as Geode under the AMD banner.

NexGen would be a peripheral player in the CPU market as its chips, like Cyrix’s, did not use Intel IP and were completely of its own design. Where Cyrix was pin-compatible with Intel sockets and thus had a ready market for the budget buyer, NexGen’s Nx586 required its own 463-pin socket and NxVL chipset, which severely limited opportunities in the market once Intel’s 430FX Triton chipset replaced the underperforming Neptune.

NexGen’s market position was made all the more precarious by the rate of Intel’s development and incremental clock speed increases along with the rapid devaluation of previous models. This would prove too rapid of change for the company. NexGen’s follow up processor, the Nx686, would never see the light of day in that guise, as AMD bought the company when its own in house 686-class K6 project failed to meet performance goals. The final K6 that entered service would be a development of the NexGen design.

While the computer hardware business was being molded through vicious legal strategy, the battle for supremacy in the software market was no less intense. The success of the IBM PC and clones spawned three software empires almost overnight: Microsoft, Lotus, and Aston-Tate.

The battle for supremacy in software was no less intense. The success of the IBM PC and clones spawned three empires almost overnight: Microsoft, Lotus, and Aston-Tate.

The early huge success of VisiCalc had turned the two halves of the business into fierce opponents in the courtroom, a battle sparked primarily by the generous 37.5% royalty payment for retail and 50% for OEM copies that publisher Personal Software (later named VisiCorp) owed developer Software Arts. During the turmoil, Mitch Kapor, lead developer of two versatile add-ons for VisiCalc, VisiPlot and VisiTrend, sold his interest in the code to VisiCorp and set up Lotus Software. Banking on both IBM and Microsoft’s DOS succeeding in the market, Kapor and programmer Jonathan Sachs developed Lotus 1-2-3.

Bill Gates and Mitch Kapor, founder of Lotus Software (Cringley.com)

Lotus 1-2-3′s IBM compatibility proved an outstanding success, becoming a compelling reason to purchase the IBM PC just as VisiCalc had with the Apple II. The spreadsheet program’s success was in part due to coding specifically for the PC’s Intel architecture.

Rival spreadsheet Context MBA was a more comprehensive software package but had been written in USCD’s p-System to allow use with many architectures (a hedge against IBM and Microsoft failing in the market) at the cost of responsiveness due to the translation layer used to communicate with those dissimilar architectures.

Lotus 1-2-3 Release 3.0 for MS-DOS. (Wikipedia)

Lotus, as many companies before and since (including Intel), succumbed to second-system syndrome where a huge first product success brings pressure to follow up with a comparable or preferably, a bigger, better product. The company released the underwhelming Symphony and Jazz spreadsheet programs which failed to sustain the momentum gained by 1-2-3, and Lotus retreated from software innovation to the purchasing of IP.

As the company grew it fell into the pattern of litigating to maintain its position after Jim Manzi assumed the company presidency. A suit against Paperback Software in 1987 was won in June 1990, while others claiming Mosaic Software violated the “look and feel” of 1-2-3 in the software VP Planner and TheTwin were won in June 1990 and January 1991. Additionally, Borland Software was required to remove 1-2-3 macros from its Quattro Pro spreadsheet program.

What Lotus was to PC spreadsheet programs, Ashton-Tate would duplicate with dBase, its database software. Vastly successful at first, later versions would be successively less influential and the company’s slide from prominence was accelerated by the death of founder George Tate, which saw the marketing-orientated Ed Esber become CEO. With the company’s fortunes tied to a single product, Ashton-Tate purchased IP in the form of the Frameworks office suite from Forefront Corporation along with the MultiMate word processor program, but corporate disorganization geared on reaction rather than action doomed the company.

Lotus’ fortunes waned as Microsoft claimed market after market.

Lotus’ fortunes waned as Microsoft inexorably claimed market after market. A classic example would seem to be the case of Aldus’ PageMaker desktop publishing program which was developed for integrating Apple’s LaserWriter printer with the Macintosh computer. The success of the program led Aldus to develop a word processing program under the name Project Flintstone since early versions of PageMaker had no direct text input. Upon finding out the Flintstone was a year away from completion, Bill Gates demonstrated Microsoft’s competitor Word for Windows to Aldus founder Paul Brainerd claiming that it would ship in six to nine months when in reality it was two years away from being published. Project Flintsone was summarily shelved.

Though it certainly aided in the company’s rise, Bill Gates’ Microsoft survived not due to sleight of hand, but because it effectively separated management and administration from those producing the product. It had set clearly defined short, medium and long-term strategic goals, the longest being a reflection of Bill Gates’ personality: to be number one.

Microsoft had grown at a prodigious rate thanks to IBM, but by the late 80s it was becoming apparent that while IBM was under attack from a slew of more agile competitors, all of them relied on Microsoft’s OS and supporting applications. The expansion of the computing market and its associated software from business platform to personal computing coincided with the growth of the Internet and its accessibility.

While IBM was under attack from a slew of more agile rivals, all relied on Microsoft’s OS and supporting applications.

The system evolved from small, mostly closed and incompatible networks that were restricted to academics and developers. The adoption of common standards (notably TCP/IP and HTTP) and encouraging the commercialization of the Internet was up until that point largely government-funded. With the subsequent expansion of the net from research, data and time sharing into a reflection of what the average consumer wants (email, online shopping, and interaction with a larger community), web browsers became big business.

Of the first wave of browsers, the NCSA’s Mosaic was the most successful and was licensed to many companies. Mosaic developer Marc Andreessen went on to found Netscape Communications, and Netscape Navigator rapidly became the overwhelming browser choice of consumers with over 80% market share within the first year of introduction.

Microsoft’s answer was to license a version of Mosaic from Spyglass to produce Internet Explorer but uptake was slow until the company made the decision to bundle IE with the Windows 95 operating system as a free application, instantly increasing its visibility and cutting royalties to Spyglass to just the base license fee.

Microsoft would later pay Spyglass $8 million to avert legal action. However, it wound up facing long running anti-trust proceedings brought by the European Union and U.S. Department of Justice anyway for including IE with Windows, for making it difficult to use a third-party browser with Windows and for making threats to withdraw Compaq’s Windows licensing after Compaq’s decision to bundle Netscape Navigator with its systems.

The decision to bundle IE with Windows enabled Microsoft to eclipse Netscape’s market share in three years. By 2002 IE usage had peaked at nearly 96% and would remain the industry leader for another decade until a fundamental shift in the perception of personal computing would alter the balance of power.

Internet Explorer historical usage data (Wikipedia)

That shift was a result of making the personal computer more personal as people discovered that computing was less a static workstation than it was a constant companion. If not for work, then mobile computers could certainly be used for entertainment, fashion accessory, and in some cases psychological need.

Mobile personal computing began at the low end of the spectrum with calculators, while “portable” more accurately meant “luggable” given the bulk of early components such as CRT screens and floppy disc drives as well as the general standard for miniaturization of the time. The first wave of true laptop designs were very expensive business status symbols conforming to an Intel processor powered system with a half-clamshell non-backlit LCD screen usually capable of displaying four to eight lines of text, although the Hewlett-Packard HP-110 also included a 480×120 pixel graphics mode (480×200 with the HP-110 Plus), while the GriD Compass 1101 managed 320×240 for its $8,000 to $10,000 price tag.

Cheaper, less featured models also poured into the market from a host of manufacturers such as Epson (HX-20), Sharp (PC-5000), and Kyocera, whose Kyotronic 85 was also licensed out to Olivetti, Tandy, and NEC. The reduced feature set stemmed in large part from the need for cheap low power processors, something that would be addressed as Intel introduced 386SL model from 1990, some four years after the 386 debuted.

Intel would spend $100 million in developing the 386, but the industry in general was in love with the 286 thanks to its low cost and the number of vendors offering versions of the chip. With a manufacturing cost of $141, up from $34 for the 286, the 386′s price of $900 represented a sizeable increase in expenditure for an industry not quite ready for 32-bit computing.

The PS/2 was IBM’s attempt to recapture control of the PC market by introducing an advanced yet proprietary architecture. Manufacturers stuck to “Wintel” solutions but many PS/2 innovations went on to become standards.

Intel’s “Red X” campaign was design to bypass system builders and get consumers to identify their computing needs by the manufacturer of the processor.

Compaq’s DeskPro 386 and ALR’s Access both debuted in September 1987, a full seven months before IBM’s PS/2 underlined IBM’s fading status. With Microsoft publicly chafing at an industry clinging to 16-bit (and in some cases 8-bit) compatibility, the OEMs that usurped IBM’s position had quickly found their place as a “market leader” was largely illusionary. Intel’s “Red X” marketing campaign in October 1989 made a concerted effort to force the industry into 32-bit computing, bypassing system builders and appealing to the buying public.

Large full-page advertisements featuring the numbers 286 with a large red “X” sprayed over the top of them began a strategy to get consumers to identify their computing needs by the manufacturer of the processor.

The campaign was designed more to marginalize the 286 licensees (Harris, AMD, IBM, Fujitsu, and Siemens) as it was to sell Intel’s 386s, and it quickly established in the consumer’s mind that companies pushing the 286 (including Intel’s own OEM partners) were selling obsolete technology while elevating Intel’s brand as market leader.

Early Intel Inside Ad 386sx (Flickr user intelphotos)

The Red X campaign presented obvious proof that Intel had little need for second source partners and belatedly spurred the x86 chip producers into making their own 386-class chips. The mark of AMD’s design and manufacturing prowess was shown with the first 18 six-inch wafers, which were ready by August 1990 and yielded only a single defective Am386 die.

Between March 1991 and the end of the year the Am386 racked up $200 million from sales of two million processors, gaining a 14% share of the market, while another two million were sold in the first three months of 1992. Sales would remain buoyant even as Intel transitioned to the 486, which came in a number of guises including both 32-bit and 16-bit external/32-bit internal varieties in addition to Overdrive models, but sales hid a larger truth.

The 386 had required four and a half years to achieve a 25% market share while the 486 achieved the same in one year less, with the impending Pentium likely to better that by a considerable margin (it would achieve the feat in 18 months).

The Pentium era would see Intel distance itself from competitors and elevating its brand to consumers directly by adopting a copyrightable model name. The company would capitalize and expand on Red X marketing with their long running “Intel Inside” campaign that made Intel’s brand the common identifier when the consumer was faced with a variety of system vendors. The program included TV advertising with its soon to be well-known five-note jingle as well as subsidized advertising for vendors who highlighted Intel’s brand during the advertisement.

Early Intel Inside ad “Spot Intel” (Flickr user intelphotos)

Within three years, 1200 companies joined the campaign and the combined exposure elevated Intel’s sales 63% in the first full year of operation. Less publicized would be the company’s move into component manufacturing when it committed to building its own motherboards, driving hundreds of board manufacturers from the market while raising quality assurance levels with the OEMs who sourced the components.

The Pentium era would see Intel distance itself from competitors and elevating its brand to consumers directly by using a copyrightable model name.

Such was the program’s success that the OEM became a secondary consideration after choice of processor for many shoppers — a complete reversal in five years, assuming many consumers prior to 1989 actually had any preference in CPU manufacturer. AMD in particular from this point forward would be battling against the Intel brand as much as Intel technology, but they would be far from the only companies under threat.

Apple and IBM joined by Motorola formed the AIM alliance in July 1991 to develop a commercial PowerPC RISC-based architecture as an alternative to the growing influence of x86 “Wintel” solutions. IBM would also hedge its bets by entering into a 10-year joint venture with Intel in November of the same year to develop processors. The big loser in the power struggle would be DEC’s promising Alpha XP 64-bit RISC architecture. Being declined by IBM as a developmental choice followed DEC’s own decision to decline Apple’s invitation to use the 21064 in future Macintoshes five months earlier.

All of Intel’s efforts could have been wiped out with Dr. Thomas Nicely’s discovery of errors in the Pentium processors lookup table during June 1994, a little more than a year after the architecture’s introduction. Adverse publicity stemming from Dr. Nicely’s lack of received support from Intel caused the issue to enter the mainstream via a CNN television report.

Intel’s official response was that all chips carry errors. In recent times, 50,000 of its own 486s had been turned into key rings and Cyrix had halted production of its 486DX to fix a floating-point bug, but Intel recently elevated into the public eye was under the scrutiny of consumers whom expected a defective product to be replaced regardless of what it was.

The murmur of discontent became a clamor as IBM suspended Pentium shipments, and Intel’s response was decisive in the face of the earlier equivocation. Intel chose to maintain its brand over immediate profit, offering a public apology and a replacement processor for any affected customer. Intel sustained a $475 million write-down on a million affected processors, but averted any real loss in marketing momentum.

The Pentium FDIV bug, while referenced frequently by detractors, barely caused a ripple in the market, with Intel recording a 31% growth in overall semiconductor sales the following year and its CPU market share rising to 77% worldwide by units and 82% by revenue. As Intel subsumed the motherboard market, it also began relegating chipset manufacturers into also-ran status.

Of the 76 million chipsets shipped in 1995, Intel contributed a mere 1.5 million while SiS, VIA, OPTi, and Acer Labs (ALi) produced almost 33 million between them. By the end of 1996, Intel’s new motherboard business enabled chipset production to account for 40 million of the 71.4 million total produced, dooming low-end chipset manufacturers to early retirement and raising the overall quality of available components (there was a thriving market for cheap boards using grossly overstated specification — in some cases even non-functional fake components).

Intel’s motherboard business doomed low-end chipset manufacturers to early retirement and raised the overall quality of available components.

AMD eventually managed to get its 586-class processor into the market in March 1996. Announced the previous September as 30% faster than the Pentium on a clock for clock basis, the design fell far short of the rhetoric as well as the competing Cyrix 6×86. The planned K5 name was dropped in favor of SSA/5 prior to its release. After the CPU core and cache structure were reworked, the second wave (and first to use the K5 name) shipped in October 1996.

Much improved, it still did little to motivate sales. As with Cyrix’s part, AMD’s tended to generate more heat than the Pentium, limiting its appeal to the overclockers. AMD’s follow up, the K6, proved to be what the previous design had aspired to.

The K6 was based on the Nx686 microprocessor that NexGen was designing when it was acquired by AMD (Wikipedia)

With Intel beginning to ship samples of the Pentium II to OEM vendors the previous month, the April 2, 1997 launch provided much needed marketing exposure for the company and enabled AMD to cease production of the K5 within a few months. High-profile orders from DEC for the Venturis FX-2 and from IBM for the budget Aptiva line would help bring the K6 into the market. However, overall market share would slip to under 10% in large part due to Intel’s mobile Pentium sales and a strong overall last quarter leading into the holiday season where AMD’s 4.4% of sales paled against Intel’s 91.1%.

AMD’s 586-class processor fell far short of its faster-than-Pentium rhetoric and a reworked, much improved revision did little to motivate sales. The follow up K6 finally proved to be what the previous design had aspired to.

The arrival of the “Chomper” K6-II in May 1998 would lift AMD to 12% of x86 sales for the year as personal computer sales broke through the 100 million barrier. The K6-II would dominate the sub-$1,000 desktop market with its improved performance and friendly price tags (the 366MHz version started at $187), outselling Intel’s Celeron and Cyrix’s MII combined two to one in its first full quarter of availability.

By the late 1980s, personal computing was seeing the full effects of the economies of scale. CPUs and chipsets consolidated the function of many individual integrated circuits into fewer and more cost effective parts while processors exhibited enough speed that a range of capable CPUs was available via binning and speed reduction for low-power mobile and low-cost products.

The limiting factor in growth became the ability to manufacture the chips fast enough to satisfy demand. Sales of personal computers grew in excess of 10% a year on the back of home productivity applications and the developing 3D graphics market, which had begun blooming with the arrival of the 3dfx Voodoo Graphics board and major games such as Valve’s Half-Life in 1998.

A sign of the shift in personal computing arrived on February 8, 1999, when Free-PC announced its campaign offering free Compaq PCs with Internet access in exchange for usage tracking and on-screen advertising. By the time the campaign finished in February 2000, 25,000 customers had signed up and many small computer sales companies had been put out of business.

The direct model of shaping the internet and those who use it would in the future become more indirect as personal computing moved into the new millennium.

This article is the fourth installment on a series of five. If you enjoyed this, make sure to join us next week as wrap up the series with a little more on the Intel-AMD rivalry and ARM’s role in bringing personal computing to the next step in its evolution. If you feel like reading more about the history of computing, check out our feature on the rise and fall of AMD. Until next week!

Header image via Flickr user creative_stock

Article source: http://www.techspot.com/article/899-history-of-the-personal-computer-part-4/

JPMorgan hack exposed data of 83 million, among biggest breaches in history

Article source: http://www.thanhniennews.com/tech/jpmorgan-hack-exposed-data-of-83-million-among-biggest-breaches-in-history-32014.html

Art History Series Kicks off Monday with Lecture on Computer Art 1.0

Since the invention of the computer, artists have sought ways to fuse the traditional arts and emerging digital tools, and after 50 years of varying forms of computer art, a new extension of art history has taken shape. 

On Monday, computer art pioneer Frieder Nake will discuss “Algorithmics and Aesthetics — On Digital Images,” as part of the newly created Ad Astra Lecture Series: Art, Architecture, Science, Technology Research Alliance. The series is part of the efforts of UT Dallas’ Edith O’Donnell Institute of Art History

Frieder Nake

Nake, a mathematician, is professor of interactive computer graphics and digital media at The University of Bremen and The University of the Arts Bremen, Germany. His lecture, which will be at 4 p.m. in TI Auditorium, ECSS 2.102, and will focus on how digital art has changed over time. 

“Computer art established itself as a very special aspect of conceptual art,” said Nake, a recent nominee for the Visionary Pioneer of Media Art award of Prix Ars Electronica. 

“Only with interactive art, however, digital art gained enough autonomy and started formulating its own aesthetic questions. We will take a look at generative aesthetics, and show some, perhaps surprising, connections to the broad stream of art history.”

Named after the Latin phrase ad astra, meaning “to the stars,” the Ad Astra Lecture Series takes its energies from the utopian vision of the ancient past.

Luciano Chessa

Under the direction of Dr. Charissa N. Terranova, associate professor of aesthetic studies, the series features emerging and established practitioners from art, science and technology with a goal of expanding art history. 

“With its strong grounding in science and technology, UT Dallas is uniquely poised to expand and take the field of art history in new directions,” she said. “The series builds on the unique expertise of its faculty — in impressionism, post-impressionism, the Renaissance and early modern cartography, big data, complex systems, and biology in the history of art and architecture — in creating a new landscape of art, science and technology united.

“Art history done in ‘UTD fashion’ means bold reciprocity: an art history that benefits from the manifold precepts and utilitarian practices of science and engineering labs and, vice versa, science and engineering programs that sharpen skills of abstract and critical thinking through interaction with the humanists of art history.”

Ad Astra Lecture Series

The series will continue during the spring semester. Times, locations and more details will come later. Here are the scheduled speakers:

Jan. 14, 2015: Sean B. Carroll, award-winning scientist, writer, educator and film producer, will present “Jacques Monod and Albert Camus: A Scientist’s and a Philosopher’s Daring Adventures from the French Resistance to the Nobel Prize.”

March 7-8, 2015: Sophia Roosth, assistant professor, history of science, Harvard University, will give talks on synthetic biology as part of Ad Astra, RAW (the Arts Humanities Graduate Student Association annual conference) and the Arts and Technology Colloquia. Roosth’s research focuses on the 20th- and 21st-century life sciences.

The second lecture in the series will also be next week. Luciano Chessa will present “Music the Dead Can Hear: Theosophical Presences in Luigi Russolo’s Art of Noises” at 3 p.m. Thursday in the Edith O’Donnell Arts and Technology Building, Room 3.805B.

Chessa, musicologist, composer, performer and professor at the San Francisco Conservatory of Music, is the author of Luigi Russolo, Futurist: Noise, Visual Arts, and the Occult. His book and lecture present a new interpretation of the mechanical sound synthesizers that painter and musician Russolo created beginning in 1913.

Chessa, along with other Italian scholars, has established the prominence of the occult in early 20th-century Italian culture. There it operated in tandem with contemporary scientific ideas about X-rays and wireless telegraphy — all with an emphasis on waves and vibrations and their new communicative potential. Chessa argues that Russolo’s multileveled experiments were designed to reach higher states of spiritual consciousness.

Both lectures are free and open to the public.

Media Contact: Chaz Lilly, UT Dallas, (972) 883-4461, charles.lilly@utdallas.edu
or the Office of Media Relations, UT Dallas, (972) 883-2155, newscenter@utdallas.edu.





Article source: http://www.utdallas.edu/news/2014/10/3-31225_Art-History-Series-Kicks-off-Monday-with-Lecture-o_story-wide.html?WT.mc_id=NewsHomePage

Computer History Museum is Going on the Road With Acclaimed Speaker Series …

MOUNTAIN VIEW, Calif., Oct 03, 2014 (GLOBE NEWSWIRE via COMTEX) —

The Computer History Museum, the world’s leading institution exploring the history of computing and its impact on society, announced today that it is taking its acclaimed speaker series on the road, and the first stop will be NPR’s corporate headquarters and digital news center in Washington, DC.

“Revolutionaries” is the Museum’s acclaimed speaker series and is broadcasted throughout the world on multiple platforms. It features renowned innovators, business and technology leaders, and authors in enthralling, educational conversations often with leading journalists. Our audiences gain insight into the remarkable process of innovation, its risks and rewards, and the failures that led to ultimate success.

Participants have included author and Pulitzer Prize winner Jane Smiley; gaming legend Jane McGonigal; journalists Steven Levy, David Kirkpatrick, and John Markoff; Facebook founder Mark Zuckerberg; former IBM chairman Sam Palmisano; Dreamworks co-founder Jeffrey Katzenberg; author Walter Isaacson on his Steve Jobs biography; Yahoo CEO Marissa Mayer; Ford Motor Company chairman Bill Ford; Google executive chairman Eric Schmidt; Tesla founder Elon Musk; Calif. Lt. Gov. Gavin Newsom; Facebook COO Sheryl Sandberg; and many others.

“Taking the series on the road a few times a year gives us the opportunity to expand the influence and awareness of the Museum and ‘Revolutionaries,’” said “Revolutionaries” series creator Carol Stiglic. “We have had the good fortune to have partners like Intel and KQED supporting the series, and we have plans to stage “Revolutionaries” programs at both venues in the New Year. And, there may be some additional geographic surprises coming in 2015.”

Tonight, Museum CEO John Hollar will lead an in-depth conversation with Case, one of America’s best-known and most accomplished entrepreneurs and philanthropists, and a pioneer in making the Internet part of everyday life. They will discuss his passion for starting companies and supporting entrepreneurs, his roller-coaster ride at the top of AOL, his work in the public policy arena, and his philanthropic endeavors. They will also discuss his “Rise of the Rest” tour, giving entrepreneurs outside of Silicon Valley the opportunity to compete for startup funding. For more information on “Revolutionaries” and live shows visit www.computerhistory.org/events.

ABOUT REVOLUTIONARIES

“Revolutionaries” is the Computer History Museum’s acclaimed speaker series, featuring renowned innovators, business and technology leaders, and authors in enthralling conversations often with leading journalists. Our audiences learn about the process of innovation, its risks and rewards, and the failures that led to ultimate success.

ABOUT COMPUTER HISTORY MUSEUM

The Computer History Museum in Mountain View, California, is a nonprofit organization with a four-decade history as the world’s leading institution exploring the history of computing and its ongoing impact on society. The Museum is dedicated to the preservation and celebration of computer history and is home to the largest international collection of computing artifacts in the world, encompassing computer hardware, software, documentation, ephemera, photographs, and moving images. The Museum brings computer history to life through large-scale exhibits, an acclaimed speaker series, a dynamic website, docent-led tours, and an award-winning education program.

The Museum’s signature exhibition is “Revolution: The First 2000 Years of Computing,” described by USA Today as “the Valley’s answer to the Smithsonian.” Other current exhibits include “Charles Babbage’s Difference Engine No. 2,” “Where To: The History of Autonomous Vehicles,” “IBM 1401 Demo Lab,” and “DEC PDP-1.”

For more information and updates visit www.computerhistory.org, check us out on Facebook, and follow @computerhistory on Twitter.

A photo accompanying this release is available at: http://www.globenewswire.com/newsroom/prs/?pkgid=28143

 CONTACT: Carina Sweet csweet@computerhistory.org (650) 810-1059 

Copyright (C) 2014 GlobeNewswire, Inc. All rights reserved.

Article source: http://www.marketwatch.com/story/computer-history-museum-is-going-on-the-road-with-acclaimed-speaker-series-revolutionaries-2014-10-03?reflink=MW_news_stmp

In Level Five, a computer-game programmer tries to erase the history of …

French filmmaker Chris Marker all but created his own genre, interweaving elements of documentary, fiction, experimental, and essay filmmaking into vibrant cine-mosaics. His 1997 feature Level Five, screening for the first time in Chicago this weekend, is characteristically dense and flowing, much easier to watch than to summarize. It centers on Laura (Catherine Belkhodja), a fictional computer programmer working on an interactive online game in which players will “re-create” the Battle of Okinawa in World War II by retrieving historical materials from a vast, decentralized virtual library, then ordering the events down to the last detail. As Laura develops the game, she begins to learn about the atrocities that accompanied the battle and is so devastated that she tries to alter the course of history in her re-creation, only to find that her program has acquired a will of its own.

The movie often departs from Laura’s story—which Marker presents in the form of her video diary—to incorporate not only testimonies that appear in her game but also Marker’s personal reflections, which he delivers in voice-over narration of documentary footage he shot in Japan and elsewhere. Marker connects these various elements in an associative manner, often digressing to consider such topics as the opening of Japan to the West in the 1850s and the musical theme to Otto Preminger’s noirish romance Laura. The narrative structure of Level Five corresponds to that of Laura’s game: rather than arrive at a single insight, we’re meant to roam around a world of ideas.

At one point in Level Five, Marker muses that an exploratory role-playing game might provide avenues into the past that more conventional historical narratives cannot. He might have tested this thesis with any historical episode, but it takes on special resonance when applied to the Battle of Okinawa. The Japanese interviewed here assert that their nation has never truly confronted Okinawa, where more than 150,000 civilians were killed by the Imperial army or committed suicide at its behest before U.S. troops landed in April 1945. That heinous episode has been paved over during Japan’s postwar reconstruction; Marker, a frequent visitor to Japan, says he’s found no traces of it whatsoever: “I’d become so Japanese, I shared in their collective amnesia—as though the war never happened.” Level Five suggests a historical intervention, a call for people everywhere—not only in Japan—to own up to this crime against humanity. Marker presents this reclamation of lost history as a triumph of reality over nationalistic fantasy; just as Laura’s program comes to command her, so too does history have a way of writing us.

Article source: http://www.chicagoreader.com/chicago/chris-marker-level-five-catherine-belkhodja-battle-of-okinawa-online-gaming/Content?oid=15086361

Computer History Museum Makes Historic CP/M Operating System Source Code …

MOUNTAIN VIEW, Calif., Oct. 1, 2014 (GLOBE NEWSWIRE) — The Computer History Museum (CHM) announced today that it has made available original source code for early versions of CP/M (Control Program for Microcomputers). Written by Gary Kildall to transfer data from the era’s new floppy disk drive storage units to an Intel 8080 microprocessor-based computer, CP/M became the dominant operating system for hobbyist and small business system users in the late 1970s.

“CP/M was unlike most other operating systems in that it consumed very little memory space and could be ported to run on many different personal computers models of the era,” said Len Shustek, Museum chairman of the board of trustees. “Combined with its early availability and low cost, this made CP/M a runaway success and laid an important foundation for the personal computer revolution.”

The Institute of Electrical and Electronics Engineers (IEEE) has recognized the development of CP/M as an IEEE Milestone in Electrical Engineering and Computing by installing a bronze plaque outside the former headquarters of Kildall’s company Digital Research, Inc. in Pacific Grove, California. To mark the 40th anniversary of the prototype demonstration in Kildall’s backyard toolshed in the fall of 1974, the Computer History Museum is pleased to make available, for non-commercial use, the source code of several of the early releases of CP/M.

The Museum is releasing scanned printer listings and/or machine-readable source code for four early versions of CP/M dating from 1975 to 1979. The first is the earliest source code for CP/M the Museum has been able to locate, dating from before there were official version numbers. It was used at Lawrence Livermore National Laboratory for its Octopus network system. Next is Version 1.3, from 1976, which was the first release to include the BIOS (Basic Input Output System) code that made it easy to modify the software for different computers; this version also includes an amazing 48-page reverse-engineered source code listing with a hand-annotated disassembly of the object code. Lastly, the Museum is releasing Versions 1.4 and 2.0, which allowed compilation and assembly on personal computers and considerably expanded and generalized access to disks.

“The Museum thinks preserving historic source code like these programs is key to understanding how software has evolved from its primitive roots to become a crucial part of our civilization,” said Shustek.

For a blog posting surrounding the release of this source code, please visit:

http://www.computerhistory.org/atchm/early-digital-research-cpm-source-code/

For other releases in the Museum’s historic source code series, please see:

APPLE II DOS-

http://www.computerhistory.org/atchm/apple-ii-dos-source-code/

IBM APL-

http://www.computerhistory.org/atchm/the-apl-programming-language-source-code/ Apple

MacPaint and QuickDraw –

http://www.computerhistory.org/atchm/macpaint-and-quickdraw-source-code/

Adobe Photoshop-

http://www.computerhistory.org/atchm/adobe-photoshop-source-code/

Microsoft Word for Windows Version 1.1-

http://www.computerhistory.org/atchm/microsoft-word-for-windows-1-1a-source-code/

MS-DOS-

http://www.computerhistory.org/atchm/microsoft-ms-dos-early-source-code/

About the Computer History Museum

The Computer History Museum in Mountain View, California, is a nonprofit organization with a four-decade history as the world’s leading institution exploring the history of computing and its ongoing impact on society. The Museum is dedicated to the preservation and celebration of computer history and is home to the largest international collection of computing artifacts in the world, encompassing computer hardware, software, documentation, ephemera, photographs, and moving images. The Museum brings computer history to life through large-scale exhibits, an acclaimed speaker series, a dynamic website, docent-led tours, and an award-winning education program.

The Museum’s signature exhibition is “Revolution: The First 2000 Years of Computing,” described by USA Today as “the Valley’s answer to the Smithsonian.” Other current exhibits include “Charles Babbage’s Difference Engine No. 2,” “IBM 1401 and PDP-1 Demo Labs”, and “Where To? The History of Autonomous Vehicles.”

For more information and updates, call (650) 810-1059, visit www.computerhistory.org, check us out on Facebook, follow @computerhistory on Twitter, and the Museum blog @chm.

Carina Sweet
(650) 810-1059

Article source: http://globenewswire.com/news-release/2014/10/01/670067/10100880/en/Computer-History-Museum-Makes-Historic-CP-M-Operating-System-Source-Code-Available-to-the-Public.html