Computer History Museum Makes Historic MS-DOS and Word for Windows …

The Computer History Museum (CHM) announced today that it has, with permission from Microsoft Corporation, made available original source code for two historic programs: MS-DOS, the 1982 “Disk Operating System” for IBM-compatible personal computers, and Word for Windows, the 1990 Windows-based version of their word processor.

IBM went outside the company for many hardware and software components of their 1981 personal computer. Though most vendors were kept in the dark about the project, code-named “Chess,” IBM developed a unique relationship between their Boca Raton-based team and Microsoft, then a small company based in Seattle.

Microsoft, which was providing the BASIC language interpreter, agreed to also supply an operating system. Without their own operating system already in place, they licensed a product from nearby Seattle Computer Products and worked closely with IBM to make the changes they wanted. It shipped as “PC-DOS” for IBM and “MS-DOS” for other PC manufacturers. We are today releasing the source code of MS-DOS version 1.1 from 1982, and of version 2.0 from 1983.

“Version 1.1 fits an entire operating system – limited as it was – into only 12K bytes of memory, which is tiny compared to today’s software,” said Len Shustek, Museum Chairman.

Microsoft’s DOS-based version of Word, first released in 1983, was not a success against the dominant word processor of that era, WordPerfect. The 1989 release of Word for Windows changed all that: within four years it was generating over half the worldwide word processing market revenue. It was a remarkable marketing and engineering achievement. We are today revealing the technical magic by releasing the source code to version 1.1a of Word for Windows.

“MS-DOS and Word for Windows built the foundation for Microsoft’s success in the technology industry,” said Roy Levin, distinguished engineer and managing director, Microsoft Research. “By contributing these source codes to the Computer History Museum archives, Microsoft is making these historic systems from the early era of personal computing available to the community for historical and technical scholarship.”

“We think preserving historic source code like these two programs is key to understanding how software has evolved from primitive roots to become a crucial part of our civilization,” says Shustek.

For a blog posting surrounding the release of this source code, please visit: Microsoft DOS and MS Word for Windows.

For other releases in the historic source code series, see:

Apple II DOS, IBM APL, Apple MacPaint and QuickDraw, Adobe Photoshop

About the Computer History Museum

The Computer History Museum in Mountain View, California is a nonprofit organization with a four-decade history as the world’s leading institution exploring the history of computing and its ongoing impact on society. The Museum is dedicated to the preservation and celebration of computer history, and is home to the largest international collection of computing artifacts in the world, encompassing computer hardware, software, documentation, ephemera, photographs and moving images.

The Museum brings computer history to life through large-scale exhibits, an acclaimed speaker series, a dynamic website, docent-led tours and an award-winning education program. The Museum’s signature exhibition is “Revolution: The First 2000 Years of Computing,” described by USA Today as “the Valley’s answer to the Smithsonian.” Other current exhibits include “Charles Babbage’s Difference Engine No. 2,” and “Going Places: The History of Google Maps with Street View.”

For more information and updates, call (650) 810-1059, visit
, check us out on Facebook, and follow us @computerhistory on Twitter.

        CONTACT: Carina Sweet

(C) Copyright 2014 GlobeNewswire, Inc. All rights reserved.

Article source:

Digital Den Creator Pushes For A New Boston-Area Computer Museum

The role Massachusetts has played in the history of computing is undeniable.

However, with all the historic ideas, innovators, and companies — SpaceWar, VisiCalc, Digital Equipment Corporation, BBN, Data General, Prime, Wang, and, of course, MIT (more…)

Article source:

IGN Presents: the History of Atari

To those born into the console era, whose formative gaming education came from Nintendo, Sega, or PlayStation, Atari feels like an amorphous presence in the world of videogames: a once-important name that has been diluted by countless mergers, acquisitions and bankruptcies. A titan of the arcade era whose relevance had dwindled almost to nothingness by the turn of the millenium.

Many younger gamers have little idea of the extent to which this one company laid the foundation of the modern video game industry, beyond recognizing the name, and perhaps knowing that the Atari 2600 was an early home console. But the truth is that the modern video game industry owes almost everything to Atari and its two founders.

Atari co-founders Ted Dabney and Nolan Bushnell, with head of finance Fred Marincic and Pong creator Allan Alcorn.

Atari was a defining force in both arcades and the home computers throughout the 1970s and 80s (it wasn’t until 1993 that it finally shut down its computer manufacturing arm). In one form or another, it brought us everything from Pong to Tempest, Centipede to the famously dreadful E.T. The Video Game. But Atari’s games are only part of the story. Atari’s founders invented the video game arcade cabinet, helping to create the arcade culture that gave birth to modern video games. Without Atari, the history of games would have been completely different. The story of its rise and its many, varied deaths is a fascinating one that spans the entirety of modern gaming’s history, from the early 70s to its latest bankruptcy in January 2013.

The variety of corporate metamorphoses that Atari has undergone over the years is such that its history becomes difficult to untangle after a certain point, but Atari’s story starts as world-changing things very often do: with one person and a great idea. Atari’s two founders, Nolan Bushnell and Ted Dabney, met in 1969, where they were both working for a company called Ampex in Redwood City, California. Years earlier, as an electrical engineering student in Utah, Bushnell had developed a fascination with one of the very first video games, Spacewar, developed on an improbably giant computer at the Massachusetts Institute of Technology in 1962 by Professor Steve Russell and two of his students. He’d sneak into the college’s computing lab at night with a fraternity brother to play it.

Spacewar! in action. Photo by Joi Ito.

Bushnell’s college was important. Computer graphics were invented at the University of Utah in the 1960s by a man named Ivan Sutherland, one of computer science’s pioneers. The University had, at the time, state of the art computer equipment. This made Bushnell one of a relatively very small number of people who could play the earliest video games, including Spacewar, on campus computers.

While attending school, Bushnell also worked in an “amusement arcade” called Lagoon Amusement Park during the holidays, and it occurred to him that the electronic game could work as a coin-operated machine. Arcades at that time were halls of pinball cabinets and other coin-operated entertainments, like slot machines and ball-throwing gambits and other trivial games of skill and chance. What Bushnell essentially envisioned, though, was the 1980s arcade, packed with glowing coin-op game cabinets and spellbound teens –  places where an entire generation would fall in love with video games. These places would not have happened without him, and his company, Atari, would become one of the biggest names in this future world.

In post-war America pinball was demonised in the same way that video games frequently have been in the decades since. In the 1940s and 50s, the most rebellious, coolest thing you could do as a young person in many parts of America was to hang out near a pinball machine. Parents and other worried adults banded together to protest the machines, fearing that their children were being corrupted by their bright, noisy influence, transformed into time-wasting entertainment junkies and being led into gambling. Pinball machines were actually made illegal in some parts of the country – perhaps most famously, New York mayor Fiorello LaGuardia ordered the seizure of thousands of machines in January 1942 and smashed them up for materials to help with the war effort. Pinball remained technically illegal in New York until 1976. Imagine, against this backdrop of moral panic, how people reacted to the introduction of electronic video games, and to the transformation that the arcade would undergo.

This was computing, in the 1950s. Photo: U.S. Army.

But in the early 1960s, computers still required a small room to house them. It wasn’t until the tail-end of the decade that Bushnell, along with Ted Dabney, would develop the first ever coin-operated arcade machine for a company called Nutting Associates. It was called Computer Space. The game released in 1971, and although it fell short of the manufacturer’s expectations and was considered something of a failure by Nutting (it was just too complicated to catch on in a big way outside of college campuses, Nolan later posited), it still sold 1500 units and made Bushnell and Dabney enough money to strike out on their own and continue making coin-operated electronic games.

Pong was the first game program Al Alcorn ever created.

Their company – originally called Syzygy Co. – was founded in 1971. Upon discovering that the name was already in use in California, the duo changed it to Atari, Inc in 1972. The word “ataru” literally means “to hit a target” in Japanese and is associated with good fortune. The name came from the ancient Chinese board game Go, of which Bushnell was a fan. He essentially chose company’s name from amongst its strange jargon. In that context, Atari means something closer to “I’m about to win” – like “check” in chess. Other name candidates, reportedly, were Sente and Hane.

Dabney invented the early technology that allowed dots to move on a screen without the assistance of an extremely expensive computer, and thereby essentially invented modern video games. It was called the Spot Motion Circuit, and it allowed a dot to move up, down, left and right on a screen. It was a different world from the supercomputers that Spacewar was running on, as it allowed dedicated cabinets to be manufactured at a reasonable cost with built-in boards. It was essentially the invention of the video game arcade cabinet.

The mediocre-performing Computer Space was the first ever commercially-sold video game, but it was the newly-founded Atari’s first game that would set the stage for the rapid evolution and soaring popularity of the arcade. In 1972, Bushnell attended a demonstration of the first-ever home video game console, the Magnavox Odyssey – a brown-and-beige plastic box released in August 1972 that played a small variety of silent games, including Table Tennis, a competitive tennis game that probably looks pretty familiar to you. The Magnavox sold around 330,000 across the North America and Europe, where it was released in 1973.

Pong would change everything. Photo: Chris Rand.

Magnavox’s tennis game was far from the first, of course. On the University of Utah campus computers, Bushnell likely played a few of them; a version of tennis called Tennis for Two was created as far back as 1958.

But none would break out like Atari’s Pong, released in 1972. It wasn’t Bushnell himself who created the program for Atari, but a new hire by the name of Al Alcorn, who had worked at Apex alongside Atari’s founders as a junior engineer and had never so much as seen a video game until Bushnell showed him Computer Space. Pong was the first game program he ever created. Not bad, as far as starts go.

Nobody actually expected Pong to go anywhere; Al Alcorn, famously, was assigned it as a project to test his abilities, and it was never intended to be a commercial product. But what Al made, after months of work making it more efficient, turned out to be a lot of fun. The differences between Pong and the Magnavox tennis games might not seem that obvious now, but they were hugely significant then, especially within the technical confines of the time. Pong’s ball sped up the longer the game went on, and pinged off the paddles at different angles depending on where it was hit. The gaps at the top of the screen, actually the result of a quirk in the technology rather than intention, ensured that no game of Pong could go on forever, that there was always that tiny space for the ball to slip past. Plus, it had sound. That might not sound like much, but it turned digital tennis from absurdly dull to incredibly addictive.

Article source:

How to delete cookies and browsing history in Internet Explorer, Google Chrome …

Since you’re reading this, you’ve probably heard bad things about so-called ‘cookies’ and how you should delete cookies from your PC or laptop. We’ll show you how to delete cookies but before you start, it’s worth understanding a cookie’s role and why you might actually want to keep it.

See all internet tutorials.

Cookies store information about you and your preferences on your hard disk, and websites use this information for various purposes. Some cookies are beneficial because they save you from having to set your preferences each time you visit a website. For example, you might the currency on a shopping website from Euros to British Pounds. Without a cookie to store that information, you’d have to make that change each time you shopped on that site.

Some cookies store a lot more information, which some people might consider private and sensitive. Again, this could used to make your life easier, but if that information includes which websites you visited after looking at a previous site – thereby tracking you as you browse the web, then you probably don’t want that on your computer.

Cookies can send data back to the webpages that you visit; and what the website (or owners of the site) do with this data is where the danger lies. This is why deleting your cookies and – to a lesser degree – your browsing history is a good thing to do periodically.

Again, you may not want to delete your history, since it’s convenient to have your browser auto-complete website addresses as you begin typing, prioritising sites you’ve visited recently over search engine results. You might also like to search your history to find a page you forgot to bookmark weeks or months ago.

With this in mind, deleting your cookies and internet browsing history really very simple. We’ll cover the four main web browsers here, and you’ll need to follow the instructions for all the browsers you use, since they don’t all share a common pool of cookies and websites you’ve visited.

How to delete cookies and browsing history: Microsoft Internet Explorer

Go to Tools (the cog icon at the top-right), and choose Internet Options. From here you will see a Browsing History heading halfway down the new window; click on Delete… A new window will pop up that will let you delete your cookies, history, temporary internet files, and form data too (this is typically your email, phone number, address and also passwords you’ve allowed your browser to save and use to fill out forms in the future).

In the latest version of Internet Explorer 11, there’s an additional option: Preserve Favourites website data. This automatically keeps cookies and temporary files for your favourite websites, and is worth ticking unless you really want to delete everything.

How to delete cookies and browsing history: Google Chrome

Located to the top right of the Chrome window you will see an icon with three horizontal bars. Click on this and then select Settings from the list.

From here you need to select History from the left-hand column. Now click the Clear Browsing Data button and tick the Cookies box. Note that the drop-down box at the top lets you choose the time period. This is handy as you can (as Google says) obliterate data from the past hour, day, week, last 4 weeks or everything.

Depending on the options you tick in the list, you can use this duration to selectively delete cookies, browsing history or any other items from the list.

Chrome also lets you delete specific cookies. To do this, instead of clicking on History as described above, click on Settings and – if necessary – click Show advanced settings. In the Privacy section, click the Content settings button.

In the Cookies section you can delete and block specific cookies. To see the list of stored cookies, click the All cookies and site data… button.

There will probably be a long list, and it’s an arduous task to go through them all to work out which to keep and which to delete, but it’s possible if you want to keep cookies for your favourite sites.

How to delete cookies and browsing history on Firefox

Click on the Firefox drop-down menu located in the top-left of the main window, from here hover over History and chose Clear Recent History… from the menu that appears (or just press Ctrl-Shift-Del). This will bring up a new window that will let you delete your Cookies and Browsing history.

You don’t get quite the extensive list of options as you do with Chrome, but the duration drop-down menu lets you choose from 1 hour, 2 hours, 4 hours, today or everything.

How to delete cookies and browsing history in Safari

Deleting your Cookies and browsing history on Apple’s Safari is just as simple as the previous three browsers mentioned. Go to Menu bar Safari Reset Safari Check the Remove all cookies and Clear history. Unfortunately, you can’t choose a duration, so the process will remove all cookies and history when you click Reset.

That’s how to delete cookies and browsing history on Internet Explorer, Chrome, Firefox and Safari.

See also How to get free online storage.

Article source:

Maryland Day to celebrate state’s history, natural resources, culture

Maryland Day

“Scout” William Hogart, standing, a Historic Interpreter at the Historic Annapolis Foundations “Hogshead” home on Pinkney Street, speaks about the home to guests Liz Palermo, left, Alyssa Palermo, center, age 4, and Aaron Palermo, right. The Four Rivers Heritage Area held events around the Annapolis area in honor of Maryland Day.

Maryland Day

Phil Ilse Reynolds, left, take a tour of the William Paca House and Gardens with docent Christina Csaszar, right. The Four Rivers Heritage Area held events around the Annapolis area in honor of Maryland Day.

Maryland Day

Siblings Rosemary Blomberg, left, age 7, Anne Blomberg, center, age 6, and Thomas Blomberg, right, age 4, get dressed in “colonial” style clothes made from modern garments, with help from Stella Breen-Franklin, owner of One Petticoat Lane, which hosted a colonial fashion show. The Four Rivers Heritage Area held events around the Annapolis area in honor of Maryland Day.


What: Maryland Day.

When: Friday, Saturday and Sunday.

Where: Various sites in Annapolis and south county.

Admission: Free or just $1.

Complete schedule:

More info: 410-222-1805.

Posted: Tuesday, March 18, 2014 8:15 am

Updated: 11:32 am, Tue Mar 18, 2014.

Maryland Day to celebrate state’s history, natural resources, culture


Plays, tours and displays on Colonial life will be part of this weekend’s seventh annual Maryland Day celebration.

Many events will be free or will have a $1 charge, organizers said.

Subscription Required

An online service is needed to view this article in its entirety.

You need an online service to view this article in its entirety.

Have an online subscription?

Login Now

Need an online subscription?



Or, use your
linked account:

Choose an online service.

Current print subscribers

Login Now

Need an online subscription?



Or, use your
linked account:

Choose an online service.

Current print subscribers


Tuesday, March 18, 2014 8:15 am.

Updated: 11:32 am.

Article source:

Americans Have Started Saying "Queue." Blame Netflix.

Back in Netflix’s early years, users baffled by the word “queue” used to call customer service to ask, “What’s my kway-way?” recalls Netflix communications director Joris Evers. This isn’t a question Netflix hears much anymore—and they can probably take some credit for that.

Not so long ago, the word “queue” would have sounded out of place outside the tech world or the United Kingdom, but it seems to be cropping up more and more in an American context. In the past month alone, the New York Times has used “queue” in reference to Fort Lee traffic, SXSW registration, and patrons of a San Francisco restaurant. Just last week, the Washington Post used it into an otherwise unremarkable story about new security lanes at Reagan National Airport: Before transport authorities decided to build new lanes, Lori Aratani wrote, the “long narrow hallway” at Terminal A “limited the number of passengers who could queue for screening.”

I can’t prove that Netflix is responsible. But as of January of this year, the company had 33 million subscribers in the U.S. That’s 33 million Americans who add the films they want to watch to a virtual “queue.” In Google searches originating in the U.S. since 2004, the word most commonly associated with “queue” is “Netflix,” though it might get some competition: Hulu has introduced its own “queue” function, and Amazon has adopted the term, too, inviting users to advertise the books they plan to read on a “Book Queue.” In 2011, a New York Times reader asked the site’s “Gadgetwise” blog how to create a “queue” of YouTube clips.

“Queue” has been commonplace in computing, in both British and American English, since the 1960s. “What you’re seeing is the surfacing of tech jargon,” said Grant Barrett, co-host of A Way with Words, a nationwide public radio show about language. “‘Queue’ has long been used in computer-programming to refer to a series of processes, tasks, or actions that happen, or will be run, one after another… Outgoing mail is added to a message ‘queue.’ Calculations are ‘queued’ to be run by a computer’s processor.”

“Even before Netflix, Americans would come across ‘printer queues,’” said Lynne Murphy, a linguist at the University of Sussex.

The increasingly fluid channels between British and American media probably also played a role in the popularization of “queue.” Even if it grew out of computer jargon and was popularized by the likes of Netflix, it’s coming to be used in the more traditional English sense of waiting in line.

According to the Oxford English Dictionary, the first usage of “queue”—as “a line or sequence of people, vehicles, etc., waiting their turn to proceed, or to be attended to”—appears in the Scottish historian Thomas Carlyle’s 1837 The French Revolution: A History. All ten of the quotes the OED editors chose to represent the history of the word “queue,” from 1837 to 2005, are from English, Irish or Scottish authors. By OED definition, the word is “chiefly British.”

“The traffic between American English and British English is a lot heavier than the traffic between American English and any other language,” said University of Colorado Boulder lexicographer Orin Hargraves. “A lot more words travel back and forth and get established in the other dialect than they ever did before, mainly because of media and the Internet.”

But in Netflix’s case, the idea to use “queue” didn’t come from media or the Internet. Evers told me it was the brainchild of Neil Hunt, the company’s chief product officer. His country of origin? England.

Article source:

Pesco on LSD, computers, and the counterculture

Above, video evidence of my short presentation “Just Say Know: A Cyberdelic History of the Future” at the recent Lift Conference 2014 in Geneva, Switzerland. Albert Hoffman first synthesized LSD in 1938 in Switzerland so this felt like the right set and setting to share stories about the intersection of psychedelic culture and computer technology from the 1960s to the present and beyond!

Article source:

A Short History of “Hack”

Clearly, “hack” is the word of the moment; its technological connotations have proliferated in both scope and presence. As used above, and in the halls of Facebook, it derives from a verb that first appeared in English around 1200, meaning to “cut with heavy blows in an irregular or random fashion,” as the Oxford English Dictionary defines it. (Another strain of the word, referring to a person—especially a writer—who does undistinguished work, comes from “hackney,” as in a horse or car for hire.)

It was at M.I.T. that “hack” first came to mean fussing with machines. The minutes of an April, 1955, meeting of the Tech Model Railroad Club state that “Mr. Eccles requests that anyone working or hacking on the electrical system turn the power off to avoid fuse blowing.” The lexicographer Jesse Sheidlower, the president of the American Dialect Society, who has been tracking the recent iterations of “hack” and “hacker” for years, told me that the earliest examples share a relatively benign sense of “working on” a tech problem in a different, presumably more creative way than what’s outlined in an instruction manual.

In the nineteen-sixties, the term seems to have migrated from the M.I.T. context to computer enthusiasts in general, and, in time, became an essential part of their lexicon. The Jargon File, a glossary for computer programmers that was launched in 1975, lists eight definitions for “hacker.” The first reads, “A person who enjoys exploring the details of programmable systems and how to stretch their capabilities, as opposed to most users, who prefer to learn only the minimum necessary.” The following six are equally approving. The eighth, and last, is “[deprecated] A malicious meddler who tries to discover sensitive information by poking around. Hence password hacker, network hacker. The correct term for this sense is cracker.”

That “[deprecated]” was a way of whistling past the graveyard, a self-conscious attempt to marginalize what later came to be called “black hat” hacking (malicious meddling), as opposed to “white hat” hacking (free-spirited creation). The black-hat sense has been around since at least November, 1963, when M.I.T.’s student newspaper, The Tech, noted, “Many telephone services have been curtailed because of so-called hackers, according to Prof. Carlton Tucker, administrator of the Institute phone system. … The hackers have accomplished such things as tying up all the tie-lines between Harvard and M.I.T., or making long-distance calls by charging them to a local radar installation.” The term subsequently migrated to computers. In 1976, a book entitled “Crime by Computer” included a chapter called “Trojan Horses, Time Bombs, Round Down, and the System Hacker.”

The black-hat sense proved irresistible to members of the media and other non-techies, no doubt in part because “hack” sounds malicious—not to mention that “hack” rhymes with “attack.” Steven Levy’s 1984 history of below-the-radar programmers and innovators, “Hackers,” was very much in agreement with the white-hat notion—its subtitle was “Heroes of the Computer Revolution”—but the book was so popular that it served as a sort of Trojan horse for the opposition. As Levy wrote in an afterword to a 1993 edition:

… the popularization of the term was a disaster. Why? The word “hacker” had acquired a specific and negative connotation. The trouble began with some well-publicized arrests of teenagers who electronically ventured into forbidden digital grounds, like government computer systems. It was understandable that the journalists covering these stories would refer to the young perps as hackers. After all, that’s what the kids called themselves. But the word quickly became synonymous with “digital trespasser.”

To give an indication of what Levy means, here are the first three uses of the word in the Times in 1990:

Computer hackers often sell the stolen codes to other students for a few dollars.

Mr. Poulsen, who is charged with the most crimes, has a history as a “hacker,” who began trespassing in university and government computers as a teen-ager using the assumed name Dark Dante, according to a profile in California magazine in 1984.

Mr. Morris, viewed by some as a dedicated computer researcher, by others as a reckless hacker, testified that it was never his intention to slow down computers or damage Internet data.

Although Lifehacker and other neutral or positive applications of the word are increasingly prominent, the black-hat meaning still prevails among the general public. Indeed, it has probably influenced the interpretation and enforcement of the Computer Fraud and Abuse Act. It’s as if the mere existence of the term “hacker” has added ammunition to the prosecution of such figures as Edward Snowden, Julian Assange, Chelsea Manning, and Aaron Swartz, the Internet activist who was indicted and charged with eleven violations of the act in 2011. His alleged offense was downloading many academic articles from a proprietary database; the scene of the crime, perhaps fittingly, was M.I.T.

Even as the mainstream usage of “hacker” took on its darker connotation, the geeks continued using it to mean what it always had: a righteous dude. As linguist Geoff Nunberg pointed out in a recent “Fresh Air” commentary, “Within tech culture, ‘hacker’ has become a shibboleth that identifies one as a member of the tribe.” When an M.I.T. student died in a plane crash in 1993, one of his fraternity brothers eulogized him by saying, “He was a hacker in every sense of the word, and we’re all going to miss him greatly.”

Ben Yagoda teaches journalism at the University of Delaware and is the author, most recently, of “How to Not Write Bad: The Most Common Writing Problems and the Best Ways to Avoid Them.”

Illustration by Jordan Awan.

Article source:

Sir Clive Sinclair visits the Centre For Computing History’s special …

It must feel for Sir Clive Sinclair that his most infamous creation follows him everywhere he goes.

At a weekend dedicated to his leading role in computer innovation he ended up signing the back of an engineering student’s Sinclair C5.

Sir Clive spent an hour browsing through the Cambridge-based Centre For Computing History’s special exhibition of his firm’s finest achievements.

Jason Fitzpatrick, the centre’s curator said Sir Clive was the closest the UK had to Apple visionary Steve Jobs.

He said: “He brought computers to the masses. The ZX81 came along and that was £99 and that changed everything. It’s a huge legacy.

“I can’t begin to imagine what it’s like to have that on your shoulders – him and his team – and we just wanted to pay our respects to that.”

The UK’s most popular personal computer, the first pocket calculator, and the first digital watch all came out of Sir Clive’s company, Sinclair Research.

On display this weekend were the leftovers of a battle between Sinclair Research and Acorn for technological supremacy in the households of the 1980s.

The Centre For Computing History’s collection is, according to Mr Fitzpatrick, “one of, if not the best in the UK”.

Charles Cotton, a Cambridge Entrepreneur who used to work in sales and marketing at Sinclair Research said in the 1980s it was “the closest you could get to working in a California tech company without moving to Silicon Valley”.

Ruth Bramley – whose maiden name is Sinclair, she insists she’s no relation – joined the company in January 1981 and worked their until 1984.

She said: “The ZX80 was already out and the ZX81 was about to go out.

“It was a very exciting time. There were 12 of us working in Cambridge and about the same number in St Ives. You’re in it and everything is just happening around you – it was a fantastic place to work.”

Engineering student Alec Wright, 21, was the fan who arrived in a Sinclair C5.

He said: “It’s not my regular vehicle because it’s a bit impractical, but I take it out once or twice a week.”

It is certainly the only signed Sinclair C5 registered with the Homerton College porters lodge.

Although impractical, the Sinclair C5 is still a part of this great British innovator’s eclectic history.

As Mr Fitzpatrick put it: “Not all of it was successful but who can have only successes?

“He was an inventor, and if things didn’t go well he would get up, get on and do something else.

“He saw very early on that home computing could change our lives, and there’s no doubt that it has.”

Article source:

Tech Time Warp of the Week: Watch Alan Greenspan Hawk Apples Computers …

The Apple IIc. Photo: Wikipedia

An Apple IIc with flat panel display. Photo: Bilby/Wikipedia

Apple IIc unboxed. Photo: Black Patterson/Wikimedia Commons

Apple IIc with a mouse and external floppy drive. Photo: Wikipedia

Photo: Wikimedia Commons

The Apple IIc Plus, the last of the Apple II’s. Photo: SCEhardt/Wikipedia

The Apple II, the c’s forefather. Photo: Musée Bolo/Rama/Wikipedia


Tech Time Warp of the Week: Watch President Nixon Dial the Moon in 1969

Tech Time Warp of the Week: Newspapers Go Digital, 1981

Tech Time Warp of the Week: Steve Jobs Predicts the Future, 1980

Tech Time Warp of the Week: The World’s First Hard Drive, 1956

Tech Time Warp of the Week: The Mother of All Demos, 1968

Tech Time Warp of the Week: 30 Years of Apple Ads, 1984 to the Present

The Apple IIc. Photo: Wikipedia

An Apple IIc with flat panel display. Photo: Bilby/Wikipedia

Apple IIc unboxed. Photo: Black Patterson/Wikimedia Commons

Apple IIc with a mouse and external floppy drive. Photo: Wikipedia

Photo: Wikimedia Commons

The Apple IIc Plus, the last of the Apple II’s. Photo: SCEhardt/Wikipedia

The Apple II, the c’s forefather. Photo: Musée Bolo/Rama/Wikipedia

Alan Greenspan says that people in high places are always asking him: “Where does the money go?”

That’s only what you’d expect from the man who for nearly 20 years served as the chairman of the Federal Reserve — especially when you consider he was overseeing the U.S. central banking system when bankers were cashing in on derivatives — those “financial weapons of mass destruction” that melted down the economy in 2008. But he’s not talking about the crash of 2008 or anything else that befell the U.S. financial system during his tenure with the Fed. He’s pitching Apple computers.

A mere two years before President Reagan appointed him to the Fed, you see, Greenspan was the star of an 1985 TV commercial dubbed The Apple IIc and Money. No joke. You can see it below. “People in high places are always asking me: ‘Alan, where does the money go?’” Greenspan says in the ad, before letting a world of consumers know that the best of way of answering this question is the Apple IIc — and its modem!

Released just three months after the iconic Apple Macintosh, this personal computer was the latest incarnation of the company’s highly successful Apple II system. It was supposed to be smaller — and better! — than past versions. The ‘c’ stood for compact. But it could just as easily have meant “cheap.” Sound familiar? With a price tag of $1,300, it was closer to being the computer for the rest of us than the $2,500 Mac.

The Apple II series debuted in 1977, and until Mac’s release in 1984, it was Apple’s ”main flagship product,” says Dag Spicer, the curator of the Computer History Museum in Mountain View, California. “They sure milked it for a long time,” he says. “It was an astonishing length of time for a computer. They extended the Apple II ecosystem as much as they could.” The last of the IIs — the Apple IIc Plus — was released in 1988 and discontinued in 1990. Other versions lived into the early 90s. That’s a pretty impressive run — though it doesn’t live up to Steve Jobs’ promise that it would never die.

In 1985, it was still on the rise. But the country was still recovering from the recession in the early 80s. In an effort to promote the new IIc, Apple turned to Greenspan, who was running a New York consulting firm at the time and had served as a presidential economic adviser, as the company’s TV ad points out.

The Apple IIc and its modem — a dapper and professorial Greenspan says in the 30-second ad spot — let you call up your bank to see how much money you have. It even lets you pay your bills online — automatically. Then he shows you how it all works, bringing up some text on his monochrome screen with the tap of a button. It confirms that he has successfully wired $65 dollars from his checking account to someone called Dr. Ray.

After this financial magical act, Greenspan turns back to the camera, ready for his closeup. “You can even pay your bills off automatically,” he says. “If you have any money left over, congratulations. You’re doing better than the government is.”

It would be funny if it wasn’t so true — time and time again.

Article source: