Does your ancient computer belong in a museum?

Before you sell, give away or recycle your old computer and the related accessories and peripherals you once worked or played on, you might look into whether a museum would be interested in adding some or all of the pieces to its collection.

Kim W. Tracy did just that in July. He donated two of his old programmable calculators to the Smithsonian Institution’s National Museum of American History — a TI-57 from the 1970s (rebranded for Radio Shack as the EC4000), and an HP32-C from the early 1980′s.

David Studebaker, co-founder of Studebaker Technology in Naperville, Illinois, found museum homes for almost all of his significant computer memorabilia — including his son’s Atari 400.

“I bought the TI-57/EC4000 when I was in high school, and then programmed about everything it was capable of doing,” recalled Tracy, now the chief information officer of university technology services for Northeastern Illinois University in Chicago. “It was very influential in my becoming a computer scientist. And I bought the HP32-C as soon as I could afford it.”

David Studebaker, co-founder of Studebaker Technology in Naperville, Illinois, and an active 50-year veteran in the computer industry, also had to find new homes for his old computer-related equipment when he moved to a smaller house.

“I found museum homes for almost all my significant computer memorabilia — computers, including my son’s Atari 400 home computer, all the ROM cartridges, peripherals and manuals for it; my Osborne I ‘luggable’ computer, and much miscellaneous hardware, software, manuals, magazines and other items.”

The bulk — literally — of Studebaker’s accumulation went to the Living Computer Museum in Seattle, the Purdue University archives, and the Smithsonian’s National Museum of American History.

Your piece of technology history doesn’t have to be a particularly unique or valuable item for the right museum to want it.

Who wants your tech treasures?

Dozens of technology-collecting museums and other organizations are out there, and while their interests can overlap, each has some degree of unique focus and display resources that range from “one of everything” to “representative game consoles.”

For example, the Computer History Museum in Mountain View, California, “focuses on items that have historical significance or represent a major tech breakthrough,” said its senior curator, Dag Spicer. “And we also collect products that didn’t succeed, whether for marketing or technical reasons.”

By contrast, Bruce Damer, founder and curator of the DigiBarn in the Santa Cruz Mountains of Northern California, said he is “collecting mostly prototypes, machines that had a historical role, so it matters who had the machine and what it was used for.”

Finding a computer museum and its desires

Some computer museums post wish lists on their websites that they update regularly or in preparation for an upcoming exhibit.

If you look at museum lists like those on Wikipedia or, it will look like there are dozens of computer and tech museums. However, these lists aren’t necessarily comprehensive, current or accurate. Many of the listings are defunct, some are informal labors of love, some have little-to-no exhibit space, some are basically collectors’ garages, and many simply aren’t actively open to contributions or don’t want what you have.

Here are a few museums that in the past year have verified that they are collecting historical computer equipment, along with their wish lists (if any):

How to offer your stuff to a museum

“Donating an old computer is no different than giving anything else to a museum,” said Evan Koblentz, computer historian at the InfoAge Science History Learning Center and Museum. “Do your diligence, have a conversation with them, make sure they’re legit. Many of these things are historical artifacts, not just old tech junk. You want to do right by them. Some places have good intentions but no ability to execute, so your stuff goes into somebody else’s storage bin.”

Here’s what you should do:

  1. Gather together whatever you have for a given product, such as power supplies, cables, peripherals, manuals or disks.
  2. If safe, see whether it works. If it works, make sure none of your data is on it. (If it doesn’t work, and you think there may be sensitive data on it, remove the hard drive.)
  3. Spend a few minutes on Ebay to get a sense of how common or rare your item is, and what it might be worth.
  4. Write down the information — for example, vendor/make/model, serial number — when you purchased it (if you remember), and take a few pictures of it.
  5. Look through the wish lists for some of the museums.
  6. Offer it, one museum at a time, using each museum’s specified approach.
  7. If a museum accepts it, pack it up carefully, ship, repeat.

And if none of the places want your particular piece of technology history, don’t despair. There are lots more technology museums. Plus, many universities, vendors and even stores have museums, museum areas, or hall displays.

Tags: Downtime

Article source:

Posted in computer history | Tagged | Comments Off

Computer building history is made in Gurnee

What if someone laid out the guts of your computer on a table and said, “Now put it back together.”

Better yet, what if you had to not just reconstruct the computer but also finish the job as fast as you could?

That’s what Computer Systems Institute of Gurnee asked of contestants Thursday during a computer build-off, called PC Domination.

For the first time in three years of build-off history at the school, a woman took the top prize in 48 minutes and 29 seconds with just one 5-second penalty for not putting in a back plate, an oversight that didn’t affect powering up the computer and logging into Windows, which was the finish line.

Shaquanda Barker, 29, of Lake Villa, took home the honor and won a $25 iTunes gift card.

“Whoohoo, it feels good!” she said.

“It’s a big relief,” she added, after it was done.

Barker said that before she went through the course and got her certification, she knew nothing about computers.

“When I say I didn’t know anything about computers, I mean it. I would abuse the mouse too,” she said with a laugh. A former Head Start teacher, she started looking for a new career and decided on computers, as opposed to health care, which the school also offers.

“I looked at all three: Health Care, Business and Computers. The pressure of health care was too much. If you make a mistake it could mean the end,” she said.

Barker was competing against Luis Tellado, 33, of Waukegan, who is just starting his certification process, but already works at a help desk at ABS Associates in Schaumburg and knows his way around computers.

David Wenkel, the programming manager for the networking career program, said the match-up came from a test where so many students scored high enough to qualify that they had to draw names.

“It’s a fun way to get the students excited about skills that will make them employable,” he said, explaining that CDW in Vernon Hills and other types of companies like that are looking for graduates with those skills.

“Having some speed is a valuable skill too,” he said.

Barker had just finished the 32-week program and Tellado is just starting, but was described by Wenkel as one of those people who had already taught themselves a lot in the basement or garage. He has the knowledge, but not the certificate to prove it.

The two paired off in a room with a big time clock and a glass wall for other students and teachers to watch through. The competition included plenty of friendly (if geeky) banter.

“He needs to put in the motherboard right now,” said one student.

“Come on, don’t mess up. She’s got a lead on him right now,” said another.

“It’s cold and quiet in there right now. This is intense,” said instructor Kevin Claudio, who turned away to talk to a student during the competition. “I blinked and they got the optical drives in.”

Both competitors shook hands before the match and afterward. Tellado said he had been involved in tech stuff since 2006.

“I’ve got a passion for it,” he said. “I want to take my career to the next level,” he continued, “I want to dive deep and dissect a computer and learn the ins and outs of it”.

At one point he had caught up to Barker and he was the first to try and power up his computer. He did a little dance as he set it down.

“Don’t start your touchdown dance yet,” said one of the students on the other side of the glass wall.

And that student was right.

Both competitors had trouble starting their machines up and had to bring them back to the table to check their connections. Teachers had thrown in a little puzzler by giving them each an extra cord that they pondered over before deciding they didn’t need it. One of the rules was you could not have extra screws left over, but no one said anything about an extra cable.

When it was done, Barker thanked all of her teachers.

“I have some really good instructors,” she said.

Article source:

Posted in computer history | Tagged | Comments Off

D-Wave CEO: Our Next Quantum Processor Will Make Computer Science …

Things get weird at the atomic scale.

The rules of classical physics governing the objects we can see and touch break down. Particles can occupy two places at once or connect across vast distances, conditions known as superposition and entanglement (or what Albert Einstein dismissively described as “spooky action at a distance.”)

Scientists have explored for decades the theoretical possibilities of applying quantum mechanics to computing. But D-Wave Systems has been working to push the field into the practical realm, using an approach known as “adiabatic quantum computation.” The Burnaby, British Columbia, company, founded in 1999, released what it describes as the first commercial quantum computer in 2010.

Conventional computers deal with binary bits of information, 1s or 0s. But a quantum computer manipulates what are known as qubits (or quantum bits), which can be 1s and 0s at the same time, leveraging the power of superposition. Such a machine depends on entanglement as well, performing many operations on the same data simultaneously.

No one can really say for certain where those helpful distant qubits are operating. The leading bet among physicists is the many-worlds interpretation, which would suggest quantum computers offload processing to parallel universes. (No, seriously — read this!)

D-Wave has signed big-name customers including Lockheed Martin, Google and NASA, but its claims remain controversial, with conflicting reports on whether the machines really leverage superposition and entanglement. Indeed, whether it’s faster than conventional computers at all seems to depend on the problems and algorithms in question.

Earlier this month, D-Wave’s Chief Executive Vern Brownell and DFJ’s Steve Jurvetson, one of the company’s earliest and biggest investors, sat down for an hour-long interview with Re/code. The discussion spanned the critiques of the company, the science of quantum computing and the next steps for D-Wave.

Notably, the latter includes the forthcoming D-Wave processor, which Brownell says will end all doubt that they’ve leaped ahead of classical systems — and will forever leave them behind.

You can watch Brownell and Jurveston discuss the weird world of quantum computing in the video below:

Brownell added that the company’s 70 or so peer-reviewed scientific papers already confirm they’re tapping into quantum mechanics.

If the machines are achieving “quantum speedup” and growth curves keep pace with predictions, Jurvetson believes we could be on the precipice of a fundamental shift in computing — an exponential upon exponential leap that reshapes our assumptions about what machines can do.

The interview that follows has been edited for length and clarity.

Re/code: What kinds of possibilities does quantum computing open up? How can it change the sort of problems that we can apply computing to? 

Brownell: We’re at the dawn of this computing age, so things will change over time, and we’ll see a broader and broader set of applications. But today we focus on three problem domains that we think are best suited to this particular type of quantum computing.

Those include machine learning, which is one of the most interesting things going on in computer science today. AI 2.0 and useful AI have really revolutionized the way a lot of folks do things today.

The second thing is the broad set of optimization problems. In logistics, for example, you’re trying to find optimal routing and things like that. They are very complex and scale very quickly with the number of variables and interrelationships you’re trying to optimize for.

And then the third class is what we call sampling. The best example for this is perhaps in financial services, where Monte Carlo simulations represent the largest workloads in most investment banks. It’s used to model things like risk in portfolios — and that’s a fit that works very well with this type of computer.

We’re particularly excited about things like working with DNA-SEQ to find better cancer cures, or doing financial modeling, or, with Lockheed in particular, helping them verify their flight control systems.

But the most important challenge for us is to scale our software capability. The tools aren’t where they need to be; we need to have compilers and higher-level [application programming interfaces] and a full [software development kit] that will allow the technology to be used by a broad set of developers around the world.

Offering this kind of capability in the cloud, so you can access it as you would any other classical resource, will dramatically open up the opportunity here and get a lot of great minds thinking about, “How can I use this new tool that’s unlike anything else in the computing space and help solve human-scale problems?”

Vjeran Pavic, Re/code Steve Jurvetson, DFJ partner and D-Wave board member

Why did DFJ decide to invest in this space — and in this company in particular?

Jurvetson: We look for companies that are unlike anything we’ve ever seen before, with a bold vision to change the world and run by passionate entrepreneurs who get you jumping out of your seat.

This qualifies in spades.

The other part of your question is very easy to answer: When we first invested 12 years ago, it was without a doubt the only company trying to ship a commercial-grade quantum computer, as opposed to just doing research.

Clearly, if we can scale computation into new domains, to outstrip Moore’s law itself, this is very valuable. What’s so very different about quantum computing is, as you add qubits, it’s almost like an exponential on top of an exponential. Every qubit is roughly doubling the power of the computer.

Once the performance of one of these machines meets or exceeds a classical computer, there’s no looking back — and there’s nothing the classical computer industry can do to catch up. I believe that’s unprecedented in business.

There has been some controversy over what D-Wave has achieved, with varying results from various tests. In a paper published in Science in July, a research team at the Swiss Federal Institute of Technology reported: “We found no evidence of quantum speedup when the entire data set is considered, and obtained inconclusive results when comparing subsets of instances on an instance-by-instance basis.” What was your response to that report?

Brownell: That group of researchers basically looked at some benchmarking results that were published, not by us but by a third party, and took that one specific problem and built a system that could perform better than we were able to perform. Or at par with the way our computer could perform for that one specific problem.

It turns out it’s almost a mistake because there shouldn’t be a speedup over classical systems for that particular problem that was benchmarked. Another researcher named [Helmut] Katzgraber [at Texas AM] proved that you really shouldn’t see a speedup in that kind of problem.

Vjeran Pavic, Re/code

Shortly, we’re pretty confident that you’ll see results that definitely show us scaling better than the best known classical algorithms for those problem sets. We’re starting to see quantum accelerations, if you will, start to take off and cross over what classical systems can do.

So stay tuned for information there. We’ve had a history of knocking down our skeptics.

We’re now on the verge of definitively showing, in this scientific way, [quantum] speedups. I think that’s going to become an historic moment in computer science history.

Jurvetson: Another way to look at it is, what do the customers say?

The one I’ve watched for the longest time is Google. So just look at what they’ve said on their blogs. In 2009, they claimed that in their machine learning applications for recognizing images, the D-Wave System was already outperforming their data center.

Then they reaffirmed that with the D-Wave Two purchase, that system outperformed what classical computers could do. So you could say, “Gosh, maybe that’s the ultimate test for a business, do the dogs like the dog food?”

Every one of D-Wave’s customers has asked to purchase, and in most cases has purchased, more than once. It doesn’t answer the science, of course, but it does answer the question of whether there is value in the market.

So what do you make of Google deciding they’re going to build their own quantum hardware and hiring John Martinis (a professor of physics at the University of California, Santa Barbara) to lead it, who has praised your efforts but also suggested you’re on the wrong path.

Brownell: Uh, I don’t think so …

I can read one quote. Referring to this notion of coherence (the difficulty of keeping qubits in their quantum states long enough to perform computations), he said: “They conjecture you don’t need much coherence to get good performance. All the rest of the scientific community thinks you need to start with coherence in the qubits and then scale up.”

Brownell: What that particular quote is referring to is that there are different ways to build quantum computers and most of the rest of the community is working, at least as of today, on “gate-model” quantum computing, where coherence is very important.

In fact, they haven’t really been able to build any quantum computers because of this coherence issue.

But the adiabatic quantum computer (D-Wave’s approach) is inherently more robust against the decoherence process (see a good explainer on why here). But you’ll also note that that’s probably an older quote. I think John is on the record saying he’s going to be working with Google to do “annealing,” which is the adiabatic style of computing.

So we’re pleased that he’s joined that, I think it’s a validation of the work that we’ve done.

What’s the best guess as to when this will become a more mainstream approach to how computing gets done?

Our business is basically doubling every year in different dimensions.

In parallel universes?

Ha, yeah, right. And even the more pedestrian revenue and financial metrics like number of customers and so on. We believe we will continue on that path and we’ll accelerate that path.

Then layer on that today’s customers buy systems from us and put them in their data centers. Offering this as a cloud service could allow us to provide this as a service to anyone who needs it. My belief is that, in roughly five years, this could be a service that’s ubiquitous.

Developers developing an iOS app may decide, “I want to access quantum resources for these particular parts of my problem” and use those resources as freely as they use classical resources today in the cloud.

I know of Lockheed, Google and NASA, but you said you’re doubling every year. Are there other companies that haven’t been announced?

There are other customers. We have a relationship with the U.S. intelligence community. We can disclose that In-Q-Tel, the investment arm for the CIA, is one of our investors, so there’s interest in activity going on in those spaces.

Sorry, they’re an investor or customer?

Brownell: Investor.

Jurvetson: But, it’s kind of like, they often have a customer in mind when they invest.

Can you tell us about your product plans and pipeline?

Our next-generation processor will be 1,000 qubits, actually more precisely 1,152, and that’s going to be released early next year. We already have several customers waiting for that processor and we have about four of those systems in our laboratory today undergoing development and tests.

It not only increases the number of qubits, it also has significant improvements in other important dimensions of performance. So certainly this next processor is going to be very exciting.

To learn more about the weird world of quantum mechanics and computing, check out Google’s video below:


Article source:

Posted in computer history | Tagged | Comments Off

History of the Personal Computer, Part 2: Intel & Motorola’s virtual duopoly …

Initial development of the 8080 didn’t start until mid-1972, some six months after Federico Faggin began lobbying Intel’s management for its development. By this time, the potential microprocessor markets had started to present themselves. Computers were still seen as an expensive business and research tool, and the markets for a new generation of relatively inexpensive personal machines and industrial controllers didn’t exist.

While initial development had been delayed, Intel’s primary competitor, Motorola’s 6800, also had its share of issues. Intel had a new market mostly to itself. The remaining parts of the puzzle, an operating system and consumer-friendly packaging, were also taking their first steps.

This is the second installment in a five part series, where we look at the history of the microprocessor and personal computing, from the invention of the transistor to modern day chips powering our connected devices.

Read the complete article.

Article source:

Posted in computer history | Tagged | Comments Off

Hungary to Host Conference on History of Computer Science

MOSCOW, September 19 (RIA Novosti) – The 8th IT STAR Workshop on the History of Computing will take place on September 19 in Szeged, Hungary.

The conference will be dedicated to the history of computer science in the countries of Central, Eastern and Southern Europe. It will feature more than 10 speakers, from Poland, Italy, Russia, Hungary and other countries.

Petri Paju, Ph.D., a Post-Doc Researcher from the Department of Cultural History at the University of Turku in Finland, will make a report on IBM in Eastern Europe. Kiril Boyanov, A Full Member of the Bulgarian Academy of Sciences, will speak on the history of computer science in Bulgaria. Vladimir Kitov, a lecturer from the Department of Computer Science of Russia’s Plekhanov University of Economics, will make a report on Development and using of the three first Soviet computers.

The conference is an annual event which is dedicated to the different aspects of information technology.

Article source:

Posted in computer history | Tagged | Comments Off

What’s Happening: Living history, Tech talk, free computer classes



A trial lawyer and Montana Tech professor will debate on the U.S. Constitution from 7 to 9 p.m. in Montana Tech’s University Relations Center. They are Shane Krauser, of Chandler, Ariz., and John Ray, Ph.D., a Montana Tech political science and public policy professor. Krauser is the director of the Academy for Constitutional Education and an adjunct professor of Constitutional law. It’s sponsored by the Montana Tech College Republicans and Montana Citizens for Truth.


Millicent Firestone of Los Alamos National Lab will talk at 4 p.m. in Montana Tech’s ELC 202. Firestone expertise includes the design and synthesis of amphiphilic molecules and monomers.


Bannack State Park’s third annual Living History event will be held Sept. 18-21. Times are 10 a.m. to 5 p.m. Thursday through Saturday and 10 a.m. to 2 p.m. Sunday. This depicts the first 20 years of Bannack’s history. Details or to schedule a school visit, 406-834-3413.


The Butte Public Library offers free “Super Basics” computer classes, where students start at the beginning — turning on the computer — and go from there at 6 to 8 p.m., Uptown Library, 224 W. Sign up: 406-723-3361, or visit the library.


Butte-Silver Bow Public Archives, 17 W. Quartz St., will be closed from 2 to 5 p.m. Thursday. It reopens at 9 a.m. on Friday, Sept. 19. 


Anaconda Community Market, which features area food, produce, jewelry, handmade soaps, crafts and wares, runs 10 a.m. to 1 p.m. at Friendship Park, adjacent to the Copper Village Museum and Art Center. Admission is free.


An open house at East Middle School, 2600 Grand Ave., runs from 6 to 7:30 p.m. Thursday. Parents and students also will have a chance to visit booths hosted by community organizations.



  • Butte Camera Club meets at 7 p.m. Thursday in the Butte Plaza Mall meeting room. Ken Herrly will present a program on his wildlife photos Details: 406-494-3648, 406-563-7518
  • Butte Rotary Club meets at noon Thursday, Sept. 18, at the Butte Country Club. Prospective member are always welcome.
  • Belly Dance Class, Thursdays 6:30 to 7:30 p.m., Butte-Silver Bow Public Library basement. No experience necessary; $5 per class Details: 406-723-3164
  • Silver Bow Kiwanis meet at noon Tuesday, Sept. 23, at Perkins, 2900 Harrison Ave. Guest speaker is Jim Greene, incoming Kiwanis lieutenant governor.
  • Elk Park Ladies Luncheon is 12:30 p.m. Saturday, Sept. 27, at the Butte Country Club. Graduates of Franklin, Holy Savior and Harrison schools are invited. Reservations due by Sept. 24 by calling Gerry, 406-494-4410; Esther, 406-494-4013; or Millie, 406-494-3135.
  • Adult Children of Alcoholics meet at 10 a.m. Saturday in the Atherton Apartments community center room, 4500 Continental Dr. Details: 406-396-4112.

Article source:

Posted in computer history | Tagged | Comments Off

A history of misses haunt RadioShack

In an alternate universe, RadioShack Corp. would rule the world, supplying all of your electronics needs from computers to cellphones, and even making them. But in this world, RadioShack is almost bankrupt, having missed almost every opportunity to be the centre of the technology revolution.

Last week, the electronics retailer announced its latest quarterly loss – $119.4 million (U.S.) – and said that it might not have enough capital to continue as a “going concern.” The announcement was a surprise to no one. RadioShack, despite some terrific marketing, has been in turnaround mode for almost two decades. Now, with 10 consecutive unprofitable quarters and a stock worth a little over $1, RadioShack is a battery running out of charge.

More Related to this Story

RadioShack would be yet another tale about a business failing to adapt to the times, if it were not RadioShack. This is the retailer that sat at the heart of the electronics revolution and had many paths to glory, most of which it took. Yet, in what should be a Harvard Business School case study, it executed all of them badly.

The story of RadioShack begins with failure. The company, founded in 1921, sold radio parts and surplus supplies by outlet and catalog. But it was almost bankrupt when it was purchased in 1963 by Tandy Corp., a leather retailer.

At the time, RadioShack had just nine stores. But it expanded rapidly to become a hobbyist’s dream. RadioShack became a mythical place for all things related to electronics, catering not just to the do-it-yourselfers but also to anyone in search of the latest gadgets.

RadioShack also knew how to ride a wave. During the CB radio craze of the 1970s (you had to be there to understand), RadioShack was the leading retailer of CBs. It was doing so well that at one point, it was opening about three stores a day.

RadioShack entered the 1980s poised to be the centre of the computer revolution. Indeed, in 1977, the company had introduced one of the first mass-produced computers, the TRS-80, and initially outsold Apple using the power of its retail channel and its thousands of locations.

But from that perch, RadioShack went nowhere. RadioShack’s computer business lost traction and was eventually made obsolete as companies like IBM and Dell delivered more powerful computers through different channels.

Failures abounded. RadioShack phased out its computer business in 1993 along with its circuit board business. That year, too, the company sold its cellphone manufacturing business.

Instead of concentrating on RadioShack and building up its offerings, the company tried new concepts with new stores: Computer City to sell computers, Energy Express Plus to sell batteries, Famous Brand Electronics for refurbished electronics, McDuff and Video Concepts for audio and video, and the Incredible Universe, which became the company’s Best Buy knockoff.

None of these worked, and all were either closed or sold off by the late 1990s.

Still, RadioShack had a terrific reputation as the place to go for gear. In the Internet bubble, the stock closed at a high of $78.50 a share.

But the company had already lost its focus. The big box stores like Best Buy began to capture the bulk of the electronics business. RadioShack remained largely your local stop for electronics gear. The problem was that most of the equipment became cables and ancillary things to make the computers go.

Looking yet again for a new business model, RadioShack seized on mobile. But this merely made RadioShack a pawn in the cellphone wars as it tried to profit from selling a commodity.

RadioShack had some initial successes with sales in kiosks in Sam’s Clubs, but when RadioShack became too successful, Sam’s Club’s owner, Wal-Mart, took away the contract. And an attempt to sell phones at Target failed. The move to smartphones squeezed RadioShack’s margins, as did cellphone companies’ move to have their own stores.

In short, mobile has not panned out.

Now, RadioShack is trying again – it tried to rebrand itself in February with a slick $4 million Super Bowl commercial as not the store from the 1980s that you remember. But if RadioShack is not your 1980s store, what is it?

Best Buy may survive because it is so important to retailers and it finally has figured out how to sell in and around the Web.

But what is RadioShack’s purpose? It may make money selling headphones or computer cables, but that business model seems unsustainable. After all, how many cable cords can you sell to sustain 4,000-plus stores?

The company’s cash is evaporating. A plan to close about 1,100 stores was halted by RadioShack’s current lenders. And while RadioShack’s biggest shareholder, the hedge fund Standard General, is rumored to be in talks to provide new financing, the question would then become whether RadioShack’s latest attempt to leverage its name by adopting cleaner and brighter stores could be pulled off.

That question remains unanswered, but you have to shake your head at the missed opportunities.

RadioShack could have been Best Buy. It could have been Amazon. It could have become Dell. The paths that RadioShack could have taken are numerous. But instead of choosing one, it chose them all, walking away from its place as a hobbyist’s dream.

So what can we learn? RadioShack suffered from poor, often overpaid, leadership, which could not focus on a single plan and then was left grasping for a rescue strategy.

We can see echoes of this in the current mad dash by the Silicon Valley giants to become conglomerates. Google, Facebook and others also fear that their original mission will become obsolete and so they are buying anything new for billions of dollars to avoid a fate like RadioShack’s.

But perhaps the tech giants are missing one thing from the RadioShack story. RadioShack tried many paths. But going in all directions without a full commitment is not enough, particularly when the core brand is not sustained. RadioShack has branded itself well, but it led itself too far from its strengths.

It’s a cliché but true that retailers do not fare well in bankruptcy. Some bankruptcy experts have said the reason may be that the bankruptcy code forces the retailer to decide within nine months whether to confirm leases – too short a period to decide what will succeed or fail. But the problem with retailers is usually not that they have too much debt – something bankruptcy can solve – or even that they cannot decide on their locations. The problem is that bankrupt retailers cannot attract new customers or create a new survival plan.

It’s not an enviable place to be. RadioShack sits waiting for its new financiers to find some new path. As someone who bought a CB radio back in the 1970s, I hope they make it, but I wonder if such a path exists.

Follow us on Twitter: @GlobeBusiness

Article source:

Posted in computer history | Tagged | Comments Off

History of the Personal Computer: Leading up to Intel’s 4004, the first …

The personal computing business as we know it owes itself to an environment of enthusiasts, entrepreneurs and happenstance. Before PCs, the mainframe and minicomputer business model was formed around a single company providing an entire ecosystem; from building the hardware, installation, maintenance, writing the software, and training operators.

The invention of the microprocessor, DRAM, and EPROM integrated circuits would spark the widespread use of the BASIC high-level language variants, which would lead to the introduction of the GUI and bring computing to the mainstream. The resulting standardization and commoditization of hardware would finally make computing relatively affordable for anyone.

This is the first installment in a five part series. Over the next few weeks we’ll be taking a look at the history of the microprocessor and personal computing, from the invention of the transistor to modern day chips powering our connected devices.

Read the complete article.

Article source:

Posted in computer history | Tagged | Comments Off

h+ Magazine | Future Ancestry Research

Technology, and particularly computing, is essential to family history. Without it, we could still tell family stories to our children, but we certainly couldn’t substantiate those stories from billions of historical records into millions of family trees, as web applications like FamilySearch and do today.

In the 1960s, Intel co-founder Gordon Moore observed that the ratio of computing capacity to cost was doubling predictably, every couple years or faster. In other words, a computer built in 1969 had twice as much capacity as a computer built at the same cost in 1968, and over a hundred times as much capacity as a computer built at the same cost in 1962; a computer built in 1969 would also reliably have half the capacity of a computer built at the same cost in 1970, and less than a hundredth the capacity of a computer built at the same cost in 1976.

That trend, known as Moore’s Law, has continued to the present. Today, a $150 smartphone can store about a million times more data and process that data about a thousand times faster than the $150K Apollo Guidance Computer that took astronauts to the moon in 1969. The smartphone also has wireless access to extended computing capacity on the Internet, including powerful systems such as Google, Amazon, Facebook, Yahoo and eBay, as well as gigantic troves of family history data.

Suppose Moore’s Law continues. Within decades, whatever replaces smartphones would have millions, billions and then trillions of times the overall computing capacity at the same cost. Within a century, $150 would purchase more computing capacity than that of all human brains combined. If that were to happen, what might the intersection of family history and technology look like? What might FamilySearch or be like? Of course we don’t really know, but let’s imagine.

One of the things we might do is tell stories about our family and ancestors at a much more massive scale and at a far deeper level, by computing highly detailed family history simulations. Maybe they would be something like a mix of Google Earth enhanced with a full history of maps derived from geological and astronomical research; Oculus Rift enhanced with brain-computer interfacing for an immersive tactile experience; and Second Life enhanced with avatars generated from family trees, photos, journals, and DNA, and abstracted to sub-neuronal degrees of detail to enable artificial intelligence. In deeper more meaningful ways, we could understand and even feel our family history, as the characters, settings, plots and conflicts unfold before us – as our stories come to life, and we walk in our ancestors’ shoes (literally?).

As it turns out, if ever we compute such family history simulations, detailed to the point of enabling the characters with fully immersive consciousness, there would be a rather shocking philosophical ramification.

Imagine a family history simulation sophisticated enough to enable the characters with artificial intelligence and full awareness and consciousness. If something like $150 could purchase more computing capacity than that of all human brains combined, we might run many thousands, millions, or more family history simulations, for education, entertainment, research and innumerable other purposes.

Now imagine further that you actually live in such a future. version 42 has just been released, and all your friends are using it to run family history simulations. One day, while you and a friend are watching some of your ancestors, your friend turns to you and asks, “Are we living in a family history simulation?” You laugh of course, but your friend insists.

“Seriously! Are we living in a family history simulation? Think about it. probably runs at least a million family history simulations. Each of them includes billions of artificial intelligences that, so far as I can tell, experience their world like we experience ours. That’s something like, oh, let’s just say eight quadrillion artificial intelligences. What are the chances that you and I happen to be among eight billion natural intelligences instead of eight quadrillion artificial intelligences? … one in a million, or less because that’s assuming a non-simulated world simulated the others. I’d say we’re almost certainly living in a family history simulation.”

Your friend would be right. If there are a large number of worlds verified to be inside family history simulations, and no worlds verified to be outside family history simulations (despite common assumptions about our own world), the laws of probability entail that any given world, including our own, is most likely to be inside a family history simulation, and worlds outside family history simulations are merely an improbable hypothesis. In other words, if we run many family history simulations, we probably already live in one. Fellow tech-philosophy fans should take a look at a formal development of this idea, known as the Simulation Argument.

Now this philosophical argument doesn’t purport to prove that we’re actually living in a family history simulation. It only purports to prove that at least one of two possibilities must be true: either (1) we will never compute many family history simulations to a degree of detail that enables characters with artificial intelligence and fully immersive consciousness, or (2) we probably already live in such a family history simulation. However, both possibilities present us with some challenges.

The first suggests various probabilistic or hard limits to our technological progress. Maybe we’re likely to destroy ourselves with powerful new weapons or succumb to natural global catastrophes before attaining the ability to run detailed family history simulations. Perhaps we’re destined for some form of totalitarian control that would stifle such innovations. It may even be the case that the complexity or nature of consciousness is such that completely immersing or embedding it in a simulation is impossible or progress toward it would be asymptotic – indefinitely progressing slower and slower while requiring more and more resources.

Of course, if the first possibility is not true then the second must be, and it entails that reality is not what most of us usually suppose it to be. So what do you think? Are we living in a family history simulation? Could and would we simulate our ancestors to the point of artificial intelligence and immersive consciousness?


This article originally appeared in a somewhat different form on Lincoln’s blog here:

And a somewhat different variation originally appeared on the Tech Roots blog.

Related Posts

  • Your life is going to change faster than ever before
  • Interivew with Nick Bostrom
  • The Technological Singularity @ LOSCON 39 with David Brin, Phil Osborn, Vernor Vinge, Mitch Wagner
  • Shall we wish for the singularity to happen and could it happen without human intervention ?
  • Digital Physics vs. The Simulation Argument [updated]

Article source:

Posted in computer history | Tagged | Comments Off

History Lesson: Dragon 32, the Welsh computer

What, you naturally assumed Wales had no role in the history of home computers? That’ll teach you.

Take a look at the Dragon 32, Swansea toy company Mettoy’s knee-jerk attempt to get some loving as the ZX81 and VIC-20 started cascading into British homes.

They don’t make ads like this any more. Um, which is probably for the best

Mettoy’s source of inspiration was an odd one: Tandy’s American TRS-80 Color Computer, or CoCo to some of its fans.

Using the same chipset and language as the CoCo would, for better or worse, make the Dragon 32 unique in the UK market.

After a feisty start over Christmas 1982, Mettoy set up Dragon Data as its own company at a new site in Port Talbot. You’d expect that to be pretty big news in a low-key town, but… well, not so much.

That first 32k model (upgraded from a 16k prototype to take on the 48k Spectrum) landed cheap and early. It could run cassettes, cartridges and even disks, but games usually had to be built from scratch (or CoCo ports) as most developers who hadn’t already shacked up with Sinclair, Commodore, BBC or Acorn just weren’t cut out to work with Dragon tools.

Sluggish support and ugly graphics kept Dragon 32 sales in check after the first surge. Still, it did well enough for a Dragon 64 follow-up and American distribution through the Tano Corporation of fabled US party city New Orleans, a natural match for Wales’ premier party town Port Talbot.

Dragon Data went through sales and receivership within the next couple of years, but the Dragon remains Wales’ only mainstream market entry. Some may also count the ‘super-Spectrum’ SAM Coupe, but as it was even less successful then the Dragon, we wouldn’t. We’re harsh that way.

Enter the Dragon (32)

The Dragon 32 may not have been packing much in the power department and it may have been a pain in the tail to develop for, but that didn’t mean it wasn’t home to a number of great games.

If you fancy having a dig through the best of what Wales’s main hardware offering had to offer, here are the essential titles you should look out for.

Article source:

Posted in computer history | Tagged | Comments Off