April 2, 1986: A ‘Laptop’ Computer

Will this catch on? IBM prepares to unveil its “laptop,” or briefcase-size, computer. Today in WSJ History, April 2, 1986.

Article source: http://blogs.wsj.com/wsj125/2014/04/01/april-2-1986-a-laptop-computer/

Today in History: Apple Computer was founded by Steve Jobs, Steve Wozniak …

Today is Tuesday, April 1, the 91st day of 2014. There are 274 days left in the year. This is April Fool’s Day.

Today’s Highlight in History: On April 1, 1789, the U.S. House of Representatives held its first full meeting in New York; Frederick Muhlenberg of Pennsylvania was elected the first House speaker. 

On this date: In 1853, Cincinnati, Ohio, established a fire department made up of paid city employees.

In 1912, the city of Branson, Mo., was incorporated.

In 1924, Adolf Hitler was sentenced to five years in prison for his role in the Beer Hall Putsch in Munich. (Hitler was released in Dec. 1924; during his time behind bars, he wrote his autobiographical screed, “Mein Kampf.”)

In 1933, Nazi Germany staged a daylong national boycott of Jewish-owned businesses.

In 1939, the United States recognized the government of Gen. Francisco Franco in Spain, the same day Franco went on radio to declare victory in the Spanish Civil War.

In 1945, American forces launched the amphibious invasion of Okinawa during World War II.

In 1954, the United States Air Force Academy was established by President Dwight D. Eisenhower.

In 1963, New York City’s daily newspapers resumed publishing after settlement was reached in a 114-day strike. The daytime drama “General Hospital” premiered on ABC-TV.

In 1972, the first Major League Baseball players’ strike began; it lasted 12 days.

In 1976, Apple Computer was founded by Steve Jobs, Steve Wozniak and Ronald Wayne.

In 1984, recording star Marvin Gaye was shot to death by his father, Marvin Gay (cq) Sr. in Los Angeles, the day before his 45th birthday. (The elder Gay pleaded guilty to voluntary manslaughter, and received probation.)

In 1992, the National Hockey League Players’ Association went on its first-ever strike, which lasted 10 days. 

Ten years ago: President George W. Bush signed into law new protections for the unborn that for the first time made it a separate federal crime to harm a fetus during an assault on the mother. Michigan won the NIT championship with a 62-55 victory over Rutgers. Actress Carrie Snodgress died in Los Angeles at age 57.

Five years ago: President Barack Obama, in London for an economic crisis summit, sought to rally the world’s top and emerging powers to help cope with a global downturn; chanting protesters clashed with riot police in the British capital. Sixteen people, most of them oil workers, were killed when a Super Puma helicopter crashed into the North Sea off Scotland’s northeast coast. In a college baseball blowout, Eastern Kentucky was leading Kentucky State 49-1 when the teams stopped playing after five innings (they also agreed to cancel the second game of a scheduled double-header).

One year ago: Prosecutors announced they would seek the death penalty for James Holmes should he be convicted in the July 2012 Colorado movie theater attack that killed 12 people. A cast member of the MTV reality show “BUCKWILD,” Shain Gandee, 21, was found dead in a sport utility vehicle in a West Virginia ditch along with his uncle and a friend; they had succumbed to carbon monoxide poisoning after the SUV’s tail pipe became submerged. 

Today’s Birthdays: Actress Jane Powell is 85. Actress Grace Lee Whitney is 84. Actress Debbie Reynolds is 82. Country singer Jim Ed Brown is 80. Actor Don Hastings is 80. Baseball Hall of Famer Phil Niekro is 75. Actress Ali MacGraw is 75. Rhythm-and-blues singer Rudolph Isley is 75. Baseball All-Star Rusty Staub is 70. Reggae singer Jimmy Cliff is 66. Supreme Court Justice Samuel Alito is 64. Rock musician Billy Currie (Ultravox) is 64. Actress Annette O’Toole is 62. Movie director Barry Sonnenfeld is 61. Singer Susan Boyle (TV: “Britain’s Got Talent”) is 53. Country singer Woody Lee is 46. Actress Jessica Collins is 43. Rapper-actor Method Man is 43. Movie directors Allen and Albert Hughes are 42. Political commentator Rachel Maddow is 41. Tennis player Magdalena Maleeva is 39. Actor David Oyelowo is 38. Singer Bijou Phillips is 34. Actor Sam Huntington is 32. Comedian-actor Taran Killam is 32. Actor Matt Lanter is 31. Actor Josh Zuckerman is 29. Country singer Hillary Scott (Lady Antebellum) is 28. Actor Asa Butterfield is 17. 

Thought for Today: “Our wisdom comes from our experience, and our experience comes from our foolishness.” — Sacha Guitry, Russian-born French actor-writer-director (1885-1957).

Article source: http://www.gastongazette.com/news/local/today-in-history-apple-computer-was-founded-by-steve-jobs-steve-wozniak-and-ronald-wayne-1.298554/

French crew shoots documentary at west Boynton middle school computer lab

A French film crew Friday shot video of an eighth-grade girl doing her U.S. history schoolwork on a computer at Christa McAuliffe Middle School in suburban Boynton Beach in an effort to show the French public what a good job other countries are doing using technology to educate children.

“The point of this is to ask relevant questions to the government,” said Pascale Labout, director of the documentary “L’Ecol du Futur,” or “School of the Future.” The crew filmed scenes at McAuliffe’s “Launch Pad,” a computer lab that mixes traditional and online instruction.

“This is a place where this blended type of learning has been implemented,” LaBout said.

The Launch Pad, started by the school this past year, allows Christa McAuliffe Middle School to teach five periods of U.S. history on the computer each day and one period of high-school-level Spanish. The computer lab, converted from a woodworking shop, allows students to work at their own pace with a teacher there to help.

McAuliffe principal Jeff Silverman said he is building a second Launch Pad lab next year to double the number of students taught on computers. Across the district, the number of similar computer labs at middle schools is being doubled to 24 from 12 next year.

“These kids are going to be so far ahead,” school board member Karen Brill said Friday as she watched the U.S. history students bouncing on their stability balls and rocker chairs as they worked on their computers while the film crew shot its documentary.

Brill added that universities are increasingly moving toward computer-based learning, so the virtual lab will better prepare these middle schoolers for high school and college classes.

While computer access is common in U.S. schools, not all French schools have Internet connections, and the French government has set a goal of 2017 for total Internet connection in schools, Labout said.

Her film crew is shooting footage at Christa McAuliffe, in Colorado and North Carolina and at the Massachusetts Institute of Technology to show the French people what innovations American schools have made using technology and start asking questions of how technology should work in French classrooms.

She said a big question they want to examine is the training given to American teachers, adding that in one case a rural French school gave students iPads but didn’t give its teachers any training in how to use the devices.

“L’Ecol du Futur” is slated to run on the European television channel Canal+ in September.

Article source: http://www.palmbeachpost.com/news/news/local/french-crew-shoots-documentary-at-west-boynton-mid/nfM6n/

Microsoft releases source code for MS DOS 1.1 and 2.0, Microsoft Word for …

Microsoft today released the source code for MS DOS 1.1 and 2.0 as well as Microsoft Word for Windows 1.1a. With the help of the Computer History Museum, the move means means this code is now available to the public.

MS-DOS was a renamed version of 86-DOS, written by Tim Paterson of Seattle Computer Products and initially released in August 1980. Microsoft hired Paterson in May 1981, bought 86-DOS 1.10 for $75,000 in July, and renamed it MS-DOS. Microsoft released the first DOS-based version of Microsoft Word in 1983. In 1989, Word for Windows arrived, and within four years was generating over half the revenue of the worldwide word-processing market.

“Thanks to the Computer History Museum, these important pieces of source code will be preserved and made available to the community for historical and technical scholarship,” Microsoft Research managing director Roy Levin said today. We agree, although it’s not clear what took so long.

Computer History Museum

Image Credit: Robert Scoble

Article source: http://thenextweb.com/microsoft/2014/03/25/microsoft-releases-source-code-ms-dos-1-1-2-0-microsoft-word-windows-1-1a/

Computer History Museum Makes Historic MS-DOS and Word for Windows …


MOUNTAIN VIEW, Mar 25, 2014 (GLOBE NEWSWIRE via COMTEX) –
The Computer History Museum (CHM) announced today that it has, with permission from Microsoft Corporation, made available original source code for two historic programs: MS-DOS, the 1982 “Disk Operating System” for IBM-compatible personal computers, and Word for Windows, the 1990 Windows-based version of their word processor.

IBM went outside the company for many hardware and software components of their 1981 personal computer. Though most vendors were kept in the dark about the project, code-named “Chess,” IBM developed a unique relationship between their Boca Raton-based team and Microsoft, then a small company based in Seattle.

Microsoft, which was providing the BASIC language interpreter, agreed to also supply an operating system. Without their own operating system already in place, they licensed a product from nearby Seattle Computer Products and worked closely with IBM to make the changes they wanted. It shipped as “PC-DOS” for IBM and “MS-DOS” for other PC manufacturers. We are today releasing the source code of MS-DOS version 1.1 from 1982, and of version 2.0 from 1983.

“Version 1.1 fits an entire operating system – limited as it was – into only 12K bytes of memory, which is tiny compared to today’s software,” said Len Shustek, Museum Chairman.

Microsoft’s DOS-based version of Word, first released in 1983, was not a success against the dominant word processor of that era, WordPerfect. The 1989 release of Word for Windows changed all that: within four years it was generating over half the worldwide word processing market revenue. It was a remarkable marketing and engineering achievement. We are today revealing the technical magic by releasing the source code to version 1.1a of Word for Windows.

“MS-DOS and Word for Windows built the foundation for Microsoft’s success in the technology industry,” said Roy Levin, distinguished engineer and managing director, Microsoft Research. “By contributing these source codes to the Computer History Museum archives, Microsoft is making these historic systems from the early era of personal computing available to the community for historical and technical scholarship.”

“We think preserving historic source code like these two programs is key to understanding how software has evolved from primitive roots to become a crucial part of our civilization,” says Shustek.

For a blog posting surrounding the release of this source code, please visit: Microsoft DOS and MS Word for Windows.

For other releases in the historic source code series, see:

Apple II DOS, IBM APL, Apple MacPaint and QuickDraw, Adobe Photoshop

About the Computer History Museum

The Computer History Museum in Mountain View, California is a nonprofit organization with a four-decade history as the world’s leading institution exploring the history of computing and its ongoing impact on society. The Museum is dedicated to the preservation and celebration of computer history, and is home to the largest international collection of computing artifacts in the world, encompassing computer hardware, software, documentation, ephemera, photographs and moving images.

The Museum brings computer history to life through large-scale exhibits, an acclaimed speaker series, a dynamic website, docent-led tours and an award-winning education program. The Museum’s signature exhibition is “Revolution: The First 2000 Years of Computing,” described by USA Today as “the Valley’s answer to the Smithsonian.” Other current exhibits include “Charles Babbage’s Difference Engine No. 2,” and “Going Places: The History of Google Maps with Street View.”

For more information and updates, call (650) 810-1059, visit www.computerhistory.org
, check us out on Facebook, and follow us @computerhistory on Twitter.

        CONTACT: Carina Sweet
                 650.810.1059
                 csweet@computerhistory.org

http://media.globenewswire.com/cache/28639/small/22868.jpg

http://www.globenewswire.com/newsroom/ti?nf=MTMjMTAwNzM5NjQjMjg2Mzk=

(C) Copyright 2014 GlobeNewswire, Inc. All rights reserved.

Article source: http://www.marketwatch.com/story/computer-history-museum-makes-historic-ms-dos-and-word-for-windows-source-code-available-to-the-public-2014-03-25?reflink=MW_news_stmp

Digital Den Creator Pushes For A New Boston-Area Computer Museum

The role Massachusetts has played in the history of computing is undeniable.

However, with all the historic ideas, innovators, and companies — SpaceWar, VisiCalc, Digital Equipment Corporation, BBN, Data General, Prime, Wang, and, of course, MIT (more…)

Article source: http://betaboston.com/news/2014/03/24/digital-den-creator-pushes-for-a-new-boston-area-computer-museum/

IGN Presents: the History of Atari

To those born into the console era, whose formative gaming education came from Nintendo, Sega, or PlayStation, Atari feels like an amorphous presence in the world of videogames: a once-important name that has been diluted by countless mergers, acquisitions and bankruptcies. A titan of the arcade era whose relevance had dwindled almost to nothingness by the turn of the millenium.

Many younger gamers have little idea of the extent to which this one company laid the foundation of the modern video game industry, beyond recognizing the name, and perhaps knowing that the Atari 2600 was an early home console. But the truth is that the modern video game industry owes almost everything to Atari and its two founders.

Atari co-founders Ted Dabney and Nolan Bushnell, with head of finance Fred Marincic and Pong creator Allan Alcorn.

Atari was a defining force in both arcades and the home computers throughout the 1970s and 80s (it wasn’t until 1993 that it finally shut down its computer manufacturing arm). In one form or another, it brought us everything from Pong to Tempest, Centipede to the famously dreadful E.T. The Video Game. But Atari’s games are only part of the story. Atari’s founders invented the video game arcade cabinet, helping to create the arcade culture that gave birth to modern video games. Without Atari, the history of games would have been completely different. The story of its rise and its many, varied deaths is a fascinating one that spans the entirety of modern gaming’s history, from the early 70s to its latest bankruptcy in January 2013.

The variety of corporate metamorphoses that Atari has undergone over the years is such that its history becomes difficult to untangle after a certain point, but Atari’s story starts as world-changing things very often do: with one person and a great idea. Atari’s two founders, Nolan Bushnell and Ted Dabney, met in 1969, where they were both working for a company called Ampex in Redwood City, California. Years earlier, as an electrical engineering student in Utah, Bushnell had developed a fascination with one of the very first video games, Spacewar, developed on an improbably giant computer at the Massachusetts Institute of Technology in 1962 by Professor Steve Russell and two of his students. He’d sneak into the college’s computing lab at night with a fraternity brother to play it.

Spacewar! in action. Photo by Joi Ito.

Bushnell’s college was important. Computer graphics were invented at the University of Utah in the 1960s by a man named Ivan Sutherland, one of computer science’s pioneers. The University had, at the time, state of the art computer equipment. This made Bushnell one of a relatively very small number of people who could play the earliest video games, including Spacewar, on campus computers.

While attending school, Bushnell also worked in an “amusement arcade” called Lagoon Amusement Park during the holidays, and it occurred to him that the electronic game could work as a coin-operated machine. Arcades at that time were halls of pinball cabinets and other coin-operated entertainments, like slot machines and ball-throwing gambits and other trivial games of skill and chance. What Bushnell essentially envisioned, though, was the 1980s arcade, packed with glowing coin-op game cabinets and spellbound teens –  places where an entire generation would fall in love with video games. These places would not have happened without him, and his company, Atari, would become one of the biggest names in this future world.

In post-war America pinball was demonised in the same way that video games frequently have been in the decades since. In the 1940s and 50s, the most rebellious, coolest thing you could do as a young person in many parts of America was to hang out near a pinball machine. Parents and other worried adults banded together to protest the machines, fearing that their children were being corrupted by their bright, noisy influence, transformed into time-wasting entertainment junkies and being led into gambling. Pinball machines were actually made illegal in some parts of the country – perhaps most famously, New York mayor Fiorello LaGuardia ordered the seizure of thousands of machines in January 1942 and smashed them up for materials to help with the war effort. Pinball remained technically illegal in New York until 1976. Imagine, against this backdrop of moral panic, how people reacted to the introduction of electronic video games, and to the transformation that the arcade would undergo.

This was computing, in the 1950s. Photo: U.S. Army.

But in the early 1960s, computers still required a small room to house them. It wasn’t until the tail-end of the decade that Bushnell, along with Ted Dabney, would develop the first ever coin-operated arcade machine for a company called Nutting Associates. It was called Computer Space. The game released in 1971, and although it fell short of the manufacturer’s expectations and was considered something of a failure by Nutting (it was just too complicated to catch on in a big way outside of college campuses, Nolan later posited), it still sold 1500 units and made Bushnell and Dabney enough money to strike out on their own and continue making coin-operated electronic games.

Pong was the first game program Al Alcorn ever created.


Their company – originally called Syzygy Co. – was founded in 1971. Upon discovering that the name was already in use in California, the duo changed it to Atari, Inc in 1972. The word “ataru” literally means “to hit a target” in Japanese and is associated with good fortune. The name came from the ancient Chinese board game Go, of which Bushnell was a fan. He essentially chose company’s name from amongst its strange jargon. In that context, Atari means something closer to “I’m about to win” – like “check” in chess. Other name candidates, reportedly, were Sente and Hane.

Dabney invented the early technology that allowed dots to move on a screen without the assistance of an extremely expensive computer, and thereby essentially invented modern video games. It was called the Spot Motion Circuit, and it allowed a dot to move up, down, left and right on a screen. It was a different world from the supercomputers that Spacewar was running on, as it allowed dedicated cabinets to be manufactured at a reasonable cost with built-in boards. It was essentially the invention of the video game arcade cabinet.

The mediocre-performing Computer Space was the first ever commercially-sold video game, but it was the newly-founded Atari’s first game that would set the stage for the rapid evolution and soaring popularity of the arcade. In 1972, Bushnell attended a demonstration of the first-ever home video game console, the Magnavox Odyssey – a brown-and-beige plastic box released in August 1972 that played a small variety of silent games, including Table Tennis, a competitive tennis game that probably looks pretty familiar to you. The Magnavox sold around 330,000 across the North America and Europe, where it was released in 1973.

Pong would change everything. Photo: Chris Rand.

Magnavox’s tennis game was far from the first, of course. On the University of Utah campus computers, Bushnell likely played a few of them; a version of tennis called Tennis for Two was created as far back as 1958.

But none would break out like Atari’s Pong, released in 1972. It wasn’t Bushnell himself who created the program for Atari, but a new hire by the name of Al Alcorn, who had worked at Apex alongside Atari’s founders as a junior engineer and had never so much as seen a video game until Bushnell showed him Computer Space. Pong was the first game program he ever created. Not bad, as far as starts go.

Nobody actually expected Pong to go anywhere; Al Alcorn, famously, was assigned it as a project to test his abilities, and it was never intended to be a commercial product. But what Al made, after months of work making it more efficient, turned out to be a lot of fun. The differences between Pong and the Magnavox tennis games might not seem that obvious now, but they were hugely significant then, especially within the technical confines of the time. Pong’s ball sped up the longer the game went on, and pinged off the paddles at different angles depending on where it was hit. The gaps at the top of the screen, actually the result of a quirk in the technology rather than intention, ensured that no game of Pong could go on forever, that there was always that tiny space for the ball to slip past. Plus, it had sound. That might not sound like much, but it turned digital tennis from absurdly dull to incredibly addictive.

Article source: http://www.ign.com/articles/2014/03/20/ign-presents-the-history-of-atari

How to delete cookies and browsing history in Internet Explorer, Google Chrome …

Since you’re reading this, you’ve probably heard bad things about so-called ‘cookies’ and how you should delete cookies from your PC or laptop. We’ll show you how to delete cookies but before you start, it’s worth understanding a cookie’s role and why you might actually want to keep it.

See all internet tutorials.

Cookies store information about you and your preferences on your hard disk, and websites use this information for various purposes. Some cookies are beneficial because they save you from having to set your preferences each time you visit a website. For example, you might the currency on a shopping website from Euros to British Pounds. Without a cookie to store that information, you’d have to make that change each time you shopped on that site.

Some cookies store a lot more information, which some people might consider private and sensitive. Again, this could used to make your life easier, but if that information includes which websites you visited after looking at a previous site – thereby tracking you as you browse the web, then you probably don’t want that on your computer.

Cookies can send data back to the webpages that you visit; and what the website (or owners of the site) do with this data is where the danger lies. This is why deleting your cookies and – to a lesser degree – your browsing history is a good thing to do periodically.

Again, you may not want to delete your history, since it’s convenient to have your browser auto-complete website addresses as you begin typing, prioritising sites you’ve visited recently over search engine results. You might also like to search your history to find a page you forgot to bookmark weeks or months ago.

With this in mind, deleting your cookies and internet browsing history really very simple. We’ll cover the four main web browsers here, and you’ll need to follow the instructions for all the browsers you use, since they don’t all share a common pool of cookies and websites you’ve visited.

How to delete cookies and browsing history: Microsoft Internet Explorer

Go to Tools (the cog icon at the top-right), and choose Internet Options. From here you will see a Browsing History heading halfway down the new window; click on Delete… A new window will pop up that will let you delete your cookies, history, temporary internet files, and form data too (this is typically your email, phone number, address and also passwords you’ve allowed your browser to save and use to fill out forms in the future).

In the latest version of Internet Explorer 11, there’s an additional option: Preserve Favourites website data. This automatically keeps cookies and temporary files for your favourite websites, and is worth ticking unless you really want to delete everything.

How to delete cookies and browsing history: Google Chrome

Located to the top right of the Chrome window you will see an icon with three horizontal bars. Click on this and then select Settings from the list.

From here you need to select History from the left-hand column. Now click the Clear Browsing Data button and tick the Cookies box. Note that the drop-down box at the top lets you choose the time period. This is handy as you can (as Google says) obliterate data from the past hour, day, week, last 4 weeks or everything.

Depending on the options you tick in the list, you can use this duration to selectively delete cookies, browsing history or any other items from the list.

Chrome also lets you delete specific cookies. To do this, instead of clicking on History as described above, click on Settings and – if necessary – click Show advanced settings. In the Privacy section, click the Content settings button.

In the Cookies section you can delete and block specific cookies. To see the list of stored cookies, click the All cookies and site data… button.

There will probably be a long list, and it’s an arduous task to go through them all to work out which to keep and which to delete, but it’s possible if you want to keep cookies for your favourite sites.

How to delete cookies and browsing history on Firefox

Click on the Firefox drop-down menu located in the top-left of the main window, from here hover over History and chose Clear Recent History… from the menu that appears (or just press Ctrl-Shift-Del). This will bring up a new window that will let you delete your Cookies and Browsing history.

You don’t get quite the extensive list of options as you do with Chrome, but the duration drop-down menu lets you choose from 1 hour, 2 hours, 4 hours, today or everything.

How to delete cookies and browsing history in Safari

Deleting your Cookies and browsing history on Apple’s Safari is just as simple as the previous three browsers mentioned. Go to Menu bar Safari Reset Safari Check the Remove all cookies and Clear history. Unfortunately, you can’t choose a duration, so the process will remove all cookies and history when you click Reset.

That’s how to delete cookies and browsing history on Internet Explorer, Chrome, Firefox and Safari.

See also How to get free online storage.

Article source: http://www.pcadvisor.co.uk/how-to/internet/3218163/how-delete-cookies-web-browsing-history/

Maryland Day to celebrate state’s history, natural resources, culture

Maryland Day

“Scout” William Hogart, standing, a Historic Interpreter at the Historic Annapolis Foundations “Hogshead” home on Pinkney Street, speaks about the home to guests Liz Palermo, left, Alyssa Palermo, center, age 4, and Aaron Palermo, right. The Four Rivers Heritage Area held events around the Annapolis area in honor of Maryland Day.

Maryland Day

Phil Ilse Reynolds, left, take a tour of the William Paca House and Gardens with docent Christina Csaszar, right. The Four Rivers Heritage Area held events around the Annapolis area in honor of Maryland Day.

Maryland Day

Siblings Rosemary Blomberg, left, age 7, Anne Blomberg, center, age 6, and Thomas Blomberg, right, age 4, get dressed in “colonial” style clothes made from modern garments, with help from Stella Breen-Franklin, owner of One Petticoat Lane, which hosted a colonial fashion show. The Four Rivers Heritage Area held events around the Annapolis area in honor of Maryland Day.



WHEN YOU GO

What: Maryland Day.

When: Friday, Saturday and Sunday.

Where: Various sites in Annapolis and south county.

Admission: Free or just $1.

Complete schedule: http://marylandday.org.

More info: 410-222-1805.

Posted: Tuesday, March 18, 2014 8:15 am
|


Updated: 11:32 am, Tue Mar 18, 2014.

Maryland Day to celebrate state’s history, natural resources, culture

By LAUREN FABER
lfaber@capgaznews.com

CapitalGazette.com

Plays, tours and displays on Colonial life will be part of this weekend’s seventh annual Maryland Day celebration.


Many events will be free or will have a $1 charge, organizers said.

Subscription Required


An online service is needed to view this article in its entirety.


You need an online service to view this article in its entirety.

Have an online subscription?


Login Now

Need an online subscription?


Subscribe

Login

Or, use your
linked account:

Choose an online service.

Current print subscribers


Login Now

Need an online subscription?


Subscribe

Login

Or, use your
linked account:

Choose an online service.

Current print subscribers

on

Tuesday, March 18, 2014 8:15 am.

Updated: 11:32 am.

Article source: http://www.capitalgazette.com/news/annapolis/maryland-day-to-celebrate-state-s-history-natural-resources-culture/article_69541d39-f62c-5401-9a39-f02e2e23f1ff.html

Americans Have Started Saying "Queue." Blame Netflix.

Back in Netflix’s early years, users baffled by the word “queue” used to call customer service to ask, “What’s my kway-way?” recalls Netflix communications director Joris Evers. This isn’t a question Netflix hears much anymore—and they can probably take some credit for that.

Not so long ago, the word “queue” would have sounded out of place outside the tech world or the United Kingdom, but it seems to be cropping up more and more in an American context. In the past month alone, the New York Times has used “queue” in reference to Fort Lee traffic, SXSW registration, and patrons of a San Francisco restaurant. Just last week, the Washington Post used it into an otherwise unremarkable story about new security lanes at Reagan National Airport: Before transport authorities decided to build new lanes, Lori Aratani wrote, the “long narrow hallway” at Terminal A “limited the number of passengers who could queue for screening.”

I can’t prove that Netflix is responsible. But as of January of this year, the company had 33 million subscribers in the U.S. That’s 33 million Americans who add the films they want to watch to a virtual “queue.” In Google searches originating in the U.S. since 2004, the word most commonly associated with “queue” is “Netflix,” though it might get some competition: Hulu has introduced its own “queue” function, and Amazon has adopted the term, too, inviting users to advertise the books they plan to read on a “Book Queue.” In 2011, a New York Times reader asked the site’s “Gadgetwise” blog how to create a “queue” of YouTube clips.

“Queue” has been commonplace in computing, in both British and American English, since the 1960s. “What you’re seeing is the surfacing of tech jargon,” said Grant Barrett, co-host of A Way with Words, a nationwide public radio show about language. “‘Queue’ has long been used in computer-programming to refer to a series of processes, tasks, or actions that happen, or will be run, one after another… Outgoing mail is added to a message ‘queue.’ Calculations are ‘queued’ to be run by a computer’s processor.”

“Even before Netflix, Americans would come across ‘printer queues,’” said Lynne Murphy, a linguist at the University of Sussex.

The increasingly fluid channels between British and American media probably also played a role in the popularization of “queue.” Even if it grew out of computer jargon and was popularized by the likes of Netflix, it’s coming to be used in the more traditional English sense of waiting in line.

According to the Oxford English Dictionary, the first usage of “queue”—as “a line or sequence of people, vehicles, etc., waiting their turn to proceed, or to be attended to”—appears in the Scottish historian Thomas Carlyle’s 1837 The French Revolution: A History. All ten of the quotes the OED editors chose to represent the history of the word “queue,” from 1837 to 2005, are from English, Irish or Scottish authors. By OED definition, the word is “chiefly British.”

“The traffic between American English and British English is a lot heavier than the traffic between American English and any other language,” said University of Colorado Boulder lexicographer Orin Hargraves. “A lot more words travel back and forth and get established in the other dialect than they ever did before, mainly because of media and the Internet.”

But in Netflix’s case, the idea to use “queue” didn’t come from media or the Internet. Evers told me it was the brainchild of Neil Hunt, the company’s chief product officer. His country of origin? England.

Article source: http://www.newrepublic.com/article/116996/netflix-queue-and-history-british-word-america