Does It Help to Know History?

About a year ago, I wrote about some attempts to explain why anyone would, or ought to, study English in college. The point, I thought, was not that studying English gives anyone some practical advantage on non-English majors, but that it enables us to enter, as equals, into a long existing, ongoing conversation. It isn’t productive in a tangible sense; it’s productive in a human sense. The action, whether rewarded or not, really is its own reward. The activity is the answer.

It might be worth asking similar questions about the value of studying, or at least, reading, history these days, since it is a subject that comes to mind many mornings on the op-ed page. Every writer, of every political flavor, has some neat historical analogy, or mini-lesson, with which to preface an argument for why we ought to bomb these guys or side with those guys against the guys we were bombing before. But the best argument for reading history is not that it will show us the right thing to do in one case or the other, but rather that it will show us why even doing the right thing rarely works out. The advantage of having a historical sense is not that it will lead you to some quarry of instructions, the way that Superman can regularly return to the Fortress of Solitude to get instructions from his dad, but that it will teach you that no such crystal cave exists. What history generally “teaches” is how hard it is for anyone to control it, including the people who think they’re making it.

Roger Cohen, for instance, wrote on Wednesday about all the mistakes that the United States is supposed to have made in the Middle East over the past decade, with the implicit notion that there are two histories: one recent, in which everything that the United States has done has been ill-timed and disastrous; and then some other, superior, alternate history, in which imperial Western powers sagaciously, indeed, surgically, intervened in the region, wisely picking the right sides and thoughtful leaders, promoting militants without aiding fanaticism, and generally aiding the cause of peace and prosperity. This never happened. As the Libyan intervention demonstrates, the best will in the world—and, seemingly, the best candidates for our support—can’t cure broken polities quickly. What “history” shows is that the same forces that led to the Mahdi’s rebellion in Sudan more than a century ago—rage at the presence of a colonial master; a mad turn towards an imaginary past as a means to equal the score—keep coming back and remain just as resistant to management, close up or at a distance, as they did before. ISIS is a horrible group doing horrible things, and there are many factors behind its rise. But they came to be a threat and a power less because of all we didn’t do than because of certain things we did do—foremost among them that massive, forward intervention, the Iraq War. (The historical question to which ISIS is the answer is: What could possibly be worse than Saddam Hussein?)

Another, domestic example of historical blindness is the current cult of the political hypersagacity of Lyndon B. Johnson. L.B.J. was indeed a ruthless political operator and, when he had big majorities, got big bills passed—the Civil Rights Act, for one. He also engineered, and masterfully bullied through Congress, the Vietnam War, a moral and strategic catastrophe that ripped the United States apart and, more important, visited a kind of hell on the Vietnamese. It also led American soldiers to commit war crimes, almost all left unpunished, of a kind that it still shrivels the heart to read about. Johnson did many good things, but to use him as a positive counterexample of leadership to Barack Obama or anyone else is marginally insane.

Johnson’s tragedy was critically tied to the cult of action, of being tough and not just sitting there and watching. But not doing things too disastrously is not some minimal achievement; it is a maximal achievement, rarely managed. Studying history doesn’t argue for nothing-ism, but it makes a very good case for minimalism: for doing the least violent thing possible that might help prevent more violence from happening.

The real sin that the absence of a historical sense encourages is presentism, in the sense of exaggerating our present problems out of all proportion to those that have previously existed. It lies in believing that things are much worse than they have ever been—and, thus, than they really are—or are uniquely threatening rather than familiarly difficult. Every episode becomes an epidemic, every image is turned into a permanent injury, and each crisis is a historical crisis in need of urgent aggressive handling—even if all experience shows that aggressive handling of such situations has in the past, quite often made things worse. (The history of medicine is that no matter how many interventions are badly made, the experts who intervene make more: the sixteenth-century doctors who bled and cupped their patients and watched them die just bled and cupped others more.) What history actually shows is that nothing works out as planned, and that everything has unintentional consequences. History doesn’t show that we should never go to war—sometimes there’s no better alternative. But it does show that the results are entirely uncontrollable, and that we are far more likely to be made by history than to make it. History is past, and singular, and the same year never comes round twice.

Those of us who obsess, for instance, particularly in this centennial year, on the tragedy of August, 1914—on how an optimistic and largely prosperous civilization could commit suicide—don’t believe that the trouble then was that nobody read history. The trouble was that they were reading the wrong history, a make-believe history of grand designs and chess-master-like wisdom. History, well read, is simply humility well told, in many manners. And a few sessions of humility can often prevent a series of humiliations. What should, say, the advisers to Lord Grey, the British foreign secretary, have told him a century ago? Surely something like: Let’s not lose our heads; the Germans are a growing power who can be accommodated without losing anything essential to our well-being and, perhaps, shaping their direction; Serbian nationalism is an incident, not a cause de guerre; the French are understandably determined to take back Alsace-Lorraine, but this is not terribly important to us—nor to them either, really, if they could be made to see that. And the Ottoman Empire is far from the worst arrangement of things that can be imagined in that part of the world.  We will not lose our credibility by failing to sacrifice a generation of our young men. Our credibility lies, exactly, in their continued happy existence.

Many measly compromises would have had to be made by the British; many challenges postponed; many opportunities for aggressive, forward action shirked—and the catastrophe, which set the stage and shaped the characters for the next war, would have been avoided. That is historical wisdom, the only wisdom history supplies. The most tempting lesson that history gives is to not tempt it. Those who simply repeat history are condemned to leave the rest of us to read all about that repetition in the news every morning.

Article source: http://www.newyorker.com/news/daily-comment/help-know-history

Bafétimbi Gomis: I researched Swansea on Football Manager computer game

Bafétimbi Gomis has admitted using the Football Manager game to learn about Swansea City and their players before he signed for the club.

The 29-year-old was aware of Garry Monk’s interest in him while he was playing for Lyon last season, and used his time travelling across Europe to research the club and it’s players with the aid of the computer game.

Gomis, who scored his first Swansea goal in the 1-0 Capital One Cup win over Rotherham on Tuesday having impressed from the substitute’s bench in the 2-1 win at Manchester United on the Premier League’s opening weekend, told his new club’s website: “I play a lot of Football Manager. During my time with my previous club, we travelled quite a bit for European matches. Therefore, I used my spare time on the plane to play. I’ve been playing the game ever since my development stages, but I have found it very helpful in helping me find out more about Swansea.

“Before I signed here, I spent a month playing as Swansea to help me get to know my team-mates – to find out a bit more about them. Of course, I also watched video footage to see how the team played, but it is true that the game helped me learn a lot about each of my team-mates’ characteristics – their age, where they used to play and their attributes.”

Gomis says the best-selling game even helped him learn about the history of the south Wales club – and their manager. “I believe that, when signing for a club, it’s vital that you learn about its history,” he explained. “It was important to know what kind of club Swansea are, their rivalry with Cardiff City, as well as other such things.

“I realised that the manager Garry Monk was a key player for Swansea, who helped the club rise to the top division of English football. Since Swansea have followed me for some time, I have followed them too and I was very surprised with the quality and mind-set within this team. But, now, it is not a surprise to me, given that Garry Monk is the manager.”

This season Premier League clubs started using Football Manager’s database of players to help identify and recruit new signings.

Article source: http://www.theguardian.com/football/2014/aug/28/bafetimbi-gomis-football-manager-computer-game-swansea-city-lyon

Computer security threats: A brief history

Print and online media have given extensive coverage to the recent security breach where Russian hackers stole more than a billion passwords, usernames and email addresses.

In light of this and similar threats, IT security and protecting sensitive data are more important than ever. Over time, computer security threats have become much more sophisticated and more damaging. But this evolution has happened over decades. Tracking these changes reveals some fascinating insights into how criminals have worked to change their tactics and how businesses have responded.

Early security problems: moths and Cap’n Crunch

One of the first recorded computer security threats actually didn’t come from a human. In 1945, Rear Admiral Grace Murray Hopper found a moth among the relays of a Navy computer and called it a “bug.” From this, the term “debugging” was born. It wasn’t until the 1960s that humans started exploiting networks. From 1964 to 1970, ATT caught hundreds of people obtaining free phone calls through the use of tone-producing “blue boxes.” Later in the 1970s, John Draper found another way to make free phone calls by using a blue box and plastic toy whistle that came in Cap’n Crunch cereal boxes. The two items combined to replicate a tone unlocking ATT’s phone network.

The rise of worms and viruses

By 1979, computer threats took on another form. In that year, the researchers created the first computer worm. Originally intended to help computers, the bug was modified by hackers so it would destroy and alter data. Just a few years later, computer viruses were created. By 1988, damage became widespread as a worm disabled around 6,000 computers connected to the Advanced Research Projects Agency Network. And by 1990, the first self-modifying viruses were created.

Going global: worldwide attacks

When the mid-1990s hit, viruses went international as the first Microsoft Word-based virus using macro commands spread all over the world. In 1998, hackers took control of more than 500 government, military, and private computer systems with the “Solar Sunrise” attacks. Two years later, other hackers were able to crash Amazon, Yahoo and eBay’s websites. In 2001, the Code Red worm ended up causing $2 billion in damage by infecting Microsoft Windows NT and Windows 2000 server software. The large-scale attacks continued into 2006, when anywhere from 469,000 to one million computers were infected with the Nyxem virus.

Explosive connection, rapid infection

In the mid-2000s, as people connected to the Internet like never before, widespread infection rates exploded as well. The Storm Worm virus in 2007 and the Koobface virus in 2008 used emails and social media to spread rapidly, infecting millions of computers. Hackers also stole data with the Conficker worm in 2009. In 2012, the Heartbleed bug was discovered, which took advantage of a flaw in the OpenSSL security software library to access sensitive data like passwords. And in 2013 one of the most infamous attacks occurred, when hackers gained access to retail giant Target’s servers, leading to the theft of 70 million customer records.

As you can see, computer security threats are nothing new. But as they get bigger and bolder, companies have to protect their data in new ways. Clearly, the “patch and pray” approach can’t keep the bad guys at bay any more. But what can? The race is on for bigger, better connected and more proactive solutions that stay one step ahead. A few years from now, it sure would be great to read a History of Computer Threats article calling this Russian data heist the last big caper of its kind. Meanwhile, change your passwords!

Tags: IT Security,Tech Culture

Article source: http://techpageone.dell.com/technology/security-it/computer-security-threats-a-brief-history/

Federal prosecutions not easy in police shootings

Federal prosecutions not easy in police shootings

FILE – This Aug. 12, 2014 file photo shows protesters standing on a street in Ferguson, Mo. Racial tensions have run high in in the predominantly black city of Ferguson, following the shooting death by police of Michael Brown, 18, an unarmed black man. As the Justice Department probes the police shooting of an unarmed 18-year-old in Ferguson, Missouri, history suggests there’s no guarantee of a criminal prosecution, let alone a conviction. Federal authorities investigating possible civil rights violations in the Aug. 9 death of Michael Brown must meet a difficult standard of proof, a challenge that has complicated the path to prosecution in past police shootings. (AP Photo/Jeff Roberson, File)

Federal prosecutions not easy in police shootings

FILE – This Aug. 20, 2014 file-pool photo shows Attorney General Eric Holder talking with Capt. Ron Johnson of the Missouri State Highway Patrol at Drake’s Place Restaurant in Florrissant, Mo. As the Justice Department probes the police shooting of an unarmed 18-year-old in Ferguson, Missouri, history suggests there’s no guarantee of a criminal prosecution, let alone a conviction. Federal authorities investigating possible civil rights violations in the Aug. 9 death of Michael Brown must meet a difficult standard of proof, a challenge that has complicated the path to prosecution in past police shootings. (AP Photo/Pablo Martinez Monsivais, File-Pool)



Posted: Tuesday, August 26, 2014 7:27 pm
|


Updated: 8:04 pm, Tue Aug 26, 2014.

Federal prosecutions not easy in police shootings

Associated Press |


0 comments

WASHINGTON (AP) — As the Justice Department probes the police shooting of an unarmed 18-year-old in Missouri, history suggests there’s no guarantee of a criminal prosecution, let alone a conviction.

Federal authorities investigating possible civil rights violations in the Aug. 9 death of Michael Brown in the St. Louis suburb of Ferguson must meet a difficult standard of proof, a challenge that has complicated the path to prosecution in past police shootings.

Subscription Required


An online service is needed to view this article in its entirety.


You need an online service to view this article in its entirety.

Have an online subscription?


Login Now

Need an online subscription?


Subscribe

Login

Choose an online service.

    Current print subscribers


    Login Now

    Need an online subscription?


    Subscribe

    Login

    Choose an online service.

      Current print subscribers

      on

      Tuesday, August 26, 2014 7:27 pm.

      Updated: 8:04 pm.

      Article source: http://www.maryvilledailyforum.com/news/state_news/article_38d4203d-d770-5df0-a641-75262b754b9d.html

      Sheena’s Bakery has long history downtown

      Sheena’s bakery deli MW

      Breakfast pastries, such as cinnamon rolls, bagels and turnovers, in the display case at Sheena’s Bakery Deli. MICHAEL WYKE/Tulsa World

      Sheena’s bakery deli MW

      Sheena Tillman at the counter of Sheena’s Bakery Deli. MICHAEL WYKE/Tulsa World

      Sheena’s bakery deli MW

      The turkey-and-pepperjack sandwich from Sheena’s Bakery Deli. MICHAEL WYKE/Tulsa World

      Sheena’s bakery deli MW

      A selection of meats, cheeses and chicken salad on a deli tray from Sheena’s Bakery Deli. MICHAEL WYKE/Tulsa World

      Sheena’s bakery deli MW

      Ready-made lunches wait for customers to snag them at Sheena’s Bakery Deli. MICHAEL WYKE/Tulsa World

      Sheena’s bakery deli MW

      The turkey-and-pepperjack sandwich from Sheena’s Bakery Deli. MICHAEL WYKE/Tulsa World



      SHEENA’S BAKERY DELI

      9 E. Fifth St.

      918-584-1772

      Food: 2.5 stars

      Atmosphere: 2 stars

      Service: order at counter

      (on a scale of 0 to 4 stars)

      6:30 a.m. to 2 p.m. Monday-Friday; accepts all major credit cards.

      Posted: Tuesday, August 26, 2014 10:30 am

      Review: Sheena’s Bakery Deli has long history downtown

      By SCOTT CHERRY
      World Restaurant Critic

      TulsaWorld.com

      |
      3 comments

      Although Sheena Tillman has served thousands of sandwiches, salads and cookies to downtown Tulsa lunch diners, she thinks Sheena’s Bakery Deli still holds “hidden gem” status.

      “I still meet people who have worked downtown for 20 years who didn’t know we were here,” Tillman said. “They usually are happy when they find us, though.”

      Subscription Required


      An online service is needed to view this article in its entirety.


      You need an online service to view this article in its entirety.

      Have an online subscription?


      Login Now

      Need an online subscription?


      Subscribe

      Login


      Login Now

      Need an online subscription?


      Subscribe

      Login

      on

      Tuesday, August 26, 2014 10:30 am.

      Article source: http://www.tulsaworld.com/weekend/foodreview/sheena-s-bakery-has-long-history-downtown/article_264c16d5-44e2-542e-8b9e-a5b8b9f1339f.html

      Web history of alleged foetus thief questioned

      Johannesburg – A woman accused of faking her own
      pregnancy could not have visited pregnancy-related websites at work because she
      did not have access to the internet, the South Gauteng High Court in
      Johannesburg heard on Tuesday.

      “It is the accused’s version that she did not have
      any access to the internet [on that computer],” said Carla van Veenendaal,
      for Loretta Cook.

      Cook is on trial for allegedly murdering Velencia Behrens
      in January 2011 by cutting her open in a bid to steal her unborn child, after
      faking her own pregnancy.

      Cook is also charged with the attempted murder of the
      child, who survived.

      Computer forensics expert Marius Myburgh testified that
      he examined the hard drive of the computer Cook used at the financial services
      firm, where she worked.

      He said the user profile of the computer was under Cook’s
      name and several sites had been visited in the three months before she took maternity
      leave in December 2011, including one for “pregnancy signs and
      symptoms”.

      “The computer was definitely connected to the
      internet and was on the internet,” Myburgh testified.

      Last week a doctor who examined Cook shortly after the
      alleged murder testified Cook was not pregnant.

      Myburgh said he was “guided” by the
      investigating officer to look for any activity on the computer relating to
      pregnancy.

      Myburgh also said he had found a few deleted word
      documents of relevance but the details of these documents were not revealed in
      court.

      Van Veenendaal argued that her client worked in an
      open-plan office and anyone could have used the computer and that she was not
      required to enter a password to access her computer.

      She asked Myburgh if he could accurately testify who had
      used her computer.

      “I could see the user profile Loretta Cook was
      active and logged in, I cannot place a physical person behind the
      computer,” said Myburgh.

      Article source: http://www.news24.com/SouthAfrica/News/Web-history-of-alleged-foetus-thief-questioned-20140826

      Missouri History Museum hosts Ferguson town hall

      Posted: Sunday, August 24, 2014 10:01 am
      |


      Updated: 1:03 pm, Sun Aug 24, 2014.

      Missouri History Museum hosts Ferguson town hall

      Associated Press |


      0 comments

      ST. LOUIS (AP) — The Missouri History Museum in Forest Park is hosting a town hall meeting Monday night on the Ferguson police shooting of Michael Brown.

      The event is hosted by New York activist and author Kevin Powell. The St. Louis museum is calling the free event a “safe space for young people to speak their minds and older adults to listen.”

      Subscription Required


      An online service is needed to view this article in its entirety.


      You need an online service to view this article in its entirety.

      Have an online subscription?


      Login Now

      Need an online subscription?


      Subscribe

      Login

      Choose an online service.

        Current print subscribers


        Login Now

        Need an online subscription?


        Subscribe

        Login

        Choose an online service.

          Current print subscribers

          on

          Sunday, August 24, 2014 10:01 am.

          Updated: 1:03 pm.

          Article source: http://www.maryvilledailyforum.com/news/state_news/article_1d4f9dc1-d94e-5a71-84b1-94ec9f083614.html

          Tech Time Warp of the Week: Watch Apple’s Awkwardly Wrong Prediction of the …

          Apple has a long history of weird, self-serving company videos that elevate its computer-and-gadget operation to nothing short of a global superpower. But this is something else.

          In 1987, two years after founder Steve Jobs was run out of the company, Apple produced a video that predicted a phantasmagorically glorious future for the maker of the Macintosh. It may be the oddest, most brilliant, and horribly wrong prediction anyone has ever made. With the 7-minute clip, which you can enjoy above, CEO John Sculley, Apple II chief Del Yocam, Apple exec Mike Spindler, and that other cofounder—Steve “The Woz” Wozniak—envisioned what Apple would be like in the year 1997. And let’s just say they didn’t hit the bullseye.

          In Cupertino’s vision of a future 1997, Apple dominates the news, the markets, even stand-up comedy. Wall Street loves the company, and its growth is skyrocketing. The original Macs haven’t changed all that much, and Apple computers are everywhere—in living rooms and kitchens, at the airport, on planes, in space, and, well, on your face.

          Yes, history would play out somewhat differently than the Applemaniacs hoped it would. By the real 1997, Apple was in the gutter. Sculley had been kicked out of the company four years before, only to be replaced by Spindler, who would join him in the graveyard of ousted CEOs three years later.

          Sure, Apple’s late-80s video was meant as a bit of a joke. But humor has never been the company’s strongpoint. Exhibit A: the video’s prediction that the Apple of 1997 would sell a version of its ancient Apple II desktop computer known as the V.S.O.P. Apparently, this stands for “Very Smooth Old Processor.” As Yokum puts it: “This being 1997, some people think the Apple II concept is getting old. We don’t agree.”

          Other bits don’t miss the mark quite so badly. The video predicts something called VistaMac, which isn’t all that different from Google Glass, the digital eyewear that is now very much a reality. Of course, unlike the VistaMac, Google Glass doesn’t take floppy disks—and it doesn’t look so very late-80s.

          The video also presages a few things we now take for granted, including recommendation systems and ubiquitous virtual assistants that help us navigate the world. “A computer that talks is no big deal. A computer that listens? That’s a breakthrough,” says Woz. “Apple computers have always been friendly, but we’ve gone from friendly to understanding.” Sounds a bit like Siri—though we hasten to add that even Siri doesn’t quite work as promised.

          To be fair, Apple did have the last laugh. Though the company’s crazy predictions didn’t exactly come true, 1997 actually turned out be a very important year for the company. In 1997, Steve Jobs came back, as Apple purchased his new company, Next Computer. And he was smart enough to realize the V.S.O.P would never fly.

          Article source: http://www.wired.com/2014/08/apple-awkward-future/

          History Lesson: Commodore VIC-20, the computer forever in its successor’s …

          For a machine with a blokey British naming convention about it, like a TV channel called Dave, there’s barely a sniff of Blighty in the VIC-20′s heritage.

          After all, having been created in America (the home of Commodore) and initially launched in Japan, it should comes as no great surprise that the initials merely stood for Video Interface Chip.

          Zoom

          Regardless of its upbringing, back in 1981, among Clive Sinclair’s early dabbling and sci-fi whispers of the BBC Micro, the VIC-20 knew how to draw attention: through affordability.

          And so Commodore’s offering, with its chunky keyboard, became many an old-school gamer’s entry point into a life of technological jollies.

          Born from Commodore founder Jack Tramiel’s urge to get in before a predicted Japanese PC takeover, the VIC nestled between Commodore’s PET business machines and the beige titan that was the C64, not just chronologically but in its half-education, half-games approach.

          That it was “fully expandable to 27.5k user RAM” and had a “full set of upper and lower case characters” were among the pant-dropping bullet points.

          But as with its UK competition, and US rivals like the Apple II and Atari 400, tight restrictions only forced the VIC’s bedroom game-builders to whip their brains harder. Memory expansion packs helped, from 3k all the way up to 16k (or even a monstrous third-party 64k).

          Hundreds of games loped into the wild on cassette and cartridge as the system pulled in players, but Commodore also offered kit like the VIC Printer and Disk Unit to turn it into a “super-computer for the professional”.

          A canny combination of timing, support and marketing sealed the VIC-20′s place among those early ’80s machines: it got four good years and close to three million sales before the C64 built up enough steam to knock it on the head in 1985.

          And while its successor may have left a bigger footprint, 33 years down the line you can still find VIC enthusiasts coaxing unlikely tricks out of the old 5k box.

          Live at the Old VIC

          The C64 and Amiga may be regarded as Commodore’s best systems for games, but the humble VIC-20 had a few crackers too.

          Have a look at the best of what it had on offer and if you’re a young ‘un, look at the sort of things we used to play before Mario stepped into his overalls.

          Header image courtesy of Marcin Wichary

          Article source: http://www.computerandvideogames.com/475111/features/history-lesson-commodore-vic-20-the-computer-forever-in-its-successors-shadow/

          How 21st-Century Cities Can Avoid the Fate of 20th-Century Detroit

          SA Forum is an invited essay from experts on topical issues in science and technology.

          In Coningsby, a Benjamin Disraeli novel published in 1844, a character impressed with the technological spirit of the age remarks, “I see cities peopled with machines. Certainly Manchester is the most wonderful city of modern times.”

          Today, of course, Manchester is mainly associated with urban decline. There is a simple economic explanation for this, and one that can help guide cities and nations as they prepare for another technological revolution.

          Although new technologies have become available everywhere, only some cities have prospered as a result. As the late economist and historian David Landes famously noted, “Prosperity and success are their own worst enemies.” Prosperous places may indeed become self-satisfied and less interested in progress. But manufacturing cities such as Manchester and Detroit did not decline because of a slowdown in technology adoption. On the contrary, they consistently embraced new technologies and increased the efficiency and output of their industries. Yet they declined. Why?

          The reason is that they failed to produce new employment opportunities to replace those that are being eroded by technological change. Instead of taking advantage of technological opportunities to create new occupations and industries, they adopted technologies to increase productivity by automating their factories and displacing labor.

          The fate of manufacturing cities such as Manchester and Detroit illustrates an important point: long-run economic growth is not simply about increasing productivity or output—it is about incorporating technologies into new work. Having nearly filed for bankruptcy in 1975, New York City has become a prime case of how to adapt to technological change. Whereas average wages in Detroit were still slightly higher than in New York in 1977, they are now less than 60 percent of the latter’s incomes. At a time when Detroit successfully adopted computers and industrial robots to substitute for labor, New York adapted by creating new employment opportunities in professional services, computer programming and software engineering.

          Long-run economic growth entails the eclipse of mature industries by new ones. To stave off stagnation, my own research with Thor Berger of Lund University suggests that cities need to manage the transition into new work (pdf).

          Such technological resilience requires an understanding of the direction of technological change. Unfortunately, economic history does not necessarily provide obvious guidance for policy makers who want to predict how technological progress will reshape labor markets in the future. For example, although the industrial revolution created the modern middle class, the computer revolution has arguably caused its decline.

          To understand how technology will alter the nature of work in the years ahead, we need to look at the tasks computers are and will be able to perform. Whereas computerization historically has been confined to routine tasks involving explicit rule-based activities, it is now spreading to domains commonly defined as nonroutine. In particular, sophisticated algorithms are becoming increasingly good at pattern recognition, and are rapidly entering domains long confined to labor. What this means is that a wide range of occupations in transportation and logistics, administration, services and sales will become increasingly vulnerable to automation in the coming decades. Worse, research suggests that the next generation of big data–driven computers will mainly substitute for low-income, low-skill jobs over the next decades, exacerbating already growing wage inequality (pdf).

          If jobs for low-skill workers disappear, those workers will need to find to jobs that are not susceptible to computerization. Such work will likely require a higher degree of human social intelligence and creativity—domains where labor will hold a comparative advantage, despite the diffusion of big data–driven technologies.

          The reason why Bloom Energy, Tesla Motors, eBay and Facebook all recently emerged in (or moved to) Silicon Valley is straightforward: the presence of adaptable skilled workers that are willing to relocate to the companies with the most promising innovations. Importantly, local universities, such as Stanford and U.C. Berkeley, have incubated ideas, educated workers and fostered technologies breakthroughs for decades. Since Frederick Terman, the dean of Stanford’s School of Engineering, encouraged two of his students, William Hewlett and David Packard, to found Hewlett–Packard in 1938, Stanford alumni have created 39,900 companies and about 5.4 million jobs.

          For cities to prosper, they need to promote investment in relevant skills to attract new industries and enable workers to shift into new occupations. Big-data architects, cloud services specialists, iOS developers, digital marketing specialists and data scientists provide examples of occupations that barely existed only five years ago, resulting from recent technological progress. According to our estimates, people working in digital industries are on average much better educated and for any given level of education they are more likely to have a science, technology, engineering or mathematics (STEM) degree. By contrast, workers with professional degrees are seen less often in new industries, reflecting the fact that new work requires adaptable cognitive abilities rather than job-specific skills. It is thus not surprising that we find San Jose, Santa Fe, San Francisco and Washington, D.C., among the places that have most successfully adapted to the digital revolution.

          The cities that invest in the creation of an adaptable labor force will remain resilient to technological change. Policies to promote technological resilience thus need to focus on the supply of technically skilled individuals and encouraging entrepreneurial risk-taking. For example, the National Science Foundation recently provided a grant to North Carolina Central University to integrate entrepreneurship into scientific education. More such initiatives are needed. Furthermore, immigration policies need to be made attractive to high-skill workers and entrepreneurs. Cities like New York and London owe much of their technological dynamism to their ability to attract talent.

          Meanwhile, it is important to bear in mind that policies designed to support output created by old work are not a recipe for prosperity. Whereas General Motors has rebounded since its 2009 bailout, Detroit filed for Chapter 9 bankruptcy in 2013. Instead of propping up old industries, officials should focus on managing the transition of the workforce into new work. The places that do so successfully will be at the frontier. As argued by Jane Jacobs in more colorful terms: “Our remote ancestors did not expand their economies much by simply doing more of what they had already been doing…. They expanded their economies by adding new kinds of work. So do we.”

          Article source: http://www.scientificamerican.com/article/how-21st-century-cities-can-avoid-the-fate-of-20th-century-detroit/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+sciam%2Fhistory-of-science+(Topic%3A+History+of+Science)