Federal prosecutions not easy in police shootings

Federal prosecutions not easy in police shootings

FILE – This Aug. 12, 2014 file photo shows protesters standing on a street in Ferguson, Mo. Racial tensions have run high in in the predominantly black city of Ferguson, following the shooting death by police of Michael Brown, 18, an unarmed black man. As the Justice Department probes the police shooting of an unarmed 18-year-old in Ferguson, Missouri, history suggests there’s no guarantee of a criminal prosecution, let alone a conviction. Federal authorities investigating possible civil rights violations in the Aug. 9 death of Michael Brown must meet a difficult standard of proof, a challenge that has complicated the path to prosecution in past police shootings. (AP Photo/Jeff Roberson, File)

Federal prosecutions not easy in police shootings

FILE – This Aug. 20, 2014 file-pool photo shows Attorney General Eric Holder talking with Capt. Ron Johnson of the Missouri State Highway Patrol at Drake’s Place Restaurant in Florrissant, Mo. As the Justice Department probes the police shooting of an unarmed 18-year-old in Ferguson, Missouri, history suggests there’s no guarantee of a criminal prosecution, let alone a conviction. Federal authorities investigating possible civil rights violations in the Aug. 9 death of Michael Brown must meet a difficult standard of proof, a challenge that has complicated the path to prosecution in past police shootings. (AP Photo/Pablo Martinez Monsivais, File-Pool)

Posted: Tuesday, August 26, 2014 7:27 pm

Updated: 8:04 pm, Tue Aug 26, 2014.

Federal prosecutions not easy in police shootings

Associated Press |


WASHINGTON (AP) — As the Justice Department probes the police shooting of an unarmed 18-year-old in Missouri, history suggests there’s no guarantee of a criminal prosecution, let alone a conviction.

Federal authorities investigating possible civil rights violations in the Aug. 9 death of Michael Brown in the St. Louis suburb of Ferguson must meet a difficult standard of proof, a challenge that has complicated the path to prosecution in past police shootings.

Subscription Required

An online service is needed to view this article in its entirety.

You need an online service to view this article in its entirety.

Have an online subscription?

Login Now

Need an online subscription?



Choose an online service.

    Current print subscribers

    Login Now

    Need an online subscription?



    Choose an online service.

      Current print subscribers


      Tuesday, August 26, 2014 7:27 pm.

      Updated: 8:04 pm.

      Article source: http://www.maryvilledailyforum.com/news/state_news/article_38d4203d-d770-5df0-a641-75262b754b9d.html

      Sheena’s Bakery has long history downtown

      Sheena’s bakery deli MW

      Breakfast pastries, such as cinnamon rolls, bagels and turnovers, in the display case at Sheena’s Bakery Deli. MICHAEL WYKE/Tulsa World

      Sheena’s bakery deli MW

      Sheena Tillman at the counter of Sheena’s Bakery Deli. MICHAEL WYKE/Tulsa World

      Sheena’s bakery deli MW

      The turkey-and-pepperjack sandwich from Sheena’s Bakery Deli. MICHAEL WYKE/Tulsa World

      Sheena’s bakery deli MW

      A selection of meats, cheeses and chicken salad on a deli tray from Sheena’s Bakery Deli. MICHAEL WYKE/Tulsa World

      Sheena’s bakery deli MW

      Ready-made lunches wait for customers to snag them at Sheena’s Bakery Deli. MICHAEL WYKE/Tulsa World

      Sheena’s bakery deli MW

      The turkey-and-pepperjack sandwich from Sheena’s Bakery Deli. MICHAEL WYKE/Tulsa World


      9 E. Fifth St.


      Food: 2.5 stars

      Atmosphere: 2 stars

      Service: order at counter

      (on a scale of 0 to 4 stars)

      6:30 a.m. to 2 p.m. Monday-Friday; accepts all major credit cards.

      Posted: Tuesday, August 26, 2014 10:30 am

      Review: Sheena’s Bakery Deli has long history downtown

      World Restaurant Critic



      Although Sheena Tillman has served thousands of sandwiches, salads and cookies to downtown Tulsa lunch diners, she thinks Sheena’s Bakery Deli still holds “hidden gem” status.

      “I still meet people who have worked downtown for 20 years who didn’t know we were here,” Tillman said. “They usually are happy when they find us, though.”

      Subscription Required

      An online service is needed to view this article in its entirety.

      You need an online service to view this article in its entirety.

      Have an online subscription?

      Login Now

      Need an online subscription?



      Login Now

      Need an online subscription?




      Tuesday, August 26, 2014 10:30 am.

      Article source: http://www.tulsaworld.com/weekend/foodreview/sheena-s-bakery-has-long-history-downtown/article_264c16d5-44e2-542e-8b9e-a5b8b9f1339f.html

      Web history of alleged foetus thief questioned

      Johannesburg – A woman accused of faking her own
      pregnancy could not have visited pregnancy-related websites at work because she
      did not have access to the internet, the South Gauteng High Court in
      Johannesburg heard on Tuesday.

      “It is the accused’s version that she did not have
      any access to the internet [on that computer],” said Carla van Veenendaal,
      for Loretta Cook.

      Cook is on trial for allegedly murdering Velencia Behrens
      in January 2011 by cutting her open in a bid to steal her unborn child, after
      faking her own pregnancy.

      Cook is also charged with the attempted murder of the
      child, who survived.

      Computer forensics expert Marius Myburgh testified that
      he examined the hard drive of the computer Cook used at the financial services
      firm, where she worked.

      He said the user profile of the computer was under Cook’s
      name and several sites had been visited in the three months before she took maternity
      leave in December 2011, including one for “pregnancy signs and

      “The computer was definitely connected to the
      internet and was on the internet,” Myburgh testified.

      Last week a doctor who examined Cook shortly after the
      alleged murder testified Cook was not pregnant.

      Myburgh said he was “guided” by the
      investigating officer to look for any activity on the computer relating to

      Myburgh also said he had found a few deleted word
      documents of relevance but the details of these documents were not revealed in

      Van Veenendaal argued that her client worked in an
      open-plan office and anyone could have used the computer and that she was not
      required to enter a password to access her computer.

      She asked Myburgh if he could accurately testify who had
      used her computer.

      “I could see the user profile Loretta Cook was
      active and logged in, I cannot place a physical person behind the
      computer,” said Myburgh.

      Article source: http://www.news24.com/SouthAfrica/News/Web-history-of-alleged-foetus-thief-questioned-20140826

      Missouri History Museum hosts Ferguson town hall

      Posted: Sunday, August 24, 2014 10:01 am

      Updated: 1:03 pm, Sun Aug 24, 2014.

      Missouri History Museum hosts Ferguson town hall

      Associated Press |


      ST. LOUIS (AP) — The Missouri History Museum in Forest Park is hosting a town hall meeting Monday night on the Ferguson police shooting of Michael Brown.

      The event is hosted by New York activist and author Kevin Powell. The St. Louis museum is calling the free event a “safe space for young people to speak their minds and older adults to listen.”

      Subscription Required

      An online service is needed to view this article in its entirety.

      You need an online service to view this article in its entirety.

      Have an online subscription?

      Login Now

      Need an online subscription?



      Choose an online service.

        Current print subscribers

        Login Now

        Need an online subscription?



        Choose an online service.

          Current print subscribers


          Sunday, August 24, 2014 10:01 am.

          Updated: 1:03 pm.

          Article source: http://www.maryvilledailyforum.com/news/state_news/article_1d4f9dc1-d94e-5a71-84b1-94ec9f083614.html

          Tech Time Warp of the Week: Watch Apple’s Awkwardly Wrong Prediction of the …

          Apple has a long history of weird, self-serving company videos that elevate its computer-and-gadget operation to nothing short of a global superpower. But this is something else.

          In 1987, two years after founder Steve Jobs was run out of the company, Apple produced a video that predicted a phantasmagorically glorious future for the maker of the Macintosh. It may be the oddest, most brilliant, and horribly wrong prediction anyone has ever made. With the 7-minute clip, which you can enjoy above, CEO John Sculley, Apple II chief Del Yocam, Apple exec Mike Spindler, and that other cofounder—Steve “The Woz” Wozniak—envisioned what Apple would be like in the year 1997. And let’s just say they didn’t hit the bullseye.

          In Cupertino’s vision of a future 1997, Apple dominates the news, the markets, even stand-up comedy. Wall Street loves the company, and its growth is skyrocketing. The original Macs haven’t changed all that much, and Apple computers are everywhere—in living rooms and kitchens, at the airport, on planes, in space, and, well, on your face.

          Yes, history would play out somewhat differently than the Applemaniacs hoped it would. By the real 1997, Apple was in the gutter. Sculley had been kicked out of the company four years before, only to be replaced by Spindler, who would join him in the graveyard of ousted CEOs three years later.

          Sure, Apple’s late-80s video was meant as a bit of a joke. But humor has never been the company’s strongpoint. Exhibit A: the video’s prediction that the Apple of 1997 would sell a version of its ancient Apple II desktop computer known as the V.S.O.P. Apparently, this stands for “Very Smooth Old Processor.” As Yokum puts it: “This being 1997, some people think the Apple II concept is getting old. We don’t agree.”

          Other bits don’t miss the mark quite so badly. The video predicts something called VistaMac, which isn’t all that different from Google Glass, the digital eyewear that is now very much a reality. Of course, unlike the VistaMac, Google Glass doesn’t take floppy disks—and it doesn’t look so very late-80s.

          The video also presages a few things we now take for granted, including recommendation systems and ubiquitous virtual assistants that help us navigate the world. “A computer that talks is no big deal. A computer that listens? That’s a breakthrough,” says Woz. “Apple computers have always been friendly, but we’ve gone from friendly to understanding.” Sounds a bit like Siri—though we hasten to add that even Siri doesn’t quite work as promised.

          To be fair, Apple did have the last laugh. Though the company’s crazy predictions didn’t exactly come true, 1997 actually turned out be a very important year for the company. In 1997, Steve Jobs came back, as Apple purchased his new company, Next Computer. And he was smart enough to realize the V.S.O.P would never fly.

          Article source: http://www.wired.com/2014/08/apple-awkward-future/

          History Lesson: Commodore VIC-20, the computer forever in its successor’s …

          For a machine with a blokey British naming convention about it, like a TV channel called Dave, there’s barely a sniff of Blighty in the VIC-20′s heritage.

          After all, having been created in America (the home of Commodore) and initially launched in Japan, it should comes as no great surprise that the initials merely stood for Video Interface Chip.


          Regardless of its upbringing, back in 1981, among Clive Sinclair’s early dabbling and sci-fi whispers of the BBC Micro, the VIC-20 knew how to draw attention: through affordability.

          And so Commodore’s offering, with its chunky keyboard, became many an old-school gamer’s entry point into a life of technological jollies.

          Born from Commodore founder Jack Tramiel’s urge to get in before a predicted Japanese PC takeover, the VIC nestled between Commodore’s PET business machines and the beige titan that was the C64, not just chronologically but in its half-education, half-games approach.

          That it was “fully expandable to 27.5k user RAM” and had a “full set of upper and lower case characters” were among the pant-dropping bullet points.

          But as with its UK competition, and US rivals like the Apple II and Atari 400, tight restrictions only forced the VIC’s bedroom game-builders to whip their brains harder. Memory expansion packs helped, from 3k all the way up to 16k (or even a monstrous third-party 64k).

          Hundreds of games loped into the wild on cassette and cartridge as the system pulled in players, but Commodore also offered kit like the VIC Printer and Disk Unit to turn it into a “super-computer for the professional”.

          A canny combination of timing, support and marketing sealed the VIC-20′s place among those early ’80s machines: it got four good years and close to three million sales before the C64 built up enough steam to knock it on the head in 1985.

          And while its successor may have left a bigger footprint, 33 years down the line you can still find VIC enthusiasts coaxing unlikely tricks out of the old 5k box.

          Live at the Old VIC

          The C64 and Amiga may be regarded as Commodore’s best systems for games, but the humble VIC-20 had a few crackers too.

          Have a look at the best of what it had on offer and if you’re a young ‘un, look at the sort of things we used to play before Mario stepped into his overalls.

          Header image courtesy of Marcin Wichary

          Article source: http://www.computerandvideogames.com/475111/features/history-lesson-commodore-vic-20-the-computer-forever-in-its-successors-shadow/

          How 21st-Century Cities Can Avoid the Fate of 20th-Century Detroit

          SA Forum is an invited essay from experts on topical issues in science and technology.

          In Coningsby, a Benjamin Disraeli novel published in 1844, a character impressed with the technological spirit of the age remarks, “I see cities peopled with machines. Certainly Manchester is the most wonderful city of modern times.”

          Today, of course, Manchester is mainly associated with urban decline. There is a simple economic explanation for this, and one that can help guide cities and nations as they prepare for another technological revolution.

          Although new technologies have become available everywhere, only some cities have prospered as a result. As the late economist and historian David Landes famously noted, “Prosperity and success are their own worst enemies.” Prosperous places may indeed become self-satisfied and less interested in progress. But manufacturing cities such as Manchester and Detroit did not decline because of a slowdown in technology adoption. On the contrary, they consistently embraced new technologies and increased the efficiency and output of their industries. Yet they declined. Why?

          The reason is that they failed to produce new employment opportunities to replace those that are being eroded by technological change. Instead of taking advantage of technological opportunities to create new occupations and industries, they adopted technologies to increase productivity by automating their factories and displacing labor.

          The fate of manufacturing cities such as Manchester and Detroit illustrates an important point: long-run economic growth is not simply about increasing productivity or output—it is about incorporating technologies into new work. Having nearly filed for bankruptcy in 1975, New York City has become a prime case of how to adapt to technological change. Whereas average wages in Detroit were still slightly higher than in New York in 1977, they are now less than 60 percent of the latter’s incomes. At a time when Detroit successfully adopted computers and industrial robots to substitute for labor, New York adapted by creating new employment opportunities in professional services, computer programming and software engineering.

          Long-run economic growth entails the eclipse of mature industries by new ones. To stave off stagnation, my own research with Thor Berger of Lund University suggests that cities need to manage the transition into new work (pdf).

          Such technological resilience requires an understanding of the direction of technological change. Unfortunately, economic history does not necessarily provide obvious guidance for policy makers who want to predict how technological progress will reshape labor markets in the future. For example, although the industrial revolution created the modern middle class, the computer revolution has arguably caused its decline.

          To understand how technology will alter the nature of work in the years ahead, we need to look at the tasks computers are and will be able to perform. Whereas computerization historically has been confined to routine tasks involving explicit rule-based activities, it is now spreading to domains commonly defined as nonroutine. In particular, sophisticated algorithms are becoming increasingly good at pattern recognition, and are rapidly entering domains long confined to labor. What this means is that a wide range of occupations in transportation and logistics, administration, services and sales will become increasingly vulnerable to automation in the coming decades. Worse, research suggests that the next generation of big data–driven computers will mainly substitute for low-income, low-skill jobs over the next decades, exacerbating already growing wage inequality (pdf).

          If jobs for low-skill workers disappear, those workers will need to find to jobs that are not susceptible to computerization. Such work will likely require a higher degree of human social intelligence and creativity—domains where labor will hold a comparative advantage, despite the diffusion of big data–driven technologies.

          The reason why Bloom Energy, Tesla Motors, eBay and Facebook all recently emerged in (or moved to) Silicon Valley is straightforward: the presence of adaptable skilled workers that are willing to relocate to the companies with the most promising innovations. Importantly, local universities, such as Stanford and U.C. Berkeley, have incubated ideas, educated workers and fostered technologies breakthroughs for decades. Since Frederick Terman, the dean of Stanford’s School of Engineering, encouraged two of his students, William Hewlett and David Packard, to found Hewlett–Packard in 1938, Stanford alumni have created 39,900 companies and about 5.4 million jobs.

          For cities to prosper, they need to promote investment in relevant skills to attract new industries and enable workers to shift into new occupations. Big-data architects, cloud services specialists, iOS developers, digital marketing specialists and data scientists provide examples of occupations that barely existed only five years ago, resulting from recent technological progress. According to our estimates, people working in digital industries are on average much better educated and for any given level of education they are more likely to have a science, technology, engineering or mathematics (STEM) degree. By contrast, workers with professional degrees are seen less often in new industries, reflecting the fact that new work requires adaptable cognitive abilities rather than job-specific skills. It is thus not surprising that we find San Jose, Santa Fe, San Francisco and Washington, D.C., among the places that have most successfully adapted to the digital revolution.

          The cities that invest in the creation of an adaptable labor force will remain resilient to technological change. Policies to promote technological resilience thus need to focus on the supply of technically skilled individuals and encouraging entrepreneurial risk-taking. For example, the National Science Foundation recently provided a grant to North Carolina Central University to integrate entrepreneurship into scientific education. More such initiatives are needed. Furthermore, immigration policies need to be made attractive to high-skill workers and entrepreneurs. Cities like New York and London owe much of their technological dynamism to their ability to attract talent.

          Meanwhile, it is important to bear in mind that policies designed to support output created by old work are not a recipe for prosperity. Whereas General Motors has rebounded since its 2009 bailout, Detroit filed for Chapter 9 bankruptcy in 2013. Instead of propping up old industries, officials should focus on managing the transition of the workforce into new work. The places that do so successfully will be at the frontier. As argued by Jane Jacobs in more colorful terms: “Our remote ancestors did not expand their economies much by simply doing more of what they had already been doing…. They expanded their economies by adding new kinds of work. So do we.”

          Article source: http://www.scientificamerican.com/article/how-21st-century-cities-can-avoid-the-fate-of-20th-century-detroit/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+sciam%2Fhistory-of-science+(Topic%3A+History+of+Science)

          Can IT change the course of history? The story of Graham Tottle, the IT guy …

          It is rare for an IT worker to affect history, but in 1990 Graham Tottle was in Saddam Hussein’s
          HQ when Iraq invaded Kuwait.

          Tottle had been sent to Iraq to redesign the country’s Agricultural Projects database,
          previously held as a Lotus spreadsheet (Arabic Lotus), so that it would form a networked relational
          database built on dBASE.

          The database was designed to hold a huge amount of information on 850 projects – areas, crop,
          inputs, expected outputs, finance, historical production records, and so on, says Tottle.

          This could then be summarised, crop by crop, to predict total production. He says this was a
          more efficient way to model the data than using a spreadsheet such as Lotus.

          Tottle was working as a UN farming consultant, teaching his Iraqi counterparts in Baghdad and
          helping to develop the production database for Iraqi agriculture.

          Fighting to become a coder

          Graham Tottle became interested in computers while serving with the Royal Signals in the 1950s.
          Responding to a quiz in The Times, he ended up being hired by the English Electric Company,
          which later became ICL, the UK’s first major computer company.

          “I was hired as a systems analyst and I fought to become a programmer,” he says. This was the
          era of mainframe – there were no operating systems. “We wrote our own systems program,” says
          Tottle. Among the software he created was the UK’s first index sequential file handler.

          The system built a model of Iraq’s agricultural output based on a detailed production return
          from the previous year’s data.

          Clearly, such a database would be a key strategic tool to limit the impact of UK and US
          sanctions on Iraq.

          Tuttle says: “As I guessed at that time, Saddam’s capacity to be afflicted by sanctions was to
          become a vital consideration in the US and UK decision whether to set out and rely on sanctions or
          to go to war.”

          But on 2 August 1990, when Iraq invaded Kuwait, Tottle found himself in the office of the Iraq
          Agriculture Planning Division located within a massively fortified skyscraper, three floors above
          Saddam Hussein’s office. Tottle was among 3,000 foreign nationals who were rounded up and moved
          between hotels.

          He eventually took refuge at the UN library. “We used our shortwave radios to listen to Margaret
          Thatcher ‘vomiting poison like a spotted serpent’, as Saddam Hussein put it,” says

          During the day, he spent his time training UN staff and designing a database for peach
          production and playing on Microsoft Flight Simulator 4, “practising take-offs from Saddam
          International airport.”

          Saddam’s capacity to be afflicted by sanctions was to become a
          vital consideration in the US and UK decision whether to set out and rely on sanctions or to go to

          Graham Tottle

          A week later, while trying to escape across the Jordanian border, Tottle noticed a missile
          hidden under a motorway bridge. His group was turned back at the border, and on returning to
          Baghdad, he briefed MI6 about the hidden rocket. “I found out it was an Al-Abbas [a variant of the
          Scud] long-range missile, which the Iraqis had kept on the eastern border, and were being shifted
          to attack Tel Aviv,” he says.

          Agriculture planning and war

          Iraq’s agriculture program started life as a paper Tottle originally wrote in 1959, based on
          some of the ideas in the US Program Evaluation and Review Technique (PERT) project management tool,
          which had been created to support the development of the Polaris submarine weapon system in the

          During the mid-1960s to late 1970s, Tottle worked at the English Electric Company, which later
          became ICL. He left to form a software company, Agricultural Computer Systems International, which
          built farming software for developing nations.

          The software was used in countries such as Malaysia, which produces a quarter of the world’s
          rubber. “There are 300,000 rubber farmers in Malaysia,” says Tottle. “The computer program produced
          plans for replanting rubber trees, and an action list.”

          Iraq was not the only time Tottle found his expertise in agricultural IT being used in the midst
          of a conflict. It was also used in the 1990s during the so-called “banana wars”, when the US put
          pressure on Caribbean banana producers over preferential EU tariffs.

          Scottish independence and the swinging 60s

          Now, 24 years after the start of the first Iraq war, Tottle’s experiences in the country have
          been committed to print, as the backdrop to his new novel, 2040,
          which was published in July

          Drawing on his own experiences, the book describes a farming consultant working on an
          agricultural production database before the first Iraq war. “The events in Iraq were quite
          traumatic,” says Tottle. “Saddam Hussein had already wiped out 40,000 of his own population by
          bombing them with chemical weapons. What would happen if he bombed the world?”

          In 2040, Tottle explores this premise and the development of computers and information
          systems, looking at the dangers to individual liberty and the surveillance society.

          We are tracking almost every human activity. I try to picture
          this in the book

          Graham Tottle

          The novel depicts an alternative reality, which begins with the Iraqi dictator using chemicals
          weapons on the West. This alternative universe, called Downside, is set in the future, when
          Scotland has become independent and government snooping on civilians is taken to extremes.

          Tottle believes people have become far too tolerant of the ever-present surveillance society,
          where CCTV and internet monitoring track all their activities. “We are tracking almost every human
          activity,” he says. “I try to picture this in the book, where individuals are being controlled and

          Now imagine how the state could use information gleaned from the internet of things. In his
          book, Tottle describes how a young woman gets stuck in an internet-controlled toilet in
          Macclesfield. In another example, a couple driving a car are stopped by the police and asked where
          they are going because the car “should not be there”.

          But there is hope, in the form of KDF6, a “tight little 1960s mainframe” built by English
          Electric for one of its first customers, says Tottle. Why choose a 1960s mainframe? Firstly,
          machines built at that time were pre-internet, says Tottle. “It is well known that due to Microsoft
          and the web, people’s privacy is not sustainable. So why not use this ancient computer

          Incidentally, the English Electric Company took over Leo, maker of the world’s first commercial
          computer, and later merged with International Computers and Tabulators under Harold Wilson’s Labour
          government in 1968 to form ICL, the UK’s answer to IBM. English
          Electric Company was also where Tottle began his IT career

          As for Scottish independence, Tottle says: “I am against it. I feel people have forgotten what a
          great joint history Britain has had.”

          Tottle’s novel,
          available on Kindle and in paperback.

          Related Topics:

          Internet infrastructure,

          Business applications,

          IT consultancy,

          IT for consulting and business services,

          IT risk management,

          Business intelligence and analytics,

          Database software,

          Privacy and data protection,

          IT for government and public sector,

          IT suppliers,

          Web development,


          Email Alerts

          Article source: http://www.computerweekly.com/news/2240227336/Can-IT-change-the-course-of-history

          Computers can find similarities between paintings – but art history is about …

          Some computer scientists at Rutgers University in New Jersey have written a computer programme that finds connections between paintings and can even discover influences between artists, they claim. This certainly raises some fascinating questions, but not about art history.

          In the paper, Babak Saleh and his colleagues describe how they created a programme to compare paintings so as to establish recurrences of certain features.

          They classified more than 1,700 paintings according to various visual features they contained, from simple object descriptions to style and colour. And many striking comparisons and links were indeed thrown back by the programme.

          But I’m afraid this is by no means going to help art history. The paper is titled “Toward Automated Discovery of Artistic Influence”. And sorry folks, but art history isn’t just about tracing influence and comparing use of things like space, texture, form and colour.

          This is what we might call connoisseurial art history, which is what you might have found in the 19th century. Connoisseurs began to compare works scattered across churches and monasteries, classifying them and trying to discern common authorship. The works were identified for certain similarities of technique or ways of painting, for instance, hands or ears. At this point, in the later 19th and early 20th century, the project was somewhat forensic. Indeed, the founder of this method was a doctor, Giovanni Morelli.

          But unsurprisingly, the discipline has developed somewhat since then. To study art history, we need to know about economics, politics, literature, philosophy, languages, theologies, ideologies while also studying to understand how art thinks. Art thinks through making, through forms, through materials. And over the past century, art history has been enriched by feminist, post-colonial, queer, and trans-national perspectives. We no longer hunt for connections – we ask questions. We are not diagnosticians seeking for common symptoms. We are not criminologists tracing clues that link a with b.

          Even at the most basic level, machines would not be helpful in developing these larger narratives. The idea that machines can see or notice what human beings do not is a fallacy, because the machine is only doing what it is told – and it is the programmers who are setting parameters. But those parameters are based on a woefully old-fashioned and dull misunderstanding of what art historians do, and what they look for.

          The big question is not that Caillebotte (one of the examples given) was influenced by Degas. Instead it is what he did with that “Degas-ness”. Did he get what Degas was doing? Was he arguing against Degas by making sure we saw some reference to his work? Why would it be valuable to work with it, to work it otherwise? What does referencing and deferring to another artist make possible for the one who does so?

          In one example from the article, the programme “discovered” similarities between French Impressionist Frederic Bazille’s Studio 9 Rue de la Condamine (1870) and American Norman Rockwell’s Shuffleton’s Barber Shop (1950) into which they thought art historians could look further.

          It is, of course, possible that Rockwell knew Bazille’s painting from an illustration in a book about Impressionist art, and even liked it. But what would we learn from finding pot-bellied stoves in both paintings, except about how people heated rooms pre-central heating? Rockwell’s art was all about creating an American vernacular style in art in opposition to the European modernism of which Bazille was an early part. Such comparisons are shallow, and overlook time, place, history and art politics.

          The real problem is that even in the game of source hunting and influence tracing, ideology is already at work. Influence, linking artists and artworks in a one-way direction, such as family descent, is a dressed-up way of protecting the canon (and the art market), and this machine-aided form of looking for similarity would only reinforce it.

          There was, until recently, virtually no art history that ever asked how women or African-Americans, or non-Europeans “influenced” the direction of art, or even traced any kind of links between such artists and the canonised white men. It is the kind of art history practiced in today’s universities, rather than the auction houses, that is asking precisely these bigger questions.

          Art history studies cultures, societies, histories, and experiences and how they are given form. All we get from exercises in comparison and influence are superficial resemblances at which any artist would laugh. Art history takes art and artists seriously.

          Article source: http://theconversation.com/computers-can-find-similarities-between-paintings-but-art-history-is-about-so-much-more-30752

          Sam Altman on targeting energy and biotech with his first Y Combinator batch

          The first batch of startups to go through Y Combinator since Sam Altman took over as president pitched at Demo Day in Mountain View on Tuesday.

          Cromwell Schubarth
          Senior Technology Reporter- Silicon Valley Business Journal


          Sam Altman has wasted no time in putting his stamp on Y Combinator.

          The first batch of startups under his guidance as president pitched on Demo Day on Tuesday. The cohort was bigger and more varied than I’ve seen in the past. And many of the startups who took the stage at the Computer History Museum talked a good game of aiming higher than ever.

          A pair of nuclear energy startups, a quantum computer enterprise and about a dozen biotech/health companies were the most obvious signs of change under Altman.

          “I definitely have a strong interest in deep science companies,” Altman told me a little after the final company pitched and the beer and wine began to flow at the post-Demo Day party. “Nobody but us really is investing in these sorts of companies right now.”

          Altman said Y Combinator likes industries that aren’t attracting a lot of competing investors. “These are companies that, if they work, will be incredibly valuable and very few people are trying to invest in them.”

          The new focus on energy startups took Altman around the world earlier this year, recruiting startups to join the Mountain View accelerator.

          “They didn’t come to us,” he said. “I went out and hand-selected those guys. I met with every fusion and fission company I could find. Surprisingly there are not that many of them.”

          He makes it clear, too, that he wants to avoid the mistakes made in Silicon Valley’s cleantech love affair of about 10 years ago.

          “Everyone screwed up energy the last time around,” he said emphatically. “The last energy boom featured investments in companies that were trying to deliver energy that was more expensive than what you get off the grid. But there was some vague notion of saving the world and making that OK.”

          Cromwell Schubarth is the Senior Technology Reporter at the Business Journal.

          Article source: http://www.bizjournals.com/sanjose/news/2014/08/20/sam-altman-on-targeting-energy-and-biotech-with.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+bizj_national+(Bizjournals+National+Feed)