Global stocks, euro up after Italian government formed

By Caroline Valetkevitch

NEW YORK | Mon Apr 29, 2013 12:22pm EDT

(Reuters) – World stock indexes and the euro advanced on Monday as the formation of a new government in Italy eased uncertainty about the political future of the country, the third-largest economy in the euro zone, while tame inflation data drove down U.S. Treasury yields.

U.S. price data showed inflation remained quiet, suggesting the Federal Reserve, which will begin a two-day policy meeting on Tuesday, will not be ending its accommodative monetary stance any time soon.

Recent signs of weak U.S. growth had raised expectations the Fed will keep its pace of bond buying unchanged at $85 billion a month at its meeting this week, while the European Central Bank is widely expected to announce an interest rate cut when it meets on Thursday.

Investors welcomed the formation of a broad coalition government in Italy under new Prime Minister Enrico Letta, two months after inconclusive general elections, though investors remain cautious over how long the new growth-focused government will survive.

The resolution of Italy’s political stalemate helped bring its five- and 10-year borrowing costs down to their lowest level since October 2010 at a bond sale on Monday, while yields on 10-year debt in the secondary market fell 13 basis points to 3.93 percent.

“Italian sovereign debt is benefiting from the twin effects of central bank liquidity support and political stability of sorts,” Nicholas Spiro, managing director of London-based consultancy Spiro Sovereign Strategy, said.

MSCI’s world equity index .MIWD00000PUS was up 0.6 percent, while the broad FTSE Eurofirst 300 index .FTEU3 of top European shares provisionally closed up 0.5 percent, led higher by Milan’s FTSE MIB .FTMIB, which rose 2.2 percent.

Wall Street helped world stock indexes add to gains, with U.S. stocks buoyed by stronger-than-expected housing data.

“Wall Street appears primed for another assault at record highs,” said Andrew Wilkinson, chief economic strategist at Miller Tabak & Co in New York.

The Dow Jones industrial average .DJI was up 67.12 points, or 0.46 percent, at 14,779.67. The Standard & Poor’s 500 Index .SPX was up 9.06 points, or 0.57 percent, at 1,591.30. The Nasdaq Composite Index .IXIC was up 27.64 points, or 0.84 percent, at 3,306.90.

The Fed’s stimulus measures have helped U.S. stocks rally for much of this year, with both the S&P 500 and Dow hitting record highs in the past two months.

The euro rose 0.5 percent at $1.3095, with hedge funds cited among key buyers. The euro’s session peak of $1.3115, the highest since April 19, was reached midway through the London session.

A Reuters poll of 76 economists last Thursday showed only a narrow majority of 43 expected a 25 basis point cut at this week’s ECB policy meeting, which would take the bank’s refinancing rate to a record low of 0.5 percent. <ECB/INT>

Inflation, as reflected in the personal consumption expenditure price index, rose just 1 percent over the 12 months through March, the smallest gain since October 2009 and a slowdown from the 1.3 percent logged in the period through February.

Benchmark U.S. 10-year Treasuries were up 3/32 in price to yield 1.658 percent.

U.S. data also showed that contracts to purchase previously owned U.S. homes rose in March as the housing market continued to pick up pace this year.


The uncertain outlook for economic growth, especially in the world’s two big oil consumers, the United States and China, initially kept crude prices under pressure. But prices recovered in early U.S. trading.

China is due to release surveys on activity in its giant factory sector later this week.

Brent crude was up 25 cents to $103.41 a barrel, after making its biggest weekly gain since November last week. U.S. light crude was up 72 cents at $93.72.

(Additional reporting by Richard Hubbard in London and Nick Olivari and Angela Moon in New York; Editing by Dan Grebler and Leslie Adler)

Google Fiber’s Ripple Effect

The threat of superfast Google Fiber is causing other Internet providers to crank up their own offerings.

By David Talbot on April 26, 2013

As Google plans to expand its ultrafast Internet service from a fledging effort in Kansas City to Austin, Texas, and Provo, Utah, evidence is emerging that the company has forced broadband competitors into offering dramatically better service.

New data from Akamai, which delivers a hefty portion of all Web traffic, reveals a remarkable turn of events in Kansas. In the fourth quarter of 2012, Kansas saw the largest jump in average Internet connection speeds of all U.S. states compared to the fourth quarter of 2011, with an 86 percent surge (see “When Will the Rest of Us Get Google Fiber?”). The next-highest increase was in Wyoming, at 51 percent.

Google began installations in November in Kansas City, Kansas, offering one-gigabit Internet connections—nearly 100 times faster than the U.S. average—for $70 per month, or $120 with television service, a Nexus 7 tablet remote, two terabytes of DVR storage, plus another a terabyte of cloud storage. The rollout and TV service had been announced a few months before. “It could be the case that the other incumbent providers were going, ‘Oh, crap, we stand to potentially lose subscribers to this deal with Google if we don’t provide competitive service,’ ” says David Belson, who authored Akamai’s state of the Internet report.

There is no public data that gives a complete picture of the speed improvements or price reductions that Internet service providers in the Kansas City area made in response to the beginning of the Google service, which delivers broadband over fiber-optic lines. But Susan Crawford, a professor at the Benjamin N. Cardozo School of Law in New York and former special assistant for technology policy in the Obama administration, says her research suggests that Google is indeed the driving force in the Kansas market.

In December, Time Warner Cable increased speeds of some services in the Kansas City area, boosting its “turbo” service from 15 megabits per second to 20 megabits per second and its fastest service from 50 to 100 megabits per second. “I see Time Warner Cable in and around Kansas City acting like a bulldog with a bone,” says Crawford, author of Captive Audience: The Telecom Industry & Monopoly Power in the Gilded Age. “They want to make sure they hang onto subscribers, not lose them.”

In general, there is plenty that the dominant Internet providers can do to provide better deals without much effort, she says. Cable companies like Time Warner Cable and Comcast have the technical capacity to speed up service, and also plenty of room to lower prices, given the estimate from one analyst—Craig Moffet of the Wall Street firm Bernstein Research—that they typically make 97 percent profit margins on Internet services.

The competition may be even hotter in the newer Google Fiber battlegrounds. After Google announced plans for Austin (see “Google Fiber Takes on Texas”), AT&T quickly announced it would match that effort with its own one-gigabit service, and Time Warner Cable sweetened its Internet plans with free Wi-Fi in public areas to existing customers.

Google has not disclosed how many customers it has in Kansas City, or what plans those customers bought. But Akamai was able to do some forensic work to see just how small Google’s service footprint was, and thus just how little it took to wake up the competition.

According to Belson, in the fourth quarter of last year, Google served less than a tenth of a percent of the 830,000 Internet addresses that Akamai counted in Kansas, or fewer than 830 customers. “Ultimately, we didn’t see enough unique IP addresses from [Google] that those speeds would have unduly influenced the overall [speed] calculation,” Belson says.

Even more remarkable, perhaps, was that Google Fiber customers were using far less than the available blazing speed. IP addresses associated with Google Fiber were seeing average connection speeds of twice the Kansas average of five megabits, and peak speeds of five times greater than the average of 25 megabits.

Some of this might be explained by the fact that some Google Fiber customers took only a basic hookup with five-megabit service for a one-time $300 installation fee, and did not accept the fast service. But the larger reality is that, so far, “there is not a whole lot of stuff out there today that is really gigabit-capable,” Belson says.

Nonetheless, gigabit speeds have proved to be quite capable of waking up a nation of Internet service monopolies and duopolies (see “A Tale of Two Genachowskis”).

Battery Could Provide a Cheap Way to Store Solar Power

Combining aspects of high-energy lithium-sulfur batteries with flow battery technology can lower costs.

By Kevin Bullis on April 26, 2013

There’s a promising new entry in the race to build cheap batteries for storing energy from solar panels and wind turbines. Stanford researchers led by Yi Cui, a professor of materials science and engineering, have demonstrated a partially liquid battery made of inexpensive lithium and sulfur. Cui says the battery will be easy to make and will last for thousands of charging cycles.

Cui believes that the material and manufacturing costs of the battery might be low enough to meet the Department of Energy’s goal of $100 per kilowatt-hour of storage capacity, which the DOE estimates will make the technology economically attractive to utilities. Existing batteries can cost hundreds of dollars per kilowatt-hour of capacity, although several companies are working to commercialize cheaper ones (see “Ambri’s Better Battery” and “Battery to Take On Diesel and Natural Gas”).

The technology is a cross between a flow battery and an experimental type called a lithium-sulfur battery. In a flow battery, positive and negative liquid electrolytes are stored in swimming-pool-size tanks. The batteries are attractive because the amount of energy they store can be increased simply by expanding these tanks, without increasing the size of the electronic connections and other battery parts needed to extract the energy. But they require expensive ion membranes and large amounts of material.

Lithium-sulfur batteries, meanwhile, consist of two solid electrodes connected by a liquid electrolyte. They have the potential to store large amounts of energy, but they’ve been hard to commercialize because they can’t be recharged often enough. The problem is that compounds called lithium polysulfides, which form during the charging and discharging process, tend to dissolve in the electrolyte, leaving the lithium and sulfur inaccessible for future charging cycles. With each recharge, more energy capacity is lost, limiting the life of these batteries.

But Cui saw that this phenomenon could be useful in a flow battery, where energy is stored in the electrolyte and not in a solid electrode. Indeed, the dissolved lithium polysulfide stores more energy than the materials usually used in flow batteries, such as vanadium, so less material is needed. That, and the fact that lithium and sulfur cost less than vanadium, could help lower the cost of flow batteries.

What’s more, Cui says, his modified flow battery needs no ion membrane. Only one of the electrodes is a liquid; the other is metallic lithium. An inexpensive coating on the lithium serves the purpose of the membrane, allowing ions but not electrons to move between the lithium metal and the polysulfides. That is key to both protecting the lithium and creating an electrical current.

Challenges remain before the battery can be commercialized. For example, the number of times it can be recharged, while currently impressive for a lithium-sulfur battery, still needs to be improved for the technology to be economically competitive. Cui’s battery has been charged 2,000 times, but the DOE target is 5,000 recharges. Even to reach 2,000 cycles, he needed to include extra lithium in the battery to accommodate the fact that the metal degrades a bit with each charging cycle. The extra lithium adds to the cost, which could make it harder to meet the target of $100 per kilowatt-hour.

Energy Department Backs New Way to Make Diesel from Corn

A novel chemical pathway could address the high cost of transporting cellulosic materials to make diesel fuel.

By Kevin Bullis on April 29, 2013

Within a year, a pilot plant in Indiana will start converting the stalks and leaves of corn plants into diesel and jet fuel. The plant will use a novel approach involving acid as well as processes borrowed from the oil and chemical industry, which its developers hope will make fuel at prices cheap enough to compete with petroleum.

The plant, which will have the capacity to process about 10 tons of biomass a day—enough for about 800 gallons (3,000 liters) of fuel per day, will be built by Mercurius Biofuels of Ferndale, Washington, with the help of a grant from the U.S. Department of Energy of up to $4.3 million.

Cellulosic biomass—corn stalks and other matter like wood chips and grass—are abundant and require less energy and fertilizer to produce than sugar or corn grain, the main sources of biofuel now. Because of this, the production of cellulosic biomass is cheaper and results in less carbon dioxide emissions.

But so far it’s proved difficult to make fuel economically from these sources (see “Cellulosic Ethanol Inches Forward”). One big problem has been the cost of transporting raw biomass. A solution is to build small biorefineries that are close to the needed feedstocks, but smaller facilities tend to be more expensive per liter of fuel produced.

In Mercurius’s new process, biomass can be converted into a liquid intermediate chemical at small plants located close to sources. That liquid takes up much less volume than the original biomass, making it more economical to ship to a large centralized facility to be converted to fuel.

Mercurius uses acids to break down cellulose and make a chemical called chloromethylfurfural; the process is based on an approach developed by Mark Mascal, a professor of chemistry at the University of California at Davis.

Converting cellulose to this chemical makes more efficient use of the carbon in cellulose than one of the most common approaches to making fuel from cellulose: converting cellulose into sugar and fermenting it to make ethanol. “Fermentation blows out one-third of the carbon as carbon dioxide,” Mascal says. “[Our process] captures all of the available carbon in biomass.”

The chloromethylfurfural, in turn, can be converted into diesel or jet fuel with industrial processes similar to those used in the chemicals industry and at oil refineries. “We have processes that are a lot like petroleum refining processes, so it’s scalable and potentially faster to bring to market,” says CEO Karl Seck.

Using acids can be expensive, so one key to the process is the fact that it will be easy to recycle the acids used. Unlike sugar, the chloromethylfurfural isn’t soluble in water, so it is easy to separate it from the acid so the acid can be used again, Seck says (see “Reinventing Cellulosic Ethanol Production”). He says the process will also be cheaper than using enzymes to break down cellulose, a common approach being developed now.

Other companies and academic groups are developing processes for making biofuels from cellulose. Many of these turn biomass into gases before converting those gases into fuels. In contrast, Mercurius’s approach makes liquids that are cheaper to handle, requiring smaller and cheaper equipment.

The new technology is at an early stage. Each part of the process has been demonstrated, including the final steps of producing diesel and jet fuel that meet specifications for use in vehicles. But everything has only been done at a small scale, and the entire process hasn’t yet been linked together. Some other alternatives are further along.

Kior, for example, uses a catalytic process to break up cellulose to make a sort of crude oil that, as with Mercurius’s technology, can be processed into diesel and other fuels (see “Kior ‘Biocrude’ Plant a Step Toward Advanced Biofuels”).

France’s Hollande to ease entrepreneurs’ capital gains tax

PARIS | Fri Apr 26, 2013 2:58pm EDT

(Reuters) – President Francois Hollande will propose next week easing entrepreneurs’ capital gains tax, an official in his office told French media on Friday, as the Socialist leader struggles to win the confidence of business owners.

Hollande’s government had planned last year to raise capital gains tax on business owners early in its mandate, but backed down after a high-profile revolt and warnings the move would drive start-up companies abroad.

But the episode, and increases in other business taxes, have created a climate of mistrust for the government among business owners at a time when leaders need them to help kick-start the stagnant economy and create jobs.

Eager to dispel the bad blood, Hollande is set to announce plans to reduce the taxable amount of their capital gains by up to 65 percent, Les Echos business newspaper reported on its website on Friday.

Under special cases such as a business owner retiring, the deduction could reach as much as 85 percent, the daily said.

The presidency official did not confirm the numbers but insisted that Hollande wanted to encourage businesses to be set up and help risk-taking investors that finance new firms.

With the thinnest profit margins in the euro zone, corporate confidence has fallen to levels not seen since the 2008-2009 financial crisis as firms face plunging demand and high taxes.

Hollande’s Socialist government is struggling to win back corporate France’s confidence, with many business people fearing more tax hikes are in store as the state battles to bring down its budget deficit.

Prime Minister Jean-Marc Ayrault acknowledged last week that the government’s flagship measure for companies, a tax credit aimed at cutting their wage bill indirectly, had received a tepid reception so far.

(Reporting by Elizabeth Pineau; Writing by Leigh Thomas)

Web startup Aereo sets its sights on Boston TV market

Tue Apr 23, 2013 10:58am EDT

(Reuters) – Aereo, the red hot Web startup that has raised the ire of U.S. broadcasters, is planning to expand to Boston starting May 15, the company said on Tuesday.

Backed by Barry Diller’s IAC/InterActiveCorp, Aereo plans to launch first with consumers who pre-registered and then more broadly to the Boston area on May 30.

Aereo is currently available in New York.

The company has caught the attention of the likes of News Corp’s Fox, Walt Disney’s ABC, CBS Corp and Comcast’s NBC because it offers people cut-rate subscriptions to their channels.

The broadcasters collect millions of dollars in fees from cable operators to carry their stations. Aereo does not pay anything to the broadcasters.

This prompted the media companies including News Corp and Disney to file a lawsuit against Aereo. Earlier in April, a U.S. appeals court declined to temporarily shut down the online television venture.

Meanwhile the broadcasters have upped the ante: Fox is threatening to remove itself from the free airwaves and become a cable channel if the courts do not shut down Aereo.

(Reporting By Jennifer Saba in New York; Editing by Nick Zieminski)

UK retools flagship credit scheme to help small firms

By William Schomberg and David Milliken

LONDON | Wed Apr 24, 2013 4:06am EDT

(Reuters) – Britain sought to inject new life into the country’s stagnant economy on Wednesday by giving banks greater incentives to lend to small and medium-sized firms which complain they are starved of credit.

The Bank of England and the Treasury said a new phase of their flagship Funding for Lending Scheme would be heavily skewed towards smaller firms.

Banks taking part in the program will also now be able to lend to alternative providers of credit – such as leasing firms which often work with small companies – as well as mortgage and housing credit corporations.

Under a third change, banks can get funding from the FLS for an extra year until the end of January 2015.

The Bank of England and the government see a lack of credit to small businesses as a major factor behind Britain’s very slow recovery from the financial crisis. On Thursday, data could show the economy slipped into its third recession in under five years

Finance minister George Osborne is under pressure to boost growth after concerns from the International Monetary Fund – previously a supporter of his austerity policies – said he may need to slow the pace of spending cuts.

He announced measures to boost the housing market in March and employers groups welcomed Wednesday’s changes to the FLS. But they said it remained to be seen whether banks would become less risk-averse and lend to such borrowers as start-up firms.

“What a lot of SMEs (small and medium-sized enterprises) will be looking for is money actually getting to the front line on reasonable terms, and not just to the safe bets,” said Adam Marshall, policy director at the British Chambers of Commerce.

Economists said the changes were not a game-changer for the economy. “The FLS is likely to provide a boost when confidence returns to the economy, but confidence is the elusive factor,” analysts at Barclays said in a note to clients.

Alan Clarke, an economist at Scotiabank said the changes were probably a complement to more broad-based stimulus in the future by the Bank of England, and were unlikely to stop it from buying more government bonds later in the year.


The original FLS was launched last August and offers banks cheap credit if they increase lending to households and businesses. Results have been mixed, with benefits so far mainly going to banks and homebuyers rather than small businesses.

Banks drew 14 billion pounds ($21 billion) in cheap funding from the Bank of England between August and the end of last year but the FLS failed to stop a decline in overall bank loans at the end of 2012, adding to pressure on the government to take more action.

Bank of England Governor Mervyn King said the extension of the FLS would assure banks about their cheap funding rates.

“This innovative extension will now do even more for small and medium-sized businesses so that they can play their full part in creating new jobs,” Osborne said in a statement.

One of the changes announced on Wednesday seeks to get credit to small and medium-sized firms flowing as soon as possible: for every pound of additional lending by banks to the sector in the remainder of 2013, the amount of funding that banks will be able to draw upon increases by 10 pounds.

In 2014, that falls to five pounds of FLS funding for banks for every pound they lend to SMEs.

Lending to other sectors will count on a one-for-one basis towards the allowance for banks accessing the scheme.

Cormac Leech, a banking analyst at Liberum Capital, said the 10-to-1 ratio to increase bank lending to small firms this year would help banks such as Royal Bank of Scotland and Lloyds, which are Britain’s biggest business lenders.

“They are highly incentivised to write SME loans even at an underwriting loss. So it’s a key positive for them and should help to drive their share price and sector earnings,” he said.

Employers groups want more competition in Britain’s banking sector as a way to spur fresh lending. Those hopes suffered a blow on Wednesday when the planned sale of 630 bank branches by Lloyds to the Co-Operative Group fell through.

 (Editing by Jeremy Gaunt)

Renewables Can’t Keep Up with the Growth in Coal Use Worldwide

An International Energy Agency report calls for more research, carbon price, to help renewables compete.

By Kevin Bullis on April 17, 2013

Despite remarkable growth, solar and wind power aren’t making a dent in carbon emissions, says a new report from the International Energy Agency. Coal consumption is growing too fast to offset any gains from renewables.

According to the report, solar power capacity increased by 42 percent, and wind increased 19 percent during 2012. In comparison, coal only grew by 6 percent over the last two years. But because the total installed capacity of coal power was already huge, the amount of coal capacity added was much larger than that of solar and wind power. Even the increase in natural gas consumption hasn’t decreased the use of coal worldwide (see “Coal Demand Falls in the U.S., Rises Everywhere Else”).

Renewable energy can’t keep up with coal, let alone decrease its use. From 2001 to 2010, the amount of electricity generated with coal increased by 2,700 terawatt hours. Over the same period, electricity from non-fossil sources—including wind, solar, biomass, hydropower, and nuclear—increased by less than half that amount: or 1,300 terawatt hours.

Worldwide, more coal power is being installed because it’s inexpensive, reliable, and easy to incorporate into the grid. Before countries decide to stop building new coal plants, wind and solar and other low-carbon alternatives need to get cheaper, says Matthew Stepp, a senior analyst at the Information Technology and Innovation Foundation.

“In 2011, the last year data has been published, China built as many coal plants as there are in Texas and Ohio combined, even as it led the world in wind deployment,” says Stepp. “Even China, with its seemingly endless government budgets is still implementing fossil fuels because it’s cheap and high-performing.”

 “The situation is actually worse than the IEA portrays,” adds David Victor, co-director of the Laboratory on International Law and Regulation at the University of California at San Diego. Data from the agency shows that the world actually emits more carbon per unit of energy than it did a decade ago because of the growth in coal, he says.

The lack of progress on developing and implementing technology for capturing carbon dioxide from power plants is also noteworthy, Victor says. “This is a good case study because a decade ago there were high hopes for [carbon capture and sequestration (CCS)], but in the last decade basically nowhere on the planet has there emerged a viable business model for electric power CCS,” he says.

The IEA report says that while funding should be tripled to provide the kind of technology needed to replace fossil fuels, the actual share of spending on energy R&D is going down. It also calls for a reduction in subsidies for fossil fuels, which at $523 billion are six times higher than subsidies for renewable energy.

Nanoparticle Disguised as a Blood Cell Fights Bacterial Infection

Biomimetic nanoparticles could be an effective treatment against antibiotic-resistant bacteria.

By Mike Orcutt on April 14, 2013

A nanoparticle wrapped in a red blood cell membrane can remove toxins from the body and could be used to fight bacterial infections, according to research published today in Nature Nanotechnology.

The results demonstrate that the nanoparticles could be used to neutralize toxins produced by many bacteria, including some that are antibiotic-resistant, and could counteract the toxicity of venom from a snake or scorpion attack, says Liangfang Zhang, a professor of nanoengineering at the University of California, San Diego. Zhang led the research.

The “nanosponges” work by targeting so-called pore-forming toxins, which kill cells by poking holes in them. One of the most common classes of protein toxins in nature, pore-forming toxins are secreted by many types of bacteria, including Staphylococcus aureus, of which antibiotic-resistant strains, called MRSA, are endemic in hospitals worldwide and cause tens of thousands of deaths annually. They are also present in many types of animal venom.

There are a range of existing therapies designed to target the molecular structure of pore-forming toxins and disable their cell-killing functions. But they must be customized for different diseases and conditions, and there are over 80 families of these harmful proteins, each with a different structure. Using the new nanosponge therapy, says Zhang, “we can neutralize every single one, regardless of their molecular structure.”

In animal tests, the researchers showed that the new therapy greatly increased the survival rate of mice given a lethal dose of one of the most potent pore-forming toxins. Liver biopsies several days following the injection revealed no damage, indicating that the nanosponges, along with the sequestered toxins, were safely digested after accumulating in the liver.

If the drug can achieve regulatory approval, says Zhang, the major application would be the treatment of bacterial infections, especially those involving antibiotic-resistant bacteria. Neutralizing bacterially produced toxins not only protects the body, but can also weaken the bacteria against the immune system, since the bacteria can no longer rely on the toxins for protection, says Zhang. This is one of the ideas behind a relatively new approach to treating antibiotic-resistant bacterial infections, called anti-virulence therapy.

Zhang says his group hopes to pursue clinical trials of the nanosponge therapy soon, and he’s optimistic about its prospects. The polymer that makes up its core is already FDA-approved, and the red blood cell membrane is safe since it is taken from the body, he says. Compared to other types of drugs, says Zhang, “I envision much less hurdles for clinical trials and approval.”

How Facial Recognition Tech Could Help Trace Terrorism Suspects

The FBI could use software to help identify suspects, and more advanced techniques are around the corner.

By Tom Simonite on April 18, 2013

The FBI appealed to the public Thursday for help identifying two men shown in pixilated photos and video footage who are suspected of involvement in Monday’s bomb attacks in Boston.

The two men, now identified as Tamerlan Tsarnaev and Dzhokhar Tsarnaev, brothers originally from Chechnya, were involved in a dramatic shootout with police in Cambridge, Massachusetts, on Thursday night. The pair robbed a 7/11 and killed an MIT police officer before hijacking a car and engaging police in pitched battles in the suburb of Watertown. The older of the two men, Tamerlan Tsarnaev, was killed during a shootout with police while his younger brother, Dzhokhar, remains on the run as of Friday morning.

Experts say the FBI could have used images from the scene of Monday’s bombing—together with facial recognition software—to search through identity databases. The approach is likely to become more common in the future as new technology makes using facial recognition on surveillance and bystander imagery more reliable.

Deploying facial recognition software in the Boston investigation isn’t straightforward because the images available are very different from the evenly lit, frontal, passport-style photos stored in law enforcement databases. Such mug shots can be matched with about 99 percent accuracy, says Anil Jain, a professor at Michigan State expert who works on facial recognition, a figure that falls to about 50 percent for images of good quality but with added complications such as a person wearing a hat or glasses.

Attempting facial recognition on images like those released by the FBI Thursday is out of the question, Jain says. However, there may be other images and videos available that contain a better view that could be high quality enough, he says. “You could search all the other images based on clothing,” he says, “[and then] you could locate the same person and collect multiple images.”

Such a search could be done manually, but the FBI also likely has access to software that could speed the process by matching images and video footage that show the same scene or area, says Brian Martin, director of biometric research at MorphoTrust, a company that provides facial recognition technology to the FBI and the U.S. Department of Defense.

An image found amongst the many provided by witnesses and surveillance cameras wouldn’t have to be a perfect mug shot, either, says Martin. “There are numerous techniques to clean up an image,” he says. “You could improve the resolution, correct shadows, or rotate the pose of the face.”

Facial recognition algorithms struggle once a person’s face is turned by more than about 20 degrees, says Martin, but software from his company can correct turns of up to 45 degrees. It does this using built-in knowledge of facial geometry and by filling in the hidden side of a face by copying from the visible side.

Still, even if the FBI is able to find a photo to submit to its facial recognition search system, it won’t return just a single name, even if the person is on file. “With this type of situation you’re trying to generate leads,” says Martin, and agents would expect to manually screen a list of tens or hundreds of possible matches.

The FBI and other law enforcement and security agencies will see a growing opportunity to use facial recognition, as the volume and quality of surveillance camera and bystander imagery from cell phones grows. That trend is encouraging, and sometimes directly funding, companies like MorphoTrust and academics like Jain to work on technologies that could see facial recognition used routinely in criminal investigations both major and minor.

Martin’s team at MorphoTrust is working on making software better able to handle the kind of images that appear in surveillance and bystander data. “In a case like this you don’t typically get a good frontal view that’s well-lit,” says Martin. “We’re trying to push the boundaries so you can compensate for things like a face more than 45 degrees off to the side.”

With funding from the FBI, Jain at Michigan State is working on software for matching faces from low quality surveillance video against existing image databases. Another project is developing a system that can search a database of faces for a match with a sketch drawn by a forensic artist or a partial or outdated photo.

Other researchers are testing more fundamental rethinks of facial recognition algorithms. Marios Savvides, an assistant professor at Carnegie Mellon and director of its Cylab Biometrics Center, has developed technology that can create an accurate high-resolution image of a face from a poor resolution one, and which can correct for faces turned partly away from the camera.

Savvide’s software matches faces turned to the side by working out what the faces on file would look like when turned by the same angle, and also by tracking features that are still visible. That avoids having to assume the hidden side of a face matches the visible one, as with in MorphoTrust’s technology, says Saviddes.

“Many cases today, like in Boston and other crimes, law enforcement have low-resolution, off-angle images they can’t do anything with,” says Saviddes, “but we can change that.”