If they build it, will we come?

To break our dependence on gas-powered vehicles as soon as possible, electric vehicles are the leading alternative. “Electrification is a part of every automaker’s long term strategy, which is a good indication that it’s not a question of if, but rather how fast electric vehicles will go mainstream,” the Electric Drive Transportation Association (EDTA) that represents various electric drive technologies explains. But, they added, “this is an emerging industry that requires support and cooperation among policymakers and industry at all levels over the long term.”  

It’s also a significant change for consumers, for whom cars play an integral role, financially and culturally, including being invested in the sense of freedom that cars and trucks provide. Alternative fuel vehicles such as electric ones, on the other hand, don’t provide that sense of “freedom.” Will it make it? What happens if I run out of “fuel”? If I break down, who can help me? What do I have to do to make it work? Therefore, there needs to be a transition. Based on an October 2009 study by the University of Michigan, hybrid vehicles are the most effective transition, which are available today.

This nationwide study found “a very high responsiveness of demand for hybrid vehicles”, and that providing high customer value is key – that is, reliability, durability and convenience, as well as fuel savings and affordability. A press release on the study states, “42 percent of consumers said there was at least some chance that they would buy a PHEV sometime in the future.” Hybrids have a gas engine and an electric motor that is powered by batteries (the gas engine typically regenerates the batteries). An electric vehicle is 100 percent electric (e.g. NEVs). Both also have regenerative braking, but the ultimate impact of that is negotiable and common in all EVs.

Ultimately, the automakers’ long term EV strategies include various vehicles not in production yet, only in development, except for low-speed neighbhourhood electric vehicles (NEVs). The first full-service electric vehicle expected out is the Chevy Volt, expected to hit showrooms in November 2010 at a sticker price of $40,000 with eligibility for a $7,500 federal tax credit. But the proof will be in the showroom as the production date, price and fuel economy have changed a number of times. It’s possible that the heavy marketing of these vehicles, the most intense of which has been for the Volt, promotes the use of electric vehicles to the mass market in a potent way. However, if the vehicle fails to meet performance expectations it could sour the public on EVs.

The University of Michigan study also validates the behavioural economics theories described in the first of this article series. “Social factors are just as important as economic factors in spurring the adoption of hybrid vehicles, and increasing social forces pushing toward the purchase of hybrids may be cheaper than using economic incentives”. People liked buying the Toyota Prius because its distinctive styling served as “badging” that its owner is socially responsible, for example. The key issues that potential customers repeatedly say are required for them to adopt electic cars are cost, convenience and performance. If the cars are perceived to be too expensive, then interest declines precipitiously (e.g. in the University of Michigan survey, at cost premiums of $5,000 and $10,000, adoption likelihood drops to 30 percent and 14 percent respectively). Fuel and environmental savings are not enough to compensate. Batteries are the big cost problem, as well as a key driver of infrastructure decisions, such as battery “swapping.” The “complexity of multiple pack configurations, for different customers, raise issues of complexity in design and operation of such battery switching stations,” according to a recent DOE/EERE report.

“Freedom” demands range and easy, ubiquitous access to recharging – at parking spaces, shopping centres, supermarkets, and rest stops. Building codes need to require it. A nascent GPS-based service that might make drivers feel in control, and thus “free”, is called Road2 and was developed by Celadon Applications. It allows you to monitor how much “fuel” the vehicle has remaining, where you will need to recharge (traveling at various speeds), and, importantly, where you can conveniently recharge along your route. In addition, we need our technicians properly trained so we have peace of mind if there’s a problem. As of February 2009, Portland (Oregon) was the number one city in the nation in terms of new hybrid sales per household, with 12.2 per 100,000 households, according to the DOE/EERE’s September 2009 report. They take a multi-pronged approach that includes infrastructure and statewide financial incentives on top of federal tax credits.

Oregon is now among leading states with respect to organised support for plug∞in vehicles” with statewide building codes in place for EV charging stations. Oregon’s Department of Transportation Office of Innovative Partnerships and Alternative Funding states that “having one common, open system for charging all types of vehicles is an important factor in making the transition successful. This effort will help gain public recognition and consumer confidence in the EV charging infrastructure by providing uniform performance and safety features throughout Oregon. Oregon is also developing an Electric Vehicle Supply Equipment installation manual for both residential and commercial sites.

Then there’s the increased demand on the grid of all these electric vehicles. These current reports show “a consistent conclusion that there is plenty of capacity to support a plausible market for many years”. The University of Michigan study validates the need for a more robust economic incentive programme, reflected in the decline of EV in purchase rates as the cost premium rose. The government tax credits are a nice boost, but the buyer still has to finance the full cost of the vehicle, which is a significant deterrent.

Prices of EVs and hybrids are driven by the battery costs, and the technology is still evolving. The $40,000 Volt is reportedly competing with cars in its class that are in the $20,000 range, even though Chevy is pricing the Volt well below cost for marketability. The Tesla sportscar, at $110,000, is clearly not a mass market vehicle purchase. Both vehicles use lithium-ion batteries, which are predominately responsible for the high cost, but as production of these batteries ramps up their cost is expected to fall. However, how these batteries perform in the long-run is still an unknown. The mass market demands performance – speed, range, smooth driving, reliability – which in EVs depends upon the battery technology and its efficiency. Industry sources point out that current incentives are based on battery size, which penalises efficiency, and they suggest rewarding “vehicle capabilities” instead, such as the amount of passengers or cargo the vehicle holds, or the speed and range (with caps), or safety tests passed.

The EDTA adds that policymakers can help by “establishing support for battery warranties and charging stations,” which will help reduce purchase prices and “overcome initial market barriers.” There is still a perception that all types of electric vehicles are unsafe and slow, but promotions emphasising the performance of current hydbrids, and the Volt, the Tesla Roadster and the new Fisker Karma sports cars hopefully will help those market barriers be overcome as well. According to the University of Michigan study, 54 percent of consumers cited “dependence on foreign oil” as the key driver for them to transition to EV’s – and this survey was conducted when gas prices were at their peak, between July and November 2008.

To transition from “gas guzzlers” to electric vehicles, we need to change “how” we drive, not just “what” we drive. We need to think before we reflexively get in our SUV to go to the market. Can we walk? Can we take a neighborhood electric vehicle? Can we bicycle? The technology absolutely needs to rise to meet consumer demands to take to the open road, but consumers also need to take responsibility for their habits. Hybrid electric vehicles don’t break old habits, nor train us to plug in our cars, since most function much like gas cars. NEV’s do train us to charge the car, at least in a 110 volt household outlet, and to think about our habits – so they are an important part of the transition.  

The know-how to build EVs and the required infrastructure exists, and we are motivated to drive them, so the transition to electric vehicles should be a no-brainer. But…will we? What has to happen for the quantity of “garden-variety consumers” who buy Fords and Toyotas today to put their money and lives into an electric vehicle?

If we want to reduce our dependence on foreign oil and save the planet, we must all embrace hard choices and risks, including challenging our own assumptions and changing our own habits, and pushing our governments and private companies to do the same.

A necessary technology

Sometimes you need to face facts head on: like the fact that the electricity powering much of China and India’s fast-rising development is coal-generated. Yet coal is generally perceived as a “dirty” fuel. That means solutions like Carbon Capture and Storage (CCS) are now being readied to help capture the carbon dioxide emitted by coal power, and safely pipe it underground. It could, say some, help save the planet.

Hans Bolscher, Project Director for CCS for the Dutch Ministry of Environment and Ministry of Economy says a good CCS programme connects all the three key stages of trapping, transporting and storing greenhouse gases. But it’s complex, not to mention expensive.

CCS in a nutshell
CCS is a three-step process includes the capture of CO2 from power plants and industrial sources. The transportation element is usually via pipelines to a storage site. The storage depositaries are usually deep saline formations, or depleted oil or gas fields. Some coal seams are unmoveable; enhanced oil recovery sites can also be used for storage.

So what about the nuclear route instead? “Nuclear energy,” says Bolscher, “cannot be harnessed quickly enough to fill the gap. Only CCS has the capacity to be readied quickly enough.”

“Actually, we would love to believe that reducing individual carbon consumption – by making our cars and light bulbs more efficient, for example, would be sufficient to fulfil the climate objectves”. That’s the argument Greenpeace puts forward. But new kinds of behaviour, such as the huge rise in cheap flying, completely destroy this idea. Flying has a truly terrible impact on our carbon footprint.”

The need for CCS incentives and investment
• A robust price for carbon needs to be established quickly so that investors have the incentive needed to develop low-carbon alternatives.

• Policy frameworks have to be established and made credible – businesses will only invest in CCS technology if the rules will not change further down the road.

• Governments need to provide business and consumers with the right mix of incentives and penalties. The right mix of rewards and regulations from government and business will all encourage consumers to play their part – and make Europe
a low-carbon world leader.

Ambitious targets set
Many countries have been successful with their alternative energy programmes. Germany, for instance, leapt in head first, developing innovative wind and solar programmes. However the Dutch have been pioneering their own CO2 reduction strategies says Hans Bolscher. “In many ways the Dutch route has been more effective, combining heat and power sources. Overall it’s been a more effective way of reducing CO2 emissions, we would argue.”

Much of the new Dutch attitude to CO2 emissions stems from a new Dutch cabinet formed back in 2007 determined to tackle climate change head on. It made CCS a core component of Dutch ambitions to slash CO2 emissions. “We had a very ambitious programme with CCS at the heart of the programme. Until 2020 our CCS programmes will see relatively modest gains. But from 2020 we predict our CCS programme will see our emissions cut by a third.” And by 2030 the savings will be even more significant.

The Dutch have given their CCS programme serious support: they’ve created a high level task force staffed by a mix of CEOs and high level ex-cabinet support, including a former prime minister and a former minister. “We [the Dutch government] published a letter two months ago describing very clearly our CCS policies and how we intend to achieve them. We don’t say CCS is more important than energy efficiency or sustainable energy. It’s a rather more joined∞up approach than that.”

Taking the initiative
The Dutch government is putting money on the table to back its CCS programme. It has allocated ¤90m to support five separate CCS demonstration initiatives. Three of these projects will capture CO2. The two other demonstrations will show how CO2 can be stored on land. “From that moment the public debate really began,” says Hans Bolscher. “We’ve also got to prepare legislation that will take care of infrastructure issues, as well storage issues.”

Meanwhile CCS needs major investment
funding – and quickly. “We need big investment as soon as possible,” says Bolscher. “The success of CCS depends on the carbon price and the expectation of higher prices in future. That’s why the government is keen to put money on the table. By 2020 the carbon price should be high enough to promote CCS as one of the most efficient measures for heavy industry.”

But if CCS isn’t up and running in time then Bolscher says it may have to be forced on mainstream industry. “Should industry adopt it or not? We think it should be adopted, regardless of the carbon price – and if necessary the Dutch government will be willing to force industry to adopt it, and backed by legislation.”

Not under my backyard
There’s also, of course, the public perception. Although CCS has won much support from the environmental community, support is not universal. “Already we have many people opposing it on the grounds of NUMBY-ism – Not Under My Back Yard.”

Organisations meanwhile like Greenpeace argue CCS diverts attention away from demand reduction and renewable energy; it also claims the technology is hugely expensive and carries significant liability risks. Which means that arguments for CCS need to be clearly explained and defined to the public.

“There are plenty of negative preconceptions about CCS,” acknowledges Bolscher. “The most popular is that CCS is unsafe. That’s a simplified conclusion but as it is a new approach we accept there is going to be discussion about it. Another pre-conception is that CCS bypasses sustainable energy. Again, untrue. CCS is also about maximising energy efficiency and sustainable energy as a joint package.”

Others meanwhile argue that CCS is hugely expensive – which it is. “Yes, but it’s much less expensive,” says Hans Bolscher, “than solar energy and about the same cost as wind power. Also, the costs will come down over time.” The main challenge for the moment is lowering the cost of capture and to get the scale of operations up and running so that the cost of transport and storage is reduced.

A timetable to bring results
So where will CCS technology be in the next five to 10 years, and how will it have changed in that time? “By 2015, we will have two major clusters of demonstrations up and running with 3-5 megatonnes of CO2 captured annually,” says Hans Bolscher.

“The public debate will have eased and we will have a worked-out strategy on how to use CCS in the next decade. By 2020, we will have implemented CCS with 50 percent of all major industry emitters. The amount of carbon dioxide stored will rise from 20 megatonnes by 2020 to 40-50 megatonnes up to 2030.”

The Netherlands’ own action plan calls for annual energy efficiency improvements of two percent by 2020, a 30 percent reduction in greenhouse gas emissions by 2020 (baseline – 1990) and 20 percent renewable energy in the energy mix by 2020. The Clean and Efficient-programme expects to reduce greenhouse gas emissions from 212 million tonnes in 2005 to 158 million tonnes in 2020.

Although CCS has its drawbacks – such as cost and questions over its long-term viability – many environmentalists generally see it as the best short-to medium-term route for the planet. “I would say three countries – Norway, the UK and the Netherlands – are really ahead of the game regarding CCS,” says Hans Bolscher. “They’re certainly ahead of the rest of Europe.”

Further information www.vrom.nl; http://international.vrom.nl

Ocean warming conundrum

Not by much, but spread over the vast depths of the deep the change is significant, adding to sea level rise and possibly heralding even greater impacts for mankind and the planet.

While scientists aren’t yet certain if the warming is caused by climate change, they are scrambling to learn more about what’s going on.

This is because the layer starting roughly 2km (one mile) from the surface makes up about half the world’s oceans and plays a key role in regulating the planet’s climate.

“A decade or so ago we had this picture in our minds that deep oceans were pretty stable and that things didn’t change much there,” said oceanographer Steve Rintoul of Australia’s state-backed science and research body CSIRO.

“What’s changed in the last decade is that we’ve started to accumulate enough measurements to show there are widespread changes happening in the deep ocean. And those include really remarkably widespread warming of the deepest layers of the ocean,” he told reporters from Hobart, Tasmania.

Water expands as it gets warmer and this, along with the melting of glaciers and ice caps, is a major force behind rising sea levels.

Seas, on average, are rising at a rate of 3mm a year but some studies suggest they could rise by up to a metre by 2100, inundating low-lying coastlines.

“The heat storage aspect is important because over the past 50 years, about 90 percent of the extra heat stored by the earth is now found in the ocean,” said Rintoul. The deep ocean takes up 10 to 20 percent of this.

Scientists say that extra heat is being trapped by greenhouse gases released by agriculture, deforestation and the burning of fossil fuels.

Warmer waters
The greatest warming of the deep oceans has been recorded near Antarctica and the North Atlantic.

These are the two regions where very cold, salty water sinks from the surface to the depths in a motion that helps drive a global circulation of ocean currents that regulates the climate, for example by giving northern Europe its mild weather.

The water that sinks off parts of Antarctica heads north into different ocean basins as it branches out. It can take centuries to make its way back to the surface.

“We’re seeing warming. We’ve only seen this pattern for a decade or two now,” said Gregory Johnson, an oceanographer of the US National Oceanic and Atmospheric Administration.

He pointed to the difficulties of taking measurements in the crushing depths, which have limited scientists to taking samples every decade on costly voyages that transect an area of ocean.

“When we go out and do these measurements, we go out and stop the ship and lower the instrument down to the bottom and bring it back. It’s sort of like going across the ocean at a slow jog because you spend more than half your time stopped and sampling”.

He said the observed warming rate for the deep layers of the Southern Ocean, between Australia and Antarctica, was about 0.03 degrees Celsius per decade.

“It seems very small but it’s actually a huge amount of energy uptake. Compared with mankind’s global energy consumption rate, it’s three times that rate going into the deep ocean,” he told reporters from Seattle.

“That’s about four Hiroshima bombs every five seconds, or five hairdriers for all 6.8 billion people on the planet going continuously,” he said.

Carbon impact
Some areas, such as the Southern Ocean, have been sampled more than others.

And what scientists have found is worrying.

The water sinking off Antarctica is becoming fresher and therefore less dense, though it’s unclear if this will lead to long-term changes in the speed of deep ocean currents.

Changes in wind patterns are also causing more deep, carbon-rich water to come to the surface.

The oceans are a major carbon “sink”, soaking up large amounts of the main greenhouse gas carbon dioxide, including about a quarter of all the CO2 emitted by human activity.

Oceans store about 50 times the CO2 in the atmosphere. And most of this is stored in intermediate and deep ocean waters.

“There are huge amounts of carbon stored in those waters below the 2,000 metre mark,” said Bernadette Sloyan of the CSIRO’s Marine and Atmospheric Research division in Hobart.

“And changing the temperature changes the ability of the ocean to hold and store that carbon as a reservoir,” she said.

Mankind’s fossil fuel emissions are the equivalent of about six billion tonnes of carbon annually, a fraction of the estimated 38 to 40 trillion tonnes of carbon stored in the intermediate and deep ocean layers.

At present, while the ocean naturally releases carbon dioxide gas in upwelling currents off Antarctica and in parts of the tropics, the world’s oceans overall soak up more than they emit.

Putting the heat on
But scientists say that could change.

“The changes we project will happen in the Southern Ocean will tend to make the ocean less effective at storing CO2,” said Rintoul.

The deep oceans are also a major source of nutrients, such as iron, that ocean ecosystems need to survive.

“Changes in the circulation of the ocean and deep ocean and how it interacts with the upper ocean will have significant impacts on bringing nutrients back up to allow ecosystems to keep thriving,” Sloyan said.

For now, scientists are trying to speed up measurements to figure out if mankind has woken up a monster in the deep.

Rintoul and Johnson said more study was needed to pin down any direct climate change connection.

“At the moment we can’t really say that the pattern of deep warming that we see is a signal of human-caused climate change,” said Rintoul.

“And the reason we can’t say that is partly because we only have a few decades of observations and also because we don’t really understand the processes that control variability in the deep ocean as well,” he said.

Self-propelled gliders being developed to take deep water samples will help. Engineers are also working to expand a network of floats, which can presently sink to 2,000 metres to take data readings, with new models that can go even deeper.

“It’s said we know more about Mars than we know about the deep ocean. It’s absolutely true,” Sloyan said.

A touching scene

Displax, a company that makes interactive technologies, says it’s found a way of turning any surface – flat or curved – into a multi-touch screen. The surface can be made from glass, plastic, wood – any material that doesn’t conduct electricity.

The company, based in Braga in north-western Portugal, has developed a transparent polymer film, thinner than paper, that can be fixed to a surface, making it interactive. Significantly, the film can be applied to standard LCD screens, which means that existing television and computer displays can become touch sensitive.

The technology works by processing multiple input signals received from a grid of nano-wires embedded in the film. Each time a finger is placed on the screen, a small electrical disturbance is caused.

A micro-processor controller analyses this data and decodes the location of each input on that grid to track the movement. The film is so sensitive that users can interact with it not just through touch but by blowing on it. This, says the company, opens up new possibilities for future applications.

The lightweight film can be applied to any surface from 18 centimetres to three metres across the diagonal and can be safely used on outdoor displays. Currently, it can detect 16 fingers on a 50-inch screen, but that is expected to increase as development progresses.

The technology is extremely powerful, precise and versatile, says Miguel Fonseca, chief business officer of Displax.
“Almost everyone who sees it thinks of new applications, from converting LCDs into multi-touch screens, tables into multi-touch tables, to creating interactive information screens in stores, shopping malls or public areas, to developing new exciting gaming environments.”

Displax developed the technology primarily for commercial environments, but it now expects potential customers to come from industries as diverse as telecoms, retail, property, broadcast, pharmaceuticals and finance. The company expects consumer applications to be developed using the new technology as well.

“The technology will open up new opportunities for many market players, technology vendors as well as businesses,” said Fonseca.

In partnership with the sun

In the last few years, Kerself has won the trust of investors and the market in which it operates, to become the Italian leader in the engineering, design, production, installation and distribution of photovoltaic energy plants.

Founded in 1998 by its current chairman and CEO, Pier Angelo Masselli, the company has gradually extended its operations and markets, by means of a growth strategy that has always aimed high – from photovoltaic systems to the new frontiers of renewable energy, the Italian market to Europe and beyond. Based in Correggio, near Reggio Emilia in northern Italy, this is an ecological company par excellence, offering products and solutions for the development of clean energy while safeguarding the environment.

Steady growth
Over the last few years, Kerself has expanded through organic growth and important, targeted acquisitions in the photovoltaic sector. The forecast turnover of the group for 2009 is €400m, with an EBITDA of €55m. It directly controls five companies, Helios Technology, Nuova Thermosolar, DEA, Saem and Ecoware, all of which operate in the solar energy sector. With a direct and indirect workforce of more than a thousand, the group exports its photovoltaic systems to the main Mediterranean countries, while at the same time keeping a careful eye on the non-European markets, America especially.

Kerself  has achieved its competitive advantage not only by means of technological and functional innovation, design and the availability of a broad range of products and solutions, but also by means of a vertically integrated group structure, using single specialist companies with experience in different stages of the process, from the production of cells and modules to the engineering, design, installation and distribution of photovoltaic plants of all sizes, from small units to major solar energy fields. Among the main success factors are the quality of the production processes, research and development, a strategy based on creating loyalty among installers of proven experience, and a sales network which operates throughout the country.

“Our company has always been environment∞ friendly, and we’ve always backed a sustainable energy model based on renewable sources,” says Angelo Masselli, who is proud of his business model and its ability to create value and employment.

Alliances
Another important point of the group strategy is alliances. In 2008, Avelar Energy, a company belonging to the Russian giant Renova, became a shareholder of Kerself, where it plays a role as a solid business and industrial partner, with an agreement to install more than 100 MW by 2011. In the meantime, Masselli has set up important negotiations with some of the biggest international market leaders in the photovoltaic sector and in the production of thin film (the latest frontier in the photovoltaic area) with a view to guaranteeing the finest technology available at all times. A number of cooperation projects are also going ahead in other renewable energy sectors, such as geothermal energy and eco∞compatible materials for building and energy∞saving applications, as is also taking place in other European countries.  

The internal R&D team, in cooperation with the most important Italian university research departments, is focusing on the applications front with a view to guaranteeing products of excellence at all times, tailored for the requirements of each customer.

Taking advantage of opportunities
The international development of Kerself is on-going, thanks to a few important agreements recently signed, which will contribute to the expansion of the group’s growth strategy. The first of these involves the setting up of a joint venture in Israel, along with important local enterprises operating in the renewable energy sector. The agreement aims at taking advantage of the opportunities offered by the Israeli market, thanks to the prospect of new, incentive-based tariffs and the excellent sunshine exposure of the country. Kerself will provide its know∞how in fixed structure technology and biaxial digital trackers, while the Israeli partners will share its commercial contacts network and will install the equipments directly on-site.

The latest important agreement is a contract signed with a fund in The Netherlands, for the development of renewable energy projects in Europe, involving the turnkey supply of photovoltaic plants with a total output of 25 MW in 2009. Financial funding has already been received from a major European bank, and the plants will be built in the Marche and Puglia regions, in central and southern Italy respectively. The agreement also extends into the years to come, and involves the supply of turnkey plants with an additional output of 75 MW by  2011.

Further information: www.kerself.eu

Mobile phones bring insurance to Kenyan farmers

Farmers can cover the cost of seeds, fertilisers and pesticides at local agricultural supply shops by paying an extra five percent of their value. If their harvest fails due to bad weather they are reimbursed and can plant again.

For now, the policy only covers wheat and maize and is available for the agricultural inputs of Syngenta and two Kenyan partners supplying seeds and fertiliser not sold by the Swiss agrochemicals company. The companies in turn subsidise some of the insurance costs.

“Last year, when I took out the insurance policy, we had a total crop failure. The crop didn’t even reach the flowering state, it dried up,” said Jane Gathoni Simon, a maize farmer who took part in a pilot programme last year.

“But at the end of the year we were compensated. I managed to get the (replacement) seeds in time and planted.”

Food prices rocketed in much of east Africa in 2008 and 2009 after a succession of failed rains hit staple crops such as maize. The region is now being deluged by above normal rainfall, causing widespread flooding in many areas.

On purchase, dealers use a camera phone to scan a barcode that automatically registers the policy with Kenyan insurance provider UAP over Safaricom’s mobile phone network.

Confirmation of the policy is then sent to the farmer on his mobile phone via a short text message.

Safaricom, 40 percent owned by Britain’s Vodafone, is the leading mobile provider in east Africa’s biggest economy with nearly 15 million subscribers at the end of 2009.

Lost without rain

The local climate is monitored by 30 solar-powered weather stations that transmit rainfall, sun radiation, temperature and wind data every 15 minutes over the mobile data network.

In the case of drought or excessive rains, registered farmers automatically receive insurance payouts through M-Pesa, Safaricom’s successful mobile money transfer service.

Some 9,000 farmers have signed up for the insurance and 100 more are joining daily, said Rose Goslinga, project leader from the Sygenta Foundation for Sustainable Agriculture.

“The main problem that farmers face in the end is the weather, it’s the one thing they can’t control,” Goslinga said.

“They can do all the things that have been taught by agronomists, but if it doesn’t rain, they are lost. With these weather stations, with insurance, they get an option to do something about their fate.”

However, the new crop insurance known as Kilimo Salama – Swahili for safe farming – covers only variability of rain and not crop failure due to pests or disease.

James Wambugu, UAP Insurance executive director, said technology is making insurance more accessible in a country where only six percent of adults have any form of protection.

“By harnessing technologies like M-Pesa and weather stations, our costs are low and we are able to bring down the cost of the insurance,” he said.

Mobile infrastructure
The ease of information transfer is largely due to Kenya’s rapidly growing mobile sector which is the benchmark for the region both in terms of penetration and innovation.

“A lot of our subscribers are in very remote places, so by paying insurance premiums and payouts through the mobile phone, we cut out one to three months of travel in one or two mobile transactions,” says Wadzanai Chiota, head of value-added services at Safaricom.

The Eldoret project is not the first attempt at using technology to bring insurance to rural Kenyans.

In January, the International Livestock Research Institute set up a pilot insurance programme for pastoralists that uses satellite imagery to document potentially lethal losses of the grasses and shrubbery the livestock relies on.

At the moment, UAP’s insurance is limited to areas that are near one of the 30 weather stations in the fertile Rift Valley. Designers of the project are aiming for 500 stations in Kenya with hopes of reaching as many as 50,000 farmers by 2012.

But not all farmers are fully convinced.

Ezekiel Rop harvested only two bags of maize per acre in 2009 after the devastating drought, instead of the usual 15-20. He had to sell some cows and depend on relief food to get by.

“I’ve heard about insurance for farmers, but I’ve not considered it,” he said. “I want to understand it before I buy it.”

Welcome to the iGeneration

The iGeneration want to multi-task all the time; they want to do as many things as they can while doing as much as they can. They want to do more than they could do an hour ago, and they want it now. Considering that Apple’s iPhone was only launched in the spring of 2008, it is phenomenal to look at its rapid proliferation.

Not so long ago, the smartphones was seen merely as a phone with bonus functionality. The iPhone and its applications have changed everything, essentially providing a pocket-sized PC customisable to each user’s needs – business, pleasure or a combination of the two.

The jack of all trades
Applications enable your miniature jack-of-all-trades to e-mail, take pictures, keep track of breaking news, watch films, listen to music, play games, read the dailies – the possibilities are nigh-on endless. Take the example of Sky News’ news app: it allows users to send their own news reports and pictures directly, perfectly epitomising the effect Web 2.0 has had on our consumption of news. Whereas previously, Web 2.0 would take its form via smaller, more compartmentalised outlets, for a brand as universal and far-reaching as Sky News to have taken the bold step of opening their ears and eyes to this all-new bespoke world is a decisive move.  The mobile web has become not just entirely bespoke but moreover, entirely user-centred.

Digression versus work-obsession
For the everyday member of the iGeneration, the most popular apps are likely to be communications-related. 24-hour access to work e-mails, being on call all the time – as much as the iPhone’s boundless capacity to entertain could be seen to facilitate laziness, there is also a danger of it inducing “work crazy” behaviour. The recession provides a convenient excuse to be on call more often than is “healthy”. Social networking and the on-the-go PC are similarly a match made in heaven, allowing Twitter and Facebook users to tweet, update statuses, upload photos and communicate while on the move. Whether the two extremes will counterbalance is yet to be seen.

Multiple-personality disorder
Apple’s keynote at the WWDC last week saw senior vice president of iPhone software Scott Forstall analyse the rapid growth of apps. Less than a year ago, OS 2.0 and the SDK were released and now, we are at a point where there are more than 50,000 apps for sale in the App Store. By April 2009, one billion applications had been downloaded. The statistics are phenomenal and entirely unprecedented – in comparison with the Palm Pre and the BlackBerry, the iPhone is a far more multi-faceted gadget.

In terms of entertainment, the new OS 3.0 games are as pristine as a PC user could expect. For the user wanting GPS functionality, the iPhone offers apps as pristine as standalone navigation systems. The speed of development, at least in the public eye, has been phenomenal; if this is a taster for the future of apps, the population will soon be able to exist without ever having leave their collective chair. They can do anything at the tap of a touch-screen. Gone are the days of scouring a phone book or dust down books in the attic in the pursuit of information; the iPhone has embedded itself in the 21st century, for better or for worse.

The parallel, portable web
The mobile web is a new and exciting medium; a front door, an open-access platform for developers. The browser has lost importance in this new world, in stark comparison with the parallel world of the PC, where the launch of Google’s new minimalist Chrome browser is still courting a frenzy. Apple’s vision is somewhat parallel to the internet’s; under Steve Jobs, the iPhone offers a versatile set of content-providing tools. And by placing itself at the hub of this new world, Apple’s iTunes store has become financial protagonist of this new narrative, taking a 30 percent cut of each sale.

The future
The more popular the iPhone gets – and the greater the demand for more functionality – the more apps will become available. The more apps there are, the more crowded the iTunes store will be. With everything from iPickupLines (an automated app suggesting pick-up lines such as “If you were the new burger at McDonalds you would be the McGorgeous!”) to LogMeInIgnition (allowing users to take remote control of another computer and troubleshoot problems) available right now, the marketplace leaves few gaps that aren’t immediately filled by keen developers. A fantastic new platform for entrepreneurs and consumers alike, the iGeneration live through a portal more the generation preceding them could ever have predicted.

Panel rejects Avastin for breast cancer

If regulators follow that advice, the Swiss drugmaker could no longer promote Avastin for that use in the US.

Doctors still could prescribe Avastin for breast cancer as it would retain approval for colon, lung, brain and kidney cancers, but sales would likely fall.

Breast cancer treatment accounts for about $1bn of Avastin’s more than $6bn in annual sales, analysts said. The product is Roche’s top-selling drug.

Members of a Food and Drug Administration panel said they did not see enough of a benefit from Avastin in advanced breast cancer to justify its serious risks. They voted 12-1 to urge the FDA to remove the breast cancer approval.

The drug’s risks include gastrointestinal perforations, bleeding and blood clots.

Roche said rates of those problems were low, at less than four percent in the breast cancer trials.

But panel chairman Wyndham Wilson, a National Cancer Institute researcher, said there was definitive evidence that Avastin causes serious and life-threatening side effects. “Small numbers, but if you’re the one, that’s not what you want to be exposed to,” he said.

Overall data “does not support this being effective” in advanced breast cancer, Wilson added.

The FDA usually follows panel recommendations.

The vote was a rare setback for Avastin, a widely used treatment for a variety of cancers.

Morningstar analyst Karen Andersen said the negative ruling in breast cancer “wouldn’t be something that would change my valuation of the firm.”

“Colorectal and lung cancer are both markets where Avastin has been able to penetrate a very high percentage of the first line patients. (Roche) really relies on those indications for the foundation of Avastin sales,” she said.

Roche unit Genentech said the company stands by its data showing Avastin helps patients with advanced breast cancer. “Avastin should be an option for patients with this incurable disease,” a company statement said.

Avastin won clearance for breast cancer in 2008 under a shorter approval process, but Roche was required to run two follow-up studies to confirm the drug’s effectiveness and receive full approval.

Those studies failed to confirm the level of benefit seen in the initial breast cancer trial, FDA staff said.

Avastin, which is given intravenously, delayed the growth of cancer by 5.5 months in the initial study. In the two later studies, the time ranged from about one month to nearly three months.

Avastin did not extend patients’ overall survival in any of the studies.

The drug’s generic name is bevacizumab.

Drug firm designs licence to thrive

Award-winning ImmuPharma focuses on the creation of medicines aimed at treating serious medical conditions such as cancer, inflammatory and allergic disorders. However, many UK biotech companies are not good at securing licencing deals – or bringing deals to the table that will attract investors.

Not so with ImmuPharma. It has a global licensing partner with highly successful Cephalon, a US specialty pharmaceuticals player. This innovative pharma start-up also boasts talented scientists and a low cost base, not to mention access to world-class technology.

Critically, ImmuPharma also has ongoing clinical human trials on a range of drugs. Which is why investors like M&G – this well-regarded asset management company recently took at 10 percent stake in the business – plus Gartmore, Jupiter, Standard Life and Legal & General, are taking such an interest.
 
“ImmuPharma is a complete steal at current levels and its strong balance sheet gives it little dilution risk. We see numerous catalysts ahead to drive interest in the story globally and are bullish that further Lupuzor data expected imminently will be positive,” said Noble research broker in October 2009.

ImmuPharma focuses on five key markets:
• Lupus
• Cancer
• Moderate to severe cancer pain or post-operative pain
• Highly resistant, hospital-acquired infections such as MRSA
• Inflammatory and allergic disorders
Additionally, ImmuPharma focuses on:
• Niche, high-value specialist areas
• Innovative, targeted drugs addressing unmet needs

Lupus, in particular, says ImmuPharma boss Dimitri Dimitriou, is a good example of a condition where ImmuPharma is taking a strong lead. “Lupus is a chronic, potentially life-threatening autoimmune disease that attacks multiple organs such as the skin, joints, kidneys, blood cells, heart and lungs. There are an estimated 1.4 million people diagnosed with the disease in the seven major countries, according to ImmuPharma and analyst estimates – and there is no known cure.”

Dimitriou knows what he is talking about. He has more than 20 years’ experience in the biotech and pharma industry and has been a senior director at GlaxoSmithKline as well as holding deal-making responsibilities at Bristol-Myers Squibb. (He also held senior marketing roles at Procter & Gamble). He is not short of market nous.

Public for four years, ImmuPharma is principally concerned with developing pioneering drugs in specialist therapeutic areas where there is not much competition. “ImmuPharma has exclusive collaboration with the Centre National de la Recherché Scientifique (CNRS),” goes on Dimitri Dimitriou, “the largest fundamental research organisation in Europe with a budget of 3.3bn euro (2008).”

Real progress
And so far, progress is hugely encouraging. ImmuPharma’s Phase IIb trial was double-blind and placebo-controlled in 150 Lupus patients who received Lupuzor once a month for three months – which demonstrated a significant clinical improvement to their condition says the company.

ImmuPharma was paid $15m before the study was complete, plus $30m last year for the worldwide rights for Lupuzor. “Cephalon is now responsible for the development and commercialisation of the drug worldwide,” explains Dimitriou. “This is part of one of the largest deals in Europe, with ImmuPharma potentially receiving further cash payments from Cephalon of an additional $450m, depending on the achievement of certain regulatory and sales milestones and even more importantly, high royalties on commercial sales of Lupuzor.”

Although ImmuPharma’s Lupuzor is the headline drug, the company has several other innovative biologicals targeting unmet needs. There have been several material developments that have de∞risked Lupuzor say investment brokers, thanks to exciting clinical data and general improving investment sentiment.

Low overheads
So ImmuPharma is attractive to investors on a range of fronts. Its low cost base is certainly a draw. Although it has operations in France, it operates as a virtual company in London. Cleverly, it utilises external companies for much of its clinical trials work and makes extensive use of sourcing compounds from drug libraries such as the Centre National de la Recherche Scientifique (CNRS) in France.

Analyst projections on ImmuPharma look resilient. “We estimate that the company will generate total revenues in FY 2009 of £21.56m,” says pharmaceuticals analyst Dr Navid Malik, “which, after costs of £5.7m and a royalty pay∞ away to CNRS of £4.3m, should generate a pre∞tax profit of £11.66m (net profit of £9.9m).”

Next year, grant income of £1m and a pre∞tax loss of £4.78m is anticipated. Based on estimates of a further significant milestone payment of £15m payable in 2011, it’s estimated that could mean a £6.74m pre∞tax profit.

“The timing of a potential filing for Lupuzor in our view is Q4 2012, with a launch in 2013,” adds Malik. “On launch we are assuming substantial milestones will be payable to ImmuPharma, in the order of $50m for the US market alone. We have modelled a royalty on sales of 20 percent (conservatively, net of payments to CNRS).”

Certainly revenue streams are rare in the biotech sector; funding is often done through new equity issues.  However, because of ImmuPharma’s Cephalon deal, this has alleviated the need for funding and reduced internal costs – Cephalon has assumed all further costs. Analysts’ forecasts range from £1.54 (Singers) to over £3.00 (Panmure and Matrix). Hugely exciting.

Broker upgrade – why ImmuPharma is a buy
“ImmuPharma is a rare breed among UK biotechnology companies,” says pharmaceuticals analyst Dr Navid Malik. “It has a high-value product, Lupuzor, scheduled to enter Phase 3 trials in Q4 next year, which it has licensed to Cephalon, a high-quality US pharmaceutical company. A strong cash position (£27m) combined with a low burn rate should fund existing operations for many years, reducing the risk of dilution and giving investors some downside protection. We believe the market does not appreciate the full potential value of Lupuzor, its lead programme. We are initiating coverage with a BUY rating. Target price 309p.” What is critical to ImmuPharma’s progress with Lupuzor is the development pathway taken by GSK to develop Benlysta, the first drug treatment to treat Lupus in 50 years. A strong Benlysta will benefit Lupuzor as some of the donkey work in developing this new market would be done by a far larger pharma player. When that is achieved then Lupuzor has a perfect entrée.

Why lupus is critical
Lupus is a deeply challenging waxing and waning disease and is characterised by severe and sudden flares of activity. Typically it targets women of childbearing age. It’s an autoimmune condition where the body’s immune system comes under severe attack. Some periods of improvement or remission can follow. However lupus cannot be cured and patients typically endure it for life. Depending on its progression, it can involve specific organs such as the brain and kidneys (lupus nephritis). There can also be severe life-threatening complications. Severe infections tend to become more common, plus the risk of miscarriages stroke, disability and ultimately death. In the past, lupus drug development has been littered with failures.

Further information: www.immupharma.com

New findings discover source of empathy

Women have higher levels of emotional intelligence than men; that’s often just taken as read in management circles.

They are much better at empathy – the ability to see a situation from someone else’s perspective – than men. It means they have a natural edge when it comes to the soft skills needed in the modern workplace.

But there could be a quick fix that levels the playing field for men, and it comes in an unexpected form – nasal spray.

Scientists at Bonn University and the Cambridge Babraham Institute say they’ve found a link between exposure to the neuropeptide oxytocin and the ability of men to show empathy.

The substance also makes men more sensitive to what are called “social multipliers”, such as spotting a disapproving look from a colleague.

The team took 48 healthy males and gave half of them a blast of oxytocin nose spray, the other half a placebo.

They showed the men photos depicting a series of emotionally charged situations – among them, a crying child, a girl hugging her cat, and a grieving man.

They then asked the men to express how they felt about the people in the photos.

“Significantly higher emotional empathy levels were recorded for the oxytocin group than for the placebo group,” says René Hurlemann of Bonn University’s Clinic for Psychiatry.

The dose of oxytocin had the effect of enhancing the ability of the men to experience fellow-feeling. The men who had the nose spray achieved results that would normally only be expected in women, says Hurlemann.

Oxytocin is a hormone that is known to trigger labour pains and strengthen the emotional bond between a mother and her newborn child.

The hormone could be useful as a medication for diseases such as schizophrenia, which are frequently associated with reduced social approachability and social withdrawal, Hurlemann says.

And an empathy spray could be just the product many wives and girlfriends have been waiting for.

Volume gains sweeten US pill

Big Pharma will provide somewhat more in savings than the original $80bn agreement under the latest version of healthcare reform but the bill passed by the House of Representatives creates 32 million more customers for the industry’s products.

Crucially, the government will not impose drug price caps.

“The amount they are paying, essentially in a tax, over a five to 10 year period is potentially more than offset by the increased volume that they have coming in,” said Ben Yeoh, an industry analyst at Atlantic Equities.

“It looks neutral, within forecast error,” he said. “There’s relief that something has got through that drugmakers can live with.”

But this would be counteracted by a sharp increase in the number of insured patients and enhanced revenues from the Medicare programme for the elderly, a brokerage said. Drugmakers were unlikely to revise long-term earnings outlooks on the back of the news, it added.

Ending uncertainty
Under the complex deal hammered out in Washington, the drugs industry will have to pay fees of $3bn from next year – rising to a peak of $4.2bn in 2018 – and provide discounts to help ensure Medicare coverage.

Savvas Neophytou, an analyst at Panmure Gordon, put the cost to the sector at between 1.5 to 2.2 percent of EPS per year for the first five years but said the end to uncertainty should help sentiment.

The vote ended a year-long political battle with Republicans over the vexed issue of healthcare reform and achieved a goal that has eluded many presidents for a century – most recently Bill Clinton in 1994.

“There’s a sense that if no health reform had passed this time around, then everything would have got a really big kicking in three or four years time because, inevitably, something had to give,” said Jack Scannell, an industry analyst at Sanford Bernstein.

Lawmakers also rejected an initial plan to end lucrative “pay-for-delay” settlements between brand-name and generic drugmakers – a win for both groups of manufacturers.

The biotechnology industry, too, has reason to be thankful that the legislation was not worse.

A big fear for biotech investors had been that government would give generic alternatives a fast route to the market. In fact, makers of biotech drugs like Amgen Inc and Roche Holding AG’s Genentech unit will still have a 12-year period of exclusive sales before facing competition from generic rivals.

Zooming in on superbugs

An international team lead by researchers from Britain’s Wellcome Trust
Sanger Institute used very high-throughput gene sequencing machines to
compare individual MRSA bugs from patients and show precisely how they
were genetically related.

Methicillin-resistant staphylococcus
aureus (MRSA) causes infections such as blood poisoning and pneumonia
and can kill. It is one of a group of drug resistant bacteria, or
“superbugs” that are major problems in hospitals around the world.

Stephen
Bentley, who led the study published by the Science journal, said the
new technology had allowed scientists for the first time to find
precise differences in strains of the bug – a “fundamentally important”
step to tackling infection.

“It allows researchers and public
health officials to see how infections are spread, from person to
person, from hospital to hospital, from country to country,” he said.

Until
now, even the best methods for identifying genetic differences between
bacteria have been unable to pick up tiny differences – leaving
uncertainty about how infections spread.

The success of the new
method relies on comparing whole genetic codes, the scientists said.
The ability to track strains in this way will help researchers
understand how strains can spread so rapidly, and should lead to new
control strategies, not only for MRSA but also for other emerging
superbugs.

The researchers looked at 62 MRSA samples. One set of
42 was taken from hospitals in North and South America, Europe,
Australia and Asia from patients who became infected with MRSA between
1982 and 2003, and 20 were from a hospital in Thailand, from patients
who developed MRSA within seven months of each other.

“We wanted
to test whether our method could successfully zoom in and out to allow
us to track infection on a global scale – from continent-to-continent,
and also on the smallest scale – from person-to-person,” Simon Harris
of the Sanger Institute told reporters at a briefing.

The team
sequenced the whole genomes of all the samples and were able to spot
single-letter changes in the genetic code and identify differences
between even the most closely related bugs.

From the results
they created an “evolutionary tree” which showed that MRSA infections
are often clustered in locations, but can be spread across borders by
patients travelling between one place and another and visiting
different hospitals.

Drug-resistant bacteria kill about 25,000
people a year in Europe and about 19,000 in the US. The European Centre
for Disease Prevention and Control says superbug infections cost 900
million euros ($1.31 bln) a year in extra hospital time and 600 million
euros a year in lost productivity.

Sharon Peacock of Britain’s
Cambridge University who also worked on the study, told reporters the
work could “flag up hotspots for MRSA transmission … and these could
then be examined to improve infection control strategies.”

Dutch researchers said that all hospital patients should be screened for MRSA to try to halt its spread.