Solar Impulse 2 aims to prove the future of flight is photovoltaic

On June 2, 2014, a unique-looking aircraft took off from an airfield in Payerne, Switzerland. With a wingspan of over 72m – the same as a Boeing 747 – it was powered by four small engines that together produced roughly the same power as a motorbike. The sole occupant was test pilot Markus Scherdel, housed in a tiny cockpit at the very front of the main fuselage. Wind whistling over the wings, the aircraft took to the air without fuss, lifted nimbly into a clear sky and, travelling at a maximum speed of 48 knots, rose to an altitude of 8,000ft before returning to terra firma and a relieved welcoming committee.

The maiden flight of Solar Impulse 2 had proved what the backers of the project had always believed in the face of general scepticism: namely, it is possible for an aircraft weighing 2.3 tonnes – roughly the same as an average-sized car and 100 times lighter than an Airbus 340 – to fly like a bird without consuming a single drop of fuel. The four engines are powered by nothing more than the sun. And, in an important breakthrough in the pursuit of sustainable transport, they are able to function at night because Solar Impulse 2 can store the solar power that has been captured during the day.

It is possible for an aircraft weighing 2.3 tonnes to fly like a bird without consuming a single drop of fuel

Thus the flight confirmed that the vision of the project’s founders – famed balloonist Bertrand Piccard and former Swiss Air Force pilot and engineer André Borschberg – is not just a pipe dream. Next May, Solar Impulse 2 is scheduled to embark on the first solar-powered, round-the-world flight – a 35,000km journey into history. Taking off from the Gulf, the aircraft will fly over the Arabian Sea to India and then to Burma, into China, across the Pacific, coast-to-coast over the mainland US, across the Atlantic to southern Europe – or, depending on conditions, to North Africa – before landing back in the Gulf.

Although there have been other solar-powered flights, this one crashes through all the barriers of renewable air power in terms of its length, green credentials and the quality of its science. With just the pilot aboard, Solar Impulse 2 is designed to fly for five consecutive, continent-hopping days and nights, ploughing through the skies without leaving a trace of effluent in its wake. In fact, the aircraft must be capable of staying aloft for five days because that’s how long it requires to cross just the Atlantic. That’s approximately three and a half times longer than Charles Lindbergh took in the Spirit of St Louis in 1927.

Unlimited endurance
But, apart from the cleverness of the science (and the courage of the pilot), what exactly will the flight of Solar Impulse 2 prove? After all, we already have solar-powered flight. As the many (mainly Swiss) companies that are backing the project point out, this is all about producing an aircraft with almost unlimited endurance but that takes nothing from the planet. More deeply, they see it as a project that will, in time, enhance the average person’s quality of life. “This partnership pushes the boundaries of technology and innovation to achieve a better world,” argues Ulrich Spiesshofer, Chief Executive of ABB, the Swiss power generation giant. “We believe in Bertrand’s vision, and we are convinced that, by pioneering innovative technologies, we will be able to decouple economic growth from energy consumption and environmental impact.”

Another major backer is Alfred Schindler, founder and Chairman of elevator-manufacturer Schindler Group. Like Spiesshofer, he sees the project as a game-changer for people and planet alike. “It’s not only about saving and conserving energy, it’s all about working smarter instead of harder,” he says. “While staying in the air – day after day – Solar Impulse moves us beyond the idea of conventional belt-tightening. It proves convincingly that one can tap into a virtually unlimited supply of solar energy. Solar Impulse is a unique platform where creativity meets audacity and where action converts a dream into reality.”

35,000km

Length of Solar Impulse 2’s planned round-the-world flight

17,000

Solar cells on the plane’s wing

In a nutshell, in an energy Utopia, the world’s industries and homes – not to mention its aircraft – could be powered without whole populations suffering in the form of polluted oceans, disfigured landscapes, contaminated waterways and an ozone-depleted atmosphere.

The Solar Impulse 2 project started a decade ago when Piccard, who comes from a family of explorers, joined up with Borschberg. A psychiatrist by training, Piccard and Englishman Brian Jones co-piloted the first balloon, Breitling Orbiter 3, to sail non-stop around the globe – in fact, a balloon with a habitable pod attached. It was a remarkable feat but nothing special for the Piccard family: grandfather Auguste was a pioneering balloonist and father Jacques an undersea explorer, while Bertrand had always mixed a fascination with flight with his other activities. Now 56, he was a pioneer of hang-gliders and microlights, the latter being highly accident-prone in their developmental years.

Sustainable flight
The project encompasses Bertrand’s commitment to a future of sustainable transport. A friend of Richard Branson, he’s a fan of the British businessman’s space project, Virgin Galactic, and a believer in sub-orbital air transport. “Will commercial aviation soon be able to transport passengers on solar power only?” he asks. “I would be crazy to answer ‘yes’ and stupid to answer ‘no’. The technology for it doesn’t exist today but remember it was also impossible to transport passengers with fuel when the Wright brothers did their first flight in 1903.”

Piccard cannot be accused of being limited by his imagination. On sub-orbital flight, he says: “Can you imagine a take-off in Europe with clean energy, a climb to 100km high, followed by a parabolic flight to Australia and a landing after one hour? And you get a zero gravity experience for the same flight. This is what I see as the future of commercial air transport.”

Although he’s an adventurer by nature, this third-generation Piccard is painstakingly methodical and scientific in his approach. The maiden flight of Solar Impulse 2 was the result of a decade of research conducted by a team of 80 physicists, materials scientists and aerospace experts. After the initial scepticism wore off, a host of companies have thrown their considerable resources behind the project. They include watch and instrument maker Omega, materials group Solvay, Schindler, ABB, Bayer Material Science, telecommunications giant Swisscom, Toyota (which has contributed hybrid technology), photovoltaic cell manufacturer SunPower and French aerospace group Dassault. Even the European Space Agency has lent a scientific hand. Combined, they add up to a veritable Who’s Who of materials and aerospace sciences.

Piccard believes in walking before running – or flying. Solar Impulse 2 is the second-generation version and was developed to prove the whole concept was viable. With Piccard and Borschberg aboard, the original Solar Impulse made its first, crucial night flight on July 8, 2010. “We can finally prove what is possible without fuel”, declared Piccard once he’d landed.

Thereafter the founders advanced the project in increments. In 2011, they made a flight to Le Bourget airfield in Paris. A year later, they achieved the first intercontinental flight from Morocco to Switzerland – that is, across the Mediterranean. And last year, they flew across the US, from California to JFK Airport in New York City, with several stops en route.

The science
Rather like his father and grandfather with their highly original projects, Piccard ran into outright disbelief at first. The concept of flying a solar-powered aircraft at night and over such vast distances was dismissed as impossible by a number of experts in their field. How could mere solar power support such a large wingspan? Was it possible to develop materials so light that Solar Impulse 2 could stay aloft for five days and nights? And above all, how solar could energy be stored so it could be released at night? In short, the main issue was how to build – and fly – an aircraft that was so big and yet so light.

The engine of Solar Impulse 2 is in fact the solar cells. No less than 17,000 of them are built into the giant wing – and that’s one reason the wingspan is so big. They provide the ‘fuel’ for the four 10-horsepower electric motors. And, in a major breakthrough, by day the cells recharge the 400kg lithium batteries that keep the aircraft in the air at night.

Piccard certainly earns marks for thoroughness. Everything has been tested to death before being installed in the aircraft. The solar panels, for example, were put through their paces on what became the world’s loftiest solar power plant, right on top of the Jungfraujoch in Switzerland at a height of 3,466m above sea level – or roughly as high as the aircraft will fly.

Into The Sun

Solar Impulse 2 is not the only solar-powered aircraft. Among others, a German company called Solar Flight has produced the adaptable Sunseeker Duo, an aircraft that resembles a sailplane – or glider – but has enough solar panels on its 22-metre wings and tail to enable it to maintain a steady climb with two aboard.

An earlier version of the Sunseeker achieved the considerable feat of flying over the Swiss Alps. However, even the latest model can only stay aloft for a maximum of 12 hours.

This is an aircraft for enthusiasts. Once on the ground, the folding-wing Duo can be packed away into a small hangar, or even dismantled and packed into a special trailer.

Even before the aircraft takes off on its circumnavigation of the globe, the science is already trickling down into other applications. A team of researchers at Bayer, whose main responsibility is designing and building the entire one-man cockpit shell, had to develop a new kind of insulating material before the project could be considered viable. It’s a rigid polyurethane foam that makes the cockpit a bearable space for a human. With outside temperatures plummeting to -40 °C at night, the inside temperature of the cockpit can fall to minus 20. But during the day the reverse can happen, with the pilot enduring plus 40°C. The foam both blocks out the cold and lets out the heat.

The human element of the project cannot be underestimated. The cockpit cannot be pressurised or heated because that would rob the batteries of precious engine power, so the pilot sits in an oven by day and a refrigerator by night. “We need to make sure the pilot is as sustainable as his aircraft,” explains Borschberg. “This is why the round-the-world flight will be as much a human as a technological feat.”

In the meantime, consumers are already reaping some of the benefits from Solar Impulse 2. “The same products fall into the automotive and refrigeration industries,” points out Bayer executive Richard Northcote. Similarly ABB, which is focusing on clean energy, has applied what it has learnt while working on the aircraft into electric car technology and cleaner energy generation. ABB is already the world’s second-biggest supplier of solar inverters.

Brussels-based Solvay, a specialist in materials, has been involved from day one. Researchers in Belgium, Brazil, France, Germany, Italy and the US have been busy developing bespoke solutions since 2004. As a result, Solar Impulse 2 is practically festooned with Solvay products: some 6,000 in total and many of them entirely original. Among other applications, they waterproof the aircraft – including the precious solar panels and batteries – by means of an ultra-thin polymer film. And because the honeycomb-structured wings flex in the air currents, Solvar has invented an adhesive that moves with the solar cells without gaps opening. The Brussels firm has even applied its science to the pilot’s underwear: it’s made of a special polyamide yarn that it is claimed will “interact with the body, stimulating micro-circulation and helping muscle performance”. Cramped as the pilot will be, he’ll need all the circulation and muscle performance he can get. The Solvay products developed for the project are filtering down into a number of promising new markets, including protection of solar panels against the elements, batteries for computers and mobile phones, and baggage compartments on aircraft.

The cockpit cannot be pressurised or heated, so the pilot sits in an oven by day and a refrigerator by night

The outstanding scientific feat, though, is arguably the wings. Essentially, they’re made of paper, but of a rather special kind. Polymer-impregnated, the paper forms the basis of the honeycomb on which the structure is based. As the wings twist, flex and vibrate, the paper maintains the integrity of the entire 72m of span.

The aircraft is a flying laboratory. Even its insurance cover is a breakthrough: considered uninsurable because of the high risk and priceless value of the technology it incorporates, the first aircraft spent much of its development time on the ground without any form of protection. A fire or other disaster might have stopped the project dead. However, Swiss Re came to the rescue. In 2012, its corporate solutions division became the sole insurer of the prototype aircraft after a lot of head-scratching and creative risk-assessment. But the paperwork arrived in the nick of time for Solar Impulse 2 – the document was presented at its unveiling ceremony in early 2014. “As of today the plane is officially insured – you’re clear for take-off,” announced Agostino Galvagni, Chief Executive of the corporate solutions division.

Success or failure
What are the odds of this feat of scientific and physical derring-do coming to a successful conclusion? A few short months out from the culmination of a decade of work, the odds are 50:50. The preliminary flights have proved promising and, as far as outsiders know, Solar Impulse 2 has met its targets. However, it remains a flying experiment. Above all, nobody has sat at the controls of the aircraft for five days and nights, struggling to stay awake as they pilot Solar Impulse 2 across continents and oceans.

Piccard himself seems sanguine about prospects for success. And anybody who has seen the wind-powered Breitling Orbiter 3 – perhaps the most unlikely-looking form of round-the-world transport – would certainly give its sun-driven successor a better chance of making it. But even if Solar Impulse 2 breaks down and is grounded somewhere en route, it has already produced measurable benefits for mankind.

In bed with Big Brother: Apple and IBM form strategic alliance

When two companies decide to come together as part of a strategic alliance, they usually trumpet the deal as the exciting start to a happy union. However, such alliances can often be the result of two desperate companies being flung together in a last ditch effort to compete with their respective rivals.

Forming a partnership that benefits both parties is often seen as a cost-effective way of expanding a business into new areas without creating too much of a financial burden by building a new division of a company. However, while sharing expertise across two companies can obviously have its benefits, it also comes with a large number of problems that can cause difficulties for both businesses.

The news that two of the world’s leading technology companies – Apple and IBM – would be entering into an exclusive strategic partnership caused shock among many within the industry. The deal, which will see the firms collaborate on developing a leading role in the enterprise software arena, raised eyebrows among those who remember the less than respectful relationship the two have had in the past.

Difficult history
Apple and IBM’s history goes back a long way, and has not exactly been peaceful. Fraught with tension, Apple’s relationship with IBM in the early days of the 1980s was often that of a brash young upstart doing battle with the established, more corporate rival. Such was the competition between the two companies that Apple founder Steve Jobs was photographed making a somewhat uncomplimentary gesture outside IBM’s offices, while his firm ran an advert during the 1984 Super Bowl that showed IBM as ‘Big Brother’ in an Orwellian future: the message was that Apple was the fresh and friendly alternative. Jobs was even more explicit in a keynote speech: “It is now 1984. It appears IBM wants it all. Apple is perceived to be the only hope to offer IBM a run for its money. Dealers initially welcoming IBM with open arms now fear an IBM-dominated and controlled future. They are increasingly turning back to Apple as the only force that can ensure their future freedom. IBM wants it all and is aiming its guns on its last obstacle to industry control: Apple. Will Big Blue dominate the entire computer industry? The entire information age? Was George Orwell right about 1984?”

The partners in an alliance are seen as a bunch of losers clinging to each other with the hope that there’s safety in numbers

The two companies took very different approaches to the computer market, with Apple’s closed-off system standing in stark contrast to the more open platform IBM offered. While Microsoft ultimately succeeded in overtaking both to become the dominant personal computer platform, Apple’s ‘walled-garden’ approach saw it fall far behind IBM.

IBM’s move towards enterprise computing and Apple’s focus on consumer products has meant the two operate in very different spheres nowadays. They have, however, tried to partner up before – most notably in the early 1990s when they formed the AIM Alliance with Motorola. All three firms were struggling against the dominance of Microsoft, and the alliance was created to develop leading processors and software for both the consumer and enterprise markets. Unfortunately, most of their attempts to crack these markets had failed by the end of the decade, and the AIM Alliance fell apart.

Hands in each other’s pockets
Apple has struggled to break into the corporate market, with most offices around the world preferring to use Microsoft’s Windows platform over Apple’s OS X. In the personal computing market, Microsoft reportedly outsells Apple at a rate of 19 to one. In the mobile market, Apple iOS platforms had to play catch up with RIM’s BlackBerry smartphone, which was favoured by corporate clients. While BlackBerry’s market share has dropped at an alarming rate in recent years, many big companies still hold strong reservations about putting Apple’s services at the centre of their IT operations. Long held as a consumer- and media-focused company, Apple’s business services have been seen as lacking in comparison to those offered by its rivals.

The exclusive nature of the Apple-IBM deal gives the former a huge opportunity to tap into the latter’s large number of corporate customers and get them to switch to its iOS platform. Apple plans to launch a “new class” of more than 100 enterprise solutions as part of the deal, many of which may come in the form of native iOS applications. Another feature will be optimised IBM cloud services tailored specifically for iOS, which will allow IT departments to manage devices, security and analytics in a more efficient manner. Apple will also offer an exclusive form of its AppleCare insurance product to IBM customers.

Tim Cook, Apple’s CEO, said in a statement accompanying the deal that it represented a “radical step for enterprise”. He added: “iPhone and iPad are the best mobile devices in the world and have transformed the way people work, with over 98 percent of the Fortune 500 and over 92 percent of the Global 500 using iOS devices in their business today.” In particular, Apple wants to harness IBM’s strong analytics business for the benefit of corporate clients. Expressing an admiration for IBM Jobs would scarce have countenanced, Cook said: “For the first time ever we’re putting IBM’s renowned big data analytics at iOS users’ fingertips, which opens up a large market opportunity for Apple. This is a radical step for enterprise and something that only Apple and IBM can deliver.”

The man at the top of IBM was also keen to trumpet the significance of the deal. Ginni Rometty, Big Blue’s Chairman, President and CEO, added in the same statement that the deal will transform the way people work: “Mobility – combined with the phenomena of data and cloud – is transforming business and our industry in historic ways, allowing people to re-imagine work, industries and professions.

“This alliance with Apple will build on our momentum in bringing these innovations to our clients globally, and leverages IBM’s leadership in analytics, cloud, software and services. We are delighted to be teaming with Apple, whose innovations have transformed our lives in ways we take for granted, but can’t imagine living without. Our alliance will bring the same kind of transformation to the way people work, industries operate and companies perform.”

Mutual benefits
Some in the industry sounded a note of caution over the deal. Jean-Louis Gassée was an executive at Apple during its early years in the 1980s, and knows a considerable amount about the two firms’ previously factitious relationship. He wrote: “Alliances generally don’t work because there’s no one really in charge, no one has the power to mete out reward and punishment, to say no, to change course. Often, the partners in an alliance are seen as a bunch of losers clinging to each other with the hope that there’s safety in numbers. It’s a crude but, unfortunately, not inaccurate caricature.”

Starbucks placed its coffee outlets inside bookstores, offering shoppers the chance to sit back and read while enjoying an overly milky coffee

However, Gassée believes the Apple and IBM deal will be different: “Another, more immediate, effect, across a wide range of enterprises, will be the corporate permission to use Apple devices. Recall the age-old mantra You Don’t Get Fired For Buying IBM, which later became DEC, then Microsoft, then Sun… and now Apple. Valley gossip has it that IBM issued an edict stating that Macs were to be supported internally within 30 days. Apparently, at some exec meetings, it’s MacBooks all around the conference room table – except for the lonely Excel jockey who needs to pivot tables.”

For all the failed strategic alliances, there have been a number that have worked and helped bolster the businesses of both partners. There are also a few firms that have successfully employed strategic alliances throughout their history to enhance their brands and widen their reach.

Starbucks, the seemingly ubiquitous global purveyor of weak coffee, saw a dramatic transformation in the early 1990s after two decades as a struggling chain restricted to the West Coast of the US. In 1993, it formed a partnership with book retailer Barnes & Noble, placing its coffee outlets inside the bookstores, and offering shoppers the chance to sit back and read while enjoying an overly milky coffee. The success of the partnership led to people spending an increased time in Barnes & Noble, browsing for longer, and buying books they’d had a chance to sit down and try while drinking their coffee. It also gave Starbucks an image boost, by associating the brand with a more sophisticated clientele. Starbucks has since copied the model with many other big chain stores, including PC World in the UK. In 1996, it joined forces with PepsiCo to bottle and sell a Starbucks-branded coffee drink (the Frappuccino) in shops and supermarkets. It has also struck deals with airlines, retail stores and restaurant chains, including United Airlines, Kraft Foods and McDonald’s.

One of the longest standing partnerships between companies operating in different industries is the one between Disney and Hewlett-Packard. The firms have worked together since 1938, when Disney acquired equipment to help create the sound design for its film Fantasia. The two have worked together ever since, with HP currently supplying much of Disney’s IT network.

Even Apple has had a number of strategic alliances – with varying degrees of success. It has offered exclusive deals to telecoms giant AT&T for its iPhone (something that probably favoured that company more than Apple in the end) and entered a partnership with leading e-discovery firm Clearwell Systems in 2010 to bring better analytical services to businesses and law firms through the iPad.

Failed friendships
However, there are plenty of examples of alliances that have failed. In late 2009, German auto giant Volkswagen (VW) announced it was acquiring a substantial stake in Japanese manufacturer Suzuki Motor Corporation. The deal saw VW take 19.9 percent of Suzuki for €1.7bn and sign an agreement to share technologies and global distribution networks. It seemed as though it would help both firms break into each other’s markets, with VW dominant in Europe but struggling to crack Asia. Part of the deal saw VW allow Suzuki to have use of much of its electric and hybrid vehicle technologies, while the Japanese firm offered its German partner its own technologies, as well as access to its lucrative hold of the Indian market.

Sadly, the partnership quickly unravelled in a storm of disagreements and cultural differences. By October 2011, Suzuki claimed VW had breached its contract, particularly in failing to hand over the hybrid technology. A month later, the two companies terminated their agreement to work together, and Suzuki demanded VW return its near 20 percent stake: something the German firm refused to do. The dispute eventually went to an international arbitration court and continues to rumble on. This summer, the London-based court announced it had finished taking witness hearings and will likely make a decision on the case by the end of the year.

US tech giant Cisco Systems has consistently failed in efforts to forge partnerships with other firms. It has attempted in the past to work with both Motorola and Ericsson in an effort to enhance its operations. However, after Google acquired Motorola, the company ceased being a partner of Cisco and became a competitor. Similarly, Cisco’s 2004 alliance with Ericsson to modernise the wireline communications market collapsed after Ericsson made a number of acquisitions that made it a direct competitor with its strategic partner.

As Gassée wrote: “There was a time when strategic alliances were all the rage. In 1993, my friend Denise Caruso published the aptly titled ‘Alliance Fever’, a 14-page litany of more than 500 embraces. The list started at 3DO and ending with Zenith Electronics, neither of which still stands: 3DO went bankrupt in 2003, Zenith was absorbed by LG Electronics. These aren’t isolated bad endings.”

Whether Apple and IBM’s deal will go the same way is unknown. Certainly Apple needs to do something to challenge the dominance of Microsoft in the corporate market, as well as persuading IT departments it has the strength of service and security that has so far deterred them from fully signing up to the Mac ecosystem. Likewise, IBM will hope that partnering with Apple will allow some of its former rival’s success to rub off, while also giving it access to Apple’s lucrative consumer market.

In order for it to be a success, there need to be clearly defined boundaries as to what the two firms are offering, as well as trust and full disclosure about the enterprise markets they are pursuing. Cook has maintained this will not be a problem, telling CNBC: “It’s landmark. It takes the best of Apple and the best of IBM and puts those together. There’s no overlap, there’s no competition. They’re totally complementary.”

Why are governments pretending global warming isn’t happening?

Governments will do all they can to tackle climate change – providing it doesn’t cripple the economy, cost jobs or hurt long-term growth. This is expected: no country would risk the financial stability of the realm to ensure CO2 emissions are slightly reduced and waters are kept cleaner. Yet few in the highest echelons of power will admit to moving backwards on environmental policy, and would rather people just forget about global warming. Paradoxically, not doing anything to tackle climate change is the surest way of all to destroy jobs and impede long-term economic growth.

It’s unlikely governments will forsake the economy for the environment; the short-term fix is favoured over the complex – but crucial – long-term remedy. And the truth is people value their standard of living more than the environment. So governments can’t really be blamed for cooling efforts to boost renewable energy sources. But the economy versus the environment will continue to be a huge conflict of interest as 65 percent of annual CO2 growth is down to economic activity, according to the New Economics Foundation.

The truth is people value their standard of living more than the environment

The G20 Summit heads to Australia in November and eyes are fixed on the country. Many hope environmental issues will be prominent on the agenda, but the chances of that happening are as likely as Australian Prime Minister Tony Abbot choosing to become a monk… again. Abbot is easy to single out as an environmental tyrant, but he is only one of a host of leaders who are growingly increasingly apathetic to climate change. Here, we highlight some of the most prominent governments that have gone cold on global warming.

Australia: environmental vandal
Air pollution is killing more Australians than road accidents, according to an investigation by The Sydney Morning Herald. Despite this, Australia’s right-wing government is clear on climate change: it will never be as important as economic growth. In July, we reported Tony Abbot was on a destructive mission to reverse Australia’s commitment to the environment with a string of deeply worrying polices. Since assuming office in 2013, Abbot and his Environment Minister Greg Hunt have been at the forefront of a worrying number of policy reversals. They have pushed hard for the delisting of a rare Tasmanian forest as a World Heritage Site, exempted Western Australia from laws protecting its endangered sharks and approved offshore infrastructure that will dredge 35 million tonnes of the seabed.

The Galilee Basin is at the root of the problem. With a combined area of 96,000sq mi, it is one of the biggest thermal coal reserves in the world. Due to its close proximity to the Great Barrier Reef, environmentalists are concerned mining will lead to an extra 4,800 ships travelling across the highly sensitive area. Abbot’s unabashed move to ditch environmental responsibility in favour of big business was described by one disgruntled citizen as “environmental vandalism”.

Canada: global warming censor
Meteorologists have been banned from discussing climate change in public by the Canadian Government. The shocking move to gag weather experts highlights Canada’s wavering commitment to environmental issues. In June, the government joined forces with Australia by publicly forsaking climate change in favour of financial growth. Prime Minister Stephen Harper said environmental sustainability was “not the only or even the most important” problem the world faced, at a joint conference with Tony Abbot in Ottawa. The pair played down the possibility of coordinated global action on climate change, despite President Obama’s ambitious package for united action on curbing emissions. Harper and Abbott said they felt no extra pressure to make a concerted effort to combat climate change.

Canada’s CO2 emissions were 14.68 metric tonnes per capita in 2010 and they are predicted to soar 38 percent by 2030. The country had agreed to reduce greenhouse gases by cutting emissions to five percent of 1990s levels by 2012, but the government pulled out in 2011. It has also rejected the Kyoto Protocol – an international treaty that sets binding obligations on industrialised countries to reduce emissions – by continuing to exploit oil sands. Greenhouse gas emissions for oil sand extraction are three times higher than a barrel of conventional crude oil. Stephen Devlin, an economist at the New Economics Foundation says: “With increasingly ambitious domestic policies coming from China and the US… Steven Harper is at a real risk of marginalising himself in the international community.”

UK: crazy Pickles
Communities Secretary Eric Pickles was described as “crazy” for his apparent opposition to offshore wind farms by Tom Burke, Chairman of the sustainable energy advocacy group E3G. The former government advisor criticised Pickles for his interference in environmental issues and claimed his stance on renewable energy was “scaring away investors”. Since Pickles has been in office, he has authorised just two of 12 onshore wind farm applications. But the problem is more deep-rooted than one dogmatic politician who opposes renewable energy. Burke says environmental policy is “ideologically against the core values of the right”.

In January, the UK Government came under fire for discreetly pushing through new changes to fracking regulation so companies no longer have to inform nearby residents of their plans to drill in the area. Prime Minister David Cameron is also alleged to have told junior ministers to “cut the green crap” from energy bills, according to The Sun newspaper. The UK Government has tried to look pro-green by hugging huskies and announcing plans to build a wind turbine at the PM’s home in Oxfordshire (which mysteriously never happened): behind closed doors, however, a tacit agenda against climate change may be in the making. To be truly active, the UK has to set the pace for climate change in Europe. “If Britain led the pack in aggressive measures to reduce CO2 emissions by 40 percent, Europe could cut its gas imports from Russia by 80 percent,” says Burke, citing E3G research. Despite calls for the government to do more on environmental policy, the coalition is still on target to half emissions by 2025.

Germany: coal hearted
Coal was once a doomed source of electricity in Europe, but it’s been on the ascendancy since Germany decided to shutter its 17 nuclear power stations. The closures will be completed by 2022, and come in the wake of the Fukushima disaster and as consumers bemoan rising electricity costs in Germany (three times higher than US prices). It’s a massive step backwards for Chancellor Angela Merkel, who has helped Germany become one of the top renewable energy producers in the world.

Around 32,411 MW of renewable energy is produced in Germany annually: nearly 30 percent of the country’s electricity needs. This commitment to renewable energy is keeping electricity prices high and has forced Germany to reopen its coal plants – much to the dissatisfaction of environmentalists. Coal-generated electricity is at its highest rate in Germany since 2007. And this is not any old coal; it’s lignite, the dirtiest form of fossil fuel. Germany is seeking to rebalance green ambition with the reality of sustaining a healthy economy, but Burke believes the two can coexist: “Climate change policy will enhance economic growth – it will stimulate investment in infrastructure and generate value whilst improving energy productivity.”

India: Greenpeace antagonist
India is haemorrhaging $80bn a year by failing to turn around environmental degradation, according to a World Bank report. Growth has brought opportunity and optimism to India – as well as air pollution and water contamination. This is a “big threat to the livelihood of millions of Indians,” said Greenpeace India’s Nandikesh Sivalingam. Relatively little has been done to tackle climate change as CO2 emissions rose 7.7 percent last year and India appears to be on course to exacerbate the issue.

In August, the Indian Government announced plans to tighten control over the funding of Greenpeace India, sparking fears of a wider crackdown on environmental groups. The Ministry of Home Affairs told India’s Central Bank that any transfer of funds from Greenpeace International or ClimateWorks Foundation to Greenpeace India will need special approval. The ruling, coincidently, came several weeks after India’s Intelligence Bureau claimed foreign funded NGOs were “stalling development projects” and hurting growth. Pro-business Prime Minister Narendra Modi believes this has set India’s economy back three percent annually. Greenpeace India described the move as an attempt to “crush and stifle opposing voices in the civil society”. Sivalingam added the NGO continues to face challenges over the “destruction of biodiversity” thanks to forest mining and the “unsafe” use of nuclear energy.

How Star Wars changed the special effects industry

A long time ago, in a galaxy far, far away, a young cinematographer called George Lucas changed the way film was made. The year was 1977, the far away galaxy in question was an inhospitable barren land known as Hollywood. The young Lucas had a very clear vision and aesthetic in mind when he wrote his space fantasy magnum opus, but the technology available was still woefully inadequate to produce such challenging visuals. Lucas had created – in his storyboards at least – a rich environment filled with weird and wonderful creatures, space battles and hunk of junk spaceships leaping through hyperspace. But at a time when computer generated imagery (CGI) was still in its infancy (having debuted in Richard T Heffron’s Futureworld only a year earlier), it quickly became clear Lucas was going to have to build his universe by hand.

In our universe, almost four decades later, that young Jedi’s creation remains a big deal. Without going into the merits of plot, character and storylines – a debate almost as controversial as arguments over whether Han Solo or Greedo shot first – the way Star Wars was made set a high bar for the future of special effects. The original trilogy (Episodes IV, V and VI) produced between 1977 and 1983 set the gold standard for visual effects. John Dykstra, the special effects designer who worked on the original Star Wars and was chosen by Lucas to head up the director’s visual effects company, Industrial Light & Magic, has described sitting with a group of friends and building models and robots from scratch in order to achieve the realism Lucas was after.

“Back in the days of Star Wars, we kind of walked into an empty warehouse and sat on the floor and went ‘How are we going to do this?’” he told Den of Geek. “The producers, Gary Kurtz of Fox and George Lucas, took an incredible risk by listening to what myself and my collaborators had to say with regard to how to do this, because we were inventing this stuff from scratch. We have a basic toolkit now, in the form of a computer, and so much of it now is programming.”

UK tax rebates

25%

For films spending £20m or less

20%

For films spending over £20m

71%

Growth of UK film production due to tax incentives

1,050

Films claimed rebates between January 2007 and May 2013

£1.1bn

Total tax relief claimed in that time

£22m

Rebate claimed for Thor: The Dark World (believed to be the largest yet)

Dykstra and Lucas used extremely detailed miniatures, animation and a pioneering system of computer-controlled motion photography to create special effects (SFX) that still look fresh today. Star Wars was the first big-budget blockbuster to rely on realistic action scenes and explosions, and essentially invented the techniques to achieve this. But a mere two decades after Dykstra destroyed the Death Star in A New Hope using nothing but a cardboard box and titanium shavings, Lucas turned his back on animatronics and practical effects in favour of expanding his Star Wars universe digitally in the prequel Episodes I, II, and III, released in the late 1990s and early 2000s.

I’ve got a bad feeling about this
For the grace and balance of Lucas’s first trilogy, in which he and Dykstra expertly managed the juxtaposition of pioneering digital effects and animatronic special effects, his last three Star Wars films incited the ire of fans – despite being just as groundbreaking. Star Wars Episode I: The Phantom Menace, released in 1999, featured some of the same elements as A New Hope had 22 years before: the planet Tatooine and the famous gliding landspeeder driven by Luke Skywalker in the original. But the difference was that, in the more recent films, all the effects were created digitally and added in post-production by a talented team of VFX compositors. Across those two trilogies, Lucas revolutionised the industry twice with his approach to visual effects: once as he virtually invented the tools he needed to make his universe come to life, and second when he pushed digital effects further than ever before by building fully rendered characters, cities and spaceships out of computer animation. Now, as Disney prepares to launch the first Star Wars film free from Lucas’ control, the series is set to continue innovating as it turns away from flashy VFX and opts for a more analogue approach to filmmaking, true to its roots.

When Industrial Light & Magic pioneered the process of building whole environments digitally, and then inserting them into the original shots, it opened up a range of opportunities for art directors and filmmakers. Suddenly they were no longer constrained by the impracticalities of having to build every aspect of a scene from scratch, and could bend or break the laws of physics in order to make shots more powerful. But there was a weightlessness to the images created, which cinemagoers found hard to connect with. Jar Jar Binks – one of the key characters in Lucas’s prequel trilogy – is the finest example of this; the character was both annoying and at odds with his environment, and was widely ridiculed by audiences. However, just a few years later, The Lord of the Rings films would introduce audiences to Gollum, and he would become one of the most popular characters of the franchise – despite being created almost entirely through CGI.

Visually, the landscape, cities and characters of the Star Wars prequels were dazzling, but there were issues with how the actors contrasted with the CGI environment, detracting from the films’ stunning visual fabric. Inadvertently, as Lucas was pioneering a new way of making films, and a set of techniques that have become increasingly popular over the last decade and a half, he was both creating a new niche in the visual effects market and making audiences acutely aware of the limitations of the medium at the same time.

“When they introduced the Jar Jar Binks character, they were confident they could completely satisfy the audience with a full CGI character, whereas we have since learned that was not the case,” explains Owen Jackson, a London-based VFX compositor. “Directors today are certain they can tell a story with a full CG character: take Avatar for instance, which was carried by a series of CGI characters. Though that was only in 2009, and it was a tremendous success, audiences of today would already note the difference in the quality of the CGI available in 2014.”

A bright centre to the universe
In the years since the Star Wars prequel trilogy was launched, CGI technology has come along in leaps and bounds, with London emerging as the capital of the special visual effects (VFX) sector. It is a game-changing niche as far as making films is concerned: VFX, even more than conventional special effects and animatronics did in the 1970s, has opened up new worlds and galaxies to be explored in film. It has also helped cement science fiction as a credible genre, rather than a market for teenagers. Take 2014’s Oscar- and BAFTA-winning picture Gravity, which is primarily set in orbit: it is utterly realistic and has, perhaps more than any film since Avatar, helped displace audiences’ fear of CGI-heavy pictures.

VFX has become a rich industry, innovative both in terms of artistic creativity and cutting-edge technology. In the mid-2000s it was JK Rowling’s contractual agreement with Warner Bros that the Harry Potter movie franchise be primarily produced in the UK with British cast and crew that boosted the sector. This surge of activity by industry-leading companies led the UK Government to secure a slew of tax credits in order to incentivise the growth of the sector and compete on a par with other leading industry capitals. According to a recent study by Oxford Economics, film production in the UK would be up to 71 percent smaller if it were not for the government’s tax incentives.

“London and the UK is experiencing an extraordinary amount of film, TV and animation activity as a result of tax credits,” Adrian Wootton, Chief Executive of Film London and the BFI told the Financial Times. “There’s been a huge amount of R&D and expansion. We now have the biggest concentration of VFX in the world, with six of the eight top companies in this sector in London.”

The planet Tatooine appeared in multiple episodes. CGI enhanced the physical sets in later entries
The planet Tatooine appeared in multiple episodes. CGI enhanced the physical sets in later entries

Do or do not
As a result of this unexpected shift to Europe, the production of blockbusters and high budget showstoppers has been fragmented. Where once a film would have been commissioned, shot, cut and launched from a base in Hollywood, today tax incentives and the emergence of tech hubs such as London mean the process of making a film is no longer centralised. This means films can be cheaper to make, but creatively they are much more complicated products. The advent of modern CGI technology means changes can be made to the plot and art direction of a film up to the last seconds of post-production: a process that would have been extremely costly and time-consuming before digital images invaded the shots.

“Directors have gotten used to the ability to continue to make changes throughout production rather than going in with a clear direction,” explains Jackson. This has led to a slew of CGI-heavy but incoherent features that have cost production companies billions of dollars. Take Disney’s 2012 flop John Carter: an epic reworking of a 1917 adventure novel set on Mars, the film left Disney $200m out of pocket.

The film mostly consisted of epic battles with computer-generated beasts, but lacked a discernible plot or emotional focus for the audience to connect with. A CGI environment that is too extensive can leave audiences feeling alienated from the story and the characters, with nothing recognisable to identify with. It is no coincidence Avatar – the highest-grossing film of all time – expertly blends live-action sequences with those created digitally.

Given the damage done to the reputation of CGI by the likes of John Carter and Jar Jar Binks, it wasn’t coincidental that JJ Abrams, the man tasked by Disney to helm the upcoming Star Wars Episode VII, gleefully allowed images of a full-scale Millennium Falcon model and a fleet of X-wing fighter craft to circulate online. Abrams is drawing a line under Jar Jar and embracing the old school sets, props and animatronics that made the original trilogy so popular with fans and critics.

By combining analogue techniques with CGI effects, directors such as Abrams are looking to make up for some of the inevitable shortcomings of both technologies. However, some in the industry believe that, despite Abrams’ high-profile return to analogue, CGI will inevitably take over for good.

“When you watched Transformers, did it feel like the robot characters were real and did it look like they were alive in the same world as [the film’s star] Shia LaBeouf?” wrote Alex Billington, a film critic and industry commentator on firstshowing.net. “When you watched The Dark Knight, did you ever notice one second of CGI, even in Harvey Dent’s burned face? If Hellboy II is out of the question in the argument for practical effects, the next great example is Iron Man, where the late Stan Winston’s armour was completely hand-built and used while shooting. However, could you tell when his armour changed from practical to CGI? I know I never knew the difference and that’s a testament more to the CGI artists than anyone else. That’s not to say that Winston’s work wasn’t amazing, as it certainly was what gave the CG artists a design to work off of, but it shows that the use of CGI is not a bad thing by any means.”

A technician shows some of the design work behind key scenes from the hugely successful film Gravity in the London offices of a leading visual effects company
A technician shows some of the design work behind key scenes from the hugely successful film Gravity in the London offices of a leading visual effects company

But there is still a long way to go for the industry, especially when it comes to human or humanoid CGI characters. “Creating a fully digital CGI human face is the holy grail of visual effects, and we are nowhere near that yet,” says Jackson. “That is why studios like Pixar have very stylised human faces, because they cannot achieve a realistic enough final product. The realism that you naturally get from an animatronics face is more believable because it’s an analogue photographic technique that is easier for audiences to believe. People are so used to looking at human faces that they can pick up on even the most minute discrepancies and nuances of the animation: if anything is off, audiences will spot it even if they cannot put their finger on why it looks wrong.”

It is not a coincidence then that the most successful CGI characters – such as Gollum in The Lord of the Rings trilogy and the Na’vi people in Avatar – were created by blending real actors with many layers of CGI images, rather than by animating a character from scratch. Andy Serkis, who played Gollum, has helped pioneer these types of performance capture roles, which have allowed CGI characters to become more realistic.

The fact of the matter is directors no longer have to choose between including traditional special effects and animatronic models or going fully digital by filming sequences using a green screen and building the shots later – although many still do. The most successful products combine a mixture of both disciplines. This world is not one of light and dark: but it is, like the mystical Force in Lucas’ sci-fi epic, in need of balance.

Major labour reforms ahead after McDonald’s loses wage fight

Typically, one of the four million fast food workers in the US will receive between $7.50 and $9 an hour. When Nancy, a long-term full-time McDonald’s employee earning $8.25 an hour – which, with her two children, put her below the US poverty line – called the company’s McResources helpline in 2013, the operator suggested she visit food banks. “Are you on SNAP?” her corporate councillor asked. “It’s Supplemental Nutritional Assistance Programme, or food stamps.” With her two children, Nancy would “most likely be eligible for SNAP. It’s a federal programme”. It turned out the company’s official policy was to direct employees struggling to make ends meet to federal assistance programmes.

With a single line in an already bombastic ruling, the General Counsel of the National Labour Relations Board (NLRB) potentially changed the archaic nature of US labour relations – and the lives of workers such as Nancy – irrevocably. The US, for all its talk of freedom and modernity, has a shockingly unfair and outdated set of labour laws – or rather, a lack of appropriate legislation in the area. So when workers employed by McDonald’s franchises in low-skill low-wage jobs decided to hold the multinational behemoth accountable for their shockingly low pay, it seemed unlikely they would prevail. But, in a historic decision, they did, ushering in a new era for workers’ rights in the US.

The key phrase is ‘joint employer’: a term that applies to corporate entities and potentially links them to franchisees. If the decision is ratified, it would mean McDonald’s being held jointly responsible for matters of hiring, firing, wage disputes and benefits – potentially challenging its well-established franchise model. Today, up to 90 percent of the US’s 14,000 McDonald’s restaurants are franchises.

The company’s official policy was to direct employees struggling to make ends meet to federal assistance programmes

Though it primarily concerns franchise businesses, the decision is significant because labour rulings against large, extensively represented corporations such as McDonald’s are rare, and this one is nothing short of incendiary. The case was brought by a group of workers who believed McDonald’s should be held jointly responsible for the pay and well-being of employees hired by franchise holders. Unions in the US have long argued big corporations should be at the table when it comes to collective negotiations on pay. McDonald’s and co. have hidden behind the caveat that it is franchise holders who hire, and are therefore ultimately responsible for labour disputes and issues. Essentially, McDonald’s argued staff at its restaurants across the US were subcontracted by franchise holders.

Jointly responsible
Business groups, however, have been vocal in their opposition to the ruling. They are not wrong to fear it, as franchisors would suddenly be held liable for thousands of legal cases of overtime, wages and union-organising violations brought by their employees and subcontractors. Richard F Griffin Jr, the labour board’s general counsel, however, was adamant there was merit in 43 of the 181 claims brought before the NLRB, which accused McDonald’s of illegally firing, threatening and penalising workers for getting involved with labour unions, and other pro-labour activities. McDonald’s is likely to contest the decision.

The dispute between employees and fast food chains has been heated for some time, and the ruling by the NLRB added fuel to the fire. At the beginning of September, workers from McDonald’s, Burger King and other franchises in California, Missouri, Wisconsin and New York walked out in protest over low wages. The industrial action, coordinated by local union groups, coalitions and the pressure group Fast Food Forward, has been the biggest yet, and the ruling by the NLRB, though favourable for workers, has done little to assuage their anger. The campaign as a whole has been funded and backed by the Service Employees International Union.

“Employers like McDonald’s seek to avoid recognising the rights of their employees by claiming that they are not really their employer, despite exercising control over crucial aspects of the employment relationship,” Julius Getman, a labour law professor at the University of Texas, told The New York Times. “McDonald’s should no longer be able to hide behind its franchisees.”

The ruling referred specifically to 43 cases filed since November 2012, though 64 cases remain pending. It is now up to the plaintiffs and the defenders to settle. “The National Labour Relations Board Office of the General Counsel has investigated charges alleging McDonald’s franchisees and their franchisor, McDonald’s USA, violated the rights of employees as a result of activities surrounding employee protests,” said the NLRB statement that followed the ruling.

Liability and responsibility
McDonald’s case has always appeared rather flimsy. The very nature of a franchise is that individuals or small company can buy the right or license from a larger brand to market the larger company’s products or services in a specific space. The essence of McDonald’s business is to license nearly identical restaurants selling nearly identical burgers all over the world. To then suggest the only aspect of its business that is not centralised and rigorously monitored by McDonald’s is patently ludicrous. For a McDonald’s franchisee, everything from the uniforms and the training of their workers to the way frontline staff greet diners is meticulously set out by head office. Everything apart from their pay, that is, if McDonald’s had it their way.

14,000

McDonald’s outlets in the US

90%

of which are franchises

“[The] decision of the NLRB is a vital step in reminding companies that they can’t hide behind labels or a franchise system to avoid complying with our labour laws,” wrote Sarah Belton, a lawyer with Public Justice, in The New York Times. “Our country is in the middle of a critical conversation about poverty, income inequality and the rights of workers to a living wage. Integral to this dialogue is today’s focus on the bottom line that often stresses cutting costs, like labour, in the hopes of raising profits.”

If Belton’s words appear far-fetched or overly dramatic, they are not. Fast Food Forward is campaigning for the minimum wage of employees to be raised to $15 an hour and that they should be allowed to join a union without fear of reprisals from employers. The wage increase would raise workers’ incomes above the poverty line but below the national average – hardly a grandiose demand from the workforce. According to the pressure group, McDonald’s own employee resources website suggested it would take $12.36 an hour net, roughly, to make ends meet. Fast Food Forward, however, claims the McDonald’s figure doesn’t factor in food, water and bills, and suggests workers would actually need two jobs to survive on that wage.

Even though McDonald’s seems to have been aware of the tragically low wages paid to its frontline workers, it has specifically chosen to hide behind the excuse that they are subcontracted by franchisees. “McDonald’s can try to hide behind its franchisees, but today’s determination by the NLRB shows there’s no two ways about it: the Golden Arches is an employer, plain and simple,” said Micah Wissinger, a lawyer who filed complaints on behalf of several McDonald’s employees in New York. “The reality is that McDonald’s requires franchisees to adhere to such regimented rules and regulations that there’s no doubt who’s really in charge.”

“Conceptually, this is a big step,” Catherine Fisk, a labour and workplace expert at UC Irvine law school, said of the NLRB counsel’s ruling to the Los Angeles Times. “It shows the general counsel is trying to adapt to the evolving nature of the restaurant business.” Though Fisk is sceptical about whether or not the ruling will lead to any change in terms of the unionisation of fast food workers, she pointed out that, more importantly, the ruling ensured that, if workers did organise, “they can at least bargain with someone who can do something”, i.e. the branded company in question.

“Allowing companies to game the system to prioritise profits comes with a cost,” wrote Belton. “American taxpayers spend over a billion dollars a year on public assistance to McDonald’s workers.”

Bad for small business
The NLRB decision is yet to be ratified, but, if it is, McDonald’s and other franchisors will be considered liable for matters of wages and benefits as a joint employer. If this standard is changed, the very nature of franchises in the US could be forced to change too, potentially jeopardising smaller businesses. If McDonald’s were forced to the negotiating table with unions over labour matters, it is likely it would pass on those costs to franchisees – often much smaller companies with proportionately smaller profit margins. And it is likely the ruling will affect franchises across a range of sectors beyond fast food.

Angelo Amador, Vice President for Labour and Work Force Policy for the National Restaurant Association, told The New York Times the decision was another example of the current administration’s quest to undermine small businesses. He said the ruling “overturns 30 years of established law regarding the franchise model in the United States, erodes the proven franchisor-franchisee relationship, and jeopardises the success of 90 percent of America’s restaurants who are independent operators or franchisees”. David French, Senior Vice President with the National Retail Federation, agrees, saying the decision confirmed the labour board was “just a government agency that serves as an adjunct for organised labour, which has fought for this decision for a number of years as a means to more easily unionise entire companies and industries”.

The reality is that McDonald’s requires franchisees to adhere to such regimented rules and regulations that there’s no doubt who’s really in charge

Another risk, detractors of the NLRB ruling are suggesting, is that, by being held accountable for labour and welfare, branded companies will back out of the traditional franchise model. “It’s a catastrophic event from a legal perspective,” Robert Cresanti, Executive Vice President of Government Relations and Public Policy at the Washington-based International Franchise Association told Business Insurance. “It shatters fundamental principles of privacy and control, and I think we’re going to have long-term repercussions from it.” There is a real danger, it seems, that this ruling will establish a precedent that will lead to franchisors becoming exposed to a number of liability issues that go beyond labour.

“This decision goes against decades of established law regarding the franchise model,” wrote Heather Smedstad, McDonald’s USA’s Senior Vice President of Human Resources, in a statement published on the company’s website. “McDonald’s does not direct or co-determine the hiring, termination, wages, hours, or any other essential terms and conditions of employment of our franchisees’ employees – which are the well-established criteria governing the definition of a ‘joint employer’.” She added McDonald’s, “as well as every other company involved in franchising, relies on these existing rules to run successful businesses as part of a system that every day creates significant employment, entrepreneurial and economic opportunities across the country”.

However, despite the cries of jubilation from labour unions and workers’ groups, – and grunts of dissatisfaction from businesses – the NLRB ruling is still a small step in what will undoubtedly be a long road to more egalitarian workers’ rights in the US. Regardless of the uproar in the business sector, the unionisation of workers is not necessarily such a scary thing. Better-paid, happier workers will have more time and money to spend consuming products they might not be able to afford if they continue to be paid unfairly low wages. The workforce should be treated with the respect and dignity that human beings deserve, because they are more than just the frontline workers at your local fast food joint: they are also its biggest customers.

Australia digs while the world burns

Australia is home to one of the most diverse ecosystems in the world: from kangaroos to giant mosquitos, from the Great Barrier Reef to the world’s largest mangrove. It is certainly one of most exuberant countries in the world when it comes to wildlife. According to the Wilderness Society, there are only 17 countries classed as ‘megadiverse’ – that is, countries with an “extraordinarily high levels of biodiversity”. Australia is one of only two developed nations that fall into the category. Collectively, these nations hold around two thirds of the world’s biodiversity. Australia has more species endemic to its shores than 98 percent of the world’s countries. Most people agree it is a veritable treasure trove of plant and animal life, of incalculable value, and that this biodiversity should be preserved at any cost. Prime Minister Tony Abbott, however, is not most people.

Abbott has been carefully cultivating a dangerous record when it comes to environmental policies. Since assuming office in 2013, he and Environment Minister Greg Hunt have campaigned for the delisting of rare Tasmanian forests as a World Heritage site, approved the exemption of Western Australia from federal laws protecting endangered white sharks, and approved construction work on ports and coal terminals that will dredge the sea bed and potentially harm the Great Barrier Reef. These have not been popular moves, but Abbott has ploughed ahead, undeterred.

The Abbott government has consistently made the case that environmental concerns will always lose when pitted against economic benefits

“This crowd have been in office for less than six months. Imagine the damage they can do in three years,” wrote a disgruntled reader in The Sydney Morning Herald. “We, the people, do not want your bloody coal terminal. You, the decision makers, should be forced to pay for your avarice and environmental vandalism,” wrote another.

A recent report by the Intergovernmental Panel on Climate Change (IPCC) has given some credence to the concerns expressed by the zealous readers of the Herald. In its Climate Change 2014 report, the organisation is clear about the need to lower global carbon emissions, and cited carbon pricing as a positive step towards achieving that goal. It is in direct contradiction to a recent policy reversal by the Australian government in which the cap-and-trade emissions trading scheme was scrapped entirely.

Curbing emissions
The IPCC report is primarily concerned with the mitigation of the effects carbon emissions are already having on the global climate. Research by over 1,250 specialists concluded that, to limit average temperature rises to only two degrees above pre-industrial levels, there would need to be a drastic increase in the use of low-carbon energy by 2050. The IPCC report concludes the required investment to power the shift towards lower-carbon energy consumption would shave only 0.06 percent off expected annual economic growth rates. In the meantime, carbon pricing can be an efficient and viable solution to keeping emissions down.

“Carbon pricing was a positive step in the right direction and if Australia did link it to the European scheme, the price wouldn’t be high. Scrapping it is seen as a backwards step,” Professor David Stern, one of the lead authors of the IPCC report and lecturer at the Australian National University told The Guardian. “It’s a cost-effective incentive for everyone to look for their best option to lower emissions. Most of the mentions of Australia in the IPCC report are around its emissions trading scheme. The report makes clear that carbon pricing is a cost-effective model, if done correctly.”

The carbon pricing policy was launched in Australia under the previous Labour government, but Abbott’s Liberal-led coalition has long taken issue with it. Dubbed ‘the carbon tax’ by its opponents, the policy is now being replaced altogether.

Hunt recently announced an extension of the pre-existing Emissions Reduction Fund, which includes cash rewards for private companies that lower emissions, as well as a tree-planting initiative.

“The carbon tax was a AUD7.6bn hit on the Australian economy in its first year of operation, yet there was no meaningful reduction in emissions. The carbon tax truly is a policy failure,” said Hunt in a statement. “The Australian Government is a step closer to replacing the destructive carbon tax with an effective, practical and simple approach to reduce emissions. The Emissions Reduction Fund is the centrepiece of the Coalition Government’s Direct Action Plan to reduce Australia’s greenhouse gas emissions by five percent below 2000 levels by 2020.”

However, critics of similar schemes, such as Stern, have suggested such reward schemes as the one currently being proposed by Hunt and the Australian government actually cost more in the long run as taxes need to be raised to pay for the subsidies, while carbon trading schemes actually generate revenue. “The carbon price has long-term benefits for our nation and Australians can see that,” WWF-Australia Climate Change National Manager Kellie Caught told The Australian. “Not only does the carbon price raise revenue for the budget, it tackles climate change by making polluters pay, drives investment in renewable energy, and helps save the Great Barrier Reef.”

Coal heart
Australia remains a major coal producer, however, and up to 85 percent of the country’s energy comes from coal plants. In the latest budget, the Abbott government granted the mining industry handouts worth over AUD100m to keep exploring for new sources of coal and iron ore. Plans are afoot for the opening of nine mines in the near future, which could yield 27 billion tonnes of coal. With a combined area of 96,000sq mi, the Galilee Basin is among the biggest coal deposits in the world. But the University of Technology, Sydney, has warned that, if all the plans are approved and mines are excavated, the extra yearly emissions generated will equate to putting an additional 7.59 million cars on the road within the next decade and a half.

The Galilee Basin mines are being developed by Gina Rinehart, the wealthiest woman in the world, along with the Adani Group of India. Three large mines are already being built: the first of which will be about 15-miles wide when finished. The Queensland Government recently granted approval for the AUD16.5bn project, despite widespread condemnation from environmental experts. There are concerns about the effect the mines will have on groundwater due to the mines’ close proximity to the underlying Great Artesian Basin. And, though the mines themselves are problematic from an environmental perspective, the impact the mined coal will have is almost immeasurable.

Most of the coal mined in Galilee will be exported, meaning an additional 4,800 ships will travel through the Great Barrier Reef each year. Specialised ports on the Queensland coast are being expanded to service the Galilee Basin mines. Up to 150 million tonnes of material will be dredged from around the reef, posing an “unprecedented” threat to the corals, according to a report by the Australian Marine Conservation Society. “The Great Barrier Reef is expected to degrade under all climate change scenarios, reducing its attractiveness,” the report states. “Evidence of the ability of corals to adapt to rising temperatures and acidification is limited and appears insufficient to offset the detrimental effects of warming and acidification.”

But mining remains a powerful and lucrative industry in Australia, and, as growth forecasts continue to be revised down, it is unlikely the government will be willing to antagonise it. Abbott’s coalition has set a target of reducing carbon emissions by five percent of what they were in 2000 – a figure roughly in line with other similarly developed countries. But, according to Australia’s independent Climate Change Authority, the country’s target should be closer to 15 percent of 2000 levels in order to be “credible”.

It is unlikely the government will review its targets. The Abbott government has consistently made the case that environmental concerns will always lose when pitted against economic benefits: a policy that might be popular with industry chiefs, but which is unlikely to be sustainable in the long term. Numerous steps have been taken to speed up the dismantling of environmental safeguards; in the new budget, Abbott has confirmed the long-rumoured scrapping of the Australian Renewable Energy Agency (ARENA), despite the agency still having about AUD1bn in unallocated funds. When it was founded less than two years ago, ARENA was given an AUD2.5bn budget to foment the development of renewable energy in the country, support research and share expertise. “Axing the Australian Renewable Energy Agency is a damning reflection on the backward thinking of the Abbott government. The Abbott government has no vision for the future, or indeed, the present day,” the leader of the Green Party, Christine Milne, told the Environment News Service.

Against the grain
ARENA is not, by a long shot, the only environmental initiative on the chopping block. The Clean Energy Finance Corporation, launched last year and responsible for AUD10bn in investment in clean energy companies, is also being discarded. The coalition is also reviewing the Renewable Energy Target, a measure Abbott’s Liberal Party ostensibly backed before the election to support the goal of supplying 41,000GWh of clean energy by 2020. “What’s disappointing here is that the coalition really went out of its way prior to the election to restate their commitment to ARENA,” Kane Thornton, Deputy Chief Executive of the Clean Energy Council, told The Sydney Morning Herald.

Meanwhile, the world’s leading climate change experts and economists are all standing behind reports such as the IPCC’s, which call for a drastic reduction in carbon emissions and a revaluation of the energy industry. But the Australian government seems resolute in playing by its own rules. It is a deeply problematic attitude. Because the country has been blessed with such abundant resources, decisions taken by local government can have a global impact. If the Galilee Basin is explored in the way Rinehart and the Adani Group hope it might, the consequences of the copious unmitigated carbon emissions will be felt everywhere.

The time for the unbridled exploration of coal is long passed; politicians, economists and scientists agree. Abbott, Hunt, and the rest of Australia’s coalition government and their coal mining chums should heed the advice of the IPCC and the Climate Change Authority and review their policies. Owen Pascoe, WWF Australian spokesperson asked it best in a statement: “The IPCC has made it clear what the world needs to do – the question for Australia is now what sort of country are we going to be? Are we going to sit back and hope the United States, Europe and China will do the work to save our Great Barrier Reef and protect our country – or are we going to pitch in and do our fair share?”

New Holland Agriculture on the path to zero carbon farming

New Holland’s vision for the future of agriculture is about efficiency and sustainable farming – doing more with less – for agribusinesses to operate profitably and develop in a way that is sustainable in the long term while preserving natural resources for future generations. This commitment underpins New Holland’s company ethos and is at the heart of everything it does, in every factory, office and product.

This translates into New Holland’s wide-spectrum approach to sustainability and its belief in innovative technology that maximises efficiency, is easy to use and is accessible to all. The company works closely with farmers on every continent, partners with universities and research institutes, and participates in biodiversity and conservation projects across the globe to develop and promote technologies and farming practices that will contribute to a sustainable future.

If we want to go further in agriculture, we must look at increasing the efficiency of farming operations; to do more with fewer resources

All these efforts come together in New Holland’s Clean Energy Leader programme, the overarching objective of which is to help farmers achieve energy independence, increasing their production efficiency and improving the sustainability of their farms. Since 2006, the programme has driven numerous pioneering initiatives shaped around the needs of farmers, the machinery they use, and the impact their activities have on the population and the environment.

The path to energy independence begins with New Holland’s research into renewable energy sources in agriculture: in addition to growing oilseed rape and sunflowers for energy, the company has been looking at a wide offering of other solutions. These range from turning sugar beet, sugarcane and maize into bioethanol, to unlocking the high energy potential of short rotation coppice and grasses such as miscanthus – or even using sugarcane stover and corn stover to produce energy.

On the way to energy independence
As countries around the world look to reduce their dependency on fossil fuels, research into alternative sources of energy has come to the fore. The agricultural world is taking the lead by developing innovative solutions to identify crops that can be turned into energy and the most efficient processes to achieve this. New Holland has taken a proactive role, conducting numerous research projects in collaboration with academic institutions and specialist industrial partners around the world.

These projects look at a variety of crops that can serve as alternative sources of renewable fuel. For example, New Holland is running two projects – one with the University of Illinois and the other with Penn State University and the US Department of Agriculture – to look into various aspects of growing, crop yields and harvesting miscanthus as a source of energy.

Sugar cane stover and corn stover can be transformed into ethanol using second-generation cellulosic bioethanol production techniques, which produces 30 to 40 percent more ethanol than traditional first generation techniques. In Brazil, New Holland has partnered with the Sugar Cane Technology Centre in a project to develop energy production from sugar cane and sugar cane straw in two test farms. This solution brings advantages to both farmers and the environment as it negates the need for stubble burning – which is extremely polluting – and provides an additional source of income for the farmer.

The company has also sponsored research on corn stover bale density. This has resulted in the development of the Cornrower equipment, which can increase bale density by as much as 15 percent.

New Holland is partnering with University of Nebraska to look into using corn stover as animal feed. On another project, New Holland is working with Iowa State University, and the POET and DuPont plants for the production of second-generation cellulosic ethanol from corn stover to determine the best methods to harvest, collect and store corn stover.

The power of biomass
Biomass as a source of energy is more than a viable solution; it is carbon-neutral energy, as the carbon emitted during the utilisation of the crops is absorbed by the crops grown during the following season. And it doesn’t need to take land away from food production, as unwanted by-products of harvesting, such as corn stover, can be baled and turned into energy. Even animal manure, bedding and excess slurry can be used to generate energy.

Alternative energy solutions

30-40%

Additional ethanol produced through second-generation cellulosic production techniques

15%

Increase in bale density from Cornrower equipment

Biomass can provide farmers with the key to energy independence: it is on every farm, and so is the equipment needed to harvest and process the crops. Farmers can produce the energy to run their equipment and their farm – and they can sell any surplus back to the national grid.

Farmers can already run their New Holland equipment on biodiesel: with their land to grow biomass crops, they are well on the way to being energy independent. New Holland is looking ahead, to a future with zero emissions.

On the path to zero carbon farming New Holland has worked on its NH2 hydrogen tractor project, unveiling its second-generation prototype in 2011. This is a truly revolutionary concept that uses a hydrogen tank and fuel cells to generate electricity, which runs the electric motors that powers the machine and its implements. No noise, no polluting emissions – just a little water. The NH2 tractor has shown in the field that it is a viable concept. However, the high cost of the fuel cells means that, for now, it is not commercially feasible.

New Holland has turned its attention to developing a solution for the nearer future, which doesn’t depend on such high cost elements: a methane tractor that runs on methane generated from biomass grown on the farm and processed in the farm’s biogas plant. Methane propulsion technology can lower emissions by as much as 80 percent compared with a standard diesel engine, and, when using bio-methane, the machine’s carbon impact is virtually zero. New Holland unveiled its T6.140 Methane Power tractor working prototype last year. The first unit has entered service at New Holland’s pilot Energy Independent Farm in Italy, and the T6.140 could enter production to become commercially available in only a few years, bringing a zero carbon future closer.

Preserving resources and profitability
Cutting emissions from burning fossil fuels by providing alternative renewable sources of energy produced sustainably is one way of reducing our impact on the environment. If we want to go further in agriculture, we must look at increasing the efficiency of farming operations; to do more with fewer resources. New Holland’s track record in developing innovative technologies and features speaks volumes about its belief in the positive role agricultural equipment can have. It can help us feed a fast growing population while preserving natural resources for future generations – contributing to a profitable business model for farms that is sustainable in the long term.

Doing more with less: that is what precision farming is about, and it is dramatically changing the landscape of farming around the world

Doing more with less: that is what precision farming is about, and it is dramatically changing the landscape of farming around the world. New Holland’s Precision Land Management (PLM) technologies: analyse and plan the farmer’s tasks in the field; pinpoint areas with lower yields where inputs can be focused; and enable farmers to precision place seeds, fertiliser and pesticides to ensure maximum returns, achieving uniform planting and crop protection. This not only leads to higher yields, but it also prevents potentially harmful and wasteful surpluses from damaging the environment.

Soil is a farm’s lifeblood. That’s where the present and future viability of a farm lies, so protecting it – maintaining the nutrients it contains – is vital to every farmer’s future. With PLM mapping and auto-guidance systems, the farmer is able to control field traffic and cover the field with the minimum number of passes, so they never overlap. This prevents overspraying and potentially harmful run-off, as well as minimising the risks of soil compaction, which dramatically affects its productivity.

This is a powerful technology, but of course it is a tool and it requires an in-depth knowledge of efficient farming practices to yield its full potential. New Holland has partnered with US Purdue University’s College of Agriculture – one of the world’s leading academic institutions of agricultural sciences – to develop and implement a programme to train the company’s PLM field staff.

The course covers how precision farming is developing and is expected to evolve around the world, how it fits in and impacts on a farmer’s business model, and the challenges and implications of adopting precision solutions in the business model. The overarching aim of the programme is to enable the staff to identify opportunities PLM offers farmers and how staff can advise them on how they can integrate these technologies successfully in their businesses.

Low impact agriculture
Businesses around the world are encouraged to monitor and reduce their carbon emissions, and farms are no different, as consumers increasingly ask for produce with a small carbon footprint. The easiest and perhaps most obvious way to do that is to use equipment with low emissions, such as New Holland’s ECOBlue machines. These machines not only meet the strict Tier 4A emissions regulations but also cut down on fuel consumption, further reducing their carbon footprint.

In fact, a New Holland Tier 4A compliant machine’s emissions are so low that it would have to run 100 days to produce the same amount of emissions as a Tier 1 machine built 10 years ago would have in just one day. Farmers can discover the exact carbon emissions of their fleet with New Holland’s carbon footprinting method, so they can calculate how much they could reduce their footprint by replacing some of their equipment with ECOBlue models.

New Holland is investing in methane technology to reduce the environmental impact of farming
New Holland is investing in methane technology to reduce the environmental impact of farming

Preserving natural resources is not only about efficiency: it is also about reducing the impact of farming activities on the environment. This can be done, profitably, by combining sustainable management of the soil, residue and inputs with advanced crop diversification techniques. We should not forget that, while farms are the guardians of our countryside, they are primarily businesses that must generate the income needed to build a sustainable long-term future.

New Holland is active in conservation agriculture projects around the world, supporting farmers who wish to apply these practices with equipment and advice. Conservation agriculture focuses on eliminating waste and preserving nature’s resources. No-till farming, for example, leaves the soil virtually undisturbed after the growing season, so nutrients and moisture are locked in for the following season’s crops. This not only combats erosion, but also improves the water preservation qualities of the soil as its structure is maintained. In addition to these benefits, no-till farming reduces soil compaction and fuel consumption, as it dramatically reduces traffic in the field – further reducing the farm’s carbon footprint.

Committed to a sustainable future
The Clean Energy Leader strategy is at the heart of everything New Holland does, from supporting sustainable farming practices to participating in biodiversity and conservation projects. It also guides the company’s partnerships with academic and research institutes that promote technologies and farming practices for a sustainable future.

And it informs the way New Holland runs its operations, starting from reducing the environmental impact of its production. It concentrates manufacturing, where possible, close to the customer so transport is reduced, cutting down fuel consumption. New Holland selects the most efficient and lowest emissions transport solutions. Its manufacturing plants apply World Class Manufacturing Practices that require energy reduction and elimination of waste. It lengthens the lifespan of its spare parts by giving them a second lease of life as remanufactured parts.

As a result of all these efforts, in 2013, New Holland played a key role in CNH Industrial being ranked first in the Dow Jones Sustainability World and Europe Indices for its environmental performance for the third year in a row.

The future of agriculture is bright: applying technology and mechanisation with an eye on the environment, doing more with less while reducing the impact of farming activities is the way to a sustainable and profitable future for farmers – and a healthier environment for all.

How does the music industry make money?

In 1937, the Chicago branch of the American Federation of Musicians arranged a strike in the city that saw musicians withdraw their services in protest at the proliferation of ‘canned’ music. Such had been the rise in the use of recorded music, which was subsequently played on the radio, that musicians who had previously made their money through performing live were now finding work hard to come by.

It seems strange now to think of musicians being against recorded music, but back then it was a relatively new concept and many artists were paid minuscule amounts of money in comparison to what they earned performing live. The idea that people could own recordings of music, which they could then play as and when they liked, meant live musicians were reduced to getting their income from all-too-rare performances, or the meagre royalties that sales and radio plays generated.

The consequence of the ban was a loss of $125,000 in recording fees for Chicago musicians. It did little to stop the rapid growth of record companies and the demand from the public for recorded music. Such was the appetite that it soon became the norm for people to have large collections of music, while record companies became all-powerful industry bodies that decided who became successful.

Ownership vs rental
The idea of ownership of music by consumers is one that has only emerged since recorded music became popular. Consumers have come to assume they own the content they have purchased – and this also includes films and books. However, the rights holders of such content tend to be a combination of record labels that distribute it or pay for it to be made, and the artists who created it in the first place. The user is, in legal terms, merely licensing it.

Some consumers have thought that, once they purchased the content, they’re free to do with it as they choose (such as sharing it with others), but the owners of the copyrighted material have pursued such practices with vigour. Policing the use of licensed material became much harder, however, when the internet created an easy way of sharing content quickly, freely and in a manner difficult to detect. Content rights holders found it hard to keep up with the many ways that were emerging to share their work.

Although streaming services seem to be helping in the fight against piracy, they also mean listeners or viewers don’t have as much control over the content as they used to

The dominance of record labels began to wane when the internet opened up alternative possibilities for musicians and fans who felt they had been short-changed by a bloated and cynical industry. In their efforts to address the decline in sales caused by piracy, large and powerful rights holders scored a number of public relations own goals by going after individuals – including young children – who had traded their work.

The record industry was slow and confused in its reaction to these changes, struggling to find a meaningful and steady form of income with which to replace dwindling sales. It was hoped the music-buying public would be content with downloadable files from the likes of the iTunes Store; sustaining the content model of the labels that owned and distributed music, but turning it digital.

However, while the number of music files being legally downloaded has risen sharply over the last decade, a range of alternative services have emerged to challenge the way in which consumers access such content. Streaming services such as Spotify and Rdio have attempted to turn people away from music ownership and towards a rental system based around advertising and subscriptions. In the world of film and television, Netflix has begun to make serious strides in cutting the number of pirated films downloaded, while presenting a challenge to the physical market.

The fight against music piracy 
While music piracy is a long way from being eradicated, services such as Spotify and iTunes, which allow easier access to music, are helping to address the problem. Recent research has shown these services are both growing rapidly and curbing piracy. In Norway, research by Ipsos MMI found the number of songs pirated in 2012 was 210 million – a mere 17.5 percent of the 1.2 billion copied in 2008. Piracy of film and television had also halved in that period. These figures could reflect a trend for the rest of the world, one in which streaming is killing off piracy and bringing in revenue for rights holders. But many argue the amounts paid to the holders – particularly by Spotify – are neither enough nor fair to newer musicians with fewer fans.

Fall of the Norwegian pirates

pirate-music

1.2bn

Songs pirated in 2008

210m

Songs pirated in 2012

17.5%

Decrease in music piracy 2008-12

The music industry has taken its time in finding a suitable model, but the ease of use and extensive catalogue that the leading streaming services offer mean there is now a viable alternative to piracy. However, there needs to be regulations on copyright that are enforced, says Julian Hewitt, a music specialist and partner at Australia-based Media Arts Lawyers. He told The New Economy: “It seems like the industry will not litigate, and so are looking for political assistance to help curb piracy. Certainly, if, as a society, we still think there’s a place for copyright – and that’s a position I very strongly support – then we need to uphold it. The market is competitive and legitimate services are not expensive – it’s the same with filmed content – so hopefully piracy will become inconvenient enough to kill itself off – but it has cost our artistic community immeasurably.”

Although streaming services seem to be helping in the fight against piracy, they also mean listeners or viewers don’t have as much control over the content as they used to. Subscribers are still licensing the songs, but they have less control over what they can do with them – such as copying them and distributing them among their friends. Some research has even shown people’s attitudes towards ownership of music appear to be changing. However, what changes now is the amount of money recouped from each listener – much less initially than from a physical or download sale. This has implications for artists and rights holders that have grown dependent on these sales.

Does Spotify pay artists enough? 
The debate over what Spotify pays artists has raged since the company was launched in 2008. According to some artists, Spotify pays as little as $0.004 for every play of a song. The rate has been criticised by many for being too low, particularly as those without a large fan base are seeing their work hosted on a service that allows people an unlimited number of plays. The debate had begun to simmer down in recent months, following a number of high-profile signings, including Pink Floyd and Metallica. However, others, such as Coldplay and The Beatles, have remained off the service, and more are becoming disgruntled with the low royalty fees.

Spotify stats

Spotify

$1bn

Estimated amount paid by Spotify to rights holders at end of 2013

$0.004

Rights holder payment per play

$0.70

Rights holder payment per download

Nigel Godrich – the acclaimed producer of artists including Radiohead, Beck and Paul McCartney – reignited the debate over what Spotify pays artists in July when he announced on Twitter that he was removing his music from the streaming site. Godrich said the recently released album by Atoms for Peace – a project with Thom Yorke from Radiohead and Flea from the Red Hot Chilli Peppers – was taken down from Spotify in protest at the low royalty fees the service pays new artists. Describing the rate new artists get paid, Godrich – backed by Yorke – said: “It’s an equation that just doesn’t work.”

Mark Kelly, keyboardist for English band Marillion, disagrees, however. He told The New Economy that most of the songs Spotify pays out for are newer material: “According to Spotify, most of the money they pay out is for new songs and not the old catalogue artists like Marillion and Radiohead. Furthermore, they pay through 70 percent of their revenue, as do Apple.”

Spotify responds 
In response to the comments from Godrich and Yorke, a spokesman for Spotify said the firm was merely at the early stage of developing its service: “Right now, we’re still in the early stages of a long-term project that’s already having a hugely positive effect on artists and new music. We’ve already paid $500m to rights holders so far, and by the end of 2013 this number will reach $1bn. Much of this money is being invested in nurturing new talent and producing some great new music.”

There is a stark difference between the lifetime fees Spotify generates for artists and the one-off payment received for a single purchase or download. Although the rate of $0.004 per play is small, in the long-term it could net the artist more than the $0.70 for each individual download. Also, as Spotify is a relatively new service with fewer paid-up members than iTunes, it is likely that, as it increases in popularity, it will be paying out much more to rights holders.

However, much of the discontent from artists over what Spotify pays to rights holders is to do with who owns the rights and who negotiated the fees in the first place. The record labels that saw their sales and influence dwindling at the turn of the century have invested heavily in Spotify – including through equity stakes. They are happy to negotiate low-rates of royalties safe in the knowledge their extensive back catalogues will earn them money for years to come. Struggling artists, on the other hand, might not be able to last quite so long with such a paltry – if steady – stream of income.

One label that has a different stance to many is the Beggars Group, which pays its artists 50 percent of streaming revenue. According to Kelly, all labels should follow Beggars’ lead: “Part of the problem is that many artists don’t own their rights and are being paid from streaming income as if it were a physical sale. All labels should follow the lead of Beggars Group and pay artists 50 percent of all streaming income. That, coupled with a 10 or 20-fold increase in paying customers, would mean that streaming would become a valuable source of income for artists and labels alike.”

Hewitt says streaming models are likely to be more beneficial to artists in the longrun, although it’s still too early to make a proper estimate of the effect: “Conceptually, the streaming models are incredibly fair because, instead of paying for the upfront purchase of music, your subscription fees are being distributed by plays – so the music that has the most profound place in your life will be paid the highest share of your fees. Doing a ‘per play’ calculation on what artists get paid at this point is pretty nebulous because the subscription pool is still very low.”

Embracing technology
Clearly, new technology is not going away, and the industry – both artist and label – needs to embrace it. Radiohead’s manager, Brian Message, spoke shortly after the comments made by Yorke and Godrich, saying the industry needed to work with firms such as Spotify to develop the best possible way of remunerating artists.

Solving the problem of online music piracy has troubled the industry for well over a decade

He said: “It’s not black and white: it’s a complicated area. There [have] been over 20 attempted reviews of copyright and how it operates in the internet era, and there’s been no satisfactory solution to it. The bottom line is technology is here to stay and evolution of technology is always going to go on. It’s up to me as a manager to work with the likes of Spotify and other streaming services to best facilitate how we monetise those [platforms] for the artists we represent. It’s not easy, but it’s great to have the dialogue.”

Hewitt adds that streaming services are still being developed: “The model is still being refined – the level of royalties paid through from services like Spotify to the artists, labels and publishers is a negotiated rate that isn’t set in stone ad infinitum.” However, Hewitt also says the music industry needs a strong set of competing services to give the consumer choice and ensure a healthy market: “I think it’s important that there are competing services otherwise the music industry runs the risk of another iTunes: a service provider with enough market share and power to screw down the creatives.”

One service that is not paying enough to artists, according to Kelly, is YouTube. He says: “All artists, new and old, suffer from the biggest streaming service, YouTube, paying an insulting amount to the people who made YouTube the success it is: the creators.” Some artists are even less enthusiastic about online music, with Talking Heads frontman David Byrne telling the Guardian in October: “The internet will suck all creative content out of the world.”

Solving the problem of online music piracy has troubled the industry for well over a decade. Kelly says these new services are gaining traction because of their ease of use compared to file-sharing websites: “The user experience for people who access content through piracy is terrible. That’s why people are willing to pay for Spotify and iTunes… I think there is a place for regulation too, in order to make piracy an even worse experience and more hassle than it’s worth. The place to apply the pressure is the delivery system: the ISPs. They have been profiting from delivering creative content at ever-faster speeds while not paying us, the creators, a bean.”

The digitalisation of music

Alternatives for artists
Finding an alternative method has troubled both artists and executives. With technology forever evolving, it’s increasingly difficult to predict what method of music consumption will be prevalent in a few years. From the perspective of an artist, some have looked to give away their songs, hoping their income will come from increased touring and merchandise sales. Others have used crowd-funding services such as Kickstarter and Bandcamp to raise money.

Having spent their career at industry giant EMI, Radiohead decided not to sign a new contract, instead choosing, in 2007, to offer their album In Rainbows to fans for whatever they felt like paying. Although they never released the official figures, it’s thought the average amount spent by fans was £4, and that it proved more profitable than their final album at EMI, Hail to the Thief. Radiohead didn’t stick to this model, however. In 2011, the band released a new album in the traditional manner, through independent label XL Recordings.

The consequence of this shift in how people see their media content – emphasising the license over actual ownership – might, in fact, have a profound effect on the control an artist can have over that music. It might make consumers realise the entertainment they enjoy is not really theirs to distribute and share, bringing to an end over a decade of rampant piracy.

Can antibiotics win the fight against bacteria?

A-titan-will-quake

Read Part 2 of 2:

Healthcare Apocalypse

One evening in November 2011, a consultant in the infectious diseases ward of a UK hospital was treating 27 patients. Seven of them had confirmed cases of E.coli, a bacterium that can cause deadly infections. Six were suffering from severe or life-threatening infections. Of those six, the conditions of two patients meant their treatment was limited to three intravenous antibiotics from two classes.

The patient whose condition was non-severe had only been admitted because there were no oral antibiotics that would work on this strand of E.coli, so he required a drip, despite only having a mild infection. The doctor was stuck for ways to treat the patients, whose conditions were as common as they were deadly. He had already run out of options with which to treat two of his severe cases. This was just one bug, in one ward, in one UK hospital.

According to scientific data, we could be heading to a world where there is no cure for infections such as those caused by E.coli: procedures such as childbirth and appendectomies, which are standard and safe today, will be deadly once again. We are heading towards a world without antibiotics.

Dr Arjun Srinivasan, an Associate Director at the US Centres for Disease Control and Prevention says: “antibiotics were one of, if not the most, transformational discoveries in all of medicine. Infections are something that we struggled to treat for many, many years, for centuries before the advent of antibiotics, and infections were a major cause of death before the advent of antibiotics.” For him, a world without antibiotics is unthinkable.

Hazard-Sign

50%

Increase in mortality from drug-resistant pathogens

20 mins

Time it can take bacteria to double their population

The cost of resistant bacteria:

25,000

Deaths in the EU

€1,500,000,000

Cost to EU healthcare system

12,000

Deaths in the US

30,000

Annual deaths from sepsis in Thailand

$2,000,000,000

Cost to Thailand in lost productivity

Bazooka subtlety
Bacterial infections are widespread and can be as harmless as an infected pimple or as deadly as a rampant case of meningitis (which can kill a child in hours). The only way to treat bacterial infections is with antibiotics, and for the past 90 years that is exactly what we have done. Alexander Fleming discovered penicillin in 1928 and, since then, antibiotics have been the cornerstone of modern medicine. But bacteria are wily microorganisms that evolve extremely fast, so, even as we have been developing ways to kill them, they have been growing immune to our poisons.

This has never been a problem, though scientists have been aware of the spectre of resistance in bacteria since 1945. Until the late 1980s, researchers were regularly discovering new classes of antibiotics that could be used when a certain type of bacteria developed immunity to existing drugs. Every new drug we can throw at bacteria will only ever be effective for a limited amount of time before they evolve and develop immunity to it.

The more you use an antibiotic, the more you expose a bacteria to an antibiotic, the greater the likelihood that resistance to that antibiotic is going to develop,” says Srinivasan. “So the more antibiotics we put into people, we put into the environment, we put into livestock, the more opportunities we create for these bacteria to become resistant. We also know that we’ve greatly overused antibiotics and, in overusing these antibiotics, we have set ourselves up for the scenario that we find ourselves in now, where we’re running out of antibiotics.”

Though most consumers are quick to associate the use of antibiotics with the treatment of severe infections, their use is far more widespread. Antibiotic creams and lotions are available over the counter to treat minor cuts and insect bites. Any type of surgery that involves cutting open the body cannot be done without doctors administering courses of antibiotics before and after to avoid infection and sepsis. Even cancer treatments that ravage the body’s immune system would be much more difficult without the use of antibiotics, which help boost the body’s defences.

The end of medicine as we know it
Antibiotics are the vital difference between modern medicine and the carnage that occurred before. Things as common as strep throat or a child’s scratched knee could once again kill,” says Dr Margaret Chan, Director General of the World Health Organisation.  “Antimicrobial resistance is on the rise in Europe and elsewhere in the world.

“We are losing our first-line antimicrobials. Replacement treatments are more costly, more toxic, need much longer durations of treatment, and may require treatment in intensive care units. For patients infected with some drug-resistant pathogens, mortality has been shown to increase by around 50 percent. A post-antibiotic era means, in effect, an end to modern medicine as we know it.”

The problem is modern medicine’s penchant for antibiotic use has made bacteria evolve much faster than we were prepared for. And the issue is getting graver by the day; there are many types of bacteria resistant to some drugs (known as multi-drug resistant strains), but increasingly researchers are finding strains resistant to all drugs. These pan-drug resistant strains are untouchable by antibiotics of any class.

Bacteria reproduce at an alarming rate: some types can double their population numbers in as little as 20 minutes. “Evolution is a national process. Bacteria evolve to survive in a hostile environment,” says Professor Laura Piddock of the University of Birmingham and the Antibiotic Action public awareness initiative. “They evolve quickly because they grow so quickly, so it doesn’t matter what the stressful environment is; if there is a bacterium within that environment that has changed itself and become resistant it will survive, but then they have the monopoly in that environment and they grow up.”

even as we have been developing ways to kill them, they have been growing immune to our poisons

“It’s important to know that this is a phenomenon that plays out in nature,” says Srinivasan. “Most of the antibiotics that we have available to us now were derived from products in nature. So penicillin was an agent that was excreted by moulds in order to kill bacteria. Eventually bacteria will evolve and they’ll adapt ways around that to overcome that obstacle.”

This means the amount of time it takes for genetic mutations to occur is extraordinarily short, and much faster than researchers can identify and target mutations. According to Antibiotic Action, between 1935 and 1968, 14 classes of antibiotics were introduced for human use. Since then, only five classes have been discovered. “Because there are no new classes that is not to say there are no new antibiotics; there were until the 90s,” says Piddock. “No new drugs are an issue because you can have new version of current classes, but we are not even getting that.”

The cost of carpet-bombing
The issue has garnered a lot of attention recently, both in the media and academic circles. Many universities are studying the problem, and doctors are starting to feel the effects in ERs and surgery rooms. It is a particular issue when it comes to gram-negative bacteria such as E.coli and salmonella, which have a specific outer cell structure that makes it difficult to kill.

“They are very difficult to treat if they grow antibiotic resistance,” says Piddock. That is not to say that gram-positive bacteria are not an issue: “The key thing is bacteria that we call ‘gram-positive’ – and this includes staphylococcus and MRSA – have been much easier to discover and develop new antibiotics for. It is just less of an issue at the moment.”

The increasing dearth of antibiotics is a problem with causes and implications that go beyond human and veterinarian medicine. Antibiotics are found in paints that coat the hulls of ships in order to keep barnacles off, they are sprayed into the ocean to tackle disease in fish farms, they are sprayed onto fruit trees. The list goes on.

The slowing arms race

Classes of antibiotics introduced:

1935-69: 14

1968-2014: 5

“I think some of us were shocked to hear for the first time that antibiotics are being put into paint to keep barnacles off ships,” David Willets, the British science minister told a G8 meeting. “We need to share information to get a measure of what is happening and the scale of the threat. We need to promote research and development. We also need to look at domestic prescribing practices and try to promote a more responsible approach to prescribing.”

For many researchers, it is this reckless use of antibiotics that has caused the current crisis, but there are no clear ways to limit their consumption. “If we make antibiotics very expensive so that they are not used unless they absolutely have to be used, then we risk excluding countries that have less funds and might never be able to afford antibiotics at all,” says Piddock. But she is quick to emphasise that if we do not limit their use at all then we risk losing a valuable resource. “If antibiotics continue to be used as widely and inappropriately as they have in the past, then resistance to these new drugs will develop too.”

Tackling irresponsibility
Some countries are beginning to legislate against the rampant and irresponsible use of these medications. The EU has declared costilin, used on livestock to prevent disease, should be restricted for use on infected animals, with prophylactic use banned. There are fears that resistance to costilin could become damaging to human health as well; costilin is a vital class of antibiotic used in human medicine, and European scientists now fear their unrestrained use on animals promotes resistance among humans as well.

The European Commission is yet to make a formal decision on the matter. It would be a precautionary measure, but for Piddock and other academics it would be a step in the right direction. She says: “The reality is that we should question the use of any antibacterial agent outside of human medicine and until there is unequivocal evidence showing no effect of animal use upon human health.”

The reality is that we should question the use of any antibacterial agent outside of human medicine

Antibiotic resistance is not only having a tremendously negative affect on human health, but also on the public purse. It has been estimated that, in the EU alone, resistant bacteria kills around 25,000 patients a year, which can cost up to €1.5bn to the healthcare system. In the US, around 12,000 people dies of these infections a year. In poorer countries where health provisions are more precarious, it is harder to measure, but Thailand has announced it estimates antibiotic resistance costs the economy $2bn a year in lost productivity – over 30,000 people die of sepsis alone.

Tsunami in sight
Some countries have a much higher percentage of resistant bacteria than others. That is a concern, because if there are more resistant bacteria circulating there are more patients suffering from those infections. “We have seen in the UK, for instance,” says Piddock, “that people bring in certain types of resistance from other countries, so it becomes a problem in our country and our healthcare system. We cannot be complacent because what is someone else’s problem today, with global travel, could be our problem tomorrow.”

But, over the years, technology has evolved. Our understanding of the risks of antimicrobial resistance has increased and researchers are finding alternative solutions to reducing our over-reliance on antibiotics. “Our scientific understanding has increased beyond belief: for instance we can seek to minimise resistance by dosing differently,” says Piddock. “We can look to get the dose of antibiotics above a level that we call ‘mutant prevention concentration’ – that means it would take a lot of effort to select bacteria that are resistant.”

There are simple hygiene and health initiatives that can help limit the spread of bacteria: washing hands with soap and water or with rubbing alcohol, for instance, can go a long way. But experts still agree that the only way of effectively getting a handle on the situation is to limit antibiotic use to human medicine only – and, even then, only in absolutely necessary cases.

Of course, this is easier said than done. Economic implications will be massive, particularly in agro-business. The slow response of authorities to limit the use of the most vital types of drugs is also a major concern. But scientists agree: there is a tsunami just beyond the horizon. We know it’s there and we know it’s heading our way. We just don’t seem to be doing enough to protect ourselves from its effects.

A titan will quake

Read Part 1 of 2:

Healthcare Apocalypse

It’s not been the most straightforward 18 months for pharmaceutical giant GlaxoSmithKline (GSK). The rest of the industry hasn’t exactly been revelling in record profits, of course: patent cliffs, price cuts and increased regulation have put off investors, causing the whole industry to slow down. GSK has been hit by these problems, too, but it’s also needed an aggressive course of reputation management after a scandal and resultant lawsuit in the US. The company was fined $3bn for falsely advertising and misselling antidepressants such as Wellbutrin and Paxil – the largest fine ever in the pharmaceutical industry. The patent cliff has been looming for GSK, along with the rest of the industry, for some years.

The end of a patent is more significant in the pharmaceutical industry than in consumer goods because rapid innovation is not an option in healthcare. In 2010, GSK lost its US patent on asthma drug Advair. Last year, the patent ran out in Europe.

In addition to its patenting problems, GSK is facing an investigation by the Chinese authorities into possible bribery and corruption. The corporation is accused of bribing doctors and healthcare professionals to prescribe specific drugs. It is alleged the company spent up to £320m on bribes to clinch higher prices for their prescription drugs and win market share.

In addition to its patenting problems, GSK is facing an investigation by the Chinese authorities

Whether this bribery, if it occurred, was the result of a few bad employees within the company or wider corporation policy is not yet clear.  Chinese authorities are still looking into the allegations, making it clear the country will not stand for corruption – particularly from foreign companies. Yet it remains to be seen whether the patenting issues or the lawsuits will cause long-term problems for the company, and where investment should be focused to protect the company from such problems in the future.

Not quite out of breath
Advair, the most popular drug for chronic obstructive pulmonary disease (COPD), is responsible for 20 percent of GSK’s total annual profit. It brings in $8.6bn in annual revenue. It is marketed by GSK all over the world under the names Seretide, Viani and Adoair, among others. As the patent has now expired in both Europe and the US, GSK is looking to invest in alternatives to fill the market. Considering that Advair is responsible for such a large percentage of the firm’s profits, GSK is keen to maintain its monopoly over the product.

It’s likely that, in the immediate term, Advair will not prove too problematic. The patent expired in the US three years ago and the impact has not really been felt. The company is fortunate in that, although the patent on the drug has expired, the patent on the technology required to administer the drug (a device known as Diskus) does not run out until 2016. Without this technology, it is remarkably difficult to produce an equivalent COPD drug to Advair.

The medication itself is formed from a combination of two different asthma drugs, making it hard to copy in the correct dosage. Therefore, GSK’s monopoly over Advair is probably safe for the next three years at least. Drug company Mylan is the favourite to replicate the drug quickest, but that will not happen until after Diskus falls off the patent cliff in 2016.

At the moment, Advair’s strongest competitor is a drug owned by British pharmaceutical company AstraZeneca, known as Symbicourt. This generates $3.2bn in yearly sales, compared with Advair’s $8.6bn. If the prominence of Symbicourt increases, the patent loss could be a much more serious problem for GSK. To combat the threat, GSK is focusing its efforts on developing new cancer and HIV drugs so that, by the time Advair begins to lose its worth, the drugs sector of the company will have diversified its profit base. GSK also has the advantage of developing vaccines and consumer products, as well as pharmaceuticals, so it has the potential to diversify in those sectors if necessary.

Drug-image

$8,600,000,000

Annual profits from Advair

$3,200,000,000

Annual profits from nearest competitor

20%

Advairs contribution to GSK annual profits

$3,000,000,000

Fine for misselling and false advertising

$320,000,000

Alleged cost of bribes in China

$2,000,000,000

Expected fine in China

4%

China’s contribution to GSK annual profits

5

Factories in China

7,000

Employees in China

The drugs don’t work
It will not be so straightforward with GSK’s ongoing issues in China, where the problems are more to do with reputation and public relations. If found guilty of corruption and bribery, GSK could face heavy fines. The worst outcome for GSK would be if it were forced to leave China. Despite currently only making four percent of its profits in China, GSK has invested heavily in the country, opening five factories and employing over 7,000 people.

Developing markets are of particular importance to GSK as patent cliffs are slowing its growth in developed ones. The company acknowledged the importance of the Chinese market in a press release issued after representatives from GSK met with Chinese officials. Abbas Hussain, President International for GSK, said: “We will actively look at our business model to ensure we make a significant contribution to meeting the economic, healthcare and environmental needs of China and its citizens.”

The company’s sales in China plummeted 61 percent in the third quarter of 2013. The bribery scandal has limited GSK’s ability to market its products and has pushed much of its market share towards its competitors. GSK will need to reclaim this if it is to continue building a strong presence in China. Furthermore, fines for the corruption and bribery accusations are expected to be in the range of $2bn, which would have a big impact on the company’s annual profit.

In 2012, the company was accused of marketing an antidepressant drug as being suitable for younger people in the US despite proof it was linked to suicidal tendencies in teenagers. GSK also paid Dr Drew Pinsky to promote the drug on his popular radio show as helping with sexual performance and weight loss, although there was no evidence to support such claims. As a result of this false advertising, GSK was fined $3bn – a record amount in the pharmaceutical industry. Despite the fact that, financially, GSK is able to recover from such payouts, its reputation will struggle to be restored. GSK’s popularity has plummeted in the US as the result of the revelations and, where possible, consumers are seeking alternative suppliers.

Drawing a line
No individual was prosecuted after the Wellbutrin fiasco. It is unlikely the Chinese authorities will be so lenient. Four sales executives have already been arrested as part of the investigations. The hammer could also fall on senior members of the company despite the fact they were not necessarily involved in the groundwork of the schemes. The Chinese authorities say they have evidence to suggest those at managerial level had full awareness of the corrupt processes taking place. The former chief of GSK’s operations in China, Mark Reilly, has been banned from leaving the country. Even in the West, some see the arrests as the only plausible way to stop this sort of corruption and make sure of stricter corporate governance. The threat of relatively small-scale fines is not a sufficient deterrent.

Even in the West, some see the arrests as the only plausible way to stop this sort of corruption and make sure of stricter corporate governance. The threat of relatively small-scale fines is not a sufficient deterrent.

If, as seems increasingly likely, it transpires the corruption was not the work of a few individuals but the result of a deep-seeded company policy, GSK could be at risk of prosecution in the UK under the country’s far-reaching anti-bribery laws. There were rumours this would happen after the misrepresentation of antidepressants in the US, as the drugs were also marketed in the UK as being suitable for teenage consumption. However, this came to nothing. Yet a British company suffering two similar scandals in as many years is unlikely to go unnoticed by the country’s authorities.

A spokesman for GSK said: “There is no question about our commitment to China. It is a critically important company of the future.” However, BBC Business Editor Robert Peston believes the only feasible option might be “withdrawal from what many would see as the most important market in the world”. GSK sales executives have been banned from hospitals in China and sales have dropped at an unprecedented rate.

Perhaps it is a case of weathering the storm, but China seems keen to be seen as taking a hard line on corruption and is using foreign companies as scapegoats. That is not to protest GSK’s innocence but rather recognise that an exit from China might be the only option. That said, GSK might be saved by the culpability of other pharmaceutical companies.

It is not just GSK that is currently being investigated by Chinese authorities: Eli Lilly, Novartis and Sanofi have been all victims of whistle blowing in the media. Sales executives from a number of these companies have also been banned from entering hospitals. As it’s unfeasible they will all have to leave China, GSK might be able to retain its position. But that depends, of course, on the outcomes of the various investigations.

Hope for the future
While the situation in China and the shocking sales drop has deterred investors, it appears that confidence in GSK is not entirely lost. Its diverse revenue streams enable the company to face problems targeted at a particular sector without detriment to the rest of the corporation.

Furthermore, GSK is investing heavily in the development of new drugs, particularly in the respiratory diseases area, where it has held prominence for so many years with the drug Advair. The company is also, as noted, putting money into developing HIV and cancer drugs, as well as trying to increase its philanthropic work.

GSK has pledged money to Save the Children’s projects in areas with high rates of HIV, to compensate for the company’s current HIV drugs being far too expensive for those who most need them. In the third quarter of 2013, GSK had approval for two new drugs in the US (Tivicay for HIV and a flu vaccine) and one in Europe (Tafinlar for metastatic melanoma). It has also been given approval in Japan for Revlar: a respiratory disease drug that could replace Advair.

Andrew Witty, CEO of GSK since 2008, is trying to change the culture of the company. Witty’s strategy has been to diversify the company’s products: he has overseen heavy investment in consumer products, such as Sensodyne, to prevent too great a dependency on sales of drugs to the West. In spite of this, there has been no talk of a policy overhaul in light of the China scandal.

GSK has requested its auditors, Ernst and Young, carry out an independent audit, alongside the Chinese authorities, to figure out where the corruption occurred and whether it was the work of a few individuals or a result of immorality in the company strategy. Perhaps, when a fine is set, GSK can consider the best way to prevent this sort of scandal occurring in the future and make a gesture of contrition – or pull out of the Chinese market altogether.

Interview: Harib Al Kitani, CEO, Oman LNG

The Middle East has met most of the world’s energy needs in recent decades, but Saudi Arabia, Qatar and the UAE tend to be seen as the regional heavyweights. However, just to the southeast of these three resource-rich countries lies Oman, a coastal paradise steeped in history. It is one of the more liberal economies in the Middle East, and its citizens enjoy a high standard of living. The government of Oman is eager to diversify its economy, partly by attracting tourists to its varied historic sites and beaches, and is looking to important domestic businesses to spur this diversification.

Oman is fast becoming a key regional player in the energy industry, thanks in part to the partly state-owned firm Oman LNG. With such notable global backers as Shell, Total, Korea LNG and Mitsubishi, the company is seen as a rising star in the region’s energy market. We spoke to CEO Harib Al Kitani about the country’s prospects, the changing energy landscape and the company’s recent integration with Qalhat LNG.

Oman LNG today

LNG

1994

Year Oman LNG was established

10.4m tonnes

Annual capacity of Oman LNG

51%

The government’s shares in the company

How have the country’s economy and hydrocarbons sector developed since Oman LNG was established in 1994?
There’s a marked difference between the period prior to the Oman LNG project coming on stream in 2000 and the era following the project’s take off. When our wise and visionary leader, His Majesty Sultan Qaboos bin Said, issued the royal decree establishing the company in 1994, it was obvious that the country would no longer be concentrating its efforts on developing its oil resources alone, but would deliberately seek to harness its gas resources as well. Today, Oman is known not just as an oil producer but also a reliable supplier of liquefied natural gas to the world, contributing about three percent to meet global demand according to the International Gas Union.

I think it’s fair to say that the success of the Oman LNG project has remained an important justification for players in the country’s energy sector to explore for gas with vigour. Recently, there’s been a strong emphasis on In-Country Value that seeks to develop SMEs to provide services to the hydrocarbon sector. Oman LNG is part of that drive and this will build capacity, expertise and professionalism across the value chain, and retain capital within the country.

In terms of the economy, I believe the facts speak clearly: earnings from LNG exports contribute a critical nine percent to the sultanate’s GDP and rank second only to oil revenues. This has enabled the Government of Oman to pursue an ambitious diversification programme through support for many sectors of the economy, including tourism, agriculture and the development of infrastructure. Recent gas discoveries augur well for increased contribution to the national economy and, consequently, a brighter future for the country in general.

To what extent is the country’s economic growth tethered to energy exports?
Like most economies of the world blessed with natural resources, Oman has for a long time focused on developing its energy resources, with earnings from these resources making up a large part of the country’s revenue. But that focus is gradually shifting as the government continues to execute a far-reaching diversification programme. Tourism, for instance, has grown steadily over the years, with its contribution to the economy rising in a similar fashion.

Earnings from LNG exports contribute a critical nine percent to the sultanate’s GDP and rank second only to oil revenues

Frankly, it is not at all surprising when you consider the natural disposition of the Omani people to be friendly and welcoming: when you consider that Oman is largely a pristine location where you can see much of nature still in its untouched state, with a coastline of nearly 1,700km.

Other non-oil sectors are also projected to grow in the coming years and, with some of the earnings from the energy sector directed to diversification, including spreading infrastructure, a new economic landscape based on a broader set of commercial endeavours is certainly emerging. Also, considering the government is the main shareholder, with high stakes in the integrated company – it holds a 51 percent shareholding in Oman LNG and 46.8 percent in Qalhat LNG – it benefits from the expected higher revenues that a combined structure positions us to achieve in the long term.

How has the thriving hydrocarbons sector helped the successful diversification of Oman’s economy?
Oman’s export of liquefied natural gas (LNG) has often been described as the game-changer to the country’s economy, because earnings from the sale of LNG cargoes have provided elbow room for investment in other promising sectors, such as tourism. But beyond our direct contribution to government’s coffers, Oman LNG dedicates 1.5 percent of its net income after tax towards a number of carefully thought-through initiatives. These include promoting entrepreneurship, and partnering with the Ministry of Manpower to support training-for-employment in other sectors, which has provided long-term employment for more than 1,400 citizens.

In another specific example, one looks at the country’s shipping industry and the role LNG has played in its development. The first couple of ships acquired by Oman Shipping Company (OSC) were for transporting cargoes to buyers, but today OSC has grown from ferrying just LNG to other types of cargoes, thus expanding its revenue base. Let’s not discount how the reliable supply of LNG from Oman for the past 13 years has enhanced the country’s reputation as a thriving destination for business. This has certainly attracted investors, with the world looking at Oman as a good location to do profitable business.

How big a part does Oman LNG play in Oman’s hydrocarbons sector?
With the consummation of recent integration with our sister company, Qalhat LNG, in September, Oman LNG has become the only face of LNG supply from the sultanate to the world. As the country’s only exporter of liquefied natural gas, with a three-train plant operation and a 10.4 million tonnes per annum capacity, we play quite a significant role in the sector. Moreover, we are ready to take up more gas, should it become available. Oman LNG and Qalhat LNG’s contribution to the country’s revenue in 2012 was over $5bn. So while we already play a significant role, we believe we could still accomplish some more.

Tell us, if you would, about the company’s integration with Qalhat.
The integration of Oman LNG and Qalhat LNG is based on a number of obvious synergies between the two companies. After years of operating as individual entities and developing their respective niche markets, the government and other shareholders decided it would be wise to capitalise on these synergies, save costs in some respects, and offer a better and more efficient service to our customers.

Prior to the integration, both companies received feed gas from a single supplier (the government-owned Petroleum Development Oman), utilised vessels from OSC and made use of shared facilities at the plant in Qalhat, Sur. The integration brings together our strengths. It shows some clear thinking on the part of shareholders on how we can remain successful and has been well-received by investors, lenders, insurers, buyers and many of our other stakeholders because it positions Oman LNG to do more with its resources.

Furthermore, it eliminates the idea of competition between the Omani companies, and shores up our flexibility to service the market through long-term sales and purchase agreements, cargo diversions and swaps using the quantity of feed gas we receive and the number of vessels available.

Our primary concern isn’t so much to be a regional heavyweight but to ensure we delight our customers with the quality of our service and our ability to cater to their needs. The integration positions us to do this well. If this quality of service leads us to become a regional or global heavyweight, then of course, we welcome that opportunity. Besides these, the integration increases Omanisation within the country’s LNG industry because it enables more citizens to come into challenging positions that have arisen as part of the combined structure.

Already we have almost hit 90 percent in Omanisation of management positions, but in the not-too-far future we could be seeing much higher figures. This puts a critical industry squarely in the hands of the Omani people and furthers the actualisation of a national ambition.

How do you feel the integration of Oman LNG and Qalhat will transform regional and global markets?
By the simple fact that the integration positions Oman LNG to do more, it follows naturally that we will be able to supply LNG to the markets more efficiently and address demand more readily through swaps and diversions of cargoes, rather than just the traditional long-term sales and purchase contracts. As such, I wouldn’t be surprised if Oman becomes a choice supplier because of its flexibility in meeting new demand.

We are just now entering what many are describing as the Golden Age of Gas

The integration certainly brings clarity to the supply of LNG from the sultanate and dismisses any notion of competition between the two companies. So I believe it reflects the true image of the country as one unified front for business to the world. Considering the push for greater care in dealing with the environment and threats to food security posed by climate change, it’s really just a matter of time before we see a shift to wider uses for natural gas because it is less damaging to the environment than coal or oil.

I think that, in the future, the market will be better served by a healthy combination of long-term supply contracts and a producer’s flexibility in meeting short-term gaps in supply – which is what the integration really enables us to do.

How do Qalhat’s company values align with those of Oman LNG?
For both companies, delivering added value to customers was an important aim – and I should add this has distinguished us in the eyes of buyers of our cargoes. I speak from personal experience of having had the privilege of serving at both entities prior to the integration: I was at Oman LNG for 10 years and left as the company’s marketing manager to assume the position of chief executive officer at Qalhat LNG in 2005. At both organisations, there was a deliberate effort to cultivate value in everything we set out to accomplish.

I was appointed CEO of Oman LNG in August 2012 and, although there had been a number of remarkable changes over the seven years I was away at Qalhat LNG, there remained at the company’s core the desire to add value to the many services delivered to stakeholders. The same mindset was cultivated at Qalhat LNG, where we had a view of our stakeholders as being Partners In Excellence.

How do you see global demand for LNG changing in the coming decade?
I would be surprised if I saw a drop in global demand for LNG. In many parts of the world we are seeing rising demand for natural gas, even in places that lie outside the so-called traditional markets of Asia and Europe. In the Middle East, for instance, there’s demand for natural gas. Within countries in the Gulf Cooperation Council, countries such as Kuwait and the United Arab Emirates are looking to import natural gas. LNG is a cleaner energy form than traditional types of energy, such as oil and coal. We are just now entering what many are describing as the Golden Age of Gas.

With ever-growing concerns about the environment, health and food supply, among other considerations, our ecology will need more deliberate care in terms of what kinds of energy we use. At the moment, LNG presents a clear, viable option for managing these many concerns. Furthermore, this rising demand is being accompanied by a deliberate push to find and develop gas to supply the market: whether by having a conventional plant (as could be the case in places like Tanzania for example) or through floating technology (as may be the case in parts of North America, Brazil and Malaysia).

Does Oman LNG have any future plans for expansion?
At the moment, our focus is on making the best possible use of what is available to Oman LNG in terms of gas supply to meet demand in the market. We are not taking any options off the table and, as with most exploration activities, we cannot exactly predict what the on-going search for more gas in the country could yield.

This integration will definitely bring good opportunities to our company and Oman’s hydrocarbon industry generally as we seek new ways to deploy resources in a profitable fashion. So everything considered, I am optimistic about the future.

Starving the world: why the global food industry needs to wake up

The global population is set to hit 9.7bn by 2050, up from 7.1bn in 2012. If the food supply industry remains in its current inefficient state, there will not be enough to feed the world. The rapid increase is already putting a considerable strain on the industry. The amount of waste is astronomical. The industry needs to be corrected quickly.

As much as 37 percent of food is currently wasted during production. This is a particular problem for developing countries, where factors such as undeveloped infrastructure can render produce unsanitary. If the food supply industry is to sustain an increasing population, then methods of production, transportation and storage must be modernised.

Food stats

Woman shopping in Tesco

7.1bn

Global population in 2014

9.7bn

Global population in 2050 (predicted)

4m tonnes

Edible food wasted annually

600,000

Children golden rice could save every year

37 %

Food wasted during production

25 %

Crops lost to pests in the developing world

The world is facing a food crisis. In addition to a rapidly expanding population, an increased use of biofuels and changing dietary requirements are exacerbating the issue. Yet it is difficult to prepare for this because of the volatility of the industry. Professor Tim Benton of Leeds University says the situation is fragile: “If there is a perturbation to the system created by some combination of extreme weather and civil or social unrest, or international wars disrupting supply chains, then the impact on almost any country could be quite profound.”

The cost of going local
Some argue a reduced dependency on external markets and increased self-sufficiency within nations is required to limit this risk. However, self-sufficiency is a fruitless aim; New Zealand’s economy, for example, is dependent on food exports. 50 percent of its total exports are food and beverages, a third of which go to EU markets. Self-sufficiency could ruin it.

While some environmentalists argue carbon emissions produced in the aviation required to transport food around the world are an argument against exports, there are other environmental costs in locally produced food. Were the UK to produce entirely its own food, the transport required to get that produce across the country would compromise the environment just as much as aviation. Urban food miles, such as the consumer’s journey to obtain the food and the journey of waste to land fill, also has a significant impact on the environment whether the product is produced locally or thousands of miles away.

A 2003 study by Simon and Mason showed it was more environmentally friendly for the UK to import apples from New Zealand than to eat local produce. Although the transportation produced CO2 emissions, the production process used far less energy in New Zealand than it would have done in the UK thanks to economies of scale. New Zealand also uses fewer pesticides and less fertiliser. A food supply chain reliant on locally sourced produce can be flawed. If consumers wish to continue enjoying a wide selection of food, regardless of whether the produce is in season or not, at an affordable rate, then imports are fundamental.

However, there are also economic and moral arguments against exports. It is harmful for a country’s economy not to support local farmers, especially when cheaply imported food forces them to lower their prices. The low costs of imported food suggest producers compromise on workers’ wages and labour conditions.

Import vs export 
Britain imports 46 percent of its food, leaving it at risk from volatility in foreign markets. If food production in external markets were to slow down, the UK would only be able to feed half its population.

Professor Benton considers efficiency to be the key: “If there is a disruption to one key ingredient, say lentils, should we be growing lentils, recognising that we would not be a competitor on a global market for growing lentils… or should we be finding market alternatives that are sustainable?”

China is experiencing the opposite problem: as a nation, it needs to import more food than it is able to. Much of China’s farmland has been industrialised as part of the rapid urbanisation of the country, so it is unable to produce enough food internally. Fuelling the need for imports is an increasingly affluent population who want to purchase an eclectic selection of produce.

A Chinese farmer harvesting wheat
A Chinese farmer harvesting wheat. A shortage of farmland has caused food prices in China to double in 10 years

Janet Larsen, Director of Research at the Earth Policy Institute, focuses on the population problem. She says: “With more people in the developing countries trying to move up the food chain to consume more meat, milk and eggs – and thus more grain embodied within – we are starting to see our food supply stretched tight”.

Traditionally, the Chinese have consumed a lot of pork. However, the emerging middle classes are diversifying their meat consumption in line with western preferences, so there has been a significant increase in demand for poultry and beef. Farming beef requires more than twice as much grain as is needed to farm pork. To produce this grain, China needs more farmland. The only solution is to import it, since local farmers are struggling to keep up with the demand. This shortage, according to the UN Food and Agriculture Organisation, has caused prices to double in the last 10 years.

Getting food to mouths
In the developing world, the problem is not so much a lack of space to produce food as a lack of resources to store and transport food to the consumer without it becoming unsanitary and going to waste. One cause of wasted food is pests. In the developing world, up to 25 percent of crops can be lost to pests. This problem will increase as pests become resilient to pesticide.

In the developing world, up to 25 percent of crops can be lost to pests

In 2009, a new species of caterpillar brought Liberia and neighbouring Guinea to the point of crisis. While some pesticides might be effective in tackling this problem, consumers do not want to eat crops that have been treated with pesticides as it is seen as a health risk and environmentally compromising.

Undeveloped infrastructure in developing nations also causes inefficiency and mass waste: one of the most straightforward yet most devastating problems is access to markets. If farmers cannot bridge the gap between production and consumer then the produce goes to waste. This is a particular problem in Nigeria during monsoon season as transport links are crippled.

The Nigerian government is attempting to solve this problem through the launch of the Agriculture Transformation Agenda. The aim is to boost food production by 20 million tonnes per year and create mass employment in processing industries. The agenda was launched in 2011 and, so far, has successfully focused on modernising infrastructure to improve market access.

However, in the more rural areas of Nigeria, it is a five-hour drive through dirt tracks to get to the nearest city – and even this is nearly impossible during rainy season. Similarly, bad refrigeration facilities and limited access to sufficient storage during transportation means the food is not suitable for consumption and is wasted. If transport links were improved, a greater amount of the food produced would reach the consumer, creating a more profitable and efficient industry.

Is food waste the fault of the consumer? 
In developed countries, the problem is waste caused by the consumer rather than the producer. Consumers waste on average 15 million tonnes of food per year, of which at least 4 million tonnes is edible. In a recent study, Tesco has shown that 68 percent of its salad is thrown away and one in every 10 bananas. In May last year, the supermarket launched a campaign against food waste after discovering that every family in the UK throws away £700 worth of food every year. Tesco released the results of an accompanying study that showed the supermarket generated around 30,000 tonnes of waste in the first six months of 2013.

In developed countries, the problem is waste caused by the consumer rather than the producer

Mary McGrath, CEO of FoodCycle, a charity that collects surplus food from supermarkets and distributes it to vulnerable people, was unsurprised by the results. It said in a press release: “We have never received any surplus from Tesco… we of course welcome any report that enables food waste to be discussed openly”.

To combat this waste problem – and save the consumer hundreds of pounds – supermarkets are being encouraged to stop multiple purchase offers and to sell produce in smaller quantities. While this would tackle the problem of waste, it would create further environmental problems within the food industry because of the increased requirement for packaging, along with the energy needed to recycle it or the waste created by disposing of it.

A simpler solution would be to eat more of the food the consumer currently throws away – especially considering a lot of it is still edible. Consumers do not know the difference between “use by” and “best before” labels, for example, and are creating unnecessary waste. Either the consumer needs to be educated or the use of “use by” dates should be reconsidered.

The good in GM food

Protesters gather transgenic soybeans
Protesters gather transgenic soybeans. The destruction of similar GM products by activists has been denounced as “wicked”
Rice terraces in southeast Asia
Rice terraces in southeast Asia. Rice genetically modified to contain vitamin A could save hundred of lives

Genetically modified solutions 
Organisations such as the Waste and Resources Action Programme (WRAP) are coming up with solutions to prevent food waste that do not involve increasing the amount of packaging. In January last year, representatives of the UK food industry joined forces to explore and support initiatives to limit food waste. These included donating surplus food to charities such as FoodCycle and FareShare, and encouraging supermarkets to turn surplus food into animal feed or recycle it into renewable fuel.

Andy Dawe, Head of Food and Drink at WRAP, said: “Preventing waste arising not only saves money in tough economic times but also provides environmental savings. Where there is a surplus of food, it is important to make sure it’s being used in the best possible way.”

While these solutions might be appropriate in the short term for a developed nation, the only worldwide way to make the food supply industry more efficient is the (controversial) genetically modified crop. They are pest-resistant so eradicate the need for pesticides and thus limit the health considerations.

GM crops are also resistant to adverse weather conditions, can be adapted to contain the necessary nutrients to tackle malnutrition, and remain edible for longer. The downside is GM crops have the potential to cause pesticide resistance over time.

This debate has recently been forced into the spotlight because of ‘golden rice’: rice genetically modified to contain vitamin A. It is believed this GM crop could save the lives of over 600,000 children a year in the developing world and prevent many more from going blind. However, batches of golden rice ready for testing in the UK were destroyed by protesters who said the benefits claimed were misleading – an act labelled “wicked” by Environment Secretary Owen Paterson.

Aside from GM crops, the way to tackle waste in the food supply chain is to address the problem at source and distribution. With a more efficient distribution system and intergovernmental agreements in place, the transportation of food on a large scale can be streamlined. And for that to happen, the world’s supermarkets need to get on board.