The rapid evolution and personalisation of the ‘desktop’

The PC desktop is changing so fast that what used to confidently be called the ‘desktop’ is undergoing the sort of rapid evolution bound to throw up new and unfamiliar security challenges. Technological developments such as smartphones, tablets and mobile operating systems can be wheeled out to partly explain this change.

However, it is to the humble user rather than computer architectures of network topologies we must pay the closest attention to if we are to understand how the business desktop will be reshaped from the ground up over the next decade.

Put simply, employees are downloading and accessing a host of ‘grey’ mini-applications, services and browser plug-ins on a sometimes-industrial scale to run in parallel with traditional software licensed or developed to do the everyday work of a business.

As well as introducing a high degree of uncertainty and risk, this turns the established model of software deployment on its head. Where once, IT staff decided what ran, now employees have been handed the discretion to run what they fancy.

What use is policy?
Organisations might want to ban alien applications and social media plugins, but they are also aware that some of these services and applications are part of longer-term industry changes that can also generate new possibilities for a business. Can a way be found to reconcile the two worldviews?

Most organisations have a written computer usage policy to define authorised behaviour, which in specific instances will be enforced with an extra layer of technology to control which applications can run on a PC or open a port through the firewall. That offers certainty but is a blunt instrument that fails to address a range of underlying issues.

What happens if users misunderstand, forget or ignore the policy, or are simply socially engineered into installing risky applications? Can organisations any longer rely on mere usage policies to form a reliable part of their compliance stance? In any event, can applications be efficiently managed if IT staff lack reliable tools to perform simple discovery and control on a continuous basis?

One powerful and flexible tool with which to impose order on the chaos is a privilege management system such as Avecto‘s Privilege Guard. Technically, privilege management is a way of controlling applications that demand admin rights under Windows to function, a legacy programming model that presents obvious security risks.

Where once IT staff decided what ran, now employees have been handed the discretion to run what they fancy

Using such a system in a least-privilege setting offers a way of blocking harmful applications (which often ask for admin rights to gain control of a target) while allowing ‘standard’ users to elevate these privileges according to pre-defined policies. But it doesn’t stop there.

Privilege management systems also come with a discovery and auditing function that admin staff use to assess the type of applications and rights used on a network over time; this provides a neat starting point from which to create a digital usage policy to then replace the written protocols.

Bad management and the grey area
Once armed with a comprehensive picture of which applications are being used and under what conditions, the next stage is to divide applications into categories according to risk or their use to the business.

Leaving aside the hopefully small number of dangerous applications, there is no simple answer as to which applications and services run, and which donít. Suffice to say, this is a grey area that demands the attention of IT teams and staff consultations. Imposing a digital usage policy from ‘on high’ is bad management.

A particularly difficult example is that of social media applications. For staff in one department, these may offer no concerns to the business, while in another one down the hall data security issues would make unguarded use unthinkable.

Other examples are consumer cloud storage services such as Dropbox, which has risen to prominence for the way it allows users to cope with data files across multiple types of ‘desktop’ – whether PC, smartphone, tablet and even home computer – without resorting to insecure flash drives. Many businesses without private clouds are keen to access such services, but worry about the risk to data accessible from multiple systems using uncertain authentication, remotely managed encryption with no auditable compliance to speak of. Assessing where the limits lie with such services can be complex.

Installing without warning
Adopting privilege management concepts will not necessarily offer a complete solution thanks to a growing band of apps Windows 8 ‘Metro’ apps for one – that install without asking for elevated rights.

Granted, Microsoft‘s design improves on the mistake of creating applications that require privileges and end up being funnelled inefficiently through Windows User Account Control, but leaves hanging the question of whether even standard user apps should be allowed in the first place.

The challenge of Windows 8 apps is that the number of possibilities increase from the few dozen usual suspects found in todayís desktop environment to potentially thousands or even tens of thousands.

A clear answer could be application whitelisting (allowing a pre-defined group of applications), or its twin: blacklisting (disallowing specific applications). As far as Windows 8 is concerned, Microsoft provides tools to manage Windows Store apps through AppLocker Group Policy. However, privilege management systems will also do the same job in a way that then integrates with broader application management requirements.

Because it is impossible to authorise each and every app dynamically, the best way to proceed is to define a family of acceptable apps using whitelisting, updating this policy as regularly as practical.

The specific example of Windows 8 apps underlines the importance not simply of auditing the applications being used, but of doing the same for the policy itself. Digital policies should never become fixed in stone; a good policy is always as recent as possible.

The conclusion from all this is that the new desktop is dynamic, fast-evolving and defined as much by what users do as what IT vendors deem to be useful. The user is now in control of the organisationís destiny and IT teams need to adapt. Thatís a huge change that asks not only for a new mind-set, but the tools to make such a world possible. What admins can’t do is cling on to the past and its fading certainties.

Layers of potential

graphene-1

For graphene

Researchers at the Massachusetts Institute of Technology (MIT) have recently announced the discovery of yet another fascinating property of graphene. The carbon-based material generates a large number of electrons when it absorbs light: making it a viable material with which to manufacture photovoltaic cells.

When researchers at the University of Manchester tried isolating graphene in 2004, it was thought to be too unstable to work with. Not only were they successful in isolating the material, it turned out to be so useful and fascinating that the researchers (Andre Geim and Konstantin Novoselov) were awarded the Nobel Prize for Physics in 2010.

Outstanding mechanical, optical, thermal and physical properties make this material very promising indeed

Graphene is only one atom thick, but it possesses a variety of intriguing properties. The carbon atoms are arranged in a honeycomb lattice, in a single-atom layer, which makes it a great electric conductor. Its outstanding mechanical, optical, thermal and physical properties make this material very promising indeed.

Though research is still underway, graphene’s remarkable diversity of properties means it is an attractive material for flexible electronics. Byung Hee Hong and Jong-Hyun Ahn, researchers at SKKU Advanced Institute of Nanotechnology, have already started experimenting with stacking layers of graphene to produce a film “with properties superior to those of commercial transparent electrodes such as indium tin oxides”.

Because the material absorbs only 2.3 percent of the light that hits it, and conducts electricity easily, by sandwiching a layer of liquid crystals between two graphene sheets, it becomes a ‘smart’ LCD screen (as light travels between the electrodes). Graphene has been used as a ‘Hall-bar device’ for sensitive electronic detection, but graphene’s photovoltaic capabilities are probably its most commercially appealing property.

Conventional materials with similar properties usually turn light into electricity at the rate of one electron per photon absorbed. A photon, however, has more energy than an electron is capable of carrying, so a lot of the light energy is lost as heat. Graphene appears to be able to generate several electrons per photon.

Graphene has another advantage over other photovoltaic materials like silicon and gallium arsenide; its optical properties mean it is a unique conductor. “Graphene can work with every possible wavelength you can think of”, explains Andrea Ferrari, Professor of Nanotechnology at the University of Cambridge. “There is no other material in the world with this behaviour.” This is exciting news for the solar energy industry, though it might still be a while before graphene photovoltaic panels come to be.

The research community is certainly very excited about graphene. As well as the Nobel Physics Prize won by Geim and Novoselov in 2010, and the research being carried out at MIT, a team led by Jari Kinaret from Chalmer’s University in Sweden has just been awarded €1bn by the EU to further their research. With that sort of backing, graphene will be a commercially viable material in no time.

Against graphene

Since its discovery by Konstantin Novoselov, Andre Geim and other researchers at the University of Manchester in 2004, much excitement has been stirring in the science world as talk of the potential uses of graphene grows. Many have heralded this tiny, one-atom-thick layer of carbon as the next big technological step forward, with uses as diverse as flexible and transparent electronics, protective coatings, solar cells, architectural designs and advanced transistors.

The resilience that comes with a material made from a hexagonal lattice of carbon atoms means designers have been desperate to introduce graphene as a commercially viable material in the production of a range of items. However, realising this dream has proven harder than expected. Despite the material being discovered almost a decade ago, we are yet to see any mainstream products made from it.

That it has been almost a decade without any real headway means much more research is needed

Much of the research seen so far has centred on mechanical exfoliation as a means of achieving the best forms of graphene, but it is both costly and time-consuming. For large-scale manufacturing, it is simply not yet practical. There is also concern that graphene transistors are not ready for production, mostly because they lack a band gap – vital in passing an electric current through a solid object – and a solution is not expected for another decade or so.

Even the material’s founders have recently written about the challenges facing the development of graphene as a widely used technology: specifically, whether the advantages it brings offer enough benefits to replace currently used materials.

Novoselov and other members of his research team published a paper titled “A Roadmap for Graphene” last October, spelling out their concerns about the commercial viability of the material. They conclude that, instead of graphene replacing other materials already actively in use in product design, it should be used in applications specifically designed to benefit from its unique properties.

Novoselov said graphene has the “potential to revolutionise many aspects of our lives simultaneously,” but that, while some products might be ready within a few years, depending on the quality of graphene required, “Different applications require different grades of graphene and those which use the lowest grade will be the first to appear, probably as soon as in a few years. Those which require the highest quality may well take decades.”

The current range of applications for graphene seems rather narrow. According to Novoselov and his team: “Graphene is a unique crystal in a sense that it has singlehandedly usurped quite a number of superior properties: from mechanical to electronic. This suggests that its full power will only be realised in novel applications, which are designed specifically with this material in mind, rather than when it is called [upon] to substitute other materials in existing applications.”

Graphene is clearly a remarkable discovery that should, in time, transform the design of a number of different products. However, the fact that it has been almost a decade without any real headway made means much more research is needed in bringing down the cost and improving the stability of this tiny piece of carbon.

Pioneering changes and setting standards for solar technology

In the past five years, solar energy has undergone a dramatic transformation: one that has taken it from an interesting technology to a viable and efficient energy source. Technological advances have allowed grids to overcome the intermittent nature of solar power, and redistribute it to reduce waste and bring down energy costs.

Smart energy provider Petra Solar has worked hard to energise the industry with the introduction of reliable and smart solar systems in a distributed way, as well as integrating them with new technology, such as smart and micro grids. Dr Hisham Othman, Chief Technology Officer at Petra Solar, discusses new technology, new uses of energy and intelligent grid solutions.

As cities grow and our energy needs increase, how do grids change?
As we look at the grids around the world, we find that some grids have kept up with demand growth; for example increased population and consumption, while many have failed to do so. This problem will be further exacerbated as population and consumption continues to grow.

However, we have seen encouraging technological breakthroughs that have brought advancements in competitively priced sources for clean energy, added consumption controls, and the ability to exercise those controls. The grid has had to adapt not only on the business level, but also on the physical level in the way it interacts with these new
energy sources.

How do smart grids interact with the increased flow of renewable energy into urban grids?
Grids have to evolve from the existing paradigm of trying to accommodate renewable energy sources to a new model of embracing them and truly integrating renewables as legitimate and fundamental resources of the energy grid. This integration of renewables into the smart grid will drive the need for additional monitoring, control and flexibility in the way the grid manages the two-way power flow.

Truly smart grids will fundamentally push intelligence downstream to various places in the grid instead of being managed centrally. We are seeing more distributive controls, which enable customers to be active participants in balancing energy distribution. We will see controls within renewable electricity generation devices themselves, so that they interact intelligently, and are not just dumping energy into the grid, but are actually sensitive to grid conditions.

Micro-grids have the potential to be the driver of fundamental change in the whole utility industry

Advanced controls will transform solar panels into smart solar systems. We will also see smart homes that will interact with both the physical and economic layers of the grid to shift and reduce energy demand as and when needed. This modernisation and automation of electric power networks will provide a model for the utility of the future as well as the consumer. There will be more intelligence throughout the grid, not only on the side of utility companies but also with consumers and other energy sources that interact with the grid.

What are the current and future trends in terms of investments? 
We see increasing communication and demand response investments by utilities. We also see investments in managing data flows to gain operational intelligence that will enable informed decision making that leads to operational efficiencies.

We also see that investment is needed to fully integrate renewables and manage the intermittent flow of those sources, especially in high penetration areas where renewables have become a significant part of the energy mix.

How have Petra Solar’s products evolved over the years as sustainable energy technology has changed?
We have beefed up our power electronics at the fundamental level to really make solar a smart technology. Smart solar interacts seamlessly with the grid and enables very high penetration of renewables. Petra Solar has also integrated upwards by moving from just being a technology provider to a solution provider. We moved into areas that enable not only distributed solar energy, but also energy efficiency and micro-gridding.

Micro grids have the potential to drive fundamental change throughout the utility industry. For example, micro grids are valuable when combining renewable resources to meet the needs of one area or providing differentiated levels of reliability within an existing grid.

A micro grid is a small grid that lives within either a larger grid or independently that is able to meet the demand of a community by integrating multiple resources in a reliable way. That ultimately will lead the democratisation of energy, the ability of consumers at a residential or industrial level to be able to provide part of their own energy needs in a reliable way.

The smart energy model Petra Solar invented and introduced into the market provides solar energy in a reliable way and integrates that energy within the grid function so that it is a reliable energy source. This has simplified the way solar can be installed into the grid. It seamlessly integrates without having to remedy the intermittency issue that comes with traditional solar energy.

How do Petra Solar’s products interact with smart city grids in order to work towards greater energy efficiency?
The solutions we provide to the smart city not only include solar panels, but also a two-way communication system that enables a conversation between the grid and the solar modules. We also provide the software tools to monitor and manage that energy portfolio.

Our solar systems drive energy efficiency in the sense that their production peaks coincide with the times of day when energy demand is highest and therefore obviate the need for expensive peaking energy resources. Efficiency is also gained when distributed solar systems within the smart grid supply the energy needs of homes, thus reducing the losses of the transmission and distribution grids.

The two-way communication network enables the proper information flow and real-time controls between the smart grid and the consumption centres to balance the needs for cost efficiency and reliability, while providing high levels of consumer satisfaction.

For example, we introduced intelligent lighting controls for street lighting, which reduce energy consumption costs and demand by turning lights off during off-peak hours. This principal control to reduce demand can extend further into reducing the demand from houses and commercial property.

How will energy users be able to make their consumption more efficient?
This is an area that is evolving and it will have a huge impact on the structure and economics of the grid of the future. Demand response and control can only be activated through the mutual dissemination of timely knowledge about the needs and constraints of the grid, and demand through a robust and secure two-way communication network.

This enables users to make intelligent decisions about their consumption. They can basically become an active – rather than a passive – part of the energy mix  and they can play their natural role as consumers. They move from being just demand to being true consumers in the energy process, in an economic sense.

What are your most significant ongoing projects?
One of our projects in the MENA region is a partnership with the Bahrain oil company Bapco to develop and install the Bahrain Smart Community Project in the town of Awali and at Bahrain University.

This project will be a showcase of distributed solar energy in the Gulf. It demonstrates how large-scale distributed smart solar can actually become a viable and reliable part of the energy mix in places with lots of oil and gas. The project also demonstrates that the challenges of energy security, climate change and economic development can be solved through global partnerships and collaboration.

The changing face of mining sector investment

A recent report on mergers and acquisitions (M&A) in the mining industry shows there is to be a significant rise in activity from non-traditional investors in 2013. The sharp increase in deals is expected to come from private capital, state-owned enterprises (SOE) and sovereign wealth funds (SWF). These state-backed and private financial groups have shown a steady growth in interest and were responsible for 31 percent of the deals made in 2012: up from 21 percent in 2011.

Traditional investors, who have been responsible for the majority of M&A deals since 2009, have become less willing to invest and the value of transactions reached a five-year low of $2.9bn in 2012. They have been inclined to take smaller, non-controlling stakes in companies and are interested in shorter return timeframes.

Mike Elliot, Global Mining and Metals Leader at Ernst & Young, said: “The funding gap is being filled by private investors and SOEs, who may not be dislodged from their newfound positions once the cautionary investment environment recedes and traditional investors return to the sector.”

A new deal
The restricted funding conditions mean smaller mining companies with development assets will be more inclined to carry out deals with new investor groups in order to finance project construction. Paul Murphy is Australia and Asia-Pacific Mining and Metals Transactions Leader for Ernst & Young, publishers of the report. He said: “They have not got much of a choice right now. Either they raise equity and get diluted, they merge, or they get taken out.”

SOEs are expected to be the biggest of the new wave of backers in 2013, with China the dominant force, driving long-term aspirations to feed the growing demand for metals in its domestic market. China’s main activity has been in Africa, which makes up 75 percent of its foreign mining investment.

In 2012, China was the largest acquirer of African frontier markets and accounted for two-thirds of the growth in demand for steel and aluminium worldwide: as well as being the biggest importer of iron ore and coal, according to research by Bloomberg. It is active in other mining markets and, in February, China’s largest state-owned investment company, Citic Group, paid about $452m for a 13 percent stake in Australia’s Alumina Ltd mining company.

Richard Tory, Head of Natural Resources for Morgan Stanley’s Asia-Pacific region, said: “With stronger and bigger Chinese players emerging, we could see a significant pickup in the volume of overseas acquisitions.” SWFs, particularly those from the Middle East, are also investing heavily in African mining, with the aim of diversifying their portfolios and becoming less reliant on oil and gas revenue. They have been welcomed into Zimbabwe by the Movement for Democracy group, which hopes the activity will help generate capital that can then be redistributed among the population.

The government in Western Australia launched an SWF last year to take long-term benefits from the boom currently provided by the mines in the area. The fund was paid for from $1.1bn of government mining royalties. West Australian Treasurer Christian Porter said he hoped the fund would be worth $4.7bn by 2032.

Interest grows
It is predicted there will be continued growth in interest from non-traditional private capital and hedge funds in 2013. Unlike other new investors, they are putting venture capital into junior mining companies in order to secure small, non-controlling stakes. They hope to profit from commodity prices, which they believe will improve alongside the global economy. According to Ernst & Young, 80 percent of the deals made by private capital and hedge funds in 2012 were for non-controlling interests.

The volatility of commodity prices led to a depression in miners’ share value

Bloomberg Industries’ Global Head of Metals and Mining Research, Ken Hoffman, spoke about the different attitude new private investors are bringing. He said: “Private-equity funds have a completely different outlook on the mining industry from traditional investors. The most important thing for a private-equity fund is locking in guaranteed cash flow from its investment… this means they may look to acquire strategic stakes in companies hedging output…  in 2013 and beyond, you could see a complete transformation of this industry into something that is a product produced for cash flow.”

David Baker, Managing Partner at Baker Steel, believes mining companies have to shoulder some of the blame for the drop in interest from more established backers in 2012. Using the gold industry as an example, he believes businesses are confusing investors by publishing reserve and production figures in gold and then benchmarking against the greenback, which has been depreciating.

Baker said: “Mining companies need to restore trust and give more clarity… we believe the mining companies should be consistent and report in gold. This would then give investors a clearer picture on how much gold it is costing to mine the resource and how many ounces of gold are added to the shareholder vault.”

Losing value
The downturn in majority takeovers has been felt heavily by the industry, leading Bloomberg to speculate that they are paying the price for the surge in interest over the past decade at $1.1trn. About $50bn has been wiped off the value of projects at the biggest mining companies in the last year. Rio Tinto announced it is taking a $14bn impairment charge, $3bn of which came from its failed coal-mining project in Mozambique, nearly wiping out the $3.7bn it paid in 2011.

A spokesperson for the company said: “In Mozambique, the development of infrastructure to support the coal assets is more challenging than Rio Tinto originally anticipated.” The remainder of the write-down is from its $38bn acquisition of Alcan in 2007. The group’s CEO, Tom Albanese, bore the responsibility and has been forced to step down.

Elsewhere, Barrick Gold has said it is taking a $4.2bn write-down on the $7.3bn it paid for the Lumwana Copper mine in Zambia. Evy Hambro, Manager of BlackRock’s World Mining Fund, said: “Companies are now starting to come clean with many of the mistakes they’ve made over the last few years… It wouldn’t surprise me to see more write-downs.”

A number of CEOs have shared the same faith as Albanese in the early months of 2013. Cynthia Carroll gave in to criticism from the shareholders at Anglo American in relation to cost overruns that led to a $4bn impairment charge at the companyís flagship iron ore drilling programme in Brazil.

Marius Kloppers of BHP Billiton stepped down after the company reported a fall of 58 percent in half-year profits. BHP firmly blamed “substantially lower” commodity prices and rising operational costs for the slump. Ernst & Young’s Director of Transaction Advisory Services, Michael Bosman, compounded this idea. He said: “[The] value of financing, raised last year, fell as traditional investors reduced their exposure due to softer commodity prices.”

Many believe the volatility of commodity prices led to a depression in miners’ share value, in turn causing many risk-averse buyers to offer less. The report says: “Sellers were unwilling to accept lower valuations based on their depleted share prices in 2012, on the grounds that this unfairly reected near-term uncertainties, rather than the long-term potential of their assets. Consequently, negotiations are taking longer and becoming more complex, resulting in sluggish M&A at best.”

Investments from non-traditional parties are expected to re-ignite interest from the more traditional groups in the coming year. According to Ernst & Young’s report: “In the longer term, further investment is likely to follow from traditional investors as the impact on the region’s infrastructure from the investments made by Chinese SOEs and sovereign wealth funds filters through, translating into future
M&A opportunities.”

The vast potential of UK-based tidal energy

The UK marine energy sector is experiencing big developments. The government’s energy bill has created new funding measures, and corporate investment has emerged just as new tidal technology testing is underway.

Trials of a new commercial-scale turbine at the European Marine Energy Centre (EMEC) in Scotland commenced in January as part of the Energy Technologies Institute‘s (ETI) Reliable Data Acquisition Platform for Tidal (ReDAPT) project. The £12.6m project was announced in 2009 and is due for completion mid-2014, when the turbine’s testing period ends.

Deployment of this project has been warmly welcomed, as until now the marine energy industry in the UK has developed on the back of government grants and some venture capital investment. This kind of funding is not sustainable in the long-term due to the lengthy gestation period of the technology: although, there is now a growing confidence that the machinery is close to maturity.

Wave potential
ETI has relied on financial support from its six private-sector partners: BP, Caterpillar, EDF Energy, E.ON, Rolls-Royce and Shell. Its public funds are received from agencies such as the Department for Transport and the Technology Strategy Board. Its latest turbine installation was carried out by Alstom, a French multinational conglomerate that recently acquired tidal turbine designer and manufacturer Tidal Generation Limited (TGL) from Rolls-Royce.

Larger corporate companies, such as Alstom and Siemens, have invested in similar technologies as they align with their corporate strategies and, more importantly, the businesses have the funding and enduring investment timeframes to support the projects.

Alstom’s new 1MW turbine was installed at the EMEC tidal test site in Orkney at the end of January. It was mounted on the same tripod support structure used to deploy the previously tested 500KW device, where it will remain for an 18-month trial period. There are high hopes for this machinery, as TGL’s 500KW turbine generated 200mw hours for the national grid between September 2010 and March 2012, demonstrating great potential.

Jacques Jamart, Senior Vice President of Alstom New Energies, said: “This new milestone installation in the development of tidal power generation technology is a step further towards the commercialisation of this new power solution.”

Tidal… is an inexhaustible supply of energy, is as predictable as the turning of the tide [and] requires no fuel

The aim of the ReDAPT project is to increase confidence in turbine technology. The current test will provide a wide range of environmental impact and performance information, while demonstrating the latest turbine designís efficiency. Induction of the turbine will demonstrate a new, productive, reliable design.

It boasts features that are as impressive as they are effective: an 18 metre rotor supports three pitchable blades, and the 150-tonne machine is buoyant, allowing minimal installation and maintenance costs. It operates at 40 metre depths with the ability to rotate to face the incoming tide at the optimal angle to extract maximum energy potential.

Jerume Pecresse, President of Alstom Renewable Power said: “This technology will optimise electricity production, limit maintenance constraints, and thus will help reduce the cost of electricity of this renewable energy source.”

It is anticipated that this, in turn, will accelerate deployment of marine energy technology in the UK, as there are currently many unexploited marine energy resources around the countryís coastline. The next step is to install pilot arrays prior to full commercial production.

This subsequent phase seems more promising, with the recent acquisition of financial support from the Department of Energy and Climate Change. A spokesman said: “We’ve more than doubled support for wave and tidal technologies from April 2013 under our Renewables Obligation scheme, and committed £28m of government support for testing facilities. Weíve also helped develop Marine Energy Parks, which are bringing together manufacturing and expertise to generate new innovation and accelerate the move to commercialisation.”

An additional development fund of £20m for low carbon technology has also been awarded to MeyGen and SeaGeneration Wales to test their innovative turbines in formations out at sea. Climate Minister Greg Baker said: “These projects will provide valuable insight into how best to harness the power of the sea and take us one vital step closer to realising the full potential of marine in our future energy mix.”

Jobs and power
MeyGen is using the investment to deploy a commercial array of 1.4MW tidal turbines, developed and supplied by Andro Hydro Hammerfest, at the Inner Sounds Pentland Firth site off the mainland of northern Scotland. These Scottish locations prove popular testing sites as Scotland boasts 25 percent of Europeís tidal energy potential, with the Pentland Firth and Orkney Waters hosting the worldís first commercial-scale leasing ground for marine energy.

The results from MeyGen’s turbines will be telling; Andritz Hydro Hammerfestís pre-commercial 1MW turbine was installed at the EMEC site in December 2011, but did not deliver any energy to the grid until February 2012. MeyGen has also applied for Scotlandís Saltire Prize Challenge – a clean energy grant worth £10m – to accelerate the commercial development of wave and tidal energy technology.

All this investment will be beneficial, as tidal turbines not only have the potential to deliver a significant portion of UK electricity ñ up to 2GW of deployment by 2020 – but the sector could be worth £6.1bn to the country by 2035, creating nearly 20,000 jobs.

Tidal appears to be a sensible energy resource choice for the UK due to the abundance of shoreline available to the island nation. It is an inexhaustible supply of energy, is as predictable as the turning of the tide, requires no fuel supply and has a smaller carbon footprint than many other energy sources.

But according to a recent RenewableUK report, the strike prices for the new Contract for Difference regime in 2017 pave the way to more tribulations for the marine energy industry. To attract more investors to commercial-scale tidal arrays, they will have to guarantee a strike price of £280-300/MWh by 2020, but realistically the prices may be as high as £320/MWh. This would look far less favourable to investors than the current estimated targets of £100/MWh for offshore wind and nuclear energy by 2020.

Emerging results from current tests may give more of an indication if tidal energy stands a chance at competing for investors against other renewables. However, EMECís Managing Director, Neil Kermode, has himself admitted that the company is facing challenges with grid capacity and connection charges and has stressed the importance of the momentum needed to overcome these obstacles.

For marine energy to be lifted from its status as an esoteric concept, the momentum Kermode is talking about needs to come from ongoing investments, with potential future backers following suit. Without them, the new tidal technological developments will be left stagnant and the UKís greatest potential source of renewable energy will continue to be left untapped.

Brazil’s growing dependence on hyrdopower-based energy

How is the renewable energy sector in Brazil developing?
Brazil has an energy matrix strongly based on renewable resources, with around 80 percent being produced from hydroelectric power plants. Brazil is one of the few countries in the world to have such a high percentage of energy coming from hydric resources. While it is a positive aspect, important challenges occur in its operation, especially regarding energy security goals.

Challenges happen due to Brazil’s continental proportions, thus it needs a wide-range transmission network; in other words, far reaching transmission lines able to bring energy from where it is generated to the central grid, balancing high rainfall zones with regions of lower pluviometric rates.

However, the latter is an old challenge and the country has already learnt how to deal with it. This way, my analysis is the sector’s primary issue at the moment would be to assure energy security goals are accomplished. Here, environmental rules have been changing the way hydroelectric power plants are built.

Because of the requirements for getting environmental licenses and permits to those projects, energy generators have been reducing the volume of dammed water for each project, frequently opting for ërun-of-the-river stations and that has been leading to system instabilities.

In parallel, the country, after excessively concentrating its matrix on that kind of resource, currently lacks diversity at its electrical system to compensate for those new smaller dams and the boom-and-bust cycle on pluviometric rates. This is reflected in relevant ups and downs in both pricing and energy availability.

Another key issue is to reach those energy security goals by using cleaner energy resources. After all, if the country’s matrix growth is to rely on those smaller hydroelectric power plants, it must have alternative and complementary resources that can offset any lack of water supply.

Eolic energy is currently one of the few activities that supports the local economy.To Brazil’s north-eastern region, a landscape with windmills is a sign of prosperity and opportunity

Under this situation, eolic energy gains relevance within the Brazilian matrix, potentially being a standout as a complementary resource. There is also an interesting correlation between wind and rainfall: generally speaking, when it rains, wind flow lowers, and when rainfalls are scarce, the wind flow is stronger. In other words, when a windmill generates energy, it can replace hydroelectric generation.

Brazil already understands how important the complementary relation between those two energy resources is. Not by chance, Empresa de Pesquisa Energetica, an arm of the local energy regulator ANEEL, foresees eolic power having the highest growth rate for the next few years.

Is the government doing enough to support renewable energy firms, and is the industry in need of subsidies?  
Eolic energy development in the country might be taken as a success story. In the beginning, eolic energy was virtually non-existent if one is considering the late 1990s and early 2000s. The scenario has quickly changed and developed. Nowadays, Brazil already figures as one of the world’s main markets for this industry, mostly thanks to the government position: doing as much as possible to provide us with a profuse eolic industry in the country.

In terms of solar energy, shared generation is already manageable, and specific rules and legislation have been created for it. But the roll out of this model is still very slow.

The incentives led to consistent savings on energy bills, and the fixture of photovoltaic panels was a spontaneous move from end users. The problem with this strategy is that the advantages provided by the system, as well as actual savings on bills for the average Brazilian, are part of a long-term process, and customer engagement tends to be slow.

Large-scale solar energy projects, small hydroelectrical power plants and biomass plants all depend on specific public auctions to grant concessions. If that doesn’t happen, they get stuck in the queue, as they are not as competitive as other resources.

Renova Energia has recently moved into the solar energy market. What opportunities do solar resources offer your business? 
Energy generation from solar resources is still incipient in Brazil, even with solar irradiance levels that reach 6.8 on its measurement scale (a level second only to that found in the Sahara desert). However, we believe solar energy potential to shared solutions reaches about 5-10 GW.

Even so, it’s an industry that is going to advance gradually. I don’t think there is going to be a boom in demand. Renova is positioning itself in this particular market, closing sales as an important player in the shared energy segment.

Taking the sales at the large-scale solar energy market, which has been growing in terms of global importance, we see good opportunities in the future. In Brazil, solar photovoltaic energy remains much more expensive than the other alternatives ñ like run-of-the-river power plants but we believe that, with a public auction similar to that made for the eolic sector in 2009, solar energy might be a viable competitor against other resources like small-sized hydro power and biomass. The potential for solar energy in Brazil is massive.

What do you say to people who believe wind farms are a blight on the landscape and are not efficient enough?  
Caetite City, in the state of Bahia, is one of the places where our wind farms are located. Itís a region that lacks resources and job opportunities. For this reason, eolic energy is currently one of the few activities that supports the local economy. To Brazilís north-eastern region, a landscape with windmills is a sign of prosperity and opportunity.

Windmills integrated into the landscape are a guarantee of revenues in a region lacking resources. Land acquisition for installing windmills brings greater revenue to the region than many social programmes. The city hall in Guanambi, a neighbouring city that also houses Renova’s windmills, has even incorporated the image of the mills on its postcards.

Taking the sales at the large-scale solar energy market, which has been growing in terms of global importance, we see good opportunities in the future

Nonetheless, the consolidation and development of a new industry always come with some scepticism, but the efficiency of the introduced model and social-environmental issues created from eolic mills building are themselves enough to dismiss initial concerns.

From an environmentally-friendly point of view, impact to the environment is considerably lower than that caused by a hydropower plant, and from a social point of view, the windmills are helping to keep families in their native regions. It’s the most efficient resource by far. In comparison, Brazilian wind farms have a net capacity factor above 50 percent, while local hydroelectric plants operate with a net capacity factor of 35 percent.

How long do you think it will be before renewable energy makes up a significant portion of the whole world’s energy provision? 
This is a tricky question to answer. I imagine once energy markets are consistently region-centric, given the demand for natural resources, and particular regulations and laws for each region or country. This way, each region should be developing local policies to take maximum advantage from its captive natural resources. I believe the next generation, familiar with the beliefs and the responsibility of preserving natural resources, will be seeking and charging with more voraciousness the issue of sustainability. Itís all a matter of the generations involved and their moral compasses.

Moreover, investments in new technologies are aimed at easing the decision-making process in new ventures, increasingly looking to generate energy from renewable sources. In Brazil, we truly believe in their potential, especially on wind power, as a complementary solution to the country’s energy demands ñ mainly because the country is undergoing a turning point as far as the energy
sector goes.

With dams registering low water levels due to the dry weather, the Operador Nacional do Sistema decreed thermal plants to be online (a non-sustainable solution triggered by emergencies in the system). Its role in the country is that of a kind of insurance policy on energy supply every time the hydro power structure reduces production.

Cost per MWh generated in thermal units may vary between BRL400 and BRL500, much higher than natural gas thermal units, which have also been used to mitigate low levels in Brazilian dams. Having said that, eolic energy emerges as the most likely complementary solution for the hydroelectric system, especially during the drier seasons, with competitive prices and operating under a sustainable model.

A brief history of PCO’s innovations and successes

PCO emerged 25 years ago as an offshoot of the Technical University of Munich. At its heart were the development, design and manufacturing of scientific camera systems based on image intensifiers and digital cameras. In the 1990s, the company created its first purely digital camera system, sensicam: it’s still used in many scientific laboratories, helping find answers to the questions of the day.

At the start of the millennium, the company entered the world of industrial imaging with one of the smallest digital cameras then produced. The ‘pixelfly’ was capable of delivering 12 bit intra-scene dynamic for quality control, and delivered the first high quality images of the Titanic.

Broad scope
From there, PCO moved further into the high-speed imaging area: our CMOS image sensor delivered slow motion images of a quality that had not been seen before. With the corresponding camera family, PCO’s products now cover a broad range of applications: from cinema-quality, slow motion via documentation applications in nature and technology, to crash-test applications in car safety.

This progression has produced new applications, while existing fields have benefited from the advances in colour image conversion and quality. The latest step has been the co-development of new scientific image sensor technology, which offers a rich blend of previously impossible performance parameters.

PCO has always tried to exploit image sensor and electronic device technology to its maximum

The pco.edge camera family offers high resolution, absolute low readout noise, high linear dynamic and high-speed imaging. The new system has attracted particular interest from the life sciences community and has helped to push the limits in a variety of areas, such as microscopy imaging.

Today’s picture
PCO has always tried to exploit image sensor and electronic device technology to the maximum. This has been challenging and sometimes cumbersome, but it’s also been fun. More importantly, it’s allowed us to offer our customers the best possible tools as many of our camera systems operate at the forefront of technology. Our attitude has helped us open new fields of applications and markets, and to build our worldwide reputation in scientific and industrial research, quality control and metrology.

But technology has developed as well. While 15 years ago it meant a lot if you knew how to properly read out an image sensor, nowadays lots of all-in-one control parts are available and it’s not that difficult to construct a decent digital camera. Because of this, we’re now investing more than ever in the development of new, high-performance CMOS image sensors so we can offer our customers the best performance.

By doing this, we continue to follow the slow-but-steady growth model that has allowed us to achieve our current good position. But above all, we are committed and motivated to continue the advancement of camera technology.

ICF’s ten step guide to professional coaching

Coaching is a powerful tool that can help companies flourish despite uncertain economic times. We look at some of the benefits companies have cited:

1. Coaching can assist organisations with key business goals. Within the coaching partnership, the coach works with employees to identify and create clarity around key business goals, and establish effective management strategies to ensure goals are met.

2. A coach supports employees in confidently pursuing new ideas and alternative solutions with greater resilience and resourcefulness. A coach encourages fresh perspectives and provides inspiration through the questions they ask during sessions, and the actionable goals they co-create with employees.

3. Coaching can manage the change that accompanies growth within an organisation. Many see professional coaching as an important modality for managing change, including growth. A coach helps employees assess current needs, opportunities and challenges, while maximising the potential they already possess.

4. Coaching can boost productivity and effectiveness. This is especially important if employees are taking on new or leadership-level roles. Coaches are trained to work with clients to inspire them to their personal and professional potential, thus increasing productivity and effectiveness. Within the coach-client relationship, a focus is placed on learning and clarity for forward action. According to the ICF Global Coaching Client Study, 70 percent of clients saw work performance improvement.

5. Coaching can develop communication skills. The ICF Global Coaching Client Study revealed that 72 percent of those being coached noticed an improvement in communication skills. Furthermore, individuals who engaged in a professional coaching partnership claimed to have gained: fresh perspectives on personal challenges and opportunities; enhanced thinking and decision-making skills; enhanced interpersonal effectiveness; and increased confidence in carrying out chosen work and life roles.

6. Coaching can help organisations attract and retain talented employees. If companies have trouble finding strong employees (or getting them to stick around), they need to commit to investing in employee development. Coaching is one way for companies to develop their employees and show their employees they value their development.

7. Coaching can aid the work-life balance of employees. An employee experiencing the benefits of a balanced life is a happier employee. And a happier employee is a more productive one. A coach can work with employees to discover, clarify and align with what they hope to achieve, including a stable work-life balance. According to the ICF Global Coaching Client Study, 67 percent of coaches saw an improvement in balancing their work and personal life.

8. Coaching can help employees thrive. Most companies who hire a coach report they are satisfied with the outcomes. The ICF Global Coaching Client Study found that 99 percent of coaching clients were “somewhat” or “very satisfied” with their overall coaching experience, and 96 percent of them would repeat the process.

9. Coaching can help companies despite uncertain economic times. Coaching can be a powerful tool in the face of uncertainty. Organisations of various types and sizes have claimed a number of benefits, including: improved business performance; improved product quality; higher employee retention and morale; greater employee commitment; leadership development; conflict reduction; and team building skills.

10. Coaching can restore self-confidence to organisations hit hard by the recession. Organisations that have experienced workforce reductions through downsizing, restructuring or a merger place extremely high expectations on the remaining workforce. Restoring self-confidence to face the impending challenges is critical to meeting organisational demands. The ICF Global Coaching Client Study shows 80 percent of those being coached saw an improvement in their self-confidence.

Coaching can deliver a significant return on investment for companies. The ICF Global Coaching Client Study found that 86 percent of companies made back at least their investment. Of those, 28 percent saw an ROI of 10 to 49 times the investment and 19 percent saw an ROI of 50 times their investment.

Find out more at www.coachfederation.org/tne.

Asetek’s tech for a growing smartphone market

Most people today take the remarkable convenience and productivity of their smartphones, tablets and personal computers for granted. Too many fail to grasp the immense amount of energy that’s needed in data centres to support their devices.

Data centres are rooms filled with racks of servers, where online data is processed and stored in the so-called cloud. The fundamental problem with the cloud is that a large proportion of the power that goes into a server gets translated into heat that must then be cooled – which takes a lot of energy. With the explosive growth of cloud computing, data centre cooling has emerged as one of the world’s fastest-growing energy problems. In fact, approximately one percent of all electricity used on the planet today goes towards cooling data centres.

Almost all data centres rely on air conditioners for cooling by blowing cold air through each server to remove the heat. “It’s hard to imagine a less efficient cooling system,” says Andre Eriksen, the CEO and founder of Asetek Inc. There is a much better solution.

Eriksen, a native of Denmark, was the recipient of the country’s Best Entrepreneur award in 2000, says: “It just doesn’t make any sense that you should spend as much energy cooling a data centre as you do running it.” The company he founded has developed a breakthrough liquid cooling technology for data centres at its engineering facilities in Denmark. Asetek manufactures in Denmark and China.

Water cooling the cloud
Water is a far more effective cooling medium than air because it is 4,000-times better at storing and transporting heat. This is why the radiator in your car uses water to cool the engine, and why it feels cold when you step into the ocean, even if the water temperature is the same as the air temperature. Asetek’s Rack Cooling Distribution Unit (RackCDU) system utilises this advantage by bringing water directly to the hottest components inside each server and removing the heat before it escapes into the air.

We are changing the paradigm of liquid cooling for data centres by making liquid cooling cost-effective for all data centres, not just the small handful of enormous super-computers that use it

The heated water is then pumped out of the data centre. The liquid is completely self-contained in a sealed system of liquid tubes and small-pump/cold plate units that fit inside each server. No liquid ever comes into contact with the electronics. Water is such an efficient coolant that warm water keeps servers running comfortably, meaning no power is required to actively chill the water. It is cooled for free with outside air.

The water that comes out of the servers is surprisingly hot, at a temperature around 140 degrees Fahrenheit (60 degrees Celsius), enabling data centres to reuse this heat as an energy source for other applications (such as central heating or hot water). Data centres around the world generate more than 400trn BTUs of unused waste heat each year.

The University of Tromso, in Norway, recently selected RackCDU to capture data centre waste heat to warm its campus. Eriksen says: “Now, rather than having to pay an extraordinary amount of money just to remove the waste heat from your data centre, you can actually capture it and reuse the heat for other things, saving even more money.”

Making the future affordable
Compared to air-cooling, liquid cooling can cut data centre cooling energy by up to 80 percent, while allowing the heat to be captured and reused. It has the potential to reduce global data centre energy consumption by up to 50 billion kilowatt-hours annually, equalling the output of almost six nuclear power plants, at a saving of more than $10bn per year.

With so much money at stake, why isn’t liquid cooling already deployed in every data centre? In fact, liquid cooling for data centres isn’t a new idea; IBM has been doing it since building the earliest mainframe computers. But those previous liquid cooling systems were much too expensive and complicated for general use.

Eriksen said: “We are changing the paradigm of liquid cooling for data centres by making liquid cooling cost-effective for all data centres, not just the small handful of enormous supercomputers that use it today.”

RackCDU is designed to be installed into standard commercial servers, and can be installed at the factory or as a retrofit to existing data centres. The cost of an installed system can typically be repaid within a period of less than 12 months, through a combination of energy cost savings, and other equipment and maintenance savings.

The coming thing
In the past few months, installations have been announced at multiple high-profile data centres, including the Lawrence Berkeley National Lab, the National Renewable Energy Lab and the US Department of Defense. Cray, one of the biggest names in high-performance computing, launched the first product line featuring the system in November 2012.

This is not Asetek’s first foray into the liquid cooling of computers. It is already the world’s leading supplier of liquid cooling systems for the computer workstation and gaming PC market. The company has more than 1.3 million systems in the market today, sold by big-name manufacturers including Hewlett-Packard, Dell, Asus, Lenovo, Intel and AMD.

The pressing problem of data centre cooling will only grow as more people around the globe buy smartphones and adopt digital lifestyles. Eriksen says he founded Asetek to address this concern with a more efficient and cost-effective approach. The company recently completed its initial public offering. Eriksen says: “This situation presents a gigantic business opportunity for us, while also moving the dial on a huge environmental problem and that is what’s driving me.”

The dirty little secret about data centres is all too clear. The many benefits of cloud computing come at a substantial price in terms of energy burdens and climate change effects. Computer scientists and environmentalists alike can both agree that a better way is needed to cool down data centres. In its own way, Asetek is working to create a more sustainable planet for everyone, not just data centre managers.

Further information: sem@asetek.com; www.asetek.com 

OKH’s infrastructural contribution to Singapore

OKH Holdings, a premier brand name in the industrial property development scene, is notable for its ability to push the boundaries with its contemporary designs and eco-friendly features that are designed to minimise the environmental impact of the industrial property projects built.

In addition to its core property development business activities, the company offers a wide range of integrated construction services with its in-house design and construction team. Coupled with a direct procurement strategy, the company is able to ensure the quality and cost-efficiency of the properties and construction projects under its purview.

The journey of OKH truly began in 2001, when the companyís current CEO and Executive Chairman, Thomas Bon, took over the company, which had been established in 1998. Though he started off as a one-man operation, and had to face the challenges of limited financial and human resources, his business vision as a property developer remains undaunted. Bringing an out-of-the-box approach to infrastructure solutions, OKH gradually built its reputation with the completion of fast-track projects in a cost-efficient and timely manner.

Innovation, vision and progress
The companyís efforts received a big boost when it secured several high-profile infrastructure projects in Singapore, which included clients such Singapore Mass Rapid Transit, Singapore Turf Club, Singapore Formula 1 Grand Prix and Singaporeís Budget Terminal. With several high-profile projects under its belt, OKH acquired the relevant knowledge and capabilities to integrate designs and eco-friendly features without compromising on time and costs.

We need to adapt to changing market trends, innovate and most importantly, remain focused on our vision and progress forward

Leveraging on these valuable experiences, OKH steadfastly adopted its out-of-the-box approach to its primary vision of property development, and took its first step in that direction with the successful completion and sale of Seatown Industrial Centre in 2009. Bon says: We need to adapt to changing market trends, innovate and most importantly, remain focused on our vision and progress forward.

Aesthetic features are often understated in buildings, but it makes a big difference as most of our time is spent inside. Hence, one of my primary visions is for OKH’s property development projects to leave a good lasting impression, with our focus on aesthetics on the outside and the inside.

Current and future success
With the successful completion and 100-percent sold-out sales records of its industrial projects, such as AíPosh BizHub and Primz BizHub, OKH is currently marketing and developing various development property projects, such as Woodlands Horizon located near Singapore’s Admiralty MRT Station.

OKH has received numerous awards in the past few years. In 2012, the company received the Enterprise 50 award (for the second consecutive year) and Bon was recognised as the overall winner of an Asia Pacific Entrepreneurship Award (organised by Enterprise Asia) as well as the Top Entrepreneur Award from the Entrepreneur Award 2012 (organised by ROTARY-ASME). Going from strength to strength, the company has built up a strong pipeline of property development projects. Bon is aiming to get the company listed on the Singapore Exchange in the near future to further enhance its brand name and progress towards its business aspirations.

While OKH is expecting to move into its brand-new headquarters at Tai Seng Link to accommodate its growing business activities in 2015, Bon is already setting his sights on extending the companyís business presence beyond Singapore and onto the regional stage. With a strong business foundation in Singapore and its credentials, OKH is ready to grasp the opportunities that lie ahead.

The artists of invention

Commerce has gone through many changes since our ancestors first established the concept of lending. Economies have grown as populations have increased, and with the development of increased lending, commerce has become integral to our way of living; allowing mankind to develop and sustain itself on a macro scale.

Until recently, innovation was a concept labelled as the remit of the technologically and creatively minded. But as scientific know-how has become more accessible and information ubiquitous, The New Economy has taken the opportunity to recognise the ingenious achievements made in business throughout 2012. The following list of 40 companies and individuals identifies triumphs in research and development and the implementation of said accomplishments.

Innovation is considered by many companies to be a risky and expensive liability. When times get tough, many CEOs go straight to their R&D departments in search of potential savings. Focusing on an existing portfolio and continuing with the status quo may seem like a logical move, especially when shareholders come nibbling, but had Apple taken that approach there would be no iPod, MacBook, iPhone or iPad, and the company would have gone out of business. The most successful companies – like a lot of automobile manufacturers in the last few decades of the 20th century – understand that new products work in a similar fashion to the investment multiplier often discussed by economists in terms of national economies: you must innovate to accumulate.

Naturally, countries have different approaches to what they consider innovation. Much of what we traditionally regard as the West looks at the launch of new consumer goods and services as true innovation, while Australasia and the Scandinavian regions tend to categorise it in terms of sustainability and green technology. China and a number of Asian nations celebrate innovation as the bettering of existing processes. Looking at the Innovation 40 2012, these differences aren’t difficult to consolidate. Indeed, we’ve scoured the globe for a range of entities throughout a host of different industries, selecting those that have moved the boundaries of existing paradigms throughout the year – from those creating new markets to those who succeeded in refining strategies or existing products; all have positioned themselves to evolve regardless of what financial constraints may have hindered their efforts.

Tackling innovation head on is a notion that any firm – small or large – must adopt in order to withstand the pressure currently prevalent in global markets. With more and more competition arising from new contenders in once-discounted domiciles – which boast enterprises capable of offering affordable pricing structures – companies that had previously gained large market shares are finding themselves under intense pressure to re-establish themselves. The most immediate winner of this new sense of competition is, of course, the consumer.

What is commendable about this list is that it is made up of not only well-established organisations and multinational superpowers, but of some breakthrough companies pioneering niche developments. We believe this is important for a number of reasons. Firstly, this is an information economy: success is recognised in those capable of understanding their market and offering something different. Secondly, information is in overabundance, so continually being able to analyse and react is a fundamental requirement for pursuing growth, even on a small scale. Lastly, innovation can be fostered in each and every functional and strategic department, and if approached correctly can turn a small enterprise into an industry leader.

With firms and individuals such as those seen on this list ensuring industry moves forward, the future looks promising: from an economic growth perspective; in bettering living standards; and adding to human advancement in general. We feel that the novel nature of this list – recognising innovation as the application of thought to a number of different processes and end-results – offers a diverse and well-rounded overview of the most exciting prospects from 2012.

Sandvik

2,700

Sadnvik has 2,700 employees active in its R&D division

5,500

Sandvik has approximately 5,500 active patents in the company today

Sandvik is a Swedish engineering group widely recognised for its innovative approach in furthering business-related mechanisms and practices. A world leader, the high-technology engineering group has a long-standing tradition of innovation supported by R&D investments. The company collates data on the needs and expectations of customers, implementing improvements based on the feedback. The commitment demonstrated by Sandvik in maintaining customer satisfaction is unparalleled; the company now operates in 130 countries.