The crucial importance of requirements management

The concept of requirements which today has become a formal process critical to successful outcomes is familiar to many organisations large and small around the world. The modern definition of requirements is a singular physical and functional need that a particular product or service must be able to perform.

It is most commonly used in a formal sense in systems engineering and software engineering, but is often used in enterprise engineering. The requirements-gathering task and the necessity to use the best tools available is best demonstrated in standards development. This is becoming more complex day by day, and therefore increasingly challenging.

Standards being developed for the industries and technologies of today must be able to withstand the pressure of rapid market change, the high-speed evolution of existing technologies, the emergence of new technologies, and associated process change. Simultaneously, harmonisation is widening the range of stakeholder interest in the development of standards.

Faced with these challenges, it is essential that best practice is followed and that includes the use of powerful tools by anyone who is managing requirements. The requirements management function is the first step in a trail of traceability from requirements through use cases or user stories, model design, code generation, testing and deployment.

Preventing chaos
It is crucial to identify errors and discover needs during the requirements phase. Effective requirements management goes a long way to eliminating most design mistakes and reducing failures during the development process: we are all familiar with the saying ‘a stitch in time saves nine’, which advises early remedial action to avoid deterioration later.

This wisdom withstands time as many benchmark studies show it is much less expensive to identify and capture requirements early in the development lifecycle. Systematic and effective requirements management based on best practice captures risks earlier in the lifecycle and should help to reduce errors by controlling element characteristics in complex projects.

From the smallest business to the largest enterprise, cost overrun, project failure and in some extreme cases loss of life may well have been avoided had the requirements been complete. In the CHAOS Report by the Standish Group, which examines failed projects, three reasons were identified for project success: a clear statement of requirements; user involvement; and executive management support.

Requirements management is the essence of successful project outcomes whether it is software or product development for business or engineering, change management, or project management. One simple necessity is to ensure that the business and IT stakeholders understand each other.

Seeing is understanding
A visual representation of how requirements interact and depend on one another is powerful; it can help users understand what they want by using mind mapping to give a clear and contextual idea of their requirements.

Collaborative tools, such as Enterprise Architect from Sparx Systems, provide the ability to electronically document requirements and capture any artefacts (audio, video, pdf, etc) in a common model. This requirement model provides a complete, inclusive and easy-to-understand picture for stakeholders, which creates assurance and strengthens stakeholder commitment.

Requirements management is the essence of successful project outcomes

Again, where users are uncomfortable with written requirements, a picture is worth a thousand words, since a group of related concepts is often easier to remember than written material. This is why the use of mind maps assists with problem solving; different stakeholders, by visualising relationships, can more readily recognise their shared concerns about a given requirement.

Mind mapping improves verbal communication by enabling stakeholders to outline their thoughts spatially and subsequently articulate the meaning behind the concepts they have outlined. Using mind maps, the group can provide support to the individual by helping them find the words to communicate concepts represented by the mind map.

The system or product being produced displays quality aspects that result from effective requirements management. The requirements statement identifies a necessary attribute, capability, characteristic or quality of a system in order for it to have value and use to a user.

Matching expectations
Quality is not an accident and such statements are created on the basis of a well-considered and effectively executed process involving all the relevant stakeholders. The objective of requirements management is to increase the probability that a given project will deliver applications with the expected functionality.

With complete traceability from mind mapping, requirements and business process to software design and deployment, the overall likelihood that problems will be introduced is effectively mitigated. This expectation is more realistic with the ability to store requirements in a central and secure location, track artefact inter-relationships and control changes to single and group requirements.

Today, it is not uncommon for development teams, customers and stakeholders to be geographically dispersed, and efficient tools are necessary to link and organise people working on shared, complex projects. Using tools such as Sparx Systems’ Enterprise Architect, these stakeholders can effectively collaborate on projects by understanding who is working on specific tasks, what roles are to be filled and who has responsibility for the various aspects of the project.

Different team members and stakeholders must be able to input information that is relevant to their roles and activities, and is useful to the other members of the project. This implies the necessity to capture this information in a model that is available to all team members, overcoming their geographical limitation. Since the requirements function is essentially a detailed tapestry of the project elements, their definition, their inclusion and their traceability must be visible to the entire team.

For instance, a database analyst, a programmer and a security analyst could be assigned to work on a requirement. These resources must be managed to ensure their availability to work on assigned tasks, which exposes the need for integration between requirements management and resource management tools. Again, depending on the particular role, tools such as Enterprise Architect provide many different role-dependant views of the same information.

Infonavit’s contribution to private housing development

Private housing in Mexico was a major issue in the early 1970s. The poorest members of society were restricted, with no access to mortgages, and no savings to build solid homes and a future. This became the motivation behind the 1972 constitutional decree that founded the Instituto del Fondo Nacional de la Vivienda para los Trabajadores (Infonavit), the National Workers Housing Fund Institute.

Under the scheme, Mexican private sector workers must contribute up to five percent of their payroll to the National Housing Fund, which is used by Infonavit to grant a series of mortgages and financial products to low income workers, at competitive interest rates. Employees with access to Infonavit funds have been the fastest growth segment for homeownership over the past decade, and the institute predicts it will supply 3.3 million new mortgages between 2013 and 2017; allowing a large scope for social change and development in the area.

“Its mission is to generate progress in Mexico by accompanying the mortgage-holder: helping to meet their housing needs and increasing the net-worth for their families,” explains Alejandro Murat Hinojosa, CEO of Infonavit. “The institute is built on four pillars: better coordination between housing institutions; the reduction of the housing deficit; adapting the urban and social housing models; and to generate wealth and quality of life through housing for mortgage-holders and their families.”

Social institute
Infonavit is a social institute as well as a financial one. When employers contribute the funds on behalf of employees, it allows the latter group to use these funds to access bigger loans and mortgages. The institute has a huge scope; 68 percent of the mortgages issued in Mexico are provided by Infonavit. Murat Hinojosa says: “When we talk about better coordination among the housing institutions and better articulation and allocation of resources, we have to consider four fundamental elements: community building; re-diversification of verticality; commitment to the environment; and information and technology.”

These are not the words of a profit-minded individual, but of a social crusader committed to bettering the lot of others. “We try to integrate these elements by stimulating better access to public services and infrastructure for communities.”

The institute has a vast history of helping the lower income members of Mexican society onto the property ladder. Last year alone, Infonavit granted over 578,000 loans, of these 63 percent were focused toward title-holders that earned less than four times the minimum wage. “We are attending the lowest segment, and that is very important to us”, says Murat Hinojosa.

Infonavit’s resilience in the face of changing governments is due to its adaptability

“We have a strategic agenda, based around four fundamental points: housing needs, financial sustaintability, social balance and scope of policies. When we approach our financial model, we have understood that it is not about a financial product, but about generating quality of life.”

The institute is a fundamental part of the Mexican government’s strategy to combat housing needs in Mexico, and has been for the past three administrations. Infonavit’s resilience in the face of changing governments is due to its adaptability. When it was founded in the 1970s, mortgages were exclusively for buying new-builds; it was a valuable strategy at the time that helped boost Mexico’s building industry.

But in recent years, upon noticing a change in its client’s demand, Infonavit created loan products to meet several other needs, such as building own homes, and renovating and buying previously owned property. “Infonavit is a social mortgage financial institution. Our mandate establishes that we should work with low income areas, or segments that commercial banks don’t serve,” says Murat Hinojosa.

“What we have been able to achieve over the past few years is a cross-subsidy model that gives us the opportunity to establish more competitive interest rates, so we can benefit title-holders…

“The great opportunity Infonavit is excited about today is to migrate from a traditional housing model to a more integrated urban and sustainable model of competitive cities.”

Empowering communities
Part of Infonavit’s modernisation strategy has also been to develop a whole range of sustainable financial products. The ‘Green Mortgages‘ programme, which accounted for 400,000 loans in 2012, is aimed at promoting sustainable elements in the housing industry, such as solar panels. “It benefits people by giving them better control of their budgets in energy and water consumption, and gives them the opportunity to lower CO2 emissions,” says Murat Hinojosa. “So not only would we foster a commitment to the environment, but we would enable our title-holders to have more control over their energy and water budgets, and generate huge saving as well as reduce their carbon footprints.”

Infonavit’s Green Mortgages scheme is also central to its urban development strategies. Most of these mortgages are granted to urban developments. “We consider a commitment to the environment with better transportation and better community building,” says the CEO. “This is important because, if people are happy where they live, our financial products will certainly be more robust. But we will also be achieving our most fundamental goal of generating quality of life and net worth. We want to be generating social as well as economic wealth.

The institute has a vast history of helping the lower income members of Mexican society onto the property ladder. Last year alone, Infonavit granted over 578,000 loans

“We have gone beyond the need for new housing to identify the preferences of our clients. So now we have generated products that address the needs of people buying new houses, or expanding or renovating their houses. We are also trying to address the renting model, which Mexico has not been able to tackle properly.”

Infonavit’s approach to risk and return, which is fundamental financially, is also unique. “We have a motto at Infonavit: “as long as there is a willingness to pay there will always be a solution”,” says Murat Hinojosa. The institute is just as concerned about its social scope as it is about returns. “It is important to consider the quality in the housing and how to achieve a community. If we achieve these two fundamental elements we will be generating net worth.”

For Infonavit, it’s vital that loans contribute to the betterment of the communities where the beneficiary-employees choose to settle. One of its most successful programmes in recent years is aimed at improving communities not only for residents, but also municipalities.

The Municipal Competitiveness Programme in Housing aims to provide much-needed relief for local purse strings. “Today in Mexico, many of the municipalities lack resources, planning capacity and financing to better their communities,” explains Murat Hinojosa. The programme gives incentives to municipalities that meet high standards of community building and sustainable urban development.

Infonavit offers financial support to local projects, and helps implement action plans. “By doing this, we are helping municipalities generate the resources they need to build better infrastructure in their communities.”

But the programme goes one step further and allows municipal authorities to collect property taxes from Infonavit, together with mortgage payments. This has significantly boosted tax collection by ensuring taxpayers can use their Infonavit titles to pay their taxes.

Thinking ahead
The Mexican mortgage lender is clear in its goals for the future: Infonavit will continue to push through and consolidate its human capital – financial sustainability – making its operational model one of the most efficient in Latin America. “We will achieve this with a great strategic agenda and good corporate governance,”says Murat Hinojosa. “We will continue loaning and accompanying our title-holders through life, and supporting them with their housing and financial needs.

“We see great opportunities in the future. We need a model that continues to understand better the preferences of our clients and to continue bringing them the solutions they need in rent, in growing the size of their property, and solutions to buy new or older houses. We need to understand that we need to have a wider scope. We see the possibility of generating housing solutions for all Mexicans.

In that regard, we are aligned with the President, Enrique Peña Nieto, who has asked all housing institutions to generate mortgage solutions for those underserved workers who donít have the access to social security in housing. These are the municipal and state workers, armed forces, police and small business owners. The three fundamental opportunities for the future are renovation in housing, a wider scope attending underserved worker and rental schemes which do not exist in Mexico.”

The financial and sustainable benefits of ‘smart grids’

As energy requirements become more pressing, the need for a more efficient system via which to control electricity is being worked on by a number of companies. Smart grids act as a more flexible and customisable control system for how electricity is used, as well as offering both price and security benefits.

Netherlands-based Locamation has been developing its own set of smart grid solutions that are highly adaptable to changing circumstances. The New Economy spoke to the firm’s Chief Operating Officer, Pascal Bleeker, and its Director of Sales and Marketing, Alexander Schoenfeldt, about how the industry is developing and what the company is able to offer utility firms in the drive towards greater efficiencies.

Maintaining high standards
Bleeker says it is its extensive experience in the industry that enables it to match the needs of its customers: “[We] have 30 years experience of automating industrial and electrical systems with real-time information and communication technology. This know-how can be found in all our products and solutions. Our ambition is continuing to turn new innovative ideas into available and proven solutions for the smart grid world.”

The company places considerable emphasis on providing a secure and stable solution for its clients, which are primarily utility companies, and it is continually testing the strength of its security to make sure it is up to scratch. Its SASensor product, a substation automation solution based on an open software and system platform enables Locamation to offer a customisable and secure service to clients.

Bleeker says: “Besides reliability and performance, the highest level of security has always been our continuous goal in our product development, and has been proven in several hacker tests over the last few years. This approach has actually helped us be flexible in meeting customer requirements, and this flexibility is helping our customers to mitigate investment risks and achieve business goals.”

Proving that flexibility
While flexibility is obviously a strength, added costs if things are continually changed is challenging too. Locamation says its flexibility is appreciated by utility companies and that it enables its customers to actually cut costs. By using a software approach, the company is able to seamlessly deliver necessary upgrades, new functions and fixes when required. In the past, utility firms would be tied into expensive equipment that would either need to be replaced or discarded.

Schoenfeldt says: “Everybody is aware of some uncertainty in the future, but the preparatory steps network companies can take by using SASensor will result in a better preparation to face future demands and prevent stranded costs. Like utilities may have meters that are not future-proof, with no security provided, as hardware encryption is not in place. With SASensor, they will have the flexibility.”

In the past few years… distribution companies have started to prepare for future challenges, rather than waiting for regulatory decisions

Changing regulations can also impact on utility firms, and so increased flexibility allows companies to adjust accordingly, says Schoenfeldt: “Utility firms have changed their viewpoints in the past few years significantly, from a government-owned monopoly to a customer and profit-oriented service provider. This is an ongoing trend. Specifically, distribution companies have started to prepare for future challenges, rather than waiting for regulatory decisions. ”

“Within this transformation, the distribution companies realise that moving forward in small technical micro-steps will not help them to manage the cliff of their future market model. So they need to change their business model.”

SASensor will also help in the smart buildings sector, creating greater efficiencies. Bleeker says: “In the near future, buildings will not only become zero emission houses, they will become an active grid component with available capacity for other consumers. The impact to the grid will be significant and does require control in load and frequency management. With SASensor, we can activate and utilise these building capabilities much more efficiently than with conventional technology.”

The importance of collaboration
Working together with other companies, as well as local authorities, is crucial to implementing smarter power solutions that benefit communities. Regulations are not currently designed to help the industry, but local authorities are lobbying central government on its behalf.

Bleeker says: “The regulation is not well planned for supporting smart grid initiatives. We also need to get financial benefits from this. It’s not only about reliability and sustainability; people are also looking at the financial effects. Local government is lobbying central government to change the tax laws to help make these projects financially viable.” The tide is turning, however, and many countries are piloting new smart-grid schemes in the hope that they will prove a success.

Schoenfeldt adds: “A lot of European countries have understood the need to change the regulatory funding model. We have seen this start in the UK. In Germany, Poland, Belgium, and Austria they are preparing the next regulatory period, and every country is coming to an agreement about how the grid costs are covered. Every government is interested in results, and so that is why there are so many pilot projects around to provide them with real experience.”

Last year, Locamation announced it had begun a number of pilot schemes with companies in the UK. In October, it was conducting a pilot with Scottish and Southern Energy Power Distribution. The pilot will run for a year and see a substation with Locamationís SASensor HMV product. Earlier in the year, the company revealed it had struck a pilot deal with Western Power Distribution and UK Power Networks, two of the largest power distribution networks in the UK.

Working together with other companies, as well as local authorities, is crucial to implementing smarter power solutions that benefit communities

Keeping an eye on progress
In the past, utility firms would only be able to monitor how their grids were running at specific times, creating mere snapshots of how they were operating. Locamation’s approach is to continuously monitor grids, building intelligence into the substations.

Bleeker says: “As we continuously monitor the standard key performance indicators of power quality, load, phase angle, we decided to filter the continuous information stream, and log and record the data specials in order to get early signals of potential outages or failures already on substation level.”

“The network companies have the requirement to understand better and start to manage the grid differently. For this grid insight, we change from a yearly-to-daily picture and will end with a live movie of the critical locations and stress points in the network topology. The benefit of this new grid intelligence, including documentation and the possibility to remotely control, is keeping the network healthy, cost-efficient, secure and reliable.”

Smart grids are going to become more important as communities attempt to use their electricity in a more efficient manner. Creating a sustainable network of energy provision is a goal that will ultimately benefit communities everywhere.

Bleeker says: “Just imagine: the electricity grid will become the most powerful network of societies and industrial life, being their blood and nerve system at the same time. Automation and self-healing mechanisms have been put in place to enable society to use electricity at any point, at any time and to meet any demand it has.”

For this network to be achieved, there will need to be support from regulators, as well as the industry, ensuring that a standardised set of systems is in use so that collaboration is possible. As Bleeker says: “Locamation understands its responsibility to provide standardised, intelligent, designed system architecture, and devices and software in combination with basic and customer-specific software functionalities in a collaborative community of customers and partners.”

Schoenfeldt agrees: “Our ambition should be to create a standardised platform for the industry. There are standards for metering, for substation communication, for in-home platforms. The interoperability will make them all work together.”

Apps widen the technological capabilities of taxi services

The familiar sight of the black taxi hurtling through the streets of London has been under threat from the growth of convenient minicab services reliant on drivers with sat nav systems. In recent years, ‘cabbies’, known for their encyclopaedic knowledge of the city’s streets, have seen many customers desert them for minicab firms.

Firms like Addison Lee do not require drivers to undergo the sort of exhaustive training necessary for a black-cab license (known as ‘the Knowledge’) and have benefited from an easy-to-use phone system.

New tech, old industry
In the last year, black-cab drivers have embraced new technology as a means of fighting back against the minicab firms. Hailo, an application initially launched on Apple’s iOS platform towards the end of 2011 by some entrepreneurs and black-cab drivers, has enjoyed a rapid uptake by both customers and drivers.

I’ve always believed it’s what’s been missing in any taxi market in any city in the world

The app enables customers to order a taxi in seconds, even allowing them to track its progress on a map. Payment is taken either through a card account or by cash. Since launching, the app has been downloaded by roughly 10,600 of the 23,000 black-cab drivers in London, with most praising the increase in business they’ve seen. More than 225,000 customers have also downloaded the app.

One of the co-founders, Jay Bregman, told Londonlovesbusiness.com the reasons the app has been so successful: “We build better networks that mean cabbies are closer to more hails and availability is higher for customers without paying a fortune. What also gives us an edge is that the taxi community trusts us because former taxi drivers themselves are behind the business.”

Even though Hailo takes a cut of the fares drivers receive, many are pleased with the way it has boosted trade. One of the aspects most appreciated is the freedom drivers have to use the service whenever they want. They aren’t tied to a working schedule, as with minicabs, and retain the freedom and flexible working hours they’re used to.

Bregman told Wired magazine: “We aim to deliver to drivers only 20 to 30 percent more incremental fares each day to fill their downtime. So we always have more drivers than we need. Hailing times in London are mostly down to two minutes, and they’re falling.”

Taxi across the water
Such has been Hailo’s success in London that it’s already expanded to nine cities around the world, with highly competitive – and potentially lucrative – New York added in February. One of the founding drivers from London, Russell Hall, told the BBC he saw the model as transferable to many cities: “I’ve always believed it’s what’s been missing in any taxi market in any city in the world. What we’ve created in London will go to any city.”

In New York, however, the company faces stiff competition from another start-up, called Uber. More established in the US, Uber recently began signing up ordinary drivers so they can offer their own taxi services. Another market many firms are looking at is Japan, with Tokyo offering considerable rewards to whoever cracks it first. Hailo’s Chairman, Ron Zeghibe, told the Financial Times: “Japan’s market alone is worth $25bn and Tokyo weighs in at twice the market value of either London or New York. We will become Japan’s first end-to-end mobile taxi app.”

Hailo recently secured $30.6m investment from venture capital firms, including Twitter and Foursquare investor Union Square Ventures, as well as Accel Partners, Japanese telecom firm KDDI and Richard Branson. Since its inception, the company has received over $50m in backing, reporting over $100m in sales in 2012.

Propel Fuels offer a green alternative to gasoline

Propel Fuels offers drivers access to gasoline, ethanol and biodiesel fuels at a single pump. Continuing to expand the means by which renewable fuels are breaching the US energy market, Propel is a pioneering, young company whose strategy relies on consumer adaptability, rapid expansion and technological prowess.

A self-proclaimed environmentalist who reportedly spends his free time scaling 200ft California redwood trees, Acting CEO Matt Horton’s business methods are both pragmatic and principled in their makeup. Having originally worked as a venture capitalist in the Silicon Valley office of @Ventures, Horton invested a great deal of time and entrepreneurial spirit in the formation of Propel’s initial business plan in 2007. While at first being intended as a vendor for high-blend biodiesel, the company was later repositioned to offer multiple fuel types.

The company, while offering a variety of fuels, focuses primarily on ethanol-based fuel derived from corn, most notably used in E85 and E10 fuels. At present, Propel – in partnership with Solarzyme – is trialling algae-derived fuels at Propel gas stations throughout California. The pilot programme is the first such instance of algae fuel being sold to US consumers and saw sales rise 35 percent through January and February this year.

Location, location, location
Having opened 31 retail stations across California and Washington as of January 2013, Propel is strengthening the foundations for what Horton terms the “slow, but exciting” transformation of the US automotive industry. Last year, Propel secured $21m worth of funding for the furthered construction of 200 fuelling stations over the next two years. Both government and private investment types back the company. This has included the recent procurement of $11m in equity capital and $10m in debt financing from a number of venture capital firms.

The alternative fuels industry is dependent on its largest competitor as a pathway into the market

Propel is currently based in California, where the company benefits from the Low Carbon Fuel Standard, through which the company aims to reduce carbon intensity in transportation fuel. Though similar laws are implemented across the US, few are as stringent or progressive as California’s laws for the reformation of environmental strategy.

Location is integral to Propel’s business strategy in that it seeks to target those of an adaptable and enterprising sort to best ensure the maximum adoption of its product. Though the oil industry is considered to have a chokehold on US markets, Propel demonstrates a confidence in its consumers beyond that of competing companies. Using software built by the company, Propel chooses locations it believes best suit its technical requirements. Horton said: “In this business, the vehicle drives everything. You can have all the infrastructure in the world, but if there aren’t any vehicles around that use it, it’s not going to make any difference.”

Capturing the base
Propel ultimately seeks to create an extensive network of fuelling stations, as well as determine similarly minded consumer bases whose ultimate profit lies in the wider acceptance of alternative fuels throughout the US. Horton’s optimism is highly representative of the company’s healthy cash flow and future projections, having averaged 300 percent growth since 2010 and received in excess of $10m in revenues through 2011. With continued commitments to both R&D and spearheading advances in alternative fuel developments, Propel is looking to expand into further states throughout the US.

There are inherent challenges to the furthering of this project. Geoff Cooper of Renewable Fuels Association said: “The gasoline stations don’t want a competitor, but the alternative fuels industry is dependent on its largest competitor as a pathway into the market.” Regardless of such obstacles, Horton maintains that ‘The energy industry is ripe for disruption,” as well as stating his belief that Propel can better the cause of renewable fuels across the US.

Most reassuring for Propel’s future is Horton’s confidence in the market: “I believe strongly in the power of business and the American consumer to drive change. A company like Propel moves us in a more sustainable direction, rather than relying on regulation and government mandates.” Horton’s willingness to instigate change and innovation, as opposed to waiting on slow and inconsequential reforms, will surely allow Propel Fuel to reduce the US’  negative environmental impact.

Industries use gaming software to train employees

In February, it was reported that surgeons who played games on the Nintendo Wii console were more skilled at performing keyhole surgery operations. The Play to Become a Surgeon study by researchers at the University of Rome saw resident surgeons playing on the console for an hour a day for a month. The result was an improved accuracy in their procedures: up to 65 percent more accurate than doctors who undertook only conventional training. The skills used to carry out keyhole surgery are similar to those exercised when using the Wii: hand-eye coordination, movement precision, depth perception and 3D spatial visualisation when focusing on 2D on-screen images.

Dr Gregorio Patrizi, who led the study, said: “We hope this may be a trigger to develop dedicated software aimed to help young surgeons as the economic impact of these consoles is significantly lower than traditional laparoscopic [keyhole surgery] simulators and they provide a basic didactic value. The Nintendo Wii might be a helpful, inexpensive and entertaining part of the training of young laparoscopists, in addition to a standard surgical education based on simulators and the operating room.”

Call of duty
A field-medic simulator called Simulation Technology Applied to Trauma Care is a video game that enacts real-world battle scenarios in which doctors and medics must attend to a fallen soldier. The virtual patient exhibits the vital signs of heartbeat and blood pressure in response to treatment, providing medics with a realistic environment in which they can practice performing under pressure. More advanced medical simulators are also available, including a biofeedback mannequin that will become more ill if the doctor makes a mistake. Similar models have also been used to explore the functions of smart implantable pumps to treat brain tumours.

The military also uses video game-based software to aid training. The US Army is currently utilising the Dismounted Soldier Training System (DSTS) as a virtual tool for soldiers of the 157th Infantry Brigade at Camp Atterbury Joint Manoeuver Training Centre. The system operates in a warehouse, where soldiers gear up with flip-down goggle mounts, have sensors strapped to their arms and legs, and carry computer-enhanced weapon systems. Each soldier stands on a 4ft diameter pad in the centre of a 10ft2 training area, from which they can see and hear the virtual environment and communicate with members of their squad using a helmet-mounted display with headphones and microphones.

The skills used to carry out laparoscopies are similar to those exercised when using the Wii

DSTS operator Matthew Roell said: “A soldier uses his body to perform manoeuvres, such as walking or throwing a hand grenade, by physically making those actions. The sensors capture the soldier’s movements and those movements are translated to control the soldierís avatar within the simulation. This simulation training is the future of training. The DSTS allows a soldier to wear the simulation instead of sitting inside of a simulator.”

Sgt First Class Aaron Hammond, Operations, 157th Infantry Brigade said: “It’s our job to train soldiers at the lowest level. The DSTS gives leaders and squads a chance to really look at their tactics, techniques and procedures in a safe, but realistic environment. It does not replace training, but it can add to it. We can bring the terrain of Afghanistan to the soldier.”

Rise of flight
Simulated environments also aid preliminary training in the shipping industry. Approximately 80 percent of all accidents and incidents occurring in ship operations are caused by ship management errors, communication failure and lack of skill. Practice with full mission ship simulators is thus of vital importance. While simulation training is not a substitute for the experience of training on an actual vessel, it is an effective preliminary method to thoroughly familiarise students with equipment, procedures and processes.

Similarly, in the aviation industry, flight simulators train pilots and flight crew for both civil and military aircraft and can even be used to train maintenance engineers. Much like military training simulators, flight simulators provide a safe environment in which to learn skills relating to dangerous real-life situations, such as extreme weather, systems failures or even hijack scenarios. There are currently around 900 full-flight simulators in the world, owned by aeroplane manufacturers, airlines and specialist training companies.

Airbus‘ Senior Director of Flight Crew Training Policy David Owens said: “The simulator is an extraordinary piece of technology. It has the ability to recreate, as far as the pilot’s concerned, total reality. Every aspect of flying an aeroplane is recreated here.”

“Globally, the aviation industry is forecasting unprecedented growth,” he added, referring to projections that there will be a need for a worldwide fleet of 40,000 jets, double what currently exists, by 2032. Demand for simulators is therefore set to remain strong, as full-flight simulators play a vital role in supporting the pilots needed to fly the growing global fleet. Owens said: “Currently, we squeeze maximum use out of our simulators. We use them 24/7, more or less, 365 days a year.” It is no wonder then that manufacturers are churning out as many full-flight simulators as they possibly can.

Sim city
On a consumer level, people can turn their computers into virtual aeroplanes using software such as Microsoft Flight Simulator. Popular consumer games at the forefront of the industry are predominantly action role-playing shooters like Gears of War and Mass Effect. Both were developed by Epic Games’ Unreal Engine (UE3); a game engine that uses Unreal Development Kit (UDK) software to build virtual scenery.

The software now not only designs gaming scenery, but real-world buildings too. American architectural firm HKS used UDK when constructing the Dallas Cowboys Stadium, completed in 2009, to create a 3D visualisation of the world’s largest domed stadium and allow clients to view the final structure. HKS has now developed custom tools that allow its in-house team to translate the 3D models of its building projects seamlessly into the UE3 world, getting the textures and lighting perfect.

HKS’ Manager of Advanced Technologies Pat Carmicheal said: “The order of magnitude that Unreal opens up to us as architects is phenomenal. Architects want every detail in the building to be as accurately acute as the real environment as possible. There are thousands of surfaces in a typical building that we do, especially with the scale of buildings that HKS does. We have to have a lot of different texture surfaces simulated in these environments and it’s a lot of work.”

Tim Sweeney, founder and CEO of Epic Games, sees strong potential in the use of multi-core processors to carry out individual graphics-related tasks. Parallel operations can be used to strengthen artificial intelligence, perform physics calculations and execute programmable special-purpose functions that can improve game realism and responsiveness. Such developments may prove to be of great benefit to companies like HKS as more industries explore the potential non-game uses of video game software.

Companies reap the benefits of furthering eco-policy

Until recently, few companies seemed aware of how important the green issue can be to employees and job candidates. Businesses adopted green policies to meet government-set targets or to use as an advertising strategy. Employees seemed to have been left out. But industries have woken up to the importance of including their workforce in their eco-initiatives.

A Ceridian survey of UK employees in 2007 found 69 percent believed it was highly important their employers were environmentally responsible. A further 40 percent said they would turn down a job with an eco-unfriendly company. They were in favour of ‘green’ company benefits, such as incentives to switch to a sustainable energy provider, discounts on environmentally friendly products and subsidised fares for public transport.

It appears they were listened to; the world over, companies are now reaping the benefits of adopting eco-business practices. They have seen a drop in day-to-day running costs complemented by a rise in productivity.

BT
BT‘s energy saving campaign, launched in 2010, continues to be one of the company’s main priorities. The idea is to make staff more aware of efforts within the business to reduce carbon contribution through heating, lighting and equipment.

BT started by aiming the programme at individuals who had more influence within the business, such as heads of departments and engagement managers. The head chef and the auditorium manager were targeted to introduce more eco-friendly policies in their departments. Simple measures like reducing fridge-freezer usage and turning off unused projectors began to make a difference.

The company retrains its drivers through its fuel-efficiency scheme, Openreach. Drivers with the highest fuel usage and accident rates are retrained and taught about the new eco-friendly vehicles being brought in.

Energy Champions, selected from among the workforce, conduct energy audits and report faults. They encourage staff to be more eco-friendly by turning off unused hardware, lighting and recycling. The company also introduced Carbon Clubs to bring members of staff together to discuss green issues.

A recent employee survey found waste and energy savings were the main issues the staff felt should be addressed. As well as Carbon Club and Energy Champions, BT has brought in carpooling and cycle-to-work schemes. Over 10,000 of the 75,000-strong workforce have signed up for at least one of these programmes.

SAP
In 2009, SAP appointed Peter Graf as Chief Sustainability Officer and he came with a mission.  Not content to just make the company greener, he wanted to make sustainability part of the culture among the 60,000 employees. Through a basic questionnaire, he discovered the combined distance travelled to and from work by the staff every day was enough to travel to the moon, back to Earth and then to the moon again.

SAP’s engineers devised a system to share information about which colleagues were available for ride sharing. The programme matched people based on common interests. The carpooling in turn led to idea sharing and networking that helped drive innovation within the company. TwoGo, the software developed for the idea, is now available for purchase. It is common for the CEO to carpool, which Graf says leads to employees wearing their best suits and practising their conversation the night before the commute.

SAP also introduced a PIN-activated print system to prevent a build up of unclaimed documents. In the past four years, it has saved $250m by implementing a bout of eco-friendly technologies, all of which it has made for sale.

In 2008, internal surveys showed SAP had a very low rate of employee engagement in eco matters. But this interest has grown as people came to understand and see the benefits of Graf’s mantra that “sustainability is built into how the company creates value.” The adoption of sustainability policy at all levels of SAP has helped employee engagement rise to 91 percent. Graf hopes by the year 2020 to have reduced emissions to the level they were at in 2000, a cut of about 50 percent.

The world over, companies are now reaping the benefits of adopting eco-business practices

NAB
The National Australian Bank (NAB) runs a number of green schemes for employees to join, as well as offering eco-friendly perks. The Green Team Community is an internal voluntary programme for employees who wish to help the bank become more environmentally friendly. The company provides training for members to champion green causes and raise awareness in their local workplaces.

In its first four years, the community helped the company to become carbon neutral – and it was the first bank in Australia to do so. Gavin Slater, Group Executive for Group Business Services at NAB, praised the staff for their contribution. He said it was “[a] testament to the enthusiasm, commitment and activism of employees to make a difference and ensure their workplace is more environmentally sustainable.”

Having achieved this milestone, NAB then launched Beyond Carbon Neutral. The programme saw the introduction of recycled paper across the company, from printing paper to paper towels and cups. Shareholders have been given the option to receive their communication electronically. Some 109,000 have taken this option, which has seen a cut of 10.7 tonnes of equivalent carbon dioxide per annum.

Throughout the year, the bank holds a series of Green Speakers talks, where outside experts are invited to talk to staff about carbon, water and waste reduction. It runs various eco-benefits, such as Green Your Life, which provides access to discounts on a range of environmental products like water tanks and solar hot water panels sold by third parties. Employees are also encouraged to take advantage of interest-free loans to help them purchase annual public transport tickets and run a carpooling service.

Munich Re
Munich Re has embarked on a companywide environmental management system (EMS) to improve the environmental impact of services within the business. The EMS registers the effect of the company’s day-to-day operations on the environment and looks at ways to restrict any harmful actions. The programmeís goal is to reduce the consumption of paper, energy and water and minimise refuse and business travel.

Dr Ulf Mainzer, board member for Human Resources and General Services, said: “Our efforts range from modernising our office technology to sharing environmental tips with our employees.” The scheme was tested at a number of member companies before being introduced across the group in 2011. EMS has seen the introduction of data collection systems and so far 87 percent of its staffís activities fall under the project.

The company has introduced codes of conduct and communication initiatives to encourage its staff to consider the effect any action may have on the environment. Munich Re aims to achieve climate neutrality by 2015; a reduction from 4.90 tonnes in 2009.

Dr Astrid Zwick, Munich Reís Environmental Officer, says: “Wherever possible we try to bring our ecological activities into harmony with our financial objectives. Especially when staff members contribute their own ideas, it is possible to continually improve environmental management.”

Establishing the Portuguese pillars of sustainability

Caixa Geral de Depósitos (CGD) believes a business based on the pillars of sustainability is important if it is to manage available resources in a balanced fashion, recognise opportunities and create value for the future; all in line with its stakeholders’ expectations. CGD has contributed to Portugal’s economic and business development, to greater competitiveness, and the international expansion of Portuguese businesses. It has also sought to prevent over-indebtedness, and foster the social and financial inclusion of people and companies by promoting micro-savings and financial literacy.

The financial sector – and banks in particular – play an essential role in promoting sustainable development through selective processes that incorporate policies and criteria to address socio-environmental risk, lending and customer satisfaction management precedents. Rigorously applying such criteria will reduce risks and, as a consequence, allow the greater accumulation of value.

The social pillar
CGD’s commitment to tackling the main social challenges in Portugal is one of the pillars of its sustainable path. It is reflected in the bank’s strong investment in areas such as access to financial products and services, microfinance, the creation of volunteer programmes, incentives towards entrepreneurship and social entrepreneurships, and support for the third sector.

This work was recognised by the European Commission when it selected CGD to form part of the social responsibility committee created by the European Savings Banks Group. In June 2012, CGD was chosen as the unique representative of the Portuguese business sector on the Committee of Experts of the European Commission for Social Business Initiative.

CGD has also developed several groundbreaking projects aimed at promoting volunteering. Proceeding with its leadership policy, CGD created Young VolunTeam in 2012. The initiative was developed in partnership with the sustainability consulting firm Sair de Casca and the third sector institution ENTRAJUDA, and with the support of the national government through the Director General of Education. One of the project’s goals is to fill the gaps that exist regarding volunteering and its promotion among the youth.

Young VolunTeam aims to promote a culture of volunteering among young people and to create an incentive for the youth to develop and understand the importance of skills such as mobility, curiosity, openness and entrepreneurship in their own professional future and for the overall competitiveness of their country.

The project relies on an innovative dynamic that gives the young people involved (aged 15 to 18) responsibility for its success. They are the leaders of the initiatives in their schools and communities. The activities in which each group of students engage include: training in volunteering and project management; developing training sessions for younger students; and building projects in their schools that will meet the overall goals.

Rigorously applying such criteria will reduce risks and, as a consequence, allow the greater accumulation of value

The students then send reports of their actions to a jury composed of the project’s partners. The five most valuable projects are publicly recognised and strongly promoted as best practices: examples to be admired and followed. As well as being supported by the programme team, some schools count on the help of a CGD Volunteer to develop their actions.

Schools enrolled for the project’s pilot year have already developed several campaigns to promote volunteering. They have also been involved in activities such as: food, clothing and book collection campaigns; visits to nursing homes; and creating a solidary store inside the school, where pupils can bring used clothes and take home new ones. The training sessions for younger classes have proven to be highly motivational for the enrolled students, and strengthening the relationships between schools, creating new joint volunteering projects.

Under its Education/Financial Literacy Programme, CGD has undertaken multiple initiatives with major social impact, promoting education for savings, and responsible and informed consumption via its Positive Balance website (also targeted to SMEs).

In support of entrepreneurship, the resources aim to create new businesses and innovative projects. These begin with credit analysis and lead to support for investment and innovative projects in universities. By supporting the creation of businesses as ‘first employment’ (e.g. start-ups), CGD hopes to combat high unemployment rates among young people.

Corporate social responsibility
In 2012, CGD received the Best Sustainable Banking Group in Portugal award, as well as awards for leadership in climate responsibility by the Carbon Disclosure Project and ACGE climate responsibility index. These reflect the work that has been undertaken as part of the bankís commitment to sustainable development.

In the same year, Oekom evaluated CGD as the best in class in the financial sector at the international level. The rating is yet another accolade for CGD’s sustainable performance and the commitments it has been making to the future: for the benefit of coming generations, society, the national economy and environment. This has strengthened CGD’s role as a legitimate ambassador for the Portuguese financial sector in the application of the best international management practices.

In addition to several national and international awards, CGD has become a member, by direct invitation, of the Committee of Corporate Social Responsibility of the WSBI/European Savings Bank Group. Its membership of this important European forum allows it to leverage synergies and enter into partnerships that offer opportunities for the development of the Portuguese economy.

The environmental pillar
In 2012, CGD consolidated its commitment to sustainable development as a highly respected market leader by promoting best practice in the Portuguese financial sector. In terms of the environment, CGD has been following a demanding course in managing emissions of greenhouse gases, optimising available processes and resources, and significantly reducing energy bills.

The Caixa Zero Carbon Programme is emblematic, as it represents the bank’s strategy in relation to climate change by promoting a low carbon economy. It is the first structured programme for carbon neutrality in the Portuguese financial sector. Projects including the Solar Plant, Caixa Forest and the Carbon Calculator make CGD a leader in combating climate changes. It is the best Portuguese company and the best Iberian financial institution in terms of meeting the requirements of a low carbon economy, according to analysis carried out by the Carbon Disclosure Project (CDP).

CGD is the only Portuguese company in the Iberian top six for tackling climate change and the only Iberian financial institution recognised for its contribution to a low carbon economy, according to the CDP report ‘Iberia 125 Climate Change Report 2012’. CGD was awarded the highest (A) performance rating and entered the Carbon Performance Leadership Index, which highlights companies that have demonstrated a strong strategic approach to combating climate change and reducing emissions.

CGD became the first Portuguese company to be chosen for inclusion in the index, thus taking the lead in performance among institutions in the financial sector on the Iberian Peninsula. This distinction is due to its continued implementation of goals and actions to reduce greenhouse gas emissions, with particular focus on energy efficiency, employee mobility, waste management, reuse of resources and waste minimiation measures.

For several years now, CGD has been involved in the Floresta Caixa (Caixa Forest) project aimed at the restoration and afforestation of a number of areas in Portugal. It began several initiatives for the preservation of forests and biodiversity to mark the International Year of Forests in 2011. Its major goal is to plant a tree for each of its youngest customers (below 14 years) and care for each one for the next 30 years.

CGD has already reached 157,000 trees: some of them in areas that were severely affected by forest fires that compromised the economy and wellbeing of local communities. The bank has managed this by gathering volunteers from CGD, local communities, schools, scouting groups, fire fighting departments, entrepreneurs and several other areas.

Strengthening business ties
Within CGD (and the group itself), initiatives in the area of sustainability are worked on by multidisciplinary working groups under a management model based on the General Committee for Sustainability and the Sustainability Steering Committee, where the Executive Committee is represented.

CGD’s commitment to sustainability, in constant dialogue with the various stakeholders, enhances the creation of global partnerships, active networks and strategic alliances, promoting innovation processes. The strategic pillars upon which CGD’s Portuguese retail banking activity is based continues to support the recovery of the economy and companies, along with households.

By the end of 2012, the number of companies awarded ‘PME LÌder’ in CGD totalled 2,306 (of which 314 had “PME Excelencia” status), compared to 1,089 companies (of which 275 were “PME Excelencia”) the previous year. CGD rose from third to second in the national ranking.

Further information: www.cgd.pt/English/Sustainability/Pages/CGD-Sustainability.aspx

How is Dell to thrive in today’s vastly competitive markets?

At the beginning of February, one of the technology industry’s most successful entrepreneurs stunned the industry with the announcement that he would be taking the company that bears his name private, in a deal worth as much as $24bn. Michael Dell, the 47-year-old multibillionaire who turned his small personal computing firm into one of the world’s leading industry players, had already been back in his old job as CEO of the company for six years before announcing his intention in early 2013.

Having launched the company as a 19 year old in 1984, Michael Dell created one of the leading providers of personal computers in the world. It dominated the market until he stepped down as CEO in 2004. Having built its reputation on a build-to-order model that proved popular with people wanting to customise their computers, Dell lost its way during the middle of the last decade, as Apple, Lenovo and HP carved up large chunks of the personal computing market.

Striking a deal
After the company suffered a rapid decline in sales and market share, Michael Dell returned as CEO and Chairman in 2007, but failed to restore the business to its former glory. Now he wants to take it private, having floated the company in 1988, in a move that is being backed by a number of private equity investors, as well as Microsoft. It is Microsoft – which is going through something of a transitional period itself – that is the most intriguing.

Microsoft and Dell have suffered in recent years from their slow and muddled reactions to the changing personal computer market

The deal struck in February will see Michael Dell partner with private equity group Silver Lake, Microsoft and a number of banks to take the company private, pending shareholder approval. The founder will contribute his existing $3.7bn stake, as well as an additional $750m, while Silver Lake will put in $1.4bn in equity. Microsoft, who will be keen to keep one of its more prominent clients afloat, is contributing $2bn.

It’s speculated that Michael Dell sees the taking of his company off the stock market as a way in which it can develop new products without the constant glare of shareholder scrutiny and market reactions. Having to make quarterly reports about the strategy of the firm and then suffering the whims of investor sentiment on the markets has not been kind to Dell, with the stock price tumbling in recent years.

After the deal was announced, the company’s biggest rivals were quick to give their opinions on what it meant for their businesses. HP released a statement saying Dell “faces an extended period of uncertainty”, and it was well placed to capture any concerned customers: “With a significant debt load, Dell’s ability to invest in new products and services will be extremely limited. We believthat Dell’s customers will now be eager to explore alternatives, and HP plans to take full advantage of that opportunity.”

Addressing the opposition
Brian Rogers, Chief Investment Officer of T Rowe Price, Dell’s second-largest external shareholder, was opposed to the deal, saying: “We believe the proposed buyout does not reflect the value of Dell, and we do not intend to support the offer as put forward.”

Southeastern Asset Management, the largest outside investor in the company, also opposed the deal. The firm said in a statement: “We would have endorsed a transformative transaction that would have provided full and fair value to Dell’s public shareholders, including a leveraged recapitalisation or a go-private type sale where current shareholders could elect to continue to participate in a new company with a public stub.”

“Unfortunately, the proposed Silver Lake transaction falls significantly short of that, and instead appears to be an effort to acquire Dell at a substantial discount to intrinsic value at the expense of public shareholders.”

Influential hedge-fund manager David Einhorn also aired his strong opposition to the buyout, describing the plan to take the company private as ‘cash hoarding’. Einhorn’s Greenlight Capital had a large stake in Dell last year, although this was later sold. Dell’s stock price fell after shareholder frustration at the company’s strategy, which Einhorn says helped Michael Dell in his buyout plan.

Einhorn added: “Michael Dell probably didnít mind the stock falling. Now he wants to take Dell private and voila, the balance sheet will be fully utilised to finance his purchase of the company.”

What next?
Both Microsoft and Dell have suffered in recent years from slow and muddled reactions to the changing personal computer market. Dell has struggled to remain relevant in the advent of mobile computing and the decline in use of Windows-based computers. As devices become increasingly mobile, Dell has failed to match the hardware innovations of Apple, while Microsoft was slow in anticipating the entrance of Google into the operating system market place.

Microsoft needs a strong player still invested in its operating system

Dell built its reputation as a PC manufacturer, and it is this market that presents it with the most problems. People are no longer tied to its office computers, with more and more using tablets or remotely accessing work data from their home computers. Dell has not really offered a compelling reason for people to stick with its customised computers, and as people become less attached to the Windows platform, the company finds itself propping up a part of its business that is losing customers.

The decision to focus on the higher margins supposedly gained from high-end computers, like its XPS and Latitude ranges, appears misguided as mobile computing continues to be the industry’s main source of growth.

One area the company may look to focus on is upgrading the existing computers of its corporate clients. Dell’s Chief Financial Officer, Brain Gladden, told investors in an earnings call that many of these customers were on old platforms that required a refresh, and Dell would likely focus on serving them.

He said: “I think that’s really tough to get at, but the data that we’ve seen would suggest there’s still somewhere in the range of 40 percent of the corporate installed base for PCs that is XP or Vista, that needs to be upgraded. So that’s, I think, pretty consistent with the data that we see for our installed base, and for what we hear from our corporate customers.”

In fact, Microsoft will cease offering support to its old and still widely used Windows XP operating system in early 2014, meaning Dell could see a considerable amount of new orders for office computers in the coming year. Gladden added: “All the data that weíve seen, all the conversations we’ve had with customers, would lead us to believe that there’s still a significant refresh activity that has to happen in the next 12 to 14 months.”

Microsoft partnership
The fact that Microsoft has heavily endorsed Michael Dell’s buy-out bid shows how important the company is to its business. As HP retreats somewhat from the PC business, Microsoft needs a strong player still invested in its operating system. Michael Dell has been enthusiastic in his praise of the new Windows 8 operating system, saying in December: “…in the customer conversations that weíve been having, the interest in Windows 8 is quite high, even with commercial customers, who would normally wait a few releases to adopt the new versions.”

He added: “With Windows 8 products… we’re pleased with the incredible experience that [customers] expect, while you get the security and versatility and reliability that your enterprise really requires.”

Aside from software, Microsoft might be eager to make a push into the hardware market, and a heavy investment in one of its biggest clients could signal a greater partnership between the two, in which products are developed together. Google recently announced its own laptop computer, while Apple has been hugely successful offering both computers and the operating systems that run on them. Tighter integration between the two firms – with Dell computers perhaps offering a customised Microsoft experience with advanced features unavailable on rival platforms – could be a way in which the two firms experiment in the future.

Mobile push
While it has struggled to make a dent in the competitive mobile computing market, Dell is expected to make a big push in offering Windows-based touch-screen computers. At the Mobile World Congress 2013 trade show in Barcelona at the end of February, the companyís vice president, and general manager of tablets and performance PCS, Neil Hand, was bullish about Dell’s range of products, telling reporters: “At Dell, we have – unequivocally – the most secure, most manageable and most reliable product portfolio in the industry.”

Hand added that the new tablets unveiled, including the enterprise level Latitude 10 Enhanced Security edition, would help the company to challenge Apple’s dominant iPad. He said: “What I’m proud of is we’re bringing that same kind of resilience and capability to our tablet products as we do to our traditional Latitude and OptiPlex desktops and laptops.”

The company is aiming squarely at the corporate market with its security-focused new tablet, boasting a fingerprint reader and a smart card. With tablets becoming more popular among business executives, Dell hopes its new Latitude 10 will attract IT departments that want to maintain a link between its Windows infrastructure and employeesí mobile devices.

Servers and networking
Another area the company has dedicated considerable resources to is providing servers and networking to corporations. In recent years, the management has spent over $12bn in buying smaller firms in a drive to offer a comprehensible enterprise IT solution, and it seems as though it is starting to pay off. In February, the company reported that revenues for its servers, led by its Xeon-E5 PowerEdge 12G, grew by 18 percent to $2.62bn. Its networking products increased by 42 percent, although its storage range dropped eight percent.

If Michael Dell is going to reinvigorate the company that bears his name and made him his fortune, then a focus on being the most trusted enterprise-computing provider is the best way to go about it. The company should not try to compete with the likes of Apple and Google in the consumer space, and instead tighten its relationship with Microsoft, and push ahead with its long-term end-to-end business.

According to Krista Macomber, an analyst at Technology Business Research, this strategy is likely to continue and the imminent privatisation will likely enhance it. She said: “Privatisation will enable Dell to align all of its assets around its end-to-end solutions strategy, as it will allow Dell to act strategically on a longer time horizon than the current quarter without the scrutiny of Wall Street. As a relative newcomer to the enterprise, and in delivering end-to-end solutions, Dell faces more work in its journey.”

“Dell’s continued evolution into a provider of end-to-end IT infrastructure is a long-term endeavour that requires heavy investment in R&D and acquisitions, calculated alterations to its go-to-market approach, and corporate revenue pressures in the form of declining PC scale ñ as demonstrated by the company’s fourth quarter financial performance and impending privatisation.”

Technological advancements in improvements to healthcare

I recently attended the JP Morgan healthcare conference – the Davos of the medical world. And, like the World Economic Forum‘s annual gathering of business leaders, the JP Morgan conference is a Rorschach blot: you find in it what you are looking for.

Personally, I am interested in how healthcare business models are changing – not in a smooth trend line, but one example at a time. The change has less to do with healthcare ‘reform’ than it does with improved access to information beyond the traditional sources of clinical trials and medical billing systems. Now we can find out more about each individual patient (and ultimately aggregate data), about the use and performance of drugs and treatments out in the market (not just during testing), and even about outcomes.

In search of this theme, I met with a variety of startup companies on the fringes of the event. The formal programme was mostly publicly traded companies talking about their earnings outlooks, with one section reserved for privately held companies.

Sharing care
First, there was Andrew Brandeis of SharePractice, a doctor who used to run a high-end medical service called CarePractice in San Francisco, but saw a need for doctors to share information about how they treat patients. The industry standard for such information, Epocrates, offers a mobile app with information about pretty much every drug on the market, but neglects other kinds of treatments. Brandeis also asserts that there is too much advertiser influence – in fact, only Epocrates’ educational content and sponsor messages are driven by ads. Make of that what you will.

Brandeis’s idea is crowd-sourced information: doctors will record for one another what treatments they actually use. He showed me his own cellphone contact list: most of the people on it are other doctors. He shares with them already, and SharePractice will make it easy.

But will they want to share this kind of information? Yes, says Brandeis, because they already do, via text, email and phone: “It’s a cumbersome process, the data is totally unstructured and doctors wind up repeating themselves because searching through six months of text messages makes no sense.”

Someday, health care will be more like highway traffic

SharePractice simplifies the entire process. In the end, SharePractice will adopt more or less the same revenue model as Epocrates: freemium/subscription, perhaps institutional sales and lead generation. What is different is the origin of the data.

Almost by coincidence, one of the next sessions at the conference was led by Athenahealth, announcing its planned acquisition of Epocrates. “We don’t care about Epocrates’ revenues,” to paraphrase Athenahealth CEO Jonathan Bush, “we have 30 percent awareness among doctors: they have 90 percent.” In short, he sees Epocrates as a marketing and distribution channel for Athenahealth and is not overly worried about pleasing advertisers.

In fact, the primary influence on both Epocrates and SharePractice will most likely be the institutions that pay for the medicines (and procedures): “Here is what other doctors suggest, and here is the subset whose costs will be at least partly covered by the patient’s insurance.” Ultimately, formularies (payer restrictions on which drugs they will cover and how much), rather than advertisers, will control how the choices are displayed.

Advice from experience
Finally, I met with another startup, WiserTogether, whose idea is to offer users treatment-choice information on a personalised basis. The two founders are former consultants to health institutions and, true to type, they used a broad survey of both patients and doctors (40,000-plus individuals) to generate their basic data set.

The WiserTogether service uses a unique interface and data-presentation model to provide personal advice to consumers of medical services, though it will be sold primarily to institutions and doctors. Consider it a clinician-patient communication tool.

While SharePractice focuses on day-to-day practice, WiserTogether’s tool is for patients considering whether to try surgery or radiation for cancer, for example, or physical therapy or drugs for back pain. With WiserTogether’s tool, they can get a sense of the choices made by others who share their preferences ñ for example, regarding effectiveness, speed, cost, side effects or aversion to pain – and their satisfaction with the results.

WiserTogether updates its data over time, but it is still relying on traditional survey techniques, though it is also monitoring what is happening in the field through volunteered claims data, and data from medical records. The healthcare industry is lurching slowly towards real-time, user-generated data. With greater data liquidity, it becomes easy to match health indicators (from both patient and doctor), patients’ preferences, appropriate treatments, and payer constraints. Itís all just data.

In the old days, we used to monitor traffic with road cameras and the occasional TV news helicopter. Now we simply collect signals from the cellphones of millions of drivers around the world. We aren’t invading their privacy because no one cares who they are – only where they are and how fast they are going.

It is the same in health care. Initially, we take formal surveys and run carefully monitored clinical trials; each patient in such a trial represents – more or less – thousands of others. And, indeed, such trials are a good predictive tool.

But someday health care will be more like highway traffic; we will be able to see it unfold in real time and we will see how well people get to their goals by their chosen routes. Millions of people will be uploading daily statistics with tools that monitor not just activity, but also blood composition and – courtesy of the Japanese – chemicals in our urine.

Would you drive around in Manhattan, Moscow or Mumbai without a real-time traffic map? Not anymore. Someday we will think of health interventions in the same way.

Smart Grid Awards 2012

More and more resources are being poured into smart grids and associated technology on a daily basis. As the technology flourishes and develops, market participants seem capable of a growing variety of solutions. 2012 was a particularly strong year for innovations, as a host of new projects came to market. Here, The New Economy celebrates the industry leaders in its Smart Grid Awards, 2012. Congratulations to the winners.

Best Transmission Company
Alstom Grid

Best Distribution Company
Petra Solar

Best Command & Control Company
Locamation 

Best Demand Response
Schneider Electric

Best Smart City Technology Company
Schneider Electric

Best Networking & Communications Company
Schneider Electric

Best Systems Integrator Company
SAP

Best Smart Appliance Company
Samsung

Best eMobility Company
Iberdrola

Best Smart Building Company
Delta Controls

OSU further the potential of clean coal technology

Having spent two years supplementing decidedly menial developments in furthering clean coal technology, a group of researchers at Ohio State University (OSU) have taken crucial steps towards the technological bettering of clean coal technology.

In February 2013, OSU professor Liang-Shih Fan led and directed a select team of students in reaching a new milestone for clean-coal technology. Fan – having already instigated a number of local environmental initiatives – is Professor of Chemical and Biomolecular Engineering at OSU, as well as being the Director of Ohio State’s Clean Coal Research Facility. Having made the breakthrough, the team hope that “not only can we use America’s natural resources such as Ohio coal, but we can keep our air clean and spur the economy with jobs.”

The discovery – termed ‘coal-direct chemical looping‘ (CDCL) – is a pioneering technology entailing the chemical conversion of coal to heat while safely retaining 99 percent of the carbon dioxide generated. Fan explains that: “In the simplest sense, combustion is a chemical reaction that consumes oxygen and produces heat… unfortunately, it also produces carbon dioxide, which is difficult to capture and bad for the environment. So we found a way to release heat without burning.”

More specifically, CDCL requires the mixing of small oxide beads with fine powdered coal, carrying oxygen for use in initiating a chemical reaction. The resulting mixture is then heated so that coal binds with oxygen from the iron oxide to produce carbon dioxide. While the carbon dioxide rises and collects atop the reaction chamber, the subsequent generation of heat creates water vapour for use in powering steam turbines.

The otherwise-contained carbon dioxide is separated and recycled, whereas the iron beads are exposed to the air inside the reactor, resulting in re-oxidisation: this reaction allows the beads to be regenerated indefinitely for use in later processes or for storage. Current research practices demonstrate that each unit produces approximately 25 thermal kilowatts for use in generating electricity.

The associated researchers plan to develop the technology further with the construction of a pilot plant, located at the US Department of Energy‘s National Carbon Capture Centre and scheduled to begin operations in late 2013. Aiming to produce approximately 250 thermal kilowatts using syngas (synthesis gas), the tests are considered of vital importance in the technology’s commercial development.

The building of this plant further demonstrates committed government support for new, environmentally beneficial technologies. Fan has since attested to the State of Ohio being “very supportive of our research efforts” and tells of “brilliant invention and cutting-edge research,”  and being “successful and progressive” in alignment with government funding.

Coal dependence 
Coal used in the generation of electricity is currently the US’ second-biggest contributor to carbon dioxide emissions and has come under increasing scrutiny amid growing public concern over global warming. The coal industry has responded by repeatedly emphasising its commitment to clean coal initiatives.

Carbon dioxide is a renewable material and might be used in future electricity generation

Over the past 30 years, more than $50bn has been deployed in the US in the development and implementation of clean coal technologies, with almost $500m devoted to carbon capture and storage research and development.

Increased US capacity for generating coal-based electricity could go some way in reducing the price of oil across global markets. Furthermore, Iraq looks set to better realise its potential as a leading oil producer, and increase its annual output by 600,000 barrels per day for the foreseeable future.

Citigroup’s Edward Morse said in a statement: “Starting this year, North American output… should start to have tangible impacts both on global prices and trading patterns, and will eventually turn the global geopolitics of energy on its head.” The shift would significantly limit the revenues of the producing OPEC nations, as well as Russia and West Africa.

How clean is ‘clean’?
Though clean coal solutions are beginning to gain some presence in the developed world, there are many who argue the legitimacy of terming these projects ‘clean’. The world’s first ‘clean coal’ power plant – built in Spremberg, Germany in 2008 – though fundamentally negating the release of carbon dioxide into the atmosphere on a short-term basis, does not provide a long-term prevention method in stopping carbon dioxide being expelled.

Having furthered developments in both clean coal technology, and in the isolation and short-term containment of carbon dioxide, the question is how it is to be recycled or contained in the long-term. Current techniques include the injecting of carbon dioxide into depleted natural gas fields or other suitable geological formations. Termed Carbon Capture and Sequestration (CCS), it is an attractive and relatively simple answer for the near-term, although it cannot be considered a permanent solution.

CCS currently sits high on the green agenda for many governments, remaining a viable option for climate change amelioration. That said, many scientists and technicians consider the burying of carbon dioxide a crucial mismanagement of a potentially viable energy source. Carbon dioxide is a renewable material – unlike those from which it’s expelled – and some chemists have questioned the methodology of burying something that might be used in future electricity generation.

With recent discoveries in the field of CCS support, demand for carbon dioxide-generated means of producing electricity has been steadily rising among scientists and engineers. The theories of how carbon dioxide could generate electricity are split into two broad schools of thought: one is the converting of CO2 into simple oxygenated compounds such as methanol; the other is the utilisation of organic matter in imitating photosynthesis. Although both branches of thought offer a sound scientific means for generating electricity, neither can be considered a commercially viable option at present.

Today we have a global economy in which highly incentivised focus is both hungry and unflinching in its efforts to improve environmental conditions. Whereas initiatives led by those such as Professor Fan are clearly making headway in the advancement of clean coal technology, there are still important related aspects for which development necessitates a vast amount of investment.

Approximately a quarter of the world’s electricity generation is provided by coal-related methods, and the process is a major contributor to proliferating environmental pollution. In this sense, it’s clear that any further attempts to better restrict harmful pollutants would be of paramount importance worldwide.