Inside the battle for Genzyme’s future

Raman, a business professor at nearby Harvard University, scoured the internet and spoke to dozens of scientists. They told him that if his daughter Nandita could hold on, a treatment being developed by Genzyme Corp might soon be available. In April, 2006, US regulators approved the drug, Myozyme, in time to save Nandita’s life.

“For motivating thousands of employees who collectively added years to our child’s life, and restored a smile to her face, we thank you,” Raman said, addressing Termeer before a packed auditorium at Babson College, which on April 7 inducted him into its Academy of Distinguished Entrepreneurs.

As the 64-year-old Termeer accepted his honor “with humility,” he was painfully aware that trouble was brewing. Federal regulators were about to sanction his company for manufacturing violations that led to shortages of two of its other life-saving drugs: Cerezyme, a treatment for Gaucher disease, and Fabrazyme, a treatment for Fabry disease.

And so it goes for Termeer, an urbane and charismatic Dutchman who has emerged as one of the biotechnology industry’s most celebrated – and controversial – figures. As a result of the latest setbacks, though, he is now in danger of losing control of the company he has spent the better part of three decades building.

Production problems
Patients with Gaucher, Fabry and Pompe disease are deficient in certain enzymes that break down fats or sugars in the body, leading to organ damage and, without treatment, death. Last year, a viral contamination forced the company to shut down its manufacturing plant in the Allston neighbourhood of Boston. To conserve supplies of Cerezyme and Fabrazyme, patients were forced to skip or take less of their medication. And though the plant is up and running again, Genzyme has repeatedly pushed back the time it will take to fully meet demand.

That has strained Termeer’s relationship with patient groups, and enraged Wall Street. Genzyme’s shares fell 26 percent in 2009 as the company scaled back earnings projections and yielded ground to competitors. For 16 years Genzyme monopolised the $1.2bn market for Gaucher disease. But in February, the US Food and Drug Administration approved Vpriv, a rival drug made by British drugmaker Shire Plc. It is expected to approve an experimental treatment made by Protalix BioTherapeutics Inc of Israel by the end of the year.

“As a large organisation, the challenges are different from those you face as an entrepreneur, and I’m sure you have read about the challenges we are facing,” Termeer told his Babson audience dryly.

Angry investors
The manufacturing crisis has also opened the door to activist investors. Carl Icahn, the billionaire known for taking control of underperforming companies and shaking up management, has nominated himself along with three other representatives to Genzyme’s board. Ralph Whitworth, another activist who runs the investment firm Relational Investors LLC, and who Genzyme recently appointed to its board, has called on the company to sell off parts of its business – such as its treatments for kidney disease – that are unrelated to its manufacture of drugs for rare diseases.

On June 16 investors will gather to vote on whether they consider Termeer the right person to steer the company out of its crisis.

Termeer insists he is. “We are on our way to overcoming the manufacturing challenges we have faced,” he said in an interview, “but still ahead is the need to regain the trust of our patients and physicians. I have known this community for many years. In fact, I developed this market from its inception. This isn’t the time to have someone else try to understand it.”

Some industry experts agree.

“Clearly there is a challenge to the company’s reputation, and clearly the products were not being made to the quality that is needed, and he needs to take ownership of that,” said Fred Hassan, the former chief executive of drugmaker Schering-Plough, who turned that company around after it suffered its own manufacturing crisis in 2002.

“But it is important that if investors ever consider changing him, whoever they replace him with must be immediately experienced in handling major manufacturing problems, and there are very few people like that,” he added. “Sometimes it is better to bring in the right crew and get on with it.”

Persuading investors to reinstate their trust in Termeer, however, won’t be easy.

“I think there is a good chance Icahn will get nominated to the board,” said Karen Andersen, an analyst at Morningstar. “A lot of investors are very unhappy at the way he has been leading the company.”

Icahn and his three other board nominees said in proxy filing in April they “believe that Genzyme’s inability to fully manufacture and supply certain of its products leads to the conclusion that the manufacturing system at Genzyme is ‘broken.'”

Key to his chances of holding on will be whether Termeer can convince investors that the problems are confined to manufacturing and can be “surgically” removed, as he insists, or whether investors believe they reflect a broader breakdown in leadership and operations.

Biotech pioneer
To chart how Genzyme and Termeer have reached this perilous position is to depict the arc of the industry itself. Genzyme is one of the oldest biotechnology companies in the world, and one of the last to remain independent.

It was founded in 1981 by Henry Blair, an enzymologist who had been collaborating with the National Institutes of Health (NIH) to develop a treatment for Gaucher disease using an enzyme known as glucocerebrosidase. The enzyme is found in human placentas, and Genzyme’s job was to collect the placentas, extract and purify the enzyme, and ensure enough quantity to support the clinical trials.

Termeer, a boyish-looking man who still retains his Dutch accent, joined the fledgling company in 1983 after a 10-year career at Baxter, a healthcare company known for producing successful young entrepreneurs.

From the start, Termeer wanted Genzyme to remain independent. To do that he believed it needed to be diversified and it needed to manufacture its own products, not just discover them. With the help of venture capital financing, Genzyme acquired two companies in England that produced raw materials for use in diagnostic testing and pharmaceuticals, and by the time the company went public in 1986, at $2.50 a share, it was generating revenue of nearly $10m.

“We didn’t want to become the research arm of a large pharmaceutical company,” he said. “The notion of building a business, rather than building a research company, became a very important characteristic of Genzyme.”

Independence had its cost. Genzyme was launched on the 15th floor of an old garment building in the red light district of Boston – known as the Combat Zone.

“I would park my car and walk to the office and before I got there I would be propositioned at least three times,” Termeer said. “It was an unusual kind of environment.”

Trials and tribulations
In 1984, the NIH ran its first trial of glucocerebrosidase in a Gaucher disease patient – Brian Berman, who was four. Berman’s belly had swollen to the size of a basketball and surgeons were about to take out his spleen.

“It was pushing all his other organs out of the way,” Termeer said.

The enzyme worked, and Berman’s belly shrank. The result had a powerful impact on Termeer, as well as on Berman’s family. To this day Termeer keeps a picture of Berman on the wall of his office in Cambridge, Massachusetts, and the two remain in touch.

“He is married, he has kids, and he still has his spleen,” Termeer says with pride.

But Genzyme’s scientific advisers were not convinced the company should try to develop a commercial product. It would require expensive clinical trials and, they argued – incorrectly as it turned out – gene therapy was around the corner.

Termeer was determined to press ahead. “I had so many detractors in those days, people who said you are out of your mind,” he said. “But I had seen this boy.”

To make matters worse, the NIH had conducted a new trial, of eight juvenile patients with Gaucher disease, and it failed – mainly because the patients had been bigger than Berman but had not received a greater dose of the drug. When the results were published, most researchers wrote it off.

Termeer, though, remained a believer, as did pioneering Gaucher disease researcher Dr Roscoe Brady and Brian Berman’s mother Robin, a physician. They decided to try to raise $10m to run a new clinical trial.

“We were in a tough situation,” Termeer said. “We had to prove that it worked.”

In June 1987, though more than eight months pregnant, Robin Berman joined Termeer and Brady as they criss-crossed the country in search of funding. It was an arduous task. No-one would give them a cent. The trial results, after all, appeared to speak for themselves.

Then, one day, as Berman was speaking to brokers and wealthy individuals in Albany, New York, her water broke.

“She turned to me for help,” Termeer said. “What was I to do? I’m an economist!”

The episode seemed to jolt the audience.

“Suddenly they got it,” Termeer said. “They called their friends and we raised the money.”

Three weeks later the stock market crashed.

“If we had been any later it would have been a very different story as all financing was then off the table.”

“Crazy dreaming”
The US Food and Drug Administration agreed to let Genzyme conduct a trial of 12 patients – a bold move since it was the height of the AIDS crisis and the agency was concerned about pooling placental fluid.

Patients came from around the world. One flew in twice a month from Germany, courtesy of the German government. Another family moved to Washington DC from South Africa. Two patients in the trial nearly died. And the leg-work involved in producing the enzyme was grueling.

“Every time we ran out of the enzyme we would go around with a little car to all the hospitals in New England and get the placentas,” Termeer said. “Then we would carry them up to the 15th floor where we would fractionate and purify them.”

The purification process took place in two tiny centrifuges. When they were turned on, the entire building, and all the garment workers on the lower floors, shook.

“They tolerated it. They knew it was for a good cause,” Termeer said.

In the end, the drug worked in all the patients, and in 1991 the FDA approved the product, which became known as Ceredase.

“We had a reunion 20 years later,” Termeer said. “We sat together in a hotel here. A lot of crying went on.”

The collection of placentas to produce Ceredase for a larger number of patients became an enormous undertaking. But it happened that Pasteur Merieux, a company in Lyon, France, was using placental blood to produce immunoglobulins and albumin, a substance used to expand blood supplies for use in wars or accidents.

“They would process these placentas in beautiful French wine presses,” Termeer said. Then they would hand over the remaining tissue to Genzyme, which would extract the enzyme. Ultimately, around 30 percent of all placentas from births in the US, and around 70 percent of placentas from Europe, found their way to that plant. Eventually, Genzyme created a genetically-engineered version of Ceredase using Chinese hamster ovary cells. That drug, Cerezyme, was approved by the FDA in 1994, and Genzyme opened its plant in Allston – the plant at the centre of today’s troubles – in 1996.

“When you look at the history of Ceredase, how Henri had to convince the people to do the trial, how it failed, how he went back, it was almost crazy dreaming,” said John Crowley, the chief executive of Amicus Therapeutics Inc and co-founder of Novazyme, a company that had also been developing a treatment for Pompe disease and that Genzyme acquired in 2001. “This was the kind of risk-taking innovation and dreaming that made biotech great. Henri absolutely broke the mold and the paradigm.”

Polarising figure
Termeer is hoping that his history of achievement will be enough to persuade investors to give him the chance to steer Genzyme through its current crisis. But he has his work cut out for him.

“It’s going to be a very difficult thing for him to get a second shot and retain the authority to do the things he wants to do,” said Oliver Pursche, executive vice president at Gary Goldberg Financial Services.

Termeer is in many ways, a polarising figure, having transformed himself from a scrappy entrepreneur into a leading figure in Boston’s financial, scientific and artistic community. Besides running Genzyme, he sits on the boards of the Massachusetts General Hospital and Massachusetts Institute of Technology; is chairman of the Federal Reserve Bank of Boston’s board of directors and is a member of Massachusetts Governor Deval Patrick’s Council of Economic Advisors.

“Henri is the ultimate renaissance man, the Thomas Jefferson of Boston,” said Jack Meyer, a friend who formerly managed Harvard University’s endowment fund and is now chairman of the Boston ballet’s board of trustees. “He thinks deeply, he thinks broadly, he sees all of life and is involved in all.”

But to hear others tell it, there is a darker, more calculating side to Termeer.

In December, 2000, Genzyme acquired Biomatrix Inc, which had developed an injectable treatment for osteoarthritis called Synvisc. Genzyme combined Biomatrix with two of its separately traded units, Genzyme Surgical Products and Genzyme Tissue Repair, to form a new company, called Genzyme Biosurgery. Biosurgery traded independently as a tracking stock.

Tracking stocks are equities tied to the performance of a particular segment of a company’s business and can be a way to unlock the value of a high growth unit without spinning it off to shareholders. They were popular during the dot.com era of the 1990s but fell out of favor after the Enron accounting scandal in 2001 cast a pall over complex financing vehicles.

Lawsuit challenged buyback
In May 2003, Genzyme eliminated its tracking stock structure. Under its articles of organisation Genzyme could force Biosurgery shareholders to sell their shares back at 130 percent of fair market value. Genzyme acquired Biosurgery shares for $1.77 each.

To Biosurgery shareholders, the payment amounted to highway robbery. Rory Riggs, the former president of Biomatrix, and John Lewis, a partner at investment firm Gardner Lewis Asset Management and Biosurgery shareholder, filed a class action lawsuit against Genzyme and its executives, including Termeer.

The lawsuit, amended in 2005, charged Genzyme’s executives with deliberately driving down the value of Biosurgery’s stock in order to buy it back at a price beneficial to Genzyme’s shareholders but grossly unfair to Biosurgery shareholders.

“Henri intentionally, with ill will, defrauded his own shareholders to whom he had a fiduciary duty, to make himself more wealthy,” Lewis said. “In my opinion he belongs not in the jail but underneath the jail.”

According to the suit, Genzyme’s own internal valuations reflected their belief that the true value of Biosurgery’s shares ranged from $12.75 to more than $50 a share. It alleged that by March 1, 2002, Termeer had already decided to eliminate Biosurgery’s stock. A memorandum from that day’s meeting obtained in discovery states that “we are working toward reabsorbing GZBX,” Biosurgery’s stock symbol.

Between January and July, 2002, Genzyme’s shares fell more than 70 percent amid earnings downgrades and suspicions that the company had kept inventory levels of its kidney disease drug Renagel high in order to artificially boost sales – a practice known as channel-stuffing.

The decline meant a repurchase of Biosurgery could be dilutive to Genzyme’s earnings.

“Against this backdrop, on May 24, 2002, Termeer met with one of his senior colleagues to prepare for the May 2002 Board meeting, to be held five days later,” the suit states. “A remarkable note from that May 24 meeting – ‘letting GZBX success occur – compare pluses and minuses’ – suggests that he decided to solve this problem by turning to an unusual method for a chairman and chief executive of a public company: he decided to work to limit Biosurgery’s apparent success to enable its tracking stock to be repurchased on favourable terms for Genzyme’s shareholders.”

In 2007, shortly before the case was due to go to trial, Genzyme agreed to settle for $64m – or twice the amount it had originally paid Biosurgery shareholders.

To the shareholders, the settlement was an implied admission of guilt. But Termeer says he settled the case to avoid the risk of a trial, and said $64m was a small amount – an indication that the case was “not that big.”

“There was no interest on any of our parts to be anything other than fair.”

Still, Genzyme has never been considered an especially transparent company. Investors have criticised it over the years for its opaque compensation structure, accounting practices and excessively passive board.

Cognizant of that distrust, Termeer has moved to more closely align senior executive pay with performance, alter the company’s accounting practices to bring them in line with its peers, and introduce new people onto the board.

He recently scaled back his own compensation. In 2009, his total package declined 25 percent to $9.5m from $12.7m. Even so, he still received an increase in salary of 8.7 percent and kept the car and driver that ferry him back and forth from his $4m ocean-front home in Marblehead, on Boston’s north shore, at a cost of $81,386.

Termeer concedes there is a price to be paid for a CEO to be at the helm too long.

“I need to move on, both for myself and the health of the company,” he said. “But I want to get us through this problem and start to prepare the company for its next phase in 2011.”

Mining for nanotechnology

Most people wouldn’t naturally associate mining with nano- or green technology, would they? But at Imerys, the world’s leading producer of industrial minerals, that’s exactly what they do, and they’ve dedicated a significant proportion of their R&D effort to both of them.

Nanotechnology is a term that embraces such an enormous range of subjects that it can be hard to understand what it actually means. In material science, however, it’s more specific; it’s about the use of nanometre-sized particles of often quite ordinary substances to obtain new and useful properties that don’t exist at other scales. The earliest examples are becoming well known – suncreams, self-cleaning glass, scratch-resistant coatings and carbon nanotube reinforced composites are typical. All of these use synthetic nanoparticles made by cooking up various chemicals, often in the gas phase, at very low throughput and using large quantities of energy. So, not surprisingly, they come with a hefty price tag.

Natural industrial minerals, by contrast, are found everywhere – in ceramics, beer and wine production, cosmetics, paints, paper, plastics and rubbers – where they reinforce, filter, toughen, stiffen and above all whiten the materials of our daily lives. The need for whiteness specifically requires particles of micrometre rather than nanometre sizes, but nevertheless many mineral products have always contained a significant proportion of nano and near-nano dimensioned particles. Over the last few years, Imerys has been busy extracting and processing these for a range of new properties and applications. Examples include a toughening agent for clear coatings, where nano size is needed to achieve the clarity and transparency required for the most demanding applications, and microcapsules made from a polymer/mineral nanocomposite which protect their precious cargo of  active ingredients from their surroundings and release them exactly when and where they are needed. And the great thing about using particles extracted this way is that we know they’ve been a safe, tried and tested part of everyday materials for centuries, so there’s no need to fear any unseen health risks from their use.

So how, you might ask, can an industry that extracts minerals by the millions of tonnes be called green? Take a look around you, and consider the fact that the things you see are made up from raw materials, and if these can’t be grown they have to be dug out of the ground. So it’s not whether we mine materials, but what they are, how we do it and how we use them that counts. An example: the conventional role of minerals in petroleum-based plastics is to reinforce and stiffen them, reducing the amount of polymer needed for a given task, and achieving the goals of lower cost and sometimes even lower weight. Do the sums, and you find that, more often than not, this reduces the carbon footprint of the product as well.

It doesn’t stop there; Imerys is currently working with UK partners on minerals that will raise the performance of the new generation of bioplastics to the point where they can take over from the conventional ones made from oil. Add in the fact that the minerals simplify and markedly speed up the moulding and processing steps, and the potential is clear.

Finally, at the end of a product lifecycle the mineral components are relatively easy to recover, as they will survive combustion and other high temperature processes needed to purify mixed wastes. Imerys has already developed specialist materials made from sources as diverse as broken glass bottles and the mountains of sludge created by paper recycling. So take a new look at natural minerals, and realise the potential that Imerys has already begun to unlock.

Further information: www.imerys.com

Nanobody drug to market in 2013

Ablynx specialises in nanobodies – touted as the next generation in antibody treatment – which are developed from the antibodies in llamas. They have the potential to enable drugs to get to diseases that are too small for traditional antibodies to reach.

One of its lead drugs, being developed in partnership with US drugs giant Pfizer, which inherited it when it took over Wyeth last year, is currently in Phase II trials and could become its first commercialised product.

“The earliest we think it could be on the market is 2013, that’s a tangible distance away,” Chief Executive Edwin Moses told reporters.

Pfizer is hoping it will be able to use the new nanobody drug to replace its arthritis medicine Enbrel – which made almost $1bn in the third quarter of 2008 – Moses said.

The company also has partnerships with Merck, Boehringer Ingelheim and Novartis.

In a twist of nature, llamas have evolved to produce extra small antibodies, which are in turn used to make nanobodies. They are stable enough to be inhaled or inserted under the skin, setting Ablynx’ technology apart from rivals like GlaxoSmithKline’s Domantis, Moses said.

In tests on animals, nanobodies have shown they could replace aspirin and even treat Alzheimer’s disease, and they are currently in mid-size human trials for arthritis and thrombosis.

They also cost about a third of the price of developing traditional antibodies, Moses said.

“We take two or three llamas, give them an injection and go away on holiday [and create] what might have taken 20 medical chemists a year to come up with,” said Moses.

Online market creates too many choices

Many of us turn to personal recommendations when deciding to make a purchase, but the sheer volume of online customer feedback now available can be just as daunting.

Leading online retailers – notably Amazon – were quick to develop personal recommendation software, analysing a customer’s buying patterns to suggest other products that they might enjoy.

A team of European researchers are now taking that approach a step further.

Alexander Voss, a researcher at Microsoft’s European Innovation Centre in Aachen, Germany, is coordinating seven companies and universities that are developing advanced methods, models and algorithms to bring personalised recommendations to everything from web content and interactive TV channel guides to e-commerce.

Their algorithms, developed in the EU-funded MyMedia project, combine two sources of data. “Stable information” is derived from user-created profiles – covering details such as age, residence, likes and dislikes – and the star∞ratings they give to past purchases, such as films they have enjoyed.

“Unstable information” reflects factors such as context, the user’s mood and tastes, and what types of music they tend to listen to.

Most personalised recommendation technology looks at unstable information as a black and white issue: if a user watched one video they must want to see more like it.

But the MyMedia researchers are adding a third and very important variable when it comes to finding what people really want. “We’re not just looking at choices in terms of ‘yes’ or ‘no’, but also ‘maybe’,” Voss explains.

“Just because someone didn’t watch a certain video doesn’t mean they wouldn’t want to, they simply might not have had the time or felt like it at that moment.

“Instead of excluding it, we add it to a relative ranking of recommended content that changes over time as the system builds up a better idea of the user’s interests.”

The system is unique in another way too: it allows content providers to fine-tune recommendation parameters to see how the changes affect users’ responses. “There is no other recommender system that can do that,” Voss says.

The technology is being tested by e-commerce company Microgénesis, telecoms company BT and the BBC. If it catches on, we could soon have more time to spend enjoying our choices, rather than worrying about what choice to make.

R&D hotspot

Complete with beanbags and coffee served in steel tumblers, an office in Mumbai is helping change the perception that India is no place for top-end research and development.

Staffed with about 60 full-time researchers, many of them Indians with PhDs from top universities in the United States, the centre is at the cutting edge of Microsoft’s R&D. It covers seven areas of research including mobility and cryptography.

Its success, including developing a popular tool for Microsoft’s new search engine Bing, underscores the potential of R&D in India at a time when cost-conscious firms are keen to offshore to save money by using talented researchers abroad.

Showing off the Bing tool which enables searches for locations with incomplete or even incorrect addresses, B. Ashok, a director of a research unit at the centre, said the innovation would never have taken root if the R&D had been done in the United States.

“It was completely inspired by the Indian environment, but is applicable worldwide,” he said.

Structural problems
While India might seem like a natural location to expand offshoring into R&D, it is hampered by some serious structural problems that range from not enough home grown researchers to a lack of government support.

India produces about 300,000 computer science graduates a year. Yet it produces only about 100 computer science PhDs, a small fraction of the 1,500-2,000 that get awarded in the United States, or China, every year.

“Students here are not exposed to research from an early age, faculties are not exposed to research and there’s no career path for innovation because there’s a lot of pressure to get a ‘real’ job,” said Vidya Natampally, head of strategy at the Microsoft India Research Centre.

With few government incentives and an education system that emphasises rote learning, India lacks the kind of environment found in say, Silicon Valley, where universities, venture capitalists and startups encourage innovation.

“China has a policy in place for R&D; we don’t,” Natampally said, adding that India could move up the value chain faster if even a small percentage of its engineering graduates went into research.

The small numbers of PhDs and the lack of government incentives for India’s fledgling R&D sector are blunting the country’s edge, analysts warn.

Competition
Rival China has already pulled ahead with more than 1,100 R&D centres compared to less than 800 in India, despite lingering concerns about rule of law and intellectual property rights.

Aside from providing funding to encourage students to complete their PhDs, China also offers fiscal incentives such as tax breaks for R&D centres and special economic zones provide infrastructure for hi-tech and R&D industries. India is also losing out in the patent stakes. In 2006-2007, just 7,000 patents were granted in this country of 1.1 billion people, compared to nearly 160,000 in the United States.

“We’re nowhere near the US or even Israel when it comes to innovations,” said Praveen Bhadada at consultancy Zinnov, which estimates the R&D sector in India is worth about $9.2 billion.

“Our costs are low and our talent pool is ahead of China, Russia and Ukraine, but China gives specific incentives, and produces way more PhDs than we do.”

India is cheaper than China for R&D, those in the industry in Bangalore said. But salaries in India have been rising by about 15 percent every year and may soon reach parity with China. R&D centre costs in Shanghai are currently just 10-15 percent higher than in India.

Beyond coding
Microsoft and other firms have been working around the government’s indifference. Cisco, IBM, Intel, Nokia, Ericsson and Suzuki Motor have all gone beyond low-end coding and tweaking products for the local market, with hefty investments and recruitment.

Their success shows India’s potential if the government starts supporting such ventures and building high-tech parks and incubators.

“If Paris asks for some work, it’s not because they think it’s cheaper but because they want inputs from India,” said Jean Philippe, chief designer of the Renault India Studio, which competes with the French carmakers’ five other global studios.

Texas Instruments and San Jose-based Cadence Design were among the first to set up R&D in India in the mid-80s, drawn by the legions of English-speaking software engineers who could be hired at about 20 percent of the cost of engineers in the United States.

The opening of the economy in the early 90’s and the establishment of the software services industry drew more foreign firms looking to cut costs and tap emerging markets.

“From when a few companies offshored non-critical design work, we have seen India emerge as a preferred destination for design and development of chip, board and embedded software,” said Jaswinder Ahuja, managing director of Cadence India.

Firms first focused on the ‘D’ in R&D, but research has grown in importance in recent years, and many of the facilities in India are now the largest outside their home base.

Half of Cisco’s core R&D work, including innovations in WiMAX and optical networks, and about 40 percent of SAP’s ideas for processes and product development come from India.

“The Indian units are more tuned to the needs of customers in emerging markets. Besides, Bangalore is only a 5-hour flight away from three strategic regions: Southeast Asia, east Asia and the Middle East,” said Aravind Sitaraman, vice president at Cisco.

IBM’s India Research Labs do a “fair share of patenting”, helping swell the parent’s record numbers every year, said director Guruduth Banavar in Bangalore.

Its new $100 million-mobile communications research, Mobile Web, is the first time a big project has been driven from outside the United States, he said.

“For a research lab it’s the best environment to be in: you can see the problems and the opportunities,” said Banavar, who was previously at IBM’s lab in Boston and has, like several of his peers, returned to India to oversee operations here.

Always backed by a bright IT service

There is one point that often makes interested enterprises hesitant about relocating their infrastructure and applications to the cloud: requirements on their data’s security and related compliance issues. Not all data and documents are business- critical per se. Although some are very much in that category, a company will have different requirements on information they want to share with business partners, and still others for generally available data. Because the requirements demanded of cloud services vary so greatly, Siemens IT Solutions and Services has developed a new cloud computing approach: “Siemens Bright IT Services”.

Online services for employees, on-demand archive management or a collaboration solution: The best-suited cloud depends on what service a company wants. The “Bright IT Services” approach takes into account these different requirements and offers the right solution for everyone – in the public, the private or the community cloud. Its focus is not on the technology itself, but rather the smartest way for a company to leverage the new opportunities offered by IT. The key question is: How can cloud services be integrated securely in the existing landscape so that they offer the best-possible support for business processes?

Many current cloud service providers are either too small, don’t have the necessary experience in the industry or, like Google or Amazon, offer pure commodity services such as storage space or computing power. In contrast, Siemens IT Solutions and Services focuses on a strategic consulting approach coupled with sector-specific know-how. It analyses security aspects and potentials, always from the specific perspective of the company’s requirements: What processes and data are highly sensitive? Are there applications that are less critical and so suitable for the public cloud? What data protection and legal aspects does the company have to consider? What infrastructure already exists and how can it be connected to cloud services?

Integrated IT clouds
More companies are already using cloud services than is generally known. It is often the case that even management or the people in charge of IT don’t know where data is located in the clouds within their own organisation. The reason for this: In many cases, the business departments are themselves buying in IT systems and applications that they need and which offer fast technological support for their own processes. The cloud is usually an obvious resort since budgets for smaller on-demand IT services can be approved quickly without signatures from bosses. However, the upshot is frequently a jumbled assortment of different technologies that creates more security gaps instead of plugging them.

It’s therefore advisable to consider an end-to-end, transparent concept sooner rather than later. After all, these diffuse security risks have to be curbed. But the departments still have to be provided with fast and flexible solutions for their day-to-day work. One of the main challenges is therefore to integrate existing IT systems and cloud services into a single overall concept. To name just one example: Email services can be implemented quickly and cheaply in the public cloud as a mass application with data that requires a normal level of security. But that may not apply to emails from the managing board or research department. Administrators would then have to manage some of the mailboxes internally and others externally – which involves additional time and effort. An integrated cloud approach from a provider like Siemens IT Solutions and Services means it will become easy to choose how many mailboxes are to be obtained from a public and how many from a private cloud. The user interface and configuration remain the same.

So that IT service providers can recommend the right mix for their customers, it is crucial for them to maintain multiple partnerships – for example what Siemens IT Solutions and Services does with Microsoft, VMware or Oracle. That means they are vendor-independent and can evaluate which solutions are best suited for which demands. The solutions can then be integrated simultaneously with both the relevant cloud models and the existing infrastructure.

A service that delivers a business advantage
Cloud computing is not just a means of cutting costs, but also opens up new business models and ways of cooperating with strategic partners. Community clouds are intended to facilitate such collaboration. The focus is on workflows – not just those of one company, but of several, or even an entire industry. The common goal is for the different organisations to be able to work together more agilely and flexibly.

For instance, Siemens IT Solutions and Services has developed such a community cloud for the media and entertainment industry, where work is typically split over several different locations. Media companies frequently outsource particular work steps to external service providers or use offshore capacities. At the same time, the various parties are integrated in the individual processes and must always be able to access up-to-date content promptly. Up to now, such activities were coordinated mainly by material being sent back and forth by tape and courier. Community cloud services now aim to simplify, speed up and improve the quality of specific production and broadcasting processes in the media business. They cut costs and make it easier to meet compliance requirements. Thanks to cloud computing, any authorised person can now access digitised media content quickly and easily from anywhere in the world using a Web browser and so speed up the approval process.

The cloud – yes please, as long as it’s secure
Before venturing into cloud computing, more than 80 percent of companies want guarantees that their data will be protected. These were the findings of a recent survey by analysts from Information Technology Intelligence Corporation (ITIC) among 300 companies with up to 100,000 employees worldwide. Compliance is also a question on managers’ minds: Where is my data stored and who can access it? How can user identities be controlled and protected and access verified in an auditable way?

A private cloud is suitable for especially security-critical company data and applications, since customers know precisely in which country – and even which data centre – their data is stored. It can also be encrypted and transferred or encoded and stored in secure databases. As a result, demarcated parts of the cloud can be specifically assigned to a customer and even managed by specially selected system administrators. The drawback: The higher the cloud service’s security level, the less flexible it is because it’s not possible for multiple resources to be used dynamically. In addition, at least some of the economies of scale, and so cost advantages, are lost.

Authorisation to access the clouds
So as to protect cloud services for its customers, Siemens IT Solutions and Services uses its own identity-based solutions for Identity and Access Management (IAM), such as its flagship DirX. It administers the roles and rights of employees and external partners and so protects resources and systems against unauthorised access. That is particularly important in community clouds, where a large number of people across different organisations use data from the cloud. Thanks to their authorisations, employees or business partners can also identify each other in the cloud and set up and collaborate in secure networks across departments and companies. Single sign-on means that users have to log on only once to gain access to all cloud services they are allowed to use. In this way, important compliance guidelines, such as EuroSOX or Basel 2, are easy to observe. That’s because it is always possible to track who accessed what data.

Siemens’ internal Computer Emergency Response Team (CERT) also ensures the necessary security in the Web by keeping all worldwide threats on the Web on its radar, analysing them and taking prompt and appropriate protective measures. Since a secure Internet connection is a vital part of cloud computing, repulsing such attacks is crucial. Data protection is also ensured by using certified data centres and, where necessary, anonymising information processing.

Further information: www.siemens.com

Internet security

Ken Olson, founder of DEC, famously said in 1977 that there is “no reason anyone would want a computer in their home.” We have come a long way since Olson made that statement. Hundreds of millions of people around the globe now use a computer in the workplace as well as at home for a multitude of tasks, ranging from banking and entertainment to keeping in touch with friends and family. The basis for the personal computing phenomenon we take for granted today was the launch of Windows 95.

But the possibilities Windows 95 offered to developers also inspired a generation of hackers and virus writers. Whilst the skinhead yob was spray painting walls, his geeky counterpart was programming viruses. And he was good at his hobby – in 1996, the number of computers infected with a virus was 10 per 1,000, but by 2000  the number had risen to 91 per 1,000 computers. The virus writer of the late 90s caused more damage than physical graffiti ever did. Businesses suffered financial losses and damaged corporate images whenever IT systems had to be repaired, and lost or corrupted data recovered. Not that the antivirus companies minded. The virus explosion was good business. Leading consumer antivirus company Symantec saw its revenue almost double from $578.4m to $1.071bn  between 1998 and 2002.

Soon, viruses like ‘Melissa’, ‘I love you’, and ‘Kournakova’ were making headlines around the world and consumers flocked to the shops to buy antivirus software. Microsoft and the innovative Windows 95 platform was now perceived by many as the root cause of what was termed the ‘Internet security issue.’ Windows 95 was superseded by Windows 98, then Windows 2000, but the virus phenomenon showed no sign of stopping. Any hopes that the problem would simply disappear like any other fad or anti∞establishment protest were short-lived and Microsoft continued to be the scapegoat. In January 2002, a memo from Bill Gates announced the formation of the Trustworthy Computing (TwC) division, a group charged with ensuring that all Microsoft products, not just Windows, would be developed with security as a paramount requirement. Gates was making a statement. He saw that security and Microsoft’s connection with the issue, be it real or perceived, was a major threat to the company’s future success. He wanted to get serious and he wanted employees, and the outside world, to know. Driving almost instant cultural change across a 60,000-plus company was a huge challenge for the company’s leadership and began with a decision to suspend work on all products, including the already delayed Windows Vista, whilst developers and engineers underwent extensive training on how to develop more secure code. Today, ensuring secure engineering practices company∞wide remains a core responsibility for TwC.

The result of training its engineers was the creation of the Security Development Lifecycle (SDL), an engineering process which all Microsoft products go through and is still the foundation of the company’s security strategy. The SDL system demands regular security reviews throughout the development process and a final assessment that determines whether a product can ship or not. Windows Vista was the first operating system to go through SDL from day one, and since then, every internet∞facing or enterprise∞class product must do the same. In 2008 Microsoft shared the SDL process with developers outside the company so they could also develop more secure code.

Graham Titterington, principal analyst at Ovum agrees with the practice. “Microsoft’s approach to the problem of producing secure IT systems is a good model that shows the necessity to build in security across every activity,” he says. However, despite Microsoft’s decision to re∞train its developers and implement processes such as the SDL, critics initially dismissed TwC as Microsoft’s attempt to deflect blame. Seven years on and many of those critics now acknowledge the impact of Microsoft’s efforts through TwC and agree with the company’s assertion that security is an industry issue and not the exclusive preserve of Microsoft. Eric Domage, research manager, Security Products & Services at IDC EMEA Software Group, says: “It is amazing how much Microsoft has changed.In the early 90s, it was a world-first class provider of vulnerabilities and software breaches. Today, Microsoft has become a major IT safety player, from threat and vulnerability detection to investing its own money in educating local law enforcement agencies.

“Whatever your opinion of Microsoft, and mine is not always positive, you must think about the global effort made by the company and look at the effects. Because of Microsoft, antivirus and advanced security tools (MSRT, Windows Defender.) are almost free in the home. Browsers are safer, browsing is safer, operating systems are robust, spammers are sued and sentenced and global security on internet has got better. “A lot more has to be done. The pervasive nature of IT in the world makes hacking really attractive for many criminals as well as non-criminals looking for revenge on something important. Microsoft will never solve the deep issues of IT weaknesses on its own. But its contribution to a safer Internet is real.”

The report shows that in the second half of 2008, nearly 90 percent of disclosed vulnerabilities affected applications – those software programmes that sit on the operating system – which suggests that as Microsoft makes Windows more secure, hackers are shifting their focus to attack third∞party software vendors, Web services providers and original equipment manufacturers. Figures like these point to the way that security has evolved and is a bigger issue today than ever – despite Microsoft putting its own house in order. Developing more secure software is only part of the answer and Microsoft has expanded the focus of Trustworthy Computing. One example is the creation of the Microsoft Security Research and Response Centre, a global security response team created to protect against vulnerabilities discovered in Microsoft products. Once a vulnerability is identified, the Centre assesses its impact and develops and delivers software security updates on a regular, predictable monthly schedule to deal with them. The updates are tested with the different operating systems and applications it affects, then localised for markets and languages across the globe. “Researching vulnerabilities, engineering and testing resolutions for them and distributing them to a regular and predictable calendar is a huge undertaking,” says Roger Halbheer, Microsoft’s chief security advisor, EMEA. “Despite the industry’s best efforts, vulnerabilities in software code will always exist and, unfortunately, so will criminals looking to exploit them.

All major software companies produce and distribute security updates. We have chosen to be very transparent, predictable and open about our process. We tell our customers that every second Tuesday in the month they can expect to receive security updates from us so they can plan ahead to implement them.” He adds: “I can understand why some people think ‘here we go again, another Microsoft security problem’ but the reality is that all companies are distributing security updates. Our approach is to actively promote the update process and make it predictable so that our customers can plan to install them and ensure their networks or home PCs are secure. The important element to focus on is whether companies are investing in identifying vulnerabilities and being proactive and effective in dealing with them.”

Through Trustworthy Computing, Microsoft’s efforts deserve to be acknowledged. And yet, despite all this, the security issue persists. Microsoft is no longer the soft target for hackers and virus writes. However,  perhaps as a consequence of the company’s secure engineering efforts cybercriminals have shifted focus instead to exploiting weaknesses in human nature, using scams and other crimes of deception that have nothing to do with flaws in technology. Fifteen or 20 years ago misguided computer enthusiasts were the source of the problem, but today’s cybercriminal has no interest in technology at all. The Internet is simply a tool to be used for committing fraud and identity theft.

With such a complex, ever-evolving landscape of online threats, it is clear that one organisation, even one as big as Microsoft, cannot have all the answers and solutions. Microsoft has made progress in addressing security since Bill Gates’ TwC Memo in 2002, and the approach has undoubtedly been successful. However, to ensure a safer online experience for all, both the private and public sectors must continue working together, whether by sharing technological innovations, entering partnerships, educating consumers or a combination of all three.

E-readers: Opening a new chapter

Ask C.T. Liu about future growth engines for his company, LCD maker AU Optronics, and he whips out his Kindle e-book in lieu of an answer.

Sony Corp has joined the paperless wave with its own e-readers, partnering with Google to offer public domain books that are no longer protected by copyright.

Other believers in the dawn of a paperless age include Taiwan’s Netronix, which is making similar models with touchscreens, and Dutch Polymer Vision, set to soon introduce a pocket e-reader with rollable displays.

“We see it as a new industry,” said Liu, a senior vice president at AU, the world’s No.3 LCD maker whose panels are part of Dell, Hewlett-Packard and Apple PCs, as well as Sony LCD TVs.

“It replaces paper, printing, publishing, text books, and so on,” said Liu, in charge of AU’s consumer display business.

The growing number of models could help to bring down prices and boost sales, making these portable readers the next breed of must-have gadgets.

Weighing less than a typical paperback, e-books use a new generation of light, flexible and interactive display, or e-paper. Once the power is off, its images remain unchanged on the screen as it needs no added light source to read.

Because they require no backlighting like traditional LCDs, e-books consume far less power and are also much lighter. A typical Kindle can be read for days without recharging.

The bright future of e-books is particularly attractive to major LCD makers in Asia, including AU and hometown rival Chi Mei Optoelectronics Corp, at a time when they are struggling with sluggish sales of PCs and flat-screen TVs.

AU, which booked a record loss in the October-December quarter, is branching out to the new display sector by buying a 21 percent stake in e-paper specialist SiPix Imaging Inc.

The End of Paper?
Amazon.com Inc’s Kindles have proved a hit since their launch in 2007. Citigroup estimated the US online retailer sold a half-million Kindles in 2008, about one-third more than the number of iPods sold by Apple in its first year.

Some critics argue that e-books could become the victim of their own success if cellphone makers take notice and start to incorporate the newer LCD technology into their own models and include similar reading applications.

Other kinds of devices could also try to incorporate ebook-like applications.

“There will be more low-cost digital reading platforms coming out. Netbook PCs, for example, are much cheaper,” said Jeremy Huang, who tracks the e∞book market for the Market Intelligence Centre, a private industry researcher in Taiwan.

Netbooks, low-cost mini-notebook computers, sell for as little as $299, while a Kindle retails for $359 and the latest model of the Sony Reader is priced at about $349.

Market research firm iSuppli Corp forecasts global e-book display revenue will rise to $291m by 2012, representing an annual growth rate of 143 percent from 2007. That is still much smaller than about $72bn for large LCDs last year.

To help drive the e-reader market, some analysts say book publishers could subsidise a low-cost e-reader, or even give one away, with a multi-year subscription, similar to how telecom operators subsidise cellphones in return for service contracts.

The use of e-paper displays in other devices, such as signs, could also help to build economies of scale, bringing down costs.

Supporters imagine a day when e-paper versions of portable newspaper and magazine readers might be rolled up or folded, and carried to the beach or read on the train by commuters.

The timing of e-readers seems particularly good as the ink-stained newspaper market is struggling and some papers are cutting their print editions due to the recession and as more people go online to get news for free.

In February, US-based Plastic Logic signed strategic deals with media partners, including the Financial Times, USA Today and Zinio, for its first e-readers to target business users. The devices boast a large display and weigh less than many printed magazines.

In South Korea, LG Display Co Ltd has been developing A4-size color flexible e-paper among other potential products for the e-reader market.

“Screens will replace papers, that’s given. But I doubt whether e-books will make a sizeable market in the next two years,” said Park Hyun, an analyst at Prudential Investment & Securities.

“They are preparing for the new business, but it won’t mean much for their business or earnings until the market grows in size.”

The new world of mobile communications

A myriad array of new services have been developed so that a user is “always on” (e.g. Facebook and Twitter for mobile, Instant Messaging, presence etc.). These new mobile services, operating via the Internet Protocol (IP), have been made feasible through exploiting the huge data networks that the mobile operators have deployed over the last few years. What is effectively the “Mobile IP Space” has been created as a result of the voracious demand by consumers for IP services.  

An important criterion here is the quality and price of the new services on offer. Data flat rates offered by operators have emerged as a low-cost means for consumers to explore and enjoy any new service as it becomes available. This is an important point because as increasingly stable mobile voice services over IP emerge (offered by private third parties like Skype), operators are faced with the conflict of permitting these new services to occur on their network which eat into their established, lucrative GSM voice revenue.

With the emergence of “Smartphones” over the last few years, these high-tech handsets are increasingly becoming mobile entertainment centres. Users can read the latest news, download music, take photos, send and view video clips etc. Equipment manufacturers are releasing new hardware all the time and mobile operators are increasingly focusing their resources on marketing IP data services (iPhone Apps etc.). In spite of this, voice traffic still accounts for the majority of operator sales with 80 percent of their top line still being generated by voice services. Taking this into consideration, it is not surprising that operators have been reluctant to move rapidly to deploying voice services over IP; however, it is only a matter of time until they can no longer ignore the consumer requirement for such a service.

The innovative Swiss start-up, Qnective AG, offers mobile VoIP and other IP solutions for operators, Internet Service Providers (ISPs) and social networking companies. The solutions stand out because of their technical quality and ease of integration. The company has developed platforms which combine the entrenched GSM world with the advantages of IP-based communications. The development challenge was by no means straightforward when one considers the numerous manufacturers of mobile handsets with different operating systems (Symbian, RIM, Apple, Android etc.) all operating on various mobile operator network technologies (GPRS, EDGE, HSDPA etc.).   

The dynamic world of mobile comms
Third party service providers have been pioneers in the field of new mobile voice services via IP.  Major technology players such as Google and Microsoft have also pinpointed this field as a key strategic focus for their future development.  Consumers have been able to access the array of new, low-cost IP services offered by these companies and, now that bandwidth in mobile network technology is stabilising, they are becoming genuine competitors to the incumbent mobile operators. Additionally, national regulators are putting pressure on operators such as Vodafone, T-Mobile and O2 to manage their overall network structure to full IP over the next few years.

A summary of the market trends forcing the incumbents to evolve would look as follows:
• GSM predominantly limited to SMS and voice and consumers are demanding additional services only mobile IP can
support, e.g. internet, presence, video streaming, “apps”, social networks, etc.
• Smartphones are the highest growth area in handset penetration which facilitates the offering of “enhanced communication services” to mobile consumers which are “always on”.
• Technology companies like Google/Apple/Microsoft etc. are steadily moving into voice services and encroaching on the markets of traditional Telco operators forcing them to react and evolve towards IP.
• Consumer-focused IP businesses like Skype are gaining traction as users become more accustomed to using mobile IP (not just desktop) – their success has not gone unnoticed by operators.
• IP communication uses network infrastructure in a more cost-efficient manner making it cost-effective for operators to provide consumers with more attractive offers, e.g. genuine flat rates.

Qnective AG predicted these trends and market developments and provides operator platforms for mobile voice and multimedia over network data channels. Qnective’s technology is an advanced hybrid GSM/IP solution, enabling operators to execute a smooth transition to full IP. Qnective achieved its first full integration with the Dutch operator 6GMobile, formerly the mobile unit of British Telecom. With regard to his company’s strategic objectives, Oswald Ortiz, CEO of Qnective AG, says: “The overall goal is to establish Qnective as the preferred provider of IP-based, hybrid mobile voice and multimedia services to the leading telecommunication players and to achieve a meaningful portion of the global GSM subscriber base within five years.”

What makes Qnective’s solutions unique?
Firstly, the company has succeeded in implementing GSM-quality voice services via IP data connections. Crucially their IP-based communication can be fully integrated into the GSM network environment. The result is a hybrid solution, which brings together the advantages of the IP world with those of traditional telephony. This software-based solution does not need any new hardware infrastructure.

Non-traditional telecommunications companies could therefore enter the field of mobile communications or extend their existing IP services with a manageable financial outlay. Low-cost data channels, which are still somewhat underused, can be deployed for mobile IP communications. The interfaces are designed in such a way that the solution is suitable for universal use and is compatible with different set-ups.  

Qnective’s unique differentiating technology
• Highest quality IP applications for mobile telephony (carrier grade status voice clarity etc.).
• Clients for many operating systems and end devices like Symbian, Blackberry, Android & Windows Mobile (including Windows PCs).
• Innovative social services for interacting with Facebook, chats and other similar platforms and people-to-people networks.
• Fully integratable into GSM world (HLR integration, CLI display, Business rules etc.).
• Seamless antenna handover to avoid call drops.
• Lowest data usage amongst competing offerings.
• One single number for Desktop/Cell phone, online on multiple devices.
• Optional highly secure solution (encrypted IP telephony), deployable autonomously.<

Further information: www.qnective.com

Untangling the world wide web

Making sense of the media content available on the internet is a daunting task. It would take 24 hours just to watch the videos that are added to YouTube every sixty seconds.

For organisations that need to keep track of what is being said about them and about their products, this is a big headache.

Among those working on a potential solution are researchers at the Boemie project. The name is an acronym for the not-snappily-titled Bootstrapping Ontology Evolution with Multimedia Information Extraction.

Turning the “ore” available on the internet into “gold” is how they describe their aim.

The Boemie team are building highly structured “knowledge bases” that can automatically – or, for now, semi-automatically – identify, analyse and index almost any multimedia content.

Video, text and audio content from a multitude of sources can be categorised, labelled, indexed, searched and retrieved as needed.

The system needs a bit of human intervention to get started. Someone with knowledge of a particular topic – sport is the one they’ve been experimenting with – defines a few key concepts. They might define “tennis match” as a type of sporting event, and the concept “Wimbledon finals” as an example of a tennis match.

The computer then takes over. It builds an ontology – a formal way of linking concepts together – and uses it to extract useful information from a variety of multimedia sources.

As it learns more about tennis, the computer suggests new concepts to add to the ontology, which an operator who knows about sport can accept, reject or modify.

What makes Boemie special is that once it’s built some knowledge about tennis, it goes back and analyses all of its information again, reviewing it in the light of what it has learned.

It can repeat this process again and again.

The Boemie project has significant commercial potential, says George Paliouras, its technical manager.

“Without semantic indexing, it’s very difficult to retrieve multimedia content,” he says.

“Boemie offers a new approach to do this at a large scale and with high precision. It can speed and improve the analysis, categorisation, indexing and retrieval of almost any kind of multimedia content.”

We might finally be able to untangle the web.

Combining cloud and in-house IT

Experienced system integrators like Siemens IT Solutions and Services are needed to analyse which applications are suitable for cloud computing and to integrate them into existing infrastructures. Cloud computing is driven by cost pressure and the need for increasing flexibility in a fast-changing business environment.

In uncertain times like these, why would a CIO be keen to make significant investments in server parks and hardware that are not utilised to their full capacity? But to maximise both security and cost∞effectiveness, companies need to carry out a detailed analysis of which applications and data are critical to security and which ones are not. They can then run non-critical applications in a cost-efficient public cloud, while sensitive workflows and data remain in the company’s own data centres. To ensure smooth workflows, both are interlinked using a common platform.

Reaching the full potential of the cloud
Siemens IT Solutions and Services not only offers industry-specific consulting and security assessments on cloud strategies and solutions; this provider also carries out the transformation and integration of the systems. Siemens is one of only a few providers that cover the entire spectrum from software as a service (SaaS) over infrastructure as a service (IaaS) to platform as a service (PaaS).

Boasting a broad vendor ecosystem, Siemens is able to deliver the solution that works best in almost any given environment. Combining their integration experience, architecture know-how and security knowledge, Siemens delivers hybrid models that provide cloud flexibility in a controllable and measurable setting.

Software as a service
The advantage of software as a service lies in the fact that highly standardised applications are available on a pay∞per∞use basis, making SaaS both highly affordable and flexible. Providers can smoothly install upgrades in the background ensuring that users always have the latest software version, which at the same time guarantees that the same software version is immediately available to all parts of the organisation across the globe.

Infrastructure as a service
With IaaS, companies can purchase infrastructure services such as computing power, storage, or archiving space within minutes over a web browser portal. Users also benefit from easily adaptable and very flexible computing and storage services to meet dynamically changing requirements. In such an on-demand model, users can cancel services they no longer need at any time. For companies that wish to take advantage of multiple IaaS providers, Siemens offers services based on their hybrid architecture,  an architecture which can maximise the potential of using several on-demand environments.

Platform as a service
PaaS goes beyond IaaS by also providing development tools for applications that run in the cloud. These tools can be used to develop new applications or adapt existing applications to run on the cloud-based platform. There are two ways for customers to enjoy the benefits that PaaS can offer. Either they set up and maintain their own application, or they choose a partner which aggregates and/or integrates many applications on the same platform into a complete solution. In both cases, an on-demand business model means that a vendor, such as Siemens, provides a secure and stable application platform.

Further information: Siemens IT Solutions and Services, Anne Beck, Otto-Hahn-Ring 6, 81739 Munich, Tel. 089-636 47982, Fax: 089-636 42162, E-Mail: anne.beck@siemens.com, www.siemens.com/it-solutions

Jobs troubled by Foxconn deaths

Jobs was making his first public comments about apparent employees’ suicides at a complex operated by the unit of Hon Hai Precision Industry, which also counts Hewlett-Packard and Dell among its clients.

At this year’s All Things Digital conference, an annual gathering of A-list technology and media executives in California, Jobs sniped at Adobe Systems Inc’s “waning” Flash technology, vowed not to get into a search battle with Google, and waxed lyrical about the future of tablet PCs.

Jobs also talked about how he conceived the iPad even before the iPhone. Apple released the iPad in April and it has quickly defined the tablet computer market, selling more than two million units in the first 60 days.

But a string of deaths at Foxconn’s base in southern China, which critics blame on stressful working conditions, threatens to cast a shadow over the device’s success.

“It’s a difficult situation,” Jobs, dressed in his customary black turtleneck and jeans, said on stage. “We’re trying to understand right now, before we go in and say we know the solution.”

The iPad’s momentum has helped drive share gains.

Apple recently overtook long-time nemesis Microsoft to become the world’s largest technology company by market value – an event unthinkable a decade ago – and Apple’s shares have spent much of 2010 hitting new highs.

“For those of us that have been in the industry a long time, it’s surreal. But it doesn’t matter very much, it’s not what’s important,” Jobs said. “It’s not what makes you come to work every morning.”

Top dog
Jobs has appeared at the event in previous years, but not since 2007. Much has changed for Apple – and its helmsman – in that period. A pancreatic cancer survivor, the company’s founder underwent a liver transplant a year ago.

The company’s growing clout and business ambitions have also increasingly put it at the centre of several high-profile disputes and in the regulatory spotlight.

The US Justice Department is making preliminary inquiries into whether Apple unfairly dominates the digital music market through its iTunes store, sources say.

Hostility between Apple and Adobe has been brewing for months. Apple has criticised Flash as a buggy battery hog, while Adobe has accused Apple of exerting tyrannical control over developers creating programmes for the iPhone and iPad.

“We didn’t start off to have a war with Flash or anything else. We just made a technical decision,” he said.

Jobs also addressed criticisms about Apple’s decision to only offer the iPhone in the US on the AT&T wireless network, which is often faulted for sluggish performance.

Without explicitly naming AT&T, Jobs acknowledged in response to a question that the network was having “troubles,” but said he believed quality would improve.

“I’m convinced that any other network had you put this many iPhones on it would have had the same problems,” he said.

Asked if there might be advantages to offering the iPhone in the US on more than a single wireless carrier, Jobs said that there might be, but declined to provide further details.

Some tech blogs have speculated Verizon Wireless could soon offer the iPhone, which continues to be the standard-bearer in the smartphone market amid growing competition from handsets running Google’s Android platform.