Pioneering sustainability in the UAE; Bee’ah advises

Waste management has become a high priority in the United Arab Emirates due to rising migration and consumption levels. This has prompted Bee’ah, an environmental and waste management company in the city of Sharjah, to lead the charge in recycling by setting an ambitious ‘Zero Waste to Landfill’ target, which would make Sharjah the first environmentally responsible city in the Middle East and one of the most responsible in the world.

Waste management objectives are never without substantial challenges – particularly in the context of the Middle East. Bee’ah has identified ‘source segregation’ as the main priority to effectively achieve our Zero Waste goal in Sharjah so that less waste ends up in landfills. This is due to the estimated 70 percent of waste generated in Sharjah that can be recycled (composed of 40 percent dry recyclables and 30 percent organic material).

For this reason, it is of vital importance that our community is well informed about the methods and benefits of separating their waste so most of it can be recycled at Bee’ah’s Material Recovery Facility (MRF). This is the largest waste-sorting plant in the Middle East and the third-largest in the world, with a total annual capacity of up to 500,000 tonnes of recyclable waste.

Through proper segregation, waste management and recovery systems, much of the waste that is produced from our community can be recovered for composting and recycling. That includes paper, plastic bottles, and aluminium and steel cans. Indeed, most of the waste generated contains valuable recyclables that could be processed and successfully re-used in our economy.

70%

of waste in Sharjah can be recycled

As a country that has consistently ranked high in terms of waste per capita, the failure to segregate has resulted in the UAE economy losing around AED1.5bn ($408m) every year. Educating and informing the public about proper waste management and recycling methods is all about closing the loop, and preventing and reducing further unnecessary wastage.

If you can turn garbage into new raw materials, it creates a much more efficient and prosperous economy. Due to the UAE’s lack of natural resources, the challenge of developing infrastructure and awareness, and introducing regulations is very important to long-term sustainable success.

The cost of inaction
Waste management in the Gulf Cooperation Council countries comes under the responsibility of municipalities, most of which have general regulations and rules for the management and disposal of solid waste, along with programmes to reduce the amount of the solid waste generated. Right now in the UAE, you can see the infrastructure being built and businesses showing support for the Sharjah Municipality’s successful efforts to raise public awareness and increase recycling. But our regulations need to be accompanied with public awareness, cultural change and physical activity if we are to achieve Zero Waste to Landfill.

The regulatory challenges we face in this waste management objective are inconsequential when compared to the future impact of taking no action at all. For instance, it takes almost 400 years for a plastic bottle to naturally degrade if it ends up in landfill: given the average person in the UAE is responsible for approximately 350 to 450 bottles of water going to waste every year, this represents an enormous environmental tragedy. The UAE is one of the highest consumers of bottled water globally, so the local bottled water manufacturing industry could save millions of bottles and vastly increase its revenue by using recycled raw materials to produce new bottles. Furthermore, recycling plastic products requires 20 to 40 percent less energy than manufacturing new ones and, as the international community has demonstrated, recycling can save the wider economy millions of dollars in resource and energy expenditure.

Encouraging people to recycle is a long road and the journey has only just begun in the United Arab Emirates. Bee’ah is just eight years old and has already managed to implement the required facilities and awareness to successfully encourage measurable recycling in Sharjah: there has been an increasing level of consciousness and, as a result, there has been exponential growth in the number of plastic bottles arriving at Bee’ah’s facilities.

Successes to date
Our experience to date has been that, whenever we provide recycling facilities, people do use them. Having said that, Bee’ah has also introduced important education and recycling initiatives to accelerate the waste management progress: Bee’ah has been implementing residential recycling initiatives in residential communities across Sharjah and will be rolling them out in more residential areas of Sharjah over the next year; and Bee’ah has just concluded the fourth academic year of the Bee’ah School of Environment initiative, which educates 164,000 students in 206 schools annually, with the aim of reaching out to even more students and teachers next year.

After launching residential recycling within some of Sharjah’s sectors in 2012 and achieving 70 percent progress towards Zero-Waste to Landfill in Sharjah in 2014, Bee’ah has begun plans to launch residential recycling across all sectors within the city. We are extremely confident that, by providing the right tools and infrastructure, Bee’ah will help our community lead environmental change for many years to come and will fully accomplish our waste management goals.

The whole aim of recycling is to turn what is now waste into valuable raw materials for industry to use; this creates jobs in the UAE and substantially reduces the need for imported materials. We are all trained to look at recycling from an environmental and an economic viewpoint. The environmental benefits are obvious: a good recycling programme in the UAE would save hundreds of millions of tonnes of resources every year. By recycling, we are also capturing valuable resources and feeding it back into the economy of the UAE.

Bee’ah’s Material Recovery Facility is the largest waste-sorting plant in the Middle East
Bee’ah’s Material Recovery Facility is the largest waste-sorting plant in the Middle East

The next challenge
The next challenge for Bee’ah in reaching its Zero Waste target is tackling the remaining 30 percent of waste that is unrecyclable. To do so, Bee’ah is building the region’s first, and the world’s largest, advanced thermal waste-to-energy facility, utilising state-of-the-art gasification technology. The waste-to-energy plant will recycle 400,000 tons of non-recyclable waste annually, generating 85MW of clean green energy. This facility, valued at £300m, will be built next to the Bee’ah waste management centre in partnership with Chinook Sciences, a UK-based research group that supports nuclear, metal and industrial gas industries.

Bee’ah is confident the spread of waste management through education and awareness programmes throughout Sharjah will continue to encourage recycling and lead an environmentally sustainable emirate. Ultimately, waste management in the United Arab Emirates relies on our efforts to continue developing our waste management programmes’ capacities, and on our residents reducing the amount of waste generated by implementing the three Rs: reduce, re-use and recycle. This strategy not only decreases the size of Sharjah’s landfills but also ensures an environmentally sustainable and clean emirate as it moves ever closer to the ambitious Zero Waste to Landfill target.

Shadow banking grows thanks to distracted regulators

In November 2014, the Financial Stability Board (FSB) published Transforming Shadow Banking into Resilient Market-Based Financing. The FSB was concerned that a largely unregulated sector had grown globally to $75trn in 2013 – an increase of $5trn on the preceding year. The paper also included an updated “roadmap” from the G20 for the ‘Strengthened Oversight and Regulation of Shadow Banking in 2015’.

Banking reform has long been focused on clipping the wings of retail and investment banks, but it appears the strength of shadow banking has not been on the supervisor’s radar. It has recently become a subject of much debate and the statistics alone suggest it could pose a significant risk of systemic failure to the global economy.

In China, the shadow-banking sector has risen to an estimated $6trn – or 69 percent of
the economy

The FSB defines shadow banking as “credit intermediation involving entities and activities (fully or partly) outside the regular banking system”. This covers derivatives, money-market funds, securities lending and repurchase agreements, as well as the riskier investment products and loan-shark activities. The FSB is now proposing to combine ‘bank and market-based finance’ (its new name for shadow banking) into a more diverse financial system, hoping it will prove more resilient and, through greater competition, more effective. In the meantime, hedge funds and private equity firms in particular will be well aware they face increased regulatory scrutiny and the threat of market intervention.

A surprisingly big problem
We should be mindful that net credit growth since the economic crisis has essentially been in bond rather than bank finance, and the importance of shadow banking to the economic success of China and India cannot be overemphasised. In practice, many shadow banks are based in tax havens and rely on short-term funding – causing great alarm to those monitoring its stellar rise. Action by our regulators appears long overdue as problems have arisen across the globe.

Although the collapse of Shanxi Platinum Assemblage Investment in December 2014 was relatively small fry, it serves to highlight the financial risks lurking in China’s shadow banking system – especially where high-yielding wealth management products blur into grey-market lending. In China, at least 90 percent of small businesses fail to get bank loans and the shadow-banking sector has accordingly risen to an estimated $6trn – or 69 percent of the economy. There is, of course, no prospect of a public bailout if failure occurs as this is unregulated territory. The Financial Times stressed: “A series of similar incidents…. suggests China’s slowing economy has created fertile ground for hucksters as companies become increasingly desperate for funds amid a pullback in lending from banks as well as more mainstream non-bank lenders, such as trust companies.”

Shadow banking also makes it far harder for countries to control their economies by fiscal policy. Argentina, for example, whose economy could implode at any time, saw a 50 percent increase in the shadow-banking sector in 2013. Worldwide, nearly half of the $70trn in managed assets is in funds offering investors redemption at short notice, but, conversely, funds are investing increasingly in higher yielding, less liquid assets. The writing is on the wall.

Global fault lines
With the benefit of hindsight, Mark Carney (the FSB chairman) has declared the global economic crisis exposed significant “fault lines” in shadow banking activities. It is clear regulators did not understand the interconnectivity between banks and shadow banks. Heavy reliance by the latter on short-term wholesale funding and a lack of transparency hid the leverage and maturity mismatch and camouflaged the inherent risks. Credit intermediation through the shadow banking system came to an abrupt halt, threatening the viability of the entire financial system when the funding markets simply dried up. The FSB has accordingly adopted a two-pronged, high-level strategy to deal with these fault lines, although there is no obvious panacea.

First, it has created a ‘global’ monitoring framework to track developments in the shadow-banking sector – specifically, to identify systemic risks and apply corrective actions if necessary. This monitoring started in 2011 and now covers 25 jurisdictions, representing 80 percent of global GDP.

Second, to prevent the re-emergence of systemic risks from shadow banking, the FSB is looking to significantly increase the regulation of the sector. It will coordinate the development of policy measures in five core areas. These are: mitigating the risks in banks’ interactions with shadow banking entities; reducing the susceptibility of money market funds to ‘runs’; improving transparency and aligning incentives in securitisation; dampening pro-cyclicality and other financial stability risks in securities financing transactions; and assessing and mitigating financial stability risks posed by other shadow banking entities and activities.

FSB policies will focus on the economic functions involved in shadow banking activities in order to cater for innovations and adaptations on or outside the regulatory perimeter. The G20 roadmap of planned activities for 2015 covers actions by the FSB, the International Organisation of Securities Commissions and the Basel Committee on Banking Supervision. It includes an analysis of the global hedge fund sector, a review of the progress of money market fund reforms and a study of global securities financing data. Clearly, a massive change programme is in the pipeline.

The FSB strategy is to implement a series of integrated policies worldwide to mitigate the financial stability risks and, for sustainable economic growth, to transform shadow banking into “resilient market-based financing”. To many observers, this is wildly optimistic. Experience tells us regulators are often poorly coordinated. They will also face intensive lobbying from the banking industry and distrust from third world countries. In the emerging world, shadow banking fills a vacuum and the FSB has come up with no viable alternatives. Although the FSB says avoiding a repeat of past failures is not a recipe for success, in many ways it is a fear of the past that is guiding G20 policy – not the specific needs of today. Banking can never be risk-free.

Mark Carney has said the world has turned a corner in reducing the probability of another financial crisis, but globally there is a voracious appetite for market-based finance that will not succumb easily to regulatory restraint. We have not heard the end of shadow banking.

University rankings: the institutions that are paying to be good

In 1983 US News launched its national university rankings. Although various attempts to compile statistics and rank universities had been in existence since the late 19th century, beginning with efforts from The Commission of the US Bureau of Education in the 1870s, these were the first attempt at a comprehensive national league table. However, this modest supplement of the media publisher was to spark, what Dr Simon Marginson, Professor of International Higher Education at the Institute of Education, University College London, has called “the biggest thing to hit tertiary education, especially university education, for some time.”

The most brazen attempt to use the increasing importance and prestige of global university rankings has been by QS World Rankings

University League tables went global in 2003 when Nian Cai Liu of Shanghai Jiao Tong University, as he recently told The Economist, came up with six indicators with which to measure the performance of institutions from around the world. This endeavour was to become the so-called Academic Ranking of World Universities, also known as the Shanghai Rankings. What soon followed was a proliferation of alternative attempts at rankings universities globally. Among the dozens of global rankings, the Times Higher Education World University Rankings, the QS World University Rankings and the Shanghai Rankings have for many years been seen as the most authoritative. The university league tables pioneer US News has also recently branched out into ranking universities globally, launching its Best Global Universities Ranking in 2014.

“Prestigious Events in Spectacular Places”
With the exception of a few government efforts, such as the European Commission’s U-Multirank ranking, university ratings have always served a commercial purpose. In the early years of rankings, before the internet, a yearly rankings addition as a supplement in a magazine would boost sales for that edition. This is still the case for rankings published by magazines, as well as now leading to increased internet-traffic. Rankings also raise the profile of the name behind the ranking, advertising their other products. Although the actual credibility and worth of ranking universities, both nationally and globally, has always been hotly disputed, this commercial aspect, seemingly posed no threat to the objectivity of the rankings.

University rankings, according to Professor Ellen Hazelkorn, Policy Advisor to the Higher Education Authority of Ireland, have “thrived in a vacuum about comparative data on higher education.” This has allowed them to become a profitable business. Along with the “importance of higher education to knowledge-intensive economic growth and the world-wide pursuit of talent” and importance placed upon rankings by governments and students, universities around the world are “convulsed around how to achieve better scores – thus creating yet another industry around consultancy”, she continues. One way in which university rankings now act as consultant is through organising what Richard Holmes, author of the book Watching the Rankings, describes as “prestigious events in spectacular settings.” He describes these events as “a lucrative mix of business opportunities including customised rankings, conferences, consultancy, and workshops for world-class wannabes” and “a huge commercial success based on the participation rates charged, and their frequency.” According to Holmes they, too, often “produce convenient results for hosts and sponsors and profit for the rankers.”

Cash for Gold
The most brazen attempt to use the increasing importance and prestige of global university rankings has been by QS World Rankings. Alongside its ostensibly objective list of what it considers the world’s top 800 universities, QS World Rankings offers a parallel rating-system; a paid for, opt-in gold-star service. Universities that pay for this service are evaluated against a set of 51 criteria, and awarding between one and five (and a five plus) stars in eight different categories. Alongside its published list of 800, the awarded number of gold stars appear next to the universities name. Universities that do not appear in the rankings can also utilise this gold-star service. Take for instance Karaganda Economic University, located in the fourth largest city in Kazakhstan. According to the criteria set out by QS for its World Rankings, this institution is not among the top 800 universities in the world. Yet according to the QS gold-star service, Karaganda Economic University is rated with three gold stars.

Similar universities around the world, either not included in the world rankings or which sit on the lower end of the league table, pay large amounts for this service, presumably to market themselves as a gold-star accredited university. According to Inside Higher Education in 2013, this service cost $9,850 as well as “an additional annual licensing fee of $6,850 for each of the three years the license is valid, allowing them [the universities] to use QS’s graphics and logos in their promotional materials. This brings the total cost to a participating university to $30,400 for three years.” The prestige of the QS name, along with the increasing use of business style marketing at universities, justifies universities paying such prices.

QS maintains that this gold-star service is separate to its world rankings and is simply intended to provide more in-depth information to students deciding upon their university. Dr Marginson is not convinced, claiming that “the use of client-orientated, paid for ‘stars’…conflicts with the rankings process, in that it adds a fake valuation – any valuation that you can purchase has no valid foundation, surely – alongside what is meant to be an objective ranking.” Currently, QS World Rankings is the only top global ranking to offer such a paid for service. When asked if he sees any other of the rankings following a similar path, Dr Marginson replied that “The Times Higher may develop more business functions, now that it has take over its data processing from Thomson-Reuters, but we are yet to see if it will attempt something like the ‘star’ system. That would be a major error. If it did.”

Monetising Data
Off the back of the growth of university rankings is also the growth of Higher Education intelligence, with those within the university rankings industry well placed to take the lead. Those engaged in the ranking of universities, in order to complete their task, have collected vast amounts of data on universities around the world. This data is then often monetised through, according to Professor Hazelkorn, “selling it back often to the same institutions which provided the data in the first place.” This allows them to compare their own data and performance to that of other universities. Thomson Reuters sells access to its data on universities through its Global Institutional Profiles Project which it advertises as being able to help universities measure their research impact and output, teaching practices and income for research. Likewise, the Shanghai Rankings has its Global Research University Profiles initiative and the Times Higher has announced plans to build its own database, although whether or not this will be monetised in the way Thomson Reuters has not been announced. This all demonstrates, according to Professor Hazelkorn, “a growing market for monetising HE [Higher Education] data to inform policy and institutional decision-making.”

University rankings have come a long way since their birth in the 1980s. No longer are they merely an additional supplement for undecided students and concerned parents to mull over. Rankings have gone from being magazine supplements to key players in the world of Higher Education. From monetising data to convening “prestigious events in spectacular setting” (for a price, of course), as well as offering paid for, opt-in valuation services, rankings are not shy of using their authoritative names and vast amounts of Higher Education data for profitable ends. This has raised serious questions about objectivity. According to Dr Marginson, “there is a conflict of interest between what should be objective ranking of universities, and client-orientated business services for universities.” In her book Rankings and the Reshaping of Higher Education: The Battle for World-Class Excellence, Professor Hazelkorn writes that the university rankings industry is now “replete with perceptions of conflict of interest and self-interest, along with self-appointed auditors – all of which, in this post global financial crisis age, would almost certainly provoke concerns in other sectors”. Despite this, rankings remain influential among governments, parents and students. In a world where more and more people are going to university and where competitive job markets place such emphasis upon education, university rankings and their different business services seem likely to only increase.

Fighting cancer with innovative immunotherapy techniques

To understand how cancer immunotherapies work, it is useful to understand cancer’s defining properties. Cancer occurs when normal cells in the body multiply at an unusual rate and old cells do not die. Cancer also has the ability to invade and damage normal tissue, either locally (close to the site of the initial tumour) or distantly (at other sites throughout the body). A healthy immune system may have trouble fighting cancer, first because cancer cells are internal and not foreign to the body, so the immune system might not recognise them as dangerous, and, second, because some cancer actually suppresses immune activity.

NBS20 uses the patient’s own immune cells and tumour-initiating cells to create a therapeutic vaccine

Immunotherapies are treatments that fight disease using components of a patient’s own immune system – either by stimulating the patient’s immune system or by providing the patient with manmade products that stimulate the immune system. Cancer immunotherapies currently in development can be divided into a few major groups, such as monoclonal antibodies, checkpoint inhibitors and cancer vaccines.

Antigens are the molecules on viruses, bacteria or tumours that are recognised as foreign by the immune system. In a healthy person, the immune system develops proteins called antibodies that target these antigens, resulting in the destruction of the illness-causing microbes. Researchers have discovered how to make monoclonal antibodies that target particular antigens, including the antigens found on cancer cells, in laboratories. Some monoclonal antibodies (mAbs) currently available as cancer therapies include Bevacizumab (Avastin) for the treatment of colorectal, non-small cell lung and metastatic breast cancer, and Cetuximab (Erbitux) for the treatment of head and neck and colorectal cancer.

Scientists have discovered that one particular antigen, CTLA-4, prevents T cells (immune system cells that can destroy cancer cells) from doing their job, thereby making it easier for cancer to develop and spread unchecked. So researchers have developed an antibody in the lab called anti-CTLA-4 that blocks this ‘immune checkpoint’ antigen, freeing the immune system to attack and inactivate tumors. Anti-CTLA-4 is one example of what is known as a checkpoint inhibitor and has been approved by the US Food and Drug Administration (FDA) as ipilimumab (Yervoy) for the treatment of metastatic melanoma. Other checkpoint inhibitors react with programmed death 1 (PD1), which can inactivate cytotoxic T lymphocytes so they do not kill tumour cells they recognise. Nivolumab (Opdivo) and Pembrolizumab (Keytruda) have been approved for the treatment of metastatic melanoma.

Fig 1

Traditional preventive vaccines contain weakened versions of antigens that cause specific diseases, such as measles, mumps or flu. When injected into the body, the immune system can recognise and remember the antigen, so that, if a person becomes exposed later, the immune system will fight it, preventing illness. Therapeutic cancer vaccines are different in that they aim to treat, not prevent, cancer.

Other immunotherapies available include CAR-T therapy, which involves engineering immune T cells to help them recognise specific antigens on tumour cells, and TIL therapy, in which immune cells called tumour-infiltrating lymphocyte (TIL) cells are taken from the patient and the most effective cancer-killers among them are reproduced in a laboratory and then injected back into the patient.

Some immunotherapies are given in conjunction with other therapies, such as chemotherapy, while others are given afterwards, either to allow other therapies to weaken or shrink the cancer first, or as a second or third option if other treatments prove ineffective.

Helping the immune system
A patient-specific cancer immunotherapy, known as NBS20 (USAN generic name eltrapuldencel-T), is currently under late stage (Phase 3) clinical investigation by NeoStem. The proposed treatment is intended to help the immune system target cancer or tumour-initiating cells (commonly referred to as ‘cancer stem cells’), which are thought to rapidly proliferate cancer cells, fuelling tumour growth, and ultimately spreading the disease throughout the body. NBS20 uses the patient’s own immune cells and tumour-initiating cells to create a therapeutic vaccine. NBS20 is unique in targeting these cancer or tumour-initiating cells; other therapies, which target different tumour cells, may treat existing cancer but may not be as likely to prevent tumour recurrence (see Fig. 1).

The goal is for NBS20 to eliminate these cancer or tumour-initiating cells after a patient has already undergone other treatments that may have reduced tumour size but are unlikely to result in a five-year survival (see Fig. 2). Results from a randomised Phase II trial, which showed a survival advantage for patients treated with DC-TC (now NBS20) compared to another investigational control treatment, were presented and published in 2012. Enrolment for the Phase III trial is taking place now at sites across the country.

Clinical efficacy and reasonable cost
Despite advancements in cancer immunotherapy, the cost of developing autologous treatments – those developed from a patient’s own cells – can be prohibitive. Maintaining the necessary clean room laboratory environments, employing skilled scientists, and manufacturing and producing each patient’s therapy individually, often through extensive manual production processes, can all lead to product cost of goods that is not sustainable through commercialisation, and therefore to treatments that may not be affordable for patients. To achieve success as an industry, manufacturers of cell therapies must be able to produce a high-quality product at a reasonable cost of goods, which also meets demand over the commercial life of the product.

NeoStem acquired PCT, a contract and development manufacturing organisation, in 2011. This acquisition provides NeoStem with significant cell therapy development expertise as well as cGMP (current good manufacturing practices) manufacturing facilities with controlled environments, thereby facilitating the clinical manufacturing of NeoStem’s own pipeline.

Fig 2

Through the application of PCT’s expertise and experience, NeoStem expects commercial-scale cost of goods to be manageable for its therapeutic candidates and for client product candidates, while still meeting FDA and cGMP standards. This is accomplished by implementing a structured manufacturing development methodology called Development by Design (DbD). Expanding on the foundational focus on quality of cell therapy products, DbD aims to help clients develop a commercially viable manufacturing process that focuses on four key drivers: quality, cost of goods, scalability, and sustainability.

An important step for cell therapy companies to achieve this is to define a product profile early in the development programme. Elements to consider in the profile include: characteristic profile (i.e. description, formulation, dosage, potency, volume and shelf life); safety profile (i.e. microbial assurance, cellular impurities and manufacturing residuals); use profile (i.e. indications for use, treatment timing, preparation and use); and business profile (i.e. geography market projections, cost of goods targets, and intellectual property).

Considering these factors as early in development as possible enables cell therapy companies to create a strategic development roadmap that anticipates challenges and manages the changes that will be required as development progresses. Based on the roadmap, we engage PCT’s deep scientific and engineering expertise and pursue innovation to overcome the challenges and meet the DbD goals for commercially viable manufacturing.

We believe our unique focus on targeting the tumour-initiating ‘stem’ cells within a tumour, combined with an innovative business model that consists of a rich cell therapy pipeline, externally recognised manufacturing expertise, and initiatives intended to use innovation to solve the challenges specific to the cell therapy industry, puts NeoStem in the unique position of simultaneously taking part in and shaping the future of cancer immunotherapy.

Kraft charged for manipulating wheat prices

The US Commodity Futures Trading Commission (CFTC) has announced that it is filing charges against Kraft Foods Group and its subsidiary, Mondelēz Global LLC, in the state of Illinois, where they are both headquartered. According to a press release published on April 1, the civil enforcement complaint against the two multinational corporations is for the manipulation and attempted manipulation of both cash wheat prices and wheat futures.

Other violations of exchange rules were also found and included in the charges

The allegation contends that following a summer of high wheat prices, in December 2011, the pair orchestrated an approved strategy to purchase six months worth of stock, or in other words, $90m of wheat futures. Yet, the CFTC has found that there was never an intention for Kraft and Mondelēz to actually take the delivery, and so the bluff was executed so as to cause a market reaction to the mammoth purchase. The plan to lower the price of cash wheat and bolster the futures spread over the following three months was successful. The CFTC calculated the profit made to amount to at least $5.4m.

In addition, the CFTC claims that there have been other occasions in which Kraft and its snack foods company, Mondelēz, have influenced the wheat market by holding long positions without a valid hedge exemption or an actual requirement for the stock in question. Other violations of exchange rules were also found and included in the charges.

“This case goes to the core of the CFTC’s mission: protecting market participants and the public from manipulation and abusive practices that undermine the integrity of the derivatives markets. A market participant who is not happy with cash prices available to it may not resort to manipulative trading strategies in an attempt to artificially lower that price,” the CFTC’s Director of Enforcement, Aitan Goelman, said in the statement.

Why do astronauts have bad skin?

NASA plans to send humans to Mars within the next 20 years. The Red Planet is more than 55 million kilometres away and it would take at least half a year to get there. How safe is such a trip? How do you protect the astronaut’s health in the hostile environment of interplanetary space? The astronaut will face cancer-inducing radiation and maybe massive deep-space solar storms. In-spacecraft air pollution, such as skin particles from other crewmembers, may cause problems. And, of course, the body will change as a result of the long-term lack of gravity.

Human spaceflights have been performed since 1961; 541 astronauts and cosmonauts have been to space, corresponding to about 123 man-years. The record for the longest spaceflight is held by Russian cosmonaut and space medicine expert Valeri Polyakov, who spent 438 days non-stop in space.

15%

Decrease in skin thickness among some astronauts

At the moment, most health information on humans in space is obtained from scientific projects performed during six-month stays at the International Space Station (ISS), in a low Earth orbit at an altitude of 430km. A series of pathophysiological modifications have been observed, but not everybody is a responder. The question arises: why do some astronauts develop health problems, such as impaired eyesight, and others do not?

Getting under the skin
On Earth, the body is attuned to gravity, with the heart pumping fluids upwards. In weightlessness, the body does not need this function. This can result in modifications of fluidics in the body. The journal Radiology reported that about 60 percent of investigated astronauts had high fluid pressure in their skull, according to MRI data. Many astronauts suffered from space headaches (0.56 events/astronaut). In general, astronauts face the problem of bone loss similar to osteoporosis (up to one percent per month) and possess immune system deficits.

Skin impairments are one of the most frequent health problems that occur during space missions. For example, skin rashes are common on spaceflights. However, so far no major studies on the effects of extended space travel on human skin have been performed. There was one pilot study conducted in 2006 with a European astronaut who performed in-flight and post-flight skin measurements. Modifications of the dermis, the skin surface, and the hydration level were observed. Furthermore, the measured increase in the transepidermal water loss reflected an impairment of the barrier function of the outermost skin layer. It seemed the astronaut underwent an accelerated skin ageing process in space.

Skin problems, including impaired wound healing in space, need further research activities to protect astronauts’ health during long-term flights and to better understand ageing and wound-healing processes. Therefore, NASA and the European Space Agency (ESA) launched the ‘Skin B’ project, which covers pre-flight, in-flight on the ISS, and post-flight skin measurements on five astronauts. For the first time in space research, medical femtosecond lasers are being employed to obtain label-free optical biopsies with superior intracellular resolution on astronauts. Physically taken biopsies are not required to get a precise look inside the skin and to monitor modifications of the tissue architecture and the cellular metabolism.

The novel imaging technology is called multiphoton tomography (MPT). MPT was introduced by the German company JenLab, with facilities in Jena and Saarbruecken. So far, MPT is mainly used in major hospitals in Australia, California, Russia and Western Europe for early detection of skin cancer and to monitor tumour borders during neurosurgery. Furthermore, it became an important tool for major companies such as P&G, Chanel, L’Oreal, Beiersdorf and Shizeido to evaluate pharmaceutics and cosmetics, including nanoparticle-based sunscreens. In 2014, JenLab was awarded a New Economy Award in London and an IAIR Award in Milan for the development of the certified multiphoton tomograph MPTflex.

The Skin B project is ongoing. The in-flight measurements take place aboard Columbus (the ISS’s space lab module) to obtain information on skin hydration, the function of the skin barrier and the skin surface. Pre- and post-flight multiphoton measurements are performed at the European Astronaut Centre in Cologne. So far, ESA astronauts Luca Parmitano, Alexander Gerst and Samantha Cristoforetti have been studied. Parmitano launched on March 28, 2013 from the Baikonur Cosmodrome in Kazakhstan, Gerst on May 28, 2014 and Cristoforetti on November 23, 2014. Their journeys to the ISS were performed with the novel ‘express spacecraft’ and took less than six hours. Parmitano and Gerst spent almost half a year on the ISS. Cristoforetti is still working there.

Alexander Gerst, here seen aboard the ISS, was one of the astronauts studied as part of the Skin B project
Alexander Gerst, here seen aboard the ISS, was one of the astronauts studied as part of the Skin B project

Dangerous thinning
The novel, high-resolution, MPT imaging technology revealed astonishing long-term space effects on the skin.

Normally our skin thickness stays relatively constant. Cells produced in the lowermost skin cell layer, the stratum basale, migrate to the skin’s surface. On the way, they die and form the outermost layer, the stratum corneum. That dead cell layer acts as a barrier and keeps, for example, the water inside our body. Finally, we lose the outer dead cells as horny dead biomaterial (e.g. by mechanical friction) and it drops to the ground due to gravity. This turnover of the epidermis takes one month – at least on Earth.

On the ISS, skin physiology is different. Bioparticles from the stratum corneum are ‘floating’ in the air and may easily be inhaled. Mechanical friction is reduced. Microgravity impacts cellular metabolism, cell migration and reproduction.

Interestingly, MPT provides clear evidence the epidermis of the two male astronauts became thinner by more than 10 percent: the mean epidermis thickness decreased from 57µm to 49µm (14 percent) on the thinnest area, and from 67µm down to 57µm (15 percent) on the thickest investigated site. The decrease was observed within the living cell layers of the epidermis and not within the dead-cell layer stratum corneum. The observed thinning of the epidermis did not correspond to other significant signs of skin ageing such as flattening of the epidermal-dermal junction.

Also the lower part of the skin, the dermis, changed. In particular, the extracelluar matrix components elastin and collagen underwent modifications. Interestingly, there was a strong increase in the amount of collagen and in the ratio of collagen to elastin compared with pre-flight conditions. This observation is in contrast to ageing effects on Earth. In fact, the monitored collagen-elastin ratio increase results in an ‘anti-ageing’ effect, as observed with certain cosmetic products on Earth.

However, thinning of the viable epidermis is of no advantage. Dangerous radiation, such as low-wavelength ultraviolet and cosmic radiation can reach the basal cell layer easily at the epidermis-dermis junction. That most important skin layer contains the stem cells and is also the location where skin cancer often starts. If an astronaut’s viable epidermis is only 30µm thin after a six-month expedition, his viable skin cell layers can likely shrink to less than two cell layers during a one-year Mars expedition.

The good news is the skin effects are reversible. When back on Earth, the epidermis of the two astronauts became thicker again. Studies are currently being conducted to monitor the repairing process and to evaluate memory effects. We will gain knowledge of the process of readapting to gravity.

Skin can be employed as an early recognition system for physical and mental health status. Further studies are recommended to understand why some astronauts show severe skin reactions, such as thinning of the epidermis, modifications of the dermis, and allergic reactions, before starting the trip to the Red Planet.

Axiom Foods on new approaches to global food challenges

It is human nature to hold tightly onto the past. There is comfort in what has already been experienced. As such, it’s a natural reaction to vehemently protect technologies, economies, the state of nature, and communities that have been our old friends and game-changers in the world; watching them disappear or be dismantled isn’t easy to accept.

One would never have imagined, 40 years ago, that we’d have a machine in our pocket that, at the swipe of a finger, could make even the most obscure of information available immediately. What then would happen to the door-to-door encyclopaedia sales man? When free airline meals disappeared from our seat trays, we bemoaned our love/hate relationship with them and an entire industry melted overnight. Next we had push-button food service on Virgin Airlines, and it was fun. Somehow we figured it out, and thrived as a result.

61.3m

Metric tons of wheat produced, 2013

55.1m

Metric tons of wheat produced, 2015 (predicted)

Nevertheless, the world, and everything in it, changes every day. It can be disheartening to witness ice caps melting, economies weakening, leaders who fall short, but change can also be incredibly exciting and surprising. We have a choice to name change as ‘bad’ and make efforts to turn back time, or we can accept the past and focus on what we do today and tomorrow. The subject of food and the current, massive shift in how global crops serve us is of global significance. It may appear daunting, maybe even scary, but there is innovation that has changed the face of the earth, our food economy, and perhaps even our biology.

One significant change in the food world is that the two largest global crops, wheat and corn, are waning in production and use; the third largest, rice, is poised to become ‘the new wheat’. Recent clinical trials at the University of Tampa have shown protein fractioned from brown rice can provide equal benefit to that derived from animals in the sport nutrition arena. Wheat, which for centuries was the most common food crop grown on Earth, has steadily decreased from worldwide production of 61.3 million metric tons in 2012 to a predicted 55.1 million metric tons in 2015. Though wheat was once celebrated as the poster child of the health food movement, consumers are now rejecting this ‘king of grains’, with some identifying it as being at the very root of their health problems. The rise of gluten intolerance and celiac disease has skyrocketed.

Studies have been conducted at John Hopkins Medical Centre to find the safest ways to test children for allergies. Silicon Valley entrepreneur Sean Parker, known for creating Napster and being the first President of Facebook, donated $24m to Stanford Medicine to launch an allergy centre. His gift was the largest private donation to allergy research in the US and was inspired by first-hand experience of life-threatening allergies.

A study by Dr Ruchi Gupta of the Mayo Clinic found childhood food allergies cost the US $24.8bn a year. According to a study done at the Mayo Clinic in 2013, food allergies affect an estimated eight percent of children, 40 percent of whom have a history of “severe reactions that if not treated immediately with proper medication can lead to hospitalisation or even death”. Every three minutes in the US, food allergies are the reason for an emergency room visit.

Consumer concern
Never in history have we witnessed our bodies rejecting food as we are today. The epidemic is creating changes and trends in how food products are created and marketed, how grocery is presenting options to shoppers, and even how restaurateurs differentiate themselves and serve up sustenance. The largest scale problems seem to be with wheat in regard to gluten intolerance, and consumer reaction against genetically modified corn. Soy has also fallen from favour, especially in the over-$6bn protein supplement industry. The FDA lists soy as one of the ‘Big Eight Allergens’, and some consumers have expressed concerns regarding its naturally occurring phytoestrogens. Rice is emerging as the new wheat, in that it can be a substitute for not only allergenic animal-based and soy proteins and milks, but also serve as a substitute for key ingredients from all major crops, such as wheat flour, bran fibre and oil, corn syrup, and starch.

“Some of America’s largest food production companies, such as Kellogg’s, General Mills and Pepsi, are developing new products specifically without wheat and corn, in favour of non-allergen grains, such as rice”, says David Janow, CEO of Axiom Foods, one of the leading fractioners of brown rice, making it into protein, syrup, milks and meat analogues.

The pace of Axiom’s development of allergen-friendly food products with rice is ferocious. Food and beverage giants are knocking on Janow’s door to source his innovative plant-based ingredients after he created proprietary all-natural processes to fraction the ancient grain. Since 2005, the Los Angeles-based company has supplied thousands of food manufacturers with the patent–pending Oryzatein, available in various protein concentrations – including the first 90 percent brown rice protein isolate. Janow’s efforts in this, and fractioning protein from plants such as pea, sachi inchi, hemp and most recently flax, have won Axiom Foods The New Economy Award for Best Food and Beverage Solutions, 2014; the awards recognise the highest achievers with big ideas changing the global economy for the better.

The world’s second crop, corn, is suffering a hit as well. According to Dr Alette Christine Coble-Temple of John F Kennedy University, “the proliferation of genetically modified corn is now being linked to celiac disease and gluten intolerance”. Dr Coble-Temple is a consultant for families and organisations on the impact of severe food allergies, and how they affect biopsychosocial development, attachment theory, and resiliency in children and adolescents. “Speaking of genetics”, says Dr Coble-Temple, “a study by Dr Grace Yu showed if a child has an allergy and they have an identical twin, that child has a 67 percent chance of having the same allergy, but the same child’s older or younger sibling’s propensity to share the intolerance drops down to a pale seven percent.”

The race is on to supply nutritious products with ‘clean labels’ featuring everything people want and nothing they don’t, plus ingredients they understand. Thankfully, the change is occurring from the inside out.

A07_image_1

Groceries and restaurateurs
In the grocery sector, specialty aisles have been a long-term trend, their themes evolving based on consumer demand. In the 1980s, fat-free was the direction, followed by low-carb. Billions of dollars of products were developed to satiate consumers’ hunger for these products. These trends ran deeply into the processed food space and across demo- and psychographics, far beyond the assumed alternative health food or fitness aficionado consumer. At one point, Walmart had one of the largest low-carb food programmes in the US, even appointing a team of food company entrepreneurs to show the way for the retailer, making recommendations on what foods consumers were demanding and giving direction on formulating them for the retailer.

The rise of gluten allergies and consumer concern about genetically modified corn has sparked a mainstream interest in natural foods and organics. In Spring 2014, Walmart partnered with natural food grocer Wild Oats to add an organic section, and is attempting to compete in the space by offering natural food and organic products at a 25 percent discount over speciality retailers such as Whole Foods. Spokespeople from Walmart say 91 percent of its customers are asking for organics. According to Mintel Research, the gluten-free industry in the US alone is pegged at $10.5bn.

The last of the ‘food frontier’ segments to respond to this shift in food quality offerings is the restaurant industry – but, even there, changes are well underway. Some of the biggest restaurateur experts – who proliferated global restaurant empires such as Pizza Hut and TGI Fridays, and have been involved with brands such as Baja Fresh and Wendy’s – are at the forefront of providing super-dense, high-nutrient food concepts. Just as Walmart is doing to mainstream organic grocery offerings, the creators of Sharky’s Woodfired Mexican Grill, Steve Paperno and David Goldstein, are applying the chain restaurant format to provide Americans with what they describe as “food to feel good about”. This 21-location chain in Southern California has been named the “fastest growing” chain by Restaurant Business Magazine and the Los Angeles Business Journal.

Paperno and Goldstein are one of a healthy handful of restaurant barons heading in a ‘good for you’ direction. Greg Dollarhyde, a driver behind Pizza Hut and Taco Bell, is hatching the Veggie Grill, a 26-location chain providing “craveable vegetarian food that enables you to move veggies to the centre of your plate”. The effect these types of restaurants are having on the global crop switch is mostly a drive away from anything GMO, such as corn, canola, cottonseed and soy, favouring organic over conventionally grown anything.

A07_image_2

Rice, challenges and opportunities
Like anything, rice has had its share of issues and challenges, most of which have been in relation to an education process about how it’s grown and as a water-saturated ‘translocator’ plant (i.e. what is takes from the soil). Organisations such as the World Rice Alliance are at the forefront of global movements to ensure pristine products from the cleanest growing regions of the world get onto our plates. Companies such as Janow’s Axiom Foods are global proponents of the allergy-free food movement and the food industry is hungry for what they have to offer. These changes are so massive that organisations such as the US Food and Drug Administration are putting into place new regulations and looking to experts such as Janow while forming their best practices.

With the rise in the number of people consuming more plant-based meals, rice has even more opportunities as a sustainable, environmentally friendly alternative to animal-based proteins and milks.

“When we fraction apart an organic brown rice kernel”, explains Janow, “there is so much to be reaped from it. We feel good about the fact that we’ve hit upon nutritional solutions for such a huge problem in the world. We never realised how much gluten and other food allergies were going to be a player in our business, or that the incredible science of genetics could affect the food supply as it has, but here we are.”

Jay-Z causes storm with Tidal music streaming service

The ongoing shift away from music ownership towards subscription streaming services has taken a new turn with the news that a number of artists – led by rapper Jay-Z – have relaunched online platform Tidal. Known for offering high definition ‘lossless’ audio to subscribers, Tidal was bought by Jay-Z earlier this year after originally launching in Sweden.

Jay-Z told music industry magazine Billboard that he believes Tidal will deliver more money in the long-run to artists

The musician led a wave of high profile artists – including, Jack White, Kanye West, Beyonce, Daft Punk, Coldplay’s Chris Martin, and members of Arcade Fire – in unveiling a service that will take aim at the growing number of listeners on services like Spotify, Rdio and Deezer.

Whereas Apple had previously led the digital music market with its download store iTunes, music fans have increasingly favoured subscription services that offer unlimited playback of songs. Spotify – also launched in Sweden – has expanded rapidly since its launch in 2006 and now has over 60m users. However, many artists are disgruntled with the low cost of the service, in particular Spotify’s ad-supported free service that accounts for the majority of its users. To date, only around 15m users pay a subscription to remove the adverts.

The debate over streaming services and whether they provide enough money to artists has rumbled on for the last few years, and Tidal seems to be a reaction against that. The main service offers streaming of high definition audio that has not been compressed for just under $20 a month. A cheaper service of standard audio quality is also available at $10 a month – in line with rivals like Spotify.

Jay-Z told music industry magazine Billboard that he believes Tidal will deliver more money in the long-run to artists, as well as giving them greater control over their catalogues. “We didn’t like the direction music was going and thought maybe we could get in and strike an honest blow. Will artists make more money? Even if it means less profit for our bottom line, absolutely. That’s easy for us. We can do that. Less profit for our bottom line, more money for the artist; fantastic.”

Apple is set to launch its own streaming service in June, which will merge its iTunes platform with the Beats Music service it acquired as part of its $3bn deal last year. Where the new Apple product and Tidal differ from Spotify is there focus on curation from industry figures and artists, rather than a digital algorithm that tries to predict what a listener will like. Both services will also offer exclusive content, such as advanced album releases and sessions, in order to entice people away from rivals.

What will the Nicaragua Canal mean for locals?

At the edge of the Pacific sits the small Nicaraguan town of Brito. This little-known place could soon become the country’s second biggest city and a global hub for free trade if a $50bn project, now underway, is completed.

The project is the Nicaragua Canal: an Atlantic to Pacific crossing that would mark one of the largest engineering ventures the world has ever seen. Set to be up and running by 2019, the waterway, led by Hong-Kong based developer HKND and its enigmatic, billionaire head Wang Jing, would form a rival to the Panama Canal – and, at 172 miles long, it would be three times its size and double its depth. The canal would intersect Lake Nicaragua, the largest lake in Central America, forming an alternative for bulk carriers and other boats too large for the Panama passage.

Local unease
Wang Jing and co gained support from the Nicaraguan government on the grounds that the canal could fuel job creation and drive investment in what is currently the second poorest country in the Western Hemisphere. But the plans have faced strong local opposition, sparking protests among those who fear for their homes and lives as they know them. It’s little wonder: head of the canal authority, Manuel Coronel Kautz, has compared the impact of the project to that of colonisation.

There are also environmental concerns: Lake Nicaragua is currently the country’s main source of drinking water and is home to 38 species of fish. “By building the canal going through the lake, we know that there’s going to be serious environmental damage”, Jorge Huerte-Pérez, of the Academy of Sciences in Nicaragua, told The Guardian.

According to biologists, the plans will also split the Mesoamerican Biological Corridor in two, dividing gene pools and so further threatening wildlife. Environmental assessments are being made, but they’re being rushed according to critics – along with the project itself, which saw a rapid approval progress (just two days were devoted to debating it in parliament).

Increasing China’s power
The canal would likely mean upping China’s influence in the region, while giving the country greater access to Venezuela’s crude oil. Such benefits have led to widespread suspicion that HKND head Wang Jing is being funded by Beijing, although the government has refuted the claims and Jing has insisted it is a private venture.

Increased Chinese influence could threaten the strong ties the US has with Central America. The American embassy expressed concern in a statement, saying it was “worried by the lack of information and transparency that has existed, and continues to exist, over many of the important aspects of this project”.

It might not be time to fear just yet, though; 70 separate plans have been laid out for a Nicaragua Canal in the past, and this could be another on the list of failures. Some believe the project will struggle to get private investors on board given there is already a major Atlantic-Pacific waterway, with HKND’s growth and profit projections considered overly optimistic. The canal would also face tough competition from the Middle East’s Suez Canal, usage of which is predicted to increase over the coming years as global manufacturing in Southeast Asia grows.

But some believe this project, which has got to a further stage than most of the previous plans, will reach completion. If it does, it’s likely to bring the world’s two biggest superpowers head to head and change Nicaragua and the lives of its inhabitants forever. Whether that’s for the good or bad remains to be seen, but many don’t want to find out, and for good reason.

Insight

Artificial intelligence is scary. But that doesn’t mean we should stop it

For something that does not yet exist, artificial intelligence (AI) has managed to cultivate more terror among human beings than Spielberg did about going for a dip in the ocean. The cult of fear that surrounds AI is not limited to those of the tin-foil hat club either. Some of the world’s brightest minds have offered their two cents on the potential dangers of bringing AI into existence, something which has helped foster an irrational level of anxiety among the public over a technology that has the capacity to usher in a new age of discovery for mankind.

Those waving the red flag claim this technology, once created, will have the power to dramatically reduce the size of the world’s labour force; stealing jobs from people across the career spectrum. Those out of work will be left wondering what it all means in a world without purpose, as artificial beings replace human ones in every area of life. Until, after permeating all facets of society, the machines rise up and destroy mankind once and for all. Scary stuff when you put it like that.

But the thing about fear, as the Indian philanthropist Jaggi Vasudev explains “is [it is] always about that which does not yet exist”. That is not to say there aren’t potential pitfalls that should be considered and taken seriously, as the case should be when developing any new technology. But whether someone chooses to look upon AI with trepidation or not depends far more on the philosophical beliefs held by the individual than any inherent evil that could potentially exist inside machines.

No matter how insane it may be to fear the non-existent, it has not stopped humanity seeing AI as something to be afraid of. In fact, a report by the Global Challenges Foundation has put the technology alongside nuclear war, climate change, super-volcanic eruptions and other potential “risks that threaten human civilisation”. However, the report cannot seem to make up its mind, suggesting on the one hand, that, in the future, machines and software in possession of “human-level intelligence” may wreak havoc against humanity, but that equally, the technology could open up a whole new world – one that could help offset most other catastrophic risks and where access to scientific breakthroughs once thought out of reach could be within man’s grasp.

“Such extreme intelligences could not easily be controlled (either by the groups creating them, or by some international regulatory regime), and would probably act to boost their own intelligence and acquire maximal resources for almost all initial AI motivations”, propose the report’s authors, Dennis Pamlin and Stuart Armstrong. “And if these motivations do not detail the survival and value of humanity, the intelligence will be driven to construct a world without humans.”

50%

Of jobs will potentially be automatable within two decades

God’s image
The idea that a super intelligent AI would wish to obliterate an entire species is based on the assumption an advanced intelligence would care about us enough to hit the kill switch. “Perhaps what we really fear, even more than a Big Machine that wants to kill us, is one that sees us as irrelevant”, wrote theorist Benjamin H Bratton in an op-ed piece for The New York Times. “Worse than being seen as an enemy is not being seen at all.”

Is it not more plausible to imagine this scenario, rather than one where an AI sees man as some sort of threat worthy of being disposed of? If anything, the notion that a highly intelligent life form, artificial or otherwise, would perceive us as a challenge worth vanquishing says more about man’s own arrogance than it does the possible motivations of a sentient supercomputer.

“That we would wish to define the very existence of AI in relation to its ability to mimic how humans think that humans think will be looked back upon as a weird sort of speciesism”, wrote Bratton. “The legacy of that conceit helped to steer some older AI research down disappointingly fruitless paths, hoping to recreate human minds from available parts. It just doesn’t work that way.”

Technological obsolescence
A much more likely scenario than being murdered by machines in our sleep is the prospect of being rendered obsolete by our new robot overlords. As the report by the Global Challenges Foundation warns, the risk that “economic collapse may follow from mass unemployment as humans are replaced by copyable human capital” is completely within the realm of possibility. Indeed, it is already happening.

Many assembly line workers have had to come to terms with the fact there are machines out there that can do their jobs quicker, more safely and more efficiently – not to mention the fact robots don’t tend to take sick leave.

But factory workers are not the only people who are going to feel the squeeze of mechanisation. Any worker whose role entails a routine, task-driven activity will one day have to contend with technology – and they are going to lose. With the advent of Google’s driverless car, not to mention the accessibility of drones, which will soon be capable of self-piloting, it won’t be long before huge swathes of truck drivers, delivery boys and bike couriers are out of a job too.

But don’t think this is just an issue for low-skilled workers, and that those cushy middle-income jobs are far too complicated for machines to master. Computers will be capable of outperforming us in a wider variety of far more complex tasks. Realistically, the only jobs in which we will be able to trump machines will be those that require social intelligence (e.g. persuasion and negotiation) or creative intelligence (which is simply the ability to imagine new ideas). Nearly 50 percent of jobs are potentially automatable within a decade or two.

Combine mechanisation with a super-intelligent AI and you are looking at something with the potential to make human beings redundant across the board. Even one of the most influential economists in the world is concerned about the threat of technological obsolescence. “People are scared – and they are right to be scared”, said Robert Shiller at the World Economics Forum in Davos. “Artificial intelligence is coming, and it will replace your job.”

Drone aircraft are increasingly common and will soon be capable of self-piloting
Drone aircraft are increasingly common and will soon be capable of self-piloting

No work no sweat
So the machines are coming to steal everyone’s jobs. Sounds reminiscent of the rhetoric employed by nationalist political parties when trying to stir up fear of immigrants. The thing is, the vast majority of the jobs taken by immigrants are low-level roles most nationals don’t want to do anyway. The same can be said of automation.

Take Google’s driverless cars and Amazon’s plans for delivery drones for example. Yes, they will render many people jobless, forced to retrain or find another career path, but, in the grand scheme of things, autonomous machines doing the work of human beings is a net positive for society.

Remember that cities are responsible for 80 percent of all greenhouse gas emissions and that one of the biggest challenges facing the planet is how climate change can be combatted so mankind can avoid a catastrophic event. The innovative application of AI could help reduce the size of cities’ carbon footprints, especially if the machinery used is fossil fuel free.

The technology would have the added benefit of dramatically reducing the congestion seen on city streets and remove human error from the equation, helping to reduce the 3,287 road-related deaths that occur each day. Not only would that save lives, it would also reduce the strain on A&E departments around the world.

Yet, there are those who worry what the planet, and more importantly, the economy would look like, if machines were to take over the workplace. Much of this concern comes from people worrying about their role in a meaningless universe, deprived of purpose. A world without traditional jobs is one without full employment or money – and that scares the hell out of people. But it needn’t. The anxiety about such a future is the result of individuals, politicians and businesses looking at the problem through the current economic prism.

Why not adapt the economic model to suit the evolving landscape as advanced technologies begin to emerge, become viable and start remoulding the world around them? The concept of unconditional basic income (UBI) has gained a small amount of traction in recent years, but failed to take hold because there is not yet the imperative to accept such a radical idea. People still have jobs; unemployment, although high in many countries, has not reached crisis point, and there is still a real need for human workers. However, in a world of machines, concepts such as UBI could help prevent an economic collapse and provide security to people as mankind makes the transition into a new world.

The ancient Egyptians were able to develop an advanced culture because they were one of the first peoples to practice large-scale agriculture. It is impossible to develop language, plan great feats of architecture or do much thinking about anything at all when you are constantly worrying where your next meal will come from. By using AI and automation to free ourselves from the shackles of traditional work, the acquisition of wealth and the consumption race, we would be able to unlock the full creative potential of mankind on a scale impossible or even imaginable before.

People are fearful of AI because they cannot envisage their place in a world where it exists. What mankind can do is try to picture the type of world that we want to live in, so AI can help us get there.

Driverless vehicles could put a large number of professional drivers out of work
Driverless vehicles could put a large number of professional drivers out of work

Exercising caution
It is important we develop AI to be friend not foe. One man working on that problem is Luke Muehlhauser, Executive Director of the Singularity Institute in Silicon Valley. “Anything intelligent is dangerous if it has different goals than you do, and any constraint we could devise for the AI merely pits human intelligence against superhuman intelligence, and we should expect the latter to prevail”, he told Wired. “That’s why we need advanced AIs to want the same things we want. So friendly AI is an AI that has a positive rather than negative effect on human beings. To be a friendly AI, we think an AI must want what humans want. Once a super intelligent AI wants something different than what we want, we’ve already lost.”

That poses the question of what it is man wants. Whatever it is, it’s important to think long and hard about it before unleashing AI into the world. However, unsurprisingly, there is very little discussion being had about the subject – and that is what bothers people such as Stephen Hawking.

In a piece for The Telegraph, Hawking wrote: “Facing possible futures of incalculable benefits and risks, the experts are surely doing everything possible to ensure the best outcome, right? Wrong. If a superior alien civilisation sent us a message saying, ‘We’ll arrive in a few decades’, would we just reply, ‘OK, call us when you get here – we’ll leave the lights on’? Probably not – but this is more or less what is happening with AI. Although we are facing potentially the best or worst thing to happen to humanity in history, little serious research is devoted to these issues outside non-profit institutes such as the Cambridge Centre for the Study of Existential Risk, the Future of Humanity Institute, the Machine Intelligence Research Institute, and the Future of Life Institute. All of us should ask ourselves what we can do now to improve the chances of reaping the benefits and avoiding the risks.”

Unfortunately, history has shown that forethought about the wider ramifications of developing new technology is rare. Even when it is exercised, we tend to throw caution to the wind, preferring to take a chance at moving forwards, instead of standing still. AI should not be looked at with fear, but with a sense of caution, as the only thing more frightening than taking a chance is not taking it at all.