The AI governance challenge

On the sidelines of the last World Economic Forum Annual Meeting in Davos, Singapore’s Minister of Communications and Information quietly announced the launch of the world’s first national framework for governing AI. While the global media has glossed over this announcement, its significance reaches well beyond the borders of Singapore or the Swiss town where it was made. It is an example that the rest of the world urgently should follow and build upon.

Over the last few years, Singapore’s government, through the state-led AI Singapore initiative, has been working to position the country to become the world’s leader in the AI sector. And it is making solid progress: Singapore, along with Shanghai and Dubai, attracted the most AI-related investment in the world last year. According to one recent estimate, AI investment should enable Singapore to double the size of its economy in 13 years.

Managing the risks
Of course, AI’s impact extends globally. According to a recent McKinsey report, Notes from the AI Frontier, AI could add up to 16 percent of global GDP growth by 2030. Given this potential, the competition for AI investment and innovation is heating up, with the US and China predictably leading the way. Yet, until now, no government or supranational body has sought to develop the governance mechanisms needed to maximise AI’s potential and manage its risks.

Strengthening AI-related governance is in many ways as important as addressing failures in corporate or political governance

This is not because governments consider AI governance trivial, but because doing so requires policymakers and corporations to open a Pandora’s box of questions. Consider AI’s social impact, which is much more difficult to quantify – and mitigate, when needed – than its economic effects. Of course, AI applications in sectors like healthcare can yield major social benefits. However, the potential for the mishandling or manipulation of data collected by governments and companies to enable these applications creates risks far greater than those associated with past data-privacy scandals – and reputational risks that governments and corporations have not internalised.

Another McKinsey report notes: “Realising AI’s potential to improve social welfare will not happen organically.” Success will require “structural interventions from policymakers combined with a greater commitment from industry participants”. As much as governments and policymakers may want to delay such action, the risks of doing so – including to their own reputation – must not be underestimated.

In fact, at a time when many countries face a crisis of trust and confidence in government, strengthening AI-related governance is in many ways as important as addressing failures in corporate or political governance. After all, as Google CEO Sundar Pichai put it in 2018: “AI is one of the most important things humanity is working on. It is more profound than, I don’t know, electricity or fire.”

The European Commission seems to be among the few actors that recognise this, having issued its draft ethics guidelines for trustworthy AI at the end of last year. Whereas Singapore’s guidelines are focused on building consumer confidence and ensuring compliance with data-treatment standards, the European model aspires to shape the creation of human-centric AI with an ethical purpose.

Accepting responsibility
Yet neither Singapore’s AI governance framework nor the EU’s preliminary guidelines address one of the most fundamental questions about AI governance: where does ownership of the AI sector and responsibility for it and its related technologies actually lie? This question raises the fundamental issue of responsibility for AI, and whether it delivers enormous social progress or introduces a Kafkaesque system of data appropriation and manipulation.

The EU guidelines promise that “a mechanism will be put in place that enables all stakeholders to formally endorse and sign up to the guidelines on a voluntary basis”. Singapore’s framework, which also remains voluntary, does not address the issue at all, though the recommendations are clearly aimed at the corporate sector.

If AI is to deliver social progress, responsibility for its governance will need to be shared between the public and private sectors. To this end, corporations developing or investing in AI applications must develop strong linkages with their ultimate users, and governments must make explicit the extent to which they are committed to protecting citizens from potentially damaging technologies. Indeed, a system of shared responsibility for AI will amount to a litmus test for the broader ‘stakeholder capitalism’ model under discussion today.

‘Public versus private’ is not the only tension with which we must grapple. Political economist Francis Fukuyama pointed out: “As modern technology unfolds, it shapes national economies in a coherent fashion, interlocking them in a vast global economy.” At a time when technology and data are flowing freely across borders, the power of national policies to manage AI may be limited.

As attempts at internet governance have shown, creating a supranational entity to govern AI will be challenging, owing to conflicting political imperatives. In 1998, the US-based Internet Corporation for Assigned Names and Numbers was established to protect the internet as a public good by ensuring, through database maintenance, the stability and security of the network’s operation. Yet approximately half of the world’s internet users still experience online censorship.

The sky-high stakes of AI will compound the challenge of establishing a supranational entity, as leaders will need to address similar – and potentially even thornier – political issues.

Masayoshi Son, CEO of Japanese multinational conglomerate SoftBank and enthusiastic investor in AI, recently said his company seeks “to develop affectionate robots that can make people smile”. To achieve that goal, governments and the private sector need to conceive robust collaborative models to govern critical AI today. The outcome of this effort will determine whether humankind will prevail in creating AI technologies that benefit us without destroying us.

©️ Project Syndicate 2019

Why the Airbus A380 failed to take off

In 2005, when Virgin Atlantic placed an order for six Airbus A380 jets, Richard Branson joked that the airliners, which featured double beds and a casino, would offer customers “two ways to get lucky”, according to The New York Times. Airbus had placed its bet: as more passengers took to the skies each day, it predicted that airlines would demand larger jets as airports and routes became increasingly congested.

The aerospace manufacturer envisioned that the future of flying would be based on the ‘hub and spoke’ model, with passengers travelling between major airports and using connecting flights to reach their final destination.

Designed to compete with Boeing’s models, the A380 was never able to match the success of the 747

Fast-forward to 2019, and Airbus’ forecast has unravelled. Last year, Virgin cancelled its long-standing order of six A380s. Then, in February 2019, Emirates – the most loyal customer of the superjumbo programme – cut its order from 53 to 14 aircraft, while Qantas cancelled the delivery of its last eight A380s. This was the final nail in the coffin for the project: Airbus reluctantly announced the “painful” decision to permanently halt the A380 production line in 2021, following years of speculation regarding the model’s future. The plane that many thought would revolutionise aviation lasted just 16 years.

Efficiency is key
Meanwhile, on the other side of the Atlantic, Boeing made a bet of its own. Conversely to Airbus, the Seattle-based manufacturer envisioned customers flying point to point between smaller airports. As the A380 took to the skies for the first time in 2007 – just as the global financial crisis was starting to hit the pockets of operators and customers – Boeing unveiled the smaller, sleeker, more efficient Dreamliner 787 aircraft, a jet that could fly the same distance as Airbus’ jumbo on two fewer engines.

As margins were stretched, airlines struggled to afford the risks associated with jumbo jets. An aircraft that was smaller and quicker to refill was a far more appealing offer. “The A380 is an aircraft that frightens airline CFOs; the risk of failing to sell so many seats is just too high,” a senior aerospace industry source told Reuters in February. As Airbus grappled to find potential customers for its flagship craft, orders for the 787 were piling up.

“The future of aviation is being driven not by the big, established global players, but by small, low-cost start-up operators,” Martin Pugsley, Head of Financial Services at law firm DWF, told The New Economy. “These companies look to scale up quickly by offering lots of flights connecting lots of locations. To do this, they need smaller, lower-cost, fuel-efficient, single-aisle aircraft that have a quick turnaround time. The fastest-growing markets for aviation services are in the developing world, and most of these new players are following the low-cost model too.”

Airbus relied on Emirates to keep the A380 programme running. Prior to cutting its order, the Dubai-based airline had ordered a total of 162 jets (with 109 currently in operation) – nearly seven times more than the second-largest operator, Singapore Airlines (with 24). Only three other airlines have more than 10 in their fleet, while just one Chinese carrier – China Southern – operates the model. Not one American airline has made an order.

In total, there were 290 firm orders for the jet, of which 234 have been delivered – well short of Airbus’ target of 700. Analysts believe the company has only recuperated a small fraction of the estimated $20bn spent on research and development. The jet, first dreamed up in 1988 and blighted by years of delays as Airbus attempted to get it in the skies, never really took off.

Wrong place, wrong time
Many have asserted that the A380 programme was doomed to fail well before production began. Aviation analyst Richard Aboulafia slated the aircraft, calling it “simply the dumbest programme of modern times” in a 2019 interview with travel blog the Points Guy. Designed to compete with Boeing’s models, the A380 was never able to match the success of Boeing’s own jumbo, the 747 – an aeroplane that came to be known as the ‘Queen of the Skies’. Boeing sold roughly 1,500 of the aircraft, with its ‘hump’ becoming the most recognisable silhouette in the sky.

However, regulations have significantly changed since the jumbo jet’s introduction more than 50 years ago. In the early days of aviation, Extended-Range Twin-Engine Operational Performance Standards (ETOPS) decreed that two-engine aircraft had to fly within 60 minutes of an airport in case they encountered engine issues. As technology progressed and aircraft manufacturers improved their safety records, these regulations were gradually relaxed. Today, Boeing’s 777 and 787 models are able to fly as far as five hours from the nearest airport, leaving them free to operate on effectively any route in the world and opening up new opportunities to airlines – opportunities that have crowded out the A380.

The A380 had its first test flight in 2005 – two years before the new ETOPS directive was issued. With the aircraft designed to take advantage of the previous rules, the change to regulations were a huge blow. “What we are seeing here is the end of the large, four-engined aircraft,” Airbus CEO Tom Enders said in February. “There has been speculation that we were 10 years too early; I think it is clear that we were 10 years too late.” The 747 was a roaring success because it entered service during a period of strict regulation, making it the only option. The A380, meanwhile, deemed a financial flop, was overwhelmed by increased competition as a result of relaxed rules.

Oversized expectations
The spaciousness of the A380 made it a favourite with passengers, while its scale made it a favourite among plane spotters. A dedicated website was set up to help passengers choose a route served by the jet, and in February 2019, The Guardian’s transport correspondent Gwyn Topham said “it felt a miracle” that this enormous plane could fly.

But in spite of this reverence, business travellers – integral to the profit margins of legacy airlines and flag carriers on long-haul routes – demonstrated an overwhelming preference for flight frequency and flexible schedules over aircraft size. Airlines found they could fly two 777s on a typical long-haul route at a lower total cost than flying one A380, making the aircraft economically uncompetitive on all but a handful of routes worldwide.

“When Airbus unveiled the prototype in 2005, its main selling point was its sheer size. Ironically, the aircraft’s size is what turned potential airline customers off,” said Pugsley. “Each unit is naturally far more expensive than smaller rival models, which meant that Airbus was operating in a small market from the very beginning.”

The limited appeal was further compounded by a lack of airport facilities capable of handling the plane. Airbus had out-engineered the airports: gates had to be refitted to accommodate a plane of such magnitude; runways had to be strengthened to cope with its weight; and even terminals had to be adjusted to ease potential passenger congestion issues stemming from the A380’s vast capacity. “When you deal with new technologies where the industry has not enough previous experience, it’s easy to underestimate the challenges during the design,” Paolo Colombo, Global Industry Director at software developer ANSYS, told The New Economy.

As one innovation meets its demise, another is born. The Boeing 777-9 is due to make its first flight later this year and, though it has just a single deck, its expected capacity of around 400 passengers near enough matches that of the A380. Further, despite having half as many engines, it can fly just as far. Equipped with folding wing tips, the jet can operate in and out of all of the world’s major airports.

With the aircraft already outstripping the A380 in terms of orders a year before its formal launch, it looks like the new ‘Triple Seven’ could be the future of aviation. But, then again, the same was once said of the A380.

The attention economy commodifies human engagement, but it’s difficult to monetise

Brands don’t want your dollars anymore – they want your eyeballs. The ominously named concept of ‘eyeball marketing’, where a business’ value is derived from the amount of attention it garners, rather than its revenue, has become the modus operandi for today’s digitally focused brands. These companies are eschewing the notion that paying customers are loyal customers, and are instead looking to less tangible metrics.

Loyalty is the ultimate attention filter as it prevents consumers from even looking for other information, products or services

This move away from real currency has proved problematic, particularly for digital media companies with complex business models and no physical product revenue to rely on. As online advertising dollars are increasingly captured by social media sites, the so-called ‘attention economy’ has left firms with all eyes on their platforms, but no cash in the bank.

New rules
The concept of the attention economy was coined by Herbert A Simon in a 1971 essay titled Designing Organisations for an Information-Rich World. In the essay, Simon argued that the increasingly information-rich age in which we live has created an overwhelming catalogue of data that humans simply cannot process or absorb.

Attention, then, is used as a differentiating factor in the consumption of this information; a tool to prioritise what is deemed interesting, relevant or important.
In a commercial context, the theory is predicated on an understanding that consumers devote their attention to brands that appeal to them the most. According to Paul Armstrong, founder of technology consultancy HERE/FORTH, “eyeballs and clicks” have become a new currency – except, rather than consumers spending money with a brand, they’re spending time. This mark of success is far more immaterial and challenging to measure.

Metrics such as bounce rates, clicks and time spent viewing a page are used as measures of consumer attention, but the true value of these figures is questionable. While brands that sell physical products may be able to identify a clear correlation between attention and sales, for platforms that disseminate their content freely, the attention economy creates a far more challenging environment.

Attention seekers
Digital news sites are particularly vulnerable in the attention economy. Their entire business model is based upon building a greater readership, none of which pays for what it is reading. These platforms have traditionally relied on advertising to support them, but it has become increasingly clear in recent months that this is not enough to sustain many businesses within the sector.

This was proven earlier this year when more than 2,000 employees at publishers including Vice Media, HuffPost and BuzzFeed News were made redundant in the space of several weeks. Around 9.5 percent of HuffPost’s workforce was laid off in late January after its parent company, Verizon Media, slashed the value of two recent acquisitions by $4.6bn, admitting competition for digital advertising was leading to shortfalls in revenue and profit. BuzzFeed, meanwhile, cut 220 jobs in a bid to boost profitability, following pressure from its venture capital backers.

The redundancies are symptomatic of shifts in the attention economy, where once-safe digital advertising has been upended by tech giants like Facebook and Google, and companies are now struggling to find new sources of revenue. “Digital publishers are being beaten at all turns by big players,” Armstrong told The New Economy. “Few seem to be proactively looking for new models and it’s this lack of innovation that’s really hurting their businesses.”

The rise of the internet has also created an economy where products – in this case, the news – cost virtually nothing to reproduce, meaning suppliers face issues in adding value. Newness and findability are not enough to sustain a business model, so publishers must find ways to position themselves as a trusted source or produce personalised content to capture the attention and loyalty of their customers.

“Loyalty has never been harder to achieve thanks to the choice available and the transient nature of brands,” said Armstrong. Yet, it’s vital for brands to achieve this if they are to survive in the attention economy. Loyalty is the ultimate attention filter as it prevents consumers from even looking for other information, products or services. Some have hit the nail on the head: exclusive clothing brand Supreme, for example, creates loyalty through scarce product availability and a strong online presence, generating a “cultural phenomenon and legions of obsessive fans”, said Armstrong. Others, such as short-form video-hosting service Vine, have seen their customers running for the door in favour of more relevant or exciting options.

Innovate to survive
The only way to beat the metrics is to “focus on core strengths – remind people why they come to that brand and what that brand stands for”, Armstrong told The New Economy. But establishing a loyal audience is only the first step – the real challenge is monetising the attention of that audience.

Digital advertising alone is not sufficient, so some brands have opted to put up paywalls, meaning readers must pay a fee before they can access certain products or content, which serves to lock in attention to that brand. It’s proved a successful strategy for some – for example, The New York Times’ 2.5 million digital subscribers accounted for almost two thirds of its $417m revenue in Q3 2018. However, there is a caveat, explained Armstrong: “Paywalls depend on an existing audience and strong traffic. If you have the former, go forth and test whether it will pay for content. But, if you just have traffic, feel free to test a paywall, but be warned that it’s unlikely to become a huge source of revenue if interest is transitory.”

Others have established systems where users enter their personal information to access content, such as inputting an email address to download a report. Brands can use this information in future email marketing, increasing their consumer database and the likelihood of future sales. Similarly, though, “selling data relies on a core understanding of who your audience is and what they are willing to accept [in terms of subsequent marketing],” said Armstrong. “Too often data is abused, or brands apply too-broad strokes”, which has a detrimental effect on loyalty.

The fundamental issue with the attention economy, though, is that attention as a metric is valourised entirely by brands. “Attention is a currency in so much that brands like it and want more of it,” said Armstrong. “The average consumer doesn’t think about giving their attention to a brand’s content.” If consumers aren’t making a conscious decision about where they direct their attention, brands must undertake a significant amount of guesswork to establish how to influence them.

“Attention is a fool’s errand as a [key performance indicator],” said Armstrong. Rather than obsessing over the amount of time consumers spend viewing their content, brands should focus on creating pioneering products and experiences that have inherent value and are relevant to their target audience; those will ultimately draw long-lasting attention. Placing too much importance on where consumers spend their attention locks content creators into a never-ending feedback loop and slows innovation to a drip. As Armstrong concluded: “The best brand strategies are agile and show willingness to try new things, without necessarily being sure of the [return on investment].”

Fatal dam collapse in Brazil casts doubt over the mining industry

Before 2019, few had heard of Brumadinho, a small municipality of Brazil. Located in the state of Minas Gerais (which means ‘general mines’), the town is largely dependent on mining for its livelihood. On January 25, tragedy put Brumadinho on the map.

The local mining dam suddenly collapsed and a mudslide engulfed the area, bursting through the mine’s offices before descending on the community below. The most recent figures from the Civil Police of Minas Gerais, released on May 5, put the death toll at 237, with 33 people still missing, making it the deadliest incident in the history of Brazil’s mining industry.

The mining industry has tended to treat waste management as an afterthought and is now suffering the consequences

The owner of the complex, the mining company Vale, now faces public condemnation. Since the incident, 11 employees and two contractors have been arrested. This is not the first time Vale has been implicated in a dam collapse. A similar accident occurred in 2015, when a dam owned by Vale and BHP burst in Mariana (also located in Minas Gerais), killing 19 people and flooding the local area.

In both instances, the dams were tailings dams, which store the waste produced during the mining process. There is one particular type of tailings dam – the upstream dam – that has been known for more than 10 years to have a higher risk of collapse than any other. Upstream dams use the mine waste itself to form part of its structure, which can make it less stable over time. There are 88 dams of this kind in Brazil. Both the Brumadinho dam and the Mariana dam were upstream dams.

Erica Schoenberger, Professor of Environmental Health and Engineering at Johns Hopkins University, told The New Economy: “Upstream dams, where the sequential stages of the dam are built inwards, are cheap but not so good at retaining water and are particularly susceptible to seismic shocks.”

In spite of these safety concerns, dams built using the upstream method constitute the vast majority of tailings dams. There are approximately 3,500 tailings dams worldwide and they are among the largest man-made structures on Earth. Yet few people are aware of them. As containers for a toxic sludge of water and minerals, tailings dams exist to keep mining waste out of sight and out mind. The invisibility of these dams within public consciousness reflects their place in the mining industry, which has tended to treat waste management as an afterthought and is now suffering the consequences.

Behind the Vale
Vale is the world’s largest iron ore exporter. Given the significant role that natural resources play in the Brazilian economy, the organisation has enjoyed huge success over the years. This can be attributed in part to the commodities boom of the mid-2000s, which was spurred on by high resource demand from China. As a major job provider in Brazil, the mining industry has also tended to enjoy support from politicians. For instance, President Jair Bolsonaro campaigned on promises to increase mining in the Amazon rainforest. The sector has been incentivised to maximise production, seemingly at the expense of safety regulations.

Across many industries, the traditional approach to waste is to deal with it as cheaply as possible. For example, water dams are functionally very similar structures to tailings dams but their security is taken more seriously at the early stages of design since water is a valuable asset; there is profit to be made from storing water effectively.

By comparison, there is no economic incentive for storing mine waste – or rather, there is no clear economic incentive until disaster strikes. After the crisis, Vale’s shares fell 24 percent. The company recorded a $1.6bn loss in the first quarter of 2019. Union Investment is among the investors who have since walked away from the company.

Over the last decade, tailings dam collapses have become more common, and researchers at World Mine Tailings Failures predict that 19 very serious failures will occur between 2018 and 2027. As a result, investors are now realising the extent of the risk across the mining industry. In April 2019, the Church of England Pensions Board and Council on Ethics of the Swedish National Pension Funds called upon almost 700 listed mining companies to disclose information about their management of tailings dam facilities. They posited that investors have a right to better assess the risk of their holdings in mining companies.

Vale maintains that it has always abided by global regulatory standards, a claim that is being assessed by the ongoing criminal investigation. However, if true, it would suggest that the current global standards do not do enough to reduce the risk of collapses.

Searching for alternatives
On the one hand, the cause of the recent dam collapses stems from poor design and engineering. Tailings dams are outdated structures that have seen little innovation since they were created approximately 100 years ago. The Ecologist argues that this lack of innovation can be attributed to cutbacks to research and development departments in the downturn of the mining supercycle. Now many are calling for the use of alternatives to tailings dams.

One such alternative is the dry stack method. This involves dewatering the mine waste and compacting it for storage. This not only means that less water is used in the mining process but also makes the structure much more stable. However, it is not currently considered to be financially viable for larger mines and only works in certain environments. Its use was rejected for a new mine in British Columbia because it is less feasible in wet, mountainous regions.

A particularly attractive prospect looks to minimise the amount of waste produced by mining or even eliminate it altogether. Dr Bernhard Dold, Professor of Applied Geochemistry at Luleå University of Technology, believes such techniques could prove instrumental in making the mining industry more sustainable. “We are at the starting point of this transition of the mining process,” said Dold. “The final goal is to eliminate the waste (tailings and waste rocks) and hence the risk of dam failures and pollution.”

According to Dold, one approach is to separate the more dangerous, reactive materials in the tailings from non-reactive silicates like sand, which could be repurposed for construction and other industrial processes. An even more exciting prospect, however, would be to find a way of turning tailings into a georesource. If this became commonplace within the industry, then the mining sector would naturally invest in the safety of tailings dams, as these would now store an asset as opposed to a waste product.

Taking responsibility
It is unlikely that the right engineering solution will present itself immediately. It may be that what the industry requires first is a fundamental shift in mindset.

“The planning and design of tailings storage facilities (TSFs) need to be integrated into the overall plan for the mine,” said Schoenberger. “This may seem a no-brainer, but it has not been standard practice in the industry. Expertise in TSF engineering needs to be nourished within the firm so that expert advice is consistently part of the process. Given that tailings dam failures are the largest cause of mining disasters, mine management needs to fully embrace the idea that TSF design and construction must attain the highest standards regardless of the cost, if for no other reason than that a TSF failure will cost the company more.”

According to Schoenberger, the tailings dam failures are more the result of insufficient regulations and poor governance than they are a question of technology. In support of this argument, The Wall Street Journal reported that Brumandinho miners had warned managers that the dam was going to collapse and still nothing was done. Schoenberger argues for greater responsibility at the decision-making level, both within companies and also from governments.

“Government regulation of the industry – and robust enforcement of those regulations – is an absolute necessity if we want to reduce the number and severity of mining disasters,” said Schoenberger. “Regulations on paper must be translated into practices on the ground. This means also that governments have to embrace the same principles of best practice and commit to enforcing them.”

Historically, the economics of disaster risk has not been factored into the overall cost of building a mine. The Brumandinho disaster shows just how detrimental this shortcoming can be, both to mining companies and, more importantly, the communities they operate in. As shareholders become more aware of the risks posed by tailings dams and as governments place pressure on the industry to act, mining companies may be forced to find new, safer ways of managing their waste.

Graphene is the new wonder material transforming the energy sector

Stories of supposedly groundbreaking technologies seem to be around every corner. It is rare, however, for these innovations to actually change the world. One material that succeeded in disproving its sceptics was plastic – dating back to the mid-19th century, synthetic polymers have had a profound impact on the planet. As their price tumbled in the 1950s and 60s, plastics were used in the mass production of items that never would have been possible without the material’s pliability, strength and lightweight quality. Plastic products, from disposable syringes to water bottles to contact lenses, have helped boost living standards around the world.

Graphene boasts an impressive collection of superlatives: not only is it the world’s thinnest material, but it is also the strongest

However, plastic is now falling out of favour due to its contribution to environmental pollution and its role in encouraging a throw-away culture. But there is another wonder material promising disruption on a global scale: graphene. According to the University of Manchester, which was the site of graphene’s discovery, “combining all of graphene’s amazing properties could create an impact of the scale last seen with the Industrial Revolution”.

Groundbreaking graphene
Graphene is made up of a single microscopic layer of carbon atoms laid out in a honeycomb-like lattice. Although it can be found in an object as ordinary as a pencil, graphene is a completely extraordinary material. In 2004, two researchers from the University of Manchester – Andre Geim and Kostya Novoselov – became the first to isolate graphene by sticking adhesive tape to a piece of graphite. When they pulled the tape away, they were able to separate a single layer of carbon, opening the door to a new class of two-dimensional materials. The pair won the 2010 Nobel Prize in Physics for their “groundbreaking experiments” with graphene.

Graphene boasts an impressive collection of superlatives: not only is it the world’s thinnest material, but it is also the strongest. It is tougher than diamonds, more conductive than copper and incredibly flexible. Graphene is also completely transparent while still being extremely dense.

Around the world, entrepreneurs and businesses are taking note of the remarkable material. Between 2010 and 2017, the number of patent filings related to graphene grew at an average rate of nearly 61 percent per year to reach 13,371, according to market research firm ReportLinker.

Graphene’s appeal is nearly universal. Its flexibility and transparency could one day contribute to a phone that could be rolled up like a newspaper, while its incredible strength allows it to take a hit twice as well as Kevlar, the fabric currently used in bulletproof vests. As a replacement for health-tracking wearables like Fitbits, graphene could be tattooed directly onto the skin. In the future, aeroplanes, cars and countless other machines could be manufactured out of lightweight, superstrong graphene. Already the material has been injected into tennis rackets to enhance their durability.

Value of the graphene market

$85m

2017

$200m

2018

$1bn

2023 (predicted)

In the energy sector, there are a number of ways graphene could enhance power generation, storage and infrastructure. As Craig Dawson, a graphene applications manager at the University of Manchester’s Graphene Engineering Innovation Centre, told The New Economy: “Because graphene has the ability to serve several purposes, often at the same time, we could see its use [becoming] ubiquitous across the sector.”

Rapidly growing interest in the material has seen the value of the graphene market soar from $85m in 2017 to nearly $200m in 2018, ReportLinker said. Predictions vary about how quickly the market will continue to grow. While market research firm IDTechEx expects the industry to be worth $300m by 2027, ReportLinker said it could reach as much as $1bn by just 2023, with China leading the way in research and development.

Khasha Ghaffarzadeh, Research Director at IDTechEx, wrote in 2018 that China has developed a huge monopoly on technology development, dominating sectors from solar photovoltaics to 3D printing. It’s no surprise, then, that the country quickly became the centre of graphene innovation. According to Ghaffarzadeh, the largest suppliers of advanced materials like graphene are Chinese, and nearly 70 percent of the total nominal production capacity is located in the country.

Super-charged applications
Globally, the amount of electricity generated by renewable energy sources is on the rise. In 2017, nearly 24 percent of electricity produced around the world came from renewables. By 2023, the International Energy Agency (IEA) expects that figure to rise to almost 30 percent. Led by solar, wind and hydropower, renewables are set to meet more than 70 percent of global electricity-generation growth, the IEA said.

One big snag in the development of the renewables industry has been the fact that green sources of energy are produced intermittently – when the sun is shining or the wind blowing, for example. To ensure excess energy can be held back and deployed once again when the clouds come out, energy storage will be a vital element of the renewable energy mix.

Dawson and his colleagues at the University of Manchester see graphene as a good candidate for the creation of next-generation batteries, thanks to the material’s mechanical and chemical robustness alongside its impressive conductivity. “Graphene’s impermeability could be useful within devices such as fuel cells and grid-scale redox batteries where protons can hop across the graphene-enhanced membranes,” Dawson explained.

Graphene meets all the qualities of an ideal energy storage device, according to Rob Dryfe, a professor at the University of Manchester. Speaking about graphene in a video for the university, Dryfe said: “[You] want a high-surface, high-volume, light, stable, conducting material, and the advantage of graphene is that it basically ticks all of those boxes.”

Most energy storage devices today use lithium-ion batteries, which can be made up of various formulations of lithium, cobalt, aluminium, nickel and manganese. While lithium-ion batteries are prized for their ability to hold a charge, they are not efficient when it comes to shifting energy in and out of batteries quickly. For this reason, graphene supercapacitors would be ideal for electric vehicles – especially high-end supercars. As well as enabling cars to reach a high speed in seconds, supercapacitors could allow electric vehicles – as well as other electronics like phones and laptops – to be fully charged almost instantly.

Supercapacitors and batteries each have their benefits and downfalls. “Where high power is required, a supercapacitor could be advantageous, and where a large charge capacity is required, a battery is probably more suitable,” Dawson explained. Graphene researchers are also looking into the development of graphene supercapacitor hybrid batteries, which Dawson said would offer the best of both worlds.

Some players within the graphene supply chain, including students and partners of the University of Manchester and corporations like Samsung, have seen “tangible results” of such graphene-modified batteries and supercapacitors, Dawson said. “We could, therefore, expect to see these enter the market in the near future.”
For now, these devices need further research to become fully optimised for use in the energy sector. Nunzio Motta, a professor of science and engineering at the Queensland University of Technology, told The New Economy this is another area where China is ahead of the curve. Some Chinese companies are already producing commercial graphene batteries and supercapacitors, he said, which are being used to power electric buses.

Beyond batteries
Graphene’s utility extends far beyond batteries and supercapacitors, but because it was discovered less than two decades ago, many of its possibilities are still unknown. Researchers are working on experiments today that just five or 10 years ago wouldn’t have seemed possible.

For instance, Dawson said graphene’s dual transparency and conductivity means the material could be used to improve the efficiency of solar cells. This could help end the industry’s reliance on rare materials such as indium tin oxide, an expensive material that can be used as a thin film. “[Graphene] is also impermeable to almost everything, protecting energy infrastructure from the elements,” he added.

In 2012, researchers made carbon-based solar cells using graphene, which they said would be more easily produced and recycled than existing solar cell designs. The prototypes were expensive to make, however, and the scientists said more research was needed in order to optimise efficiencies.

Other projects are ongoing, including a joint venture between UK-based energy technology developer Verditek and graphene specialist Paragraf. The two teamed up in 2017 to create “a new generation of highly robust, ultra-lightweight” graphene-based solar panels that could “potentially revolutionise the photovoltaic market”. Researchers at three Australian universities, meanwhile, joined together to develop a light-absorbing, ultra-thin film that they said has “great potential” for use in solar thermal energy harvesting. Upon presenting their findings in March, the scientists said the graphene film offers a “new concept” for solar energy, opening new research areas, including in the direct conversion of heat to electricity and the desalination of water.

Motta told The New Economy that he does not see graphene as a suitable material for solar cells, but that it could be useful in concentrating solar power plants, which use mirrors to focus sunlight into a central tower where water is boiled to generate superheated steam. “Mirrors covered with graphene could have some application because graphene could conduct the heat very well,” Motta said. The possible applications of graphene elsewhere in the energy industry are far-reaching, including reinforcing materials used in wind turbine blades and lacing concrete used to build other energy structures with graphene.

Opening doors
Graphene is not only revolutionary because of what it can do – it has also changed scientists’ understanding of similar resources. Take superconductors, for example: these materials are able to conduct electricity with no resistance, meaning energy in a superconductor could flow continuously without ever degrading.

The problem with most superconductors is that they only work at temperatures near absolute zero, or about –273.15 degrees Celsius, where there is no motion or heat. Because it is impractical to run superconductors with the expensive cooling materials needed to make them work, scientists have spent decades searching for a material that could be used as a superconductor at room temperature.

Graphene, while not a high-temperature superconductor in itself, has brought scientists a step closer to solving this puzzle. Two papers published in the science journal Nature in 2018 proved that graphene could work as a superconductor if two layers were sandwiched together at the so-called ‘magic angle’. The researchers found that graphene works in a similar way to other ‘unconventional’ superconductors called cuprates, which can conduct electricity at up to 133 degrees Kelvin above absolute zero, or around –140 degrees Celsius.

While scientists have struggled to understand the workings of cuprates, the graphene system was relatively well understood. “The stunning implication is that cuprate superconductivity was something simple all along. It was just hard to calculate properly,” Robert Laughlin, a physicist and Nobel laureate at Stanford University, said at the time of the announcement. While Laughlin said it was not yet clear whether the behaviour of cuprates would match that of graphene, he said enough behaviours were present to “give cause for celebration”.

Tipping point
In the coming years, China will continue to dominate the graphene market, according to Ghaffarzadeh, who wrote that China has “the technology, the investment, the resolve and the value chain, including all the major target markets”. Elsewhere in the world, there remains a disconnect between the technical research into graphene and the businesses that can create practical, innovative products. Dawson said the umbrella group Graphene@Manchester, which includes the university’s Graphene Engineering Innovation Centre and its National Graphene Institute, aims to bridge the “innovation gap” between academia and the commercial world by forging collaborative partnerships between researchers and corporate entities.

Going forward, experts expect a critical juncture to take place in the global development of the graphene industry. Ghaffarzadeh has suggested a “major turning point” for graphene would occur in 2020 or 2021, while Graphene@Manchester CEO James Baker told China Daily that the tipping point could come as soon as the end of the year.
However, in a report published in 2016, consultancy firm Deloitte said the research and prototyping phase for graphene would likely extend for another decade. The report noted that throughout history, the materials that make the biggest impact tend to take decades of development before achieving mainstream adoption. Graphene products created in 2016 were just offering a glimpse of the material’s full potential, the Deloitte report said: “Some of the future technologies and benefits of graphene, which could embody the ‘graphene era’ and change our world, only currently exist within the realms of our imagination.”

Although it is almost certain that graphene will not solve every problem researchers claim it will, its discovery in 2004 has already opened the door to countless new areas of research and product design. For the energy sector, graphene’s arrival has led to groundbreaking discoveries and innovative new ideas – but this could just be the beginning. As Motta said: “We’re probably just scratching the surface regarding what graphene can do for the future energy sector.”

As the technology becomes more viable, we must assess the ethics of mind-reading

For decades, scientists have worked to decode the secrets of the brain. Whether through tracking the peaks and troughs of electrical activity via electroencephalography or examining the structure of the brain with computerised tomography and MRI scans, neuroimaging has helped us glean a greater understanding of the inner workings of the mind. Now, thanks to the rapid development of AI and machine learning, we are closer than ever to unlocking new methods of communication with so-called ‘mind-reading’ technology.

Through brain-computer interfaces (BCIs), scientists can ‘read’ an individual’s mind by opening a pathway for communication between the brain and a computer. According to Ana Matran-Fernandez, an AI industry fellow at the University of Essex who studies BCI technology, this could be a significant catalyst for change in the lives of people who are currently unable to communicate.

Action should be taken now to ensure powerful corporations, hackers and governments cannot exploit people through BCIs when the technology becomes more sophisticated

By integrating BCI technology with smart home devices, for example, we could one day give those who are fully paralysed with locked-in syndrome a renewed sense of independence and freedom. But even in the face of these potentially life-changing benefits, the ethical implications of mind-reading technology are immense.

Getting in your head
Both public and private groups have thrown money at research projects in a bid to get the nascent mind-reading industry off the ground. In 2018 alone, the US Government invested $400m in the development of neurotechnology through its BRAIN Initiative, while in 2017 the US’ Defence Advanced Research Projects Agency funnelled $65m into the development of neural implants. Elsewhere, the European Commission has put €1bn ($1.1bn) towards a 10-year programme that aims to model the entire human brain, known as the Human Brain Project.

Companies are also working to push the limits of BCI technology. In a 2017 article published by the science journal Nature, a team of neuroscientists, ethicists and machine-intelligence engineers called the Morningside Group estimated that for-profit entities were already spending $100m per year on neurotechnology development. Allied Market Research, meanwhile, has predicted the global market for BCIs will reach as much as $1.46bn by 2020.

Facebook is one of the most prominent companies currently experimenting with mind-reading technology. At an annual developer conference in 2017, Regina Dugan, then head of Facebook’s division for new hardware projects, Building 8, said users would one day be able to type on the platform using only their minds.
“It sounds impossible, but it’s closer than you may realise,” Dugan said.

Mind-reading technology industry

$1.46bn

Estimated value by 2020

$400m

Amount invested by the US Government in 2018

$100m

Amount invested by for-profit companies each year

 

Although Dugan left Facebook after just 18 months and Building 8 was dismantled at the end of 2018, Facebook founder Mark Zuckerberg recently reiterated his interest in BCI technology during an interview with Jonathan Zittrain, a professor of international law at Harvard Law School. Zuckerberg had previously called telepathy the “ultimate communication technology”. Other prominent names exploring BCIs include billionaire entrepreneur Elon Musk and Japanese carmaker Nissan. The former is working on a secretive project called Neuralink, which seeks to make humans “symbiotic” with AI through an implant in the brain, while the latter revealed plans for its ‘brain-to-vehicle’ technology in 2018.

Despite these thought-provoking plans, Matran-Fernandez pointed out a number of hurdles to the commercial development of BCIs, principally the cost. In fact, most of the research being conducted into BCIs is currently confined to laboratories because the necessary equipment is so expensive. But even if this technology was cheap enough for mass-market production, Matran-Fernandez believes the proprietary algorithms underlying such devices are still not up to scratch: “In practice, the headsets I’ve seen that will be cheap enough for anyone – I’m talking $100, $200 – that would be cheap enough to buy for personal use, they are not… good enough.” This is due, in part, to the fact that every brain is different, and the minds of those with conditions like locked-in syndrome have an added layer of complexity.

Matran-Fernandez told The New Economy: “I think the technology is getting there, and the algorithms are getting there. It’s just a matter of putting it all together and finding someone who’s willing to work on that and make it open-access.”

Brain trust
While the industry must tackle these practical concerns to boost the widespread adoption of mind-reading technology, serious ethical concerns about the practice continue to threaten its viability. For now, BCI technology remains relatively rudimentary, and it could be decades before it becomes advanced enough to be used in everyday life. But the Morningside Group has argued that action should be taken now to ensure powerful corporations, hackers and governments cannot influence or exploit people through BCIs when the technology becomes more sophisticated.

The group warned: “[We] are on a path to a world in which it will be possible to decode people’s mental processes and directly manipulate the brain mechanisms underlying their intentions, emotions and decisions, where individuals could communicate with others simply by thinking, and where… mental and physical abilities are greatly enhanced.” In such a scenario, the group believes the existing ethical guidelines on research and human testing – namely the Declaration of Helsinki and the Belmont Report – would be “insufficient”.

The matter of privacy and security is further complicated by the unique fragility of the human mind. In some cases where people received deep brain stimulation through electrode implants, for example, participants reported feeling a loss of identity and agency afterwards. As researchers Katherine Pratt and Eran Klein pointed out in an article penned for the Conversation, neural data is unlike other forms of personal data because it is “intimately connected to the mind and who we take ourselves to be”.

Matran-Fernandez, however, is not particularly concerned about issues of privacy and identity just yet. As she explained, no mind-reading actually takes place; instead, the process is like being outside the door of an auditorium when an orchestra is playing. In Matran-Fernandez’s analogy, the orchestra is the brain, with different instruments making up different neural activities. “From outside you can hear, in general, the music,” she said. “[But] it’s going to be distorted. You don’t hear it as if you’re sitting inside… You cannot distinguish the different instruments.”

It is similar for researchers using BCI technology. If two categories are created – ‘yes’ and ‘no’, for example – researchers can study the brain’s response to each option and distinguish between the two, but they cannot pinpoint what a participant is actually thinking about.

Mind over matter
Although mind-reading technology prompts numerous ethical debates, there are promising practical uses for BCIs, especially in the healthcare sector, where it has already had a degree of success. A consortium of leading scientists called BrainGate, for instance, is using BCI technology to help people with neurological diseases, injuries or lost limbs regain movement and the ability to communicate.

In one of the group’s case studies, paralysed individuals with sensors implanted in their brains were able to type up to eight words per minute on smartphones and tablets. In another proof-of-concept demonstration, BrainGate showed how people with chronic tetraplegia resulting from spinal cord injuries could regain limb movements with the help of BCI technology.

A separate group of researchers at Columbia University, meanwhile, was recently able to construct synthetic speech using a device called a vocoder, which deciphered what people wanted to say through brain activity alone. With more research, it is hoped that BCI devices will allow people to control wheelchairs, robotic prosthetics and paralysed limbs, as well as enabling those with severe locked-in syndrome to communicate.

Today, the very first steps are being made into these areas of research. As we develop a better understanding of the brain, there is great potential for further improvements in healthcare and medicine. While researchers must tread carefully through the ethical dilemmas these technologies pose, the adoption of mind-
reading technology throughout society appears to be a question of when, not if.

Cleaning up crypto’s dirty legacy

In Irkutsk, Eastern Siberia, the temperature barely creeps above freezing for at least seven months of the year. Life is hard for the city’s 600,000 residents – much of the local economy is based on energy revenues, meaning it was hit particularly hard by the 2014 oil crash. Suffering food shortages, unemployment and a bootleg alcohol tragedy that has killed at least 76, residents likely felt that there were few prospects left in this isolated, frozen wasteland.

In 2017, however, they found a glimmer of hope in an unlikely place: the nascent bitcoin market. The process of ‘mining’ for cryptocurrencies relies upon the one thing that Irkutsk has in spades – access to cheap energy. Bitcoin mining in particular is a hugely energy-intensive industry, as it requires multiple high-power servers working at full capacity to solve mathematical problems and generate currency. The technology also produces a huge amount of heat, so powerful cooling systems are vital – except in Irkutsk, where natural Arctic temperatures do the work for them.

The bitcoin industry has a huge carbon footprint, which could threaten any global progress made on tackling climate change

But while bitcoin mining may have boosted this struggling economy, it’s also causing significant environmental damage on a global level. The industry has a huge carbon footprint and produces a significant amount of electronic waste, which could threaten to derail any global progress made on tackling climate change.

Consumptive industry
Bitcoin, the decentralised currency system, was developed by Satoshi Nakamoto, the name given to the unknown individual or group that published the original open-source software in 2009. The value of each unit of the cryptocurrency has since risen exponentially, hitting a high of around $20,000 at the end of 2017.

To earn bitcoins, miners solve mathematical problems to verify ‘blocks’ that are stored in a blockchain ledger. “[Miners] are creating what we call ‘crypto hashes’ – they’re essentially trying out random numbers to try and find a signature for a new block,” explained Alex de Vries, a blockchain expert at PwC. Around two billion attempts are needed to identify this signature, hence why most miners have multiple computers working on problems simultaneously. This speeds up the process, but also increases the amount of energy used for each ‘mine’.

The energy needed to produce one bitcoin is also rising over time, due to a feature of the original software that only allows a single unit of the decentralised currency to be mined every 10 minutes. Due to the growing number of miners across the globe, each must solve an increasing number of problems per hour in the competition to score the bitcoin for that 10-minute period. “Every machine that’s added to the network takes a little bit of the same pie that everyone is eating,” said de Vries. More maths means more computers working, meaning more energy will be used. Mining computers have become more efficient, with the latest generation of machines able to create roughly 20 percent more crypto hashes per MWh of electricity. However, Alex Hern, Technology Editor at The Guardian, explained in a January 2018 article: “In the zero-sum game of bitcoin mining, that just means a miner can afford to run more machines at the same time, leaving their power usage roughly stable.”

Power needs
The sheer volume of energy needed means that areas such as Irkutsk are popular with bitcoin miners, as the Siberian Government only charges four cents per hour for electricity. For context, in New York City, energy costs 19.4 cents per hour. Part of the reason energy is so cheap in Irkutsk (and in other areas that are popular with bitcoin miners, such as China) is that it is produced by hydroelectric dams. This source of clean, renewable energy happens to be extremely inexpensive, as water is an abundant natural resource that is not affected by market volatility. Hydropower dams also require little maintenance, have low operating costs and produce a consistent and reliable source of energy for up to 100 years.

Cost per hour of electricity in Siberia

19.4¢

Cost per hour of electricity in New York

83%

of available bitcoins have already been mined

3.4m

Total bitcoins left to be mined

The use of hydroelectric energy to power the bitcoin mining industry has drawn criticism from environmental campaign groups for two reasons. First, cryptocurrency has no intrinsic value – its worth is entirely defined by market speculation. If the entire sector were to implode tomorrow, the most ‘successful’ miners would be left wringing their hands in dismay as they would have huge amounts of an immaterial product that had suddenly been rendered worthless. Cryptocurrency is also an adjunct industry, in that it is a carbon copy of the existing currency industry; it doesn’t solve any problems or fulfil any societal needs. As such, the use of one of the world’s cheapest clean energy sources to power this unnecessary and highly consumptive process is controversial, to say the least.

Second, the bitcoin mining industry demands constant access to power, which renewable sources don’t provide. As de Vries explained: “The sun isn’t always shining, the wind isn’t always blowing, and it’s the same for hydropower.” Therefore, in some areas, hydroelectric energy is supplemented by coal-based power, which is the most detrimental for the environment. “The presence of bitcoin miners could actually be a reason to either build new coal-based power plants or reopen existing ones,” de Vries told The New Economy. “That… happened in Australia last year, where a coal-based power plant was reopened for bitcoin mining. That’s the worst possible outcome of the process.”

An end in sight
Bitcoin’s vast energy needs have caused huge issues in countries with less-than-robust energy sectors. In Venezuela, according to cryptocurrency news website Bitcoinist.com, mining operations have caused widespread blackouts and have drained the national grid of electricity that could be used to power homes – particularly bad news for a country currently in the grip of the worst economic crisis in its history. According to estimates by the Bitcoin Energy Consumption Index, bitcoin consumes around 49TWh of energy per year – an amount that could power more than 4.5 million US households if it was redirected.

In addition to draining the world of energy, bitcoin is also causing significant environmental damage. The cryptocurrency’s current annual carbon footprint is roughly the same size as Singapore’s, according to data from Digiconomist. By effectively adding another country’s carbon impact to the global total, the world becomes increasingly unlikely to hit the warming targets set out under the Paris climate agreement. If these are not met, it will have disastrous consequences for weather patterns and ocean ecosystems.

The industry creates a huge amount of electronic waste due to hardware burning out after a certain amount of time. “These machines cannot be repurposed, so they effectively become single-purpose, as they are hardwired for mining,” explained de Vries. “When they’re done, the only thing you can do with them is to throw them away – they become electronic waste immediately after they become unprofitable.” Just 20 percent of this waste is recycled globally, with the rest ending up in already-brimming landfills or incinerators.

‘The total electronic waste output of the bitcoin network is as much as Luxembourg, by my initial estimates, and that’s a very conservative number,” said de Vries. Add to that bitcoin’s massive energy consumption and carbon footprint and we’re looking at what can only be described as the planet’s worst environmental nightmare. Humanity is already decades behind in tackling our carbon impact – while we’re now taking some action on pollutants like fossil fuels, change simply isn’t happening fast enough. The last thing we need is another vastly consumptive industry adding to our substantial footprint.

There may be one slight silver lining, though, in the fact that there is a finite amount of bitcoins. Nakamoto only created 21 million coins in total, of which 83 percent have already been issued, meaning there are around 3.4 million left to be mined. Once all of those have been released, there’s a chance the industry may simply cease to exist. However, that’s little relief when you consider the damage bitcoin will cause to the environment between now and then. As de Vries concluded: “[The industry] could have a natural end, but then we’ve really wasted all this energy.”

Technological developments are making retail cashiers obsolete

In recent years, the retail industry has seen a dramatic decrease in the number of cashiers present in stores, with self-service checkouts filling their positions instead. Many of us bemoan the automated machines; the thought of locating oft-hidden barcodes, scanning at exactly the right angle and packing all at the same time just seems like too much. Then there are the frequent issues that arise, which see customers patiently waiting for a human employee to come to their assistance, type in a secret code or attest to the fact that they are old enough to purchase cough medicine.

A large portion of the workforce – not just in the US and China, but across the globe could become redundant as a result of our incessant pursuit of convenience

The idea of these DIY checkouts is to improve efficiency and convenience for customers, but how effective are they? They may work when you’ve only got a handful of items, but anything more seems to send the whole system into meltdown. Despite these frustrations, self-service checkouts continue to make their way into shops – and as the technology develops, these teething problems may soon be a thing of the past. It seems that we may be set to say goodbye to supermarket cashiers forever.

Grab and go
In December 2016, e-commerce behemoth Amazon launched Amazon Go, its first cashier-less grocery store, selling ready meals prepared onsite, basic groceries, drinks and snacks. To date, it has 10 stores in Seattle, Chicago and San Francisco, with number 11 on its way; rumours of a London branch have been circulating since last year.

Using sensor fusion, deep learning, computer vision and hundreds of cameras, Amazon’s ‘just walk out’ technology automatically identifies when items are taken off shelves (and returned, for that matter). It keeps track of everything in a virtual basket and when customers are done, they simply walk out. An e-receipt of the charge to their Amazon account follows soon after. While many supermarkets are dipping their toes into the self-checkout game, Amazon Go stores signal a landmark moment for the retail landscape.

In China, though the technology is less advanced than Amazon’s, the number of cashier-less shops is in the hundreds and continues to soar thanks to dozens of start-ups in this burgeoning market. One prominent player is BingoBox, which features radio-frequency identification (RFID) tags on all its products. To check out, shoppers scan their items using a typical self-checkout machine and pay via multi-purpose app WeChat.

China’s biggest players – Alibaba, JD.com and Tencent – are also jumping on the bandwagon. With more than 20 stores already in operation, e-commerce giant JD.com has plans to roll out hundreds more using increasingly sophisticated technology, such as facial recognition. In August 2018, meanwhile, JD.com opened its first unmanned store in Indonesia, signalling its plans for regional expansion.

They took our jobs
The rise of the self-checkout will inevitably result in job losses. In the US, retail jobs are the most common type of employment, with women holding more than 70 percent of the positions. According to a recent report by Cornerstone Capital Group, in the coming years we can expect to see a loss of between six and 7.5 million retail jobs in the US alone as a result of automation. A large portion of the workforce – not just in the US and China, but across the globe – could become redundant as a result of our incessant pursuit of convenience. A number of companies, including Amazon, have guaranteed that technological developments would not see cashiers replaced with machines, but let’s not forget that banks made – and broke – similar promises when they first introduced ATMs many years ago.

There’s also something to be said for the human contact we receive when grocery shopping. For some, it may be the only social interaction they have for days and, as such, is a welcome breath of fresh air. In today’s fast-paced, technologically driven world, individuals are becoming increasingly isolated. Let’s not march even more rapidly towards that future – there is a place for convenience, but not at the cost of millions of jobs and one of the last bastions of social interaction.

The longevity industry comes of age

Legends of a miracle cure for old age have been told for millennia, but the concept flourished around the 16th century after rumours spread that Spanish explorer Juan Ponce de León was on a quest to find a fountain of youth, rumoured to be located in Florida. To this day, tourists flock to a stone well in St Augustine in the hope that its sulphur-smelling water will smooth the wrinkles on their faces.

For now, death remains a certainty, but it is true that we are living longer than ever before. Life expectancies have been trending upwards since the late 19th century and, according to the United Nations, the global population of those aged 60 and older more than doubled from 1980 to 2017, reaching 962 million. By the end of the century, this group is poised to hit 3.1 billion. With birth rates tumbling, the global population is inevitably becoming older. By 2100, a quarter of the Earth’s residents will be aged 60 or above.

Having an older population is not necessarily a bad thing. Ageing only becomes a problem when a person’s quality of life, otherwise known as their health span or healthy life years, stagnates. “Sadly, the [healthy life years] index does not grow as fast as life expectancy does: people live longer, but they are also sick for longer,” Elena Milova, a board member and outreach officer at the longevity non-profit Life Extension Advocacy Foundation (LEAF), told The New Economy.

Ageing gracefully
Current public health approaches to ageing are ineffective, according to the World Health Organisation’s (WHO) 2015 World Report on Ageing and Health. The Geneva-based agency explained that, even in high-income countries, current health systems are not prepared to meet the needs of older populations. Long-term care models are “inadequate and unsustainable”, and physical and social environments are filled with barriers that stop the elderly from participating.

Many companies will thrive in ‘hyperageing’ societies by targeting older populations with new products and services that extend healthy life years

These issues must be addressed by supporting healthy ageing, combating ageism and enabling autonomy in the elderly, the WHO concluded. Some ways of doing this, detailed in the group’s 2017 action plan on ageing, include investing in health systems and infrastructure, creating age-friendly housing and establishing health services that focus on meeting the multidimensional needs of the elderly.

Assistive technology, or ‘agetech’, is one way that older people can gain greater independence. The elderly are able to live self-sufficiently for longer with smart devices that automatically dispense medication, technologies that monitor cognitive skills and networking apps that improve social connectivity.

Agetech can also include financial products and services optimised for older users. “Today, most fintech banks are aimed at younger people using smartphones,” Dmitry Kaminskiy told The New Economy. Kaminskiy is working to build the Global Longevity Consortium, a group of companies that provide resources for the longevity industry.

In the next two or three years, Kaminskiy expects big financial corporations to show agetech the same level of excitement they had for fintech in the mid-2010s. “A lot of venture investors, angel investors and big financial institutions will recognise the opportunity of the market, and they will start to invest in it,” he said.

Ageing Analytics Agency, which Kaminskiy co-founded as part of his consortium, published a report in 2019 on the proactive steps Singapore has taken to address the problems caused by its ageing population. In 2015, the city-state launched an action plan for successful ageing. This included preventative and active ageing programmes that begin at age 40, such as the National Silver Academy – a network of organisations that offer seniors educational programmes – and the Silver Generation Ambassadors, who make health services and government schemes more accessible through home visits.

Global population aged 60+

382m

1980

962m

2017

3.1bn

End of 21st century

Japan is another interesting country for longevity industry players. Currently, it boasts the world’s longest life expectancy, with half the population aged over 50 and more than a quarter over 65. Japan also holds the title of the highest ratio of centenarians per capita.

But these credentials have come with a price: the country is facing a dementia crisis, with one in five people aged 65 or older expected to suffer from the memory-loss disease by 2025. Because of Japan’s unique position, a study by the consultancy firm McKinsey said the country’s response to the “unprecedented” economic and social challenge of ageing would become a roadmap for other governments contending with ageing populations, such as Spain and Italy.

Corporate leaders have a big role to play in these countries too. In fact, many companies will thrive in ‘hyperageing’ societies by targeting older populations with new products and services that extend healthy life years, McKinsey said. For example, the report described a social network in Japan where users aged 50 and older can chat with others about their hobbies and interests. The product, created by Kozocom, aims to tackle social isolation in older people, who often live alone, by helping them become part of a community.

A disease-free world
While one side of the longevity industry prepares for a world with significantly more elderly people, another aims to stop ageing from being a concern at all. Numerous scientific research firms are taking a preventative approach to the health implications of ageing by developing medical therapies that address the causes. Drugs that aim to extend the healthy period of peoples’ lives are already being trialled on humans.

“Science is increasingly showing that biological ageing consists of about a dozen root mechanisms; it is basically the accumulation of several types of damage that happen due to normal bodily functions, regardless of our lifestyle,” Milova said. “These root mechanisms of ageing can be addressed by medical means and, as a result, people will age more slowly with an extended period of good health, while the development of age-related diseases will be significantly postponed or even prevented.”

In the long run, focusing on the root causes of ageing will reduce the strain on nursing homes, healthcare providers, caregivers and assistive technologies in a world with a growing elderly population. This mission is already big business in Silicon Valley.

Facebook creator Mark Zuckerberg and his wife, paediatrician Priscilla Chan, announced in 2016 that they would donate $3bn through the Chan Zuckerberg Initiative to curing, preventing or managing all diseases by the end of the century. In their announcement, they said current spending on treatments is 50 times higher than investments into preventative medicine, which would stop people from getting sick in the first place.

Google’s parent company, Alphabet, has been working to unlock the secrets of ageing since 2013 with its biotech company, Calico. The firm has received a budget of $2.5bn to date, but the details of its operations are mostly a mystery. In 2018, the firm made a rare announcement about its research on naked mole rats, which defy the usual ageing process. In a statement on its findings, Calico said research found that naked mole rats’ risk of death does not increase with age, as is typical with other mammals.

In fact, the rodents showed little to no sign of ageing, and their risk of death did not increase even at 25 times past their reproductive maturity. “These findings reinforce our belief that naked mole rats are exceptional animals to study to further our understanding of the biological mechanisms of longevity,” said Rochelle Buffenstein, Senior Principal Investigator at Calico, in a statement.

Reaching for immortality
Anti-ageing medicine is not just a Silicon Valley trend; today, these once-fringe ideas are gaining traction all around the world. Even big banks are beginning to realise their disruptive force.

According to a recent report by investment bank Citi, the current anti-ageing market is worth about $200bn globally, but this only involves non-therapeutics such as cosmetic products and procedures. Recent breakthroughs in the science of ageing could produce commercial therapeutics within the next decade.

Citi claimed the most promising anti-ageing approach was one undertaken by biotech firms, including Unity Biotechnology, a US-based company that is backed by Amazon founder Jeff Bezos. These firms are developing a class of drugs called senolytics that are designed to eliminate senescent cells, or cells that have ceased to divide and replicate.

The removal of these cells in mice delayed age-related diseases, according to research funded by the US National Institutes of Health and published in Nature Medicine. In naturally ageing mice, the drugs also extended both their life and health span. One senolytic therapy made by Unity is designed to treat patients with osteoarthritis of the knee by essentially returning the knee tissue to a more youthful state. If successful, the therapy could be commercially available by 2023.

Steve Hill, Board Member and Editor in Chief at LEAF, told The New Economy that it currently takes about 17 years for a new drug to be developed from start to finish. Treatments such as senolytics, which are making waves now, have been in the works for years. Hill said: “The defeat of age-related diseases is a long haul, not something that will suddenly occur overnight; it will likely happen in small steps, each of which brings us closer to the goal.”

Therapies that delay or even reverse certain diseases of ageing could hit the market within the next decade, Hill said, but a more comprehensive control of age-related diseases will likely take much longer. “However, as our knowledge grows and automation and AI become increasingly present in the research setting, we could see that progress happening faster than it currently is,” he added.

Another barrier to the development of new medicines is cost. Most funding for early-stage scientific ventures comes from government investment and philanthropy, according to LEAF. Profit-focused investors are often wary of these projects as returns can take decades to materialise, if at all. But what was unthinkable to corporate investors like insurance companies or pension funds just half a decade ago is not so unconventional anymore.

“There is a tremendous change in the general perception towards longevity and actually even an acceptance of the technical possibility of extending life, or at least the healthy period of life,” Kaminskiy said. He expects 2019 to be a turning point for many bigger investors.

Longevity investor and billionaire Jim Mellon said in a white paper published by British bank Barclays that “billions of dollars” would soon be flowing into the longevity sector as its growth continues. “There are very few companies available to the general public today, but there will be hundreds lining up over the next two or three years.”

The economics of ageing
As the population turns increasingly grey, the economy will change in a number of ways. The European Commission’s 2018 Ageing Report said the EU’s working-age population, defined as those aged 15 to 64, is shrinking “significantly” due to a combination of increased life expectancy, declining fertility and migration flow dynamics. As a result, the EU will go from having 3.3 working-age people for every person over 65 to just two working-age people by 2070.

The commission also expects spending on healthcare and pharmaceuticals for age-related diseases to surge. It predicted spending on long-term care systems in the EU would rise by 73 percent to a total of 2.7 percent of GDP by 2070, from 1.6 percent of GDP in 2018. Meanwhile, spending on public health in the EU, which was 6.8 percent of GDP in 2016, could rise between 0.9 and 1.6 percentage points by 2070. As Milova points out, these issues create a double burden for healthcare and social welfare systems, which are dependent on taxpayer funding.

However, the advent of drugs that improve our healthy life years will have different implications for the global economy. In the bank’s white paper, Barclays’ Chief Economist for Europe, Antonio Garcia Pascual, said rising life expectancies – so long as they coincide with rising health spans – could be advantageous for the global economy. If people remain fit and healthy into their old age, they could continue to work for longer and boost long-term economic growth. Governments would also be collecting more taxes and thus have bigger budgets.

Elsewhere, the traditional economic dynamic will be turned on its head. Savings and investment plans will change as people work out how to finance their longer lives. This impact will extend to monetary policy, as Pascual explained in the white paper: “As a population ages, savings tend to rise and that brings down interest rates. That’s one reason why we are already seeing central banks struggling to lift interest rates; the long-term equilibrium rate is being dragged down by population dynamics.”

A handful of cities could benefit further by positioning themselves as hubs for longevity research. Aside from Singapore and Japan, the UK stands out as having expertise in a unique mix of three sectors that the longevity industry relies on: biotechnology, AI and finance. It is also home to an ageing population, with centenarians being the fastest-growing age group in the UK.

The British Government has already displayed an interest in the sector, providing a £98m ($130m) challenge fund for healthy ageing. Kaminskiy said accelerators could also be built within the next couple of years to kick-start the growth of advanced biomedicine, agetech and other aspects of the longevity industry. According to a report by AAA, there were 260 longevity-related companies operating in the UK as of 2018.

As far-fetched as the idea of curing all diseases sounds today, similar feats have already been done on a smaller scale with vaccines. To date, only smallpox has been declared completely eradicated by the WHO, though other diseases have been eliminated in certain areas. This work gives hope to the idea that, with more supportive environments, preventative medicine and anti-ageing therapies, humans could live longer and healthier lives. While the longevity industry still has some growing up to do, the way we age is already transforming.

Declining insect numbers will have a devastating effect on ecosystems

Insects often get something of a bad rap. Most are considered little more than pests and (colourful species aside) are lucky if they don’t end up flattened by a slipper or a rolled-up newspaper. Recent research, however, indicates that humans have been taking these unpopular critters for granted. A study published earlier this year in the journal Biological Conservation found that 40 percent of insect species are currently threatened with extinction. Although previous research had noted falling insect numbers, the rate of decline – scientists from the China Academy of Agricultural Sciences and the universities of Sydney and Queensland believe global insect biomass is falling by 2.5 percent every year – has come as a shock.

Municipal planners should work on making cities more hospitable for their smaller inhabitants, allowing insects to live alongside humans

Insects play an important role that extends far beyond annoying picnickers. They are essential to the planet’s ecological health, with many other animals and plants relying on them. If insect numbers continue to collapse, the effects are likely to be devastating. Fewer insects may mean fewer pests, but a world with no insects at all would be one of crippling starvation and mass extinction.

On their last legs
Although headlines telling of an insect Armageddon may sound melodramatic, the situation facing the world’s beetles, bees and butterflies is pretty dire. At the current rate of decline, half of the world’s insects will be gone in 50 years and they will be wiped out entirely in a century. The reasons behind such a nosedive require further study, but some potential causes are coming to the fore.

Dr David Wagner, Professor of Ecology and Evolutionary Biology at the University of Connecticut, explained: “Agricultural intensification… and all it entails, relative to the more nature-friendly small family farms of the past, and urbanisation are the two greatest threats to insect diversity in Europe and North America. In the tropics, deforestation and climate change appear to be the primary threats. The threats to Mother Nature are many. I fear that her plight is one of death by a thousand cuts.”

Interestingly, although Wagner is right to highlight climate change as a contributing factor to declining insect populations, it comes relatively low down in the pecking order. While a recent study did note plummeting invertebrate numbers in a relatively undisturbed patch of the Puerto Rican rainforest, there is little conclusive evidence to suggest that climate change, so often the modern-day villain, is behind the worldwide decline.

“Evidence is limited as to how much impact, so far, climate change has had on insect numbers,” Professor Dave Goulson from the University of Sussex told The New Economy. “We’ve seen about a one-degree increase in temperature so far and there is evidence that, for example, some species like butterflies have shifted their ranges northwards in response to climate change. Bumblebees seem to be disappearing from the southern edges of their ranges as well. So it is beginning to have an effect. But probably in the grand scheme of things, climate change hasn’t had a major impact on insect levels.”

Habitat loss appears to be the main driver behind the recent drop in insect numbers, with large commercial farms meaning that flower-rich, semi-natural habitats that were once home to many insects have been lost. The increased use of chemicals may have resulted in larger agricultural yields, but they have proved deadly for the tiny creatures that once enjoyed their share of the annual harvest.

“Pesticides are designed to kill things and lots of the ones we use are insecticides,” Goulson said. “If you spray lots of insecticide, you shouldn’t be surprised if insects don’t thrive.”

Grub’s up
No one is yet suggesting that insects are in short supply – they remain the most abundant type of animal on Earth and outnumber humans in terms of biomass by a factor of 17. Yet, in spite of their overall volume, their small size and considerable mobility makes them hugely difficult to monitor.

“It’s worth stressing that we have enormous knowledge gaps when it comes to insect numbers,” Goulson said. “So far, we’ve named about a million species of insects but most people agree that there are probably at least another four million that we’ve yet to name. That means that we have absolutely no data for about 80 percent of the world’s insect fauna. But even for the one million insects that we have named, in the vast majority of cases, we have no long-term monitoring scheme in place to say what’s happening to their numbers. All we do have is isolated studies from particular places.”

While these isolated studies are painting a bleak picture, it should be pointed out that not all insects are struggling. As insect numbers fall overall, more hardy or adaptable species are likely to thrive – unfortunately, though, these are likely to be the species most commonly classified as pests.

“No matter what humans do, nature produces winners and losers,” Warner said. “If the world is converted to a massive anthroscape for our exploitation, with little regard for wild lands or wildlife, we will still have dogs, cats, rats, starlings, robins, cockroaches and dandelions. What will be lost will be tragic: millions of species that have fought tooth and nail, alongside us, on their own struggle for existence, through millennia, only to be extinguished by the rapacious self-interests of mankind.”

Not so long ago, insects were being talked up as a potentially protein-rich food for livestock, or even for humans. Currently, more than 1,900 species of insect are deemed fit for human consumption, but recent fears over an insect population collapse may scupper the few insect farms that view this as a viable proposition. On the other hand, there may not be much else to eat other than cockroaches and houseflies if most of the world’s pollinating insects die out.

Looking out for the little guy
If a bleak future is to be avoided, mankind needs to act fast to arrest the insect decline. Urbanisation is one of the main drivers behind the fall and there is unlikely to be an easy way of reversing this. At present, 55 percent of the world’s population lives in urban areas, with this figure set to hit 68 percent by 2050, according to UN projections. Trying to halt, let alone reverse, this trend is likely to prove futile. Instead, municipal planners should work on making cities more hospitable for their smaller inhabitants, allowing insects to live alongside humans.

“If we change the way we use parks, road verges, roundabouts and any place where there is spare land, planting wildflowers and reducing pesticide use, then perhaps cities could become giant insect nature reserves,” Goulson explained. “This would also have the advantage of helping people reconnect with nature because increasingly people spend all their time in cities. Some children grow up without ever seeing butterflies and perhaps we should try to change that.”

Agricultural businesses and national governments also need to take their fair share of responsibility. Financial incentives should be increased to encourage more organic farming, while providing access to an independent agronomic advisory service would help farmers make decisions about how many chemicals to use on their crops, without the interference of pesticide manufacturers.

Members of the public are also not without blame. Increased demand for organic food and local, seasonal produce would make it easier for growers to transition from industrial farming to smaller-scale, mixed horticultural operations. It’s true that saving insects from extinction will require major changes relying on both top-down efforts and buy-ins from consumers, but the alternative – a life devoid of the ecological vibrancy we enjoy today – is surely much worse.

The unexpected environmental drawbacks of concentrated solar power plants

Stretching across 3,500 acres of the Mojave Desert, the Ivanpah Solar Electric Generating System is the largest concentrated solar power (CSP) plant in the world. Costing $2.2bn and taking more than three years to build, the facility is the best-known project in the growing CSP market. Having said that, not many are aware that establishments like Ivanpah exist. While solar energy has generally become better established, concentrated solar power remains a little known development.

When most people talk of solar power, they are usually referring to photovoltaic cells that directly convert sunlight into electrical energy. Conversely, CSP works by using a large number of mirrors to reflect and concentrate sunlight onto a central receiver, which then converts it to heat. It comes with benefits and drawbacks when compared with photovoltaic solar, but for some parts of the world, CSP is a perfect fit.

Ivanpah’s current energy output of 392MW – enough to power 140,000 homes – may be impressive, but its real benefit could be as a trailblazer for other CSP plants

The Ivanpah plant, which formally opened in February 2014, has not been without its critics. In February this year, California’s San Bernardino County – home to the Ivanpah facility – passed a bill banning the further construction of large solar projects. Clearly, even green technology is not always welcomed.

A place in the sun
If a bird’s eye view of Ivanpah helps to show its sheer scale, it doesn’t reveal much about how the solar facility actually works. The plant produces energy in a similar way to how most of the world’s electricity is made: using steam-turned turbines. Across three distinct sites, more than 300,000 software-controlled mirrors follow the path of the Sun, reflecting its rays onto the summits of three 459ft-tall towers. At the top of each of these towers is a boiler that is rapidly heated by the concentrated sunlight, producing high-temperature steam that is piped back down to turbines on the ground.

Ivanpah Solar Electric Generating System in numbers

$2.2bn

Cost to build

3+

Years to build

329MW

Energy output

Ivanpah’s current energy output of 392MW – enough to power 140,000 homes – may be impressive, but its real benefit could be as a trailblazer for other CSP plants. One of the major problems facing solar power is how to generate energy when cloud cover (or nightfall) gets in the way. However, CSP plants can be paired with thermal storage – usually in the form of molten salt – to enable energy production to continue even after the Sun has gone down.

Though Ivanpah does not boast thermal storage, one of the facility’s operators, BrightSource Energy, has used information gleaned from the plant’s operations to include thermal energy storage in subsequent facilities, including the Huanghe Qinghai Delingha Solar Thermal Power Project in China.

In the firing line
While environmentalists normally endorse solar power projects, this hasn’t always been the case at Ivanpah. Although the land upon which the facility is built may look fairly desolate, the Mojave Desert boasts a surprisingly rich level of biodiversity. As well as milkweed and queen butterflies, the area around the Ivanpah site is home to a number of desert tortoises. With growing fears that the solar plant could be harming tortoise numbers, operators have spent $55m on mitigating the ecological damage.

Despite these efforts, environmental concerns persist. One particularly grim side effect of the plant’s construction is that birds, attracted to the insects gathering at the top of the towers, are incinerated as they pass through the beams of concentrated sunlight. According to estimates, this results in some 6,000
deaths every year.

Just as the Ivanpah Solar Electric Generating System represents a significant leap forward in the development of solar power, it also serves as an important case study on the unintended consequences that can arise from the deployment of green technology. As more CSP plants are built around the world, operators would do well to consider both the good and bad that they could be doing to the environment.

China plans to breed stink bug army to fight crop infestations

Scientists in China plan to unleash an army of stink bugs on the fall armyworms that are spreading across its provinces and destroying the country’s crops, according to a report published by the South China Morning Post on June 11.

Originally native to America, the crop-devouring caterpillars reached Africa through imported produce before spreading to China in early January

Armyworms are named so because they march across landscapes like an invading army. Feasting on a diet of many plant species, including sorghum, corn and sugarcane, they can infest hundreds of hectares of crops in a single night.

Originally native to America, the crop-devouring caterpillars reached Africa through imported produce before spreading to China in early January. On account of the devastation the insects have caused in recent years, the international agricultural community has become increasingly concerned about them.

The Centre for Agriculture and Biosciences International warned in 2017 that armyworms posed a major threat to food security and agricultural trade, after the pest devastated maize crops in Africa, destroying farmers’ livelihoods.

China’s stink bug army offers a promising solution. Stink bugs are the natural predators of armyworms and make for highly effective exterminators. During a field trial in the south-western province of Yunnan, scientists found that a mature stink bug can eliminate up to 41 fall armyworm larvae a day.

To combat the armyworm infestation, the Institute of Plant Protection at the Chinese Academy of Agricultural Sciences has created a stink bug factory, where it hopes to breed 10 million stink bugs a year.

Since its arrival in China, the armyworm has affected crops in 18 provinces and has been found in at least 92,000 hectares of Chinese farmland. It’s predicted to reach China’s corn belt in the north-east of the country this month.

China’s Ministry of Agriculture and Rural Affairs warned that the pest “severely threatens the agriculture and grain production security of China”. A solution is urgently needed in order to stave off the widespread destruction of crops. If successful, the stink bugs could be harnessed in other countries to battle this growing threat to global crop supplies and farmers’ livelihoods.