Neo Lithium is mining success in South America’s lithium triangle

From construction and energy to manufacturing and technology, dozens of industries depend on the supply of commodities that are sourced from deep beneath the Earth’s surface, such as metals and minerals. The mining industry, therefore, is pivotal to the global economy, as it extracts and delivers the raw materials – for example, gold, copper, coal and iron – that are transformed into valuable products and services.

Mining can be a challenging exercise. Projects must be well researched and planned, otherwise they risk vast operational expenses that result in low returns. Preventing this situation from arising requires a thorough evaluation in which key factors of size, grade and quality are assessed to ensure the project is financially viable.

When it comes to mining projects, making informed decisions is the most important part of the process; that cannot be done without adequate data

These three factors are mainstays of any mining project, regardless of the commodity being extracted, and their assessment is classified as a concept-level study. Other factors such as location, access to infrastructure, access to power and corporate social responsibility must also be considered on a more individualised basis. These are known as feasibility studies.

Concept studies must be undertaken to ascertain whether a project is viable according to these three essential factors. Feasibility studies then incorporate additional detail and begin to consider costing. Companies and financial institutions will only decide whether to invest in a project once all preparatory work has been completed.

Evaluating viability
Neo Lithium’s 3Q Project has recently undergone concept-level studies and provides a useful model for the necessity of these evaluative parameters. The 3Q Project is located in the southern end of the lithium triangle that spans Argentina, Bolivia and Chile. The area is characterised by high-altitude salt flats with a high concentration of lithium. Argentina, together with neighbouring Chile, holds the majority of the world’s lithium brine resources.

Neo Lithium’s 3Q Project in numbers

7m

Tonnes of lithium carbonate equivalent

35 years

Predicted lifespan

$319m

Total investment

$1.14bn

Net present value

To prepare for the project, Neo Lithium completed a concept-level study and financial appraisal to produce a preliminary economic assessment, an essential document ordered by the Toronto Stock Exchange. In March 2019, the company also completed a pre-feasibility study, allowing it to begin communicating that resources can be economically processed. As such, the site can now be considered a reserve. Armed with this information, the company will continue to assess the project against the key criteria of size, grade and quality.

With regards to size, the 3Q site looks to have resources totalling the equivalent of more than seven million tonnes of lithium carbonate, making it the fifth-largest brine project worldwide. Its substantial size means the project will likely be in production for many years. At pre-feasibility level, it was predicted to have a lifespan of at least 35 years, which, for mining projects, is quite long. Understanding the size of any project is crucial, as it allows companies to use economies of scale. The larger the project, the greater the cost advantages that can be obtained.

In terms of grade, the project is extremely valuable: the higher-grade area of 3Q has the second-best grade of known brines in the world, while the full seven million tonnes have the sixth-highest grade. As the resource has proven to be so rich, the team of geologists and hydrogeologists working on the project has decided to start extraction in the area with the highest concentration of lithium. This makes good economic sense as the higher the grade, the smaller the ponds required to evaporate brine. According to the plan, after 10 years the brine’s grade will drop, at which point the company will need to add more ponds to the extraction process. The same will be required after 20 years, too. All things considered, this is the most cost-effective way to start the project and also emphasises the importance of evaluating a material’s grade at the concept stage.

Quality control
While both the size and grade evaluations of the project have brought positive results, it’s the quality metric that is particularly advantageous at 3Q. Quality of brine is an important factor in reducing operating costs. The more contaminants present in the brine, the higher the cost of production, as contaminants need to be separated from the product through processing. In the case of lithium brine, sulphates and magnesium are the two most common contaminants.

The brine extracted at 3Q contains the lowest volume of these impurities ever seen in the market, making it a highly valuable product. High-quality brine that contains few contaminants can be treated using solar evaporation ponds, thereby reducing energy consumption during processing and lowering costs. The use of renewable energy has the added bonus of shrinking the company’s carbon footprint. Low-quality materials with many impurities cannot be processed using solar energy, driving up processing costs and increasing the environmental impact. The better the materials’ quality, the lower the initial capital and ongoing operating costs, making the project more attractive for investors. Quality, therefore, is a crucial metric that will fundamentally determine whether a project can progress to the feasibility assessment stage.

Securing returns
Through evaluation of the 3Q site using these three metrics, Neo Lithium established that the project is certainly viable and is likely to be highly lucrative. It benefits from economies of scale, high concentrations of valuable materials and the lowest impurity levels in the industry. These key factors will ensure the site becomes not only a lithium producer, but also a resilient, low-cost producer that can withstand the ups and downs of the market. What’s more, as a result of the low level of impurities and high grade of the site, the initial capital investment will not be very high, making it easier for our team and shareholders to finance the project.

Had the company not undertaken these original concept-level studies, it would be in the dark as to the commercial viability of the site. Instead, it has been able to build on those initial factors with additional studies to further ensure the success of the project. Neo Lithium has now completed pilot ponds, a pilot sulphonation plant, a pilot carbonation plant and permanent camps and offices. To ensure the project reaches its full potential, the company has built roads, helped local communities with a number of initiatives, installed renewable sources of solar energy, completed environmental and social baseline studies, trained local residents, and hired the best professionals in the industry.

None of this progress would have been possible without the initial assessments. While they may seem like simple metrics, they are crucial in evaluating the viability of any mining project. These tests also help producers understand projects from a market perspective, allowing Neo Lithium to position its work to obtain optimal investment. As companies progress from concept-level to feasibility-level studies, they usually appreciate in share value and price. Waiting too long could mean the market ends up valuing the project like a futures-producing asset. Therefore, it’s imperative to promote investment opportunities at the correct moment
in the evaluation process.

When it comes to mining projects, making informed decisions is the most important part of the process; that cannot be done without adequate data. By accruing as much knowledge as possible, companies like Neo Lithium are able to pursue only the most worthwhile projects and ensure long-lasting, significant returns.

Medical research is conducted in developing countries to avoid ethics legislation

South Africa’s Northern Cape is one of the most ethnographically unique locations on Earth. It is home to the San people, a nomadic hunter-gatherer group that carries some of the oldest human DNA. While their unique genetic make-up could reveal much about human development, it has also made them a target for scrutiny by curious scientists and opened them up to exploitation.

Research organisations have conducted investigations on the San people without respect for their heritage or culture. “Our knowledge has been taken by clever people who come and tempt us with 10 rand or five rand,” San elder Petrus Vaalbooi explained in an interview with the TRUST project. This exploitation is part of a practice known as ‘ethics dumping’.

Until now, the San had no way to protect themselves from such mistreatment. But TRUST, a collaborative effort between the EU and the San people, has seen the development of a code of ethical guidelines that will work to eliminate exploitative practices in research across the globe.

Dumping ground
The term ‘ethics dumping’ was coined in 2013 by the European Commission (EC) to describe the act of doing research deemed unethical in a scientist’s home country in a foreign setting with more lax regulations. The concept is part of a wider story of unethical domestic research practices that were documented in the mid-20th century, such as Nazi physicians’ experimentation on minority groups, or the infamous Tuskegee trials, which saw 600 African American sharecroppers enrolled in a 40-year trial to observe the impact of syphilis if left untreated. None of the participants were informed that they had the disease, or offered treatment. By the time the trial came to an end in 1972, 128 of the original participants had died from syphilis or related complications.

Due to the nature of political and legislative progress across developed and emerging economies, ethical guidelines have not been adopted in a uniform manner worldwide

“Within Germany and the US, extremely vulnerable populations that couldn’t defend themselves were exploited for research,” explained Professor Doris Schroeder, Director for the Centre of Professional Ethics at the University of Central Lancashire. “These horrendous experiments were possible because of a divided domestic society and extreme racism.” What took place during the Second World War and at Tuskegee emphasised the dire need for ethical regulation of research trials, which was gradually implemented at a national level over the course of the 20th century.

Due to the nature of political and legislative progress across developed and emerging economies, however, ethical guidelines have not been adopted in a uniform manner worldwide. This has paved the way for ethics dumping as we know it today. “Ethics dumping requires strong ethics legislation and compliance mechanisms in one setting (high income) and the lack [thereof] in another setting (low or middle income),” explained Schroeder. “This legislative and compliance gap is then used to undertake research abroad that would be prohibited or severely restricted at home.” As such, ethics dumping should not be conflated with unethical domestic research practices, but understood as the act of travelling abroad to evade ethical guidelines.

Destination for exploitation
Over the past 50 or so years, many prominent pharmaceutical companies and prestigious research organisations have been known to engage in ethics dumping, particularly as part of clinical trials. One recent example involved three cervical cancer screening trials that took place in India between 1995 and 2015 in Mumbai, Osmanabad and Dindigul. The trials, which were partly funded by the US National Cancer Institute and the Bill and Melinda Gates Foundation, aimed to determine whether trained healthcare workers could conduct screenings using inexpensive materials, which would help reduce incidences of cervical cancer in poorer communities.

The Tuskegee trials in numbers

1932

Trials began

40 years

Duration

600

Unwitting participants

128

Participants died from syphilis or related complications

According to a study by Sandhya Srinivasan, Veena Johari and Amar Jesani, the trials were conducted on 374,000 women, 141,000 of whom were placed in a control group and given no screening, risking the disease going undetected if it developed. In that group, 254 women died from cervical cancer. Had this research taken place in the US, a control group with no access to the screening would not have been allowed, as it would have been classed as an ethical violation. However, because the trials took place abroad, where regulations are more lax, US funders were able to bypass the usual protections.

India in particular has become something of a hotspot for ethics dumping. Denny John, an evidence synthesis specialist at the Delhi-based Campbell Collaboration, explained: “A clinical trial comes at a huge cost, so bringing that down is a key consideration for a lot of research organisations.” By relocating to countries such as India, where costs are lower, companies can cut their overheads. John added: “India also has a large and varied population, so that’s also part of the reason why trials are being conducted in this part of the world.” The country became an even more attractive environment for medical research in 2005, when ethics laws were relaxed in a bid to strengthen India’s position as a key player on the global pharmaceutical stage.

However, the development of this advantageous environment for trials has inadvertently opened Indian people up to exploitation by unscrupulous researchers. In 2012, a BBC investigation found that US pharmaceutical firm Biogen Idec had enrolled people from the lowest caste of Indian society into trials for a heart-failure-prevention drug without seeking their consent. The women interviewed were from the Dalit caste, a group that has been ostracised, impoverished and poorly educated in Indian society, meaning they were unable to read the documents given to them or fully understand the process they were entering into. According to the BBC, one of the women, Chandrakala Bai, suffered heart abnormalities as a result of taking the trial drug; she died of a heart attack less than a month after being taken off the medication.

Survivor of the Tuskegee trials, Herman Shaw, speaks at the White House in 1997
Survivor of the Tuskegee trials, Herman Shaw, speaks at the White House in 1997

Biogen Idec is not the only organisation to have been accused of exploiting Indian people in this way: AstraZeneca, GlaxoSmithKline, Johns Hopkins and Sanofi have all been implicated in Indian trials where consent was not sought from participants. John explained that much of the issue lies in a lack of local regulatory oversight. “India’s [national] ethics committee cannot actually monitor on the ground, and that is where it is needed,” he said. In terms of corporations, “the funding for studies is done by [large pharmaceutical companies], but the recruitment is done by local hospitals”, meaning there can be a mismatch between the protocols set and the actual trial carried out. This can also result in the exploitation of those from lower socioeconomic classes, as was discovered by the BBC’s investigation.

Ethics dumping of this nature flouts basic rules of consent, both within the field of medical research and as a human right. The inequitable power dynamic between research organisations and unwitting test subjects also makes it very difficult for the latter to claim any kind of recourse if they do suffer ill effects as a result of a trial. Many of those implicated in unethical Indian clinical research are not in the position to instruct lawyers given their lack of social mobility and low-income economic background. “In India, there’s no structured patient groups to fight for compensation,” added John, which means almost all avenues for recourse are also closed off. In this sense, trialling drugs on these people carries little risk of repercussions for large companies, the majority of which have an arsenal of lawyers and funds at their disposal to fend off legal challenges.

Money talks
While national regulations aren’t able to prevent companies from conducting trials in India and other low-income societies, there are international regulations designed to do just that. The first of these, the Nuremberg Code, was drawn up during the eponymous trials in 1945 to prevent a repeat of the atrocities committed by Nazi physicians. The document sets out initial principles of “absolutely essential” voluntary consent, the need to “avoid all unnecessary physical and mental suffering and injury” and only undertake experiments that “yield fruitful results for the good of society”. However, many saw the code as a largely needless endeavour. Physician and Yale Law School professor Jay Katz notably called it “a good code for barbarians but an unnecessary code for ordinary physicians”.

The code’s successor, the Declaration of Helsinki, was more widely adopted when it was published by the World Medical Association in 1964. Its principles have since been codified into international documentation, such as the UN International Covenant on Civil and Political Rights and the World Health Organisation’s research ethics guidelines. Yet, these are not legally binding instruments under international law.

“I do believe that a legal instrument such as a code of conduct tailored to the problem of ethics dumping can reduce the practice significantly and hopefully eradicate it in the future,” Schroeder told The New Economy. “However, a code needs strong support, and this support is best given by the funder of research.” With this in mind, Schroeder worked alongside other renowned academics, lawyers and research experts on the TRUST task force to “catalyse a global collaborative effort to improve adherence to high ethical standards around the world”. The group developed a value framework, from which two codes of conduct were drawn up. One code “was adopted by the EC as a mandatory reference document for framework programmes and covers all research in resource-poor settings”, explained Schroeder.

This decision is particularly powerful given that the EC manages Horizon 2020, an €80bn ($90bn) programme that any EU-based scientist can apply to for financial assistance in conducting research. By making funding contingent on ethical practices from the outset, the EC is sending a powerful message that it will do everything in its power to ensure ethics dumping doesn’t take place under its jurisdiction. “It is much better if funders are proactive, not reactive,” Schroeder added.

It is worth noting that the practice is not always predicated on a conscious decision to exploit a group of people, but rather can be borne out of a lack of cultural understanding. “Because of ignorance, [ethics dumping] flourishes in an environment where researcher mobility is put above all else and the home culture is taken abroad without asking self-reflective questions,” explained Schroeder. In the best case, this ignorance leads to wasted resources as methods developed by researchers in high-income settings are proved to be unsuitable for local needs. “At the maximum harm end, lives can be lost, as in [the cervical cancer study in India],” Schroeder said.

To go about solving this, TRUST has developed a second code, in partnership with the San people, that researchers must adhere to when undertaking projects in the region. The code states that studies must respect the San’s values and way of life. Codes like these ensure local communities retain their autonomy, and research becomes an equitable, not exploitative, exchange.

Under the microscope
While the action taken by the TRUST project is certainly a step in the right direction, there is more to be done to eliminate ethics dumping, particularly at a corporate level. This is more difficult than targeting individual researchers or learning organisations, as businesses are often less reliant on external funding to conduct trials. As such, a two-pronged approach is needed; legally binding, international ethical regulations must combine with court challenges for past transgressions.

The inequitable power dynamic between research organisations and unwitting test subjects makes it very difficult for the latter to claim any kind of recourse

In January this year, for example, a US district judge said that Johns Hopkins, Bristol-Myers Squibb and the Rockefeller Foundation should face a $1bn lawsuit over their role in a US Government experiment in the 1940s that infected hundreds of Guatemalans with syphilis. The plaintiffs’ incredulous lawyer said: “This experiment began 72 years ago. It’s hard to believe.” But while the three companies may have put it to the backs of their minds, the 444 Guatemalan victims and their relatives certainly haven’t. The case will ensure they receive the recognition they deserve. After all, justice has no expiry date.

Processes must also be put in place to ensure research participants are fully cognisant of the possible side effects during trials and are appropriately compensated if they do suffer any ill effects. Given the substantial financial gap between patients and large corporations, it’s vital that organisations are set up to allow individuals to seek compensation through collective legal action. This will go some way in closing the substantial power gap
between the two parties.

Delivering financial recompense for past cases and making it impossible to gain funding for exploitative future research will go some way towards eliminating ethics dumping on a systematic level. Sadly, chances are there will still be individuals that engage in dubious ethical practices, such as scientist He Jiankui, who modified the DNA of unborn twin girls late last year. However, if such practices are denounced at an intergovernmental and corporate level, as well as within the international research community, those cases will become fewer and farther between.

Although not all cases of ethics dumping are born from a choice to mistreat members of another culture, the practice, whether consciously or not, legitimises the belief that some people are candidates for exploitation. It implicitly categorises those from lower-income societies as pawns in the race for progress, less important than scientific and economic goals. This discriminatory belief has no place in our modern, globalised society, and only serves to discredit vital work undertaken by the research community.

Horseshoe crabs are being exploited for their life-saving blood

There are many noteworthy things about horseshoe crabs. First, they are not actually crabs at all – a recent study confirmed a long-held scientific belief that they are in fact arachnids, more closely related to spiders and scorpions than any crustacean. More interesting still is the fact that the blood of the horseshoe crab is one of the most valuable liquids on Earth, changing hands for thousands of dollars per litre.

The crab’s blood – which, it should also be noted, is blue in colour – plays an essential role in the medical profession, enabling scientists to identify bacterial contamination. Every medical device that enters the human body, from implants to injections, is first checked for toxins using the blood of these strange, helmet-shaped sea creatures.

The fact that horseshoe crabs have inhabited the planet’s oceans for more than 450 million years has seen them termed ‘living fossils’. They have survived multiple mass extinction events and outlived the dinosaurs. But now, their time may be coming to an end. In 2016, the International Union for Conservation of Nature added the Atlantic horseshoe crab to its ‘vulnerable’ list in response to falling numbers.

Overharvesting is certainly not helping the plight of the horseshoe crab, nor the many other coastal species that rely on them as a food source. Perhaps just as worryingly, the species decline is prompting the biomedical industry to question whether it will be able to continue protecting individuals from contamination in the future.

Quite the catch
While three extant species of horseshoe crab are found in Asia, much of the biomedical demand is currently met by Limulus polyphemus, the Atlantic horseshoe crab found on the east coast of the US. Every year, thousands of these crabs are caught by the pharmaceutical industry and shipped to a laboratory. A drain is then inserted close to the heart and around 30 percent of the animal’s blood is syphoned off. It’s a process that (most of) the crabs survive, but a distressing one nonetheless. As a result, the Atlantic States Marine Fisheries Commission (ASMFC) closely monitors their collection.

Horseshoe crabs have survived multiple mass extinction events and outlived the dinosaurs, but now their time may be coming to an end

“From 2004 to 2017 along the US Atlantic coast, an average of approximately 417,700 crabs were harvested and bled annually by the biomedical industry,” ASMFC staff told The New Economy. “A few crabs do die at the facility and are reported to the ASMFC. After crabs are bled, they are returned alive to the water and the ASMFC applies a 15 percent mortality rate to those and adds them to those that died during the collection and time at the facility. Therefore, the ASMFC believes that on average, from 2004 and 2017, approximately 61,500 horseshoe crabs died annually from biomedical practices along the Atlantic coast of the US.”

For a long time, the loss of a few thousand horseshoe crabs was viewed as a necessary price to pay for the huge medical benefits derived from their capture. Blood cells taken from horseshoe crabs are used to make Limulus amebocyte lysate (LAL), a substance that produces an instantaneous and clearly visible reaction to contaminants. Without LAL, anyone receiving an injection, medical implant or intravenous drip would be at risk of a potentially lethal bacterial infection.

Shell shocked
Concerns over the long-term health of horseshoe crab populations off the US East Coast have caused some to question whether their current use should be reconsidered. Although mortality rates remain low, there is growing evidence that even when the crabs survive, the blood harvesting process has detrimental effects.

“There are limited studies that address the sub-lethal effects of biomedical bleeding,” ASMFC staff noted, citing research by Rebecca Anderson and Meghan Owings. “These studies both had relatively low sample sizes and were not conducted at or with biomedical facilities, but the studies indicated that bleeding may affect activity levels, immune function and mating frequency. Preliminary data collected from horseshoe crabs that have been bled, tagged, released and recaptured at a later time indicate that survival on a long-term basis is not very different from unbled crabs, but analysis of this data is ongoing.”

Horseshoe crabs in numbers

450m

Years of existence

2016

Declared a vulnerable species

417,700

crabs are harvested from the US Atlantic coast annually

61,500

crabs die annually as a result

A 2014 study by researchers at the University of New Hampshire and Plymouth State University found that the bleeding process results in lower activity levels and a decreased expression of tidal rhythms. While it is true that further study is required, the data available does suggest the horseshoe crabs that survive bleeding are left disorientated and weak, potentially affecting the ability of female crabs to spawn. Further research has found that the mortality rate of horseshoe crabs post-bleeding is only eight percent for males, but as high as 30 percent for females.

Even if it is difficult to conjure up empathy for a creature that looks like something from the Alien film series, reasons remain to care about the plight of the horseshoe crab. The species is an important source of food for many other coastal animals and its falling numbers could have a far-reaching impact on other populations. The number of red knot birds, for example, has plummeted across the East Coast in tandem with horseshoe crab numbers.

“Horseshoe crabs play an ecological role throughout their range as part of the marine food web and as a substrate for barnacles and limpets,” explained ASMFC staff. “They are consumed by other crabs, fish, conchs, sharks and sea turtles, including the endangered loggerhead, in different stages of their life cycle, and they consume many species as well.”

Climate change, habitat destruction and commercial fisheries are all putting pressure on horseshoe crab numbers, in addition to demand from the biomedical industry. While it is true that the LAL test is an essential asset to the medical community, this fact should not serve as a reason to ignore the effects that crab harvesting is having on the species and the wider coastal environment.

Put to the test
The biomedical industry is facing a dilemma. Although the three Asian species of horseshoe crabs could be used for blood, this simply pushes population pressures elsewhere, while still raising the same ethical questions. A synthetic alternative to the LAL test offers hope, but progress on this front has been slow. Dr Ling Ding and Dr Bow Ho of the National University of Singapore developed recombinant Factor C (rFC) to replace the LAL test back in 1997. Since then, despite numerous studies showing that rFC is at least as effective as LAL, if not more so, adoption has been limited.

However, a recent patent protection expiry regarding rFC offers hope that more commercial pharmaceutical companies will begin manufacturing it. As is so often the case, an economic incentive may provide the horseshoe crab’s most likely route to salvation. With around 70 million LAL tests performed every year, a man-made alternative offers a potentially lucrative opportunity for pharmaceutical manufacturers.

Still, a synthetic LAL substitute may not be enough to save the species. In addition, more emphasis must be placed on cleaning up coastal ecosystems and enforcing tighter restrictions on the animal’s use as bait. The close relationship between biomedical firms and regulatory bodies also creates a concerning conflict of interest that should be investigated further.

The creation of protected areas has resulted in local horseshoe crab populations recovering somewhat in recent years, and this approach should be rolled out on a much larger scale. Over the past three decades or so that the LAL test has been in use, horseshoe crabs have saved the lives of countless humans. It’s about time we returned the favour.

World View’s surveillance balloon completes longest mission to date

On June 5, Arizona-based company World View announced that its Stratollite balloon had completed a 16-day test flight, its longest successful mission to date. The balloon is a remotely operated, navigable, high-altitude balloon that can sail through the stratosphere for long periods of time, monitoring the Earth below. It offers a solution to expensive satellite launches and high-cost drone flights.

The US military is planning on using the Stratollite in the fight against human and drug trafficking and to combat maritime piracy

World View believes the Stratollite will revolutionise planetary observation. In addition to being more cost-effective then satellites, the Stratollite has higher image quality and greater endurance over any given area. This gives it a myriad of applications, including monitoring crops in real-time, aiding disaster response and providing military surveillance.

According to World View, the US military is planning on using the Stratollite in the fight against human and drug trafficking and to combat maritime piracy.

The Stratollite itself is a pyramid-shaped structure that can be fitted with payloads of up to 4,500kg, which can include cameras, environmental sensors and communications equipment. To gain peak altitude, the Stratollite uses a primary lift balloon. Then, once in the stratosphere, its secondary balloons help it rise and fall.

Until its flight on June 5, the Stratollite had only remained aloft for a maximum of five days. This latest test flight is therefore a significant milestone for the company. By the end of 2019, World View hopes to have completed 30 and 60-day missions and to have commercial services up and running in 2020.

“This is a great accomplishment for our team and a key step on the path towards productising the Stratollite and the unique data sets it will provide for our customers,” Ryan Hartman, World View President and CEO, said in the company’s press release.

Given the low cost of the Stratollite relative to drones and satellites, the technology is raising concerns among legal experts about the ramifications for world privacy, as it makes surveillance capabilities much more affordable. Nonetheless, the Stratollite has wide-reaching benefits in disaster relief, military communications and weather forecasting and could render expensive satellite launches obsolete.

How Wimbledon scales up instantly with IBM Cloud

The Championships at Wimbledon is the oldest tennis tournament in the world. While its brand remains traditional, it’s also known as an early adopter of technological innovations that connect the sport, media, and its relationships with fans. Bill Jinks and Karen Dewar discuss how the Championships use IBM’s Hybrid Cloud services, AI capabilities, and security products.

Learn more about IBM’s secure cloud solutions.

Karen Dewar: Bill Jinks is the IT Director at Wimbledon. So, Bill: Wimbledon is this amazing global sporting event and brand, but I’ve also heard it referred to as a data-driven media business. How do these two things co-exist?

Bill Jinks: Yeah, so: we’ve recognised that we can’t just be a sporting event. We are on a global stage, and it’s really important for us to act like that. We need to own our own content, own our audience. Own the data that we drive. So more and more we have to behave like a media organisation in those ways. And they co-exist nicely. The sporting event’s really important to us, but by those things we can actually get more value out of it.

Karen Dewar: Wimbledon has been using IBM’s hybrid cloud capabilities for some years now; can you tell me why that’s important from a business perspective, and how it’s keeping you competitive?

Bill Jinks: Yeah, obviously year-round we’re actually quite a small business. But then for the Championships we have to…

Karen Dewar: There’s a massive explosion.

Bill Jinks: It’s a massive explosion! We have to scale for millions of people who are going to watch the event on the digital platforms. And cloud allows us to scale up for that audience. It also provides the security, the availability, and gives us the confidence that it won’t go wrong during the two weeks of the year when all eyes are on Wimbledon.

Karen Dewar: So, Wimbledon is this fabulous traditional brand, but you also have a really good reputation in terms of being highly innovative. Can you talk a little bit about how you’re using AI in terms of connecting with your fans on a very, sort of, real and personal sense?

Bill Jinks: Increasingly fans want to engage in a conversational style on their social platforms. So we’ve used IBM’s AI within Facebook Messenger to create a chat interface, which allows fans to follow their favourite players, to get match information. And then they can ask further questions to get more information and insight about the Championships.

Karen Dewar: So for such a prestigious event, security must be a huge focus. So can you tell us how Watson Cyber Security has helped you in this regard?

Bill Jinks: Information security is a significant concern. And we’ve been very lucky that IBM’s security products have helped protect wimbledon.com for many years. But the threat landscape and the exploits that are coming along are ever increasing. And the amount of information you’re getting from your security infrastructure is becoming almost more than your security teams can deal with.

So I like to see Watson Cyber Security as an additional member of our team. It’s analysing all of that data, and it’s providing the security team with insight they need to help protect wimbledon.com

Karen Dewar: Bill, thank you very much for your time today, really appreciate it.

Bill Jinks: Thank you.

How IBM Cloud helps Royal Mail Group deliver faster, efficient services

The UK’s Royal Mail Group is one of the oldest mail carriers in the world – pre-dated only by the mail services of Portugal, and of the Roman Empire. The modern Royal Mail, explains Martin Moore, has been innovating with cloud for some time – exposing its APIs in a controlled and secure way, and helping the business understand its customers’ needs. Martin and Karen Dewar discuss the competitive advantages that IBM Cloud brings to Royal Mail Group.

Learn how to move your business applications to the IBM Cloud.

Karen Dewar: Martin Moore is Platforms Delivery Director for Royal Mail Group; one of the oldest mail carriers in the world. What some people might not know is that certainly from an IBM perspective, we consider you as really an innovator in your use of cloud technologies to drive competitive advantage into your business. So, tell us a little bit more about your use of cloud-based technologies.

Martin Moore: So at Royal Mail we’ve been innovating with cloud for quite a while now. We started off with the IBM private cloud; and that’s where we implemented our API Connect platform. You know, the key driver for that was to expose our APIs in a secure way out to our customers. And that went tremendously well. It took us about three months from inception through to the actual final delivery of the project.

That now exposes our APIs in a much more easily controlled way. We get analytics, we get data, from all those interactions. Even down to what customers are making what sort of enquiries. So we can really start to understand our customers’ needs.

And due to the success of that, we’ve actually moved from the public cloud now to a dedicated IBM Cloud platform. And that’s really enabled us to ensure that we’ve got a highly scalable solution as our APIs really take off. And – yeah – it’s a really good foundation for the future.

Karen Dewar: Brilliant! How are you seeing the cloud bringing you some competitive advantage? Because I guess the speed is one of the key aspects of that.

Martin Moore: The speed is the key thing. That’s why there’s a cloud first strategy now. And I think we need to innovate more, we need to fail fast, as other people do. We need to just get out there and just try things. So, we’ve changed our ways of working.

And the benefit of that is, we’re now able to on-board our customers in a handful of hours or days, instead of what used to take weeks or months. And the way that we on-board them is now far more secure and modern; so our customers are happier, we’re happier, and – you know, it just works really, really well.

Karen Dewar: Fabulous, and: what does the future hold for Royal Mail Group? What do you see as the sort of, challenges, and maybe advantages, of adopting a multi-cloud strategy?

Martin Moore: For me, the key thing is about laying down the foundations of how you govern and manage your multi-cloud environment from day one. You need to consciously decide what processing powers you need to run where. Different clouds will give you different advantages in different places.

But coupled with that, we need to ensure that our traditional, sort of, hub-and-spoke integration method, is modernised. And we need to have more of a federated integration service. And be cloud agnostic.

Karen Dewar: Thanks Martin.

Martin Moore: Okay! Thanks Karen.

Karantis360 and IBM Cloud: Helping people in care live at home for longer

Karantis360 is an innovative health tech business using smart technologies to transform the care industry. Using IBM Cloud, Karantis360 is combining the Internet of Things (IoT), sensors, artificial intelligence and machine learning to help in-care people live at home longer—while giving their families greater peace of mind. Helen Dempster and Karen Dewar discuss how the solution works, why the business moved to the IBM Cloud platform, and Karantis360’s hopes for the future.

Learn more about IBM’s data and AI solutions.

Karen Dewar: Helen Dempster is Chief Visionary Officer at Karantis360; so Helen I understand you’ve developed this solution that is disrupting and transforming the care industry through the use of smart technologies. Things like internet of things, sensors, AI, and also machine learning. So can you tell us a little bit more about your solution?

Helen Dempster: What we’ve done is, we’ve taken – as you say – IoT and AI, and created a solution that will help people in their own homes and stay there for longer.

And AI is used to create patterns. If there is an issue that deviates from the learned behaviour pattern, it will push out an alert to the – either care provider or the family member – who’s overseeing the care. That will then give the individual overseeing the care the time to be able to manage that properly. And it’s instantaneous.

Karen Dewar: Wonderful, so: you’re a cloud-native company, but you weren’t born on the IBM Cloud, you moved to us. Tell us a little bit about that decision process.

Helen Dempster: So initially we were born on Microsoft Azure. We opted back in December 2017 to migrate across to IBM.

Security is a huge issue for us – especially given the industry we’re in. IBM’s a very well known name, a very well known brand, and one that people trust. So the security that IBM offer as a brand was amazing for us.

Scalability – and instant scalability, for us – is key. IBM offered a huge range of AI capabilities to put into the system. And reliability. And those were our key factors.

Karen Dewar: Fabulous. And finally, what does the future hold for your solution? I mean, what do you envisage from a technology and a functionality perspective, going forward?

Helen Dempster: So for us, we’d like to see this go global. To utilise the data centres that IBM have globally around the world.

But the business is one of care, and care is key to our hearts. And giving transparency to families – whether they’re here, there, wherever they may be – is one of the key components to this.

So the technology is based on a mobile platform. That will give family members key sight as to what’s going on inside the family home. And equally, using IBM as a brand name to propel us is one that we could only have dreamed of in December 2017.

Karen Dewar: Brilliant. Helen, thank you so much for your time today.

Helen Dempster: Thank you.

TOMRA offers a circular solution to the world’s plastic problem

In recent years, there has been a huge shift in attitudes towards waste and recycling, with new regulations and targets being set by policymakers and governments alike. At the heart of these discussions is the need to create a circular economy – one that ensures countries handle resources in a sustainable and effective way. This is particularly true of plastic, which, despite being the workhorse of the global economy, is often produced and disposed of unsustainably.

The sheer amount of plastic used around the world each year is staggering: since plastic production entered the mainstream in the 1950s, more than 8.3 billion tonnes have been produced, 6.3 billion tonnes of which is now regarded as plastic waste. According to the Ellen MacArthur Foundation, only 14 percent of plastic is recycled globally. By comparison, the global recycling rate for paper is 58 percent, while 70 to 90 percent of iron and steel is reprocessed. Clearly, plastic is by far the most problematic material in circulation; if the current trend continues, there will be 12 billion tonnes of plastic waste in our landfills and ecosystems by 2050.

A load of rubbish
The past 40 years have seen our unsustainable linear economic model – take, make, dispose – accelerate at an alarming rate, with approximately one million plastic bottles now bought every minute. To put this into context, more than 800,000 of these will be incinerated, disposed of in landfills or lost into the natural environment. In other words, plastic packaging has become a key driver of our waste crisis.

Progressive schemes should be set up to allow consumers, policymakers and brand owners to adopt the mindset that plastic is not disposable, but is a valuable resource

Today, the battle against plastic waste is taking place on a global scale, with new regulations demanding that governments and policymakers reassess their approach to handling refuse. Recent import bans in China, Malaysia and Thailand, for example, have left waste-exporting countries sitting on a lot of plastic. As a result, these states are now being forced to look at their own recycling infrastructures in order to find sustainable solutions to their domestic waste management problems.

By adopting a circular model, it will become possible for such countries to collect and recycle plastics on a large scale and to a high standard, creating a closed loop that has positive environmental impacts, such as reducing carbon emissions and limiting the production of virgin materials. Fortunately, policymakers, industry leaders and technology providers are beginning to wake up to this fact.

Finding value in waste
There isn’t just one simple solution to creating a truly circular economy. Instead, a number of factors need to work in harmony to ensure the idea of reducing plastic waste is promoted effectively and efficiently. To achieve this, consumers, brand owners and manufacturers must all play their part.

At TOMRA, we believe the implementation of a deposit return scheme (DRS) is an effective way of encouraging consumers to recycle their plastic drinks containers and keep materials within the closed loop. These schemes see consumers pay a fixed deposit that is refunded when a bottle has been returned for recycling. As well as motivating consumers to take the plastic container back to be recycled, the financial incentive helps to communicate the message that plastic has value and should be treated as a resource, rather than a disposable piece of waste.

Global recycling rates

90%

Iron and steel

58%

Paper

14%

Plastic

Source: The Ellen MacArthur Foundation

What’s more, a DRS makes it easier to collect high-quality plastics through a reverse vending machine, which separates the contents to reduce the risk of contamination – an important consideration when maintaining food-grade standards. Many industries have been working hard to develop this holistic process in a bid to remove plastic from mixed-waste streams, reduce our reliance on downcycling and replace virgin plastics. Globally, around 40 markets have adopted DRS to date, experiencing return rates of up to 98 percent.

Tackling plastic refuse at the waste and recycling-plant stage is also a key step to accomplishing a circular economy. Thanks to a number of technological developments, sorting machines within waste/recycling plants can detect and recover plastics that have been incorrectly disposed of. This ensures more plastic is kept in the circular economy and out of the environment. Sensor-based sorting machines, for example, present an efficient, economically viable and reliable way of recovering a valuable resource and allowing it to be reused for other purposes.

The range of everyday plastic products that can become mixed up with general waste is huge, so being able to detect, recover and recycle them is a sustainable step towards achieving a closed loop. Brand owners must also start advocating the use of recycled goods and be willing to purchase recycled plastics. This, in turn, will showcase to manufacturers that there is a demand for recyclable goods.

Today, 11 leading brands, retailers and packaging companies are aiming to provide 100 percent reusable, recyclable or compostable packaging by 2025. If successful, this would mean a move away from downcycling plastics, as well as the creation of a high-quality material that meets all necessary regulations.

Closing the loop
The facts surrounding waste do not lie – we need to take action on a global scale to improve how we manage plastic. Progressive schemes should be set up to allow consumers, policymakers and brand owners to adopt the mindset that plastic is not disposable, but is instead a valuable resource. Through instilling and promoting this ideology, a circular economy for plastics can be developed, ensuring products and packaging are recycled in a sustainable, energy-efficient and cost-effective manner.

If we are to take meaningful strides towards achieving global sustainability targets, we must consider the benefits of a circular economy and increase the use of post-consumer plastics within product manufacturing and packaging. We’ve already witnessed improvements in the fast-moving consumer goods market, but debates around quality, cost and cleanliness rumble on. Clearly, more work needs to be done.

At TOMRA, our vision is to lead the resource revolution. We are dedicated to improving the recycling rates of plastic and many other materials, and have been developing solutions since 1972 to support the circular economy. This year, we will build on this commitment through our Circular Economy Unit, a team devoted to supporting the circular economy, protecting the environment and encouraging sustainable practices in relation to resource productivity. By working together, we can solve our planet’s waste crisis.

Microscopic ‘smart dust’ sensors are set to revolutionise a range of sectors

The future of computing is microscopic. For decades, technology has followed the same pattern: as speed and capability increase, cost and size shrink. It can be seen in the transformation from the mainframes of the 1960s that filled entire rooms to the bulky PCs that became ubiquitous in the 1990s to the paper-thin laptops, tablets and smartphones we use today.

“Based on that [trend], we should be arriving very soon at millimetre-scale computing systems,” said David Blaauw, a professor of electrical engineering and computer sciences at the University of Michigan, who has spent decades working to miniaturise computing systems.

In 1997, the researcher Kristofer Pister coined the term ‘smart dust’ to describe these millimetre-sized devices. Pister and his colleagues at the University of California, Berkeley, aimed to create a network of sensors made up of tiny wireless computer systems called ‘motes’. Acting as microscopic eyes, ears and arms, these motes could rove around the world collecting all kinds of data: visual, thermal, chemical and biological. In theory, smart dust could revolutionise industries by reaching places scientists never thought possible.

Internet of Things 2.0
Since Pister first presented the idea of smart dust, the concept of building a distributed network of wireless microelectromechanical systems (MEMS) has only gained steam. Although it may sound like a radical concept, smart dust is the natural next step for today’s Internet of Things (IoT). The IoT market has quickly established itself as an essential component of the modern world. Devices vary wildly, from consumer technologies like smart thermostats to products designed for the corporate world, such as small sensors that monitor oil wells to ensure optimised production.

It is unlikely that smart dust technology will trickle down into consumers’ homes before it has penetrated sectors such as industrial monitoring and medicine

In 2017 there were already 27 billion connected IoT devices, but by 2030, research firm IHS Markit expects the number to more than quadruple to 125 billion devices. Dust Networks, a company founded by Pister that is now owned by US semiconductor firm Analog Devices, has successfully deployed one-centimetre scale sensors in several commercial markets, including oil refineries, Pister told The New Economy: “That technology is still selling and being deployed today.”

These sensors have already transformed the way many sectors collect data, but with more advanced and even smaller technology, they could go much further. Not only could smart dust dive into oil wells; it could also sit on the wings of a butterfly to monitor migration patterns or be deployed inside the human body to oversee the recovery process of broken bones or damaged organs.

A number of uses are already being considered: tiny motes of smart dust could be deployed across a farm’s crops to monitor the needs of the plants, from determining watering times to pest control. Elsewhere, smart dust could track bees to find out where they encounter various chemicals that threaten their populations.

In medicine, embedded defibrillators and pacemakers already carry out health-monitoring processes, but smart dust could take that to the next level. Researchers at the University of California, Berkeley, were the first to propose what they called ‘neural dust’ – millimetre-sized sensors that could be implanted in the body and used to stimulate nerves and muscles, as well as to monitor the activity of different organs. In 2016, they successfully built the first of such sensors, measuring three millimetres long, one millimetre wide and one millimetre across.

One day, Blaauw hopes millimetre-scale devices like neural dust could not only take measurements, but also take action. For instance, a project his team is working on involves implanting tiny sensors into tumours to monitor the effectiveness of treatments. “This is very advantageous over blindly treating the tumour,” Blaauw said. “But down the road, it would be great if those little sensors, those little systems, could also themselves provide treatment by emitting certain hormones or medications or other kinds of bio-simulations.”

The construction industry has previously demonstrated success with larger smart-dust-like sensors. For instance, when workers laid the concrete for the basement of One World Trade Centre in New York City, Bill Ray, a research director at the consulting firm Gartner, said they threw sensors directly into the concrete. Throughout construction, the sensors enabled workers to monitor what was going on inside a block of concrete, ensuring that it set properly. By the time the batteries in the sensors died, their job was done. Although these sensors were not millimetre-sized smart dust, Ray told The New Economy that it gives the industry a look at just how varied the different applications of smart dust could be.

In the hands of researchers, smart dust could open the door to innumerable groundbreaking discoveries. “There is actually a lot of information out there that we cannot access… because we’re not small enough,” Blaauw said. “With millimetre-scale systems, we can actually illuminate knowledge into regions of our environment that we just don’t have access to today.”

Weighing up the benefits
It is unlikely that smart dust technology will trickle down into consumers’ homes before it has penetrated sectors such as industrial monitoring and medicine, where motes could more easily be developed at larger scales. But that hasn’t stopped researchers from dreaming up creative uses for consumer-ready smart dust applications.

For instance, Pister envisions a wireless device that he could attach to his fingernails, allowing him to type or sketch anywhere, on the spot. “When we can get down to the one millimetre-scale devices commercially, there are all sorts of fun applications,” he said.

Engineering industry magazine IEEE Spectrum reported details of how smart dust could be embedded into packaged foods so consumers could test whether it was still safe to eat by waving their smartphone over the product. This idea, like many smart dust applications, is only realistic if the technology becomes cheap enough for motes to be disposed of after a single use. Cost, therefore, will be a critical factor holding the smart dust technology back from widespread deployment. “Cost is always a barrier,” Blaauw said. “Definitely right now, these kinds of things, in general, are boutique. They are expensive.” The hope is that, after a chicken-and-egg dance, a company that can tolerate the high initial cost of the technology will drive production up, thus bringing down costs for others.

Number of connected IoT devices worldwide

27bn

in 2017

125bn

by 2030

But companies will only take the leap to a new technology if they can identify a strong return on investment, and currently silicon prices are far too expensive for use in disposable MEMS sensors. IEEE Spectrum calculated that to make one sensor at a cost of $0.01, silicon prices would have to drop by a fifth.

“With a new technology, this is always a tricky part: to identify a killer application that will help drive the final production to volumes where the cost can be low. Then that can springboard many other applications that can then take advantage of that previous investment,” Blaauw said.

Researchers are tackling the cost issue by creating sensors made out of different materials, including paper and plastic. For sensors designed to be put in the body, planted in the ground or used in food packaging, the potential waste caused by the disposal of hundreds or thousands of smart dust motes becomes another issue. To address both of these concerns, researchers at the University of Pennsylvania are working with biodegradable materials, such as those used in dissolvable surgical sutures.

Gathering dust
Before smart dust can radically change the way businesses around the world operate, researchers must first create microsystems that work. Over the past decade, significant progress has been made in shrinking the size of the computing system itself. Blaauw and his team at the University of Michigan have created the Michigan Micro Mote, the world’s smallest computer at just two millimetres wide. During its development, one of the biggest problems the group encountered was reducing the size of the battery.

While shrinking the size of microchips is relatively easy, battery size is dictated by power consumption, and current motes eat up too much power for microscopic batteries. For this reason, many researchers are focused on creating millimetre-scale systems that rely on very low power; when on standby, the Michigan Micro Mote runs on about a million times less power than the average smartphone. Electronics manufacturer Ambiq Micro, which Blaauw co-founded, claims to be the world leader in energy-efficient semiconductor design. The company makes ultra-low-power solutions for IoT devices, wearables and other technologies.

Alongside vast opportunities, smart dust brings sweeping new concerns about privacy and ethics

Researchers are also considering different ways to power smart dust. For instance, the Michigan Micro Mote was able to harvest energy from ambient light via a one-millimetre-by-one-millimetre solar panel. Energy harvesting allows sensors to grab power from a number of external sources, including ambient light, heat or motion. Through ambient backscatter, devices can also take energy from existing radio frequency signals or Wi-Fi. Matrix Industries, for example, focuses on capitalising on ambient heat. The company has already designed a watch that is powered by its wearer’s body heat, and it is also working on systems that capture energy from daily fluctuations in air temperatures.

Sensors used in the body, meanwhile, can be powered by ultrasound technology, which is optimal for hospital use. The researchers behind neural dust, which does not use any batteries, utilise ultrasound vibrations to both power their sensors and read out measurements. “Ultrasound is much more efficient when you are targeting devices that are on the millimetre scale or smaller and that are embedded deep in the body,” electrical engineering and computer sciences graduate Dongjin Seo said in a statement at the time of the announcement. “You can get a lot of power into it and a lot more efficient transfer of energy and communication when using ultrasound as opposed to electromagnetic waves, which has been the go-to method for wirelessly transmitting power to miniature implants.”

No silver bullet
Because of these roadblocks, smart dust could still be decades away from revolutionising the IoT on a commercial scale. According to Gartner’s Hype Cycle, it will take more than 10 years for smart dust to achieve mainstream use. On the graph’s curve, it is at the very beginning of its life cycle, slightly ahead of flying autonomous vehicles.

But the arrival of smart dust will not occur with a big bang. “There are lots of things that will be smart and dust-like on the way to that final destination,” Ray said. Even once the technical kinks are straightened out, the barriers to deploying smart dust on a wider scale could continue to expand. Like any new technological development, alongside its vast opportunities, smart dust brings sweeping new concerns about privacy and ethics. There has already been an outcry over how easily smart home devices can be hacked, and the commercialisation of smart dust would only increase the scale of data being collected by IoT-related products.

Whether it is Google, Amazon or a construction company you have never heard of, the thought of large institutions placing microscopic sensors capable of collecting audio and visual data anywhere they choose brings up a raft of questions around privacy and security: how is that data stored, and by whom? Will consumers be able to opt out of having their data collected?

Experts agree that as long as the issue of security is addressed from the start of the development cycle, it will not be difficult to ensure smart dust is protected from hackers. “[Security] will always be a battle, but even at a sub-millimetre scale there is no reason that dust motes can’t have banking-level encryption and authentication on their communications,” Pister said.

But the trouble with the gradual evolution of new technologies like smart dust is that researchers tend to focus first on making the technology function. Security only becomes a concern when the devices seep into the commercial and consumer markets – and even then, typically after a security scandal occurs. For researchers using smart dust technology to study bees or butterflies, it is easy to see why security is of little concern, according to Blaauw. “But when this technology starts becoming widely used, which is what I think we’re going to see in the next decade or so, then security will have to be added in,” he said.

Nevertheless, the real questions around the security of smart dust are part of a bigger story around data collection and privacy. “We can make [smart dust] secure; technically making it secure is not the problem,” Ray said. “The issue is, how do we decide what level of security we like?” For example, Facebook is a secure system – that, however, has not stopped the social media giant from becoming embroiled in one scandal after another in recent years. These scandals are not about technical security, but about Facebook’s role in sharing data without its users’ knowledge or consent. As Ray explained: “The issue… is not the level of security; it’s the decisions about how to manage that data.”

The IoT industry and social media firms have so far demonstrated how not to approach these complex questions. While 10 years may seem like a long time for business leaders and scientists to tease out the answers to these concerns, it is important that privacy and security are part of the conversation from the start.

Once a security framework is established, smart dust can start to act as the eyes and ears of businesses and organisations. Medical professionals could soon streamline diagnostics and treatments with the help of internal sensors, while a sprinkling of smart dust could allow manufacturers to monitor their inventory from the factory to the shelf. The arrival of networks of minuscule sensors will allow unprecedented levels of data collection. Companies must act before the dust settles on this new era.

Sectors on the front line of smart dust innovation

Industrial monitoring

A number of industrial sectors have already installed networks of small sensors to monitor machinery across a plant or factory. Aside from oil refineries, chemical plants and mines are places where the sensor networks that are already installed could be upgraded to smart dust as soon as the technology is available. As well as monitoring their equipment, consumer goods businesses with large operations (such as packaged-food makers or breweries) could use smart dust to boost inventory control and enhance security by wirelessly monitoring their products.

Medicine

Smart dust has the potential to vastly improve medical treatments and health monitoring by shedding light on what is happening deep inside the body. But even on the surface of the skin, millimetre-scale systems could revolutionise medicine. A process as simple as fitting a prosthetic limb could be improved by using smart dust: by scattering the tiny sensors around the area where the prostheses will attach, smart dust can look out for areas of increased temperature, showing where the stress points will be.

Scientific research

Countless scientific research projects could be kick-started by the emergence of smart dust. In theory, scientists could send a swarm of smart dust to Mars – or even further into space – to collect data on new interstellar frontiers. Back on Earth, smart dust could be used to monitor the conditions of a wildfire or give early warning of a natural disaster like a volcanic eruption or an earthquake. On an even smaller scale, the microscopic sensors could measure – and control – energy or water usage and costs in homes or businesses.

Indonesia is the world’s fastest-growing e-commerce market

With communities spread across 17,000 islands, Indonesia seems like an unlikely place for online shopping to take off. Although cities like Jakarta are densely populated, a significant proportion of the population lives in rural areas like Kalimantan, where they make a living from coal and palm oil. But delivering to these areas is costly, and makes e-commerce a logistical nightmare. Yet, in many ways, the fact that some Indonesians are so cut off from many consumer goods has created a growing appetite for online shopping. The country is now home to the fastest-growing e-commerce market in the world, according to PPRO Group, and is predicted to be worth $53bn by 2025.

The rise of e-commerce
The introduction of the smartphone to Indonesia was nothing short of revolutionary for the country. Unlike their Western counterparts, who typically went online for the first time via a computer, most Indonesians first experienced the internet through a mobile phone. Now, 95 percent of the country’s internet users are mobile and Indonesians spend 206 minutes a day on social media compared with the global average of 124. The launch of the mobile economy unlocked a wealth of opportunities for Indonesia’s consumers and fostered a growing demand for online shopping.

For more Indonesians to partake in the digital economy, the country will need to see an increase in its fibre optic network

Tokopedia is one of the most successful start-ups to emerge from this trend. Founded in 2009, the online marketplace arrived in perfect time to connect a growing base of online consumers with retailers and independent sellers. In 2017, the e-commerce platform won $1.1bn in investment from Chinese e-commerce giant Alibaba.

“When Tokopedia was established in 2009, Tokopedia saw the disparity in opportunities across Indonesia,” Nuraini Razak, Vice President of Corporate Communications at Tokopedia, told The New Economy. “Small businesses in rural areas have to move to urban areas to expand their market reach, while consumers across Indonesia have limited access to goods or have to pay more for the same goods just because of where they live.”

Platforms like Tokopedia have also facilitated the country’s burgeoning excitement for entrepreneurship. Traditionally, as one of the most resource-rich countries in the world, Indonesia’s labour force has been largely concentrated in the copper, tin, oil and gas industries or in Indonesia’s largest job provider, the agricultural sector. Now, the idea of starting an online business is gaining traction among the digitally savvy youth.

“Tokopedia enables more than 5.5 million merchants in Indonesia, 70 percent of whom consider themselves first-time entrepreneurs,” said Razak. “Increasingly, Indonesian users have options, convenience and financial access… [which,] further drive [the] Indonesian digital economy.”

Pushing for digital innovation
The Indonesian Government recognises the critical role that start-ups and SMEs have played in driving digital innovation within the country and has pledged to support the digitalisation of approximately eight million SMEs by 2020. There are also hopes that the country’s president, Joko Widodo, who was recently re-elected for a second term, will help to further the country’s digital transformation. In his last term, Widodo completed various infrastructure projects, building much-needed roads, railways and ports. Now he wants to continue to push infrastructure spending, both urban and digital.

Razak argues that the country has been steadily improving its internet penetration, citing “much cheaper smartphones and better network infrastructure, such as 4G/LTE”. However, unlocking true digital transformation will be no easy task for Widodo. According to McKinsey & Company, despite its digitally savvy citizens, Indonesia is well behind other ASEAN countries in the digital revolution. Bandwidth is poor compared with the rest of the region, and in some provinces, the proportion of people with access to the internet is as low as 2.5 percent.

For more Indonesians to partake in the digital economy, the country will need to see an increase in its fibre optic network, as well as enhancement of its 4G infrastructure. Fortunately, the Palapa Ring project will bring 13,000km of undersea fibre-optic cables to Indonesia’s major islands, while Teleglobal and SES Networks will improve internet access among remote areas with its satellite-based network.

Another key obstacle is the question of human capital. Currently, Indonesia lacks the tech talent needed to significantly drive growth. The country invests less than Singapore and Malaysia in education on a per capita basis, and highly skilled workers account for only 10 percent of total employment, according to the World Economic Forum. As a result, Indonesian companies often offshore tech work or bring in foreign advisors on a temporary basis.

Widodo aims to improve Indonesia’s education system and believes this change will depend partly on foreign investment. Indonesia has a free trade deal with Australia, which includes zero-tariff access on many goods and services for Australian universities to set up campuses in Indonesia. Widodo hopes that such moves will improve the transfer of skills from Australia to Indonesia.

Indonesia’s digital economy is poised to become the largest in South-East Asia. Already, e-commerce has had a transformative impact on the country, largely because of small businesses and start-ups driving innovation. Now, Indonesia needs to encourage and facilitate that innovation by implementing the right infrastructure and improving the education system. Only then can the country ensure that it truly reaps the benefits of digital transformation.

WikiHow has an answer for everything

Prior to January 2005, if you wanted to learn how to tie a tie, treat an ear infection or overcome shame, you’d have to ask those around you in the hope that they happened to have the particular advice you were in need of. Today, a quick search on the world’s most popular ‘how to’ website, wikiHow, will yield answers to these questions and 375,000 others. More than 150 million people visit the site each month, learning from the vast range of practical, step-by-step guidance, all for free.

Before the glory days of the internet, founder Jack Herrick had his own personal library filled with ‘how to’ guides. “The light bulb went off when I really started diving into the web,” he told The New Economy. His greatest inspiration was Wikipedia, the most ubiquitous free encyclopaedia on the planet. “When I discovered Wikipedia and saw how well that was working… I realised that the same model could be used for ‘how to’.”

Getting Wiki with it
WikiHow wasn’t Herrick’s only attempt at creating a vast archive of free ‘how to’ guides. Between 2004 and 2006, Herrick served as owner and co-CEO of eHow, which was publishing ‘how to’ articles written by paid contributors. But this model would not be sustainable if Herrick was to achieve his vision “to serve every single person on the planet and cover every topic”.

By using open-source software, wikiHow has been opened up for editing by millions. Anyone, anywhere can contribute their knowledge

“We needed something that had the potential to produce more topics in more languages, and also [of a] higher quality,” Herrick explained. He switched to the ‘wiki method’ to solve this conundrum: by using open-source software, the site has been opened up for editing by millions. Anyone, anywhere can contribute their knowledge, adding step-by-step guides based on whatever their expertise might be. As a result, wikiHow now covers a plethora of incredibly niche topics, from how to care for hummingbirds to how to paint saw blades.

The decision to run an open collaboration site is closely tied to the ethos Herrick set out with: “I wanted to build the world I wanted to live in… where people can use and reuse other people’s information, and share and improve [it] – that ethic appeals to me.”

WikiHow also promises the ‘right to fork’, which means that, should the project derail at some point, anyone can step in and take over. This acts as a safeguard to wikiHow’s mission. “As time has gone on, it’s become clear that the open-source model is crucial and the right to fork is necessary,” said Herrick. He uses Facebook as an example. At the start, he said, “Facebook just seemed like this wonderful garden that we’re all invited to play in, and now here we are, 14 years later, and it’s not this wonderful garden – it feels more like everyone is like a prisoner trapped inside their data-sucking cage”. He added: “If open-source software was being used and the users of Facebook had the right to fork, Facebook never would have abused us [in] this way.”

A very unique bird
This altruistic approach ties in with wikiHow’s unique hybrid business model. Elizabeth Douglas, who has served as CEO of the company since 2017, told The New Economy: “What it means is that we are a very mission-focused and mission-driven company, so our mission is to teach everyone in the world how to do anything. All of the decisions that we make as a team, in terms of what we’re going to do, what we’re going to invest in and what matters to us, always match to our mission. For instance, we will sometimes make decisions that are more mission-focused than they are revenue-focused.” One such decision saw the site translated into Indonesian, one of the most spoken languages in South-East Asia.

375,000

Topics on wikiHow

150m

Visitors to wikiHow each month

When Herrick started wikiHow, a term for this type of business model didn’t exist. “I [kind of] invented this term ‘hybrid organisation’, described as a community service project, which happens to be a for-profit company,” he told The New Economy. “We are a very unique bird relative to the rest of the flock in Silicon Valley.”

Interestingly, this mission-centric approach has allowed the company to grow much more organically than other businesses in the valley. “We’ve been profitable almost since the beginning,” said Douglas. “What’s different about wikiHow [compared with] a typical Silicon Valley business is that we’re not trying to grow as fast as we can. We grow our team or our product when we see really great opportunities that are aligned with our mission; we don’t grow just because we’re looking for an exit. So knowing that we’re going to be around for a long time, we’re able to make decisions that are really, really good for the company and for the mission in the long term… so it actually ends up being a business that is very much focused on long-term sustainability instead of short-term revenue opportunity.”

In line with this hybrid approach is the decision to refuse outside funding and buy-outs from bigger companies, despite numerous offers over the years. As Herrick explained: “Making money is great, certainly it’s something I have nothing against… However, we don’t want to do that at the expense of our mission – we just never found anyone who prioritises that the same way that we do.”

Douglas added: “When you achieve outside funding, whoever is funding will generally have a goal of return. Because we’re not trying to return someone’s money on a certain schedule… we don’t have the pressure of the needs of an outside investor.”

While it’s not the case in Silicon Valley, the number of hybrid organisations around the world is slowly starting to grow. “WikiHow has been leaning against the wind for a very long time here, and I’m now starting to feel the wind shift to be more of that,” said Herrick. “[The company is] no longer this total oddball.”

The kaizen approach
In order to achieve its mission, wikiHow operates under a modus operandi of constant evolvement. Each article on the site can be edited numerous times, with tweaks or new images added by editors, both internal and external, to make it as helpful as possible. In discussing this mindset, Herrick referred to the Japanese concept of kaizen, or continuous improvement: “It’s funny, every time we finish something we’re like ‘this is the best, we can never get it any better’ – then six months later we find a way to make it better again.”

Herrick uses the site’s ‘how to tie a tie’ article as an example of wikiHow’s kaizen philosophy in action. Over the past 14 years, he says, it’s been changed hundreds of times. Today, the article reflects the way wikiHow sees itself evolving for the times ahead. “The article has a variety of different things. It has videos, and it has some illustrations on top of those videos, it has great quality text and still images… It’s really multimedia,” he explained. This dynamic page design helps users learn as quickly and as effectively as possible.

To push this goal further, each page on the site uses data to assess how helpful visitors find it so the necessary adjustments can be made. For guides of a medical or legal nature, wikiHow’s in-house team of experts carefully scrutinise each piece to ensure its accuracy. Examples such as ‘how to recognise spinal meningitis symptoms’ and ‘how to perform the Heimlich manoeuvre’ attest to how these physician-approved articles can go as far as saving the lives of people around the world.

WikiHow is a company unlike others: it acts like a charity, but doesn’t take donations; it turns a profit, but making money is a secondary factor in decision-making; growth is key to the company’s mission, but it refuses external investment that would accelerate growth. Through its hybrid model and open-source approach, wikiHow is able to maintain a constant cycle of improvement and sustainable growth. WikiHow’s mission is to teach the average user how to do any number of tasks, but the site has become a role model in the business world, too, thanks to its pioneering hybrid model.

Schneider Electric is helping to bring reliable electricity to Africa

Solar power continues to grow in popularity, with a number of new business models emerging in order to meet increased demand. But despite gaining favour in many markets around the world, there are still some regions that are yet to reap the benefits of this renewable energy – a fact that is made all the more troubling by the power inequality in such areas. In Africa, for example, only half of the continent’s vast population currently has access to a reliable source of electricity.

Fortunately, solar is a scalable power-generation technology that is well suited to the task of electrifying Africa. From the moment a single photovoltaic (PV) cell comes online, electrical current starts to flow; as more cells, modules, panels and arrays are added, power output rises proportionally. Over time, this growth in solar capacity will allow a greater number of people across the continent to gain access to electricity.

Growth in solar capacity will allow a greater number of people across Africa to gain access to electricity

Further, it’s important to note that solar power can be consumed onsite without the need for expensive transmission lines and doesn’t suffer from transmission losses. These are significant benefits over centralised power models, which require extensive infrastructure and significant investment to ensure residents receive electricity.

Electrifying Africa
Few technologies can rival the bankability and scalability of solar power. Nigeria, for example, has found great success with Green Village Electricity, a local distributed energy service company that operates more than 20 solar PV plants across the country. Each of the firm’s microgrid facilities can generate enough energy to power up to 2,000 households, businesses, schools and municipal offices per village.

As a result of this success, almost every country in the Economic Community of West African States is following Nigeria’s lead, with decentralised solar power generation now moving from a vague election promise to official public policy. Some countries have even created decentralised energy agencies that are tasked with fostering greater collaboration between governments, investors, non-governmental organisations and local populations. Off-grid applications are also helping to ensure residents have a reliable source of electricity.

As well as providing comfort and security to citizens, this greater access presents many growth opportunities, notably through the maintenance of new ecosystems and pay-as-you-go services. For this new business model to work, though, bankability is key.

Storage solutions
A key economic driver behind the adoption of solar power is the falling cost of onsite storage. Cheaper solar batteries allow telecoms operators to install more cellular towers, which, in turn, help to spur economic activity in the local area. The same is true of data centres, gas stations, water treatment facilities and other critical services. At Schneider Electric, for example, we have built a unique solution that provides a high-end real estate development in South Africa with energy independence. We have achieved this by combining community-level centralised storage with decentralised power generation.

Homeowners paying to be off-grid want the process to be easy and simple – they don’t want to have large and potentially dangerous battery banks or noisy diesel generators that require constant refuelling standing in their garages. Instead, they want the benefits of power security without having to adjust their lifestyles or pay premiums.

Africa is currently spending $1.2bn per annum on diesel generators for telecoms tower farms that are polluting the atmosphere. By implementing a solar-diesel hybrid system, however, the impact of CO2 emissions can be reduced and operating expenditure lowered. Solar power has the potential to amplify this relationship further than any other decarbonised energy generation technology.

Scalable, affordable and easy to distribute PV technology levels the playing field by ensuring everyone has access to reliable electricity. Indeed, African countries that embrace solar power enjoy a competitive advantage that few western nations can match. Against this backdrop, Africa is proving fertile ground for new business models and cleaner forms of energy.