US pharmaceuticals giant Eli Lilly will acquire Loxo Oncology for around $8bn in cash – $235 per share – as it seeks to push further into the cancer drugs market. The final sale price represents a 68 percent premium and is a substantial acquisition for Lilly, one of the world’s largest drugmakers.
Loxo, which was founded in 2013 in Stamford, Connecticut, is a specialist producer of genetically focused cancer drugs. While most medicines on the market treat a single type of cancer, Loxo’s products are able to treat multiple types, as they target single gene mutations.
“The acquisition of Loxo Oncology represents an exciting and immediate opportunity to expand the breadth of our portfolio into precision medicines and target cancers that are caused by specific gene abnormalities,” Anne White, the president of Eli Lilly’s oncology division, said in a statement.
Eli Lilly used the announcement to highlight four promising cancer treatments that are currently in development by Loxo Oncology
Lilly also used the announcement to highlight four promising cancer treatments that are currently in development by Loxo, including Vitrakvi, a pioneering oral TRK inhibitor that was recently approved for sale by the US Food and Drug Administration (FDA).
Loxo’s COO, Jacob Van Naarden, expressed his excitement for the partnership in a statement: “We are confident that the work we have started, which includes an FDA approved drug, and a pipeline spanning from Phase 2 to discovery, will continue to thrive in Lilly’s hands.”
Eli Lilly was founded in 1876 by an American Civil War veteran and has a market capitalisation of approximately $110bn. It was the first company to mass-produce insulin and the polio vaccine, but it is arguably best known for its synthesis of Prozac.
Speaking to CNBC, Lilly CEO David Ricks said the company would like to grow its presence in oncology: “We have a good set of medicines there but we’d like to expand that because there’s so much exciting science for patients emerging in oncology to invest in.”
This deal is the second large-scale acquisition of a cancer drugmaker in as many weeks: Bristol-Myers Squibb, another large American pharmaceuticals company, announced on January 3 that it would acquire cancer drug specialist Celgene for $74bn. The deal is expected to close in Q3 2019.
The global market for cancer drugs is worth around $123bn per year. With these acquisitions, both Lilly and Bristol Myers-Squibb are carving out a competitive position for themselves in the sector, adding cutting-edge treatments to their respective arsenals in the process. These deals, therefore, are strategic decisions for both companies’ product offerings and market shares.
The war on plastic is well underway. In January, the European Union declared that all packaging across the bloc must be reusable or recyclable by 2030. Businesses all over the world are getting on board too, targeting single-use plastics in particular.
Part of the reason this cause has captured the imagination of so many is the growing public awareness of the damage plastic is causing to marine life. According to the Ellen MacArthur Foundation, more than eight million tonnes of plastic enters our oceans every year. By some estimates, there will be more plastic in our seas than fish by 2050.
1.6m sq km
Estimated surface area of the Great Pacific Garbage Patch
Cutting down our plastic use will help, but it won’t remove the huge quantities of plastic that have already entered the marine ecosystem. Inspired by this problem, Dutch inventor Boyan Slat founded the Ocean Cleanup in 2013 to tackle the largest concentration of waste in our oceans, the Great Pacific Garbage Patch (GPGP).
Slat’s organisation now has more than 70 members of staff and has received millions of dollars of investment, but cleaning up a mass of plastic covering an estimated surface area of 1.6 million square kilometres still represents a huge challenge.
Taking out the trash
Over the years, Slat has refined the Ocean Cleanup system extensively. Today, it simply consists of a 600-metre-long floater and a three-metre-deep skirt attached just below the water’s surface. Wind, waves and the ocean current propel the passive, free-floating system.
And, as the current largely dictates the GPGP’s movements, the Ocean Cleanup system moves faster than the plastic it seeks to remove. This allows the underwater skirt to concentrate the waste into a single area.
By 2040, an entire fleet of clean-up systems could have removed 90 percent of the oceans’ plastic
Once natural forces have accumulated the plastic in the centre of the Ocean Cleanup system, a garbage removal vessel is dispatched to collect the plastic and take it back to land to be sorted and recycled.
This process only needs to take place once every few months in order to rapidly reduce the size of the GPGP. Simulations indicate this method could remove 50 percent of the plastic in just five years.
Sea change
On September 8, the Ocean Cleanup project finally launched, setting sail from San Francisco to embark on its 450km journey to the test location. Once there, however, it will face a number of challenges.
8m tonnes
Estimated amount of plastic entering our oceans each year
Although there are no heavy-traffic shipping routes currently passing through the GPGP, there is potential for the Ocean Cleanup system to encounter an ocean vessel. In order to prevent collisions, the system is equipped with lanterns, radar reflectors, GPS and the automatic identification system used extensively for maritime navigation.
In addition, the US Coast Guard has earmarked the area as a special-operations zone and will continue to inform mariners of the project.
As well as passing ships, the Ocean Cleanup system will have to contend with the violent weather that is to be expected in the middle of the Pacific Ocean. In order to safeguard against even the most testing storms, the apparatus has been designed with flexibility in mind, ensuring it can follow high waves and strong currents without experiencing undue strain.
2050
The year by which some expect plastic to outstrip fish in our oceans
The Ocean Cleanup team’s ambitions extend beyond the Pacific Ocean, however. Ultimately, Slat and the rest of the crew aim to tackle waste accumulated in other ocean gyres – the global systems of ocean currents where the majority of the Earth’s plastic waste ends up.
By 2040, an entire fleet of clean-up systems could have removed 90 percent of the oceans’ plastic. For a free-floating system that only relies on natural forces for its movement, that would be a truly remarkable achievement.
When the Fukushima Daiichi Nuclear Power Plant went into meltdown in March 2011, the consequences stretched far beyond the surrounding evacuation zone.
In the months following the incident, many prominent politicians in Germany who had previously been supportive of nuclear energy – including Chancellor Angela Merkel – changed their position. In May 2011, the German Government announced it would shut all nuclear reactors in the country by 2022.
Events like Fukushima – and the Chernobyl disaster that occurred some 25 years earlier – hang over nuclear power like a toxic mushroom cloud, dominating public discussion.
Even as proponents talk up its benefits – chiefly, a continuous supply of cheap, low-carbon energy – they often struggle to be heard over those pointing to these disastrous events.
Today, a number of cutting-edge start-ups are trying to change this narrative. Driven by entrepreneurial passion, as well as the possibility of government subsidies and significant profits, these businesses are set to shake up an industry that has long been in need of new ideas.
The forgotten man
The environmental challenges facing our planet are well documented: average global temperatures continue to increase and sea levels are rising at rates not seen for thousands of years.
Although the majority of scientists are in agreement as to the causes of climate change, coming to a consensus on how to tackle them has not been easy. Certainly, not every expert has been brave enough to admit nuclear power should play a key role in the world’s future energy makeup.
The Paris Agreement was rightly lauded for the way it attempted to bring unity to a fractious debate, but there are no easy answers when it comes to reducing carbon emissions.
Crucially, the agreement makes just one reference to renewable energy across its 32 pages. Its technology-neutral approach is long overdue and makes nuclear power a more viable option for carbon mitigation strategies.
According to Simon Irish, CEO of Terrestrial Energy, one of a number of firms taking a novel approach to nuclear technology, this can only be a good thing for the fight against climate change.
11%
Nuclear power’s contribution to the global energy supply
“In recent decades, nuclear energy has always suffered from a negative perception,” Irish told The New Economy. “However unfair and inaccurate these views, they have held nuclear technology back from achieving [its] full potential. Attitudes in the 1990s and 2000s also held back nuclear innovation. That has now changed: the potential of advanced reactors is driving growing excitement.”
In an ideal world, power generated from solar, wind and various other forms of renewable energy would be enough to meet humanity’s needs. However, challenges concerning the reliability and storage of some renewables make this optimistic at best.
In an op-ed published by The Guardian shortly after the Paris Agreement was signed, scientists James Hansen, Kerry Emanuel, Ken Caldeira and Tom Wigley argued that dismissing nuclear energy makes it more difficult to move towards a decarbonised planet.
They wrote: “The climate issue is too important for us to delude ourselves with wishful thinking. Throwing tools such as nuclear out of the box constrains humanity’s options and makes climate mitigation more likely to fail. We urge an all-of-the-above approach that includes increased investment in renewables combined with an accelerated deployment of new nuclear reactors.”
This is not to suggest those with concerns regarding nuclear power should be ignored. According to the Guardian article, nuclear reactors have prevented approximately 60 billion tonnes of carbon dioxide from entering the atmosphere over the past 50 years – however, in its place, they have produced radioactive waste that, if improperly managed, could pose a serious threat to humans and wildlife.
When this by-product of the nuclear industry is classified as high-level waste, it must be kept isolated for thousands of years.
Old and new
Commercial nuclear power has been utilised since the 1950s, but early hopes for nuclear-powered space shuttles, artificial hearts driven by radioactive batteries and plutonium-heated swimming pools for scuba divers never came to fruition. Today, the technology only meets around 11 percent of the world’s energy needs.
While negative headlines have certainly held nuclear back, so too has a lack of technological development. This wasn’t much of a problem when conventional reactors were turning a profit, but it has recently become more challenging.
The falling cost of renewables is eroding nuclear’s competitive advantage, and the industry is struggling to find the funds to decommission older reactors, let alone build new ones.
In addition, recent failings have further undermined investor confidence in nuclear projects. Last year, South Carolina Electric & Gas confirmed it would be shelving plans to build two large nuclear reactors in Fairfield County despite having already committed $9bn to their construction.
Estimates indicate the companies involved in the project would recoup just $861m if they were to sell the unfinished reactors for parts.
And yet, the industry’s future is not entirely bleak. While conventional reactor projects continue to struggle across Europe and the US, more inventive approaches are finding favour. According to the centre-left think tank Third Way, there are now 75 advanced nuclear projects in the US alone – a significant increase from the 48 recorded in 2015. ‘Next-generation reactors’ can also be found in Russia and China.
Events such as Fukushima have plagued the nuclear energy industry for some years
It’s hardly surprising Irish believes it is an exciting time to be working in nuclear power: “The tremendous potential of advanced nuclear technology, encouraged by supportive industry policy in many markets, has been the business incentive for dozens of nuclear energy start-ups in the UK, Canada and the US, representing investments of over $1.5bn in private capital.”
The new developments are taking a variety of forms and have different benefits. So-called ‘fast reactors’, for example, have been trialled in a number of countries to process the nuclear waste produced by conventional reactors.
Other ideas have focused on creating small modular reactors capable of operating in places where large-scale reactors wouldn’t be feasible. The commercial viability of nuclear fusion, as opposed to fission, is also being explored. These ideas could all give the industry a new lease of life.
Worth its salt
Traditional nuclear reactors use a continuously flowing system of water to prevent overheating. When this water is disrupted – as with the Fukushima plant when the tsunami hit – the reactor quickly overheats, leading to explosions and nuclear meltdowns. Creating a new type of reactor that avoids this safety flaw is a key concern for the advanced nuclear movement.
One of the proposals currently gaining traction in the market involves the use of a molten salt reactor that automatically cools down if it starts to overheat. Terrestrial Energy is one of a number of companies exploring this technology, which is capable of creating reactors that are ‘walk-away safe’.
This means that even if plant operators turned off the safety systems and walked away, the salts would continue to cool the reactor. This is due to the fact the liquid salt expands when it is heated, pushing the uranium atoms apart and slowing the reaction.
According to Irish, there are a number of other advantages associated with this technology: “Terrestrial Energy’s advanced nuclear power plant design, called the Integral Molten Salt Reactor [IMSR], represents a transformative clean energy technology, fundamentally different from today’s commercial reactors.
“IMSR power plants are a scalable and cost-effective clean energy alternative to fossil fuel combustion. They can be built at a fraction of the upfront cost of a conventional nuclear power plant and in half the time.”
The safety benefits molten salt reactors can deliver could be worth far more than any financial savings
According to the Terrestrial Energy CEO, an IMSR power plant could generate electricity at a cost of $50/MWh, a significant reduction from the average $97/MWh delivered by conventional reactors. If achieved, this figure would also prove competitive against electricity generated by fossil fuels – particularly coal.
Although it would be easy to dismiss Terrestrial’s estimates as optimistic, technological developments could see the cost of energy generated by molten salt reactors come down further. More importantly, the safety benefits they can deliver could be worth far more than any financial savings.
Getting the green light
Nuclear power has long been overdue a revamp, but the delay is, to some extent, understandable. Time frames are always going to be stretched when the technologies involved are producing large quantities of hazardous waste.
Regulatory challenges can be particularly difficult to overcome and can leave entrepreneurs waiting decades between a project’s formulation and its commercialisation. In the US, the Nuclear Regulatory Commission has been heavily criticised for being something of a closed shop, unwilling to create relationships with the new wave of nuclear businesses.
With investors unsure as to when, or indeed if, they’ll make a financial return, nuclear start-ups face an uphill battle to acquire the funding needed to gain a foothold in a prohibitively expensive industry. In turn, this can cause nuclear start-ups to exaggerate their potential profits in order to outshine competitors.
While this may help firms plug momentary gaps in funding, it erodes long-term confidence in the industry. Still, the hurdles facing advanced nuclear start-ups are not enough to deter everyone.
“We plan for [the] first commercial deployments of IMSR power plants in the 2020s, and are making steady progress towards this goal,” Irish noted. “Any truly innovative technology takes time to develop, but our IMSR design removes many of the limitations that have held back traditional reactor designs, and has advantages even when compared to other advanced reactors.”
To back up Terrestrial’s optimism, the US Senate’s energy committee passed the Advanced Nuclear Energy Technologies Act earlier this year, authorising the secretary of energy to push ahead with four advanced nuclear reactors by 2028. However, even with the boost this will give nuclear start-ups, time may not be on their side.
In order for nuclear power to supply just a fraction of the clean energy required over the coming decades, governments around the world will need to approve new plants at a far faster rate than is currently the norm.
It will require a shift in the regulatory mindset to one that is more receptive to new ideas. In other words, government incentives for low-carbon fuels will need to encompass more than just renewables.
The potential effects of climate change will be devastating. Even if the two-degree-Celsius target set forward by the Paris Agreement is met, the most pessimistic predictions indicate that sea levels could still rise by as much as six metres by 2100.
As humanity struggles to curtail its consumption, its margin for error gets smaller and smaller. To reject nuclear power as a possible addition to our decarbonisation efforts, just as new technological advancements are gaining ground, could prove to be an incredibly short-sighted decision.
For the well-off in both rich and poor countries around the world, lives are enriched by plentiful access to energy, which provides light, fresh food and clean water, as well as powers technology and allows for the control of temperature.
Abundant energy provides the same life-transforming labour as hundreds of servants: without a refrigerator, we would need to locate fresh food daily, store shelves would be half-empty, and a lot of food would go bad before we could eat it. In fact, this is one reason why stomach cancer was the leading cancer in the US in 1930. Without synthetic fertiliser, which is produced almost entirely with fossil fuels, half the world’s food consumption would be imperilled.
Without modern stoves and heaters, we would need to find our own firewood, and we would risk being poisoned in our own houses by killer air pollution. And without fuel-powered trucks, ships and machines, humans would need to do nearly all the hard labour.
Moving the goalposts
Worldwide, fossil fuels produce two thirds of all electricity, with nuclear and hydro producing another 27 percent. According to the International Energy Agency (IEA), solar, wind, wave and bioenergy produce just 9.8 percent of electricity in the OECD, and this is possible only because of huge subsidies, cumulatively totalling over $160bn this year. Even ultra-environmentally aware Germany still produces more than half of its electricity with fossil fuels.
Yet there is a disturbing movement in the West to tell the 1.1 billion people who still lack these myriad benefits that they should go without. A familiar refrain suggests that instead of dirty, coal-fired power plants, poor countries should leapfrog straight to cleaner energy sources like off-grid solar technology. Influential donors – including the World Bank, which no longer funds coal energy projects – endorse this view.
The underlying motivation is understandable: policymakers must address global warming. Eventually moving away from fossil fuels is crucial, and innovation is required to make green energy cheap and reliable. But this message to the world’s poor is hypocritical and dangerous. While fossil fuels contribute to global warming, they also contribute to prosperity, growth and wellbeing.
There is a strong, direct connection between power and poverty: the more of the former, the less of the latter. A study in Bangladesh showed that grid electrification has significant positive effects on household income, expenditure and education. Electrified households experienced a jump of up to 21 percent in income and a 1.5 percent reduction in poverty each year.
Reliance on coal is not ending soon. While we would wish otherwise, it often remains the cheapest, most dependable energy source: the IEA estimates that, by 2040, coal will still be cheaper, on average, than solar and wind energy, even with a sizeable carbon tax.
Problems come to light
Over the past 16 years, nearly every person who gained access to electricity did so through a grid connection, mostly powered by fossil fuels. And yet donors say that many of the 1.1 billion people who are still without electricity should instead try solar panels.
Compared with expensive grid expansion, providing an off-grid solar cell is very cheap. But for the recipient, it is a poor substitute. It offers just enough power to keep a light bulb going and to recharge a mobile phone, which is better than nothing – but only barely. The IEA expects that each of the 195 million people with off-grid solar will get just 170kWh per year – or half of what one US flat-screen TV uses in a year.
While fossil fuels contribute to global warming, they also contribute to prosperity, growth and wellbeing
Perhaps not surprisingly, the first rigorous test published on the impact of solar panels on the lives of poor people found that while they got a little more electricity, there was no measurable impact on their lives: they did not increase savings or spending; they did not work more or start more businesses; and their children did not study more.
Little wonder: 170kWh is not what most of us would consider real access to electricity. Off-grid energy at this level will never power a factory or a farm, so it cannot reduce poverty or create jobs.
And it will not help fight the world’s biggest environmental killer: indoor air pollution, which is mostly caused by open fires fuelled by wood, cardboard and dung, and claims 3.8 million lives annually.
This is not a concern in rich countries, where stoves and heaters are hooked up to the grid. But because solar is too weak to power stoves and ovens, recipients of off-grid solar panels will continue suffering.
Balance of power
In 2016, the Nigerian finance minister, Kemi Adeosun, called the West out for its “hypocrisy” in attempting to block Africa from using coal to solve its power shortages: “After polluting the environment for hundreds of years… now that Africa wants to use coal, they deny us.”
A Copenhagen Consensus study for Bangladesh found that building new coal-fired power plants there would, over the next 15 years, generate global climate damage eventually costing around $592m.
But the benefits from electrification through higher economic growth would be almost 500 times greater, at $258bn – equivalent to more than an entire year of the country’s GDP. By 2030, the average Bangladeshi would be 16 percent better off.
Denying Bangladesh this benefit in the name of combatting global warming means focusing on avoiding 23 cents of global climate costs for every $100 of development benefits that we ask Bangladeshis to forgo – and this in a country where energy shortages cost an estimated 0.5 percent of GDP, and around 21 million people survive on less than $1.25 per day.
There is no choice: we must fight energy poverty and fix climate change. But that requires a huge increase in green energy research and development, so that clean sources eventually outcompete fossil fuels.
And it means recognising that it is hypocritical for the world’s wealthy, who would never accept survival on a tiny amount of power, to demand this from the world’s poorest.
As digital technologies and automation have advanced, fears about workers’ futures have increased. But the end result does not have to be negative: the key is education.
Already, robots are taking over a growing number of routine and repetitive tasks, putting workers in some sectors under serious pressure. In South Korea, which has the world’s highest density of industrial robots (631 per 10,000 workers), manufacturing employment is declining, and youth unemployment is high. In the US, the increased use of robots has, according to a 2017 study, hurt employment and wages.
But while technological progress undoubtedly destroys jobs, it also creates them. The invention of motor vehicles largely wiped out jobs building or operating horse-drawn carriages, but generated millions more – not just in automobile factories, but also in related sectors like road construction.
Recent studies indicate that the net effects of automation on employment, achieved through upstream industry linkages and demand spillovers, have been positive.
The challenge today lies in the fact that the production and use of increasingly advanced technologies demand new, often higher-level skills, which cannot simply be picked up on the job.
Given this, countries need to ensure that all of their residents have access to high-quality education and training programmes that meet the needs of the labour market. The outcome of the race between technology and education will determine whether the opportunities presented by major innovations are seized, and whether the benefits of progress are widely shared.
Primary focus
In many countries, technology has taken the lead. The recent rise in income inequality in China and other East Asian economies, for example, reflects the widening gap between those who are able to adopt advanced technologies and those who aren’t.
But education-job mismatches plague economies worldwide, partly because formal education fails to produce graduates with skills and technical competencies relevant to the labour market.
In a report by the Economist Intelligence Unit (EIU), 66 percent of the executives surveyed were dissatisfied with the skill level of young employees, and 52 percent said a skills gap was an obstacle to their firm’s performance. Meanwhile, according to an OECD survey, 21 percent of workers reported feeling over-educated for their jobs.
This suggests that formal education is teaching workers the wrong things, and that deep reform is essential to facilitate the development of digital knowledge and technical skills, as well as non-routine cognitive and non-cognitive (or ‘soft’) skills.
This includes the four Cs of 21st-century learning (critical thinking, creativity, collaboration and communication) – areas where humans retain a considerable advantage over artificially intelligent machines.
The education challenges raised by advancing technologies affect everyone, so countries should work together to address them, including through exchanges of students and teachers
The process must begin during primary education, because only with a strong foundation can people take full advantage of later education and training. And in the economy of the future, that training will never really end.
Given rapid technological progress, improved opportunities for effective lifelong learning will be needed to enable workers to upgrade their skills continuously or learn new ones. At all levels of education, curricula should be made more flexible and responsive to changing technologies
and market demands.
Online learning
One potential barrier to this approach is a dearth of well-trained teachers. In sub-Saharan African countries, for example, there are some 44 pupils for every qualified secondary school teacher; for primary schools, the ratio is even worse, at 58 to one. Building a quality teaching force will require both monetary and non-monetary incentives for teachers and higher investment in their professional development.
This includes ensuring that teachers have the tools they need to take full advantage of information and communications technology (ICT), which is not being widely used, despite its potential to ensure broad access to lifelong learning through formal and informal channels. According to the EIU report, only 28 percent of secondary school students surveyed said that their school was actively using ICT in lessons.
ICT can also help to address shortages of qualified teachers and other educational resources by providing access across long distances via online learning platforms. For example, the Massachusetts Institute of Technology’s OpenCourseWare enables students to reach some of the world’s foremost teachers.
This points to the broader value of international cooperation. The education challenges raised by advancing technologies affect everyone, so countries should work together to address them, including through exchanges of students and teachers, as well as the construction and upgrading of ICT infrastructure.
All efforts to bolster education should emphasise accessibility, so that those who are starting out with weaker educational backgrounds or lower skill levels can compete in the changing labour market.
Well-designed and comprehensive social safety nets – including, for example, unemployment insurance and public health insurance – will also be needed to protect vulnerable workers amid rapid change.
The AI revolution will be hugely disruptive, but it will not make humans obsolete. With revamped education systems, we can ensure that technological progress makes all of our lives more hopeful, fulfilling and prosperous.
When two NASA employees created the first affordable 3D printer prototype in 2013, enthusiasm from all sectors reverberated around the globe. For medical practitioners, the idea that 3D printers could be used to build human organs became the ultimate goal.
The 3D printer first took form in 1984 when Charles ‘Chuck’ Hull filed a patent for stereolithography technology, which is used for 3D printing in the layer-by-layer method that is so recognisable today. The technique uses photopolymerisation, a process through which light links molecules together, forming polymers. This first generation of 3D printers, however, used materials that were not robust enough for final products, limiting them to rapid prototyping.
20μm
Resolution achieved by 3D Systems
Around a decade later, Hull’s company, 3D Systems, introduced nanocomposites, a new generation of materials for 3D printing. Made of blended plastics and powdered metals, they were far more durable than their predecessors, opening the door to 3D printing for durable end-products. The medical sector soon came calling.
Over the following years, new processes and techniques were developed, paving the way for 3D printed organs. A seminal moment came in 1999 when scientists at the Wake Forest Institute for Regenerative Medicine printed an artificial scaffold of a human bladder. Once covered in human cells, the scaffold was able to grow working organs; the stage for bioprinting was set.
Bringing organs to life
Numerous milestones have been met since then: in 2002, functional (though miniature) kidneys were transplanted back into animal subjects, and in 2010 the first blood vessel was printed. Progress has been slow, but for good reason: while bioprinting can successfully create human skin, cartilage and retinas, it can only do so because such tissue is very thin, small or devoid of blood vessels.
Zhengchu Tan, a researcher at Imperial College London’s Department of Mechanical Engineering, explained the challenge facing scientists: “In order to 3D print a full organ, all the different cells that are found in the organ must be implanted or derived from stem cells.
These cells must then create the complex biological structure of the organ, including the arrangement of blood vessels (vasculature). Then we need to test whether the organ is able to complete its normal function in the body.”
Fortunately, the field has seen significant developments of late. Scientists at United Therapeutics, a biotech company based in Maryland, US, are working on a printer that could recreate the intricate details that make up a lung, including all 23 branches, as well as accompanying alveoli and capillaries.
United Therapeutics is using a collagen printer made by 3D Systems, and is also working with start-up firm 3Scan to create detailed maps of the structure of lungs. So far, United Therapeutics has printed just a few branches with no cells, but there’s a great deal of enthusiasm that an entire lung could be possible in a decade or so.
Creating a 3D printed version of an organ is just the first step – scientists must then evaluate whether it can perform its normal function within the body
To make that happen, the company must overcome the challenge of resolution size. At present, 3D Systems’ printer can achieve a resolution of 20 micrometres. A lung, however, will require components less than a micrometre in size.
According to an interview with MIT Technology Review, Pedro Mendoza, Director of Bioprinting at 3D Systems, said he plans to use techniques from the semiconductor industry, such as mirrors, masks and lasers, to improve the resolution size currently being achieved.
Speed also needs to be addressed – at the current rate, a completed lung scaffold would take around a year to build – while bringing the matrix to life is another challenge.
“There’s a great deal of enthusiasm that an entire lung could be possible in a decade or so”
San-Diego-based Organovo – which is, incidentally, the company that created the first 3D printed blood vessel – has been making strides in the field too. The company has developed a process that can turn cells from donor organs into printable ‘bioink’.
Layers are then printed to build sections of liver tissue, which Organovo plans to use for clinical trials – a welcome alternative to animal testing. It has also started helping patients directly: as of December 2017, Organovo was given orphan drug designation to treat a rare condition, alpha-1 antitrypsin deficiency, using its 3D printed liver tissue.
Blood runs cold
Imperial College London, in collaboration with King’s College London, meanwhile, is approaching the task by using cryogenics to create 3D printed structures. Tan, one of the scientists working on the project, told The New Economy how the technique works: “The printing material is a liquid solution. To turn it into a soft, solid material, it must be frozen and then thawed. This is where we use cryogenics to freeze the printing solution, hence solidifying it.”
By using this method, the team is able to create structures that can imitate the characteristics of organs. Tan added: “By solidifying each layer, further layers can be built on top to create 3D structures. After the frozen structure is created, it is thawed and it turns into a super-soft hydrogel that mimics the mechanical properties of the brain.”
Structures made from hydrogels can be used to create scaffolds, which act as templates for tissue regeneration. This method allows cell regeneration without the issues often associated with tissue transplants – namely, rejection.
That said, hydrogel structures come with their own issues. “One of the main challenges of 3D cell culturing is that cells seeded deep inside a cell seeding substrate cannot survive if there is an inadequate flow of cell culture medium,” Tan explained. This includes oxygen, water and nutrients.
Fortunately, cryogenic 3D printing can solve this problem. The system is able to create hollow structures, meaning the cell culture medium can flow through the entire structure. “This brings the research closer to achieving full 3D tissue regeneration,” Tan explained. “The 3D scaffold also provides a structure that the cells can grow on to aid the development of their own extracellular matrix and eventual shape, instead of growing from a flat 2D petri dish.”
<1μm
Resolution needed to print a human lung
She added: “The cryogenic technique is unique because it utilises the liquid to solid phase change of the printing ink in order to build up layers in 3D. The printing material is able to crosslink after the frozen structure is thawed, which means that no toxic chemicals (crosslinking agents) or harsh UV radiation (photocrosslinking) that might harm cells are required. This makes the technique very friendly to cells and tissue regeneration, as cells are often frozen and thawed for storage.”
Printing on the brain
While the team at Organovo works to increase the size of tissue it can grow, United Therapeutics is trying to circumvent the current limits of printing resolution. For the team at Imperial College London, meanwhile, research is now focused on improving the cryogenics system to print much larger and more complex structures.
“There has also been a development on the composite hydrogel printing material: it has been formulated to mimic [the] brain, lung and liver so we will be able to print structures that mimic various organs,” Tan said.
“The main application for these structures is to provide a more accurate in-vitro testing method and a more ethical alternative to animal testing for destructive cell studies (chemo, radio and immunotherapy). Specifically, the end goal will be to study how different brain tumour cells respond to different lines of treatment. The ultimate goal will, of course, be to create a full human artificial brain.”
There is still a long way to go until organs can be printed on a large scale, eradicating fatally long waiting lists and unnecessary deaths as a result. But developments are being made and the scientists working on them are incredibly committed. Whether using cryogenics or printable bioink, it seems certain that 3D printed organs will be brought to life.
While the mania for cryptocurrencies may have peaked, new units continue to be announced, seemingly by the day. Prominent among the new arrivals are so-called ‘stablecoins’. Bearing names like Tether, Basis and Sagacoin, their value is rigidly tied to the dollar, the euro or a basket of national currencies.
It’s easy to see the appeal of these units: viable monies provide a reliable means of payment, unit of account and store of value. But conventional cryptocurrencies, such as bitcoin, trade at wildly fluctuating prices, which means that their purchasing power – their command over goods and services – is highly unstable. Hence, they are unattractive as units of account.
Face value
No grocer in their right mind would price the goods on their shelves in bitcoin. No worker would want a long-term employment contract that paid in a fixed number of those units. Furthermore, because their ability to command goods and services in the future similarly fluctuates wildly, cryptocurrencies like bitcoin are unattractive as a store of value. (Cryptocurrencies are also challenged as a means of payment, but leave that aside for the moment.)
Stablecoins purport to solve these problems. Because their value is stable in terms of dollars or their equivalent, they are attractive as units of account and stores of value. They are not mere vehicles for financial speculation.
But this doesn’t mean that they are viable. To understand why, it is useful to distinguish three types of stablecoin. The first type is fully collateralised: the operator holds reserves equalling or exceeding the value of the coins in circulation.
Tether, which is pegged one-to-one to the dollar, claims to hold dollar deposits equal to the value of its circulation. But the veracity of this claim has been disputed.
Conventional cryptocurrencies trade at wildly fluctuating prices, which means that their purchasing power is highly unstable
This points to yet another problem with this model: expense. To issue one dollar’s worth of Tether to you or me, the platform must attract one dollar of investment capital from you or me, and place it in a dollar bank account.
One of us will then have traded a perfectly liquid dollar, supported by the full faith and credit of the US Government, for a cryptocurrency with questionable backing that is awkward to use.
This exchange may be attractive to money launderers and tax evaders, but not to others. In other words, it is not obvious that the model will scale, or that governments will let it.
Closing the stable door
The second type of stablecoin is partly collateralised. In this case, the platform holds dollars equal to, say, 50 percent of the value of the coins in circulation. The problem with this variant will be familiar to any monetary policymaker whose central bank has sought to peg an exchange rate while holding reserves that are only a fraction of its liabilities.
If some coin owners harbour doubts about the durability of the peg, they will sell their holdings. The platform will have to purchase them using its dollar reserves to keep their price from falling.
But, because the stock of dollar reserves is limited, other investors will scramble to get out before the cupboard is bare. The result will be the equivalent of a bank run, leading to the collapse of the peg.
The third type of stablecoin, which is uncollateralised, has this problem in spades. Here, the platform issues not just crypto coins but also crypto bonds. If the price of the coins begins to fall, the platform buys them back in exchange for additional bonds.
The bonds are supposed to appeal to investors because they trade at a discount – so that, in principle, their price can rise – and because the issuer promises to pay interest to the bondholders in the form of additional coins. That interest is to be funded out of the income earned from future coin issuance.
Here, too, the flaw in the model will be obvious to even a novice central banker. The issuer’s ability to service the bonds depends on the growth of the platform, which is not guaranteed. If the outcome becomes less certain, the price of the bonds will fall.
More bonds will then have to be issued to prevent a given fall in the value of the coin, making it even harder to meet interest obligations. Under plausible circumstances, there may be no price, however low, that attracts willing buyers of additional bonds. Again, the result will be the collapse of the peg.
All of this will be familiar to anyone who has encountered even a single study of speculative attacks on pegged exchange rates, or to anyone who has had a coffee with an emerging-market central banker.
But this doesn’t mean that it is familiar to the wet-behind-the-ears software engineers touting stablecoins. And it doesn’t mean that the flaws in their currently fashionable schemes will be familiar to investors.
The market for biopharmaceuticals – medicines created using biotechnology – has exploded over the past two decades, with annual sales of recombinant proteins and antibodies standing at around $140bn, and vaccine sales totalling approximately $36bn per annum.
These medicines have a massive impact on human health, providing improved and even life-saving treatments for diseases ranging from cancer, haemophilia and diabetes to infectious diseases.
Examples of blockbuster products – those with annual sales over $1bn – include insulin (a treatment for diabetes that makes $34bn each year), erythropoietin (anaemia; $10bn), Remicade (arthritis; $7bn), Prevnar 13 (diphtheria; $5bn) and Gardasil (cervical cancer; $2bn).
The enormous sales success and societal impact of these medicines help to fuel the R&D of novel products, with $127bn spent each year on the development of new and improved versions of biopharmaceuticals.
Based on the number of novel compounds currently in the pipeline, the US is by far the most active market, with around 4,000, according to Batavia Biosciences’ research. Europe and Japan are the second and third-largest contributors, producing 2,000 and 500 compounds respectively, while the rest of the global market provides a further 3,000 compounds.
Of the $127bn flowing into R&D each year, some $117bn is dedicated to conducting clinical trials, while the remaining $10bn is spent on developing manufacturing processes to ensure products are produced to a consistently high standard.
A lack of efficacy, poor biodistribution and adverse effects can all prevent drugs from progressing to Phase 1 clinical trials
The cost of progress
The journey from compound discovery to market introduction starts with the R&D phase – in which the initial discovery and product refinement takes place – and is followed by a series of clinical trials.
Phase 1 clinical trials seek to determine the safety and dosage of the candidate drug by testing it on tens of healthy volunteers. Phase 2 trials then test the drug on hundreds of patients to assess efficacy and identify any potential side effects.
Phase 3 clinical trials extend this sample of patients to the thousands, again monitoring for adverse reactions and assessing efficacy.
Batavia Biosciences has built a world-renowned R&D organisation that understands the challenges associated with biopharmaceutical product development
According to Batavia’s research, the attrition rate in the development of novel biopharmaceuticals is substantial, with only around five percent of candidate drugs successfully making it through the R&D phase into the first round of clinical trials.
Approximately 63 percent of these drugs then pass Phase 1 clinical trials, with only 31 percent of that number moving on to Phase 3. Less than 50 percent of novel biopharmaceuticals then advance to Phase 4 – also known as market introduction.
The R&D phase, therefore, has the highest attrition rate by far, with many putative products failing to progress into a regulatory-approved investigational new drug (IND) dossier – a prerequisite for conducting Phase 1 clinical trials.
Reasons for failure can include a lack of efficacy, poor biodistribution and adverse effects, among many others. In essence, an IND dossier must answer three pivotal questions: what scientific rationale is there for testing the candidate medicine on humans?
Has a process been established that allows for robust and reproducible manufacturing of the candidate medicine? And what safety data has been generated that allows a solid risk assessment of the candidate medicine to be tested on humans?
$127bn
Approximate spend on the development of new and improved biopharmaceuticals
As the development of highly complex biopharmaceuticals is extremely expensive (costing at least $100m to bring to market) and risky (given the high attrition rate), the commercial price of such medicines is typically very high.
Finding a remedy
It is in the R&D phase that Batavia and other contract development and manufacturing organisations (CDMOs) seek to offer their know-how and experience to improve market outcomes.
Batavia has a broad client base, comprising biotech companies, multinational biopharmaceuticals firms and non-profit organisations, including a range of academic, governmental and charitable institutions.
The company has state-of-the-art laboratory infrastructure in both the Netherlands and the US, and is well equipped to develop any biopharmaceutical – be it an antibody, recombinant protein or vaccine – from early discovery all the way through to clinical manufacturing.
Batavia collaborates with institutions such as the World Health Organisation, the Bill & Melinda Gates Foundation (BMGF) and the National Institute of Health, as well as a number of prominent academic institutes, to develop novel vaccines.
Examples of countermeasures currently under development include novel vaccines against respiratory syncytial virus, chikungunya virus and Ebola virus, as well as potent neutralising antibodies that protect against the Zika virus.
More recently, Batavia has been working with Univercells and Merck to create a low-cost manufacturing platform that will significantly ease the cost of global health vaccines.
$117bn
Amount dedicated to conducting clinical trials
The platform, which is known as OMNIVAX and is sponsored by the BMGF, aims to reduce the per-dose price of the trivalent poliovirus vaccine from around $2 to $0.15.
OMNIVAX brings together novel, high-cell-density-packed bed reactors, high-efficiency novel purification membranes and continuous processing to deliver lower costs through massive process intensification. The platform is also being rolled out around the world for a range of global health and epidemic-preparedness vaccines.
Batavia collaborates with a number of renowned global institutions to bring transformative products to market
Culture of success
Batavia has built a world-renowned R&D organisation that understands the challenges associated with biopharmaceutical product development and can provide sponsors and collaborators with realistic and practical solutions.
As well as providing know-how, Batavia actively develops a range of tools and technologies to facilitate the development and manufacturing of biopharmaceuticals at reduced costs and accelerated timelines.
Examples of such technologies include the company’s SCOUT, SATIRN, SIDUS, STEP and SCOPE platforms, which help to provide a more thorough understanding of the manufacturing issues associated with novel compounds early in the product development chain. This gives companies the opportunity to either ‘fail fast’ or invest more capital into a prospective product.
$10bn
Amount dedicated to developing manufacturing processes
The market for CDMOs is expected to continue expanding, with some industry commentators even predicting year-on-year growth of up to 15 percent. A major driver of this optimism is the anticipated surge in the number of biotech companies operating around the world – due, in part, to exciting breakthroughs in fields like immuno-oncology and gene therapy.
It’s also important to note that multinational biopharmaceutical companies currently only contribute around 12 percent to the commercial outsourcing market. With this contribution expected to rise as companies look to outsource more of their commodity-based activities, it is likely we will see an increase in internal resource flexibility and a shift in focus towards activities that result in adding direct company value, such as intellectual property development.
Naturally, a continued demonstration of the value-added model that Batavia and other CDMOs present to their clients will further develop the outsourcing market for biopharmaceutical development and product manufacturing, making life-saving treatments cheaper for everyone.
In a world increasingly defined by long working hours and fierce competition, optimum productivity reigns. Whether it’s at work or in our personal lives, reaching peak performance – and doing so efficiently – is the dream. This ambition to rise above the crowd is particularly pronounced in cities famed for creativity and technological development; in Silicon Valley, it takes on a life of its own.
The West Coast is a bubble of sorts: Silicon Valley is home to many of the world’s brightest engineers, technicians and developers. It is also regarded as something of a start-up paradise, attracting thousands of ambitious entrepreneurs each year as they bid to change the world one platform or device at a time.
In this highly competitive space, workers are forced to push themselves to the absolute limit. But to these software engineers and CEOs, achieving optimum productivity isn’t an unattainable goal: it’s an equation ready to be solved. No longer limiting themselves to formulas of the technological kind, individuals in Silicon Valley have turned their attentions to matters of the body.
Food for thought
In terms of mainstream attention, ‘biohacking’ first came to prominence in 2014 with the introduction of Soylent, a meal-replacement drink created by Silicon Valley engineer Rob Rhinehart.
Tired of eating fast food at his desk – which was not only unhealthy, but also time-consuming – Rhinehart set about creating an alternative. After studying the myriad nutrients required for the body to function efficiently, Rhinehart formulated a drink that doubles as “a complete meal” – or so the Soylent website claims.
Intermittent fasting
<8 days
Typical fasting period
50%
Reduction in the risk of cancer in mice
With a grainy consistency, the drink has a purposefully bland taste, which some have even referred to as “rancid”. But flavour was never part of the equation: the cessation of hunger is Soylent’s primary purpose.
And if the name sounds familiar, it’s because it was indeed inspired by the 1973 film Soylent Green, a sci-fi thriller that centres on a ‘miracle food’ – which, as it transpires, is made from human corpses.
The name, however, hasn’t put investors off: in March 2017, a group of venture capitalist firms invested $50m into Soylent, adding to the $22.4m secured in previous funding rounds. Among those to invest were major players GV (formerly Google Ventures), Tao Capital Partners, Lerer Hippeau and Andreessen Horowitz. Clearly, there is an appetite for products that circumvent the rules of nature in order to maximise individual output.
“We are in a culture where optimisation is highly valued,” Dr Niketa Kumar, a clinical psychologist based in Silicon Valley, told The New Economy. “Biohacking is marketed as a method for gaining a competitive edge. We are also in a culture where productivity (output in relation to effort) is lionised.” It’s perhaps unsurprising, then, that various other forms of biohacking have entered the Silicon Valley zeitgeist since Rhinehart concocted Soylent.
Power hungry
Perhaps the most popular form of biohacking is intermittent fasting. Silicon Valley’s movers and shakers are now abstaining from food for days at a time in order to optimise both their professional and personal lives.
One of the loudest voices on the scene is Geoffrey Woo, CEO and co-founder of HVMN, a start-up specialising in biohacking. Woo told The New Economy: “While building HVMN, I began tinkering and investigating all sorts of self-experiments and ‘biohacks’ to improve cognitive and physical performance.”
Silicon Valley’s competitive nature encourages some to go to extraordinary lengths to not only succeed, but to become the very best
Having been inspired by TED Talks on how fasting can jump-start the growth of new brain cells, as well as the work of Dr Valter Longo on the consequent longevity of such a routine, Woo began fasting.
“I was already primed to give it a go, and jumped into doing 60-hour fasts (Sunday dinner to Wednesday breakfast),” Woo said. “At first it was very difficult, but eventually the entire company started fasting together.”
Participants fast for anywhere between 14 hours and eight days at a time. For most of us, the idea of not eating for days on end conjures images of distraction and exhaustion – hardly elements conducive to high productivity.
But proponents of intermittent fasting maintain that it works wonders. In fact, Woo said his own personal benefits have included “mental clarity and increased focus”.
Woo added: “Productivity is not just improved by the increased focus and clarity, [intermittent fasting] also removes the distraction of having to think about food every few hours. So when I fast, I have very long working blocks.”
The scientific theory behind intermittent fasting centres on ketones. When the body lacks calories to burn, it turns to fat for energy instead. This process produces ketones, which some consider to be a ‘super fuel’ for the brain that enhances mental clarity and performance.
There are other surprising benefits, too: studies on mice, for example, have shown intermittent fasting can extend life expectancy and improve cognitive performance.
“In mice, we extended lifespan and reduced [the risk of] cancer and inflammatory diseases by about 50 percent,” Longo told The New Economy. “In humans, we saw decreases in risk factors for diseases including cholesterol, blood pressure, inflammatory markers, IGF-1 [insulin-like growth factor 1], C-reactive proteins [and] fasting glucose.”
A 2014 study conducted by Longo and published in Cell Stem Cell indicated that prolonged fasting reduces the immunosuppression and mortality brought on by chemotherapy. More recently, a 2018 study published in Cell Metabolism by scientists from the National Institute on Ageing – a division of the National Institutes of Health – found that eating just one meal a day ensured “better outcomes for common, age-related liver disease and metabolic disorders” in mice. The study also supported Longo’s longevity hypothesis.
With a growing base of evidence to support the health benefits of intermittent fasting, it’s unsurprising the practice is rising in popularity. Through HVMN, Woo has created an online community called WeFast, which boasts some 13,000 members on Facebook alone.
The group provides advice and support to fellow online members, while Silicon Valley residents often meet in person to literally ‘break fast’ each month. With many participants wearing monitors at all times, glucose and ketone levels can also be shared, enabling individuals to ‘hack’ their bodies in every sense of the term.
Smart drugs
For those seeking a shortcut to ketone power – or, indeed, to double up during their non-fasting days – nootropics present an attractive solution. Also known as ‘smart drugs’, nootropics are supplements that can enhance cognitive function, with advocates contending they positively impact creativity, memory and motivation.
“Nootropics help by doubling down on certain metabolic pathways triggered by fasting, or supply helpful cellular building blocks or signalling molecules that aren’t typically found in a normal diet,” Woo explained.
One of the best-known players in the market is Moon Juice. According to the company’s website, its flagship product, Brain Dust, is an “adaptogenic blend of enlightening superherbs and supermushrooms” that enhances focus and mental stamina, as well as alleviates stress and promotes a positive mood.
Nootropics are supplements that can enhance cognitive function, positively impacting creativity and memory
Such products aren’t just limited to tech engineers in Silicon Valley, either: smart drugs are finding favour with students, too. According to a study of close to 80,000 people published in the International Journal of Drug Policy in August, 14 percent of participants said they had used cognitive-enhancing stimulants at least once in 2017, up from five percent in 2015.
As pressure mounts to achieve exceptional results in a saturated graduate pool, it is easy to see why more students are turning to smart drugs to instantaneously enhance their performance during pivotal moments in their studies, such as examinations.
13,000
Number of WeFast Facebook members
But some, like Kumar, have their doubts about resorting to such hasty remedies: “It is an example of an extreme ‘quick fix’ way of seeking change. I believe that, on one level, it is easier to take extreme measures than to try and make more balanced change over the long term. With biohacking, you feel something pretty quickly and people notice this new feeling and see it as an affirmation of the change they’re seeking.”
Uncanny valley
While biohacking is gaining traction in other markets, the trend continues to be concentrated within Silicon Valley – for the time being, at least. Undoubtedly, this can be explained by the area’s unique culture and exceptionally competitive environment.
“It’s akin to an intellectual battlefield where there are drastically different social and economic rewards for being the number one versus the number two start-up in a market,” Woo said. “Hence, engineers and entrepreneurs in Silicon Valley are highly motivated to optimise every single possible edge against the competition.”
Woo believes Silicon Valley’s “especially open and experimental” environment is also a factor: “[It] accelerates the translational work necessary to bring research and data buried in scientific literature and translate it to best practices. What might have taken a concept years to broadly disseminate now takes weeks given the interconnectedness of the Silicon Valley communities.”
The tech hub’s competitive nature, meanwhile, encourages some to go to extraordinary lengths to not only succeed, but to become the very best. “I think the values of optimisation, productivity and competition are rampant in Silicon Valley,” Kumar said. “I also think the culture… prioritises professional achievement above other areas of life. When self-worth depends on our success professionally, we become motivated to go to extreme measures.”
As with anything in life, quick fixes are rarely sustainable. A trend like biohacking may work for some, but it could very well be dangerous for others. With research into intermittent fasting, meal-replacement drinks and nootropics limited – particularly with regard to the long-term physical and psychological implications for different genders – it is dangerous to take their supposed benefits at face value.
“Hunger – and the pursuit of nourishment – is one of the most basic human drives,” Kumar said. “One of my concerns with biohacking is that people are teaching themselves to ignore basic signals within themselves.” Who knows what impact this could have on an individual further down the line.
Silicon Valley is home to some of the most intelligent individuals on the planet – if anyone were to be aware of the potential risks of biohacking, it would be them. And there certainly is some logic behind the practice: self-optimisation is hard to argue with.
Excess, on the other hand, is not. Perhaps a balance of eating less and taking proven nootropics – while not forgetting one of the greatest pleasures on this Earth, food – is key to not only optimum productivity, but an optimal life, too