The Sun Belt shines for US tech and manufacturing

How the US should be divided regionally is a source of constant debate. Even the seemingly simple demarcation between south and north is unclear. Virginia’s status as ‘southern’ is disputed, despite Richmond once serving as capital of the predominately southern Confederacy. The proper designation of border states such as Kentucky and Maryland is open to question. Texas is sometimes thought of as a southern state, while officially designated part of the south-west, or even sometimes a region in itself. Even Delaware, the US’ second-smallest state, is divided, with its lower portion culturally southern, the upper part northern. It is, perhaps, for this reason the US is often divided into various ‘belts’. These are used to tie together different states, usually describing some political or economic characteristics that bind them together.

The strong influence of religion has earned some southern states the title of Bible Belt, while in the comparatively secular Pacific north-west one can find the Unchurched Belt. The area of land from the Atlantic north-east region to the Great Lakes, once known as the Factory Belt has, since the 1980s, been referred to as the Rust Belt. The former slave states from Virginia down to eastern Texas were once collectively known as the Black Belt, owing to a large African-American population, while the dominance of maize growing from Illinois to Nebraska and the Dakotas earned the region the name of Corn Belt. Louisiana, Mississippi, Arkansas and Texas together form the Rice Belt – again, owing to the dominance of that particular crop. The north-east is sometimes referred to as the Salt Belt, due to its need for salt to apply to icy and snowy roads in the winter, while the unusually high incidence of cardiovascular disease has won the south-east the unfortunate title of the Stroke Belt. The frequency with which these belts are spoken of varies, of course: some have reached the point of cliché, while others have fallen out use or have been rendered obsolete by changing trends.

The new south
Of all the belts that crisscross the US, the largest is the Sun Belt. As with all US regional designations, the exact borders are disputed, but in its widest definition, the Sun Belt stretches from the Carolinas and Florida in the east, right down to Georgia, Alabama, Mississippi and Louisiana in the south-east, and through to south-western states such as Texas, New Mexico and Arizona, and as far as Nevada and California. Stretching from the Pacific to the Atlantic, the region houses around 33 percent of the US population.

The term was coined in the 1970s in reference to the politically changing and resurgent southern third of the US. Kevin Phillips, a political writer and Republican Party strategist, first used to the term in his book The Emerging Republican Majority to describe the social trends and political changes that helped Richard Nixon win the 1968 presidential election. The term referred to states that formed the lower ring of the US; states that were becoming increasingly important at the time, both politically and economically.

The south-east of the US has historically been economically poor. However, in the mid-20th century, many of these states started to see rising fortunes, and, along with their south-western neighbours, came to form the Sun Belt. The reasons for this economic boom are varied: the invention of air conditioning made the warm climate more tolerable to live in, while federal spending programmes provided a much-needed spark to economic development. The region enjoyed strong growth soon after the post-war period, around the same time the once-prosperous north-east and mid-west were turning into the Rust Belt.

The south-east of the US has historically been economically poor. However, in the mid-20th century, many of these states started to see rising fortunes

Much of this economic growth came from the technology sector. At the onset of the Cold War, the vast and undeveloped Sun Belt states formed the perfect location for the expansion of the defence industry. Heavy investment was pumped into research centres around the Sun Belt, alongside the sprawling university complexes of states such as California and Texas. As Bruce Schulman noted in his From Cotton Belt to Sun Belt: “Research parks…received billions of dollars in defence contracts and attracted scientists and engineers from across the nation.” These science and tech hubs created by military investment soon generated a buzz within the consumer tech economy, which by the 1980s was giving way to California’s Silicon Valley and Texas’ Silicon Prairie.

Certain states, such as Florida, Arizona and New Mexico, also saw increased immigration of elderly people from the north of the country. A combination of the Sun Belt’s mild winters and its newly obtained air conditioning capacity meant pensioners looked to these states as ideal places to retire. Likewise, tourism grew to become a major industry for many of the states, particularly Florida, Nevada and California. In Arizona, too, local politicians were able to successfully lobby for federally supported light industry. However, the technology sector – largely growing out of military defence dollars – has remained the Sun Belt’s most successful industry, and the key to its development.

The boom was both the result and cause of increasing migration to the area from northern regions: for parts of the Sun Belt, it was the first serious inward migration in over a century. Previously, southerners tended to migrate northward in hope of increased prosperity, but the reverse became true, signalling the shifting centres of economic power in the US. Now, the region accounts for a third of the entire US population – around 109 million people.

For decades, the south-east languished in the shadow of the north’s success, stuck in a perpetual sense of backwardness. At the same time, the south-west was an outpost, a dusty pioneering edge of the country – Arizona and New Mexico only received statehood in 1912. But together as the Sun Belt, these regions, by the 1980s, had come to be seen collectively as the most dynamic and important part of the country.

The region transitioned from a “south of boll weevils, magnolias, juleps and sharecroppers into a Sun Belt of skyscrapers, dealmakers, military bases and air-conditioned comfort”, wrote Schulman. The region even gave the south a new identity; in the mid-20th century, the south was still confronting its legacy of racism, and was seen by much of the country as socially and economically backward. However, as Schulman observed, “the idea of the Sun Belt offered the [region] a fresh identity, linking it with the west”. As one analyst put it, according to Schulman: “Yankee executives might balk at moving to the south, but they might seek to locate to the Sun Belt.”

Shining bright

The region was hit hard by the economic recession, stemming from the collapse of the US housing market. Between 2010 and 2015, however, Houston, Dallas, Miami, Phoenix and Atlanta saw their populations grow by a collective 1.57 million. This increase was made up of both domestic and international migrants. What’s more, in every year there has not been a recession in the 21st century, the fastest growing city in the US has always been in the Sun Belt. Economic and employment growth has also been strong: as the urban studies theorist Richard Florida recently noted in an article for CityLab, among US counties with leading business and job growth, many “are located in large Sun Belt metros like Miami, Tampa, Orlando, Phoenix and Charlotte”.

When it comes to employment in the technology sector in the US, the Sun Belt continues to keep pace with the major economic sectors of the rest of the nation. According to a study in CompTIA’s 2016 Cyberstates report, though California predictably had the highest number of tech industry workers in 2015, Texas was in second place, with over 585,000 tech employees – nearly twice the number in New York, which was in third place. Florida came fourth, beating education and research powerhouse Massachusetts.

The region was hit hard by the economic recession, stemming from the collapse of the US housing market

According to a study by the Martin Prosperity Institute, Rise of the Urban Start-Up Neighbourhood, Californian neighbourhoods in San Francisco, Los Angeles and San Diego, alongside those of Boston and New York, receive the largest amount of venture capital funding for start-ups. As the study noted: “The top 20 neighbourhoods or zip codes for venture investment include nine in San Francisco, five in San Jose, three in Boston-Cambridge (one in suburban Waltham and two in Cambridge close to MIT), and one each in San Diego (close to the University of California, San Diego), Dallas and New York (close to New York University).” Even outside California, the Sun Belt does well in this regard: Frisco in Dallas receives 1.11 percent of all venture capital in the US, while the Dunwoody zip code in Atlanta is among the top 20 recipients of venture capital for software start-ups.

That said, this success does not extend to the entire Sun Belt region. The tech sector, according to the study, contributed only 1.8 percent of Louisiana’s total GDP, and 2.6 percent of Mississippi’s. In contrast to other Sun Belt states, tech accounts for 10.5 percent of California’s GDP, 5.6 percent of Florida’s, 7.9 percent of Georgia’s and 6.2 percent of Texas’. Heavy investment in tech flowed into California, Texas and Florida for national defence reasons in years past, forming technology hubs that soon saw their benefits transfer to the civilian. For other regions in the Sun Belt, however, the benefits of the tech industry have not been felt.

Some states have been successful in cultivating a tech sector, but not all – even the successful ones – have become technology hubs. South Carolina, for instance, saw employment grow by 3.3 percent in 2015, primarily based on the expansion of its automotive and aviation manufacturing industries; a $500m manufacturing operation in the Charleston area, potentially employing nearly 3,800 workers, is now being planned by Volvo and Mercedes-Benz. Nevada, meanwhile, has a relatively small technology sector, being traditionally dominated by gambling, mining and agriculture, but its tech contribution is set to grow as it becomes the home of world-leading data centres that power the cloud economy.

Setting sun

Since the start of the post-recession recovery, many Sun Belt states have been seeing the sort of success – in terms of population and job growth – that led to them being grouped under the same umbrella in the first place. However, the constituent states’ success and growth has increasingly diverged in both pace and direction. California, Texas and Florida have all seen, and continue to see, a powerful technology sector boost their economies. Georgia also seems to be cultivating a tech start-up culture, particularly in Atlanta. Other states, however, are going down different roads: Nevada may be seeing some gains in its tech sector, but South Carolina is becoming an increasingly important manufacturing hub.

At the same time, the gap in prosperity and overall economic development between states such as Mississippi and the Sun Belt’s tech hubs show huge differences do exist in this disparate collection. What appears to tie the Sun Belt together is the fact it is still on its way to becoming the centre of both political and economic gravity in the US. That said, some aspects of unity in the region are beginning to fray: California’s tech industry is more likely to be spoken of alone, as Silicon Valley, rather than part of the Sun Belt; Texas’ fortunes are increasingly tied to its fellow energy-producing states northward in Oklahoma and the Dakotas; North Carolina’s growing biomedical research is making it a region in its own right; and South Carolina’s manufacturing surge is now seen as part of a distinctively south-eastern reshoring phenomenon. As the US economic recovery continues, and the southern third of the country sees itself surge ahead, the sun may be setting on the Sun Belt as a regional grouping, but as long as the member states continue to move from strength to strength, the end will not be mourned.

Conscious consumerism might be nothing more than lip service

Despite scarcely getting a mention 20 years ago, ‘ethical’ products are now plentiful and commonplace on supermarket shelves. For seemingly every item, there exists a cleaner, greener, or more sustainably sourced alternative. Customers can now get the product they want without circumventing their morals in the process. Entire businesses have emerged on the back of the ‘ethical’, or ‘conscious consumer’, revolution – aiming products at consumers who tailor the majority of their purchases to the ethical significance of the product, not just its quality or function.

It has even become a marketing tool, with many businesses choosing to construct their entire brand identity around the needs of the conscious consumers they hope to attract. Symbols representing ethical organisations, and certifications advertising how the products are made, now have similar prominence on the packaging and advertising as the product itself.

However, despite its best intentions, ethical consumerism is extremely limited in what it can achieve. While supporting products that feature these certifications may make a small difference, ultimately, it is not enough to prompt widespread social change.

Many businesses choose to construct their entire brand identity around the needs of the conscious consumers they hope to attract

Tied hands
Social consumerism is based on the idea that the collective purchasing decisions of a group are able to influence the wider market. If enough people buy Rainforest Alliance coffee, for example, then companies will be forced to take notice of this shift in the market, and make higher ethical standards the norm. By reflecting their personal values in their purchasing decisions, a group can make the world a better place. But, unfortunately, the global market is more complicated than that.

Dr Terry Hathaway, a political economist at the University of York, suggested consumers do not necessarily have the power they think they do, with many broadly unaware of what they are actually supporting.

“We buy many, many different things over the course of a day, and the amount of information that you need to know to make the best choice, or even a reasonably good choice, based on these things is impossibly large. I mean, you start looking at supply chains. What things are manufactured in China that become feed stocks for certain things?”

Even seemingly simple products can be composed from a number of individual elements, and, inevitably, these components are sourced from a diverse range of suppliers and locations. While a customer may be confident they are promoting one positive action, there may be dozens of negative implications they support indirectly and unwittingly.

Hathaway said his classic example is the Fairtrade chocolate bar. While purchasing one may be considered a vote for supporting the fair payment of cocoa farmers around the globe, the customer is likely ignorant of the multitude of factors that went into its production and distribution. Finer details, such as the production of the wrapper, the working conditions of the driver who delivered the product, and the parent company’s other product lines, may not mirror the ethical standards the product itself promotes.

Another issue with conscious consumerism is the lack of clear feedback a company receives regarding the purchase of its product. Hathaway explained that, to a large company, a purchase is just a purchase, and the rationale behind choosing one product over another is often unknown.

“You could buy a car because it’s blue, but how the corporation or company understands that is a different thing”, Hathaway said. “You can purchase something for any number of idiosyncratic reasons, and all the company knows is that you purchased it.”

In order to encourage the production of more responsible products from companies, Hathaway suggested consumers might be better off making their ethical preferences known through responses to consumer surveys. However, in order for this to work, consumers would need to follow through on their intentions.

You can purchase something for any number of idiosyncratic reasons, and all the company knows is that you purchased it

Following through
In most surveys, customers show a particular fondness for ethical and sustainable products. In Nielsen’s 2015 Global Corporate Sustainability Report, the company painted a seemingly straightforward picture of how customers value sustainability in their brands. Across 30,000 consumers in 60 countries, 66 percent of responders said they were willing to pay more for sustainable goods. The survey also reported people who were earning less were more willing to pay for fair-trade products than those on higher salaries; consumers who earned under $20,000 per year were five percent more likely to say they would pay a premium for ethical goods than those in higher wage brackets.

However, the startling reality is that, while consumers may be enthusiastic advocates of ethically produced goods, the majority of the time, they do not follow through on their intentions. Professor Timothy Devinney, co-author of the book Myth of the Ethical Consumer, illustrated his findings in an article published by The Conversation in 2011.

Devinney’s research showed people are far less likely to choose an ethical product if it lacks some core functionality. In another study, Devinney’s team proved people were more likely to choose an ethical product if they were prompted to, and had their privacy removed. For items that were the same price, 70 percent of people chose the ethical product when reminded of product ethics. However, if people were left to their own devices and given anonymity, less than one percent chose the ethical product.While customers may like the idea of conscious consumerism on paper, the majority are unwilling to bear the cost.

People are far less likely to choose an ethical product if it lacks some core functionality

Single choices
Despite these shortfalls, conscious consumerism can prompt some degree of market change, but only in specific products that have limited variables, like the recent push for consumers to choose cage-free eggs.

“An egg is an egg, and you can either buy this egg or you can buy that egg, and the differences are price and production methods that are all clearly laid out for you”, said Hathaway. “In those situations, I think the market can effectively communicate an ethical approach that’s good – an ethical approach as desired.”

Hathaway conceded that, in this case, it would only take a single level of abstraction, the purchase of mayonnaise for example, for the customer to be left in the dark once more.

However, another more significant opportunity for change is available to companies that are making purchasing decisions. Hathaway said, during the tendering process, companies have far more insight than the average consumer might.

“You’ve got an established social relationship there so you can, as a business, actually go around and say ‘look, these are our requirements, this is what we want, and what evidence do you have of that?’ – it’s a much more active way of relating to the thing that you’re buying from than when you go to the supermarket or when you buy a car.”

For individual consumers, partial ignorance may be the answer. Simply making purchases that ‘feel’ right is likely to have a positive impact overall, even if this impact is limited. Buying green energy, for example, is a clear sign of support for renewables, even if the working conditions of the builders constructing the turbines are unknown.

While not a completely pointless endeavour, conscious consumerism alone is not enough to prompt the widespread, substantial changes ethical consumers hope to achieve.

Maintaining the arts and social sciences should be a higher education priority

As I look at the recent unrest occurring in the western world, I cannot help but believe a perceived lack of opportunity is the primary driving force for these sad events. This perception is being exacerbated by the internet: individuals are able to see the differences between the ‘haves’ and the ‘have-nots’, and the social and economic privilege wealth brings every day on their phones. Because a clear, attainable path to upwardly advance their economic and social position is not apparent, the current state of society is perceived by many as unfair and discriminatory. This discontent can fuel actions that, to say the least, are destructive and, in some cases, murderous. While the feeling of exclusion in society is not an excuse, it does provide insight into actions that are needed to create a more harmonious world, and perhaps redefine the mission of our institutions of higher education.

In the last two decades, the role of public universities has evolved. To a large extent, public universities have had their mission redefined by their respective stakeholders, turning them into institutions whose primary objective is to act as accelerators for the economy. Lost in this transition is the fact institutions of higher education are the primary and most effective tool for enabling upward mobility. In the drive to reduce the cost of higher education while increasing the direct economic impact of university activities, a more hardline business model for funding investment is being applied. The result is that many programmes, in particular those related to the arts and social sciences, are seen to have marginal value and, as such, receive limited investment in resources. What is missing in this analysis is the longstanding benefit universities can provide to society beyond the direct translation of technology.

By focusing disproportionately on knowledge creation and translation, the impact of universities on society is diminished

Public universities need to satisfy a number of objectives. First, they provide a means to educate the population. They also act as a critical medium for the expansion of our understanding of the world beyond the context of local experience. This knowledge enables university graduates to interact and successfully compete internationally. The second objective of a public university is to create new knowledge. This helps drive industrial development and, almost as importantly, enhances pride in the community. It helps define a community’s position on the world stage, while advancing economic prosperity. Third, and most importantly, public universities provide the path to upward mobility. A university is a place where aspirations are encouraged and realised; a place where dreams can start to become reality. It is this third feature that truly differentiates public institutions from private ones.

Aim higher
By focusing disproportionately on knowledge creation and translation, the impact of universities on society is diminished. Devaluing the arts and social sciences encourages students to not learn the lessons of history and recognise how the past has shaped the current world. Without this knowledge, prior societal experiences cannot be leveraged to realise new opportunities. This lack of educational breadth also stifles the pursuit of an understanding of differences in cultures and value systems. These differences can directly impact both personal and business transactions. Last, in failing to nurture the development of an appreciation for artistic creativity – the defining feature of humanity – an important tool for nurturing global partnerships and understanding is lost.

The net result of failing to place appropriate emphasis on non-technical subject matter is the graduation of students who are not well equipped to interact and compete globally. It directly impedes their understanding of people from different cultures. Aside from failing to recognise market opportunities, this ignorance can lead to wasted opportunities to prevent conflict and promote growth. All students should be given the opportunity and actively encouraged to have in-depth experiences in the soft sciences. Discounting the value of investing and financially supporting advances in these fields is poor economics that ultimately devalues the societal investment in higher education.

Let me in

Of even greater concern is access to higher education. A university education is the single most important governmental tool for promoting socio-economic upward mobility. It provides hope and promise. It is the defining brass ring of opportunity. This is the reason public universities were created. They were funded by communities with the goal that the next generation, regardless of financial circumstance or heritage, would have the ability to access better paying jobs and an improved quality of life. Entry to the university would only be limited by personal talent and drive. Unfortunately, the reality is the opportunity for many to attend university is blocked because of lack of resources or poor primary education systems. Not having access to higher education squashes aspirations for a better life, often generating resentment against those who are more prosperous. This resentment can take many forms, including active expressions of intolerance.

An active reaffirmation of the role of public universities in society is necessary if we wish to build a more harmonious world and capitalise on its potential. Reasserting the optimism of a promise of a better tomorrow for all is essential if we wish to restore societal stability. Public universities are the primary vehicle with which to achieve this reality. This requires universities, and those who fund them, expand their metrics of performance for measuring success beyond the goal of advancing economic prosperity: a new vision that is focused on improving societal prosperity should be adopted. We need to appreciate both that investment in non-technical subject areas has great value, and that we need to redouble efforts to enable access and success for all who have the talent to attend a university. These are critical investments. Public universities can be society’s vanguard, not only in delivering technology to drive and advance the world’s economic engine, but also in expanding inclusiveness, hope and the promise of a better future for all.

Climate causes fall in global wine production

The International Organisation of Vine and Wine (OIV) has estimated a fall of five percent in the global production of wine in 2016. In a new report, the group attributed the fall in output to ‘climatic events’ significantly impacting southern hemisphere crops – in particular those in Chile and Argentina.

In the report released this week, OIV approximate the total global wine production for this year to be 259 million hectolitres. This figure represents one of the lowest outputs recorded in the last 20 years.

According to the report, Italy, France and Spain – the three largest international producers of wine in terms of volume – all posted falls in output. The OIV estimated that while Italy and Spain recorded only a slight drop in output, France posted a fall of 12 percent. The most significant falls in production, however, are predicted in South America – with Argentina expected to post a fall of 35 percent, while Chile anticipate a 21 percent decrease in production.

In the case of Chile and Argentina, high levels of rainfall caused a significant number of harvests to drown. OIV chief executive, Jean-Marie Aurand, stated: “The El Niño climate phenomenon seems to be back in Latin America, where production was affected by fairly exceptional weather, with lots of rain.”

One positive, however, was Romania. After two years of poor harvests, the country is expected to return to normal output levels with a 37 percent rise over the previous year.

Since it is a crop particularly sensitive to climate conditions, grape harvests have become something of a litmus test – marking the wider impact of climate change. While the quantity of harvests dwindled, higher temperatures have helped improve the quality of some recent vintages.

According to a report published in the journal Nature Climate Change this year, higher global temperatures and delayed rains have pushed harvests in France forward as far as two weeks – with early harvests generally considered to produce higher quality vintages.

Despite this, many will fear there is a tipping point approaching, with higher temperatures posing the risk of making some varieties of grape unsuitable for the regions they are most popular in.

The Impossible Burger that will revolutionise food-tech

Since it first appeared on the menu at New York’s trendy Momofuku Nishi back in August, the Impossible Burger has become the city’s most famous patty, drawing in crowds of food critics and meat-lovers alike. With its tender texture, pinkish medium-rare hue and succulent flavour, the sought-after burger seems like the perfect carnivorous feast. Meat, however, is the one ingredient you won’t find in the Impossible Burger.

Marketed as a ‘meaty masterpiece’, the pioneering veggie burger is said to be indistinguishable from real beef. After five years of extensive scientific research into the burger-eating experience, former Stanford University biochemistry professor Patrick Brown and his team at Impossible Foods have finally cracked the perfect plant-based formula. By analysing beef at a molecular level, the Silicon Valley start-up has managed to pinpoint what gives a burger its juicy, meaty flavour: a protein called heme.

The iron-rich molecule is found in abundance in animal muscle tissue, and is what gives red meat its characteristic colour and slightly metallic taste. Luckily for the team at Impossible Foods, heme can also be found in plant cells, allowing burger engineers to extract, ferment and inject the protein into patties. As hungry diners bite into the Impossible Burger, the added heme causes the patty to ‘bleed’, fooling even staunch meat-eaters. The burger’s place on the menu at the proudly carnivorous Momofuku Nishi confirms its quality, particularly as this is a restaurant that once defiantly declared: “We do not serve vegetarian-friendly items here.”

“We are seeing the beginning of a food-tech revolution”, said Maria Lettini, Director of the Farm Animal Investment Risk and Return Initiative (FAIRR). “Companies are becoming innovators, and reports suggest that there’ll be a year-on-year increase in the plant-based protein market of over 8.5 percent for the next five years.”

Beast impact

The runaway success of the Impossible Burger is the latest in a long line of breakthroughs for plant-based protein alternatives. In 2014, the global meat substitutes market was valued at $3.4bn, with big-name investors such as Bill Gates rushing to fund research. Now, as meat replacement creators compete for valuable venture capital, they are united by a common goal: to create a future for food that is better for the planet and better for people.

For meat alternatives to really take off, they simply need to be more readily available and affordable

Global meat consumption is growing at an unprecedented rate. In the early 19th century, the average person consumed just 22lbs of meat a year. By 2013, that figure had risen to an incredible 95lbs annually. According to the World Health Organisation (WHO), annual meat production is projected to further swell from 218 million tonnes in 1997-99 to 376 million tonnes by 2030. As this demand for meat continues to grow, so does the environmental toll of global animal agriculture. Raising animals for food uses an unsustainable amount of land and water. Just one kilogram of beef requires 150 square meters of land, while a single quarter-pounder hamburger uses the same amount of water as two months of daily showers. Eating into the planet’s resources at an alarming rate, animal agriculture has emerged as the leading cause of deforestation, species extinction, water pollution and ocean dead zones.

On top of this, the pollutant gases from meat farming are threatening to push the planet towards an unprecedented environmental catastrophe. Shockingly, animal agriculture is responsible for more greenhouse gas emissions than all the world’s cars, trains, ships and planes combined, with farming emissions projected to rise a further 80 percent by 2050.

Clearly, meat farming is making the planet sick, and the same can be said for its people. Last year, the WHO placed processed meats in the same carcinogenic category as plutonium, while also publishing evidence that links red meat to diagnoses of cancer, heart disease, diabetes and obesity.

Despite such troubling health and environmental risks, we can’t expect everyone to become vegetarian. For many, eating meat is not just a part of their daily diet, but also plays an important role in their family and social lives, with meat dishes taking centre stage at celebratory gatherings and religious holidays across the globe. Given the undeniable social significance of meat consumption, future global food habits must focus on a reduction of meat intake. As pioneering meat replacements are slowly convincing carnivores to occasionally opt for a plant-based option, meat-free meals are finally gaining momentum.

Mass-market mock-meat

For meat alternatives to really take off, however, they simply need to be more readily available and affordable. That’s where organisations like ShareAction and FAIRR come in. In an attempt to bring meat replacements to the mass market, these UK-based organisations have collaborated to bring together a $1trn coalition of investors, focused on urging powerful multinational food companies to expand into plant-based products.

As part of this initiative, the investor coalition is targeting Kraft Heinz, Nestlé, Unilever and Tesco, among other food producers, advising them of the grave financial and environmental risks tied to animal agriculture. These big-name brands will then be encouraged to establish strategies in order to respond to the risks of meat farming, with investment in plant protein at the top of the list.

“It’s only really within the past 50 years that we’ve arrived at this situation where meat and animal products constitute, for most people, a central aspect of almost every single meal”, said ShareAction’s Campaigns Manager, Clare Richards. “Food companies must realise that we simply can’t continue on this trajectory. We are asking brands to incorporate this into their planning by including protein diversity as part of their approach to sustainable diets.”

If consumer health and environmental concerns aren’t motivation enough for food companies to reassess their over-reliance on animal agriculture, then the money-saving potential of plant products may just do the trick. According to a recent Oxford University study, a global shift away from meat-heavy diets could save the world’s economy $1.5trn in climate change and healthcare costs by 2050. The report revealed a significant dietary change could also save employers tens of billions of dollars in lost working days by reducing meat consumption-related sickness.

“This is a call to action”, said Lettini. “Expenses associated with a continued and increasing consumption of animal products are soaring, and investors need to spend some time understanding what the financial risks are around this industry.”

The project may hope to see plant protein incorporated into the world’s weekly shop, but both ShareAction and FAIRR accept animal products won’t be banished from supermarket shelves anytime soon. “This isn’t about telling people that they can’t have meat; it’s about moderation and moving things in a more sustainable direction”, said Richards.

Sustainable horizon
The evidence points to an uncomfortable truth: quite simply, if our diets don’t change, our planet irreversibly will. With new techniques and big-name brands beginning to embrace plant protein, however, meat-free options will soon be more affordable, more widely available, and more delicious than ever. So far, food scientists have only explored around eight percent of the world’s plant proteins as potential meat alternatives, and this limited research has already given us the Impossible Burger – and we’ve not even mentioned the critic-approved Beyond Meat chicken taco and the internationally acclaimed Vegetarian Butcher. As investors turn their attentions to this largely untapped market, plant-based proteins may soon usher in a new sustainable future for food.

Tesla sets its sights on fully autonomous vehicles by 2017

All new Tesla vehicles will feature hardware capable of driving completely autonomously, Chief Executive Elon Musk announced at a press conference on October 19. In an ambitious acceleration of Tesla’s self-driving programme, the car manufacturer has begun to equip all Model S and Model X vehicles with a new hardware system, including eight cameras – to provide 360-degree visibility – and twelve ultrasonic sensors.

The software set to accompany the update is still being tested and currently awaits regulator approval. Once validated, each software update will progressively increase the vehicle’s self-driving capabilities – with Tesla aiming for full automation by the end of 2017.

“Self-driving vehicles will play a crucial role in improving transportation safety and accelerating the world’s transition to a sustainable future”, the carmaker said in an official statement detailing the hardware update.

‘Full autonomy will enable a Tesla to be substantially safer than a human driver, lower the financial cost of transportation for those who own a car and provide low-cost on-demand mobility for those who do not.”

Tesla have sought to reassure regulators of the safety of its self-driving features after a driver using the company’s existing Autopilot mode was involved in a fatal crash in May. An investigation by safety regulators revealed that the car’s front-facing camera had failed to recognise a white truck against a bright, sunny sky – leading the vehicles to collide. The crash prompted the company to overhaul its Autopilot feature, moving the system away from a reliance on cameras and increasing the role of radars in navigation.

New Tesla vehicles will initially lack this existing Autopilot system, with current features such as automatic braking, lane holding and cruise control being temporarily disabled as the company prioritises the next generation of hardware. These Autopilot features will be enabled in instalments through ‘over-the-air’ software updates as the company works on validating the new system.

After watching stock prices fall in recent months, the car manufacturer has been setting increasingly ambitious deadlines for its autonomous technologies in an attempt to maintain investor confidence. Yet, as regulators continue to scrutinise Tesla’s existing Autopilot systems, the company’s new self-driving software could well face a lengthy approval process.

Ingenuity Lab advances to second stage of XPRIZE competition

This year’s XPRIZE event sees innovators from around the world compete over their ability to transform COemissions into products with high net worth. Ingenuity Lab’s team of 14 has proposed to convert CO2 waste emitted from a natural gas power plant into usable chemical products.

The group is up against semi-finalists from Canada, China, India, Switzerland, Scotland and the US – including carbon capture technology companies, academic institutions, non-profits and startups. Some of the solutions they’ve put forward include converting CO2 into concrete, biofuels, toothpaste, nanotubes, fish food and fertilisers.

The Carbon XPRIZE competition was launched in September 2015, and aims to combat global CO2 emissions by incentivising scientists to change them from a liability into an asset.

In the second round of XPRIZE, Ingenuity Lab Carbon Solutions and 26 other teams will have to demonstrate their technology at a pilot scale, at their own chosen location – using either real flue gas or a simulated flue gas stream.

The Carbon XPRIZE competition was launched in September 2015, and aims to combat global CO2 emissions

Over a 10-month period, teams will then be monitored according to how much CO2 they convert, and the net value of their products. Teams that score highest will share a $2.5m purse, and move on to the finals of the competition, where they will demonstrate their technology at real-world plants.

Ingenuity Lab provides a platform for interdisciplinary research and innovation that leverages the possibilities of the intersection of materials science, nanotechnology and informatics. It hopes that with its carbon technology it can greatly reduce the amount of CO2 released into the environment, while simultaneously providing for increased economic opportunity.

Speech recognition software now as good as humans

Microsoft has developed a piece of software that can transcribe a conversation as accurately as a human, in a significant breakthrough for artificial intelligence systems. The software not only listens to words, but also is able to place the words within context to allow for more accurate transcriptions.

The latest program from Microsoft’s research team is capable of transcribing a conversation with a word error rate of 5.9 percent, a figure comparable to the error rate of a professional transcriptionist. Xuedong Huang, Microsoft’s chief speech scientist, said in a statement this represents a historic achievement.

The result doesn’t represent perfect speech transcription, but rather offers something very close to the way humans mishear fragments of conversations. Mistakes are generally quite straightforward, such as confusing ‘have’ for ‘is’ or ‘a’ for ‘the’. Transcription mistakes from both humans and this system come from minor misinterpretations of sentences rather than a physical mishearing.

The result doesn’t represent perfect speech transcription, but rather offers something very close to the way humans mishear fragments of conversations

The breakthrough that led to this achievement is the use of a neural network and the grouping of words that not just sound similar, but have similar meanings. For example, the words ‘fast’ and ‘quick’ are close together in the virtual dictionary of the neural network, since the use of one increases the likelihood of the other. This lets the system generalise meaning in the same way a human might.

What the system cannot do is understand what it is listening to. While it is able to accurately transcribe speech, it does not understand what is being said, so cannot, for example, answer a question.

The primary use of the software is likely to be in Microsoft products that use speech recognition, like the Xbox and the Cortana virtual assistant. The next stage for the research team is to modify the system so it can still function in places with a large amount of background noise, or listen to multiple voices.

Molok’s innovative waste system allows for cleaner, fresher city streets

Waste disposal is an escalating challenge that continues to afflict society, with no sign of abating. Landfills are filling at a rapid pace, rising in number and taking up invaluable space that simply cannot afford to be wasted on waste. Likewise, refuse continues to burden local government authorities, drawing considerable resources for collection, organisation and disposal. Aside from the fiscal drain, myriad impracticalities exist. For instance, our current system causes a near-permanent stench in the streets of many world cities, which is detrimental to business as well as residential quality of life.

Despite the vast sums ploughed into waste management, a viable alternative to the status quo still does not exist. Even with this being a communal problem that affects every single country in the world, we are still – in some respects – living in the Stone Age.

New approach
The main purpose of waste management is to keep the micro and macro environment clean, thus preventing the spread of bacteria and disease, while also reducing pollution levels. In terms of basic human requirements, it sits just behind the need for shelter, water and nutrition. Though developed countries have seen some progress in terms of recycling, many nations still endure insufficient waste facilities – or a total lack of them – due to financial constraints.

In order to progress and develop, we need an alternative system. Of course, this requires an innovative, forward-thinking approach that is not focused on the bottom line, but instead makes the environment the primary concern.

Thankfully, Finnish waste management specialist Molok has a solution that fits this bill. Molok’s system uses gravity to do most of the work and reduces the number of collections by more than 50 percent, while completely eliminating the unpleasant odours usually so
inextricably linked to waste.

The Molok system for waste disposal was originally developed in the 1980s, with the company itself being officially established in 1991 in Nokia, Finland.

New wave
Traditional waste collection started with small bins that were collected daily or, at the very least, weekly. Then things developed to cope with population growth, with bins becoming bigger and more mobile. Next came the wheelie bin, a contraption that is still used to this day by millions of households around the world.

“In all cases, these developments have always been on the surface of the land, no matter what size”, explained Veikko Salli, inventor of the Molok system and founder of the company. “Before I came up with the idea for Molok, we had a horizontal capacity waste container for our new hotel that simply did not work. When the waste container was even just partly filled, much of the waste spilled and would be left scattered around the yard. The basic idea I had came from a well, but the next challenge was to work out how to empty it. It took years to develop a bag that could be emptied through a bottom that opened.

“Deep Collection is what we called the method used for Molok containers, and we were the first to use this phrase in English. Molok Deep Collection is a new method of waste collection that is now being introduced around the world.”

So successful is the Molok model that many have tried to imitate it. “As a company, Molok is actually quite small compared to the scope of its brand name”, said Salli. What sets the company apart, however – aside from being the inventor of the technology – is its continual commitment to innovation, both in terms of improving current methods and products, but also in terms of creating new solutions.

“Currently, we are investing in the Internet of Things, and different types of electronic devices that can be used with Molok containers to further help our clients”, said Salli. “That said, we only focus on Molok Deep Collection products and do not venture into other areas, like so many of our competitors do, so that we do not dilute our focus and core objectives. At Molok, there is a real atmosphere of wanting to figure things out and solve problems, instead of merely manufacturing a product.”

Waste wealth
Obviously, there is a technological lag between the progression of waste management and that of society as a whole. This problem is compounded by the vast sums of money that are made from collection, and even from recycling. Our current methods, though inefficient and insufficient, suit those enterprises involved in the process. Regular collection means regular pay, while endless volumes of waste mean an endless demand for landfills, incineration and recycling.

“When it comes to the business of waste management, an important thing to understand is that most set out to make as much profit as possible from each tonne of waste – that’s always the goal. And that’s why recycling can sometimes have ugly adaptations, which actually cause greater damage to the environment”, said Salli.

Traditional systems are designed to suit heavy truck access for collection, but that does not necessarily mean they are easy to access for the people who use them. Another difficulty is having enough capacity to carry out a sufficient number of regular collections, given sprawling urban populations and escalating societal demands. While both challenges spell trouble for current systems and local governments, they inevitably translate into more business for industry players.

“At Molok, we are always thinking of how to work for the environment, as opposed to against it. With this objective at the core of everything we do, all the design rules we have originate from the laws of nature.” Specifically, Salli refers to the law of gravity. With his technology, gravity acts as a waste compressor, aided by the vertical capacity of the Molok system. However, space is only one piece of the puzzle. Having a convenient and effective point for collection is crucial as well. For this reason, Molok stations are both easy to locate and easy to use for people of all ages.

True sustainability
As the major component of Molok systems is a basic well, which is never moved, it has a very long lifetime. “Even after 25 years of experience with the Molok system, its endpoint is still a question mark”, said Salli. The system is far more environmentally sustainable than conventional alternatives, particularly as the typical lifetime for wheelie bins is around five to seven years.

Given today’s mounting demand for space in and around living areas, an investment in Molok systems generates a swift return, given that it offers more space for car parking, children’s playgrounds and green spaces, among many other things. “Nowadays, when it has become both necessary and possible to locate more capacity for housing, waste collection traffic can actually be reduced – many times by half, or even less.” Moreover, as the containers themselves are never moved, landscaping can be carried out around them, making areas more green and attractive. Finally, due to their partly underground structure, Molok containers are not affected by storms – a weather occurrence that is increasing in frequency worldwide.

Finland’s capital, Helsinki, which was rather slow in adopting Molok containers, started using the system in 1999, some nine years after it was first proposed. Then, in 2011, Helsinki introduced a new rule regarding how often Molok containers must be emptied: simply, “when it is full”.

Waste as we know it is not going to disappear any time soon – it is part of our lives, and its management is necessary in any functioning society

Given the success it has had so far, Finland has now moved on to introducing recycling points using Molok technology as well. “When developing recycling in many areas, we have found that it is important to start with places that are big enough for mixed waste collection, as they are also suitable for the location of Molok bins for paper, metal and glass. This makes it far easier for people that live in the area to bring their recyclables there”, Salli explained.

Challenges ahead
Molok’s growth has been driven by the fact that end users invariably like the system. That said, there are still numerous political and practical hurdles to overcome before such a revolutionary system can spread worldwide. In the meantime, Molok is working on new ways to solve existing issues, whether through adapting products to suit a particular environment, or through tweaking business models so they work best in each individual area.

“Different countries not only have different circumstances, but cultural differences also present a challenge”, Salli said.“Our approach is to first gain an understanding of the country in question. This means that our employees spend time in that particular country to find out more about the current circumstances there. Then, it is important for us to have a local partner who knows the market intimately, and is already involved in or connected to the industry there. Depending on the country and the situation, we deal with the local partner and either export directly with a distribution partner, set up a Molok-owned distribution centre in collaboration with a partner, or enter into a licence agreement with the local partner who then manufactures the product in the country.”

Waste as we know it is not going to disappear any time soon – it is part of our lives, and its management is necessary in any functioning society. As such, it has changed little over many decades, particularly when compared to the technological progression that has taken place in almost all other facets of life. Finally, however, the future of waste management has also arrived. Due to the way the Molok system is designed, odour is practically non-existent, while valuable space is freed. In this way, it benefits the environment, local governments, residents and businesses. Times have changed, and so has society. There is no longer any reason for waste management to remain stuck in the past – the technology exists, and we need only learn to embrace it.

VWM’s ColiMinder system allows for real-time microbial water monitoring

A breakthrough technology for microbiological online monitoring is paving the way for a technological leap in the water sector. This new technology changes the measurement of bacterial contamination from a 24-hour lab procedure to a 15-minute, fully automated measurement, enabling microbiological online monitoring and process control.

Microbiological contamination is one of the most important quality parameters of water. Without the ability to perform microbiological online monitoring, all existing processes in water treatment addressing microbiological contamination have to be built as oversized and rigid processes. These processes are, of course, neither efficient nor sustainable. They require extensive amounts of chemicals and energy in order to guarantee desired output quality, without any knowledge of the actual contamination levels they have to address. Solving this problem is a huge leap forward for the industry.

Microbiological contamination is now available as a real-time parameter, measured automatically

Petri dish to IoT
In contrast to traditional lab methods, which use the formation of colonies to measure contamination, the new technology works by measuring metabolic (enzymatic) activity related to the respective target organism. Both methods (traditional cultivation and enzymatic) use genuine properties of the living organisms as a measure, which differentiates this new technology from other methods developed for rapid microbiological contamination measurement.

The short measurement time of 15 minutes is extremely important, as it allows the measurement device – named ColiMinder – to be deployed not only in process monitoring, but also in process control.

Microbiological contamination is now available as a real-time parameter, measured automatically. The measurement device is controllable through an internet connection, the results being viewable online and in real time. Based on this new online availability, existing processes can be conducted much more efficiently and securely. In addition, new processes for water treatment can be developed and optimised.

Controlled disinfection is a good example to demonstrate the potential of the new technology. The dosage of chlorine or the amount of ozone used in the disinfection process should address the contamination actually present [blue line in Fig 1]. Without information about the actual contamination level, the disinfection intensity must be at a level that guarantees sufficient disinfection [green line in Fig 1], covering the highest contamination peaks. Measuring the contamination level online, however, allows us to exactly address the actual contamination level [red line in Fig 1], which leads to a reduction in process costs of up to 50 percent. At the same time, the result of the disinfection can be monitored online to safeguard the process. The same, of course, applies in various other processes, like vegetable washing lines or biocide dosage in cooling water.

Membrane integrity
One of the fastest growing technological fields in water treatment, membrane filtration is increasingly used in desalination, sewage treatment and water purification. However, the technique cannot be safeguarded using traditional microbiological lab methods. Any change in a process, such as adding more membranes or switching from one membrane bulk to another, must be evaluated. Using classic lab methods on every change in process would not be affordable, and the results would only be available 24 hours later. Membrane integrity monitoring is a real challenge, and microbiological online monitoring is the solution to it.

Key applications include wastewater treatment and reuse, surface and raw water monitoring, membrane integrity testing, drinking water treatment, and food production. It also could help towards fighting antimicrobial resistance through contamination monitoring, and the development of new treatments and methods in agriculture and aquaculture.

This unlocks a huge market potential in safeguarding, optimising, improving and developing new processes in water treatment and distribution. At present, reagent systems are available to measure the most important microbiological indicator organisms: E coli, coliform bacteria, enterococci, and total bacteria.

vwm-graph

Madhav’s solar solutions allow deprived areas to convert to renewable sources

The Indian solar PV market has exploded in recent years, fuelled by the country’s insatiable need for energy. By 2035, India’s energy demands will make it the world’s second-largest consumer of energy, responsible for an 18 percent share of global energy consumption. Of India’s total 1.2 billion population, about 70 live in rural locations. This fact alone creates significant opportunity for off-grid projects, and puts the enormous potential of solar energy in figures.

Green dream
India is on a focused mission to go green. It aspires to have the world’s largest solar PV portfolio by 2022. As Prime Minister Narendra Modi put it at the COP21 climate conference in Paris: “As the developing world lifts billions of people into prosperity, our hope for a sustainable planet rests on a bold, global initiative.”

MI Solar is based in Vadodara, Gujarat. The company is on a mission to create a self-sustainable enterprise in the solar PV sector. The challenges of this complex industry do not deter MI Solar. We have a laser-sharp focus on our business objectives. Instead of chasing every shiny object, we only go after sustainability. For us, the word ‘sustainable’ carries utmost importance in business. This ideology has developed through our association with Madhav, the parent company of MI Solar.

In business, it is easy to slip into chasing multiple projects. However, financially, it often makes more sense to devote one’s energy to an obscure project that brings consistent returns. Madhav has done extensive work in India’s core infrastructure sector. The company’s portfolio ranges from public-private partnerships to engineering procurement construction projects. Successful execution of their projects comes through years of experience in the construction sector. A move into the solar PV sector came as a natural progression for the group’s solar arm: MI Solar.

A sizable portion of India’s population is not connected to the mainstream grid, and very few of these people can afford a private electric supply

India’s Solar PV market is diverse and complex. Also, the sector features immense competition. Domestic and international investors are increasingly attracted to this lucrative area, which is a positive for the country, but makes individual excellence more difficult to achieve.

Indeed, caution is essential for anyone who wants to invest in the industry, as it does have pitfalls and downsides along with opportunities. The large-scale independent power producer subsector is large, yet competitive. Some investors have taken up large portions of investment in it, and it is notable that the few who have scaled up have done so due to their lone ability to execute projects.

Rising to the challenge

The Indian solar PV market has shown signs of having a bright future ahead of it. With the end goal of an evenly distributed power generation market, the federal and state governments are bringing in policy changes to bridge gaps.

On the infrastructure side, MI Solar is proud to have built large-scale plants in diverse locations across India. The highest standards of quality and adherence to completion timelines are the defining features of these projects, and the secret to their success. MI Solar even goes one step further, by acquiring the land for plant locations. This works to ensure seamless project execution, as the land use permissions are frequently a cause of major bottlenecks when executing projects. MI Solar understands the Indian construction sector and the sectors associated with it. Thus, they are able to adapt and brace themselves to changing winds in the market. The same is true of the company’s knowledge of metering policies, which differ from state to state. A strong understanding of these allows MI Solar to tailor its approach for each location, maximising the bottom line.

Power to the people

A sizable portion of India’s population is not connected to the mainstream grid, and very few of these people can afford a private electric supply by the way of a diesel-powered generator (the primary alternative).

Indeed, such a setup is both costly and inconvenient. Diesel, being a fossil fuel derivative, is a scarce resource, so is expensive. Diesel-fuelled power generators cause a lot of noise and produce a lot of smoke, which is both socially disruptive and detrimental to the environment.

To deal with this problem, MI Solar is working with industries that are diesel-dependent. This includes the mining and stone crushing industries. Due to the nature of their business, such industries are predominantly located in far-flung regions, which only adds to the difficulties they face in doing business. MI Solar has taken up the challenge to bring relief to these sectors. The first step is to remove dependence on rudimentary means of power.

A hybrid generator model, with both diesel and solar PV integration, works best in this case. Operations can run on direct solar energy, with a storage backup in place. Machines can work on stored energy, with diesel models still plugged in. This setup ensures consistent uptime and a significant reduction in costs, fumes, and dependence on fossil fuels. The successful execution of this project will have huge implications for India. Once its industrial use is established, the same generator model can be extended to the rural population for domestic use. In essence, the more industrial projects that adopt this system, the more likely it is for the technology to benefit individual households as well.

Due to the absence of crowdfunding setups in this sector, however, the path to economies of scale for micro-setups seems limited. This could slow down many ambitious plans for rural improvement. If only a platform in the vein of Kickstarter and Indiegogo existed for hybrid power solutions in India, then in a single push the world would achieve the goal of a sustainable planet.

FarmLink’s data platform allows farmers to maximise yields on any type of land

Today’s global agriculture industry is experiencing a level of change on a par with mechanisation and biotechnology. With the recent rapid growth of the agtech segment globally, data science solutions represent the next catalyst for radical innovation across the global food system. Helping to lead these efforts is the winner of the 2016 New Economy Award for Best Agriculture Solutions – FarmLink, a US data science company located in Kansas City, Missouri.

“Advanced data science and associated technology are essential ingredients to addressing the most complex challenge of our time – feeding our growing global population in a sustainable manner”, said Dan Glickman, former US Secretary of Agriculture. “With greater access to high-quality data, farmers and researchers can uncover the insights needed to increase productivity locally while protecting our natural resources in a more efficient and effective manner.”

Farming data
Seven years ago, FarmLink began building its proprietary analytics platform by connecting the managed combine fleet of sister company MachineryLink. The fleet of more than 250 combine harvesters collected real-time yield data and related calibration data during harvests across six million acres in 26 states. Introducing the first Internet-of-Things concept in agriculture, the fleet’s live data stream, paired with field and weather data, enabled the company to build proprietary data indices that included more than 50 natural variables (including sun, soil and moisture), along with accurate yield data at an unprecedented micro-field resolution of 150 sq ft. By reverse engineering these datasets, FarmLink’s team of data scientists determined the various drivers of yield on a specific micro-field, and thereby isolated the effect of product decisions and farming practices on production under comparable conditions. In extensive back-testing, the company achieved r-squares between 0.92 and 0.98 when its proprietary indices were compared to historical USDA data at county levels across key US growing regions.

“The result is that we’ve been able to create one of the world’s largest agricultural datasets as the foundation for our sophisticated data analytics platform”, said Ron LeMay, Chairman and CEO of FarmLink. “After working with farmers and agribusiness leaders, we believe it will take a combination of industry knowhow and grit, coupled with robust data science and analytics, to ensure a positive outcome for the nine billion people expected by 2050.”

The company wasted little time finding partners who could leverage the dataset and hasten innovation. This initial capability has been applied to land benchmarking, agricultural practices benchmarking, grain trading decisions, product benchmarking and other scientific and environmental tests of effectiveness within the last two years alone.

Accurate modelling
While growth in the agtech segment continues to accelerate, for most agriculture companies, data science and associated technology is at a very early stage of development, due in large part to a lack of high-quality data required for scientifically reliable, predictive models.

Since 2014, and with more than six years of quality data verified, FarmLink has been working with public and private organisations to sustainably solve complex biological, environmental and economic challenges, including how to maximise yields, how to optimise inputs, and how to minimise environmental degradation. FarmLink’s unique dataset consists of more than one billion test plots of corn, wheat and soybean yield data, of which partners have been able to leverage the analytics and insights for new research and development.

“Our custom analytic platforms are easily used by agribusinesses and others to transform and amplify limited research datasets into millions of ‘product by environment’ observations. These platforms can be used to measure the value and sustainability of new technology, not only in farmers’ fields, but also those under development in R&D labs”, said Bob McClure, FarmLink’s Chief Data Scientist. “In a very short period of time, we’ve been able to help significantly improve the way data inputs in a single research project are analysed and used to create predictive models with broad relevance.”

For example, instead of simply looking at small test plots to evaluate the product performance of a new fertiliser, one of FarmLink’s agribusiness partners shifted from relying on these individual text plots to a network of field data, analysed down to 150 sq ft, and then compared product performance using FarmLink’s advanced weather and yield productivity models to achieve a broader view of potential efficacy.

“By using the enhanced product performance data, we can model how the product is expected to perform, given the soil, weather and yield forecast at the zone level”, McClure said. In addition to helping to advance research findings, this granular level of analysis ultimately allows ag retailers and farmers to understand the likely economic benefit, field by field, before buying or applying a new product.

The first year FarmLink started collecting data, the company said it threw away two-thirds of the data because it didn’t meet its self-imposed quality standards

While inputs are often essential to ensuring yield, farmers also remain committed to ensuring the soil and water health of their lands. FarmLink’s data science was recently applied to helping understand areas of opportunity for land conservation on farms. Using a proprietary algorithm based on FarmLink’s yield, weather and soil indices, the company was able to reliably predict the effect of nitrate leaching in a given field at a zone level. This information is vital to both farmers and stakeholders in pinpointing which zones are the most economically and environmentally advantageous to preserve as conservation land. FarmLink’s ability to deliver these solutions has helped farmers, NGOs and governments prioritise land stewardship, and align on optimal long-term conservation efforts that improve the sustainability of agriculture overall.

Product innovation
As today’s global economic pressures continue to stress farm operations, experts believe precision technology must come with actionable, precision data to realise the full benefits. “At FarmLink, we believe farmers in rural communities around the world deserve access to the same level of technology-driven decision making tools present in other industries today”, said LeMay.

An Illinois farmer recently recounted that, when his grandfather started farming, he only needed two things: a good work ethic and a strong back. In 2016, this young farmer said, you also need reliable data science.

The first year FarmLink started collecting data, the company said it threw away two-thirds of the data because it didn’t meet its self-imposed quality standards. “We learned that, unless you have a reliable, repeatable, and managed end-to-end process, it is highly likely the quality of data collected will not be sufficiently accurate to be actionable. In essence – worthless to farmers”, LeMay said. “As an industry focused on achieving global food security, we must commit to the rigour of data collection, management and sharing at the highest standards. As we’ve imagined new tools for farmers and others across the food system, we’ve remained committed to these quality standards.”

In 2014, FarmLink introduced TrueHarvest, the industry’s first and most sophisticated yield benchmarking tool that allows farmers to compare and validate individual farm data at the zone level and across the farming operation. During the following two years, the company developed additional products to allow farmers and their advisors to use actionable data necessary to select and validate crop management decisions that address unique opportunities in each field. Introduced in 2015, Discovery provides a glimpse into land potential, and is designed to help non-technical farmers interact with data science for free. The company said its goal is to help drive adoption of data science tools across the industry by providing an easily accessible, high-quality product that builds trust in this new technology.

FarmLink also partnered with DTN and the Progressive Farmer to create MarketVision. This is a tool, powered by FarmLink, that gives farmers and land owners access to grain marketing technology in one convenient place. Farmers can use MarketVision for insights into when to sell their grain along with real-time updates on crop yield potential and risk management data. In fact, the product has shown a proven accuracy over a five-year period using millions of acres of yield data. In testing the tool, FarmLink plugged in real data from various farmers around the country. MarketVision, in most instances, was able to project corn yield within two bushels of the actual individual yields at the field level, providing a reliable base for marketing decisions.

Wide appeal
While the company’s offerings benefit US farmers today, the technology is applicable to agriculture worldwide, and helps create a common language to accelerate solutions. “FarmLink connects the dots to build technology that magnifies human intent and capability. These agtech platforms can inspire the next generation of agricultural innovators to make meaningful changes in global agriculture at the farm level”, said Rikin Gandhi, founder of Digital Green, a non-profit international development organisation that uses an innovative digital platform to bring together technology and social organisations to improve agriculture in rural communities in south Asia and sub-Saharan Africa.

For example, the platform could prove to be invaluable in further improving productivity in Australia or evaluating land conservation in South America. It can be deployed to assess the value of range or pasture restoration, or evaluate specific land for agricultural or non-agricultural uses.

“As the agtech space continues to expand, FarmLink remains at the forefront of innovation. The next generation of farmers – from the US to eastern Europe to sub-Saharan Africa – will need technology and data science to prosper”, said LeMay. “FarmLink is committed to continuing to place new tools and innovation at their fingertips.”