Lego cuts 1,400 jobs following first drop in sales in 13 years

On September 5, Danish toy maker Lego announced plans to cut 1,400 jobs from its global workforce following a five percent drop in sales in the first half of 2017. This decline marks the first sales fall in 13 years.

Jørgen Vig Knudstorp, Chairman of Lego Brand Group, said in an interview: “The car has gone off the road and landed in a ditch and now we have to pull it out and get it back up to speed again,” according to The Wall Street Journal.

In the last few years, the company has been trying to adapt its strategy in line with new digital trends, becoming a benchmark of innovation in the industry. But the approach started to weaken during recent months, and the company is now seeking to speed up product launches.

The transition also includes changes in management; in August, Lego appointed its second CEO in eight months. However, the company said Bali Padda’s replacement was not related to his performance. Niels Christiansen, the new Chief Executive, will take over the position in October.

In the last few years, the company has been trying to adapt its strategy in line with new digital trends, becoming a benchmark of innovation in the industry

Meanwhile, the company – one of the world’s largest toy manufacturers – said it would shed eight percent of its workforce. As Bloomberg described, the move is the company’s biggest cut in more than a decade, showing that “the world’s most profitable toymaker isn’t invincible”.

Results posted in the first half of the year raised concerns about the course that the company has taken. The company has failed to meet expectations on licensed products, like those related to the Star Wars and Batman franchises.

Speaking to Bloomberg, Lutz Muller, CEO of Vermont-based toy consultancy firm Klosters Trading Corporation, said: “It’s in the action figure segment that Lego is facing its problems.” This is the toughest market space in terms of competition.

Angry Birds maker Rovio plans IPO

Rovio Entertainment, the Finnish start-up behind the hit smartphone game Angry Birds and The Angry Birds Movie, has announced that it is planning to sell shares in an initial public offering in Helsinki, seeking funds to pursue its games-driven growth strategy. The offering will comprise a share issue of approximately $35.7m as well as a secondary share issue by shareholder Trema International Holdings.

Angry Birds surged into public consciousness in 2009 as a smash hit smartphone game, becoming the first mobile game to reach one billion downloads. It has now expanded its gaming portfolio with new launches such as Angry Birds Evolution. It has also cashed in on the success of its diversification into film entertainment, with the release of The Angry Birds Movie generating approximately $350m in box office revenues.

Building on this momentum, it has recently entered into a film licensing, production and distribution agreement with Sony-owned Columbia Pictures Industries for the film’s sequel, which is scheduled for release in 2019.

Rovio has recently entered into a film licensing, production and distribution agreement with Columbia Pictures Industries for the film’s sequel, which is scheduled for release in 2019.

Despite the success of the film, Rovio CEO, Kati Levoranta, has repeatedly underscored that the company intends to prioritise its game-making division, expressing confidence in its “games-first entertainment” strategy. She further explained in a statement: “All of our recent launches – Angry Birds Evolution, Battle Bay and Angry Birds Match – have shown better performance in key performance indicators than any previously launched Rovio game, thus suggesting additional growth potential ahead.”

The company’s press release expressed that it aims to develop “fewer, bigger and better” games that have the “potential to reach top-grossing lists and have a long lifetime”.

In addition, the company hopes that the IPO will provide a push for growth, as well as improving strategic flexibility and driving brand recognition. It will also enable the company to make full use of shares in potential acquisitions.

Walmart and Google challenge Amazon with voice shopping partnership

Google and Walmart have teamed up in a deal that will enable consumers to shop online using voice recognition software. The partnership was announced on August 23, and is viewed by many as an attempt to challenge Amazon’s industry dominance.

From September, shoppers will be able to link their Walmart accounts to Google Express, the company’s e-commerce platform, and use Google Assistant to order groceries by talking into their smartphone. Users can also receive personalised shopping results and make use of Walmart’s easy reorder service to enable fast repeat purchases. Further integration is also expected in the coming months.

Google undoubtedly has hopes that the Walmart collaboration will help it improve its patchy record in the e-commerce industry

According to Walmart President and CEO Marc Lore: “Next year, we will also leverage our 4,700 US stores and our fulfilment network to create customer experiences that don’t currently exist within voice shopping anywhere else, including choosing to pick up an order in store (often for a discount) or using voice shopping to purchase fresh groceries across the country.”

Google Express was launched in 2013, but has struggled to compete with Amazon’s superior range of products. Similarly, Amazon’s foray into food and grocery items has not gone unnoticed by Walmart. Amazon’s presence in the voice-based shopping market is also relatively well established, as consumers have been able to order through the company’s personal assistant Alexa for a number of months now.

Although shoppers may be sceptical about how much of a game changer voice searching really is, the partnership between Google and Walmart likely has one eye on future consumer trends. Online shopping is already ubiquitous across smartphones, while 20 percent of all mobile search queries are voice-based. Voice shopping, therefore, could be a potential growth area for both companies.

Google undoubtedly has hopes that the Walmart collaboration will help the company to improve its patchy record in the e-commerce industry. Aside from the disappointing performance of Google Express, the company’s mobile payments platform, Android Pay, lags behind market leader Apple Pay.

BHP set to sell underperforming US shale business

The world’s largest mining company, BHP, intends to sell its underperforming US shale oil and gas division, it announced on August 22. The Anglo-Australian firm made a $20bn investment in the shale sector back in 2011 – a move that Chairman Jacques Nasser now admits was a mistake.

Six years ago, oil prices were trading at around $100 a barrel, approximately double today’s prices. BHP’s move into the shale business made it a top 10 player in the US market, but the prolonged fall in the value of oil has proved costly to the company’s finances.

It is also expected that BHP’s demerger of its US shale business could bank the company $10bn, with oil prices making a recovery this year

“We have determined that our onshore US assets are non-core and we are actively pursuing options to exit these assets for value,” BHP revealed as part of its annual results.

Although other aspects of BHP’s recent strategy have also faced shareholder criticism, including its $13bn potash project, company finances were boosted by an upturn in the industrial commodities market. Net profits of $6.7bn tracked below expectations for the 12 months ending June 2017, but the company was able to slash net debt by $10bn and pay dividends of $4.4bn to investors.

It is also expected that BHP’s demerger of its US shale business could bank the company $10bn, with oil prices making a recovery earlier this year. News of the intended sale saw BHP shares rise by 1.5 percent, but the mining firm will have to juggle the need to extract value against investor wishes for a quick sale.

What is clear, however, is that BHP will not be able to recoup the money that it paid six years ago. The company will not be the last to bemoan the volatility of the commodity price market, but it will be wary not to acquire assets at the top of their value cycles in future.

Total acquires Maersk Oil in $7.45bn deal

On August 21, French oil giant Total announced it will be buying Maersk Oil for $7.45bn, in one of the largest transactions since oil prices collapsed in 2014, according to Bloomberg.

The agreement involves the transfer of $4.95bn in 97.5 million Total shares (3.76 percent of the company) to Moller Maersk, the Danish group selling its oil division. The other $2.5bn will be covered when Total assumes part of Maersk Oil’s debt.

The acquisition will transform Total into the second-largest operator in the North Sea.

“This transaction delivers an exceptional opportunity for Total to acquire, via an equity transaction, a company with high-quality assets which are an excellent fit with many of Total’s core regions,” said Total Chairman and CEO Patrick Pouyanné in a statement.

Moreover, the deal is expected to “boost earnings and cash flow, and bolster its dividend prospects”, according to Reuters.

In Pouyanné’s words, the acquisition will give Total a new anchor point in Denmark to strengthen its existing offshore producing business, transforming it into the second-largest operator in the North Sea.

On the other side of the deal, Moller Maersk said the sale is related to its strategy of focusing on other activities. This includes focusing on its shipping division, which has been struggling in recent years.

The deal, led by one of the largest players in the global oil market, takes place at a moment of recovery in the energy field, following years of downturn. For example, in January 2016, oil prices fell below $30 per barrel in the international markets for the first time in 12 years, amid fears of a stall in China’s growth. In recent days, prices have been around $50 per barrel.

Walmart’s online sales soar in the US

Walmart’s online sales in the US grew by 60 percent year-on-year in Q2 2017, boosting its total to 1.8 percent, according to the company’s results posted on August 17.

The retailer’s new figures represented its 12th quarter of continued expansion, contrasting with the industry’s downward trend. They also showed that Walmart’s latest investments to strengthen its digital performance are paying off.

In recent months, the retailer, which is the world’s largest, acquired Jet.com for $300m and unveiled the purchase of the clothing seller Bonobos for $310m, among other e-commerce deals.

However, in June 2017, Amazon announced its acquisition of Whole Foods in a $13.4bn deal, signalling that competition in the grocery industry keeps getting stronger.

Greg Foran, Walmart’s CEO in the US, admitted: “Amazon is an unbelievably good competitor.”

On August 15, Amazon sold $16bn worth of bonds, most of them aimed at the acquisition of the supermarket chain. This move has been regarded as Amazon’s first step to conquer the sector.

Speaking to reporters on August 17, Greg Foran, Walmart’s CEO in the US, admitted: “Amazon is an unbelievably good competitor.”

However, having announced five deals in the space of a year, Walmart has shown it is unwilling to relinquish its leadership in the sector, modernising the structure that made it the US’ largest brick-and-mortar retailer.

Walmart’s revenues rose 2.1 percent to $123.4bn in the second quarter, beating forecasts, with more customers not only buying online but also walking into stores, Financial Times reported.

However, high spending and lower prices, especially in groceries, impacted Walmart’s profits, which are now expected to remain low.

Rural Mexican farmers rally against NAFTA renegotiations

A day after Mexico, the US and Canada met to begin renegotiating the North American Free Trade Agreement (NAFTA), Mexican farmers took to the streets to protest their government’s attempts to save the trade deal.

While representatives from Mexico and Canada battle to save a deal US President Trump described during his campaign as “the worst trade deal in history”, Mexican farmers complain that NAFTA has pummelled their industry.

There is generally support for NAFTA within Mexico, but today’s protests echo the reception of the deal when it was signed in 1994, when many feared free trade with the US would undercut Mexican farming produce.

While Mexican industry has largely benefitted from the deal, racking up a trade surplus with the US totalling $63.2bn in 2016, farming has suffered. Mexico imported a total of $18bn worth of agricultural products from the US last year, making it the US’ third largest agricultural export market.

Many rural farmers blame NAFTA for competition that has driven down the price of their goods, and unions representing farmers are calling for the government’s support of NAFTA to be put to public vote. Meanwhile, Mexico’s President, Enrique Peña Nieto, is keen to save the deal before elections next July, noting how the deal has led to job growth.

Unions representing farmers are calling for the government’s support of NAFTA to be put to public vote

Trump has expressed his distaste for the deal on many occasions, and has threatened to walk away from it altogether, but decided against this in April. However, the option remains a threat should he deem the renegotiated terms to be unfavourable to the US.

In a tough opening speech to kick off this week’s talks, US Trade Representative Robert Lighthizer highlighted the importance of modernising the deal to acknowledge the growth of the digital and service sectors in the 23 years since NAFTA was signed. He also issued some stern rhetoric outlining the US’ expectations from the talks.

“For countless Americans, this agreement has failed. We cannot ignore the huge trade deficits, the lost manufacturing jobs, the businesses that have closed or moved because of incentives – intended or not – in the current agreement,” he said.

Microsoft advances in the cloud computing space with latest acquisition

On August 15, Microsoft reached a new high in its efforts to dominate the cloud computing market by completing the acquisition of cloud orchestration start-up Cycle Computing.

Cycle Computing may not be one of the better-known industry players, but it has enabled high-profile clients in the financial, manufacturing and science industries to access high-performance computing capabilities via the cloud. This allows them to analyse big data and run huge workloads without requiring a data centre of their own.

Most significantly, the platform is currently used by all three of the major cloud vendors (Google, Amazon Web Services and Microsoft Azure), giving added significance to the acquisition. Microsoft has confirmed that it will continue to support customers using rival cloud platforms, but that future versions will be “Azure focused”.

Cycle Computing has enabled high-profile clients to access high-performance computing capabilities via the cloud

“We’ve already seen explosive growth on Azure in the areas of artificial intelligence, the Internet of Things and deep learning,” explained Jason Zander, Microsoft Azure’s Vice President. “As customers continue to look for faster, more efficient ways to run their workloads, Cycle Computing’s depth and expertise around massively scalable applications make them a great fit to join our Microsoft team. Their technology will further enhance our support of Linux HPC workloads and make it easier to extend on-premise workloads to the cloud.”

The financial details of Microsoft’s acquisition have not been revealed, but the company will no doubt be satisfied with its purchase if it encourages businesses using rival cloud solutions to join its Azure platform. Microsoft is also more likely to gain access to the significant profits being generated by some of the high-performance applications that are operating in the cloud.

In 2012, a cancer research application built using Cycle Computing’s software cost in the region of $5,000 an hour to run. That particular piece of software was operating on Amazon’s cloud offering, but today’s acquisition could see a higher proportion of these high-end, high-cost cloud solutions running across Microsoft Azure instead.

Wikipedia and the meaning of truth

With little notice from the outside world, the community-written encyclopedia Wikipedia has redefined the commonly accepted use of the word “truth.”

Why should we care? Because Wikipedia’s articles are the first-or second-ranked results for most Internet searches. Type “iron” into Google, and Wikipedia’s article on the element is the top-ranked result; its article on the Iron Cross is also first. Google’s search algorithms rank a story in part by how many times it has been linked to; people are linking to Wikipedia articles a lot.

This means that the content of these articles really matters. Wikipedia’s standards of inclusion – what’s in and what’s not – affect the work of journalists, who routinely read Wikipedia articles and then repeat the wikiclaims as “background” without bothering to cite them. These standards affect students, whose research on many topics starts (and often ends) with Wikipedia. And since I used Wikipedia to research large parts of this article, these standards are affecting you, dear reader, at this very moment.

Many people, especially academic experts, have argued that Wikipedia’s articles can’t be trusted, because they are written and edited by volunteers who have never been vetted. Nevertheless, studies have found that the articles are remarkably accurate. The reason is that Wikipedia’s community of more than seven million registered users has organically evolved a set of policies and procedures for removing untruths. This also explains Wikipedia’s explosive growth: if the stuff in Wikipedia didn’t seem “true enough” to most readers, they wouldn’t keep coming back to the website.

These policies have become the social contract for Wikipedia’s army of apparently insomniac volunteers. Thanks to them, incorrect information generally disappears quite quickly.

So how do the Wikipedians decide what’s true and what’s not? On what is their epistemology based?

Unlike the laws of mathematics or science, wikitruth isn’t based on principles such as consistency or observability. It’s not even based on common sense or firsthand experience. Wikipedia has evolved a radically different set of epistemological standards – standards that aren’t especially surprising given that the site is rooted in a Web-based community, but that should concern those of us who are interested in traditional notions of truth and accuracy. On Wikipedia, objective truth isn’t all that important, actually. What makes a fact or statement fit for inclusion is that it appeared in some other publication – ideally, one that is in English and is available free online. “The threshold for inclusion in Wikipedia is verifiability, not truth,” states Wikipedia’s official policy on the subject.

Verifiability is one of Wikipedia’s three core content policies; it was codified back in August 2003. The two others are “no original research” (December 2003) and “neutral point of view,” which the Wikipedia project inherited from Nupedia, an earlier volunteer-written Web-based free encyclopedia that existed from March 2000 to September 2003 (Wikipedia’s own NPOV policy was codified in December 2001). These policies have made Wikipedia a kind of academic agora where people on both sides of politically charged subjects can rationally discuss their positions, find common ground, and unemotionally document their differences. Wikipedia is successful because these policies have worked.

Unlike Wikipedia’s articles, Nupedia’s were written and vetted by experts. But few experts were motivated to contribute. Well, some wanted to write about their own research, but Larry Sanger, Nupedia’s editor in chief, immediately
put an end to that practice.

“I said, ‘If it hasn’t been vetted by the relevant experts, then basically we are setting ourselves up as a frontline source of new, original information, and we aren’t set up to do that,’” Sanger (who is himself, ironically or not, a former philosophy instructor and by training an epistemologist) recalls telling his fellow Nupedians.

With experts barred from writing about their own work and having no incentive to write about anything else, Nupedia struggled. Then Sanger and Jimmy Wales, Nupedia’s founder, decided to try a different policy on a new site, which they launched on January 15th, 2001. They adopted the newly invented “wiki” technology, allowing anybody to contribute to any article – or create a new one – on any topic, simply by clicking “Edit this page.”

Soon the promoters of oddball hypotheses and outlandish ideas were all over Wikipedia, causing the new site’s volunteers to spend a good deal of time repairing damage – not all of it the innocent work of the misguided or deluded. (A study recently published in Communications of the Association for Computing Machinery found that 11 percent of Wikipedia articles have been vandalised at least once.) But how could Wikipedia’s volunteer editors tell if something was true? The solution was to add references and footnotes to the articles, “not in order to help the reader, but in order to establish a point to the satisfaction of the [other] contributors,” says Sanger, who left Wikipedia before the verifiability policy was formally adopted. (Sanger and Wales, now the chairman emeritus of the Wikimedia Foundation, fell out about the scale of Sanger’s role in the creation of Wikipedia. Today, Sanger is the creator and editor in chief of Citizendium, an alternative to Wikipedia that is intended to address the inadequacy of its “reliability and quality.”)

Verifiability is really an appeal to authority – not the authority of truth, but the authority of other publications. Any other publication, really. These days, information that’s added to Wikipedia without an appropriate reference is likely to be slapped with a “citation needed” badge by one of Wikipedia’s self∞appointed editors. Remove the badge and somebody else will put it back. Keep it up and you might find yourself face to face with another kind of authority – one of the English∞language Wikipedia’s 1,500 administrators, who have the ability to place increasingly restrictive protections on contentious pages when the policies are ignored.

To be fair, Wikipedia’s verifiability policy states that “articles should rely on reliable, third∞party published sources” that themselves adhere to Wikipedia’s NPOV policy. Self-published articles should generally be avoided, and non∞English sources are discouraged if English articles are available, because many people who read, write, and edit En.Wikipedia (the English-language version) can read only English.

Mob Rules
In a May 2006 essay on the technology and culture website Edge.org, futurist Jaron Lanier called Wikipedia an example of “digital Maoism” the closest humanity has come to a functioning mob rule.

Lanier was moved to write about Wikipedia because someone kept editing his Wikipedia entry to say that he was a film director. Lanier describes himself as a “computer scientist, composer, visual artist, and author.” He is good at all those things, but he is no director. According to his essay, he made one short experimental film in the 1990s, and it was “awful.”

“I have attempted to retire from directing films in the alternative universe that is the Wikipedia a number of times, but somebody always overrules me,” Lanier wrote. “Every time my Wikipedia entry is corrected, within a day I’m turned into a film director again.”

Since Lanier’s attempted edits to his own Wikipedia entry were based on firsthand knowledge of his own career, he was in direct violation of Wikipedia’s three core policies. He has a point of view; he was writing on the basis of his own original research; and what he wrote couldn’t be verified by following a link to some kind of legitimate, authoritative, and verifiable publication.

Wikipedia’s standard for “truth” makes good technical and legal sense, given that anyone can edit its articles. There was no way for Wikipedia, as a community, to know whether the person revising the article about Jaron Lanier was really Jaron Lanier or a vandal. So it’s safer not to take people at their word, and instead to require an appeal to the authority of another publication from everybody who contributes, expert or not.

An interesting thing happens when you try to understand Wikipedia: the deeper you go, the more convoluted it becomes. Consider the verifiability policy. Wikipedia considers the “most reliable sources” to be “peer-reviewed journals and books published in university presses,” followed by “university-level textbooks,” then magazines, journals, “books published by respected publishing houses,” and finally “mainstream newspapers” (but not the opinion pages of newspapers).

Once again, this makes sense, given Wikipedia’s inability to vet the real∞world identities of authors. Lanier’s complaints when his Wikipedia page claimed that he was a film director couldn’t be taken seriously by Wikipedia’s “contributors” until Lanier persuaded the editors at Edge to print his article bemoaning the claim. This Edge article by Lanier was enough to convince the Wikipedians that the Wikipedia article about Lanier was incorrect – after all, there was a clickable link! Presumably the editors at Edge did their fact checking, so the wikiworld could now be corrected.

As fate would have it, Lanier was subsequently criticised for engaging in the wikisin of editing his own wikientry. The same criticism was leveled against me when I corrected a number of obvious errors in my own Wikipedia entry.

“Criticism” is actually a mild word for the kind of wikijustice meted out to people who are foolish enough to get caught editing their own Wikipedia entries: the entries get slapped with a banner headline that says “A major contributor to this article, or its creator, may have a conflict of interest regarding its subject matter.” The banner is accompanied by a little picture showing the scales of justice tilted to the left. Wikipedia’s “Autobiography” policy explains in great detail how drawing on your own knowledge to edit the Wikipedia entry about yourself violates all three of the site’s cornerstone policies – and illustrates the point with yet another appeal to authority, a quotation from The Hitchhiker’s Guide to the Galaxy.

But there is a problem with appealing to the authority of other people’s written words: many publications don’t do any fact checking at all, and many of those that do simply call up the subject of the article and ask if the writer got the facts wrong or right. For instance, Dun and Bradstreet gets the information for its small∞business information reports in part by asking those very same small businesses to fill out questionnaires about themselves.

“No Original Research”
What all this means is hard to say. I am infrequently troubled by Wiki’s unreliability. (The quality of the writing is a different subject.) As a computer scientist, I find myself using Wikipedia on a daily basis. Its discussions of algorithms, architectures, microprocessors, and other technical subjects are generally excellent. When they aren’t excellent and I know better, I just fix them. And when they’re wrong and I don’t know better– well, I don’t know any better, do I?

I’ve also spent quite a bit of time reviewing Wikipedia’s articles about such things as the “Singularity Scalpel,” the “Treaty of Algeron,” and “Number Six.” Search for these terms and you’ll be directed to Wikipedia articles with the titles “List of Torchwood items” and “List of treaties in Star Trek,” and to one about a Cylon robot played by Canadian actress Tricia Helfer. These articles all hang their wikiexistence upon scholarly references to original episodes of Dr. Who, Torchwood, Star Trek, and Battlestar Galactica – popular television shows that the Wikipedia contributors dignify with the word “canon.”

I enjoy using these articles as sticks to poke at Wikipedia, but they represent a tiny percentage of Wikipedia’s overall content. On the other hand, they’ve been an important part of Wikipedia culture from the beginning. Sanger says that early on, Wikipedia made a commitment to having a wide variety of articles: “There’s plenty of disk space, and as long as there are people out there who are able to write a decent article about a subject, why not let them?… I thought it was kind of funny and cool that people were writing articles about every character in The Lord of the Rings. I didn’t regard it as a problem the way some people do now.”

What’s wrong with the articles about fantastical worlds is that they are at odds with Wikipedia’s “no original research” rule, since almost all of them draw their “references” from the fictions themselves and not from the allegedly more reliable secondary sources. I haven’t nominated these articles for speedy deletion because Wikipedia makes an exception for fiction – and because, truth be told, I enjoy reading them. And these days, most such entries are labelled as referring to fictional universes.

So what is Truth? According to Wikipedia’s entry on the subject, “the term has no single definition about which the majority of professional philosophers and scholars agree.” But in practice, Wikipedia’s standard for inclusion has become its de facto standard for truth, and since Wikipedia is the most widely read online reference on the planet, it’s the standard of truth that most people are implicitly using when they type a search term into Google or Yahoo. On Wikipedia, truth is received truth: the consensus view of a subject.

That standard is simple: something is true if it was published in a newspaper article, a magazine or journal, or a book published by a university press – or if it appeared on Dr. Who.