Tuesday, October 7. 2014
It's always fun to look at all the various predictions of the Next New Things (given they vary quite widely by predictor, by year - us being no exception). Anyway, here are Gartners for the next few years (abridged by Broadstuff)
For what its worth, based on Broadstuff's advanced TTID algorithm and patented BGA prediction methodology (see end of post for definition), we can safely predict that:
1. Most of these will take far longer to play out than pedicted, and many that do play out will have far less impact than supposed
Which of course is what Gartner's other great prediction system invention, the Hype Curve, tells us - only when something passes out of any hype trend, does it finally become useful.
The other thing I am left scratiching my head about, is that given the Great Hollowing Out*, if all these come to pass (most of these trends imply yet more waged employees being dumped or offshored), where will the money come from to buy all those marvellous new things these new lean businesses make? Even Henry Ford saw that one coming when he upped wages so employees could buy his cars....
* I am always amused by The Economist calling the Hollowing Out trend all a Myth in 2004, when it was patently obvious it was already happening. But of course, for that we use the MRD approach
TTID = This Time It's Different. The algorithm states that whenever this claim is made, it isn't, and put your hand tightly on your wallet
BGA = Bill Gates Algorithm - This (X) will have far less impact in 2 years than we think, and far more in 10. We apply the Chasm upgrade though, which states that most New New things will never jump over the chasm and will be dashed on therocks in trying. About 3 of the above will be survive, on average - which 3 would you bet on?
MRD = Mandy Rice Davies corollary - "Well, They Would, Wouldn't They" - Always look carefully at where someone is coming from before following where they lead....
Friday, August 22. 2014
Richard Dawkins (respected/hated Evolutionary Biologist and loved/hated Atheist) has touched off yet another Twitterstorm, via that unfortunate habit Evolutionary Biologists (and Vulcans) have of looking at humans from a viewpoint of mass mathematical game theory participants, rather than as - well, humans. Cue yet another Twitterstorm du Jour. (Huffington Post summaries it best):
The irony of Dawkins being called "immoral" by religious and various other "strong beliefs" based groups often proposing far worse things is piquant, but there is an even bigger irony here with Mr Dawkins doing this. He actually was the first person to coin the term "meme" and to postulate how they work. So, depending on your point of view on Mr D, he is either a master memeticist or a complete c*nt who has been hoisted on his own memetic petard by the #Offended on Social Media.
Oh yes - there were 12 points to the Huffpo article, and these are the clinchers I think:
11. Attempting to squeeze a few last hits out of the now-subsiding "outrage", a journalist will write a meta-piece attempting to explain the anatomy of a Dawkins Twitter scandal*.
He is clearly a master memetic tactician therefore, Twitterstorms and meta-pieces being the sign of memetic success - but whether continuing to offend large numbers of people in exchange for viral Twitter publicity is a good strategic memetic play is less clear. Wildean theory says it is effective, but in a Social Media Age where everything you say remains online to be held against you, it may not be. After all, one of the first lessons of social game theory is being nice wins - eventually...
*13. Attempting to extract the last ounces of traffic, a blogger writes a snarky piece on the whole affaire...
Monday, August 11. 2014
Very interesting article in the Economist about "Entrepreneurial" vs "Innovation" economics. the whole article is well worth reading as it is one of those very rare items in the UK "Tech Startup" space - a systemic analysis with actual numbers. There are 3 key points dealing with the UK's current Startup / "Every Person is an Entrepreneur" craze:
Firstly, State money is wasted on funding too many entrpreneurial SMES with too little money, they do nothing for the economy overall:
....once you take into account the number of SME jobs lost after the first three years of their creation, there is very little net job creation by these firms. Only 1% of new enterprises have sales of more than £1 million six years after they start. Research at the University of Sussex shows that median sales of a six-year-old firm is less than £23,000 (Storey, 2006). These firms also tend to be the least productive and least innovative (R&D spending—the best measure we have for inputs in the innovation process—in Tech City is not higher than in other parts of London or Britain). Indeed, the few high growth innovative firms (about 6% of the total SME group, Nesta, 2011)—those that really should be supported—do not directly benefit from the hype that surrounds SMEs and startups: once they get the funds these are too diluted to make a difference.
Secondly, what the State should be funding is an Innovation ecosystem, not an Entrepreneurial/Startup one per se
Innovation-led “smart” growth has occurred mainly in countries with a big group of medium to large companies, and a small group of SMEs that is spun out from some of those large companies or from universities. These firms have benefited immensely from government funded research. Indeed, in my book I show how many firms in Silicon Valley have benefitted directly from early-stage funding by government, as well as the ability to build their products on top of government funded technologies.
The author points out that nearly every "entrepreneurial startup" in Silicon Valley today would not exist without huge US government funded projects that underpin it's technology, and direct low cost (aka non VC) early days investment - Apple is a case in point:
Every technology that makes the iPhone smart was government-funded (internet, GPS, touch-screen display, SIRI). Apple spends relatively little on R&D compared with other IT firms precisely because it uses existing technology. It applies its remarkable design skills to these technologies, effectively surfing on a government-funded wave. Apple, Compaq and Intel also all enjoyed the benefits of early-stage public funds (SBIC in the case of Apple, SBIR in the case of Compaq and Intel).
Thirdly the UK's state spend on innovation and pull through is small by competitive standards. Silicon Valley was largely built on the huge government backed spendiing, not the VC community - and it is probably still the real case:
Silicon Valley firms were initially not funded mainly by venture capital. It came in after the ball had got rolling thanks to funding by the Department of Defence, the Department of Health and, more recently, the Department of Energy. In fact, there is increasing evidence that many startups are told by venture-capital firms to go first to SBIR and then come back (Block and Keller, 2013).Venture-capital funds are not providing the kind of patient long-term finance needed for radical innovations. They are too focused on a profitable “exit”—usually through an IPO or a sale to a bigger company—within 3-5 years. But innovation often takes 15-20 years.
This sort of state pull through is what China is using too, and the numbers are measured in $ Trillions. In fact even in Europe, the UK underperforms hugely on this sort of state investment:
In Germany such links are created by well-funded Fraunhofer Institutes. In Britain these are being imitated through the Catapult centres, which in theory should be linked to Tech City-type projects, either through procurement policy or via learning. Currently there are no links between these. And whereas the Fraunhofer system has an annual research budget of €1.8 billion ($2.4 billion) and a network of 20,000 staff across 60 centres (in 2010), Britain’s Catapult centres were given just £200m to spend over 4 years. When the Tech-City gurus in Number 10 Downing Street criticise the Technology Strategy Board, which is in charge of the Catapult strategy, for not being more like Darpa, they ignore the very different size of TSB’s budget in comparison with Darpa—and even more the fact that the TSB does not have the market creating potential that Darpa does.
Leads to a fourth point, about the competence of No 10's "Tech City Gurus" and advisors - but that's for another post. To end though, the observation is if small beer is what the government is willing to put into the game, its better to spend it on tertiary education and R&D, where impacts are proven, rather than launch a million underfunded startups:
Research at the University of Cambridge (Hughes 2008) suggests that the British government spends (directly and indirectly) close to £8 billion ($13 billion) annually on SMEs—more than it spends on the police and close to the amount it spends on universities. Is this warranted? How do we know it would not be better to simply direct that money to teachers where there is plenty of evidence that quality education raises human capital and growth.
This post is just a summary, I recommend reading the article.
Friday, March 28. 2014
Interesting 2 paragraphs from Fred Wilson's blog, talking about "what's next":
But the roadmap has been clear for the past seven years (maybe longer). The next thing was mobile. Mobile is now the last thing. And all of these big tech companies are looking for the next thing to make sure they don’t miss it.. And they will pay real money (to you and me) for a call option on the next thing.
I'm intrigued by the idea of a call option, I think it could be executed better than via VC funding though, Fred - now that would be disruptive
But I think Fred's largely right that Mobile was the last Next Thing - though strictly speaking its not "Mobile" now per se, but PC level processing power meeting Moore's Law and shrinking in size and price so it can be easily portable, with a damn good UI (think iPaq then iPhone). These "Smart" phones and "tablets" killed good old Planet Mobile dead in about 3 years (Motorola, Nokia, Blackberry - where are they now? They were earth shaking giants a few short years ago!)
Anyway, where is the Next Big Thing to be found is the question Fred asks. The future is of course here, just unevenly spread, so the trick is to see what bits of the future are here, now - and actually are going somewhere. Ten things that have changed exponentially in the "networked technology" areas we follow, in the time we've been writing Broadstuff (est 2006) are:
- Robotics (including the flying type)
As you can see, these are hardly New New Things, just things that were already here in 2006 and even then clearly had high potential. What's interesting is that they were all already on very predictable development vectors in 2006, but no one looked at them as killer technologies in those days. That was because at that time, their rate of development was still mainly all theoretical, and not provably valuable. To compare, here are 10 other things that were also floating around in 2006/7 that I thought also could happen sooner and haven't yet, but still may as they are all Big Next Next Things potentially.
These are all here today, unevenly distributed, and still chugging along - but at slower rates than the various laws of networking, learning, Moores et al would predict. Typically there is a something in them that is missing, obstinately sticking at current capability or economically unavailable, awaiting the "key" to their leap over the Chasm. But all it takes is a small shift (think iPaq vs iPhone again) and over they go.
All you have to do to build your own mind-boggling portfolio of New Next Things To Watch is read the various Gartner Hype Curves for the last 10 years, and you will see a slew of things on the hot S curve one year and disappearing 2-3 years later. They don't go away though, and are still evolving in the Darwinian mud of technology species, it's just that something hasn't yet quite worked out for them yet. And somewhere in that stew already, are the next 10 New New Things.
Tuesday, February 25. 2014
Impact of mathematical techniques on operations, by industry - McKinsey
McKinsey has discovered you can use Operations Research (or Decision Maths as it is known these days) mathematical techniques to analyse and optimise manufacturing operations - McKinsey Insights:
The application of larger data sets, faster computational power, and more advanced analytic techniques is spurring progress on a range of lean-management priorities. Sophisticated modeling can help to identify waste, for example, thus empowering workers and opening up new frontiers where lean problem solving can support continuous improvement. Powerful data-driven analytics also can help to solve previously unsolvable (and even unknown) problems that undermine efficiency in complex manufacturing environments: hidden bottlenecks, operational rigidities, and areas of excessive variability. Similarly, the power of data to support improvement efforts in related areas, such as quality and production planning, is growing as companies get better at storing, sharing, integrating, and understanding their data more quickly and easily.
Not only that, but you can apply Lean operating techniques in manufacturing companies too:
Nonetheless, to get the most from data-fueled lean production, companies have to adjust their traditional approach to kaizen (the philosophy of continuous improvement). In our experience, many find it useful to set up special data-optimization labs or cells within their existing operations units. This approach typically requires forming a small team of econometrics specialists, operations-research experts, and statisticians familiar with the appropriate tools. By connecting these analytics experts with their frontline colleagues, companies can begin to identify opportunities for improvement projects that will both increase performance and help operators learn to apply their lean problem-solving skills in new ways.
Amazing stuff....except its very, very old news. Monte Carlo simulations and capacity planning algorithms have been around for decades, a lot of it even pre-dates WW2. Value analysis started at 3M in the 1960's. Richard Schonberger wrote the groundbreaking Japanese Manufacturing Techniques in 1982 (I still have my copy) and he was merely Westernising something the Japanese had been doing for 2 decades by then. And then I saw this, which really made me smile wryly:
Similarly, a leading steel producer used advanced analytics to identify and capture margin-improvement opportunities worth more than $200 million a year across its production value chain. This result is noteworthy because the company already had a 15-year history of deploying lean approaches and had recently won an award for quality and process excellence. The steelmaker began with a Monte Carlo simulation, widely used in biology, computational physics, engineering, finance, and insurance to model ranges of possible outcomes and their probabilities
The wry smile was because I did much the same, in 1994-5, for a steelmaker, using some of these exact same techniques - while I was consulting at McKinsey to boot. I have the obligatory picture of big rolling mills from a grateful client, and the prize I won in the McKinsey internal "Practice Olympics" to prove it In fact I'd bet the McKinsey Quarterly in the 1970's, 80's and 90's will be full of analyses like this one. There truly is nothing new under the sun.
But with New Improved Big Data it can all be rebadged bright and new....except it doesn't work this way. There was a shedload of Big Data in the Old Days too (shop floor data capture techniques underpin most of the Internet of Things, and did you know some of the first broadband networks in the world went in at manufacturers in the 1980's). Manufacturing has always had a lot of data, and Big Manufacturers bought Big Iron to process Big Datasets then too (except it was called data with a small "d" then). The Monte Carlo methods, or N jobs on M machines Optimisation (for examples) are still the same algorithms they were in the 1930's and 50's.
And you know what - you just cannot simulate the minute operation laden details of a shop floor or logistics network reliably. No matter how big your dataset, or your computers, or your machine tool onboard intelligence, there is just too much variability. Which is why the Just In Time/Lean movement came about as the better approach - the aim was to simplify the problem, rather than hit it with huge algorithm models and simulations so complex no one fully understood what they were doing anymore (just ask the banks what happens going down that route) - the aim of JiT/Lean was to actually reduce the problem variability, to get back to Small Data if you like.
And you know what else - despite the analytical miracles I and many others performed in the day, despite the extraordinary efforts by managements and workers, so many of those steel mills (and clothing companies, and manufacturers of a million other widgets) moved East. There is only so much you can do against cheap labour, national subsidies and guaranteed government contracts.
And that brings me to something else in the story, which is what is really going on here I suspect - its not Big Data, its Big Economics:
Sure, its partly about raw material prices changing - when they are too high to buy or too low to sell you really have to be efficient at manufacturing. But when you are getting to this level of number crunching, after 20 years of Lean projects, in my experience it's because the endgame is appearing on the horizon, its a last of the summer wine story, the end of an S curve. Interestingly, it seems like all the McKinsey consultants and the project were in India, and Eastern labour costs are rising, as is oil for those long ship rides back to the European and US markets, so much so in in fact that there is an increasing trend in re-shoring, as production is coming back to the US and EU. Big picture, the low cost Eastern windfall is ending, and you have to start getting much smarter again about the actual manufacturing process. You can get benefits from doing it right with Big Iron and Big Algorithms, no doubt - but this sounds like back to the future....I suspect they are now using bigger and bigger number crunching to eke the last 20% of improvements from the various kaizen projects ongoing, trying to keep the factories in situ as the Big Economics shift yet again.
And you didn't need Big Data to tell you that....
(Hat tip to my colleagues at the Agile Elephant for the link)
Thursday, February 20. 2014
News just in that Facebook has bought WhatsApp for $SillyMoney - $19.6bn - TechCrunch:
With 450 million monthly users and a million more signing up each day, WhatsApp was just too far ahead in the international mobile messaging race for Facebook to catch up, as you can see in the chart above we made last year. Facebook either had to surrender the linchpin to mobile social networking abroad, or pony up and acquire WhatsApp before it got any bigger. It chose the latter.
Facebook couldn't afford not to have it, if someone else had bought it that would have made a direct attack on Facebook's chosen strategic way out of its own declining user engagement, i.e. mobile applications and messaging. Facebook is absolutely determined not to be overtaken by the "next wave" Social Networks. But its own "new wave" systems were kludgy, so the price of not having your lunch eaten in 2016 is c $20bn in 2014. There is a bit of irony in that WhatsApp strongly do not believe in advertisng, their founder once saying that:
"There's nothing more personal to you than communicating with friends and family, and interrupting that with advertising is not the right solution,"
Clearly $19bn is a mind changing amount
What this really shows is that the days of Facebook's organic growth are over, and from now on in they are going to have to acquire revenue, and thats a very expensive way of doing it at their size if you are continually needing to buy the guys who will eat your lunch. You had to believe a lot to believe Facebook's valuation - that just got harder. But its the Bubbletime, so all will be good - for a while
Update 1 - According to El Reg, Whatsapp does indulge in datascraping of a user's address book, so that does make it an interesting prospect for added value datamining.
Update 2 - Azeem Azhar of PeerIndex has a smart bit on analysis - if WhatsApp had stayed independent it still would have destroyed Facebook's mobile story, which really outlines why Facebook had to act to stop itself being eaten for lunch:
Sunday, February 16. 2014
Bertrand Duperrin's summary presentation
I've written up the 2 days of case studies from the Enterprise 2.0 Summit I attended last week in Paris, they are over here - Day 1 and Day 2 - on the Agile Elephant blog. There are a number of common threads emerging from the studies. Being lazy, I've copied Emanuele Quintarelli's list to start with, based on his study (as they concur with my analysis), I've added my thoughts in italics:
- The project is explicitly supported and sponsored by the top management (70% vs 34% for laggards). Long lasting processes, technology and process change should be somewhat mandated by the formal organization. Informal projects are easy to start but need formal acceptance to embed themselves in the enterprise
Other stand-out observations so far from the case studies:
- Any system needs time to embed/mature/settle in (words varied, but the concept was the same) before it becomes stable and self sustaining, you can't "make a baby in 1 month with 9 mothers" as it were.
Over the 2 days I did sense a departure of what the case studies were showing vs. what some of the Social Business theorists were espousing, in general the case studies showed that pragmatism and evolutionary development (what Dachis' Dion Hinchliffe called "Sustainable Transformation") was the order of the day vs more revolutionary/dramatic transformational approaches.
Update - I have also put up Bertrand Duperrin's slides (at top) now that they have been posted up, it was a very good talk on the subject as well
Tuesday, February 4. 2014
Just had a chance to finally read Janet Parkinson's latest post on this subject - it was to me the stand-out concept of our recent Patchwork Elephant conference, but she has pushed her ideas further - in essence, she argues that “The future is here, it’s just not evenly distributed yet” for businesses becoming just another part of the social network:
In other words "the social network" has become a business market that has already allowed a new form of hotel business to rapidly become almost the biggest market for beds in just a few years. As Janet points out, this is just the beginning:
And eventually this will just be a part of "the matrix". Janet also mentions Ronald Coase's theories about transaction costs in making this happen (ie the social network will match buyers and sellers at a fraction of the present costs), I hope thats because she's heard me bang on about them all the time - but the endgame she sees is actually more Nick Carr:
Over time, this mesh will become regulated – infrastructures always do. Electricity, water, telephony all ended up as part of the utility infrastructure and this will be no different. The main problem for the individual will be the sheer scale of the mesh – we will need tools to navigate it. Some tools will come from the infrastructure itself but we imagine that some tools will come from yourself. This ties in closely with the VRM concept of tools being created for individuals to manage and control their own data, allowing access only to those to whom they give permission. We could imagine us all owning our own smart systems with data controlled by ourselves – a bit like owning an electric appliance which you plug into the mesh – that could source the relevant data, barter the deal and present the options in order of importance, then automatically make all the necessary arrangements for you. The opportunity for profiteering in these transactions would be minimal – regulation would be complex.
I think that is still the optimistic outlook, I can imagine quite a few stones on the road - but history has shown other once hard-to-do things have just been absorbed into the infrastructure....eventually.
Wednesday, January 29. 2014
Google sold Motorola to Lenovo for a lot less than they bought it for - TechCrunch:
Motorola Mobility is being sold to Lenovo, in a deal worth $2.91B. Google is divesting itself of the handset division it purchased for $12.5B in 2011, but it will keep some of the assets — including patents.
Why the drop in value? Well, nothing crashes quite like last year's Mobile Tech Company. Just ask Blackberry, or Nokia, or Palm. And smartphone innovation is slowing anyway.
But those patents, of course, have real value. There are 12,000+ of them, ammunition for defending the barriers to entry against future entrants for years, in all sorts of related technologies, like, oh, mobile Internet of Things tech. And getting cross licencing deals with the other big boys. And just when you've got a new mobile play in your nest.
Innovation is great, M&A is faster, but just in case...lawyers and loads of patents never hurt.
And for Google, writing down $10bn of sunk cost is chump change on an option for 12000 patents. And they keep $3bn in cash. And a nice little tax write-off to boot .
Wednesday, December 4. 2013
We are told fairly frequently these days that data is the new Oil (or not?). If it is, then is Google the new Standard Oil?
To refresh everyone's memories (Wikipedia):
Standard Oil Co. Inc. was an American oil producing, transporting, refining, and marketing company. Established in 1870 as a corporation in Ohio, it was the largest oil refiner in the world. By 1890, Standard Oil controlled 88 percent of the refined oil flows in the United States.Its controversial history as one of the world's first and largest multinational corporations ended in 1911, when the United States Supreme Court ruled that Standard was an illegal monopoly.
In other words, Standard Oil built up a near monopoly position in oil, not unlike Google's near monopoly position in digital data. Like Standard, Google also uses profits in one area (advertising) to offer free services in many other areas, making it very difficult for competition to emerge in those spaces.
Opinions are still divided about the breakup of Standard by the way:
Some economic historians have observed that Standard Oil was in the process of losing its monopoly at the time of its breakup in 1911. Although Standard had 90 percent of American refining capacity in 1880, by 1911 that had shrunk to between 60 and 65 percent, due to the expansion in capacity by competitors. Numerous regional competitors had organized themselves into competitive vertically integrated oil companies, the industry structure pioneered years earlier by Standard itself. In addition, demand for petroleum products was increasing more rapidly than the ability of Standard to expand. The result was that although in 1911 Standard still controlled most production in the older US regions of the Appalachian Basin (78 percent share, down from 92 percent in 1880), Lima-Indiana (90 percent, down from 95 percent in 1906), and the Illinois Basin (83 percent, down from 100 percent in 1906), its share was much lower in the rapidly expanding new regions that would dominate US oil production in the 20th century.
We are certainly seeing China emerge as an independent digital data region, and othes are splitting off various segments of the digital data mining and refining business. It remains to be seen whether the expected rapid growth in digital data moves at such a pace that Google can no longer supply it all. That will probably be the acid test in the Digital Oil business.
Interestingly, after the breakup, bits of Standard slowly got together again - Two of the resulting companies were Jersey Standard ("Standard Oil Co. of New Jersey"), which eventually became Exxon, and Socony ("Standard Oil Co. of New York"), which eventually became Mobil. They are now Exxon-Mobil. Other Standard spin offs re-combined to form Amoco and Chevron.
Microsoft, came under antitrust investigation for being inherently too large for market competition. The only company since the breakup of Standard Oil that was broken up like Standard Oil was AT&T, and as with the breakup of Standard Oil, many of the "Baby Bells" ended up merging together after changes in regulations and technology, with one of them eventually buying AT&T and adopting the AT&T name.
But, little surprise that Google is ramping up its lobbyist engine in Washington, as reported today:
Google Inc. is moving its Washington office closer to Capitol Hill after spending $18.2 million on lobbying, more than Northrop Grumman Corp. and enough to rank the technology company as the eighth-biggest advocacy spender.
Google is one company that understands that those that forget the past are doomed to repeat it.
(Page 1 of 4, totaling 37 entries) » next page
More Broad Stuff
Poll of the Week
Will Augmented reality just be a flash in the pan?
Creative Commons Licence
Original content in this work is licensed under a Creative Commons License