Last week Adam Ozimek posted an interesting piece Big Data Versus Hayek. He noted how centralized algorithmic price setting, for real estate, airlines and hotels, was displacing local and decentralized decision making. Because it works better and is more profitable. And how this development was very “un-Hayekian.” Friedrich Hayek of course famously arguing centralized planning and pricing could never do as well as the man-on-the-spot in taking into account local conditions.
Hugo award winning science fiction author Greg Egan complained recently about science fiction movies, starting his post with the line “Why is almost every contemporary science fiction movie irredeemably stupid?” He digs into three movies: Her, Ex Machina and Interstellar. Regarding Her, he noted:
Like many who spend a lot of time reading on the internet, I love twitter. It’s an invaluable source of information. One especially prized by journalists and infovores. But the product has stagnated. In particular casual users have struggled with it. One billion people have tried it (!) but only about a quarter of those stayed with the product. So it was no surprise when twitter announced in early June current CEO Dick Costolo would step down.
When taxi-like service Uber (order a car instantly from your smartphone) first became successful, it created a trend for copy cats. These companies were marketed and mocked as “Uber for X“, e.g., Uber for flowers, Uber for shopping, Uber for laundry, Uber for pizza. You get the idea. But Uber’s explosive growth had another side. The company fought tooth and nail, lawsuit by lawsuit, against entrenched taxi interests to expand. And as Google unleashes the full potential of machine learning (especially talking computers), it risks a similar battle on privacy, becoming an “Uber for lawsuits.” I’ve mentioned this in previous posts, but as an aside. It’s worth exploring in more depth.
With Apple’s announcements at WWDC and Google’s announcements at Google I/O, there’s a reasonable case to be made that 2015 will be looked back on as the year we transitioned from the mobile tech era into the machine learning era. To be clear, that’s a huge oversimplification. Smartphone mobile tech is still changing rapidly (watch versus phone) and machine learning is itself tightly coupled to mobile’s rise. And there’s plenty of other technology vying for a similar claim: solar, genomics (CRISPR), Internet of Things, Bitcoin, 3D printing, other big data and cloud, etc. And yet. The world is so complex. Honing in on a single simplifying theme can provide insight. So let’s run with this one to see where it leads.
Ben Thompson starts off his Peak Google post saying “Despite the hype about disruption, the truth is most tech giants, particularly platform providers, are not so much displaced as they are eclipsed.” By this he means old platforms and companies don’t fail or go away. They continue to dominate their old platforms. It’s just that new companies create new platforms that are so much bigger they eclipse the old ones. His examples are IBM mainframes being eclipsed by PCs, and PCs being eclipsed by smartphones. I want to pause here to note that both of his eclipse examples are driven by the invention of new and more personal input methods. Yes, it’s true PCs continued using command line input at first. But once PCs shifted to mouse/keyboard and graphical interfaces, IBM dropped out and PC use exploded. We entered the Microsoft era. For smartphones of course the input shift was moving to touchscreen interfaces, where Apple iOS and Google Android now dominate. History seems to show that computer platforms have such strong lock-in the original owners never lose control. Instead what happens is new entrants have a window of opportunity to eclipse old platform owners when new and more personal input methods become technically feasible.
From the original version of the TV show Star Trek, in the episode The Conscience of the King, Captain Kirk is suspicious the actor Anton Karidian is actually the evil mass murderer Kodos the Executioner. So Kirk asks the computer for information:
Even though it’s only April, it’s already clear 2015 will be looked back on as the year cord cutting (replacing cable TV with internet streaming) started going mainstream. HBO is finally allowing non-cable customers to stream HBO content without requiring a cable subscription. Apple is expected to launch a TV streaming service later this year. Existing internet streaming services like Netflix, Amazon Instant Video, Sling TV, Hulu are all growing rapidly. Netflix alone already accounts for a third of all US internet traffic. Just this week Verizon got so aggressive in how they unbundled ESPN they’re getting sued for breach of contract. Not a move a company like Verizon would have attempted even a few years ago.
The Search for Extraterrestrial Intelligence (SETI) has traditionally used radio telescopes to search nearby stars. Excellent. But I really loved a new study published last week by Griffeth, et. al, which differs from traditional radio searches in two ways. First, Griffeth and team did not look for direct signals sent by extraterrestrial intelligences (ETs). Instead they looked for excess heat produced as a waste byproduct of energy use at galactic scales. Second, they did not look nearby inside our own galaxy. Instead they looked at galaxies far, far away. 100,000 of those galaxies in fact. This approach may seem counterintuitive. But I think it’s one of the best ways to look for ETs.
I’m a sucker for year-end lists and predictions. So here’s 5 for 2015 that I’ll scorecard next December.