We’ve had 3G wireless systems for years, but, realistically, the devices in use on them are telephones with some whistles and bells attached. And let’s face it: Even today’s fastest 3G networks are akin to jumping from a V.34 to a V.90 modem from back in the day. Just how well do you think Google Earth would have operated on that infrastructure?
Mr. Jenson’s plea for the “miracle app” is probably going to have to wait a little while. At least from my perspective, the so-called “miracles” need a precursor or two in place in order to become miracles. When you look at where we are today with the “immobile Web,” consider some of the hurdles that were necessary to get to this point:
- First we needed a lot of people to have their own access devices, but at $5,000 a pop the PC wasn’t a particularly attractive candidate. However, when Intel Corp. (Nasdaq: INTC) unilaterally slashed prices on microprocessors, realizing it could earn much more by accelerating penetration rates, the prices of PCs plummeted and penetration exploded.
- The second part of the exploding demand for PCs was that they had to be reasonably easy to use and be a little more than a substitute for a calculator and typewriter. That’s where Microsoft Corp. (Nasdaq: MSFT) and the introduction of Windows 3.0 come into play. I know this may be difficult for those who view them as the devil incarnate, but Windows 3.0 was an important event – it freed developers from the 640k memory barrier. Now, you could have real apps!
- Page forward to the mid-90s and the introduction of Mosaic and its successor Netscape. That makes the Internet “easy” for the masses and becomes the beginning of the end of all those America On-Line disks clogging up in your mailbox.
- As consumers, we could have had access to DSL long before it actually showed up, but, true to form, the off-spring of Ma Bell were too interested in protecting their cash cow T1 business. But with the cable companies rolling out high-speed access and a handful of upstarts offering DSL service on telco lines, the writing was on the wall.
Obviously the first needed change on my list is faster networks – preferably not measured in kilobits per second. As Unstrung has pointed out before, if you’re going to have hundreds or thousands of people within a given area trying to swap stuff at tens-of-megabits per second the weakest link in this chain is the standing backhaul network, which wasn’t built for it. (See 4G Backhaul: A Problem for All?) Whether you are a Mobile WiMax believer or not, it is on its way to being followed at some point by LTE (long term evolution), and both were meant for speed. So at least on that front, relax, Scott – help would appear to be on the horizon.
Over on the device side of the equation, the playing field is wide open, as you would expect. We’re not going to have a defined platform that all market resources develop. One size obviously isn’t going to fit all in the mobile device market, but offering 31 flavors isn’t going to help either. Google’s Android platform may be well intentioned (?), but it only adds to the fragmentation that already exists. (Obviously, Google is of a different opinion.)
Apple Inc. (Nasdaq: AAPL) certainly provided an interesting new interface with its iPhone, and its touch-driven screen may become the de facto standard for a certain category of device. What I found interesting was a comment from a subscriber at RealMoney who told me that his iPhone was the first handset he’s had that he thought of as a computer, whereas all the others had been portable telephones. I think the reason he felt that way was Apple was the only company with sufficient brand power to force AT&T Inc. (NYSE: T) (or any other domestic carrier) to open its network by allowing the device to support WiFi. He now had fast network access, ease of use, and a reasonable amount of power literally at his finger tips; the same thing we’ve had on the “immobile Web” side for a while.
But there are still a couple of pieces of the puzzle that will enhance the current experience and enable the “miracles.”
First, the bane of every portable device has been power usage. I suppose one day we’ll have a device that magically recharges itself, but until that point arrives we’ll have to come up with increasingly efficient components within the device. Obviously, talking with friends or sending email isn’t exactly a huge drain these days, as battery life has steadily improved for years. However, as this device starts doing more (bigger screen, higher resolution, video compression/decompression, etc.) the focus will be on efficiency. It will be interesting to watch the implementation and success (or lack thereof) of Intel’s new Atom family of processors – its first designed specifically for the mobile world below notebooks.
The second issue is storage. Granted, every device doesn’t necessarily need a lot of it, but it provides flexibility. The portable world’s been talking about solid state disk (SSD) for years, and we’re actually on the cusp of an acceleration in adoption. Prices of NAND flash have declined, and densities have increased sufficiently so that it’s a reasonable alternative without breaking the bank. The march to higher densities is a never-ending process with IM Flash – a joint venture between Intel and Micron Technology Inc. (Nasdaq: MU) – recently announcing 32 Gigabit parts to be available in the second half of this year. One such chip is capable of storing 1,000 MP3 files, but stack eight of them in a single package and they’ll hold 40 hours of high-definition video. Samsung is expected to have its 32 Gbit part available soon as well and has planned a 64 Gbit part for 2009.
Make the resources (communications speed, processing power, memory, and storage) available to developers, and they’ll come up with the miracles that you’re looking for, Scott. They may not be here tomorrow, but they’re not that far off either.
Full Disclosure: Bob Faulkner is long MU; The Telecom Connection model portfolio is long MSFT and MU.
— Bob Faulkner, Special to Light Reading