I've been a strong supporter of what most people call fixed/mobile convergence, the automatic, bi-directional handoff of connections between cellular and Wi-Fi, since I first learned of the concept about a decade ago. This all seems so obvious: cellular has a limited range and is expensive to deploy in terms of both spectrum and capital and operating costs. There's a ton of unlicensed spectrum available everywhere, and the limited range (a consequence of the low-power requirements inherent in the unlicensed bands) enforces a high degree of spectral re-use. The carriers can never hope to offer everything that consumers want to buy (most notably wire-like broadband data and video on demand) with their licensed spectrum, especially in high-population-density venues, and current service pricing and policy models enforce this. So, it would seem that the carriers would be well-served handing off calls and especially data connections to Wi-Fi in high-density, high-demand areas. And, even if the carriers aren't game here, enterprises should be flocking to this solution as a way to cut costs and maximize throughput and capacity within their four walls, where more than a few cellular calls are routinely being made.
And yet converged (I prefer the term mobile/mobile convergence (MMC), BTW, as the handoff is between two mobile technologies and no fixed anything beyond the usual infrastructure is actually required) solutions are rare today, in both the carrier and enterprise domains. While I still believe that converged solutions remain the best option for meeting user needs over the long run, I was curious as to why the uptake rate hasn't been greater. It should be, indeed, off the charts.
So, a few calls and a little research, and the conclusion became smack-yourself-in-the-head obvious: device diversity. Convergence technology suppliers have reported to me time and time again that the sheer number of differing handset implementations, coupled with reticence (possibly, I think, strategic in nature) on the part of device manufacturers to either (a) open their APIs or (b) implement them if they've not done so already, coupled in either case with (c) a lack of standards and constantly-shifting OS platforms, provide the serious economic disincentive for progress in the convergence space. Software costs more than ever to develop today, and investments in client-side functionality can't be made without some assurance of return on same.
So, once again we have the carriers firmly in the driver's seat. No matter - I'm still predicting that these guys will turn to Wi-Fi in a big way; they have no other choice. Why are we finding Wi-Fi in every smartphone today? So users can access their network at home? Nope. How about Atheros' recent announcement of their 6004 very-low-power 2x2 component for handsets? Remember, 802.11n is as much about capacity, if not more, than throughput. And remember that Atheros is about to part of Qualcomm, the most important company in wireless, and a major player in handset chipsets. And how about Ruckus Wireless recently introducing a very interesting Wireless Services Gateway (WSG) product that, in addition to many capabilities available today, will eventually offer cellular/Wi-Fi roaming? In short, what we have here is another painfully-obvious case of the challenge defined in Geoffrey Moore's classic Crossing the Chasm - with this being a chasm that simply must be crossed. Femotcells and distributed antenna systems will get us only so far.
Check out this article at cnet News for a great overview (and a bit more) of the opportunity here. Despite appearances at present, convergence is not dead - it's just resting, and not in the sense of Monty Python's infamous dead parrot. Really - convergence remains the best alternative for addressing the capacity problem, and it will be back once the carriers are ready.