CDMA Technology
Members Sign-In
Guest Column

Decisions for the Wireless Industry: Technology Clarity but Business Uncertainty

Columnist:
Andrew J. Viterbi
President
Viterbi Group, LLC

In a career which spans nearly five decades, I have been privileged to participate in the evolution of wireless communication as an academic researcher, an entrepreneur and an active investor, and even as a consumer. As such, I am a good example of the inverse funnel: a person who starts out knowing a great deal about a very narrow field and progresses to eventually know very little about a very broad field. Thus I began in the nineteen fifties and sixties with research on information and communication theory, progressed in the nineteen seventies and eighties with R&D for military, space and satellite communication, then in the eighties and nineties for commercial satellite and cellular personal communication; in the twenty-first century, besides investment and advisory roles, I enjoy the status of partly informed consumer and of putative industry and technology expert called upon for opinion articles such as this one.

Technology Lessons Learned

In all this time there must be a few lessons which I have learned, and perhaps even taught. Let’s start on firm ground with a wireless technology lesson. Simply stated: “Averaging is good, and smart averaging is better.” Transmitted signals can contain elements in multiple dimensions or “degrees of freedom.” For a signal of duration T seconds and bandwidth W Hz the number of (nearly independent) dimensions is approximately 2TW. Thus over a bandwidth W, the transmission rate in dimensions/second is 2W. Suppose we were to transmit just one bit per dimension using quadrature phase shift (QPSK) modulation; this amounts to modulating the phase of a sinusoidal carrier (which is two-dimensional) in one of four equally spaced angles. Thus the bit rate is R=2W bits/sec., but there is a limitation imposed by noise. If the noise is wideband and uniformly distributed with density No Watts/Hz ( the white Gaussian noise model) and the received signal power is S Watts, then the error rate is approximately exponentially decreasing in the signal-to-noise ratio, S/(NoR). Thus to achieve a respectable error rate of no more than one error in 100,000 bits, we require that S/(NoR)>10 dB, and consequently the rate is limited to R<(1/10)S/No bits/sec.

On the other hand, were we to aggregate a large number of bits and employ them to collectively modulate a large number of independent dimensions, we would in effect be averaging over a large number of noise components. Claude Shannon showed in 1948 that by so doing, employing “error-correcting coding,” in the limit error rates can be made arbitrarily small as long as the number of transmitted bits/dimension is no greater than (1/2) Log(1+S/NoW), where the logarithm is to the base 2. Thus the limitation is merely R<W Log(1+S/NoW). If we place no limit on bandwidth, the bit rate limit becomes R<(1/Ln2)S/No which is about 14 times as large as for the uncoded single dimension modulation. Even if we constrain the bandwidth to be the same as for the uncoded example, so that R=2W, still Shannon’s limit implies R<(2/3)S/No , an improvement of a factor of 6.7 over the uncoded case with the same bandwidth.

Greater improvements result when the channel is fading. In the best case, where the modulation is such that successive fading dimensions can be made nearly independent, averaging over multiple dimensions, even without coding, produces major improvements. This results from the fact that if each dimension is modulated independently, the error rate is inversely proportional to signal-to-noise; more precisely, it equals (½)/(1+S/NoR), where S is now the average received power for the Rayleigh fading signal. This means that to achieve an error rate of one error in 100,000 bits, we would need a signal to-noise ratio on the order of 50 dB. But if we spread the signal over n independent dimensions, allocating only 1/n th to each dimensions, as n increases, the error rate approaches the exponential behavior of the unfaded case. Applying coding on top of this, performance can even approach the Shannon limit of the unfaded case. We may conclude from this that since dimensionality is proportional to bandwidth and the more dimensions available the more we can average over, a wideband system is far preferable to a narrowband one. By the way, what we have chosen to call “averaging’ is also referred to as spectral or temporal diversity.

There are, however, two other means for improving performance beyond spectral averaging through wideband transmission. One is to provide more dimensions by spatial diversity. This can be achieved with multiple antenna arrays in base stations and perhaps dual antennas in user terminals; the degree of independence among spatial dimensions depends on their separation in proportion to the wavelength. “Space-time” coding can be employed to better exploit the combined spectral and spatial diversity.

While coding is certainly one form of “smart” averaging, there is another which may be employed in a complementary manner to achieve even greater performance and efficiency gains. This other technique, which we may call smart scheduling, relies on knowledge of the current channel characteristics, achieved through measurements. In personal communication systems, measurement is facilitated by the two-way links between user terminal and base stations, or access points. Thus the user’s terminal can measure the quality of the signal it receives on any of the dimensions and relay this information to the base station. The base station can then employ the better dimensions (spectral, temporal or spatial) to transmit at a higher data rate and the worse ones at a lower rate. The same can be employed in the reverse direction with the base station performing measurements and relaying to the terminal. Of course, performing measurements and relaying them can introduce some latency, which may be detrimental to voice conversations but has negligible impact on data exchange. Incidentally, as applied to spectral diversity, this approach is a time-honored information theory concept known as “water filling.” Another term to define this approach is “weighted” averaging.

To summarize our “technology lesson,” wideband signaling is highly recommendable. Its favored implementation has been through code division multiple access (CDMA). Coding and scheduling, with its required channel measurements, are easily integrated with this wideband modulation, but other wideband modulation techniques also show promise to implement the basic concepts of “smart” averaging.

Industry and Market Structure

While the fruits of a half century of evolution of communication theory and a quarter century of dramatic improvements in its implementation by integrated semiconductor circuits has led to clarity for technology, the same can not generally be said for the business side of personal and mobile communication. There is, however, one trend which seems to be clearly established. This is the nature of the three-tiered “pipeline” from the innovator to the ultimate user of new services. The innovator is a technologist employed by a (usually small) enterprise whose business is the design of semiconductor chips which provide ever improved performance or novel features. The enterprise is often a so-called “fabless semiconductor” meaning that it creates and proves the feasibility of the desired functions, usually through software simulation or hardware emulation; it then proceeds to design the chip to implement the functions and contracts with a semiconductor fabrication facility, known as a “foundry,” to fabricate the resulting design in quantity. If the likely quantities are only small to moderate, a less expensive approach is through use of standard digital signal processing chips, sometimes referred to as “Field Programmable Logic Arrays.”

The next tier is that of the manufacturers who produce the user’s terminal and the network and base station infrastructure of the cellular system; in some notable cases, manufacturers only concentrate on either the consumer or the infrastructure, but not both. Success in this most competitive arena requires massive size and manufacturing capacity. In fact, only a handful of companies worldwide have proven successful. For the most part, they have both the design and foundry capabilities as well as the necessary powerful manufacturing and marketing muscle. Yet the innovations have usually come from much smaller entrepreneurial organizations. Furthermore, the infrastructure manufacturers, and to a lesser extent the handset manufacturers, have often been required to finance sales to their customers, the carriers which provide the service to the users, thereby sharing the pain of the final tier, whose woes we consider next.

The carriers or service providers have been to a certain extent victims of their early success. With hundreds of millions of subscribers in each of the three major regions and over a billion subscribers worldwide, wireless voice communication has surpassed all expectations. While the service is twenty years old, the major growth has been in the second “digital” decade. Voice telephony has been the predominant application, but a few early data applications have been successful, notably NTT DoCoMo’s i-mode offerings in Japan and SMS (short message service) primarily in Europe. Whereas these are low data-rate applications consuming only the equivalent of voice bandwidth, it was assumed that high data-rate services, such as streaming video and real-time game playing, would be even more successful. With this hope and dream, carriers in many regions, notably North America and Europe, invested billions on wideband spectrum licenses, usually awarded through auctions. This not only raised their debt load to irrational levels, but it also increased competition thus potentially lowering profitability. Beyond this, infrastructure had to be put in place or upgraded to handle the new high-speed services. In some cases the vendor manufacturers financed their equipment sales fully, but ultimately the loans came due. By this time CDMA had been almost universally accepted as the technology of choice for data as well as voice. The International Telecommunication Union (ITU) endorsed three versions, the two prevailing being CDMA2000, which was an evolution of a previous CDMA standard, and WCDMA also known as UMTS, not much different but favored by those carriers who had previously implemented time-division multiple access (TDMA) standards such as GSM in Europe and PDC in Japan. Needless to say, the evolutionary CDMA2000 standard for carriers who already had up to seven years experience with the basic technology was much cheaper and easier to deploy, resulting in a rapid acceptance with tens of millions of subscribers already using it and achieving data rates two to three times those of fixed dialup lines. On the other hand, conversion of TDMA users to the new WCDMA standard has so far been very slow, with Japan leading though with only a few hundred thousand users. To compound the carriers’ difficulties, the necessary consumer education and training in the advantages provided by high-speed data to the handheld terminal has been woefully slow and in cases totally lacking. Fortunately, for the CDMA2000 carrier community, this evolutionary standard has almost doubled the spectral efficiency for voice, so the investments are paying off even though data usage is still low. The GSM/TDMA community, on the other hand, has been reluctant to invest in the new WCDMA standard, opting instead to upgrade GSM to somewhat higher data speeds through a standard called GPRS.

Overall, some carriers in the Americas and Asia are moving ahead well with CDMA2000. Others in the same regions and throughout Europe are slowly and tentatively converting to WCDMA. The major growth areas are now China and India, which will most likely tip the balance for the future of wideband data and voice services.

The Wireless Access Muddle

To compound the complexities and challenges for the wireless industry, new paradigms of service have arisen in the past few years. These have been characterized in the following manner:

Wide Area Networks (WAN),

Local Area Networks (LAN),

Personal Area Networks (PAN).

(The first two are sometimes preceded by a “W” for Wireless, since the the term also applies to fixed wireline networks.)

WAN is essentially another name for the current Cellular networks. PAN is the generic nomenclature for “Bluetooth,” the short range wireless links to replace cabling at small distances and which are also replacing infrared in controllers. But it has been the Wireless LAN which has caused the most furor in the industry with dire predictions for the future of high-speed data via WAN’s. The development of the LAN began primarily with the adoption of the IEEE 802.11 standard for the unlicensed 900 and 2400 MHz bands. This comes in various flavors (a through g), but it is the 802.11b standard which predominates, and is commercially known as Wi-Fi. Originally intended to operate at 1 Mbits/second, with reasonable robustness provided by a moderate degree of wideband spectrum spreading, it is being pushed up to 11 and even 22 Mbits/sec. with far less robustness particularly in the presence of other (unlicensed) emissions in the same band. For the home and the enterprise, the market has blossomed rapidly. With this low cost device, the individual can roam with laptop or PDA about her or his surroundings and maintain connectivity to the server or Internet. In a sense this is the equivalent for data of the cordless phone for voice. But it is the potential of its use outside the home or office, at so called “Hot Spots,” which has drawn the attention of investors who see this as the low-cost replacement for the high-speed WAN (WCDMA/UMTS in particular). In our opinion, this is not likely to happen for at least two reasons. The first is the cost of connecting to broadband wired networks, given the limited customer attendance per location. This has driven out previous entrepreneurial ventures; the service is currently provided primarily by a traditional cellular carrier as an auxiliary to its core cellular business. The other reason is that it lacks mobility and universal coverage, the two strong suits of the wireless WAN. CDMA2000 data service, for example, is provided throughout all metropolitan areas in the U.S. by two carriers. While its typical data rate is only one-tenth of that of Wi-Fi, it suffices for most requirements. An extension known as CDMA2000- 1xEV-DO reaches downlink speeds comparable to Wi-Fi and provides an experience better than DSL for streaming video to the laptop, for example.

Ultimately though, the real impediment for high-speed data is the input-output bottleneck. Data savvy users of Internet and shared files and services are accustomed to a “desktop experience” meaning large screens and comfortable keyboards. Handsets, even with the enhanced features of palm-sized PDA’s, provide neither. For these, truly high-speed data provides little or no advantage over medium speeds in the low hundreds of Kbits/second. Only when the input-output bottleneck is thus widened will multi-megabit/sec. data be useful. In the meanwhile, the value of mobility and ubiquity, as well as low cost (as a consequence of economies of scale through coexistence with voice services) will preserve the primacy of the wireless cellular WAN.


About the Author


Dr. Andrew Viterbi is a co-founder and retired Vice Chairman and Chief Technical Officer of QUALCOMM Incorporated. He spent equal portions of his career in industry, having previously co-founded Linkabit Corporation, and in academia as Professor in the Schools of Engineering and Applied Science, first at UCLA and then at UCSD, at which he is now Professor Emeritus. He is currently president of the Viterbi Group, a technical advisory and investment company.

His principal research contribution, the Viterbi Algorithm, is used in most digital cellular phones and digital satellite receivers, as well as in such diverse fields as magnetic recording, voice recognition and DNA sequence analysis. More recently, he concentrated his efforts on establishing CDMA as the multiple access technology of choice for cellular telephony and wireless data communication.

Dr. Viterbi has received numerous honors both in the U.S. and internationally. Among these are four honorary doctorates, from the Universities of Waterloo (Canada), Rome (Italy), Technion (Israel) and Notre Dame, as well as memberships in the National Academy of Engineering, the National Academy of Sciences and the American Academy of Arts and Sciences. He has received the Marconi International Fellowship Award, the IEEE Alexander Graham Bell and Claude Shannon Awards , the NEC C&C Award, the Eduard Rhein Foundation Award and the Christopher Columbus Medal. He has received an honorary title from the President of Italy and he has served on the U.S. President’s Information Technology Advisory Committee.

Viterbi serves on boards of numerous non-profit institutions, including the University of Southern California, UC President’s Council for the National Laboratories, MIT Visiting Committee for Electrical Engineering and Computer Science, Mathematical Sciences Research Institute, Burnham Institute and Scripps Cancer Center.

(3/21/2003)

 


Columnist:
Matt Bancroft
Mar, 2008
Using Mobile Device Management to Unlock CDMA Service Revenues
 

Columnist:
Paul Schaut
Jan, 2005
MVNOs and the Fragmentation Challenge
 

Columnist:
Francisco Kattan
Sep, 2004
Simplifying the Mobile Messaging Experience - an Integrated Approach
 

Archives