Tuesday, September 30, 2008

Expect whining: The Boomers in Retirement

McKinsey did a study (free registration required) of the impact of retiring baby boomers in November 2007. As one might expect, the report is brimming with factoids. The ones that jumped out at me:
  • 60 percent of the boomers won’t be able to maintain a lifestyle close to their current one without continuing to work.
  • The same percentage of older boomers already suffers from chronic health problems.
  • 43 percent already are frustrated that they aren’t leading the lives they expected.
But society will not be able to ignore their (our – I just scraped into the category) whining. More factoids:
  • Boomers will control nearly 60 percent of US net wealth in 2015 (see Exhibit 2).
  • There will be more than 45 million households with people from 51 to 70 years old, compared with about 25 million for the “silent” generation, born from 1925 to 1945.
  • Their real disposable income and consumption will be roughly 40 percent higher.
Doing a Rumsfeldian analysis of knowns and unknowns, 24 million boomers have not prepared for retirement (this excludes 11 million plain disadvantaged, and 10 million affluent); of these, 13 million know they’re unprepared, but 11 million don’t know they’re unprepared (Exhibit 3).

Guess who’ll be demanding a bail-out of their own in 2015, just as we’re (hopefully) getting over the current one?

Monday, September 22, 2008

Spectrum auctions aren’t neutral

Spectrum allocation isn’t neutral regarding technology or services, even when there aren’t obvious strings attached. Even when it looks like regulators are not making technology bets, as in contemporary auctions, they are indeed doing so.

In the old days, radio licenses were issued via beauty contests, with stringent conditions on the kind of services and technology that a licensee could use. Television is a good example: the license limits use to TV broadcasts, and specifies the technology to be used (such as NTSC analog and ATSC digital in the US). Such “command and control” regulation has been much decried in recent years: industry is better placed to make technology decisions than the government, say the Right, and beauty contests lead to windfall profits, say the Left.

The Right favors the auction of flexible-use licenses which don’t specify the service to be offered or the technology to be used. This has been largely implemented in the cellular bands, where operators could choose whether and when to move from analog to digital technology, and which technology to use. [1] AT&T, for example, uses the TDMA technology, while Verizon uses CDMA.

However, even if conditions aren’t attached to the frequencies, the way in which frequencies are packaged into bands limits the technologies that can be used, and thus the services that can be offered. A current example is the debate in Europe about allocating spectrum for future wireless services. Unstrung reports:
. . . the CEPT working group will likely recommend two band plan options: one for frequency division duplex (FDD), which uses different frequencies to transmit and receive signals, and the other for time division duplex (TDD), which uses one channel and timed transmissions to send and receive. Cellular operators have traditionally favored FDD systems. [my italics]
Regulators will have to make a choice between FDD and TDD, which entails a choice between services and vendors. FDD is voice-oriented and aligned with the cellular industry (UMTS) and Qualcomm, while TDD is data-oriented and aligned with the WiMAX camp and Intel.

--- Note ---

[1] Some remnants of the old philosophy remain. The PCS rules, for example, allow licensees to provide any mobile communications service BUT broadcasting as defined in the Communications Act is prohibited.

Wednesday, September 17, 2008

Rebooting the broadband debate

Rob Atkinson and his colleagues at ITIF have written an even-handed and insightful report on “Explaining International Broadband Leadership”.

They found that while the United States is behind other countries in broadband deployment, speed and price, it can’t all be blamed on the government – but that good policies can make a difference. It’s harder than many on the Left claim to find a silver bullet in the experience of some other country (France, South Korea, etc.), but despite claims from the Right, one can learn something from their best practices. Government leadership and competition matter, but so do market incentives on both the supply and demand side.

Atkinson, Correa and Hedlund’s headline policy recommendation is that we end the “either-or shouting matches”. However, the question is How? They call for a “pragmatic discussion”, but that’s the end result, not the means to the end. It’s true, as they say, that we should be able to agree that the United States can do better on broadband, but we can only move beyond a divisive and unproductive debate if the conversation is reframed – and if we can recruit new, less entrenched, participants to the table.

One could engage industry and society at large (rather than just companies and activists with narrow issue agendas) if broadband were tied to commercial and personal success.

Workforce development: “Telecommuting” is a very tired meme nowadays, but it had power back in the day. If the Fortune 500 came to believe that universal affordable broadband would make them more competitive, and if the AFL-CIO came to believe that it would make workers more employable, then the debate might shift.

More sales: The “information superhighway” is just as tired, but the notion that the interstates and local roads are good for business is as true now as it ever was. If US retailers of goods and services (including entertainment) came to believe that they’d generate more profitable sales if the network was faster and cheaper, and if populist protectionists came to believe that fast local broadband was a bulwark against losing business to them furriners, then the debate might shift.

Energy: If one could make an argument that the US could get to energy independence sooner by moving bits rather than atoms, then the debate might shift. Gas prices will fluctuate, but the trend will be up. If you’re moving atoms, the world isn’t flat. Broadband can enable gains from local specialization based on knowledge, rather than production costs of commodities.

Wednesday, September 10, 2008

Protecting receivers vs. authorizing transmitters


When governments hand out permissions to operate radios – licenses, for example – they think in terms of transmitters: within the licensed frequency range you can broadcast at such-and-such a power, and outside those frequencies you can transmit only at some other, much lower, power. [1] This distribution of broadcast power is often called a “transmission mask”.

In thinking about new ways of regulating radio, I’ve come to believe that a transmission mask is not sufficient; it helps to include receiver parameters. [2] But that’s not the point of this post; if you’re interested, read my paper at SSRN.

Today’s question is: if transmitter parameters are not sufficient, could one do without them completely? Can you go the whole hog, and only specify a receiver mask? I think you can, and I’m encouraged that Dale Hatfield tells me Robert Matheson concluded this some time ago, though I haven’t found a reference yet.

Receiver and transmitter parameters are figure and ground. Assume a steady state where all systems operate without interfering with each other. Transmissions will by definition have been chosen to prevent interference with receivers in adjacent frequency bands. The result of all the transmissions is a level of electromagnetic energy at every receiver which is low enough that no harmful interference results. (I’m ignoring receiver specifications [2] and the geographical distribution of transmitters in this discussion.) Each transmission propagates through space to a receiver, resulting in the allowed receiver mask:
{all transmission masks} -> {propagation} -> {resulting receiver mask}
One can also invert the calculation: given a receiver mask and propagation model, one can determine what the allowed transmissions should be.

A license defined in terms of receiver masks would allowed the licensee to transmit any amount of energy as long as it does not exceed the mask of anyone else. It would guarantee that nobody else is allowed to radiate energy which leads to the allowed levels being exceeded at the licensee’s receiver.

One can compare reception-based vs. transmission-based licenses by thinking about property rights. The two important attributes here are exclusivity (if I’m a licensee, I can prevent anyone from transmitting in my frequency “band”), and autonomy (within my “band”, I can do what I like, notably acting in a way that makes money). [3], [4]

A transmitter-based license focuses on autonomy by defining transmission parameters. It specifies what a licensee is allowed to do, but it doesn’t provide any guarantee of exclusivity. A receiver-only right reverses this emphasis: by specifying what would amount to harm to a receiver, it provides a way to make exclusivity real in practice. The constraints on autonomy are implicit in the receiver-rights of others: as long as a licensee doesn’t interfere with other licensees’ exclusivity, it can do what it likes.

Anything transmitter-only rights can do, receiver-only rights can also do. They are mirror images of each other. The information burdens are also mirror images. Receiver rights place a burden on all the other rights holders to figure out if their transmissions will transgress a receiver spec. While transmission rights don’t impose an information overhead upfront, the rights holder bears the burden of uncertainty: they may be blind sided at some future date by a new transmission right that reduces the value of the system they’ve deployed. The fight between M2Z and T-Mobile is a good example: T-Mobile claims that the proposed terms of a proposed new cellular license M2Z seeks (AWS-3) will cause harmful interference to their operations under a current license (AWS-1).

I like receiver-only rights because they put the focus on the ability of a wireless system to operate in the presence of noise, which one can only do by taking receiver parameters into account. However, enforcement proceedings may appear more complicated in this case, since it isn’t obvious which of many transmissions is responsible for a receiver mask being exceeded: a spectrum analyzer at a receiver can only measure the sum of all radiated power. Today the regulator has what looks like a big stick: it can objectively check whether a licensee’s equipment meets or violates its approved transmission mask. The stick doesn’t actually help solve the most difficult interference problem, the case where all transmissions meet their masks, but it gives the regulator power that it will be loath to give up.

Conclusion

There a choice in creating radio rights between protecting receivers and authorizing transmitters – or some mixture of the two. The current approach limits the discussion simply to ways of authorizing transmitters. A more nuanced analysis of the trade-offs is required, and was begun here.

Notes

[1] Since this is going to get pretty geeky, so I’m going to leave out a lot of important other stuff, including geography (radio licenses are typically restricted to a certain area) and duty cycles (how often transmitters operate).

[2] Receiver parameters. I distinguish between a receiver mask, by which I mean a distribution of RF energy (i.e., a spectrum) which represents the worst-case noise environment in which a receiver needs to be able to operate, and a receiver specification, by which I mean the ability of the receiver to detect and decode wanted signals in the presence of unwanted ones.

[3] According to Gary Libecap in Contracting for property rights (1989), p. 2 “. Private ownership . . . may involve a variety of rights, including the right to exclude nonowners from access, the right to appropriate the stream of rents from use of and investments in the resource, and the right to sell or otherwise transfer the resource to others.”

[4] “Band” here means the collection of constraints on operation. In the conventional approach, it’s usually taken to mean a frequency band and geographical region.