Inside Intel's Intellectually Dubious Patent Study

Share

Joseph Schuman
July 17, 2014

At the heart of policy disputes over standard essential patents is a simple truth: Companies whose products depend on standardized technologies want to increase their profit margins by cutting input costs – the royalties they pay to use standardized technologies invented and patented by other companies.

In other words, policy conflicts over standard essential patents (SEPs) tend to pit implementing companies against inventing-and-licensing companies, one business model against another.

So when in the course of patent policymaking it becomes necessary to examine the worthiness of alleged scholarship about SEPs, a decent respect for the consumers and markets ultimately affected requires policy makers to examine the scholarship’s origin and separate fact from advocacy.

Such is the case for a new “working paper” that entered the standards debate last month with a controversial thesis that generated headlines and a lot discussion in patent circles. Its title: “The Smartphone Royalty Stack: Surveying Royalty Demands for the Components Within Modern Smartphones.” Its authors: Ann Armstrong, associate general counsel at Intel, as well as Joseph J. Mueller and Timothy D. Syrett, two lawyers at Wilmer Cutler Pickering Hale & Dorr who work for Intel.

Before we get into the gist of the paper, readers will benefit from a quick historical recap of Intel’s interest and role in the SEP debate, and some general background on SEPs and how standard setting works.

Why SEPs Matter

Most people rarely if ever think about patents and the legal and economic innovation ecosystem that help bring to the world inventions they use every day. But almost no one outside the technology fields requiring standards thinks about SEPs. And that’s understandable. People take for granted that, for example, their smartphone or Wi-Fi-connected laptop connects them to their grandmother and their bosses and their favorite TV show whether they’re in Bakersfield, Bern or Beijing, and whether they’re using the hottest new device or a cheaper if less-feature-filled rival. To these uninitiated, it doesn’t matter that this connectivity is the result of decades of high-risk R&D, and hundreds of thousands of hours of engineers from different companies working together at standard-setting organizations (SSOs) to agree on the best technical solution that harnesses the laws of physics for the sake of connecting people.

And if standards are too esoteric a subject to be on the average person’s radar, why should they care about SEPs?

Here’s why, and we’ll use as an example a general rundown of how things work at 3GPP, the main SSO defining the ubiquitous 3G and 4G wireless cellular standards that make it possible for different devices and networks to communicate with one another and support their ever increasing functionality. Participants in 3GPP, which is voluntary and open, include a handful of companies that invent most of the patented technologies included in the standard and many, many other companies that use those technologies in their products. Engineers from all these companies regularly get together to discuss whether to adopt a new technical solution – discussions that can take years and often require several rounds of new work on the technology. Then they vote on whether to accept it. At 3GPP, a consensus based approach is used, and any member can object. If consensus fails, a super-majority of 71% is required for adoption. To recap: A strong majority of the SSO member companies who get to decide which patented technologies to include in the standard are the companies who will be using and paying to license those technologies once they have become part of the standard.

Meanwhile, the SSOs’ intellectual-property rules generally require a commitment from the inventing companies. In essence, the inventing companies agree that if their patented technology is adopted in the standard, they will offer the technology to willing licensees at terms that are FRAND (sometimes called RAND): fair, reasonable and non-discriminatory. The goal is to maintain the patent-protection incentive for invention companies to invest and take risks, while assuring that owners of technology in the standard don’t refuse to license their SEPs on reasonable terms.

Beyond the world of standards, it is easy to forget that investment in the development of new technologies is risky, and that the early developers of new technologies take much bigger risks than the late entrants to a market who employ the technology only after it is widely adopted. Inventors know they will likely fail a lot before they make something work, and that even with technological success there’s no guarantee commercialization will happen. Engineers working on new technologies cannot know in advance whether their finished product will be adopted by an SSO.

The current FRAND regime has resulted in a global smartphone ecosystem in which consumers continually benefit from improving technology and declining prices. Old and new rival handset makers all have access to the same technology standards. This means that to compete for consumers’ attention at the high end of the smartphone market they must be more innovative with design and functionality, while at the low end commoditization of the product pushes prices down. The global ubiquity and constantly increasing variety of smartphones and cellphones available to consumers speaks for itself.

Smartphones are not an exception among SEP-heavy industries. Stanford University economist Stephen Haber, University of California at Berkeley economist Ross Levine and Alexander Galetovic of the Universidad de los Andes looked at six decades of price data from the U.S. Bureau of Economic Analysis and the Bureau of Labor Statistics, and concluded that consumer prices fall much more quickly in SEP-heavy industries than they do in industries dominated by non-standard-essential patents. (Patent Truth will take a deeper look at this study sometime soon.)

At the risk of belaboring the point, we’ll say again: Most people, most consumers never have to think about SEPs because the system enabled by SEPs works. At least in practice. But in theory – or rather specific theories -- there are problems galore.

So Much for Smartphones

Such theories go back at least as far as 2007, when Stanford law Professor Mark Lemley published his seminal SEP crie de coeur, “Ten Things to Do about Patent Holdup of Standards (and One Not To).” In the paper, Lemley pointed out that “hundreds of thousands of patents cover semiconductor, software, telecommunications and Internet inventions,” and that “innovation often requires the combination of a number of different patents.” He then argued that if a given product includes technologies covered by 5,000 patents, with each patent holder seeking a paltry share of the product’s price – say, 1 percent – that total royalties could stack up to 5,000 percent! It was a curious theory, but Lemley offered no empirical evidence to back it up.

One product Lemley did cite as a potential candidate at risk for such a business-ruining threat was an Intel microprocessor. We’ll get back to the Intel example in a moment.

Now, Lemley also cited the threat of so-called royalty stacking to the 3G telecom industry, which he noted was especially encumbered by so many standard-essential patents. “This is not a formula for a successful product,” Lemley said.

But it’s interesting to note that a term Lemley didn’t use in this paper was “smartphone,” which these days is the best-known product to use 3G and now 4G technology.

That omission is easy to understand. As a fully fledged product category, the smartphone didn’t yet exist when he posited this theory.

As it happens, in mid-2007, the same year Lemley published his paper, Apple introduced to the world the iPhone, a successful product by nearly any measure. The iPhone wasn’t the first smartphone, but it shook up the industry, with Apple stealing Nokia’s long-held wireless crown. It didn’t take too long before Samsung was then breathing down Apple’s neck. With its thriving competition and a market of consumer choices rife with innovation, the smartphone industry would seem to be a bad example for anyone seeking to prove the existence of royalty stacking or other theoretical problems attributable to SEPs.

“Intel Inside”

It was Patent Truth’s colleague, Kirti Gupta, who brought to mind the 2007 Lemley study. Gupta is an engineer who took part in the standards process before going back to school to get her PhD in economics and becoming an economist who studies these issues. (Full disclosure: Gupta and the author of Patent Truth both work for Qualcomm, which makes a considerable share of its revenue from licensing its smartphone-empowering technologies.)

History, she noted, “has provided us with clear evidence. Intel’s microprocessors have historically demonstrated a huge success in the marketplace [while] bundling thousands of patented inventions. Several of us carry laptops with the moniker ‘Intel Inside,’ referring to the microprocessors which, according to Lemley, may be covered by 5,000 patents or more. The prices for these microprocessors dropped significantly over time, again, defying the theory of royalty stacking.”

One of the principal mitigating factors is the significant amount of cross-licensing that takes place in industries built on products covered by multiple patents, according to economists who surveyed such firms. “The research finds that in the complex technologies, [with] fragmented patent rights, firms display extensive cross-licensing agreements. Therefore, innovation does not stop,” Gupta said.

Flash forward to the working paper on royalty stacking recently produced by Intel.

The authors of the paper say that using publicly available information, they find evidence of smartphone royalty stacking, which they define as a situation “in which the cumulative demands of patent holders across the relevant technology or the device threaten to make it economically unviable to offer the product.” [Underline added.] Specifically, they “estimate potential patent royalties in excess of $120 on a hypothetical $400 smartphone,” which they say is nearly equal to the cost of the physical components that go into the device.

A caveat, or two or three: The authors say that if they are privy to “confidential licensing information through our in-house or litigation work, we do not report it in this article, in any way.” And in arriving at the theoretical cost of stacked smartphone royalties, they also decided to exclude “off-sets such as ‘payments’ made in the form of cross-licenses and patent exhaustion arising from licensed sales by component suppliers.” In other words, cross-licensing, one of the most significant industry factors affecting royalties, is one of the factors being ignored.

The authors express this caveat several times at the start of their working paper, and they then add: “Conducting such an analysis on an industry-wide basis presents significant practical challenges, because of the paucity of publicly-available information about the scope of many existing cross-licenses. Cross-licenses and pass-through rights could be expected to significantly decrease the monetary payments made by companies with large patent portfolios.”

The purported value of the paper would seem to significantly diminish at that point. On top of that, something the authors don’t say is whether, to avoid the speculation that defines their paper, they tried to get hard information that does exist on such royalties from the handset makers, for whom royalties are an important input cost.

They simply don’t have the data, which leaves them with theory and little else.

Lastly, the theories about “royalty stacking” all begin with and rely on a presumption that should appear absurd to any reader who takes a moment to step back and consider it: that the patented technologies behind the core functions of a tech product should be treated as less valuable than the physical materials, assembly, distribution or marketing costs that also determine the product’s consumer price.

As readers of Patent Truth well know, this preference for theory over empirical data – and especially theory cloaked in the guise of assuring fair competition, with big, headline-catching numbers – characterizes most prominent anti-patent studies that have entered public debate. (Readers, please feel free to explore on your own the origins of the $1.5 billion dollars that, according to the Consumer Electronics Association, patent trolls are costing the American economy every week. Then check back at Patent Truth soon to see our take.)

“A Grave Threat”

It is easy to speculate, based on its merits and methodology, on the motive of the Intel study. At a time when policy makers in the United States and other countries are again considering measures that would devalue patents, this study seems aimed at providing ammunition for arguments to do so.

But we don’t need to speculate on Intel’s motive. Intel is quite open about it.

In another paper produced this year for the China Institute of International Antitrust and Investment, Intel declares that “the high-technology industry today faces a grave threat from patent owners that make industry-wide commitments to license their patents on fair, reasonable, and nondiscriminatory (‘FRAND’) terms, but later renege on their promises.”

Citing theoretical studies by Lemley and others, Intel says “significant competitive harm” is created by breaches of FRAND commitments. But Intel never gets around to showing actual violations or harm that have taken place.

Instead Intel focuses on the potential evils of royalty stacking without at all acknowledging that market values for validly patented technologies exist because these technologies improve a standard and strengthen an industry that uses the standard. Intel notes that U.S. judges in some cases have made determinations on the value of specific FRAND royalties less than those sought by inventing companies. But Intel then argues even these FRAND royalties represent a breach of FRAND commitments. And then, to support its royalty-stacking argument, Intel presents a theoretical “stack” price based on multiplying a rate sought for one SEP by the estimated total number of SEPs in a standard. By using only a rate sought, as opposed to a rate accepted by an implementer or approved by a court; by ignoring cross-licensing; and assuming falsely that all patents are worth the same, Intel arrives at a large number intended – it seems – to provoke fears about the “grave threat.”

These arguments sound like negotiating tactics in contract talks – and essentially, that’s what they are. They are points made during an effort by an implementation company to lower its input costs and increase its profits.

But they purport to be a disinterested sifting of facts aimed at guiding policy.

This just isn’t true.