In the year 2021, it’s almost impossible not to be a techno-pessimist.
Silicon Valley’s promises of revolutionary technological advancement just a decade ago have gone unfulfilled. No self-driving cars, just more expensive cabs. Sure, we got next-day delivery with Amazon Prime, but at the expense of our local corner store. The promise of instant connection with anyone in the world has only brought us closer to those who would do us harm. Our screen time gives us anxiety, and our eyes are getting tired.
The problem is that the way we build technology is fundamentally broken. In the old days, the government, academic researchers, and private enterprise worked symbiotically to create exciting new technology, such as Ethernet and computer operating systems. But the way we make technology has become siloed, both intellectually and financially. Researchers chase knowledge, and private enterprise chases profit. Neither collaborates to find innovative ways to solve real-world problems. And when innovation does happen, entrepreneurs live in fear their work will enter the “kill zone,” where they will be clobbered or cloned by monopolistic tech giants such as Facebook, Amazon, and Google.
The result is a Silicon Valley that has become lazy and derivative, leaning almost entirely on its greatest hits. Venture capitalists give money only to startups that jump on the hottest trend, and entrepreneurs in turn are stuck trying to iterate on those trends, rather than innovating. A lot of people are running hard and fast at the same unoriginal ideas. Tech stars may talk about changing the world, but tech has always been about making money. Deep in the Valley’s DNA is the soul of a ’49er — a gold-rush prospector heading West for riches, digging as close as possible to the spot where someone else struck gold.
It’s coming from inside the house
Our basic internet infrastructure may have held together during the coronavirus pandemic, but not even the titans of tech have been satisfied with what Silicon Valley provided us over the past year. In an April 2020 essay to everyone in particular, the venture capitalist Marc Andreessen, who cofounded the storied firm Andreessen Horowitz, wondered where all our stuff was — all the useful technology that could have helped get us through the COVID-19 calamity.
Where, he wondered, were the cities of the future? Why didn’t we have the capacity to scale up the manufacturing of medical supplies? Or to create more affordable housing? Or delivery drones and supersonic aircraft?
Why, he asked, haven’t we been building?
It’s a good question. Set aside, for the moment, that Andreessen is the man who, a decade ago, cheered that software would eat the world. Or that his latest successful ventures have all surrounded cryptocurrency — a technology that has no use case aside from speculation and crime. His sentiment is correct, and we can see it in the numbers. American productivity has ground to a halt. During the most recent tech boom, from the late 1990s to the early 2000s, annual productivity growth — a measure of how much a worker can produce an hour — hovered at about 2.5%. Since 2007, according to the Bureau of Labor Statistics, it has averaged only 1.5%. For all its success on the stock market, Silicon Valley has not been the innovation engine that we hoped it would, or that it promised.
Publicly, Silicon Valley will tell you everything is working as it should. Services like Facebook, they point out, are free, so their benefits aren’t reflected in the productivity numbers. In a recent CNBC interview, the Founders Fund general partner Keith Rabois said the movement in Washington to break up Big Tech was nothing but a political fabrication. Everyone outside the nation’s capital, he insisted, loves tech just the way it is.
But privately, tech leaders acknowledge something is wrong. The incentives surrounding who gets money for which idea are askew, and funding is being allocated in ways that actually stifle innovation. Some entrepreneurs acknowledge Silicon Valley’s promise to save the world was always part manipulation — a way to get employees to buy in to long hours and to get consumers to overlook growing monopolization. Others will tell you the worst thing to happen to tech entrepreneurship was to become the toast of elite business schools, to be glamorized as a way to get rich, to be turned into a Hollywood production written by Aaron Sorkin.
One veteran entrepreneur told me venture capitalists didn’t get docked for failing to invest in innovative, useful technology — but they would for not hitting a hype cycle. VCs that ride the latest hype cycle are more likely to get markups — follow-up investments from other VCs, at higher prices — and markups get them more cash in their funds. The Silicon Valley hoodies may not love Wall Street suits, but they do love Wall Street money, and Wall Street has never met a hype cycle it didn’t love.
That means startups outside the sectors that produce hype cycles are starved for cash. According to PwC, 83% of all venture-capital investments from 1995 to 2019 went to startups dealing in life sciences or in information and communication technologies. That leaves crucial sectors, like energy, out of the mix.
Thanks to this perverse incentive structure, a lot of entrepreneurs invent (or reinvent) with an eye toward the exits. And right now those exit options are getting acquired (if a company has managed to avoid entering the kill zone, where bigger tech companies either clone or kill their product) or going public into our current super-bubbly stock market.
Before it went public, Facebook attempted to acquire Snap for $3 billion. When that didn’t work it simply cloned its features. Google bought the rival navigation service Waze
and then slowly started integrating its navigation features into Google Maps. Going public has never been easier for tech companies now that the stock market is in the throes of a
SPAC — special-purpose acquisition company — boom. Companies that go public through a SPAC are required to offer forward-looking projections to investors, not the detailed prospectus required for an initial public offering. That works out just fine for the SPAC sponsors who collect richer fees than IPO underwriters, but it ends up rushing companies that are nowhere near ready for a public debut.
It’s expensive to be this cheap
To blame Silicon Valley alone for the lack of game-changing innovation would be a mistake. America’s great technological innovations of the past century were in large part born of a marriage between the public and private sectors — and the government side of that equation isn’t holding up its end of the bargain either.
After World War II, Washington embraced a simple insight — that basic scientific research opened the door to the discoveries that would propel us toward the future. To promote innovation, Congress created the National Science Foundation, which directed public funds to universities and private labs, along with the Defense Department.
Those private labs included Bell Labs, which produced innovations like the transistor, the laser, and all sorts of programming languages, and Xerox’s Palo Alto Research Center, which developed the computer interface as we know it and inspired a young Steve Jobs. The public funding helped sustain in-house research teams at corporations and created a direct line of communication between those asking questions about how to advance science and those asking questions about how to advance commerce.
Corporations also innovated in-house because they were scared of running afoul of the government if they simply plucked off other products from competitors. As a 2019 working paper by economists at Duke University and the University of East Anglia put it, antitrust enforcement — i.e., government crackdowns on mergers and anticompetitive copying — “convinced managers that buying other firms would be a costlier way to grow than by introducing new products derived from in-house research.”
Antitrust scrutiny also forced companies to open their technology to the rest of the world. In 1969, the government sued IBM over claims it illegally maintained a monopoly on general-purpose computing. The case lasted 13 years, six of them in trial, before the Reagan administration — no friend of antitrust enforcement — dropped the case. But the scrutiny forced IBM to stop tying together its hardware and its software, a move that helped launch the independent software industry. Later antitrust moves also helped spur innovation, such as the case against Microsoft in the 1990s that cleared the way for rival web browsers and, ultimately, Google.
This successful system of innovation changed because the way Americans thought about the economy changed. In the 1980s, private industry, the government, and scholars decided the market, free of government assistance and regulation, could take care of innovation on its own. At the same time, Wall Street grew tired of big corporate research-and-development teams and encouraged CEOs to cut costs by going outside their companies to find innovation. About the same time, the government stopped pursuing any meaningful antitrust enforcement, and legal thought turned against marketplace interventions meant to drive competition.
But it turned out that the pro-market forces were wrong. Instead of driving more groundbreaking tech, the hands-off approach proved to be a disaster. In 1971, Fortune 500 companies won 41% of the R&D 100 Awards, the most prestigious honor for innovative technology. By 2006, that number had declined to just 6%.
At the same time, the Duke-University of East Anglia researchers found, corporate research dried up. From 1980 to 2006, the average company’s publication of scientific research declined by 20% a decade. Most of the research came from just two firms: Microsoft and Google. Even publication at a tech giant like Apple declined relative to its sales, as major corporations began leaving R&D to the universities.
The result was a communication breakdown that made it harder to innovate. “Although specialization means universities and firms can become better at producing research and developing products respectively,” the researchers wrote, “this division of innovative labor has made it more challenging for innovative research to turn into useful products.”
It isn’t just private enterprise that has bailed out of scientific research. According to the MIT economist John van Reenen, federal spending on R&D in 1964 was about 2% of economic output. Today it hovers at about 0.7%. “In today’s dollars,” he wrote, “the United States spends roughly $240 billion less per year on R&D than it did at its peak.”
Research is just like any other commodity: The less you pay for, the less you get. Even a slight increase in R&D would be a game changer for American technology. “Increasing R&D investment by $100 billion,” van Reenen concluded, “would represent one-half of 1% of GDP and would be transformative for the future of US innovation.”
Back to the future
It would be nearly impossible for America to rebuild the corporate R&D juggernaut it had 40 years ago. But we can and must accept that the way the government and industries allocate money needs serious adjusting. If we can change the way we finance and build tech, perhaps we can once again create a truly novel innovation ecosystem.
Surprisingly, there is bipartisan awareness in Washington of the need for reform. Republicans and Democrats want to rein in the power of Big Tech, and that means opening the playing field to would-be rivals. It means stopping Apple and Google from using the information they collect from third-party apps to undercut their own clients. It means getting Amazon to stop treating its third-party selling “partners” like what they’re actually called in-house — “internal competitors.” It means getting rid of the kill zone.
Six pieces of legislation moving through Congress are designed to address such antitrust concerns. The leader of the movement, Rep. David Cicilline of Rhode Island, said he divided the reforms into six bills to make it harder for Silicon Valley’s lobbyists to fight them. So far, it’s worked. In June, all six bills passed through a marathon Judiciary Committee markup.
The battle has forged odd divides, even by Washington standards. It has pitted California Democrats against one another and has put the archconservative Rep. Matt Gaetz of Florida on the same side as the progressive Rep. Pramila Jayapal of Washington state. The cross-party fault lines make it almost impossible to predict how the legislation will shake out.
What is certain, though, is that the White House supports the effort. In another surprise, the Biden administration put the legal scholar Lina Khan forward as its nominee for the head of the Federal Trade Commission, which enforces antitrust laws. During her confirmation hearing, the lions once again laid down with the lambs, as eight of the committee’s 12 Republicans voted with Democrats in support of her nomination.
Khan is something of a wunderkind. She is known in academic and policy circles for writing the definitive research paper explaining how Amazon used anticompetitive behavior to ensure its dominance. Both Amazon and Facebook argue that Khan is biased and should recuse herself from any FTC decision regarding their businesses.
They are scared, and they should be. During her first open committee meeting, she and her fellow Democrats on the FTC rescinded an Obama-era rule discouraging the commission from taking action against monopolistic behavior. Khan argued that the FTC needed more flexibility to regulate the peculiarities of our modern tech industry, so the committee untied its own hands.
Boogeymen get attention, so all-things “Lina” — yes, she’s achieved first-name status among the haters — have been covered breathlessly on the business channels by distressed infotainment analysts, who are worried poor Jeff Bezos and his little trading post Amazon are being bullied by the government.
But the government isn’t just showing a willingness to enforce the rules of the game; it’s also showing a renewed willingness to get back in and play the game. The Senate recently passed the US Innovation and Competition Act, a $250 billion measure meant to fund tech research in an effort to counter China. With support from the White House, the bill has passed the Senate and is awaiting passage in the House — where a push from Democrats is likely to turn it into law. If that happens, it will help to revive the public side of America’s innovation engine.
In his April 2020 essay, Marc Andreessen said the reason we didn’t have all the nice things we deserved, like high-speed trains or hyperloops, was that we didn’t want them. But that’s not it. It’s because we don’t correctly value the innovation that created them. We want the product without the pain. We want something for nothing. Nice things are expensive, and we have not prioritized a funding mechanism that is willing to do the heavy lifting. We have not correctly valued the damage of underinvesting.
Technological advancement is worth more than Silicon Valley’s incentive structure is willing to pay for it. It always has been. The rewards don’t come fast enough in research — not fast enough to guarantee returns — so we need to rethink how innovation really happens if we want to reinvigorate it. It doesn’t come from the lone college dropout, prospecting through science to find the golden discovery that will lead to riches. Invention is an ecosystem, one that requires support from society. It’s not enough to want it. We have to want it enough to pay for it.
© Linette Lopez, written exclusive for BusinessInsider