What Biotech and Hollywood Have in Common
Biotechnology is not an isolated case. It is
representative of how many innovation clusters get started. Throughout history,
hubs of innovative activity have agglomerated in unlikely places. Consider
another important industry whose success depends on stars: motion pictures. At
the beginning of the twentieth century, film was the hot new thing, competing
with theater to establish itself as a respectable medium of entertainment and
facing formidable technological and managerial challenges common to all new
industries. Everything had to be invented from scratch, from the shooting and
editing of a film to its production and distribution.
In 1913, the year before World War I began,
the movie industry was largely concentrated in New York, where the major
studios and the biggest stars were, with smaller outposts in Chicago,
Philadelphia, Jacksonville, Santa Barbara, and Los Angeles. By 1919, one year after the end of the war, 80 percent
of American movies were made in California. Charlie Chaplin and countless other
stars had moved west, and Los Angeles had three times as many motion picture
establishments as New York. The golden age of Hollywood had begun. By the
mid-1920s, Los Angeles had further consolidated its position as the world’s
premier location for film, and the term Hollywood no
longer just referred to a quiet neighborhood west of downtown and had begun to
be used as a generic term for the entire filmmaking world. This era of
unprecedented artistic achievement and commercial success peaked in 1940, when
Hollywood studios produced about four hundred films a year and 90 million Americans
went to the movies each week. By then the economy, society, and culture of Los
Angeles had changed forever. Movies had become big business, generating tens of
thousands of local jobs, and were responsible for a significant portion of the
city’s prosperity.
The transformation of Los Angeles from a small,
provincial outpost, remote from everyone and everything, into a cosmopolitan
center of artistic creation is a truly breathtaking tale. Its dynamics track
what we see happening in more modern innovation clusters. As more and more
actors, studios, and specialized service providers (stage technicians,
musicians, location scouts, costume designers, and so forth) congregated in
Hollywood, the forces of agglomeration sustained an accelerating upward trajectory.
This agglomeration made Los Angeles the place to be, and made it increasingly
difficult for other locations to compete. In exactly the same way that Silicon
Valley and Seattle today attract the best and the brightest Chinese and Indian
engineers, Hollywood at the time became a magnet for talented immigrants,
mostly European, many of them Jewish: great directors like Ernst Lubitsch,
Alfred Hitchcock, Fritz Lang, and Michael Curtiz and great actors like Rudolph
Valentino, Marlene Dietrich, and Ronald Colman.
Although the economic forces that are
responsible for Los Angeles’s rapid rise are clear, the initial seed is not.
Why Los Angeles? The conventional explanation has always been that the motion
picture industry needed to be based in Los Angeles because of its good weather;
the cold New York winters presented technical challenges for outdoor filming.
But while the weather was important, it could not have been the decisive
factor. This is another case of after-the-fact rationalization. Los Angeles is
not the only city in America with a good climate. And Berlin, London, Paris,
and Moscow—none of them with mild winters—all remained film capitals.
In 2006, the UCLA geographer Allen Scott proposed
a much better explanation. He pointed out that Los Angeles’s rise hinges on the
year 1915, when a powerful combination of commercial and cultural forces
transformed the city. The event that precipitated
Hollywood’s ascent had to do with a real star, the pioneering director D. W.
Griffith. The inventor of a number of new techniques that would define
filmmaking for decades, including the close-up, the flashback, and the
fade-out, Griffith became so influential that Charlie Chaplin called him “the
teacher of us all.” The key moment for Hollywood occurred in 1915, when Griffith
shot the first big-budget blockbuster in history, The Birth
of a Nation. With a production cost of $85,000—five times more than that
of any film made before—The Birth of a Nation earned
over $18 million in sales, far more than any other film of the silent era. It
was the film that brought motion pictures firmly into the mainstream and made
them appealing to middle-class audiences, who until that point had considered
film inferior to theater. In the process it planted the seed of Los Angeles’s
future success. Three years after the film was made, the city already had twice
as many workers in the film industry as New York, and the gap kept growing
every year for the next two decades. The process of agglomeration had started,
and there was no turning back.
With the benefit of hindsight, the location of
industries appears inevitable. Today we immediately associate Los Angeles with
movies, New York with finance, Silicon Valley with computers, Seattle with
software, and the Raleigh-Durham area with medical research. But this is not
how people saw it before these industries settled in their respective cities.
In 1910 there was little in Los Angeles that suggested it was going to become
the film capital of the world. There was nothing in the Raleigh-Durham region
in the 1960s that indicated it would become a biomedical research capital. In
the 1970s, Seattle seemed like the last place that would become a global hub
for software development. Cambridge, San Diego, and San Francisco happened to
have the right kind of stars at the right time. By contrast, the location of
traditional manufacturing is easier to explain, because it can often be traced
to such physical factors as access to a harbor or proximity to natural
resources. There is a reason that Chicago, Detroit, Toledo, Buffalo, and
Cleveland grew into sprawling manufacturing clusters in the nineteenth and
twentieth centuries, and it has to do with cheap transportation of heavy
materials over waterways.
The history of high-tech clusters indicates
that while we understand fairly well what happens after clusters are
established, we often have a hard time predicting them. We have an even harder
time creating them. Even Silicon Valley, arguably the most important cluster in
the United States, hardly appears planned. Military research had a lot to do
with its beginning, but the cluster in the Valley did not take root because
military officials sat down and decided to create an innovation hub in the
region. In 1940 the peninsula south of San Francisco was a quiet agricultural
region with a comparative advantage in fruit production. The arrival among the
orchards of William Shockley, the legendary high-tech pioneer who invented the
transistor, was the seed that sparked growth of the local innovation industry.
When some of Shockley’s disciples created the first integrated circuit at
Fairchild Semiconductor, it became clear that the seed had germinated: the
process of clustering had begun. That serendipitous seedling was the starting
point of an economic miracle that eventually brought millions of jobs to the
region.
While it is true that Shockley was connected to
Stanford—a fact that most histories of the Valley point to as proof that the
Valley owes its existence to the university—at the time Stanford was just one
among the many universities in America, and not even the best one. Of course
Stanford did play a role, but it was less deterministic than many people think.
A research university was necessary but far from sufficient for the birth and
coming-of-age of the Valley. If Shockley had decided to locate in, say,
Providence, which was then an area with a significantly more developed
industrial base than Palo Alto, Silicon Valley might today be clustered in
Rhode Island, and we would be reading dozens of books on how Brown University
caused the cluster.
Visionaries have been trying to build thriving
cities from the time that people started living in them. Utopian communities
have always ignited people’s imaginations, with their promise of curing social
ills through enlightened planning and strong values. In most cases these
communities have not lasted. In 1928, Henry Ford tried to establish a new
industrial center called Fordlandia, building it from scratch on virgin land.
His vision was to apply the rational efficiency of Ford engineering to the
building of an ideal community in the middle of the Brazilian rainforest to
harvest rubber for Ford’s tires. As it turned out, it was difficult to engineer
utopia. Ford’s experiment proved to be a disaster for residents and investors
alike. It was sold at a big loss just seventeen years after it had been
inaugurated with great fanfare.
Struggling communities all across America are
now trying to reinvent themselves and attract good jobs. How should governments
aid in this effort? Ever since Michael Porter popularized the catchy concept of
cluster building in the early 1990s, cities and states have been trying to
engineer clusters through a variety of public policy measures, which economists
call place-based policies. They are effectively a form
of welfare, but they target cities, not individuals. About $60 billion is spent
annually by states and the federal government on these policies—more than is
spent on unemployment compensation in a normal year. But the economic logic behind
such measures is rarely discussed and even less frequently understood.
Do these policies work? To answer this question,
we must examine the underlying ideas more closely and rigorously evaluate their
economic rationale. We will discover that just as Henry Ford faced challenges
in building a city from scratch, local governments face challenges in
reorienting regional economics. Understanding when government intervention
makes sense and when it doesn’t is a crucial first step in setting sound
policies.
Post Comment
Aucun commentaire