Do immigrants from poor countries hurt native workers? A study by an influential immigration economist at Harvard University recently found that a famous flood of Cuban immigrants into Miami dramatically reduced the wages of native workers.
That study lit a small fire last year in The Atlantic, National Review, and New Yorker. Celebrity advocates of restricting immigration declared that the study had “nuked” their opponents’ views. The study, by George Borjas, is a centerpiece of his own mass-market book on immigration, We Wanted Workers, cited by the U.S. Attorney General as proving the economic harms of immigration.
But there’s a problem. The Borjas study had a critical flaw that makes the finding spurious. That flaw is explained in a new research paper that I co-authored with Jennifer Hunt, who is the James G. Cullen Professor of economics at Rutgers University. In this blogpost, I explain the flaw and why it reinforces earlier findings that the Mariel Boatlift influx of Cuban immigrants did not reduce wages of Miami workers.
The flaw was not known to the academic peers who reviewed the study for publication. The finding that the Cuban immigrants caused Miami wages to collapse is an artifact of how the study handles its data. The study is based on a wage survey from a sample of workers. The study focuses on a small group within that larger sample, a group where the sample shifted to include a lot more black male workers with relatively low wages—simultaneously with the Boatlift. This has the effect of sharply reducing the average wage of people in the sample, but this had nothing to do with the Cuban influx.
This is noteworthy because what happened in Miami is the one historical event that has most shaped how economists view immigration. To explain what went wrong, I have to pan out and describe the historical setting under study.
The Mariel Boatlift
For an economist there’s a straightforward way to study how low-skill immigration affects native workers. Find a large, sudden wave of low-skill immigrants arriving in one city only. Watch what happens to wages and employment for native workers in that city, and compare that to other cities where the immigrants didn’t go.
An ideal experiment like this actually happened in Miami in 1980. Over just a few months, 125,000 mostly low-skill immigrants arrived from Mariel Bay, Cuba. This vast seaborne exodus is known as the Mariel Boatlift. The workforce of Miami rose by 8%, the low-skill workforce shot up by 20%. If immigrants compete with native workers, this is exactly where you should see natives’ wages drop. Of course, if they did drop, you would want to make sure that it wasn’t just a regional or national trend. You would compare wage trends in Miami after 1980 to the same trends in other, similar cities that didn’t get a migrant surge.
Economist David Card of U.C. Berkeley did exactly this, in a massively influential study in 1990. Card’s work became one of the most cited economic studies of immigration. The design of the study was elegant and transparent. But even more than that, what made the study memorable was what Card found.
Nothing. The Card study found no difference in wage or employment trends in Miami, flooded with new low-skill workers, and other cities—not for any workers, including low-skill workers. That study concluded that “the Mariel immigration had essentially no effect on the wages or employment outcomes of non-Cuban workers in the Miami labor market.”
Economists ever since have tried to explain this remarkable result. Was it that the affected U.S. workers had simply moved away? Had low-skill Cubans made native Miamians more productive, stimulating the local economy? Was it that the Cubans’ own demand for goods and services had generated as many jobs as they filled? Was it that Miami employers shifted to production technologies that used more low-skill labor? Regardless, the real-life economy was evidently more complex than an ‘Econ 101’ model requiring wages to fall when immigrant labor arrives.
A critical flaw
This is where Borjas’s study came in, in 2015. The new paper by Borjas claimed that Card’s earlier analysis had obscured a large fall in the wages of native workers by using too general a definition of ‘low-skill worker’. Card’s study had looked at the wages of U.S. workers whose education extended only to high school or less.
Borjas’s paper divided up this group. It separately measured the wages of two slices of that larger group: 1) people who never finished high school, and 2) people who finished high school but went no further. It found that in the less-than-high-school group, wages plummeted right in 1980, just the year when all those low-skill Cubans had arrived. That happened in Miami to a much greater degree than in other cities. The study estimated that the Mariel Boatlift had slashed the wages of U.S. workers with less than high school by somewhere between 10 and 30 percent. It seemed like a smoking gun.
But it was in that act of slicing the data that the spurious result was generated. It created data samples that, exactly in 1980, suddenly included far more low-wage black males—accounting for the whole wage decline in those samples relative to other cities. Understanding how that happened requires understanding the raw data.
Both the Card and Borjas papers use data from the Current Population Survey or CPS, a representative sample survey of U.S. workers. This survey happens every month; it’s where we get the estimates for the U.S. unemployment rate. Two datasets taken from the CPS also report workers’ wages. These two datasets are called the ‘March Supplement’ and the ‘Outgoing Rotation Group’ (ORG). These aren’t surveys of all workers, but of a small number of workers chosen randomly from within subsets of the population, so that their answers to questions about wages will be representative of the wages of others like them.
Right in 1980, the Census Bureau—which ran the CPS surveys—improved its survey methods to cover more low-skill black men. The 1970 census and again the 1980 census had greatly undercounted low-skill black men, both by failing to identify their residences and by failing to sufficiently probe survey respondents about marginal or itinerant household members. There was massive legislative and judicial pressure to count blacks better, particularly in Miami. Starting in the 1981 CPS, survey coverage of lower-skill black men shifted sharply to include relatively more black men with less-than-high-school, and relatively fewer black men who had completed high school.
You can see the sample shift sharply in the graph below. In all graphs here, the 1981 CPS is shown above the year 1980 (as in the Borjas paper) because the wages reported there were mostly earned in late 1980.
On the left is the fraction of black workers in the exact March CPS samples used in the Borjas paper: male, non-Hispanic employed workers age 25–59, with less than high school. On the right is the same ‘fraction black’ in the same dataset, using the broader definition of low-skill used the Card paper: high school or less. The black line shows workers in Miami.
On the left, that’s a huge jump in the composition of the Miami sample in 1980. The fraction of blacks in the Borjas paper’s sample goes from just one third of the sample to two thirds, exactly when the Boatlift happens. Those new workers being surveyed aren’t the Cubans themselves: remember these are the non-Hispanic workers. And in the years thereafter the black fraction rises still more, reaching 91% (!) in 1985. That doesn’t happen among the broader education category used in the Card paper, on the right.
Those graphs show the same trends in the ‘control’ cities. Remember that both researchers test the effects of the Boatlift by comparing wages in Miami to wages in other, comparable cities that didn’t receive an immigrant surge. The two researchers choose these cities in slightly different ways, both of them based on finding cities with employment trends similar to Miami’s trend around the same time. Card uses Atlanta, Los Angeles, Houston, and Tampa-St. Petersburg; Borjas uses Anaheim, Rochester, Nassau-Suffolk, and San Jose.
But the black fraction doesn’t jump up in the samples from either group of ‘control’ cities. In fact, the fraction black in the control cities that the Borjas paper focuses on falls after 1980, actually reaching zero in 1983. At some points the Borjas study is comparing a city whose survey dataset is massively shifting to cover more low-wage blacks to cities where the CPS didn’t interview a single black man at this skill level.
The spurious finding
This shift in sample coverage of blacks is what creates the fall in measured wages. Among men in Miami with less-than-high-school at this time, wages are much lower among blacks than non-black workers. So including more blacks in the sample would make the average wage in the sample fall, even if nothing happened to the wages of any workers in the real population of Miami where the sample was picked. Take the shift in blacks in the sample, multiply it by the black-nonblack wage gap, and you get the fall in wages that would appear in sample used in the Borjas study.
For example, the black fraction in the survey sample went up between 1979 and 1985 by 51 percentage points more in Miami than it did in the samples from the main control cities considered in the Borjas study. Multiply that by the 49% relative gap in wages between blacks and non-blacks among very low-skill workers in this time and place, and you get 25% lower wages by 1985 just from including more low-wage blacks in the survey. The graph on the left below does this for all the years after 1980:
The vertical axis there is roughly comparable to a percent change in the wage, where –0.1 is a 10% decline and –0.2 is a 20% decline (it’s the change in logarithm points). Looking at that graph you can see that the inclusion of more low-wage blacks in the sample would cause the average wage to fall drastically, a decline that hits about 25% a few years after 1980.
The effect of the Mariel Boatlift measured by Borjas is “between 10 and 30 percent”. In other words, the spurious decline in wages explains the entire effect estimated in Borjas’s study.
On the right, in the figure above, is the spurious wage decline that you would get if you studied wages in the Card paper’s broader education category of high-school-or-less. You saw above that there was a much smaller shift in the number of blacks surveyed in that broader group. There’s also a smaller gap between black and nonblack wages in that group. So the estimated wage drop following the Mariel Boatlift would be negligible using the grouping of workers in Card’s paper: just 2 or 3% for a couple of years, not enough to show up among the statistical noise. Card’s study was therefore essentially unaffected by the bias of shifting sample coverage. This can explain why the Card and Borjas studies reach such sharply different conclusions.
What I’m discussing here is just one of many ways that this change in the composition of the survey sample can explain discrepancies between earlier studies of the Boatlift. One of these is that a study by Giovanni Peri and Vasil Yasenov of U.C. Davis showed that the results in the Borjas study are sensitive to which dataset one uses. I mentioned above that the CPS has two wage surveys each year. The above graphs are for the March CPS that the Card paper used, and Borjas’s paper focuses on. Peri and Yasenov showed that the result is much smaller in the other wage survey (the ‘ORG’ survey), a result that is discussed in revised versions of the Borjas study. Specifically, the wage effect estimate in the Borjas study is three times larger in the March CPS survey data than it is in the ORG data. There is no clear reason why the true effect of an immigration wave would change that much between two different surveys. But our paper has a simple and definitive explanation for this result: the post-1980 increase in coverage of low-wage blacks is three times larger in the March CPS than in the ORG data. The glove fits, in this way and several others that we discuss in our paper.
The origin of this problem is the way that the survey data are sliced in the Borjas study. When the Census Bureau picks people to survey who will represent the population, they only choose them to represent broad categories of people. If researchers slice up the data too finely, the Census Bureau can’t promise that those slices will represent the population anymore. For example, if you narrowly sliced the data to include only 53 year-old female Peruvian-American surgeons, you might find that very few (or no) such people happened to have been included in the survey sample. But such a very small (or empty) group would not necessarily represent all people of that description in any given city.
A less extreme example is what happened in the March CPS data of Miami: under pressure to improve coverage of low-skill black men, their efforts led to a shift within their sample of black men with high school or less. Within that group, right after the Mariel Boatlift, they began covering far more black men with less than high school, and relatively fewer who had finished high school. That wouldn’t change the racial representativeness of the broader high-school-or-less group. By slicing that group too finely, the Borjas study created a sub-group where the representativeness of the population changed sharply, at precisely the moment that a wage decline would be inaccurately attributed to the Boatlift.
More light, less heat
The evidence from the Mariel Boatlift remains as found in David Card’s seminal research: there is no evidence that wages fell, or unemployment rose, among the least-skilled workers in Miami even after a sudden refugee wave raised the size of that workforce by 20%.
This does not by any means imply that large waves of low-skill immigration do not or could not displace any native workers, especially in the short term. But facile pronouncements in recent statements by politicians, that immigrants necessarily do harm native workers, must grapple with rigorously-studied real-world experiences to the contrary. The Mariel Boatlift remains one of the most enlightening experiences of this kind.
Read more in Part Two: Answering Questions about Our Research.