Nationwide the 2015 election had the lowest voter turnout the country has seen in 72 years, 36%. Countless state, county, city, and school races across the US went scarcely noticed by voters. San Francisco held a hugely controversial election that many commentators said was a battle for the city’s soul, with millions of dollars spent on ballot initiatives aimed at the city’s spiraling housing costs and rapid gentrification. Yet only 41% of registered voters cast ballots. Closer to my home, the city of Santa Barbara held a historic election, its first since switching to city council districts, which promised the potential to shake up City Hall, yet voter turnout was 38%.
Why Odd-Year Elections Keep People From Voting
Local governments that choose to hold their elections in odd-numbered years typically see far lower voter turnout, often dropping by half, and the voters that cast ballots are overwhelmingly whiter, older, and wealthier than those who participate in general elections.
Imagine a working immigrant mother who recently became a US citizen. She’s excited to vote, but has never done it before. After working long hours cleaning houses, picking her kids up from childcare, cooking them dinner and washing the dishes, she realizes it’s election day and the polls close in an hour. The local city council elections haven’t really been covered on TV, which focuses mostly on national news, and are rarely mentioned in the weekly local Spanish newspaper. The candidates don’t bother knocking on doors in her apartment complex, where few residents are eligible or registered to vote, and even fewer turn out during odd-year elections. She doesn’t know who is running for city council, what they stand for, or what issues are being debated. With time running out before the polling booths close, she decides she’ll wait to vote next year, when she can cast her ballot for the president.
The gap between voter turnout in national elections and odd-year local elections has widened over the years, with a few potential causes:
- Demographic change means there are more voters like the woman mentioned above, young people or immigrants who are new to voting and have less access to information about local politics.
- Americans are working longer hours, which means they feel more and more strained for time to follow local politics, research issues, and vote.
- As campaigns have become longer and more expensive, people living in cities or states where an election takes place every year feel overwhelmed and fatigued by trying to research and sort through information in seemingly endless election seasons.
- Local newspapers and TV stations have declined, gone bankrupt, and laid off investigative journalists, while national cable news like Fox News and national online news sites like the Huffington Post have boomed, leaving voters with scarce access to information about local issues.
The problem with odd-year elections made national headlines after the civil unrest in Ferguson, Missouri, where off-cycle elections are one of the primary reasons why the city government so starkly lacked representation from the majority black community.
The Anti-Irish History of Odd-Year Elections
But why do these off-cycle elections even exist? What reason does a city have to hold an election separate from the state and national elections? Why spend extra taxpayer dollars to run a separate election when it clearly leads to lower voter turnout?
The answer lies in history. Off-cycle elections are mostly credited to Progressive Era reformers in the late 1800’s who saw them as a way to fight corruption in big cities. But they were also a favorite policy of anti-immigrant political groups who blamed rapidly growing populations of Irish and other immigrants for using urban political machines to get jobs and services for their communities.
Sarah Anzia is probably the leading academic scholar studying odd-year elections. While much of the attention on her work has focused on her suggestion that public employee unions are one of the major factors keeping municipal elections in odd years, I think something much more interesting is buried in her earlier examination of the history of odd-year elections. Their original intent was primarily to break the backs of Irish political organizations in big cities.
Anzia found that by the 1890’s, when Progressive Era reformers took up the cause of off-cycle elections for cities, there had already been a long history of politicians changing the dates of city elections to manipulate outcomes. There is no thorough national record of this history, but it can be dug up in case studies of individual cities. Off-cycle elections emerged during the mid-1800’s through what Anzia refers to as “partisan power plays”, political parties jockeying to change the rules of the game to help them win. Specifically in cities like New York and San Francisco, it was a result of an alliance between anti-corruption reform parties and nativist anti-immigrant parties who found a common enemy in the Democratic Party, which in many big cities had become dominated by a well-organized urban Irish voter turnout machine.
An Alliance Between Anti-Immigrant and Anti-Corruption Activists
For many reformers in the 1800’s, Irish and corruption were synonymous. The era was the height of a wave of immigration to the rapidly industrializing US from Ireland and Southern and Eastern Europe. Immigrants lived in extreme poverty, worked under highly exploitative conditions, and received little assistance or rights from the government. More than any other group, the Irish built political power in the US’s biggest cities in response to the intense racism Irish immigrants met when they arrived. Tammany Hall and other Irish-dominated political organizations ensured immigrant communities access to basic services, jobs and emergency assistance, built infrastructure and charities, and were rewarded by a loyal bloc of voters. Yet they also became a symbol of corruption, rewarding their supporters with government jobs and giving bribes to get what they wanted, especially under New York’s notorious Boss Tweed.
Of course history is written by the victors, and the late 1800’s political battles between middle-class Protestant whites of English descent and the working poor Catholic and Jewish immigrants are simplistically depicted as the good reformers versus the corrupt mobsters. There was corruption in the urban immigrant political machines no doubt. But poor people and immigrants voted for them because they provided basic infrastructure and human services in their neighborhoods and defended their rights, as opposed to the intensely racist treatment they got from parties like the Whigs or the Know Nothings. As we make policy today, we should examine this history with a critical eye to separate real anti-corruption efforts like civil service reform from shameless attempts to break Irish political power like odd-year elections.
The reform movements of the late 1800’s certainly had their discriminatory undertones, walking the fine line between hating corrupt Irish political machines and hating Irish people. Legendary reformer cartoonist Thomas Nast, whose work is shown in this post, is credited in history textbooks with taking down notorious Boss Tweed but often depicted Irish people as drunken violent monkey-like creatures who had taken over the country. The movement’s belief in rational scientific progress flirted at times with eugenics, the idea that keeping the poor and uneducated from breeding would further the human race. And the push for alcohol prohibition was often tied to the idea that Irish, Russians and other urban immigrant groups were drunks who were ruining the moral fiber of American society.
San Francisco and New York
But in the case of off-cycle elections, the switch was often won through a direct alliance between anti-corruption reformers and anti-immigrant bigots. In 1850’s New York, the racist nativist Know Nothing party allied with the Whigs (precursors to Republicans) in the state legislature to separate New York’s city election away from the state and national elections. Voter turnout for city elections plunged, especially for Democrats, who depended on working-class immigrant voters who failed to turn out in off-cycle elections.
Irish who came to San Francisco during the Gold Rush brought Tammany Hall-style political organization to the West Coast in the 1850’s. The People’s Party, a local San Francisco party that drew its support from both the financial elite and anti-Irish nativists, was born in response. During their decade of control of San Francisco, the People’s Party led a successful push to switch San Francisco to off-cycle elections by allying with Republicans in the state legislature to change the city’s charter.
These cities set the precedent for a trend that swept the country decades later. Today, our cities are facing low voter turnout and unequal representation because of a policy rooted in anti-Irish racism. There is no evidence now that cities with even-year elections have any more corruption than those with odd-years. But the much greater threat facing our democracy, the power of unlimited corporate money, is made much more powerful in low turnout off-years, when voters are disengaged and tuned out, and it’s easy to buy an election.
Today’s defenders of odd-year elections say that if local elections are moved to even-years that local issues will be drowned out by national politics. They say that the small turnouts for odd-year elections are actually a good thing—that a small group of citizens who are well-informed and pay attention to local issues are the ones who should make the decisions.
But is it possible that the “uninformed” voter has something meaningful to contribute to their community? That a young person or low-wage worker who rides the bus every day might actually have a better perspective on the city’s public transit system than a member of the Chamber of Commerce who has seen a presentation by a city official on the subject? That an undocumented immigrant or young black person may not go to the same dinner parties as city councilmembers and school board trustees, but they’ve experienced harassment at the hands of city police that the members of the Rotary Club have no idea about? That while some people’s definition of local issues are limited to parking and potholes, the family who just got evicted because they can’t afford rent might consider raising the city’s minimum wage to be an important local issue?
Odd-year elections are driven by a fear of the people that tears against the fabric of our democracy. It’s a fear that the people are too stupid to govern themselves. Although it might be couched in more polite language today, it’s the same fear of the ignorant Irish masses, mindlessly mobilized by political machines. Today’s defenders of odd-year elections should know the history of what they’re defending because they carry on its legacy today.
While mandatory paid maternity (and often paternity) leave is nearly universal across the globe and broadly popular with policy experts and the public, it’s had difficulty gaining traction in Congress. But by learning from the lessons of the Fight For 15 movement that has increased the minimum wage in cities across the US, advocates could soon find this policy sweeping the country like wildfire, with DC as the first spark.
Why a Popular Policy Goes Nowhere in Congress
Much like paid family leave, the public overwhelmingly supports raising the minimum wage, which has absolutely no effect on whether a congressional bill will be signed into law. Momentum for a higher minimum wage is being fueld by the combination of a political landscape dominated by a national debate over economic inequality and an economic landscape where a wageless economic “recovery” has failed to raise average workers’ incomes. Support for raising the wage is shared broadly across race, age, income, gender and even political party divides because for most people it’s a simple moral issue: no one who works full-time should live in poverty. Yet while few people support a low minimum wage, lobbying powers like the Chamber of Commerce and National Restaurant Association have managed to grind the issue to a halt in Congress. Corporate interests with deep pockets are able to hold Republican lawmakers tightly in line with the business agenda while also maintaining a firm grip on Democrats in swing districts seeking big money donors for tough reelection battles. In the gridlocked era where virtually zero meaningful legislation has been signed into law since the Tea Party wave of 2010, something like the minimum wage is dead on arrival, no matter how much popularity it has with the public.
Paid family leave has similar broad support, including a majority of Republicans—who would be against parents being allowed to spend time with their newborn children? Its growing popularity is tied to rising concerns about American work-life balance as the average workweek reaches 47 hours and American women’s presence in the workplace has stalled while continuing to rise in other countries. Major companies like Netflix have gained recent national attention and praise for adopting paid family leave for their workers (although they exclude their low-wage workers who need it most, showing why we can’t rely on the benevolence of our corporation-people-friends). It’s become a major campaign issue in the 2016 presidential election, playing a prominent role in the first Democratic debate and even getting lip service from Marco Rubio. Yet despite being one of the most popular kids at the public policy party, family leave faces the same impossible odds in Congress as the minimum wage.
Why the Fight for 15 Movement is Working Anyway
Despite a congress made dysfunctional by GOP obstruction and corporate money, the national movement to raise the minimum wage went in two years from impossible to unstoppable. When fast food workers first began striking in 2013, demanding $15/hour wages, serious journalists and political pundits inside the beltway dismissed the cause as laughable. But the labor and social justice organizers working to lay the groundwork of the FightFor15 movement knew what they were doing. The strategy had been tested already with a push for a modest $10 minimum wage ballot initiative in San Jose delivering a win in 2012. The first $15/hour minimum wage victory came in 2013 with a massive and expensive battle in the tiny town of Seatac, WA, whose economy is anchored by the Seattle-Tacoma international airport. Seatac was the perfect place to prove that 15 was possible. Meanwhile nearby, the $15 minimum wage debate had landed in the center of the Seattle mayoral race and after the election the city council negotiated an agreement with business interests to pass an increase, bringing national attention as the first major city to pass a $15 minimum wage. Wage increases continued to sweep the left-leaning West Coast, especially the many cities of the San Francisco Bay Area. Moderate minimum wage hikes were put on the ballot across the country in the 2014 election, passing in four rural red states. When the Los Angeles City Council reached an agreement this year to pass a $15 wage in the second largest city in the US, raising up a low-wage workforce many times the size of Seattle or San Francisco, there was no denying that $15 had gone from pipe dream to national benchmark.
The strategy was a tectonic shift for the labor movement. Traditionally unions have invested massive resources into electing Democrats to Washington, DC and trying to push them to take a pro-labor stance on federal legislation, a strategy which has had little success on key issues like opposing trade agreements and removing barriers to workers unionizing. Yet over the past few years, organized labor has experimented with investing heavily in local grassroots organizing, including fast food and retail workers who face long odds of forming unions under current laws. They’ve pushed full steam ahead with minimum wage campaigns, often using ballot initiatives to bypass elected officials influenced by corporate donors and ride strong support among regular people to victory.
Fight for 15’s strategic brilliance is based on a few key concepts perfectly tailored to the political environment of the 2010’s:
- Going Hard: Winning these battles requires maximizing the one asset we have– people power. By staking out a position like $15/hour strong enough to actually excite and mobilize regular people (even if the conventional wisdom of political elites said it was impossible) Fight for 15 built an unstoppable movement from the ground up.
- Going Local: The farther away from regular people the decisionmaking process gets, the less power everyday working people have and the more power corporate lobbyists have. Pushing for citywide or sometimes statewide minimum wage hikes built grassroots momentum and kept the movement from being bogged down in Washington DC.
- Going Simple: Of the many policy ideas to address economic inequality, the minimum wage is one of the simplest, which paints the choice for voters in clear moral terms. The more this battle is fought out in broad daylight rather than in backroom negotiations over the wonky details of obscure policy, the more it draws a clear divide between corporate lobbyists and regular people.
Why Paid Family Leave is Next
The DC proposal for paid family leave picks up on all of these strategic elements. It’s the first time paid family leave has ever been done at a city level. It’s also far bolder of a proposal than any state has adopted, with no state offering more than 8 weeks or coming close to fully paying workers’ normal income during that time. (Here in California you can get up to 6 weeks at 55% of your normal wages by tapping into your state disability benefits). The DC plan is 16 weeks fully paid leave for workers who earn up to $52k a year, with half pay above that, and includes adoption and LGBT families. And while it’s a little more complex than a minimum wage increase, the overall concept is a simple one that makes obvious sense to the average voter.
While a majority of American workers earn above $15 an hour, only 11% of Americans have paid family leave. Paid family leave makes the biggest difference in the lives of working-class women, but it also helps bring in the solidarity of professional-class women who know how precarious their own economic status can be and how awful family care policy is in the US. And it taps into a growing number of men, especially young men who came of age in a time of shifting gender roles, and genuinely want to be present in their children’s lives but are being held back by Stone Age workplace policies and cultures that don’t accommodate paternal leave. In fact, men doubled their share of taking family leave after California adopted paid family leave in 2004.
A good campaign can be led by the people who are most directly affected, brings in new people to the movement and energizes those who are already part of it, makes tangible lasting change in people’s lives, exposes the bad guys for how shitty they truly are, and ultimately shifts the balance of power. That’s what Fight for 15 has done and that’s what paid family leave has the potential to do too.
It’s part of something bigger
What’s happening right now is not just a series of campaigns to raise the minimum wage. It’s the revival of a labor movement that engages the vast majority of Americans who aren’t union members. It’s collective bargaining at a mass scale of not just one company’s employees, but the population of entire regional economies like the San Francisco Bay Area and Los Angeles. It’s not just minimum wage increases that are being won by this strategy. Many of the ballot initiatives and ordinances have also included paid sick days and wage theft enforcement. San Francisco has even begun to lay out the right to a predictable, sane, work schedule.
In the 21st century, grassroots local movements are not just going to lead the way on increasing the minimum wage. They’re going to push cities and counties and states to pass stronger enforcement of existing wage laws, enact paid sick days, paid family leave, reasonable hours and scheduling, health and safety standards, and perhaps even equality for the most disenfranchised workers excluded from many labor laws like domestic workers and farmworkers.
Movements like Fight for 15 that raise standards for all workers from the bottom up are reminding us why we ever had a labor movement in the first place. They’re reminding us why fighting for the dignity of working people matters. They’re reminding us that when it comes to the national debate on economic inequality, workers outnumber and outvote bosses. They’re reminding us that when we organize, we win.
San Francisco has become a flashpoint in the national political battle over eye-popping rent increases in America’s cities.
San Francisco real estate developer Michael Cohen, who used to run the city’s economic development department, says “The single most important land use debate that goes on in San Francisco is whether you believe that the laws of supply and demand exist.”
This explanation hasn’t many satisfied longtime residents, whose swelling anger against gentrification and displacement has broken out into mass protests against new luxury developments in historically working-class neighborhoods and a ballot measure this fall to halt all new luxury development in the city’s Mission District.
Cohen, the growing SF Bay Area Renters Federation (SFBARF) and others, say more of these new developments are the only way to bring down rents citywide, because they will relieve the city’s chronic shortage of housing. They say protests against new development are actually the reason for rising rents, as the city’s constrained housing supply is lagging far behind the demand of young professionals flocking to San Francisco and other urban centers to work in booming well-paid tech industries in recent years. Others, largely led by community organizations like Causa Justa/Just Cause made up of longtime residents, predominantly working-class families and people of color, call this “trickle down housing” and say it doesn’t work.
So why are these Bay Area native folks so skeptical of basic economic theory? Well maybe it’s because your basic economic theory is hella basic, bruh. Economic models are about simplification of the real world—we need models to teach theories, but sometimes those simplifications become a problem when we apply them to real life.
The people who lived and worked and raised their families in low-income urban neighborhoods long before the hipsters thought it was cool, in the decades when the middle-class fled to white suburbia, have a lifetime of experience with the economic reality of urban housing markets, not the basic economic theory.
Here’s what they see: The same neighborhoods that are seeing the biggest rent increases are those seeing the most new housing developments. They see the East/West divide of the city of San Francisco: The densely populated eastern neighborhoods like the Mission and SoMa are booming with new development while facing eye-popping rent increases and overwhelming numbers of evictions. Meanwhile, the western side of the city, with its many low-density middle-class enclaves that are hotbeds of NIMBYism, sees relatively little change. They’ve seen a boom and bust cycle of interest in San Francisco by developers and yuppies: the long flight of white middle-class families and employers from the city for decades, followed by a tech bubble in the 90’s that led to a development rush before popping spectacularly, and then a resurgence of the tech industry in the last few years leading to another cycle of development, gentrification and rising rents. It’s hard to shake the gut feeling that development isn’t for them, that in fact it hurts their community. They’ve been tossed around by this boom and bust cycle, families losing their homes, friends losing their stores, feeling like strangers in their own neighborhoods, in a story that just doesn’t align with that supply and demand graph.
Housing cost increases in SF
New housing developments being proposed in SF
For longtime residents, the basic supply and demand theory just doesn’t pass the bullshit smell test. Maybe it’s not because they’re stupid and don’t understand economics. Maybe it’s because thanks to their actual lived experience, they understand how urban housing markets work in practice better than the Patiently Explaining Gentrifiers understand them in theory. So my dear Patiently Explaining Gentrifiers, the next time you roll your fixie past the black family in your apartment and they look at you sideways, please refer to this helpful guide to break down the economics that they understand and you don’t.
- Luxury housing doesn’t really substitute for low-income housing
Housing marketed at young urban professionals is not a perfect substitute for housing built for blue-collar families. That means they have different markets setting their price, which are driven by different levels of supply and demand. (Not to mention these two segments of the housing market diverge farther and farther apart the more economic inequality grows.)
Imagine a world where you could only buy two cars: Cadillacs and Honda Civics. Most people who own Cadillacs wouldn’t be caught dead driving a Honda Civic and most people who push a Civic can barely dream of owning a Cadillac. So if GM decided to build a lot more Cadillacs next year, you’d expect to see the price of Cadillacs go down, but it probably wouldn’t have much effect on the price of Civics.
But that doesn’t mean the two prices are totally unrelated. A bargain item can be a substitute for a luxury item, it’s just not a very good one. Say for example, there’s a sudden rush of rich people wanting brand new Caddies. They start buying up all the Cadillacs on the car lots, but the people who already own Cadillacs insist that to retain the “Cadillac brand” (and maybe improve the selling value of their car), GM must not produce more than a limited amount per year. The resulting shortage sends Caddie prices sky high as the dealerships are swarmed with people trying to outbid each other. But those who can’t get their hands on a Cadillac still need a car. So the second-class yuppies (you know, the ones who work at Ask.com instead of Google) start buying Honda Civics and tricking them out, adding heated seats and TV screens. Seeing Civics have gained a new customer base with more money to spend, the Honda dealerships start raising their prices too.
Replace cars with housing and dealerships with landlords, and this is part of what we’re seeing with the housing market in cities like San Francisco, and it’s the basic argument of the folks at SFBARF.
So the natural solution is of course to build more Cadillacs (aka high-end housing developments) to address this problem at its root?
Yes, that’s true to some extent. Solving the housing crisis will require building more luxury housing somewhere. But there’s more.
2. Neighborhood speculation raises land values
We can’t just replace cars with housing and dealerships with landlords in a simple model of the world. Because cars aren’t like housing. If you park your Caddie in the spot next to my Civic, it doesn’t cause my monthly car note to get more expensive. But with housing, only part of what you’re paying for is the physical structure, most of what you’re paying for is the location, the land, the neighborhood.
When you build expensive new condos next to low-income apartments, it has an external effect: it raises the value of all the land in the neighborhood surrounding it. (The same way if you put a toxic waste dump in a neighborhood it lowers the surrounding land values.) Land is a speculative asset: unlike the buildings on top of the land, you can’t build more of it. There’s a limited supply and the best way to make money from it is to buy up as much as you can get your hands on now if you think the price is going to go up in the future. Developers start rushing to get in first on this hot new neighborhood, bidding up land values (“Did you see the New York Times wrote an article about this totally up-and-coming neighborhood??”)
Once the new residents move in, they create demand for someone to open up an artisanal kale wrap deli and kombucha bar next door and a barbershop on the corner that will trim your fixed gear bicycle’s decorative moustache. Those new amenities create even more demand from yuppies and hipsters to live in that neighborhood. The cycle of rising land values continues to spiral out of control.
So what the hell happened? Why isn’t increasing the housing supply bringing down rents?
- Rising land values reduce the supply of low-income housing
Now if I’m a landlord who currently owns a fairly cheap apartment building and rents primarily to working-class immigrant families, when neighborhood land values rise, suddenly I own an asset that I’m not using to its full potential. I’m better off completely shifting my business model of what kind of housing I’m providing on the land. I’m basically burning free money unless I either sell the land to a developer or fix the place up myself and charge higher rents to new customers willing and able to pay more. This will likely involve evicting my low-income tenants from their homes. If my city has strong renter protection and/or rent control laws, I’ll have to do whatever I can to harass my tenants, threaten to call the police or immigration on them, refuse to make repairs, or otherwise make their life a living hell until they move out. I might even convert my rental apartments into condos for sale using what’s called an “Ellis Act eviction” to evade tenant protection laws. I’m reducing the supply of low-rent housing by converting it to a different type of housing.
Here’s a similar example: Say I grow wheat in North Dakota. If shale oil is discovered in my region and everyone around me is fracking for oil, I’m sure as hell not going to keep farming my land for wheat. But if I take my land out of wheat production and sell it into oil production, guess what that does to the supply of wheat?
And if I’m a developer looking to build cheap, low-cost housing marketed at working-class families, I’m sure as hell not going to do it anymore now that the first step is buying some premium priced land, which now covers every single neighborhood in cities like SF and NY. New development of housing for working-class families basically grinds to a halt, because it’s not profitable to buy expensive land and then rent it for cheap. Supply of low-income rentals is strangled, despite the fact that the booming growth of tech is also creating tons of low-paid jobs for janitors, landscapers, cooks, childcare workers, and security guards who are fueling rising demand for this type of housing.
Say I want to open a cheap sandwich shop. I’m trying to decide between selling Mexican tortas for $5 or Vietnamese banh mi for $4. Suddenly the price of wheat spikes, and bread now costs $3 a loaf. My old business ideas would no longer turn a profit once you factor in the other costs, so I either don’t open my shop or I change my business plan and open up an Artisan Torta/Banh-Mi Fusion Deli and charge $15 a sandwich. The higher price of an input, wheat, has restricted the supply of cheap sandwiches, just as a higher price of land restricted the supply of low-income rental housing.
That’s why between 2007-2014, San Francisco built over twice the amount of high-income housing than it was projected to need by California’s housing department, but only built half of what it needed for low-income families and a third of what it needed for middle-income households.
So as speculation raises land values in a neighborhood, landlords shift their buildings away from the low-income market and orient towards the high-income housing market. And developers are unable to build new housing affordable to low-income families because land values are so high. Both trends result in a massive sucking away from the neighborhood’s housing supply for working-class families.
- Yuppies gobble up more housing space per person
But it gets worse. This assumes a 1:1 ratio of replacement as developers and landlords shift from supplying low-income housing to high-income housing. But blue-collar households tend to have more people in the same space than white-collar households. Apartments that once held a working-class immigrant family of five are now being converted to become home to a young tech worker, his bandana-wearing pug, and his girlfriend who stays over sometimes to watch Netflix and chill. That means landlords and developers are responding to speculation by taking low-rent housing supply off the market even faster than they’re putting high-rent housing supply into the market.
Every new high-rent development in a low-income neighborhood contributes to the cycle of speculation raising land values around it, bringing new low rent-development to a halt and converting the existing nearby supply of low-rent housing at an alarming rate into high-rent housing for people who demand much more square footage per person. Thus a development that helps relieve the shortage of high rent housing can actually create a much worse shortage of low-rent housing.
That’s why residents of rapidly gentrifying neighborhoods like San Francisco’s Mission District are protesting new luxury developments. When you’ve lived through this kind of speculative development, you don’t need an economics degree to know the math.
But it doesn’t have to be a zero sum game.
We can increase the supply of expensive housing while also increasing the supply of affordable housing. But we need to remove the factor of neighborhood speculation from the equation.
Before I go further, let me make abundantly clear that I agree that we absolutely need to build more housing suitable for young professionals in central cities. In fact, we need a lot more of it. Too often, social justice activists struggling every day to defend our communities’ right to live in their own homes forget that white flight to the suburbs in the late 20th century was one of the worst things that ever happened to low-income communities of color in the US. It devastated funding for urban schools and social services as public resources were shifted out to suburban bedroom communities. Rising economic and racial segregation widened income inequality, and reduced economic mobility, as the rich and poor lived in two separate worlds. And the massive environmental toll of millions of commuters driving out to far-flung quiet neighborhoods every day manifested itself in the air pollution and climate change whose burden falls most heavily on low-income communities of color. We need to ask ourselves: What’s our endgame? Maintaining the pre-boom segregated status quo? Because that’s an awful future.
We know that the reversal of last century’s white flight is a good thing. But not if it simply leads to displacement of the urban working-class and communities of color. When the last affordable neighborhood in San Francisco and New York disappears, and the displaced families of the working class are all forced to leave cities and grow a ring of high poverty suburbs, both the economic isolation and environmental devastation will remain the same as it was before.
We need shared cities free of land speculation.
We can do this by building high-end housing in urban neighborhoods that are already historically middle and upper class, where it won’t lead to speculation that a neighborhood is “up and coming”. Every city has these neighborhoods. They’re on the west sides of Los Angeles and San Francisco and the north side of Seattle (I’m a die hard West Coaster). They’re the neighborhoods in your city with the oldest median age, the lowest population density, the highest home ownership rate, the whitest residents and the highest incomes. They’re the places seeing virtually no new housing being built right now. Often strict zoning codes limit new building in these neighborhoods to two stories or single family homes, and the residents are fiercely opposed to denser apartments near them, using their abundance of free time to rail against the parking problems, crime and noise they will bring (often code for younger, poorer, or browner people). The residents of these neighborhoods tend to be more well-resourced, well-organized and well-connected than those in low-income neighborhoods, who often face language and educational barriers and are too busy working long hours for low wages to attend planning commission meetings. Developers quake in fear of their wrath and major new housing projects are rarely proposed, let alone make it to the review phase to be fought over. These are the real NIMBYhoods and they need to be upzoned. Simply changing zoning codes in the lowest density parts of cities to allow taller, denser buildings could lead to a housing development surge without raising rents in low-income neighborhoods.
It won’t be easy. The mayor of Seattle managed to negotiate out an agreement with business interests to pass the first $15 minimum wage in any major American city, but when he backed ending neighborhood bans on apartment buildings by eliminating single-family-only zoning citywide, he met staunch opposition and withdrew the proposal.
As well as political opposition, there’s also a logistical problem. These neighborhoods tend to have limited public transit service, (part of their exclusion of young people, poor people, and people of color) which makes it hard to add more apartment-dwellers, especially those dependent on public transit. We’ll need to build out more transit between the NIMBYhoods and downtown areas where young professionals work.
But building high-end housing in already high-end neighborhoods is the only way to increase supply without triggering the spiral of speculation that raises land values and pushes poor people out.
Meanwhile, we need to ensure we’re also increasing the supply of housing for the working-class, which is bound to erode away if yuppies keep coming faster than our cities can build housing for them.
Longstanding tools that cities use to nudge developers to pay for affordable housing, like inclusionary housing ordinances and density bonuses, definitely help, especially as private development booms. But alone they’re not enough to maintain the balance of different types of housing needed in growing cities where both software programmers and their janitors need a place to live. With the sharp decline in affordable housing funds from the state and federal governments, this will require new sources of revenue.
A land value tax could finance public transit expansions into high-end neighborhoods while also creating a new affordable housing fund. This fund could buy empty plots of land or buildings that go up for sale in low-income neighborhoods and add them into a Community Land Trust—publicly-owned land made permanently affordable to low-income families, where the benefits of increases in land values are captured by the public instead of landlords. This kind of tax would fall mainly on landlords who are riding the wave of speculation, sitting on their land and extracting bigger and bigger profits by charging higher and higher rents to tenants. It wouldn’t tax building more housing on top of your land. And it could protect a large chunk of urban neighborhoods from the wild swings of speculation—or at least make sure that longtime residents actually reap their share of the benefits.
I won’t pretend to know all the answers and I’m not an actual economist (to be fair, neither is Matt Yglesias, the intellectual father of the movement to increase housing supply, whose book The Rent is Too Damn High is the only e-book I’ve ever bought). I’m a guy who lived in Oakland as a kid and knows my rent would be double what I’m paying right now if I moved back there. And I’d hate to be one of the kids in Oakland right now looking around and thinking maybe my city doesn’t have a place for me anymore.
But it doesn’t have to be that way.
A shared city is possible. A city where people from a diverse set of racial backgrounds and economic classes cross paths in public spaces, learn from each other, and create things together. A city where we all contribute to and benefit from the same school districts, transportation networks, libraries, parks and city services. A city that allows more people to live lifestyles that are walkable and transit accessible, energy and water efficient, allowing us to sustain our planet. A city where children can grow up, become adults and get good jobs and support their own families, and one day retire in peace. But unleashing the animal spirits of unchecked speculation, much like the gold rush that once built San Francisco upon the violent displacement of its native inhabitants, will carry us down a much different path.
We’re tinkering at the margins of disaster, putting 40 million people in jeopardy because we’re terrified of upsetting politically powerful corporate interests.
While the news has been buzzing with Governor Jerry Brown imposing California’s first-ever mandatory water restrictions in response to the catastrophic drought, what often goes unspoken is that the new constraints leave untouched the state’s biggest water consumer by far: agribusiness. Agriculture uses 80% of California’s water, yet the only thing Brown is requiring agricultural companies to do is provide more information about their water use. Gov. Brown’s response to criticism? “Some people have a right to more water than others.”
This is a preview of the broader politics we’ll see unfold as America struggles to adapt to climate change. On a global scale, climate change is primarily being caused by the unchecked consumption of the rich and the reckless path of the powerful. Meanwhile, the people harmed most by drought and sea level rise here in California, and other negative impacts all over the world, will be the poor and powerless. Much like during our recent economic disaster, as we face environmental disaster, lawmakers and other very serious people will tell us that we all need to tighten our belts and make sacrifices for the greater good in a harsh new world. Yet at the end of the day, it always seems that the only sacrifices made are from everyday people whose contributions are metaphorically and literally a drop in the bucket. Meanwhile, the wealthy interests that lie at the root cause of the problem sail along with their profits, subsidies, and guarantees intact.
The people of California didn’t cause this drought. The people of California are not luxuriously long shower-taking germophobes who have crushed our environment beneath the weight of our excessively detail-oriented dishwashing. This drought is the result of generations of poorly managed water policy driven by the political heavyweight of big agribusiness’s lobbyists who demand ultra-cheap water rates. This drought is the result of a housing bubble driven by real estate developers and banks who financed endless expansions of suburban sprawl across the scorching heat of inland California for families who could no longer afford to live in increasingly expensive coastal cities. This drought is the result of hopeless inaction on climate change, where overwhelming warnings from the scientific community are being screamed down by the political megaphone of the fossil fuel lobby.
A real drought response would focus on the root causes of our water consumption. It cannot be emphasized enough that 80% of our water is used by agribusiness and much of the rest goes to golf courses and country clubs. But even our residential water consumption is not equal: The majority of residential water use (and the vast majority not directly used to keep people alive and healthy) is used for outdoor landscaping like lawns. Single family homes have twice the outdoor water use of multifamily apartments, and rich neighborhoods use three times the water of poor neighborhoods.
Here’s what a real drought response might look like if we weren’t so afraid of powerful special interests:
1) Build sustainable affordable housing in coastal cities
In the peak summer months, an average San Francisco resident uses 46 gallons of water a day. Other coastal cities range around 50-100 gallons, while inland cities average around 200-500 gallons a day per person. It takes massive amounts of water to keep lawns green in suburban subdivisions sprawling out from scorching hot inland cities. We need housing growth policies that encourage dense, affordable, water/energy-efficient multi-family housing in cool coastal urban areas rather than McMansions in the hot inland parts of the state. But right now our housing regulations do the exact opposite: it’s easy for developers to build cheap new housing in Bakersfield, Palmdale or San Bernardino, not so much in San Francisco, Santa Cruz or Santa Barbara, where longtime wealthy homeowners are vehemently opposed to higher density apartments being built in their neighborhoods.
Although more people are migrating out of California than in, our population is still inevitably growing as more children are born here. So unless we want some sort of draconian policy restricting childbirth, the question is not whether more people will live in California, but where they will live. Unfortunately, because of the crushing unaffordability of California’s coastal and urban areas, the vast majority of population growth has been moving to inland areas like the Central Valley, High Desert and Inland Empire with cheaper housing, but much higher water needs. Promoting dense infill development of affordable housing in coastal urban areas would help increase economic opportunity for working families while creating the serious systemic reform we need to manage California’s water resources long term.
2) Keep fracking from endangering our water supply
Land is rapidly being snatched up across the Central Valley and Central Coast to open new oil wells above the Monterey Shale. Fracking uses significant amounts of water (70 million gallons in California last year), but the bigger problem is that it threatens to pollute our limited water supply with the undisclosed chemicals used in new drilling methods. Fracking produces massive amounts of toxic wastewater, with the challenge of wastewater disposal becoming a ticking time bomb which could contaminate our dwindling clean water supplies. Oil companies have shown a blatant disregard for California’s weak regulations, with hundreds of illegal wastewater pits being discovered right next to farmland and above groundwater supplies in rural California. Yet along with agribusiness, the oil industry was also given a free pass on Governor Brown’s new water restrictions.
But even worse, oil in the Monterey Shale is as dirty as the Canadian Tar Sands. Fracking California’s shale creates the potential to put over 6 billion tons of carbon into the atmosphere, nearly as much as the Keystone XL pipeline, driving forward the climate change that is fueling this extreme drought. The science is clear that climate change increases the frequency and severity of catastrophic drought in California. There can no longer be any doubt for drought-stricken Californians that the climate is changing, and if we want to keep it from getting worse, we need to stop the relentless digging for more dirty energy.
3) Stop subsidizing water-guzzling agribusiness
While agribusiness uses 80% of California’s water, not all farms are created equal when it comes to water consumption. Growers choose between planting different crops, some of which use many times more water than others. Even within the same crop, different growers choose to use more and less efficient irrigation methods. Like any business, California’s growers are making basic mathematical calculations of how to maximize their profits. So when agribusiness is provided artificially cheap water by the government, typically at lower rates than you and I pay as residential consumers, growers pick profitable but thirsty crops, and cheap but wasteful irrigation methods.
Anyone who’s passed an intro economics class would tell you that when you have a shortage of something, the price is naturally supposed to go up. But agribusiness, with its powerful lobbyists in Sacramento, has long been coddled by lawmakers and protected from actually paying fair market prices for water (big business is always all about the free market until they’re not). By keeping agricultural water prices artificially low, the government is directly massively subsidizing drought-causing industries like almonds and cattle. It takes a gallon of water to grow a single almond and 10% of the state’s water goes to the almond industry alone. A pound of beef takes 2500 gallons to produce, and one hamburger is about as much water as you use to shower for a month.
I’m sympathetic to the concern that raising the cost of water as an input to growing food will raise costs at the grocery store for struggling families. It wouldn’t be hard to design a simple policy to keep overall food prices low while shifting growers to more drought resistant crops. California could put an emergency drought surcharge on the sale of water to agribusiness, then take that revenue and use it to subsidize low-water-use fruits and vegetables at the point of sale to the consumer. A grower then faces a different calculation to decide whether they should plant another crop instead of almonds, or whether they should invest in that new irrigation system. This would shift the behavior of both food consumers and food producers towards more drought resistant foods, as prices of water-intensive foods go up while prices of water-efficient foods go down. For example in Ventura County where I live (the 10th biggest agriculture producing county in the US), that could mean land shifting from water sucking strawberries, to other major local crops that are more drought resistant, like lemons and avocados. A policy like this could even effectively make it easier for low-income California families to afford healthy foods, a major challenge facing poverty-stricken communities.
We need to step up and take real responsibility for a serious long-term water management plan if we want to sustain life for 40 million people in California and growing. There is simply no way to protect our water supply for future generations without meaningful systemic reforms addressing agricultural water use, oil drilling, and housing development. Yes, they would raise howls of protest from some of the state’s wealthiest and most powerful political interests: agribusiness, oil companies, and real estate developers. But allowing money and corporate interests to control our politics is what’s got us stuck in this climate change mess in the first place. At some point, we have to stop fucking around with our planet, put on our big kid pants, and do the right thing.
The ALS Ice Bucket Challenge, like any internet phenomenon, has had its backlash and the inevitable backlash against the backlash. But whether or not you like it, no one can deny that it’s one of the most effective fundraising campaigns in recent memory.
But at the end of the day, one horribly depressing fact makes it all seem like a heartwarming act of staggering futility: The tens of millions of dollars raised by fundraising gimmicks like this are drops in the bucket (excuse the bad pun) compared to the tens of billions spent by the federal government on medical research. By far the largest contributor to ALS research is normally the National Institutes of Health, a taxpayer-funded government agency which has lost 25% of its purchasing power over the last decade as an insatiable thirst for budget cuts has become the new normal on Capitol Hill.
As of this post, the ice bucket challenge has raised $70 million, which means that this year private ALS research funding will actually surpass public funding. But the problem with internet phenomena is they die quickly. Of course The ALS Association will probably receive some permanent bump from cultivating long-term donors, but no one expects this level of funding or even anything close to it to continue indefinitely.
Let’s say the ALS ice bucket challenge plateaus out after raising about $100 million. Federal government funding for ALS research has declined from $59 million annually in 2010 to $40 million this year. That would mean over five years, federal budget cuts completely wipe out the gains from all those ice buckets. Unless the ALS Association can come up with an equally successful online fundraising campaign every five years, in the long run the future of ALS research looks pretty bleak.
Once you look not just at ALS, but the broader picture of countless deadly diseases the scientific community is simultaneously trying to combat, it becomes abundantly clear how impossible it is to adequately fund medical research through social media fundraising campaigns. It’s difficult to imagine research on another disease having an equally popular viral marketing campaign at the same time—there’s simply limited space in our social media newsfeeds and our attention spans. Even if every couple years, research on one particular disease saw a surge in a few tens of millions in funding from momentarily trending on social media, it will never be enough to make up for tens of billions in slashed federal funding for disease research as a whole.
The larger question we need to ask ourselves is: How should we as a society be funding medical research?
As Republicans in Congress have forced billions in cuts to public medical research, far outstripping anything that can be raised from individual donors on the internet, one can only wonder: What about the diseases who don’t have such a brilliant viral social media campaign? Hell, what about ALS a year from now? Are we moving towards a society where public priorities like curing diseases must rely on appealing to the whims of social media trends, competing for our short attention spans in the jungle of the internet by coming up with increasingly flashy ways to raise money? Are we becoming a society where charities must devote enormous resources to trying to come up with the next viral video or trending hashtag to fill the gap of services the government should be providing? A society where resources are distributed not based on scientific expertise, but based on which cause has the best marketing campaign?
Government is and always will be more effective at raising money to cure diseases than the internet is. Tens of millions of dollars for ALS, which took a social media campaign of one-in-a-million success, could be financed by literally pennies added to an average American’s taxes.
But we don’t like this because taxes mean coercion and coercion means controversy. If I personally don’t want to contribute a few cents every year in my taxes to research ALS, should I be forced to?
The answer is yes: this is what democracy is for.
As a society we can collectively decide some priorities are too important to leave charities scrambling to scrap together resources, and we can democratically choose to raise much larger sums of money through taxing ourselves to fund public goods like scientific research. We can adjust the amount people are required to contribute based on their income, so CEOs give more than janitors. We can have scientists, public health experts, and health economists make decisions about where to spend that money so that even if I have no idea what ALS is (I didn’t before the ice bucket challenge) some small portion of my income is still directed to finding a cure.
We can get serious about curing and preventing disease, ending poverty, improving education, caring for the elderly, keeping our air and water clean. But only if we’re willing to do the hard thing. If we’re willing to say to people: “I don’t care if you don’t know what ALS is. I don’t care if even if you did know, you wouldn’t contribute 50 cents a year to cure it. You can’t get out of this by dumping an ice bucket on your head. Those of us who do care outvote you.”
The ALS Association is doing a great thing, but they are hopelessly outmatched by the callousness and political power of the budget-slashers in Washington. We will never, ever, ever be able to give medical researchers the resources they deserve, no matter how many internet fundraising campaigns we have, unless we recognize the politics of this issue and take a stand against those who would gut medical research in order to pay less taxes, who place private profits over public good. What we need is not fleeting interest from the American public to string together temporary private dollars for the latest cause. What we need is a commitment to using democracy to achieve our goals. Democracy means controversy, democracy means conflict, but democracy is the way to create true lasting systemic change.
The hottest trend in education right now seems to be buying an iPad for every student, especially in high poverty schools. By providing tablets to students who may not have computer access at home, the theory goes, we can ensure all children in America have the skills they need to succeed in a 21st century economy.
But the sudden popularity of iPads among school administrators despite opposition from many teachers and parents should raise questions: Are iPads actually the most effective tool to bridge the digital divide? If our education system is preparing low-income children for the 21st century, what role are they being trained to play: producers of digital content or consumers of it?
Working at a community group engaging the public in major decisions on spending new funding in several California school districts, I’ve encountered mostly negative reactions to the iPad trend. Teachers bemoan distracted students (LA schools recalled their iPads after students figured out within a week how to unblock access to sites like Facebook and YouTube). Parents worry that children will get jumped walking home in rough neighborhoods with iPads in their backpacks. Most students are happy to get a free iPad, but often say they think it’s a waste of money when compared with other more urgent school needs.
With such thin community support, why are they being adopted at such a ferocious pace? Part of the answer is Common Core, new education standards where testing is now done on computers. Another part is strong marketing from Apple, who reaps major profits by controlling a staggering 94% of the market for school tablets. (While building long-term brand loyalty from a huge future customer base.) Finally, superintendents face an incentive to spend funds on things like iPads for everyone, which are highly visible and often generate positive media attention, rather than something like restoring furlough days cut from the school calendar, which is barely noticed by the public.
None of this is to argue against school districts investing in technology. I believe in integrating technology in schools and I’ve personally benefitted from these efforts. My elementary school in the 90’s was stocked with donated Apple computers, which I remember exploring with awe. I attended a technology magnet high school that had classes from video editing to web design to computer repair, as well as a mandatory tech literacy curriculum, which included learning to use Excel, Powerpoint, Publisher, Photoshop and even create basic Flash animation. I rolled my eyes at being forced to learn these programs then, but now use most of them on a regular basis at work.
Schools should be making targeted efforts to close the digital divide. More and more, college classes and middle-class jobs assume a basic level of computer skills. A lack of familiarity with Microsoft Excel or Powerpoint can cripple the career success of people from low-income families.
But the digital divide is more complicated than it appears. Surprisingly enough, smartphone ownership in the US is actually higher among blacks and Latinos than whites. We live in a society that’s difficult to participate in without the internet and many low-income families who can’t afford home computers or wi-fi use smartphones as their primary source of internet access.
The real digital divide isn’t about unequal access to mobile technology like smartphones and tablets. It’s about unequal access to real computers.
Here’s the difference: computers are producer tools, tablets are consumer tools.
If you teach a kid from a poor family how to use a tablet to surf the web, he/she has learned how to be a consumer of online content. But if you want him/her to learn how to make a webpage, rather than just look at one, they’ll probably need to learn on a computer, not an iPad.
But this isn’t just about teaching children to be web designers and software engineers. A major barrier that shuts low-income people out of white collar jobs in general is lack of more basic computer skills like being able to make a slideshow presentation for a meeting, design a simple publication about a topic, analyze and manipulate a spreadsheet of data, or even type quickly on a keyboard. None of these are skills you learn on an iPad.
It’s hard to predict the advances of technology, and maybe in twenty years I’ll look back and think this was naïve of me to say. But at a fundamental level, the whole point of a tablet is simplicity and mobility—it’s a product intentionally kept simple to allow it to be small, slick and mobile—which means it’s meant to supplement computers, not replace them. A tablet’s main purpose is to easily access content that’s actually created on a computer.
Let’s ask ourselves what we’re really trying to do here: What’s the deeper shift we’re trying to create through these school tech initiatives? Are we trying to widen the consumer base for the tech industry by making it possible for more people to watch videos and read articles online? Or are we trying to create a world that opens access to low-income communities of color as not just consumers, but producers of digital content as well?
It’s not only more cost effective, but more useful to invest in shared computer labs at school sites where students can learn to actually make things: Whether it’s writing code, editing videos, doing graphic design, turning data into charts and graphs, or making powerpoints and posters, these are 21st century skills that empower rather than commodify students.
If we’re about real meaningful access to the 21st century economy—about kids having a fair shot at living wage jobs and getting out of poverty—iPads for everyone is not the answer.
On Monday the NY Times covered a fascinating new study on social mobility. As you can see from the map above, there’s huge variation in the likelihood a kid from a low-income family will end up making it out of poverty depending on where they live.
Despite popular misconceptions about the “American Dream”, kids who grow up poor in the US are less likely to climb into the middle and upper classes than their counterparts in Western Europe, Canada, etc. This is a phenomenon dubbed “The Great Gatsby Curve”— countries with high levels of inequality also have low levels of mobility– i.e. if the rungs on the ladder are farther apart, it’s harder to climb the ladder. Poor kids in the US suffer from a weaker social safety net, worse health care and nutrition, more unstable housing, limited access to childcare, preschool, and college, and have to compete with rich kids who are even richer than kids in other countries (and have the resulting advantages in life).
Now let’s assume the American Dream is not just some bullshit platitude and we really do care about opportunity for all, regardless of the circumstances of one’s birth.
We might be feeling pretty hopeless right now. Fixing America’s crisis of inequality sounds overwhelming, and this useless Congress has a snowball’s chance in hell of creating any costly new social programs like universal preschool.
But what this study says to me is that it’s possible to make a significant difference at the local level. The trap of intergenerational poverty in places like Atlanta, Memphis, and Charlotte might be worse than any country in the industrialized world. But places like Salt Lake City, the San Francisco Bay Area, and Seattle have social mobility comparable to places like Norway and Denmark.
What explains the differences? Why are rags-to-riches stories in Chicago much less common than its rival metropolises of NY and LA? Why is a poor kid in San Francisco twice as likely to become successful as a poor kid in St. Louis? Can growing up poor in Seattle really give you four times better of a chance in life than in Memphis?
When you look at the map, the worst regions are clearly the Deep South and the urban industrial core of the Midwest. I associate these areas with entrenched black-white patterns of inequality and segregation. But the article notes that in cities like Atlanta, while opportunities to rise in society are scarce for poor blacks, they are just as scarce for poor whites. Maybe communities like Atlanta are more likely to see poverty in racialized terms (“Those people are not like me”), weakening their support of attempts to advance opportunities for poor folks of all races.
But let’s get into the nitty-gritty.
The authors were looking for proof that tax credits aimed at reducing child poverty like the EITC lead to better social mobility. But unfortunately they found that local tax policies had only a small correlation with class mobility.
So what can we do to at the local level to increase opportunity? Which of these factors is most easily and clearly impacted by public policy?
It’s interesting to note that the factors that most strongly correlate with lack of opportunity are things you might associate with a social conservative policy agenda: promoting traditional family structures, community ties, and religious participation. So why then is the South, the heartland of social conservatism, the huge red swath on the map with the least opportunity? I’d argue this is because social conservative policies are extremely ineffective at actually accomplishing things like reducing teen birthrates, and often counterproductive (see: birth control, sex ed).
I don’t think anywhere in America we’ve actually developed effective policies to make people have stronger family ties or be more active in their communities. (I do think we could mitigate some of the negative effects of widespread single motherhood with things like universal preschool and paid maternal leave though.)
Unfortunately the factors most directly tied to government policy (college tuition, local public spending, etc.) are at the bottom of the list. Seems like if state/local governments want to raise social mobility the best thing they can do is increase per-pupil school funding, but even that has a pretty weak correlation. Clearly the focus should be on reducing high school dropouts, but how exactly policymakers should do that is the tougher question.
I think the most interesting part of this study is the link it establishes between social mobility and segregation along racial and economic class lines. In a sprawled out, highly segregated city like Atlanta, people in poor black neighborhoods are much more isolated from decent job opportunities, good schools, social networks and other resources.
Cities and counties should be paying close attention to this. Plan dense, walkable, mixed-income neighborhoods. Provide quality public transit connecting low-income communities to job and educational opportunities. Focus economic development and infrastructure spending on the urban center, not just on the suburban outskirts. Don’t allow wealthy NIMBYs to block affordable housing in the suburbs and don’t allow developers to gentrify poor people out of revitalizing urban neighborhoods. Smart growth is not just about sustainability and hippy shit. It’s about the goddamn American Dream. It’s about everyone having a fair chance to make it. Bald eagles and apple pie and all that.
Last but not least, for the community organizers out there: Notice that “Social Capital Index” there at the top? That measures people’s civic engagement and level of involvement in community groups. Whether you’re organizing community activists to increase school funding, provide subsidized childcare or better public transit, the act of organizing people itself enhances economic opportunity as much as any policy change. Helping strengthen people’s ties to each other and to their community is one of the key foundations of social mobility. The best thing we can do is organize from the grassroots to make this American Dream a reality.