Tag Archives: economics

Externalities: Where the “Invisible Hand” Gets Cramps

progressAdam Smith, in his economic philosophy book The Wealth of Nations, penned a famous passage that reads as follows:

[E]very individual necessarily labours to render the annual revenue of the society as great as he can. He generally, indeed, neither intends to promote the public interest, nor knows how much he is promoting it. By preferring the support of domestic to that of foreign industry, he intends only his own security; and by directing that industry in such a manner as its produce may be of the greatest value, he intends only his own gain, and he is in this, as in many other cases, led by an invisible hand to promote an end which was no part of his intention. Nor is it always the worse for the society that it was no part of it. By pursuing his own interest he frequently promotes that of the society more effectually than when he really intends to promote it. I have never known much good done by those who affected to trade for the public good.

The first thing that a modern reader may notice about this passage is Smith’s assumption that an individual will prefer “the support of domestic to that of foreign industry,” which is observably not the case today. This may suggest that there is a flaw in the theory, and indeed there is.

Today, free-market advocates use the idea of the “invisible hand” to advocate leaving the market as free of government regulation as possible, asserting that it will inevitably regulate itself to the benefit of society, and that attempts by the government to shape it will inevitably produce a poorer result. This is taking the idea much further than Smith himself would have done. Smith recognized that for his “invisible hand” to operate, a structure of laws and government enforcement of contract obligations, property rights, and so on was required, but the idea that government is the foe of a free market rather than its enabler has become common currency on the economic right.

To what extent is Smith’s idea of an invisible hand directing self-interest to produce benign outcomes accurate? Under what circumstances does this in fact work? Under what circumstances does it fail to work?

Where The Invisible Hand Works

The mechanism behind Smith’s invisible hand idea is competition and the requirement that a business satisfy customers. This is what prevents a business from selling defective merchandise or charging exorbitant prices for it. If it does, a competitor will seize market share by offering a better product and/or charging a lower price. By seeking his own self-interest, a business owner will serve the interests of his customers as well, because the one is dependent on the other.

This does work to an extent. It clearly breaks down under monopoly conditions, where no effective competition exists. One finds that problem in the pharmaceutical industry, where customers are captive and patent law gives companies a monopoly over many of their products.

Aside from real competition, another necessity for the operation of the invisible hand is that the person making the decision to act owns both the benefits and the costs of that action. In the simple case of a company choosing to put in the time and effort to offer a good product for a good price, that’s so. The company will reap the benefit in increased sales and market share. (The consumer also benefits, but the company’s competitors do not.) The company also pays the cost by investing capital to improve its product, or by lowering per-transaction revenue by holding prices down.

As a technical term, we may say that the benefits and costs of the business decision are both internal to the business. The person making the decision pays those costs and reaps those benefits, and so the decision is informed by both.

But what happens when that’s not so?

External Costs

What happens when a decision by a business has consequences and costs that the business does not pay? For example, when a manufacturer dumps the wastes from the manufacturing process into the air, into a local river, or otherwise on public ground, the cost in the form of health consequences and other damaging effects of pollution is borne by the public.

A part of that cost is, in fact, borne by the business, in the sense that the business owner is part of the community and has to live in it, and its employees (or even its owner) may be impacted  by the negative public health effects of the pollution. But these costs don’t impact the business in particular. Most importantly, they don’t impact the business any more (or less) than they do its competitors. That being the case, the business has no incentive to reduce its pollution, since while that would slightly benefit the business itself, it would benefit the competition just as much, and hence provide no net gain.

There are many things that businesses do in pursuit of self-interest that are very much not to the public good. This includes trying to hold down wages, lobbying for government subsidies, collusion and price-fixing, allowing unsafe working environments, on and on. These things carry a cost far in excess of the benefits, lumping all of them together. But because the benefits are almost all realized by the business, but the costs are mostly paid by others, the business does them anyway — and that’s a perfectly rational, sound decision.

So there’s the first situation in which the invisible hand gets cramps. It doesn’t work when costs are externalized. Under those conditions, a business’ pursuit of self-interest will not accrue to the public good.

External Benefits

But costs aren’t the only thing that can be externalized. Sometimes benefits are external to the actor, too. In that situation, it’s not that a business will do something harmful, but that it will not do something needful. When the costs are internal but the benefits are mostly external, it makes no sense in terms of self-interest to take an action.

Let’s go back to the example of wages. A business pays wages to its employees because it has to. You can’t get people to work for nothing. Just won’t happen, sorry. Not usually, anyway. So the business pays the money (an internal cost) and gets the work done (an internal benefit).

But what about raising wages across the board? What about voluntarily deciding to pay its employees more? Obviously there’s a cost to that, but is there also a benefit?

Sure. Higher wages mean more consumer spending which generates more sales and boosts the economy. Everyone wins. But that’s exactly the problem. Everyone wins — a shared benefit — but the business foots the bill all by itself — a private cost. While the business will indeed benefit from raising its wages, so will its competitors, who will not be sharing in the cost (unless they also raise wages). Something that benefits you and your competitors equally, but that only you pay for, is not a net gain.

Externalized benefits cramp the invisible hand every bit as much as externalized costs. The same rule applies to things that we don’t expect a business to do, like defending the nation, enforcing the law, educating poor children, building highways, and so on. All of these things would benefit a business that took on the task. Invasion by a hostile power, public disorder, an ignorant workforce, and lack of infrastructure are all bad for business. But they’re equally bad for my business and my competitors’ businesses. It’s certainly in my self-interest for these things to be taken care of, but not for me alone to foot the bill for taking care of them. There’s no profit in that.

What It All Means

The invisible hand metaphor is in fact sometimes valid. But it’s not valid more often than it is. The invisible hand works without cramping up only in very limited circumstances and for very limited purposes. It works if and only if both costs and benefits are internal to the business or person making the decision. We can trust a business to make decisions in the public good wherever that holds true.

But where it doesn’t — and it doesn’t an awful lot of the time — something else, usually the government, must step in and either require the business to behave itself, or take on a task that business simply has no reason to do.

Advertisements

2 Comments

Filed under Politics & Economics

Why Supply-Side Economics Is a Bust

politicsOf course, every liberal rejects supply-side economics, but most of us do that because we feel it’s unfair, hard-hearted, and generally nasty. Which it is. But if it worked — which it doesn’t — the fact that it’s unfair, hard-hearted, and generally nasty wouldn’t be reason enough to reject it.

“This is unfair” is a politically losing argument, if it gets framed as a question of fairness versus economic utility. If supply-side economics produced a more robust, faster-growing, richer economy, but one in which the benefits went disproportionately to the rich, most people would support it. Hell, I would support it. Why not? We can always implement welfare measures to help the poor, taking advantage of all the prosperity that throwing money at rich people is supposed to produce. If it really comes down to a choice between great riches for a few and povert for everyone, who wouldn’t choose to let the rich get richer?

Well, maybe someone who thinks with his heart instead of his head. And that’s the only reason I can think of why so many liberals let the argument be framed in exactly that way. Because if they were thinking with their heads (and if they understood economics, which unfortunately most people of all political persuasions don’t), they would see that the biggest and strongest criticism applicable to supply-side economics isn’t that it’s unfair, but that it doesn’t work.

It doesn’t work. It doesn’t produce prosperity. It dampens it down. It produces sluggish economic performance and an unstable economy likely to break down under financial stresses that a healthier economy would shrug off.

The supply-side promise is that most people will get a smaller piece of a bigger pie. But the reality is that under supply-side policies, most people get a smaller slice of a smaller pie. It’s not a question of prosperity versus fairness. It’s one of prosperity and fairness — or neither.

What Is Supply-Side Economics?

The term “supply-side” is meant to draw a distinction with Keynesian economics, which emphasizes the problem of consumer demand. A Keynesian approach is to keep wages high and income broadly distributed, so as to maintain strong demand for goods and services, which strengthens sales economy-wide and prompts increased investment in enterprises that create more jobs. More jobs means more demand which means more investment and more jobs — and so on.

The downside of this is that it argues that wealth should not be allowed to concentrate too much. We can afford to have some people be richer than others, but not by so much that demand becomes depressed. It argues for such policies as a progressive tax system, high spending on public works and services, and measures to encourage unions, restrict immigration, and discourage outsourcing. This creates an incentive in some quarters to find a competing theory that allows people to become rich without restriction.

Supply-side economics is that theory. It argues that the limiting factor on investment isn’t consumer demand but capital formation and rate of return. If the rich are allowed to keep more of what their investments earn for them, they’ll have more money to invest and a bigger incentive to invest it. Hence the rationale for doing exactly the opposite of what Keynesian economics calls for in almost every situation. Instead of keeping wages high, keep them low to maximize corporate profits. Instead of a graduated tax system, have a flat one, or even one that taxes the rich hardly at all, while resting the bulk of revenue generation for the government on the middle class. This, we were told, would result in more investment and, over time, a better standard of living for everyone. There’s a surface plausibility to all this, which is why so many people bought into it whose immediate interests weren’t being served.

By now we should be aware that the promise hasn’t been met. We’ve seen real wages drop for most people, and the economy has been both slower-growing and far less stable than it was before supply-side policies were implemented in the 1980s. So we can see that it’s not working, but without understanding why it’s not working — why, in fact, it was predicted not to work long ago — there’s a tendency to screen out the data and go on believing what we think should be true, rather than what we can see is true.

It’s Not Whether They Can, But Whether They Will

The reason why supply-side economics doesn’t work is because the question is not, and has never been, whether rich people can invest in wealth-creating, job-creating ventures. It’s whether or not they will.

What I mean by a wealth-creating venture is one that produces goods or services (or both) for sale on the market. An investor with a sum of money to invest may do so by building a company, or by buying stock in a company, or otherwise funding the expansion of business to create wealth. Alternatively, he can invest in financial instruments that make money by lending money to others, or by gambling (essentially) on doing so.

Wealth creation has the potential to provide a higher return in the long run, but financial instruments are likely to pay off faster. More importantly, wealth creation pays off only if and when the goods and services produced are sold to customers. If that doesn’t happen, the investment won’t pay off well, and may end up being lost altogether.

Faced with slack consumer demand, investors are less likely to invest in wealth creation and more likely to invest in financial instruments and gambling, which produce few to no jobs and have little or no “trickle down” effect.

In a situation like that, putting more money in the hands of investors doesn’t help; it just gives them more money to gamble with. Allowing investors to keep a higher percentage of their returns doesn’t help, either, when the desired type of investment isn’t likely to produce any returns at all. Worse, it encourages investments with a quick payoff, while higher taxes encourage investments with a longer payoff term, spreading the profits out over multiple years and thus taxing them at a lower rate.

With higher demand for the goods and services that investment in wealth creation produces, more such investment will occur. This, not increasing available capital, is how to boost economic growth.

Where Does Demand Come From?

At first glance, it might not seem important how widely money is spread around. As long as someone has it, someone will spend it, right?

Not necessarily. While the very rich do spend more on consumption than poorer people do, the difference is nowhere near proportional to the difference in income. There are only so many fancy suits of clothes, second and third and fourth homes, luxury cars, and so on that any one consumer needs or even wants. Desire to consume is not infinite, and it is quite possible for one’s means to exceed one’s desires.

The more money a person makes, the lower a portion of that money is spent on consumption and the more of it is saved and invested. A million dollars will be used to buy a lot more goods and services if it is shared among twenty people who have $50,000 each than if it is held by a single millionaire.

Consumer demand is a combination of desire to buy and ability to buy. Too much concentration of wealth results in a few people whose ability exceeds their desire, and a lot of people whose desires exceed their ability. By spreading the wealth around from those who have too much to those who have too little, demand will be increased, and this will boost investment in wealth creation — just as Keynesian theory predicts.

It’s Not a Trade-Off

What this means is that supply-side economics is not just unfair, but also economically unsound. It isn’t just unfair, it also depresses economic growth and produces economic instability. It’s an idea that serves only one purpose: the desire of plutocrats to rip off the rest of us.

It’s a bill of goods that we should never have bought. If we reverse course and undo the entire line of thought that began with the Reagan years in the United States, we will have an economy that performs better and is fairer for most people.

So don’t believe anyone who suggests you have to choose between the two. You can have both.

Or you can have neither.

5 Comments

Filed under Politics & Economics

The American South (Part III)

22927636_sLike the election of 1860, that of 2008 provoked an over-the-top reaction from the Confederacy. So far, it hasn’t been nearly as bloody, and let’s hope that endures. But it’s presented its own set of problems.

The obvious reason why the 2008 election provoked that reaction is racial. That year, the United States elected an African-American president. He is characterized by his detractors as many things that he isn’t, including some that I wish he were (a socialist, for example). He’s actually a moderate Democrat much in the Bill Clinton mold, but to hear his foes on the right talk, he’s the second coming of Che Guevara.

Understanding why Barack Obama provokes this reaction is important. It’s not so much that the president is a black man as the fact that a black man could be elected president, and what that means about how the country is changing. We’re past the time when substantial numbers of white people sincerely believed in racial stereotypes, and Obama’s intelligence and ability are obvious. But when I was a boy, a black man of his abilities (or any abilities) could not possibly have been elected president. Today, he can and was. What that means is that the United States is not the same place into which I was born. It has changed — much for the better, in my opinion, but the Confederacy disagrees.

As discussed in the previous two sections, the Confederacy is an authoritarian subculture within the culture of the United States and opposed to its basic ideals. (I’m calling it that because “the South” is misleading for reasons that will shortly become clear. I’m referring to a cultural reality in using that term, not to the historical Confederate States which, of course, no longer exist, and did not include all of cultural Confederacy when they did. Maryland and Kentucky are, or at least were at one time, both part of the cultural Confederacy although neither state seceded. What’s more, Virginia and Florida, which were part of the historical Confederacy, seem to have left the cultural Confederacy.) It is a holdover, a last relic of the feudal/agrarian pattern that once prevailed in civilized societies everywhere. Founded on the growing of cash crops by forced labor, it is a culture that is antithetical to anything that could be called “freedom,” “democracy,” or “equality,” and those three concepts are central tenets of the defining values of the United States. (Which is not, of course, to suggest that the United States has a perfect record of living up to them; such is clearly not the case. But the Union believes in them. The Confederacy does not.) In its politics (consistently a one-party state, formerly Democratic, today Republican), in its religion (overwhelmingly Evangelical Christian), in its economics (brutally anti-labor and characterized by extreme social stratification), the Confederacy remains at odds with everything America is supposed to stand for.

That’s been the case for literal centuries. But there’s one new fact about the Confederacy that is provoking a surge in activism today, like a desperate attempt to hold back the tide of change, and to destroy the United States as most people think of it before it’s too late to do so.

The Confederacy is dying. And it knows it.

Urbanization, Mobility and the Internet: The Triple Kiss of Death

Three things are destroying the Confederate subculture. These are the increasing urbanization of the South, the migration into it of people who grew up outside of it, and the vast increase in idea exchange provided by the Internet.

In 1860, the urbanization of the South (measured as the percentage of the people who live in urban areas) was under ten percent. Even as late as 1950, it was still under 50%. It has always lagged behind the national urbanization percentage, and still does, but as of the 2010 census, the South was over 75% urban — barely behind the Midwest. A generation of Southerners have grown up in an environment where the values and attitudes of the Confederacy make no sense and cannot be defended. Young white Southerners very naturally don’t buy into those values and attitudes. They have become Americans, not Confederates, adopting the values of the Union (which are themselves evolving, but that’s nothing new). In addition to urbanization, much of the South has finally industrialized, which means the realities confronting people are those of capitalism, not feudalism, and so are the political issues that matter. With industrialization has come prosperity for much (although not all) of the South, and with prosperity has come a set of foreign attitudes.

At the same time, the ethnic mix of the Southern population is changing and becoming more diverse. In 1860, virtually everyone who lived in the South was a white person of British or German ancestry, a slave or free person of African ancestry, or a Native American, and almost all of them were born in the South and grew up in the South. That’s no longer true. In 1980, an estimated 20 percent of the Southern population overall was born elsewhere, and that trend has accelerated. This varies widely by state. More than half of Floridians were born outside of Florida, while less than ten percent of residents of Louisiana and Mississippi were born outside those states. When someone moves to the South from outside the Confederacy, these days it’s usually for economic reasons, and the outsider brings modern values and attitudes along with the luggage. The percentage of Hispanics and people of Asian ancestry living in the South is also on the way up. None of these people buy into the Confederacy, either. (Distorting the political picture is that most immigrants are non-citizens and so not eligible to vote. However, they still interact with young white Southerners and this results in cultural change.)

Finally, the Internet brings foreign ideas into the South even faster than modern mobility is bringing new people. While this also permits the entrenched Confederates to build informational lacunas and echo chambers, any Southerner with a shred of curiosity can find information about all kinds of things that, in earlier times, would have been less accessible. This presents a challenge to the cultural values and religious beliefs that form the corpus of the Confederacy.

These changes are reflected in national elections. In 2008, Obama won the states of Virginia, North Carolina, and Florida. He won Virginia and Florida again in 2012. He won Maryland both years, but Maryland ceased to be part of the Confederacy a long time ago. Virginia and Florida represent the leading edge of the change. It is inconceivable that a state remaining part of the cultural Confederacy could vote for a black president, and so it’s reasonable to assert that both of these states have now left the Confederacy and are, in the meaning used in this series of posts, no longer Southern. The Carolinas and Georgia will follow. Texas will take longer, but that will happen, too, as Texas continues to urbanize and as its large Hispanic population acquires citizenship. Mississippi, Alabama, and Louisiana will be the final stronghold of the Confederacy, most likely. Their assimilation could take as long as another lifetime, but by themselves, they can’t sustain the Confederacy against foreign pressure.

None of this is unknown to Confederates. Like the election of Barack Obama, other signs point to the Confederacy’s decline. And it’s fighting back. As in the 1860s, the inevitability of defeat does not deter the quixotic attempt to win, or protect America from the damage they may do in trying.

The Last Gamble

The Confederacy is dying, but it’s not quite dead yet. The hold that it has over the Republican Party at the national level allows it to exert influence over public policy — not enough to reshape the national government, let alone the national culture, to its own values, but enough to paralyze the government and prevent progress towards a more advanced, humane, and responsible Union.

Marxian analysis is again useful here. The United States, an advanced capitalist country, should be having a debate between capitalism and socialism. The question of whether we should have an industrial economy, with all of the government involvement that always must go along with that, should have been settled long ago. In one sense, it was — we do have such an economy, and ought now to be engaged in trying to humanize it and debating whether that economy and the wealth it produces properly belongs to an elite class of rich capitalists or to the people in general. But because of the Confederacy, we can’t have that debate yet. Instead, we must deal with a faction that remains committed to values and an approach to government appropriate to a nation of farmers, not one of industrialists, wildly out of touch with modern reality, and consequently nihilistic and destructive.

At this juncture, the Confederates — who are the driving force behind the Tea Party movement and, increasingly, the dominant core of the Republican Party — aren’t interested in enacting a governing agenda. Their agenda is to destroy, not to govern. What they want is to wipe the United States out of existence. In fact, this isn’t even secret. References to “starving the beast” and “drowning the government in a bathtub” show what the goal is. The glee with which Republicans in Congress have brought the United States to the brink of default and caused government shutdowns further reveals the Confederate agenda. These people know very well that the United States is a foreign country that is conquering the South culturally, and that this process has been aided by the federal government at times, from the forcible desegregation of the schools to the encouragement of modern-day capetbagging. Their ancestors tried to fight the United States militarily and lost; today’s generation is trying to demolish the country from within the government instead. The United States is the enemy to them, every bit as much as it was in the 1860s.

And as in the 1860s, the only way to deal with this situation is by acknowledging it. The Confederates regard the United States as their enemy, and so, as an American, I must regard them as my enemy, too. We must all do that. Whatever their legal status, the presence of almost the entire Republican membership of Congress — certainly most, if not all, of those from the South — should be regarded as morally illegitimate. There can be no compromise. They must be defeated. It’s that simple. It’s that cut and dried.

The Confederacy is dying, and that reality will be reflected in the next few elections. In the meantime, we must hold to a minimum the damage that the beast can do in its death throes. It’s too late to prevent this latest phase of the American Civil War.

All we can do is to ensure that, once again, the Union wins it.

Leave a comment

Filed under Politics & Economics

The American South (Part II)

22927636_sThe 1860 election was an oddity, similar in key aspects to the 1912 election, but with far grimmer consequences. That year, the Republican Party ran its second presidential candidate, Abraham Lincoln. Lincoln was, for a Republican, moderate on the issue of slavery. He opposed it, but proposed no actions against it except a pledge that all new states created under his watch would be free states.

Despite this, his candidacy provoked fury in the South. He would probably have lost the election with a respectable showing, as John C. Frémont had in 1856, except that the Democratic Party split in two that year, with two nominating conventions presenting two candidates for the White House. This happened when Southern delegates walked out of the Democratic National Convention — twice — over a refusal to adopt a plank that would have forcibly extended slavery into territories where the inhabitants voted against it. Eventually, the pro-slavery Democrats held their own convention and nominated their own candidate.

It’s been suggested that the fissure in the party was deliberately intended to throw the election to Lincoln, in the hope of provoking secession. Certainly the demand that slavery be extended where it wasn’t wanted was a radical proposal and violated the concept of popular sovereignty, of democracy itself, and the ideals on which the United States was ostensibly founded, but then, so did slavery and so does the entire authoritarian culture of the South. Whether this conspiracy theory is correct or not, the outcome is clear enough. Lincoln won a majority of the Electoral College with a plurality but not a majority of the popular vote.

While the 1860 election resembled the 1912 election in this respect, it more closely resembles the 2008 election in its aftermath, but again, the consequences were far more dire. Between Lincoln’s election and his inauguration, seven Southern states seceded from the United States. These states came together and formed the Confederate States of America, adopting a constitution almost identical to that of the United States, but with three significant changes, two of which showed the nature of Southern society. One of these changes was to protect slavery from interference by either the Confederate government or any state government. A second was a change to  Article I, Section 8 of the U.S. Constitution which enumerates the powers of Congress. In the U.S. Constitution, that clause reads in pertinent part:

The Congress shall have Power To lay and collect Taxes, Duties, Imposts and Excises, to pay the Debts and provide for the common Defence and general Welfare of the United States . . . To regulate Commerce with foreign Nations, and among the several States, and with the Indian Tribes;

The Confederate Constitution altered this to read:

The Congress shall have power To lay and collect taxes, duties, imposts, and excises for revenue, necessary to pay the debts, provide for the common defense, and carry on the Government of the Confederate States; but no bounties shall be granted from the Treasury; nor shall any duties or taxes on importations from foreign nations be laid to promote or foster any branch of industry . . . To regulate commerce with foreign nations, and among the several States, and with the Indian tribes; but neither this, nor any other clause contained in the Constitution, shall ever be construed to delegate the power to Congress to appropriate money for any internal improvement intended to facilitate commerce; except for the purpose of furnishing lights, beacons, and buoys, and other aids to navigation upon the coasts, and the improvement of harbors and the removing of obstructions in river navigation;

In this change we see reflected the fact that the South held to the paradigm of agrarian civilization (minus monarchy and hereditary nobility, but I would say only because the United States forbade both and the South had become used to that situation). All government efforts to spur industrialization were forbidden, except those that facilitated the moving of cash crops to market.

(The third significant difference between the two was that the Confederate Constitution limited the president to a single six-year term.)

The Civil War

While the secession of the South is understandable given the economic and political realities, a much greater mystery is presented by the attack on Fort Sumter. Lincoln would have faced popular opposition to using force to restore the Union otherwise. Why provoke a war that, given the realities of manpower and industrial capacity, the Confederacy was almost sure to lose? Again one is tempted to conspiracy hypotheses, but in fact the action may be adequately explained by hot-headed stupidity and that’s more likely what happened. Foreign countries have sometimes made this mistake about American character, misunderstanding the swiftness with which opposition to war can turn to fervent support after the nation is attacked. The South had no excuse, but made the same error — which once again points up how foreign that region of the country is to the rest of the United States.

After the attack on Fort Sumter, Lincoln summoned and federalized the militias of the loyal states and planned an invasion of the South to restore the Union. This action provoked the secession of four more states and began the most gruesome war in U.S. history. The final death toll from the war was more than 600,000 on both sides, meaning that America lost at least twice as many people in the Civil War as in World War II, from a much smaller population base. The Confederacy did surprisingly well, likely because the military tradition of the Southern quasi-aristocrats meant that the best military leadership of the United States was Southern and joined the rebellion, but in the end, inevitably, the Union won.

During the war, with the Southern Senators and Representatives absent, Congress passed measures promoting industrialization that had been blocked by the South up to then. The building of the trans-continental railroad, the creation of a new national banking system, the Morrill Tariff, and the Homestead Act all emerged during this time. Again we see that the conflict between the South and the rest of the nation was one between an agrarian economy and an industrial capitalist economy, with slavery the fulcrum of the conflict and the moral flash point.

Reconstruction

After the war, the United States added three hugely important amendments to the Constitution. The 13th Amendment abolished slavery. The 14th Amendment guaranteed equal protection under the law regardless of race, defined all persons born or naturalized here as U.S. citizens, and extended the protections of the Bill of Rights to cover actions by state governments. The 15th Amendment guaranteed the right to vote regardless of race or “previous condition of servitude.” These amendments together with the government’s reconstruction policies sought nothing less than the eradication of the South as a separate culture and its assimilation to the rest of the United States.

It was an ambitious goal that could not succeed, or not within a reasonable time frame. In the end, the Southern elite adjusted to their loss of the war and implemented laws and economic structures that preserved the authoritarian, racially stratified culture of the South despite the end of slavery. The former slaves were kept bound to forced labor by economic arrangements amounting to a kind of serfdom. Their right to vote was curtailed by a mix of Byzantine restrictive laws and clandestine terror.

One thing needs to be clearly understood. The Civil War was fought over slavery, but if the North-South conflict had only been about slavery, it would have ended with the passage of the 13th Amendment outlawing the practice. Having lost the war and lost the slaves, the planter interests would have faded away and the South would have become just like the rest of America. That didn’t happen. Slavery was a large part of what created the authoritarian culture of the Confederacy, but it exists independently of that institution and encompasses much more.

Slavery as such was gone. The hold of the South on the federal government was also gone. The industrialization of the country outside the South proceeded at a rapid pace. By the end of the 19th century, the United States had become a first-tier economic power. The South, however, languished behind, as the entrenched planter interests maintained their grip on power and preserved, as best they could, the agrarian character of the South. While in the 20th century the United States for the most part entered the classic dispute between capitalist and socialist ideas and between owners and the working class, the South stayed stuck in a pre-capitalist condition and acted as a drag weight on the nation’s evolution.

The South and 20th Century Politics

The Democratic Party remained the party of the South after the Civil War, which cost it dearly in power over the national government. Between the presidential election of 1868, won by Republican Ulysses S. Grant, and that of 1928, won by Republican Herbert Hoover, Democrats won the White House exactly four times. Grover Cleveland, a Northern Democrat (from New York) who was indistinguishable from conservative Republicans apart from the party label, won a razor-thin victory in 1884 against a weak GOP candidate, lost his reelection bid in 1888, and barely won a second term in 1892. Woodrow Wilson was the beneficiary of the 1912 election anomaly mentioned above; that year, it was the Republicans who split, with former president Theodore Roosevelt running on a third-party ticket against both Wilson and the GOP nominee, President W.H. Taft. With Roosevelt and Taft splitting the Republican vote, Wilson was able to win an Electoral College majority on a popular vote plurality. He won reelection in 1916 on an implied promise to keep America out of World War I, a promise he did not keep.

Through all this time, the South used its limited influence over Congress to protect its culture and institutions from federal encroachment and prevent effective enforcement of the 14th and 15th Amendments in the South.

The Great Depression began a process that would change all of that. The Depression was capitalism’s great failure and fostered a move towards socialism. Because the Republicans at that time were committed to capitalism and unable to make the necessary changes, it fell to the Democrats to seize the political opportunity, which happened of course under the leadership of Franklin Roosevelt. Roosevelt put together a new political coalition capable of winning national elections, something Democrats had been denied for decades. That coalition included labor, women, and minorities — as well as the white South. As with many political alliances, this one featured strange bedfellows.

The alliance held together through the Depression and World War II, but began to come apart after the war. President Truman’s executive order desegregating the armed services in 1948 started the ungluing. The passage of the Civil Rights Act in 1964 in a Democratic Congress and its signature by a Democratic president (from the South, no less) finalized it. The South was up for grabs after that. But in order to grab it, the Republican Party had to adopt positions that violated its founding principles and the stance for racial equality that had defined the party from inception.

It did. And that brings us to the position we are in today, with the neo-Confederates having swallowed the Party of Lincoln in one of the most ironic hostile takeovers in history. The Confederacy is using that power in an attempt to demolish the United States government from within.

Next week: The American South (Part III), about the approaching demographic demise of the Confederacy as a separate subculture, and its desperate attempt to take the United States with it to oblivion.

1 Comment

Filed under Politics & Economics

World Building: Politics and Economics

10440602_sThere’s a particular literary sin — or it’s a sin to my own nitpicking mind — that bothers me in the science fiction and fantasy genres. I’m referring to presenting a political or economic reality that, given the technology in common use, cannot possibly exist. This is the reason I can’t watch the TV show “Firefly,” which presented a world culturally and politically and economically indistinguishable from ours in space with technology that would insist on something new. It’s a flaw in Asimov’s Foundation series: the very existence of a monarchical Galactic Empire is absurd. It’s what made me grit my teeth in frustration on reading the Wild Card shared world series, which depicted a high-tech feudal monarchy and a high-tech robber-baron capitalist society, neither of which can exist.

To make the world you’re building or reading real, it’s important to take the prevailing technology (and also the prevailing magic) into consideration when determining the culture’s politics and economics. You can’t slap an ideal democracy onto your Bronze-Age empire just because you like it better than a monarch and his satraps. The latter can actually govern a Bronze Age empire, while the former cannot (unless of course you have some magic that replaces the technological underpinnings necessary for widespread democracy).

Government is the making and implementing of collective decisions and the resolving of disputes in a community. How that is done — how it can be done — and what disputes can arise are functions of material circumstances, and more than anything else, material circumstances are functions of technology. Of all areas of technology, the most important for purposes of governance are those related to communication.

Collective decisions are made by (at least tacit) agreement. It requires communication. As long as people are in the same place, communication travels at the speed of sound — they stand or sit there and talk to each other. Get past the range of talking and listening, and communication happens as fast as a message can travel, and as well as it can be repeated and understood. This limits the ability of people to participate in the making of collective decisions when they are distant from the conversation.

Democracy is very old in origin. It’s the most natural form of government, but in a low-tech society it only works in a small community, where people can gather in the same place, hash things out, and vote. Athens had a democracy. Alexander’s empire, though steeped in Athenian values and culture, did not. Why not? Aside from Alexander’s ego, it would have been impossible for all of the residents of the eastern Mediterranean region from Persia to Egypt to Macedonia to get together in a town meeting, hash things out, and vote.

For this reason, although there were a few exceptions such as the Athenian democracy and the Roman Republic, the prevailing government form throughout the ancient world was monarchy: a strong head of a privileged class that made collective decisions for everyone, that most people went along with because they weren’t asked for all that much, they got protection from bandits and neighboring enemy kingdoms, and they didn’t want to get their heads cut off by the king’s men.

Today, it’s quite different. Today, the prevailing government form is representative democracy. Why? Because of the advance of technology, with the most important inventions being the printing press and representation itself. The printing press led to widespread literacy, which made people less inclined to go along with collective decisions in which they weren’t allowed to participate. Representation allowed people to participate in a democratic government by proxy, when they were still unable to do so directly.

We have new communication technology now that is once again changing the nature of governance: the internet, which permits instantaneous, widespread participation in the global debate. Because of this development, governance in a hundred years (assuming civilization survives) will have a lot more direct and participatory democracy elements cutting through and dominating the remaining representative mechanisms. We will see economic changes, too, deriving from advances in computers and robotics that make it possible to produce wealth without human labor.

The end result of all this is that it’s anachronistic to have in a story a modern representative democracy governing a low-tech, illiterate society, or a feudal monarchy governing a high-tech industrialized one, or anything fully recognizable from any era in history governing a future society with more advanced technology still. It’s lazy, thoughtless world-building and should provoke snorts of disbelief and head-shaking.

There are constraints on world building that come from the prevailing technology and magic. Magic can change the basic picture derived from technology, but it should always be possible to see how it does so. Anything doesn’t go. It can (and should) be imaginative, but it all has to make sense.

Image credit: berkut2011 / 123RF Stock Photo

Leave a comment

Filed under Fantasy Storytelling