The Fracturing World of Streaming

An Editorial

I suspect Hulu’s days are numbered. Here’s why.

Hulu was founded as a joint venture between multiple big media companies as an answer to the rise of Netflix, the independent DVD rental-by-mail service that quickly transformed itself into an innovator in the new market of videos streamed online, offering movies and TV shows for subscribers to watch anywhere at anytime. Hulu, being owned by the likes of NBC Universal, Fox, and the Walt Disney Company, had an advantage in securing the streaming rights to a vast library of television programming, as it was owned by the very companies that produced those shows. Netflix, in contrast, has to continually negotiate the streaming rights to the films and TV shows it offers, apart from their own original programming that they make in-house.

Then, Disney bought 21st Century Fox, and as a consequence of that merger, they now own a 67% share of Hulu, with NBC Universal’s parent, Comcast, agreeing to basically let Disney run Hulu outright and promising to sell its remaining 33% share in five years. So, okay, Disney owns Hulu now, what does that matter?

Disney is planning to launch its own streaming service on November 12. Disney+ is a service that will take advantage of the massive size the company has grown to in recent years, to offer not only Disney films, but also movies and TV shows from Pixar, Marvel, Lucasfilm, Fox, and even National Geographic. At the same time, Comcast is going to slowly pull NBC Universal’s content off of Hulu to start their own streaming service that is expected to launch next year. Basically, Hulu is going to be made redundant, and from Disney’s point of view, if it merges Hulu with Disney+, it is guaranteed a massive subscriber base from all the former Hulu subscribers, and will inherit all of Hulu’s own original programming as well. It just makes sense to me that Disney would want to do this.

Disney and NBC aren’t the only big media companies seeking to claim a piece of the hot, new streaming market. WarnerMedia is also launching their own streaming service, HBO Max, also arriving next year. CBS already has its own service, CBS All Access, offering the channel’s TV lineup and some original shows, most notably the latest versions of Star Trek. In a way, I am starting to feel sorry for Netflix, the innovator that started this rush, now constantly losing programming to its competitors as the big media companies pull their libraries off Netflix to put them on their own, rival streaming services. No wonder Netflix has made a huge push to increase its in-house library.

In fact, Netflix, Amazon Prime Video, Vudu, and other streaming services have another crutch they have to overcome. The FCC has repealed net neutrality in the United States, and so far attempts to have Congress reinstate it have stalled, in spite of wide, bipartisan support for net neutrality among the American people. I have written about net neutrality on this blog before, but to summarize, it is the principle that internet providers have to treat all data that users want to access equally. Without it, ISP’s can legally throttle streaming speeds and charge higher premiums for access to certain internet services.

Well, AT&T owns WarnerMedia, and Comcast owns NBC Universal. Is it any wonder these companies want to get into the streaming business? Think about it – they can force people who want to stream movies and TV shows on Netflix or Amazon or Disney+ to pay higher prices for a “premium internet package”, but allow people who use their own in-house streaming service to use it at no extra charge. So long as they are up-front about it and not deceptive, that’s perfectly legal now.

Speaking of cost…

In general, streaming services are not free. Netflix runs up to $15.99 a month. CBS All Access will run you up to $9.99 a month. Hulu’s ad-free version costs $11.99 a month. CuriosityStream, a niche streaming service offering documentaries about science, nature, and history costs $19.99 per year. Amazon Prime Video requires an Amazon Prime subscription, which currently costs $119 per year. At least Crunchyroll, a streaming service for anime, actually is free – though if you want to remove ads you have to pay $7.99 a month. The more streaming services you sign up for, the more expensive it gets. Many Americans, myself included, have gotten rid of cable and now mainly use streaming for entertainment, but it is really easy to end up paying enough in your monthly streaming service bills to equal a cable bill.

And it’s not like Disney+, HBO Max, or the new NBC streaming service will be free, either.

When the streaming market was new, everyone had Netflix and Hulu and that was about all anyone needed. Now, streamers have to pick and choose which streaming services they want to subscribe to, and so these companies have to compete with each other on what their service offers viewers to watch. There are literally websites that exist to track which movies and shows are on which services, and these have to be updated regularly as contracts expire and new ones are negotiated.

Hence, the push for more streaming-only original content that is linked to a specific service. Do you want to watch the upcoming Star Trek: Picard? Better sign up for CBS All Access, then, as you can’t watch it anywhere else.

Legally, anyway

This forces people to pick and choose what programming is worth paying an extra $10-$20 a month for, and many will miss out on films and shows they my have otherwise enjoyed because the price tag is too high. People will end up segregating themselves by preferred streaming service, only seeing shows available on other services when visiting a friend’s or relative’s house. The streaming market is fracturing as we speak, and at least for the foreseeable future, will continue to do so. What impact will this have on what films and TV shows get made, or on how audiences respond to them? Only time will tell. However, it will have an impact, and a big one. Of that, I can be sure.

The Inspirations behind the U.S. Constitution


US constitution and flag by wynpnt at goodfreephotos

When I was in elementary school in Ohio back in the 1990’s, schools were first starting to do standardized testing of their students. Teachers had to set aside a couple of days in the school year for all the students in the school to sit down and bubble-in multiple choice answers in a cheap booklet printed on thin, easily-torn paper. The teachers and the school were going to be graded on how well we students performed on this boring, menial test. Therefore, as I sat in the classroom waiting for the Social Studies portion of the test to be passed out, the teacher told us all, “Okay, class, on question 14 when it asks ‘Where did the Founding Fathers get the idea for the Constitution?’, the correct answer is ‘B: The Iroquois Confederacy.’ Yes, I know that isn’t actually the correct answer, and you all know that, too, but that’s the answer the state wants.”

Yes, really, she actually said that.

I remember that specifically because I thought it was incredibly strange. I mean, we had just recently learned in history class that the Founding Fathers drew inspiration from many sources when writing the Constitution. Why was the state government wanting us to say that only one of those inspirations mattered? Looking back on this today, as a full-grown adult, it seems clear to me that someone in either Ohio’s education department or the company that made these standardized tests had a political agenda and was trying to get all the schools to teach to that agenda. I’ve talked on this blog before about how history classes in schools don’t actually teach the whole truth, and this is an example of why.

I’m not sure what happened to fifth-grade history classes in Ohio after I moved to California. Maybe future generations of Ohio fifth-graders grew up learning that the Founding Fathers were just inspired by the Iroquois Confederacy. Unless someone with a different agenda came along and changed the tests. Maybe now they learn that the Founding Fathers were inspired by the ancient Greeks and Romans. In any case, as I said, the actual truth of the matter is that there were many inspirations that they drew from when writing the U.S. Constitution, and today, I thought I would go over some of the main ones. Starting, of course, with:

The Iroquois Confederacy

Iroquois flag image by Himasaram and Zach Harden

Long before Christopher Columbus sailed west into the unknown, the tribes of what is now Upstate New York lived in a constant state of war. Then, three people worked together to convince the five tribes to come together and make peace: Hiawatha, a warrior disillusioned by the constant fighting, Jigonhsasee, a woman known for her hospitality toward guests from any tribe, and a spiritual leader known to history as the Great Peacemaker. This trio managed to get the Seneca, Cayuga, Onondaga, Oneida, and Mohawk to join forces in a federation called the Haudenosaunee, governed by a constitution that they called the Great Law of Peace.

In the federal government that they created, political power was based around clan mothers, a reflection of the Iroquois’ matriarchal society. Clan mothers had the power to appoint whomever they wanted to serve as their tribe’s chiefs, and could dismiss them at any time for any reason. One type of chief was the “Sachem”, who represented the clan at the Haudenosaunee Great Council, made up of 50 Sachems from all five tribes who would make decisions by consensus.

The American colonists would have been familiar with the Iroquois Confederacy, partly due to their close proximity, and partly due to the fact that the colonists and Iroquois were partners in the fur trade. In many colonial wars between the English and French colonies, the Iroquois usually sided with the English. Benjamin Franklin, in particular, spent some time among the Iroquois, and was inspired by their successful federation. In the mid-18th century, he called for the British colonies in North America to join forces in a federation of their own, in what he called the “Albany Plan of Union”. He was unsuccessful in convincing the colonies to adopt his plan – they already reported to London, and didn’t need an extra layer of government above them that they would have to obey the laws of and pay taxes to. However, when the colonies rose up against the British, the idea of the colonies joining forces was resurrected, as now they had a reason to stick together as they fought for their independence.

A few specific ideas that made it into the Constitution came from the Iroquois, such as having a Congress that represented all of the states, creating a balance of power between the states and the federal government, and barring any person from holding more than one political office in the U.S. government at a time.

Ancient Athens and Rome

Ancient Rome scene illustration by Edgar S Shumway

Of course, since the Founding Fathers were educated men of European descent, they would have been intimately familiar with the historical roots of Western civilization, namely, ancient Greece and Rome. These city-states developed unique (for the time) political systems that laid the ideological foundations for democracy and republicanism. Indeed, “democracy” is a word with Greek roots meaning “rule by the people”, and “republic” comes from the Latin for “public matters”.

The idea of democracy came from ancient Athens, a Greek city-state that had recently overthrown its king. Initially, after the monarchy was deposed, an oligarchy of the city’s wealthiest families ran things, but then infighting between these families led to factionalism that paralyzed the city’s government and wreaked havoc on the political process. To resolve this, a reformer named Solon advocated for the creation of a new political system that got every male Athenian citizen directly involved in political decision making: democracy. He believed that if every citizen, regardless of class, could vote and have a say in political matters, there would be no more factionalism.

Laughter image from Rawpixel

He thought WHAT?

As a result, the ancient Athenians would pass laws and make important decisions such as going to war or making peace by a vote of all the citizens of Athens. A council known as the “Boule” decided what went before the Athenians for a vote, and a committee known as the “Prytaneis” worked to implement the Athenian citizens’ decisions. Athenian elections to these offices were a bit different than what we are used to – technically, the vote only determined which candidates would be eligible to be picked in a random lottery. This was officially a mechanism to let the gods have a vote, though it also made it much harder to buy an election through corruption.

Many city-states across the ancient Mediterranean would be inspired by the Athenians and experiment with their own political systems. One of the most successful of these was Rome, a city-state that would grow to conquer the entire region and become one of history’s most famous empires. The Roman Republic, like the democracy in Athens, came about after the overthrow of a king. The Senate, the king’s council of advisers, took power for itself. Initially, they intended to keep power in the exclusive hands of the patricians (the Roman nobility), but after several revolts by the plebeians (commoners), a carefully-constructed political system that balanced power between different political groups was established.

The Romans would divide power between the Senate, the people, and the various elected public officials who would follow a traditional career path called the cursus honorum. At the top of the pyramid of public officials were two consuls, who would share the duty of chief executive. There would always be two of them in order to ensure that neither could become too powerful, and they would only serve a one-year term in order to further limit their power. How much can one do in a year, after all? Meanwhile, the plebeians would elect Tribunes of the Plebs, whose job it would be to oversee the Senate and the officials and ensure they didn’t abuse their powers. Most notably, a tribune could veto any government action that they felt threatened the interests of the plebeians. This was important, as all Roman elections, except the elections for the tribunes, were rigged in favor of the wealthy elite.

The idea of a government with no king, where the people have a say in their own government and choose their leaders, and where a careful balance of power that keeps any official from becoming too powerful, was clearly a major influence on the thoughts of the Founding Fathers as they framed the new U.S. government that they were creating.

The very same British that they were rebelling against

Redcoats image by Jerry Saslav

While the Founding Fathers had declared their independence from Great Britain, they did so in large part because they saw the British king and parliament as having violated their rights as Englishmen. There were many parts of the English political tradition that they absolutely weren’t about to reject.

While the Kingdom of England started out as just another feudal, absolute monarchy, during the Middle Ages the nature of the kingdom’s government and the role of the crown evolved. A rebellion against King John forced him to sign the Magna Carta, the first law that explicitly put limits on the king’s power, most notably by requiring him to get the people’s consent to raise taxes. This “consent” eventually took the form of Parliament, a body chosen by the king’s subjects that would meet to examine and decide on the king’s request for money. Over the centuries, Parliament would use this power to win concessions from the crown, forcing the monarch to accept further restrictions on his power or the granting of further rights to the people in return for approving new taxes. This process was very slow and gradual, and it was not always peaceful. In fact, at times it led to out-and-out civil wars. However, it was successful in securing such important concessions as a Bill of Rights and restrictions on the power of the king’s officials to lock people up.

This legal tradition found its way into both the original Constitution itself and the American Bill of Rights that was added to it shortly thereafter. Just as the British Parliament had two houses, the U.S. Congress would have two houses. Just as British laws need the assent of the monarch, American laws would need to be signed by the President. Just as all Englishmen would have the right to challenge the legality of their arrests in court, so, too, would all Americans. Just as a suspected English criminal would be tried by a jury of his peers, so would an American suspect.

The Age of Enlightenment

Perhaps the most important influence of all, though, was the political movement that was captivating the minds of many educated, middle-class Europeans at the time: the Enlightenment. People would gather in coffeehouses and in their caffeine-induced highs, they would apply the notions of science and reason that had powered the discoveries of the Scientific Revolution toward human society. Not content to just accept that the social order was the way it was because God said so or because it had always been that way, they would question and challenge and debate instead. Famous political philosophers like John Locke, Voltaire, Jean-Jacques Rousseau, and the Baron de Montesquieu would write about their ideas as to how human societies could be improved.

Well, the newly-free United States was the perfect place to experiment with putting these ideas into practice. The Founding Fathers were avid readers of the works of Enlightenment philosophers. The Declaration of Independence’s lines about how everyone is entitled to “life, liberty, and the pursuit of happiness” is a paraphrasing of some of John Locke’s lines in his Two Treatises of Government. Indeed, the idea that a Declaration of Independence was needed at all came from the “social contract” theory of government – the idea that society agrees to have a government in return for protection from murderers and thieves. This theory, and its name, comes from the works of Rousseau.

It was Montesquieu who first proposed a government with three explicit branches: executive, legislative, and judicial. The influence of his writings is why the Founding Fathers created a President, Congress, and Supreme Court. Similarly, Voltaire’s arguments that people should have freedom of speech, freedom of the press, and freedom of religion led directly to the First Amendment.

United States Constitution image from Wikimedia Commons

What the Founding Fathers did with the constitution they wrote was synthesize various ideas and inspirations that were floating around in the late 18th century into a single, experimental document that tried to create their ideal government. In return, what is most fascinating of all is just how much later constitutions around the world were influenced by the American one. Today, most countries around the world have a written constitution that lays out the creation of some form of legislative, executive, and judicial branch as well as a bill of rights that includes freedom of speech, press, and religion and protections for accused criminals.

North Korea image by Conan Mizuta

Now, the degree to which they actually respect those rights… well, it can vary.

This is what was truly, well, revolutionary about the American Revolution. Sure, the ideas it was inspired by had been around, in some cases for a while. Yet it sparked a revolution in the collective mind of the world, showing that, yes, we actually can run a country like this. Successfully. And that, well, it’s pretty inspirational.


Cat Flag: Utah Edition


I just got back from a wonderful week-long visit to the Beehive State. Yes, that is the official state nickname of Utah – it was chosen to represent the hard-working Mormon pioneers who built the state into a prosperous community out of the desert soil. Indeed, according to the beliefs of the Church of Jesus Christ of Latter-day Saints, “Deseret”, the name Brigham Young first proposed for the state, is an ancient name for the honeybee. Congress rejected this name, instead insisting on naming the state after the Ute people who lived there first. Still, one can see bee-related symbols all over the state, including on Utah’s state flag:

I have made multiple trips to Arizona in the past few years, but I had never been to its northern neighbor before. This year, I decided to make a road trip up to Utah to see what I’ve been missing.

Specifically, the part of Utah I visited was southern Utah, the lands just to the north of the Grand Canyon where the geological complex that becomes one of my favorite places on Earth begins. This region is home to numerous national parks that preserve the upper canyons, and I visited three of them: Arches, Canyonlands, and Zion.



While on my tour of southern Utah, I noticed a few interesting things about this part of the country. One thing I noticed was that there were constant references to “Dixie” everywhere you looked. Businesses in the area often have “Dixie” in their name, the local college is named Dixie State University, and much of the land between the national parks and up in the mountains is part of the Dixie National Forest. It turns out that the southern part of the state is known colloquially as “Utah’s Dixie”.

Why? Well, it is located in the south of Utah, some of the early Mormon settlers in this area came from southern states, and the area was home to many cotton farms. I find that both hilarious and endearing.

Another thing I noticed was that the street names in Utah’s cities all follow an interesting, and consistent, pattern. In town after town, the streets all had names like “100 South”, “1500 East”, “900 North”, and so on. In seemingly every town I visited, most streets would have a numerical designation that was a multiple of 100, followed by a cardinal direction. Now, I’ve been to cities where they had a “First Street” or “Fifth Avenue”, but this was something new. After some research, I discovered that this street-name convention originates with Salt Lake City, where the streets are laid out on a grid originating at Temple Square, where the LDS Church built their headquarters and largest temple. The idea was that the streets emanating away from this focal point would act as grid coordinates – when you tell someone that you are at the corner of 700 South and 400 West, he or she will know you are seven blocks south and five blocks west of Temple Square. Apparently other Utah cities and towns copied this same naming pattern, using their main street or a major local landmark as the origin points of their grids.

What was far more surprising to me, however, was just how diverse of a landscape Utah has. You can drive through the stereotypical hot, dry Southwestern desert, fertile valleys lush with greenery, and snow-covered mountain vistas in a single afternoon. I know because I did exactly that.

Three environments 1


Three environments 2

Three environments 3

These three images were all taken on the same day

Sure, I saw many parts of Utah that were stark, rocky deserts with red canyons like the ones I’ve seen in Arizona. On the other hand, I also drove through many miles of farmland and green pastures that wouldn’t look out of place in the Midwest. On top of that, it actually snowed while I was there. As someone who lives in coastal California, I was captivated by the white flakes drifting to the ground, and looked in awe at the mountainsides as I saw them grow even whiter and more brilliant. Utah says that it has the “Greatest snow on Earth”, and while I admit I am no expert on snow by any means, I thought the snow that I saw was absolutely wonderful. For a visit. I imagine scraping my car off and shoveling my driveway would get old real quick.

Still, the three national parks I visited were all definitely within the “rocky red desert” part of the state, which has a wonder and beauty all its own:

I also noticed a pattern in the national parks I visited and the other tourists who were at these parks with me. These parks are clearly geared towards the outdoorsy, adventurous, doesn’t-mind-roughing-it crowd. There were campgrounds and hiking trails a-plenty, especially in Canyonlands, and most people I saw had brought at least a backpack, water bottle, and hiking shoes. I saw plenty of bikes, rock-climbing equipment, tents, RVs, and off-road vehicles. This was especially true at Canyonlands, the park with the fewest services and creature comforts. It had a single, tiny visitor center and outhouses for restrooms. It didn’t seem like the other people I saw at the park minded all that much; they were usually too busy getting their gear out and getting ready to hike the trails.

My favorite of the national parks I visited, though, would have to be Zion. It was far more, for lack of a better word, civilized than Canyonlands was. Not only does it have plenty of creature comforts like actual restrooms, a shuttle that takes you up the canyon, and a very nice lodge, but it also grades its trails so that you know which ones are good for beginners, intermediate hikers, and experts. I really appreciated that about it.

I also appreciated the amazing beauty of the place. The trail I took went up the Temple of Sinawava, the part of Zion Canyon where it first widens up around the Virgin River. It has some amazing sights, including the Weeping Cliffs, so-called because water from snow-melt seeps through the rock itself and runs down into the river.

The presence of the river in Zion really makes a difference with the wildlife that is present there. It has become an oasis and refuge in the Utah desert. According to the park’s museum, 70% of all plant species in Utah can be found only in that canyon! It is also home to 289 species of birds, 28 species of reptiles, and 79 species of mammals. The river itself is also home to 7 species of fish.

As you can tell, Zion was my favorite of the national parks that I visited during my trip, and I fully intend to go back there someday and explore even more of it. I still think that northern Arizona is my favorite travel destination, but Utah is now a very close second. This likely won’t be the last time you see me blogging about this wonderful state!

Strange Politics: The Emperor of Japan

Crown Prince Naruhito of Japan, who is set to take the throne as Emperor on Tuesday. Image by Michel Temer.

The current Emperor of Japan, who is 85 years old and has reigned since 1989, is set to abdicate the throne on April 30 in favor of his son, Crown Prince Naruhito. On his accession, the crown prince will become the 126th member of his dynasty to reign over the world’s oldest monarchy. All emperors of Japan, including the soon-to-reign Naruhito, trace their descent to the Shinto goddess of the sun Amaterasu through her descendant Jimmu, who is said in legends to have become the first emperor of Japan in 660 BC. Of course, modern historians and archaeologists tend not to believe such things, but have still found evidence that the Japanese imperial line dates back at least as early as the Kofun period around the 5th century AD – which still means the line of Japanese emperors goes back more than 1500 years!

How is this possible? Well, Japan’s emperors play a very unique role in Japanese society that has no equivalent in any other country. Indeed, it is only us Westerners who have dubbed them with the title “Emperor”, as a way to roughly conceptualize their status and position. The actual Japanese title is Tennō, meaning “heavenly sovereign”. The Japanese language refers to foreign emperors as “kōtei“, in order to distinguish them.

The Tennō reigns from the “takamikura” or Chrysanthemum Throne and during his reign, he has no name; he is just the Tennō. Many Western news media outlets and reference works will call the current monarch “Emperor Akihito”, referring to His Imperial Majesty by the name he used as a prince, but to the Japanese this would be considered quite disrespectful. Having said that, the traditional Japanese calendar divides Japanese history into “eras” that are each given a name, and a tradition has arisen that a new era name is selected upon the succession of a new emperor and that former emperors are referred to by the era name of their reign. Thus, Tuesday will be the first day of the “Reiwa” era, and the current emperor will then be referred to as “former Emperor Heisei”.

The Tennō is more than a reigning monarch, but the head of the Shinto religion as well. He is in charge of the three most sacred objects in Shinto, which are presented to him upon taking the throne: the Sacred Mirror that Shinto worshipers believe lured Amaterasu out of hiding, the Sacred Sword that her brother, the storm god Susanoo, pulled from the corpse of a dragon, and the Sacred Jewel that Amaterasu gifted her mortal descendants when she sent them to Earth. Indeed, as the role of the Tennō is considered sacred, he only very rarely speaks in public, which means that when he does speak, his words carry quite a lot of weight. The Tennō is so revered in Japanese society that the country’s very national anthem is a poem singing his praises.

Having said all of that, one would think that the Tennō is an extremely powerful figure in Japan. At least politically, however, nothing could be further from the truth. Indeed, from a constitutional standpoint, he is the least powerful monarch in the world. See, while most constitutional monarchs, like Queen Elizabeth II of the United Kingdom, legally retain some important political powers such as the ability to veto laws, appoint the Prime Minister and other important officials, declare war, command the armed forces, and ratify peace treaties, Japan’s constitution explicitly rips those powers away from the Emperor. The 1947 Constitution of Japan describes the Emperor as “The symbol of the state and the unity of the people”, and specifically instructs him to only exercise his functions and duties in accordance with the instructions of Japan’s democratically-elected politicians. For example, while the Queen may be theoretically free to choose whatever Prime Minister she wants and is only bound by long-standing tradition and custom to name Parliament’s preferred candidate, the Emperor is bound by the text of the constitution to choose the Prime Minister that the Diet picks for him. The Emperor’s political role has been described as a “rubber stamp”, but I think a more apt description might be something akin to a human flag. Just as a national flag is an object with immense symbolic value for a country, the Emperor is a person with immense symbolic value for Japan.

Why is this the case, though? Well…

Let’s just say a certain date that will live in infamy was involved.

The actual, on-the-ground political power of the Tennō has waxed and waned many times over the centuries due to a variety of historical factors. In the 7th century AD, the Emperor Kōtoku implemented a number of political reforms known as the Taika Reform, modelling Japan’s government on the Chinese model. At this point, we can call Kōtoku a true emperor, as he was assuming powers similar to the Chinese emperor. However, by the Heian period (AD 794-1185), the Fujiwara clan were actually running the show in the Emperor’s name. This was in part due to the fact that the Fujiwara frequently intermarried with the imperial family, and many emperors at this time had Fujiwara mothers who acted as regents for their sons. Toward the end of the Heian period, though, the Fujiwara’s power declined and civil war broke out between rival clans for power. This anarchic phase ended with the victory of Minamoto no Yoritomo, who seized power and became Japan’s first shogun. For centuries thereafter, the shoguns ruled Japan as military dictators of a feudal society. This is the age people think of when they think of historical Japan, with its castles, samurai, and ninja.

In theory, the shogun was appointed by the Tennō and ruled in his name, and the Tennō could dismiss a shogun that displeased him. In practice, however, this was very much not the case, as the Emperor Go-Daigo learned the hard way in the 14th century when he tried to do exactly that and ended up causing another civil war. Power remained firmly in the hands of the shoguns until 1853.

Why 1853? Well, that was the year that an expedition by the U.S. Navy led by Commodore Matthew Perry arrived in Japan on a mission to convince the isolationist Japanese to open up their ports to trade with the United States. The massive steam-powered gunboats armed to the teeth with powerful cannons shocked and frightened the Japanese, who now saw how far behind the west they had become technologically. This precipitated a political conflict that led to another civil war between factions supporting the shogun and emperor, with the imperial faction (with some British backing) winning the day in the end.

This led to the Meiji Restoration in 1867, with the Emperor Meiji deposing the last shogun and re-establishing imperial rule for the first time in nearly a millennium. In 1890, Japan’s first constitution took effect, establishing a constitutional monarchy modeled on those of Europe at the time. While this constitution allowed for some limited democracy through the election of a Diet with legislative power, it also preserved the emperor’s role as an active political player with immense power. The emperor’s power was further magnified through the establishment of State Shinto, a form of the Shinto religion that was infused with political ideology, most notably including the belief that the emperor was more than just a descendant of Amterasu but a divine being in his own right who should be worshiped as such.

In the 1930’s, a series of militaristic, imperialistic prime ministers allied with war-hungry military commanders took power and launched a campaign to conquer China. This led to the United States imposing an oil embargo on Japan, to which the Japanese retaliated by bombing Pearl Harbor on December 7, 1941. During World War II, the regime justified its actions through the lens of State Shinto, holding that the world should know the benefits of the emperor’s divine rule and that it was glorious to die for the emperor. When Japan lost the war, the victorious Allies had to decide what to do about this imperial cult. Some called for the abolition of the Japanese emperor’s role entirely, or at least for then-Emperor Shōwa (known in the west by the name he had as as a prince, Hirohito) to be deposed. Ultimately, the decision was made to do neither, but instead to have the emperor publicly renounce his divinity and for Japan to be made to adopt a new constitution that stripped away all his political power.

In a way, then, the role of the Tennō has gone back to the way it was during the shogunate, only instead of the Tennō being a symbolic puppet of a military dictator, he is the symbolic puppet of a modern democracy. It’s amusing to me how things have come full-circle with a modern twist like that. It just goes to show that everywhere in the world, and throughout all of history, politics is always very strange.

Why Democrats and Republicans?

An Editorial

It’s April of 2019, so naturally, everyone is gearing up for the 2020 U.S. presidential election. Already. I’ve discussed before why our presidential election cycle here in the U.S. takes forever, but today, I wanted to look a little deeper at an often-overlooked aspect of American politics that we often don’t appreciate, until something happens that reminds us about it.

See, former Starbucks CEO Howard Schultz is currently looking at the possibility of running for president, but not as a Democrat or as a Republican. He is looking at pursuing an independent run for office in 2020. This got me thinking about the fact that America has been politically dominated by the same two political parties since the mid-19th century. The Democratic Party is the oldest continuously-existing political party in the world, and the Republicans are also among the world’s oldest. Why those two parties? What has kept them in power for so long?

Well, I think there are a couple of factors at play. The first one being:

How Americans Vote

Most elections in the United States are decided by the oldest, simplest, and easiest to understand election system in the world, known to political scientists who study these things as “First-Past-the-Post” or FPTP. Under this rule, whoever gets the most votes, wins. Simple, right? That sounds fair.

Or it does, until you consider this scenario:

  • You have three candidates running: Jill, Jane, and John. Jill is liked by faction “A”, Jane is liked by some people from faction “B”, and John is liked by other people from faction “B”.
  • When the votes come in, Jill wins 40% of the vote, Jane wins 30%, and John wins 30%. Under FPTP, Jill wins.
  • Notice, though, that 60% of the votes cast were for candidates supported by faction “B”. This means that even though the majority of people support “B”, they are now going to be ruled by “A”. In essence, the minority won.

This is called the spoiler effect, and it is a major factor in the logic of voters when they go to the polls in countries, like the U.S., that use FPTP. Voters don’t want to “waste their vote” on the candidate that they actually support if he or she has no chance of winning, so they will instead vote for the candidate who is most likely to win that lines up most closely to their political beliefs.

Now, I personally think that the spoiler effect is a bit overstated. Clear-cut scenarios like the one I presented are rare. People are complicated, politics is complicated, and voters’ political agendas are very personalized and not likely to overlap neatly. A more realistic scenario is that of Ross Perot, who ran for president in 1992 and 1996, both times as neither a Democrat nor a Republican. His message was popular with a wide swath of Americans, and he pulled in liberal, moderate, and conservative voters. I have heard people argue that he “swung the election” to Bill Clinton in each of those races, but that is a really hard claim to prove. If he hadn’t run, who knows how many people who ended up voting for him would have instead voted Democrat or Republican? Perhaps Clinton would have won regardless, perhaps not.

Still, the spoiler effect does matter, as it matters in the minds of voters as they decide for whom they should cast their ballots. In this way, FPTP creates an environment that favors a two-party system: one big party on the political left, and one big party on the political right. Third-party and independent candidates in an FPTP system like Perot (and perhaps Schultz) have a much more daunting challenge, as they have to break voters out of the mindset of worrying about the spoiler effect.

So, that’s part of the answer. However, FPTP does NOT guarantee that the same two political parties will remain the “big two” indefinitely. In the FPTP-using United Kingdom, the Labour Party overtook the Liberals as the main party of the left in the 1920s. More recently, Canada, which also uses FPTP, saw their main party of the right, the Progressive Conservatives, completely collapse in the 1990s, eventually replaced by the Conservative Party of Canada. Even here in the United States, there were a number of political parties that rose and fell before we Americans settled on the Democrats and Republicans. So, how did those two manage to solidify and entrench their power so completely?

The Civil War

In 1860, Abraham Lincoln, the first-ever president from the new, antislavery Republican Party, was elected. Almost immediately, the country broke apart into a war between the states, north vs. south. When the war ended in a Union victory, the Republicans claimed credit for abolishing slavery for good and reunifying the nation. That’s why their nickname is “the Grand Old Party”. Many Union veterans, freed slaves, and former abolitionists remained faithful Republican voters for the rest of their lives, as were many early feminists, as Republicans led the way in winning the right to vote for women.

Meanwhile, Reconstruction in the south was, to say the least, controversial. The more radical faction in the Republican party wanted to protect the rights of African-Americans while punishing white southerners for daring to rebel. Democrats, on the other hand, argued for reconciliation with white southerners and turning a blind eye to discrimination and violent attacks against the African-American community. As for why America didn’t try reconciling with white southerners while also protecting African-Americans, well, the only man who advocated for such a plan had been shot in Ford’s Theater by John Wilkes Booth.

White southern voters remembered the Republicans as the party of the Union, the party of the war, the party of Reconstruction. So, they became the most reliable Democratic voters in America for generations. The southern states were known as the “Solid South”, as it was said the Democrats could nominate a dog or a lamppost and the south would vote for it. In many parts of the south, the local Republican Party organization simply ceased to exist. For generations, the divide set by the Civil War became the main divide in American politics, as the Republicans and Democrats coasted off of the feelings towards their respective parties by those who had been most directly impacted by it.

That’s why, in 2016, 89% of African-American voters supported Democrat Hillary Clinton over Republican Donald Trump… wait, what?

Hang on. Left me look at the 2016 election map by state:

Looks like someone has some explaining to do.

The New Deal and the Southern Strategy

One of the side-effects of the divide between the Democrats and Republicans being based on the Civil War, was that both parties had liberal, moderate, and conservative wings. The divide between the parties was NOT based on ideology, at least at first.

Then, the generation that had lived through the Civil War started to grow old and pass away, and their children started to grow old and pass away as well. Over time, as the memory of the Civil War faded, people just weren’t as married to the political parties of their parents and grandparents anymore.

The first sign of a major shift came during the presidency of Franklin D. Roosevelt. Many of his New Deal policies brought economic aid and benefits to impoverished African-American communities. This brought hope to a large section of the population that had been denied it for generations. FDR also appointed African-American leaders such as Mary McLeod Bethune to important political positions. Gradually, the younger generation of African-American voters were pulled toward the Democrats.

This caused some tension within the Democratic Party, which was still the party of Jim Crow segregation in the South. Yet President Lyndon B. Johnson, himself a white southern Democrat, pulled together a cross-party coalition that passed the Civil Rights Act of 1964 and Voting Rights Act of 1965. Many conservative white southern Democrats were so upset by what they saw as a betrayal by their own party, that they formed a new political party, the American Independent Party, that ran the outspokenly pro-segregation Alabama Governor George Wallace as their candidate in 1968.

Many see this as the tipping point, the final break between the former Solid South and the Democrats. But that’s not entirely true. Many white southern Democrats from that generation stayed lifelong Democrats. Sen. Robert Byrd (D-WV) has the distinction of having fought against both the Civil Rights Act of 1964 and the Iraq War. In fact, in 1972, George Wallace ran for president again, but this time as a Democrat, with one of his opponents in the Democratic primary being none other than Shirley Chisholm, the first African-American woman to serve in Congress.

Still, as the Democratic Party grew more and more openly liberal, the Republicans grew more and more openly conservative. The Republican leadership recognized that this was creating an opening to attract southern conservative ex-Democrat voters, so they pursued “the Southern Strategy” to get these voters to switch to the Republican Party.

The Southern Strategy involved keeping northern Republicans who were taking advantage of a more mobile society to move to states with less snow in the winter loyal to the GOP, while also attracting socially conservative voters in the south to the Republican cause. It was started, in earnest, by Richard Nixon, through specific policy initiatives and careful messaging-management (sometimes referred to by his opponents as “dog-whistling”) and was continued by Ronald Reagan’s outspoken support for evangelical Christian voters in the Bible Belt. Today, the transformation is complete: the Republicans are the party of America’s conservatives, and the Democrats are the party of America’s liberals.

There are a few lessons to take away from this. First, that the reason that the Democrats and the Republicans have remained in power in the United States for so long is largely due to their flexibility, adaptability, and tactical thinking. The Democrats and Republicans of today are nothing like their mid-19th-century ancestors at all, and they have survived by being willing to win no matter what the cost.

That leads to the second lesson, though. Ultimately, neither political party actually cares about you. They only care about winning your vote if it will win them the election, and as we have seen, they are more than happy to just dump one demographic of voters to attract another if they think it will win them power. So, my advice is not to vote for any particular political party because you are “supposed” to because of your race, religion, gender, class, age, orientation, career choice, region of the country, or anything else. Vote based on which political party or candidate will actually bring YOU the most benefits or bring about the changes YOU want to see in America. Because at the end of the day, your own conscience is all that should matter to you when you fill out that ballot.