Why is housing so expensive in California?

Los Angeles has more than 15,000 homeless residents living in their vehicles. In San Francisco, the median home price is $1.4 million, 59% of Silicon Valley tech workers can’t afford a place to live, and 88% of all households can’t afford to purchase a home. In San Diego, some have turned parking lots into de facto apartment complexes whose “residents” live out of their cars. Silicon Valley firm MainStreet has gone so far as to pay $10,000 for some employees to relocate out-of-state. Facebook is donating millions to build an apartment complex for teachers.  Apple has pledged to spend $2.5 billion to do what it can to alleviate the housing crisis.

The cost of housing has steadily increased since 2012, according to Zillow, which at the time of this writing lists the median California home price as $550,800. Four of the top five most expensive metro areas for housing are in California. Indeed, the cost of housing in California is one of the main reasons cited by people who either have left the state altogether or are considering doing so. Indeed, about half of Californian voters have considered moving out of state. Not surprising when Californians spend more than half their income on housing.

Except, that doesn’t actually make sense. In 2018, 700,000 people left California. This marks the eighth year in a row that the number of Californians moving away has increased, an the 15th that the state has experienced a net population loss. A basic understanding of supply and demand would make one think California’s housing prices should be going down, not up.

For comparison, neighboring Arizona’s median housing price is half of California’s. Now, you may say, that’s Arizona, a land of arid deserts with 100+ degree weather for much of the year. Yet Oregon, another state on the beautiful and temperate West Coast, is still far cheaper than California for housing. On the East Coast, meanwhile, states like South Carolina are cheaper even than Arizona.

So, why is California’s housing so expensive?

Because California hasn’t built enough houses.

I mean, on the surface, that seems like an obvious answer. If demand is decreasing, but prices are still rising, it logically follows that the demand must still be far higher than the supply. California’s high housing costs are a simple result of there being too many people trying to buy or rent places to live in California. Mystery solved!

Except, that immediately leads to a follow-up question: why hasn’t California built enough homes?

There are a few reasons. For starters, building in California is difficult. Look at a map of the state, and you will see a rugged terrain of mountains and coastlines that aren’t exactly amenable to tons of construction.

California map from the USGS

Not only that, but California is a known for its earthquakes, presenting architects and engineers with an expensive challenge to make structures that will be safe when the ground shakes. Then there’s the wildfires, some of which have recently destroyed once-thriving communities. News headlines like this raise safety concerns about letting new developments sprawl into areas that could be vulnerable to fires, such as forested hills and mountains.

Another factor is the labor pool. The housing crisis is something of a self-reinforcing vicious cycle. The types of skilled and unskilled labor that are required for a healthy construction industry don’t generally pay enough to afford the cost of housing in California, and that means a shortage of construction workers and higher labor costs for construction projects. This, in turn, makes it harder to build affordable housing.

Even with these factors taken into account, there is still another key reason that housing in this state is so expensive that popped up again and again in the sources I found when researching this topic: the state and local governments.

Let’s say you want to build a brand-new affordable housing complex in some California community. First, you better do your research: California has extensive land-use regulations that have increased in number year after year for decades, making it far harder to even get a land-use permit. Next, you have to have your proposal reviewed by a myriad of government agencies and bureaucracies, each of which can place additional restrictions you have to abide by in order to get permission to build (making your project more expensive). This process can take months, by the way.

Oh, and let’s not forget that you are building in CALIFORNIA, a state that has a long history of taking environmentalism seriously. Any new building project has to have its environmental impact assessed, a process that can literally take years.

Did I mention that you have to pay fees at every stage of this process? Many of these fees are not exactly easy to learn about in advance, either, with hidden fees bringing up the total you would have to pay up to 18% of the cost of your project in some jurisdictions.

Then, you have to get the final approval from the city council or county board of supervisors. These local politicians are under immense pressure from NIMBY voters (short for “Not In My BackYard”), most of whom already own property in the community and like having high housing prices as it makes their own homes more valuable. They will probably give long-winded speeches about how your development could “change the character” of their community. Pressure from these voters have already pushed quite a few communities to adopt laws that explicitly restrict housing growth.

Now, a few observant Cat Flaggers may be wondering, “Won’t the high housing costs cause these people to have to pay skyrocketing property taxes?” Well, no, thanks to a 1978 ballot initiative passed by California voters known as Prop 13. This law caps the property taxes that can be assessed on housing. Basically, California bases property taxes on the value of a home at the time of purchase, and only allows a slight annual inflation increase. This means that Californians who have owned their home for decades are taxed far lower than the actual current market value of their property would indicate.

This law is incredibly popular among Californians, especially homeowners (for obvious reasons). Yet one big unintended side-effect is that it eliminates the incentive for local governments to approve residential developments. The city or county would earn far more revenue from approving a retail store.

Is it any wonder, then, that California’s housing is so expensive? It is remarkably difficult and expensive to even get a housing project past all the immense red tape that the state and local governments put up and all the objections of those who already have their own houses. Add onto this the higher cost of labor and the practical challenges of even building in California, and it seems a miracle that any housing is built here at all. A report in 2016 by McKinsey & Company says California needs 3.5 million new homes in the next six years to alleviate this crisis, yet the system is stacked against housing construction at every step. Meanwhile, California is now home to one out of four homeless Americans.

Thus, we can see that burdensome regulations, a voter-mandated tax policy, and the high cost of construction in California have left people stuck either scrambling and competing against each other to be able to find any housing at all, or giving up altogether and leaving for other states. Until all of these problems are addressed, this crisis will only continue.

The Most Important Army from World War II You’ve Never Heard of

Chadian soldier in the First Free French Army from World War II. Image colorized by Cassowary Colorations.

Recently, I was reading over some of my old blog posts, and a very unfortunate and startling realization hit me. I have never actually talked about African history on Cat Flag before! Yikes! That is changing today, as I return to a topic that I covered last year and promised more of: What were the less-famous countries of the world doing during World War II? Today, I will be looking at how the war affected Africa, specifically those countries that were under the colonial rule of France at the time.

First some background…

In the 19th century, the major European powers engaged in the “Scramble for Africa”, claiming and conquering as much of the continent as they possibly could and incorporating it into their empires. Portugal, Germany, and even Belgium colonized parts of Africa, but most of it got swallowed up by the British and French.

France’s attitude toward its own imperialism was colored by the ideals of the French Revolution and the national motto of “Liberté, égalité, fraternité.” Conquering and subjugating peoples and lands across the globe to gain resources for one’s industrialization kind of clashes with the ideals of freedom, democracy, human rights, and equality for all. The way the French resolved this contradiction was to assimilate the natives of the lands they colonized. To the French, their ideal was for all the people living under the tricolor to be one, big, happy empire where all were equal French citizens regardless of skin color, and everyone had equal rights, including the right to vote. Or, at least, that was the theory. In practice, French citizenship was not given automatically, it was instead reserved for the small handful of natives who had “evolved” (yes, that was the term they used) into proper Frenchmen by completely abandoning their native culture and ways and fully assimilating into French culture.

This was the world into which Félix Éboué was born. A black man from French Guiana, he rose through the ranks to be a key colonial administrator in French Equatorial Africa, named governor of Chad in 1938.

In 1940, Nazi Germany’s blitzkrieg overran northern France and captured Paris. An emergency government meeting in the town of Vichy granted Marshal Philippe Pétain emergency dictatorial powers, as he negotiated a surrender and armistice with Adolf Hitler. In the aftermath, northern France would be under German military occupation, with Pétain in Vichy leading a Nazi collaborationist puppet regime. French general Charles de Gaulle, who had escaped to London, refused to accept this turn of events and called on the French people to resist their occupiers and fight for their liberty.

This turn of events was quite sudden, and France’s global colonial empire wasn’t entirely sure how to react. They were now colonies of a country that was under foreign occupation. Should they keep taking orders from France as if nothing had happened? Declare their loyalty to Pétain?

They weren’t the only colonies in this predicament. Germany had occupied the Netherlands and Belgium as well. However, both of those countries still had semi-intact governments in place that just happened to be operating from exile in London. The colonial administrations in both the Dutch East Indes and the Belgian Congo pledged their loyalty to these governments-in-exile and supported the British. France, however, did have a government that was still based in France. At first, basically all colonial administrations in France’s overseas empire recognized the Vichy regime as the legitimate government of France. When the British went to force the issue by attacking the port of Dakar in French-ruled Senegal, the result was a humiliating defeat.

The foundation of the Free French Army

However, all of this started to change when Éboué declared that Chad would side with de Gaulle. After all, as oppressive as French colonial rule could be, at least the French opened avenues, however limited, for their subjects to advance themselves. Compared to the racist and genocidal Nazis, a free France was clearly the less-bad option for non-white French subjects.

Félix Éboué and Charles de Gaulle

Before long, French Cameroon joined Chad, and the two invaded Gabon, a colony that had declared for Vichy. Éboué’s forces, organized as part of the Free French Army, won. Before long, the Free French Army had captured Madagascar as well. These “Free French” forces included many of the French soldiers who had escaped to England with de Gaulle, but it also included many thousands of African soldiers who were recruited from these French colonies.

However, some colonies still professed their loyalty to Pétain, most notably those in northern and western Africa that were vital to the Vichy regime as a breadbasket. Located just across the Mediterranean from France, Spain, and Italy, the agricultural sector in these colonies supplied Axis-occupied Europe with food and had numerous ports that were useful as naval bases. This made it imperative that the Axis powers maintained control of them, and also made them one of the first major targets of the Allied plans to liberate Europe.

American and British forces invaded North Africa in a campaign dubbed “Operation Torch”. While this invasion was happening in northwestern Africa, British forces operating in Egypt in the northeastern corner of the continent were joined by the Free French Army invading Libya, an Italian colony, forcing the Axis armies to divide their attention and their forces across multiple fronts. By the end of 1942, the Allies had squeezed the Axis powers’ forces off the African continent entirely.

While the Free French Army was fighting on the battlefields of Africa, some important political maneuvers were happening behind the scenes in the conference rooms of the Allied commanders. In a stunning turn of events, Gen. Dwight D. Eisenhower (yes, that one) convinced the Vichy French forces they were fighting in North Africa to switch sides, merging with the Free French Army. As a result, de Gaulle’s forces expanded immensely, with about 60% of the new Free French soldiers being native North Africans and West Africans. This multiracial, pan-African army participated in the Allied invasion of Italy, with about 130,000 fighting in the Italian countryside, of whom 6,331 were wounded in battle and 1,726 gave their lives.

When the D-Day landings took place in Normandy on June 6, 1944, one of the units participating in the landings at Utah Beach was the 2nd Armored Division, a Free French unit formed from the African and French veterans of the aforementioned invasion of Libya. For the rest of 1944, the Free French Army played a major role in the Allied liberation of France.

So, why haven’t I heard of this army before?

While the Free French Army was chasing the Axis forces out of Africa and fighting in Italy, there were people back home in France who were actively resisting the Nazi occupation of their homeland with a campaign of sabotage and hit-and-run guerrilla ambushes. Many were former soldiers who refused to accept their nation’s surrender. However, there were also Communists fighting to establish a Marxist state, Jews trying to escape the Holocaust, and ordinary citizens trying to escape forced labor in Germany. Those who were caught by the Germans would be killed, and in some cases their families and villages would suffer retribution as a result. The sacrifices made by these brave fighters and their importance in the war shouldn’t be downplayed. Their fight for liberty is nothing short of heroic.

However, there was a consensus among the Allied leaders that the French people should feel that they liberated their own country, and the fact that so many of those that had fought to defeat Vichy and its Nazi puppet-masters were Africans from places like Chad, Cameroon, or Morocco didn’t fit that narrative. In fact, as the Allied forces grew closer to Paris, it was agreed by the top generals that the first unit to enter the city on the day of its liberation would be the 2nd Armored Division – but only those among its ranks who were “100% white“. Ironically, de Gaulle was never consulted about this, it was a decision made by the American and British top brass.

In the months that followed, the French Resistance was officially merged with the Free French, forming a provisional government and a new French Army, both of which tended to put figures from the Resistance in their uppermost ranks. Once the war was over, many overseas French colonies began to advocate for their independence, and France initially repressed these movements with brutal force. It was only after de Gaulle took over as President of France in 1958 that the French colonial empire was finally broken up and its colonies were allowed to become independent nations.

Today, when we look at the history of World War II, we generally tend to ignore how it affected Africa and the many battles and struggles that were fought there. When we think of the role France had in World War II, we mostly think of the country’s quick surrender early in the war, with the more generous also mentioning the French Resistance as a side-note. Yet in many ways the war could have gone quite differently if it weren’t for the brave men of Africa who recognized that they would be better off fighting for the freedom of the France they knew, racist and flawed as it was, than potentially living under the thumb of the Nazis. I think it is truly a shame that this sacrifice has gone forgotten and ignored because they don’t fit neatly into the story we tell about World War II. Personally, I think that it may be time for that story to change.

To the Stars through a Difficult Viewing

Ad Astra Per Aspera – Latin for ‘to the stars through difficulties’ – is the official motto of the state of Kansas and the South African Air Force, and is widely used among many other governments, organizations, and universities, as well as being referenced in music, literature, and popular culture. It’s almost surprising that it has taken this long for a science fiction film about space exploration to reference this well-known phrase in its title. I was first made aware of the film Ad Astra while watching baseball on TV, as a trailer for the film played during the commercial break. A major Hollywood production about a space adventure starring Brad Pitt and Tommy Lee Jones? And it isn’t an adaptation of a comic book or part of a decades-old franchise? I just had to see it for myself.

Then I left the theater, and it took me a while to put my finger on why, exactly, I didn’t like this movie.

At first glance, I should love it. Ad Astra is a work of “hard sci-fi”, a sub-genre I really enjoy that focuses on the “science” part of science fiction with super-realistic and grounded predictions of future technology and actual physics. Think movies like The Martian or Gravity. Ad Astra attempts to imagine what a future where humanity has colonized the moon and Mars would actually look like given the state of current technology and global politics. It stars two very good actors who are giving excellent performances. It has an excellent premise: astronaut Roy McBride (Pitt) learns that his long-lost father (Jones) may still be alive at the edge of the solar system, conducting a deep-space experiment that threatens to wipe out all life on Earth. It’s a premise that has built-in emotional tension and a mystery you want to see the hero solve.

I also have to complement the film on its attitude toward exposition. All too often, sci-fi and fantasy stories have to explain to the audience the background for what’s happening through either a ton of awkward narration or the inclusion of an audience-point-of-view “fish out of water” character who has to have everything explained to him or her. Ad Astra eschews that. Instead, it takes a “show, don’t tell” approach to world-building, trusting the audience to be smart enough to put the pieces together. All the characters we follow live in this world, and treat it as everyday and normal. The futuristic and fantastical elements of this world that we see are exposed to the audience through the characters’ mundane interactions with these elements. It’s a challenge to do this type of exposition well, and Ad Astra definitely succeeds.

Yet for all this film has going for it, I can’t help but feel it is constantly undercutting itself. The biggest piece of this is the direction. The cinematography and style of a hard sci-fi film should feel just as grounded and real as its setting, like a PBS documentary. Instead, director James Gray (The Immigrant, The Lost City of Z) decided to go for a super-stylized, surreal, dreamlike look and feel for his movie. This film is super-artsy, with odd cuts, fancy fades between scenes, bizarre lighting choices, and lots of Dutch angles, to the point where it becomes quite distracting and you get the impression he was trying too hard. Even worse, it was apparently decided at some point along the way that we needed to constantly hear Roy McBride’s inner thoughts as he goes about his mission. It quickly gets very distracting.

That’s another thing I couldn’t enjoy about this movie: its hero. Our main character is a real jerk. On the surface, he looks and acts like a stoic, calm-under-pressure, competent astronaut, but as we hear practically everything he’s thinking, we can see that he sees himself as far superior to all the idiot normals around him and doesn’t really like or trust anyone. Admittedly, this is just a personal preference of mine, but I want to like the protagonist of a movie I’m watching. I want to root for the hero, and I can’t do that if the hero is a terrible person.

Lastly, the film has a few plot twists that seem very contrived and convenient. Sure, the film knows how to world-build when it comes to technology and society, but when it comes to the actual interactions between the characters and events that advance the story, the writing is just weak. It feels like they were more interested in forcing the film to move on to the next plot point than letting the characters get there naturally. Not only that, but some decisions that the characters make during this film’s runtime just seem downright stupid.

I don’t hate Ad Astra, and there are parts of it that I genuinely like. Unfortunately, its flaws don’t outweigh its strengths to me. As much as it is disappointing for me to do so, I have to give it 4 out of 10.

The tale of King James and the many, many Bible versions

I have mentioned my Christian faith on this blog a few times over the years. Growing up, I was taught all about the Bible, with my first one given to me when I was still in elementary school. My family owned many different copies of the Bible, each a different “version”. There was the King James Version, New King James Version, Revised Standard Version, New International Version, New American Bible, and many others. Each of these translated the Bible’s passages with slightly different wording, and as such I came to appreciate the importance of comparing these editions to get a sense of how different scholars and translators saw fit to interpret each passage.

In fact, there are still new versions of the Bible being produced to this day – the Christian Standard Bible was published just two years ago! Yet, by far, the most popular version of the Bible in the United States is the King James Version, first published in 1611. This is the classic Bible most people think of when they think of the Bible and its passages, giving us such phrases as “Thou shalt not”, “feet of clay“, and the annual Christmas blessing of “Peace on Earth, Goodwill toward men.” According to a study conducted in 2011, 55% of Americans who read the Bible regularly prefer the King James Version. This far outshines the second-most-popular version, the New International Version (19%), let alone any other edition (all with usage in the single-digits). Indeed, there is a religious movement that insists that the King James Version is the only correct English translation of the Bible, far superior to all others.

Recently, I was re-reading Wide as the Waters: The Story of the English Bible and the Revolution it Inspired by Benson Bobrick, a book that goes into detail about the origins of the King James Version. This got me thinking about just why there are so many versions of the Bible in English in the first place, and why, in spite of this, the King James version is still so popular.

The list of English-language translations of the Bible is quite long, and there are a few key reasons for this:

  • How the text is translated
  • How English evolves over time
  • What text is being translated

Let’s start with the translation process itself. When translating anything from one language to another, you are going to run into idioms, turns of phrase, and grammatical differences that are unique to the source language. Do you take these and translate them literally, word-for-word, or do you try to convey the intended meaning in the idioms, turns of phrase, and grammar of the receiving language? This is one of the major sources of variability between Bible versions. Some versions prefer literal, word-for-word translations, others prefer a “sense-for-sense” translation, and still others attempt to walk a fine line between the two methods.

Another major reason that there are so many Bible translations is that the English language itself has evolved over time. The King James Version’s use of “thee”, “thou”, “thy”, and a whole host of other archaic words preserves the English language as spoken at the start of the 17th century, but this is not how we speak today. Most modern editions of the Bible try to update the text for a modern audience in plain, everyday language. There are even some versions that are explicitly tailored to American English as opposed to British English, such as the American Standard Version or New American Bible.

The Great Isaiah Scroll, one of the Dead Sea Scrolls, is the earliest known copy of the Book of Isaiah.

Finally, you have to consider what text, exactly, you are wanting to translate. See, the Bible wasn’t originally a single text, but a collection of writings that were gathered in a process over the centuries into an accepted “canon of scripture” that were agreed by the faithful to be the Word of God as written by early prophets and apostles through divine inspiration.

It was, ironically enough, a pagan Pharaoh of Egypt descended from Alexander the Great’s general Ptolemy that we have to thank for starting this process. According to legend, as Ptolemy II Philadelphus built his Library of Alexandria, intending it to be the great repository of all knowledge, he hired 70 Jewish scholars to translate their most sacred writings into Greek. This meant those scholars had to determine which ancient Hebrew writings were to be considered “sacred”. The result of their work was the Septuagint, the first codified Bible. This was the sacred text used by faithful Jews in Jesus’ day. Today, we know thanks to numerous archaeological discoveries in recent years, most notably the Dead Sea Scrolls, that these books of the Bible had been around for centuries and were treated as sacred before this time by the people of Israel, but the Septuagint brought them all together into a single edition for the first time.

However, it was not to reign supreme for long. The followers of Jesus Christ collected their own set of writings that were added to the canon by early church fathers, collectively known as the “New Testament”; the earlier writings they dubbed the “Old Testament”. Meanwhile, Jewish religious leaders and scholars began editing down the books of the Septuagint, removing some writings and verses that they felt were not truly divine, resulting in the Masoretic Text, the standard text of the Hebrew Bible to this day. The edited-out books and verses were dubbed the “Apocrypha”, and their inclusion in the Christian Bible varies by denomination – Roman Catholics and Orthodox Christians include these books in their canon, while Protestants do not.

The question of what text is being translated, however, doesn’t just apply to which books of the Bible you are including in your canon, but also what physical copy you are using as the basis of your translation. See, we have numerous early collections of Biblical writings, some complete and some incomplete, dating back centuries. In addition to the aforementioned Dead Sea Scrolls, we have the Codex Vaticanus (a 4th-century Greek manuscript of the Bible held in the Vatican Library), the Codex Sinaiticus (another Greek manuscript of similar age discovered in Egypt in the 19th century), the Peshitta (a Syriac-language translation used by some churches in the Middle East believed to date from the 3rd century), the Vulgate (a Latin translation compiled by St. Jerome in the 4th century), the Cairo Codex and Aleppo Codex (two partial Hebrew Masoretic texts dating to the 9th century), the Leningrad Codex (the oldest complete copy of the Hebrew Old Testament, stored in the Russian National Library), and the Textus Receptus (a collection and synthesis of medieval Greek New Testament manuscripts produced by Dutch philosopher Erasmus). Today, it is common for Biblical scholars seeking to translate new editions of the Bible to compare these ancient texts and try to create the best-fit translation based on these writings.

See, which text you use matters, as demonstrated by that verse I mentioned earlier, “Peace on Earth, Goodwill toward Men”. This is the correct translation of the text in the Codex Sinaiticus and Textus Receptus epi gēs eirēnē en anthrōpois eudokia, but other early Greek manuscripts read epi gēs eirēnē en anthrōpois eudokias, the addition of that single letter changing the translation to “Peace on Earth to Men of Goodwill,” or as the New International Version writes it, “on earth peace to those on whom his favor rests.” This is quite the change in connotation!

The first page of the original printing of the King James Version in 1611

So, where does the King James Version fit in all of this?

In the late middle ages, the Vulgate was the official version used by the Roman Catholic Church for all its services, and it was rare to encounter any other translation as the church placed heavy restrictions on translations of the Bible into the vernacular. This didn’t stop William Tyndale, a Protestant scholar in early 16th-century England who decided to try his hand at translating the Bible into English. He used both the Vulgate and the Textus Receptus to translate the New Testament. He then turned to the Old Testament, and was about halfway done, but then he was caught by the English authorities, convicted of heresy, and strangled to death, his corpse burned at the stake. Ironically, King Henry VIII would break the Church of England away from the Catholic Church just one year later, and Tyndale’s Bible would be used by the new church as the basis for the Great Bible, the official English-language Protestant Bible that all churches in England were required to use by royal decree.

However, the Great Bible was a bit of a mess. It was rushed out in a great hurry to meet the king’s demands, with poor Myles Coverdale, the bishop asked to prepare the work for printing, having to fill in the gaps left by Tyndale’s unfinished Old Testament by translating from German copies that had been acquired from early Lutherans, as well as making edits to various Biblical passages that were meant to satisfy the conflicting political demands of both more conservative and more radical church leaders. According to Bobrick’s book, when King Henry was informed of the many flaws in the translation, he asked, “Are there any heresies maintained thereby?” Upon hearing a “no” as the reply, he said, “Then in God’s name let it go abroad among our people!”

So it was that the Great Bible became the standard Anglican Bible, flaws and all, for a few decades while the conflict between Protestants and Catholics in England for supremacy raged under the reigns of Henry VIII’s children. Queen Elizabeth I of England ended up cementing the Church of England’s place on the Protestant side of the equation, but was aware of the Great Bible’s deficiencies. In the meantime, Calvinist English nonconformists that had fled into exile produced the Geneva Bible, a far superior translation that was praised by scholars of its day and widely read by many in England. However, the Geneva Bible also included a host of margin notes that “helped” in “interpreting” various verses, invariably arguing for a Calvinist interpretation, and these notes were offensive to the Anglican church leadership. So, the queen authorized the creation of the Bishops’ Bible in 1568 as the new official Bible of the Church of England. However, this version was deliberately written to be different than its unofficial competitor, sometimes twisting sentence structure into odd formations just to not be the same. For example, where the Geneva Bible said “mother-in-law”, the Bishops’ Bible would say “wife’s mother.” Thus, it was unpopular with the general public, who just kept buying Geneva Bibles.

Then, in 1603, Queen Elizabeth passed away, and King James VI of Scotland inherited the throne of England as King James I.

Hey, that’s me!

King James had already decided that the existing English Bibles weren’t good enough for the Church of Scotland, and had discussed the matter with the Scottish clergy as early as 1601. In 1604, he called a conference of English and Scottish church leaders and demanded that a new official translation be created. This version would need to be based on the “original text”, meaning the oldest texts available at the time – the Masoretic Text and Septuagint for the Old Testament, the Textus Receptus and Peshitta for the New Testament. The translation would use the Bishops’ Bible as a guide, but replace the awkward wording and unnecessarily complex sentence structure with plain English (or, what would have been considered “plain English” at the time), consulting the earlier English translations to help determine the easiest phrasing to read and say. There was also an emphasis on using the traditional interpretations of the original Hebrew and Greek words rather than the Puritan interpretations that were floating around at the time – “church” instead of “congregation”, for example. Lastly, unlike the Geneva Bible, there would be no margin notes.

King James wanted this task done right, so instead of entrusting it to one person or a handful of people, he hired 47 of the best biblical scholars in England and Scotland to write this new version. Breaking into six committees, each of which included people across the spectrum from Catholic-leaning high Anglican to radical Puritan, the scholars would constantly translate and re-translate the same passages, comparing drafts and debating the meanings of the text until they were satisfied they had the perfect translation.

The result was published in 1611 and became the official English-language version of the Bible in the realm. It wasn’t adopted uniformly all at once, however, with the Mayflower pilgrims bringing the Geneva Bible with them to America, and the Book of Common Prayer used by the Church of England continuing to quote from the Great Bible until 1662. By the mid-18th century, however, virtually all Bibles printed in English were the King James Version. This would have been the version used in colonial America during the War of Independence, and throughout the English-speaking world in much of the 19th century.

The only exception would have been the Douay-Rheims Bible, an English translation of the Vulgate made for Catholics, but even this wouldn’t have been seen often as until the 1960s most Catholic mass and liturgy was still conducted in Latin.

In the mid-19th century, however, there was a growing interest among academics and scholars in revisiting the Bible and examining its text more critically. In 1862, Scottish publisher Robert Young produced Young’s Literal Translation, a version that translated every single word as literally as possible from the earliest available Hebrew and Greek texts. In 1885, the Church of England authorized the creation of the Revised Version, in order to update the language of the King James Version from Shakespearean English to Victorian English, and also to take advantage of the discoveries of older copies of ancient scriptures such as the Codex Sinaitica. A version in American English based on the Revised Version was published in 1901, the American Standard Version. Every version of the Bible published since has been made in this tradition.

That’s not to say that everyone agreed that the Bible needed updating. By the time Biblical scholars of the 19th, 20th, and 21st centuries had started preparing new translations of the Bible, the King James Version had been so pervasive in the English-speaking countries of the world for so long that it had taken on a life of its own, permeating both religious and popular culture. To many, the first version of the Bible they think of when someone mentions “the Bible” is the King James Version. People love the beautiful poetry of 17th-century English, after all.

Let’s be honest, though, it is a bit hard for a modern English-speaker to read that language, beautiful as it may be. Interestingly, a new group of translators that has come up with a solution: the 21st Century King James Version! Yes, this is a thing – an edition of the King James Version that leaves in the verbs ending in “-eth” that people love, but replaces some archaic words that nobody uses today with modern equivalents. For example, the original KJV text of Ezra 9:3 reads “And when I heard this thing, I rent my garment and my mantle, and plucked off the hair of my head and of my beard, and sat down astonied.” The KJ21 (yes, that’s how they want it abbreviated) reads “And when I heard this thing, I rent my garment and my mantle, and plucked off the hair of my head and of my beard, and sat down stunned.” Credit where credit is due, that is an interesting compromise.

Now that you know all of this, here’s an experiment you might want to conduct. Next time you happen to be in a book store, pay attention to the Bibles on sale. What version or versions are they selling? This might be interesting to know.

Wait… who am I kidding. People don’t go to bookstores anymore. They shop on Amazon. I forgot.

The Fracturing World of Streaming

An Editorial

I suspect Hulu’s days are numbered. Here’s why.

Hulu was founded as a joint venture between multiple big media companies as an answer to the rise of Netflix, the independent DVD rental-by-mail service that quickly transformed itself into an innovator in the new market of videos streamed online, offering movies and TV shows for subscribers to watch anywhere at anytime. Hulu, being owned by the likes of NBC Universal, Fox, and the Walt Disney Company, had an advantage in securing the streaming rights to a vast library of television programming, as it was owned by the very companies that produced those shows. Netflix, in contrast, has to continually negotiate the streaming rights to the films and TV shows it offers, apart from their own original programming that they make in-house.

Then, Disney bought 21st Century Fox, and as a consequence of that merger, they now own a 67% share of Hulu, with NBC Universal’s parent, Comcast, agreeing to basically let Disney run Hulu outright and promising to sell its remaining 33% share in five years. So, okay, Disney owns Hulu now, what does that matter?

Disney is planning to launch its own streaming service on November 12. Disney+ is a service that will take advantage of the massive size the company has grown to in recent years, to offer not only Disney films, but also movies and TV shows from Pixar, Marvel, Lucasfilm, Fox, and even National Geographic. At the same time, Comcast is going to slowly pull NBC Universal’s content off of Hulu to start their own streaming service that is expected to launch next year. Basically, Hulu is going to be made redundant, and from Disney’s point of view, if it merges Hulu with Disney+, it is guaranteed a massive subscriber base from all the former Hulu subscribers, and will inherit all of Hulu’s own original programming as well. It just makes sense to me that Disney would want to do this.

Disney and NBC aren’t the only big media companies seeking to claim a piece of the hot, new streaming market. WarnerMedia is also launching their own streaming service, HBO Max, also arriving next year. CBS already has its own service, CBS All Access, offering the channel’s TV lineup and some original shows, most notably the latest versions of Star Trek. In a way, I am starting to feel sorry for Netflix, the innovator that started this rush, now constantly losing programming to its competitors as the big media companies pull their libraries off Netflix to put them on their own, rival streaming services. No wonder Netflix has made a huge push to increase its in-house library.

In fact, Netflix, Amazon Prime Video, Vudu, and other streaming services have another crutch they have to overcome. The FCC has repealed net neutrality in the United States, and so far attempts to have Congress reinstate it have stalled, in spite of wide, bipartisan support for net neutrality among the American people. I have written about net neutrality on this blog before, but to summarize, it is the principle that internet providers have to treat all data that users want to access equally. Without it, ISP’s can legally throttle streaming speeds and charge higher premiums for access to certain internet services.

Well, AT&T owns WarnerMedia, and Comcast owns NBC Universal. Is it any wonder these companies want to get into the streaming business? Think about it – they can force people who want to stream movies and TV shows on Netflix or Amazon or Disney+ to pay higher prices for a “premium internet package”, but allow people who use their own in-house streaming service to use it at no extra charge. So long as they are up-front about it and not deceptive, that’s perfectly legal now.

Speaking of cost…

In general, streaming services are not free. Netflix runs up to $15.99 a month. CBS All Access will run you up to $9.99 a month. Hulu’s ad-free version costs $11.99 a month. CuriosityStream, a niche streaming service offering documentaries about science, nature, and history costs $19.99 per year. Amazon Prime Video requires an Amazon Prime subscription, which currently costs $119 per year. At least Crunchyroll, a streaming service for anime, actually is free – though if you want to remove ads you have to pay $7.99 a month. The more streaming services you sign up for, the more expensive it gets. Many Americans, myself included, have gotten rid of cable and now mainly use streaming for entertainment, but it is really easy to end up paying enough in your monthly streaming service bills to equal a cable bill.

And it’s not like Disney+, HBO Max, or the new NBC streaming service will be free, either.

When the streaming market was new, everyone had Netflix and Hulu and that was about all anyone needed. Now, streamers have to pick and choose which streaming services they want to subscribe to, and so these companies have to compete with each other on what their service offers viewers to watch. There are literally websites that exist to track which movies and shows are on which services, and these have to be updated regularly as contracts expire and new ones are negotiated.

Hence, the push for more streaming-only original content that is linked to a specific service. Do you want to watch the upcoming Star Trek: Picard? Better sign up for CBS All Access, then, as you can’t watch it anywhere else.

Legally, anyway

This forces people to pick and choose what programming is worth paying an extra $10-$20 a month for, and many will miss out on films and shows they my have otherwise enjoyed because the price tag is too high. People will end up segregating themselves by preferred streaming service, only seeing shows available on other services when visiting a friend’s or relative’s house. The streaming market is fracturing as we speak, and at least for the foreseeable future, will continue to do so. What impact will this have on what films and TV shows get made, or on how audiences respond to them? Only time will tell. However, it will have an impact, and a big one. Of that, I can be sure.