top of page
  • Love and Liberty

Americanism And Christianity

America is not the Kingdom of God. America is not the new Israel. Americanism is not Christianity. Though some nations have been more influenced by Judeo-Christian ethics than others, it is not accurate to call the United States a “Christian” nation, nor has it ever been. The concept of a Christian nation is mythical, for it is an impossibility. What is it about America that leads some people to refer to it as a Christian nation? Because some of the founding fathers claimed to be Christians? Because the 10 commandments and other biblical references were inscribed onto buildings and currency? Because people have invoked the name of God to justify the nation’s military activities? No one can deny that good things have occurred in the history of the United States, but to attribute that to the US government instead of the Body of Christ, is at best misguided, and at worst blasphemous.


America is not the world’s hope. In fact, for all of the points that some people want to make about the good that has stemmed from the establishment of the US, they often neglect to mention, or they explain away, the atrocities that have been committed in this nation. Slavery is a blight on this country. Many white Americans will attempt to explain this away, but the simple truth is, black people were considered to be a lesser race by many in the country’s early years, and the sentiment continued long after the civil war, and still exists in some respects. Yet they want to condemn Hitler and the Germans for thinking the Jews were a lesser "race.". What is “Christian” about this?


The Ku Klux Klan members would quote the Bible to defend their atrocities; does that give them legitimacy since they referenced the scriptures? The Jim Crow laws were detestable. What is distinctly “Christian” about this? Native Americans were slaughtered in the name of American expansionism. How is this different than the actions of the Nazis? What is “Christian” about this? The disgusting acts committed by American military personnel during WWII and other wars are glossed over because they were considered exceptional events that occurred during “just” military campaigns. Have American wars been justified? The Revolutionary War can be viewed as a defensive war, and the Bible does not seem to condemn self-defense. This exception aside, what wars can or should be labeled as “just” or “necessary?” WWII is often hailed as a just war by those who view it through shallow lens, but some point out the fact that America’s involvement was hypocritical because the governmental actions of their allies were not that different from the actions of their enemies, and that America’s interest was not about helping the Jews, but furthering its empirical pursuits. Seriously, is there much difference between what Hitler did to the Jews and what some people in this country did to black people and Native Americans? Do you think it would have been just for another country to come into the US and kill those who had slaves, and also to massacre those who did not, because they were mere casualties of war?


How many people have been killed in American wars? How many foreign innocents were killed? What is “Christian” about that? “American Conservatism”, often conflated with Christianity, decries abortion of American babies, but does not lift a voice when millions of people in other countries are killed in the name of American Patriotism. Hypocrites. Do we believe in the sanctity of human life? Or is this just a label that is used to defend a political agenda? Does God care more about the American unborn, than he does the men, women, and children in other countries? If you say yes, your religion is Americanism, not Christianity. Why do so many people yield allegiance to a nation that has committed so many atrocities, and lift up as gods the military personnel that have carried out activities that are antithetical to Christianity?


Pointing to the missions that God gave to Israel in the Old Testament is not a justification for claiming the US has a mission from God to destroy other nations. God is not advancing the interests of any earthly nation today (though Israel remains his elect nation), rather, his concern is the Kingdom of God, which is comprised of people of all nations and all languages. America is not Israel. The president is not King David. America’s interests have not been God's interests. Yet many will defend their actions “for God and country.” So did Hitler when he killed “foreigners.” Do you think that all Muslims should be exterminated? Some people who profess to be Christians do. Was that the mission that Christ gave his disciples? Do you hate and want to kill Muslims? Or are you concerned with their well-being and conversion? Your answer might be an indication of where your allegiance lies: with Americanism, or with Christ. When did Christ ever commission his followers to kill in his name? Is the Christian’s duty to kill to protect Americanism, or is it his duty to be a living sacrifice for Christ?


Of course, I know that people are prepared to defend their actions, claiming “God and country” as their mantra. But the Americanism that I see around me, that mixes God into its false religion, is not the Christianity I see in the Bible. God’s vehicle for good is the Body of Christ. Ideally, when the Body of Christ is strong in a nation, it will influence the nation’s government, but this influence will only go so far. No government is operated like God’s church is meant to operate: with pastors, elders, administrators, evangelists, Christian ordinances, etc. America is not a theocracy.


Consider some of the differences between Americanism and Christianity:


Americanism teaches self-reliance – Christianity teaches we should share one another’s burdens.

Americanism teaches self-preservation – Christianity teaches self-denial.

Americanism teaches an “us vs. them” ideology – Christianity teaches there is one human race, and we are all precious in the eyes of God.

Americanism says “Don’t tread on me” – Christianity teaches us to be living sacrifices and to do good to all, and to live peaceably with all men, as much as it is possible. (I read these on the web somewhere but can't remember where.)

America is not God’s chosen nation and the US military is not God’s army. Christianity will thrive regardless of the environment it is in. Real Christianity that is. The US government does not promote Christianity. It never has. The Body of Christ promotes Christianity, though its members are few. Where does your loyalty lie? Do you worship at the altar of Americanism, which seeks to fulfill the flesh while giving lip service to God? Or are you a follower of Christ?

bottom of page