Bible Questions Answered
 
  

Is the United States a Christian nation?

Christian nation

Question: "Is the United States a Christian nation?"

Answer:
It may seem intuitive, at first, to attempt to answer this question by focusing on government. But the best way to determine whether or not the United States is a Christian nation is to compare the philosophy of its people to the Word of God.

The Declaration of Independence states that every person has these God-given, inalienable rights: life, liberty and the pursuit of happiness. This philosophy is what we could call the “American Worldview,” and it drives everything about the nation— from its economic and foreign policy to the private lives of its people. This is the atmosphere in which most of us have grown up. But can this American Worldview be called a Christian Worldview? Can we really call the United States a Christian nation?

Life
First, what does “life” mean to a Christian? Most Americans would say we have a right to be alive, just by virtue of having been born. Most Americans would say we have the right to do with our lives as we choose, because our lives belong to us. Christianity agrees that we have the “right to life” and recognizes that life comes from the Creator, just as the Declaration says. However, the Christian (biblical) view is that the right to live does not exist by virtue of being born, but by virtue of being created first in the mind of God (Jeremiah 1:5). Acts 17:25 says that God “gives to all mankind life and breath and everything.” The Bible is saying here that the life of man is sustained by God, and as such, it belongs to Him. But Americans generally believe that we are free to do with our lives just as we please because we believe our lives belong, primarily, to us. For a Christian, God’s law is the absolute truth, and the final authority. It tells the Christian “Thou shalt not murder” and “Thou shalt not bear false witness.” But the United States shows, both by the lives of her citizens and the laws passed in her courts that she does not recognize the authority of God, nor respect His laws.

Liberty
What does “liberty” mean to a Christian? Freedom of speech, freedom of the press, freedom of religion and the freedom to bear arms are some of the rights outlined in the Bill of Rights. All of these freedoms are good things valued greatly by Americans. Today, however, our nation has, for the most part, rejected the Bible as the standard of right and wrong. So, now liberty has an additional meaning to our citizens: it means that we are ultimately free to do whatever we want. It means that we control our own destiny—or that we should—and that nobody can tell another person what is right or what he should value. This mindset has had disastrous results. In America now, everything is subjective. In the face of the monstrous tragedy of abortion, Americans echo the words of Pontius Pilate: “What is Truth?” Our personal choice has become the only thing we truly value. We are tolerant above all, but only because to put down another person’s freedom is to endanger our own liberty. Practically speaking, since such a wide variety of religions are now represented among our citizens, how can we say “America is a Christian nation” without obliterating the Christian faith? A Christian individual will not kill or hurt someone of another religion who refuses to convert. However, the Bible is clear: we are not to tell people that all roads lead to heaven. There is one Way, and His name is Jesus Christ. The Bible informs Christians that freedom and liberty are good and right. But, it also gives us the context of that freedom: we have freedom as Christ’s followers, because we trust in His righteousness, instead of our own. We were slaves to this world, and to sin. Now we are slaves to Christ – and that is a Christian’s definition of freedom. That doesn’t sound much like the definition of Liberty that is held by the government or the citizens of America.

The Pursuit of Happiness
Now, the Pursuit of Happiness: what is it, to a Christian? In the Bible, happiness is an emotion that is welcomed, but not to be sought after. We seek God, and joy is a result of closeness to Him. But joy is different from happiness. Joy is a spiritual contentment and pleasure that comes from the Holy Spirit. A person must be in fellowship with the Spirit to experience joy, and it transcends circumstance. The apostle Paul said that he had “learned to be content whatever the circumstances” (Philippians 4:11), and Paul’s circumstances were hardly the sort to produce happiness: beatings, stoning, shipwreck, hunger, thirst and danger. But his joy and peace were from God, not from his circumstances. In contrast, Americans tend to believe we are to pursue, at all cost, happiness in our lives here in this world. Pursue happiness, the American is told, at the cost of all else. If it makes you happy to leave your wife and children, do it. If it makes you happy to devote your life to stardom at the expense of friends and family members, you should follow your dream. If you are a man but you think being a woman will make you happy, have a sex change. Play video games 10 hours a day? Drink yourself to death? Get married to your dog? Sure, if it makes you happy! Perhaps when the Constitution was framed, the Judeo-Christian ethic of “love thy neighbor” was understood as a foundational principle upon which to base our right to pursue happiness. But it has changed over the years to mean a right to pursue individual pleasure, no matter how strange the means, without being judged by your fellow man and without regard to how that pursuit affects the other person’s rights or freedoms, or affects the fabric of society itself.

But consider Mark 8:36: “For what does it profit a man to gain the whole world and forfeit his soul?” For the Christian, this thought is central: nothing is gained from pursuing comfort and happiness here on earth. Nothing is really gained, for a Christian, by “life, liberty and the pursuit of happiness.” The Christian person pursues other things: “Pursue righteousness, faith, love and peace, along with those who call on the Lord out of a pure heart” (2 Timothy 2:22). “Let us therefore make every effort to do what leads to peace and to mutual edification” (Romans 14:19). “But you, man of God, flee from all this, and pursue righteousness, godliness, faith, love, endurance and gentleness” (1 Timothy 6:11). Christians are concerned with the spiritual – because they belong to another country; they are citizens of a spiritual country, the Kingdom of Heaven.

So, is the United States a Christian nation? No. Not in its philosophy, or in what it loves, or in what it does. Despite its Judeo-Christian roots and heritage, and the beliefs of some of its founders, the United States today is a nation that follows other gods, and lives a lifestyle that is not compatible with Christianity.

Recommended Resources: What if America Were a Christian Nation Again? by D. James Kennedy and Logos Bible Software.


Related Topics:

Should a Christian be a Republican or a Democrat?

What does the Bible say about abortion?

Does God expect Christians to vote?

How should a Christian view the separation of church and state?

Were the Founding Fathers of the United States Christians?



Return to:

Miscellaneous Bible Questions


Return to:

GotQuestions.org Home


Is the United States a Christian nation?