Sunday, August 8, 2010

The Decline of Religion in the West?

This is going to be a short post and I mostly just want to hear the opinions of others.

I think it is pretty apparent that in the last century religion has really taken a strong hit and has been in fairly rapid decline. According to polls, less and less people call themselves religious and certainly an increasing chunk of the population no longer goes to church or reads the Scriptures. This is due to a lot of factors, but I think that the early ones which really shook up the foundations of the Church were the increase of science/technology and the advent of Darwin's foundation-shattering theory of evolution sometime in the 19th century. Perhaps the global atrocities of the twentieth century, including the two world wars and the Holocaust, also led to a decline in faith. At the very least, many Jews found themselves questioning the existence of God in the wake of the absolute devastation of the Holocaust. Increased urbanization and then suburbanization has broken apart the traditional community ties that were based largely on the church (see my comment on Matt's post, about the importance of community in religion).

All of these things, and undoubtedly many more, have led to Christianity's diminished importance in our country. Do you guys feel that this increased secularization is a bad thing? Do you think that the disappearance of faith, which had characterized Western civilization since the fall of the Roman empire and the Christian Church's succesful bid to fill the void created from the empire's fall, plays a large part in the alienation, aimlessness, and overall unhappiness and anxiety that characterizes our modern, technologically-insane society? With the disappearance of the Church as the primary foundation of people's lives, there is also a change in morals and values. Are these affecting people in a positive or negative way?

And finally, since I am rather uninformed on this subject, will the same thing happen to other societies as has happened to Western civlization? I sense that traditional religions are being pushed out in other cultures, like the African continent or Latin America or Asia, by the rise of westernization, which is spreading rapidly across the globe and dominating everything.

Personally, I do feel that the decay of religious faith is greatly changing our cultural landscape. I think there is less of a sense of community than there used to be, although other things affect this too (everyone commuting great distances in their cars; people spending all of their time on the computer or their cell phone; an overall increase in entertainment outlets and in everything, which pushes people in all different directions). Although I disagree strongly with the way religious conservatism dominated large chunks of our national history (the Puritans of New England or the Southern Protestants, who used religion to justify slavery, and that were incredibly conservative), I think our country/civilization is lacking a sort of overarching spirituality which would bring us together.

Anyways, this is just some food for thought and I'd like to hear some others' comments on the issue, since I think it is a very, very important one.

--Edward

1 comment:

  1. As I wrote in this post, I was pretty sure that I had written an additional comment on Matt's post, but for whatever reason (I'm guessing the bad Internet connection at my house), it seems to not be there now. Anyways, I will try to sum up what I said in that post here.

    Mostly I talked about how I think an important part of organized religion is bringing the community together and making for greater social ties among people. In this day and age where everyone is doing their own thing, dropping their kids off at college prep classes, taking yoga classes in a trendy spot in the nearby city, and visiting your local organic-homegrown-fair trade-nonprocessed food store, people seem united less and less by geography. From my understanding, in the old days, people would gather at least every Sunday at the church for worship, and then afterwards have feasts and talk, and the guys would basically go to church to pick up all the hot girls with the sexy ankles and erotic bonnets. This seems like perhaps a romantic, idealized view of the past, but I genuinely believe that one of the great pragmatic reasons for oganized religion is that it helps bring people together, creating a larger sense of family than the nuclear family unit which has become the norm in modern America. You get baptized there, you go to Sunday school there, you get first communion--whatever, I don't really know all of the organizational tenents of Christianity, but I do see the good in them, even if I haven't taken much part in them. While it seems likely that if I have kids, I would probably not end up taking them to church regularly, perhaps it would do them some good. Undoubtedly if I had gone to church growing up, I would've been even angrier against religion in the past, but in recent years I think that I have matured in my view of religion, and now see a lot of the good it does, and how that good is disappearing in the increasingly secular society we live in.

    ReplyDelete

Note: Only a member of this blog may post a comment.