Religion is an important part of society, especially in a place like the U.S the is considered to be a Christian country. However people are becoming more independent and beleive that they don't need help from anyone. This is where religion is losing it's influence because religion is all about help. Helping you and helping others is what religion is about . This means as the trend of being self reliant is becoming more and more popular, religion is losing it's influence.
Looking at the U.S. Several of the most recent changes to occur including the Gay Rights campaign show that ideas and lifestyles that were once impossible to live with are now becoming accepted and maybe even the norm.
Also a recent study (http://www.Gallup.Com/poll/162803/americans-say-religion-losing-influence.Aspx) shows that %77 percent of Americans think that religion is losing its influence on the country its ideas and government.
No, religion is not losing influence in the U.S. Because other religions such as the Islamic religion is growing more and more in the United States. It may have lost its influence in the Government though because The U.S. Government is taking God out of schools and the pledge of allegiance. Other religions are spreading in the United States not just Christianity and if we have freedom of religion some may say that the government shouldn't just be founded on Christianity because some believe in other religions so they may be taking Christianity out more so that it shows that its not just supporting one religion but I don't think that means that religion is losing influence in the U.S. With bad things always happening I think people still need somewhere to turn and that is religion so I think religion is gaining influence.