Is religion losing it's grip on America?
I believe it's true.
Back in the old days religion seemed to have had control of almost everything.
It seemed to be in every inch of our government and laws.
Back in the day it shamed women who became unwed parents,forcing many of these women to hand over their babies up for adoption to married couples and relatives.
The use of contraception was shunned upon,the religious fought tooth and nail for it to never become legal but they lost.
They fought for abortion to remain illegal even in the cases of rape and incest but they lost there also.
Even now it seems religion still wants to keep women barefoot, pregnant & in the kitchen, but today many women are shunning these religious beliefs and becoming very successful,happy and free in their own careers and lives.
Men and women can now legally divorce from loveless unhappy marriages without being shamed upon.
Gay's are being more accepted,we are becoming closer and closer to seeing gay people have equal rights as any straight person and marry their other halves.
Not to mention the huge continuing rise of atheists in America now coming out, with this I can only see religion losing more of it's grip on this country and America becoming a better,more peaceful,more happy and accepting place because of it.
What do you think?
See Votes by State