What about religion? Does it play a part in our healing process/mindset? Apart from anything else, faith builds a community, groups you can join (even if you don't believe or are not sure).
As a Christian, I believe it does, however, I also believe that sometimes it makes it harder. Let me re-phrase that a bit. Sometimes other Christians make it worse rather than the faith itself. I cannot speak on behalf of other faiths, agnostics and atheists, but I can speak as a Christian.
People interpret the Bible in their own way, giving their opinions on situations, and to their own advantage. This is because we have God-given freewill, we are not God's puppets and shouldn't be brain-washed. We all know of many cults where this is the case in the past, now and will be in the future.
The New Testament is full of Jesus' explanations, followed by the disciples continuing to explain, because people had got hold of the wrong end of the stick.
Very often I hear people exclaim that the Bible is not relevant today; I disagree. Granted, parts of it seem very harsh, brutal, and don't make sense, however, if we read in the context it was written, we can make more sense if it. For example, the Old Testament laws on being unclean and what animals were not permitted to be eaten.
Think about that with these questions in mind:-
Did they have running water in those days, let alone showers?
Did they have cleaning products?
Did they have hygenically maintained farms?
I think you'll find that the answer to them all is a resounding "No!" The unclean rules were there for good reason.
The Old Testament is full of blood-thirsty battles and sacrifices to be offered in a bid to wipe out bad and evil practices. Clearly these things didn't work. This is why Jesus came; to fulfil the Old Testament and show us that there is a better way with two simple, positive commandments - not a whole list of complicated rules and regulations to get tied in knots with.
Love God before everything else.
Love your neighbour as yourself (all of mankind - humankind if you're politically correct; I prefer the original!)
If you have faith it does give life a purpose and hope.
How can atheists have anything to look forward to? You're born, and you die. Gone forever!
How can agnostics have hope? Imagine sitting on a fence separating two fields. One might be safe and the other might contain a bull. One day it might be too late to decide which is the right field to be in as they might suddenly fall into the wrong field, unable to balance on the fence any longer.
I understand that Christianity is the only faith that encourages a personal relationship with God. Yes we have Church Leaders/Priests/Vicar's but Christianity is not about the leader of the church group, it is about accepting Jesus as your personal Saviour, knowing that He was; abandoned by His friends, had the authorities breathing down His neck, He was picked on, humiliated, ridiculed, lost loved ones...but He paid the price for our eternal salvation and assured us that we do not have to be tied up in knots with umpteen rules and regulations. Simply observe the two I mentioned above, and if we all do that, everything else will fall into place.
Christianity gives you the Holy Spirit. A Helper, Strengthener, Enabler, Who empowers us to carry on.
In my book, Life After Death: A Mother's Story, I talk about my childhood and how I begged God to make the bad things stop. I understand now, although I didn't then, He never did because the perpetrators chose their own actions. They didn't keep the two commandments and so I suffered, however, I did find comfort in my faith.
It's worth considering.
Jeany Pavett. Author of Life After Death: A Mother's Story