I think part of the problem is that people try to make the bible say what they want it to say. Instead of letting the bible speak with the help of the Holy Spirit.
Do we not all believe that the body has the ability to heal itself? Where did that belief come from? Does anyone know? I think if we knew the truth about many things, we would all have something to hang our heads about.