I think people are brainwashed into thinking that doctors know everything and we mere mortals are ignorant. I'm the absolute expert on my own body regardless of what doctors say. I think it could be a generational thing, my mum viewed doctors on a par with gods and did everything they told her to do, to her great harm. I on the other hand met many wanna be doctors at uni when I was studying and found them to be some of the least empathic and the most greedy and materialistic people. They don't change when they become 'qualified'.