I'm sorry, but people need to do their own research and become empowered with the knowledge of what will heal them. Doctors really just aren't as informed as they should be. IMHO, they just lack a real understanding of how the body works, and worse....they are almost worshipped as God in our culture. They either want to take organs out or put you on synthetic meds. They are bought and purchased by the pharmacuetical industries and it's really just that simple. I realize that many of them are caring doctors who really want to help, however, their training just simply falls short.