Monday, May 24, 2010

The Doctors is a tv show, which is supposedly geared toward educating people in the area of personal health. It seems to be more like domestic terrorism aimed at the weak-minded, keeping them in line by scaring them with medical nightmares. Keep the people placated with junk science and medicated with junk treatments. There's no money in cures, and no market if everyone thinks they're healthy. This show spreads the idea that no one is healthy, and everyone could use a little bit of medicine of one kind or another. Disgusting.

No comments:

Post a Comment