Organic food found to be no healthier
I saw this article about organic food being no healthier than ordinary food. It doesn't surprise me. It often felt to me like the whole "eating organic" thing is more about feeling in control, or perhaps superior to others than it is about health. What do you think?