I need your opinion on something that has been driving me nuts.
We have all heard expressions like "eat your veggies", but my question is, are they absolutely 100% necessary?
People say that vegetables contain a lot of good stuff, and I agree. But they aren't the only source of that good stuff.
Fiber: Can also be found in whole-wheat bread, fruit, oatmeal, beans, brown rice etc.
Vitamins and minerals: I have yet to hear of a specific vitamin or mineral that can't be found in other foods as well.
Water content: Water is widely available on its own.
They are alkalizing and counteract the effects of acidic foods like meat and fish: Almost all fruit are alcalizing as well.
Contain "Mysterious phytochemicals": Someone could argue that other foods have "mysterious phytochemicals" too, like all the "superfoods" that taste great and are loaded with antioxidants (goji berries, blueberries, etc).
I'll also leave this link here, for fun:
If possible, I would like this to be a LOGICAL discussion, and not an emotional one. I understand that people get emotional when their sacred beliefs are questioned. I'd be interested to hear answers about "vegetables" - NOT "fruit and vegetables" or "plants" in general. Answers about whether they are necessary or not - not if there are tasty recipes available, or if someone ate vegetables and lived to be 120 years old.