We can no longer stomach our food system. It's killing more and more Americans and costing billions in healthcare. 78% of Americans eat organic food, because they think it's healthier. But is organic really better for us or just a marketing scam?
When corporations went into the business and "organic" became a brand, everything changed. The philosophy and the label grew apart. Can gummy bears or bananas flown halfway across the world truly be organic?
"In Organic We Trust" is an eye-opening food documentary that looks beyond organic for practical solutions for me and you. Local farmer's markets, school gardens, and urban farms are revolutionizing the way we eat. Change is happening from the soil up.
[watch video below]