Your diet plays a big role in your health. The foods we eat can either nourish and strengthen us, or, they can make it easier for illness and disease to take root. When it comes to cancer, researchers have spent decades trying to understand whether dietary choices can increase or decrease the risk for cancer. Many of their results are inconclusive, and the question is still out: does your diet affect your chances of getting cancer? Keep reading to find out.

Show Full Article