I talk to a lot of people (particularly from the northeast part of the country) who always say that they don't like college football because it's "corrupt", and much prefer the NFL to it. Why do people from that section of the country have that perception?
Do they not think that the NFL is corrupt? I think anytime you have millions or billions of dollars at stake in anything, there's going to be a certain level of corruption. Whether it's things like Spygate, etc.
Even in MLB where most of the statistics are tainted because of steroids recently.
My question is, why do so many people from the northeast have the perception of college football that they do? There's no question that college football has never been snow white, but it's as if these people don't understand what college football is all about.