WHERE’S THE BEEF? Well, the day isn’t far off, barring major economic and political changes in this country, when the beef isn’t being raised in the U.S. because, according to The Epoch Times’ Kevin Stocklin, the cattle industry is steadily disappearing. And it’s not because health-fad following Americans have dropped red meat from their diets.