From dusty trails to epic gunfights, Westerns once ruled Hollywood—then vanished, only to return in surprising new ways. For decades, they were the backbone of American cinema, churning out tales of ...