Cowboys have always had a romantic image. when people first watched Hollywood movies, they thought that being a cowboy wasn't a job. It was a lifestyle, full of adventure, freedom, horses. It was a classic symbol of the United States. In reality, American cowboys have lived and worked in the western and southwestern United States for over three centuries, long before Hollywood. And the lifestyle always been about hard work and long hours.