cowboys have always had a romantic image. when people first watched Hollywood films, being a cowboy wasn't a job. it was a life of adventure, freedom, horses. it was a classic symbol of the United States of America. In reality, the real American cowboys have lived and worked here in the west and south-west of the United States for over three centuries, long before Hollywood.