Cowboys have always had a romantic image. When people first watched Hollywood films, being a cowboy wasn't a job. It was a life of adventure, freedom, horses. It a was classic symbol of the United States of America. In reality, the real American cowboys have live and worked here in the west and south-west of the United States for over three centuries, long before Hollywood. The adventure and romance have disappeared but the hard work and long hours are the same as they've always been.