Having gained control of the continent, the Americans began to expand across it, continually pushing westwards from wilds - and displacing and dispossessing the Native Americans in the process. By the end of the nineteenth century this from of continuous colonizing or "pioneering" had led to the settlement of the entire United States from the east coast to the west.