After the late eighteenth century, white Americans began to move westward. The displaced local tribes and changed the entire landscape into agricultural belts. Finally, they established control up to the west coast. By the early twentieth century, the landscape of the USA had transformed radically. The USA began to dominate the world market in agricultural produce.