the Wild West
noun [ S ] uk/ˌwaɪld ˈwest/ us/ˌwaɪld ˈwest/
(美国)蛮荒的西部
the name given to the western part of the US during the time when Europeans were first beginning to live there and when there was fighting between them and the Native Americans