western United States

Explore definitions, synonyms, and language insights of western United States

Definitions

Noun
the region of the United States lying to the west of the Mississippi River

Synonyms