'western United States' definitions:

Definition of 'western United States'

(from WordNet)
noun
The region of the United States lying to the west of the Mississippi River [syn: West, western United States]