'western United States' definitions:

Definition of 'western United States'

From: WordNet
noun
The region of the United States lying to the west of the Mississippi River [syn: West, western United States]