'western United States' definitions:
Definition of '
western United States
'
From:
WordNet
noun
The region of the United States lying to the west of the Mississippi River [syn:
West
, western United States]
A
B
C
D
E
F
G
H
I
J
K
L
M
N
O
P
Q
R
S
T
U
V
W
X
Y
Z