'West Germany' definitions:

Definition of 'West Germany'

(from WordNet)
noun
A republic in north central Europe on the North Sea; established in 1949 from the zones of Germany occupied by the British and French and Americans after the German defeat; reunified with East Germany in 1990 [syn: West Germany, Federal Republic of Germany]