'tropical medicine' definitions:

Definition of 'tropical medicine'

From: WordNet
noun
The branch of medicine that deals with the diagnosis and treatment of diseases that are found most often in tropical regions