Thailand is known as the “Land of Smiles.” While smiles are prevalent and often a sign of happiness and good will that is not whole story (See Below). According to Culture Shock! Thailand : In the west, a smile is about something and generally an expression of amusement, In Thailand a smile is a natural part of life, sometimes serving social functions as well.