It is a cherished belief among Thai people that their country was never colonized. Yet politicians, scholars, and other media figures chronically inveigh against Western colonialism and the imperialist theft of Thai territory. Thai historians insist that the country adapted to the Western-dominated world order more successfully than other Southeast Asian kingdoms and celebrate their proud history of independence. But many Thai leaders view the West as a threat and portray Thailand as a victim. Clearly Thailand's relationship with the West is ambivalent.