I've asked cassielsander the same thing, but why was establishing diplomatic ties with China, in and of itself, a "good thing"? Did we win some important concessions from them in return for them? Did they make the world--or even just East Asia--a safer, better place? Did they lay the groundwork for later gains?
no subject