I blame it (and the Americans' preoccupation with insincere smiling in general) on Carnegie's "How to Win Friends and Influence People". I know, I know, one man single-handedly changing a huge part of nation's culture? But Carnegie himself notes in the book that the reason the smile works is that sellers don't, as a rule, smile at the would-be buyers during the transaction — so that smile must have not been part of the culture yet. And we all know what happened then: the book becomes a bestseller, everybody reads it, it's taught in MBA courses, etc. and voila, now everybody's smiling. You can even see it in, e.g., the photos of politicians/businessmen: in the 30–40s you see mostly serious, somber-looking men. By the 70–80s almost all of them have that rigor mortis smile.