The dental industry in America is massive. Why is it such an important part of the American lifestyle?

  • aidan@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    Honestly I would really disagree with your claim that Americans are more obsessed with appearance and have more of a negative impression of poverty. In the Midwest/South I never saw any judgement of anyone because of their financial situation. It might be more common in coastal cities. I think there are vain people everywhere, and I haven’t noticed any more or less living outside the US. I really don’t understand why you think it’s so much more engrained in the US- and I wonder if it’s because of the people you interacted with in the US vs outside it. (For example maybe being in middle/highschool in the US, which is full of people being judgemental everywhere)