American Culture Makes Women Unattractive
By promoting a thoroughly un-feminine attitude and identity
American culture makes women unattractive.1 It’s not just the unnatural gender identities that people are encouraged to have, although that’s a significant part of it. It’s also the prevailing attitudes that Americans pick up and internalize until those attitudes shape their entire personalities. Postmodern American attitudes are ugly.
To be sure, our culture’s confusion about gender roles is important. Traditional femininity is discouraged — unless it’s a grotesque caricature of womanhood being performed by mentally ill men who think they’re girls, in which case our depraved country celebrates it. And of course, traditional masculinity is denigrated and opposed even more forcefully. Males are feminized; females are masculinized; and surprise, surprise, young people are having less sex than ever.
But there’s much more to the story than just the promotion of “gender nonconformity” as an ideal. American culture also promotes pathological attitudes that make people repugnant. This would obviously include dark-triad personality disorders, as manifested in things like hysterical crybullying, disingenuous virtue-signaling, and other signs of obvious narcissism among adults with the emotional maturity of bratty toddlers — something that is becoming increasingly common with each passing generation. But even if we ignore the narcissists among us, I think we have to admit that we Americans (and Westerners) tend to have generally negative attitudes, at least compared to people from more traditional cultures. And these negative attitudes shape our personalities in negative ways, making us less attractive than we could otherwise be.
The typical American disposition could be described as “dour cynicism.” By that, I mean an attitude of disdain, disillusionment, and depression; an I’m-sick-and-tired-of-everything-because-it’s-all-bullshit kind of energy.
Keep reading with a 7-day free trial