Men live to oppress you!
Or that’s what they say, anyway.
And since we are talking taboos, I may as well add it only means “white men” but nobody says that because, duh, it’s obvious!
So really the message is, “White men live to oppress you!”
But do they? Really? Or are they a scapegoat? Just another group to “vilify” to take focus off of the real stuff. Like the world is basically an international s#it show, and getting worse by the day?
Or am I the only one seeing it that way? Is this utopia? Equality? Peace? Love? Rainbows? Getting along?
After 40 years of govt. support, are we there yet? If not, what is missing? Same for women’s rights? At what point will it be right? Never? Now? Someday?
What is the objective? To splinter apart with increasing hate, or to turn the page and realize for the most part none of us were even born then or if so, were certainly at best just a child and unable to change much?
Do humans need “a scapegoat?” Or can we evolve past that?
Please share your thoughts in the comments.