When we look back at history, people have always been oppressed, it (is)was normal. Even today most nations continue to oppress their people, we choose to ignore it. The freedom we had in the west seems to have been a mirage, a blip in history. For those of us growing up in post war Britain, where conscription ended, and also capital punishment, where workers' rights were fought for and won, where freedom of religion, sexuality and speech became human rights, where gender equality was strived for, and racism was apparently going to be a thing of the past. That 'window of time' when nationalism was a dirty word.
Was it all just a dream, were we lulled into a false sense of hope, of liberalism, of enlightenment? Is the reality, is the norm, oppression and a shift of power and money only to the rich?
At which point have all the cultures and societies before us realised they were oppressed? The ones who were violently and suddenly oppressed and enslaved are under no illusions, but almost every society in history was eventually oppressed if not enslaved.
My question is....At what point does a society realise it is oppressed?
My answer is..... Probably too late!