Commentary America is no longer a country being divided between conservative and progressive, between Republican and Democrat. It’s now being divided between the “woke” and the “non-woke.” And thus far, the woke appear to be gaining the upper hand. To be “woke” means to have been “awakened” to the awful truth about America, just how bad a place it is, and just how much it needs to change—fast. To hear the woke tell it, the United States is a country filled to the brim with racism, sexism, transphobia, and just about every other evil -ism in the lexicon. Woke activists see their job as getting the rest of the country to “wake up,” acknowledge, and reject the insidious white supremacy that supposedly permeates … well, just about everything in America. The woke insist that not only does the United States need a vast cultural makeover, this crucial makeover also needs …
Is Your ‘Woke Credit Score’ High Enough to Keep You From Being Denied Service by Major Corporations?
May 5, 2021
admin
0 Comment