Microsoft Word introduces new "woke" feature to monitor your language

GURPS

INGSOC
PREMO Member
Microsoft Pushing Woke Culture, Includes Word Checker To Help You Use Woke Terms


“Grammar & Refinements” checks age bias, cultural bias, ethnic slurs, gender bias, gender-neutral pronouns, gender-specific language, racial bias, sexual orientation bias and socioeconomic bias



Some of the issues reportedly targeted by Microsoft in its latest version of Office 365, a subscription service that 250 million people use, include age, ethnicity, gender, sexual orientation, or “socioeconomic status,” The Sun reports, offering some examples of suggested changes.

Legendary astronaut Neil Armstrong’s famed words upon landing on the moon in which he lauded “one giant leap for mankind” could be altered to “humankind” or “humanity.” Pop singer Barry Manilow’s hit “Copacabana,” in which Manilow sang of Lola the “showgirl” could be altered to “dancer,” “performer,” or “performing artist.”

Referring to the former Prime Minister of Great Britain, Margaret Thatcher, as “Mrs. Thatcher” could be replaced by “Ms. Thatcher.” “Postman Pat” could be changed to “mail carrier” or “postal worker.” “Headmaster” becomes “Principal,” “mistress” becomes “lover,” “master” becomes “expert,” “manpower” becomes “workforce,” and “heroine” becomes “hero,” among other changes reported by The Sun.
 
  • Like
Reactions: BOP

GURPS

INGSOC
PREMO Member

Microsoft quietly released a little feature and suddenly it caused outrage



But it took the Daily Mail, of all publications, to emit a little stink in Microsoft's direction. The paper calmly explained how Redmond's "woke filter was capturing words like mankind, blacklist, whitewash, mistress and even maid."

The Mail even scoffed that former British Prime Minister Margaret Thatcher couldn't be referred to as "Mrs. Thatcher." No, she is now "Ms. Thatcher." I'm not sure she'd have liked that.

And then there's "dancer" being not inclusive, while "performance artist" is.

Naturally -- or, some might say, thankfully -- this problematic-solver is an opt-in selection. It doesn't autocorrect. It merely whispers gently that you may be sounding like someone not everyone will like. Or that only particular people will like.

The feature also offers alternative suggestions across subjects such as age bias, cultural bias, ethnic slurs, gender bias and racial bias.
 
Top