I do not like Facebook. I hate using it, but many readers depend on our TPC page for our posts. My son used Facebook to make sure we knew he was okay after the Boston bombings (he was only two doors away from the second explosion). My former students use it to stay in touch with me. But if I felt I could eliminate Facebook from my life, I would.
It constantly glitches, it's annoying to use, the format keeps changing, I keep getting pop-ups explaining what seems like daily adjustments to their features, and don't get me started on the privacy issues and Mark Zuckerberg's political group's support of the filthy Keystone XL tar sands pipeline.
I do not like Facebook. But I use it, very reluctantly. I look forward to the day I no longer rely on it, and I resent the reliance that I admit to, along with my own unwillingness to delete my account.
My personal page was once suspended for including this image as part of a post on women's health care:
They thought it was pornographic, apparently. My account was restored after I protested.
So a beautiful photo of a pregnant woman was considered porn, and my innocuous page was dispensed with immediately, but it took Facebook this long to crack down on actual, you know, misogyny and violence against women posts. Via the L.A. Times:
Activists say an online campaign to curb misogynist content on Facebook could be a watershed moment for a growing movement to remove posts and images that promote violence against women on the Web.
Women, Action and the Media launched the campaign last week, urging major companies to pull their advertisements from Facebook that could run alongside graphic language and images of rape, abuse and other violence against women.
Heeding the call, more than a dozen advertisers, including Nissan Motor Co. and Nationwide Building Society, removed their ads from Facebook, while others, such as American Express and Unilever's Dove brand, pressured Facebook Inc. to remove the offending pages. [...]
Facebook, which makes the bulk of its revenue from advertising, said Tuesday that it is reviewing its guidelines to evaluate content that violates its standards and will train moderators to identify and remove hate speech.
Yes, money talks.
"Review" away, FB, because that's worked so well for you in the past.