I hope everyone had a good holiday weekend. Marie and the kids were gone. I spent my time "putting my garden in" (a phrase I've long disliked for some reason) and cleaning out the areas of the house and garage I had usurped for gardening purposes. Marie was glad to get the space back.
I listened to a ton of podcasts and lectures this weekend. If you're acquainted with the Teaching Company's Great Courses series, you ought to check it out. It's awfully pricey but awfully good. I got a few lectures through use of an Audible membership. It was well worth the price. I've also been listening to a lot of Al Kresta. If you're listening to his daily radio show (available by podcast), you're missing out.
Among other things learned while listening: Although estimates vary, it is possible that few than 1/10th of 1% of the population experiences gender dysphoria. And for that we're opening up girls' restrooms to all men who claim they suffer from it? This, despite the prevalence of voyeurism. From Wikipedia: "Subsequent research showed that 65% of men had engaged in peeping, which suggests that this behavior is widely spread throughout the population. Congruent with this, research found voyeurism to be the most common sexual law-breaking behavior in both clinical and general populations. In the same study it was found that 42% of college males who had never been convicted of a crime had watched others in sexual situations. An earlier study indicates that 54% of men have voyeuristic fantasies, and that 42% have tried voyeurism."
This is a very good piece about Facebook's anti-conservative (anti-libertarian) practices. It seems to explain what really happened. In short, there were problems with Facebook's approach, but it isn't as insidious as some conservative news sources would have us believe. Excerpt:
In the case of Facebook's Trending Topics, if the top stories on Facebook were being identified purely by an algorithm, the story that sparked the controversy may never have happened. . . .
But in the case of Trending Topics, humans – imperfect and biased as we all are – were needed to step in where algorithms weren't enough. For instance, without a light touch of human judgment, "lunch" would be a trending topic by noon every day, and #JeSuisCharlie and "Charlie Hebdo" might be treated as separate topics instead of being counted as the same news topic. Such a light touch theoretically shouldn't be problematic.
But because we are all human, we are all imperfect. And we all have biases we can't see. Of course, nobody says they have biases. "I really just try to look at the facts and be impartial," is a refrain I've heard in a thousand focus groups. . . .
What is extraordinary about a platform like Facebook is that it allows conservatives to bypass the filters, conscious and unconscious. News stories about people using Second Amendment rights to defend themselves or heartwarming stories of churches doing good work can make their way to people who want to read them, without a news editor in Manhattan deciding if they're really "news." Social media has, in many ways, democratized news and created a true "marketplace of ideas," which is also a core part of Facebook's stated mission.
The way Facebook's Trending Topics product was set up, it had a chance to further that goal. Rather than an editor picking and choosing from the get-go, Facebook's algorithm would surface the "most talked about" items.
So far, so good.
But imagine that you're 24 years old, working as a contractor for Facebook. You've probably got a degree from an elite institution, and get much of your news from watching John Oliver. Suddenly, a story breaks on a conservative website you've never heard of about, say, a pro-life march. You don't have anything against conservatives, you're just cleaning up the results you're given, but this article isn't on your list of trusted news sources, and are there really that many people talking about it?
You can see how, even with good intentions, things could go awry. Adding a feature that decides what is "trending," even if it is mostly driven by data, meant Facebook risked replicating the editorial processes that conservatives had fought against.