FACEBOOK: JEKYL AND HYDE?

(Cleveland Police via AP, File)

By John Gibson

Follow on TwitterFacebook, listen free/live on foxnewsradio app

Dr. Jekyl was the good man, his alternate personality, Mr. Hyde, was the monster.

The smart young techies at the top and bottom of the Facebook hierarchy should watch the movie. Spencer Tracy was great.

The movie might give them a clue about what's happening in their company.

To illustrate, yesterday I put a short video of what turned out to be a torrential rainstorm at the Texas ranch on Facebook. Totally inconsequential, but most videos on FB are just that--nothing particularly important. At the same time, god-only-knows who was posting a video of something awful and revolting and perhaps even a crime. Both things exist side by side.

Evidently people load 400 videos a minute to FB. That's a lot and the numbers may even be an undercount. Success and problems come together at FB.

In fact, a big problem. At today's big FB conference in San Jose--called F8--founder and uber boss Mark Zuckerberg may be--should be-- asking his army of hungry coders to please develop an algorithm that will recognize a real life murder.

That's because of the obvious--at least for anybody watching either FB or the news: the "live" murder, victim chosen at random, committed by Steve Stephens in Cleveland and posted on FB without FB being able to block it.

As for Stephens himself, at a noon news conference today police announced he was found by police near Eerie, Pa, and shot himself to death.

But he leaves an aftermath that focuses not on murder by crazy people (can there be any doubt?), but on FB itself. There needs to be a police style All Points Bulletin on a solution to the problem of violent videos on FB--which FB now recognizes a big problem needing a big solution. Can Facebook figure out how to prevent the worst possible videos from gaining a spot on FB without FB execs and engineers knowing about it in advance?

Evidently not.

FB seems as surprised as anyone when one of these videos shows up. It makes FB look like a car careening down the road driving itself while the so called drivers--those inside FB itself--are busy closing deals on multi million dollar houses and watching in amazed delight as their swollen bank accounts click off digits like the pump at the gas station.

If the driverless car crashes--Steve Stephens' "live" murder on FB-- the so called drivers of the careening online vehicle know as FB express shock that such a thing could happen.

There are endless stupid things posted on FB that the world probably shouldn't see, but don't do any great harm.

But how to sort them out. How does technology recognize what is acceptable and what is not? The wizards are going to try to figure it out. Zuckerberg says "young people are just smarter" and he's got a ton of young minds to prove that rule true. But it seems to me when a technology company becomes a media company--FB audiences would make a TV exec drool--the responsibility becomes different.

FB needs to have human eyes on what it "airs".  That's its responsibility, just like any other television or film, or media company.

Instead we're going to hear about new algorithms, I promise you. That's because even FB can't afford the thousands of censors it would have to employ to otherwise solve the problem.

But I also think it will be a fool's mission even FB engineers won't take seriously. Instead the FB HQ is going to just ask the world to accept that while you're scrolling through funny dog videos and people bunge jumping and kids having birthday parties you'll run across the occasional murder.