Today, I want to talk about facebook.
Yes, facebook, the social media website, I’m sure you have heard of them.
Facebook currently gets a lot of media attention.
And not in a good way.
That’s because not only has Facebook collected and passed on users' information without those
users' consent, it has also provided a platform for the organized spread of political misinformation,
aka “fake news”.
I doubt you were surprised by this.
It’s hardly a breakthrough insight that an almost monopoly on the filtering of information
is bad news for democracy.
This is, after all, why we have freedom of the press written into the constitution: To
prevent an information monopoly.
And when it comes to the internet, this is a problem that scientists have written about
for two decades at least.
Originally the worry of scientists, however, focused on search engines as news providers.
This role, we now know, has been taken over by social media, but it’s the same problem:
Some company’s algorithm comes to select what information users do and do not see prominently.
And this way of selecting our information can dramatically affect our opinion.
A key paper on this problem is a 2003 article by three political scientists who coined the
They wrote back then:
“Though no one expected that every page on the Web would receive an exactly
equal share of attention, many have assumed that the Web would be dramatically more egalitarian
in this regard than traditional media.
Our empirical results, however, suggest enormous disparities in the number of links pointing
to political sites in a given category.
In each of the highly diverse political communities we study, a small number of heavily-linked
sites receive more links than the rest of the sites combined, effectively dominating
the community they are a part of […]
We introduce a new term to describe the organizational structure we find: “googlearchy” – the
rule of the most heavily linked.
We ultimately conclude that the structure of the Web funnels users to only a few heavily-linked
sites in each political category.”
We have now become so used to this “rule of the most heavily linked” that we have
stopped even complaining about it, though maybe we should now call it the “rule of the most heavily liked.”
But what these political scientists did not discuss back then, was that of course people
would try to exploit these algorithms and then attempt to deliberately misinform others.
So really the situation is much worse now than they made it sound in 2003.
Why am I telling you this?
Because it occurred to me recently that the problem with Facebook’s omnipotent algorithm
is very similar to a problem we see with scientific publishing.
In scientific publishing, we also have a widely used technique for filtering information that
is causing trouble.
In this case, we filter which publications or authors we judge as promising.
For this filtering, it has become common to look at the number of citations that a paper
And this does cause problems, because the number of citations may be entirely disconnected
from the real world impact of a research direction.
The only thing the number of citations really demonstrates is popularity.
Citations are a measure that’s as disconnected from scientific relevance as the number of
likes is from the truth value of an article on Facebook.
Of course the two situations are different in some ways.
For example on social media there is little tradition of quoting sources.
This has the effect that a lot of outlets copy news from each other, and that it is
extra hard to check the accuracy of a statement.
Another difference is that social media has a much faster turnover-rate than scientific
This means on social media people don’t have a lot of time to think before they pass
But in both cases we have a problem caused by the near monopoly of a single algorithm.
Now, when it comes to scientific publishing, we have an obvious solution.
The problem both with the dominance of a few filtering algorithms and with the possibility
of gaming comes from users being unable to customize the filtering algorithm.
So with scientific publishing, just make it easier for scientists to use other ways to
evaluate research works.
This is the idea behind our website SciMeter dot org.
The major reason that most scientists presently use the number of citations, or the number
of publications, or the number of publications in journals with high impact factor, to decide
what counts as “good science” is that these numbers are information they can easily
access, while other numbers are not.
It adds to this that when it comes to measures like the journal impact factor, no one really
knows how it’s calculated.
Likewise, the problem with Facebook’s algorithm is that no one knows how it works, and it
can’t be customized.
If it was possible for users to customize what information they see, gaming would be
much less of a problem.
Well, needless to say, I am assuming here that the users’ customization would remain
You may object that most users wouldn’t want to deal with the details, but this isn’t
It is sufficient if a small group of people generates templates that users can then chose
Let me give you a concrete example.
I use Facebook mostly to share and discuss science news and to stay in touch with people
I have on my “Close friends” list.
I don’t want political news from facebook, I am not interested in the social lives of
people I don’t know, and if I want entertainment, I look for that elsewhere.
However, other people use Facebook entirely differently.
Some spend a lot of time with groups, use it to organize events, look for distraction,
or, I don’t know, to share cooking recipes, whatever.
But right now, Facebook offers very little options to customize your news feed to suit
your personal interests.
The best you can do, really, is to sort people onto lists.
But this is cumbersome and solves only some aspects of the sorting-problem.
So, I think an easy way to solve at least some of the problems with Facebook would be
to allow a third-party plug to sort your news-feed.
This would give users more control and also relieve Facebook of some responsibility.
Mark Zuckerberg once declared his motto clearly: “Move fast and break things.
Unless you are breaking stuff, you are not moving fast enough.”
Well, maybe it’s time to break Facebook’s dominance over information filtering.