Science to perpetuate statu quo

Lyotard warned several decades ago in his seminal work, The Postmodern Condition, about noun_225280science’s lack of legitimation. At least, about the lack of a “beautiful” legitimation, as he noticed that instead of searching for the truth or trying to improve mankind’s living conditions, science is nowadays ruled by what he called the performativity principle. In brief, science must be lucrative, returning money to investments in the same way other businesses do. Or returning more, if you want to assure a regular money flow to research. For those to whom “performativity” sounds like cryptic philosophy, Feyerabend provided (why not Plato?) a straightforward explanation:

20th century science has resigned to have any philosophic pretension to become a big business. It is no more a threat to the society, but one of the firmer pillars.

Yep, a threat to society sounds bad … except if you think about society as an ideological system with the function to reproduce inequality by supporting and transmitting the scheme that maintains few people in the zenith of the social pyramid: those who own resources and retains power relationships.
There are lots of examples of how science is no longer mankind’s progress weapon but a way to perpetuate the status quo. Here is one of my favorite examples: citation indexes.

Citation Indexes? What’s that?

Citation indexes (CI) are essentially lists in which scientific journals are ranked according to their impact factor, or the measure of how important their articles are for the scientific community. Sounds nice and helpful as this shows

‘a journal’s true place in the scholarly research world’ and ‘Measure research influence and impact at the journal and category levels’ (Thomson Reuters, the editor of the JCR ranking, dixit).

Perhaps it sounds nice, but it isn’t. In the same way JCR qualifies journals, these journals transitively pinpoint good researches –whose works are published in these journals – and exclude the others.

C.I. Rankings promote inequality

noun_97178.pngFirst, top-list journals are expensive, so there is not global access to these, and this becomes a powerful source of inequality. “There are countless researchers without access to most impacting articles because journals abusive price: each paper costs about $30 and you should read lots of papers. If these articles are, arguably, the best scientific works, those people without access to them would have more difficulties in developing brilliant, innovative results, thinking science as an accumulative process.
Additionally, citation indexes make countless researchers all over the world systematically invisible as they are misrepresented. Their works are excluded from mainstream research not even because of their quality but because of where they are published and, indirectly but not less important, because of the language (the vast majority of journals in the first quartile are in English) or researchers’ relationships.
Of course, those researchers are not explicitly excluded. But the symbolic violence of this segregation is brutal, first because it is explained and legitimated in terms of quality of the research work, and second, due to the relative invisibility of this segregation.

An alternative to citation indexes?

Criticism has been dethroned by pseudo-democracy or pseudo-intersubjectivity mechanisms to focus literature or entertainment contents consumption. Habermas complains about the intellectuals’ lack of authority to direct public discussions. Science, a change engine by definition, seems to be one of the few places resisting this democratizing wave by maintaining authority argument in the form not only of peer review committees with shamanic powers to interact with Knowledge deities to decide what’s good or not.

That’s even worse when you know that sometimes those peer reviews can be fabricated or just hilariously stupid, made only to justify picking money from young researchers’ pockets.

I’m overtly not in love with mass pseudo-democratic mechanisms, easily influenced and cooked by advertising constructions or filter bubbles. But it is clear that we need to give voice to horizontal and open peer-review systems where anyone can be a peer. And national research certification systems could also take into account more open and modern impact measures, more aligned with what science and research should mean.
Is there anything like that? There is.

What do you think about Academia.edu, for instance?

download

Academia.edu is a platform for academics to share research papers. The company’s mission is to accelerate the world’s research.

Academics use Academia.edu to share their research, monitor deep analytics around the impact of their research, and track the research of academics they follow. 32,590,050 academics have signed up to Academia.edu, adding 9,815,878 papers and 1,817,127 research interests. Academia.edu attracts over 36 million unique visitors a month.

[https://www.academia.edu/about]

Raúl Antón Cuadrado