The Politics of Altmetrics

Thursday 27 September

by Dr Suze Kundu, Head of Public Engagement at Digital Science

 

This session covered some of the challenges of adopting and utilising altmetrics, including the ethical challenges associated with these metrics, responsible use of metrics, and the governance of metrics. Three speakers took to the stage to share some of their research on the topic.

First up, Gabrielle Samuel from Lancaster University stood in for Gemma Derrick to present their paper on how ethical decisions are made in research into social media. Given that altmetrics could be considered a way of researching social media, Samuel wanted to ask conference attendees how they think is it evolving, whether we need to think about whether any of it is ethically problematic, and how we should we negotiate these issues as a community.

Using the example of the ESRC, who insist on ethical approval for social media research, compared to most other funding bodies where it is not required, Samuel asked whether this lack of consistency across the board was good enough. Often, the funding bodies that did not require ethical approval for studies on social media research deferred the decision on whether ethical approval was required to Universities and institutions, shifting the responsibility of making the decision, with each of these institutions giving a different answer.

Generally speaking, Samuel believes that ethical guidelines surrounding research on social media are too woolly and vague, particularly when it is left to Universities to individually lead on this. Even many ethics committee members were unsure of how to assess research on social media, judging proposals on a case-by-case basis. If there is no set guidance here, it is possible that one project would be approved while a similar one would be rejected. Many times however, researchers themselves were asked to deem whether their research project’s ethics were considered or not.

Given that this is a new area, using new tools to analyse a topic, surely this sort of research should be going through the same safety checks as a physical piece of lab equipment, for example. Such kit could not be used unless all aspects of safety had been reviewed. Should research around social media be subjected to the same levels of rigorous testing? Should this be led by funders or publishing houses? Do they have requirements for ethical approval to have been sought? No. Many journals do not require this either. In fact, of all research published around social media, only 8% stated they had sought ethical approval of any form.

Currently there are no gatekeepers for the ethical governance process surrounding social media data. Furthermore if researchers make decisions themselves, this places a heavy burden on them. Is this fair? And is it truly subjective, and ethical in itself? Samuel asks whether all social media research should go through an ethics committee? What would research ethics governance and regulation look like, and who should be responsible? Samuel’s presentation certainly opened a discussion surrounding these research ethics issues and, by using the recent famous example of Cambridge Analytica, highlighted the need for such regulation sooner rather than later.

Next up, Kirstine McDermid from Leeds University discussed hidden or forgotten clinical trial data. McDermid argues that health data should be open in order to make the most of the results. The EC, the World Health Organisation and the US Food and Drug Administration all require that data from clinical trials is open. There is however a cultural lack of reporting, particularly in Big Pharma. McDermid cited Tamiflu as a classic example of this. The UK wasted half a billion pounds stockpiling the influenza drug. when in fact it was very ineffective. The clinical trials were not available to be analysed by the public, but had clinical trials been open, the success rates of the drug as shown in clinical trials may have called for further research before making such a financial commitment.

The medic and science writer Ben Goldacre recently released a report that stated that 80-90% of all clinical trials carried out by European Universities are missing clinical trial results. Perhaps the only way to push for more open clinical trials reporting is to provide guidelines on how to make these results available. ‘Transparency means progress’, as McDermid says, and researchers should be able to easily find clinical trials when deciding to push forward on new drugs.

Using altmetrics can help researchers find clinical trials and also understand the context of the conversation and other activity around them, notes McDermid. Within the Altmetric Explorer tool you can find clinical trial records, making the process of discovering this activity a lot easier and giving a good idea of the bigger picture. The activity surfaced here might even include people tweeting about how effective (or not) any particular drugs are, which can help shape further research into any particular drug.

McDermid’s closing statement pushed for greater openness and more responsible reporting of clinical trials data, to prevent any cherry-picked reporting, and for the benefit of the greater good, the public that ultimately benefit from the development of these drugs.

Simon Linacre from Cabells was the final speaker in this session, discussing “Research Metrics in the Fake News Era”. Linacre drew on the similarities of fake news and predatory publishing; both have victims, and both can have dire consequences. What role can altmetrics play in helping both of these issues? Cabells exist to help people find out more about journals and work out where best to publish their work. Their White List, the ‘good guys’, lists 11,200 journals which follow most regulations. Their new Black List of predatory publishers is already around 9,000 journals long.

Linacre conducted a quick poll of the audience to see how many academics had received an unsolicited email asking them to publish in an unknown journal within the last week. A third of people put their hand up. But how easy are they to spot, if you don’t know what to look out for? There are some classic characteristics that are displayed by these requests. They will invariably ask for an article processing fee, they are trying to upsell their reputation by using certain words, and they try and affiliate themselves with a country or university through their name. This is even before we have got to the spelling mistakes and poor grasp of grammar.

How can researchers use the numerous metrics at their disposal to help them make good decisions on where to publish their work? Linacre things that rather than using one metric, we could look at citations and altmetrics together, because more evidence should lead to making a better decision. The extra information given by altmetrics allows the researcher to get a clearer, more accurate idea of the reputation and reach of each journal.

Early career researchers must be encouraged to ‘research their research’, where they learn more about about citations, altmetrics, and what they may be telling us. Many ECRs spend years on research, but when it comes to selecting a journal to publish in, they may take mere seconds. We also need to better protect victims of predatory publishing. The ‘publish and perish’ culture of academia tends to drive some researchers that are trying desperately to keep up to publish in predatory journals, but perhaps through a greater awareness and understanding of the information that altmetrics can provide, Linacre hopes that the predatory journal’s time is nearly up.

The session gave three interesting examples of how altmetrics can be used in social media research, in searching for clinical trials data, and in journal selection. In each of these cases, altmetrics were able to give a richer, more accurate overview of the issue. In each case, altmetrics were also used alongside citations, proving once again that perhaps no single metric should be used, rather a range of metrics can be combined to generate a more accurate analysis of the scholarly publishing world, whether that be social media research, clinical trials data, or even how to avoid being fooled by that famous greeting, “dear esteemed professor”...