Apart from looking at the most popular tools for single research activities, we can also look at which tools are used together.
Below is a first attempt at visualizing this for the first 1000 responses to our survey on scholarly communication. For this figure, we looked at the four most popular tools listed for each research activity (including, in this case, tools mentioned by respondents as ‘others’). For each tool combination, the absolute number of people (out of 1000) that uses both tools is visualized.
[click on the figure for an interactive version made with plot.ly]
This figure is merely intended as an exploration of what is possible with our survey data. These preliminary results are expected to be biased due to the distribution channels (Twitter, mailing lists) used in the early weeks of the survey. Also, results plotted as relative, rather than absolute values will be much more informative. This is something we are currently working on.
However, even looking at the patterns in the figure above, some interesting observations can be made:
- For some activities, like sharing protocols and peer review outside that organized by journals, very few people mention using any tools at all;
- Some tools are clearly being used for multiple activities (like ResearchGate for sharing publications and researcher profiling, among other activitiies), which is obviously what the makers of the tool (or the publishers that buy them) are hoping for;
- We can identify some popular tools that are hardly ever used together (like LaTeX and SPSS). While this specific example most likely reflects separate populations of researchers, these kinds of observations are also interesting in view of interoperability of tools.
Of course, we need way more responses to be able to draw any meaningful conclusions and to break down the results for researchers of different disciplines, career stages and countries. We especially hope that institutions/libraries will take the opportunity to distribute the survey and get the data for their institution.
Please help us spread the survey!
In this post we present a straight percentage count of the preselected options for the 17 research activities that we asked about in our survey Innovations in Scholarly Communication. These figures represent the first 1000 responses.
Please keep in mind that these first 1000 reponses do have a bias due to the predominant distribution methods in the first weeks of the survey (Twitter and other social media, and mailing lists). The final results will probably substantially deviate from what you see below, and be more reliable. We need way more response and hope that institutions/libraries will take the opportunity to distribute the survey and get the data for their institution.
Also note that many respondents specified which other tools they use, in addition to the preselected options. Together the 1000 respondents mentioned over 1000 (!) different tools not yet included in these counts. Further, sizeable groups skipped questions (which they were allowed to) or specified that they used “none”. More about these issues in a later post.
We are not going to formally analyse these figures now but merely present them to show what types of counts are possible. We’re seeing interesting patterns already but are careful not to jump to conclusions in this early hour. Please leave comments if you wish.
On these pages we inform you on the progress being made with the survey Innovations in Scholarly Communication. We will share preliminary results, some methodological backgrounds, experiences and reactions that people share with us, and more.
Within the first month, we received over 1000 responses to our survey. A good start! (but not nearly enough, so please keep spreading the word).
In the next few posts, we’ll present some preliminary results based on the first 1000 responses. Please bear in mind these results are expected to be biased due to the way we have distributed the survey so far (mainly through Twitter and other social media and via mailing lists).
The responses show a nice distribution across research roles and disciplines, although librarians are somewhat overrepresented so far. We hope librarians will not only fill in the survey themselves, but also promote it within their institution! With more responses, we will be able to break down results according to research role etc.
Looking at the geographical distribution of respondents (based on the answers to the question ‘What is the country of your current (or last) affiliation?’) reveals that we have received responses from people in 64 counties, with most coming from the US, the UK and the Netherlands. Still quite a way to go before we have a truly global distribution!