Image: Photo by fran innocenti on Unsplash

– a discussion with researchers and educators

By Tamara Heck (@tamaraheck), Ina Blümel (@inablu), Sigrid Fahrer,David Lohner (@davidlohner), Jürgen Schneider (@artzyatfailing2), and Linda Visser.

How can we make the shift from closed to open practice in research and education? What are incentives for researchers to apply open science and open educational practices, and what hinders them to do so?

Study participants chose open scenarios for their daily research or teaching practices and tested them for six to twelve month. They wrote down their experiences with and opinions on open practices in dairies. The kickoff workshop was held in April 2019 with the first round of participants. First discussions on relevant topics were collected in Wikiversity and online pads.

Five of the ten participants and the project leaders from DIPF Frankfurt and TIB Hannover, met again one year later on March 20th, 2020 — due to Corona in an online meeting — and shared their experiences they had made throughout the year. The following blog post was collaboratively written by this group a few days after the workshop and summarizes the relevant aspects of open practices that came up during the meeting.

Meaning of open science and digitalization

Open science is a term which has different and highly individual meanings for researchers. “Currently, I prioritise to make my research reproducible, i.e. to bundle my survey, code, data and commented analyses in an effective way to make them comprehensive for others”, a participant said. Depending on their context, researchers concentrate on other open practices like applying them in their teaching scenarios, or trying to use open source tools for their research. That means, there is no “one practice fits all” logic.

In our final meeting, we tempestuously discussed if open science is a kind of “add-on” to traditional research practices or if research need a profound change to make open science happen? Aspects mentioned here were questions about researchers incentives to foster open science and traditional practices that hinder a change. One participant named an example from his own experience: ”When I had to assess research candidates as a committee member, I found publication and citation numbers quite helpful as they tend to be objective criteria to compare the quality of the candidates.” The participants discussed the option to introduce alternative metrics for open science additionally to traditional indicators. They criticized latter ones as they are biased with regard to different disciplines and publications types and they privilege publications on new phenomena with significant results compared to replication studies and research on rather unknown topics.

Further, a quality-enhanced and broadly positioned scientific communication is a crucial part of open science. This includes for example feedback from participants of an empirical study, and the “translation” of the scientific results for practitioners and interested people. To consider those steps, researchers need incentives for synthesizing their research results and collaborating with practitioners to make them applicable.

Digitalization, i.e. innovations in digital infrastructure and tools, promises participation and the transformation of the research system. The participants named examples: Decentralized distribution channels enable researchers to self-publish their research data and results and get constant feedback on their research processes by the online community. A further potential for digital machine-readable formats of research output like data and publications is that they allow for new algorithm-based research, “like for example in Digital Humanities”, says a participant.

Felix Stalder describes in The Digital Condition the interplay between technical potentials like algorithms, communality and inter-linkage and argues that digital information and media lead to a change in society. The research community would need to proceed with this change as well.

Effects of openness

Open science practices come with challenges, they are time-consuming and lack incentives, was the opinion of our participating researchers. For example many researchers think that publications in an open access journal are less reputable than publications in a journal with a high impact factor — although the impact factor has its critics and open access might lead to higher citation rates. Despite those discussions, positive “side-effects” of open science practices are perceived. “There are positive side-effects of open science though. Due to my practice of sharing research data, I felt an increased perception of my expertise by other researchers”, a study participant tells about his own experience within the last twelve months.

Open practices like sharing research artefacts and processes can foster relevant collaboration and new networks. The participants see the potential of applying open practices in one’s own research process (e.g. cOAlition S) in strengthening one’s reputation — open science as sine qua non of state-of-the-art research.

Recommendations for open practices

Address all members of society

Open science does not want to address researchers only, but claims to open up research for our society as well. “But who do we address, when we write about our research in open formats?”, a participant asked in our meeting. Does society use open research and open data, or realise they exist at all? (First research studies like the SALIENT-Projekt investigate this question) Here, the participants stated again that open science practices need to come along with sound and broadly-ranged research communication. Open science should not be a practice within the research community.

Consider openness in its context

Another discussion developed on different opinions about openness and the different reasons for it. For example, the sharing of research data is influenced by one’s personality, but the practical application of open science comes with barriers as well. For example, many researchers and institutions cannot afford APCs (article processing charges) to publish their research open access. This fact makes researchers doubt open approaches and they tend to stick to traditional ways.

Another influencing factor are instructions by research institutions. Here, it has to be considered that top- down cannot always be applied in each research context. For example, qualitative researchers face the challenge of making their data de-identifiable and at the same time transparent for others. Despite this challenge, many qualitative researchers agree to a research paradigm that does not consider the reproducibility of research important as research processes are highly individual. Those researchers do not see the importance of sharing research data.

The meaning of open science and its explicit application in practice should be considered in each research context, so the participants. Researchers shall internalize the motto: as much openness as possible.

Improve technical infrastructure

“The technical infrastructure really has to be improved”, was the first statement by a study participant, who tested open tools and practices in her higher education teaching seminar and faced basic problems that hinder digital and open practices.

If proper infrastructure is available, alternative open tools are often not as reliable as proprietary tools, as some participants experienced. For example, open source tools like the online editing pad Cryptpad are great for collaborative and asynchronous text work, even for larger student groups. In practice, the online pad often crashed and students were irritated. If lecturers experience such challenges, they often decide to choose proprietary tools again. The current situation facing the coronavirus shows that there is a need for reliable digital infrastructure and tools, which are at best open source and affordable, not only for open science, but for digital learning and teaching as well.

Guide students to open practice

Open educational practices are new and innovative for students as well”, says a participant that tested open practices with students in her university seminar. They require skills like applying software, learning management systems and digital tools that allow open practices like collaborative working. When lecturers have those skills, they face another challenge. Collaborative working and student assessment need be determined carefully and require organizational and methodological considerations. “Moreover, we need a “culture of sharing” among students, which we cannot take for granted in all contexts and disciplines”, says the study participant. Open practices require students to organize their learning processes. Transferring this responsibility to students can be perceived as enrichment, but for some students it might rather be pressure. In most cases, we should not presuppose this fact. Theories on research-based learning approaches discuss similar aspects.

Another challenge for lecturers is to communicate to students the importance of learning digital skills. Students concentrate on the learning of subject topics and research methods. Applying digital open tools to communicate and collaborate with peers comes second. Students expect to experience some benefit in open practices and why they need to apply them. Lecturers need to make their decision to “invest” in open practices transparent. Here, lecturers often stress the need of digital literacy for lifelong learning and future working life. This is not an easy task. If lecturers fail, students experience open practices as additional workload.

Limited open practices might be advantageous to guarantee better learning. When students need to gain open digital skills first, or when then fail to self-manage their learning process, lecturers need to guide students. An option to foster open practice learning is to include this topic in propaedeutics or in a preparatory study course to set the basics for students to apply open practices in their further studies.

Network and best practice sharing

Our various discussions show that we cannot determine a principle guideline for applying open practices in research and learning and teaching environments. The participants face challenges like missing digital infrastructure and tools that could be solved quite easily. The understanding and perception of openness among researchers, lecturers and learners is complex, and each single research or learning context requires its own thoughts and agreements on openness. The participants think that sharing experiences and best practices is crucial to proceed fostering open practices. Many of the projects, ideas and best practices developed from sharing them and communicating about them with peers. Helpful communities and networks are for example the Leibniz Research Alliance Open Science and the Wikimedia Open Science Fellows Program. Many other local and international networks are currently established and are collected in an online sheet, where everyone can record their network and activities.

About the study

The study Open Practices of Educational Researcher (OPER) collected experiences from ten educational researchers between April 2019 and March 2020. The study participants chose open scenarios they wanted to test and applied them in their research or teaching. They wrote down their experiences and opinions in dairies and spoke about them in interviews. The study aimed at analyzing influencing factors that support or hinder researchers and educators in higher education to apply open practices. It investigates succeeding conditions for open science and open educational practices. The study was funded by the Leibniz Research Alliance Open Science.

The scientific findings of the study, which are currently being analysed and will be published later in the year. For more participant statements and further material please refer to our meeting pad and the Wikiversity page.

Cite as

Tamara Heck, Ina Blümel, Sigrid Fahrer, David Lohner, Jürgen Schneider, & Linda Visser. (2020, April 9). Open practice in science and education – a discussion with researchers and educators who tested to be open. Zenodo.

A German version of the blogpost was orginally posted on the DIPF blog: