An invitation to contribute!
The focus of the theme is to look at new creative ways researchers are in dialogue, engaging, and co-creating with the wider public. To look at what is being learned from creative styles on YouTube, or social media, and the like — and what is being put in place in science and scholarship for communications from the tech sector practices of ‘design research’ — co-creation and agile approaches.
Virtually no new product or service from the private or public sector would be brought out onto the market without engaging in some form of ‘design research’ — essentially talking with communities and asking them questions. So how is R&D utilising ‘design research’ methods? One size does not fit all and sometimes new can also mean the rediscovery of forgotten conventions, which is why we want to look at a variety of practices — to look at projects, literature, programmes that are engaging in these processes.
Simply having better, more scientific, or evidence led policy and communications no longer seems enough when we are confronted by a ‘post truth’ world where ‘anti-science’ views are being encouraged. These new forms of deliberative public engagement would seem to offer up a richer experience for the public, a sense of ownership, and to expose them to the workings of science and scholarship.
The United Nations Sustainable Development Goals (SDGs) have provided a framework to work around to anchor the use value of R&D with Open Science clearly flagged as being important to the extent of being called ‘Science for the Future’. Recent announcement of the project partnership of the World Health Organization (WHO) and the Wikimedia Foundation on trusted information about COVID-19 show the need for trusted channels and how new engagement can be practiced with Wikimedia’s 250,000 volunteer editors to combat what the WHO calls an ‘infodemic’.
‘an overabundance of information and the rapid spread of misleading or fabricated news, images, and videos.’
WHO
So what’s so different about Open Science Communication to standard Science Communication — part of the answer is that Open Science is about acknowledging that there are systemic problems in the functioning of academia that need fixing — from its basic efficacy, to addressing knowledge equity and diversity of participation in scholarship.
Digitisation has brought along a new toolbox for scholars, scientists, and the public. The Open Science community is using this moment of change to try and fix the social and technical problems that in some way or other orientate around the broken knowledge chain of:
research > review > condensed authoritative information
If you would like to contribute or have any questions please send over to Simon Worthington – GenR Editor via Twitter @gen_R_ (DMs open), chat on Matrix / Element, or via email simon@genr.eu.
Image by Joseph Mucira from Pixabay – Pixabay License. Free for commercial use. No attribution required
There are scientists in the fields of climate and health research who review press articles and give feedback on how credible they are. I am member of the climate branch: Climate Feedback.
https://climatefeedback.org
As these reviews take time it is mainly feedback for the newspapers/editors and science enthusiasts. The reviews tend to be too late to warn the public.
I have been wondering whether one could set up a system where everyone can make such reviews and the reviews of scientists are used to determine how well they do this. Such a system could review many more press articles and do so much faster.
Legacy social media uses fact checkers from reputable groups (such as Climate Feedback). I have no idea whether they would be willing to use such a more statistical product. But I would be happy to use it myself and would install a browser add-on to see these reviews. Also Open Social Media may be more willing to implement such a system. This is a distributed system (like email) so people could opt in to using this, rather then having it forced onto them when the use Facebook or Twitter. I maintain a micro-blogging server for scientists and would implement such a system.
https://fediscience.org
Hi Victor,
Great idea. More of this type of pathways for scientists to put their expertise to work is needed.
I think we can see the parts of this coming together. Plaudit and browser plugin to rate academic works, so it could be extended to the reviews. PRErevie adds a multiple choice as a review and they should have a federated portable review format https://prereview.org/
Open Social platforms like Element / Matrix are in use in education and public sector and allow third party application integration to you can federate on Open Social Media and access large parts of the public sector workers who can add to reviews.
How about you write up a short blogpost and we float the idea out with people and see who’s working in this area.
Keep up the great work – Simon
Hi Victor,
I got talking with the authors of a paper ‘Open Science Saves Lives: Lessons from the COVID-19 Pandemic’ and they have a Reddit #AMA Ask Me Anything on the 11.11. and I mentioned your idea of federated review system for the review of the use of science in the media.
They suggested to bring up the idea in the question session next week.
So to get the ball rolling and get input from people who make open peer review I made this Open Notebook to gather questions and sketch out what such a system could be technically and socially.
Hope you like it https://demo.codimd.org/s/HJpVfj-tD
Hi Gen R,
Great so see enthusiasm for the idea. Will try to write my ideas up. Not sure I will manage to do so before the AMA.
Earlier I had a look whether my readers on Twitter and Mastodon were interested, in addition to what I commented above, it also explains a first step we could make to see if such a system is feasible.
“#ClimateFeedback is a group of climate scientist reviewing press articles on climate change, but there is only so much a few scientists can do & by the time the reviews are in & summarized the article is mostly old news.
With a larger group we could review more articles & have results while people are still reading it.
We could test how well such a larger group does (& how to statistically combine their grades) by comparing their assessments with the ones later published by Climate Feedback.
If we would make such a feasibility study would you be willing to participate?
You would get a mail when Climate Feedback starts a review & when it is finished, have that time to grade the article, maybe add how confident you are or a classification on the topic.” https://bonn.social/@VictorVenema/105017838789986179
https://twitter.com/VariabilityBlog/status/1315375013877481475
Fitting to my prejudice people on Twitter were more willing to review and people on Mastodon were more willing to build software.
P.S. Do try to crosspost the AMA on /r/open_science a few hours or a day in advance if the AMA post is available earlier.
feasibility study #ClimateFeedback great stuff.
If I get it right your suggesting getting a group together to compare their reviews with reviews from #ClimateFeedback – on speed, etc.
I’m up for it @mrchristian and I can send out the message and include in #AMA.
#AMA link gets announce today 11.11 13:55 CET – I’ll keep you updated