Study Design
study design
10 “simple rules” for socially responsible science
Guidelines concerning the potentially harmful effects of scientific studies have historically focused on minimizing risk for participants. However, studies can also indirectly inflict harm on individuals and social groups through how they are designed, reported, and disseminated. As evidenced by recent criticisms and retractions of high-profile studies dealing with a wide variety of social issues, there is a scarcity of resources and guidance on how one can conduct research in a socially responsible manner. As such, even motivated researchers might publish work that has negative social impacts due to a lack of awareness. To address this, we proposed 10 recommendations (“simple rules”) for researchers who wish to conduct more socially responsible science. These recommendations cover major considerations throughout the life cycle of a study from inception to dissemination. They are not aimed to be a prescriptive list or a deterministic code of conduct. Rather, they are meant to help motivated scientists to reflect on their social responsibility as researchers and actively engage with the potential social impact of their research.
Where do problem spaces come from? On metaphors and representational change
The challenges of problem solving do not exclusively lie in how to perform heuristic search, but they begin with how we understand a given task: How to cognitively represent the task domain and its components can determine how quickly someone is able to progress towards a solution, whether advanced strategies can be discovered, or even whether a solution is found at all. While this challenge of constructing and changing representations has been acknowledged early on in problem solving research, for the most part it has been sidestepped by focussing on simple, well-defined problems whose representation is almost fully determined by the task instructions. Thus, the established theory of problem solving as heuristic search in problem spaces has little to say on this. In this talk, I will present a study designed to explore this issue, by virtue of finding and refining an adequate problem representation being its main challenge. In this exploratory case study, it was investigated how pairs of participants acquaint themselves with a complex spatial transformation task in the domain of iterated mental paper folding over the course of several days. Participants have to understand the geometry of edges which occurs when repeatedly mentally folding a sheet of paper in alternating directions without the use of external aids. Faced with the difficulty of handling increasingly complex folds in light of limited cognitive capacity, participants are forced to look for ways in which to represent folds more efficiently. In a qualitative analysis of video recordings of the participants' behaviour, the development of their conceptualisation of the task domain was traced over the course of the study, focussing especially on their use of gesture and the spontaneous occurrence and use of metaphors in the construction of new representations. Based on these observations, I will conclude the talk with several theoretical speculations regarding the roles of metaphor and cognitive capacity in representational change.
Impact evaluation for COVID-19 non-pharmaceutical interventions: what is (un)knowable?
COVID-19 non-pharmaceutical intervention (NPI) policies have been one of the most important and contentious decisions of our time. Beyond even the "normal" inherent difficulties in impact evaluation with observational data, COVID-19 NPI policy evaluation is complicated by additional challenges related to infectious disease dynamics and lags, lack of direct observation of key outcomes, and a multiplicity of interventions occurring on an accelerated time scale. Randomized controlled trials also suffer from what is feasible and ethical to randomize as well as the sheer scale, scope, time, and resources required for an NPI trial to be informative (or at least not misinformative). In this talk, Dr. Haber will discuss the challenges in generating useful evidence for COVID-19 NPIs, the landscape of the literature, and highlight key controversies in several high profile studies over the course of the pandemic. Chasing after unknowables poses major problems for the metascience/replicability movement, institutional research science, and decision makers. If the only choices for informing an important topic are "weak study design" vs "do nothing," when is "do nothing" the best choice?