Home New Trending Search
About Privacy Terms
#
#Metasci
Posts tagged #Metasci on Bluesky

For similar results, see...

#MetaSci #SocialPsyc

1 0 0 0
African Reproducibility Network (AREN) Reproducibility Network (RN) in Africa

➡️ Applications Open: The African Reproducibility Network (AREN) Local Network Leads Training Program 2026:
africanrn.org/announcement... 
📅 Application Deadline is March 16th 2026!

#openscience #openresearch #reproducibility #metasci

2 0 0 0

#SocialPsyc #MetaSci

0 0 0 0
Preview
The Triumph of Ego Depletion The Real Story Behind One of Social Psychology’s Most Replicated Findings

And for a recent defense of the replicability of the ego-depletion effect, see...

#SocialPsyc #MetaSci

0 0 0 0

#MetaSci #Methodology

1 0 1 0

"On the other hand, proponents of a universal approach may argue that fundamental principles such as accountability and the pursuit of truth are common to all research fields."

#MetaSci #Ethics #Methodology #AcademicSky

2 0 1 0
The inflation of Type I error rates is thought to be one of the causes of the replication crisis. Questionable research practices such as p-hacking are thought to inflate Type I error rates above their nominal level, leading to unexpectedly high levels of false positives in the literature and, consequently, unexpectedly low replication rates. In this article, I offer an alternative view. I argue that questionable and other research practices do not usually inflate relevant Type I error rates. I begin by introducing the concept of Type I error rates and distinguishing between statistical errors and theoretical errors. I then illustrate my argument with respect to model misspecification, multiple testing, selective inference, forking paths, exploratory analyses, p-hacking, optional stopping, double dipping, and HARKing. In each case, I demonstrate that relevant Type I error rates are not usually inflated above their nominal level, and in the rare cases that they are, the inflation is easily identified and resolved. I conclude that the replication crisis may be explained, at least in part, by researchers’ misinterpretation of statistical errors and their underestimation of theoretical errors.

The inflation of Type I error rates is thought to be one of the causes of the replication crisis. Questionable research practices such as p-hacking are thought to inflate Type I error rates above their nominal level, leading to unexpectedly high levels of false positives in the literature and, consequently, unexpectedly low replication rates. In this article, I offer an alternative view. I argue that questionable and other research practices do not usually inflate relevant Type I error rates. I begin by introducing the concept of Type I error rates and distinguishing between statistical errors and theoretical errors. I then illustrate my argument with respect to model misspecification, multiple testing, selective inference, forking paths, exploratory analyses, p-hacking, optional stopping, double dipping, and HARKing. In each case, I demonstrate that relevant Type I error rates are not usually inflated above their nominal level, and in the rare cases that they are, the inflation is easily identified and resolved. I conclude that the replication crisis may be explained, at least in part, by researchers’ misinterpretation of statistical errors and their underestimation of theoretical errors.

Finally, it provides a clearer philosophical rationale for my 2024 article arguing that questionable research practices do not usually inflate relevant Type I error rates.

Open Access: doi.org/10.36850/4d3...

#PhilSci #stats #MetaSci #Methodology #AcademicSky

8 0 1 0

"The problem is that good science is, well, really hard, and irreproducibility simply comes with the job."

#MetaSci #PhilSci

5 1 0 0
Efforts to replicate portions of the scientific literature have lead to widely varying and often low rates of replicability. This has raised concerns over a ``replication crisis'' whereby many of the statistically significant claims in the published literature are thought to be false positives, due to some combination of publication bias and widespread use of questionable research practices. However, formal meta-scientific models invoking false positives lead to conclusions that often conflict with observational findings and require additional assumptions to reconcile varying rates of replicability across areas of research. Here, we present a minimal, alternative model of how replication failures can occur even in the absence of false positives. Using our model, we show that variation in estimates of replicability across social science is well explained as an artifact of replication sample size. We additionally demonstrate that key features of reformed science and multi-site replications can be explained without false positives. Our results are consistent with evidence suggesting that file-drawer sizes are likely much smaller, and Questionable Research Practices less abundant, than required by false positive models. We anticipate our findings will be a starting point for more formal and nuanced discussion of the health of the scientific literature and areas for improvement.

Efforts to replicate portions of the scientific literature have lead to widely varying and often low rates of replicability. This has raised concerns over a ``replication crisis'' whereby many of the statistically significant claims in the published literature are thought to be false positives, due to some combination of publication bias and widespread use of questionable research practices. However, formal meta-scientific models invoking false positives lead to conclusions that often conflict with observational findings and require additional assumptions to reconcile varying rates of replicability across areas of research. Here, we present a minimal, alternative model of how replication failures can occur even in the absence of false positives. Using our model, we show that variation in estimates of replicability across social science is well explained as an artifact of replication sample size. We additionally demonstrate that key features of reformed science and multi-site replications can be explained without false positives. Our results are consistent with evidence suggesting that file-drawer sizes are likely much smaller, and Questionable Research Practices less abundant, than required by false positive models. We anticipate our findings will be a starting point for more formal and nuanced discussion of the health of the scientific literature and areas for improvement.

"We show that variation in estimates of replicability across social science is well explained as an artifact of replication sample size."

doi.org/10.31235/osf...

#MetaSci

6 0 1 0

#EconSky #sociology #PoliSky #PolicySky #NumbersDay #FinTwitter #PsychSky #AcademicSky #PhDSky #metasci #SciencePublishing #ResearchPublishing #stats
#StatsSky #science #Criminology #environment

6 5 0 0
Preview
If progress is not to falter, students must be trained in open research The how and why of conducting transparent, rigorous, ethical research must be explicitly taught, say Madeleine Pownall, Charlotte Pennington and Flavio Azevedo

“Open research is about more than the tightening of analytical and methodological standards. The movement also invites us to reconsider how, and by whom, knowledge is created, shared and evaluated”

By @maddipow.bsky.social, @drcpennington.bsky.social, & @flavioazevedo.bsky.social

#MetaSci #OpenSci

26 13 0 0
At the Metascience conference in London in June 2024, the same question came up again and again. One of the panellists said something that, for me, landed as the most convincing answer yet: “Metascience ticks boxes for policy people. It sounds empirical. It sounds like better science, and less wasted resources.” And there is an implicit assumption beneath that. It carries that aura of objectivity, treats science as an “investment of capital” that should be optimised to create more innovations that are profitable to capital. This is the language of productivity, efficiency, economic growth and returns.

At the Metascience conference in London in June 2024, the same question came up again and again. One of the panellists said something that, for me, landed as the most convincing answer yet: “Metascience ticks boxes for policy people. It sounds empirical. It sounds like better science, and less wasted resources.” And there is an implicit assumption beneath that. It carries that aura of objectivity, treats science as an “investment of capital” that should be optimised to create more innovations that are profitable to capital. This is the language of productivity, efficiency, economic growth and returns.

"‘Metascience ticks boxes for policy people. It sounds empirical. It sounds like better science, and less wasted resources’.”

#PhilSci #STS #MetaSci #HistSci #AcademicSky

7 1 0 1

Might be interesting to colleagues from #metasci (e.g., currently at #PSE8). Thanks for your support!

2 0 0 0
Preview
Behavioural science is unlikely to change the world without a heterogeneity revolution - Nature Human Behaviour Behavioural science increasingly informs policy, but findings are not always replicated. Bryan et al. describe an emerging heterogeneity revolution. They recommend that researchers use heterogeneity i...

For more on the "heterogeneity revolution" see Bryan et al. (2021).

#MetaSci #PsycSci

9 1 0 0

#metasci #emimcc

0 0 0 0
Preview
Psychology needs a… humility revolution | BPS Madeleine Pownall argues that Psychology is ‘necessarily limited and incomplete’.

Humility Revolution

"Humility is not a threat to scientific authority; it is a strength. It shows disciplinary maturity, intellectual honesty, and methodological pluralism. A humility revolution must, surely, lead to better science."

By @maddipow.bsky.social

#PsycSci #MetaSci #Methodology

44 6 1 2
Preview
Psychology’s Recurring Crises: Lessons from History and Philosophy of Science We examine what generated crisis discussions, how they tended to unfold, and how they were resolved. And we derive some lessons from history for the current replication crisis.

“Methodological and administrative solutions are valuable, but they are not enough. Rather, we need to engage with the epistemic processes and ideals at the heart of psychological knowledge production and engage very closely with critical perspectives...”

#PsycSci #Methodology #MetaSci

3 1 0 2
Revisiting Ego Depletion: Evidence from Multi-Lab Collaborations - Junhua Dang, Shanshan Xiao, Lihua Mao, Xiaoping Liu, Anna Baumert, Solenne Bonneterre, Shiyu Cai, Xiaoxi Chen, Margaux de Chanaleille... The ego depletion effect posits that initial exertion of self-control impairs subsequent self-regulatory performance. Despite being examined in over 1000 indepe...

#MetaSci #SocialPsyc

0 0 0 0
Preview
Cultures of Trial and Error: The Narrative Side of (Open) Science Heterogenous communities with explicit commitments to science corrections or what this blog series summarises under the descriptor ‘cultures of trial and error’ have existed for the longest time.

#MetaSci #OpenSci

1 0 0 0
Preview
What is Critical Metascience and Why is it Important? If science is the subject of metascience, then metascience is the subject of critical metascience!

See also...

#MetaSci #PhilSci #Methodology

7 2 1 0
Preview
NIH Director Jay Bhattacharya talks 'replication crisis' at Duke panel, omits funding cuts Throughout the second Trump administration, the NIH has frozen billions of dollars in research funding to universities. Those cuts were not the topic of discussion at a Duke Clinical Research Institut...

NIH - "The research arm of MAHA”

Director of the National Institutes of Health, Jay Bhattacharya, calls for a greater emphasis on applied research, especially in connection with Make America Healthy Again.

#AcademicSky #MetaSci 🧪

6 6 1 0
Promotional image announcing a research article in the open-access journal Quantitative Science Studies (QSS). The title reads “Using peer review to evaluate the societal relevance of humanities research.” The authors listed are Stijn Conix, Leander Vignero, Olivier Lemaire, Pei-Shan Chi, and Li Lin, with institutional affiliations in Belgian universities (including KU Leuven). The left side features the red QSS journal logo and the label “an open access journal,” while the right side presents the article title, authors, affiliations, and keywords.

Promotional image announcing a research article in the open-access journal Quantitative Science Studies (QSS). The title reads “Using peer review to evaluate the societal relevance of humanities research.” The authors listed are Stijn Conix, Leander Vignero, Olivier Lemaire, Pei-Shan Chi, and Li Lin, with institutional affiliations in Belgian universities (including KU Leuven). The left side features the red QSS journal logo and the label “an open access journal,” while the right side presents the article title, authors, affiliations, and keywords.

Can peer review reliably assess the societal value and relevance of #humanities research, especially the societal contribution of published outputs? Our researchers took part in an interdisciplinary study examining these issues 👇📃 direct.mit.edu/qss/article/... #philsky #AcademicSky #HPS #metasci

10 2 1 0
Preview
Too many guns are smoking Is the smoking gun evidence sufficient to make a scientific claim?

Author's blog...

"All four experiments are cases of finetuning and data selection."

#Physics #MetaSci #OpenSci 🧪

6 2 0 0
Preview
Effects of international sanctions on age-specific mortality: a cross-national panel data analysis Sanctions have substantial adverse effects on public health, with a death toll similar to that of wars. Our findings underscore the need to rethink sanctions as a foreign-policy tool, highlighting the...

These casual methods are unknown to me (except instrumental variables) but how on earth can one disentangle direct causality from sanctions from what causes sanctions and poor outcomes? Are the strong claims actually well supported by the methods? #stats #metasci www.thelancet.com/journals/lan...

2 0 1 0
Preview
Ideological bias in the production of research findings There is a robust statistical relation between ideology and research findings when researchers test a policy-relevant hypothesis.

An experiment w/ 71 research teams showed that analytical decisions made on the same dataset varied according to teams’ pre-existing ideology toward or against immigration. These choices also received ⬇️ grades from "peer reviewers". #metasci #socialpsyc #cogpsyc www.science.org/doi/full/10....

5 2 0 1
Benefits of being involved in a ReproducibiliTea journal club

Benefits of being involved in a ReproducibiliTea journal club

📢 New preprint! Ten Simple Rules for Running a ReproducibiliTea Journal Club

🔗 doi.org/10.31222/osf...

Our aim is to equip you as early career researchers with the tools needed to lead grassroots change in research culture.

#reproducibility #openresearch #openscience #metasci #academicsky

19 14 1 2
Preview
ReproducibiliTea Calendar ReproducibiliTea - Journal Clubs for Open Research

📅 Upcoming events - ReproducibiliTea online
Check out our calendar for more details:
reproducibilitea.org/calendar

#OpenResearch #OpenScience #metasci #academicsky

2 0 1 0

Was there a metascience study that shows how long it takes people to publish a paper on average?

I have a memory of something like x amount of months for writing revising submitting etc

#psych #metasci

1 0 1 0
Preview
Innovations in Scientific Institutions Exciting times

"When all institutions look the same, their output tends to be the same as well... If we want to see a higher rate of scientific breakthroughs, we should pursue institutional diversity for that reason alone" by @stuartbuck.bsky.social h/t @tomstafford.mastodon.online.ap.brid.gy #MetaSci #AcademicSky

16 6 0 0
Preview
One of the most dangerous ideas in science was a misunderstanding It's only getting more popular

At least @physicsworld.bsky.social admits to using nonsense from the literature. YouTube "warriors for science" don't need to. #metasci

www.redteamofscience.com/p/one-of-the...

0 0 0 0