Lasting change cannot be sustained by a subset of researchers. Change must be embraced by the full community. Further, "Big Ideas" require large-scale cooperative efforts among many researchers and institutions.
We are currently running several workshop events intended to bolster community support and involvement in our efforts. These events will build upon the momentum and growing recognition in the catalysis research community to improve the rigor and reproducibility of catalysis research and help translate this groundswell into a set of individual and collective actions that can be implemented by researchers. Our current plan consists of three phases which build to this goal.
On April 4, 2022, we hosted a virtual workshop to discuss the current state of rigor and reproducibility in the field of thermal heterogeneous catalysis. This free, virtual seminar included talks from 6 speakers from a diverse set of fields and one panel discussion. The goal of this seminar was to open a dialog within the community and introduce a common language that we can all use to discuss the issues. Most of us have anecdotal stories concerning failures to reproduce data internally within our group or observed an apparent lack of rigor in reports in literature, but how widespread is the problem? What are possible systemic, institutional, or individual causes? How do researchers think about these problems in other fields? Can this problem be “solved,” and what would even be considered progress? What does it mean for data to be reproducible?
Recordings of the individual talks can be accessed below:
Welcome and Introduction OR "What is this effort?"
- Neil Schweitzer, Northwestern University
The Ongoing Battle for More Credible Science: Identifying Interdisciplinary Lessons
- Jennifer Tackett, Northwestern University and Editor-in-Chief, Clinical Psychological Science
Lessons Learned From Systematic Studies of Experimental Replication in Adsorption Science
- David Sholl, Georgia Institute of Technology and Oak Ridge National Lab
The Importance of Standard Operation Procedures For Catalysis Research Accelerated By Artificial Intelligence
- Annette Trunschke, Fritz Haber Institute
- John Kitchin, Carnegie Mellon University
A Unique Journal for the Publication of Reproducible Methods for the Synthesis of Organic Compounds
- Rick Danheiser, Massachusetts Institute of Technology and Editor-in-Chief, Organic Syntheses
Panel Discussion with Journal Editors
- Moderator: Bruce Gates (UC Davis)
- Panel: Susannah Scott (ACS Catalysis), Johannes Lercher (Journal of Catalysis), Davide Esposito (Nature Catalysis), Junwang Tang (Chinese Journal of Catalysis)
"... benchmarking requires communication and collaboration within a community to establish consensus about which questions are valid and how to evaluate their answers. Users of benchmarks must select problems and methods of measurement that are realistic, to avoid biasing their research toward results that are not broadly applicable. A successful benchmark will be accessible, affordable, clear, relevant, solvable, portable, reproducible, and scalable." -Bligaard et al.
As emphasized in the quote above, any actionable plans to move forward require the community as a whole to agree on what are acceptable standards for published data. Importantly, our definition of community encompasses all researchers in the field, not just academic researchers at R1 institutions. To this end, we are planning a hybrid (in-person and virtual) workshop that will take place in Rosemont, IL. The two-day workshop will cover two specific topics: 1) best practices for “standard” measurements and data reporting, and 2) standard reporting/use of benchmarking materials. To some degree, there are established community standards, but these are only available in hard-to-search literature articles, specialized textbooks, and the oral histories of some academic trees. However, from the perspective of a researcher trained outside our field, these resources can be difficult or expensive to obtain.
The direct outcome of this workshop will be a report written for both new researchers in the field and reviewers. For researchers, the report will act as an easily accessible, open access resource for best practices of common methods used in catalysis research. Each section, referring to a specific method, will contain common applications, known limitations, specific recommendations for reporting data, and a list of references describing best practices. Additionally, the report will provide researchers with hard-to-find information about benchmark materials, such as a list of well documented industrial catalysts for specific types of chemical transformations, references for synthesis recipes of benchmark/control materials, and recommendations for reporting synthesis/characterization/activation data in literature.
For reviewers, the report will serve as a standard for evaluating submitted manuscripts. In this way, this resource will help establish consistent grounds of acceptance or rejection of proposals and manuscripts across different journals and funding agencies. One point that we will emphasize is that the report will NOT establish strict rules or methods that all researchers must follow for manuscript/proposal acceptance. Innovation is at the core of our research activities, which oftentimes might stray from conventional approaches. Strict, prescriptive guidelines for particular measurements or analyses, therefore, may stifle innovation.
That being said, even with brand new types of measurements, there are minimum standards that should be upheld to demonstrate its significance. For example, guided by communally accepted, statistical criteria, researchers used rigorous statistical analysis and years’ worth of repeated measurements to clearly demonstrate the existence of the Higgs boson. These tools were just as important as the state-of-the-art experimental facilities and data analysis methods that were also developed in its discovery. While the report from this workshop should not be used to dictate how researchers perform measurements, it can still provide reviewers with justification for requiring researchers to justify deviations from conventional standards in the methods.
Once the community has agreed to standardized reporting practices, we can begin to strategize actionable plans that can further enhance R&R in our field. These efforts are the focus of Phase 3, which will consist of a series of virtual, topical, strategizing sessions. The sessions we currently have planned are Producing training media and workshops, Establishing a database of experimental data, and Forming a network of testing labs. Discussions will focus around creating inclusive, accessible infrastructure for the entire research community and potential funding sources for essential capital and personnel effort. These sessions will result in plans for submitting funding proposals. We are actively soliciting feedback on other topics. If you have other ideas or would like to participate specifically in the planning of one of these areas, please let us know!