"... benchmarking requires communication and collaboration within a community to establish consensus about which questions are valid and how to evaluate their answers. Users of benchmarks must select problems and methods of measurement that are realistic, to avoid biasing their research toward results that are not broadly applicable. A successful benchmark will be accessible, affordable, clear, relevant, solvable, portable, reproducible, and scalable." -Bligaard et al.
As emphasized in the quote above, any actionable plans to move forward require the community as a whole to agree on what are acceptable standards for published data. Importantly, our definition of community encompasses all researchers in the field, not just academic researchers at R1 institutions. To this end, we organized a hybrid (in-person and virtual) workshop that took take place in July, 2022 in Rosemont, IL. The two-day workshop covered two specific topics: 1) best practices for “standard” measurements and data reporting, and 2) standard reporting/use of benchmarking materials. To some degree, there are established community standards, but these are only available in hard-to-search literature articles, specialized textbooks, and the oral histories of some academic trees. However, from the perspective of a researcher trained outside our field, these resources can be difficult or expensive to obtain.
The direct outcome of this workshop is a report written for both new researchers in the field and reviewers. For researchers, the report is an easily accessible, open access resource for best practices of common methods used in catalysis research. Each section, referring to a specific method, will contain common applications, known limitations, specific recommendations for reporting data, and a list of references describing best practices. Additionally, the report provides researchers with hard-to-find information about benchmark materials, such as a list of well documented industrial catalysts for specific types of chemical transformations, references for synthesis recipes of benchmark/control materials, and recommendations for reporting synthesis/characterization/activation data in literature.
For reviewers, the report is a standard for evaluating submitted manuscripts. In this way, it will help establish consistent grounds of acceptance or rejection of proposals and manuscripts across different journals and funding agencies. One point that we will emphasize is that the report is NOT intended to establish strict rules or methods that all researchers must follow for manuscript/proposal acceptance. Innovation is at the core of our research activities, which oftentimes might stray from conventional approaches. Strict, prescriptive guidelines for particular measurements or analyses, therefore, may stifle innovation.
That being said, even with brand new types of measurements, there are minimum standards that should be upheld to demonstrate its significance. For example, guided by communally accepted, statistical criteria, researchers used rigorous statistical analysis and years’ worth of repeated measurements to clearly demonstrate the existence of the Higgs boson. These tools were just as important as the state-of-the-art experimental facilities and data analysis methods that were also developed in its discovery. While the report from this workshop should not be used to dictate how researchers perform measurements, it can still provide reviewers with justification for requiring researchers to justify deviations from conventional standards in the methods.