Potential evaluation tools and guidelines for Unjournal evaluators
As of January 2025, this is an incomplete list of tools we've tried or are considering. We aim to make this a more carefully curated and vetted list.
← Back to Evaluation FormThe Unjournal encourages the responsible use of AI tools to enhance evaluation quality. Key principles:
See the full policy: Unjournal's AI/LLM policy proposal
Automated regression and statistical analysis checking for research papers.
Open RegCheck →Checks statistical reporting consistency: verifies that p-values, test statistics, and degrees of freedom in papers are internally consistent.
AI agent-based and LLM-based tool including fact checking and reasoning checking.
Pro model shows substantial insight and potential; see our piloting here.
Open ChatGPT →Deep-dive analysis of individual papers. Upload a PDF and ask targeted questions about methodology, findings, and limitations.
AI research paper analysis providing structured feedback and summaries.
Open Paper-Wizard →AI-powered feedback on research papers: finding issues and limitations.
Open refine.ink →AI assistant; Opus model useful for deep analysis of research papers and evaluation assistance.
Open Claude →AI-powered search with cited sources; useful for quick literature checks and fact verification.
Open Perplexity →AI-powered research assistant for finding related papers, extracting key findings, and answering research questions.
Search engine for research findings. Quickly see what the scientific consensus is on specific claims.
AI-powered academic search engine for finding and exploring research papers and citation networks.
Smart citations showing whether papers support or contradict cited claims. Helps verify how a paper's references and citing works relate to its key claims.
Vector-space approach for finding related concepts and research. Exploring integration with Unjournal evaluation data and RePEc.
The Transparency and Openness Promotion (TOP) guidelines from the Center for Open Science. A framework for evaluating transparency standards across 8 dimensions: citation standards, data transparency, analytic methods, research materials, design and analysis, study preregistration, analysis plan preregistration, and replication.