Loading...
Loading...
Loading...
SourceVerify Reference Identity Standard
The only way to trust citation verification results.
No black boxes. No guessing. Just transparent, auditable, explainable outcomes.
Hallucinated citations mix real and fake details in endless combinations. Any field could be wrong—or almost right.
Guessing isn't an option. You need a systematic way to compare what's claimed against what's real, and show your work.
LLMs are pattern matchers, not fact-checkers. When an AI says "60% confidence" or "likely valid", it's not calculating probability—it has no connection to truth. To trust verification results, you need transparent rules that can be audited.
"This citation is probably valid (60% confidence)."
...but how did it calculate 60%? It can't.
...what evidence did it check?
...pattern matching isn't verification.
Every field in a citation gets exactly one label. No fuzzy logic, no probability scores—just clear, explainable results.
Citation field is identical to the evidence after normalization. For websites with no listed authors, organizational author = MATCH.
Same value with minor spelling errors. Displayed separately for transparency, but counts as MATCH in verification logic.
Citation field is a substring or abbreviation of the evidence.
The citation doesn't provide this field. An omission is not an error—it's neutral.
Citation provides this field, but the evidence source lacks data for it. Not found ≠ wrong.
Evidence explicitly shows a DIFFERENT value than the citation claims.
Based on the field labels, SVRIS computes exactly one of four outcomes. The logic is public and reproducible.
Document confirmed with strong evidence
Title or DOI match found with at least one other matching field.
= (title OR DOI) match + ≥2 other matches (no contradiction)
= (title OR DOI) match + ≥1 other match + ≥2 contains (no contradiction)
Document found with minor issues
Partial match with corroboration, or strong match with one contradiction.
= title MATCH + identifier MATCH (definitive)
= title match + ≥1 other match + ≤1 contradiction
= title absent + ≥2 match (no contradiction)
= ≥3 matches + title/ID absent (not contradiction)
= title contains + match (no contradiction)
Found something but uncertain
Title match or partial found, but not enough corroboration or has a contradiction.
= title strong match alone (no contradiction)
= title match + ≥2 positive signals
= title match + contains + 1 contradiction
= ≥2 contains + ≤2 inconclusives
= 2+ positives + supporting field contradiction (not title)
= title CONTRADICTION + identifier MATCH + positives
Document not found
No title or DOI match found, or too many contradictions.
= title unconfirmed + no DOI match (article not found)
= title CONTRADICTION + no identifier match (different article)
= no title/DOI match (only other fields match)
= single contains + mostly inconclusives
= title contains alone (no support)
= matches ≤ contradictions (evenly split signals)
See how SVRIS computes outcomes. Select labels for each field and watch the result update in real-time.
Match Set: 1 | Contained Set: 1
See exactly why a citation was verified or rejected. No hidden logic, no unexplainable AI decisions.
Every result can be manually verified against the standard. Perfect for research integrity and compliance.
The full specification is public. Build your own implementation, or verify ours. No vendor lock-in.
SourceVerify is powered by SVRIS. Try it free.