Methodology

How we audit peptide vendors

We audit what vendors publish, not what we test in a lab. Every claim in a vendor's profile links to an artifact a third party can re-check — a public COA URL, a Wayback snapshot, a lab portal lookup. The score is mechanical: it falls out of the evidence, it is not a judgment call.

Identity

Vendor name, primary URL, earliest Wayback Machine snapshot, and approximate domain age. If no real site can be found, that fact is recorded with a red flag.

COA program

The substantive section. We record:

  • pagePublic — is there a public COA index reachable without login?
  • totalCoasFound — count of distinct COAs visible.
  • dateRange — earliest and latest COA dates.
  • cadence — per-batch / monthly / quarterly / yearly / one-off / none / unknown.
  • everyBatchClaimed — does the vendor claim it, and does the COA volume + dating support that claim?
  • labs — which lab(s) issued the COAs, and can any be independently verified through a portal or third-party lookup?

Authenticity

Whether COA bottle photos show a vial with the vendor's branding (or a generic stock vial), whether the COA template is shared across other vendor sites, and a list of standardized red flags.

A note on shared templates. A COA template that appears across multiple vendors can mean two very different things: (a) several small US vendors use the same independent commercial lab as a client (legitimate — the lab serves many; the COA names the client), or (b) several "different" brands resell the same upstream bulk product and re-use the supplier's COA verbatim (a red flag — the COA is not vendor-specific). We distinguish the two by checking whether the lab is independently verifiable (US-based, named, contactable, with a portal or signed report) and whether the client field, batch numbers, and product specifics differ between vendors using the same lab. Only the second pattern is a red flag.

Scoring rubric (recent-cadence weighted)

The single most discriminating signal in this market is recent cadence — how many COAs the vendor has actually published in the last 90 days. A vendor that produced fifty third-party reports five years ago and stopped publishing two years ago does not have a current COA program. The rubric reflects that.

Each vendor record carries a computed recentCadence block: last90DaysCount, avgPerMonthLast90Days, and daysSinceLatest. Letter grade falls out of those numbers plus the qualitative criteria below.

A Recommended Active third-party verification. Public batch-level COAs from a named independent lab; at least one COA verifiable through the lab's own portal; average ≥4 COAs/month over the last 90 days; branded vials matching the vendor; no red flags. B Acceptable, with caveats Active but partial. ≥1 COA in the last 90 days from a named third-party lab, but missing one of: lab portal verification, branded vials, or per-batch coverage. C Caution — stale program Stale but substantive. Zero COAs in the last 90 days, but a real historical record (≥10 COAs from a named third-party lab, latest within the last six months). D Avoid — thin evidence Stale or thin. 1–3 stale COAs total, OR latest is 6–24 months ago, OR no lab portal verification AND generic vial images, OR anonymous lab. F Do not buy — no verifiable evidence No verifiable COAs OR active red flags. No public COAs ever, OR latest is older than 24 months and total is fewer than 10, OR copy-paste / vendor-self-issued / fabricated / inaccessible content. Where additional ClaimReview evidence documents a fabricated-scam-site association (e.g. Polaris on peptidescore.com), we surface the descriptor as "Scam risk — fabricated rankings cited elsewhere."

When we re-audit and a vendor's recentCadence shifts, the grade shifts. The rubric is a function, not a vibe.

Evidence policy

Every audit record includes an evidence array — labeled URLs that anyone can re-check. If we can't verify a field, we leave it null and explain why. We do not fabricate. We do not paraphrase claims into facts.

Reproducibility

Every audit data point is a link or a date. You can reproduce any score yourself with the same browser and the same public URLs we used. The complete audit corpus — including every evidence[] URL, every red flag, every score reasoning — is published as JSON under /data/vendors/ on this site, and is licensed for re-use under CC BY 4.0. The audit rubric in this document is the canonical methodology — the build is deterministic from the rubric and the inputs.

Disputes & re-audits

Vendors can request a re-audit at audit@realpeptidescores.com. Provide the URLs you'd like considered. Re-audits are timestamped and the prior audit is preserved in the record history. We will not change a score in exchange for payment.