From analyst
Assess the credibility and reliability of a publication, website, or individual source. Covers ownership, funding, editorial standards, track record, and known biases. Use before relying on a source for research or decisions.
npx claudepluginhub hpsgd/turtlestack --plugin analystThis skill is limited to using the following tools:
Assess the credibility of $ARGUMENTS.
Guides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Migrates code, prompts, and API calls from Claude Sonnet 4.0/4.5 or Opus 4.1 to Opus 4.5, updating model strings on Anthropic, AWS, GCP, Azure platforms.
Reviews prose for communication issues impeding comprehension, outputs minimal fixes in a three-column table per Microsoft Writing Style Guide. Useful for 'review prose' or 'improve prose' requests.
Assess the credibility of $ARGUMENTS.
Different source types have different credibility frameworks:
| Type | What to assess |
|---|---|
| News publication | Ownership, editorial standards, corrections policy, press council membership |
| Trade/industry publication | Who funds it, advertiser relationships, editorial independence |
| Think tank / research organisation | Funding sources, declared mission, publication peer review |
| Government / regulatory body | Statutory mandate, transparency obligations |
| Academic institution | Peer review process, retraction history, funding disclosure |
| Company / PR | By definition advocacy — assess transparency, not neutrality |
| Individual / expert | Credentials, institutional affiliation, track record, conflicts of interest |
| Aggregator / social platform | No editorial standard — assess the underlying sources it links to |
Who owns the source and who funds it? This doesn't determine whether individual pieces are accurate, but it shapes systematic biases and blind spots.
Search the source's "About" page, press releases, and public records. For significant publications:
For think tanks and research organisations: check annual reports, funding pages, and donation transparency disclosures.
Does the source have published editorial standards? Assess:
Publications outside a press council or regulatory body have no external accountability mechanism.
Has the source been accurate and reliable historically?
A single error doesn't define a source. A pattern does.
What does the source say its purpose is? Is that consistent with its output?
Most sources have a perspective — the question is whether they're transparent about it. A declared advocacy organisation that publishes transparently is more credible than a neutral-seeming publication with undisclosed funding.
Common bias patterns to identify:
Note the distinction between bias (a systematic pattern) and error (a specific inaccuracy). Both matter but they're different problems.
Assess:
## Source credibility: [Source name]
**Date of assessment:** [today]
**Source type:** [news / trade / think tank / government / academic / company / individual / other]
### Ownership and funding
[Who owns it, who funds it, transparency of disclosure]
### Editorial standards
[Press council membership, corrections policy, bylines, attribution standards]
### Track record
[Notable accuracy failures or strong reliability record — specific examples where found]
### Declared mission and known biases
[What the source says it is vs what the output pattern shows]
### Credibility assessment
| Dimension | Rating | Evidence |
|---|---|---|
| Ownership transparency | High / Medium / Low / Unknown | — |
| Editorial accountability | High / Medium / Low / Unknown | — |
| Accuracy track record | High / Medium / Low / Unknown | — |
| Bias transparency | High / Medium / Low / Unknown | — |
**Overall credibility:** [High / Medium / Low / Insufficient information]
**Appropriate use:** [What this source is reliable for, and what it's not]
### Sources used in this assessment
1. [Source](URL)