From anysite-skills
Tracks emerging trends, viral content, hashtags, momentum, and sentiment across Twitter/X, Reddit, YouTube, LinkedIn, Instagram using anysite MCP server. Useful for social listening and market shift analysis.
npx claudepluginhub anysiteio/agent-skills --plugin anysite-cliThis skill uses the workspace's default tool permissions.
Discover emerging trends and track viral content across social platforms using anysite MCP. Identify what's gaining momentum before it peaks.
Tracks and analyzes content performance across Instagram, YouTube, LinkedIn, Twitter/X, Reddit using anysite MCP. Measures engagement metrics, identifies top posts, benchmarks strategies.
Find what your audience is searching for right now — scout X/Twitter and Reddit for trending topics, discover and deep-analyze competitors, and find content gaps. Combine social signals with SEO intelligence. Powered by Citedy.
Scans YouTube, TikTok, X, Reddit for top content by engagement, views, likes. Finds viral patterns, hooks, gaps for keyword/niche research before content creation.
Share bugs, ideas, or general feedback.
Discover emerging trends and track viral content across social platforms using anysite MCP. Identify what's gaining momentum before it peaks.
Coverage: 75% - Good for Twitter, Reddit, YouTube, LinkedIn, Instagram
All data fetching uses the universal execute() meta-tool. Always call discover(source, category) first if you need to verify endpoint names or available parameters.
Core tools:
execute(source, category, endpoint, params) - Fetch data. Returns first page + cache_key.get_page(cache_key, offset, limit) - Load more results from a previous execute.query_cache(cache_key, conditions, sort_by, aggregate, group_by) - Filter, sort, or aggregate cached data without new API calls.export_data(cache_key, format) - Export full dataset as CSV, JSON, or JSONL.Error handling: If execute() returns an error with llm_hint, follow the hint to fix the request (e.g., correcting a parameter name or adjusting the query).
Step 1: Search for Trending Content
By platform:
execute("twitter", "search", "search_tweets", {"query": "<topic>", "count": 100}) sorted by engagementexecute("reddit", "search", "search", {"query": "<topic>"}) sorted by upvotesexecute("youtube", "search", "search_videos", {"query": "<topic>", "count": 50}) by recentexecute("linkedin", "post", "search_posts", {"keywords": "<topic>"}) by engagementexecute("instagram", "search", "search_users", {"query": "<topic>"}) for hashtag/topic discoveryStep 2: Analyze Momentum
Use query_cache() to filter and sort cached results:
query_cache(cache_key, sort_by="engagement_desc", conditions=[{"field": "date", "op": ">", "value": "2024-01-01"}])
Check indicators:
Step 3: Track Over Time
Monitor changes:
Step 4: Report Insights
Use export_data(cache_key, "csv") to generate downloadable reports.
Deliver:
Scenario: Identify what's trending in tech/AI space
Steps:
# Twitter
execute("twitter", "search", "search_tweets", {"query": "AI OR artificial intelligence", "count": 100})
→ Filter for: Posted within 24-48h, high engagement
→ Save cache_key as twitter_cache
# Reddit
execute("reddit", "search", "search", {"query": "artificial intelligence"})
→ Filter: r/technology, r/MachineLearning, r/singularity
→ Save cache_key as reddit_cache
# YouTube
execute("youtube", "search", "search_videos", {"query": "AI news", "count": 50})
→ Filter: Published this week, views >10k
→ Save cache_key as youtube_cache
# LinkedIn
execute("linkedin", "post", "search_posts", {"keywords": "artificial intelligence"})
→ Filter: High engagement, recent
→ Save cache_key as linkedin_cache
# Filter Twitter for high-engagement posts
query_cache(twitter_cache, sort_by="engagement_desc", conditions=[{"field": "likes", "op": ">", "value": 100}])
# Filter Reddit for specific subreddits
query_cache(reddit_cache, conditions=[{"field": "subreddit", "op": "contains", "value": "technology"}])
# Aggregate YouTube view counts
query_cache(youtube_cache, aggregate={"field": "views", "op": "avg"})
# If execute() returned next_offset, paginate
get_page(twitter_cache, offset=10, limit=50)
get_page(reddit_cache, offset=10, limit=50)
Analyze content for recurring:
- Keywords and phrases
- Company/product mentions
- Events or announcements
- Questions or concerns
For each theme:
- Platform count (how many platforms)
- Total engagement
- Growth velocity
- Sentiment distribution
Trends with:
- Presence on 3+ platforms
- Engagement growing >50% daily
- Positive or controversial sentiment
- Coverage by influencers/media
Expected Output:
Scenario: Monitor hashtag growth and adoption
Steps:
# Instagram - discover users/content around the hashtag
execute("instagram", "search", "search_users", {"query": "sustainability"})
→ Save cache_key as ig_cache
# Twitter
execute("twitter", "search", "search_tweets", {"query": "#sustainability", "count": 100})
→ Track tweet volume over time
→ Save cache_key as tw_cache
# LinkedIn
execute("linkedin", "post", "search_posts", {"keywords": "sustainability"})
→ Check professional adoption
→ Save cache_key as li_cache
# Sort by recency and engagement
query_cache(tw_cache, sort_by="date_desc")
query_cache(ig_cache, sort_by="followers_desc")
Hashtag velocity:
- Posts in last 24h vs. previous 24h
- Engagement rate change
- New accounts using hashtag
- Geographic spread
Compare early vs. recent posts:
- Topic shifts
- Audience changes
- Influencer involvement
- Commercial adoption
export_data(tw_cache, "csv")
export_data(ig_cache, "json")
→ Share downloadable reports
Based on growth curve:
- Early stage (accelerating)
- Peak stage (plateauing)
- Decline stage (slowing)
Expected Output:
Scenario: Find emerging discussions in specific communities
Steps:
execute("reddit", "posts", "get", {"subreddit": "technology"})
→ Get top posts from last week
→ Save cache_key as reddit_tech_cache
# Sort cached posts by engagement
query_cache(reddit_tech_cache, sort_by="upvotes_desc")
# Aggregate engagement metrics
query_cache(reddit_tech_cache, aggregate={"field": "upvotes", "op": "avg"})
For each high-momentum post:
execute("reddit", "search", "search", {"query": "<post topic>"})
→ Deeper analysis
Calculate:
- Upvotes per hour
- Comment velocity
- Award count
- Controversial score
From high-momentum posts:
- What problems are discussed?
- What solutions are proposed?
- What companies/products mentioned?
- What sentiment (positive, negative, concerned)?
Check if trending Reddit topics appear on:
- Twitter: execute("twitter", "search", "search_tweets", {"query": "<topic>"})
- LinkedIn: execute("linkedin", "post", "search_posts", {"keywords": "<topic>"})
- YouTube: execute("youtube", "search", "search_videos", {"query": "<topic>"})
Expected Output:
execute("twitter", "search", "search_tweets", {"query": ..., "count": N}) - Find tweets, filter by engagementexecute("twitter", "user", "get", {"username": ...}) - Check influencer adoptionexecute("reddit", "search", "search", {"query": ...,}) - Find discussionsexecute("reddit", "posts", "get", {"subreddit": ...}) - Get subreddit posts and momentumexecute("reddit", "user", "get", {"username": ...}) - Get user detailsexecute("youtube", "search", "search_videos", {"query": ..., "count": N}) - Find trending videosexecute("youtube", "video", "video", {"video": ...}) - Track view velocityexecute("youtube", "video", "video_comments", {"video": ..., "count": N}) - Gauge interestexecute("linkedin", "post", "search_posts", {"keywords": ...}) - Professional trendsexecute("linkedin", "company", "get", {"company": ...}) - Company detailsexecute("instagram", "search", "search_users", {"query": ...}) - Discover users/hashtagsexecute("instagram", "post", "post", {"post": ...}) - Engagement metricsget_page(cache_key, offset, limit) - Load additional results from any execute() callquery_cache(cache_key, conditions, sort_by, aggregate, group_by) - Filter, sort, aggregate cached dataexport_data(cache_key, "csv"|"json"|"jsonl") - Export datasets for reportingTrend Stages:
Emergence (0-20% awareness)
Growth (20-50% awareness)
Peak (50-80% awareness)
Decline (80-100% awareness)
Momentum Indicators:
Chat Summary:
CSV Export (via export_data(cache_key, "csv")):
JSON Export (via export_data(cache_key, "json")):
Ready to discover trends? Ask Claude to help you identify emerging topics, track viral content, or monitor market shifts across social platforms!