From anysite-skills
Tracks and analyzes content performance across Instagram, YouTube, LinkedIn, Twitter/X, Reddit using anysite MCP. Measures engagement metrics, identifies top posts, benchmarks strategies.
npx claudepluginhub anysiteio/agent-skills --plugin anysite-cliThis skill uses the workspace's default tool permissions.
Measure and optimize content performance across social platforms using anysite MCP. Track engagement, identify top performers, and refine your content strategy.
Analyzes audience demographics, engagement patterns, and follower behavior on Instagram, YouTube, LinkedIn via anysite MCP server. Useful for influencer validation, content optimization, and target audience profiling.
Tracks engagement metrics, ROI, and content performance across Instagram, Facebook, YouTube, TikTok using Apify Actors. Guides scraper selection, schema fetching, and Node.js-based analysis workflows.
Scans YouTube, TikTok, X, Reddit for top content by engagement, views, likes. Finds viral patterns, hooks, gaps for keyword/niche research before content creation.
Share bugs, ideas, or general feedback.
Measure and optimize content performance across social platforms using anysite MCP. Track engagement, identify top performers, and refine your content strategy.
Coverage: 80% - Strong for Instagram, YouTube, LinkedIn, Twitter, Reddit
All data fetching uses the anysite MCP v2 universal meta-tools:
execute(source, category, endpoint, params) - Fetch data from any source. Returns first page + cache_key.get_page(cache_key, offset, limit) - Load more items from a previous execute() when next_offset is returned.query_cache(cache_key, conditions?, sort_by?, aggregate?, group_by?) - Filter, sort, and aggregate cached data without new API calls.export_data(cache_key, format) - Export full dataset as CSV, JSON, or JSONL. Returns a download URL.v2 responses may include llm_hint fields with guidance on how to resolve errors. Common patterns:
llm_hint in error responses for specific resolution steps.Step 1: Collect Content Data
Platform-specific:
execute("instagram", "user", "user_posts", {"user": "username", "count": 50})execute("linkedin", "user", "user_posts", {"urn": "fsd_profile:ACoAAA...", "count": 50})execute("twitter", "user", "user_posts", {"user": "username", "count": 100})execute("youtube", "channel", "channel_videos", {"channel": "channel_id", "count": 30})Step 2: Analyze Engagement
Use query_cache() on the returned cache_key to analyze without re-fetching:
query_cache(cache_key, sort_by="likes desc", aggregate="avg:likes,comments")
Calculate metrics:
Step 3: Identify Patterns
Look for:
Step 4: Optimize Strategy
Based on findings:
Step 5: Export Results
export_data(cache_key, "csv")
Returns a download URL for the full dataset.
Steps:
execute("instagram", "user", "user_posts", {"user": "username", "count": 100})
→ returns cache_key + first page of results
If more posts exist (response includes next_offset):
get_page(cache_key, offset=next_offset, limit=50)
For each post:
- Engagement rate = (likes + comments) / follower_count
- Engagement per hour = engagement / hours_since_posted
- Content type (Reel, carousel, single image, video)
Use query_cache to sort and filter:
query_cache(cache_key, sort_by="likes desc", aggregate="avg:likes,comments")
query_cache(cache_key, sort_by="likes desc")
Top 10%: Analyze for common patterns
- Topics/themes
- Visual style
- Caption style and length
- Hashtag strategy
query_cache(cache_key, group_by="type", aggregate="count:id,avg:likes,avg:comments")
Results show:
- Reels: X% of posts, Y% of engagement
- Carousels: X% of posts, Y% of engagement
- Single images: X% of posts, Y% of engagement
For each competitor:
execute("instagram", "user", "user_posts", {"user": "competitor", "count": 50})
Compare:
- Posting frequency
- Engagement rates
- Content types
- Top themes
export_data(cache_key, "csv")
Expected Output:
Steps:
execute("linkedin", "user", "user_posts", {"urn": "fsd_profile:ACoAAA...", "count": 100})
→ returns cache_key + first page
For company page posts:
execute("linkedin", "company", "company_posts", {"urn": {"type": "company", "value": "1441"}, "count": 100})
Use get_page(cache_key, offset, limit) if more posts exist.
Group by type:
- Text-only posts
- Image posts
- Video posts
- Article shares
- LinkedIn articles
- Polls
query_cache(cache_key, aggregate="avg:comment_count,avg:share_count", group_by="type")
For each content type:
- Average reactions
- Average comments
- Average shares
- Engagement rate
Extract themes from top posts:
- Industry insights
- Personal stories
- How-to/educational
- Company news
- Thought leadership
Group posts by:
- Day of week
- Time of day
Calculate average engagement for each group
Expected Output:
Steps:
execute("youtube", "channel", "channel_videos", {"channel": "channel_id", "count": 50})
→ returns cache_key + first page
Use get_page(cache_key, offset, limit) for additional videos.
For each video:
execute("youtube", "video", "video", {"video": "video_id"})
Metrics:
- Views
- Likes/dislikes
- Comments
- View velocity (views per day since upload)
query_cache(cache_key, sort_by="views desc")
Analyze top 20% by views:
- Video length
- Titles (keywords, style)
- Thumbnail patterns
- Topics/themes
- Upload timing
Check comments:
execute("youtube", "video", "video_comments", {"video": "video_id", "count": 100})
Analyze:
- Comment quality
- Questions asked
- Sentiment
- Engagement timing
Compare:
- Long-form (>10 min) vs short (<5 min)
- Tutorial vs entertainment vs review
- Series vs one-offs
Expected Output:
execute("instagram", "user", "user_posts", {"user": username, "count": N}) - Get posts with engagementexecute("instagram", "post", "post", {"post": post_id}) - Get detailed post metricsexecute("instagram", "post", "post_likes", {"post": post_id, "count": N}) - Analyze likersexecute("instagram", "post", "post_comments", {"post": post_id, "count": N}) - Get commentsexecute("linkedin", "user", "user_posts", {"urn": "fsd_profile:ACoAAA...", "count": N}) - Get user post historyexecute("linkedin", "company", "company_posts", {"urn": {"type": "company", "value": "ID"}, "count": N}) - Company page postsexecute("twitter", "user", "user_posts", {"user": username, "count": N}) - Get tweetsexecute("twitter", "search", "search_posts", {"query": query, "count": N}) - Find trending tweetsexecute("youtube", "channel", "channel_videos", {"channel": channel, "count": N}) - All videosexecute("youtube", "video", "video", {"video": video_id}) - Video details and metricsexecute("youtube", "video", "video_comments", {"video": video_id, "count": N}) - Commentsexecute("reddit", "user", "user_posts", {"username": username, "count": N}) - User's postsexecute("reddit", "search", "search_posts", {"query": query, "count": N}) - Find popular postsget_page(cache_key, offset, limit) - Fetch next page of results from any execute() callquery_cache(cache_key, conditions?, sort_by?, aggregate?, group_by?) - Filter/sort/aggregate cached resultsexport_data(cache_key, "csv"|"json"|"jsonl") - Export dataset as downloadable fileEngagement Rate:
Content Performance Score:
Score = (Engagement Rate x 40) +
(Comments/Likes Ratio x 30) +
(Share Rate x 30)
Viral Potential Indicators:
Chat Summary:
CSV Export (via export_data(cache_key, "csv")):
JSON Export (via export_data(cache_key, "json")):
Ready to analyze content? Ask Claude to help you track performance, identify top content, or optimize your posting strategy!