From aradotso-trending-skills-37
Guides setup, configuration, management, and troubleshooting of GEOFlow: open-source PHP/PostgreSQL system for AI-powered GEO/SEO content generation, review workflows, and Docker-based publishing.
npx claudepluginhub joshuarweaver/cascade-ai-ml-agents-misc-1 --plugin aradotso-trending-skills-37This skill uses the workspace's default tool permissions.
> Skill by [ara.so](https://ara.so) — Daily 2026 Skills collection.
Guides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Guides building MCP servers enabling LLMs to interact with external services via tools. Covers best practices, TypeScript/Node (MCP SDK), Python (FastMCP).
Generates original PNG/PDF visual art via design philosophy manifestos for posters, graphics, and static designs on user request.
Skill by ara.so — Daily 2026 Skills collection.
GEOFlow is an open-source PHP/PostgreSQL system for automated GEO/SEO content production. It chains model configuration, material management, task scheduling, draft review, and front-end publishing into a single pipeline. It supports any OpenAI-compatible API, runs on Docker Compose, and exposes both a REST API and a CLI.
git clone https://github.com/yaojingang/GEOFlow.git
cd GEOFlow
cp .env.example .env
# Edit .env — set APP_SECRET_KEY, SITE_URL, and DB credentials
vi .env
# Start web + postgres + scheduler + worker
docker compose --profile scheduler up -d --build
# Front-end
open http://localhost:18080
# Admin panel
open http://localhost:18080/geo_admin/
git clone https://github.com/yaojingang/GEOFlow.git
cd GEOFlow
export DB_DRIVER=pgsql
export DB_HOST=127.0.0.1
export DB_PORT=5432
export DB_NAME=geo_system
export DB_USER=geo_user
export DB_PASSWORD=geo_password
export APP_SECRET_KEY=$(openssl rand -hex 32)
export SITE_URL=http://localhost:8080
php -S localhost:8080 router.php
open http://localhost:8080/geo_admin/
Requirements: PHP 7.4+, extensions pdo_pgsql and curl, PostgreSQL instance.
# .env (copy from .env.example)
HOST_PORT=18080
SITE_URL=http://localhost:18080
APP_SECRET_KEY=$APP_SECRET_KEY # 32+ char random string — never hardcode
CRON_INTERVAL=60 # Scheduler poll interval in seconds
TZ=Asia/Shanghai
# Database
DB_DRIVER=pgsql
DB_HOST=postgres
DB_PORT=5432
DB_NAME=geo_system
DB_USER=geo_user
DB_PASSWORD=$DB_PASSWORD # Set via environment, not hardcoded
Generate a secure key:
openssl rand -hex 32
# or
php -r "echo bin2hex(random_bytes(32));"
After first boot, log in at /geo_admin/ with:
adminadmin888Change both immediately and rotate APP_SECRET_KEY before any public deployment.
| Service | Purpose | Profile |
|---|---|---|
web | Front-end + admin HTTP | always |
postgres | PostgreSQL database | always |
scheduler | Scans tasks, writes job queue | scheduler |
worker | Calls AI API, generates content | scheduler |
# Web only (no generation)
docker compose up -d
# Full stack
docker compose --profile scheduler up -d
# Logs
docker compose logs -f worker
docker compose logs -f scheduler
# Restart a single service
docker compose restart worker
# Stop everything
docker compose --profile scheduler down
Admin: configure model + prompts + materials
↓
Create Task (title library, model, prompt, image library, publish rules)
↓
Scheduler (bin/cron.php) → writes to job_queue table
↓
Worker (bin/worker.php) → calls AI API → generates article body
↓
Optional: insert images, build SEO meta
↓
Article enters draft → review → publish states
↓
Front-end renders article + SEO/OG tags
Navigate to AI配置中心 → AI模型管理 and fill in:
https://api.openai.com/v1)gpt-4o)$AI_API_KEY in your env)Navigate to 任务管理 → 新建任务:
Title Library → selects article titles
Model → which AI model to call
Prompt → which prompt template
Image Library → optional illustration set
Publish Mode → draft | auto-publish
Schedule → immediate | cron-based
bin/geoflow)The CLI is the programmatic interface used by automation scripts and the companion skill yaojingang/yao-geo-skills.
# List available commands
php bin/geoflow help
# Check task status
php bin/geoflow task:status --task-id=<id>
# Create a task
php bin/geoflow task:create \
--title-library=<id> \
--model=<id> \
--prompt=<id> \
--publish-mode=draft
# Upload an article draft
php bin/geoflow article:create \
--title="Article Title" \
--body="<markdown content>" \
--status=draft
# Review and publish an article
php bin/geoflow article:review --article-id=<id> --action=approve
php bin/geoflow article:publish --article-id=<id>
# Run scheduler manually (one cycle)
php bin/cron.php
# Run worker manually (processes one job then exits)
php bin/worker.php --once
/api/v1)All endpoints require Bearer token authentication.
php bin/api/create_token.php --name="my-integration"
# Outputs token — store in $GEOFLOW_API_TOKEN
BASE=http://localhost:18080/api/v1
TOKEN=$GEOFLOW_API_TOKEN
# List tasks
curl -H "Authorization: Bearer $TOKEN" "$BASE/tasks"
# Get task detail
curl -H "Authorization: Bearer $TOKEN" "$BASE/tasks/<id>"
# Create task
curl -X POST \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"title_library_id": 1,
"model_id": 2,
"prompt_id": 3,
"publish_mode": "draft"
}' \
"$BASE/tasks"
# List articles
curl -H "Authorization: Bearer $TOKEN" "$BASE/articles"
# Publish article
curl -X POST \
-H "Authorization: Bearer $TOKEN" \
"$BASE/articles/<id>/publish"
# Check job queue
curl -H "Authorization: Bearer $TOKEN" "$BASE/jobs?status=pending"
<?php
// includes/db_support.php provides the connection helper
require_once __DIR__ . '/includes/db_support.php';
$pdo = get_db_connection(); // Returns a PDO instance
// Always use prepared statements
$stmt = $pdo->prepare("SELECT * FROM articles WHERE status = :status ORDER BY created_at DESC LIMIT 10");
$stmt->execute([':status' => 'published']);
$articles = $stmt->fetchAll(PDO::FETCH_ASSOC);
<?php
require_once __DIR__ . '/includes/article_service.php';
// Create a draft
$articleId = create_article([
'title' => 'My Article Title',
'body' => '## Introduction\n\nContent here...',
'status' => 'draft',
'task_id' => 42,
'seo_title' => 'My Article Title | Site Name',
'seo_desc' => 'Short description for search engines.',
]);
// Approve and publish
approve_article($articleId);
publish_article($articleId);
<?php
require_once __DIR__ . '/includes/ai_engine.php';
// Execute a single task job (task_id, job_id)
$result = run_task_job(taskId: 5, jobId: 101);
if ($result['success']) {
echo "Article created: " . $result['article_id'];
} else {
echo "Error: " . $result['error'];
}
<?php
require_once __DIR__ . '/includes/job_queue_service.php';
// Claim next available job (used by worker)
$job = claim_next_job();
if ($job) {
try {
// ... process the job ...
complete_job($job['id']);
} catch (Exception $e) {
fail_job($job['id'], $e->getMessage());
// Eligible jobs will be retried by scheduler
}
}
// Manually enqueue a task
enqueue_task_jobs(taskId: 5);
<?php
// Prompt templates support placeholder substitution
$template = "You are an SEO writer. Write a 600-word article about: {{title}}\n\nBackground knowledge:\n{{knowledge}}";
$prompt = str_replace(
['{{title}}', '{{knowledge}}'],
[$articleTitle, $knowledgeBaseContent],
$template
);
// Pass to AI service
require_once __DIR__ . '/includes/ai_service.php';
$response = call_ai_model(modelId: 2, prompt: $prompt);
echo $response['content'];
GEOFlow/
├── index.php Front-end article list
├── article.php Article detail page (SEO + OG)
├── router.php Local dev router for php -S
├── admin/ All admin UI pages
├── api/v1/index.php REST API single entry point
├── bin/
│ ├── geoflow CLI entrypoint
│ ├── cron.php Scheduler (run every CRON_INTERVAL seconds)
│ └── worker.php AI generation worker (long-running)
├── includes/
│ ├── config.php Global config + constants
│ ├── database.php Front-end data access
│ ├── ai_engine.php Core generation orchestration
│ ├── ai_service.php OpenAI-compatible HTTP client
│ ├── job_queue_service.php Claim / complete / fail / retry
│ ├── task_service.php Task CRUD
│ ├── article_service.php Article lifecycle
│ └── api_auth.php Bearer token auth
└── docs/ Deployment + API + FAQ docs
When creating a task, set publish_mode = auto. The worker will call publish_article() immediately after generation, skipping the review queue.
# Via CLI
php bin/geoflow job:retry-failed --task-id=<id>
# Via database (emergency)
psql $DATABASE_URL -c "UPDATE job_queue SET status='pending', attempts=0 WHERE status='failed' AND task_id=5;"
# Add to crontab — runs every minute
* * * * * cd /path/to/GEOFlow && php bin/cron.php >> /var/log/geoflow-cron.log 2>&1
# Or loop manually
while true; do php bin/cron.php; sleep 60; done
php bin/db_maintenance.php --vacuum
php bin/db_maintenance.php --clean-old-jobs --days=30
| Symptom | Likely Cause | Fix |
|---|---|---|
| Worker generates nothing | Scheduler not running | Start with --profile scheduler or run cron.php |
pdo_pgsql not found | Extension missing | Add extension=pdo_pgsql to php.ini or install php-pgsql |
| 500 on admin pages | DB not initialized | Let the web container complete its entrypoint migration on first boot |
| AI calls return 401 | Wrong API key | Re-enter key in AI模型管理; check key has no leading/trailing spaces |
| Articles stuck in draft | Auto-publish off | Set publish_mode=auto on the task, or manually approve in 审核中心 |
| Jobs pile up in queue | Worker crashed | docker compose restart worker; check docker compose logs worker |
| CSRF token mismatch | Session expired | Re-login to admin; ensure APP_SECRET_KEY is stable across restarts |
# All services
docker compose logs -f
# Only errors
docker compose logs worker 2>&1 | grep -i error
# Scheduler activity
docker compose logs scheduler | tail -50
# Check pending jobs in DB
docker compose exec postgres psql -U geo_user -d geo_system \
-c "SELECT status, COUNT(*) FROM job_queue GROUP BY status;"
admin / admin888 credentials immediatelyAPP_SECRET_KEY to a 32+ character random string via environment variable.env — it is in .gitignore by defaultSITE_URL to your https:// domain/geo_admin/ and /api/v1/ with network-level firewall in productionphp bin/api/create_token.php