Run datalake_fdw automated tests: smoke, feature, performance, or individual test categories.
From datalakenpx claudepluginhub misterraindrop/cc-skills --plugin datalake/testRuns pytest tests for CLI harness on local path or GitHub repo, verifies CLI resolution, and updates TEST.md with results if all pass.
/testRuns TDD workflow for features (red-green-refactor) or bugs (Prove-It pattern): writes failing tests, implements, verifies with full suite. Handles browser issues via DevTools.
/testExecutes unit, integration, or e2e tests with coverage analysis, quality metrics, failure diagnostics, and optional watch mode or auto-fixes.
/testGenerates test strategy overview, unit and integration test code, coverage analysis, execution plan, and maintenance roadmap for a specified component or feature.
/testLaunches Chrome for manual UI testing at given URL, monitors console errors during interaction, generates report with screenshots and logs on completion.
/testInvokes the testing-coach agent to provide guidance on screen reader, keyboard navigation, and automated testing based on the request.
Run datalake_fdw automated tests: smoke, feature, performance, or individual test categories.
/datalake:test [smoke|feature|performance|all|<category>]
Arguments:
smoke — Run smoke tests (quick sanity checks)feature — Run feature testsperformance — Run performance testsall — Run all test suites via scripts/test/run_all_tests.sh<category> — Run a specific test category (e.g., s3, iceberg, hdfs, hive, orc, parquet, avro)smokeYou are executing the /datalake:test command. Follow these steps precisely:
Important paths:
DB_ROOT=/Volumes/ZHITAITiPlus71002TBMedia/liuxiaoyu/git/hashdata-lightning-umbrella2/hashdata-lightning-umbrella/database
AUTOMATION_DIR=$DB_ROOT/contrib/datalake_fdw/automation
FDW_DIR=$DB_ROOT/contrib/datalake_fdw
Automation Makefile targets:
make check-services — Verify required services are runningmake smoke-test — Smoke tests (depends on check-services)make feature-test — Feature testsmake performance-test — Performance testsmake clean — Clean test artifactsTest categories (located in $AUTOMATION_DIR/sqlrepo/smoke/<category>/):
Each category has its own Makefile using pg_regress installcheck.
Configuration:
$AUTOMATION_DIR/config/test_config.env — Centralized test configKey scripts:
$AUTOMATION_DIR/scripts/setup/check_services.sh — Service health check$AUTOMATION_DIR/scripts/test/run_smoke_tests.sh — Smoke test runner$AUTOMATION_DIR/scripts/utils/common_functions.sh — Shared utilitiesFDW built-in regression tests (via make installcheck in $FDW_DIR):
Verify the test environment is ready:
cd $AUTOMATION_DIR
# Check services
bash scripts/setup/check_services.sh
If services are not running, tell the user:
Required services are not running. Please start the environment first with
/datalake:singlecluster deploy.
Also verify that datalake_fdw is built and installed:
ls $(pg_config --pkglibdir)/datalake_fdw.so 2>/dev/null
If not found:
datalake_fdw is not installed. Run
/datalake:buildfirst.
Determine the test type. Default to smoke if no argument.
smokecd $AUTOMATION_DIR
make smoke-test
This runs check-services first, then scripts/test/run_smoke_tests.sh.
featurecd $AUTOMATION_DIR
make feature-test
performancecd $AUTOMATION_DIR
make performance-test
allcd $AUTOMATION_DIR
# If run_all_tests.sh exists, use it
if [ -f scripts/test/run_all_tests.sh ]; then
bash scripts/test/run_all_tests.sh
else
# Otherwise run sequentially
make smoke-test
make feature-test
make performance-test
fi
Continue to the next suite even if the previous one fails. Collect results from all suites.
<category> (specific category)For a specific test category like s3, iceberg, hdfs, hive:
cd $AUTOMATION_DIR/sqlrepo/smoke/<category>
make installcheck
If the category directory doesn't exist, check if it's a FDW regression test instead:
cd $FDW_DIR
make installcheck REGRESS="<category>"
To run the full built-in regression suite:
cd $FDW_DIR
make installcheck
If tests fail, show the regression diffs:
if [ -f regression.diffs ]; then
cat regression.diffs
fi
After tests complete, parse output and display:
## Datalake FDW Test Results
Test Type: smoke
Automation Dir: $AUTOMATION_DIR
| Category | Tests | Passed | Failed | Status |
|-------------|-------|--------|--------|---------|
| s3 | 2 | 2 | 0 | PASSED |
| iceberg | 3 | 2 | 1 | FAILED |
| hdfs | 4 | 4 | 0 | PASSED |
| hive | 5 | 5 | 0 | PASSED |
Overall: FAILED (1 failure)
If exact counts cannot be parsed from the output, report the exit code and show relevant error output.
For failures, automatically show the regression.diffs content:
find $AUTOMATION_DIR -name "regression.diffs" -exec echo "=== {} ===" \; -exec cat {} \;
/datalake:singlecluster deploy)$AUTOMATION_DIR/reports/all is used, failures in one suite do not block subsequent suitesmake clean in $AUTOMATION_DIR if needed