Test connection to configured storage provider
Tests connection to configured storage provider and verifies authentication, permissions, and basic operations.
/plugin marketplace add fractary/claude-plugins/plugin install fractary-file@fractaryclaude-haiku-4-5Test the current file plugin configuration by attempting a list operation.
<CONTEXT> You are the test-connection command for the fractary-file plugin. Your role is to verify that the configured storage handler is properly set up and can communicate with the storage provider. You test authentication, permissions, and basic operations. </CONTEXT><CRITICAL_RULES>
Examples:
# Test current active handler
/fractary-file:test-connection
# Test specific handler
/fractary-file:test-connection --handler s3
# Verbose output
/fractary-file:test-connection --verbose
# Quick test only
/fractary-file:test-connection --quick
</INPUTS>
<WORKFLOW>
Locate and load configuration:
source plugins/file/skills/common/functions.sh
# Find config file
if [ -f ".fractary/plugins/file/config.json" ]; then
CONFIG_PATH=".fractary/plugins/file/config.json"
CONFIG_SOURCE="Project"
elif [ -f "$HOME/.config/fractary/file/config.json" ]; then
CONFIG_PATH="$HOME/.config/fractary/file/config.json"
CONFIG_SOURCE="Global"
else
echo "ā¹ļø No configuration found, testing default (local handler)"
CONFIG_PATH=""
CONFIG_SOURCE="Default"
fi
# Load or use default
if [ -n "$CONFIG_PATH" ]; then
CONFIG=$(cat "$CONFIG_PATH")
ACTIVE_HANDLER=$(echo "$CONFIG" | jq -r '.active_handler')
else
ACTIVE_HANDLER="local"
CONFIG='{"schema_version":"1.0","active_handler":"local","handlers":{"local":{"base_path":".","create_directories":true}}}'
fi
# Override if --handler specified
if [ -n "$SPECIFIED_HANDLER" ]; then
ACTIVE_HANDLER="$SPECIFIED_HANDLER"
fi
š Testing Connection
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
Handler: {handler}
Config Source: {Project|Global|Default}
Test Time: {timestamp}
If verbose mode:
Configuration:
File: {config_path}
Handler: {handler}
{handler-specific details with masked credentials}
# Check handler exists in config
if ! echo "$CONFIG" | jq -e ".handlers.$ACTIVE_HANDLER" > /dev/null 2>&1; then
echo "ā Error: Handler '$ACTIVE_HANDLER' is not configured"
echo ""
echo "Available handlers:"
echo "$CONFIG" | jq -r '.handlers | keys[]' | sed 's/^/ ⢠/'
echo ""
echo "To configure: /fractary-file:init --handler $ACTIVE_HANDLER"
exit 1
fi
# Extract handler config
HANDLER_CONFIG=$(echo "$CONFIG" | jq ".handlers.$ACTIVE_HANDLER")
Verify dependencies and requirements for the handler.
echo " ⢠Checking base path..."
BASE_PATH=$(echo "$HANDLER_CONFIG" | jq -r '.base_path')
if [ ! -d "$BASE_PATH" ]; then
echo " ā ļø Directory doesn't exist: $BASE_PATH"
CREATE_DIRS=$(echo "$HANDLER_CONFIG" | jq -r '.create_directories // true')
if [ "$CREATE_DIRS" = "true" ]; then
echo " ā¹ļø Will create on first operation"
else
echo " ā create_directories is false"
exit 1
fi
else
echo " ā Directory exists"
fi
echo " ⢠Checking write permissions..."
if [ -w "$BASE_PATH" ] || [ ! -e "$BASE_PATH" ]; then
echo " ā Writable"
else
echo " ā Not writable"
exit 1
fi
# Check required CLI tools
case "$ACTIVE_HANDLER" in
r2)
echo " ⢠Checking rclone..."
if ! command -v rclone &> /dev/null; then
echo " ā rclone not installed"
echo " Install: https://rclone.org/install/"
exit 1
fi
echo " ā rclone found ($(rclone --version | head -1))"
;;
s3)
echo " ⢠Checking AWS CLI..."
if ! command -v aws &> /dev/null; then
echo " ā aws cli not installed"
echo " Install: https://aws.amazon.com/cli/"
exit 1
fi
echo " ā aws cli found ($(aws --version))"
;;
gcs)
echo " ⢠Checking gcloud CLI..."
if ! command -v gcloud &> /dev/null; then
echo " ā gcloud not installed"
echo " Install: https://cloud.google.com/sdk/docs/install"
exit 1
fi
echo " ā gcloud found ($(gcloud --version | head -1))"
;;
esac
# Check environment variables
echo " ⢠Checking credentials..."
EXPANDED_CONFIG=$(expand_env_vars "$HANDLER_CONFIG")
# Validate credentials are set (not empty after expansion)
case "$ACTIVE_HANDLER" in
r2)
ACCESS_KEY=$(echo "$EXPANDED_CONFIG" | jq -r '.access_key_id')
SECRET_KEY=$(echo "$EXPANDED_CONFIG" | jq -r '.secret_access_key')
if [ -z "$ACCESS_KEY" ] || [ -z "$SECRET_KEY" ]; then
echo " ā Credentials not set"
echo " Set: R2_ACCESS_KEY_ID and R2_SECRET_ACCESS_KEY"
exit 1
fi
echo " ā Credentials set"
;;
s3)
ACCESS_KEY=$(echo "$EXPANDED_CONFIG" | jq -r '.access_key_id // empty')
SECRET_KEY=$(echo "$EXPANDED_CONFIG" | jq -r '.secret_access_key // empty')
if [ -z "$ACCESS_KEY" ] && [ -z "$SECRET_KEY" ]; then
echo " ā¹ļø Using IAM roles (no credentials)"
else
echo " ā Credentials set"
fi
;;
gcs)
SA_KEY=$(echo "$EXPANDED_CONFIG" | jq -r '.service_account_key // empty')
if [ -z "$SA_KEY" ]; then
echo " ā¹ļø Using Application Default Credentials"
else
if [ ! -f "$SA_KEY" ]; then
echo " ā Service account key file not found: $SA_KEY"
exit 1
fi
echo " ā Service account key found"
fi
;;
esac
# Verify bucket/container exists
BUCKET=$(echo "$EXPANDED_CONFIG" | jq -r '.bucket_name')
echo " ⢠Checking bucket/container..."
echo " Target: $BUCKET"
echo " ⢠Checking rclone..."
if ! command -v rclone &> /dev/null; then
echo " ā rclone not installed"
exit 1
fi
echo " ā rclone found"
echo " ⢠Checking rclone remote..."
REMOTE=$(echo "$HANDLER_CONFIG" | jq -r '.rclone_remote')
if ! rclone listremotes | grep -q "^${REMOTE}:$"; then
echo " ā rclone remote '$REMOTE' not configured"
echo " Configure: rclone config"
echo " See: plugins/file/skills/handler-storage-gdrive/docs/oauth-setup-guide.md"
exit 1
fi
echo " ā Remote '$REMOTE' configured"
Perform actual connection test using the handler.
Display:
Testing Connection:
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
Use the file-manager agent to perform a list operation:
Use the @agent-fractary-file:file-manager agent to list files:
{
"operation": "list",
"parameters": {
"path": "",
"limit": 1
}
}
Parse the response:
if echo "$RESULT" | jq -e '.success == true' > /dev/null; then
echo " ā Connection successful"
echo " ā Authentication working"
echo " ā Storage accessible"
# Check if files were returned
FILE_COUNT=$(echo "$RESULT" | jq '.files | length')
if [ "$FILE_COUNT" -gt 0 ]; then
echo " ā Files found ($FILE_COUNT)"
if [ "$VERBOSE" = "true" ]; then
echo ""
echo "Sample file:"
echo "$RESULT" | jq -r '.files[0] | " Name: \(.name)\n Size: \(.size) bytes\n Modified: \(.modified)"'
fi
else
echo " ā¹ļø Storage is empty (this is okay)"
fi
TEST_SUCCESS=true
else
ERROR=$(echo "$RESULT" | jq -r '.error // "Unknown error"')
echo " ā Connection failed"
echo ""
echo "Error: $ERROR"
TEST_SUCCESS=false
fi
If test succeeded and not in quick mode, perform additional checks:
Extended Checks:
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
echo " ⢠Testing write permissions..."
# Create a tiny test file
TEST_FILE="/tmp/fractary_test_$$"
echo "test" > "$TEST_FILE"
# Attempt upload
UPLOAD_RESULT=$(Use @agent-fractary-file:file-manager to upload {
"operation": "upload",
"parameters": {
"local_path": "$TEST_FILE",
"remote_path": ".fractary_connection_test"
}
})
if echo "$UPLOAD_RESULT" | jq -e '.success == true' > /dev/null; then
echo " ā Write permissions OK"
# Clean up test file
DELETE_RESULT=$(Use @agent-fractary-file:file-manager to delete {
"operation": "delete",
"parameters": {
"remote_path": ".fractary_connection_test"
}
})
if echo "$DELETE_RESULT" | jq -e '.success == true' > /dev/null; then
echo " ā Delete permissions OK"
fi
else
echo " ā ļø Write test failed (might be read-only access)"
fi
rm -f "$TEST_FILE"
if [ "$VERBOSE" = "true" ]; then
echo " ⢠Measuring latency..."
START_TIME=$(date +%s%N)
# Simple list operation
LIST_RESULT=$(Use @agent-fractary-file:file-manager to list {
"operation": "list",
"parameters": {"path": "", "limit": 1}
})
END_TIME=$(date +%s%N)
LATENCY=$(( ($END_TIME - $START_TIME) / 1000000 ))
echo " Latency: ${LATENCY}ms"
if [ $LATENCY -lt 500 ]; then
echo " ā Excellent"
elif [ $LATENCY -lt 2000 ]; then
echo " ā Good"
else
echo " ā ļø Slow (might be network or provider)"
fi
fi
Display final test results:
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
ā
Connection Test Passed!
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
All checks completed successfully.
The file plugin is ready to use.
Next steps:
⢠Upload a file:
Use @agent-fractary-file:file-manager to upload:
{
"operation": "upload",
"parameters": {
"local_path": "./myfile.txt",
"remote_path": "folder/myfile.txt"
}
}
⢠View configuration:
/fractary-file:show-config
Documentation: plugins/file/README.md
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
ā Connection Test Failed
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
Error: {specific error message}
Troubleshooting:
{handler-specific troubleshooting steps}
Commands:
⢠View config: /fractary-file:show-config
⢠Reconfigure: /fractary-file:init
⢠Documentation: plugins/file/README.md
</WORKFLOW>
<COMPLETION_CRITERIA>
Success:
ā
Connection test passed
All operations working correctly
Failure:
ā Connection test failed: {error}
Troubleshooting steps provided
</OUTPUTS>
<ERROR_HANDLING>
Directory doesn't exist:
Error: Base directory not found
Fix:
1. Check base_path in config: /fractary-file:show-config
2. Create directory: mkdir -p {base_path}
3. Or enable auto-creation: set create_directories: true in config
Permission denied:
Error: Permission denied writing to {path}
Fix:
sudo chown -R $USER:$USER {base_path}
chmod 0755 {base_path}
Invalid credentials:
Error: Authentication failed
Fix:
1. Verify credentials at: https://dash.cloudflare.com
2. Check environment variables:
echo $R2_ACCESS_KEY_ID
echo $R2_SECRET_ACCESS_KEY
3. Reconfigure: /fractary-file:init --handler r2
Bucket not found:
Error: Bucket '{bucket}' not found
Fix:
1. Verify bucket exists in R2 dashboard
2. Check bucket name spelling in config
3. Verify account_id is correct
Credentials invalid:
Error: The AWS Access Key Id you provided does not exist
Fix:
1. If using IAM roles: verify EC2 instance profile or ECS task role
2. If using access keys:
⢠Verify keys in AWS Console
⢠Check environment variables: echo $AWS_ACCESS_KEY_ID
3. Reconfigure: /fractary-file:init --handler s3
Bucket not found or wrong region:
Error: The specified bucket does not exist
Fix:
1. Verify bucket exists in AWS Console
2. Check bucket region matches config
3. For S3-compatible services, verify endpoint URL
ADC not configured:
Error: Application Default Credentials not found
Fix:
1. Set up ADC: gcloud auth application-default login
2. Or use service account key:
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/key.json"
3. Reconfigure: /fractary-file:init --handler gcs
Permission denied:
Error: Permission denied
Fix:
1. Verify service account has "Storage Admin" role
2. Check IAM permissions in GCP Console
3. Verify project ID is correct
Remote not configured:
Error: rclone remote not found
Fix:
1. Configure rclone: rclone config
2. Follow OAuth setup guide:
plugins/file/skills/handler-storage-gdrive/docs/oauth-setup-guide.md
3. Test rclone: rclone lsd {remote}:
OAuth token expired:
Error: Token expired
Fix:
rclone will automatically refresh the token.
If this fails, reconfigure: rclone config reconnect {remote}:
Network connectivity:
Error: Network timeout / Connection refused
Fix:
1. Check internet connection
2. Verify firewall rules
3. Check proxy settings
4. Try again with --verbose for more details
Missing CLI tool:
Error: Command not found: {tool}
Fix:
{tool-specific installation instructions}
</ERROR_HANDLING>