Execute bash commands and scripts safely with validation, error handling, and security checks. Use for system operations, file management, text processing, and command-line tools.
Execute bash commands and scripts safely with validation, error handling, and security checks. Use for system operations, file management, text processing, and command-line tools.
/plugin marketplace add gekko68/my-first-plugin/plugin install my-first-plugin@ts-toolsThis skill inherits all available tools. When active, it can use any tool Claude has access to.
example.shscripts/execute_bash.shscripts/generate_script.shscripts/validate_command.shExecute bash commands and shell scripts safely with comprehensive error handling, security validation, and best practices.
Plugin: my-first-plugin | Version: 1.0.0
Use this skill when the user needs to:
Determine if bash is the right tool:
Check for dangerous patterns:
bash scripts/validate_command.sh "<command>"
Use the helper script:
bash scripts/execute_bash.sh "<command>" [timeout]
# OR for scripts:
bash scripts/execute_bash.sh <script-file> [timeout]
Check exit codes and provide meaningful feedback:
❌ NEVER Allow:
rm -rf / # System destruction
dd if=/dev/zero # Disk wiping
:(){ :|:& };: # Fork bombs
chmod 777 / # Permission destruction
curl | bash # Arbitrary code execution
eval $user_input # Code injection
mkfs.* # Filesystem formatting
✅ Safe Patterns:
ls -la # Directory listing
grep "pattern" file.txt # Text search
find . -name "*.log" # File finding
tar -czf backup.tar.gz dir/ # Archiving
sed 's/old/new/g' file.txt # Text replacement
#!/bin/bash
set -euo pipefail # Exit on error, undefined vars, pipe failures
if some_command; then
echo "Success"
else
echo "Failed with exit code: $?"
exit 1
fi
#!/bin/bash
cleanup() {
echo "Cleaning up..."
rm -f /tmp/tempfile
}
trap cleanup EXIT ERR
# Your commands here
#!/bin/bash
FILE="${1:-}"
if [[ -z "$FILE" ]]; then
echo "Error: File argument required" >&2
exit 1
fi
if [[ ! -f "$FILE" ]]; then
echo "Error: File not found: $FILE" >&2
exit 1
fi
Create Directory Structure:
mkdir -p project/{src,tests,docs,config}
echo "Created project structure"
Find and Process Files:
#!/bin/bash
# Find all .log files modified in last 7 days
find . -name "*.log" -mtime -7 -type f | while read -r file; do
echo "Processing: $file"
# Process each file
done
Bulk Rename Files:
#!/bin/bash
# Rename all .txt files to .md
for file in *.txt; do
if [[ -f "$file" ]]; then
mv "$file" "${file%.txt}.md"
echo "Renamed: $file -> ${file%.txt}.md"
fi
done
Search and Extract:
#!/bin/bash
# Extract email addresses from file
grep -Eo '[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}' input.txt | \
sort -u > emails.txt
echo "Extracted $(wc -l < emails.txt) unique emails"
CSV Processing:
#!/bin/bash
# Extract specific columns from CSV
cut -d',' -f1,3,5 input.csv | \
grep -v "^#" | \
sort > output.csv
Log Analysis:
#!/bin/bash
# Count error types in log file
awk '/ERROR/ {print $5}' app.log | \
sort | uniq -c | sort -rn
Disk Usage Analysis:
#!/bin/bash
# Find largest directories
du -h --max-depth=1 2>/dev/null | \
sort -hr | head -20
Process Management:
#!/bin/bash
# Check if process is running
if pgrep -f "myapp" > /dev/null; then
echo "Process is running"
pgrep -f "myapp" | xargs ps -p
else
echo "Process is not running"
fi
Archive and Compress:
#!/bin/bash
# Create dated backup
DATE=$(date +%Y%m%d)
tar -czf "backup_${DATE}.tar.gz" \
--exclude='node_modules' \
--exclude='.git' \
./project/
echo "Backup created: backup_${DATE}.tar.gz"
Download and Process:
#!/bin/bash
set -euo pipefail
URL="https://example.com/data.json"
OUTPUT="processed_data.json"
# Download
curl -fsSL "$URL" -o raw_data.json
# Process with jq
jq '.items[] | select(.status == "active")' raw_data.json > "$OUTPUT"
echo "Processed data saved to $OUTPUT"
Multi-Step Pipeline:
#!/bin/bash
set -euo pipefail
# Step 1: Download
echo "Downloading data..."
wget -q https://example.com/data.csv -O data.csv
# Step 2: Clean
echo "Cleaning data..."
sed 's/\r$//' data.csv | # Remove carriage returns
grep -v '^$' | # Remove empty lines
tr -s ' ' > cleaned.csv
# Step 3: Analyze
echo "Analyzing..."
awk -F',' '{sum+=$3} END {print "Total:", sum}' cleaned.csv
echo "Pipeline completed"
Repository Setup:
#!/bin/bash
set -euo pipefail
REPO_NAME="${1:-my-repo}"
mkdir -p "$REPO_NAME"
cd "$REPO_NAME"
git init
echo "# $REPO_NAME" > README.md
echo "node_modules/" > .gitignore
echo ".env" >> .gitignore
git add .
git commit -m "Initial commit"
echo "Repository initialized: $REPO_NAME"
Branch Management:
#!/bin/bash
# Create feature branch from main
git checkout main
git pull origin main
git checkout -b feature/new-feature
echo "Created and switched to feature/new-feature"
Container Management:
#!/bin/bash
# Stop and remove all containers
docker ps -aq | xargs -r docker stop
docker ps -aq | xargs -r docker rm
echo "All containers stopped and removed"
Image Cleanup:
#!/bin/bash
# Remove dangling images
docker images -f "dangling=true" -q | xargs -r docker rmi
echo "Dangling images removed"
#!/bin/bash
# Process files in parallel
find . -name "*.txt" -print0 | \
xargs -0 -P 4 -I {} bash -c 'process_file "$@"' _ {}
#!/bin/bash
# Show progress for long operations
TOTAL=$(ls -1 *.txt | wc -l)
COUNT=0
for file in *.txt; do
COUNT=$((COUNT + 1))
echo "Processing $COUNT/$TOTAL: $file"
# Process file
done
#!/bin/bash
# Retry on failure
MAX_RETRIES=3
RETRY_DELAY=5
for i in $(seq 1 $MAX_RETRIES); do
if some_command; then
echo "Success on attempt $i"
break
else
echo "Failed attempt $i/$MAX_RETRIES"
if [[ $i -lt $MAX_RETRIES ]]; then
echo "Retrying in ${RETRY_DELAY}s..."
sleep $RETRY_DELAY
fi
fi
done
bash scripts/execute_bash.sh "<command>" [timeout]
bash scripts/execute_bash.sh script.sh [timeout]
bash scripts/validate_command.sh "<command>"
bash scripts/generate_script.sh script_name.sh "Description"
DRY_RUN=1 bash scripts/execute_bash.sh "<command>"
#!/bin/bash
set -x # Print commands as they execute
# Your commands here
set +x # Disable verbose mode
#!/bin/bash
set -u # Error on undefined variables
echo "DEBUG: Variable value: ${MY_VAR:-not set}"
bash -x script.sh # Run with tracing
# Slow: external command
result=$(cat file.txt | grep pattern)
# Fast: built-in
result=$(grep pattern < file.txt)
# Slow
cat file.txt | grep pattern | wc -l
# Fast
grep -c pattern file.txt
# Serial (slow)
for file in *.txt; do
process "$file"
done
# Parallel (fast)
ls *.txt | xargs -P 4 -I {} process {}
HOME # User home directory
PATH # Executable search path
USER # Current username
PWD # Present working directory
OLDPWD # Previous directory
SHELL # Current shell
#!/bin/bash
# Set with default
VAR="${VAR:-default_value}"
# Export for child processes
export API_KEY="secret"
# Unset after use
unset API_KEY
# Check permissions
ls -la script.sh
# Fix permissions
chmod +x script.sh
# Check if command exists
command -v mycommand
# Check PATH
echo $PATH
# Use full path
/usr/bin/mycommand
# Check file exists
[[ -f file.txt ]] && echo "Exists" || echo "Not found"
# Use absolute path
realpath file.txt
# Use timeout
timeout 30s ./script.sh
# Or in script
set -euo pipefail
set -euo pipefail for safety"$variable" not $variableif command; then ...[[ ]] for conditions, not [ ]eval with user inputls outputFor more complex bash operations:
scripts/execute_bash.sh - Safe command executorscripts/validate_command.sh - Command validatorscripts/generate_script.sh - Script generator