Newman Testing
CRITICAL: The description field above controls when Claude auto-loads this skill.
Overview
Provides comprehensive patterns, templates, and scripts for Newman/Postman API testing including collection structure, environment management, test assertions, reporting, and CI/CD integration.
Instructions
1. Collection Structure and Organization
Understand Collection Architecture:
- Organize requests into logical folders (auth, users, products, etc.)
- Use pre-request scripts for setup (tokens, dynamic data)
- Use test scripts for assertions and validation
- Leverage collection-level variables and scripts
Create Collections:
- Use
scripts/init-collection.sh to scaffold new collection structure
- Use templates to create well-structured requests
- Follow naming conventions (verb + resource: GET Users, POST Product)
- Group related endpoints into folders
2. Environment Variable Management
Setup Environments:
- Use
scripts/setup-environment.sh to create environment files
- Define variables for different deployment stages (dev, staging, prod)
- Store sensitive data in environment files (not in collections)
- Use dynamic variables for timestamps, UUIDs, random data
Variable Hierarchy:
- Global variables: Shared across all collections
- Environment variables: Environment-specific (URLs, credentials)
- Collection variables: Collection-specific configuration
- Local variables: Request/script-specific temporary data
3. Test Assertion Patterns
Write Effective Tests:
- Reference
templates/test-assertions-basic.js for common patterns
- Reference
templates/test-assertions-advanced.js for complex validations
- Test status codes, response times, headers, body content
- Chain requests using
pm.environment.set() for dynamic workflows
Test Organization:
- Status code validation (200, 201, 400, 401, 404, 500)
- Response structure validation (schema, required fields)
- Data validation (types, formats, ranges)
- Business logic validation (calculations, relationships)
4. Running Newman Tests
Execute Collections:
- Use
scripts/run-newman.sh for basic execution
- Use
scripts/run-newman-ci.sh for CI/CD pipeline integration
- Specify environment files with
-e flag
- Generate reports with reporters (cli, json, html, junit)
Common Execution Patterns:
# Run with environment
bash scripts/run-newman.sh collection.json -e env.json
# Run with multiple reporters
bash scripts/run-newman.sh collection.json --reporters cli,json,html
# Run in CI/CD with junit output
bash scripts/run-newman-ci.sh collection.json -e ci-env.json
5. Reporting and Output
Generate Reports:
- Use
scripts/generate-reports.sh for comprehensive reporting
- Configure multiple reporters (HTML for humans, JUnit for CI)
- Parse JSON output for custom analysis
- Track test results over time
Available Report Formats:
- CLI: Console output for quick feedback
- JSON: Machine-readable for analysis
- HTML: Visual reports with charts
- JUnit: CI/CD integration (Jenkins, GitLab, GitHub Actions)
- TeamCity: TeamCity-specific format
6. CI/CD Integration
Pipeline Integration:
- Reference
examples/github-actions-integration.md for GitHub Actions
- Reference
examples/gitlab-ci-integration.md for GitLab CI
- Install Newman as part of build process
- Run tests as separate pipeline stage
- Parse results and fail builds on test failures
Best Practices:
- Use environment variables for secrets
- Run Newman in Docker containers for consistency
- Cache Newman installation for faster builds
- Archive test reports as build artifacts
7. Error Handling and Debugging
Debugging Tests:
- Use
console.log() in pre-request/test scripts
- Enable verbose output with
--verbose flag
- Use
--delay-request to troubleshoot race conditions
- Export newman run data with
--export-* flags
Common Issues:
- Authentication failures: Check token refresh logic
- Timing issues: Use
pm.test() with delays
- Environment mismatch: Verify environment file loaded
- SSL errors: Use
--insecure flag for self-signed certs
Available Scripts
- init-collection.sh: Initialize new Postman collection structure with folders and basic requests
- setup-environment.sh: Create environment JSON files with variable templates
- run-newman.sh: Execute Newman collections with various options and reporters
- run-newman-ci.sh: CI/CD-optimized Newman execution with proper exit codes and reporting
- generate-reports.sh: Generate comprehensive test reports in multiple formats
Available Templates
- collection-basic.json: Basic collection structure with folders and sample requests
- collection-advanced.json: Advanced collection with auth, pre-request scripts, and test chains
- environment-template.json: Environment file template with common variables
- test-assertions-basic.js: Common test assertion patterns (status, headers, body)
- test-assertions-advanced.js: Advanced assertions (schema validation, chaining, conditional tests)
- pre-request-scripts.js: Pre-request script patterns (auth tokens, dynamic data, setup)
Available Examples
- basic-usage.md: Simple Newman execution with single collection and environment
- advanced-testing.md: Complex test scenarios with chaining, data-driven tests, and workflows
- github-actions-integration.md: Complete GitHub Actions workflow for Newman testing
- gitlab-ci-integration.md: GitLab CI configuration for automated API testing
- error-handling-debugging.md: Common errors, troubleshooting steps, and debugging techniques
Requirements
- Newman CLI installed (
npm install -g newman)
- Valid Postman collection JSON files
- Environment files for different stages
- Proper variable management (no hardcoded secrets)
- Clear test descriptions and assertions
- CI/CD integration following best practices
Progressive Disclosure
For additional reference material:
- Read
examples/basic-usage.md for quick start
- Read
examples/advanced-testing.md for complex scenarios
- Read
examples/github-actions-integration.md or examples/gitlab-ci-integration.md for CI/CD setup
- Read
examples/error-handling-debugging.md when troubleshooting
Skill Location: plugins/05-quality/skills/newman-testing/SKILL.md
Version: 1.0.0