Github Actions
GitHub Actions Integration Guide
This guide explains how to integrate the LoadFocus JMeter API Client with GitHub Actions for automated performance testing.
Setup Steps
1. Store Credentials as GitHub Secrets
First, store your LoadFocus API credentials as GitHub repository secrets:
- Go to your GitHub repository
- Navigate to Settings > Secrets and variables > Actions
- Add the following repository secrets:
LOADFOCUS_API_KEY
: Your LoadFocus API keyLOADFOCUS_TEAM_ID
: Your LoadFocus team ID
2. Create a GitHub Actions Workflow
Create a new file in your repository at .github/workflows/performance-test.yml
:
name: Performance Testson:push:branches: [ main, develop ]pull_request:branches: [ main ]# Optional: Run on a scheduleschedule:- cron: '0 0 * * 1' # Run at midnight every Mondayjobs:performance-test:runs-on: ubuntu-lateststeps:- uses: actions/checkout@v3- name: Setup Node.jsuses: actions/setup-node@v3with:node-version: '16'- name: Install LoadFocus JMeter API Clientrun: npm install -g @loadfocus/loadfocus-api-client- name: Configure LoadFocus API Clientrun: |loadfocus-api config set apikey ${{ secrets.LOADFOCUS_API_KEY }}loadfocus-api config set teamid ${{ secrets.LOADFOCUS_TEAM_ID }}- name: Run Performance Testsrun: |loadfocus-api jmeter run-test \--name "GitHub_${{ github.repository_owner }}_${{ github.repository }}_${{ github.ref_name }}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json- name: Upload Performance Test Resultsuses: actions/upload-artifact@v3with:name: performance-test-resultspath: performance_results.json
3. Add Performance Testing to Your Deployment Workflow
To make deployment dependent on performance test results:
name: Build, Test, and Deployon:push:branches: [ main ]jobs:build:runs-on: ubuntu-lateststeps:# Your build steps...performance-test:needs: buildruns-on: ubuntu-lateststeps:- uses: actions/checkout@v3- name: Setup Node.jsuses: actions/setup-node@v3with:node-version: '16'- name: Install LoadFocus JMeter API Clientrun: npm install -g @loadfocus/loadfocus-api-client- name: Configure LoadFocus API Clientrun: |loadfocus-api config set apikey ${{ secrets.LOADFOCUS_API_KEY }}loadfocus-api config set teamid ${{ secrets.LOADFOCUS_TEAM_ID }}- name: Run Performance Testsrun: |loadfocus-api jmeter run-test \--name "GitHub_${{ github.repository }}_${{ github.ref_name }}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json- name: Upload Performance Test Resultsuses: actions/upload-artifact@v3with:name: performance-test-resultspath: performance_results.jsondeploy:needs: performance-testruns-on: ubuntu-lateststeps:# Your deployment steps...
Advanced Configuration
Matrix Testing for Multiple Environments
Run tests against multiple environments or configurations:
jobs:performance-test:runs-on: ubuntu-lateststrategy:matrix:environment: [dev, staging, production]test-type: [api, frontend]steps:# Setup steps...- name: Run Performance Testsrun: |TEST_NAME="${{ matrix.test-type }}_test_${{ matrix.environment }}"# Adjust thresholds based on environmentif [ "${{ matrix.environment }}" == "production" ]; thenTHRESHOLDS="avgresponse<=150,errors==0,p95<=200"elseTHRESHOLDS="avgresponse<=300,errors==0,p95<=500"filoadfocus-api jmeter run-test \--name "$TEST_NAME" \--thresholds "$THRESHOLDS" \--format json > "performance_results_${{ matrix.environment }}_${{ matrix.test-type }}.json"
Creating Performance Test Reports
Generate HTML reports from JSON results:
- name: Generate HTML Reportrun: |# Install report generatornpm install -g performance-report-generator# Generate HTML reportperformance-report-generator \--input performance_results.json \--output performance_report.html- name: Upload HTML Reportuses: actions/upload-artifact@v3with:name: performance-test-reportpath: performance_report.html# Optional: Publish to GitHub Pages- name: Publish to GitHub Pagesuses: peaceiris/actions-gh-pages@v3with:github_token: ${{ secrets.GITHUB_TOKEN }}publish_dir: ./reportsdestination_dir: performance-reports
Commenting Test Results on PRs
Add performance test results as a comment on pull requests:
- name: Comment PRuses: actions/github-script@v6if: github.event_name == 'pull_request'with:github-token: ${{ secrets.GITHUB_TOKEN }}script: |const fs = require('fs');const results = JSON.parse(fs.readFileSync('performance_results.json', 'utf8'));let comment = '## Performance Test Results\n\n';comment += `**Overall Result:** ${results.overallResult}\n\n`;comment += '### Results by Label\n\n';for (const label of results.labels) {comment += `#### ${label.label}\n`;comment += `- **Result:** ${label.result}\n`;comment += `- **Samples:** ${label.metrics.samples}\n`;comment += `- **Avg Response:** ${label.metrics.avgresponse}ms\n`;comment += `- **Error Rate:** ${label.metrics.errors}\n\n`;}github.rest.issues.createComment({issue_number: context.issue.number,owner: context.repo.owner,repo: context.repo.repo,body: comment});
Reusable Workflow
Create a reusable workflow for performance testing:
# .github/workflows/reusable-performance-test.ymlname: Reusable Performance Teston:workflow_call:inputs:test-name:required: truetype: stringthresholds:required: falsetype: stringdefault: "avgresponse<=200,errors==0,p95<=250"wait-timeout:required: falsetype: numberdefault: 1800secrets:LOADFOCUS_API_KEY:required: trueLOADFOCUS_TEAM_ID:required: truejobs:performance-test:runs-on: ubuntu-lateststeps:- uses: actions/checkout@v3- name: Setup Node.jsuses: actions/setup-node@v3with:node-version: '16'- name: Install LoadFocus JMeter API Clientrun: npm install -g @loadfocus/loadfocus-api-client- name: Configure LoadFocus API Clientrun: |loadfocus-api config set apikey ${{ secrets.LOADFOCUS_API_KEY }}loadfocus-api config set teamid ${{ secrets.LOADFOCUS_TEAM_ID }}- name: Run Performance Testsrun: |loadfocus-api jmeter run-test \--name "${{ inputs.test-name }}" \--thresholds "${{ inputs.thresholds }}" \--waitTimeout ${{ inputs.wait-timeout }} \--format json > performance_results.json- name: Upload Performance Test Resultsuses: actions/upload-artifact@v3with:name: performance-test-resultspath: performance_results.json
Then call it from another workflow:
# .github/workflows/main.ymljobs:call-performance-test:uses: ./.github/workflows/reusable-performance-test.ymlwith:test-name: "API_Performance_Test"thresholds: "avgresponse<=150,errors==0"secrets:LOADFOCUS_API_KEY: ${{ secrets.LOADFOCUS_API_KEY }}LOADFOCUS_TEAM_ID: ${{ secrets.LOADFOCUS_TEAM_ID }}
Tips for GitHub Actions Integration
Caching: Cache npm dependencies to speed up workflow runs:
- name: Cache Node modulesuses: actions/cache@v3with:path: ~/.npmkey: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}restore-keys: |${{ runner.os }}-node-Concurrency Control: Limit concurrent performance tests:
concurrency:group: performance-test-${{ github.ref }}cancel-in-progress: falseEnvironment-specific Tests: Use GitHub environments for different test configurations:
jobs:performance-test:runs-on: ubuntu-latestenvironment: staging# Environment-specific variables are available hereConditional Testing: Only run performance tests when specific files change:
jobs:performance-test:if: |contains(github.event.pull_request.labels.*.name, 'performance-test') ||github.event_name == 'schedule' ||contains(github.event.head_commit.message, '[perf-test]')
For more information, refer to the GitHub Actions documentation and the LoadFocus API Client documentation.