---
title: "GitHub Actions"
metaTitle: "GitHub Actions Integration Guide | LoadFocus"
metaDescription: "GitHub Actions Integration Guide | LoadFocus — step-by-step LoadFocus documentation for cloud load testing, API monitoring, and performance insights."
order: 3
---

This guide explains how to integrate the LoadFocus JMeter API Client with GitHub Actions for automated performance testing.

## Setup Steps

### 1. Store Credentials as GitHub Secrets

First, store your LoadFocus API credentials as GitHub repository secrets:

1. Go to your GitHub repository
2. Navigate to Settings > Secrets and variables > Actions
3. Add the following repository secrets:
   - `LOADFOCUS_API_KEY`: Your LoadFocus API key
   - `LOADFOCUS_TEAM_ID`: Your LoadFocus team ID

### 2. Create a GitHub Actions Workflow

Create a new file in your repository at `.github/workflows/performance-test.yml`:

```yaml
name: Performance Tests

on:
  push:
    branches: [ main, develop ]
  pull_request:
    branches: [ main ]
  # Optional: Run on a schedule
  schedule:
    - cron: '0 0 * * 1'  # Run at midnight every Monday

jobs:
  performance-test:
    runs-on: ubuntu-latest

    steps:
    - uses: actions/checkout@v3

    - name: Setup Node.js
      uses: actions/setup-node@v3
      with:
        node-version: '16'

    - name: Install LoadFocus JMeter API Client
      run: npm install -g @loadfocus/loadfocus-api-client

    - name: Configure LoadFocus API Client
      run: |
        loadfocus-api config set apikey ${{ secrets.LOADFOCUS_API_KEY }}
        loadfocus-api config set teamid ${{ secrets.LOADFOCUS_TEAM_ID }}

    - name: Run Performance Tests
      run: |
        loadfocus-api jmeter run-test \
          --name "GitHub_${{ github.repository_owner }}_${{ github.repository }}_${{ github.ref_name }}" \
          --thresholds "avgresponse<=200,errors==0,p95<=250" \
          --format json > performance_results.json

    - name: Upload Performance Test Results
      uses: actions/upload-artifact@v3
      with:
        name: performance-test-results
        path: performance_results.json
```

### 3. Add Performance Testing to Your Deployment Workflow

To make deployment dependent on performance test results:

```yaml
name: Build, Test, and Deploy

on:
  push:
    branches: [ main ]

jobs:
  build:
    runs-on: ubuntu-latest
    steps:
      # Your build steps...

  performance-test:
    needs: build
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3

      - name: Setup Node.js
        uses: actions/setup-node@v3
        with:
          node-version: '16'

      - name: Install LoadFocus JMeter API Client
        run: npm install -g @loadfocus/loadfocus-api-client

      - name: Configure LoadFocus API Client
        run: |
          loadfocus-api config set apikey ${{ secrets.LOADFOCUS_API_KEY }}
          loadfocus-api config set teamid ${{ secrets.LOADFOCUS_TEAM_ID }}

      - name: Run Performance Tests
        run: |
          loadfocus-api jmeter run-test \
            --name "GitHub_${{ github.repository }}_${{ github.ref_name }}" \
            --thresholds "avgresponse<=200,errors==0,p95<=250" \
            --format json > performance_results.json

      - name: Upload Performance Test Results
        uses: actions/upload-artifact@v3
        with:
          name: performance-test-results
          path: performance_results.json

  deploy:
    needs: performance-test
    runs-on: ubuntu-latest
    steps:
      # Your deployment steps...
```

## Advanced Configuration

### Matrix Testing for Multiple Environments

Run tests against multiple environments or configurations:

```yaml
jobs:
  performance-test:
    runs-on: ubuntu-latest
    strategy:
      matrix:
        environment: [dev, staging, production]
        test-type: [api, frontend]

    steps:
      # Setup steps...

      - name: Run Performance Tests
        run: |
          TEST_NAME="${{ matrix.test-type }}_test_${{ matrix.environment }}"

          # Adjust thresholds based on environment
          if [ "${{ matrix.environment }}" == "production" ]; then
            THRESHOLDS="avgresponse<=150,errors==0,p95<=200"
          else
            THRESHOLDS="avgresponse<=300,errors==0,p95<=500"
          fi

          loadfocus-api jmeter run-test \
            --name "$TEST_NAME" \
            --thresholds "$THRESHOLDS" \
            --format json > "performance_results_${{ matrix.environment }}_${{ matrix.test-type }}.json"
```

### Creating Performance Test Reports

Generate HTML reports from JSON results:

```yaml
- name: Generate HTML Report
  run: |
    # Install report generator
    npm install -g performance-report-generator

    # Generate HTML report
    performance-report-generator \
      --input performance_results.json \
      --output performance_report.html

- name: Upload HTML Report
  uses: actions/upload-artifact@v3
  with:
    name: performance-test-report
    path: performance_report.html

# Optional: Publish to GitHub Pages
- name: Publish to GitHub Pages
  uses: peaceiris/actions-gh-pages@v3
  with:
    github_token: ${{ secrets.GITHUB_TOKEN }}
    publish_dir: ./reports
    destination_dir: performance-reports
```

### Commenting Test Results on PRs

Add performance test results as a comment on pull requests:

```yaml
- name: Comment PR
  uses: actions/github-script@v6
  if: github.event_name == 'pull_request'
  with:
    github-token: ${{ secrets.GITHUB_TOKEN }}
    script: |
      const fs = require('fs');
      const results = JSON.parse(fs.readFileSync('performance_results.json', 'utf8'));

      let comment = '## Performance Test Results\n\n';
      comment += `**Overall Result:** ${results.overallResult}\n\n`;

      comment += '### Results by Label\n\n';
      for (const label of results.labels) {
        comment += `#### ${label.label}\n`;
        comment += `- **Result:** ${label.result}\n`;
        comment += `- **Samples:** ${label.metrics.samples}\n`;
        comment += `- **Avg Response:** ${label.metrics.avgresponse}ms\n`;
        comment += `- **Error Rate:** ${label.metrics.errors}\n\n`;
      }

      github.rest.issues.createComment({
        issue_number: context.issue.number,
        owner: context.repo.owner,
        repo: context.repo.repo,
        body: comment
      });
```

## Reusable Workflow

Create a reusable workflow for performance testing:

```yaml
# .github/workflows/reusable-performance-test.yml
name: Reusable Performance Test

on:
  workflow_call:
    inputs:
      test-name:
        required: true
        type: string
      thresholds:
        required: false
        type: string
        default: "avgresponse<=200,errors==0,p95<=250"
      wait-timeout:
        required: false
        type: number
        default: 1800
    secrets:
      LOADFOCUS_API_KEY:
        required: true
      LOADFOCUS_TEAM_ID:
        required: true

jobs:
  performance-test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3

      - name: Setup Node.js
        uses: actions/setup-node@v3
        with:
          node-version: '16'

      - name: Install LoadFocus JMeter API Client
        run: npm install -g @loadfocus/loadfocus-api-client

      - name: Configure LoadFocus API Client
        run: |
          loadfocus-api config set apikey ${{ secrets.LOADFOCUS_API_KEY }}
          loadfocus-api config set teamid ${{ secrets.LOADFOCUS_TEAM_ID }}

      - name: Run Performance Tests
        run: |
          loadfocus-api jmeter run-test \
            --name "${{ inputs.test-name }}" \
            --thresholds "${{ inputs.thresholds }}" \
            --waitTimeout ${{ inputs.wait-timeout }} \
            --format json > performance_results.json

      - name: Upload Performance Test Results
        uses: actions/upload-artifact@v3
        with:
          name: performance-test-results
          path: performance_results.json
```

Then call it from another workflow:

```yaml
# .github/workflows/main.yml
jobs:
  call-performance-test:
    uses: ./.github/workflows/reusable-performance-test.yml
    with:
      test-name: "API_Performance_Test"
      thresholds: "avgresponse<=150,errors==0"
    secrets:
      LOADFOCUS_API_KEY: ${{ secrets.LOADFOCUS_API_KEY }}
      LOADFOCUS_TEAM_ID: ${{ secrets.LOADFOCUS_TEAM_ID }}
```

## Tips for GitHub Actions Integration

1. **Caching**: Cache npm dependencies to speed up workflow runs:

   ```yaml
   - name: Cache Node modules
     uses: actions/cache@v3
     with:
       path: ~/.npm
       key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
       restore-keys: |
         ${{ runner.os }}-node-
   ```

2. **Concurrency Control**: Limit concurrent performance tests:

   ```yaml
   concurrency:
     group: performance-test-${{ github.ref }}
     cancel-in-progress: false
   ```

3. **Environment-specific Tests**: Use GitHub environments for different test configurations:

   ```yaml
   jobs:
     performance-test:
       runs-on: ubuntu-latest
       environment: staging
       # Environment-specific variables are available here
   ```

4. **Conditional Testing**: Only run performance tests when specific files change:

   ```yaml
   jobs:
     performance-test:
       if: |
         contains(github.event.pull_request.labels.*.name, 'performance-test') ||
         github.event_name == 'schedule' ||
         contains(github.event.head_commit.message, '[perf-test]')
   ```

For more information, refer to the [GitHub Actions documentation](https://docs.github.com/en/actions) and the [LoadFocus API Client documentation](https://bitbucket.com/loadfocus/loadfocus-api-client).
