GitHub Actions
Diese Anleitung erklaert, wie Sie den LoadFocus JMeter API Client mit GitHub Actions fuer automatisierte Leistungstests integrieren.
Einrichtungsschritte
1. Anmeldedaten als GitHub Secrets speichern
Speichern Sie zunaechst Ihre LoadFocus API-Zugangsdaten als GitHub-Repository-Secrets:
- Gehen Sie zu Ihrem GitHub-Repository
- Navigieren Sie zu Settings > Secrets and variables > Actions
- Fuegen Sie folgende Repository-Secrets hinzu:
LOADFOCUS_API_KEY: Ihr LoadFocus API-SchluesselLOADFOCUS_TEAM_ID: Ihre LoadFocus Team-ID
2. GitHub Actions-Workflow erstellen
Erstellen Sie eine neue Datei in Ihrem Repository unter .github/workflows/performance-test.yml:
name: Performance Testson:push:branches: [ main, develop ]pull_request:branches: [ main ]# Optional: Run on a scheduleschedule:- cron: '0 0 * * 1' # Run at midnight every Mondayjobs:performance-test:runs-on: ubuntu-lateststeps:- uses: actions/checkout@v3- name: Setup Node.jsuses: actions/setup-node@v3with:node-version: '16'- name: Install LoadFocus JMeter API Clientrun: npm install -g @loadfocus/loadfocus-api-client- name: Configure LoadFocus API Clientrun: |loadfocus-api config set apikey ${{ secrets.LOADFOCUS_API_KEY }}loadfocus-api config set teamid ${{ secrets.LOADFOCUS_TEAM_ID }}- name: Run Performance Testsrun: |loadfocus-api jmeter run-test \--name "GitHub_${{ github.repository_owner }}_${{ github.repository }}_${{ github.ref_name }}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json- name: Upload Performance Test Resultsuses: actions/upload-artifact@v3with:name: performance-test-resultspath: performance_results.json
3. Leistungstests zum Deployment-Workflow hinzufuegen
Um das Deployment von den Leistungstestergebnissen abhaengig zu machen:
name: Build, Test, and Deployon:push:branches: [ main ]jobs:build:runs-on: ubuntu-lateststeps:# Your build steps...performance-test:needs: buildruns-on: ubuntu-lateststeps:- uses: actions/checkout@v3- name: Setup Node.jsuses: actions/setup-node@v3with:node-version: '16'- name: Install LoadFocus JMeter API Clientrun: npm install -g @loadfocus/loadfocus-api-client- name: Configure LoadFocus API Clientrun: |loadfocus-api config set apikey ${{ secrets.LOADFOCUS_API_KEY }}loadfocus-api config set teamid ${{ secrets.LOADFOCUS_TEAM_ID }}- name: Run Performance Testsrun: |loadfocus-api jmeter run-test \--name "GitHub_${{ github.repository }}_${{ github.ref_name }}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json- name: Upload Performance Test Resultsuses: actions/upload-artifact@v3with:name: performance-test-resultspath: performance_results.jsondeploy:needs: performance-testruns-on: ubuntu-lateststeps:# Your deployment steps...
Erweiterte Konfiguration
Matrix-Tests fuer mehrere Umgebungen
Tests gegen mehrere Umgebungen oder Konfigurationen ausfuehren:
jobs:performance-test:runs-on: ubuntu-lateststrategy:matrix:environment: [dev, staging, production]test-type: [api, frontend]steps:# Setup steps...- name: Run Performance Testsrun: |TEST_NAME="${{ matrix.test-type }}_test_${{ matrix.environment }}"# Adjust thresholds based on environmentif [ "${{ matrix.environment }}" == "production" ]; thenTHRESHOLDS="avgresponse<=150,errors==0,p95<=200"elseTHRESHOLDS="avgresponse<=300,errors==0,p95<=500"filoadfocus-api jmeter run-test \--name "$TEST_NAME" \--thresholds "$THRESHOLDS" \--format json > "performance_results_${{ matrix.environment }}_${{ matrix.test-type }}.json"
Leistungstest-Berichte erstellen
HTML-Berichte aus JSON-Ergebnissen generieren:
- name: Generate HTML Reportrun: |# Install report generatornpm install -g performance-report-generator# Generate HTML reportperformance-report-generator \--input performance_results.json \--output performance_report.html- name: Upload HTML Reportuses: actions/upload-artifact@v3with:name: performance-test-reportpath: performance_report.html# Optional: Publish to GitHub Pages- name: Publish to GitHub Pagesuses: peaceiris/actions-gh-pages@v3with:github_token: ${{ secrets.GITHUB_TOKEN }}publish_dir: ./reportsdestination_dir: performance-reports
Testergebnisse als PR-Kommentar
Leistungstestergebnisse als Kommentar zu Pull Requests hinzufuegen:
- name: Comment PRuses: actions/github-script@v6if: github.event_name == 'pull_request'with:github-token: ${{ secrets.GITHUB_TOKEN }}script: |const fs = require('fs');const results = JSON.parse(fs.readFileSync('performance_results.json', 'utf8'));let comment = '## Performance Test Results\n\n';comment += `**Overall Result:** ${results.overallResult}\n\n`;comment += '### Results by Label\n\n';for (const label of results.labels) {comment += `#### ${label.label}\n`;comment += `- **Result:** ${label.result}\n`;comment += `- **Samples:** ${label.metrics.samples}\n`;comment += `- **Avg Response:** ${label.metrics.avgresponse}ms\n`;comment += `- **Error Rate:** ${label.metrics.errors}\n\n`;}github.rest.issues.createComment({issue_number: context.issue.number,owner: context.repo.owner,repo: context.repo.repo,body: comment});
Wiederverwendbarer Workflow
Erstellen Sie einen wiederverwendbaren Workflow fuer Leistungstests:
# .github/workflows/reusable-performance-test.ymlname: Reusable Performance Teston:workflow_call:inputs:test-name:required: truetype: stringthresholds:required: falsetype: stringdefault: "avgresponse<=200,errors==0,p95<=250"wait-timeout:required: falsetype: numberdefault: 1800secrets:LOADFOCUS_API_KEY:required: trueLOADFOCUS_TEAM_ID:required: truejobs:performance-test:runs-on: ubuntu-lateststeps:- uses: actions/checkout@v3- name: Setup Node.jsuses: actions/setup-node@v3with:node-version: '16'- name: Install LoadFocus JMeter API Clientrun: npm install -g @loadfocus/loadfocus-api-client- name: Configure LoadFocus API Clientrun: |loadfocus-api config set apikey ${{ secrets.LOADFOCUS_API_KEY }}loadfocus-api config set teamid ${{ secrets.LOADFOCUS_TEAM_ID }}- name: Run Performance Testsrun: |loadfocus-api jmeter run-test \--name "${{ inputs.test-name }}" \--thresholds "${{ inputs.thresholds }}" \--waitTimeout ${{ inputs.wait-timeout }} \--format json > performance_results.json- name: Upload Performance Test Resultsuses: actions/upload-artifact@v3with:name: performance-test-resultspath: performance_results.json
Dann aus einem anderen Workflow aufrufen:
# .github/workflows/main.ymljobs:call-performance-test:uses: ./.github/workflows/reusable-performance-test.ymlwith:test-name: "API_Performance_Test"thresholds: "avgresponse<=150,errors==0"secrets:LOADFOCUS_API_KEY: ${{ secrets.LOADFOCUS_API_KEY }}LOADFOCUS_TEAM_ID: ${{ secrets.LOADFOCUS_TEAM_ID }}
Tipps fuer die GitHub Actions-Integration
Caching: npm-Abhaengigkeiten cachen, um Workflow-Laeufe zu beschleunigen:
- name: Cache Node modulesuses: actions/cache@v3with:path: ~/.npmkey: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}restore-keys: |${{ runner.os }}-node-Parallelitaetskontrolle: Gleichzeitige Leistungstests begrenzen:
concurrency:group: performance-test-${{ github.ref }}cancel-in-progress: falseUmgebungsspezifische Tests: GitHub-Environments fuer verschiedene Testkonfigurationen verwenden:
jobs:performance-test:runs-on: ubuntu-latestenvironment: staging# Environment-specific variables are available hereBedingte Tests: Leistungstests nur ausfuehren, wenn bestimmte Dateien geaendert werden:
jobs:performance-test:if: |contains(github.event.pull_request.labels.*.name, 'performance-test') ||github.event_name == 'schedule' ||contains(github.event.head_commit.message, '[perf-test]')
Weitere Informationen finden Sie in der GitHub Actions-Dokumentation und der LoadFocus API Client-Dokumentation.