GitLab CI/CD

Diese Anleitung erklaert, wie Sie den LoadFocus JMeter API Client mit GitLab CI/CD fuer automatisierte Leistungstests integrieren.

Einrichtungsschritte

1. Anmeldedaten als GitLab CI/CD-Variablen speichern

Speichern Sie zunaechst Ihre LoadFocus API-Zugangsdaten als GitLab CI/CD-Variablen:

  1. Gehen Sie zu Ihrem GitLab-Projekt
  2. Navigieren Sie zu Settings > CI/CD > Variables
  3. Fuegen Sie folgende Variablen hinzu:
    • LOADFOCUS_API_KEY: Ihr LoadFocus API-Schluessel (als "Masked" markieren)
    • LOADFOCUS_TEAM_ID: Ihre LoadFocus Team-ID

2. GitLab CI/CD-Pipeline erstellen

Erstellen oder aktualisieren Sie Ihre .gitlab-ci.yml-Datei in Ihrem Repository:

stages:
- build
- test
- performance
- deploy
variables:
NODE_VERSION: "16"
build:
stage: build
image: node:${NODE_VERSION}
script:
- npm install
- npm run build
artifacts:
paths:
- dist/
expire_in: 1 week
test:
stage: test
image: node:${NODE_VERSION}
script:
- npm install
- npm test
performance_test:
stage: performance
image: node:${NODE_VERSION}
script:
# Install LoadFocus JMeter API Client
- npm install -g @loadfocus/loadfocus-api-client
# Configure LoadFocus API Client
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
# Run Performance Tests
- |
loadfocus-api jmeter run-test \
--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
artifacts:
paths:
- performance_results.json
expire_in: 1 week
when: always
# Optional: Only run on specific branches
only:
- main
- develop
deploy:
stage: deploy
script:
- echo "Deploying application..."
only:
- main
# Only deploy if all previous stages succeeded
when: on_success

3. Testergebnisse anzeigen

Nachdem die Pipeline ausgefuehrt wurde:

  1. Gehen Sie zu Ihrem GitLab-Projekt
  2. Navigieren Sie zu CI/CD > Pipelines
  3. Finden Sie Ihre Pipeline und klicken Sie darauf
  4. Gehen Sie zum Job "performance_test"
  5. Klicken Sie auf "Browse" in der rechten Seitenleiste, um Artefakte anzuzeigen
  6. Laden Sie die Datei performance_results.json herunter und sehen Sie sie ein

Erweiterte Konfiguration

Umgebungsspezifische Tests

Verschiedene Leistungstests fuer verschiedene Umgebungen ausfuehren:

.performance_test_template: &performance_test_definition
stage: performance
image: node:${NODE_VERSION}
script:
- npm install -g @loadfocus/loadfocus-api-client
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
- |
loadfocus-api jmeter run-test \
--name "${TEST_NAME}" \
--thresholds "${THRESHOLDS}" \
--format json > performance_results.json
artifacts:
paths:
- performance_results.json
expire_in: 1 week
when: always
performance_test_develop:
<<: *performance_test_definition
variables:
TEST_NAME: "API_Test_Develop"
THRESHOLDS: "avgresponse<=300,errors==0,p95<=500"
only:
- develop
performance_test_staging:
<<: *performance_test_definition
variables:
TEST_NAME: "API_Test_Staging"
THRESHOLDS: "avgresponse<=250,errors==0,p95<=350"
only:
- staging
performance_test_production:
<<: *performance_test_definition
variables:
TEST_NAME: "API_Test_Production"
THRESHOLDS: "avgresponse<=200,errors==0,p95<=250"
only:
- main

Parallele Tests

Mehrere Leistungstests parallel ausfuehren:

performance_test_api:
stage: performance
image: node:${NODE_VERSION}
script:
- npm install -g @loadfocus/loadfocus-api-client
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
- |
loadfocus-api jmeter run-test \
--name "API_Performance_Test" \
--thresholds "avgresponse<=150,errors==0" \
--format json > api_performance_results.json
artifacts:
paths:
- api_performance_results.json
expire_in: 1 week
performance_test_ui:
stage: performance
image: node:${NODE_VERSION}
script:
- npm install -g @loadfocus/loadfocus-api-client
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
- |
loadfocus-api jmeter run-test \
--name "UI_Performance_Test" \
--thresholds "avgresponse<=300,errors==0" \
--format json > ui_performance_results.json
artifacts:
paths:
- ui_performance_results.json
expire_in: 1 week

Tipps fuer die GitLab CI/CD-Integration

  1. Caching: npm-Abhaengigkeiten cachen, um Pipeline-Laeufe zu beschleunigen:

    performance_test:
    stage: performance
    image: node:${NODE_VERSION}
    cache:
    key: ${CI_COMMIT_REF_SLUG}
    paths:
    - node_modules/
    script:
    - npm install -g @loadfocus/loadfocus-api-client
    # Rest of the script...
  2. Timeout-Einstellungen: Timeouts fuer lang laufende Leistungstests festlegen:

    performance_test:
    stage: performance
    image: node:${NODE_VERSION}
    timeout: 2h # Set a 2-hour timeout
    script:
    # Performance test script...
  3. Manuelle Ausloeser: Leistungstests manuell auslosen lassen:

    performance_test:
    stage: performance
    image: node:${NODE_VERSION}
    script:
    # Performance test script...
    when: manual

Weitere Informationen finden Sie in der GitLab CI/CD-Dokumentation und der LoadFocus API Client-Dokumentation.