GitLab CI/CD
Denna guide förklarar hur du integrerar LoadFocus JMeter API-klienten med GitLab CI/CD för automatiserad prestandatestning.
Installationssteg
1. Lagra autentiseringsuppgifter som GitLab CI/CD-variabler
Lagra först dina LoadFocus API-autentiseringsuppgifter som GitLab CI/CD-variabler:
- Gå till ditt GitLab-projekt
- Navigera till Settings > CI/CD > Variables
- Lägg till följande variabler:
LOADFOCUS_API_KEY: Din LoadFocus API-nyckel (markera som "Masked")LOADFOCUS_TEAM_ID: Ditt LoadFocus team-ID
2. Skapa en GitLab CI/CD-pipeline
Skapa eller uppdatera din .gitlab-ci.yml-fil i ditt repository:
stages:- build- test- performance- deployvariables:NODE_VERSION: "16"build:stage: buildimage: node:${NODE_VERSION}script:- npm install- npm run buildartifacts:paths:- dist/expire_in: 1 weektest:stage: testimage: node:${NODE_VERSION}script:- npm install- npm testperformance_test:stage: performanceimage: node:${NODE_VERSION}script:# Installera LoadFocus JMeter API-klienten- npm install -g @loadfocus/loadfocus-api-client# Konfigurera LoadFocus API-klienten- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID# Kör prestandatester- |loadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.jsonartifacts:paths:- performance_results.jsonexpire_in: 1 weekwhen: always# Valfritt: Kör bara på specifika brancheronly:- main- developdeploy:stage: deployscript:- echo "Deploying application..."only:- main# Deploya bara om alla tidigare steg lyckadeswhen: on_success
3. Visa testresultat
Efter att pipelinen har körts:
- Gå till ditt GitLab-projekt
- Navigera till CI/CD > Pipelines
- Hitta din pipeline och klicka på den
- Gå till jobbet "performance_test"
- Klicka på "Browse" i höger sidofält för att visa artefakter
- Ladda ner och visa filen
performance_results.json
Avancerad konfiguration
Miljöspecifik testning
Kör olika prestandatester för olika miljöer:
.performance_test_template: &performance_test_definitionstage: performanceimage: node:${NODE_VERSION}script:- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "${TEST_NAME}" \--thresholds "${THRESHOLDS}" \--format json > performance_results.jsonartifacts:paths:- performance_results.jsonexpire_in: 1 weekwhen: alwaysperformance_test_develop:<<: *performance_test_definitionvariables:TEST_NAME: "API_Test_Develop"THRESHOLDS: "avgresponse<=300,errors==0,p95<=500"only:- developperformance_test_staging:<<: *performance_test_definitionvariables:TEST_NAME: "API_Test_Staging"THRESHOLDS: "avgresponse<=250,errors==0,p95<=350"only:- stagingperformance_test_production:<<: *performance_test_definitionvariables:TEST_NAME: "API_Test_Production"THRESHOLDS: "avgresponse<=200,errors==0,p95<=250"only:- main
Parallell testning
Kör flera prestandatester parallellt:
performance_test_api:stage: performanceimage: node:${NODE_VERSION}script:- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "API_Performance_Test" \--thresholds "avgresponse<=150,errors==0" \--format json > api_performance_results.jsonartifacts:paths:- api_performance_results.jsonexpire_in: 1 weekperformance_test_ui:stage: performanceimage: node:${NODE_VERSION}script:- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "UI_Performance_Test" \--thresholds "avgresponse<=300,errors==0" \--format json > ui_performance_results.jsonartifacts:paths:- ui_performance_results.jsonexpire_in: 1 week
Skapa sedan ett pipeline-schema i GitLab:
- Gå till ditt GitLab-projekt
- Navigera till CI/CD > Schedules
- Klicka på "New schedule"
- Konfigurera ett schema (t.ex. varje dag vid midnatt)
Skapa prestandarapporter
Generera HTML-rapporter från JSON-resultat:
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Kör prestandatest- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json# Generera HTML-rapport- npm install -g performance-report-generator # Ersätt med faktisk rapportgenerator- performance-report-generator --input performance_results.json --output performance_report.htmlartifacts:paths:- performance_results.json- performance_report.htmlexpire_in: 1 weekwhen: always# Valfritt: Publicera rapport som GitLab Pagespages:stage: deploydependencies:- performance_testscript:- mkdir -p public/performance-reports- cp performance_report.html public/performance-reports/index.htmlartifacts:paths:- publiconly:- main
Integration med GitLab-funktioner
Merge Request-widgets
Visa prestandatestresultat i merge requests genom att använda GitLabs JUnit-rapportfunktion:
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Kör prestandatest- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json# Konvertera till JUnit-format (med ett anpassat skript)- node convert-to-junit.js performance_results.json junit-report.xmlartifacts:reports:junit: junit-report.xmlpaths:- performance_results.jsonexpire_in: 1 week
GitLab-mätvärden
Använd GitLabs mätvärdsfunktion för att spåra prestanda över tid:
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Kör prestandatest- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json# Extrahera mätvärden och rapportera dem till GitLab- |# Parsa JSON och extrahera mätvärdenRESPONSE_TIME=$(jq '.labels[0].metrics.avgresponse' performance_results.json)ERROR_RATE=$(jq '.labels[0].metrics.errors' performance_results.json)# Rapportera mätvärdenecho "performance_avg_response_time ${RESPONSE_TIME}" >> metrics.txtecho "performance_error_rate ${ERROR_RATE}" >> metrics.txtartifacts:reports:metrics: metrics.txtpaths:- performance_results.jsonexpire_in: 1 week
GitLab-miljöer
Koppla prestandatester till specifika miljöer:
performance_test_staging:stage: performanceimage: node:${NODE_VERSION}environment:name: stagingscript:- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "Staging_Performance_Test" \--thresholds "avgresponse<=250,errors==0" \--format json > performance_results.jsonartifacts:paths:- performance_results.jsonexpire_in: 1 weekonly:- staging
Tips för GitLab CI/CD-integration
Cachning: Cacha npm-beroenden för att snabba upp pipeline-körningar:
performance_test:stage: performanceimage: node:${NODE_VERSION}cache:key: ${CI_COMMIT_REF_SLUG}paths:- node_modules/script:- npm install -g @loadfocus/loadfocus-api-client# Resten av skriptet...Timeout-inställningar: Ange timeouts för långvariga prestandatester:
performance_test:stage: performanceimage: node:${NODE_VERSION}timeout: 2h # Ange en 2-timmars timeoutscript:# Prestandatestskript...Manuella utlösare: Tillåt att prestandatester utlöses manuellt:
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Prestandatestskript...when: manualDynamisk testkonfiguration: Använd GitLabs fördefinierade variabler för att dynamiskt konfigurera tester:
performance_test:stage: performanceimage: node:${NODE_VERSION}script:- |# Ange tröskelvärden baserat på branchif [ "$CI_COMMIT_REF_NAME" = "main" ]; thenTHRESHOLDS="avgresponse<=200,errors==0,p95<=250"elseTHRESHOLDS="avgresponse<=300,errors==0,p95<=500"filoadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "$THRESHOLDS" \--format json > performance_results.jsonNotifieringar: Skicka notifieringar när prestandatester misslyckas:
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Prestandatestskript...after_script:- |if [ $? -ne 0 ]; then# Skicka notifiering med curl, GitLab API eller annan metodcurl -X POST -H "Content-Type: application/json" \-d "{\"text\":\"Performance test failed for ${CI_PROJECT_NAME}\"}" \$WEBHOOK_URLfi
För mer information, se GitLab CI/CD-dokumentationen och LoadFocus API-klientdokumentationen.