GitLab CI/CD
Šī pamācība izskaidro, kā integrēt LoadFocus JMeter API klientu ar GitLab CI/CD automatizētai veiktspējas testēšanai.
Iestatīšanas soļi
1. Saglabājiet akreditācijas datus kā GitLab CI/CD mainīgos
Vispirms saglabājiet savus LoadFocus API akreditācijas datus kā GitLab CI/CD mainīgos:
- Dodieties uz savu GitLab projektu
- Pārejiet uz Settings > CI/CD > Variables
- Pievienojiet šādus mainīgos:
LOADFOCUS_API_KEY: Jūsu LoadFocus API atslēga (atzīmējiet kā "Masked")LOADFOCUS_TEAM_ID: Jūsu LoadFocus komandas ID
2. Izveidojiet GitLab CI/CD cauruļvadu
Izveidojiet vai atjauniniet savu .gitlab-ci.yml failu savā repozitorijā:
stages:- build- test- performance- deployvariables:NODE_VERSION: "16"build:stage: buildimage: node:${NODE_VERSION}script:- npm install- npm run buildartifacts:paths:- dist/expire_in: 1 weektest:stage: testimage: node:${NODE_VERSION}script:- npm install- npm testperformance_test:stage: performanceimage: node:${NODE_VERSION}script:# Install LoadFocus JMeter API Client- npm install -g @loadfocus/loadfocus-api-client# Configure LoadFocus API Client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID# Run Performance Tests- |loadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.jsonartifacts:paths:- performance_results.jsonexpire_in: 1 weekwhen: always# Optional: Only run on specific branchesonly:- main- developdeploy:stage: deployscript:- echo "Deploying application..."only:- main# Only deploy if all previous stages succeededwhen: on_success
3. Skatiet testu rezultātus
Pēc cauruļvada izpildes:
- Dodieties uz savu GitLab projektu
- Pārejiet uz CI/CD > Pipelines
- Atrodiet savu cauruļvadu un noklikšķiniet uz tā
- Dodieties uz "performance_test" darbu
- Noklikšķiniet uz "Browse" labajā sānjoslā, lai skatītu artefaktus
- Lejupielādējiet un skatiet
performance_results.jsonfailu
Papildu konfigurācija
Videi specifiskā testēšana
Izpildiet dažādus veiktspējas testus dažādām vidēm:
.performance_test_template: &performance_test_definitionstage: performanceimage: node:${NODE_VERSION}script:- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "${TEST_NAME}" \--thresholds "${THRESHOLDS}" \--format json > performance_results.jsonartifacts:paths:- performance_results.jsonexpire_in: 1 weekwhen: alwaysperformance_test_develop:<<: *performance_test_definitionvariables:TEST_NAME: "API_Test_Develop"THRESHOLDS: "avgresponse<=300,errors==0,p95<=500"only:- developperformance_test_staging:<<: *performance_test_definitionvariables:TEST_NAME: "API_Test_Staging"THRESHOLDS: "avgresponse<=250,errors==0,p95<=350"only:- stagingperformance_test_production:<<: *performance_test_definitionvariables:TEST_NAME: "API_Test_Production"THRESHOLDS: "avgresponse<=200,errors==0,p95<=250"only:- main
Paralēlā testēšana
Izpildiet vairākus veiktspējas testus paralēli:
performance_test_api:stage: performanceimage: node:${NODE_VERSION}script:- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "API_Performance_Test" \--thresholds "avgresponse<=150,errors==0" \--format json > api_performance_results.jsonartifacts:paths:- api_performance_results.jsonexpire_in: 1 weekperformance_test_ui:stage: performanceimage: node:${NODE_VERSION}script:- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "UI_Performance_Test" \--thresholds "avgresponse<=300,errors==0" \--format json > ui_performance_results.jsonartifacts:paths:- ui_performance_results.jsonexpire_in: 1 week
Pēc tam izveidojiet cauruļvada grafiku GitLab:
- Dodieties uz savu GitLab projektu
- Pārejiet uz CI/CD > Schedules
- Noklikšķiniet "New schedule"
- Iestatiet grafiku (piemēram, katru dienu pusnaktī)
Veiktspējas atskaišu izveidošana
Ģenerējiet HTML atskaites no JSON rezultātiem:
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Run performance test- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json# Generate HTML report- npm install -g performance-report-generator # Replace with actual report generator- performance-report-generator --input performance_results.json --output performance_report.htmlartifacts:paths:- performance_results.json- performance_report.htmlexpire_in: 1 weekwhen: always# Optional: Publish report as GitLab Pagespages:stage: deploydependencies:- performance_testscript:- mkdir -p public/performance-reports- cp performance_report.html public/performance-reports/index.htmlartifacts:paths:- publiconly:- main
Integrācija ar GitLab funkcijām
Sapludināšanas pieprasījumu logrīki
Parādiet veiktspējas testu rezultātus sapludināšanas pieprasījumos, izmantojot GitLab JUnit atskaišu funkciju:
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Run performance test- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json# Convert to JUnit format (using a custom script)- node convert-to-junit.js performance_results.json junit-report.xmlartifacts:reports:junit: junit-report.xmlpaths:- performance_results.jsonexpire_in: 1 week
GitLab metrikas
Izmantojiet GitLab metriku funkciju, lai izsekotu veiktspēju laika gaitā:
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Run performance test- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json# Extract metrics and report them to GitLab- |# Parse JSON and extract metricsRESPONSE_TIME=$(jq '.labels[0].metrics.avgresponse' performance_results.json)ERROR_RATE=$(jq '.labels[0].metrics.errors' performance_results.json)# Report metricsecho "performance_avg_response_time ${RESPONSE_TIME}" >> metrics.txtecho "performance_error_rate ${ERROR_RATE}" >> metrics.txtartifacts:reports:metrics: metrics.txtpaths:- performance_results.jsonexpire_in: 1 week
GitLab vides
Saistiet veiktspējas testus ar konkrētām vidēm:
performance_test_staging:stage: performanceimage: node:${NODE_VERSION}environment:name: stagingscript:- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "Staging_Performance_Test" \--thresholds "avgresponse<=250,errors==0" \--format json > performance_results.jsonartifacts:paths:- performance_results.jsonexpire_in: 1 weekonly:- staging
Padomi GitLab CI/CD integrācijai
Kešošana: Kešojiet npm atkarības, lai paātrinātu cauruļvada izpildi:
performance_test:stage: performanceimage: node:${NODE_VERSION}cache:key: ${CI_COMMIT_REF_SLUG}paths:- node_modules/script:- npm install -g @loadfocus/loadfocus-api-client# Rest of the script...Taimauta iestatījumi: Iestatiet taimautus ilgstošiem veiktspējas testiem:
performance_test:stage: performanceimage: node:${NODE_VERSION}timeout: 2h # Set a 2-hour timeoutscript:# Performance test script...Manuālie trigeri: Ļaujiet veiktspējas testus aktivizēt manuāli:
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Performance test script...when: manualDinamiskā testu konfigurācija: Izmantojiet GitLab iepriekš definētos mainīgos, lai dinamiski konfigurētu testus:
performance_test:stage: performanceimage: node:${NODE_VERSION}script:- |# Set thresholds based on branchif [ "$CI_COMMIT_REF_NAME" = "main" ]; thenTHRESHOLDS="avgresponse<=200,errors==0,p95<=250"elseTHRESHOLDS="avgresponse<=300,errors==0,p95<=500"filoadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "$THRESHOLDS" \--format json > performance_results.jsonPaziņojumi: Nosūtiet paziņojumus, kad veiktspējas testi neizdodas:
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Performance test script...after_script:- |if [ $? -ne 0 ]; then# Send notification using curl, GitLab API, or other methodcurl -X POST -H "Content-Type: application/json" \-d "{\"text\":\"Performance test failed for ${CI_PROJECT_NAME}\"}" \$WEBHOOK_URLfi
Papildu informācijai skatiet GitLab CI/CD dokumentāciju un LoadFocus API klienta dokumentāciju.