GitLab CI/CD
This guide explains how to integrate the LoadFocus JMeter API Client with GitLab CI/CD for automated performance testing.
Setup Steps
1. Store Credentials as GitLab CI/CD Variables
First, store your LoadFocus API credentials as GitLab CI/CD variables:
- Go to your GitLab project
- Navigate to Settings > CI/CD > Variables
- Add the following variables:
LOADFOCUS_API_KEY
: Your LoadFocus API key (mark as "Masked")LOADFOCUS_TEAM_ID
: Your LoadFocus team ID
2. Create a GitLab CI/CD Pipeline
Create or update your .gitlab-ci.yml
file in your repository:
stages:- build- test- performance- deployvariables:NODE_VERSION: "16"build:stage: buildimage: node:${NODE_VERSION}script:- npm install- npm run buildartifacts:paths:- dist/expire_in: 1 weektest:stage: testimage: node:${NODE_VERSION}script:- npm install- npm testperformance_test:stage: performanceimage: node:${NODE_VERSION}script:# Install LoadFocus JMeter API Client- npm install -g @loadfocus/loadfocus-api-client# Configure LoadFocus API Client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID# Run Performance Tests- |loadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.jsonartifacts:paths:- performance_results.jsonexpire_in: 1 weekwhen: always# Optional: Only run on specific branchesonly:- main- developdeploy:stage: deployscript:- echo "Deploying application..."only:- main# Only deploy if all previous stages succeededwhen: on_success
3. View Test Results
After the pipeline runs:
- Go to your GitLab project
- Navigate to CI/CD > Pipelines
- Find your pipeline and click on it
- Go to the "performance_test" job
- Click on "Browse" in the right sidebar to view artifacts
- Download and view the
performance_results.json
file
Advanced Configuration
Environment-specific Testing
Run different performance tests for different environments:
.performance_test_template: &performance_test_definitionstage: performanceimage: node:${NODE_VERSION}script:- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "${TEST_NAME}" \--thresholds "${THRESHOLDS}" \--format json > performance_results.jsonartifacts:paths:- performance_results.jsonexpire_in: 1 weekwhen: alwaysperformance_test_develop:<<: *performance_test_definitionvariables:TEST_NAME: "API_Test_Develop"THRESHOLDS: "avgresponse<=300,errors==0,p95<=500"only:- developperformance_test_staging:<<: *performance_test_definitionvariables:TEST_NAME: "API_Test_Staging"THRESHOLDS: "avgresponse<=250,errors==0,p95<=350"only:- stagingperformance_test_production:<<: *performance_test_definitionvariables:TEST_NAME: "API_Test_Production"THRESHOLDS: "avgresponse<=200,errors==0,p95<=250"only:- main
Parallel Testing
Run multiple performance tests in parallel:
performance_test_api:stage: performanceimage: node:${NODE_VERSION}script:- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "API_Performance_Test" \--thresholds "avgresponse<=150,errors==0" \--format json > api_performance_results.jsonartifacts:paths:- api_performance_results.jsonexpire_in: 1 weekperformance_test_ui:stage: performanceimage: node:${NODE_VERSION}script:- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "UI_Performance_Test" \--thresholds "avgresponse<=300,errors==0" \--format json > ui_performance_results.jsonartifacts:paths:- ui_performance_results.jsonexpire_in: 1 week
Then create a pipeline schedule in GitLab:
- Go to your GitLab project
- Navigate to CI/CD > Schedules
- Click "New schedule"
- Set up a schedule (e.g., every day at midnight)
Creating Performance Reports
Generate HTML reports from JSON results:
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Run performance test- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json# Generate HTML report- npm install -g performance-report-generator # Replace with actual report generator- performance-report-generator --input performance_results.json --output performance_report.htmlartifacts:paths:- performance_results.json- performance_report.htmlexpire_in: 1 weekwhen: always# Optional: Publish report as GitLab Pagespages:stage: deploydependencies:- performance_testscript:- mkdir -p public/performance-reports- cp performance_report.html public/performance-reports/index.htmlartifacts:paths:- publiconly:- main
Integration with GitLab Features
Merge Request Widgets
Display performance test results in merge requests by using GitLab's JUnit report feature:
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Run performance test- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json# Convert to JUnit format (using a custom script)- node convert-to-junit.js performance_results.json junit-report.xmlartifacts:reports:junit: junit-report.xmlpaths:- performance_results.jsonexpire_in: 1 week
GitLab Metrics
Use GitLab's metrics feature to track performance over time:
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Run performance test- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json# Extract metrics and report them to GitLab- |# Parse JSON and extract metricsRESPONSE_TIME=$(jq '.labels[0].metrics.avgresponse' performance_results.json)ERROR_RATE=$(jq '.labels[0].metrics.errors' performance_results.json)# Report metricsecho "performance_avg_response_time ${RESPONSE_TIME}" >> metrics.txtecho "performance_error_rate ${ERROR_RATE}" >> metrics.txtartifacts:reports:metrics: metrics.txtpaths:- performance_results.jsonexpire_in: 1 week
GitLab Environments
Associate performance tests with specific environments:
performance_test_staging:stage: performanceimage: node:${NODE_VERSION}environment:name: stagingscript:- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "Staging_Performance_Test" \--thresholds "avgresponse<=250,errors==0" \--format json > performance_results.jsonartifacts:paths:- performance_results.jsonexpire_in: 1 weekonly:- staging
Tips for GitLab CI/CD Integration
Caching: Cache npm dependencies to speed up pipeline runs:
performance_test:stage: performanceimage: node:${NODE_VERSION}cache:key: ${CI_COMMIT_REF_SLUG}paths:- node_modules/script:- npm install -g @loadfocus/loadfocus-api-client# Rest of the script...Timeout Settings: Set timeouts for long-running performance tests:
performance_test:stage: performanceimage: node:${NODE_VERSION}timeout: 2h # Set a 2-hour timeoutscript:# Performance test script...Manual Triggers: Allow performance tests to be triggered manually:
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Performance test script...when: manualDynamic Test Configuration: Use GitLab predefined variables to dynamically configure tests:
performance_test:stage: performanceimage: node:${NODE_VERSION}script:- |# Set thresholds based on branchif [ "$CI_COMMIT_REF_NAME" = "main" ]; thenTHRESHOLDS="avgresponse<=200,errors==0,p95<=250"elseTHRESHOLDS="avgresponse<=300,errors==0,p95<=500"filoadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "$THRESHOLDS" \--format json > performance_results.jsonNotifications: Send notifications when performance tests fail:
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Performance test script...after_script:- |if [ $? -ne 0 ]; then# Send notification using curl, GitLab API, or other methodcurl -X POST -H "Content-Type: application/json" \-d "{\"text\":\"Performance test failed for ${CI_PROJECT_NAME}\"}" \$WEBHOOK_URLfi
For more information, refer to the GitLab CI/CD documentation and the LoadFocus API Client documentation.