Jenkins
Šī pamācība izskaidro, kā integrēt LoadFocus JMeter API klientu ar Jenkins automatizētai veiktspējas testēšanai.
Iestatīšanas soļi
1. Saglabājiet akreditācijas datus Jenkins
Vispirms droši saglabājiet savus LoadFocus API akreditācijas datus Jenkins:
- Pārejiet uz Jenkins Dashboard > Manage Jenkins > Manage Credentials
- Izvēlieties atbilstošo akreditācijas datu domēnu (piemēram, global)
- Noklikšķiniet "Add Credentials"
- Pievienojiet šādus akreditācijas datus:
- Kind: Secret text
- Scope: Global
- Secret: Jūsu LoadFocus API atslēga
- ID: loadfocus-api-key
- Description: LoadFocus API Key
- Atkārtojiet savam komandas ID ar ID: loadfocus-team-id
2. Izveidojiet Jenkins cauruļvadu
Izveidojiet Jenkinsfile savā repozitorijā:
pipeline {agent {docker {image 'node:16-alpine'}}environment {LOADFOCUS_API_KEY = credentials('loadfocus-api-key')LOADFOCUS_TEAM_ID = credentials('loadfocus-team-id')}stages {stage('Build') {steps {// Your build stepssh 'npm install'sh 'npm run build'}}stage('Test') {steps {// Your test stepssh 'npm test'}}stage('Performance Test') {steps {// Install LoadFocus JMeter API Clientsh 'npm install -g @loadfocus/loadfocus-api-client'// Configure LoadFocus API Clientsh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'// Run Performance Testssh '''loadfocus-api jmeter run-test \--name "Jenkins_${JOB_NAME}_${BUILD_NUMBER}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json'''// Archive the resultsarchiveArtifacts artifacts: 'performance_results.json', fingerprint: true}}stage('Deploy') {when {expression {return currentBuild.resultIsBetterOrEqualTo('SUCCESS')}}steps {// Your deployment stepsecho 'Deploying...'}}}post {always {// Clean up workspacecleanWs()}}}
3. Konfigurējiet Jenkins darbu
- Izveidojiet jaunu Pipeline darbu Jenkins
- Konfigurējiet cauruļvadu, lai izmantotu jūsu Jenkinsfile
- Iestatiet atbilstošo SCM konfigurāciju, lai iegūtu jūsu repozitoriju
Papildu konfigurācija
Deklaratīvais cauruļvads ar paralēlo testēšanu
Izpildiet vairākus veiktspējas testus paralēli:
pipeline {agent anyenvironment {LOADFOCUS_API_KEY = credentials('loadfocus-api-key')LOADFOCUS_TEAM_ID = credentials('loadfocus-team-id')}stages {// Previous stages...stage('Performance Tests') {parallel {stage('API Performance') {agent {docker {image 'node:16-alpine'}}steps {sh 'npm install -g @loadfocus/loadfocus-api-client'sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'sh '''loadfocus-api jmeter run-test \--name "API_Performance_Test" \--thresholds "avgresponse<=150,errors==0" \--format json > api_performance_results.json'''archiveArtifacts artifacts: 'api_performance_results.json', fingerprint: true}}stage('UI Performance') {agent {docker {image 'node:16-alpine'}}steps {sh 'npm install -g @loadfocus/loadfocus-api-client'sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'sh '''loadfocus-api jmeter run-test \--name "UI_Performance_Test" \--thresholds "avgresponse<=300,errors==0" \--format json > ui_performance_results.json'''archiveArtifacts artifacts: 'ui_performance_results.json', fingerprint: true}}}}// Next stages...}}
Skriptētais cauruļvads
Lielākai elastībai izmantojiet skriptēto cauruļvadu:
node {def performanceTestPassed = falsestage('Checkout') {checkout scm}stage('Build & Test') {// Your build and test steps}stage('Performance Test') {docker.image('node:16-alpine').inside {withCredentials([string(credentialsId: 'loadfocus-api-key', variable: 'LOADFOCUS_API_KEY'),string(credentialsId: 'loadfocus-team-id', variable: 'LOADFOCUS_TEAM_ID')]) {sh 'npm install -g @loadfocus/loadfocus-api-client'sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'try {sh '''loadfocus-api jmeter run-test \--name "Jenkins_${JOB_NAME}_${BUILD_NUMBER}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json'''// Check if test passed by examining the JSONdef testResults = readJSON file: 'performance_results.json'if (testResults.overallResult == 'PASSED') {performanceTestPassed = trueecho "Performance test passed!"} else {echo "Performance test failed to meet thresholds!"// Optional: Fail the build// error "Performance test failed"}} catch (Exception e) {echo "Error running performance test: ${e.message}"}archiveArtifacts artifacts: 'performance_results.json', fingerprint: true}}}stage('Deploy') {if (performanceTestPassed) {echo 'Deploying...'// Your deployment steps} else {echo 'Skipping deployment due to performance test failure'}}}
Koplietotā bibliotēka
Izveidojiet koplietotu bibliotēku atkārtoti izmantojamai veiktspējas testēšanai:
// vars/performanceTest.groovydef call(Map config = [:]) {def testName = config.testName ?: "Jenkins_${env.JOB_NAME}_${env.BUILD_NUMBER}"def thresholds = config.thresholds ?: "avgresponse<=200,errors==0,p95<=250"def waitTimeout = config.waitTimeout ?: 1800def resultsFile = config.resultsFile ?: "performance_results.json"docker.image('node:16-alpine').inside {withCredentials([string(credentialsId: 'loadfocus-api-key', variable: 'LOADFOCUS_API_KEY'),string(credentialsId: 'loadfocus-team-id', variable: 'LOADFOCUS_TEAM_ID')]) {sh 'npm install -g @loadfocus/loadfocus-api-client'sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'sh """loadfocus-api jmeter run-test \\--name "${testName}" \\--thresholds "${thresholds}" \\--waitTimeout ${waitTimeout} \\--format json > ${resultsFile}"""archiveArtifacts artifacts: resultsFile, fingerprint: true// Return the test resultsdef testResults = readJSON file: resultsFilereturn testResults}}}
Pēc tam savā Jenkinsfile:
@Library('my-shared-library') _pipeline {agent anystages {stage('Performance Test') {steps {script {def results = performanceTest(testName: "API_Performance_Test",thresholds: "avgresponse<=150,errors==0")if (results.overallResult != 'PASSED') {error "Performance test failed"}}}}}}
Integrācija ar Jenkins spraudņiem
Performance spraudnis
Izmantojiet Jenkins Performance spraudni, lai vizualizētu testu rezultātus:
- Instalējiet Performance spraudni Jenkins
- Konvertējiet LoadFocus rezultātus spraudņa atbalstītā formātā (JMeter CSV vai JUnit XML)
- Konfigurējiet savu cauruļvadu:
stage('Performance Test') {steps {// Run LoadFocus testsh '''loadfocus-api jmeter run-test \--name "Jenkins_${JOB_NAME}_${BUILD_NUMBER}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json# Convert to JMeter CSV format (using a custom script)node convert-to-jmeter.js performance_results.json performance_results.csv'''// Use Performance PluginperfReport sourceDataFiles: 'performance_results.csv',errorFailedThreshold: 0,errorUnstableThreshold: 0,errorUnstableResponseTimeThreshold: '200'}}
E-pasta paziņojumi
Nosūtiet e-pasta paziņojumus ar testu rezultātiem:
post {always {script {if (fileExists('performance_results.json')) {def results = readJSON file: 'performance_results.json'def resultStatus = results.overallResult == 'PASSED' ? 'SUCCESS' : 'FAILURE'def subject = "Performance Test ${resultStatus}: ${env.JOB_NAME} #${env.BUILD_NUMBER}"// Create email bodydef body = """<h2>Performance Test Results</h2><p><strong>Overall Result:</strong> ${results.overallResult}</p><h3>Results by Label</h3><table border="1"><tr><th>Label</th><th>Result</th><th>Avg Response</th><th>Errors</th></tr>"""results.labels.each { label ->body += """<tr><td>${label.label}</td><td>${label.result}</td><td>${label.metrics.avgresponse}ms</td><td>${label.metrics.errors}</td></tr>"""}body += "</table>"emailext (subject: subject,body: body,to: 'team@example.com',attachmentsPattern: 'performance_results.json',mimeType: 'text/html')}}}}
Padomi Jenkins integrācijai
Taimauta apstrāde: Iestatiet taimautus ilgstošiem veiktspējas testiem:
stage('Performance Test') {options {timeout(time: 60, unit: 'MINUTES')}steps {// Performance test steps}}Nosacījuma izpilde: Izpildiet veiktspējas testus tikai noteiktos zaros:
stage('Performance Test') {when {anyOf {branch 'main'branch 'develop'tag pattern: "v\\d+\\.\\d+\\.\\d+", comparator: "REGEXP"}}steps {// Performance test steps}}Ieplānotā testēšana: Izpildiet veiktspējas testus pēc grafika:
pipeline {agent anytriggers {cron('0 0 * * *') // Run at midnight every day}stages {// Pipeline stages}}Parametrizēti testi: Ļaujiet pielāgot testu parametrus:
pipeline {agent anyparameters {string(name: 'TEST_NAME', defaultValue: 'API_Performance_Test', description: 'Name of the LoadFocus test to run')string(name: 'THRESHOLDS', defaultValue: 'avgresponse<=200,errors==0', description: 'Performance thresholds')string(name: 'WAIT_TIMEOUT', defaultValue: '1800', description: 'Maximum wait time in seconds')}stages {stage('Performance Test') {steps {// Run test with parameterssh """loadfocus-api jmeter run-test \\--name "${params.TEST_NAME}" \\--thresholds "${params.THRESHOLDS}" \\--waitTimeout ${params.WAIT_TIMEOUT} \\--format json > performance_results.json"""}}}}
Papildu informācijai skatiet Jenkins dokumentāciju un LoadFocus API klienta dokumentāciju.