Integrazione Jenkins
Questa guida spiega come integrare il Client API JMeter di LoadFocus con Jenkins per test di prestazione automatizzati.
Passaggi di Configurazione
1. Memorizzare le Credenziali in Jenkins
Per prima cosa, memorizzate le vostre credenziali API LoadFocus in modo sicuro in Jenkins:
- Navigate su Jenkins Dashboard > Manage Jenkins > Manage Credentials
- Selezionate il dominio delle credenziali appropriato (es. global)
- Cliccate su "Add Credentials"
- Aggiungete le seguenti credenziali:
- Kind: Secret text
- Scope: Global
- Secret: La vostra chiave API LoadFocus
- ID: loadfocus-api-key
- Description: LoadFocus API Key
- Ripetete per il vostro ID team con ID: loadfocus-team-id
2. Creare una Pipeline Jenkins
Create un Jenkinsfile nel vostro repository:
pipeline {agent {docker {image 'node:16-alpine'}}environment {LOADFOCUS_API_KEY = credentials('loadfocus-api-key')LOADFOCUS_TEAM_ID = credentials('loadfocus-team-id')}stages {stage('Build') {steps {// Your build stepssh 'npm install'sh 'npm run build'}}stage('Test') {steps {// Your test stepssh 'npm test'}}stage('Performance Test') {steps {// Install LoadFocus JMeter API Clientsh 'npm install -g @loadfocus/loadfocus-api-client'// Configure LoadFocus API Clientsh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'// Run Performance Testssh '''loadfocus-api jmeter run-test \--name "Jenkins_${JOB_NAME}_${BUILD_NUMBER}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json'''// Archive the resultsarchiveArtifacts artifacts: 'performance_results.json', fingerprint: true}}stage('Deploy') {when {expression {return currentBuild.resultIsBetterOrEqualTo('SUCCESS')}}steps {// Your deployment stepsecho 'Deploying...'}}}post {always {// Clean up workspacecleanWs()}}}
3. Configurare il Job Jenkins
- Create un nuovo job Pipeline in Jenkins
- Configurate la Pipeline per utilizzare il vostro Jenkinsfile
- Impostate la configurazione SCM appropriata per recuperare il vostro repository
Configurazione Avanzata
Pipeline Dichiarativa con Test Paralleli
Eseguite piu test di prestazione in parallelo:
pipeline {agent anyenvironment {LOADFOCUS_API_KEY = credentials('loadfocus-api-key')LOADFOCUS_TEAM_ID = credentials('loadfocus-team-id')}stages {// Previous stages...stage('Performance Tests') {parallel {stage('API Performance') {agent {docker {image 'node:16-alpine'}}steps {sh 'npm install -g @loadfocus/loadfocus-api-client'sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'sh '''loadfocus-api jmeter run-test \--name "API_Performance_Test" \--thresholds "avgresponse<=150,errors==0" \--format json > api_performance_results.json'''archiveArtifacts artifacts: 'api_performance_results.json', fingerprint: true}}stage('UI Performance') {agent {docker {image 'node:16-alpine'}}steps {sh 'npm install -g @loadfocus/loadfocus-api-client'sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'sh '''loadfocus-api jmeter run-test \--name "UI_Performance_Test" \--thresholds "avgresponse<=300,errors==0" \--format json > ui_performance_results.json'''archiveArtifacts artifacts: 'ui_performance_results.json', fingerprint: true}}}}// Next stages...}}
Pipeline con Script
Per una maggiore flessibilita, utilizzate una pipeline con script:
node {def performanceTestPassed = falsestage('Checkout') {checkout scm}stage('Build & Test') {// Your build and test steps}stage('Performance Test') {docker.image('node:16-alpine').inside {withCredentials([string(credentialsId: 'loadfocus-api-key', variable: 'LOADFOCUS_API_KEY'),string(credentialsId: 'loadfocus-team-id', variable: 'LOADFOCUS_TEAM_ID')]) {sh 'npm install -g @loadfocus/loadfocus-api-client'sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'try {sh '''loadfocus-api jmeter run-test \--name "Jenkins_${JOB_NAME}_${BUILD_NUMBER}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json'''// Check if test passed by examining the JSONdef testResults = readJSON file: 'performance_results.json'if (testResults.overallResult == 'PASSED') {performanceTestPassed = trueecho "Performance test passed!"} else {echo "Performance test failed to meet thresholds!"// Optional: Fail the build// error "Performance test failed"}} catch (Exception e) {echo "Error running performance test: ${e.message}"}archiveArtifacts artifacts: 'performance_results.json', fingerprint: true}}}stage('Deploy') {if (performanceTestPassed) {echo 'Deploying...'// Your deployment steps} else {echo 'Skipping deployment due to performance test failure'}}}
Libreria Condivisa
Create una libreria condivisa per test di prestazione riutilizzabili:
// vars/performanceTest.groovydef call(Map config = [:]) {def testName = config.testName ?: "Jenkins_${env.JOB_NAME}_${env.BUILD_NUMBER}"def thresholds = config.thresholds ?: "avgresponse<=200,errors==0,p95<=250"def waitTimeout = config.waitTimeout ?: 1800def resultsFile = config.resultsFile ?: "performance_results.json"docker.image('node:16-alpine').inside {withCredentials([string(credentialsId: 'loadfocus-api-key', variable: 'LOADFOCUS_API_KEY'),string(credentialsId: 'loadfocus-team-id', variable: 'LOADFOCUS_TEAM_ID')]) {sh 'npm install -g @loadfocus/loadfocus-api-client'sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'sh """loadfocus-api jmeter run-test \\--name "${testName}" \\--thresholds "${thresholds}" \\--waitTimeout ${waitTimeout} \\--format json > ${resultsFile}"""archiveArtifacts artifacts: resultsFile, fingerprint: true// Return the test resultsdef testResults = readJSON file: resultsFilereturn testResults}}}
Poi nel vostro Jenkinsfile:
@Library('my-shared-library') _pipeline {agent anystages {stage('Performance Test') {steps {script {def results = performanceTest(testName: "API_Performance_Test",thresholds: "avgresponse<=150,errors==0")if (results.overallResult != 'PASSED') {error "Performance test failed"}}}}}}
Integrazione con i Plugin Jenkins
Plugin Performance
Utilizzate il Plugin Performance di Jenkins per visualizzare i risultati dei test:
- Installate il Plugin Performance in Jenkins
- Convertite i risultati LoadFocus in un formato supportato dal plugin (JMeter CSV o JUnit XML)
- Configurate la vostra pipeline:
stage('Performance Test') {steps {// Run LoadFocus testsh '''loadfocus-api jmeter run-test \--name "Jenkins_${JOB_NAME}_${BUILD_NUMBER}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json# Convert to JMeter CSV format (using a custom script)node convert-to-jmeter.js performance_results.json performance_results.csv'''// Use Performance PluginperfReport sourceDataFiles: 'performance_results.csv',errorFailedThreshold: 0,errorUnstableThreshold: 0,errorUnstableResponseTimeThreshold: '200'}}
Notifiche Email
Inviate notifiche email con i risultati dei test:
post {always {script {if (fileExists('performance_results.json')) {def results = readJSON file: 'performance_results.json'def resultStatus = results.overallResult == 'PASSED' ? 'SUCCESS' : 'FAILURE'def subject = "Performance Test ${resultStatus}: ${env.JOB_NAME} #${env.BUILD_NUMBER}"// Create email bodydef body = """<h2>Performance Test Results</h2><p><strong>Overall Result:</strong> ${results.overallResult}</p><h3>Results by Label</h3><table border="1"><tr><th>Label</th><th>Result</th><th>Avg Response</th><th>Errors</th></tr>"""results.labels.each { label ->body += """<tr><td>${label.label}</td><td>${label.result}</td><td>${label.metrics.avgresponse}ms</td><td>${label.metrics.errors}</td></tr>"""}body += "</table>"emailext (subject: subject,body: body,to: 'team@example.com',attachmentsPattern: 'performance_results.json',mimeType: 'text/html')}}}}
Suggerimenti per l'Integrazione con Jenkins
Gestione dei Timeout: Impostate i timeout per test di prestazione di lunga durata:
stage('Performance Test') {options {timeout(time: 60, unit: 'MINUTES')}steps {// Performance test steps}}Esecuzione Condizionale: Eseguite i test di prestazione solo su branch specifici:
stage('Performance Test') {when {anyOf {branch 'main'branch 'develop'tag pattern: "v\\d+\\.\\d+\\.\\d+", comparator: "REGEXP"}}steps {// Performance test steps}}Test Programmati: Eseguite i test di prestazione secondo una pianificazione:
pipeline {agent anytriggers {cron('0 0 * * *') // Run at midnight every day}stages {// Pipeline stages}}Test Parametrizzati: Permettete la personalizzazione dei parametri dei test:
pipeline {agent anyparameters {string(name: 'TEST_NAME', defaultValue: 'API_Performance_Test', description: 'Name of the LoadFocus test to run')string(name: 'THRESHOLDS', defaultValue: 'avgresponse<=200,errors==0', description: 'Performance thresholds')string(name: 'WAIT_TIMEOUT', defaultValue: '1800', description: 'Maximum wait time in seconds')}stages {stage('Performance Test') {steps {// Run test with parameterssh """loadfocus-api jmeter run-test \\--name "${params.TEST_NAME}" \\--thresholds "${params.THRESHOLDS}" \\--waitTimeout ${params.WAIT_TIMEOUT} \\--format json > performance_results.json"""}}}}
Per maggiori informazioni, consultate la documentazione di Jenkins e la documentazione del Client API LoadFocus.