Jenkins
Ce guide explique comment integrer le client API JMeter de LoadFocus avec Jenkins pour les tests de performance automatises.
Etapes de configuration
1. Stocker les identifiants dans Jenkins
Tout d'abord, stockez vos identifiants API LoadFocus de maniere securisee dans Jenkins :
- Accedez a Jenkins Dashboard > Manage Jenkins > Manage Credentials
- Selectionnez le domaine d'identifiants appropriate (par exemple, global)
- Cliquez sur "Add Credentials"
- Ajoutez les identifiants suivants :
- Kind : Secret text
- Scope : Global
- Secret : Votre cle API LoadFocus
- ID : loadfocus-api-key
- Description : LoadFocus API Key
- Repetez l'operation pour votre ID d'equipe avec ID : loadfocus-team-id
2. Creer un pipeline Jenkins
Creez un Jenkinsfile dans votre depot :
pipeline {agent {docker {image 'node:16-alpine'}}environment {LOADFOCUS_API_KEY = credentials('loadfocus-api-key')LOADFOCUS_TEAM_ID = credentials('loadfocus-team-id')}stages {stage('Build') {steps {// Vos etapes de buildsh 'npm install'sh 'npm run build'}}stage('Test') {steps {// Vos etapes de testsh 'npm test'}}stage('Performance Test') {steps {// Installer le client API JMeter de LoadFocussh 'npm install -g @loadfocus/loadfocus-api-client'// Configurer le client API LoadFocussh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'// Executer les tests de performancesh '''loadfocus-api jmeter run-test \--name "Jenkins_${JOB_NAME}_${BUILD_NUMBER}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json'''// Archiver les resultatsarchiveArtifacts artifacts: 'performance_results.json', fingerprint: true}}stage('Deploy') {when {expression {return currentBuild.resultIsBetterOrEqualTo('SUCCESS')}}steps {// Vos etapes de deploiementecho 'Deploying...'}}}post {always {// Nettoyer l'espace de travailcleanWs()}}}
3. Configurer le job Jenkins
- Creez un nouveau job Pipeline dans Jenkins
- Configurez le Pipeline pour utiliser votre Jenkinsfile
- Configurez la gestion SCM appropriee pour recuperer votre depot
Configuration avancee
Pipeline declaratif avec tests paralleles
Executez plusieurs tests de performance en parallele :
pipeline {agent anyenvironment {LOADFOCUS_API_KEY = credentials('loadfocus-api-key')LOADFOCUS_TEAM_ID = credentials('loadfocus-team-id')}stages {// Etapes precedentes...stage('Performance Tests') {parallel {stage('API Performance') {agent {docker {image 'node:16-alpine'}}steps {sh 'npm install -g @loadfocus/loadfocus-api-client'sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'sh '''loadfocus-api jmeter run-test \--name "API_Performance_Test" \--thresholds "avgresponse<=150,errors==0" \--format json > api_performance_results.json'''archiveArtifacts artifacts: 'api_performance_results.json', fingerprint: true}}stage('UI Performance') {agent {docker {image 'node:16-alpine'}}steps {sh 'npm install -g @loadfocus/loadfocus-api-client'sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'sh '''loadfocus-api jmeter run-test \--name "UI_Performance_Test" \--thresholds "avgresponse<=300,errors==0" \--format json > ui_performance_results.json'''archiveArtifacts artifacts: 'ui_performance_results.json', fingerprint: true}}}}// Etapes suivantes...}}
Pipeline scripte
Pour plus de flexibilite, utilisez un pipeline scripte :
node {def performanceTestPassed = falsestage('Checkout') {checkout scm}stage('Build & Test') {// Vos etapes de build et de test}stage('Performance Test') {docker.image('node:16-alpine').inside {withCredentials([string(credentialsId: 'loadfocus-api-key', variable: 'LOADFOCUS_API_KEY'),string(credentialsId: 'loadfocus-team-id', variable: 'LOADFOCUS_TEAM_ID')]) {sh 'npm install -g @loadfocus/loadfocus-api-client'sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'try {sh '''loadfocus-api jmeter run-test \--name "Jenkins_${JOB_NAME}_${BUILD_NUMBER}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json'''// Verifier si le test a reussi en examinant le JSONdef testResults = readJSON file: 'performance_results.json'if (testResults.overallResult == 'PASSED') {performanceTestPassed = trueecho "Performance test passed!"} else {echo "Performance test failed to meet thresholds!"// Optionnel : Faire echouer le build// error "Performance test failed"}} catch (Exception e) {echo "Error running performance test: ${e.message}"}archiveArtifacts artifacts: 'performance_results.json', fingerprint: true}}}stage('Deploy') {if (performanceTestPassed) {echo 'Deploying...'// Vos etapes de deploiement} else {echo 'Skipping deployment due to performance test failure'}}}
Bibliotheque partagee
Creez une bibliotheque partagee pour les tests de performance reutilisables :
// vars/performanceTest.groovydef call(Map config = [:]) {def testName = config.testName ?: "Jenkins_${env.JOB_NAME}_${env.BUILD_NUMBER}"def thresholds = config.thresholds ?: "avgresponse<=200,errors==0,p95<=250"def waitTimeout = config.waitTimeout ?: 1800def resultsFile = config.resultsFile ?: "performance_results.json"docker.image('node:16-alpine').inside {withCredentials([string(credentialsId: 'loadfocus-api-key', variable: 'LOADFOCUS_API_KEY'),string(credentialsId: 'loadfocus-team-id', variable: 'LOADFOCUS_TEAM_ID')]) {sh 'npm install -g @loadfocus/loadfocus-api-client'sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'sh """loadfocus-api jmeter run-test \\--name "${testName}" \\--thresholds "${thresholds}" \\--waitTimeout ${waitTimeout} \\--format json > ${resultsFile}"""archiveArtifacts artifacts: resultsFile, fingerprint: true// Retourner les resultats du testdef testResults = readJSON file: resultsFilereturn testResults}}}
Puis dans votre Jenkinsfile :
@Library('my-shared-library') _pipeline {agent anystages {stage('Performance Test') {steps {script {def results = performanceTest(testName: "API_Performance_Test",thresholds: "avgresponse<=150,errors==0")if (results.overallResult != 'PASSED') {error "Performance test failed"}}}}}}
Integration avec les plugins Jenkins
Plugin Performance
Utilisez le plugin Performance de Jenkins pour visualiser les resultats des tests :
- Installez le plugin Performance dans Jenkins
- Convertissez les resultats LoadFocus dans un format supporte par le plugin (CSV JMeter ou XML JUnit)
- Configurez votre pipeline :
stage('Performance Test') {steps {// Executer le test LoadFocussh '''loadfocus-api jmeter run-test \--name "Jenkins_${JOB_NAME}_${BUILD_NUMBER}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json# Convertir au format CSV JMeter (en utilisant un script personnalise)node convert-to-jmeter.js performance_results.json performance_results.csv'''// Utiliser le plugin PerformanceperfReport sourceDataFiles: 'performance_results.csv',errorFailedThreshold: 0,errorUnstableThreshold: 0,errorUnstableResponseTimeThreshold: '200'}}
Notification par email
Envoyez des notifications par email avec les resultats des tests :
post {always {script {if (fileExists('performance_results.json')) {def results = readJSON file: 'performance_results.json'def resultStatus = results.overallResult == 'PASSED' ? 'SUCCESS' : 'FAILURE'def subject = "Performance Test ${resultStatus}: ${env.JOB_NAME} #${env.BUILD_NUMBER}"// Creer le corps de l'emaildef body = """<h2>Resultats des tests de performance</h2><p><strong>Resultat global :</strong> ${results.overallResult}</p><h3>Resultats par label</h3><table border="1"><tr><th>Label</th><th>Resultat</th><th>Temps de reponse moyen</th><th>Erreurs</th></tr>"""results.labels.each { label ->body += """<tr><td>${label.label}</td><td>${label.result}</td><td>${label.metrics.avgresponse}ms</td><td>${label.metrics.errors}</td></tr>"""}body += "</table>"emailext (subject: subject,body: body,to: 'team@example.com',attachmentsPattern: 'performance_results.json',mimeType: 'text/html')}}}}
Conseils pour l'integration Jenkins
Gestion des delais : Definissez des delais pour les tests de performance de longue duree :
stage('Performance Test') {options {timeout(time: 60, unit: 'MINUTES')}steps {// Etapes de test de performance}}Execution conditionnelle : Executez les tests de performance uniquement sur des branches specifiques :
stage('Performance Test') {when {anyOf {branch 'main'branch 'develop'tag pattern: "v\\d+\\.\\d+\\.\\d+", comparator: "REGEXP"}}steps {// Etapes de test de performance}}Tests planifies : Executez les tests de performance selon un planning :
pipeline {agent anytriggers {cron('0 0 * * *') // Executer a minuit chaque jour}stages {// Etapes du pipeline}}Tests parametres : Permettez la personnalisation des parametres de test :
pipeline {agent anyparameters {string(name: 'TEST_NAME', defaultValue: 'API_Performance_Test', description: 'Name of the LoadFocus test to run')string(name: 'THRESHOLDS', defaultValue: 'avgresponse<=200,errors==0', description: 'Performance thresholds')string(name: 'WAIT_TIMEOUT', defaultValue: '1800', description: 'Maximum wait time in seconds')}stages {stage('Performance Test') {steps {// Executer le test avec les parametressh """loadfocus-api jmeter run-test \\--name "${params.TEST_NAME}" \\--thresholds "${params.THRESHOLDS}" \\--waitTimeout ${params.WAIT_TIMEOUT} \\--format json > performance_results.json"""}}}}
Pour plus d'informations, consultez la documentation Jenkins et la documentation du client API LoadFocus.