Jenkins

Ce guide explique comment integrer le client API JMeter de LoadFocus avec Jenkins pour les tests de performance automatises.

Etapes de configuration

1. Stocker les identifiants dans Jenkins

Tout d'abord, stockez vos identifiants API LoadFocus de maniere securisee dans Jenkins :

  1. Accedez a Jenkins Dashboard > Manage Jenkins > Manage Credentials
  2. Selectionnez le domaine d'identifiants appropriate (par exemple, global)
  3. Cliquez sur "Add Credentials"
  4. Ajoutez les identifiants suivants :
    • Kind : Secret text
    • Scope : Global
    • Secret : Votre cle API LoadFocus
    • ID : loadfocus-api-key
    • Description : LoadFocus API Key
  5. Repetez l'operation pour votre ID d'equipe avec ID : loadfocus-team-id

2. Creer un pipeline Jenkins

Creez un Jenkinsfile dans votre depot :

pipeline {
agent {
docker {
image 'node:16-alpine'
}
}
environment {
LOADFOCUS_API_KEY = credentials('loadfocus-api-key')
LOADFOCUS_TEAM_ID = credentials('loadfocus-team-id')
}
stages {
stage('Build') {
steps {
// Vos etapes de build
sh 'npm install'
sh 'npm run build'
}
}
stage('Test') {
steps {
// Vos etapes de test
sh 'npm test'
}
}
stage('Performance Test') {
steps {
// Installer le client API JMeter de LoadFocus
sh 'npm install -g @loadfocus/loadfocus-api-client'
// Configurer le client API LoadFocus
sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'
sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'
// Executer les tests de performance
sh '''
loadfocus-api jmeter run-test \
--name "Jenkins_${JOB_NAME}_${BUILD_NUMBER}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
'''
// Archiver les resultats
archiveArtifacts artifacts: 'performance_results.json', fingerprint: true
}
}
stage('Deploy') {
when {
expression {
return currentBuild.resultIsBetterOrEqualTo('SUCCESS')
}
}
steps {
// Vos etapes de deploiement
echo 'Deploying...'
}
}
}
post {
always {
// Nettoyer l'espace de travail
cleanWs()
}
}
}

3. Configurer le job Jenkins

  1. Creez un nouveau job Pipeline dans Jenkins
  2. Configurez le Pipeline pour utiliser votre Jenkinsfile
  3. Configurez la gestion SCM appropriee pour recuperer votre depot

Configuration avancee

Pipeline declaratif avec tests paralleles

Executez plusieurs tests de performance en parallele :

pipeline {
agent any
environment {
LOADFOCUS_API_KEY = credentials('loadfocus-api-key')
LOADFOCUS_TEAM_ID = credentials('loadfocus-team-id')
}
stages {
// Etapes precedentes...
stage('Performance Tests') {
parallel {
stage('API Performance') {
agent {
docker {
image 'node:16-alpine'
}
}
steps {
sh 'npm install -g @loadfocus/loadfocus-api-client'
sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'
sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'
sh '''
loadfocus-api jmeter run-test \
--name "API_Performance_Test" \
--thresholds "avgresponse<=150,errors==0" \
--format json > api_performance_results.json
'''
archiveArtifacts artifacts: 'api_performance_results.json', fingerprint: true
}
}
stage('UI Performance') {
agent {
docker {
image 'node:16-alpine'
}
}
steps {
sh 'npm install -g @loadfocus/loadfocus-api-client'
sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'
sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'
sh '''
loadfocus-api jmeter run-test \
--name "UI_Performance_Test" \
--thresholds "avgresponse<=300,errors==0" \
--format json > ui_performance_results.json
'''
archiveArtifacts artifacts: 'ui_performance_results.json', fingerprint: true
}
}
}
}
// Etapes suivantes...
}
}

Pipeline scripte

Pour plus de flexibilite, utilisez un pipeline scripte :

node {
def performanceTestPassed = false
stage('Checkout') {
checkout scm
}
stage('Build & Test') {
// Vos etapes de build et de test
}
stage('Performance Test') {
docker.image('node:16-alpine').inside {
withCredentials([
string(credentialsId: 'loadfocus-api-key', variable: 'LOADFOCUS_API_KEY'),
string(credentialsId: 'loadfocus-team-id', variable: 'LOADFOCUS_TEAM_ID')
]) {
sh 'npm install -g @loadfocus/loadfocus-api-client'
sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'
sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'
try {
sh '''
loadfocus-api jmeter run-test \
--name "Jenkins_${JOB_NAME}_${BUILD_NUMBER}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
'''
// Verifier si le test a reussi en examinant le JSON
def testResults = readJSON file: 'performance_results.json'
if (testResults.overallResult == 'PASSED') {
performanceTestPassed = true
echo "Performance test passed!"
} else {
echo "Performance test failed to meet thresholds!"
// Optionnel : Faire echouer le build
// error "Performance test failed"
}
} catch (Exception e) {
echo "Error running performance test: ${e.message}"
}
archiveArtifacts artifacts: 'performance_results.json', fingerprint: true
}
}
}
stage('Deploy') {
if (performanceTestPassed) {
echo 'Deploying...'
// Vos etapes de deploiement
} else {
echo 'Skipping deployment due to performance test failure'
}
}
}

Bibliotheque partagee

Creez une bibliotheque partagee pour les tests de performance reutilisables :

// vars/performanceTest.groovy
def call(Map config = [:]) {
def testName = config.testName ?: "Jenkins_${env.JOB_NAME}_${env.BUILD_NUMBER}"
def thresholds = config.thresholds ?: "avgresponse<=200,errors==0,p95<=250"
def waitTimeout = config.waitTimeout ?: 1800
def resultsFile = config.resultsFile ?: "performance_results.json"
docker.image('node:16-alpine').inside {
withCredentials([
string(credentialsId: 'loadfocus-api-key', variable: 'LOADFOCUS_API_KEY'),
string(credentialsId: 'loadfocus-team-id', variable: 'LOADFOCUS_TEAM_ID')
]) {
sh 'npm install -g @loadfocus/loadfocus-api-client'
sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'
sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'
sh """
loadfocus-api jmeter run-test \\
--name "${testName}" \\
--thresholds "${thresholds}" \\
--waitTimeout ${waitTimeout} \\
--format json > ${resultsFile}
"""
archiveArtifacts artifacts: resultsFile, fingerprint: true
// Retourner les resultats du test
def testResults = readJSON file: resultsFile
return testResults
}
}
}

Puis dans votre Jenkinsfile :

@Library('my-shared-library') _
pipeline {
agent any
stages {
stage('Performance Test') {
steps {
script {
def results = performanceTest(
testName: "API_Performance_Test",
thresholds: "avgresponse<=150,errors==0"
)
if (results.overallResult != 'PASSED') {
error "Performance test failed"
}
}
}
}
}
}

Integration avec les plugins Jenkins

Plugin Performance

Utilisez le plugin Performance de Jenkins pour visualiser les resultats des tests :

  1. Installez le plugin Performance dans Jenkins
  2. Convertissez les resultats LoadFocus dans un format supporte par le plugin (CSV JMeter ou XML JUnit)
  3. Configurez votre pipeline :
stage('Performance Test') {
steps {
// Executer le test LoadFocus
sh '''
loadfocus-api jmeter run-test \
--name "Jenkins_${JOB_NAME}_${BUILD_NUMBER}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
# Convertir au format CSV JMeter (en utilisant un script personnalise)
node convert-to-jmeter.js performance_results.json performance_results.csv
'''
// Utiliser le plugin Performance
perfReport sourceDataFiles: 'performance_results.csv',
errorFailedThreshold: 0,
errorUnstableThreshold: 0,
errorUnstableResponseTimeThreshold: '200'
}
}

Notification par email

Envoyez des notifications par email avec les resultats des tests :

post {
always {
script {
if (fileExists('performance_results.json')) {
def results = readJSON file: 'performance_results.json'
def resultStatus = results.overallResult == 'PASSED' ? 'SUCCESS' : 'FAILURE'
def subject = "Performance Test ${resultStatus}: ${env.JOB_NAME} #${env.BUILD_NUMBER}"
// Creer le corps de l'email
def body = """
<h2>Resultats des tests de performance</h2>
<p><strong>Resultat global :</strong> ${results.overallResult}</p>
<h3>Resultats par label</h3>
<table border="1">
<tr><th>Label</th><th>Resultat</th><th>Temps de reponse moyen</th><th>Erreurs</th></tr>
"""
results.labels.each { label ->
body += """
<tr>
<td>${label.label}</td>
<td>${label.result}</td>
<td>${label.metrics.avgresponse}ms</td>
<td>${label.metrics.errors}</td>
</tr>
"""
}
body += "</table>"
emailext (
subject: subject,
body: body,
to: 'team@example.com',
attachmentsPattern: 'performance_results.json',
mimeType: 'text/html'
)
}
}
}
}

Conseils pour l'integration Jenkins

  1. Gestion des delais : Definissez des delais pour les tests de performance de longue duree :

    stage('Performance Test') {
    options {
    timeout(time: 60, unit: 'MINUTES')
    }
    steps {
    // Etapes de test de performance
    }
    }
  2. Execution conditionnelle : Executez les tests de performance uniquement sur des branches specifiques :

    stage('Performance Test') {
    when {
    anyOf {
    branch 'main'
    branch 'develop'
    tag pattern: "v\\d+\\.\\d+\\.\\d+", comparator: "REGEXP"
    }
    }
    steps {
    // Etapes de test de performance
    }
    }
  3. Tests planifies : Executez les tests de performance selon un planning :

    pipeline {
    agent any
    triggers {
    cron('0 0 * * *') // Executer a minuit chaque jour
    }
    stages {
    // Etapes du pipeline
    }
    }
  4. Tests parametres : Permettez la personnalisation des parametres de test :

    pipeline {
    agent any
    parameters {
    string(name: 'TEST_NAME', defaultValue: 'API_Performance_Test', description: 'Name of the LoadFocus test to run')
    string(name: 'THRESHOLDS', defaultValue: 'avgresponse<=200,errors==0', description: 'Performance thresholds')
    string(name: 'WAIT_TIMEOUT', defaultValue: '1800', description: 'Maximum wait time in seconds')
    }
    stages {
    stage('Performance Test') {
    steps {
    // Executer le test avec les parametres
    sh """
    loadfocus-api jmeter run-test \\
    --name "${params.TEST_NAME}" \\
    --thresholds "${params.THRESHOLDS}" \\
    --waitTimeout ${params.WAIT_TIMEOUT} \\
    --format json > performance_results.json
    """
    }
    }
    }
    }

Pour plus d'informations, consultez la documentation Jenkins et la documentation du client API LoadFocus.