Integrazione Jenkins

Questa guida spiega come integrare il Client API JMeter di LoadFocus con Jenkins per test di prestazione automatizzati.

Passaggi di Configurazione

1. Memorizzare le Credenziali in Jenkins

Per prima cosa, memorizzate le vostre credenziali API LoadFocus in modo sicuro in Jenkins:

  1. Navigate su Jenkins Dashboard > Manage Jenkins > Manage Credentials
  2. Selezionate il dominio delle credenziali appropriato (es. global)
  3. Cliccate su "Add Credentials"
  4. Aggiungete le seguenti credenziali:
    • Kind: Secret text
    • Scope: Global
    • Secret: La vostra chiave API LoadFocus
    • ID: loadfocus-api-key
    • Description: LoadFocus API Key
  5. Ripetete per il vostro ID team con ID: loadfocus-team-id

2. Creare una Pipeline Jenkins

Create un Jenkinsfile nel vostro repository:

pipeline {
agent {
docker {
image 'node:16-alpine'
}
}
environment {
LOADFOCUS_API_KEY = credentials('loadfocus-api-key')
LOADFOCUS_TEAM_ID = credentials('loadfocus-team-id')
}
stages {
stage('Build') {
steps {
// Your build steps
sh 'npm install'
sh 'npm run build'
}
}
stage('Test') {
steps {
// Your test steps
sh 'npm test'
}
}
stage('Performance Test') {
steps {
// Install LoadFocus JMeter API Client
sh 'npm install -g @loadfocus/loadfocus-api-client'
// Configure LoadFocus API Client
sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'
sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'
// Run Performance Tests
sh '''
loadfocus-api jmeter run-test \
--name "Jenkins_${JOB_NAME}_${BUILD_NUMBER}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
'''
// Archive the results
archiveArtifacts artifacts: 'performance_results.json', fingerprint: true
}
}
stage('Deploy') {
when {
expression {
return currentBuild.resultIsBetterOrEqualTo('SUCCESS')
}
}
steps {
// Your deployment steps
echo 'Deploying...'
}
}
}
post {
always {
// Clean up workspace
cleanWs()
}
}
}

3. Configurare il Job Jenkins

  1. Create un nuovo job Pipeline in Jenkins
  2. Configurate la Pipeline per utilizzare il vostro Jenkinsfile
  3. Impostate la configurazione SCM appropriata per recuperare il vostro repository

Configurazione Avanzata

Pipeline Dichiarativa con Test Paralleli

Eseguite piu test di prestazione in parallelo:

pipeline {
agent any
environment {
LOADFOCUS_API_KEY = credentials('loadfocus-api-key')
LOADFOCUS_TEAM_ID = credentials('loadfocus-team-id')
}
stages {
// Previous stages...
stage('Performance Tests') {
parallel {
stage('API Performance') {
agent {
docker {
image 'node:16-alpine'
}
}
steps {
sh 'npm install -g @loadfocus/loadfocus-api-client'
sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'
sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'
sh '''
loadfocus-api jmeter run-test \
--name "API_Performance_Test" \
--thresholds "avgresponse<=150,errors==0" \
--format json > api_performance_results.json
'''
archiveArtifacts artifacts: 'api_performance_results.json', fingerprint: true
}
}
stage('UI Performance') {
agent {
docker {
image 'node:16-alpine'
}
}
steps {
sh 'npm install -g @loadfocus/loadfocus-api-client'
sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'
sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'
sh '''
loadfocus-api jmeter run-test \
--name "UI_Performance_Test" \
--thresholds "avgresponse<=300,errors==0" \
--format json > ui_performance_results.json
'''
archiveArtifacts artifacts: 'ui_performance_results.json', fingerprint: true
}
}
}
}
// Next stages...
}
}

Pipeline con Script

Per una maggiore flessibilita, utilizzate una pipeline con script:

node {
def performanceTestPassed = false
stage('Checkout') {
checkout scm
}
stage('Build & Test') {
// Your build and test steps
}
stage('Performance Test') {
docker.image('node:16-alpine').inside {
withCredentials([
string(credentialsId: 'loadfocus-api-key', variable: 'LOADFOCUS_API_KEY'),
string(credentialsId: 'loadfocus-team-id', variable: 'LOADFOCUS_TEAM_ID')
]) {
sh 'npm install -g @loadfocus/loadfocus-api-client'
sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'
sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'
try {
sh '''
loadfocus-api jmeter run-test \
--name "Jenkins_${JOB_NAME}_${BUILD_NUMBER}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
'''
// Check if test passed by examining the JSON
def testResults = readJSON file: 'performance_results.json'
if (testResults.overallResult == 'PASSED') {
performanceTestPassed = true
echo "Performance test passed!"
} else {
echo "Performance test failed to meet thresholds!"
// Optional: Fail the build
// error "Performance test failed"
}
} catch (Exception e) {
echo "Error running performance test: ${e.message}"
}
archiveArtifacts artifacts: 'performance_results.json', fingerprint: true
}
}
}
stage('Deploy') {
if (performanceTestPassed) {
echo 'Deploying...'
// Your deployment steps
} else {
echo 'Skipping deployment due to performance test failure'
}
}
}

Libreria Condivisa

Create una libreria condivisa per test di prestazione riutilizzabili:

// vars/performanceTest.groovy
def call(Map config = [:]) {
def testName = config.testName ?: "Jenkins_${env.JOB_NAME}_${env.BUILD_NUMBER}"
def thresholds = config.thresholds ?: "avgresponse<=200,errors==0,p95<=250"
def waitTimeout = config.waitTimeout ?: 1800
def resultsFile = config.resultsFile ?: "performance_results.json"
docker.image('node:16-alpine').inside {
withCredentials([
string(credentialsId: 'loadfocus-api-key', variable: 'LOADFOCUS_API_KEY'),
string(credentialsId: 'loadfocus-team-id', variable: 'LOADFOCUS_TEAM_ID')
]) {
sh 'npm install -g @loadfocus/loadfocus-api-client'
sh 'loadfocus-api config set apikey $LOADFOCUS_API_KEY'
sh 'loadfocus-api config set teamid $LOADFOCUS_TEAM_ID'
sh """
loadfocus-api jmeter run-test \\
--name "${testName}" \\
--thresholds "${thresholds}" \\
--waitTimeout ${waitTimeout} \\
--format json > ${resultsFile}
"""
archiveArtifacts artifacts: resultsFile, fingerprint: true
// Return the test results
def testResults = readJSON file: resultsFile
return testResults
}
}
}

Poi nel vostro Jenkinsfile:

@Library('my-shared-library') _
pipeline {
agent any
stages {
stage('Performance Test') {
steps {
script {
def results = performanceTest(
testName: "API_Performance_Test",
thresholds: "avgresponse<=150,errors==0"
)
if (results.overallResult != 'PASSED') {
error "Performance test failed"
}
}
}
}
}
}

Integrazione con i Plugin Jenkins

Plugin Performance

Utilizzate il Plugin Performance di Jenkins per visualizzare i risultati dei test:

  1. Installate il Plugin Performance in Jenkins
  2. Convertite i risultati LoadFocus in un formato supportato dal plugin (JMeter CSV o JUnit XML)
  3. Configurate la vostra pipeline:
stage('Performance Test') {
steps {
// Run LoadFocus test
sh '''
loadfocus-api jmeter run-test \
--name "Jenkins_${JOB_NAME}_${BUILD_NUMBER}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
# Convert to JMeter CSV format (using a custom script)
node convert-to-jmeter.js performance_results.json performance_results.csv
'''
// Use Performance Plugin
perfReport sourceDataFiles: 'performance_results.csv',
errorFailedThreshold: 0,
errorUnstableThreshold: 0,
errorUnstableResponseTimeThreshold: '200'
}
}

Notifiche Email

Inviate notifiche email con i risultati dei test:

post {
always {
script {
if (fileExists('performance_results.json')) {
def results = readJSON file: 'performance_results.json'
def resultStatus = results.overallResult == 'PASSED' ? 'SUCCESS' : 'FAILURE'
def subject = "Performance Test ${resultStatus}: ${env.JOB_NAME} #${env.BUILD_NUMBER}"
// Create email body
def body = """
<h2>Performance Test Results</h2>
<p><strong>Overall Result:</strong> ${results.overallResult}</p>
<h3>Results by Label</h3>
<table border="1">
<tr><th>Label</th><th>Result</th><th>Avg Response</th><th>Errors</th></tr>
"""
results.labels.each { label ->
body += """
<tr>
<td>${label.label}</td>
<td>${label.result}</td>
<td>${label.metrics.avgresponse}ms</td>
<td>${label.metrics.errors}</td>
</tr>
"""
}
body += "</table>"
emailext (
subject: subject,
body: body,
to: 'team@example.com',
attachmentsPattern: 'performance_results.json',
mimeType: 'text/html'
)
}
}
}
}

Suggerimenti per l'Integrazione con Jenkins

  1. Gestione dei Timeout: Impostate i timeout per test di prestazione di lunga durata:

    stage('Performance Test') {
    options {
    timeout(time: 60, unit: 'MINUTES')
    }
    steps {
    // Performance test steps
    }
    }
  2. Esecuzione Condizionale: Eseguite i test di prestazione solo su branch specifici:

    stage('Performance Test') {
    when {
    anyOf {
    branch 'main'
    branch 'develop'
    tag pattern: "v\\d+\\.\\d+\\.\\d+", comparator: "REGEXP"
    }
    }
    steps {
    // Performance test steps
    }
    }
  3. Test Programmati: Eseguite i test di prestazione secondo una pianificazione:

    pipeline {
    agent any
    triggers {
    cron('0 0 * * *') // Run at midnight every day
    }
    stages {
    // Pipeline stages
    }
    }
  4. Test Parametrizzati: Permettete la personalizzazione dei parametri dei test:

    pipeline {
    agent any
    parameters {
    string(name: 'TEST_NAME', defaultValue: 'API_Performance_Test', description: 'Name of the LoadFocus test to run')
    string(name: 'THRESHOLDS', defaultValue: 'avgresponse<=200,errors==0', description: 'Performance thresholds')
    string(name: 'WAIT_TIMEOUT', defaultValue: '1800', description: 'Maximum wait time in seconds')
    }
    stages {
    stage('Performance Test') {
    steps {
    // Run test with parameters
    sh """
    loadfocus-api jmeter run-test \\
    --name "${params.TEST_NAME}" \\
    --thresholds "${params.THRESHOLDS}" \\
    --waitTimeout ${params.WAIT_TIMEOUT} \\
    --format json > performance_results.json
    """
    }
    }
    }
    }

Per maggiori informazioni, consultate la documentazione di Jenkins e la documentazione del Client API LoadFocus.