GitLab CI/CD
Ce guide explique comment integrer le client API JMeter de LoadFocus avec GitLab CI/CD pour les tests de performance automatises.
Etapes de configuration
1. Stocker les identifiants en tant que variables GitLab CI/CD
Tout d'abord, stockez vos identifiants API LoadFocus en tant que variables GitLab CI/CD :
- Accedez a votre projet GitLab
- Allez dans Settings > CI/CD > Variables
- Ajoutez les variables suivantes :
LOADFOCUS_API_KEY: Votre cle API LoadFocus (marquez comme "Masked")LOADFOCUS_TEAM_ID: Votre ID d'equipe LoadFocus
2. Creer un pipeline GitLab CI/CD
Creez ou mettez a jour votre fichier .gitlab-ci.yml dans votre depot :
stages:- build- test- performance- deployvariables:NODE_VERSION: "16"build:stage: buildimage: node:${NODE_VERSION}script:- npm install- npm run buildartifacts:paths:- dist/expire_in: 1 weektest:stage: testimage: node:${NODE_VERSION}script:- npm install- npm testperformance_test:stage: performanceimage: node:${NODE_VERSION}script:# Installer le client API JMeter de LoadFocus- npm install -g @loadfocus/loadfocus-api-client# Configurer le client API LoadFocus- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID# Executer les tests de performance- |loadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.jsonartifacts:paths:- performance_results.jsonexpire_in: 1 weekwhen: always# Optionnel : Executer uniquement sur des branches specifiquesonly:- main- developdeploy:stage: deployscript:- echo "Deploying application..."only:- main# Deployer uniquement si toutes les etapes precedentes ont reussiwhen: on_success
3. Consulter les resultats des tests
Apres l'execution du pipeline :
- Accedez a votre projet GitLab
- Allez dans CI/CD > Pipelines
- Trouvez votre pipeline et cliquez dessus
- Allez au job "performance_test"
- Cliquez sur "Browse" dans la barre laterale droite pour consulter les artefacts
- Telechargez et consultez le fichier
performance_results.json
Configuration avancee
Tests specifiques a l'environnement
Executez differents tests de performance pour differents environnements :
.performance_test_template: &performance_test_definitionstage: performanceimage: node:${NODE_VERSION}script:- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "${TEST_NAME}" \--thresholds "${THRESHOLDS}" \--format json > performance_results.jsonartifacts:paths:- performance_results.jsonexpire_in: 1 weekwhen: alwaysperformance_test_develop:<<: *performance_test_definitionvariables:TEST_NAME: "API_Test_Develop"THRESHOLDS: "avgresponse<=300,errors==0,p95<=500"only:- developperformance_test_staging:<<: *performance_test_definitionvariables:TEST_NAME: "API_Test_Staging"THRESHOLDS: "avgresponse<=250,errors==0,p95<=350"only:- stagingperformance_test_production:<<: *performance_test_definitionvariables:TEST_NAME: "API_Test_Production"THRESHOLDS: "avgresponse<=200,errors==0,p95<=250"only:- main
Tests paralleles
Executez plusieurs tests de performance en parallele :
performance_test_api:stage: performanceimage: node:${NODE_VERSION}script:- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "API_Performance_Test" \--thresholds "avgresponse<=150,errors==0" \--format json > api_performance_results.jsonartifacts:paths:- api_performance_results.jsonexpire_in: 1 weekperformance_test_ui:stage: performanceimage: node:${NODE_VERSION}script:- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "UI_Performance_Test" \--thresholds "avgresponse<=300,errors==0" \--format json > ui_performance_results.jsonartifacts:paths:- ui_performance_results.jsonexpire_in: 1 week
Ensuite, creez un planning de pipeline dans GitLab :
- Accedez a votre projet GitLab
- Allez dans CI/CD > Schedules
- Cliquez sur "New schedule"
- Configurez un planning (par exemple, chaque jour a minuit)
Creation de rapports de performance
Generez des rapports HTML a partir des resultats JSON :
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Executer le test de performance- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json# Generer le rapport HTML- npm install -g performance-report-generator # Remplacez par le generateur de rapports reel- performance-report-generator --input performance_results.json --output performance_report.htmlartifacts:paths:- performance_results.json- performance_report.htmlexpire_in: 1 weekwhen: always# Optionnel : Publier le rapport en tant que GitLab Pagespages:stage: deploydependencies:- performance_testscript:- mkdir -p public/performance-reports- cp performance_report.html public/performance-reports/index.htmlartifacts:paths:- publiconly:- main
Integration avec les fonctionnalites GitLab
Widgets de merge request
Affichez les resultats des tests de performance dans les merge requests en utilisant la fonctionnalite de rapport JUnit de GitLab :
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Executer le test de performance- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json# Convertir au format JUnit (en utilisant un script personnalise)- node convert-to-junit.js performance_results.json junit-report.xmlartifacts:reports:junit: junit-report.xmlpaths:- performance_results.jsonexpire_in: 1 week
Metriques GitLab
Utilisez la fonctionnalite de metriques de GitLab pour suivre les performances au fil du temps :
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Executer le test de performance- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "avgresponse<=200,errors==0,p95<=250" \--format json > performance_results.json# Extraire les metriques et les rapporter a GitLab- |# Analyser le JSON et extraire les metriquesRESPONSE_TIME=$(jq '.labels[0].metrics.avgresponse' performance_results.json)ERROR_RATE=$(jq '.labels[0].metrics.errors' performance_results.json)# Rapporter les metriquesecho "performance_avg_response_time ${RESPONSE_TIME}" >> metrics.txtecho "performance_error_rate ${ERROR_RATE}" >> metrics.txtartifacts:reports:metrics: metrics.txtpaths:- performance_results.jsonexpire_in: 1 week
Environnements GitLab
Associez les tests de performance a des environnements specifiques :
performance_test_staging:stage: performanceimage: node:${NODE_VERSION}environment:name: stagingscript:- npm install -g @loadfocus/loadfocus-api-client- loadfocus-api config set apikey $LOADFOCUS_API_KEY- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID- |loadfocus-api jmeter run-test \--name "Staging_Performance_Test" \--thresholds "avgresponse<=250,errors==0" \--format json > performance_results.jsonartifacts:paths:- performance_results.jsonexpire_in: 1 weekonly:- staging
Conseils pour l'integration GitLab CI/CD
Mise en cache : Mettez en cache les dependances npm pour accelerer les executions du pipeline :
performance_test:stage: performanceimage: node:${NODE_VERSION}cache:key: ${CI_COMMIT_REF_SLUG}paths:- node_modules/script:- npm install -g @loadfocus/loadfocus-api-client# Reste du script...Parametres de delai d'attente : Definissez des delais pour les tests de performance de longue duree :
performance_test:stage: performanceimage: node:${NODE_VERSION}timeout: 2h # Definir un delai de 2 heuresscript:# Script de test de performance...Declenchements manuels : Permettez le declenchement manuel des tests de performance :
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Script de test de performance...when: manualConfiguration dynamique des tests : Utilisez les variables predefinies de GitLab pour configurer dynamiquement les tests :
performance_test:stage: performanceimage: node:${NODE_VERSION}script:- |# Definir les seuils en fonction de la brancheif [ "$CI_COMMIT_REF_NAME" = "main" ]; thenTHRESHOLDS="avgresponse<=200,errors==0,p95<=250"elseTHRESHOLDS="avgresponse<=300,errors==0,p95<=500"filoadfocus-api jmeter run-test \--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \--thresholds "$THRESHOLDS" \--format json > performance_results.jsonNotifications : Envoyez des notifications en cas d'echec des tests de performance :
performance_test:stage: performanceimage: node:${NODE_VERSION}script:# Script de test de performance...after_script:- |if [ $? -ne 0 ]; then# Envoyer une notification via curl, l'API GitLab ou une autre methodecurl -X POST -H "Content-Type: application/json" \-d "{\"text\":\"Performance test failed for ${CI_PROJECT_NAME}\"}" \$WEBHOOK_URLfi
Pour plus d'informations, consultez la documentation GitLab CI/CD et la documentation du client API LoadFocus.