GitLab CI/CD

Ce guide explique comment integrer le client API JMeter de LoadFocus avec GitLab CI/CD pour les tests de performance automatises.

Etapes de configuration

1. Stocker les identifiants en tant que variables GitLab CI/CD

Tout d'abord, stockez vos identifiants API LoadFocus en tant que variables GitLab CI/CD :

  1. Accedez a votre projet GitLab
  2. Allez dans Settings > CI/CD > Variables
  3. Ajoutez les variables suivantes :
    • LOADFOCUS_API_KEY : Votre cle API LoadFocus (marquez comme "Masked")
    • LOADFOCUS_TEAM_ID : Votre ID d'equipe LoadFocus

2. Creer un pipeline GitLab CI/CD

Creez ou mettez a jour votre fichier .gitlab-ci.yml dans votre depot :

stages:
- build
- test
- performance
- deploy
variables:
NODE_VERSION: "16"
build:
stage: build
image: node:${NODE_VERSION}
script:
- npm install
- npm run build
artifacts:
paths:
- dist/
expire_in: 1 week
test:
stage: test
image: node:${NODE_VERSION}
script:
- npm install
- npm test
performance_test:
stage: performance
image: node:${NODE_VERSION}
script:
# Installer le client API JMeter de LoadFocus
- npm install -g @loadfocus/loadfocus-api-client
# Configurer le client API LoadFocus
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
# Executer les tests de performance
- |
loadfocus-api jmeter run-test \
--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
artifacts:
paths:
- performance_results.json
expire_in: 1 week
when: always
# Optionnel : Executer uniquement sur des branches specifiques
only:
- main
- develop
deploy:
stage: deploy
script:
- echo "Deploying application..."
only:
- main
# Deployer uniquement si toutes les etapes precedentes ont reussi
when: on_success

3. Consulter les resultats des tests

Apres l'execution du pipeline :

  1. Accedez a votre projet GitLab
  2. Allez dans CI/CD > Pipelines
  3. Trouvez votre pipeline et cliquez dessus
  4. Allez au job "performance_test"
  5. Cliquez sur "Browse" dans la barre laterale droite pour consulter les artefacts
  6. Telechargez et consultez le fichier performance_results.json

Configuration avancee

Tests specifiques a l'environnement

Executez differents tests de performance pour differents environnements :

.performance_test_template: &performance_test_definition
stage: performance
image: node:${NODE_VERSION}
script:
- npm install -g @loadfocus/loadfocus-api-client
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
- |
loadfocus-api jmeter run-test \
--name "${TEST_NAME}" \
--thresholds "${THRESHOLDS}" \
--format json > performance_results.json
artifacts:
paths:
- performance_results.json
expire_in: 1 week
when: always
performance_test_develop:
<<: *performance_test_definition
variables:
TEST_NAME: "API_Test_Develop"
THRESHOLDS: "avgresponse<=300,errors==0,p95<=500"
only:
- develop
performance_test_staging:
<<: *performance_test_definition
variables:
TEST_NAME: "API_Test_Staging"
THRESHOLDS: "avgresponse<=250,errors==0,p95<=350"
only:
- staging
performance_test_production:
<<: *performance_test_definition
variables:
TEST_NAME: "API_Test_Production"
THRESHOLDS: "avgresponse<=200,errors==0,p95<=250"
only:
- main

Tests paralleles

Executez plusieurs tests de performance en parallele :

performance_test_api:
stage: performance
image: node:${NODE_VERSION}
script:
- npm install -g @loadfocus/loadfocus-api-client
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
- |
loadfocus-api jmeter run-test \
--name "API_Performance_Test" \
--thresholds "avgresponse<=150,errors==0" \
--format json > api_performance_results.json
artifacts:
paths:
- api_performance_results.json
expire_in: 1 week
performance_test_ui:
stage: performance
image: node:${NODE_VERSION}
script:
- npm install -g @loadfocus/loadfocus-api-client
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
- |
loadfocus-api jmeter run-test \
--name "UI_Performance_Test" \
--thresholds "avgresponse<=300,errors==0" \
--format json > ui_performance_results.json
artifacts:
paths:
- ui_performance_results.json
expire_in: 1 week

Ensuite, creez un planning de pipeline dans GitLab :

  1. Accedez a votre projet GitLab
  2. Allez dans CI/CD > Schedules
  3. Cliquez sur "New schedule"
  4. Configurez un planning (par exemple, chaque jour a minuit)

Creation de rapports de performance

Generez des rapports HTML a partir des resultats JSON :

performance_test:
stage: performance
image: node:${NODE_VERSION}
script:
# Executer le test de performance
- npm install -g @loadfocus/loadfocus-api-client
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
- |
loadfocus-api jmeter run-test \
--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
# Generer le rapport HTML
- npm install -g performance-report-generator # Remplacez par le generateur de rapports reel
- performance-report-generator --input performance_results.json --output performance_report.html
artifacts:
paths:
- performance_results.json
- performance_report.html
expire_in: 1 week
when: always
# Optionnel : Publier le rapport en tant que GitLab Pages
pages:
stage: deploy
dependencies:
- performance_test
script:
- mkdir -p public/performance-reports
- cp performance_report.html public/performance-reports/index.html
artifacts:
paths:
- public
only:
- main

Integration avec les fonctionnalites GitLab

Widgets de merge request

Affichez les resultats des tests de performance dans les merge requests en utilisant la fonctionnalite de rapport JUnit de GitLab :

performance_test:
stage: performance
image: node:${NODE_VERSION}
script:
# Executer le test de performance
- npm install -g @loadfocus/loadfocus-api-client
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
- |
loadfocus-api jmeter run-test \
--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
# Convertir au format JUnit (en utilisant un script personnalise)
- node convert-to-junit.js performance_results.json junit-report.xml
artifacts:
reports:
junit: junit-report.xml
paths:
- performance_results.json
expire_in: 1 week

Metriques GitLab

Utilisez la fonctionnalite de metriques de GitLab pour suivre les performances au fil du temps :

performance_test:
stage: performance
image: node:${NODE_VERSION}
script:
# Executer le test de performance
- npm install -g @loadfocus/loadfocus-api-client
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
- |
loadfocus-api jmeter run-test \
--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
# Extraire les metriques et les rapporter a GitLab
- |
# Analyser le JSON et extraire les metriques
RESPONSE_TIME=$(jq '.labels[0].metrics.avgresponse' performance_results.json)
ERROR_RATE=$(jq '.labels[0].metrics.errors' performance_results.json)
# Rapporter les metriques
echo "performance_avg_response_time ${RESPONSE_TIME}" >> metrics.txt
echo "performance_error_rate ${ERROR_RATE}" >> metrics.txt
artifacts:
reports:
metrics: metrics.txt
paths:
- performance_results.json
expire_in: 1 week

Environnements GitLab

Associez les tests de performance a des environnements specifiques :

performance_test_staging:
stage: performance
image: node:${NODE_VERSION}
environment:
name: staging
script:
- npm install -g @loadfocus/loadfocus-api-client
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
- |
loadfocus-api jmeter run-test \
--name "Staging_Performance_Test" \
--thresholds "avgresponse<=250,errors==0" \
--format json > performance_results.json
artifacts:
paths:
- performance_results.json
expire_in: 1 week
only:
- staging

Conseils pour l'integration GitLab CI/CD

  1. Mise en cache : Mettez en cache les dependances npm pour accelerer les executions du pipeline :

    performance_test:
    stage: performance
    image: node:${NODE_VERSION}
    cache:
    key: ${CI_COMMIT_REF_SLUG}
    paths:
    - node_modules/
    script:
    - npm install -g @loadfocus/loadfocus-api-client
    # Reste du script...
  2. Parametres de delai d'attente : Definissez des delais pour les tests de performance de longue duree :

    performance_test:
    stage: performance
    image: node:${NODE_VERSION}
    timeout: 2h # Definir un delai de 2 heures
    script:
    # Script de test de performance...
  3. Declenchements manuels : Permettez le declenchement manuel des tests de performance :

    performance_test:
    stage: performance
    image: node:${NODE_VERSION}
    script:
    # Script de test de performance...
    when: manual
  4. Configuration dynamique des tests : Utilisez les variables predefinies de GitLab pour configurer dynamiquement les tests :

    performance_test:
    stage: performance
    image: node:${NODE_VERSION}
    script:
    - |
    # Definir les seuils en fonction de la branche
    if [ "$CI_COMMIT_REF_NAME" = "main" ]; then
    THRESHOLDS="avgresponse<=200,errors==0,p95<=250"
    else
    THRESHOLDS="avgresponse<=300,errors==0,p95<=500"
    fi
    loadfocus-api jmeter run-test \
    --name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \
    --thresholds "$THRESHOLDS" \
    --format json > performance_results.json
  5. Notifications : Envoyez des notifications en cas d'echec des tests de performance :

    performance_test:
    stage: performance
    image: node:${NODE_VERSION}
    script:
    # Script de test de performance...
    after_script:
    - |
    if [ $? -ne 0 ]; then
    # Envoyer une notification via curl, l'API GitLab ou une autre methode
    curl -X POST -H "Content-Type: application/json" \
    -d "{\"text\":\"Performance test failed for ${CI_PROJECT_NAME}\"}" \
    $WEBHOOK_URL
    fi

Pour plus d'informations, consultez la documentation GitLab CI/CD et la documentation du client API LoadFocus.