GitLab CI/CD

Este guia explica como integrar o Cliente API JMeter do LoadFocus com o GitLab CI/CD para testes de desempenho automatizados.

Passos de Configuracao

1. Armazenar Credenciais como Variaveis CI/CD do GitLab

Primeiro, armazene as suas credenciais da API do LoadFocus como variaveis CI/CD do GitLab:

  1. Va ao seu projeto GitLab
  2. Navegue ate Settings > CI/CD > Variables
  3. Adicione as seguintes variaveis:
    • LOADFOCUS_API_KEY: A sua API key do LoadFocus (marque como "Masked")
    • LOADFOCUS_TEAM_ID: O seu team ID do LoadFocus

2. Criar um Pipeline GitLab CI/CD

Crie ou atualize o seu ficheiro .gitlab-ci.yml no seu repositorio:

stages:
- build
- test
- performance
- deploy
variables:
NODE_VERSION: "16"
build:
stage: build
image: node:${NODE_VERSION}
script:
- npm install
- npm run build
artifacts:
paths:
- dist/
expire_in: 1 week
test:
stage: test
image: node:${NODE_VERSION}
script:
- npm install
- npm test
performance_test:
stage: performance
image: node:${NODE_VERSION}
script:
# Install LoadFocus JMeter API Client
- npm install -g @loadfocus/loadfocus-api-client
# Configure LoadFocus API Client
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
# Run Performance Tests
- |
loadfocus-api jmeter run-test \
--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
artifacts:
paths:
- performance_results.json
expire_in: 1 week
when: always
# Optional: Only run on specific branches
only:
- main
- develop
deploy:
stage: deploy
script:
- echo "Deploying application..."
only:
- main
# Only deploy if all previous stages succeeded
when: on_success

3. Ver Resultados dos Testes

Apos a execucao do pipeline:

  1. Va ao seu projeto GitLab
  2. Navegue ate CI/CD > Pipelines
  3. Encontre o seu pipeline e clique nele
  4. Va ao job "performance_test"
  5. Clique em "Browse" na barra lateral direita para ver os artefactos
  6. Descarregue e visualize o ficheiro performance_results.json

Configuracao Avancada

Testes Especificos por Ambiente

Execute diferentes testes de desempenho para diferentes ambientes:

.performance_test_template: &performance_test_definition
stage: performance
image: node:${NODE_VERSION}
script:
- npm install -g @loadfocus/loadfocus-api-client
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
- |
loadfocus-api jmeter run-test \
--name "${TEST_NAME}" \
--thresholds "${THRESHOLDS}" \
--format json > performance_results.json
artifacts:
paths:
- performance_results.json
expire_in: 1 week
when: always
performance_test_develop:
<<: *performance_test_definition
variables:
TEST_NAME: "API_Test_Develop"
THRESHOLDS: "avgresponse<=300,errors==0,p95<=500"
only:
- develop
performance_test_staging:
<<: *performance_test_definition
variables:
TEST_NAME: "API_Test_Staging"
THRESHOLDS: "avgresponse<=250,errors==0,p95<=350"
only:
- staging
performance_test_production:
<<: *performance_test_definition
variables:
TEST_NAME: "API_Test_Production"
THRESHOLDS: "avgresponse<=200,errors==0,p95<=250"
only:
- main

Testes Paralelos

Execute multiplos testes de desempenho em paralelo:

performance_test_api:
stage: performance
image: node:${NODE_VERSION}
script:
- npm install -g @loadfocus/loadfocus-api-client
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
- |
loadfocus-api jmeter run-test \
--name "API_Performance_Test" \
--thresholds "avgresponse<=150,errors==0" \
--format json > api_performance_results.json
artifacts:
paths:
- api_performance_results.json
expire_in: 1 week
performance_test_ui:
stage: performance
image: node:${NODE_VERSION}
script:
- npm install -g @loadfocus/loadfocus-api-client
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
- |
loadfocus-api jmeter run-test \
--name "UI_Performance_Test" \
--thresholds "avgresponse<=300,errors==0" \
--format json > ui_performance_results.json
artifacts:
paths:
- ui_performance_results.json
expire_in: 1 week

Depois crie um agendamento de pipeline no GitLab:

  1. Va ao seu projeto GitLab
  2. Navegue ate CI/CD > Schedules
  3. Clique em "New schedule"
  4. Configure um agendamento (por exemplo, todos os dias a meia-noite)

Criar Relatorios de Desempenho

Gere relatorios HTML a partir dos resultados JSON:

performance_test:
stage: performance
image: node:${NODE_VERSION}
script:
# Run performance test
- npm install -g @loadfocus/loadfocus-api-client
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
- |
loadfocus-api jmeter run-test \
--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
# Generate HTML report
- npm install -g performance-report-generator # Replace with actual report generator
- performance-report-generator --input performance_results.json --output performance_report.html
artifacts:
paths:
- performance_results.json
- performance_report.html
expire_in: 1 week
when: always
# Optional: Publish report as GitLab Pages
pages:
stage: deploy
dependencies:
- performance_test
script:
- mkdir -p public/performance-reports
- cp performance_report.html public/performance-reports/index.html
artifacts:
paths:
- public
only:
- main

Integracao com Funcionalidades do GitLab

Widgets de Merge Request

Apresente os resultados dos testes de desempenho em merge requests utilizando a funcionalidade de relatorios JUnit do GitLab:

performance_test:
stage: performance
image: node:${NODE_VERSION}
script:
# Run performance test
- npm install -g @loadfocus/loadfocus-api-client
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
- |
loadfocus-api jmeter run-test \
--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
# Convert to JUnit format (using a custom script)
- node convert-to-junit.js performance_results.json junit-report.xml
artifacts:
reports:
junit: junit-report.xml
paths:
- performance_results.json
expire_in: 1 week

Metricas do GitLab

Utilize a funcionalidade de metricas do GitLab para acompanhar o desempenho ao longo do tempo:

performance_test:
stage: performance
image: node:${NODE_VERSION}
script:
# Run performance test
- npm install -g @loadfocus/loadfocus-api-client
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
- |
loadfocus-api jmeter run-test \
--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
# Extract metrics and report them to GitLab
- |
# Parse JSON and extract metrics
RESPONSE_TIME=$(jq '.labels[0].metrics.avgresponse' performance_results.json)
ERROR_RATE=$(jq '.labels[0].metrics.errors' performance_results.json)
# Report metrics
echo "performance_avg_response_time ${RESPONSE_TIME}" >> metrics.txt
echo "performance_error_rate ${ERROR_RATE}" >> metrics.txt
artifacts:
reports:
metrics: metrics.txt
paths:
- performance_results.json
expire_in: 1 week

Ambientes do GitLab

Associe testes de desempenho a ambientes especificos:

performance_test_staging:
stage: performance
image: node:${NODE_VERSION}
environment:
name: staging
script:
- npm install -g @loadfocus/loadfocus-api-client
- loadfocus-api config set apikey $LOADFOCUS_API_KEY
- loadfocus-api config set teamid $LOADFOCUS_TEAM_ID
- |
loadfocus-api jmeter run-test \
--name "Staging_Performance_Test" \
--thresholds "avgresponse<=250,errors==0" \
--format json > performance_results.json
artifacts:
paths:
- performance_results.json
expire_in: 1 week
only:
- staging

Dicas para Integracao com GitLab CI/CD

  1. Cache: Coloque em cache as dependencias npm para acelerar as execucoes do pipeline:

    performance_test:
    stage: performance
    image: node:${NODE_VERSION}
    cache:
    key: ${CI_COMMIT_REF_SLUG}
    paths:
    - node_modules/
    script:
    - npm install -g @loadfocus/loadfocus-api-client
    # Rest of the script...
  2. Definicoes de Timeout: Defina timeouts para testes de desempenho de longa duracao:

    performance_test:
    stage: performance
    image: node:${NODE_VERSION}
    timeout: 2h # Set a 2-hour timeout
    script:
    # Performance test script...
  3. Acionamento Manual: Permita que os testes de desempenho sejam acionados manualmente:

    performance_test:
    stage: performance
    image: node:${NODE_VERSION}
    script:
    # Performance test script...
    when: manual
  4. Configuracao Dinamica de Testes: Utilize variaveis predefinidas do GitLab para configurar testes dinamicamente:

    performance_test:
    stage: performance
    image: node:${NODE_VERSION}
    script:
    - |
    # Set thresholds based on branch
    if [ "$CI_COMMIT_REF_NAME" = "main" ]; then
    THRESHOLDS="avgresponse<=200,errors==0,p95<=250"
    else
    THRESHOLDS="avgresponse<=300,errors==0,p95<=500"
    fi
    loadfocus-api jmeter run-test \
    --name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \
    --thresholds "$THRESHOLDS" \
    --format json > performance_results.json
  5. Notificacoes: Envie notificacoes quando os testes de desempenho falharem:

    performance_test:
    stage: performance
    image: node:${NODE_VERSION}
    script:
    # Performance test script...
    after_script:
    - |
    if [ $? -ne 0 ]; then
    # Send notification using curl, GitLab API, or other method
    curl -X POST -H "Content-Type: application/json" \
    -d "{\"text\":\"Performance test failed for ${CI_PROJECT_NAME}\"}" \
    $WEBHOOK_URL
    fi

Para mais informacoes, consulte a documentacao do GitLab CI/CD e a documentacao do Cliente API do LoadFocus.