GitHub Actions

이 가이드에서는 자동화된 성능 테스트를 위해 LoadFocus JMeter API Client를 GitHub Actions와 통합하는 방법을 설명합니다.

설정 단계

1. GitHub Secrets로 자격 증명 저장

먼저 LoadFocus API 자격 증명을 GitHub 리포지토리 시크릿으로 저장합니다:

  1. GitHub 리포지토리로 이동합니다
  2. Settings > Secrets and variables > Actions로 이동합니다
  3. 다음 리포지토리 시크릿을 추가합니다:
    • LOADFOCUS_API_KEY: LoadFocus API 키
    • LOADFOCUS_TEAM_ID: LoadFocus 팀 ID

2. GitHub Actions 워크플로우 생성

리포지토리에 .github/workflows/performance-test.yml 파일을 생성합니다:

name: Performance Tests
on:
push:
branches: [ main, develop ]
pull_request:
branches: [ main ]
# 선택 사항: 일정에 따라 실행
schedule:
- cron: '0 0 * * 1' # 매주 월요일 자정에 실행
jobs:
performance-test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Setup Node.js
uses: actions/setup-node@v3
with:
node-version: '16'
- name: Install LoadFocus JMeter API Client
run: npm install -g @loadfocus/loadfocus-api-client
- name: Configure LoadFocus API Client
run: |
loadfocus-api config set apikey ${{ secrets.LOADFOCUS_API_KEY }}
loadfocus-api config set teamid ${{ secrets.LOADFOCUS_TEAM_ID }}
- name: Run Performance Tests
run: |
loadfocus-api jmeter run-test \
--name "GitHub_${{ github.repository_owner }}_${{ github.repository }}_${{ github.ref_name }}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
- name: Upload Performance Test Results
uses: actions/upload-artifact@v3
with:
name: performance-test-results
path: performance_results.json

3. 배포 워크플로우에 성능 테스트 추가

배포를 성능 테스트 결과에 의존하도록 만들려면:

name: Build, Test, and Deploy
on:
push:
branches: [ main ]
jobs:
build:
runs-on: ubuntu-latest
steps:
# 빌드 단계...
performance-test:
needs: build
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Setup Node.js
uses: actions/setup-node@v3
with:
node-version: '16'
- name: Install LoadFocus JMeter API Client
run: npm install -g @loadfocus/loadfocus-api-client
- name: Configure LoadFocus API Client
run: |
loadfocus-api config set apikey ${{ secrets.LOADFOCUS_API_KEY }}
loadfocus-api config set teamid ${{ secrets.LOADFOCUS_TEAM_ID }}
- name: Run Performance Tests
run: |
loadfocus-api jmeter run-test \
--name "GitHub_${{ github.repository }}_${{ github.ref_name }}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
- name: Upload Performance Test Results
uses: actions/upload-artifact@v3
with:
name: performance-test-results
path: performance_results.json
deploy:
needs: performance-test
runs-on: ubuntu-latest
steps:
# 배포 단계...

고급 구성

여러 환경을 위한 매트릭스 테스트

여러 환경 또는 구성에 대해 테스트를 실행합니다:

jobs:
performance-test:
runs-on: ubuntu-latest
strategy:
matrix:
environment: [dev, staging, production]
test-type: [api, frontend]
steps:
# 설정 단계...
- name: Run Performance Tests
run: |
TEST_NAME="${{ matrix.test-type }}_test_${{ matrix.environment }}"
# 환경에 따라 임계값 조정
if [ "${{ matrix.environment }}" == "production" ]; then
THRESHOLDS="avgresponse<=150,errors==0,p95<=200"
else
THRESHOLDS="avgresponse<=300,errors==0,p95<=500"
fi
loadfocus-api jmeter run-test \
--name "$TEST_NAME" \
--thresholds "$THRESHOLDS" \
--format json > "performance_results_${{ matrix.environment }}_${{ matrix.test-type }}.json"

성능 테스트 보고서 생성

JSON 결과에서 HTML 보고서를 생성합니다:

- name: Generate HTML Report
run: |
# 보고서 생성기 설치
npm install -g performance-report-generator
# HTML 보고서 생성
performance-report-generator \
--input performance_results.json \
--output performance_report.html
- name: Upload HTML Report
uses: actions/upload-artifact@v3
with:
name: performance-test-report
path: performance_report.html
# 선택 사항: GitHub Pages에 게시
- name: Publish to GitHub Pages
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./reports
destination_dir: performance-reports

PR에 테스트 결과 댓글 추가

풀 리퀘스트에 성능 테스트 결과를 댓글로 추가합니다:

- name: Comment PR
uses: actions/github-script@v6
if: github.event_name == 'pull_request'
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
const fs = require('fs');
const results = JSON.parse(fs.readFileSync('performance_results.json', 'utf8'));
let comment = '## 성능 테스트 결과\n\n';
comment += `**전체 결과:** ${results.overallResult}\n\n`;
comment += '### 레이블별 결과\n\n';
for (const label of results.labels) {
comment += `#### ${label.label}\n`;
comment += `- **결과:** ${label.result}\n`;
comment += `- **샘플:** ${label.metrics.samples}\n`;
comment += `- **평균 응답:** ${label.metrics.avgresponse}ms\n`;
comment += `- **오류율:** ${label.metrics.errors}\n\n`;
}
github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: comment
});

재사용 가능한 워크플로우

성능 테스트를 위한 재사용 가능한 워크플로우를 생성합니다:

# .github/workflows/reusable-performance-test.yml
name: Reusable Performance Test
on:
workflow_call:
inputs:
test-name:
required: true
type: string
thresholds:
required: false
type: string
default: "avgresponse<=200,errors==0,p95<=250"
wait-timeout:
required: false
type: number
default: 1800
secrets:
LOADFOCUS_API_KEY:
required: true
LOADFOCUS_TEAM_ID:
required: true
jobs:
performance-test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Setup Node.js
uses: actions/setup-node@v3
with:
node-version: '16'
- name: Install LoadFocus JMeter API Client
run: npm install -g @loadfocus/loadfocus-api-client
- name: Configure LoadFocus API Client
run: |
loadfocus-api config set apikey ${{ secrets.LOADFOCUS_API_KEY }}
loadfocus-api config set teamid ${{ secrets.LOADFOCUS_TEAM_ID }}
- name: Run Performance Tests
run: |
loadfocus-api jmeter run-test \
--name "${{ inputs.test-name }}" \
--thresholds "${{ inputs.thresholds }}" \
--waitTimeout ${{ inputs.wait-timeout }} \
--format json > performance_results.json
- name: Upload Performance Test Results
uses: actions/upload-artifact@v3
with:
name: performance-test-results
path: performance_results.json

다른 워크플로우에서 호출하는 방법:

# .github/workflows/main.yml
jobs:
call-performance-test:
uses: ./.github/workflows/reusable-performance-test.yml
with:
test-name: "API_Performance_Test"
thresholds: "avgresponse<=150,errors==0"
secrets:
LOADFOCUS_API_KEY: ${{ secrets.LOADFOCUS_API_KEY }}
LOADFOCUS_TEAM_ID: ${{ secrets.LOADFOCUS_TEAM_ID }}

GitHub Actions 통합 팁

  1. 캐싱: npm 종속성을 캐싱하여 워크플로우 실행 속도를 높입니다:

    - name: Cache Node modules
    uses: actions/cache@v3
    with:
    path: ~/.npm
    key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
    restore-keys: |
    ${{ runner.os }}-node-
  2. 동시 실행 제어: 동시 성능 테스트를 제한합니다:

    concurrency:
    group: performance-test-${{ github.ref }}
    cancel-in-progress: false
  3. 환경별 테스트: 다른 테스트 구성을 위해 GitHub 환경을 사용합니다:

    jobs:
    performance-test:
    runs-on: ubuntu-latest
    environment: staging
    # 환경별 변수를 여기서 사용할 수 있습니다
  4. 조건부 테스트: 특정 파일이 변경될 때만 성능 테스트를 실행합니다:

    jobs:
    performance-test:
    if: |
    contains(github.event.pull_request.labels.*.name, 'performance-test') ||
    github.event_name == 'schedule' ||
    contains(github.event.head_commit.message, '[perf-test]')

자세한 정보는 GitHub Actions 문서LoadFocus API Client 문서를 참조하세요.