mirror of
https://github.com/RT-Thread/rt-thread.git
synced 2025-11-16 12:34:33 +00:00
Compare commits
1 Commits
revert-108
...
v5.2.2
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
ddf52e2cdd |
139
.github/workflows/README_CI_RESULTS.md
vendored
139
.github/workflows/README_CI_RESULTS.md
vendored
@@ -1,139 +0,0 @@
|
||||
# CI Results Comment Workflow
|
||||
|
||||
## Overview / 概述
|
||||
|
||||
This feature automatically posts CI test results as comments on Pull Requests, making it easier for contributors and reviewers to see the status of all CI checks at a glance.
|
||||
|
||||
此功能自动将 CI 测试结果作为评论发布到 Pull Request 中,使贡献者和审阅者更容易一目了然地看到所有 CI 检查的状态。
|
||||
|
||||
## Implementation / 实现方式
|
||||
|
||||
The feature uses **two complementary approaches** to ensure CI results are always visible:
|
||||
该功能使用**两种互补方法**来确保 CI 结果始终可见:
|
||||
|
||||
### 1. Direct Workflow Integration (Immediate) / 直接工作流集成(立即生效)
|
||||
|
||||
Each main CI workflow includes a `post-ci-status` job that:
|
||||
每个主要 CI 工作流都包含一个 `post-ci-status` 作业,它:
|
||||
|
||||
- ✅ Works immediately on PR branches (no merge required) / 立即在 PR 分支上生效(无需合并)
|
||||
- 📝 Updates a single comment with workflow status / 使用工作流状态更新单个评论
|
||||
- 🔄 Runs after each workflow completes / 在每个工作流完成后运行
|
||||
|
||||
**Modified Workflows:**
|
||||
- `bsp_buildings.yml`
|
||||
- `static_code_analysis.yml`
|
||||
- `format_check.yml`
|
||||
- `utest_auto_run.yml`
|
||||
|
||||
### 2. Workflow Run Trigger (After Merge) / 工作流运行触发器(合并后)
|
||||
|
||||
The `ci_results_comment.yml` workflow:
|
||||
`ci_results_comment.yml` 工作流:
|
||||
|
||||
- ⏰ Triggers when CI workflows complete / 在 CI 工作流完成时触发
|
||||
- 📊 Provides comprehensive summary of all workflows / 提供所有工作流的全面摘要
|
||||
- 🔍 Shows detailed job-level information / 显示详细的作业级信息
|
||||
- ⚠️ **Only works after merged to master** / **仅在合并到 master 后才有效**
|
||||
|
||||
## Features / 功能特性
|
||||
|
||||
1. **Automatic Updates / 自动更新**: The comment is automatically created when CI workflows complete and updated as new workflows finish.
|
||||
/ 当 CI 工作流完成时自动创建评论,并在新工作流完成时更新。
|
||||
|
||||
2. **Comprehensive Summary / 全面总结**: Shows the status of all major CI workflows including:
|
||||
/ 显示所有主要 CI 工作流的状态,包括:
|
||||
- RT-Thread BSP Static Build Check / BSP 静态构建检查
|
||||
- Static code analysis / 静态代码分析
|
||||
- Check File Format and License / 文件格式和许可证检查
|
||||
- utest_auto_run / 单元测试自动运行
|
||||
|
||||
3. **Status Indicators / 状态指示器**:
|
||||
- ✅ Success / 成功
|
||||
- ❌ Failure / 失败
|
||||
- 🟠 Queued / 排队中
|
||||
- 🟡 In Progress / 进行中
|
||||
- ⏭️ Skipped / 已跳过
|
||||
|
||||
4. **Detailed Information / 详细信息**: Expandable sections show individual job results within each workflow.
|
||||
/ 可展开的部分显示每个工作流中的各个作业结果。
|
||||
|
||||
## How It Works / 工作原理
|
||||
|
||||
1. The workflow is triggered when any of the monitored CI workflows complete.
|
||||
/ 当任何受监控的 CI 工作流完成时,将触发此工作流。
|
||||
|
||||
2. It collects the status of all workflows and jobs for the associated Pull Request.
|
||||
/ 它收集关联 Pull Request 的所有工作流和作业的状态。
|
||||
|
||||
3. A formatted comment is posted (or updated if one already exists) with the current CI status.
|
||||
/ 发布(或更新已存在的)格式化评论,显示当前 CI 状态。
|
||||
|
||||
## Comment Format / 评论格式
|
||||
|
||||
The comment includes:
|
||||
评论包括:
|
||||
|
||||
- **Overall Summary / 总体摘要**: Quick statistics showing count of passed, failed, queued, in-progress, and skipped workflows.
|
||||
/ 快速统计数据,显示通过、失败、排队、进行中和跳过的工作流数量。
|
||||
|
||||
- **Detailed Results / 详细结果**: Collapsible sections for each workflow with links to individual jobs.
|
||||
/ 每个工作流的可折叠部分,包含指向各个作业的链接。
|
||||
|
||||
## Benefits / 优势
|
||||
|
||||
1. **Visibility / 可见性**: Contributors can immediately see which CI checks have passed or failed without navigating to the Actions tab.
|
||||
/ 贡献者无需导航到 Actions 选项卡即可立即查看哪些 CI 检查通过或失败。
|
||||
|
||||
2. **Efficiency / 效率**: Reviewers can quickly assess the CI status before reviewing the code.
|
||||
/ 审阅者可以在审查代码之前快速评估 CI 状态。
|
||||
|
||||
3. **Transparency / 透明度**: All stakeholders have a clear view of the PR's CI status.
|
||||
/ 所有利益相关者都可以清楚地了解 PR 的 CI 状态。
|
||||
|
||||
## Permissions Required / 所需权限
|
||||
|
||||
The workflow requires the following permissions:
|
||||
工作流需要以下权限:
|
||||
|
||||
- `pull-requests: write` - To create and update comments / 创建和更新评论
|
||||
- `issues: write` - To post comments on PR issues / 在 PR 问题上发布评论
|
||||
- `actions: read` - To read workflow run status / 读取工作流运行状态
|
||||
- `checks: read` - To read check run status / 读取检查运行状态
|
||||
|
||||
## Configuration / 配置
|
||||
|
||||
The workflow monitors the following workflows by default:
|
||||
工作流默认监控以下工作流:
|
||||
|
||||
```yaml
|
||||
workflows:
|
||||
- "RT-Thread BSP Static Build Check"
|
||||
- "Static code analysis"
|
||||
- "Check File Format and License"
|
||||
- "utest_auto_run"
|
||||
```
|
||||
|
||||
To add more workflows to monitor, edit the `.github/workflows/ci_results_comment.yml` file and add workflow names to the `workflows` list.
|
||||
|
||||
要监控更多工作流,请编辑 `.github/workflows/ci_results_comment.yml` 文件并将工作流名称添加到 `workflows` 列表中。
|
||||
|
||||
## Troubleshooting / 故障排除
|
||||
|
||||
### Comment not appearing / 评论未出现
|
||||
|
||||
1. Ensure the workflow has the required permissions / 确保工作流具有所需权限
|
||||
2. Check that the PR is from a branch in the repository (not a fork) / 检查 PR 是否来自存储库中的分支(而非分支)
|
||||
3. Verify the workflow is enabled in the repository settings / 验证工作流在存储库设置中已启用
|
||||
|
||||
### Comment not updating / 评论未更新
|
||||
|
||||
1. The comment updates when a monitored workflow completes / 当受监控的工作流完成时,评论会更新
|
||||
2. Check the Actions tab to see if the workflow is running / 检查 Actions 选项卡以查看工作流是否正在运行
|
||||
3. Look for errors in the workflow logs / 在工作流日志中查找错误
|
||||
|
||||
## Contributing / 贡献
|
||||
|
||||
Contributions to improve this workflow are welcome! Please follow the standard contribution process outlined in the CONTRIBUTING.md file.
|
||||
|
||||
欢迎改进此工作流的贡献!请遵循 CONTRIBUTING.md 文件中概述的标准贡献流程。
|
||||
22
.github/workflows/bsp_buildings.yml
vendored
22
.github/workflows/bsp_buildings.yml
vendored
@@ -13,6 +13,9 @@ name: RT-Thread BSP Static Build Check
|
||||
# Controls when the action will run. Triggers the workflow on push or pull request
|
||||
# events but only for the RT-Thread organization master branch
|
||||
on:
|
||||
# Runs at 16:00 UTC (BeiJing 00:00) every day
|
||||
schedule:
|
||||
- cron: '0 16 * * *'
|
||||
push:
|
||||
branches:
|
||||
- master
|
||||
@@ -43,12 +46,6 @@ on:
|
||||
types:
|
||||
- online-pkgs-static-building-trigger-event
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
trigger_type:
|
||||
description: '触发类型'
|
||||
required: false
|
||||
default: 'manual'
|
||||
type: string
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
|
||||
@@ -292,16 +289,3 @@ jobs:
|
||||
with:
|
||||
name: 00_all_bsp_output_${{ github.sha }}
|
||||
path: output/
|
||||
|
||||
# Post CI status to PR comment
|
||||
post-ci-status:
|
||||
needs: build
|
||||
if: always() && github.event_name == 'pull_request' && github.repository_owner == 'RT-Thread'
|
||||
uses: ./.github/workflows/post_ci_status.yml
|
||||
with:
|
||||
workflow_name: "RT-Thread BSP Static Build Check"
|
||||
workflow_status: ${{ needs.build.result }}
|
||||
pr_number: ${{ github.event.pull_request.number }}
|
||||
permissions:
|
||||
pull-requests: write
|
||||
issues: write
|
||||
303
.github/workflows/ci_results_comment.yml
vendored
303
.github/workflows/ci_results_comment.yml
vendored
@@ -1,303 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2025, RT-Thread Development Team
|
||||
#
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
#
|
||||
# Change Logs:
|
||||
# Date Author Notes
|
||||
# 2025-10-27 GitHub Copilot Post CI results to PR comments
|
||||
|
||||
name: CI Results Comment
|
||||
|
||||
on:
|
||||
workflow_run:
|
||||
workflows:
|
||||
- "RT-Thread BSP Static Build Check"
|
||||
- "Static code analysis"
|
||||
- "Check File Format and License"
|
||||
- "utest_auto_run"
|
||||
- "ToolsCI"
|
||||
- "pkgs_test"
|
||||
types:
|
||||
- completed
|
||||
|
||||
permissions:
|
||||
pull-requests: write
|
||||
issues: write
|
||||
actions: read
|
||||
checks: read
|
||||
|
||||
jobs:
|
||||
comment-ci-results:
|
||||
runs-on: ubuntu-22.04
|
||||
if: github.event.workflow_run.event == 'pull_request' && github.repository_owner == 'RT-Thread'
|
||||
steps:
|
||||
- name: Get PR number
|
||||
id: get-pr
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
// Get PR number from workflow_run
|
||||
const prNumber = context.payload.workflow_run.pull_requests[0]?.number;
|
||||
if (!prNumber) {
|
||||
console.log('No PR found in workflow_run');
|
||||
// Fallback: search for PR by branch
|
||||
const pulls = await github.rest.pulls.list({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
state: 'open',
|
||||
head: `${context.repo.owner}:${context.payload.workflow_run.head_branch}`
|
||||
});
|
||||
|
||||
if (pulls.data.length === 0) {
|
||||
console.log('No open PR found for this branch');
|
||||
return null;
|
||||
}
|
||||
|
||||
const pr = pulls.data[0];
|
||||
console.log(`Found PR #${pr.number}`);
|
||||
return pr.number;
|
||||
}
|
||||
|
||||
console.log(`Found PR #${prNumber}`);
|
||||
return prNumber;
|
||||
|
||||
- name: Get workflow run details
|
||||
if: steps.get-pr.outputs.result != 'null'
|
||||
id: workflow-details
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
const prNumber = ${{ steps.get-pr.outputs.result }};
|
||||
if (!prNumber) {
|
||||
return { success: false, message: 'No PR found' };
|
||||
}
|
||||
|
||||
// Get all workflow runs for this PR
|
||||
const workflowRuns = await github.rest.actions.listWorkflowRunsForRepo({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
event: 'pull_request',
|
||||
per_page: 100
|
||||
});
|
||||
|
||||
// Filter runs for this specific PR
|
||||
const prRuns = workflowRuns.data.workflow_runs.filter(run => {
|
||||
return run.pull_requests.some(pr => pr.number === prNumber);
|
||||
});
|
||||
|
||||
// Get the latest run for each workflow
|
||||
const workflowMap = new Map();
|
||||
for (const run of prRuns) {
|
||||
const existing = workflowMap.get(run.name);
|
||||
if (!existing || new Date(run.created_at) > new Date(existing.created_at)) {
|
||||
workflowMap.set(run.name, run);
|
||||
}
|
||||
}
|
||||
|
||||
// Prepare results summary
|
||||
const results = [];
|
||||
for (const [name, run] of workflowMap) {
|
||||
let status = '🟡';
|
||||
let statusText = 'In Progress';
|
||||
|
||||
if (run.status === 'completed') {
|
||||
if (run.conclusion === 'success') {
|
||||
status = '✅';
|
||||
statusText = 'Success';
|
||||
} else if (run.conclusion === 'failure') {
|
||||
status = '❌';
|
||||
statusText = 'Failure';
|
||||
} else if (run.conclusion === 'cancelled') {
|
||||
status = '⏭️';
|
||||
statusText = 'Cancelled';
|
||||
} else if (run.conclusion === 'skipped') {
|
||||
status = '⏭️';
|
||||
statusText = 'Skipped';
|
||||
}
|
||||
} else if (run.status === 'queued') {
|
||||
status = '🟠';
|
||||
statusText = 'Queued';
|
||||
}
|
||||
|
||||
results.push({
|
||||
name: name,
|
||||
status: status,
|
||||
statusText: statusText,
|
||||
url: run.html_url,
|
||||
conclusion: run.conclusion,
|
||||
runId: run.id
|
||||
});
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
results: results,
|
||||
prNumber: prNumber
|
||||
};
|
||||
|
||||
- name: Get job details
|
||||
if: steps.get-pr.outputs.result != 'null'
|
||||
id: job-details
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
const workflowDetails = ${{ steps.workflow-details.outputs.result }};
|
||||
if (!workflowDetails || !workflowDetails.success) {
|
||||
return { jobs: [] };
|
||||
}
|
||||
|
||||
const allJobs = [];
|
||||
|
||||
for (const result of workflowDetails.results) {
|
||||
try {
|
||||
const jobs = await github.rest.actions.listJobsForWorkflowRun({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
run_id: result.runId,
|
||||
per_page: 100
|
||||
});
|
||||
|
||||
for (const job of jobs.data.jobs) {
|
||||
let jobStatus = '⌛';
|
||||
if (job.status === 'completed') {
|
||||
if (job.conclusion === 'success') {
|
||||
jobStatus = '✅';
|
||||
} else if (job.conclusion === 'failure') {
|
||||
jobStatus = '❌';
|
||||
} else if (job.conclusion === 'skipped') {
|
||||
jobStatus = '⏭️';
|
||||
}
|
||||
} else if (job.status === 'in_progress') {
|
||||
jobStatus = '🔄';
|
||||
} else if (job.status === 'queued') {
|
||||
jobStatus = '🟠';
|
||||
}
|
||||
|
||||
allJobs.push({
|
||||
workflow: result.name,
|
||||
name: job.name,
|
||||
status: jobStatus,
|
||||
conclusion: job.conclusion || job.status,
|
||||
url: job.html_url
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.log(`Error getting jobs for workflow ${result.name}: ${error.message}`);
|
||||
}
|
||||
}
|
||||
|
||||
return { jobs: allJobs };
|
||||
|
||||
- name: Post or update comment
|
||||
if: steps.get-pr.outputs.result != 'null'
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
const prNumber = ${{ steps.get-pr.outputs.result }};
|
||||
const workflowDetails = ${{ steps.workflow-details.outputs.result }};
|
||||
const jobDetails = ${{ steps.job-details.outputs.result }};
|
||||
|
||||
if (!workflowDetails || !workflowDetails.success) {
|
||||
console.log('No workflow details available');
|
||||
return;
|
||||
}
|
||||
|
||||
// Prepare comment body
|
||||
const now = new Date();
|
||||
const timestamp = now.toISOString();
|
||||
const results = workflowDetails.results;
|
||||
const jobs = jobDetails.jobs || [];
|
||||
|
||||
let commentBody = '<!-- CI Results Comment -->\n';
|
||||
commentBody += '## 🤖 CI Test Results\n\n';
|
||||
commentBody += `**Last Updated:** ${timestamp}\n\n`;
|
||||
commentBody += '### Test Spec & Results:\n\n';
|
||||
commentBody += '✅ Success | ❌ Failure | 🟠 Queued | 🟡 Progress | ⏭️ Skipped | ⚠️ Quarantine\n\n';
|
||||
|
||||
// Group jobs by workflow
|
||||
const jobsByWorkflow = new Map();
|
||||
for (const job of jobs) {
|
||||
if (!jobsByWorkflow.has(job.workflow)) {
|
||||
jobsByWorkflow.set(job.workflow, []);
|
||||
}
|
||||
jobsByWorkflow.get(job.workflow).push(job);
|
||||
}
|
||||
|
||||
// Calculate overall statistics
|
||||
let totalSuccess = 0;
|
||||
let totalFailure = 0;
|
||||
let totalQueued = 0;
|
||||
let totalProgress = 0;
|
||||
let totalSkipped = 0;
|
||||
|
||||
for (const result of results) {
|
||||
if (result.conclusion === 'success') totalSuccess++;
|
||||
else if (result.conclusion === 'failure') totalFailure++;
|
||||
else if (result.statusText === 'Queued') totalQueued++;
|
||||
else if (result.statusText === 'In Progress') totalProgress++;
|
||||
else if (result.conclusion === 'skipped' || result.conclusion === 'cancelled') totalSkipped++;
|
||||
}
|
||||
|
||||
// Summary line
|
||||
commentBody += '#### Overall Summary\n\n';
|
||||
commentBody += `- ✅ **Success:** ${totalSuccess}\n`;
|
||||
commentBody += `- ❌ **Failure:** ${totalFailure}\n`;
|
||||
commentBody += `- 🟠 **Queued:** ${totalQueued}\n`;
|
||||
commentBody += `- 🟡 **In Progress:** ${totalProgress}\n`;
|
||||
commentBody += `- ⏭️ **Skipped:** ${totalSkipped}\n\n`;
|
||||
|
||||
commentBody += '---\n\n';
|
||||
commentBody += '### Detailed Results\n\n';
|
||||
|
||||
// Build detailed results
|
||||
for (const result of results) {
|
||||
commentBody += `<details>\n`;
|
||||
commentBody += `<summary>${result.status} <strong>${result.name}</strong> - ${result.statusText}</summary>\n\n`;
|
||||
commentBody += `**Workflow:** [${result.name}](${result.url})\n\n`;
|
||||
|
||||
// Show jobs for this workflow
|
||||
const workflowJobs = jobsByWorkflow.get(result.name) || [];
|
||||
if (workflowJobs.length > 0) {
|
||||
commentBody += '**Jobs:**\n\n';
|
||||
for (const job of workflowJobs) {
|
||||
commentBody += `- ${job.status} [${job.name}](${job.url})\n`;
|
||||
}
|
||||
}
|
||||
commentBody += '\n</details>\n\n';
|
||||
}
|
||||
|
||||
commentBody += '\n---\n';
|
||||
commentBody += '*🤖 This comment is automatically generated and updated by the CI system.*\n';
|
||||
|
||||
// Check if comment already exists
|
||||
const comments = await github.rest.issues.listComments({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
issue_number: prNumber
|
||||
});
|
||||
|
||||
const existingComment = comments.data.find(comment =>
|
||||
comment.user.login === 'github-actions[bot]' &&
|
||||
comment.body.includes('<!-- CI Results Comment -->')
|
||||
);
|
||||
|
||||
if (existingComment) {
|
||||
// Update existing comment
|
||||
await github.rest.issues.updateComment({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
comment_id: existingComment.id,
|
||||
body: commentBody
|
||||
});
|
||||
console.log(`Updated comment ${existingComment.id} on PR #${prNumber}`);
|
||||
} else {
|
||||
// Create new comment
|
||||
await github.rest.issues.createComment({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
issue_number: prNumber,
|
||||
body: commentBody
|
||||
});
|
||||
console.log(`Created new comment on PR #${prNumber}`);
|
||||
}
|
||||
13
.github/workflows/format_check.yml
vendored
13
.github/workflows/format_check.yml
vendored
@@ -30,16 +30,3 @@ jobs:
|
||||
run: |
|
||||
pip install click chardet PyYaml
|
||||
python tools/ci/file_check.py check 'https://github.com/RT-Thread/rt-thread' 'master'
|
||||
|
||||
# Post CI status to PR comment
|
||||
post-ci-status:
|
||||
needs: scancode_job
|
||||
if: always() && github.event_name == 'pull_request' && github.repository_owner == 'RT-Thread'
|
||||
uses: ./.github/workflows/post_ci_status.yml
|
||||
with:
|
||||
workflow_name: "Check File Format and License"
|
||||
workflow_status: ${{ needs.scancode_job.result }}
|
||||
pr_number: ${{ github.event.pull_request.number }}
|
||||
permissions:
|
||||
pull-requests: write
|
||||
issues: write
|
||||
108
.github/workflows/post_ci_status.yml
vendored
108
.github/workflows/post_ci_status.yml
vendored
@@ -1,108 +0,0 @@
|
||||
#
|
||||
# Copyright (c) 2025, RT-Thread Development Team
|
||||
#
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
#
|
||||
# Change Logs:
|
||||
# Date Author Notes
|
||||
# 2025-10-27 GitHub Copilot Reusable workflow to post CI status
|
||||
|
||||
name: Post CI Status Comment
|
||||
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
workflow_name:
|
||||
description: 'Name of the workflow'
|
||||
required: true
|
||||
type: string
|
||||
workflow_status:
|
||||
description: 'Status of the workflow (success/failure)'
|
||||
required: true
|
||||
type: string
|
||||
pr_number:
|
||||
description: 'Pull request number'
|
||||
required: true
|
||||
type: number
|
||||
|
||||
permissions:
|
||||
pull-requests: write
|
||||
issues: write
|
||||
|
||||
jobs:
|
||||
post-comment:
|
||||
runs-on: ubuntu-22.04
|
||||
if: github.repository_owner == 'RT-Thread'
|
||||
steps:
|
||||
- name: Post or update CI status comment
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
const prNumber = ${{ inputs.pr_number }};
|
||||
const workflowName = '${{ inputs.workflow_name }}';
|
||||
const workflowStatus = '${{ inputs.workflow_status }}';
|
||||
|
||||
// Status emoji mapping
|
||||
const statusEmoji = workflowStatus === 'success' ? '✅' : '❌';
|
||||
const timestamp = new Date().toISOString();
|
||||
|
||||
// Try to find existing comment
|
||||
const comments = await github.rest.issues.listComments({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
issue_number: prNumber
|
||||
});
|
||||
|
||||
const botComment = comments.data.find(comment =>
|
||||
comment.user.login === 'github-actions[bot]' &&
|
||||
comment.body.includes('<!-- CI Status Comment -->')
|
||||
);
|
||||
|
||||
// Get all workflow runs for this PR to build comprehensive status
|
||||
let allStatuses = {};
|
||||
|
||||
if (botComment) {
|
||||
// Parse existing statuses from comment
|
||||
const statusRegex = /- (✅|❌|🟡) \*\*(.+?)\*\*/g;
|
||||
let match;
|
||||
while ((match = statusRegex.exec(botComment.body)) !== null) {
|
||||
allStatuses[match[2]] = match[1];
|
||||
}
|
||||
}
|
||||
|
||||
// Update current workflow status
|
||||
allStatuses[workflowName] = statusEmoji;
|
||||
|
||||
// Build comment body
|
||||
let commentBody = '<!-- CI Status Comment -->\n';
|
||||
commentBody += '## 🤖 CI Test Results\n\n';
|
||||
commentBody += `**Last Updated:** ${timestamp}\n\n`;
|
||||
commentBody += '### Workflow Status:\n\n';
|
||||
|
||||
for (const [name, emoji] of Object.entries(allStatuses)) {
|
||||
commentBody += `- ${emoji} **${name}**\n`;
|
||||
}
|
||||
|
||||
commentBody += '\n---\n';
|
||||
commentBody += '✅ Success | ❌ Failure | 🟡 In Progress\n\n';
|
||||
commentBody += '*This comment is automatically updated as CI workflows complete.*\n';
|
||||
|
||||
if (botComment) {
|
||||
// Update existing comment
|
||||
await github.rest.issues.updateComment({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
comment_id: botComment.id,
|
||||
body: commentBody
|
||||
});
|
||||
console.log(`Updated comment ${botComment.id} on PR #${prNumber}`);
|
||||
} else {
|
||||
// Create new comment
|
||||
await github.rest.issues.createComment({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
issue_number: prNumber,
|
||||
body: commentBody
|
||||
});
|
||||
console.log(`Created new comment on PR #${prNumber}`);
|
||||
}
|
||||
200
.github/workflows/scheduled-ci-trigger.yml
vendored
200
.github/workflows/scheduled-ci-trigger.yml
vendored
@@ -1,200 +0,0 @@
|
||||
name: Weekly CI Scheduler
|
||||
|
||||
on:
|
||||
# Runs at 08:00 Beijing time every day
|
||||
schedule:
|
||||
- cron: '0 0 * * *'
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
debug:
|
||||
description: 'Debug mode'
|
||||
required: false
|
||||
default: 'false'
|
||||
|
||||
env:
|
||||
TARGET_WORKFLOWS: '["RT-Thread BSP Static Build Check", "utest_auto_run"]'
|
||||
DISCUSSION_CATEGORY: "Github Action Exception Reports"
|
||||
|
||||
jobs:
|
||||
trigger-and-monitor:
|
||||
name: Trigger and Monitor CIs
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
failed_workflows: ${{ steps.collect-results.outputs.failed_workflows }}
|
||||
total_workflows: ${{ steps.collect-results.outputs.total_workflows }}
|
||||
has_results: ${{ steps.collect-results.outputs.has_results }}
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Install Python dependencies
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
pip install requests
|
||||
|
||||
- name: Record start time
|
||||
id: start-time
|
||||
run: |
|
||||
echo "start_time=$(date -u +'%Y-%m-%dT%H:%M:%SZ')" >> $GITHUB_OUTPUT
|
||||
echo "Start time: $(date -u +'%Y-%m-%dT%H:%M:%SZ')"
|
||||
|
||||
- name: Trigger CI workflows directly
|
||||
id: trigger-ci
|
||||
run: |
|
||||
python tools/ci/scheduled-ci-trigger/trigger_workflows_direct.py
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
TARGET_WORKFLOWS: ${{ env.TARGET_WORKFLOWS }}
|
||||
|
||||
- name: Wait for workflows to appear
|
||||
id: wait-for-workflows
|
||||
run: |
|
||||
echo "Waiting for workflows to appear in API..."
|
||||
python tools/ci/scheduled-ci-trigger/wait_for_workflows.py "${{ steps.start-time.outputs.start_time }}"
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
TARGET_WORKFLOWS: ${{ env.TARGET_WORKFLOWS }}
|
||||
|
||||
- name: Monitor CI workflows
|
||||
id: monitor-ci
|
||||
run: |
|
||||
python tools/ci/scheduled-ci-trigger/monitor_workflows.py "${{ steps.start-time.outputs.start_time }}"
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
TARGET_WORKFLOWS: ${{ env.TARGET_WORKFLOWS }}
|
||||
|
||||
- name: Collect monitoring results
|
||||
id: collect-results
|
||||
run: |
|
||||
echo "Checking for monitoring results..."
|
||||
if [ -f "monitoring_results.json" ]; then
|
||||
echo "monitoring_results.json found"
|
||||
FAILED_COUNT=$(python -c "import json; data=json.load(open('monitoring_results.json')); print(len([w for w in data if w.get('conclusion') == 'failure']))")
|
||||
TOTAL_COUNT=$(python -c "import json; data=json.load(open('monitoring_results.json')); print(len(data))")
|
||||
echo "failed_workflows=$FAILED_COUNT" >> $GITHUB_OUTPUT
|
||||
echo "total_workflows=$TOTAL_COUNT" >> $GITHUB_OUTPUT
|
||||
echo "has_results=true" >> $GITHUB_OUTPUT
|
||||
echo "Results: $FAILED_COUNT failed out of $TOTAL_COUNT total"
|
||||
else
|
||||
echo "monitoring_results.json not found"
|
||||
echo "failed_workflows=0" >> $GITHUB_OUTPUT
|
||||
echo "total_workflows=0" >> $GITHUB_OUTPUT
|
||||
echo "has_results=false" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
- name: Generate detailed report
|
||||
if: steps.collect-results.outputs.has_results == 'true' && steps.collect-results.outputs.failed_workflows != '0'
|
||||
id: generate-report
|
||||
run: |
|
||||
echo "Generating detailed report..."
|
||||
python tools/ci/scheduled-ci-trigger/generate_report.py
|
||||
echo "Report generation completed"
|
||||
|
||||
- name: Upload report artifact
|
||||
if: steps.collect-results.outputs.has_results == 'true' && steps.collect-results.outputs.failed_workflows != '0'
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: ci-failure-report
|
||||
path: |
|
||||
monitoring_results.json
|
||||
failure_details.md
|
||||
retention-days: 7
|
||||
|
||||
create-discussion:
|
||||
name: Create Discussion Report
|
||||
needs: trigger-and-monitor
|
||||
if: needs.trigger-and-monitor.outputs.has_results == 'true' && needs.trigger-and-monitor.outputs.failed_workflows != '0'
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Download report artifact
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: ci-failure-report
|
||||
|
||||
- name: Create Discussion
|
||||
uses: actions/github-script@v6
|
||||
env:
|
||||
DISCUSSION_CATEGORY: ${{ env.DISCUSSION_CATEGORY }}
|
||||
with:
|
||||
script: |
|
||||
const fs = require('fs');
|
||||
|
||||
const reportPath = './failure_details.md';
|
||||
|
||||
let reportContent = fs.readFileSync(reportPath, 'utf8');
|
||||
|
||||
// 提取日期从第一行: # YYYYMMDD_ci_integration-failed-report
|
||||
const lines = reportContent.split('\n');
|
||||
const firstLine = lines[0].trim();
|
||||
const dateMatch = firstLine.match(/# (\d{8})_ci_integration-failed-report/);
|
||||
|
||||
if (!dateMatch) {
|
||||
console.error('Failed to extract date from first line:', firstLine);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const dateString = dateMatch[1];
|
||||
const discussionTitle = `${dateString}_ci_integration-failed-report`;
|
||||
|
||||
// === 关键修复:移除第一行(用于提取的隐藏行) ===
|
||||
reportContent = lines.slice(1).join('\n').trim();
|
||||
|
||||
// 获取仓库ID和分类ID
|
||||
const getRepoQuery = `
|
||||
query($owner: String!, $repo: String!) {
|
||||
repository(owner: $owner, name: $repo) {
|
||||
id
|
||||
discussionCategories(first: 20) {
|
||||
nodes {
|
||||
id
|
||||
name
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
`;
|
||||
|
||||
const repoData = await github.graphql(getRepoQuery, {
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo
|
||||
});
|
||||
|
||||
const repositoryId = repoData.repository.id;
|
||||
const categories = repoData.repository.discussionCategories.nodes;
|
||||
const targetCategory = categories.find(cat => cat.name === process.env.DISCUSSION_CATEGORY);
|
||||
|
||||
if (!targetCategory) {
|
||||
console.error('Category not found:', process.env.DISCUSSION_CATEGORY);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const createDiscussionMutation = `
|
||||
mutation($repositoryId: ID!, $categoryId: ID!, $title: String!, $body: String!) {
|
||||
createDiscussion(input: {
|
||||
repositoryId: $repositoryId
|
||||
categoryId: $categoryId
|
||||
title: $title
|
||||
body: $body
|
||||
}) {
|
||||
discussion {
|
||||
id
|
||||
title
|
||||
url
|
||||
}
|
||||
}
|
||||
}
|
||||
`;
|
||||
|
||||
const result = await github.graphql(createDiscussionMutation, {
|
||||
repositoryId: repositoryId,
|
||||
categoryId: targetCategory.id,
|
||||
title: discussionTitle,
|
||||
body: reportContent // 使用清理后的内容(无第一行)
|
||||
});
|
||||
|
||||
console.log('Discussion created successfully:', result.createDiscussion.discussion.url);
|
||||
13
.github/workflows/static_code_analysis.yml
vendored
13
.github/workflows/static_code_analysis.yml
vendored
@@ -55,16 +55,3 @@ jobs:
|
||||
cppcheck --version
|
||||
cd ..
|
||||
python tools/ci/cpp_check.py check
|
||||
|
||||
# Post CI status to PR comment
|
||||
post-ci-status:
|
||||
needs: scancode_job
|
||||
if: always() && github.event_name == 'pull_request' && github.repository_owner == 'RT-Thread'
|
||||
uses: ./.github/workflows/post_ci_status.yml
|
||||
with:
|
||||
workflow_name: "Static code analysis"
|
||||
workflow_status: ${{ needs.scancode_job.result }}
|
||||
pr_number: ${{ github.event.pull_request.number }}
|
||||
permissions:
|
||||
pull-requests: write
|
||||
issues: write
|
||||
19
.github/workflows/utest_auto_run.yml
vendored
19
.github/workflows/utest_auto_run.yml
vendored
@@ -18,13 +18,6 @@ on:
|
||||
- documentation/**
|
||||
- '**/README.md'
|
||||
- '**/README_zh.md'
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
trigger_type:
|
||||
description: '触发类型'
|
||||
required: false
|
||||
default: 'manual'
|
||||
type: string
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
|
||||
@@ -305,15 +298,3 @@ jobs:
|
||||
break
|
||||
fi
|
||||
done
|
||||
# Post CI status to PR comment
|
||||
post-ci-status:
|
||||
needs: test
|
||||
if: always() && github.event_name == 'pull_request' && github.repository_owner == 'RT-Thread'
|
||||
uses: ./.github/workflows/post_ci_status.yml
|
||||
with:
|
||||
workflow_name: "utest_auto_run"
|
||||
workflow_status: ${{ needs.test.result }}
|
||||
pr_number: ${{ github.event.pull_request.number }}
|
||||
permissions:
|
||||
pull-requests: write
|
||||
issues: write
|
||||
|
||||
@@ -7,42 +7,3 @@ devices.i2c:
|
||||
- CONFIG_RT_USING_I2C=y
|
||||
- CONFIG_BSP_USING_I2C=y
|
||||
- CONFIG_BSP_USING_I2C0=y
|
||||
devices.adc:
|
||||
<<: *scons
|
||||
kconfig:
|
||||
- CONFIG_RT_USING_ADC=y
|
||||
- CONFIG_BSP_USING_ADC=y
|
||||
devices.hwtimer:
|
||||
<<: *scons
|
||||
kconfig:
|
||||
- CONFIG_RT_USING_HWTIMER=y
|
||||
- CONFIG_BSP_USING_TIMERS=y
|
||||
- CONFIG_BSP_USING_TIMER0=y
|
||||
devices.pdma:
|
||||
<<: *scons
|
||||
kconfig:
|
||||
- CONFIG_RT_USING_PDMA=y
|
||||
- CONFIG_BSP_USING_PDMA=y
|
||||
- CONFIG_BSP_USING_PDMA_CHANNEL0=y
|
||||
devices.pwm:
|
||||
<<: *scons
|
||||
kconfig:
|
||||
- CONFIG_RT_USING_PWM=y
|
||||
- CONFIG_BSP_USING_PWM=y
|
||||
- CONFIG_BSP_USING_PWM0=y
|
||||
devices.rtc:
|
||||
<<: *scons
|
||||
kconfig:
|
||||
- CONFIG_RT_USING_RTC=y
|
||||
- CONFIG_BSP_USING_RTC=y
|
||||
devices.ts:
|
||||
<<: *scons
|
||||
kconfig:
|
||||
- CONFIG_RT_USING_TS=y
|
||||
- CONFIG_BSP_USING_TS=y
|
||||
devices.wdt:
|
||||
<<: *scons
|
||||
kconfig:
|
||||
- CONFIG_RT_USING_WDT=y
|
||||
- CONFIG_BSP_USING_WDT=y
|
||||
- CONFIG_BSP_USING_WDT0=y
|
||||
|
||||
@@ -13,13 +13,13 @@
|
||||
*/
|
||||
|
||||
/**
|
||||
* @addtogroup cortex-m33
|
||||
* @addtogroup cortex-m4
|
||||
*/
|
||||
/*@{*/
|
||||
|
||||
#include <rtconfig.h>
|
||||
|
||||
.cpu cortex-m33
|
||||
.cpu cortex-m4
|
||||
.syntax unified
|
||||
.thumb
|
||||
.text
|
||||
|
||||
@@ -1,135 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
import json
|
||||
import os
|
||||
from datetime import datetime, timedelta
|
||||
from typing import List, Dict, Any
|
||||
|
||||
def load_monitoring_results() -> List[Dict[str, Any]]:
|
||||
"""加载 monitoring_results.json"""
|
||||
if not os.path.exists("monitoring_results.json"):
|
||||
print("No monitoring results found")
|
||||
return []
|
||||
try:
|
||||
with open("monitoring_results.json", "r", encoding="utf-8") as f:
|
||||
return json.load(f)
|
||||
except (json.JSONDecodeError, OSError) as e:
|
||||
print(f"Error loading monitoring_results.json: {e}")
|
||||
return []
|
||||
|
||||
def get_beijing_time() -> datetime:
|
||||
return datetime.utcnow() + timedelta(hours=8)
|
||||
|
||||
def format_time(dt: datetime) -> str:
|
||||
return dt.strftime("%Y-%m-%d %H:%M")
|
||||
|
||||
def classify_error(step_name: str, job_name: str) -> str:
|
||||
"""错误类型分类"""
|
||||
step_lower = step_name.lower()
|
||||
if any(x in step_lower for x in ["test", "suite", "pytest", "unittest"]):
|
||||
return "TEST_FAILURE"
|
||||
if "lint" in step_lower or "flake8" in step_lower:
|
||||
return "LINT_ERROR"
|
||||
if "build" in step_lower or "compile" in step_lower:
|
||||
return "BUILD_ERROR"
|
||||
if "deploy" in step_lower or "upload" in step_lower or "publish" in step_lower:
|
||||
return "DEPLOY_ERROR"
|
||||
if "check" in step_lower or "validate" in step_lower or "verify" in step_lower:
|
||||
return "VALIDATION_ERROR"
|
||||
if "generate" in step_lower or "render" in step_lower:
|
||||
return "GENERATION_ERROR"
|
||||
return "UNKNOWN"
|
||||
|
||||
def generate_report():
|
||||
"""生成符合最新样式的故障聚合报告"""
|
||||
results = load_monitoring_results()
|
||||
if not results:
|
||||
return
|
||||
|
||||
failed_workflows = [r for r in results if r.get('conclusion') == 'failure']
|
||||
if not failed_workflows:
|
||||
print("No failed workflows to report")
|
||||
return
|
||||
|
||||
now = get_beijing_time()
|
||||
date_str = now.strftime("%Y%m%d")
|
||||
|
||||
# 时间范围
|
||||
created_times = [
|
||||
datetime.fromisoformat(r["created_at"].replace("Z", "+00:00")) + timedelta(hours=8)
|
||||
for r in failed_workflows
|
||||
]
|
||||
updated_times = [
|
||||
datetime.fromisoformat(r["updated_at"].replace("Z", "+00:00")) + timedelta(hours=8)
|
||||
for r in failed_workflows
|
||||
]
|
||||
start_time = min(created_times)
|
||||
end_time = max(updated_times)
|
||||
|
||||
total = len(results)
|
||||
failed_count = len(failed_workflows)
|
||||
success_rate = 0.0 if total == 0 else round((total - failed_count) / total * 100, 1)
|
||||
|
||||
# === 第一行:用于 JS 提取标题(必须)===
|
||||
report = f"# {date_str}_ci_integration-failed-report\n\n"
|
||||
|
||||
# === 第二行:用户看到的主标题(H1)===
|
||||
report += f"# 🚨 {date_str} GitHub Actions 故障聚合报告 | Incident Aggregate Report\n\n"
|
||||
|
||||
# === 执行概览 ===
|
||||
report += f"## 执行概览 | Executive Summary\n"
|
||||
report += f"- **监控时间范围 | Monitoring Period**: {format_time(start_time)}–{format_time(end_time)} (UTC+8)\n"
|
||||
report += f"- **检测到失败运行 | Failed Runs Detected**: {failed_count}个\n"
|
||||
report += f"- **成功率 | Success Rate**: {success_rate}% \n\n"
|
||||
|
||||
# === 故障详情 ===
|
||||
report += f"## 🔍 故障详情 | Failure Details\n\n"
|
||||
|
||||
for wf in failed_workflows:
|
||||
run_id = wf.get("run_id", "N/A")
|
||||
name = wf["name"]
|
||||
html_url = wf.get("html_url", "#")
|
||||
details = wf.get("failure_details", [])
|
||||
|
||||
report += f"**📌 Run-{run_id}** | [{name}]({html_url})\n"
|
||||
|
||||
if not details:
|
||||
report += "└─ 无失败作业详情 | No details of failed jobs\n\n"
|
||||
continue
|
||||
|
||||
failed_jobs = [j for j in details if j.get("steps")]
|
||||
for i, job in enumerate(failed_jobs):
|
||||
job_name = job["name"]
|
||||
steps = job["steps"]
|
||||
job_prefix = "└─" if i == len(failed_jobs) - 1 else "├─"
|
||||
report += f"{job_prefix} **失败作业 | Failed Job**: {job_name}\n"
|
||||
|
||||
for j, step in enumerate(steps):
|
||||
step_name = step["name"]
|
||||
step_num = step["number"]
|
||||
error_type = classify_error(step_name, job_name)
|
||||
step_prefix = " └─" if j == len(steps) - 1 else " ├─"
|
||||
report += f"{step_prefix} **失败步骤 | Failed Step**: {step_name} (Step {step_num})\n"
|
||||
indent = " " if j == len(steps) - 1 else " │ "
|
||||
report += f"{indent}**错误类型 | Error Type**: `{error_type}`\n"
|
||||
report += "\n"
|
||||
|
||||
# === Team Collaboration & Support ===
|
||||
report += f"## 👥 团队协作与支持 | Team Collaboration & Support\n\n"
|
||||
report += f"请求维护支持:本报告需要RT-Thread官方团队的专业经验进行审核与指导。 \n"
|
||||
report += f"Call for Maintenance Support: This report requires the expertise of the RT-Thread official team for review and guidance.\n\n"
|
||||
report += f"提审负责人:@Rbb666 @kurisaW\n"
|
||||
report += f"Requested Reviewers from RT-Thread: @Rbb666 @kurisaW\n\n"
|
||||
report += f"烦请尽快关注此事,万分感谢。 \n"
|
||||
report += f"Your prompt attention to this matter is greatly appreciated.\n"
|
||||
|
||||
# 保存
|
||||
try:
|
||||
with open("failure_details.md", "w", encoding="utf-8") as f:
|
||||
f.write(report.rstrip() + "\n")
|
||||
print("Report generated: failure_details.md")
|
||||
print(f"Report size: {os.path.getsize('failure_details.md')} bytes")
|
||||
except Exception as e:
|
||||
print(f"Error writing report: {e}")
|
||||
|
||||
if __name__ == "__main__":
|
||||
generate_report()
|
||||
@@ -1,227 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
import os
|
||||
import json
|
||||
import requests
|
||||
import time
|
||||
import sys
|
||||
from datetime import datetime, timezone
|
||||
|
||||
def monitor_workflows(github_token, repo, workflow_names, start_time):
|
||||
"""监控工作流运行"""
|
||||
headers = {
|
||||
"Authorization": f"token {github_token}",
|
||||
"Accept": "application/vnd.github.v3+json"
|
||||
}
|
||||
|
||||
monitoring_results = []
|
||||
|
||||
for workflow_name in workflow_names:
|
||||
print(f"\n=== Monitoring {workflow_name} ===")
|
||||
|
||||
try:
|
||||
workflow_id = get_workflow_id(github_token, repo, workflow_name)
|
||||
if not workflow_id:
|
||||
monitoring_results.append({
|
||||
"name": workflow_name,
|
||||
"status": "error",
|
||||
"conclusion": "error",
|
||||
"error": "Workflow not found"
|
||||
})
|
||||
continue
|
||||
|
||||
# 查找开始时间后的运行
|
||||
runs = get_recent_runs(github_token, repo, workflow_id, start_time)
|
||||
|
||||
if not runs:
|
||||
print(f"No runs found for {workflow_name} after {start_time}")
|
||||
# 尝试查找任何正在运行的工作流
|
||||
all_runs = get_all_runs(github_token, repo, workflow_id, 10)
|
||||
if all_runs:
|
||||
latest_run = all_runs[0]
|
||||
print(f"Using latest run instead: {latest_run['id']} created at {latest_run['created_at']}")
|
||||
result = monitor_single_run(github_token, repo, latest_run["id"], workflow_name)
|
||||
monitoring_results.append(result)
|
||||
else:
|
||||
monitoring_results.append({
|
||||
"name": workflow_name,
|
||||
"status": "not_found",
|
||||
"conclusion": "not_found",
|
||||
"error": f"No runs found after {start_time}"
|
||||
})
|
||||
else:
|
||||
# 监控找到的运行
|
||||
run_to_monitor = runs[0] # 取最新的一个
|
||||
print(f"Monitoring run: {run_to_monitor['id']}")
|
||||
result = monitor_single_run(github_token, repo, run_to_monitor["id"], workflow_name)
|
||||
monitoring_results.append(result)
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error monitoring {workflow_name}: {str(e)}")
|
||||
monitoring_results.append({
|
||||
"name": workflow_name,
|
||||
"status": "error",
|
||||
"conclusion": "error",
|
||||
"error": str(e)
|
||||
})
|
||||
|
||||
return monitoring_results
|
||||
|
||||
def get_all_runs(github_token, repo, workflow_id, per_page=10):
|
||||
"""获取所有运行"""
|
||||
headers = {
|
||||
"Authorization": f"token {github_token}",
|
||||
"Accept": "application/vnd.github.v3+json"
|
||||
}
|
||||
|
||||
url = f"https://api.github.com/repos/{repo}/actions/workflows/{workflow_id}/runs"
|
||||
params = {"per_page": per_page}
|
||||
|
||||
response = requests.get(url, headers=headers, params=params)
|
||||
if response.status_code == 200:
|
||||
return response.json()["workflow_runs"]
|
||||
return []
|
||||
|
||||
def get_recent_runs(github_token, repo, workflow_id, start_time):
|
||||
"""获取开始时间后的运行"""
|
||||
all_runs = get_all_runs(github_token, repo, workflow_id, 10)
|
||||
start_time_dt = datetime.fromisoformat(start_time.replace('Z', '+00:00'))
|
||||
|
||||
recent_runs = []
|
||||
for run in all_runs:
|
||||
run_time = datetime.fromisoformat(run["created_at"].replace('Z', '+00:00'))
|
||||
if run_time >= start_time_dt:
|
||||
recent_runs.append(run)
|
||||
|
||||
return recent_runs
|
||||
|
||||
def monitor_single_run(github_token, repo, run_id, workflow_name):
|
||||
"""监控单个运行"""
|
||||
headers = {
|
||||
"Authorization": f"token {github_token}",
|
||||
"Accept": "application/vnd.github.v3+json"
|
||||
}
|
||||
|
||||
max_wait_time = 1800 # 30分钟
|
||||
check_interval = 30
|
||||
start_time = time.time()
|
||||
|
||||
print(f"Monitoring {workflow_name} (run {run_id})")
|
||||
|
||||
while time.time() - start_time < max_wait_time:
|
||||
url = f"https://api.github.com/repos/{repo}/actions/runs/{run_id}"
|
||||
response = requests.get(url, headers=headers)
|
||||
|
||||
if response.status_code != 200:
|
||||
print(f"Error getting run status: {response.status_code}")
|
||||
time.sleep(check_interval)
|
||||
continue
|
||||
|
||||
run_data = response.json()
|
||||
status = run_data["status"]
|
||||
conclusion = run_data.get("conclusion")
|
||||
|
||||
print(f" {workflow_name}: status={status}, conclusion={conclusion}")
|
||||
|
||||
if status == "completed":
|
||||
result = {
|
||||
"name": workflow_name,
|
||||
"run_id": run_id,
|
||||
"status": status,
|
||||
"conclusion": conclusion,
|
||||
"html_url": run_data["html_url"],
|
||||
"created_at": run_data["created_at"],
|
||||
"updated_at": run_data["updated_at"]
|
||||
}
|
||||
|
||||
if conclusion == "failure":
|
||||
result["failure_details"] = get_failure_logs(github_token, repo, run_id)
|
||||
|
||||
return result
|
||||
|
||||
time.sleep(check_interval)
|
||||
|
||||
# 超时
|
||||
return {
|
||||
"name": workflow_name,
|
||||
"run_id": run_id,
|
||||
"status": "timed_out",
|
||||
"conclusion": "timed_out",
|
||||
"html_url": f"https://github.com/{repo}/actions/runs/{run_id}",
|
||||
"error": "Monitoring timed out after 30 minutes"
|
||||
}
|
||||
|
||||
def get_failure_logs(github_token, repo, run_id):
|
||||
"""获取失败日志"""
|
||||
headers = {
|
||||
"Authorization": f"token {github_token}",
|
||||
"Accept": "application/vnd.github.v3+json"
|
||||
}
|
||||
|
||||
try:
|
||||
jobs_url = f"https://api.github.com/repos/{repo}/actions/runs/{run_id}/jobs"
|
||||
jobs_response = requests.get(jobs_url, headers=headers)
|
||||
|
||||
failure_details = []
|
||||
|
||||
if jobs_response.status_code == 200:
|
||||
jobs_data = jobs_response.json()["jobs"]
|
||||
for job in jobs_data:
|
||||
if job["conclusion"] == "failure":
|
||||
job_info = {
|
||||
"name": job["name"],
|
||||
"steps": []
|
||||
}
|
||||
|
||||
for step in job["steps"]:
|
||||
if step["conclusion"] == "failure":
|
||||
job_info["steps"].append({
|
||||
"name": step["name"],
|
||||
"number": step["number"]
|
||||
})
|
||||
|
||||
failure_details.append(job_info)
|
||||
|
||||
return failure_details
|
||||
except Exception as e:
|
||||
print(f"Error getting failure logs: {e}")
|
||||
return []
|
||||
|
||||
def get_workflow_id(github_token, repo, workflow_name):
|
||||
"""获取工作流ID"""
|
||||
headers = {
|
||||
"Authorization": f"token {github_token}",
|
||||
"Accept": "application/vnd.github.v3+json"
|
||||
}
|
||||
|
||||
url = f"https://api.github.com/repos/{repo}/actions/workflows"
|
||||
response = requests.get(url, headers=headers)
|
||||
|
||||
if response.status_code == 200:
|
||||
workflows = response.json()["workflows"]
|
||||
for workflow in workflows:
|
||||
if workflow["name"] == workflow_name:
|
||||
return workflow["id"]
|
||||
return None
|
||||
|
||||
def main():
|
||||
github_token = os.getenv("GITHUB_TOKEN")
|
||||
repo = os.getenv("GITHUB_REPOSITORY")
|
||||
workflows_json = os.getenv("TARGET_WORKFLOWS")
|
||||
start_time = sys.argv[1] if len(sys.argv) > 1 else datetime.now(timezone.utc).isoformat()
|
||||
|
||||
if not all([github_token, repo, workflows_json]):
|
||||
raise ValueError("Missing required environment variables")
|
||||
|
||||
workflows = json.loads(workflows_json)
|
||||
results = monitor_workflows(github_token, repo, workflows, start_time)
|
||||
|
||||
with open("monitoring_results.json", "w") as f:
|
||||
json.dump(results, f, indent=2)
|
||||
|
||||
print(f"\n=== Monitoring Summary ===")
|
||||
for result in results:
|
||||
status_icon = "✅" if result.get("conclusion") == "success" else "❌" if result.get("conclusion") == "failure" else "⚠️"
|
||||
print(f"{status_icon} {result['name']}: {result.get('conclusion', 'unknown')}")
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -1 +0,0 @@
|
||||
requests>=2.25.1
|
||||
@@ -1,98 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
import os
|
||||
import json
|
||||
import requests
|
||||
import time
|
||||
from datetime import datetime, timezone
|
||||
|
||||
def trigger_workflow_directly(workflow_name, github_token, repo):
|
||||
"""直接触发工作流"""
|
||||
headers = {
|
||||
"Authorization": f"token {github_token}",
|
||||
"Accept": "application/vnd.github.v3+json"
|
||||
}
|
||||
|
||||
# 首先获取工作流ID
|
||||
workflow_id = get_workflow_id(github_token, repo, workflow_name)
|
||||
if not workflow_id:
|
||||
print(f"✗ Workflow '{workflow_name}' not found")
|
||||
return False
|
||||
|
||||
# 使用 workflow_dispatch API 直接触发
|
||||
dispatch_url = f"https://api.github.com/repos/{repo}/actions/workflows/{workflow_id}/dispatches" # 🔧 修复:添加这行
|
||||
|
||||
# 根据工作流实际定义的输入参数进行调整
|
||||
dispatch_data = {
|
||||
"ref": "master",
|
||||
"inputs": {
|
||||
"trigger_type": "scheduled" # 使用工作流实际定义的输入参数
|
||||
}
|
||||
}
|
||||
|
||||
try:
|
||||
print(f"Triggering workflow: {workflow_name} (ID: {workflow_id})")
|
||||
response = requests.post(dispatch_url, headers=headers, json=dispatch_data) # 🔧 修复:现在 dispatch_url 已定义
|
||||
|
||||
if response.status_code == 204:
|
||||
print(f"✓ Successfully triggered workflow: {workflow_name}")
|
||||
return True
|
||||
else:
|
||||
print(f"✗ Failed to trigger {workflow_name}: {response.status_code}")
|
||||
print(f"Response: {response.text}")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
print(f"✗ Error triggering {workflow_name}: {str(e)}")
|
||||
return False
|
||||
|
||||
def get_workflow_id(github_token, repo, workflow_name):
|
||||
"""获取工作流ID"""
|
||||
headers = {
|
||||
"Authorization": f"token {github_token}",
|
||||
"Accept": "application/vnd.github.v3+json"
|
||||
}
|
||||
|
||||
url = f"https://api.github.com/repos/{repo}/actions/workflows"
|
||||
response = requests.get(url, headers=headers)
|
||||
|
||||
if response.status_code == 200:
|
||||
workflows = response.json()["workflows"]
|
||||
for workflow in workflows:
|
||||
if workflow["name"] == workflow_name:
|
||||
return workflow["id"]
|
||||
print(f"Available workflows: {[w['name'] for w in workflows]}")
|
||||
else:
|
||||
print(f"Failed to get workflows: {response.status_code}")
|
||||
|
||||
return None
|
||||
|
||||
def main():
|
||||
github_token = os.getenv("GITHUB_TOKEN")
|
||||
repo = os.getenv("GITHUB_REPOSITORY")
|
||||
workflows_json = os.getenv("TARGET_WORKFLOWS")
|
||||
|
||||
if not all([github_token, repo, workflows_json]):
|
||||
raise ValueError("Missing required environment variables")
|
||||
|
||||
try:
|
||||
workflows = json.loads(workflows_json)
|
||||
except json.JSONDecodeError:
|
||||
raise ValueError("Invalid TARGET_WORKFLOWS JSON format")
|
||||
|
||||
print(f"Directly triggering {len(workflows)} workflows...")
|
||||
|
||||
success_count = 0
|
||||
for i, workflow in enumerate(workflows):
|
||||
success = trigger_workflow_directly(workflow, github_token, repo)
|
||||
if success:
|
||||
success_count += 1
|
||||
|
||||
# 在触发之间等待
|
||||
if i < len(workflows) - 1:
|
||||
print("Waiting 10 seconds before next trigger...")
|
||||
time.sleep(10)
|
||||
|
||||
print(f"Triggering completed: {success_count}/{len(workflows)} successful")
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -1,113 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
import os
|
||||
import json
|
||||
import requests
|
||||
import time
|
||||
import sys
|
||||
from datetime import datetime, timezone
|
||||
|
||||
def wait_for_workflows_to_appear(github_token, repo, workflow_names, start_time, max_wait=300):
|
||||
"""等待工作流出现在API中"""
|
||||
headers = {
|
||||
"Authorization": f"token {github_token}",
|
||||
"Accept": "application/vnd.github.v3+json"
|
||||
}
|
||||
|
||||
print(f"Waiting for {len(workflow_names)} workflows to appear...")
|
||||
print(f"Start time: {start_time}")
|
||||
print(f"Max wait time: {max_wait} seconds")
|
||||
|
||||
found_workflows = set()
|
||||
start_timestamp = time.time()
|
||||
|
||||
while time.time() - start_timestamp < max_wait:
|
||||
all_found = True
|
||||
|
||||
for workflow_name in workflow_names:
|
||||
if workflow_name in found_workflows:
|
||||
continue
|
||||
|
||||
workflow_id = get_workflow_id(github_token, repo, workflow_name)
|
||||
if not workflow_id:
|
||||
print(f"Workflow {workflow_name} not found, skipping")
|
||||
found_workflows.add(workflow_name)
|
||||
continue
|
||||
|
||||
# 检查是否有新的运行
|
||||
runs = get_recent_runs(github_token, repo, workflow_id, start_time)
|
||||
if runs:
|
||||
print(f"✓ Found new run for {workflow_name}: {runs[0]['id']}")
|
||||
found_workflows.add(workflow_name)
|
||||
else:
|
||||
print(f"⏳ Waiting for {workflow_name}...")
|
||||
all_found = False
|
||||
|
||||
if all_found:
|
||||
print("✓ All workflows have started!")
|
||||
return True
|
||||
|
||||
time.sleep(10) # 每10秒检查一次
|
||||
|
||||
print("⚠️ Timeout waiting for workflows to appear")
|
||||
print(f"Found {len(found_workflows)} out of {len(workflow_names)} workflows")
|
||||
return False
|
||||
|
||||
def get_workflow_id(github_token, repo, workflow_name):
|
||||
"""获取工作流ID"""
|
||||
headers = {
|
||||
"Authorization": f"token {github_token}",
|
||||
"Accept": "application/vnd.github.v3+json"
|
||||
}
|
||||
|
||||
url = f"https://api.github.com/repos/{repo}/actions/workflows"
|
||||
response = requests.get(url, headers=headers)
|
||||
|
||||
if response.status_code == 200:
|
||||
workflows = response.json()["workflows"]
|
||||
for workflow in workflows:
|
||||
if workflow["name"] == workflow_name:
|
||||
return workflow["id"]
|
||||
return None
|
||||
|
||||
def get_recent_runs(github_token, repo, workflow_id, start_time):
|
||||
"""获取开始时间后的运行"""
|
||||
headers = {
|
||||
"Authorization": f"token {github_token}",
|
||||
"Accept": "application/vnd.github.v3+json"
|
||||
}
|
||||
|
||||
url = f"https://api.github.com/repos/{repo}/actions/workflows/{workflow_id}/runs"
|
||||
params = {"per_page": 5}
|
||||
|
||||
response = requests.get(url, headers=headers, params=params)
|
||||
if response.status_code != 200:
|
||||
return []
|
||||
|
||||
runs = response.json()["workflow_runs"]
|
||||
start_time_dt = datetime.fromisoformat(start_time.replace('Z', '+00:00'))
|
||||
|
||||
recent_runs = []
|
||||
for run in runs:
|
||||
run_time = datetime.fromisoformat(run["created_at"].replace('Z', '+00:00'))
|
||||
if run_time >= start_time_dt:
|
||||
recent_runs.append(run)
|
||||
|
||||
return recent_runs
|
||||
|
||||
def main():
|
||||
github_token = os.getenv("GITHUB_TOKEN")
|
||||
repo = os.getenv("GITHUB_REPOSITORY")
|
||||
workflows_json = os.getenv("TARGET_WORKFLOWS")
|
||||
start_time = sys.argv[1] if len(sys.argv) > 1 else datetime.now(timezone.utc).isoformat()
|
||||
|
||||
if not all([github_token, repo, workflows_json]):
|
||||
raise ValueError("Missing required environment variables")
|
||||
|
||||
workflows = json.loads(workflows_json)
|
||||
success = wait_for_workflows_to_appear(github_token, repo, workflows, start_time)
|
||||
|
||||
if not success:
|
||||
print("Proceeding anyway, some workflows may not be detected...")
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
Reference in New Issue
Block a user