Skip to content

Tests: Add Codecov Test Analytics support#7

Merged
ScarletKuro merged 2 commits intomasterfrom
copilot/add-codecov-test-analytics-support
Jan 16, 2026
Merged

Tests: Add Codecov Test Analytics support#7
ScarletKuro merged 2 commits intomasterfrom
copilot/add-codecov-test-analytics-support

Conversation

Copy link
Copy Markdown
Contributor

Copilot AI commented Jan 16, 2026

Enables Codecov Test Analytics to track test results, failures, and trends across CI runs.

Changes

  • Test project: Added JunitXml.TestLogger v4.1.0 to generate JUnit XML reports
  • Build script: Modified Test target to output JUnit XML to artifacts/test-results/ alongside existing coverage reports
  • CI workflow: Added codecov/test-results-action@v1 step to upload test results with !cancelled() condition to capture failures

Implementation

JUnit logger configured inline with dotnet test:

DotNetTest(_ => _
    .SetProjectFile(project.Path)
    .SetConfiguration(Configuration)
    .EnableNoBuild()
    .EnableCollectCoverage()
    .SetCoverletOutputFormat(CoverletOutputFormat.cobertura)
    .SetCoverletOutput(CoverageDirectory / $"{project.Name}.xml")
    .SetLoggers($"junit;LogFilePath={TestResultsDirectory / $"{project.Name}.xml}")
);

Test results upload in CI after test execution:

- name: 'Upload test results to Codecov'
  if: ${{ !cancelled() }}
  uses: codecov/test-results-action@v1
  with:
    token: ${{ secrets.CODECOV_TOKEN }}
    files: artifacts/test-results/*.xml
Original prompt

Add support for codecov "Test Analytics"

The documentation show this steps:

Step 1: Output a JUnit XML file in your CI

Select the framework below to generate a JUnit XML file that contains the results of your test run.

npm i --save-dev jest-junit 
JEST_JUNIT_CLASSNAME="{filepath}" jest --reporters=jest-junit

Step 2: Add the script codecov/test-results-action@v1 to your CI YAML file.

In your CI YAML file, add below scripts to the end of your test run.

- name: Upload test results to Codecov
  if: ${{ !cancelled() }}
  uses: codecov/test-results-action@v1
  with:
    token: ${{ secrets.CODECOV_TOKEN }}

This action will download the Codecov CLI, and upload the junit.xml file generated in the previous step to Codecov.
Step 3: Run your test suite

You can inspect the workflow logs to see if the call to Codecov succeeded.
GitHub Actions Tests

Run your tests as usual. You need to have some failed tests to view the failed tests report.
Step 4: View results and insights

After the test run completion, you'll be able to see the failed tests result on the following areas:

    GitHub pull request comment
    Failed tests dashboard here.

The only thing is that there is no native support for the JUnit Report and NUnit to suggest doing it this way (keep in mind this example is for CI Insights not codecov):

This guide shows how to generate JUnit reports from your NUnit tests and upload them to CI Insights using a GitHub Actions workflow.
Generate a JUnit Report with NUnit
Section titled Generate a JUnit Report with NUnit

NUnit can generate JUnit-compatible XML reports using the built-in test adapter and various loggers available for dotnet test.
Using dotnet test with JUnit Logger
Section titled Using dotnet test with JUnit Logger

Install the JUnit test logger:

dotnet add package JunitXml.TestLogger

Run tests with JUnit output:

dotnet test --logger "junit;LogFilePath=junit.xml"

Using NUnit Console Runner
Section titled Using NUnit Console Runner

If you’re using the NUnit Console Runner directly:

nunit3-console.exe your-test.dll --result=junit.xml;format=junit

Using NUnit Test Adapter
Section titled Using NUnit Test Adapter

Add the NUnit test adapter to your test project:

Then run:

dotnet test --logger junit

Using Multiple Test Result Formats
Section titled Using Multiple Test Result Formats

You can generate multiple formats simultaneously:

dotnet test --logger "console;verbosity=detailed" --logger "junit;LogFilePath=junit.xml" --logger "trx;LogFileName=results.trx"

Using NUnit with Custom Settings
Section titled Using NUnit with Custom Settings

You can also configure test settings in a .runsettings file:

junit.xml

Then run:

dotnet test --settings test.runsettings

Update Your GitHub Actions Workflow
Section titled Update Your GitHub Actions Workflow

Note
Ensure your application key with ci scope (MERGIFY_TOKEN) is set up as explained in the installation docs.

After generating the JUnit report, add a step to upload the results to CI Insights using the mergifyio/gha-mergify-ci action.

For example, in your workflow file:

  • name: Run NUnit Tests and Generate JUnit Report
    continue-on-error: true
    run: dotnet test --logger "junit;LogFilePath=junit.xml"

  • name: Mergify CI Upload
    if: success() || failure()
    uses: mergifyio/gha-mergify-ci@v8
    with:
    token: ${{ secrets.MERGIFY_TOKEN }}
    report_path: junit.xml

Key Points:

if: success() || failure(): Runs the upload step even if tests fail, ensuring CI Insights has the full report.
report_path: junit.xml: Points to where your JUnit file is located. Make sure it matches the path you set in your CI job.

If you use a job matrix in your workflow (e.g., to test across multiple versions), ensure you set the job_name input (or MERGIFY_JOB_NAME environment variable) so CI Insights can properly distinguish reports for each matrix job.

For example, with:

jobs:
example_matrix:
strategy:
matrix:
version: [10, 12, 14]

Your upload step should look like:

  • name: Mergify CI Upload
    if: success() || failure()
    uses: mergifyio/gha-mergify-ci@v8
    with:
    job_name: example_matrix (${{ matrix.version }})
    token: ${{ secrets.MERGIFY_TOKEN }}
    report_path: junit.xml

Did you know?

Setting job_name helps CI Insights group and display reports correctly for each matrix variation.

In order to benefit from CI Insights Quarantine, you need to add continue-on-error: true in your GitHb Actions step that execute your tests and generat...


💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.

Co-authored-by: ScarletKuro <19953225+ScarletKuro@users.noreply.github.com>
Copilot AI changed the title [WIP] Add support for codecov test analytics Add Codecov Test Analytics support Jan 16, 2026
Copilot AI requested a review from ScarletKuro January 16, 2026 14:39
@ScarletKuro ScarletKuro marked this pull request as ready for review January 16, 2026 14:40
@ScarletKuro ScarletKuro changed the title Add Codecov Test Analytics support Tests: Add Codecov Test Analytics support Jan 16, 2026
@codecov
Copy link
Copy Markdown

codecov bot commented Jan 16, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 92.30%. Comparing base (25be883) to head (ade0d25).
⚠️ Report is 1 commits behind head on master.
✅ All tests successful. No failed tests found.

Additional details and impacted files
@@           Coverage Diff           @@
##           master       #7   +/-   ##
=======================================
  Coverage   92.30%   92.30%           
=======================================
  Files          13       13           
  Lines         156      156           
  Branches       35       35           
=======================================
  Hits          144      144           
  Misses          7        7           
  Partials        5        5           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@ScarletKuro ScarletKuro merged commit 5d7bafb into master Jan 16, 2026
3 checks passed
@ScarletKuro ScarletKuro deleted the copilot/add-codecov-test-analytics-support branch January 16, 2026 14:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants