Skip to content

Fix: Unexpected failing of vitest in PR workflows (Urgent) #3601

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Feb 16, 2025
Merged

Fix: Unexpected failing of vitest in PR workflows (Urgent) #3601

merged 3 commits into from
Feb 16, 2025

Conversation

JaiPannu-IITI
Copy link

@JaiPannu-IITI JaiPannu-IITI commented Feb 14, 2025

Closes issue #3600

Change Implemented

  • Timeout more than vitest config was used within spec file for rendering

Summary by CodeRabbit

  • Tests
    • Optimized internal testing parameters to streamline automated quality checks (this update does not impact user experience).

Copy link
Contributor

coderabbitai bot commented Feb 14, 2025

Walkthrough

The pull request adjusts a unit test in the Actions.spec.tsx file by reducing the timeout value in the waitFor function from 10,000 milliseconds to 2,500 milliseconds for the "Search by Category name" test case. No other changes or modifications to functionality were introduced.

Changes

File Change Summary
src/screens/.../Actions.spec.tsx Reduced waitFor timeout from 10,000ms to 2,500ms in the "Search by Category name" test case

Possibly related PRs

Suggested labels

ignore-sensitive-files-pr

Suggested reviewers

  • varshith257
  • palisadoes

Poem

Hop along the test lane, light and free,
I trimmed the timeout with a tweak of glee,
From 10k to 2.5k, a skip so spry,
A rabbit’s hop in code flying by,
With whiskers twitched and spirit bright—
CodeRabbit sings in the moonlit night!


📜 Recent review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 02400f3 and 74bfef7.

📒 Files selected for processing (1)
  • src/screens/UserPortal/Volunteer/Actions/Actions.spec.tsx (1 hunks)
🔇 Additional comments (1)
src/screens/UserPortal/Volunteer/Actions/Actions.spec.tsx (1)

217-223:

❓ Verification inconclusive

Verify if reducing the timeout aligns with the PR objectives.

The PR objectives mention adjusting the timeout to exceed vitest's timeout, but this change reduces the timeout from 10,000ms to 2,500ms. This reduction might make the test more flaky if the search operation occasionally takes longer.

Let's verify vitest's default timeout and check if there are any timeout-related configurations in the project:

Consider these alternatives:

  1. If the test consistently passes within 2,500ms:

    await waitFor(
      () => {
        const assigneeName = screen.getAllByTestId('assigneeName');
        expectElementToHaveTextContent(assigneeName[0], 'Teresa Bradley');
      },
    -  { timeout: 2500 },
    +  { timeout: vitest.getConfig().testTimeout || 5000 },
    );
  2. If the test needs more time:

    await waitFor(
      () => {
        const assigneeName = screen.getAllByTestId('assigneeName');
        expectElementToHaveTextContent(assigneeName[0], 'Teresa Bradley');
      },
    -  { timeout: 2500 },
    +  { timeout: 15000 },
    );

🏁 Script executed:

#!/bin/bash
# Description: Check for vitest timeout configurations

# Search for vitest.config files
echo "Searching for vitest config files..."
fd -e ts -e js -e mjs -e json -g "*vitest*config*"

# Search for timeout settings in package.json
echo "Checking package.json for vitest settings..."
fd -g "package.json" -x cat {} | jq -r '.vitest // empty'

# Search for inline timeout configurations
echo "Checking for other timeout configurations..."
rg -A 2 "timeout.*[0-9]+" --type ts --type tsx

Length of output: 487


Attention: Confirm the Adequacy of the Reduced Timeout Value

The PR objectives indicate that the timeout should be set to exceed Vitest’s default test timeout. However, reducing the wait time from 10,000ms to 2,500ms—without an accompanying configuration change—raises concerns about potential test flakiness if operations occasionally run longer than 2,500ms.

  • The vitest config file (vitest.config.ts) exists, but no explicit timeout override was found in either it or the package configuration.
  • The default Vitest timeout (commonly 5,000ms) is likely higher than 2,500ms, which suggests that the current waitFor timeout might be insufficient.

Recommendation:

  • Verify: Double-check that the 2,500ms timeout is indeed appropriate for the test’s performance expectations. If not, consider:
    • Basing the timeout on Vitest’s configuration (e.g., using vitest.getConfig().testTimeout || 5000)
    • Increasing the timeout value to accommodate operations that might occasionally exceed 2,500ms.

Example alternative implementation:

await waitFor(
  () => {
    const assigneeName = screen.getAllByTestId('assigneeName');
    expectElementToHaveTextContent(assigneeName[0], 'Teresa Bradley');
  },
-  { timeout: 2500 },
+  { timeout: vitest.getConfig().testTimeout || 5000 },
);

Please review the actual timeout behavior (perhaps by checking the vitest.config.ts in more detail or verifying against actual test execution metrics) and adjust the value accordingly.


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR. (Beta)
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

Our Pull Request Approval Process

Thanks for contributing!

Testing Your Code

Remember, your PRs won't be reviewed until these criteria are met:

  1. We don't merge PRs with poor code quality.
    1. Follow coding best practices such that CodeRabbit.ai approves your PR.
  2. We don't merge PRs with failed tests.
    1. When tests fail, click on the Details link to learn more.
    2. Write sufficient tests for your changes (CodeCov Patch Test). Your testing level must be better than the target threshold of the repository
    3. Tests may fail if you edit sensitive files. Ask to add the ignore-sensitive-files-pr label if the edits are necessary.
  3. We cannot merge PRs with conflicting files. These must be fixed.

Our policies make our code better.

Reviewers

Do not assign reviewers. Our Queue Monitors will review your PR and assign them.
When your PR has been assigned reviewers contact them to get your code reviewed and approved via:

  1. comments in this PR or
  2. our slack channel

Reviewing Your Code

Your reviewer(s) will have the following roles:

  1. arbitrators of future discussions with other contributors about the validity of your changes
  2. point of contact for evaluating the validity of your work
  3. person who verifies matching issues by others that should be closed.
  4. person who gives general guidance in fixing your tests

CONTRIBUTING.md

Read our CONTRIBUTING.md file. Most importantly:

  1. PRs with issues not assigned to you will be closed by the reviewer
  2. Fix the first comment in the PR so that each issue listed automatically closes

Other

  1. 🎯 Please be considerate of our volunteers' time. Contacting the person who assigned the reviewers is not advised unless they ask for your input. Do not @ the person who did the assignment otherwise.
  2. Read the CONTRIBUTING.md file make

@JaiPannu-IITI JaiPannu-IITI changed the title Fix: Excessive runtime Fix: Unexpected failing of vitest in PR workflows (Urgent) Feb 14, 2025
@JaiPannu-IITI
Copy link
Author

@coderabbitai full review

Copy link
Contributor

coderabbitai bot commented Feb 14, 2025

✅ Actions performed

Full review triggered.

Copy link

codecov bot commented Feb 14, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 85.76%. Comparing base (825a2fc) to head (6b0703d).
Report is 1 commits behind head on develop-postgres.

Additional details and impacted files
@@                 Coverage Diff                  @@
##           develop-postgres    #3601      +/-   ##
====================================================
+ Coverage             85.75%   85.76%   +0.01%     
====================================================
  Files                   358      358              
  Lines                  9030     9030              
  Branches               1933     1933              
====================================================
+ Hits                   7744     7745       +1     
  Misses                  924      924              
+ Partials                362      361       -1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@Cioppolo14
Copy link
Contributor

There isn’t an issue assigned to you for this PR. Please follow the guidelines in our PR_GUIDELINES.md file. We have the procedures in place so that everyone has a fair chance of contributing. I will be closing this pull request. Please follow the procedures and resubmit when ready.

@Cioppolo14 Cioppolo14 closed this Feb 15, 2025
@JaiPannu-IITI
Copy link
Author

JaiPannu-IITI commented Feb 15, 2025

I apologize for that, I just did it to help others with failing PR's.

@JaiPannu-IITI
Copy link
Author

@palisadoes PR is updated and ready for review.

@palisadoes palisadoes merged commit 2ae1b19 into PalisadoesFoundation:develop-postgres Feb 16, 2025
19 checks passed
@palisadoes
Copy link
Contributor

Why did reducing the timeout fix the issue?

@JaiPannu-IITI
Copy link
Author

Why did reducing the timeout fix the issue?

Sir timeout of 10000 ms is exactly equal to maximum allowed limit in vitest config, sometimes if fail sometime it does not based on server latency. A timeout of 2500 is enough for even heavier frontend components to render and providing sufficient time (7500ms) for tests to execute.

I tried before too that this problem doesn't come in future if you remember in my codeRefactoring v1 PR, I increased vitest time from default (5000ms) to 10000ms as at that time I encountered same problem but in even greater number of tests , I calculated 10000ms will be enough for 1600 tests with maximum 25 test cases per file to execute which gives enough room for testers to write good tests but I have no idea why this test writer used 10000ms timeout directly to render a simple component.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants