-
-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Incremental check for coding standards in pull requests #2936
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Can you share some examples of how this looks in real life?
Downloads are not part of API rate limiting, no rate limit headers are present (302 to AWS). |
@Majkl578 see the most recent build failure. If the entire file was checked, the report would look like: phpcs lib/Doctrine/DBAL/Connection.php
This script uses According to the documentation, the encrypted key will not work for pull requests, so if there's a limiting problem, I'll have to rework the script to use the actual git repo clone, not the API (the original design didn't account for CI like Travis and didn't require the local repository clone). |
I like this concept. We just need to find out how likely we're to hit this API limit, not really sure how to do it in a deterministic way. |
I wouldn't consider this particular case a false positive. If a line is modified, it should be formatted according to the standards, even if it's not modified entirely. This way we'll achieve better compliance without producing extra diffs. Of course, some false positives are inevitable but we should be able to cope with them. |
It appears, for Travis, the limits are the same as for other non-authenticated clients. Unless there's a way to authenticate Travis to the API, we cannot use this approach. We'll need to get the diff and blobs from the local clone. |
This is really ineffective. Why use API when you can just do a git diff on travis? |
@ostrolucky because it was designed long ago to not require a local clone. Depending on the repo and pull request, it may be much more efficient to just pull a couple of files using the API instead of cloning a repository with a ten-year-old history. I've almost finished re-implementing it to use a local repo. |
Yes, I can see this useful for different use cases, not for a CI though. Good to hear you are working on more effective solution though! Here's some old code in CI we use, might help you
|
I don't have much experience with travis, but I believe all CIs work by cloning the project. In case of travis it seems to be clone with depth of last 50 commits. So you don't need to do additional clone. GIT repo should be available right there. That's why I'm saying this is really ineffective. |
@Majkl578 @ostrolucky please see the updated patch and build failure. |
LGTM 👍, and once we finish doctrine/coding-standard#9 this may be great for PRs until we enforce CS globally in the repo. |
Looks like ORM is not going to need this after all, develop will be merged into master in upcoming days and we plan to apply CS globally on whole repo afterwards, followed by increasing PHPStan level. But it can still be useful here and other repos that don't yet adhere to our CS. :) |
.travis.yml
Outdated
php: 7.2 | ||
script: | ||
- wget https://github.com/diff-sniffer/git/releases/download/0.1.0/git-phpcs.phar | ||
- php git-phpcs.phar origin/$TRAVIS_BRANCH...$TRAVIS_PULL_REQUEST_SHA |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Careful here: travis clone operations are generally shallow, which means that this may fail after 100 commits or even less.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
From the documentation:
Rather than test the commits that have been pushed to the branch the pull request is from, we test the merge between the origin and the upstream branch.
I was under the impression that both origin/$TRAVIS_BRANCH
and $TRAVIS_PULL_REQUEST_SHA
will be immediate parents of the merge commit. So they both should be on the second level of depth.
Not sure how shallow clone handles multiple parents though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In theory, there indeed may be a problem when the merge base of the two parents is more than 50 commits behind any of them. So even if we fetch the merge base manually, there's no guarantee all commits in between are available. Hm…
if: type = pull_request | ||
php: 7.2 | ||
script: | ||
- git merge-base origin/$TRAVIS_BRANCH $TRAVIS_PULL_REQUEST_SHA || git fetch origin +refs/pull/$TRAVIS_PULL_REQUEST/merge --unshallow |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Ocramius what about this? The non-shallow clone of this PR's branch is ~18MB, not that bad and should be fetched only in edge cases.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Works for me 👌
By the way, Travis allows depth configuration or even disabling it: https://docs.travis-ci.com/user/customizing-the-build/#Git-Clone-Depth |
@Majkl578 it does but a deep clone by default would be an overkill and we don't know the exact depth we need for a given pull request. The current solution is a tradeoff between performance and simplicity. The default depth of 50 will be enough in 99?% of cases. In the rest, we fetch the missing commits. |
WIP. It's unclear if we'll need the key to access the GitHub API and not exceed the limit.
Fixes #2917.