Skip to content

Add chance to draw local constants in HypothesisProvider #4356

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 22 commits into from
Apr 17, 2025

Conversation

tybug
Copy link
Member

@tybug tybug commented Apr 10, 2025

I'll be back with tests tomorrow, but figured it was worth PRing this for review.

An alternative to passing the local constants to PrimitiveProvider is to store it in BuildContext. Downside: ConjectureData instance behavior is no longer isolated and depends on context; harder to test. Upside:PrimitiveProvider implementers don't need to adjust their interface to access the constants, they can just opt-in to the BuildContext.

That said, CrosshairProvider uses def __init__(self, *args, **kwargs), so this shouldn't actually be a breaking change for them.

@tybug tybug requested review from DRMacIver and Zac-HD as code owners April 10, 2025 03:58
@Zac-HD
Copy link
Member

Zac-HD commented Apr 10, 2025

Looks pretty sensible to me! Only design change I'd make - instead of passing local_constants as an argument to the provider, just let the provider collect them from whatever helper function we provide at construction time. My intuition is that few third-party providers will want to use this; it's basically a hypothesis-and-hypofuzz internals thing.

@tybug
Copy link
Member Author

tybug commented Apr 11, 2025

crosshair failures (eg) from latest version. I added a crosshair version pin and corresponding task to update it

Looking into the other actually-relevant failures on this branch

@tybug
Copy link
Member Author

tybug commented Apr 11, 2025

Maybe we should split the probabilities for GLOBAL_CONSTANTS and local_constants, so we still have a good chance to draw weird floats and strings even in the presence of many many local constants.

@tybug
Copy link
Member Author

tybug commented Apr 12, 2025

I've tracked down the pareto test failure to a potentially very old latent bug (manual front removal bypassing listeners) masked by our with deterministic_PRNG. Refactoring the test here rejigs the prng and hides the failure again.

I think we should merge this knowing that the test is incorrect, and follow up in the next pull. I have a fix, but want to write a new test for it and make sure it's correct:

diff --git i/hypothesis-python/src/hypothesis/internal/conjecture/pareto.py w/hypothesis-python/src/hypothesis/internal/conjecture/pareto.py
index 73f3e6a53..c282fa36e 100644
--- i/hypothesis-python/src/hypothesis/internal/conjecture/pareto.py
+++ w/hypothesis-python/src/hypothesis/internal/conjecture/pareto.py
@@ -248,7 +248,7 @@ class ParetoFront:
                 i -= 1
 
             for v in to_remove:
-                self.__remove(v)
+                self._remove(v)
             return data in self.front
         finally:
             self.__pending = None
@@ -272,7 +272,7 @@ class ParetoFront:
     def __len__(self) -> int:
         return len(self.front)
 
-    def __remove(self, data: ConjectureResult) -> None:
+    def _remove(self, data: ConjectureResult) -> None:
         try:
             self.front.remove(data)
         except ValueError:
@@ -338,7 +338,7 @@ class ParetoOptimiser:
                     # the front, or it was not added to it because it was
                     # dominated by something in it.
                     try:
-                        self.front.front.remove(source)
+                        self.front._remove(source)
                     except ValueError:
                         pass
                     return True

Copy link
Member

@Zac-HD Zac-HD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(one comment, will have more later in the week)

Copy link
Member

@Zac-HD Zac-HD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor comments considering the sheer size of this PR. I eagerly anticipate the many new bugs this will uncover downstream... merge when addressed to your satisfaction.

Comment on lines +1 to +3
crosshair-tool==0.0.85
hypothesis-crosshair==0.0.20
-c test.in # match test.in attrs pin
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have a bad feeling about these pins. Can we just list hypothesis-crosshair (no pin) and -r test.in, and then install from crosshair.txt without test.txt?

Also happy to merge this PR and follow up later.

Copy link
Member Author

@tybug tybug Apr 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll follow up with this tomorrow, if only to unblock CI for other PRs!

@tybug
Copy link
Member Author

tybug commented Apr 17, 2025

I'm happy with this PR from my end, but I'm about to sleep and there's no need to rush things. I'll merge tomorrow, or you're welcome to merge as well :)

@Zac-HD Zac-HD enabled auto-merge April 17, 2025 08:06
@Zac-HD Zac-HD merged commit ee7487b into HypothesisWorks:master Apr 17, 2025
118 of 119 checks passed
@tybug tybug deleted the constant-pool branch April 17, 2025 16:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants