Skip to content
This repository was archived by the owner on Dec 7, 2021. It is now read-only.

Added converters argument to some optimization algorithms #1260

Merged
merged 19 commits into from
Oct 5, 2020

Conversation

a-matsuo
Copy link
Contributor

Summary

Added converters argument to GroverOptimizer, RecursiveMinimumEigenOptimizer, andMinimumEigenOptimizer to specify quadratic program converters to be used.

Details and comments

By adding converters argument, user can specify quadratic program converters to be used. This functionality allows users to use their own defined converters in optimization algorithms.

@woodsp-ibm
Copy link
Member

In #1266 it has been seen that the penalty parameter for the minimum_eigen_optimizer is not used/applied at all (I think it should have been set to the default internal qubo convertor that was constructed but was not). Anton figured since this was updating the converters it would be a better place to fix this issue - the referenced PR has some more discussion but I think the above captures the essence.

@a-matsuo
Copy link
Contributor Author

I checked #1266. I put the penalty parameter in MinimumEigenOptimizer to pass to the ProblemToQubo, but apparently, I just put the penalty parameter and forgot to pass it to the ProblemToQubo.

Anyway, I don't think that we need to add getters/setters of penalty for MnimumEigenOptimizer. The penalty parameter is just for the ProblemToQubo. If the user wants to change the penalty value, he/she can do like algorithm.converter.penalty_factor = <new value> and can explicity set the properly of converters.

@woodsp-ibm
Copy link
Member

In regards of the penalty setter do you want to comment in the other PR that the penalty one should be removed if this is what you think is best.

I will note though your example of algorithm.converter.penalty_factor I did not see any converter property on the algorithm to retrieve this. And it practice convertors can be a single one or a list, and not all converters support penalty so its a bit more complex. The setter that was in the other PR I would have though more dealt with the default QP2Qubo convertor that gets created - maybe the default will always be on autoconvert. Certainly there is still the penalty param in the constructor of mininum_eigen_optimizer that is never used - something that should be taken care of however you want to handle penalty.

@a-matsuo
Copy link
Contributor Author

Sure, I will comment in the other PR and pass the penalty to the default QuadraticProgramToQubo.

algorithm.converter.penalty_factor is just an example. What I wanted to mention was users can explicitly set the parameters of the converters. Since we are changing to allow users' own defined converters, making setter/getter for converters in algorithm classes is complex.

@a-matsuo a-matsuo requested a review from levbishop as a code owner September 29, 2020 16:03
@woodsp-ibm
Copy link
Member

In the GroverOptimizer I noticed this

# TODO: Utilize Grover's incremental feature - requires changes to Grover.

given that Grover has been updated did that accommodate whatever changes were required by that TODO? Maybe since that is not really related to this PR it should be updated as part of a separate PR to ensure that that changes to Grover, that has been done for this release, meet the needs there.

@a-matsuo
Copy link
Contributor Author

a-matsuo commented Oct 3, 2020

hmm, if we use the sample_from_iterations option of Grover, it might be possible to replace the process below #TODO. I didn't write that TODO so I'm not sure. Let me ask Julien.
By the way, I'm planning to make a new PR for the minor fix of Grover (e.g. I forgot to update some docstrings). I'll update this TODO thing in that PR #1309.

@woodsp-ibm woodsp-ibm merged commit 18b0901 into qiskit-community:master Oct 5, 2020
manoelmarques added a commit to qiskit-community/qiskit-optimization that referenced this pull request Jan 14, 2021
…ommunity/qiskit-aqua#1260)

* added converter argument to opt algorithms

* now penalty parameter is passed properly

* added docstrings for penalty

Co-authored-by: Manoel Marques <[email protected]>
Co-authored-by: Cryoris <[email protected]>
Co-authored-by: Steve Wood <[email protected]>
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants