Skip to content

Add Effective Dimension class #364

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 126 commits into from
Apr 21, 2022
Merged

Conversation

ElePT
Copy link
Collaborator

@ElePT ElePT commented Apr 4, 2022

Summary

This PR closes #340 by adding the LocalEffectiveDimensionClass introduced in https://arxiv.org/abs/2112.04807, taking the original experiments' code as reference: https://github.com/amyami187/effective_dimension.

Details and comments

The local effective dimension shares all the logic from the global effective dimension algorithm, with the only difference being the limitation in number of parameter sets used. However, it has been discussed to be conceptually different enough to become its own class.

Please let me know if you disagree with any variable names.

@ElePT ElePT added the Changelog: New Feature Include in the Added section of the changelog label Apr 4, 2022
@ElePT ElePT self-assigned this Apr 4, 2022
Comment on lines 41 to 42
weight_samples: Union[List[float], np.ndarray, int] = 1,
input_samples: Union[List[float], np.ndarray, int] = 1,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This I brought up before, but I am not sure its resolved. The type hint here says List[float] but should it be List[List[float]] since its really a two dimensional affair i.e. An array of neural network parameters (weights), of shape (num_weight_samples, num_weights), or an int. It could be reshaped from a flat list but I did not see that.

What I see in the setter for instance is this

                len(weight_samples.shape) > 2
                or weight_samples.shape[1] is not self._model.num_weights

That checks the dimensions (size of the shape) being > 2 but if 1 will end up failing on the next line as it will index shape[1] which will be out of bounds for a flat list.

I suggested adding unit tests to ensure it ran with plain lists as expected being passed in to make sure of things.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

While you are absolutely right, I'm not sure I want to keep List in type hints at all. They tend to cause many problems, especially when multidimensional arrays are required. I'd prefer to remove support for List completely in favor of plain np.ndarray. I think I know where this type hint originates from. There's the same problem in NeuralNetwork, same list type hint and it is also incorrect, unfortunately. In NeuralNetwork a multidimensional array is expected. Usually it is of shape (num_samples, num_features).

Therefore, input_samples may be a single value, a plain list or a list of list at least. I can extend type hints, but prefer to remove support for lists and this is a good chance to start it off. There's an old issue to consider as well: #159.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, I removed lists from the type hints.

Comment on lines 134 to 137
effdim1 = global_ed1.get_effective_dimension(self.n)
effdim2 = global_ed2.get_effective_dimension(self.n)

self.assertAlmostEqual(effdim1, effdim2, 5)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This asserts two results are the same - in my mind it would be good to have an expected reference value at least one of these was compared against to ensure that they both don't go bad together and we never notice. I see this is done in other tests too, but I will just comment here.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added an explicit assert.

Comment on lines 180 to 181
print(effdim1)
print(effdim2)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should not leave print statements from debugging in the code

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I forgot to remove prints while was working on tests.

# input setter uses self._model
self.input_samples = input_samples # type: ignore

def num_weights(self) -> int:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This ends up being a method i.e. you would call it as ed.num_weights() versus accessing it as property on the neural network i.e. self._model.num_weights Would we prefer a property getter here if this functionality is needed?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure it is required. I removed this method.

Comment on lines 79 to 80
"""Returns the dimension of the model according to the definition from [1]."""
return self._model.num_weights
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Returns the dimension of the model... So far we have given a NeuralNetwork as a qnn into the constructor. Yes it was stored as _model but somehow I imagine the term model to be something I have to read the reference and its not something we refer qnn to be in the constructor docstring. This is also call num_weights not dim or dimension. I think maybe, something more like this perhaps
Returns the number of weights of the supplied neural network. This is the dimension of the model according to the definition from [1]. where we explain it in terms of the code here and state the correspondence to the paper rather than the latter that being the main thing. Since we passed in qnn and ought to know num_weights from that, this indirect access to it - is it really useful?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe num_weights can be easily retrieved from qnn, so I don't see a real need for this method.


@weight_samples.setter
def weight_samples(self, weight_samples: Union[List[float], np.ndarray, int]) -> None:
"""Sets network parameters (weights)."""
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since the input samples setter/getter is more direct and says Sets network input samples. Would this and the getter above be more direct as Sets network weights samples. - why do we need to use the term parameters here and then have to qualify it with (weights) in order for it to presumably be understood in context here.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated.

weight_samples = np.asarray(weight_samples)
if (
len(weight_samples.shape) > 2
or weight_samples.shape[1] is not self._model.num_weights
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is not - do we mean != here

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, good point, of course it must be !=.

@adekusar-drl adekusar-drl merged commit 6a7c8c0 into qiskit-community:main Apr 21, 2022
oscar-wallis pushed a commit that referenced this pull request Feb 16, 2024
* Add eff dim class and demo

* Refactor get_fisher

* Refactor get_fisher

* Refactor docstrings

* Update effdim examples

* Add mock runtime code and demo

* Add effdim runtime examples

* Refactor variable names, remove loops

* Add unit tests

* add eff dim class

* add unit tests with fixed random seed

* Remove tutorial

* Remove runtime code

* Fix lint

* Fix style

* Fix date

* Add reno

* add reno for effective dim.

* Fix mypy

* For updating local branch

* Remove unnecesary files

* Restore files

* Fix spelling and tipe hinting

* Address style comments

* Add sphinx class refs.

* Reno effective dimension

* Modify random seed

* Update docstring

Co-authored-by: Anton Dekusar <[email protected]>

* Update docstring

Co-authored-by: Anton Dekusar <[email protected]>

* Address docstring comment (d)

Co-authored-by: Anton Dekusar <[email protected]>

* Apply suggestion

Co-authored-by: Anton Dekusar <[email protected]>

* Change method names

* Add more verbose callback

* Remove d (num_weights), add return shapes

* Address getter/setter feedback

* Fix local eff dim setter

* Address unittest feedback

* Fix formatting

* Add references

* Fix lint

* Refactor docstrings

* Add to pylintdict

* Bugfix

* Add expressibility

* Fix callable type

* Attempt to remove tutorial

* Remove tutorial

* Fix signature

* Apply suggestion reference

Co-authored-by: dlasecki <[email protected]>

* Apply suggestion doctoring

Co-authored-by: dlasecki <[email protected]>

* Apply suggestion unpack

Co-authored-by: dlasecki <[email protected]>

* Update test/algorithms/test_effective_dimension.py

Co-authored-by: dlasecki <[email protected]>

* Update effective_dimension.py

* For undoing commit

* Revert "Apply suggestion doctoring"

This reverts commit 4b15c48.

* Revert unpack commit

* Apply algorithm globals change

* Add classes to init

* Add check to local eff dim setter

* Remove seed

* Add class to init

* Commit changes

* Changes to test

* Fix checks

* Fix mypy

* Remove local num params

* Add classification example

* Add classification example

* Update qiskit_machine_learning/algorithms/__init__.py

Co-authored-by: Steve Wood <[email protected]>

* Change num_inputs, num_params

* Update qiskit_machine_learning/algorithms/effective_dimension.py

Co-authored-by: dlasecki <[email protected]>

* Update qiskit_machine_learning/algorithms/effective_dimension.py

Co-authored-by: dlasecki <[email protected]>

* Update qiskit_machine_learning/algorithms/effective_dimension.py

Co-authored-by: dlasecki <[email protected]>

* Update qiskit_machine_learning/algorithms/effective_dimension.py

Co-authored-by: dlasecki <[email protected]>

* Update qiskit_machine_learning/algorithms/effective_dimension.py

Co-authored-by: dlasecki <[email protected]>

* Update qiskit_machine_learning/algorithms/effective_dimension.py

Co-authored-by: dlasecki <[email protected]>

* Update qiskit_machine_learning/algorithms/effective_dimension.py

Co-authored-by: dlasecki <[email protected]>

* Update qiskit_machine_learning/algorithms/effective_dimension.py

Co-authored-by: dlasecki <[email protected]>

* Update test/algorithms/test_effective_dimension.py

Co-authored-by: Anton Dekusar <[email protected]>

* Adapt unit tests for num_inputs, num_params

* Remove tutorial

* Change location, add callback to local eff dim

* Bugfix after applying suggestions

* Fix style

* Add ED to init docstring

* Fix type assignment

* Remove ED from algorithms init

* Fix black

* Add ED to init

* Add Eff. Dim reno

* Add input unit tests

* Change vars names

* Fix style

* Modify callbacks, add input checks

* Fix black

* Add monte and carlo

* Fix copyright

* Modify param setter check

* Fix style

* Fix mypy

* change namings

* refactor tests, fix sphinx

* more renamings

* Update qiskit_machine_learning/neural_networks/effective_dimension.py

Co-authored-by: Steve Wood <[email protected]>

* Update qiskit_machine_learning/neural_networks/effective_dimension.py

Co-authored-by: Steve Wood <[email protected]>

* Update test/neural_networks/test_effective_dimension.py

Co-authored-by: Steve Wood <[email protected]>

* Update qiskit_machine_learning/neural_networks/effective_dimension.py

Co-authored-by: Steve Wood <[email protected]>

* Update qiskit_machine_learning/neural_networks/effective_dimension.py

Co-authored-by: Steve Wood <[email protected]>

* code review

* fix black

* fix !=

* Update qiskit_machine_learning/neural_networks/effective_dimension.py

Co-authored-by: Steve Wood <[email protected]>

* remove lists from type hints

* Update qiskit_machine_learning/neural_networks/effective_dimension.py

Co-authored-by: Steve Wood <[email protected]>

Co-authored-by: Anton Dekusar <[email protected]>
Co-authored-by: dlasecki <[email protected]>
Co-authored-by: Amira Abbas <[email protected]>
Co-authored-by: Steve Wood <[email protected]>
Co-authored-by: Anton Dekusar <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Changelog: New Feature Include in the Added section of the changelog
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add Effective Dimension Class
6 participants