Skip to content

Will you provide the pretrained model? #1

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
GuideWsp opened this issue Aug 6, 2020 · 4 comments
Open

Will you provide the pretrained model? #1

GuideWsp opened this issue Aug 6, 2020 · 4 comments

Comments

@GuideWsp
Copy link

GuideWsp commented Aug 6, 2020

No description provided.

@xiezw5
Copy link
Owner

xiezw5 commented Aug 6, 2020

Yes, it will be released soon.

@aligoglos
Copy link

Please upload on drive.

@ABDOELSHEMY
Copy link

please upload the Pretrained models on drive or something easy to download from, thank you

@zhihongp
Copy link

Ran into problems with pre-trained models when run ./test_models_pc.sh cdc_x2_test ./CDC_test.py ./models/HGSR-MHR_X2_CDC.pth 2

Could you help double check? BTW, the x4 version seems to work, and x3 has load_state_dict issue too.

Initializing DataSet, image list: ./DRealSR/Test_x2 ...
Found 83 HR 83 LR ...
Build Generator Net...
Loading from DataParallel module......
Traceback (most recent call last):
File "../TorchTools/TorchNet/tools.py", line 178, in load_weights
model.load_state_dict(model_weights, strict=strict)
File "./lib/python3.6/site-packages/torch/nn/modules/module.py", line 845, in load_state_dict
self.class.name, "\n\t".join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for HourGlassNetMultiScaleInt:
Missing key(s) in state_dict: "flat_map.2.1.weight", "flat_map.2.1.bias", "flat_map.4.0.weight", "flat_map.4.0.bias", "edge_map.2.1.weight", "edge_map.2.1.bias", "edge_map.4.0.weight", "edge_map.4.0.bias", "corner_map.2.1.weight", "corner_map.2.1.bias", "corner_map.4.0.weight", "corner_map.4.0.bias", "upsample_flat.2.1.weight", "upsample_flat.2.1.bias", "upsample_flat.4.0.weight", "upsample_flat.4.0.bias", "upsample_edge.2.1.weight", "upsample_edge.2.1.bias", "upsample_edge.4.0.weight", "upsample_edge.4.0.bias", "upsample_corner.2.1.weight", "upsample_corner.2.1.bias", "upsample_corner.4.0.weight", "upsample_corner.4.0.bias".
Unexpected key(s) in state_dict: "flat_map.2.0.weight", "flat_map.2.0.bias", "edge_map.2.0.weight", "edge_map.2.0.bias", "corner_map.2.0.weight", "corner_map.2.0.bias", "upsample_flat.2.0.weight", "upsample_flat.2.0.bias", "upsample_edge.2.0.weight", "upsample_edge.2.0.bias", "upsample_corner.2.0.weight", "upsample_corner.2.0.bias".
size mismatch for flat_map.3.0.weight: copying a param with shape torch.Size([3, 64, 1, 1]) from checkpoint, the shape in current model is torch.Size([64, 64, 3, 3]).
size mismatch for flat_map.3.0.bias: copying a param with shape torch.Size([3]) from checkpoint, the shape in current model is torch.Size([64]).
size mismatch for edge_map.3.0.weight: copying a param with shape torch.Size([3, 64, 1, 1]) from checkpoint, the shape in current model is torch.Size([64, 64, 3, 3]).
size mismatch for edge_map.3.0.bias: copying a param with shape torch.Size([3]) from checkpoint, the shape in current model is torch.Size([64]).
size mismatch for corner_map.3.0.weight: copying a param with shape torch.Size([3, 64, 1, 1]) from checkpoint, the shape in current model is torch.Size([64, 64, 3, 3]).
size mismatch for corner_map.3.0.bias: copying a param with shape torch.Size([3]) from checkpoint, the shape in current model is torch.Size([64]).
size mismatch for upsample_flat.3.0.weight: copying a param with shape torch.Size([3, 64, 3, 3]) from checkpoint, the shape in current model is torch.Size([64, 64, 3, 3]).
size mismatch for upsample_flat.3.0.bias: copying a param with shape torch.Size([3]) from checkpoint, the shape in current model is torch.Size([64]).
size mismatch for upsample_edge.3.0.weight: copying a param with shape torch.Size([3, 64, 3, 3]) from checkpoint, the shape in current model is torch.Size([64, 64, 3, 3]).
size mismatch for upsample_edge.3.0.bias: copying a param with shape torch.Size([3]) from checkpoint, the shape in current model is torch.Size([64]).
size mismatch for upsample_corner.3.0.weight: copying a param with shape torch.Size([3, 64, 3, 3]) from checkpoint, the shape in current model is torch.Size([64, 64, 3, 3]).
size mismatch for upsample_corner.3.0.bias: copying a param with shape torch.Size([3]) from checkpoint, the shape in current model is torch.Size([64]).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants