We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When I run "python test.py ctdet --exp_id coco_dla --keep_res --load_model ../models/ctdet_coco_dla_2x.pth", there is an error : coco_dla |#### | [653/5000]|Tot: 0:00:59 |ETA: 0:06:07 |tot 0.046s (0.053s) |load 0.000s (0.000s) |pre 0.000s (0.001s) |net 0.040s (0.046s) |dec 0.003s (0.003s) |post 0.003s (0.003s) |merge 0.000s (0.000s) [[ 96 133 171 ... 206 207 217] [ 89 120 131 ... 212 216 212] [ 92 130 155 ... 208 219 208] ... [126 122 129 ... 90 96 97] [117 120 124 ... 100 93 91] [101 107 109 ... 120 108 117]] Traceback (most recent call last): File "test.py", line 127, in prefetch_test(opt) File "test.py", line 70, in prefetch_test for ind, (img_id, pre_processed_images) in enumerate(data_loader): File "c:\programdata\anaconda3\lib\site-packages\torch\utils\data\dataloader.py", line 345, in next data = self._next_data() File "c:\programdata\anaconda3\lib\site-packages\torch\utils\data\dataloader.py", line 385, in _next_data data = self._dataset_fetcher.fetch(index) # may raise StopIteration File "c:\programdata\anaconda3\lib\site-packages\torch\utils\data_utils\fetch.py", line 44, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "c:\programdata\anaconda3\lib\site-packages\torch\utils\data_utils\fetch.py", line 44, in data = [self.dataset[idx] for idx in possibly_batched_index] File "test.py", line 42, in getitem images[scale], meta[scale] = self.pre_process_func(image, scale) File "E:\CenterNet-windows-master\CenterNet-windows-master\src\lib\detectors\base_detector.py", line 57, in pre_process inp_image = ((inp_image / 255. - self.mean) / self.std).astype(np.float32) ValueError: operands could not be broadcast together with shapes (480,672) (1,1,3)
I used pytorch 1.4.0 torchvision 0.5.0 python 3.7.4
The text was updated successfully, but these errors were encountered:
No branches or pull requests
When I run "python test.py ctdet --exp_id coco_dla --keep_res --load_model ../models/ctdet_coco_dla_2x.pth", there is an error :
coco_dla |#### | [653/5000]|Tot: 0:00:59 |ETA: 0:06:07 |tot 0.046s (0.053s) |load 0.000s (0.000s) |pre 0.000s (0.001s) |net 0.040s (0.046s) |dec 0.003s (0.003s) |post 0.003s (0.003s) |merge 0.000s (0.000s) [[ 96 133 171 ... 206 207 217]
[ 89 120 131 ... 212 216 212]
[ 92 130 155 ... 208 219 208]
...
[126 122 129 ... 90 96 97]
[117 120 124 ... 100 93 91]
[101 107 109 ... 120 108 117]]
Traceback (most recent call last):
File "test.py", line 127, in
prefetch_test(opt)
File "test.py", line 70, in prefetch_test
for ind, (img_id, pre_processed_images) in enumerate(data_loader):
File "c:\programdata\anaconda3\lib\site-packages\torch\utils\data\dataloader.py", line 345, in next
data = self._next_data()
File "c:\programdata\anaconda3\lib\site-packages\torch\utils\data\dataloader.py", line 385, in _next_data
data = self._dataset_fetcher.fetch(index) # may raise StopIteration
File "c:\programdata\anaconda3\lib\site-packages\torch\utils\data_utils\fetch.py", line 44, in fetch
data = [self.dataset[idx] for idx in possibly_batched_index]
File "c:\programdata\anaconda3\lib\site-packages\torch\utils\data_utils\fetch.py", line 44, in
data = [self.dataset[idx] for idx in possibly_batched_index]
File "test.py", line 42, in getitem
images[scale], meta[scale] = self.pre_process_func(image, scale)
File "E:\CenterNet-windows-master\CenterNet-windows-master\src\lib\detectors\base_detector.py", line 57, in pre_process
inp_image = ((inp_image / 255. - self.mean) / self.std).astype(np.float32)
ValueError: operands could not be broadcast together with shapes (480,672) (1,1,3)
I used pytorch 1.4.0 torchvision 0.5.0 python 3.7.4
The text was updated successfully, but these errors were encountered: