-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathtrain.sh.o1418525
More file actions
1062 lines (987 loc) · 81.3 KB
/
train.sh.o1418525
File metadata and controls
1062 lines (987 loc) · 81.3 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
init meta
Near: 1.7382310142504496
Far: 2.055487166864391
Loader initialized.
Number of Joints: 20
KPE: RelDist, BPE: VecNorm, VPE: VecNorm
Embedder class: <class 'core.cutoff_embedder.Embedder'>
Embedder class: <class 'core.cutoff_embedder.Embedder'>
Embedder class: <class 'core.cutoff_embedder.Embedder'>
###### 300, 900 and 900 ###
RayCaster(
(network): NeRF(
(pts_linears): ModuleList(
(0): Linear(in_features=1200, out_features=256, bias=True)
(1-4): 4 x Linear(in_features=256, out_features=256, bias=True)
(5): Linear(in_features=1456, out_features=256, bias=True)
(6-7): 2 x Linear(in_features=256, out_features=256, bias=True)
)
(views_linears): ModuleList(
(0): Linear(in_features=1156, out_features=128, bias=True)
)
(output_linear): Linear(in_features=256, out_features=5, bias=True)
)
(network_fine): NeRF(
(pts_linears): ModuleList(
(0): Linear(in_features=1200, out_features=256, bias=True)
(1-4): 4 x Linear(in_features=256, out_features=256, bias=True)
(5): Linear(in_features=1456, out_features=256, bias=True)
(6-7): 2 x Linear(in_features=256, out_features=256, bias=True)
)
(views_linears): ModuleList(
(0): Linear(in_features=1156, out_features=128, bias=True)
)
(output_linear): Linear(in_features=256, out_features=5, bias=True)
)
(embed_fn): Embedder()
(embedbones_fn): Embedder()
(embeddirs_fn): Embedder()
)
Found ckpts ['./logs/rat_default/010000.tar', './logs/rat_default/020000.tar']
Reloading from ./logs/rat_default/020000.tar
load optimizer from ckpt
#parameters: 1224581
done creating popt
Saved checkpoints at ./logs/rat_default/020000.tar
Bounding Box Debug Info:
x_min: -0.016303576469421557, x_max: 0.13524496459960925
y_min: 0.13631874084472656, y_max: 0.2370219573974607
z_min: 0.03008332061767555, z_max: 0.13808740234375
Bounding Box Debug Info:
x_min: 0.20284332275390612, x_max: 0.27888725280761706
y_min: -0.10573136138916016, y_max: 0.04925373077392578
z_min: 0.007576402664184343, z_max: 0.10356272888183571
Bounding Box Debug Info:
x_min: -0.016059140205383415, x_max: 0.14520207214355438
y_min: -0.14237770080566428, y_max: -0.05226047515869141
z_min: 0.01878752899169899, z_max: 0.10969234466552712
Bounding Box Debug Info:
x_min: 0.04626746368408189, x_max: 0.12963674926757793
y_min: -0.117417900085449, y_max: 0.037399066925048824
z_min: 0.022188674926757586, z_max: 0.10861693572998048
Bounding Box Debug Info:
x_min: 0.05005798721313454, x_max: 0.15824763488769517
y_min: -0.04143564224243164, y_max: 0.10374114990234376
z_min: 0.027995925903320087, z_max: 0.16608097839355446
Bounding Box Debug Info:
x_min: 0.018135158538818245, x_max: 0.16837405395507798
y_min: 0.14043280029296876, y_max: 0.26155438232421874
z_min: 0.017705509801884546, z_max: 0.10332373046874978
Bounding Box Debug Info:
x_min: -0.22748263549804704, x_max: -0.07638658905029319
y_min: 0.11107346343994118, y_max: 0.22296235656738259
z_min: 0.01744731903076149, z_max: 0.11063830566406228
Bounding Box Debug Info:
x_min: -0.2488652343750003, x_max: -0.11536257934570324
y_min: -0.017549812316894533, y_max: 0.11087640380859376
z_min: 0.005051356315612565, z_max: 0.08966143035888649
Bounding Box Debug Info:
x_min: -0.21859460449218762, x_max: -0.12297888183593772
y_min: 0.0287499084472654, y_max: 0.10050344085693337
z_min: 0.022367975234985123, z_max: 0.17305021667480458
Bounding Box Debug Info:
x_min: -0.2076027832031251, x_max: -0.10125875854492215
y_min: -0.05550039672851585, y_max: 0.07307929992675782
z_min: 0.00415080495777488, z_max: 0.09007923889160134
Bounding Box Debug Info:
x_min: -0.17041168212890642, x_max: -0.036025394439697385
y_min: 0.11028720092773414, y_max: 0.22693572998046876
z_min: 0.022087579727172624, z_max: 0.1054314270019529
Bounding Box Debug Info:
x_min: -0.1705599060058596, x_max: -0.038622104644775564
y_min: 0.1101719741821289, y_max: 0.2275417175292969
z_min: 0.021979381561279068, z_max: 0.10639647674560525
Bounding Box Debug Info:
x_min: -0.17065034484863298, x_max: -0.03775608825683617
y_min: 0.11022159576415992, y_max: 0.22663485717773438
z_min: 0.021593749999999773, z_max: 0.10601044464111306
Bounding Box Debug Info:
x_min: -0.1705158691406251, x_max: -0.03697266769409191
y_min: 0.10976802825927712, y_max: 0.22746969604492165
z_min: 0.021498142242431413, z_max: 0.10463920593261719
Bounding Box Debug Info:
x_min: -0.17116410827636735, x_max: -0.03743118286132818
y_min: 0.11016936492919921, y_max: 0.22665449523925782
z_min: 0.02142527198791481, z_max: 0.1051624069213865
center: None
Pose 0: Valid indices count = 1391744
center: None
Pose 1: Valid indices count = 1391744
center: None
Pose 2: Valid indices count = 1391744
center: None
Pose 3: Valid indices count = 1391744
center: None
Pose 4: Valid indices count = 1391744
center: None
Pose 5: Valid indices count = 1391744
center: None
Pose 6: Valid indices count = 1391744
center: None
Pose 7: Valid indices count = 1391744
center: None
Pose 8: Valid indices count = 1391744
center: None
Pose 9: Valid indices count = 1391744
center: None
Pose 10: Valid indices count = 1391744
center: None
Pose 11: Valid indices count = 1391744
center: None
Pose 12: Valid indices count = 1391744
center: None
Pose 13: Valid indices count = 1391744
center: None
Pose 14: Valid indices count = 1391744
0 0.0005850791931152344
1 46.87179684638977
2 46.84775996208191
3 47.089300870895386
4 47.00591516494751
5 47.150734663009644
6 47.06116008758545
7 47.05800008773804
8 46.93541216850281
9 47.08892273902893
10 46.96255326271057
11 47.13358426094055
12 46.974615812301636
13 47.14869427680969
14 46.98773717880249
Predicted image shape (th_rgbs): torch.Size([15, 3, 1048, 1328])
Ground truth image shape (th_gt): torch.Size([15, 3, 1048, 1328])
Any NaNs in th_rgbs: tensor(False, device='cpu')
Any NaNs in th_gt: tensor(False, device='cpu')
Any Infs in th_rgbs: tensor(False, device='cpu')
Any Infs in th_gt: tensor(False, device='cpu')
Evaluate PSNR: 20.90448258740002 (13.224236935400489), SSIM: 0.9701443314552307 (0.7066879385281541)
[TRAIN] Iter: 20000 Loss: 0.09187431633472443 PSNR: 13.35507869720459, Alpha: 0.9970220327377319, GradNorm 0.5533041440870232, Mem: 11550.86865234375
[TRAIN] Iter: 20100 Loss: 0.08531677722930908 PSNR: 13.492626190185547, Alpha: 0.9990776777267456, GradNorm 0.4837602169151964, Mem: 11558.173828125
[TRAIN] Iter: 20200 Loss: 0.08788514137268066 PSNR: 13.372926712036133, Alpha: 0.9994758367538452, GradNorm 0.8073980473968272, Mem: 11558.173828125
[TRAIN] Iter: 20300 Loss: 0.07260923087596893 PSNR: 14.102751731872559, Alpha: 0.9975651502609253, GradNorm 0.3563044155845599, Mem: 11558.173828125
[TRAIN] Iter: 20400 Loss: 0.08192630112171173 PSNR: 13.925466537475586, Alpha: 0.9999901056289673, GradNorm 0.34090528769100953, Mem: 11558.173828125
[TRAIN] Iter: 20500 Loss: 0.08252661675214767 PSNR: 13.800107955932617, Alpha: 0.9947586059570312, GradNorm 0.3787243030012972, Mem: 11558.173828125
[TRAIN] Iter: 20600 Loss: 0.07785485684871674 PSNR: 14.172076225280762, Alpha: 0.9959685802459717, GradNorm 0.32599397626132925, Mem: 11558.173828125
[TRAIN] Iter: 20700 Loss: 0.07660361379384995 PSNR: 14.09618091583252, Alpha: 0.996320903301239, GradNorm 0.4694360875372283, Mem: 11558.173828125
[TRAIN] Iter: 20800 Loss: 0.0738246887922287 PSNR: 14.10276985168457, Alpha: 0.9998658895492554, GradNorm 0.364227770480065, Mem: 11558.173828125
[TRAIN] Iter: 20900 Loss: 0.07962501049041748 PSNR: 13.840386390686035, Alpha: 0.9965566396713257, GradNorm 0.3924038067361124, Mem: 11558.173828125
[TRAIN] Iter: 21000 Loss: 0.08093756437301636 PSNR: 13.795390129089355, Alpha: 0.9987408518791199, GradNorm 0.5816660190368934, Mem: 11558.173828125
[TRAIN] Iter: 21100 Loss: 0.08430813997983932 PSNR: 13.906656265258789, Alpha: 0.9985388517379761, GradNorm 0.6461499268601819, Mem: 11558.173828125
[TRAIN] Iter: 21200 Loss: 0.0878308042883873 PSNR: 13.508638381958008, Alpha: 0.9990124702453613, GradNorm 0.30275096890167363, Mem: 11558.173828125
[TRAIN] Iter: 21300 Loss: 0.07432159781455994 PSNR: 14.288253784179688, Alpha: 0.997304379940033, GradNorm 0.2504567510855751, Mem: 11558.173828125
[TRAIN] Iter: 21400 Loss: 0.06811893731355667 PSNR: 14.502509117126465, Alpha: 0.9947421550750732, GradNorm 0.33261993135656637, Mem: 11558.173828125
[TRAIN] Iter: 21500 Loss: 0.07633478194475174 PSNR: 14.08181095123291, Alpha: 0.9981745481491089, GradNorm 0.41521816918505866, Mem: 11558.173828125
[TRAIN] Iter: 21600 Loss: 0.07802622020244598 PSNR: 14.15048599243164, Alpha: 0.9984371662139893, GradNorm 0.542238358817115, Mem: 11558.173828125
[TRAIN] Iter: 21700 Loss: 0.07174915075302124 PSNR: 14.326741218566895, Alpha: 0.9998818635940552, GradNorm 0.3922517501115669, Mem: 11558.173828125
[TRAIN] Iter: 21800 Loss: 0.08134793490171432 PSNR: 13.968099594116211, Alpha: 0.9953367710113525, GradNorm 0.3123546814183452, Mem: 11558.173828125
[TRAIN] Iter: 21900 Loss: 0.0771191343665123 PSNR: 14.262150764465332, Alpha: 0.9954769611358643, GradNorm 0.5452848335502578, Mem: 11558.173828125
[TRAIN] Iter: 22000 Loss: 0.08277855813503265 PSNR: 13.866737365722656, Alpha: 0.9994852542877197, GradNorm 0.47082814423350167, Mem: 11558.173828125
[TRAIN] Iter: 22100 Loss: 0.08455199003219604 PSNR: 13.592362403869629, Alpha: 0.9940594434738159, GradNorm 0.5418142832078272, Mem: 11558.173828125
[TRAIN] Iter: 22200 Loss: 0.08227120339870453 PSNR: 13.730535507202148, Alpha: 0.9986370801925659, GradNorm 0.464973838566184, Mem: 11558.173828125
[TRAIN] Iter: 22300 Loss: 0.07231767475605011 PSNR: 14.354056358337402, Alpha: 0.9970613718032837, GradNorm 0.47133053875719155, Mem: 11558.173828125
[TRAIN] Iter: 22400 Loss: 0.08185861259698868 PSNR: 13.88145923614502, Alpha: 0.9974803924560547, GradNorm 0.5072328627013896, Mem: 11558.173828125
[TRAIN] Iter: 22500 Loss: 0.0828903466463089 PSNR: 13.568238258361816, Alpha: 0.9992596507072449, GradNorm 0.40462193208553937, Mem: 11558.173828125
[TRAIN] Iter: 22600 Loss: 0.0767061784863472 PSNR: 14.119461059570312, Alpha: 0.9990846514701843, GradNorm 0.5920923298419999, Mem: 11558.173828125
[TRAIN] Iter: 22700 Loss: 0.0717887133359909 PSNR: 14.325491905212402, Alpha: 0.996785581111908, GradNorm 0.28650063653074537, Mem: 11558.173828125
[TRAIN] Iter: 22800 Loss: 0.08097124099731445 PSNR: 14.06324291229248, Alpha: 0.9965195059776306, GradNorm 0.4720237025665524, Mem: 11558.173828125
[TRAIN] Iter: 22900 Loss: 0.07622748613357544 PSNR: 14.188032150268555, Alpha: 0.9981414675712585, GradNorm 0.3625657563523184, Mem: 11558.173828125
[TRAIN] Iter: 23000 Loss: 0.07645894587039948 PSNR: 13.865768432617188, Alpha: 0.9969179034233093, GradNorm 0.44867234444862686, Mem: 11558.173828125
[TRAIN] Iter: 23100 Loss: 0.07996388524770737 PSNR: 13.888413429260254, Alpha: 0.9969024658203125, GradNorm 0.5099949404364503, Mem: 11558.173828125
[TRAIN] Iter: 23200 Loss: 0.07839914411306381 PSNR: 13.98352336883545, Alpha: 0.9979802966117859, GradNorm 0.40477795307676384, Mem: 11558.173828125
[TRAIN] Iter: 23300 Loss: 0.07804441452026367 PSNR: 14.301065444946289, Alpha: 0.9989490509033203, GradNorm 0.3989437300784871, Mem: 11558.173828125
[TRAIN] Iter: 23400 Loss: 0.07814480364322662 PSNR: 14.07808780670166, Alpha: 0.9999055862426758, GradNorm 0.38704052951096, Mem: 11558.173828125
[TRAIN] Iter: 23500 Loss: 0.07699927687644958 PSNR: 14.072565078735352, Alpha: 0.9977272748947144, GradNorm 0.4630417431458195, Mem: 11558.173828125
[TRAIN] Iter: 23600 Loss: 0.081876739859581 PSNR: 13.949226379394531, Alpha: 0.9967402219772339, GradNorm 0.4336730504590777, Mem: 11558.173828125
[TRAIN] Iter: 23700 Loss: 0.07485326379537582 PSNR: 14.216598510742188, Alpha: 0.9999977946281433, GradNorm 0.39377669536973325, Mem: 11558.173828125
[TRAIN] Iter: 23800 Loss: 0.07591955363750458 PSNR: 14.305036544799805, Alpha: 0.9907438158988953, GradNorm 0.42399095361958317, Mem: 11558.173828125
[TRAIN] Iter: 23900 Loss: 0.07331385463476181 PSNR: 14.30931568145752, Alpha: 0.9993669390678406, GradNorm 0.6333403510602539, Mem: 11558.173828125
[TRAIN] Iter: 24000 Loss: 0.07596548646688461 PSNR: 14.114215850830078, Alpha: 0.9983625411987305, GradNorm 0.4302525592913186, Mem: 11558.173828125
[TRAIN] Iter: 24100 Loss: 0.0714925080537796 PSNR: 14.619836807250977, Alpha: 0.9983319044113159, GradNorm 0.39694970935950846, Mem: 11558.173828125
[TRAIN] Iter: 24200 Loss: 0.08767987042665482 PSNR: 13.589244842529297, Alpha: 0.9973155856132507, GradNorm 0.6399362575881578, Mem: 11558.173828125
[TRAIN] Iter: 24300 Loss: 0.06981907039880753 PSNR: 14.724467277526855, Alpha: 0.995250403881073, GradNorm 0.5206291278372536, Mem: 11558.173828125
[TRAIN] Iter: 24400 Loss: 0.07763618230819702 PSNR: 13.988527297973633, Alpha: 0.996616780757904, GradNorm 0.44609977297345116, Mem: 11558.173828125
[TRAIN] Iter: 24500 Loss: 0.06953435391187668 PSNR: 14.606993675231934, Alpha: 0.9987635612487793, GradNorm 0.4386018231988141, Mem: 11558.173828125
[TRAIN] Iter: 24600 Loss: 0.0846921056509018 PSNR: 13.5090970993042, Alpha: 0.9906825423240662, GradNorm 0.3126314217763862, Mem: 11558.173828125
[TRAIN] Iter: 24700 Loss: 0.07848750054836273 PSNR: 14.064452171325684, Alpha: 0.9987735748291016, GradNorm 0.46411751019433034, Mem: 11558.173828125
[TRAIN] Iter: 24800 Loss: 0.07040543854236603 PSNR: 14.319046974182129, Alpha: 0.9973897933959961, GradNorm 0.36020483259237335, Mem: 11558.173828125
[TRAIN] Iter: 24900 Loss: 0.0648261308670044 PSNR: 15.05359935760498, Alpha: 0.9988207817077637, GradNorm 0.4105529059251305, Mem: 11558.173828125
[TRAIN] Iter: 25000 Loss: 0.08251875638961792 PSNR: 13.839550018310547, Alpha: 0.9983231425285339, GradNorm 0.556006728277684, Mem: 11558.173828125
[TRAIN] Iter: 25100 Loss: 0.07756344974040985 PSNR: 14.181941986083984, Alpha: 0.99518221616745, GradNorm 0.3597847446466235, Mem: 11558.173828125
[TRAIN] Iter: 25200 Loss: 0.07492969930171967 PSNR: 14.311877250671387, Alpha: 0.9995919466018677, GradNorm 0.30758264290040893, Mem: 11558.173828125
[TRAIN] Iter: 25300 Loss: 0.0694572776556015 PSNR: 14.671506881713867, Alpha: 0.9984287023544312, GradNorm 0.3499178059046643, Mem: 11558.173828125
[TRAIN] Iter: 25400 Loss: 0.06521525979042053 PSNR: 14.821541786193848, Alpha: 0.9999579787254333, GradNorm 0.3128618273429369, Mem: 11558.173828125
[TRAIN] Iter: 25500 Loss: 0.0768374651670456 PSNR: 14.012709617614746, Alpha: 0.9977542757987976, GradNorm 0.3902878968926631, Mem: 11558.173828125
[TRAIN] Iter: 25600 Loss: 0.07402903586626053 PSNR: 14.281792640686035, Alpha: 0.9998131990432739, GradNorm 0.43296111042962937, Mem: 11558.173828125
[TRAIN] Iter: 25700 Loss: 0.07458431273698807 PSNR: 14.165722846984863, Alpha: 0.9987342357635498, GradNorm 0.29991038572666195, Mem: 11558.173828125
[TRAIN] Iter: 25800 Loss: 0.08276161551475525 PSNR: 13.77078628540039, Alpha: 0.9949955940246582, GradNorm 0.5411048220554973, Mem: 11558.173828125
[TRAIN] Iter: 25900 Loss: 0.07003507018089294 PSNR: 14.514130592346191, Alpha: 0.9978060126304626, GradNorm 0.33359362143958426, Mem: 11558.173828125
[TRAIN] Iter: 26000 Loss: 0.08119158446788788 PSNR: 13.803071022033691, Alpha: 0.9901233911514282, GradNorm 0.504617379369841, Mem: 11558.173828125
[TRAIN] Iter: 26100 Loss: 0.07455047965049744 PSNR: 14.273865699768066, Alpha: 0.9988510608673096, GradNorm 0.2976951788123948, Mem: 11558.173828125
[TRAIN] Iter: 26200 Loss: 0.0792585015296936 PSNR: 14.133430480957031, Alpha: 0.9954676032066345, GradNorm 0.4264635226569596, Mem: 11558.173828125
[TRAIN] Iter: 26300 Loss: 0.06996788084506989 PSNR: 14.421030044555664, Alpha: 0.9980108141899109, GradNorm 0.2720977725564593, Mem: 11558.173828125
[TRAIN] Iter: 26400 Loss: 0.07354241609573364 PSNR: 14.445884704589844, Alpha: 0.9984304308891296, GradNorm 0.38029517768986615, Mem: 11558.173828125
[TRAIN] Iter: 26500 Loss: 0.06412147730588913 PSNR: 14.871671676635742, Alpha: 0.9956356883049011, GradNorm 0.4418306007275976, Mem: 11558.173828125
[TRAIN] Iter: 26600 Loss: 0.08094987273216248 PSNR: 14.137208938598633, Alpha: 0.9992690086364746, GradNorm 0.6195906737726714, Mem: 11558.173828125
[TRAIN] Iter: 26700 Loss: 0.08651387691497803 PSNR: 13.731367111206055, Alpha: 0.9992103576660156, GradNorm 0.7975835339388935, Mem: 11558.173828125
[TRAIN] Iter: 26800 Loss: 0.07255770266056061 PSNR: 14.339106559753418, Alpha: 0.9973243474960327, GradNorm 0.31211882525530454, Mem: 11558.173828125
[TRAIN] Iter: 26900 Loss: 0.07621078193187714 PSNR: 13.885469436645508, Alpha: 0.990019679069519, GradNorm 0.42446396344686715, Mem: 11558.173828125
[TRAIN] Iter: 27000 Loss: 0.08167152851819992 PSNR: 13.817895889282227, Alpha: 0.9988175630569458, GradNorm 0.5514891252826954, Mem: 11558.173828125
[TRAIN] Iter: 27100 Loss: 0.07581669092178345 PSNR: 14.385797500610352, Alpha: 0.9966281652450562, GradNorm 0.48231016343458694, Mem: 11558.173828125
[TRAIN] Iter: 27200 Loss: 0.06857401132583618 PSNR: 14.492484092712402, Alpha: 0.9909374117851257, GradNorm 0.35078601163029566, Mem: 11558.173828125
[TRAIN] Iter: 27300 Loss: 0.08069954812526703 PSNR: 13.769777297973633, Alpha: 0.9977288246154785, GradNorm 0.4641511011027435, Mem: 11558.173828125
[TRAIN] Iter: 27400 Loss: 0.06979846954345703 PSNR: 14.556965827941895, Alpha: 0.9956493377685547, GradNorm 0.5084507809257904, Mem: 11558.173828125
[TRAIN] Iter: 27500 Loss: 0.07276251912117004 PSNR: 14.38466739654541, Alpha: 0.995644748210907, GradNorm 0.46203565502760957, Mem: 11558.173828125
[TRAIN] Iter: 27600 Loss: 0.0695338100194931 PSNR: 14.543490409851074, Alpha: 0.9958414435386658, GradNorm 0.347548395054924, Mem: 11558.173828125
[TRAIN] Iter: 27700 Loss: 0.07488296926021576 PSNR: 14.251503944396973, Alpha: 0.9946555495262146, GradNorm 0.5413783447284479, Mem: 11558.173828125
[TRAIN] Iter: 27800 Loss: 0.0716518759727478 PSNR: 14.233119010925293, Alpha: 0.9947341680526733, GradNorm 0.37424319680044593, Mem: 11558.173828125
[TRAIN] Iter: 27900 Loss: 0.06600909680128098 PSNR: 14.738411903381348, Alpha: 0.9973373413085938, GradNorm 0.342382996582034, Mem: 11558.173828125
[TRAIN] Iter: 28000 Loss: 0.06844326108694077 PSNR: 14.41538143157959, Alpha: 0.9991683959960938, GradNorm 0.41519439040643097, Mem: 11558.173828125
[TRAIN] Iter: 28100 Loss: 0.07824773341417313 PSNR: 13.990424156188965, Alpha: 0.9882264137268066, GradNorm 0.5129933805306058, Mem: 11558.173828125
[TRAIN] Iter: 28200 Loss: 0.07361412048339844 PSNR: 14.077268600463867, Alpha: 0.9928281307220459, GradNorm 0.3510414559886135, Mem: 11558.173828125
[TRAIN] Iter: 28300 Loss: 0.06790332496166229 PSNR: 14.67627239227295, Alpha: 0.9995721578598022, GradNorm 0.3312170369482608, Mem: 11558.173828125
[TRAIN] Iter: 28400 Loss: 0.07714250683784485 PSNR: 14.108842849731445, Alpha: 0.9934637546539307, GradNorm 0.3557592828589973, Mem: 11558.173828125
[TRAIN] Iter: 28500 Loss: 0.06583665311336517 PSNR: 14.752737045288086, Alpha: 0.9968787431716919, GradNorm 0.3314789077914225, Mem: 11558.173828125
[TRAIN] Iter: 28600 Loss: 0.07690326869487762 PSNR: 14.044428825378418, Alpha: 0.9936227798461914, GradNorm 0.35778760626511047, Mem: 11558.173828125
[TRAIN] Iter: 28700 Loss: 0.06795874238014221 PSNR: 14.873968124389648, Alpha: 0.9968633055686951, GradNorm 0.41624091413428466, Mem: 11558.173828125
[TRAIN] Iter: 28800 Loss: 0.07098785042762756 PSNR: 14.50261116027832, Alpha: 0.9991388320922852, GradNorm 0.38367228042329526, Mem: 11558.173828125
[TRAIN] Iter: 28900 Loss: 0.0761304497718811 PSNR: 14.141548156738281, Alpha: 0.997944712638855, GradNorm 0.4549434178427883, Mem: 11558.173828125
[TRAIN] Iter: 29000 Loss: 0.07032103836536407 PSNR: 14.493751525878906, Alpha: 0.9976891875267029, GradNorm 0.35510424543501146, Mem: 11558.173828125
[TRAIN] Iter: 29100 Loss: 0.06558066606521606 PSNR: 14.788487434387207, Alpha: 0.9965200424194336, GradNorm 0.2787997129163128, Mem: 11558.173828125
[TRAIN] Iter: 29200 Loss: 0.06587890535593033 PSNR: 14.907171249389648, Alpha: 0.9955436587333679, GradNorm 0.29888976704798037, Mem: 11558.173828125
[TRAIN] Iter: 29300 Loss: 0.07495665550231934 PSNR: 14.105388641357422, Alpha: 0.9972041845321655, GradNorm 0.46317122487224255, Mem: 11558.173828125
[TRAIN] Iter: 29400 Loss: 0.07574787735939026 PSNR: 14.035564422607422, Alpha: 0.99823397397995, GradNorm 0.45168046610038853, Mem: 11558.173828125
[TRAIN] Iter: 29500 Loss: 0.07399596273899078 PSNR: 14.479714393615723, Alpha: 0.9967593550682068, GradNorm 0.476329786211062, Mem: 11558.173828125
[TRAIN] Iter: 29600 Loss: 0.0685841515660286 PSNR: 14.444747924804688, Alpha: 0.9897578954696655, GradNorm 0.2883182877994978, Mem: 11558.173828125
[TRAIN] Iter: 29700 Loss: 0.07369697093963623 PSNR: 14.218693733215332, Alpha: 0.9957265853881836, GradNorm 0.47680529290321844, Mem: 11558.173828125
[TRAIN] Iter: 29800 Loss: 0.07101497054100037 PSNR: 14.869857788085938, Alpha: 0.9965362548828125, GradNorm 0.3212098595579922, Mem: 11558.173828125
[TRAIN] Iter: 29900 Loss: 0.07469536364078522 PSNR: 14.045900344848633, Alpha: 0.993876576423645, GradNorm 0.307260356458544, Mem: 11558.173828125
Saved checkpoints at ./logs/rat_default/030000.tar
Bounding Box Debug Info:
x_min: -0.016303576469421557, x_max: 0.13524496459960925
y_min: 0.13631874084472656, y_max: 0.2370219573974607
z_min: 0.03008332061767555, z_max: 0.13808740234375
Bounding Box Debug Info:
x_min: 0.20284332275390612, x_max: 0.27888725280761706
y_min: -0.10573136138916016, y_max: 0.04925373077392578
z_min: 0.007576402664184343, z_max: 0.10356272888183571
Bounding Box Debug Info:
x_min: -0.016059140205383415, x_max: 0.14520207214355438
y_min: -0.14237770080566428, y_max: -0.05226047515869141
z_min: 0.01878752899169899, z_max: 0.10969234466552712
Bounding Box Debug Info:
x_min: 0.04626746368408189, x_max: 0.12963674926757793
y_min: -0.117417900085449, y_max: 0.037399066925048824
z_min: 0.022188674926757586, z_max: 0.10861693572998048
Bounding Box Debug Info:
x_min: 0.05005798721313454, x_max: 0.15824763488769517
y_min: -0.04143564224243164, y_max: 0.10374114990234376
z_min: 0.027995925903320087, z_max: 0.16608097839355446
Bounding Box Debug Info:
x_min: 0.018135158538818245, x_max: 0.16837405395507798
y_min: 0.14043280029296876, y_max: 0.26155438232421874
z_min: 0.017705509801884546, z_max: 0.10332373046874978
Bounding Box Debug Info:
x_min: -0.22748263549804704, x_max: -0.07638658905029319
y_min: 0.11107346343994118, y_max: 0.22296235656738259
z_min: 0.01744731903076149, z_max: 0.11063830566406228
Bounding Box Debug Info:
x_min: -0.2488652343750003, x_max: -0.11536257934570324
y_min: -0.017549812316894533, y_max: 0.11087640380859376
z_min: 0.005051356315612565, z_max: 0.08966143035888649
Bounding Box Debug Info:
x_min: -0.21859460449218762, x_max: -0.12297888183593772
y_min: 0.0287499084472654, y_max: 0.10050344085693337
z_min: 0.022367975234985123, z_max: 0.17305021667480458
Bounding Box Debug Info:
x_min: -0.2076027832031251, x_max: -0.10125875854492215
y_min: -0.05550039672851585, y_max: 0.07307929992675782
z_min: 0.00415080495777488, z_max: 0.09007923889160134
Bounding Box Debug Info:
x_min: -0.17041168212890642, x_max: -0.036025394439697385
y_min: 0.11028720092773414, y_max: 0.22693572998046876
z_min: 0.022087579727172624, z_max: 0.1054314270019529
Bounding Box Debug Info:
x_min: -0.1705599060058596, x_max: -0.038622104644775564
y_min: 0.1101719741821289, y_max: 0.2275417175292969
z_min: 0.021979381561279068, z_max: 0.10639647674560525
Bounding Box Debug Info:
x_min: -0.17065034484863298, x_max: -0.03775608825683617
y_min: 0.11022159576415992, y_max: 0.22663485717773438
z_min: 0.021593749999999773, z_max: 0.10601044464111306
Bounding Box Debug Info:
x_min: -0.1705158691406251, x_max: -0.03697266769409191
y_min: 0.10976802825927712, y_max: 0.22746969604492165
z_min: 0.021498142242431413, z_max: 0.10463920593261719
Bounding Box Debug Info:
x_min: -0.17116410827636735, x_max: -0.03743118286132818
y_min: 0.11016936492919921, y_max: 0.22665449523925782
z_min: 0.02142527198791481, z_max: 0.1051624069213865
center: None
Pose 0: Valid indices count = 1391744
center: None
Pose 1: Valid indices count = 1391744
center: None
Pose 2: Valid indices count = 1391744
center: None
Pose 3: Valid indices count = 1391744
center: None
Pose 4: Valid indices count = 1391744
center: None
Pose 5: Valid indices count = 1391744
center: None
Pose 6: Valid indices count = 1391744
center: None
Pose 7: Valid indices count = 1391744
center: None
Pose 8: Valid indices count = 1391744
center: None
Pose 9: Valid indices count = 1391744
center: None
Pose 10: Valid indices count = 1391744
center: None
Pose 11: Valid indices count = 1391744
center: None
Pose 12: Valid indices count = 1391744
center: None
Pose 13: Valid indices count = 1391744
center: None
Pose 14: Valid indices count = 1391744
0 0.0010066032409667969
1 47.06617593765259
2 46.951910734176636
3 46.98934030532837
4 47.03982353210449
5 46.99688267707825
6 47.012953758239746
7 46.981428146362305
8 46.99828124046326
9 46.9595890045166
10 46.98960089683533
11 46.97118616104126
12 46.997732162475586
13 47.02458834648132
14 47.03943705558777
Predicted image shape (th_rgbs): torch.Size([15, 3, 1048, 1328])
Ground truth image shape (th_gt): torch.Size([15, 3, 1048, 1328])
Any NaNs in th_rgbs: tensor(False, device='cpu')
Any NaNs in th_gt: tensor(False, device='cpu')
Any Infs in th_rgbs: tensor(False, device='cpu')
Any Infs in th_gt: tensor(False, device='cpu')
Evaluate PSNR: 21.37133488431873 (15.020993110251101), SSIM: 0.9704369306564331 (0.7318582761461678)
[TRAIN] Iter: 30000 Loss: 0.07213651388883591 PSNR: 14.30387020111084, Alpha: 0.992568850517273, GradNorm 0.30727440541605505, Mem: 11558.173828125
[TRAIN] Iter: 30100 Loss: 0.08477917313575745 PSNR: 13.75953483581543, Alpha: 0.9988416433334351, GradNorm 0.44869735111012155, Mem: 11558.173828125
[TRAIN] Iter: 30200 Loss: 0.07288145273923874 PSNR: 14.45975112915039, Alpha: 0.9992894530296326, GradNorm 0.39974331910641264, Mem: 11558.173828125
[TRAIN] Iter: 30300 Loss: 0.07372286915779114 PSNR: 14.10248851776123, Alpha: 0.9992413520812988, GradNorm 0.4576056702283829, Mem: 11558.173828125
[TRAIN] Iter: 30400 Loss: 0.08309918642044067 PSNR: 13.64865493774414, Alpha: 0.9906929731369019, GradNorm 0.4605249584040232, Mem: 11558.173828125
[TRAIN] Iter: 30500 Loss: 0.06784942001104355 PSNR: 14.710752487182617, Alpha: 0.9940316081047058, GradNorm 0.3822628517296798, Mem: 11558.173828125
[TRAIN] Iter: 30600 Loss: 0.0758289098739624 PSNR: 14.230960845947266, Alpha: 0.9970929622650146, GradNorm 0.4831038056278433, Mem: 11558.173828125
[TRAIN] Iter: 30700 Loss: 0.07190226763486862 PSNR: 14.20216178894043, Alpha: 0.9963430166244507, GradNorm 0.3238979052947748, Mem: 11558.173828125
[TRAIN] Iter: 30800 Loss: 0.07037240266799927 PSNR: 14.417102813720703, Alpha: 0.9944609999656677, GradNorm 0.33104259583485274, Mem: 11558.173828125
[TRAIN] Iter: 30900 Loss: 0.057570651173591614 PSNR: 15.320127487182617, Alpha: 0.9972307085990906, GradNorm 0.2679583339896809, Mem: 11558.173828125
[TRAIN] Iter: 31000 Loss: 0.06529482454061508 PSNR: 14.504904747009277, Alpha: 0.999701976776123, GradNorm 0.48166750425667404, Mem: 11558.173828125
[TRAIN] Iter: 31100 Loss: 0.06666803359985352 PSNR: 14.802014350891113, Alpha: 0.9845073223114014, GradNorm 0.27143542165704077, Mem: 11558.173828125
[TRAIN] Iter: 31200 Loss: 0.07639352977275848 PSNR: 14.073837280273438, Alpha: 0.9975263476371765, GradNorm 0.35853803833261055, Mem: 11558.173828125
[TRAIN] Iter: 31300 Loss: 0.0699920654296875 PSNR: 14.645929336547852, Alpha: 0.9979199767112732, GradNorm 0.37030348091650533, Mem: 11558.173828125
[TRAIN] Iter: 31400 Loss: 0.07072766870260239 PSNR: 14.707077026367188, Alpha: 0.9982064962387085, GradNorm 0.3247924125373689, Mem: 11558.173828125
[TRAIN] Iter: 31500 Loss: 0.07036430388689041 PSNR: 14.601802825927734, Alpha: 0.9955451488494873, GradNorm 0.4155852095568069, Mem: 11558.173828125
[TRAIN] Iter: 31600 Loss: 0.06647947430610657 PSNR: 14.659797668457031, Alpha: 0.9966022372245789, GradNorm 0.5114237610044601, Mem: 11558.173828125
[TRAIN] Iter: 31700 Loss: 0.06804834306240082 PSNR: 14.63945198059082, Alpha: 0.9947237968444824, GradNorm 0.2893825126545416, Mem: 11558.173828125
[TRAIN] Iter: 31800 Loss: 0.06271186470985413 PSNR: 15.140131950378418, Alpha: 0.9900357127189636, GradNorm 0.31784064448417576, Mem: 11558.173828125
[TRAIN] Iter: 31900 Loss: 0.07694245874881744 PSNR: 14.172248840332031, Alpha: 0.9969844222068787, GradNorm 0.41703730429727626, Mem: 11558.173828125
[TRAIN] Iter: 32000 Loss: 0.06167332082986832 PSNR: 14.838668823242188, Alpha: 0.9951095581054688, GradNorm 0.4759001167101412, Mem: 11558.173828125
[TRAIN] Iter: 32100 Loss: 0.06010328233242035 PSNR: 15.214550971984863, Alpha: 0.9945110082626343, GradNorm 0.3177875934926934, Mem: 11558.173828125
[TRAIN] Iter: 32200 Loss: 0.06603819131851196 PSNR: 14.79265022277832, Alpha: 0.9991294145584106, GradNorm 0.4683218523697907, Mem: 11558.173828125
[TRAIN] Iter: 32300 Loss: 0.06595678627490997 PSNR: 14.844271659851074, Alpha: 0.9883241653442383, GradNorm 0.35980521771425555, Mem: 11558.173828125
[TRAIN] Iter: 32400 Loss: 0.05864569917321205 PSNR: 15.3392972946167, Alpha: 0.9955724477767944, GradNorm 0.25157039084716204, Mem: 11558.173828125
[TRAIN] Iter: 32500 Loss: 0.07037877291440964 PSNR: 14.33360767364502, Alpha: 0.9957829713821411, GradNorm 0.3265169186442539, Mem: 11558.173828125
[TRAIN] Iter: 32600 Loss: 0.07191150635480881 PSNR: 14.447547912597656, Alpha: 0.9935752749443054, GradNorm 0.3762776379375804, Mem: 11558.173828125
[TRAIN] Iter: 32700 Loss: 0.06780068576335907 PSNR: 14.893731117248535, Alpha: 0.9994590282440186, GradNorm 0.43379878949766476, Mem: 11558.173828125
[TRAIN] Iter: 32800 Loss: 0.06292621791362762 PSNR: 15.020451545715332, Alpha: 0.9987596273422241, GradNorm 0.3395439644308556, Mem: 11558.173828125
[TRAIN] Iter: 32900 Loss: 0.057691074907779694 PSNR: 15.50700569152832, Alpha: 0.9988917708396912, GradNorm 0.30660492467453454, Mem: 11558.173828125
[TRAIN] Iter: 33000 Loss: 0.06903114914894104 PSNR: 14.665007591247559, Alpha: 0.9951984286308289, GradNorm 0.511899955346338, Mem: 11558.173828125
[TRAIN] Iter: 33100 Loss: 0.07593315839767456 PSNR: 14.030905723571777, Alpha: 0.9969836473464966, GradNorm 0.5503930200881153, Mem: 11558.173828125
[TRAIN] Iter: 33200 Loss: 0.0666099339723587 PSNR: 14.711423873901367, Alpha: 0.9941115379333496, GradNorm 0.3741920228425078, Mem: 11558.173828125
[TRAIN] Iter: 33300 Loss: 0.0672687441110611 PSNR: 14.816298484802246, Alpha: 0.9911221265792847, GradNorm 0.4256135305380222, Mem: 11558.173828125
[TRAIN] Iter: 33400 Loss: 0.06147979572415352 PSNR: 15.038351058959961, Alpha: 0.9999959468841553, GradNorm 0.2709167317087116, Mem: 11558.173828125
[TRAIN] Iter: 33500 Loss: 0.0594363734126091 PSNR: 15.258538246154785, Alpha: 0.9996747374534607, GradNorm 0.2721872156268885, Mem: 11558.173828125
[TRAIN] Iter: 33600 Loss: 0.06995741277933121 PSNR: 14.482847213745117, Alpha: 0.9932740926742554, GradNorm 0.5018653227129359, Mem: 11558.173828125
[TRAIN] Iter: 33700 Loss: 0.06606128066778183 PSNR: 14.840986251831055, Alpha: 0.9960150718688965, GradNorm 0.3425039431615131, Mem: 11558.173828125
[TRAIN] Iter: 33800 Loss: 0.0664965957403183 PSNR: 14.87355899810791, Alpha: 0.9969398379325867, GradNorm 0.4366092350256546, Mem: 11558.173828125
[TRAIN] Iter: 33900 Loss: 0.07654775679111481 PSNR: 13.960235595703125, Alpha: 0.9973047971725464, GradNorm 0.42781830354781525, Mem: 11558.173828125
[TRAIN] Iter: 34000 Loss: 0.06807838380336761 PSNR: 14.631058692932129, Alpha: 0.9954320788383484, GradNorm 0.30424342046989317, Mem: 11558.173828125
[TRAIN] Iter: 34100 Loss: 0.06232819706201553 PSNR: 14.931653022766113, Alpha: 0.9896658658981323, GradNorm 0.48224246485687305, Mem: 11558.173828125
[TRAIN] Iter: 34200 Loss: 0.07271525263786316 PSNR: 14.476004600524902, Alpha: 0.9952120780944824, GradNorm 0.5452800835113768, Mem: 11558.173828125
[TRAIN] Iter: 34300 Loss: 0.06635601818561554 PSNR: 14.730703353881836, Alpha: 0.9997681379318237, GradNorm 0.3863084784405741, Mem: 11558.173828125
[TRAIN] Iter: 34400 Loss: 0.07122237980365753 PSNR: 14.275368690490723, Alpha: 0.9990891218185425, GradNorm 0.5766967548468129, Mem: 11558.173828125
[TRAIN] Iter: 34500 Loss: 0.06208750605583191 PSNR: 14.925012588500977, Alpha: 0.9980449676513672, GradNorm 0.42416627197664436, Mem: 11558.173828125
[TRAIN] Iter: 34600 Loss: 0.07726624608039856 PSNR: 14.017037391662598, Alpha: 0.9925546646118164, GradNorm 0.3789634460050178, Mem: 11558.173828125
[TRAIN] Iter: 34700 Loss: 0.05511750280857086 PSNR: 15.450047492980957, Alpha: 0.9999381303787231, GradNorm 0.2831236187346073, Mem: 11558.173828125
[TRAIN] Iter: 34800 Loss: 0.06724397838115692 PSNR: 14.528141021728516, Alpha: 0.9961955547332764, GradNorm 0.3965636945681134, Mem: 11558.173828125
[TRAIN] Iter: 34900 Loss: 0.07003599405288696 PSNR: 14.324361801147461, Alpha: 0.9958547353744507, GradNorm 0.3943435499800273, Mem: 11558.173828125
[TRAIN] Iter: 35000 Loss: 0.06958357989788055 PSNR: 14.647502899169922, Alpha: 0.9992309808731079, GradNorm 0.4080844833756868, Mem: 11558.173828125
[TRAIN] Iter: 35100 Loss: 0.07345978915691376 PSNR: 14.258734703063965, Alpha: 0.992795467376709, GradNorm 0.3988877199684538, Mem: 11558.173828125
[TRAIN] Iter: 35200 Loss: 0.06873226910829544 PSNR: 14.430109024047852, Alpha: 0.9867998957633972, GradNorm 0.46100161635565734, Mem: 11558.173828125
[TRAIN] Iter: 35300 Loss: 0.0630650743842125 PSNR: 14.9495849609375, Alpha: 0.9986239671707153, GradNorm 0.32501924665233395, Mem: 11558.173828125
[TRAIN] Iter: 35400 Loss: 0.05899481102824211 PSNR: 15.415192604064941, Alpha: 0.9965686202049255, GradNorm 0.37211289484392895, Mem: 11558.173828125
[TRAIN] Iter: 35500 Loss: 0.06684710085391998 PSNR: 14.77835750579834, Alpha: 0.9980272054672241, GradNorm 0.48337070111552527, Mem: 11558.173828125
[TRAIN] Iter: 35600 Loss: 0.07525724917650223 PSNR: 14.207860946655273, Alpha: 0.9975347518920898, GradNorm 0.34579545845707216, Mem: 11558.173828125
[TRAIN] Iter: 35700 Loss: 0.06588057428598404 PSNR: 14.876081466674805, Alpha: 0.9999161958694458, GradNorm 0.4410860676938699, Mem: 11558.173828125
[TRAIN] Iter: 35800 Loss: 0.06530914455652237 PSNR: 14.86019229888916, Alpha: 0.9977065920829773, GradNorm 0.3058312009366289, Mem: 11558.173828125
[TRAIN] Iter: 35900 Loss: 0.062333106994628906 PSNR: 15.226061820983887, Alpha: 0.9996174573898315, GradNorm 0.35858244949934026, Mem: 11558.173828125
[TRAIN] Iter: 36000 Loss: 0.06903006136417389 PSNR: 14.8812255859375, Alpha: 0.9981603026390076, GradNorm 0.31899148179509995, Mem: 11558.173828125
[TRAIN] Iter: 36100 Loss: 0.057255737483501434 PSNR: 15.549270629882812, Alpha: 0.999863862991333, GradNorm 0.4160599612365918, Mem: 11558.173828125
[TRAIN] Iter: 36200 Loss: 0.06871870160102844 PSNR: 14.70432186126709, Alpha: 0.9971593618392944, GradNorm 0.35915218371361235, Mem: 11558.173828125
[TRAIN] Iter: 36300 Loss: 0.061197854578495026 PSNR: 15.111434936523438, Alpha: 0.9948693513870239, GradNorm 0.29556399142571266, Mem: 11558.173828125
[TRAIN] Iter: 36400 Loss: 0.06935475021600723 PSNR: 14.404176712036133, Alpha: 0.9960058927536011, GradNorm 0.4768227674764621, Mem: 11558.173828125
[TRAIN] Iter: 36500 Loss: 0.062035590410232544 PSNR: 15.091710090637207, Alpha: 0.9965261816978455, GradNorm 0.3537343868257693, Mem: 11558.173828125
[TRAIN] Iter: 36600 Loss: 0.08057915419340134 PSNR: 13.978107452392578, Alpha: 0.9975348711013794, GradNorm 0.5943433883894454, Mem: 11558.173828125
[TRAIN] Iter: 36700 Loss: 0.06105542927980423 PSNR: 15.097787857055664, Alpha: 0.9989720582962036, GradNorm 0.3604764559675207, Mem: 11558.173828125
[TRAIN] Iter: 36800 Loss: 0.06521028280258179 PSNR: 15.017525672912598, Alpha: 0.9998647570610046, GradNorm 0.3013267900185372, Mem: 11558.173828125
[TRAIN] Iter: 36900 Loss: 0.06354732066392899 PSNR: 14.939696311950684, Alpha: 0.99510657787323, GradNorm 0.37134930111202546, Mem: 11558.173828125
[TRAIN] Iter: 37000 Loss: 0.06877867877483368 PSNR: 14.495482444763184, Alpha: 0.9980462789535522, GradNorm 0.7190527905478431, Mem: 11558.173828125
[TRAIN] Iter: 37100 Loss: 0.06348437070846558 PSNR: 14.76634693145752, Alpha: 0.9971158504486084, GradNorm 0.44917165721474434, Mem: 11558.173828125
[TRAIN] Iter: 37200 Loss: 0.06844134628772736 PSNR: 14.750186920166016, Alpha: 0.9833506941795349, GradNorm 0.6731477829407206, Mem: 11558.173828125
[TRAIN] Iter: 37300 Loss: 0.060241393744945526 PSNR: 15.065364837646484, Alpha: 0.9984006881713867, GradNorm 0.3315978997730848, Mem: 11558.173828125
[TRAIN] Iter: 37400 Loss: 0.05717800557613373 PSNR: 15.395343780517578, Alpha: 0.9997563362121582, GradNorm 0.302886530278021, Mem: 11558.173828125
[TRAIN] Iter: 37500 Loss: 0.06464120000600815 PSNR: 14.880130767822266, Alpha: 0.9961671829223633, GradNorm 0.29159475466781976, Mem: 11558.173828125
[TRAIN] Iter: 37600 Loss: 0.05779410898685455 PSNR: 15.482939720153809, Alpha: 0.9923428893089294, GradNorm 0.3071901565303681, Mem: 11558.173828125
[TRAIN] Iter: 37700 Loss: 0.06717249751091003 PSNR: 14.760726928710938, Alpha: 0.9985036849975586, GradNorm 0.2982527300328501, Mem: 11558.173828125
[TRAIN] Iter: 37800 Loss: 0.06220659613609314 PSNR: 14.999661445617676, Alpha: 0.9913196563720703, GradNorm 0.5039014351078633, Mem: 11558.173828125
[TRAIN] Iter: 37900 Loss: 0.07293715327978134 PSNR: 14.532902717590332, Alpha: 0.9919548034667969, GradNorm 0.30763553190404674, Mem: 11558.173828125
[TRAIN] Iter: 38000 Loss: 0.07144412398338318 PSNR: 14.48179817199707, Alpha: 0.9881129264831543, GradNorm 0.28095618891965696, Mem: 11558.173828125
[TRAIN] Iter: 38100 Loss: 0.0637359619140625 PSNR: 14.890506744384766, Alpha: 0.9972547292709351, GradNorm 0.42823162755219746, Mem: 11558.173828125
[TRAIN] Iter: 38200 Loss: 0.07066385447978973 PSNR: 14.441457748413086, Alpha: 0.999038577079773, GradNorm 0.3684906408644917, Mem: 11558.173828125
[TRAIN] Iter: 38300 Loss: 0.06277932226657867 PSNR: 14.933694839477539, Alpha: 0.999304473400116, GradNorm 0.318428539616697, Mem: 11558.173828125
[TRAIN] Iter: 38400 Loss: 0.06597168743610382 PSNR: 14.681987762451172, Alpha: 0.996488094329834, GradNorm 0.6244442921835669, Mem: 11558.173828125
[TRAIN] Iter: 38500 Loss: 0.06106036156415939 PSNR: 15.016288757324219, Alpha: 0.996790885925293, GradNorm 0.2590706149857804, Mem: 11558.173828125
[TRAIN] Iter: 38600 Loss: 0.0580185204744339 PSNR: 15.405230522155762, Alpha: 0.9946569204330444, GradNorm 0.4005649994535107, Mem: 11558.173828125
[TRAIN] Iter: 38700 Loss: 0.0607675276696682 PSNR: 14.96713924407959, Alpha: 0.9946756362915039, GradNorm 0.3074305441830724, Mem: 11558.173828125
[TRAIN] Iter: 38800 Loss: 0.06617005169391632 PSNR: 14.818761825561523, Alpha: 0.9930927753448486, GradNorm 0.3169020303131572, Mem: 11558.173828125
[TRAIN] Iter: 38900 Loss: 0.07463806867599487 PSNR: 14.342111587524414, Alpha: 0.9902682304382324, GradNorm 0.42375790561838, Mem: 11558.173828125
[TRAIN] Iter: 39000 Loss: 0.07103808224201202 PSNR: 14.268620491027832, Alpha: 0.9899341464042664, GradNorm 0.38224792011638664, Mem: 11558.173828125
[TRAIN] Iter: 39100 Loss: 0.06873877346515656 PSNR: 14.712340354919434, Alpha: 0.9942666888237, GradNorm 0.5402286172755876, Mem: 11558.173828125
[TRAIN] Iter: 39200 Loss: 0.062334224581718445 PSNR: 14.884342193603516, Alpha: 0.9829503893852234, GradNorm 0.31930007343623973, Mem: 11558.173828125
[TRAIN] Iter: 39300 Loss: 0.06374157965183258 PSNR: 14.867518424987793, Alpha: 0.9996334910392761, GradNorm 0.35710713602426375, Mem: 11558.173828125
[TRAIN] Iter: 39400 Loss: 0.06474065780639648 PSNR: 14.623903274536133, Alpha: 0.9954198598861694, GradNorm 0.3605612178616854, Mem: 11558.173828125
[TRAIN] Iter: 39500 Loss: 0.06896379590034485 PSNR: 14.480658531188965, Alpha: 0.9989134669303894, GradNorm 0.39765320889766337, Mem: 11558.173828125
[TRAIN] Iter: 39600 Loss: 0.06655129045248032 PSNR: 14.678607940673828, Alpha: 0.9936563968658447, GradNorm 0.3656689228549784, Mem: 11558.173828125
[TRAIN] Iter: 39700 Loss: 0.07748734205961227 PSNR: 13.93455982208252, Alpha: 0.9943124055862427, GradNorm 0.3006806510477336, Mem: 11558.173828125
[TRAIN] Iter: 39800 Loss: 0.06999894976615906 PSNR: 14.538573265075684, Alpha: 0.9962007403373718, GradNorm 0.47427251051803687, Mem: 11558.173828125
[TRAIN] Iter: 39900 Loss: 0.06891077011823654 PSNR: 14.735848426818848, Alpha: 0.9899019002914429, GradNorm 0.3484309434560732, Mem: 11558.173828125
Saved checkpoints at ./logs/rat_default/040000.tar
Bounding Box Debug Info:
x_min: -0.016303576469421557, x_max: 0.13524496459960925
y_min: 0.13631874084472656, y_max: 0.2370219573974607
z_min: 0.03008332061767555, z_max: 0.13808740234375
Bounding Box Debug Info:
x_min: 0.20284332275390612, x_max: 0.27888725280761706
y_min: -0.10573136138916016, y_max: 0.04925373077392578
z_min: 0.007576402664184343, z_max: 0.10356272888183571
Bounding Box Debug Info:
x_min: -0.016059140205383415, x_max: 0.14520207214355438
y_min: -0.14237770080566428, y_max: -0.05226047515869141
z_min: 0.01878752899169899, z_max: 0.10969234466552712
Bounding Box Debug Info:
x_min: 0.04626746368408189, x_max: 0.12963674926757793
y_min: -0.117417900085449, y_max: 0.037399066925048824
z_min: 0.022188674926757586, z_max: 0.10861693572998048
Bounding Box Debug Info:
x_min: 0.05005798721313454, x_max: 0.15824763488769517
y_min: -0.04143564224243164, y_max: 0.10374114990234376
z_min: 0.027995925903320087, z_max: 0.16608097839355446
Bounding Box Debug Info:
x_min: 0.018135158538818245, x_max: 0.16837405395507798
y_min: 0.14043280029296876, y_max: 0.26155438232421874
z_min: 0.017705509801884546, z_max: 0.10332373046874978
Bounding Box Debug Info:
x_min: -0.22748263549804704, x_max: -0.07638658905029319
y_min: 0.11107346343994118, y_max: 0.22296235656738259
z_min: 0.01744731903076149, z_max: 0.11063830566406228
Bounding Box Debug Info:
x_min: -0.2488652343750003, x_max: -0.11536257934570324
y_min: -0.017549812316894533, y_max: 0.11087640380859376
z_min: 0.005051356315612565, z_max: 0.08966143035888649
Bounding Box Debug Info:
x_min: -0.21859460449218762, x_max: -0.12297888183593772
y_min: 0.0287499084472654, y_max: 0.10050344085693337
z_min: 0.022367975234985123, z_max: 0.17305021667480458
Bounding Box Debug Info:
x_min: -0.2076027832031251, x_max: -0.10125875854492215
y_min: -0.05550039672851585, y_max: 0.07307929992675782
z_min: 0.00415080495777488, z_max: 0.09007923889160134
Bounding Box Debug Info:
x_min: -0.17041168212890642, x_max: -0.036025394439697385
y_min: 0.11028720092773414, y_max: 0.22693572998046876
z_min: 0.022087579727172624, z_max: 0.1054314270019529
Bounding Box Debug Info:
x_min: -0.1705599060058596, x_max: -0.038622104644775564
y_min: 0.1101719741821289, y_max: 0.2275417175292969
z_min: 0.021979381561279068, z_max: 0.10639647674560525
Bounding Box Debug Info:
x_min: -0.17065034484863298, x_max: -0.03775608825683617
y_min: 0.11022159576415992, y_max: 0.22663485717773438
z_min: 0.021593749999999773, z_max: 0.10601044464111306
Bounding Box Debug Info:
x_min: -0.1705158691406251, x_max: -0.03697266769409191
y_min: 0.10976802825927712, y_max: 0.22746969604492165
z_min: 0.021498142242431413, z_max: 0.10463920593261719
Bounding Box Debug Info:
x_min: -0.17116410827636735, x_max: -0.03743118286132818
y_min: 0.11016936492919921, y_max: 0.22665449523925782
z_min: 0.02142527198791481, z_max: 0.1051624069213865
center: None
Pose 0: Valid indices count = 1391744
center: None
Pose 1: Valid indices count = 1391744
center: None
Pose 2: Valid indices count = 1391744
center: None
Pose 3: Valid indices count = 1391744
center: None
Pose 4: Valid indices count = 1391744
center: None
Pose 5: Valid indices count = 1391744
center: None
Pose 6: Valid indices count = 1391744
center: None
Pose 7: Valid indices count = 1391744
center: None
Pose 8: Valid indices count = 1391744
center: None
Pose 9: Valid indices count = 1391744
center: None
Pose 10: Valid indices count = 1391744
center: None
Pose 11: Valid indices count = 1391744
center: None
Pose 12: Valid indices count = 1391744
center: None
Pose 13: Valid indices count = 1391744
center: None
Pose 14: Valid indices count = 1391744
0 0.0010209083557128906
1 46.919886350631714
2 46.901962757110596
3 46.980144023895264
4 47.03640007972717
5 47.02134966850281
6 47.03787851333618
7 47.01145100593567
8 47.03399968147278
9 47.00906801223755
10 47.02764892578125
11 46.9969277381897
12 47.01318860054016
13 46.99541139602661
14 47.006776571273804
Predicted image shape (th_rgbs): torch.Size([15, 3, 1048, 1328])
Ground truth image shape (th_gt): torch.Size([15, 3, 1048, 1328])
Any NaNs in th_rgbs: tensor(False, device='cpu')
Any NaNs in th_gt: tensor(False, device='cpu')
Any Infs in th_rgbs: tensor(False, device='cpu')
Any Infs in th_gt: tensor(False, device='cpu')
Evaluate PSNR: 22.75172487303406 (15.200958656646261), SSIM: 0.9726582765579224 (0.7331493302032122)
[TRAIN] Iter: 40000 Loss: 0.06026335805654526 PSNR: 15.11142635345459, Alpha: 0.9955965280532837, GradNorm 0.32309284471886074, Mem: 11558.173828125
[TRAIN] Iter: 40100 Loss: 0.07044088840484619 PSNR: 14.509525299072266, Alpha: 0.9848423004150391, GradNorm 0.33632293114479883, Mem: 11558.173828125
[TRAIN] Iter: 40200 Loss: 0.06093515455722809 PSNR: 15.25975227355957, Alpha: 0.9988438487052917, GradNorm 0.3535657929963094, Mem: 11558.173828125
[TRAIN] Iter: 40300 Loss: 0.06345811486244202 PSNR: 14.814788818359375, Alpha: 0.9976140856742859, GradNorm 0.33077274009515445, Mem: 11558.173828125
[TRAIN] Iter: 40400 Loss: 0.07432683557271957 PSNR: 14.278756141662598, Alpha: 0.9954264163970947, GradNorm 0.49932401167078194, Mem: 11558.173828125
[TRAIN] Iter: 40500 Loss: 0.0670669674873352 PSNR: 14.758288383483887, Alpha: 0.9932855367660522, GradNorm 0.29896367281919006, Mem: 11558.173828125
[TRAIN] Iter: 40600 Loss: 0.06291566789150238 PSNR: 15.102864265441895, Alpha: 0.9973945617675781, GradNorm 0.27530478986060064, Mem: 11558.173828125
[TRAIN] Iter: 40700 Loss: 0.058217570185661316 PSNR: 15.37808609008789, Alpha: 0.9933458566665649, GradNorm 0.38790409654255303, Mem: 11558.173828125
[TRAIN] Iter: 40800 Loss: 0.06356412917375565 PSNR: 14.902010917663574, Alpha: 0.9980705976486206, GradNorm 0.5423817269595289, Mem: 11558.173828125
[TRAIN] Iter: 40900 Loss: 0.05964215099811554 PSNR: 15.186602592468262, Alpha: 0.9962046146392822, GradNorm 0.4721439571308287, Mem: 11558.173828125
[TRAIN] Iter: 41000 Loss: 0.07414574921131134 PSNR: 13.991476058959961, Alpha: 0.98455810546875, GradNorm 0.47953922570881474, Mem: 11558.173828125
[TRAIN] Iter: 41100 Loss: 0.06186044216156006 PSNR: 15.153182983398438, Alpha: 0.9954009056091309, GradNorm 0.3568919928465437, Mem: 11558.173828125
[TRAIN] Iter: 41200 Loss: 0.056734099984169006 PSNR: 15.45553970336914, Alpha: 0.998896598815918, GradNorm 0.2965239922980511, Mem: 11558.173828125
[TRAIN] Iter: 41300 Loss: 0.06655921041965485 PSNR: 14.704672813415527, Alpha: 0.9957375526428223, GradNorm 0.32302319466483903, Mem: 11558.173828125
[TRAIN] Iter: 41400 Loss: 0.06859402358531952 PSNR: 14.61442756652832, Alpha: 0.994320273399353, GradNorm 0.35769402490672886, Mem: 11558.173828125
[TRAIN] Iter: 41500 Loss: 0.06857025623321533 PSNR: 14.509525299072266, Alpha: 0.9980823397636414, GradNorm 0.4201186645848711, Mem: 11558.173828125
[TRAIN] Iter: 41600 Loss: 0.06403973698616028 PSNR: 14.945178031921387, Alpha: 0.9910011291503906, GradNorm 0.3995259530336346, Mem: 11558.173828125
[TRAIN] Iter: 41700 Loss: 0.06178385019302368 PSNR: 15.078171730041504, Alpha: 0.9975438117980957, GradNorm 0.34270161065771737, Mem: 11558.173828125
[TRAIN] Iter: 41800 Loss: 0.06193602830171585 PSNR: 15.031960487365723, Alpha: 0.9968239665031433, GradNorm 0.49315042094353756, Mem: 11558.173828125
[TRAIN] Iter: 41900 Loss: 0.06568743288516998 PSNR: 14.72323989868164, Alpha: 0.9948193430900574, GradNorm 0.4184903134291114, Mem: 11558.173828125
[TRAIN] Iter: 42000 Loss: 0.06095592677593231 PSNR: 15.002605438232422, Alpha: 0.9919247627258301, GradNorm 0.2569499657617977, Mem: 11558.173828125
[TRAIN] Iter: 42100 Loss: 0.06248226761817932 PSNR: 14.950459480285645, Alpha: 0.9989416599273682, GradNorm 0.3366666152459606, Mem: 11558.173828125
[TRAIN] Iter: 42200 Loss: 0.054491184651851654 PSNR: 15.567501068115234, Alpha: 0.9990166425704956, GradNorm 0.2609230093119295, Mem: 11558.173828125
[TRAIN] Iter: 42300 Loss: 0.057700589299201965 PSNR: 15.391874313354492, Alpha: 0.9966907501220703, GradNorm 0.4984189549780012, Mem: 11558.173828125
[TRAIN] Iter: 42400 Loss: 0.06255150586366653 PSNR: 15.249772071838379, Alpha: 0.9978551864624023, GradNorm 0.39398098735658066, Mem: 11558.173828125
[TRAIN] Iter: 42500 Loss: 0.06420333683490753 PSNR: 14.851030349731445, Alpha: 0.9786956310272217, GradNorm 0.3462555538175047, Mem: 11558.173828125
[TRAIN] Iter: 42600 Loss: 0.06347671896219254 PSNR: 15.123757362365723, Alpha: 0.9996600151062012, GradNorm 0.3309676615758138, Mem: 11558.173828125
[TRAIN] Iter: 42700 Loss: 0.07136398553848267 PSNR: 14.274288177490234, Alpha: 0.987263560295105, GradNorm 0.5209275674340877, Mem: 11558.173828125
[TRAIN] Iter: 42800 Loss: 0.06217275187373161 PSNR: 14.79295825958252, Alpha: 0.9924108386039734, GradNorm 0.457353558990739, Mem: 11558.173828125
[TRAIN] Iter: 42900 Loss: 0.05887734144926071 PSNR: 15.190601348876953, Alpha: 0.996353030204773, GradNorm 0.462488066151906, Mem: 11558.173828125
[TRAIN] Iter: 43000 Loss: 0.06274916231632233 PSNR: 15.010895729064941, Alpha: 0.9979531168937683, GradNorm 0.4711960667593109, Mem: 11558.173828125
[TRAIN] Iter: 43100 Loss: 0.0698748230934143 PSNR: 14.707219123840332, Alpha: 0.9885330200195312, GradNorm 0.35300043223404765, Mem: 11558.173828125
[TRAIN] Iter: 43200 Loss: 0.05962388962507248 PSNR: 15.273103713989258, Alpha: 0.9905163645744324, GradNorm 0.30540626030485446, Mem: 11558.173828125
[TRAIN] Iter: 43300 Loss: 0.06320229917764664 PSNR: 14.974052429199219, Alpha: 0.9926189184188843, GradNorm 0.2930374095348284, Mem: 11558.173828125
[TRAIN] Iter: 43400 Loss: 0.0676405280828476 PSNR: 14.81754207611084, Alpha: 0.9871574640274048, GradNorm 0.3363737606901633, Mem: 11558.173828125
[TRAIN] Iter: 43500 Loss: 0.056475963443517685 PSNR: 15.456110000610352, Alpha: 0.9952532052993774, GradNorm 0.24706278987527452, Mem: 11558.173828125
[TRAIN] Iter: 43600 Loss: 0.06526242196559906 PSNR: 14.899370193481445, Alpha: 0.9919456243515015, GradNorm 0.3960271681276382, Mem: 11558.173828125
[TRAIN] Iter: 43700 Loss: 0.06048848479986191 PSNR: 15.210874557495117, Alpha: 0.9867396354675293, GradNorm 0.2500266347608301, Mem: 11558.173828125
[TRAIN] Iter: 43800 Loss: 0.055998995900154114 PSNR: 15.614248275756836, Alpha: 0.9965593814849854, GradNorm 0.4177851542461779, Mem: 11558.173828125
[TRAIN] Iter: 43900 Loss: 0.058033883571624756 PSNR: 15.41997241973877, Alpha: 0.9935322999954224, GradNorm 0.2626214157854874, Mem: 11558.173828125
[TRAIN] Iter: 44000 Loss: 0.06427422165870667 PSNR: 14.980792045593262, Alpha: 0.9968828558921814, GradNorm 0.41960106910069234, Mem: 11558.173828125
[TRAIN] Iter: 44100 Loss: 0.06381412595510483 PSNR: 15.049248695373535, Alpha: 0.9999970197677612, GradNorm 0.5815065326319259, Mem: 11558.173828125
[TRAIN] Iter: 44200 Loss: 0.06422969698905945 PSNR: 14.914631843566895, Alpha: 0.9990212321281433, GradNorm 0.26936975381729705, Mem: 11558.173828125
[TRAIN] Iter: 44300 Loss: 0.06263037025928497 PSNR: 15.110718727111816, Alpha: 0.9823076128959656, GradNorm 0.3253264805182251, Mem: 11558.173828125
[TRAIN] Iter: 44400 Loss: 0.06484408676624298 PSNR: 14.92439079284668, Alpha: 0.9982744455337524, GradNorm 0.27502623911877416, Mem: 11558.173828125
[TRAIN] Iter: 44500 Loss: 0.05735905468463898 PSNR: 15.062281608581543, Alpha: 0.9941033124923706, GradNorm 0.31623477952359336, Mem: 11558.173828125
[TRAIN] Iter: 44600 Loss: 0.062471017241477966 PSNR: 15.118406295776367, Alpha: 0.9923000335693359, GradNorm 0.3348737958465407, Mem: 11558.173828125
[TRAIN] Iter: 44700 Loss: 0.06350202113389969 PSNR: 14.81750774383545, Alpha: 0.9882221221923828, GradNorm 0.3850064291989722, Mem: 11558.173828125
[TRAIN] Iter: 44800 Loss: 0.06420037150382996 PSNR: 15.31371784210205, Alpha: 0.9989868402481079, GradNorm 0.6483292173238554, Mem: 11558.173828125
[TRAIN] Iter: 44900 Loss: 0.059009164571762085 PSNR: 15.248708724975586, Alpha: 0.9972981214523315, GradNorm 0.3303721731009428, Mem: 11558.173828125
[TRAIN] Iter: 45000 Loss: 0.06000060215592384 PSNR: 15.33598518371582, Alpha: 0.9935247302055359, GradNorm 0.29458714736154723, Mem: 11558.173828125
[TRAIN] Iter: 45100 Loss: 0.059061937034130096 PSNR: 15.296406745910645, Alpha: 0.995445728302002, GradNorm 0.6446984377377383, Mem: 11558.173828125
[TRAIN] Iter: 45200 Loss: 0.05689052492380142 PSNR: 15.313490867614746, Alpha: 0.9964094161987305, GradNorm 0.32707733169944664, Mem: 11558.173828125
[TRAIN] Iter: 45300 Loss: 0.06293270736932755 PSNR: 15.132230758666992, Alpha: 0.9996517896652222, GradNorm 0.39038462746416014, Mem: 11558.173828125
[TRAIN] Iter: 45400 Loss: 0.06254787743091583 PSNR: 15.168198585510254, Alpha: 0.9996625781059265, GradNorm 0.43390114341570063, Mem: 11558.173828125
[TRAIN] Iter: 45500 Loss: 0.06284557282924652 PSNR: 14.949583053588867, Alpha: 0.9908807277679443, GradNorm 0.478154944345663, Mem: 11558.173828125
[TRAIN] Iter: 45600 Loss: 0.0588466040790081 PSNR: 15.354253768920898, Alpha: 0.9940263628959656, GradNorm 0.3298122943103268, Mem: 11558.173828125
[TRAIN] Iter: 45700 Loss: 0.06480202078819275 PSNR: 14.872370719909668, Alpha: 0.9958197474479675, GradNorm 0.3445887098149994, Mem: 11558.173828125
[TRAIN] Iter: 45800 Loss: 0.0665280744433403 PSNR: 14.657292366027832, Alpha: 0.9954926371574402, GradNorm 0.41906164792335643, Mem: 11558.173828125
[TRAIN] Iter: 45900 Loss: 0.06129935383796692 PSNR: 15.003866195678711, Alpha: 0.9838894605636597, GradNorm 0.2845206315536629, Mem: 11558.173828125
[TRAIN] Iter: 46000 Loss: 0.06478478014469147 PSNR: 14.81215763092041, Alpha: 0.9984545707702637, GradNorm 0.3715516350148181, Mem: 11558.173828125
[TRAIN] Iter: 46100 Loss: 0.06362609565258026 PSNR: 14.9805269241333, Alpha: 0.9900335669517517, GradNorm 0.3730953321241716, Mem: 11558.173828125
[TRAIN] Iter: 46200 Loss: 0.060312412679195404 PSNR: 15.192756652832031, Alpha: 0.9912204742431641, GradNorm 0.4338127371978715, Mem: 11558.173828125
[TRAIN] Iter: 46300 Loss: 0.057733479887247086 PSNR: 15.382120132446289, Alpha: 0.998034656047821, GradNorm 0.37264838096454456, Mem: 11558.173828125
[TRAIN] Iter: 46400 Loss: 0.059273459017276764 PSNR: 15.218443870544434, Alpha: 0.9912663698196411, GradNorm 0.36934175340152114, Mem: 11558.173828125
[TRAIN] Iter: 46500 Loss: 0.061207838356494904 PSNR: 15.101162910461426, Alpha: 0.9829525947570801, GradNorm 0.4183816499840461, Mem: 11558.173828125
[TRAIN] Iter: 46600 Loss: 0.052688851952552795 PSNR: 15.573709487915039, Alpha: 0.9930338263511658, GradNorm 0.42458664199874374, Mem: 11558.173828125
[TRAIN] Iter: 46700 Loss: 0.053076691925525665 PSNR: 15.815766334533691, Alpha: 0.999125599861145, GradNorm 0.3620047364355365, Mem: 11558.173828125
[TRAIN] Iter: 46800 Loss: 0.05844370275735855 PSNR: 15.101482391357422, Alpha: 0.9921358823776245, GradNorm 0.32890199133165865, Mem: 11558.173828125
[TRAIN] Iter: 46900 Loss: 0.0618489533662796 PSNR: 15.081158638000488, Alpha: 0.9963786005973816, GradNorm 0.3752058629164176, Mem: 11558.173828125
[TRAIN] Iter: 47000 Loss: 0.06768728792667389 PSNR: 14.725130081176758, Alpha: 0.9950650930404663, GradNorm 0.36867702460409424, Mem: 11558.173828125
[TRAIN] Iter: 47100 Loss: 0.06367281079292297 PSNR: 15.185117721557617, Alpha: 0.9877790212631226, GradNorm 0.44844959682188656, Mem: 11558.173828125
[TRAIN] Iter: 47200 Loss: 0.05600505694746971 PSNR: 15.493731498718262, Alpha: 0.9955963492393494, GradNorm 0.33625984871055964, Mem: 11558.173828125
[TRAIN] Iter: 47300 Loss: 0.06567969918251038 PSNR: 14.636714935302734, Alpha: 0.9897472858428955, GradNorm 0.34479799949602596, Mem: 11558.173828125
[TRAIN] Iter: 47400 Loss: 0.07311083376407623 PSNR: 14.026487350463867, Alpha: 0.9876173734664917, GradNorm 0.5639887888980059, Mem: 11558.173828125
[TRAIN] Iter: 47500 Loss: 0.058427613228559494 PSNR: 15.381183624267578, Alpha: 0.9995390176773071, GradNorm 0.40068402629862643, Mem: 11558.173828125
[TRAIN] Iter: 47600 Loss: 0.06759899854660034 PSNR: 14.631183624267578, Alpha: 0.9863094091415405, GradNorm 0.406674062280046, Mem: 11558.173828125
[TRAIN] Iter: 47700 Loss: 0.0652020052075386 PSNR: 14.87861156463623, Alpha: 0.9922118186950684, GradNorm 0.4138543673454866, Mem: 11558.173828125
[TRAIN] Iter: 47800 Loss: 0.05909428745508194 PSNR: 15.160724639892578, Alpha: 0.9790000319480896, GradNorm 0.40872348642145057, Mem: 11558.173828125
[TRAIN] Iter: 47900 Loss: 0.05793038010597229 PSNR: 15.1986083984375, Alpha: 0.9982537031173706, GradNorm 0.39168649856254434, Mem: 11558.173828125
[TRAIN] Iter: 48000 Loss: 0.06622669845819473 PSNR: 14.824636459350586, Alpha: 0.9965116381645203, GradNorm 0.31266909068646204, Mem: 11558.173828125
[TRAIN] Iter: 48100 Loss: 0.05892721563577652 PSNR: 15.64183235168457, Alpha: 0.9937596321105957, GradNorm 0.33336450883545327, Mem: 11558.173828125
[TRAIN] Iter: 48200 Loss: 0.05861327797174454 PSNR: 15.70448112487793, Alpha: 0.998167872428894, GradNorm 0.5624347025416343, Mem: 11558.173828125
[TRAIN] Iter: 48300 Loss: 0.050607629120349884 PSNR: 15.98928451538086, Alpha: 0.9989184141159058, GradNorm 0.2486718568460715, Mem: 11558.173828125
[TRAIN] Iter: 48400 Loss: 0.0659506693482399 PSNR: 14.753314018249512, Alpha: 0.9870612025260925, GradNorm 0.3845636688195509, Mem: 11558.173828125
[TRAIN] Iter: 48500 Loss: 0.06482657790184021 PSNR: 14.853557586669922, Alpha: 0.9904563426971436, GradNorm 0.42376916656924285, Mem: 11558.173828125
[TRAIN] Iter: 48600 Loss: 0.07749173045158386 PSNR: 14.506294250488281, Alpha: 0.9939370155334473, GradNorm 0.9236460549799186, Mem: 11558.173828125
[TRAIN] Iter: 48700 Loss: 0.05603981390595436 PSNR: 15.511957168579102, Alpha: 0.9912629127502441, GradNorm 0.2628473756218261, Mem: 11558.173828125
[TRAIN] Iter: 48800 Loss: 0.06566785275936127 PSNR: 15.041994094848633, Alpha: 0.9864538908004761, GradNorm 0.492780303014872, Mem: 11558.173828125
[TRAIN] Iter: 48900 Loss: 0.06574802100658417 PSNR: 14.867985725402832, Alpha: 0.9955222010612488, GradNorm 0.517769792093925, Mem: 11558.173828125
[TRAIN] Iter: 49000 Loss: 0.057714466005563736 PSNR: 15.328323364257812, Alpha: 0.9985362887382507, GradNorm 0.31026175669949274, Mem: 11558.173828125
[TRAIN] Iter: 49100 Loss: 0.06499660015106201 PSNR: 14.918743133544922, Alpha: 0.9903026819229126, GradNorm 0.3673902449463751, Mem: 11558.173828125
[TRAIN] Iter: 49200 Loss: 0.06306901574134827 PSNR: 14.945324897766113, Alpha: 0.9926246404647827, GradNorm 0.3437909614775867, Mem: 11558.173828125
[TRAIN] Iter: 49300 Loss: 0.05755721777677536 PSNR: 15.177298545837402, Alpha: 0.9959596395492554, GradNorm 0.4035346931472126, Mem: 11558.173828125
[TRAIN] Iter: 49400 Loss: 0.05525907874107361 PSNR: 15.469616889953613, Alpha: 0.9987165331840515, GradNorm 0.3195836985696464, Mem: 11558.173828125
[TRAIN] Iter: 49500 Loss: 0.05641057342290878 PSNR: 15.330248832702637, Alpha: 0.997900664806366, GradNorm 0.42370865132568614, Mem: 11558.173828125
[TRAIN] Iter: 49600 Loss: 0.054580122232437134 PSNR: 15.67291259765625, Alpha: 0.9878174662590027, GradNorm 0.3913024172729286, Mem: 11558.173828125
[TRAIN] Iter: 49700 Loss: 0.058702751994132996 PSNR: 15.396486282348633, Alpha: 0.997992753982544, GradNorm 0.31381874372034546, Mem: 11558.173828125
[TRAIN] Iter: 49800 Loss: 0.06515157222747803 PSNR: 14.757065773010254, Alpha: 0.9988183975219727, GradNorm 0.5722365419259784, Mem: 11558.173828125
[TRAIN] Iter: 49900 Loss: 0.06314641982316971 PSNR: 14.997566223144531, Alpha: 0.9907368421554565, GradNorm 0.2795880914999428, Mem: 11558.173828125
Saved checkpoints at ./logs/rat_default/050000.tar
Bounding Box Debug Info:
x_min: -0.016303576469421557, x_max: 0.13524496459960925
y_min: 0.13631874084472656, y_max: 0.2370219573974607
z_min: 0.03008332061767555, z_max: 0.13808740234375
Bounding Box Debug Info:
x_min: 0.20284332275390612, x_max: 0.27888725280761706
y_min: -0.10573136138916016, y_max: 0.04925373077392578
z_min: 0.007576402664184343, z_max: 0.10356272888183571
Bounding Box Debug Info:
x_min: -0.016059140205383415, x_max: 0.14520207214355438
y_min: -0.14237770080566428, y_max: -0.05226047515869141
z_min: 0.01878752899169899, z_max: 0.10969234466552712
Bounding Box Debug Info:
x_min: 0.04626746368408189, x_max: 0.12963674926757793
y_min: -0.117417900085449, y_max: 0.037399066925048824
z_min: 0.022188674926757586, z_max: 0.10861693572998048
Bounding Box Debug Info:
x_min: 0.05005798721313454, x_max: 0.15824763488769517
y_min: -0.04143564224243164, y_max: 0.10374114990234376
z_min: 0.027995925903320087, z_max: 0.16608097839355446
Bounding Box Debug Info:
x_min: 0.018135158538818245, x_max: 0.16837405395507798
y_min: 0.14043280029296876, y_max: 0.26155438232421874
z_min: 0.017705509801884546, z_max: 0.10332373046874978
Bounding Box Debug Info:
x_min: -0.22748263549804704, x_max: -0.07638658905029319
y_min: 0.11107346343994118, y_max: 0.22296235656738259
z_min: 0.01744731903076149, z_max: 0.11063830566406228
Bounding Box Debug Info:
x_min: -0.2488652343750003, x_max: -0.11536257934570324
y_min: -0.017549812316894533, y_max: 0.11087640380859376
z_min: 0.005051356315612565, z_max: 0.08966143035888649
Bounding Box Debug Info:
x_min: -0.21859460449218762, x_max: -0.12297888183593772
y_min: 0.0287499084472654, y_max: 0.10050344085693337
z_min: 0.022367975234985123, z_max: 0.17305021667480458
Bounding Box Debug Info:
x_min: -0.2076027832031251, x_max: -0.10125875854492215
y_min: -0.05550039672851585, y_max: 0.07307929992675782
z_min: 0.00415080495777488, z_max: 0.09007923889160134
Bounding Box Debug Info:
x_min: -0.17041168212890642, x_max: -0.036025394439697385
y_min: 0.11028720092773414, y_max: 0.22693572998046876
z_min: 0.022087579727172624, z_max: 0.1054314270019529
Bounding Box Debug Info:
x_min: -0.1705599060058596, x_max: -0.038622104644775564
y_min: 0.1101719741821289, y_max: 0.2275417175292969
z_min: 0.021979381561279068, z_max: 0.10639647674560525
Bounding Box Debug Info:
x_min: -0.17065034484863298, x_max: -0.03775608825683617
y_min: 0.11022159576415992, y_max: 0.22663485717773438
z_min: 0.021593749999999773, z_max: 0.10601044464111306
Bounding Box Debug Info:
x_min: -0.1705158691406251, x_max: -0.03697266769409191
y_min: 0.10976802825927712, y_max: 0.22746969604492165
z_min: 0.021498142242431413, z_max: 0.10463920593261719
Bounding Box Debug Info:
x_min: -0.17116410827636735, x_max: -0.03743118286132818
y_min: 0.11016936492919921, y_max: 0.22665449523925782
z_min: 0.02142527198791481, z_max: 0.1051624069213865
center: None
Pose 0: Valid indices count = 1391744
center: None
Pose 1: Valid indices count = 1391744
center: None
Pose 2: Valid indices count = 1391744
center: None
Pose 3: Valid indices count = 1391744
center: None
Pose 4: Valid indices count = 1391744
center: None
Pose 5: Valid indices count = 1391744
center: None
Pose 6: Valid indices count = 1391744
center: None
Pose 7: Valid indices count = 1391744
center: None
Pose 8: Valid indices count = 1391744
center: None
Pose 9: Valid indices count = 1391744
center: None
Pose 10: Valid indices count = 1391744
center: None
Pose 11: Valid indices count = 1391744
center: None
Pose 12: Valid indices count = 1391744
center: None
Pose 13: Valid indices count = 1391744
center: None
Pose 14: Valid indices count = 1391744
0 0.0009961128234863281
1 46.858123540878296
2 46.89125418663025
3 46.97939968109131
4 47.00898504257202
5 46.98870587348938
6 46.9944167137146
7 46.96762299537659
8 46.98552060127258
9 46.952847480773926
10 46.96713876724243
11 46.929147720336914
12 46.94833278656006
13 46.92840123176575
14 46.93993806838989
Predicted image shape (th_rgbs): torch.Size([15, 3, 1048, 1328])
Ground truth image shape (th_gt): torch.Size([15, 3, 1048, 1328])
Any NaNs in th_rgbs: tensor(False, device='cpu')
Any NaNs in th_gt: tensor(False, device='cpu')
Any Infs in th_rgbs: tensor(False, device='cpu')
Any Infs in th_gt: tensor(False, device='cpu')
Evaluate PSNR: 22.151882450964994 (15.204380459828826), SSIM: 0.9716911315917969 (0.73394471422684)
[TRAIN] Iter: 50000 Loss: 0.058105532079935074 PSNR: 15.189362525939941, Alpha: 0.9979998469352722, GradNorm 0.2840148607227131, Mem: 11558.173828125
[TRAIN] Iter: 50100 Loss: 0.05895264446735382 PSNR: 15.220385551452637, Alpha: 0.9835258722305298, GradNorm 0.38727034049459647, Mem: 11558.173828125
[TRAIN] Iter: 50200 Loss: 0.06578987836837769 PSNR: 14.668155670166016, Alpha: 0.9871050119400024, GradNorm 0.5323695298708102, Mem: 11558.173828125
[TRAIN] Iter: 50300 Loss: 0.05647384002804756 PSNR: 15.292465209960938, Alpha: 0.9977972507476807, GradNorm 0.4511156000309645, Mem: 11558.173828125
[TRAIN] Iter: 50400 Loss: 0.05369134247303009 PSNR: 15.612868309020996, Alpha: 0.9952961802482605, GradNorm 0.29123665945117716, Mem: 11558.173828125
[TRAIN] Iter: 50500 Loss: 0.06571171432733536 PSNR: 15.036233901977539, Alpha: 0.9970635175704956, GradNorm 0.41067177711498226, Mem: 11558.173828125
[TRAIN] Iter: 50600 Loss: 0.05596110224723816 PSNR: 15.527755737304688, Alpha: 0.9953183531761169, GradNorm 0.3017420617136427, Mem: 11558.173828125
[TRAIN] Iter: 50700 Loss: 0.06471474468708038 PSNR: 14.865389823913574, Alpha: 0.9922960996627808, GradNorm 0.37171601512821273, Mem: 11558.173828125
[TRAIN] Iter: 50800 Loss: 0.06971690058708191 PSNR: 14.692752838134766, Alpha: 0.9878718256950378, GradNorm 0.41942440054198543, Mem: 11558.173828125
[TRAIN] Iter: 50900 Loss: 0.059634312987327576 PSNR: 15.399162292480469, Alpha: 0.9900102019309998, GradNorm 0.37563786182013137, Mem: 11558.173828125
[TRAIN] Iter: 51000 Loss: 0.06428678333759308 PSNR: 14.89380168914795, Alpha: 0.9915561676025391, GradNorm 0.3577377036732193, Mem: 11558.173828125
[TRAIN] Iter: 51100 Loss: 0.0664040595293045 PSNR: 14.550498008728027, Alpha: 0.9929485321044922, GradNorm 0.39331133369125243, Mem: 11558.173828125
[TRAIN] Iter: 51200 Loss: 0.061399899423122406 PSNR: 15.005654335021973, Alpha: 0.9908628463745117, GradNorm 0.3866057860920289, Mem: 11558.173828125
[TRAIN] Iter: 51300 Loss: 0.05776628106832504 PSNR: 15.46190357208252, Alpha: 0.9934760928153992, GradNorm 0.35201066674416814, Mem: 11558.173828125
[TRAIN] Iter: 51400 Loss: 0.05603978782892227 PSNR: 15.377033233642578, Alpha: 0.9881032705307007, GradNorm 0.2590555431437716, Mem: 11558.173828125
[TRAIN] Iter: 51500 Loss: 0.060580506920814514 PSNR: 15.246253967285156, Alpha: 0.9961029291152954, GradNorm 0.2938522405741412, Mem: 11558.173828125
[TRAIN] Iter: 51600 Loss: 0.05653645098209381 PSNR: 15.664031028747559, Alpha: 0.99943608045578, GradNorm 0.372264025379156, Mem: 11558.173828125
[TRAIN] Iter: 51700 Loss: 0.05940168350934982 PSNR: 15.251336097717285, Alpha: 0.9883896708488464, GradNorm 0.2684890029941094, Mem: 11558.173828125
[TRAIN] Iter: 51800 Loss: 0.058517858386039734 PSNR: 15.351390838623047, Alpha: 0.9983496069908142, GradNorm 0.30776522540897283, Mem: 11558.173828125
[TRAIN] Iter: 51900 Loss: 0.06432626396417618 PSNR: 15.203666687011719, Alpha: 0.992949366569519, GradNorm 0.546373812994929, Mem: 11558.173828125
[TRAIN] Iter: 52000 Loss: 0.06645672768354416 PSNR: 14.789229393005371, Alpha: 0.9945947527885437, GradNorm 0.39641947723623916, Mem: 11558.173828125
[TRAIN] Iter: 52100 Loss: 0.055972471833229065 PSNR: 15.574606895446777, Alpha: 0.9927202463150024, GradNorm 0.2858569088117984, Mem: 11558.173828125
[TRAIN] Iter: 52200 Loss: 0.05917409434914589 PSNR: 15.254250526428223, Alpha: 0.995121419429779, GradNorm 0.33830629098554305, Mem: 11558.173828125
[TRAIN] Iter: 52300 Loss: 0.0532609298825264 PSNR: 15.800980567932129, Alpha: 0.9903849959373474, GradNorm 0.3205262052061225, Mem: 11558.173828125
[TRAIN] Iter: 52400 Loss: 0.06221266835927963 PSNR: 15.126782417297363, Alpha: 0.9913926124572754, GradNorm 0.29532575583361803, Mem: 11558.173828125
[TRAIN] Iter: 52500 Loss: 0.06813754886388779 PSNR: 14.745637893676758, Alpha: 0.9907919764518738, GradNorm 0.5402200903624608, Mem: 11558.173828125
[TRAIN] Iter: 52600 Loss: 0.06291498243808746 PSNR: 15.050938606262207, Alpha: 0.9840816259384155, GradNorm 0.44179048431173823, Mem: 11558.173828125
[TRAIN] Iter: 52700 Loss: 0.05269574746489525 PSNR: 15.821185111999512, Alpha: 0.9974943995475769, GradNorm 0.35057121099084243, Mem: 11558.173828125
[TRAIN] Iter: 52800 Loss: 0.0647878497838974 PSNR: 14.779779434204102, Alpha: 0.9787309169769287, GradNorm 0.4604729957228669, Mem: 11558.173828125
[TRAIN] Iter: 52900 Loss: 0.058308519423007965 PSNR: 15.095874786376953, Alpha: 0.9914320707321167, GradNorm 0.3698005519311737, Mem: 11558.173828125
[TRAIN] Iter: 53000 Loss: 0.05521899834275246 PSNR: 15.689702987670898, Alpha: 0.9961808919906616, GradNorm 0.3815587115860932, Mem: 11558.173828125
[TRAIN] Iter: 53100 Loss: 0.05604460462927818 PSNR: 15.515490531921387, Alpha: 0.9957966208457947, GradNorm 0.2911862310383203, Mem: 11558.173828125
[TRAIN] Iter: 53200 Loss: 0.04966666176915169 PSNR: 16.015470504760742, Alpha: 0.9983052611351013, GradNorm 0.3606774120554297, Mem: 11558.173828125
[TRAIN] Iter: 53300 Loss: 0.06414413452148438 PSNR: 14.986684799194336, Alpha: 0.9986133575439453, GradNorm 0.3347257312354418, Mem: 11558.173828125
[TRAIN] Iter: 53400 Loss: 0.06665113568305969 PSNR: 14.931570053100586, Alpha: 0.9920744299888611, GradNorm 0.38655010025542336, Mem: 11558.173828125
[TRAIN] Iter: 53500 Loss: 0.05652924254536629 PSNR: 15.21403694152832, Alpha: 0.9928452372550964, GradNorm 0.3034945419758555, Mem: 11558.173828125
[TRAIN] Iter: 53600 Loss: 0.05967526137828827 PSNR: 15.399975776672363, Alpha: 0.9942139387130737, GradNorm 0.42014945970953077, Mem: 11558.173828125
[TRAIN] Iter: 53700 Loss: 0.057706866413354874 PSNR: 15.229686737060547, Alpha: 0.9950081706047058, GradNorm 0.32379654595387747, Mem: 11558.173828125
[TRAIN] Iter: 53800 Loss: 0.05413372814655304 PSNR: 15.810196876525879, Alpha: 0.9945660829544067, GradNorm 0.6159664055313017, Mem: 11558.173828125
[TRAIN] Iter: 53900 Loss: 0.061323486268520355 PSNR: 14.841399192810059, Alpha: 0.9873390197753906, GradNorm 0.4717260487074443, Mem: 11558.173828125
[TRAIN] Iter: 54000 Loss: 0.054029516875743866 PSNR: 15.829535484313965, Alpha: 0.9992579221725464, GradNorm 0.3099340594793708, Mem: 11558.173828125
[TRAIN] Iter: 54100 Loss: 0.06321076303720474 PSNR: 15.131776809692383, Alpha: 0.9882491827011108, GradNorm 0.28262035315940826, Mem: 11558.173828125
[TRAIN] Iter: 54200 Loss: 0.05876796692609787 PSNR: 15.14811897277832, Alpha: 0.9972754716873169, GradNorm 0.3722245405130614, Mem: 11558.173828125
[TRAIN] Iter: 54300 Loss: 0.05378064513206482 PSNR: 15.575678825378418, Alpha: 0.9973217248916626, GradNorm 0.30027867949691495, Mem: 11558.173828125
[TRAIN] Iter: 54400 Loss: 0.06435342133045197 PSNR: 15.085831642150879, Alpha: 0.9927409887313843, GradNorm 0.8661561139985581, Mem: 11558.173828125
[TRAIN] Iter: 54500 Loss: 0.056262869387865067 PSNR: 15.56800365447998, Alpha: 0.9973823428153992, GradNorm 0.3509946532453024, Mem: 11558.173828125
[TRAIN] Iter: 54600 Loss: 0.05062917619943619 PSNR: 15.761314392089844, Alpha: 0.9962481260299683, GradNorm 0.34838038259932935, Mem: 11558.173828125
[TRAIN] Iter: 54700 Loss: 0.05606471002101898 PSNR: 15.45236873626709, Alpha: 0.9961941242218018, GradNorm 0.5446949463375016, Mem: 11558.173828125
[TRAIN] Iter: 54800 Loss: 0.05638191103935242 PSNR: 15.491366386413574, Alpha: 0.9956643581390381, GradNorm 0.3992045079220274, Mem: 11558.173828125
[TRAIN] Iter: 54900 Loss: 0.053719133138656616 PSNR: 15.473672866821289, Alpha: 0.992242693901062, GradNorm 0.32368995193149547, Mem: 11558.173828125
[TRAIN] Iter: 55000 Loss: 0.05765252560377121 PSNR: 15.341387748718262, Alpha: 0.9946484565734863, GradNorm 0.4005849536056774, Mem: 11558.173828125
[TRAIN] Iter: 55100 Loss: 0.06703218817710876 PSNR: 14.542020797729492, Alpha: 0.9907339811325073, GradNorm 0.3959524003460306, Mem: 11558.173828125
[TRAIN] Iter: 55200 Loss: 0.06182589381933212 PSNR: 15.28776741027832, Alpha: 0.9936490058898926, GradNorm 0.4521116959426028, Mem: 11558.173828125
[TRAIN] Iter: 55300 Loss: 0.05752597749233246 PSNR: 15.507491111755371, Alpha: 0.9881829023361206, GradNorm 0.2997876985156495, Mem: 11558.173828125
[TRAIN] Iter: 55400 Loss: 0.0611291229724884 PSNR: 14.974214553833008, Alpha: 0.9831762313842773, GradNorm 0.4300154218602808, Mem: 11558.173828125
[TRAIN] Iter: 55500 Loss: 0.05995774641633034 PSNR: 15.342151641845703, Alpha: 0.995067298412323, GradNorm 0.30099340048020257, Mem: 11558.173828125
[TRAIN] Iter: 55600 Loss: 0.05718737095594406 PSNR: 15.29163932800293, Alpha: 0.993287205696106, GradNorm 0.3503937758550557, Mem: 11558.173828125
[TRAIN] Iter: 55700 Loss: 0.051094040274620056 PSNR: 15.795233726501465, Alpha: 0.995969295501709, GradNorm 0.321528994938153, Mem: 11558.173828125
[TRAIN] Iter: 55800 Loss: 0.05370189994573593 PSNR: 15.699714660644531, Alpha: 0.9968480467796326, GradNorm 0.33549941118757304, Mem: 11558.173828125
[TRAIN] Iter: 55900 Loss: 0.05553240329027176 PSNR: 15.456072807312012, Alpha: 0.9982190132141113, GradNorm 0.36247902328255965, Mem: 11558.173828125
[TRAIN] Iter: 56000 Loss: 0.061712466180324554 PSNR: 15.150819778442383, Alpha: 0.9933829307556152, GradNorm 0.35667886889987616, Mem: 11558.173828125
[TRAIN] Iter: 56100 Loss: 0.054931506514549255 PSNR: 15.618511199951172, Alpha: 0.9974164366722107, GradNorm 0.31650972361446716, Mem: 11558.173828125
[TRAIN] Iter: 56200 Loss: 0.05372673273086548 PSNR: 15.512728691101074, Alpha: 0.9954649806022644, GradNorm 0.3042403392116199, Mem: 11558.173828125
[TRAIN] Iter: 56300 Loss: 0.06087212264537811 PSNR: 15.32063102722168, Alpha: 0.9962644577026367, GradNorm 0.42243560064408464, Mem: 11558.173828125
[TRAIN] Iter: 56400 Loss: 0.05006065219640732 PSNR: 16.047319412231445, Alpha: 0.9972860813140869, GradNorm 0.4415435022103416, Mem: 11558.173828125
[TRAIN] Iter: 56500 Loss: 0.05098789185285568 PSNR: 16.052732467651367, Alpha: 0.9972233772277832, GradNorm 0.3641407177818706, Mem: 11558.173828125
[TRAIN] Iter: 56600 Loss: 0.05514749139547348 PSNR: 15.778087615966797, Alpha: 0.9961304068565369, GradNorm 0.34587851299563643, Mem: 11558.173828125
[TRAIN] Iter: 56700 Loss: 0.06436574459075928 PSNR: 15.011810302734375, Alpha: 0.995250403881073, GradNorm 0.3386358084430265, Mem: 11558.173828125
[TRAIN] Iter: 56800 Loss: 0.056435488164424896 PSNR: 15.285340309143066, Alpha: 0.9967214465141296, GradNorm 0.41271382275156226, Mem: 11558.173828125
[TRAIN] Iter: 56900 Loss: 0.05457620322704315 PSNR: 15.979351043701172, Alpha: 0.9992966651916504, GradNorm 0.44493457764151195, Mem: 11558.173828125
[TRAIN] Iter: 57000 Loss: 0.06147640198469162 PSNR: 15.222270011901855, Alpha: 0.9912230968475342, GradNorm 0.3799770892958643, Mem: 11558.173828125
[TRAIN] Iter: 57100 Loss: 0.05592219531536102 PSNR: 15.813860893249512, Alpha: 0.9962358474731445, GradNorm 0.3971468815085494, Mem: 11558.173828125
[TRAIN] Iter: 57200 Loss: 0.057438820600509644 PSNR: 15.185134887695312, Alpha: 0.9879419207572937, GradNorm 0.24155794931356336, Mem: 11558.173828125
[TRAIN] Iter: 57300 Loss: 0.047984108328819275 PSNR: 16.232345581054688, Alpha: 0.9955338835716248, GradNorm 0.26258677495408594, Mem: 11558.173828125
[TRAIN] Iter: 57400 Loss: 0.05343136936426163 PSNR: 15.737112998962402, Alpha: 0.9994131326675415, GradNorm 0.26467632660911866, Mem: 11558.173828125
[TRAIN] Iter: 57500 Loss: 0.053275950253009796 PSNR: 15.634912490844727, Alpha: 0.9999350309371948, GradNorm 0.3069671462819184, Mem: 11558.173828125
[TRAIN] Iter: 57600 Loss: 0.054779402911663055 PSNR: 15.463504791259766, Alpha: 0.9928545951843262, GradNorm 0.392867451025022, Mem: 11558.173828125
[TRAIN] Iter: 57700 Loss: 0.055730465799570084 PSNR: 15.431191444396973, Alpha: 0.9869550466537476, GradNorm 0.33694976464219184, Mem: 11558.173828125
[TRAIN] Iter: 57800 Loss: 0.06273643672466278 PSNR: 14.869658470153809, Alpha: 0.9971306324005127, GradNorm 0.6433756967187971, Mem: 11558.173828125
[TRAIN] Iter: 57900 Loss: 0.051855772733688354 PSNR: 15.782047271728516, Alpha: 0.9889594912528992, GradNorm 0.2995418529751881, Mem: 11558.173828125
[TRAIN] Iter: 58000 Loss: 0.05642227455973625 PSNR: 15.461297035217285, Alpha: 0.9947780966758728, GradNorm 0.433683027778311, Mem: 11558.173828125
[TRAIN] Iter: 58100 Loss: 0.0633750855922699 PSNR: 14.903092384338379, Alpha: 0.9874871969223022, GradNorm 0.32876750668210036, Mem: 11558.173828125
[TRAIN] Iter: 58200 Loss: 0.05729319900274277 PSNR: 15.33437728881836, Alpha: 0.9916062355041504, GradNorm 0.31246813471665436, Mem: 11558.173828125
[TRAIN] Iter: 58300 Loss: 0.05123782902956009 PSNR: 15.834428787231445, Alpha: 0.9976340532302856, GradNorm 0.35491808107564293, Mem: 11558.173828125
[TRAIN] Iter: 58400 Loss: 0.058154575526714325 PSNR: 15.393817901611328, Alpha: 0.9949058294296265, GradNorm 0.28192403514431696, Mem: 11558.173828125
[TRAIN] Iter: 58500 Loss: 0.05728883296251297 PSNR: 15.587623596191406, Alpha: 0.9999325275421143, GradNorm 0.3603256947954725, Mem: 11558.173828125
[TRAIN] Iter: 58600 Loss: 0.054293327033519745 PSNR: 15.855406761169434, Alpha: 0.9848021268844604, GradNorm 0.27436872248636246, Mem: 11558.173828125
[TRAIN] Iter: 58700 Loss: 0.05691192299127579 PSNR: 15.460902214050293, Alpha: 0.9892644882202148, GradNorm 0.3340681403302985, Mem: 11558.173828125
[TRAIN] Iter: 58800 Loss: 0.05777928978204727 PSNR: 15.37510871887207, Alpha: 0.9904429316520691, GradNorm 0.4927137913473384, Mem: 11558.173828125
[TRAIN] Iter: 58900 Loss: 0.04910638928413391 PSNR: 15.99137020111084, Alpha: 0.9951162338256836, GradNorm 0.3632079171154266, Mem: 11558.173828125
[TRAIN] Iter: 59000 Loss: 0.06296167522668839 PSNR: 14.884933471679688, Alpha: 0.9932724833488464, GradNorm 0.3631487330109711, Mem: 11558.173828125
[TRAIN] Iter: 59100 Loss: 0.04937943071126938 PSNR: 15.976011276245117, Alpha: 0.9851778745651245, GradNorm 0.3016970914767622, Mem: 11558.173828125
[TRAIN] Iter: 59200 Loss: 0.05393635481595993 PSNR: 15.833044052124023, Alpha: 0.9936081767082214, GradNorm 0.2560019307279969, Mem: 11558.173828125
[TRAIN] Iter: 59300 Loss: 0.05420570820569992 PSNR: 15.64539909362793, Alpha: 1.0, GradNorm 0.29499178449413765, Mem: 11558.173828125
[TRAIN] Iter: 59400 Loss: 0.06286279857158661 PSNR: 14.964385986328125, Alpha: 0.9972966909408569, GradNorm 0.3698350388797063, Mem: 11558.173828125
[TRAIN] Iter: 59500 Loss: 0.05446171760559082 PSNR: 15.643678665161133, Alpha: 0.98838210105896, GradNorm 0.4124199257243224, Mem: 11558.173828125
[TRAIN] Iter: 59600 Loss: 0.06369943916797638 PSNR: 14.86291790008545, Alpha: 0.9839510321617126, GradNorm 0.3969134592906294, Mem: 11558.173828125
[TRAIN] Iter: 59700 Loss: 0.06250420212745667 PSNR: 14.908791542053223, Alpha: 0.9873477220535278, GradNorm 0.308908370714365, Mem: 11558.173828125
[TRAIN] Iter: 59800 Loss: 0.05851300060749054 PSNR: 15.719139099121094, Alpha: 0.996955156326294, GradNorm 0.565542604706601, Mem: 11558.173828125
[TRAIN] Iter: 59900 Loss: 0.0568920373916626 PSNR: 15.44765853881836, Alpha: 0.9950200319290161, GradNorm 0.3308633938004227, Mem: 11558.173828125
Saved checkpoints at ./logs/rat_default/060000.tar
Bounding Box Debug Info:
x_min: -0.016303576469421557, x_max: 0.13524496459960925
y_min: 0.13631874084472656, y_max: 0.2370219573974607
z_min: 0.03008332061767555, z_max: 0.13808740234375
Bounding Box Debug Info:
x_min: 0.20284332275390612, x_max: 0.27888725280761706
y_min: -0.10573136138916016, y_max: 0.04925373077392578
z_min: 0.007576402664184343, z_max: 0.10356272888183571
Bounding Box Debug Info:
x_min: -0.016059140205383415, x_max: 0.14520207214355438
y_min: -0.14237770080566428, y_max: -0.05226047515869141
z_min: 0.01878752899169899, z_max: 0.10969234466552712
Bounding Box Debug Info:
x_min: 0.04626746368408189, x_max: 0.12963674926757793
y_min: -0.117417900085449, y_max: 0.037399066925048824
z_min: 0.022188674926757586, z_max: 0.10861693572998048
Bounding Box Debug Info:
x_min: 0.05005798721313454, x_max: 0.15824763488769517
y_min: -0.04143564224243164, y_max: 0.10374114990234376
z_min: 0.027995925903320087, z_max: 0.16608097839355446
Bounding Box Debug Info:
x_min: 0.018135158538818245, x_max: 0.16837405395507798
y_min: 0.14043280029296876, y_max: 0.26155438232421874
z_min: 0.017705509801884546, z_max: 0.10332373046874978
Bounding Box Debug Info:
x_min: -0.22748263549804704, x_max: -0.07638658905029319
y_min: 0.11107346343994118, y_max: 0.22296235656738259
z_min: 0.01744731903076149, z_max: 0.11063830566406228
Bounding Box Debug Info:
x_min: -0.2488652343750003, x_max: -0.11536257934570324
y_min: -0.017549812316894533, y_max: 0.11087640380859376
z_min: 0.005051356315612565, z_max: 0.08966143035888649
Bounding Box Debug Info:
x_min: -0.21859460449218762, x_max: -0.12297888183593772