
#######################################################################
Please cite the following paper when using nnU-Net:
Isensee, F., Jaeger, P. F., Kohl, S. A., Petersen, J., & Maier-Hein, K. H. (2021). nnU-Net: a self-configuring method for deep learning-based biomedical image segmentation. Nature methods, 18(2), 203-211.
#######################################################################
 

This is the configuration used by this training:
Configuration name: 2d
 {'data_identifier': 'nnUNetPlans_2d', 'preprocessor_name': 'DefaultPreprocessor', 'batch_size': 198, 'patch_size': [128, 128], 'median_image_size_in_voxels': [128.0, 128.0], 'spacing': [1.5, 1.5], 'normalization_schemes': ['ZScoreNormalization'], 'use_mask_for_norm': [False], 'UNet_class_name': 'PlainConvUNet', 'UNet_base_num_features': 32, 'n_conv_per_stage_encoder': [2, 2, 2, 2, 2, 2], 'n_conv_per_stage_decoder': [2, 2, 2, 2, 2], 'num_pool_per_axis': [5, 5], 'pool_op_kernel_sizes': [[1, 1], [2, 2], [2, 2], [2, 2], [2, 2], [2, 2]], 'conv_kernel_sizes': [[3, 3], [3, 3], [3, 3], [3, 3], [3, 3], [3, 3]], 'unet_max_num_features': 512, 'resampling_fn_data': 'resample_data_or_seg_to_shape', 'resampling_fn_seg': 'resample_data_or_seg_to_shape', 'resampling_fn_data_kwargs': {'is_seg': False, 'order': 3, 'order_z': 0, 'force_separate_z': None}, 'resampling_fn_seg_kwargs': {'is_seg': True, 'order': 1, 'order_z': 0, 'force_separate_z': None}, 'resampling_fn_probabilities': 'resample_data_or_seg_to_shape', 'resampling_fn_probabilities_kwargs': {'is_seg': False, 'order': 1, 'order_z': 0, 'force_separate_z': None}, 'batch_dice': True} 
 
These are the global plan.json settings:
 {'dataset_name': 'Dataset223_ICTS2023', 'plans_name': 'nnUNetPlans', 'original_median_spacing_after_transp': [1.5, 1.5, 1.5], 'original_median_shape_after_transp': [128, 128, 128], 'image_reader_writer': 'SimpleITKIO', 'transpose_forward': [0, 1, 2], 'transpose_backward': [0, 1, 2], 'experiment_planner_used': 'ExperimentPlanner', 'label_manager': 'LabelManager', 'foreground_intensity_properties_per_channel': {'0': {'max': 1.0, 'mean': 0.3944229185581207, 'median': 0.4175388216972351, 'min': -1.0, 'percentile_00_5': -0.658677875995636, 'percentile_99_5': 0.9519757628440857, 'std': 0.26940569281578064}}} 
 
2023-11-22 15:36:14.175167: unpacking dataset... 
2023-11-22 15:36:18.887230: unpacking done... 
2023-11-22 15:36:18.893133: do_dummy_2d_data_aug: False 
2023-11-22 15:36:18.903549: Using splits from existing split file: /home/xfr/nnUNet/preprocessed/Dataset223_ICTS2023/splits_final.json 
2023-11-22 15:36:18.904855: The split file contains 5 splits. 
2023-11-22 15:36:18.904935: Desired fold for training: 0 
2023-11-22 15:36:18.904998: This split has 1600 training and 400 validation cases. 
2023-11-22 15:36:18.921630: Unable to plot network architecture: 
2023-11-22 15:36:18.921685: No module named 'hiddenlayer' 
2023-11-22 15:36:18.945258:  
2023-11-22 15:36:18.945380: Epoch 0 
2023-11-22 15:36:18.945533: Current learning rate: 0.01 
2023-11-22 15:39:47.660260: train_loss 0.1003 
2023-11-22 15:39:47.660462: val_loss 0.0237 
2023-11-22 15:39:47.660617: Pseudo dice [0.0, nan] 
2023-11-22 15:39:47.660743: Epoch time: 208.72 s 
2023-11-22 15:39:47.660838: Yayy! New best EMA pseudo Dice: 0.0 
2023-11-22 15:39:48.888382:  
2023-11-22 15:39:48.888496: Epoch 1 
2023-11-22 15:39:48.888598: Current learning rate: 0.00999 
2023-11-22 15:43:10.209475: train_loss -0.0307 
2023-11-22 15:43:10.209685: val_loss -0.204 
2023-11-22 15:43:10.209764: Pseudo dice [0.5301, nan] 
2023-11-22 15:43:10.209834: Epoch time: 201.32 s 
2023-11-22 15:43:10.209918: Yayy! New best EMA pseudo Dice: 0.053 
2023-11-22 15:43:11.313102:  
2023-11-22 15:43:11.313225: Epoch 2 
2023-11-22 15:43:11.313326: Current learning rate: 0.00998 
2023-11-22 15:46:32.932100: train_loss -0.2119 
2023-11-22 15:46:32.932287: val_loss -0.2636 
2023-11-22 15:46:32.932361: Pseudo dice [0.5839, nan] 
2023-11-22 15:46:32.932431: Epoch time: 201.62 s 
2023-11-22 15:46:32.932491: Yayy! New best EMA pseudo Dice: 0.1061 
2023-11-22 15:46:34.110329:  
2023-11-22 15:46:34.110522: Epoch 3 
2023-11-22 15:46:34.110645: Current learning rate: 0.00997 
2023-11-22 15:49:55.641326: train_loss -0.2708 
2023-11-22 15:49:55.641554: val_loss -0.3075 
2023-11-22 15:49:55.641634: Pseudo dice [0.6437, nan] 
2023-11-22 15:49:55.641706: Epoch time: 201.53 s 
2023-11-22 15:49:55.641763: Yayy! New best EMA pseudo Dice: 0.1599 
2023-11-22 15:49:56.782266:  
2023-11-22 15:49:56.782389: Epoch 4 
2023-11-22 15:49:56.782488: Current learning rate: 0.00996 
2023-11-22 15:53:17.802092: train_loss -0.2701 
2023-11-22 15:53:17.802318: val_loss -0.3088 
2023-11-22 15:53:17.802398: Pseudo dice [0.6618, nan] 
2023-11-22 15:53:17.802469: Epoch time: 201.02 s 
2023-11-22 15:53:17.802538: Yayy! New best EMA pseudo Dice: 0.2101 
2023-11-22 15:53:18.959347:  
2023-11-22 15:53:18.959463: Epoch 5 
2023-11-22 15:53:18.959596: Current learning rate: 0.00995 
2023-11-22 15:56:39.712698: train_loss -0.2828 
2023-11-22 15:56:39.712892: val_loss -0.2994 
2023-11-22 15:56:39.712974: Pseudo dice [0.634, nan] 
2023-11-22 15:56:39.713065: Epoch time: 200.75 s 
2023-11-22 15:56:39.713133: Yayy! New best EMA pseudo Dice: 0.2524 
2023-11-22 15:56:40.820759:  
2023-11-22 15:56:40.820897: Epoch 6 
2023-11-22 15:56:40.821005: Current learning rate: 0.00995 
2023-11-22 16:00:01.583168: train_loss -0.289 
2023-11-22 16:00:01.583387: val_loss -0.3069 
2023-11-22 16:00:01.583473: Pseudo dice [0.6413, nan] 
2023-11-22 16:00:01.583554: Epoch time: 200.76 s 
2023-11-22 16:00:01.583613: Yayy! New best EMA pseudo Dice: 0.2913 
2023-11-22 16:00:02.711107:  
2023-11-22 16:00:02.711222: Epoch 7 
2023-11-22 16:00:02.711319: Current learning rate: 0.00994 
2023-11-22 16:03:23.196437: train_loss -0.3058 
2023-11-22 16:03:23.196648: val_loss -0.3084 
2023-11-22 16:03:23.196727: Pseudo dice [0.6441, nan] 
2023-11-22 16:03:23.196798: Epoch time: 200.49 s 
2023-11-22 16:03:23.196892: Yayy! New best EMA pseudo Dice: 0.3266 
2023-11-22 16:03:24.350527:  
2023-11-22 16:03:24.350653: Epoch 8 
2023-11-22 16:03:24.350751: Current learning rate: 0.00993 
2023-11-22 16:06:45.530006: train_loss -0.3097 
2023-11-22 16:06:45.530249: val_loss -0.325 
2023-11-22 16:06:45.530326: Pseudo dice [0.6759, nan] 
2023-11-22 16:06:45.530393: Epoch time: 201.18 s 
2023-11-22 16:06:45.530450: Yayy! New best EMA pseudo Dice: 0.3615 
2023-11-22 16:06:46.689187:  
2023-11-22 16:06:46.689308: Epoch 9 
2023-11-22 16:06:46.689400: Current learning rate: 0.00992 
2023-11-22 16:10:08.176281: train_loss -0.3111 
2023-11-22 16:10:08.176480: val_loss -0.2957 
2023-11-22 16:10:08.176573: Pseudo dice [0.6144, nan] 
2023-11-22 16:10:08.176668: Epoch time: 201.49 s 
2023-11-22 16:10:08.176737: Yayy! New best EMA pseudo Dice: 0.3868 
2023-11-22 16:10:09.402854:  
2023-11-22 16:10:09.403044: Epoch 10 
2023-11-22 16:10:09.403147: Current learning rate: 0.00991 
2023-11-22 16:13:30.525732: train_loss -0.3032 
2023-11-22 16:13:30.525935: val_loss -0.3261 
2023-11-22 16:13:30.526013: Pseudo dice [0.6748, nan] 
2023-11-22 16:13:30.526083: Epoch time: 201.12 s 
2023-11-22 16:13:30.526142: Yayy! New best EMA pseudo Dice: 0.4156 
2023-11-22 16:13:31.632802:  
2023-11-22 16:13:31.632910: Epoch 11 
2023-11-22 16:13:31.632994: Current learning rate: 0.0099 
2023-11-22 16:16:53.073347: train_loss -0.3028 
2023-11-22 16:16:53.073545: val_loss -0.3493 
2023-11-22 16:16:53.073624: Pseudo dice [0.7137, nan] 
2023-11-22 16:16:53.073695: Epoch time: 201.44 s 
2023-11-22 16:16:53.073753: Yayy! New best EMA pseudo Dice: 0.4454 
2023-11-22 16:16:54.183543:  
2023-11-22 16:16:54.183667: Epoch 12 
2023-11-22 16:16:54.183765: Current learning rate: 0.00989 
2023-11-22 16:20:15.392768: train_loss -0.319 
2023-11-22 16:20:15.392979: val_loss -0.3145 
2023-11-22 16:20:15.393057: Pseudo dice [0.6498, nan] 
2023-11-22 16:20:15.393126: Epoch time: 201.21 s 
2023-11-22 16:20:15.393182: Yayy! New best EMA pseudo Dice: 0.4659 
2023-11-22 16:20:16.517330:  
2023-11-22 16:20:16.517451: Epoch 13 
2023-11-22 16:20:16.517545: Current learning rate: 0.00988 
2023-11-22 16:23:37.807632: train_loss -0.3167 
2023-11-22 16:23:37.807806: val_loss -0.3402 
2023-11-22 16:23:37.807912: Pseudo dice [0.7084, nan] 
2023-11-22 16:23:37.807982: Epoch time: 201.29 s 
2023-11-22 16:23:37.808041: Yayy! New best EMA pseudo Dice: 0.4901 
2023-11-22 16:23:38.938465:  
2023-11-22 16:23:38.938591: Epoch 14 
2023-11-22 16:23:38.938694: Current learning rate: 0.00987 
2023-11-22 16:27:00.287223: train_loss -0.3229 
2023-11-22 16:27:00.287415: val_loss -0.3409 
2023-11-22 16:27:00.287490: Pseudo dice [0.6982, nan] 
2023-11-22 16:27:00.287559: Epoch time: 201.35 s 
2023-11-22 16:27:00.287655: Yayy! New best EMA pseudo Dice: 0.5109 
2023-11-22 16:27:01.536724:  
2023-11-22 16:27:01.536840: Epoch 15 
2023-11-22 16:27:01.536938: Current learning rate: 0.00986 
2023-11-22 16:30:22.702003: train_loss -0.3068 
2023-11-22 16:30:22.702203: val_loss -0.331 
2023-11-22 16:30:22.702283: Pseudo dice [0.6827, nan] 
2023-11-22 16:30:22.702351: Epoch time: 201.17 s 
2023-11-22 16:30:22.702409: Yayy! New best EMA pseudo Dice: 0.5281 
2023-11-22 16:30:23.860485:  
2023-11-22 16:30:23.860715: Epoch 16 
2023-11-22 16:30:23.860818: Current learning rate: 0.00986 
2023-11-22 16:33:44.962178: train_loss -0.3161 
2023-11-22 16:33:44.962372: val_loss -0.3424 
2023-11-22 16:33:44.962448: Pseudo dice [0.7099, nan] 
2023-11-22 16:33:44.962516: Epoch time: 201.1 s 
2023-11-22 16:33:44.962571: Yayy! New best EMA pseudo Dice: 0.5463 
2023-11-22 16:33:46.131152:  
2023-11-22 16:33:46.131266: Epoch 17 
2023-11-22 16:33:46.131362: Current learning rate: 0.00985 
2023-11-22 16:37:07.214224: train_loss -0.3118 
2023-11-22 16:37:07.214430: val_loss -0.3621 
2023-11-22 16:37:07.214505: Pseudo dice [0.7413, nan] 
2023-11-22 16:37:07.214573: Epoch time: 201.08 s 
2023-11-22 16:37:07.214629: Yayy! New best EMA pseudo Dice: 0.5658 
2023-11-22 16:37:08.374390:  
2023-11-22 16:37:08.374505: Epoch 18 
2023-11-22 16:37:08.374598: Current learning rate: 0.00984 
2023-11-22 16:40:29.622568: train_loss -0.3116 
2023-11-22 16:40:29.622804: val_loss -0.3245 
2023-11-22 16:40:29.622883: Pseudo dice [0.6762, nan] 
2023-11-22 16:40:29.622987: Epoch time: 201.25 s 
2023-11-22 16:40:29.623048: Yayy! New best EMA pseudo Dice: 0.5768 
2023-11-22 16:40:30.780781:  
2023-11-22 16:40:30.780903: Epoch 19 
2023-11-22 16:40:30.780998: Current learning rate: 0.00983 
2023-11-22 16:43:51.742617: train_loss -0.3247 
2023-11-22 16:43:51.742837: val_loss -0.3269 
2023-11-22 16:43:51.742913: Pseudo dice [0.6808, nan] 
2023-11-22 16:43:51.742983: Epoch time: 200.96 s 
2023-11-22 16:43:51.743039: Yayy! New best EMA pseudo Dice: 0.5872 
2023-11-22 16:43:52.998194:  
2023-11-22 16:43:52.998312: Epoch 20 
2023-11-22 16:43:52.998407: Current learning rate: 0.00982 
2023-11-22 16:47:14.287657: train_loss -0.3224 
2023-11-22 16:47:14.287859: val_loss -0.3332 
2023-11-22 16:47:14.287932: Pseudo dice [0.6756, nan] 
2023-11-22 16:47:14.288000: Epoch time: 201.29 s 
2023-11-22 16:47:14.288056: Yayy! New best EMA pseudo Dice: 0.5961 
2023-11-22 16:47:15.464177:  
2023-11-22 16:47:15.464367: Epoch 21 
2023-11-22 16:47:15.464473: Current learning rate: 0.00981 
2023-11-22 16:50:36.658528: train_loss -0.3204 
2023-11-22 16:50:36.658726: val_loss -0.3117 
2023-11-22 16:50:36.658803: Pseudo dice [0.6413, nan] 
2023-11-22 16:50:36.658872: Epoch time: 201.2 s 
2023-11-22 16:50:36.658929: Yayy! New best EMA pseudo Dice: 0.6006 
2023-11-22 16:50:37.772881:  
2023-11-22 16:50:37.772997: Epoch 22 
2023-11-22 16:50:37.773093: Current learning rate: 0.0098 
2023-11-22 16:53:59.226092: train_loss -0.3314 
2023-11-22 16:53:59.226315: val_loss -0.3086 
2023-11-22 16:53:59.226398: Pseudo dice [0.6432, nan] 
2023-11-22 16:53:59.226499: Epoch time: 201.45 s 
2023-11-22 16:53:59.226559: Yayy! New best EMA pseudo Dice: 0.6048 
2023-11-22 16:54:00.358517:  
2023-11-22 16:54:00.358650: Epoch 23 
2023-11-22 16:54:00.358754: Current learning rate: 0.00979 
2023-11-22 16:57:21.626259: train_loss -0.3259 
2023-11-22 16:57:21.626456: val_loss -0.3294 
2023-11-22 16:57:21.626532: Pseudo dice [0.6814, nan] 
2023-11-22 16:57:21.626599: Epoch time: 201.27 s 
2023-11-22 16:57:21.626654: Yayy! New best EMA pseudo Dice: 0.6125 
2023-11-22 16:57:22.837261:  
2023-11-22 16:57:22.837387: Epoch 24 
2023-11-22 16:57:22.837481: Current learning rate: 0.00978 
2023-11-22 17:00:15.425559: train_loss -0.3295 
2023-11-22 17:00:15.425754: val_loss -0.3395 
2023-11-22 17:00:15.425829: Pseudo dice [0.6849, nan] 
2023-11-22 17:00:15.425899: Epoch time: 172.59 s 
2023-11-22 17:00:15.425956: Yayy! New best EMA pseudo Dice: 0.6197 
2023-11-22 17:00:16.547188:  
2023-11-22 17:00:16.547315: Epoch 25 
2023-11-22 17:00:16.547440: Current learning rate: 0.00977 
2023-11-22 17:01:23.344254: train_loss -0.3246 
2023-11-22 17:01:23.344438: val_loss -0.3137 
2023-11-22 17:01:23.344514: Pseudo dice [0.6555, nan] 
2023-11-22 17:01:23.344594: Epoch time: 66.8 s 
2023-11-22 17:01:23.344655: Yayy! New best EMA pseudo Dice: 0.6233 
2023-11-22 17:01:24.510747:  
2023-11-22 17:01:24.510863: Epoch 26 
2023-11-22 17:01:24.510960: Current learning rate: 0.00977 
2023-11-22 17:02:31.495146: train_loss -0.3292 
2023-11-22 17:02:31.495344: val_loss -0.3319 
2023-11-22 17:02:31.495450: Pseudo dice [0.6722, nan] 
2023-11-22 17:02:31.495521: Epoch time: 66.99 s 
2023-11-22 17:02:31.495579: Yayy! New best EMA pseudo Dice: 0.6282 
2023-11-22 17:02:32.666398:  
2023-11-22 17:02:32.666526: Epoch 27 
2023-11-22 17:02:32.666626: Current learning rate: 0.00976 
2023-11-22 17:03:39.502372: train_loss -0.341 
2023-11-22 17:03:39.502584: val_loss -0.3299 
2023-11-22 17:03:39.502665: Pseudo dice [0.681, nan] 
2023-11-22 17:03:39.502736: Epoch time: 66.84 s 
2023-11-22 17:03:39.502820: Yayy! New best EMA pseudo Dice: 0.6335 
2023-11-22 17:03:40.658668:  
2023-11-22 17:03:40.658788: Epoch 28 
2023-11-22 17:03:40.658885: Current learning rate: 0.00975 
2023-11-22 17:04:47.723372: train_loss -0.3163 
2023-11-22 17:04:47.723687: val_loss -0.3397 
2023-11-22 17:04:47.723835: Pseudo dice [0.7, nan] 
2023-11-22 17:04:47.723968: Epoch time: 67.07 s 
2023-11-22 17:04:47.724082: Yayy! New best EMA pseudo Dice: 0.6401 
2023-11-22 17:04:48.885606:  
2023-11-22 17:04:48.885742: Epoch 29 
2023-11-22 17:04:48.885844: Current learning rate: 0.00974 
2023-11-22 17:05:55.921185: train_loss -0.3234 
2023-11-22 17:05:55.921417: val_loss -0.3612 
2023-11-22 17:05:55.921498: Pseudo dice [0.7434, nan] 
2023-11-22 17:05:55.921568: Epoch time: 67.04 s 
2023-11-22 17:05:55.921627: Yayy! New best EMA pseudo Dice: 0.6505 
2023-11-22 17:05:57.184604:  
2023-11-22 17:05:57.184730: Epoch 30 
2023-11-22 17:05:57.184831: Current learning rate: 0.00973 
2023-11-22 17:07:04.185054: train_loss -0.3355 
2023-11-22 17:07:04.185316: val_loss -0.3408 
2023-11-22 17:07:04.185402: Pseudo dice [0.6907, nan] 
2023-11-22 17:07:04.185474: Epoch time: 67.0 s 
2023-11-22 17:07:04.185533: Yayy! New best EMA pseudo Dice: 0.6545 
2023-11-22 17:07:05.354735:  
2023-11-22 17:07:05.354858: Epoch 31 
2023-11-22 17:07:05.354967: Current learning rate: 0.00972 
2023-11-22 17:08:12.451191: train_loss -0.3283 
2023-11-22 17:08:12.451406: val_loss -0.3632 
2023-11-22 17:08:12.451489: Pseudo dice [0.7476, nan] 
2023-11-22 17:08:12.451585: Epoch time: 67.1 s 
2023-11-22 17:08:12.451647: Yayy! New best EMA pseudo Dice: 0.6638 
2023-11-22 17:08:13.625546:  
2023-11-22 17:08:13.625808: Epoch 32 
2023-11-22 17:08:13.625939: Current learning rate: 0.00971 
2023-11-22 17:09:20.690821: train_loss -0.3255 
2023-11-22 17:09:20.691051: val_loss -0.3638 
2023-11-22 17:09:20.691135: Pseudo dice [0.7481, nan] 
2023-11-22 17:09:20.691203: Epoch time: 67.07 s 
2023-11-22 17:09:20.691263: Yayy! New best EMA pseudo Dice: 0.6722 
2023-11-22 17:09:21.863444:  
2023-11-22 17:09:21.863638: Epoch 33 
2023-11-22 17:09:21.863774: Current learning rate: 0.0097 
2023-11-22 17:10:28.838672: train_loss -0.3311 
2023-11-22 17:10:28.838833: val_loss -0.3531 
2023-11-22 17:10:28.838903: Pseudo dice [0.7164, nan] 
2023-11-22 17:10:28.838967: Epoch time: 66.98 s 
2023-11-22 17:10:28.839047: Yayy! New best EMA pseudo Dice: 0.6766 
2023-11-22 17:10:30.112989:  
2023-11-22 17:10:30.113155: Epoch 34 
2023-11-22 17:10:30.113288: Current learning rate: 0.00969 
2023-11-22 17:11:37.107255: train_loss -0.3327 
2023-11-22 17:11:37.107455: val_loss -0.3505 
2023-11-22 17:11:37.107531: Pseudo dice [0.7106, nan] 
2023-11-22 17:11:37.107598: Epoch time: 67.0 s 
2023-11-22 17:11:37.107661: Yayy! New best EMA pseudo Dice: 0.68 
2023-11-22 17:11:38.300489:  
2023-11-22 17:11:38.300611: Epoch 35 
2023-11-22 17:11:38.300707: Current learning rate: 0.00968 
2023-11-22 17:12:45.103228: train_loss -0.3311 
2023-11-22 17:12:45.103416: val_loss -0.3193 
2023-11-22 17:12:45.103492: Pseudo dice [0.6626, nan] 
2023-11-22 17:12:45.103591: Epoch time: 66.8 s 
2023-11-22 17:12:46.176028:  
2023-11-22 17:12:46.176154: Epoch 36 
2023-11-22 17:12:46.176259: Current learning rate: 0.00968 
2023-11-22 17:13:52.926549: train_loss -0.3332 
2023-11-22 17:13:52.926715: val_loss -0.3518 
2023-11-22 17:13:52.926789: Pseudo dice [0.7237, nan] 
2023-11-22 17:13:52.926858: Epoch time: 66.75 s 
2023-11-22 17:13:52.926914: Yayy! New best EMA pseudo Dice: 0.6828 
2023-11-22 17:13:54.125691:  
2023-11-22 17:13:54.125834: Epoch 37 
2023-11-22 17:13:54.125935: Current learning rate: 0.00967 
2023-11-22 17:15:01.001285: train_loss -0.3279 
2023-11-22 17:15:01.001497: val_loss -0.3525 
2023-11-22 17:15:01.001575: Pseudo dice [0.7223, nan] 
2023-11-22 17:15:01.001643: Epoch time: 66.88 s 
2023-11-22 17:15:01.001700: Yayy! New best EMA pseudo Dice: 0.6868 
2023-11-22 17:15:02.199453:  
2023-11-22 17:15:02.199578: Epoch 38 
2023-11-22 17:15:02.199672: Current learning rate: 0.00966 
2023-11-22 17:16:09.105897: train_loss -0.3197 
2023-11-22 17:16:09.106118: val_loss -0.3329 
2023-11-22 17:16:09.106199: Pseudo dice [0.6851, nan] 
2023-11-22 17:16:09.106269: Epoch time: 66.91 s 
2023-11-22 17:16:10.301189:  
2023-11-22 17:16:10.301311: Epoch 39 
2023-11-22 17:16:10.301408: Current learning rate: 0.00965 
2023-11-22 17:17:17.165784: train_loss -0.3304 
2023-11-22 17:17:17.165962: val_loss -0.3596 
2023-11-22 17:17:17.166037: Pseudo dice [0.7326, nan] 
2023-11-22 17:17:17.166111: Epoch time: 66.87 s 
2023-11-22 17:17:17.166168: Yayy! New best EMA pseudo Dice: 0.6912 
2023-11-22 17:17:18.363900:  
2023-11-22 17:17:18.364039: Epoch 40 
2023-11-22 17:17:18.364149: Current learning rate: 0.00964 
2023-11-22 17:18:25.300869: train_loss -0.3333 
2023-11-22 17:18:25.301097: val_loss -0.3425 
2023-11-22 17:18:25.301177: Pseudo dice [0.7051, nan] 
2023-11-22 17:18:25.301246: Epoch time: 66.94 s 
2023-11-22 17:18:25.301305: Yayy! New best EMA pseudo Dice: 0.6926 
2023-11-22 17:18:26.503439:  
2023-11-22 17:18:26.503562: Epoch 41 
2023-11-22 17:18:26.503686: Current learning rate: 0.00963 
2023-11-22 17:19:33.441812: train_loss -0.3295 
2023-11-22 17:19:33.442045: val_loss -0.3513 
2023-11-22 17:19:33.442122: Pseudo dice [0.7188, nan] 
2023-11-22 17:19:33.442189: Epoch time: 66.94 s 
2023-11-22 17:19:33.442247: Yayy! New best EMA pseudo Dice: 0.6952 
2023-11-22 17:19:34.580732:  
2023-11-22 17:19:34.580849: Epoch 42 
2023-11-22 17:19:34.580939: Current learning rate: 0.00962 
2023-11-22 17:20:41.579998: train_loss -0.3257 
2023-11-22 17:20:41.580193: val_loss -0.342 
2023-11-22 17:20:41.580265: Pseudo dice [0.6994, nan] 
2023-11-22 17:20:41.580331: Epoch time: 67.0 s 
2023-11-22 17:20:41.580387: Yayy! New best EMA pseudo Dice: 0.6956 
2023-11-22 17:20:42.741434:  
2023-11-22 17:20:42.741625: Epoch 43 
2023-11-22 17:20:42.741722: Current learning rate: 0.00961 
2023-11-22 17:21:49.792263: train_loss -0.3281 
2023-11-22 17:21:49.792439: val_loss -0.3505 
2023-11-22 17:21:49.792512: Pseudo dice [0.7146, nan] 
2023-11-22 17:21:49.792593: Epoch time: 67.05 s 
2023-11-22 17:21:49.792683: Yayy! New best EMA pseudo Dice: 0.6975 
2023-11-22 17:21:51.047593:  
2023-11-22 17:21:51.047710: Epoch 44 
2023-11-22 17:21:51.047805: Current learning rate: 0.0096 
2023-11-22 17:22:58.255469: train_loss -0.3296 
2023-11-22 17:22:58.255654: val_loss -0.3234 
2023-11-22 17:22:58.255733: Pseudo dice [0.6752, nan] 
2023-11-22 17:22:58.255800: Epoch time: 67.21 s 
2023-11-22 17:22:59.289195:  
2023-11-22 17:22:59.289312: Epoch 45 
2023-11-22 17:22:59.289412: Current learning rate: 0.00959 
2023-11-22 17:24:06.499120: train_loss -0.335 
2023-11-22 17:24:06.499313: val_loss -0.3439 
2023-11-22 17:24:06.499390: Pseudo dice [0.7123, nan] 
2023-11-22 17:24:06.499466: Epoch time: 67.21 s 
2023-11-22 17:24:07.533066:  
2023-11-22 17:24:07.533194: Epoch 46 
2023-11-22 17:24:07.533295: Current learning rate: 0.00959 
2023-11-22 17:25:14.715294: train_loss -0.322 
2023-11-22 17:25:14.715470: val_loss -0.348 
2023-11-22 17:25:14.715543: Pseudo dice [0.704, nan] 
2023-11-22 17:25:14.715641: Epoch time: 67.18 s 
2023-11-22 17:25:14.715703: Yayy! New best EMA pseudo Dice: 0.6977 
2023-11-22 17:25:15.852000:  
2023-11-22 17:25:15.852189: Epoch 47 
2023-11-22 17:25:15.852340: Current learning rate: 0.00958 
2023-11-22 17:26:22.847159: train_loss -0.3246 
2023-11-22 17:26:22.847366: val_loss -0.3322 
2023-11-22 17:26:22.847451: Pseudo dice [0.6844, nan] 
2023-11-22 17:26:22.847518: Epoch time: 67.0 s 
2023-11-22 17:26:23.881494:  
2023-11-22 17:26:23.881614: Epoch 48 
2023-11-22 17:26:23.881730: Current learning rate: 0.00957 
2023-11-22 17:27:30.766024: train_loss -0.3294 
2023-11-22 17:27:30.766204: val_loss -0.3472 
2023-11-22 17:27:30.766277: Pseudo dice [0.713, nan] 
2023-11-22 17:27:30.766345: Epoch time: 66.89 s 
2023-11-22 17:27:30.766472: Yayy! New best EMA pseudo Dice: 0.698 
2023-11-22 17:27:32.027329:  
2023-11-22 17:27:32.027505: Epoch 49 
2023-11-22 17:27:32.027614: Current learning rate: 0.00956 
2023-11-22 17:28:38.932790: train_loss -0.3399 
2023-11-22 17:28:38.932977: val_loss -0.3471 
2023-11-22 17:28:38.933078: Pseudo dice [0.71, nan] 
2023-11-22 17:28:38.933147: Epoch time: 66.91 s 
2023-11-22 17:28:39.064208: Yayy! New best EMA pseudo Dice: 0.6992 
2023-11-22 17:28:40.218489:  
2023-11-22 17:28:40.218622: Epoch 50 
2023-11-22 17:28:40.218725: Current learning rate: 0.00955 
2023-11-22 17:29:47.199493: train_loss -0.3385 
2023-11-22 17:29:47.199682: val_loss -0.336 
2023-11-22 17:29:47.199757: Pseudo dice [0.6878, nan] 
2023-11-22 17:29:47.199829: Epoch time: 66.98 s 
2023-11-22 17:29:48.238102:  
2023-11-22 17:29:48.238225: Epoch 51 
2023-11-22 17:29:48.238325: Current learning rate: 0.00954 
2023-11-22 17:30:55.181538: train_loss -0.3402 
2023-11-22 17:30:55.181735: val_loss -0.3609 
2023-11-22 17:30:55.181813: Pseudo dice [0.7385, nan] 
2023-11-22 17:30:55.181882: Epoch time: 66.94 s 
2023-11-22 17:30:55.181938: Yayy! New best EMA pseudo Dice: 0.7021 
2023-11-22 17:30:56.351867:  
2023-11-22 17:30:56.351979: Epoch 52 
2023-11-22 17:30:56.352079: Current learning rate: 0.00953 
2023-11-22 17:32:03.265584: train_loss -0.3421 
2023-11-22 17:32:03.265778: val_loss -0.335 
2023-11-22 17:32:03.265856: Pseudo dice [0.6797, nan] 
2023-11-22 17:32:03.265934: Epoch time: 66.91 s 
2023-11-22 17:32:04.307078:  
2023-11-22 17:32:04.307339: Epoch 53 
2023-11-22 17:32:04.307480: Current learning rate: 0.00952 
2023-11-22 17:33:11.242730: train_loss -0.3317 
2023-11-22 17:33:11.242901: val_loss -0.3601 
2023-11-22 17:33:11.242975: Pseudo dice [0.7387, nan] 
2023-11-22 17:33:11.243040: Epoch time: 66.94 s 
2023-11-22 17:33:11.243122: Yayy! New best EMA pseudo Dice: 0.7038 
2023-11-22 17:33:12.506490:  
2023-11-22 17:33:12.506609: Epoch 54 
2023-11-22 17:33:12.506703: Current learning rate: 0.00951 
2023-11-22 17:34:19.624556: train_loss -0.3323 
2023-11-22 17:34:19.624749: val_loss -0.3698 
2023-11-22 17:34:19.624826: Pseudo dice [0.7512, nan] 
2023-11-22 17:34:19.624893: Epoch time: 67.12 s 
2023-11-22 17:34:19.624948: Yayy! New best EMA pseudo Dice: 0.7085 
2023-11-22 17:34:20.785620:  
2023-11-22 17:34:20.785760: Epoch 55 
2023-11-22 17:34:20.785862: Current learning rate: 0.0095 
2023-11-22 17:35:28.113111: train_loss -0.3274 
2023-11-22 17:35:28.113315: val_loss -0.3409 
2023-11-22 17:35:28.113395: Pseudo dice [0.6934, nan] 
2023-11-22 17:35:28.113466: Epoch time: 67.33 s 
2023-11-22 17:35:29.165256:  
2023-11-22 17:35:29.165383: Epoch 56 
2023-11-22 17:35:29.165490: Current learning rate: 0.00949 
2023-11-22 17:36:36.399153: train_loss -0.3438 
2023-11-22 17:36:36.399378: val_loss -0.3371 
2023-11-22 17:36:36.399496: Pseudo dice [0.6937, nan] 
2023-11-22 17:36:36.399570: Epoch time: 67.23 s 
2023-11-22 17:36:37.448591:  
2023-11-22 17:36:37.448710: Epoch 57 
2023-11-22 17:36:37.448806: Current learning rate: 0.00949 
2023-11-22 17:37:44.537282: train_loss -0.3381 
2023-11-22 17:37:44.537553: val_loss -0.3508 
2023-11-22 17:37:44.537629: Pseudo dice [0.7165, nan] 
2023-11-22 17:37:44.537698: Epoch time: 67.09 s 
2023-11-22 17:37:45.680340:  
2023-11-22 17:37:45.680458: Epoch 58 
2023-11-22 17:37:45.680593: Current learning rate: 0.00948 
2023-11-22 17:38:52.835055: train_loss -0.3371 
2023-11-22 17:38:52.835227: val_loss -0.3485 
2023-11-22 17:38:52.835308: Pseudo dice [0.7127, nan] 
2023-11-22 17:38:52.835390: Epoch time: 67.16 s 
2023-11-22 17:38:53.911007:  
2023-11-22 17:38:53.911117: Epoch 59 
2023-11-22 17:38:53.911216: Current learning rate: 0.00947 
2023-11-22 17:40:01.154432: train_loss -0.3403 
2023-11-22 17:40:01.154636: val_loss -0.3523 
2023-11-22 17:40:01.154715: Pseudo dice [0.7151, nan] 
2023-11-22 17:40:01.154818: Epoch time: 67.24 s 
2023-11-22 17:40:02.215180:  
2023-11-22 17:40:02.215307: Epoch 60 
2023-11-22 17:40:02.215408: Current learning rate: 0.00946 
2023-11-22 17:41:09.293490: train_loss -0.3366 
2023-11-22 17:41:09.293673: val_loss -0.3414 
2023-11-22 17:41:09.293753: Pseudo dice [0.7017, nan] 
2023-11-22 17:41:09.293821: Epoch time: 67.08 s 
2023-11-22 17:41:10.361690:  
2023-11-22 17:41:10.361808: Epoch 61 
2023-11-22 17:41:10.361907: Current learning rate: 0.00945 
2023-11-22 17:42:17.498667: train_loss -0.3395 
2023-11-22 17:42:17.498841: val_loss -0.3503 
2023-11-22 17:42:17.498915: Pseudo dice [0.7078, nan] 
2023-11-22 17:42:17.498983: Epoch time: 67.14 s 
2023-11-22 17:42:18.669579:  
2023-11-22 17:42:18.669703: Epoch 62 
2023-11-22 17:42:18.669805: Current learning rate: 0.00944 
2023-11-22 17:43:25.810788: train_loss -0.3338 
2023-11-22 17:43:25.810989: val_loss -0.3299 
2023-11-22 17:43:25.811074: Pseudo dice [0.6698, nan] 
2023-11-22 17:43:25.811150: Epoch time: 67.14 s 
2023-11-22 17:43:26.872283:  
2023-11-22 17:43:26.872396: Epoch 63 
2023-11-22 17:43:26.872491: Current learning rate: 0.00943 
2023-11-22 17:44:34.295789: train_loss -0.339 
2023-11-22 17:44:34.295979: val_loss -0.333 
2023-11-22 17:44:34.296057: Pseudo dice [0.6785, nan] 
2023-11-22 17:44:34.296125: Epoch time: 67.42 s 
2023-11-22 17:44:35.362153:  
2023-11-22 17:44:35.362322: Epoch 64 
2023-11-22 17:44:35.362478: Current learning rate: 0.00942 
2023-11-22 17:45:42.812541: train_loss -0.3282 
2023-11-22 17:45:42.812759: val_loss -0.3593 
2023-11-22 17:45:42.812837: Pseudo dice [0.7308, nan] 
2023-11-22 17:45:42.812905: Epoch time: 67.45 s 
2023-11-22 17:45:43.864536:  
2023-11-22 17:45:43.864669: Epoch 65 
2023-11-22 17:45:43.864779: Current learning rate: 0.00941 
2023-11-22 17:46:51.370073: train_loss -0.327 
2023-11-22 17:46:51.370278: val_loss -0.3476 
2023-11-22 17:46:51.370355: Pseudo dice [0.7062, nan] 
2023-11-22 17:46:51.370424: Epoch time: 67.51 s 
2023-11-22 17:46:52.594120:  
2023-11-22 17:46:52.594254: Epoch 66 
2023-11-22 17:46:52.594355: Current learning rate: 0.0094 
2023-11-22 17:48:00.061513: train_loss -0.3328 
2023-11-22 17:48:00.061715: val_loss -0.3659 
2023-11-22 17:48:00.061791: Pseudo dice [0.7463, nan] 
2023-11-22 17:48:00.061866: Epoch time: 67.47 s 
2023-11-22 17:48:00.061924: Yayy! New best EMA pseudo Dice: 0.7086 
2023-11-22 17:48:01.233150:  
2023-11-22 17:48:01.233273: Epoch 67 
2023-11-22 17:48:01.233375: Current learning rate: 0.00939 
2023-11-22 17:49:08.618705: train_loss -0.3295 
2023-11-22 17:49:08.618915: val_loss -0.3246 
2023-11-22 17:49:08.618992: Pseudo dice [0.663, nan] 
2023-11-22 17:49:08.619061: Epoch time: 67.39 s 
2023-11-22 17:49:09.692453:  
2023-11-22 17:49:09.692589: Epoch 68 
2023-11-22 17:49:09.692679: Current learning rate: 0.00939 
2023-11-22 17:50:16.827192: train_loss -0.3352 
2023-11-22 17:50:16.827374: val_loss -0.3371 
2023-11-22 17:50:16.827450: Pseudo dice [0.6904, nan] 
2023-11-22 17:50:16.827517: Epoch time: 67.14 s 
2023-11-22 17:50:17.898803:  
2023-11-22 17:50:17.898925: Epoch 69 
2023-11-22 17:50:17.899031: Current learning rate: 0.00938 
2023-11-22 17:51:25.185374: train_loss -0.3384 
2023-11-22 17:51:25.185577: val_loss -0.3483 
2023-11-22 17:51:25.185655: Pseudo dice [0.7026, nan] 
2023-11-22 17:51:25.185722: Epoch time: 67.29 s 
2023-11-22 17:51:26.251934:  
2023-11-22 17:51:26.252106: Epoch 70 
2023-11-22 17:51:26.252259: Current learning rate: 0.00937 
2023-11-22 17:52:33.315210: train_loss -0.3421 
2023-11-22 17:52:33.315394: val_loss -0.3345 
2023-11-22 17:52:33.315497: Pseudo dice [0.6772, nan] 
2023-11-22 17:52:33.315574: Epoch time: 67.06 s 
2023-11-22 17:52:34.389246:  
2023-11-22 17:52:34.389362: Epoch 71 
2023-11-22 17:52:34.389453: Current learning rate: 0.00936 
2023-11-22 17:53:41.673338: train_loss -0.3386 
2023-11-22 17:53:41.673522: val_loss -0.3131 
2023-11-22 17:53:41.673596: Pseudo dice [0.6456, nan] 
2023-11-22 17:53:41.673664: Epoch time: 67.28 s 
2023-11-22 17:53:42.738664:  
2023-11-22 17:53:42.738876: Epoch 72 
2023-11-22 17:53:42.739135: Current learning rate: 0.00935 
2023-11-22 17:54:49.930279: train_loss -0.3309 
2023-11-22 17:54:49.930478: val_loss -0.3449 
2023-11-22 17:54:49.930551: Pseudo dice [0.6977, nan] 
2023-11-22 17:54:49.930620: Epoch time: 67.19 s 
2023-11-22 17:54:50.990275:  
2023-11-22 17:54:50.990521: Epoch 73 
2023-11-22 17:54:50.990678: Current learning rate: 0.00934 
2023-11-22 17:55:58.177333: train_loss -0.3492 
2023-11-22 17:55:58.177536: val_loss -0.3525 
2023-11-22 17:55:58.177613: Pseudo dice [0.7153, nan] 
2023-11-22 17:55:58.177703: Epoch time: 67.19 s 
2023-11-22 17:55:59.239392:  
2023-11-22 17:55:59.239503: Epoch 74 
2023-11-22 17:55:59.239597: Current learning rate: 0.00933 
2023-11-22 17:57:06.420505: train_loss -0.328 
2023-11-22 17:57:06.420706: val_loss -0.3054 
2023-11-22 17:57:06.420784: Pseudo dice [0.624, nan] 
2023-11-22 17:57:06.420853: Epoch time: 67.18 s 
2023-11-22 17:57:07.591260:  
2023-11-22 17:57:07.591381: Epoch 75 
2023-11-22 17:57:07.591481: Current learning rate: 0.00932 
2023-11-22 17:58:14.887639: train_loss -0.3272 
2023-11-22 17:58:14.887827: val_loss -0.3373 
2023-11-22 17:58:14.887902: Pseudo dice [0.696, nan] 
2023-11-22 17:58:14.887972: Epoch time: 67.3 s 
2023-11-22 17:58:15.964808:  
2023-11-22 17:58:15.964947: Epoch 76 
2023-11-22 17:58:15.965047: Current learning rate: 0.00931 
2023-11-22 17:59:23.366501: train_loss -0.3405 
2023-11-22 17:59:23.366688: val_loss -0.3471 
2023-11-22 17:59:23.366762: Pseudo dice [0.7117, nan] 
2023-11-22 17:59:23.366853: Epoch time: 67.4 s 
2023-11-22 17:59:24.445474:  
2023-11-22 17:59:24.445589: Epoch 77 
2023-11-22 17:59:24.445687: Current learning rate: 0.0093 
2023-11-22 18:00:31.915399: train_loss -0.3225 
2023-11-22 18:00:31.915586: val_loss -0.3536 
2023-11-22 18:00:31.915661: Pseudo dice [0.7187, nan] 
2023-11-22 18:00:31.915728: Epoch time: 67.47 s 
2023-11-22 18:00:33.000602:  
2023-11-22 18:00:33.000724: Epoch 78 
2023-11-22 18:00:33.000821: Current learning rate: 0.0093 
2023-11-22 18:01:40.482003: train_loss -0.3319 
2023-11-22 18:01:40.482215: val_loss -0.3182 
2023-11-22 18:01:40.482309: Pseudo dice [0.6534, nan] 
2023-11-22 18:01:40.482380: Epoch time: 67.48 s 
2023-11-22 18:01:41.564064:  
2023-11-22 18:01:41.564183: Epoch 79 
2023-11-22 18:01:41.564283: Current learning rate: 0.00929 
2023-11-22 18:02:48.893963: train_loss -0.3402 
2023-11-22 18:02:48.894145: val_loss -0.3226 
2023-11-22 18:02:48.894221: Pseudo dice [0.6672, nan] 
2023-11-22 18:02:48.894295: Epoch time: 67.33 s 
2023-11-22 18:02:50.094081:  
2023-11-22 18:02:50.094199: Epoch 80 
2023-11-22 18:02:50.094294: Current learning rate: 0.00928 
2023-11-22 18:03:57.601650: train_loss -0.3374 
2023-11-22 18:03:57.601827: val_loss -0.352 
2023-11-22 18:03:57.601903: Pseudo dice [0.7162, nan] 
2023-11-22 18:03:57.601968: Epoch time: 67.51 s 
2023-11-22 18:03:58.685585:  
2023-11-22 18:03:58.685712: Epoch 81 
2023-11-22 18:03:58.685817: Current learning rate: 0.00927 
2023-11-22 18:05:06.189042: train_loss -0.3413 
2023-11-22 18:05:06.189255: val_loss -0.3435 
2023-11-22 18:05:06.189368: Pseudo dice [0.6984, nan] 
2023-11-22 18:05:06.189440: Epoch time: 67.5 s 
2023-11-22 18:05:07.265790:  
2023-11-22 18:05:07.265905: Epoch 82 
2023-11-22 18:05:07.266003: Current learning rate: 0.00926 
2023-11-22 18:06:14.812763: train_loss -0.3464 
2023-11-22 18:06:14.812943: val_loss -0.3559 
2023-11-22 18:06:14.813017: Pseudo dice [0.7225, nan] 
2023-11-22 18:06:14.813083: Epoch time: 67.55 s 
2023-11-22 18:06:15.837952:  
2023-11-22 18:06:15.838089: Epoch 83 
2023-11-22 18:06:15.838222: Current learning rate: 0.00925 
2023-11-22 18:07:23.364127: train_loss -0.3421 
2023-11-22 18:07:23.364308: val_loss -0.3358 
2023-11-22 18:07:23.364382: Pseudo dice [0.6923, nan] 
2023-11-22 18:07:23.364484: Epoch time: 67.53 s 
2023-11-22 18:07:24.490418:  
2023-11-22 18:07:24.490535: Epoch 84 
2023-11-22 18:07:24.490638: Current learning rate: 0.00924 
2023-11-22 18:08:32.098266: train_loss -0.3351 
2023-11-22 18:08:32.098462: val_loss -0.3535 
2023-11-22 18:08:32.098537: Pseudo dice [0.7231, nan] 
2023-11-22 18:08:32.098644: Epoch time: 67.61 s 
2023-11-22 18:08:33.117559:  
2023-11-22 18:08:33.117682: Epoch 85 
2023-11-22 18:08:33.117780: Current learning rate: 0.00923 
2023-11-22 18:09:40.685668: train_loss -0.3449 
2023-11-22 18:09:40.685878: val_loss -0.337 
2023-11-22 18:09:40.685951: Pseudo dice [0.6954, nan] 
2023-11-22 18:09:40.686020: Epoch time: 67.57 s 
2023-11-22 18:09:41.714514:  
2023-11-22 18:09:41.714645: Epoch 86 
2023-11-22 18:09:41.714746: Current learning rate: 0.00922 
2023-11-22 18:10:49.088030: train_loss -0.3354 
2023-11-22 18:10:49.088215: val_loss -0.3183 
2023-11-22 18:10:49.088292: Pseudo dice [0.6601, nan] 
2023-11-22 18:10:49.088361: Epoch time: 67.37 s 
2023-11-22 18:10:50.111811:  
2023-11-22 18:10:50.111969: Epoch 87 
2023-11-22 18:10:50.112082: Current learning rate: 0.00921 
2023-11-22 18:11:57.517125: train_loss -0.3401 
2023-11-22 18:11:57.517310: val_loss -0.3476 
2023-11-22 18:11:57.517387: Pseudo dice [0.7074, nan] 
2023-11-22 18:11:57.517489: Epoch time: 67.41 s 
2023-11-22 18:11:58.661982:  
2023-11-22 18:11:58.662100: Epoch 88 
2023-11-22 18:11:58.662199: Current learning rate: 0.0092 
2023-11-22 18:13:06.042747: train_loss -0.3333 
2023-11-22 18:13:06.042921: val_loss -0.3445 
2023-11-22 18:13:06.042996: Pseudo dice [0.7024, nan] 
2023-11-22 18:13:06.043062: Epoch time: 67.38 s 
2023-11-22 18:13:07.068158:  
2023-11-22 18:13:07.068331: Epoch 89 
2023-11-22 18:13:07.068455: Current learning rate: 0.0092 
2023-11-22 18:14:14.523315: train_loss -0.3376 
2023-11-22 18:14:14.523516: val_loss -0.3373 
2023-11-22 18:14:14.523592: Pseudo dice [0.6977, nan] 
2023-11-22 18:14:14.523663: Epoch time: 67.46 s 
2023-11-22 18:14:15.543983:  
2023-11-22 18:14:15.544180: Epoch 90 
2023-11-22 18:14:15.544306: Current learning rate: 0.00919 
2023-11-22 18:15:23.006905: train_loss -0.345 
2023-11-22 18:15:23.007089: val_loss -0.3661 
2023-11-22 18:15:23.007164: Pseudo dice [0.7453, nan] 
2023-11-22 18:15:23.007237: Epoch time: 67.46 s 
2023-11-22 18:15:24.035180:  
2023-11-22 18:15:24.035295: Epoch 91 
2023-11-22 18:15:24.035398: Current learning rate: 0.00918 
2023-11-22 18:16:31.433183: train_loss -0.3411 
2023-11-22 18:16:31.433355: val_loss -0.3434 
2023-11-22 18:16:31.433428: Pseudo dice [0.6954, nan] 
2023-11-22 18:16:31.433494: Epoch time: 67.4 s 
2023-11-22 18:16:32.570133:  
2023-11-22 18:16:32.570255: Epoch 92 
2023-11-22 18:16:32.570350: Current learning rate: 0.00917 
2023-11-22 18:17:40.035021: train_loss -0.3492 
2023-11-22 18:17:40.035204: val_loss -0.3289 
2023-11-22 18:17:40.035315: Pseudo dice [0.67, nan] 
2023-11-22 18:17:40.035390: Epoch time: 67.47 s 
2023-11-22 18:17:41.067732:  
2023-11-22 18:17:41.067852: Epoch 93 
2023-11-22 18:17:41.067955: Current learning rate: 0.00916 
2023-11-22 18:18:48.458509: train_loss -0.3431 
2023-11-22 18:18:48.458716: val_loss -0.3577 
2023-11-22 18:18:48.458807: Pseudo dice [0.7254, nan] 
2023-11-22 18:18:48.458874: Epoch time: 67.39 s 
2023-11-22 18:18:49.480129:  
2023-11-22 18:18:49.480256: Epoch 94 
2023-11-22 18:18:49.480381: Current learning rate: 0.00915 
2023-11-22 18:20:02.918919: train_loss -0.3271 
2023-11-22 18:20:02.919125: val_loss -0.342 
2023-11-22 18:20:02.919208: Pseudo dice [0.7019, nan] 
2023-11-22 18:20:02.919283: Epoch time: 73.44 s 
2023-11-22 18:20:03.939669:  
2023-11-22 18:20:03.939786: Epoch 95 
2023-11-22 18:20:03.939954: Current learning rate: 0.00914 
2023-11-22 18:21:11.414390: train_loss -0.3299 
2023-11-22 18:21:11.414601: val_loss -0.3582 
2023-11-22 18:21:11.414681: Pseudo dice [0.734, nan] 
2023-11-22 18:21:11.414780: Epoch time: 67.48 s 
2023-11-22 18:21:12.451969:  
2023-11-22 18:21:12.452133: Epoch 96 
2023-11-22 18:21:12.452246: Current learning rate: 0.00913 
2023-11-22 18:22:19.692212: train_loss -0.3392 
2023-11-22 18:22:19.692435: val_loss -0.3614 
2023-11-22 18:22:19.692517: Pseudo dice [0.7301, nan] 
2023-11-22 18:22:19.692603: Epoch time: 67.24 s 
2023-11-22 18:22:20.829344:  
2023-11-22 18:22:20.829528: Epoch 97 
2023-11-22 18:22:20.829658: Current learning rate: 0.00912 
2023-11-22 18:23:28.309777: train_loss -0.3457 
2023-11-22 18:23:28.309978: val_loss -0.3456 
2023-11-22 18:23:28.310055: Pseudo dice [0.7124, nan] 
2023-11-22 18:23:28.310123: Epoch time: 67.48 s 
2023-11-22 18:23:29.351256:  
2023-11-22 18:23:29.351395: Epoch 98 
2023-11-22 18:23:29.351495: Current learning rate: 0.00911 
2023-11-22 18:24:36.727839: train_loss -0.3438 
2023-11-22 18:24:36.728026: val_loss -0.3308 
2023-11-22 18:24:36.728104: Pseudo dice [0.6746, nan] 
2023-11-22 18:24:36.728171: Epoch time: 67.38 s 
2023-11-22 18:24:37.771380:  
2023-11-22 18:24:37.771514: Epoch 99 
2023-11-22 18:24:37.771716: Current learning rate: 0.0091 
2023-11-22 18:25:45.172202: train_loss -0.3404 
2023-11-22 18:25:45.172385: val_loss -0.2994 
2023-11-22 18:25:45.172460: Pseudo dice [0.6297, nan] 
2023-11-22 18:25:45.172528: Epoch time: 67.4 s 
2023-11-22 18:25:46.327940:  
2023-11-22 18:25:46.328066: Epoch 100 
2023-11-22 18:25:46.328168: Current learning rate: 0.0091 
2023-11-22 18:26:53.728251: train_loss -0.3434 
2023-11-22 18:26:53.728457: val_loss -0.3365 
2023-11-22 18:26:53.728534: Pseudo dice [0.6833, nan] 
2023-11-22 18:26:53.728610: Epoch time: 67.4 s 
2023-11-22 18:26:54.876420:  
2023-11-22 18:26:54.876647: Epoch 101 
2023-11-22 18:26:54.876781: Current learning rate: 0.00909 
2023-11-22 18:28:02.330342: train_loss -0.3237 
2023-11-22 18:28:02.330547: val_loss -0.3566 
2023-11-22 18:28:02.330627: Pseudo dice [0.7328, nan] 
2023-11-22 18:28:02.330696: Epoch time: 67.45 s 
2023-11-22 18:28:03.384494:  
2023-11-22 18:28:03.384646: Epoch 102 
2023-11-22 18:28:03.384747: Current learning rate: 0.00908 
2023-11-22 18:29:10.862697: train_loss -0.3279 
2023-11-22 18:29:10.862886: val_loss -0.3425 
2023-11-22 18:29:10.862962: Pseudo dice [0.6986, nan] 
2023-11-22 18:29:10.863034: Epoch time: 67.48 s 
2023-11-22 18:29:11.904329:  
2023-11-22 18:29:11.904630: Epoch 103 
2023-11-22 18:29:11.904789: Current learning rate: 0.00907 
2023-11-22 18:30:19.244384: train_loss -0.356 
2023-11-22 18:30:19.244573: val_loss -0.3625 
2023-11-22 18:30:19.244677: Pseudo dice [0.7354, nan] 
2023-11-22 18:30:19.244762: Epoch time: 67.34 s 
2023-11-22 18:30:20.278311:  
2023-11-22 18:30:20.278445: Epoch 104 
2023-11-22 18:30:20.278551: Current learning rate: 0.00906 
2023-11-22 18:31:27.653481: train_loss -0.3441 
2023-11-22 18:31:27.653660: val_loss -0.3243 
2023-11-22 18:31:27.653737: Pseudo dice [0.6697, nan] 
2023-11-22 18:31:27.653805: Epoch time: 67.38 s 
2023-11-22 18:31:28.686260:  
2023-11-22 18:31:28.686415: Epoch 105 
2023-11-22 18:31:28.686539: Current learning rate: 0.00905 
2023-11-22 18:32:36.101576: train_loss -0.3344 
2023-11-22 18:32:36.101769: val_loss -0.3665 
2023-11-22 18:32:36.101846: Pseudo dice [0.7472, nan] 
2023-11-22 18:32:36.101915: Epoch time: 67.42 s 
2023-11-22 18:32:37.247796:  
2023-11-22 18:32:37.247910: Epoch 106 
2023-11-22 18:32:37.248012: Current learning rate: 0.00904 
2023-11-22 18:33:44.700022: train_loss -0.3348 
2023-11-22 18:33:44.700197: val_loss -0.3669 
2023-11-22 18:33:44.700269: Pseudo dice [0.7439, nan] 
2023-11-22 18:33:44.700375: Epoch time: 67.45 s 
2023-11-22 18:33:45.745063:  
2023-11-22 18:33:45.745187: Epoch 107 
2023-11-22 18:33:45.745291: Current learning rate: 0.00903 
2023-11-22 18:34:53.257575: train_loss -0.3413 
2023-11-22 18:34:53.257764: val_loss -0.3373 
2023-11-22 18:34:53.257841: Pseudo dice [0.6883, nan] 
2023-11-22 18:34:53.257909: Epoch time: 67.51 s 
2023-11-22 18:34:54.301204:  
2023-11-22 18:34:54.301324: Epoch 108 
2023-11-22 18:34:54.301419: Current learning rate: 0.00902 
2023-11-22 18:36:01.791622: train_loss -0.3355 
2023-11-22 18:36:01.791791: val_loss -0.345 
2023-11-22 18:36:01.791865: Pseudo dice [0.6971, nan] 
2023-11-22 18:36:01.791936: Epoch time: 67.49 s 
2023-11-22 18:36:02.844671:  
2023-11-22 18:36:02.844790: Epoch 109 
2023-11-22 18:36:02.844880: Current learning rate: 0.00901 
2023-11-22 18:37:10.400774: train_loss -0.3343 
2023-11-22 18:37:10.400952: val_loss -0.343 
2023-11-22 18:37:10.401027: Pseudo dice [0.7029, nan] 
2023-11-22 18:37:10.401096: Epoch time: 67.56 s 
2023-11-22 18:37:11.543588:  
2023-11-22 18:37:11.543703: Epoch 110 
2023-11-22 18:37:11.543791: Current learning rate: 0.009 
2023-11-22 18:38:19.089424: train_loss -0.3376 
2023-11-22 18:38:19.089582: val_loss -0.3706 
2023-11-22 18:38:19.089653: Pseudo dice [0.7535, nan] 
2023-11-22 18:38:19.089720: Epoch time: 67.55 s 
2023-11-22 18:38:19.089775: Yayy! New best EMA pseudo Dice: 0.7098 
2023-11-22 18:38:20.253693:  
2023-11-22 18:38:20.253866: Epoch 111 
2023-11-22 18:38:20.253965: Current learning rate: 0.009 
2023-11-22 18:39:27.799283: train_loss -0.344 
2023-11-22 18:39:27.799489: val_loss -0.3598 
2023-11-22 18:39:27.799569: Pseudo dice [0.7297, nan] 
2023-11-22 18:39:27.799639: Epoch time: 67.55 s 
2023-11-22 18:39:27.799698: Yayy! New best EMA pseudo Dice: 0.7117 
2023-11-22 18:39:28.957789:  
2023-11-22 18:39:28.957943: Epoch 112 
2023-11-22 18:39:28.958051: Current learning rate: 0.00899 
2023-11-22 18:40:36.403231: train_loss -0.3378 
2023-11-22 18:40:36.403426: val_loss -0.3626 
2023-11-22 18:40:36.403536: Pseudo dice [0.742, nan] 
2023-11-22 18:40:36.403610: Epoch time: 67.45 s 
2023-11-22 18:40:36.403667: Yayy! New best EMA pseudo Dice: 0.7148 
2023-11-22 18:40:37.555581:  
2023-11-22 18:40:37.555701: Epoch 113 
2023-11-22 18:40:37.555792: Current learning rate: 0.00898 
2023-11-22 18:41:45.039124: train_loss -0.3161 
2023-11-22 18:41:45.039301: val_loss -0.3378 
2023-11-22 18:41:45.039383: Pseudo dice [0.6836, nan] 
2023-11-22 18:41:45.039452: Epoch time: 67.48 s 
2023-11-22 18:41:46.077514:  
2023-11-22 18:41:46.077636: Epoch 114 
2023-11-22 18:41:46.077734: Current learning rate: 0.00897 
2023-11-22 18:42:53.423704: train_loss -0.3473 
2023-11-22 18:42:53.423867: val_loss -0.3546 
2023-11-22 18:42:53.423940: Pseudo dice [0.7281, nan] 
2023-11-22 18:42:53.424007: Epoch time: 67.35 s 
2023-11-22 18:42:54.581002:  
2023-11-22 18:42:54.581115: Epoch 115 
2023-11-22 18:42:54.581208: Current learning rate: 0.00896 
2023-11-22 18:44:02.013475: train_loss -0.3271 
2023-11-22 18:44:02.013685: val_loss -0.3337 
2023-11-22 18:44:02.013763: Pseudo dice [0.6797, nan] 
2023-11-22 18:44:02.013830: Epoch time: 67.43 s 
2023-11-22 18:44:03.061253:  
2023-11-22 18:44:03.061371: Epoch 116 
2023-11-22 18:44:03.061465: Current learning rate: 0.00895 
2023-11-22 18:45:10.341120: train_loss -0.3443 
2023-11-22 18:45:10.341315: val_loss -0.3492 
2023-11-22 18:45:10.341389: Pseudo dice [0.7127, nan] 
2023-11-22 18:45:10.341458: Epoch time: 67.28 s 
2023-11-22 18:45:11.501719:  
2023-11-22 18:45:11.501922: Epoch 117 
2023-11-22 18:45:11.502034: Current learning rate: 0.00894 
2023-11-22 18:46:18.834982: train_loss -0.3423 
2023-11-22 18:46:18.835162: val_loss -0.35 
2023-11-22 18:46:18.835238: Pseudo dice [0.7126, nan] 
2023-11-22 18:46:18.835307: Epoch time: 67.33 s 
2023-11-22 18:46:19.885847:  
2023-11-22 18:46:19.885976: Epoch 118 
2023-11-22 18:46:19.886075: Current learning rate: 0.00893 
2023-11-22 18:47:27.106724: train_loss -0.3488 
2023-11-22 18:47:27.106926: val_loss -0.3148 
2023-11-22 18:47:27.107039: Pseudo dice [0.6446, nan] 
2023-11-22 18:47:27.107116: Epoch time: 67.22 s 
2023-11-22 18:47:28.266342:  
2023-11-22 18:47:28.266459: Epoch 119 
2023-11-22 18:47:28.266564: Current learning rate: 0.00892 
2023-11-22 18:48:35.596449: train_loss -0.3439 
2023-11-22 18:48:35.596672: val_loss -0.3468 
2023-11-22 18:48:35.596742: Pseudo dice [0.7059, nan] 
2023-11-22 18:48:35.596809: Epoch time: 67.33 s 
2023-11-22 18:48:36.648832:  
2023-11-22 18:48:36.648955: Epoch 120 
2023-11-22 18:48:36.649053: Current learning rate: 0.00891 
2023-11-22 18:49:43.956955: train_loss -0.3385 
2023-11-22 18:49:43.957158: val_loss -0.3193 
2023-11-22 18:49:43.957238: Pseudo dice [0.6593, nan] 
2023-11-22 18:49:43.957308: Epoch time: 67.31 s 
2023-11-22 18:49:45.024462:  
2023-11-22 18:49:45.024645: Epoch 121 
2023-11-22 18:49:45.024765: Current learning rate: 0.0089 
2023-11-22 18:50:52.292164: train_loss -0.3365 
2023-11-22 18:50:52.292377: val_loss -0.3695 
2023-11-22 18:50:52.292456: Pseudo dice [0.749, nan] 
2023-11-22 18:50:52.292554: Epoch time: 67.27 s 
2023-11-22 18:50:53.350731:  
2023-11-22 18:50:53.350890: Epoch 122 
2023-11-22 18:50:53.350996: Current learning rate: 0.00889 
2023-11-22 18:52:00.411875: train_loss -0.3436 
2023-11-22 18:52:00.412062: val_loss -0.3619 
2023-11-22 18:52:00.412139: Pseudo dice [0.7367, nan] 
2023-11-22 18:52:00.412207: Epoch time: 67.06 s 
2023-11-22 18:52:01.587696:  
2023-11-22 18:52:01.587822: Epoch 123 
2023-11-22 18:52:01.587916: Current learning rate: 0.00889 
2023-11-22 18:53:08.712323: train_loss -0.3423 
2023-11-22 18:53:08.712539: val_loss -0.3628 
2023-11-22 18:53:08.712622: Pseudo dice [0.743, nan] 
2023-11-22 18:53:08.712690: Epoch time: 67.13 s 
2023-11-22 18:53:09.759447:  
2023-11-22 18:53:09.759559: Epoch 124 
2023-11-22 18:53:09.759654: Current learning rate: 0.00888 
2023-11-22 18:54:16.929301: train_loss -0.332 
2023-11-22 18:54:16.929478: val_loss -0.3411 
2023-11-22 18:54:16.929554: Pseudo dice [0.693, nan] 
2023-11-22 18:54:16.929619: Epoch time: 67.17 s 
2023-11-22 18:54:17.985683:  
2023-11-22 18:54:17.985869: Epoch 125 
2023-11-22 18:54:17.985970: Current learning rate: 0.00887 
2023-11-22 18:55:25.112868: train_loss -0.3339 
2023-11-22 18:55:25.113039: val_loss -0.3535 
2023-11-22 18:55:25.113168: Pseudo dice [0.7161, nan] 
2023-11-22 18:55:25.113261: Epoch time: 67.13 s 
2023-11-22 18:55:26.162153:  
2023-11-22 18:55:26.162345: Epoch 126 
2023-11-22 18:55:26.162493: Current learning rate: 0.00886 
2023-11-22 18:56:33.272096: train_loss -0.3351 
2023-11-22 18:56:33.272323: val_loss -0.3391 
2023-11-22 18:56:33.272406: Pseudo dice [0.6997, nan] 
2023-11-22 18:56:33.272475: Epoch time: 67.11 s 
2023-11-22 18:56:34.425608:  
2023-11-22 18:56:34.425721: Epoch 127 
2023-11-22 18:56:34.425817: Current learning rate: 0.00885 
2023-11-22 18:57:41.533064: train_loss -0.3377 
2023-11-22 18:57:41.533246: val_loss -0.3365 
2023-11-22 18:57:41.533321: Pseudo dice [0.6938, nan] 
2023-11-22 18:57:41.533388: Epoch time: 67.11 s 
2023-11-22 18:57:42.590365:  
2023-11-22 18:57:42.590503: Epoch 128 
2023-11-22 18:57:42.590605: Current learning rate: 0.00884 
2023-11-22 18:58:49.625372: train_loss -0.335 
2023-11-22 18:58:49.625545: val_loss -0.3618 
2023-11-22 18:58:49.625620: Pseudo dice [0.7358, nan] 
2023-11-22 18:58:49.625689: Epoch time: 67.04 s 
2023-11-22 18:58:50.678884:  
2023-11-22 18:58:50.679015: Epoch 129 
2023-11-22 18:58:50.679117: Current learning rate: 0.00883 
2023-11-22 18:59:57.811189: train_loss -0.3465 
2023-11-22 18:59:57.811390: val_loss -0.3666 
2023-11-22 18:59:57.811492: Pseudo dice [0.748, nan] 
2023-11-22 18:59:57.811563: Epoch time: 67.13 s 
2023-11-22 18:59:58.870095:  
2023-11-22 18:59:58.870213: Epoch 130 
2023-11-22 18:59:58.870311: Current learning rate: 0.00882 
2023-11-22 19:01:06.196040: train_loss -0.332 
2023-11-22 19:01:06.196222: val_loss -0.3724 
2023-11-22 19:01:06.196294: Pseudo dice [0.7592, nan] 
2023-11-22 19:01:06.196360: Epoch time: 67.33 s 
2023-11-22 19:01:06.196417: Yayy! New best EMA pseudo Dice: 0.7186 
2023-11-22 19:01:07.471000:  
2023-11-22 19:01:07.471193: Epoch 131 
2023-11-22 19:01:07.471299: Current learning rate: 0.00881 
2023-11-22 19:02:14.879499: train_loss -0.3384 
2023-11-22 19:02:14.879712: val_loss -0.3384 
2023-11-22 19:02:14.879791: Pseudo dice [0.69, nan] 
2023-11-22 19:02:14.879860: Epoch time: 67.41 s 
2023-11-22 19:02:15.945994:  
2023-11-22 19:02:15.946120: Epoch 132 
2023-11-22 19:02:15.946218: Current learning rate: 0.0088 
2023-11-22 19:03:23.354773: train_loss -0.3387 
2023-11-22 19:03:23.354951: val_loss -0.3466 
2023-11-22 19:03:23.355057: Pseudo dice [0.7095, nan] 
2023-11-22 19:03:23.355134: Epoch time: 67.41 s 
2023-11-22 19:03:24.415532:  
2023-11-22 19:03:24.415709: Epoch 133 
2023-11-22 19:03:24.415849: Current learning rate: 0.00879 
2023-11-22 19:04:31.838240: train_loss -0.3303 
2023-11-22 19:04:31.838433: val_loss -0.3668 
2023-11-22 19:04:31.838519: Pseudo dice [0.7398, nan] 
2023-11-22 19:04:31.838596: Epoch time: 67.42 s 
2023-11-22 19:04:32.895805:  
2023-11-22 19:04:32.895994: Epoch 134 
2023-11-22 19:04:32.896133: Current learning rate: 0.00879 
2023-11-22 19:05:40.254681: train_loss -0.3545 
2023-11-22 19:05:40.254858: val_loss -0.357 
2023-11-22 19:05:40.254934: Pseudo dice [0.7181, nan] 
2023-11-22 19:05:40.255000: Epoch time: 67.36 s 
2023-11-22 19:05:41.423700:  
2023-11-22 19:05:41.423841: Epoch 135 
2023-11-22 19:05:41.423946: Current learning rate: 0.00878 
2023-11-22 19:06:48.778676: train_loss -0.3309 
2023-11-22 19:06:48.778866: val_loss -0.3413 
2023-11-22 19:06:48.778948: Pseudo dice [0.703, nan] 
2023-11-22 19:06:48.779056: Epoch time: 67.36 s 
2023-11-22 19:06:49.854598:  
2023-11-22 19:06:49.854724: Epoch 136 
2023-11-22 19:06:49.854825: Current learning rate: 0.00877 
2023-11-22 19:07:57.345052: train_loss -0.3259 
2023-11-22 19:07:57.345262: val_loss -0.3457 
2023-11-22 19:07:57.345342: Pseudo dice [0.7009, nan] 
2023-11-22 19:07:57.345411: Epoch time: 67.49 s 
2023-11-22 19:07:58.414352:  
2023-11-22 19:07:58.414480: Epoch 137 
2023-11-22 19:07:58.414582: Current learning rate: 0.00876 
2023-11-22 19:09:06.072921: train_loss -0.3431 
2023-11-22 19:09:06.073135: val_loss -0.3579 
2023-11-22 19:09:06.073213: Pseudo dice [0.7291, nan] 
2023-11-22 19:09:06.073280: Epoch time: 67.66 s 
2023-11-22 19:09:07.139779:  
2023-11-22 19:09:07.139926: Epoch 138 
2023-11-22 19:09:07.140046: Current learning rate: 0.00875 
2023-11-22 19:10:14.655054: train_loss -0.3466 
2023-11-22 19:10:14.655241: val_loss -0.3519 
2023-11-22 19:10:14.655324: Pseudo dice [0.7175, nan] 
2023-11-22 19:10:14.655440: Epoch time: 67.52 s 
2023-11-22 19:10:15.728862:  
2023-11-22 19:10:15.729031: Epoch 139 
2023-11-22 19:10:15.729131: Current learning rate: 0.00874 
2023-11-22 19:11:23.220678: train_loss -0.3388 
2023-11-22 19:11:23.220882: val_loss -0.357 
2023-11-22 19:11:23.220961: Pseudo dice [0.7223, nan] 
2023-11-22 19:11:23.221030: Epoch time: 67.49 s 
2023-11-22 19:11:24.392252:  
2023-11-22 19:11:24.392409: Epoch 140 
2023-11-22 19:11:24.392521: Current learning rate: 0.00873 
2023-11-22 19:12:31.819840: train_loss -0.3502 
2023-11-22 19:12:31.820062: val_loss -0.3508 
2023-11-22 19:12:31.820141: Pseudo dice [0.7173, nan] 
2023-11-22 19:12:31.820210: Epoch time: 67.43 s 
2023-11-22 19:12:32.899055:  
2023-11-22 19:12:32.899215: Epoch 141 
2023-11-22 19:12:32.899310: Current learning rate: 0.00872 
2023-11-22 19:13:40.467347: train_loss -0.3505 
2023-11-22 19:13:40.467521: val_loss -0.3399 
2023-11-22 19:13:40.467593: Pseudo dice [0.6971, nan] 
2023-11-22 19:13:40.467660: Epoch time: 67.57 s 
2023-11-22 19:13:41.543440:  
2023-11-22 19:13:41.543569: Epoch 142 
2023-11-22 19:13:41.543671: Current learning rate: 0.00871 
2023-11-22 19:14:49.117864: train_loss -0.337 
2023-11-22 19:14:49.118041: val_loss -0.3285 
2023-11-22 19:14:49.118109: Pseudo dice [0.6772, nan] 
2023-11-22 19:14:49.118174: Epoch time: 67.58 s 
2023-11-22 19:14:50.194748:  
2023-11-22 19:14:50.194868: Epoch 143 
2023-11-22 19:14:50.194967: Current learning rate: 0.0087 
2023-11-22 19:15:57.784281: train_loss -0.3307 
2023-11-22 19:15:57.784494: val_loss -0.3389 
2023-11-22 19:15:57.784582: Pseudo dice [0.6787, nan] 
2023-11-22 19:15:57.784692: Epoch time: 67.59 s 
2023-11-22 19:15:58.863506:  
2023-11-22 19:15:58.863682: Epoch 144 
2023-11-22 19:15:58.863827: Current learning rate: 0.00869 
2023-11-22 19:17:06.318490: train_loss -0.3484 
2023-11-22 19:17:06.318664: val_loss -0.3618 
2023-11-22 19:17:06.318740: Pseudo dice [0.7374, nan] 
2023-11-22 19:17:06.318808: Epoch time: 67.46 s 
2023-11-22 19:17:07.509197:  
2023-11-22 19:17:07.509305: Epoch 145 
2023-11-22 19:17:07.509425: Current learning rate: 0.00868 
2023-11-22 19:18:15.025221: train_loss -0.3463 
2023-11-22 19:18:15.025419: val_loss -0.3283 
2023-11-22 19:18:15.025504: Pseudo dice [0.6767, nan] 
2023-11-22 19:18:15.025576: Epoch time: 67.52 s 
2023-11-22 19:18:16.089219:  
2023-11-22 19:18:16.089345: Epoch 146 
2023-11-22 19:18:16.089445: Current learning rate: 0.00868 
2023-11-22 19:19:23.244755: train_loss -0.3323 
2023-11-22 19:19:23.244928: val_loss -0.3753 
2023-11-22 19:19:23.245033: Pseudo dice [0.7624, nan] 
2023-11-22 19:19:23.245108: Epoch time: 67.16 s 
2023-11-22 19:19:24.335916:  
2023-11-22 19:19:24.336054: Epoch 147 
2023-11-22 19:19:24.336164: Current learning rate: 0.00867 
2023-11-22 19:20:31.597800: train_loss -0.3365 
2023-11-22 19:20:31.598047: val_loss -0.357 
2023-11-22 19:20:31.598128: Pseudo dice [0.73, nan] 
2023-11-22 19:20:31.598197: Epoch time: 67.26 s 
2023-11-22 19:20:32.701834:  
2023-11-22 19:20:32.701951: Epoch 148 
2023-11-22 19:20:32.702044: Current learning rate: 0.00866 
2023-11-22 19:21:39.951284: train_loss -0.3293 
2023-11-22 19:21:39.951442: val_loss -0.3613 
2023-11-22 19:21:39.951516: Pseudo dice [0.7364, nan] 
2023-11-22 19:21:39.951584: Epoch time: 67.25 s 
2023-11-22 19:21:41.134625:  
2023-11-22 19:21:41.134743: Epoch 149 
2023-11-22 19:21:41.134841: Current learning rate: 0.00865 
2023-11-22 19:22:48.440382: train_loss -0.3452 
2023-11-22 19:22:48.440595: val_loss -0.3444 
2023-11-22 19:22:48.440674: Pseudo dice [0.7033, nan] 
2023-11-22 19:22:48.440770: Epoch time: 67.31 s 
2023-11-22 19:22:49.637459:  
2023-11-22 19:22:49.637588: Epoch 150 
2023-11-22 19:22:49.637680: Current learning rate: 0.00864 
2023-11-22 19:23:56.949405: train_loss -0.3316 
2023-11-22 19:23:56.949588: val_loss -0.3498 
2023-11-22 19:23:56.949666: Pseudo dice [0.7128, nan] 
2023-11-22 19:23:56.949738: Epoch time: 67.31 s 
2023-11-22 19:23:58.032191:  
2023-11-22 19:23:58.032312: Epoch 151 
2023-11-22 19:23:58.032411: Current learning rate: 0.00863 
2023-11-22 19:25:05.301198: train_loss -0.3478 
2023-11-22 19:25:05.301399: val_loss -0.342 
2023-11-22 19:25:05.301478: Pseudo dice [0.6952, nan] 
2023-11-22 19:25:05.301548: Epoch time: 67.27 s 
2023-11-22 19:25:06.585389:  
2023-11-22 19:25:06.585508: Epoch 152 
2023-11-22 19:25:06.585613: Current learning rate: 0.00862 
2023-11-22 19:26:13.945123: train_loss -0.3295 
2023-11-22 19:26:13.945312: val_loss -0.3608 
2023-11-22 19:26:13.945388: Pseudo dice [0.7239, nan] 
2023-11-22 19:26:13.945458: Epoch time: 67.36 s 
2023-11-22 19:26:15.037710:  
2023-11-22 19:26:15.037830: Epoch 153 
2023-11-22 19:26:15.037934: Current learning rate: 0.00861 
2023-11-22 19:27:22.226310: train_loss -0.3516 
2023-11-22 19:27:22.226533: val_loss -0.3501 
2023-11-22 19:27:22.226609: Pseudo dice [0.7092, nan] 
2023-11-22 19:27:22.226680: Epoch time: 67.19 s 
2023-11-22 19:27:23.415995:  
2023-11-22 19:27:23.416114: Epoch 154 
2023-11-22 19:27:23.416210: Current learning rate: 0.0086 
2023-11-22 19:28:30.874940: train_loss -0.3348 
2023-11-22 19:28:30.875161: val_loss -0.355 
2023-11-22 19:28:30.875241: Pseudo dice [0.7206, nan] 
2023-11-22 19:28:30.875312: Epoch time: 67.46 s 
2023-11-22 19:28:31.961387:  
2023-11-22 19:28:31.961508: Epoch 155 
2023-11-22 19:28:31.961607: Current learning rate: 0.00859 
2023-11-22 19:29:39.461440: train_loss -0.3403 
2023-11-22 19:29:39.461642: val_loss -0.3048 
2023-11-22 19:29:39.461722: Pseudo dice [0.6316, nan] 
2023-11-22 19:29:39.461794: Epoch time: 67.5 s 
2023-11-22 19:29:40.553228:  
2023-11-22 19:29:40.553377: Epoch 156 
2023-11-22 19:29:40.553483: Current learning rate: 0.00858 
2023-11-22 19:30:48.108234: train_loss -0.3397 
2023-11-22 19:30:48.108413: val_loss -0.3737 
2023-11-22 19:30:48.108489: Pseudo dice [0.7633, nan] 
2023-11-22 19:30:48.108557: Epoch time: 67.56 s 
2023-11-22 19:30:49.206065:  
2023-11-22 19:30:49.206412: Epoch 157 
2023-11-22 19:30:49.206607: Current learning rate: 0.00858 
2023-11-22 19:31:56.615261: train_loss -0.3436 
2023-11-22 19:31:56.615465: val_loss -0.3579 
2023-11-22 19:31:56.615588: Pseudo dice [0.7315, nan] 
2023-11-22 19:31:56.615663: Epoch time: 67.41 s 
2023-11-22 19:31:57.829800:  
2023-11-22 19:31:57.829918: Epoch 158 
2023-11-22 19:31:57.830016: Current learning rate: 0.00857 
2023-11-22 19:33:05.399560: train_loss -0.3325 
2023-11-22 19:33:05.399767: val_loss -0.3527 
2023-11-22 19:33:05.399845: Pseudo dice [0.7162, nan] 
2023-11-22 19:33:05.399914: Epoch time: 67.57 s 
2023-11-22 19:33:06.494941:  
2023-11-22 19:33:06.495094: Epoch 159 
2023-11-22 19:33:06.495198: Current learning rate: 0.00856 
2023-11-22 19:34:13.853014: train_loss -0.3404 
2023-11-22 19:34:13.853191: val_loss -0.3547 
2023-11-22 19:34:13.853267: Pseudo dice [0.7243, nan] 
2023-11-22 19:34:13.853335: Epoch time: 67.36 s 
2023-11-22 19:34:14.949592:  
2023-11-22 19:34:14.949781: Epoch 160 
2023-11-22 19:34:14.949888: Current learning rate: 0.00855 
2023-11-22 19:35:22.442984: train_loss -0.3306 
2023-11-22 19:35:22.443161: val_loss -0.3552 
2023-11-22 19:35:22.443235: Pseudo dice [0.7241, nan] 
2023-11-22 19:35:22.443346: Epoch time: 67.49 s 
2023-11-22 19:35:23.545499:  
2023-11-22 19:35:23.545627: Epoch 161 
2023-11-22 19:35:23.545724: Current learning rate: 0.00854 
2023-11-22 19:36:27.879647: train_loss -0.3413 
2023-11-22 19:36:27.879816: val_loss -0.3645 
2023-11-22 19:36:27.879890: Pseudo dice [0.742, nan] 
2023-11-22 19:36:27.879957: Epoch time: 64.33 s 
2023-11-22 19:36:28.973741:  
2023-11-22 19:36:28.973883: Epoch 162 
2023-11-22 19:36:28.973987: Current learning rate: 0.00853 
2023-11-22 19:37:28.500783: train_loss -0.3354 
2023-11-22 19:37:28.500994: val_loss -0.3629 
2023-11-22 19:37:28.501071: Pseudo dice [0.7381, nan] 
2023-11-22 19:37:28.501145: Epoch time: 59.53 s 
2023-11-22 19:37:28.501207: Yayy! New best EMA pseudo Dice: 0.7205 
2023-11-22 19:37:29.819038:  
2023-11-22 19:37:29.819260: Epoch 163 
2023-11-22 19:37:29.819377: Current learning rate: 0.00852 
2023-11-22 19:38:29.422575: train_loss -0.3429 
2023-11-22 19:38:29.422758: val_loss -0.367 
2023-11-22 19:38:29.422833: Pseudo dice [0.7483, nan] 
2023-11-22 19:38:29.422938: Epoch time: 59.6 s 
2023-11-22 19:38:29.422996: Yayy! New best EMA pseudo Dice: 0.7233 
2023-11-22 19:38:30.623834:  
2023-11-22 19:38:30.623949: Epoch 164 
2023-11-22 19:38:30.624052: Current learning rate: 0.00851 
2023-11-22 19:39:30.241697: train_loss -0.3459 
2023-11-22 19:39:30.241881: val_loss -0.3392 
2023-11-22 19:39:30.241965: Pseudo dice [0.6973, nan] 
2023-11-22 19:39:30.242042: Epoch time: 59.62 s 
2023-11-22 19:39:31.302447:  
2023-11-22 19:39:31.302595: Epoch 165 
2023-11-22 19:39:31.302699: Current learning rate: 0.0085 
2023-11-22 19:40:30.947457: train_loss -0.3377 
2023-11-22 19:40:30.947634: val_loss -0.3494 
2023-11-22 19:40:30.947711: Pseudo dice [0.7075, nan] 
2023-11-22 19:40:30.947777: Epoch time: 59.65 s 
2023-11-22 19:40:32.001261:  
2023-11-22 19:40:32.001416: Epoch 166 
2023-11-22 19:40:32.001518: Current learning rate: 0.00849 
2023-11-22 19:41:31.707195: train_loss -0.3489 
2023-11-22 19:41:31.707378: val_loss -0.3599 
2023-11-22 19:41:31.707498: Pseudo dice [0.7273, nan] 
2023-11-22 19:41:31.707580: Epoch time: 59.71 s 
2023-11-22 19:41:32.875357:  
2023-11-22 19:41:32.875470: Epoch 167 
2023-11-22 19:41:32.875569: Current learning rate: 0.00848 
2023-11-22 19:42:32.576329: train_loss -0.3497 
2023-11-22 19:42:32.576510: val_loss -0.33 
2023-11-22 19:42:32.576602: Pseudo dice [0.6791, nan] 
2023-11-22 19:42:32.576679: Epoch time: 59.7 s 
2023-11-22 19:42:33.666452:  
2023-11-22 19:42:33.666569: Epoch 168 
2023-11-22 19:42:33.666671: Current learning rate: 0.00847 
2023-11-22 19:43:33.381292: train_loss -0.3483 
2023-11-22 19:43:33.381463: val_loss -0.3609 
2023-11-22 19:43:33.381537: Pseudo dice [0.7327, nan] 
2023-11-22 19:43:33.381603: Epoch time: 59.72 s 
2023-11-22 19:43:34.464651:  
2023-11-22 19:43:34.464767: Epoch 169 
2023-11-22 19:43:34.464865: Current learning rate: 0.00847 
2023-11-22 19:44:34.211947: train_loss -0.3378 
2023-11-22 19:44:34.212128: val_loss -0.3532 
2023-11-22 19:44:34.212201: Pseudo dice [0.7286, nan] 
2023-11-22 19:44:34.212296: Epoch time: 59.75 s 
2023-11-22 19:44:35.289638:  
2023-11-22 19:44:35.289848: Epoch 170 
2023-11-22 19:44:35.289961: Current learning rate: 0.00846 
2023-11-22 19:45:34.954873: train_loss -0.3362 
2023-11-22 19:45:34.955058: val_loss -0.3579 
2023-11-22 19:45:34.955133: Pseudo dice [0.7196, nan] 
2023-11-22 19:45:34.955200: Epoch time: 59.67 s 
2023-11-22 19:45:36.153401:  
2023-11-22 19:45:36.153534: Epoch 171 
2023-11-22 19:45:36.153633: Current learning rate: 0.00845 
2023-11-22 19:46:35.882861: train_loss -0.3427 
2023-11-22 19:46:35.883076: val_loss -0.3526 
2023-11-22 19:46:35.883156: Pseudo dice [0.7174, nan] 
2023-11-22 19:46:35.883224: Epoch time: 59.73 s 
2023-11-22 19:46:36.971226:  
2023-11-22 19:46:36.971339: Epoch 172 
2023-11-22 19:46:36.971435: Current learning rate: 0.00844 
2023-11-22 19:47:36.717851: train_loss -0.3595 
2023-11-22 19:47:36.718082: val_loss -0.3449 
2023-11-22 19:47:36.718155: Pseudo dice [0.7064, nan] 
2023-11-22 19:47:36.718222: Epoch time: 59.75 s 
2023-11-22 19:47:37.805748:  
2023-11-22 19:47:37.805874: Epoch 173 
2023-11-22 19:47:37.805975: Current learning rate: 0.00843 
2023-11-22 19:48:37.583601: train_loss -0.3388 
2023-11-22 19:48:37.583798: val_loss -0.3563 
2023-11-22 19:48:37.583879: Pseudo dice [0.7265, nan] 
2023-11-22 19:48:37.583949: Epoch time: 59.78 s 
2023-11-22 19:48:38.675945:  
2023-11-22 19:48:38.676068: Epoch 174 
2023-11-22 19:48:38.676165: Current learning rate: 0.00842 
2023-11-22 19:49:38.336648: train_loss -0.3393 
2023-11-22 19:49:38.336870: val_loss -0.3433 
2023-11-22 19:49:38.336949: Pseudo dice [0.7022, nan] 
2023-11-22 19:49:38.337020: Epoch time: 59.66 s 
2023-11-22 19:49:39.428598:  
2023-11-22 19:49:39.428790: Epoch 175 
2023-11-22 19:49:39.428931: Current learning rate: 0.00841 
2023-11-22 19:50:39.118972: train_loss -0.3356 
2023-11-22 19:50:39.119196: val_loss -0.3462 
2023-11-22 19:50:39.119279: Pseudo dice [0.7078, nan] 
2023-11-22 19:50:39.119353: Epoch time: 59.69 s 
2023-11-22 19:50:40.304537:  
2023-11-22 19:50:40.304786: Epoch 176 
2023-11-22 19:50:40.304933: Current learning rate: 0.0084 
2023-11-22 19:51:40.002031: train_loss -0.3385 
2023-11-22 19:51:40.002203: val_loss -0.3238 
2023-11-22 19:51:40.002282: Pseudo dice [0.662, nan] 
2023-11-22 19:51:40.002351: Epoch time: 59.7 s 
2023-11-22 19:51:41.089840:  
2023-11-22 19:51:41.090011: Epoch 177 
2023-11-22 19:51:41.090147: Current learning rate: 0.00839 
2023-11-22 19:52:40.772911: train_loss -0.3391 
2023-11-22 19:52:40.773111: val_loss -0.3581 
2023-11-22 19:52:40.773329: Pseudo dice [0.7332, nan] 
2023-11-22 19:52:40.773407: Epoch time: 59.68 s 
2023-11-22 19:52:41.868988:  
2023-11-22 19:52:41.869112: Epoch 178 
2023-11-22 19:52:41.869214: Current learning rate: 0.00838 
2023-11-22 19:53:41.656814: train_loss -0.3334 
2023-11-22 19:53:41.657006: val_loss -0.3671 
2023-11-22 19:53:41.657088: Pseudo dice [0.7478, nan] 
2023-11-22 19:53:41.657159: Epoch time: 59.79 s 
2023-11-22 19:53:42.734503:  
2023-11-22 19:53:42.734692: Epoch 179 
2023-11-22 19:53:42.734831: Current learning rate: 0.00837 
2023-11-22 19:54:42.459177: train_loss -0.3317 
2023-11-22 19:54:42.459388: val_loss -0.3651 
2023-11-22 19:54:42.459492: Pseudo dice [0.7394, nan] 
2023-11-22 19:54:42.459592: Epoch time: 59.73 s 
2023-11-22 19:54:43.660632:  
2023-11-22 19:54:43.660760: Epoch 180 
2023-11-22 19:54:43.660848: Current learning rate: 0.00836 
2023-11-22 19:55:43.306992: train_loss -0.3356 
2023-11-22 19:55:43.307206: val_loss -0.3313 
2023-11-22 19:55:43.307290: Pseudo dice [0.6838, nan] 
2023-11-22 19:55:43.307406: Epoch time: 59.65 s 
2023-11-22 19:55:44.386406:  
2023-11-22 19:55:44.386525: Epoch 181 
2023-11-22 19:55:44.386627: Current learning rate: 0.00836 
2023-11-22 19:56:44.161201: train_loss -0.3494 
2023-11-22 19:56:44.161381: val_loss -0.3517 
2023-11-22 19:56:44.161457: Pseudo dice [0.7162, nan] 
2023-11-22 19:56:44.161526: Epoch time: 59.78 s 
2023-11-22 19:56:45.236603:  
2023-11-22 19:56:45.236848: Epoch 182 
2023-11-22 19:56:45.236958: Current learning rate: 0.00835 
2023-11-22 19:57:44.917144: train_loss -0.3471 
2023-11-22 19:57:44.917376: val_loss -0.3673 
2023-11-22 19:57:44.917470: Pseudo dice [0.745, nan] 
2023-11-22 19:57:44.917539: Epoch time: 59.68 s 
2023-11-22 19:57:45.994522:  
2023-11-22 19:57:45.994702: Epoch 183 
2023-11-22 19:57:45.994860: Current learning rate: 0.00834 
2023-11-22 19:58:45.798971: train_loss -0.3423 
2023-11-22 19:58:45.799147: val_loss -0.3874 
2023-11-22 19:58:45.799252: Pseudo dice [0.7803, nan] 
2023-11-22 19:58:45.799323: Epoch time: 59.81 s 
2023-11-22 19:58:45.799414: Yayy! New best EMA pseudo Dice: 0.7244 
2023-11-22 19:58:46.994600:  
2023-11-22 19:58:46.994726: Epoch 184 
2023-11-22 19:58:46.994824: Current learning rate: 0.00833 
2023-11-22 19:59:46.781770: train_loss -0.3521 
2023-11-22 19:59:46.781960: val_loss -0.3639 
2023-11-22 19:59:46.782034: Pseudo dice [0.7386, nan] 
2023-11-22 19:59:46.782104: Epoch time: 59.79 s 
2023-11-22 19:59:46.782166: Yayy! New best EMA pseudo Dice: 0.7258 
2023-11-22 19:59:48.096298:  
2023-11-22 19:59:48.096464: Epoch 185 
2023-11-22 19:59:48.096611: Current learning rate: 0.00832 
2023-11-22 20:00:47.784010: train_loss -0.3416 
2023-11-22 20:00:47.784190: val_loss -0.3176 
2023-11-22 20:00:47.784295: Pseudo dice [0.6578, nan] 
2023-11-22 20:00:47.784376: Epoch time: 59.69 s 
2023-11-22 20:00:48.971841:  
2023-11-22 20:00:48.971972: Epoch 186 
2023-11-22 20:00:48.972071: Current learning rate: 0.00831 
2023-11-22 20:01:48.653799: train_loss -0.3525 
2023-11-22 20:01:48.653990: val_loss -0.352 
2023-11-22 20:01:48.654102: Pseudo dice [0.7175, nan] 
2023-11-22 20:01:48.654176: Epoch time: 59.68 s 
2023-11-22 20:01:49.755517:  
2023-11-22 20:01:49.755648: Epoch 187 
2023-11-22 20:01:49.755754: Current learning rate: 0.0083 
2023-11-22 20:02:49.509622: train_loss -0.3437 
2023-11-22 20:02:49.509840: val_loss -0.3328 
2023-11-22 20:02:49.510020: Pseudo dice [0.6904, nan] 
2023-11-22 20:02:49.510100: Epoch time: 59.75 s 
2023-11-22 20:02:50.593308:  
2023-11-22 20:02:50.593443: Epoch 188 
2023-11-22 20:02:50.593592: Current learning rate: 0.00829 
2023-11-22 20:03:50.383467: train_loss -0.3353 
2023-11-22 20:03:50.383679: val_loss -0.334 
2023-11-22 20:03:50.383792: Pseudo dice [0.6837, nan] 
2023-11-22 20:03:50.383869: Epoch time: 59.79 s 
2023-11-22 20:03:51.465517:  
2023-11-22 20:03:51.465719: Epoch 189 
2023-11-22 20:03:51.465874: Current learning rate: 0.00828 
2023-11-22 20:04:51.053074: train_loss -0.3245 
2023-11-22 20:04:51.053265: val_loss -0.3412 
2023-11-22 20:04:51.053349: Pseudo dice [0.6881, nan] 
2023-11-22 20:04:51.053464: Epoch time: 59.59 s 
2023-11-22 20:04:52.248427:  
2023-11-22 20:04:52.248626: Epoch 190 
2023-11-22 20:04:52.248741: Current learning rate: 0.00827 
2023-11-22 20:05:51.945756: train_loss -0.3399 
2023-11-22 20:05:51.945957: val_loss -0.3559 
2023-11-22 20:05:51.946041: Pseudo dice [0.7308, nan] 
2023-11-22 20:05:51.946153: Epoch time: 59.7 s 
2023-11-22 20:05:53.024829:  
2023-11-22 20:05:53.024955: Epoch 191 
2023-11-22 20:05:53.025071: Current learning rate: 0.00826 
2023-11-22 20:06:52.747676: train_loss -0.3447 
2023-11-22 20:06:52.747866: val_loss -0.3344 
2023-11-22 20:06:52.747954: Pseudo dice [0.6815, nan] 
2023-11-22 20:06:52.748046: Epoch time: 59.72 s 
2023-11-22 20:06:53.841996:  
2023-11-22 20:06:53.842121: Epoch 192 
2023-11-22 20:06:53.842230: Current learning rate: 0.00825 
2023-11-22 20:07:53.505510: train_loss -0.3408 
2023-11-22 20:07:53.505694: val_loss -0.3272 
2023-11-22 20:07:53.505772: Pseudo dice [0.6643, nan] 
2023-11-22 20:07:53.505841: Epoch time: 59.66 s 
2023-11-22 20:07:54.597534:  
2023-11-22 20:07:54.597729: Epoch 193 
2023-11-22 20:07:54.597870: Current learning rate: 0.00824 
2023-11-22 20:08:54.268243: train_loss -0.3466 
2023-11-22 20:08:54.268438: val_loss -0.3225 
2023-11-22 20:08:54.268598: Pseudo dice [0.6569, nan] 
2023-11-22 20:08:54.268701: Epoch time: 59.67 s 
2023-11-22 20:08:55.475417:  
2023-11-22 20:08:55.475605: Epoch 194 
2023-11-22 20:08:55.475753: Current learning rate: 0.00824 
2023-11-22 20:09:55.096541: train_loss -0.3458 
2023-11-22 20:09:55.096775: val_loss -0.3544 
2023-11-22 20:09:55.096858: Pseudo dice [0.7196, nan] 
2023-11-22 20:09:55.096930: Epoch time: 59.62 s 
2023-11-22 20:09:56.199569:  
2023-11-22 20:09:56.199698: Epoch 195 
2023-11-22 20:09:56.199827: Current learning rate: 0.00823 
2023-11-22 20:10:55.972875: train_loss -0.3336 
2023-11-22 20:10:55.973072: val_loss -0.3609 
2023-11-22 20:10:55.973149: Pseudo dice [0.7263, nan] 
2023-11-22 20:10:55.973218: Epoch time: 59.77 s 
2023-11-22 20:10:57.084340:  
2023-11-22 20:10:57.084507: Epoch 196 
2023-11-22 20:10:57.084624: Current learning rate: 0.00822 
2023-11-22 20:11:56.807789: train_loss -0.3377 
2023-11-22 20:11:56.807987: val_loss -0.356 
2023-11-22 20:11:56.808068: Pseudo dice [0.7164, nan] 
2023-11-22 20:11:56.808150: Epoch time: 59.72 s 
2023-11-22 20:11:57.917122:  
2023-11-22 20:11:57.917247: Epoch 197 
2023-11-22 20:11:57.917363: Current learning rate: 0.00821 
2023-11-22 20:12:57.549682: train_loss -0.3383 
2023-11-22 20:12:57.549872: val_loss -0.3355 
2023-11-22 20:12:57.549981: Pseudo dice [0.6777, nan] 
2023-11-22 20:12:57.550055: Epoch time: 59.63 s 
2023-11-22 20:12:58.739728:  
2023-11-22 20:12:58.739897: Epoch 198 
2023-11-22 20:12:58.740066: Current learning rate: 0.0082 
2023-11-22 20:13:58.502864: train_loss -0.349 
2023-11-22 20:13:58.503049: val_loss -0.3566 
2023-11-22 20:13:58.503127: Pseudo dice [0.7327, nan] 
2023-11-22 20:13:58.503199: Epoch time: 59.76 s 
2023-11-22 20:13:59.625831:  
2023-11-22 20:13:59.625999: Epoch 199 
2023-11-22 20:13:59.626173: Current learning rate: 0.00819 
2023-11-22 20:14:59.374293: train_loss -0.3442 
2023-11-22 20:14:59.374512: val_loss -0.3604 
2023-11-22 20:14:59.374589: Pseudo dice [0.7371, nan] 
2023-11-22 20:14:59.374665: Epoch time: 59.75 s 
2023-11-22 20:15:00.590217:  
2023-11-22 20:15:00.590407: Epoch 200 
2023-11-22 20:15:00.590539: Current learning rate: 0.00818 
2023-11-22 20:16:00.322639: train_loss -0.3476 
2023-11-22 20:16:00.322819: val_loss -0.3594 
2023-11-22 20:16:00.322901: Pseudo dice [0.7276, nan] 
2023-11-22 20:16:00.323007: Epoch time: 59.73 s 
2023-11-22 20:16:01.425557:  
2023-11-22 20:16:01.425682: Epoch 201 
2023-11-22 20:16:01.425796: Current learning rate: 0.00817 
2023-11-22 20:17:01.102606: train_loss -0.3346 
2023-11-22 20:17:01.102818: val_loss -0.3544 
2023-11-22 20:17:01.102901: Pseudo dice [0.7181, nan] 
2023-11-22 20:17:01.102973: Epoch time: 59.68 s 
2023-11-22 20:17:02.204905:  
2023-11-22 20:17:02.205082: Epoch 202 
2023-11-22 20:17:02.205235: Current learning rate: 0.00816 
2023-11-22 20:18:01.784105: train_loss -0.3272 
2023-11-22 20:18:01.784289: val_loss -0.3493 
2023-11-22 20:18:01.784379: Pseudo dice [0.7073, nan] 
2023-11-22 20:18:01.784460: Epoch time: 59.58 s 
2023-11-22 20:18:02.983638:  
2023-11-22 20:18:02.983960: Epoch 203 
2023-11-22 20:18:02.984087: Current learning rate: 0.00815 
2023-11-22 20:19:02.580653: train_loss -0.3396 
2023-11-22 20:19:02.580849: val_loss -0.3694 
2023-11-22 20:19:02.580951: Pseudo dice [0.753, nan] 
2023-11-22 20:19:02.581026: Epoch time: 59.6 s 
2023-11-22 20:19:03.687472:  
2023-11-22 20:19:03.687594: Epoch 204 
2023-11-22 20:19:03.687702: Current learning rate: 0.00814 
2023-11-22 20:20:03.461090: train_loss -0.3418 
2023-11-22 20:20:03.461279: val_loss -0.3468 
2023-11-22 20:20:03.461359: Pseudo dice [0.708, nan] 
2023-11-22 20:20:03.461431: Epoch time: 59.77 s 
2023-11-22 20:20:04.563876:  
2023-11-22 20:20:04.564017: Epoch 205 
2023-11-22 20:20:04.564141: Current learning rate: 0.00813 
2023-11-22 20:21:04.213028: train_loss -0.3322 
2023-11-22 20:21:04.213240: val_loss -0.3467 
2023-11-22 20:21:04.213362: Pseudo dice [0.7119, nan] 
2023-11-22 20:21:04.213438: Epoch time: 59.65 s 
2023-11-22 20:21:05.257583:  
2023-11-22 20:21:05.257712: Epoch 206 
2023-11-22 20:21:05.257821: Current learning rate: 0.00813 
2023-11-22 20:22:04.881108: train_loss -0.3405 
2023-11-22 20:22:04.881283: val_loss -0.3375 
2023-11-22 20:22:04.881386: Pseudo dice [0.6876, nan] 
2023-11-22 20:22:04.881469: Epoch time: 59.62 s 
2023-11-22 20:22:05.944633:  
2023-11-22 20:22:05.944807: Epoch 207 
2023-11-22 20:22:05.944929: Current learning rate: 0.00812 
2023-11-22 20:23:05.616689: train_loss -0.3473 
2023-11-22 20:23:05.616883: val_loss -0.3661 
2023-11-22 20:23:05.616957: Pseudo dice [0.7473, nan] 
2023-11-22 20:23:05.617026: Epoch time: 59.67 s 
2023-11-22 20:23:06.661968:  
2023-11-22 20:23:06.662099: Epoch 208 
2023-11-22 20:23:06.662203: Current learning rate: 0.00811 
2023-11-22 20:24:06.412354: train_loss -0.3555 
2023-11-22 20:24:06.412534: val_loss -0.3773 
2023-11-22 20:24:06.412661: Pseudo dice [0.7655, nan] 
2023-11-22 20:24:06.412745: Epoch time: 59.75 s 
2023-11-22 20:24:07.974767:  
2023-11-22 20:24:07.974910: Epoch 209 
2023-11-22 20:24:07.975015: Current learning rate: 0.0081 
2023-11-22 20:25:16.289692: train_loss -0.3312 
2023-11-22 20:25:16.289878: val_loss -0.3663 
2023-11-22 20:25:16.289958: Pseudo dice [0.7455, nan] 
2023-11-22 20:25:16.290027: Epoch time: 68.32 s 
2023-11-22 20:27:34.706472:  
2023-11-22 20:27:34.706603: Epoch 210 
2023-11-22 20:27:34.706729: Current learning rate: 0.00809 
2023-11-22 20:29:42.358426: train_loss -0.3464 
2023-11-22 20:29:42.358606: val_loss -0.3482 
2023-11-22 20:29:42.358688: Pseudo dice [0.7097, nan] 
2023-11-22 20:29:42.358761: Epoch time: 127.65 s 
2023-11-22 20:29:46.436709:  
2023-11-22 20:29:46.436853: Epoch 211 
2023-11-22 20:29:46.436969: Current learning rate: 0.00808 
2023-11-22 20:30:54.680614: train_loss -0.3415 
2023-11-22 20:30:54.680797: val_loss -0.3561 
2023-11-22 20:30:54.680870: Pseudo dice [0.7303, nan] 
2023-11-22 20:30:54.680971: Epoch time: 68.24 s 
2023-11-22 20:30:55.728826:  
2023-11-22 20:30:55.729023: Epoch 212 
2023-11-22 20:30:55.729131: Current learning rate: 0.00807 
2023-11-22 20:32:14.470862: train_loss -0.3405 
2023-11-22 20:32:14.471081: val_loss -0.3501 
2023-11-22 20:32:14.471156: Pseudo dice [0.7081, nan] 
2023-11-22 20:32:14.471229: Epoch time: 78.74 s 
2023-11-22 20:32:23.951743:  
2023-11-22 20:32:23.951875: Epoch 213 
2023-11-22 20:32:23.951977: Current learning rate: 0.00806 
2023-11-22 20:33:39.693543: train_loss -0.3397 
2023-11-22 20:33:39.693748: val_loss -0.354 
2023-11-22 20:33:39.693825: Pseudo dice [0.724, nan] 
2023-11-22 20:33:39.693894: Epoch time: 75.74 s 
2023-11-22 20:33:40.760697:  
2023-11-22 20:33:40.760811: Epoch 214 
2023-11-22 20:33:40.760906: Current learning rate: 0.00805 
2023-11-22 20:34:40.125041: train_loss -0.3427 
2023-11-22 20:34:40.125232: val_loss -0.3642 
2023-11-22 20:34:40.125315: Pseudo dice [0.7336, nan] 
2023-11-22 20:34:40.125425: Epoch time: 59.37 s 
2023-11-22 20:34:41.172584:  
2023-11-22 20:34:41.172708: Epoch 215 
2023-11-22 20:34:41.172807: Current learning rate: 0.00804 
2023-11-22 20:35:40.644239: train_loss -0.3435 
2023-11-22 20:35:40.644431: val_loss -0.354 
2023-11-22 20:35:40.644508: Pseudo dice [0.7165, nan] 
2023-11-22 20:35:40.644594: Epoch time: 59.47 s 
2023-11-22 20:35:41.687052:  
2023-11-22 20:35:41.687240: Epoch 216 
2023-11-22 20:35:41.687350: Current learning rate: 0.00803 
2023-11-22 20:36:41.267720: train_loss -0.3424 
2023-11-22 20:36:41.267906: val_loss -0.3326 
2023-11-22 20:36:41.267981: Pseudo dice [0.6752, nan] 
2023-11-22 20:36:41.268051: Epoch time: 59.58 s 
2023-11-22 20:36:42.316004:  
2023-11-22 20:36:42.316120: Epoch 217 
2023-11-22 20:36:42.316214: Current learning rate: 0.00802 
2023-11-22 20:37:41.731689: train_loss -0.3406 
2023-11-22 20:37:41.731891: val_loss -0.3572 
2023-11-22 20:37:41.731966: Pseudo dice [0.7248, nan] 
2023-11-22 20:37:41.732034: Epoch time: 59.42 s 
2023-11-22 20:37:42.787532:  
2023-11-22 20:37:42.787652: Epoch 218 
2023-11-22 20:37:42.787740: Current learning rate: 0.00801 
2023-11-22 20:38:42.402277: train_loss -0.3384 
2023-11-22 20:38:42.402483: val_loss -0.3548 
2023-11-22 20:38:42.402563: Pseudo dice [0.7274, nan] 
2023-11-22 20:38:42.402633: Epoch time: 59.62 s 
2023-11-22 20:38:43.449979:  
2023-11-22 20:38:43.450094: Epoch 219 
2023-11-22 20:38:43.450192: Current learning rate: 0.00801 
2023-11-22 20:39:43.018060: train_loss -0.336 
2023-11-22 20:39:43.018281: val_loss -0.3801 
2023-11-22 20:39:43.018360: Pseudo dice [0.7701, nan] 
2023-11-22 20:39:43.018428: Epoch time: 59.57 s 
2023-11-22 20:39:44.064639:  
2023-11-22 20:39:44.064757: Epoch 220 
2023-11-22 20:39:44.064857: Current learning rate: 0.008 
2023-11-22 20:40:43.688598: train_loss -0.347 
2023-11-22 20:40:43.688782: val_loss -0.3701 
2023-11-22 20:40:43.688857: Pseudo dice [0.7569, nan] 
2023-11-22 20:40:43.688925: Epoch time: 59.62 s 
2023-11-22 20:40:43.688981: Yayy! New best EMA pseudo Dice: 0.7273 
2023-11-22 20:40:44.850842:  
2023-11-22 20:40:44.850960: Epoch 221 
2023-11-22 20:40:44.851056: Current learning rate: 0.00799 
2023-11-22 20:41:44.476806: train_loss -0.3268 
2023-11-22 20:41:44.477000: val_loss -0.3431 
2023-11-22 20:41:44.477078: Pseudo dice [0.7004, nan] 
2023-11-22 20:41:44.477149: Epoch time: 59.63 s 
2023-11-22 20:41:45.537444:  
2023-11-22 20:41:45.537580: Epoch 222 
2023-11-22 20:41:45.537680: Current learning rate: 0.00798 
2023-11-22 20:42:45.281052: train_loss -0.3444 
2023-11-22 20:42:45.281447: val_loss -0.356 
2023-11-22 20:42:45.281526: Pseudo dice [0.7258, nan] 
2023-11-22 20:42:45.281593: Epoch time: 59.74 s 
2023-11-22 20:42:46.325213:  
2023-11-22 20:42:46.325401: Epoch 223 
2023-11-22 20:42:46.325520: Current learning rate: 0.00797 
2023-11-22 20:43:45.834707: train_loss -0.3503 
2023-11-22 20:43:45.834882: val_loss -0.3547 
2023-11-22 20:43:45.834955: Pseudo dice [0.7287, nan] 
2023-11-22 20:43:45.835028: Epoch time: 59.51 s 
2023-11-22 20:43:46.996917:  
2023-11-22 20:43:46.997074: Epoch 224 
2023-11-22 20:43:46.997181: Current learning rate: 0.00796 
2023-11-22 20:44:46.624159: train_loss -0.3317 
2023-11-22 20:44:46.624342: val_loss -0.3559 
2023-11-22 20:44:46.624419: Pseudo dice [0.7236, nan] 
2023-11-22 20:44:46.624485: Epoch time: 59.63 s 
2023-11-22 20:44:47.685247:  
2023-11-22 20:44:47.685369: Epoch 225 
2023-11-22 20:44:47.685472: Current learning rate: 0.00795 
2023-11-22 20:45:47.380553: train_loss -0.3403 
2023-11-22 20:45:47.380746: val_loss -0.3528 
2023-11-22 20:45:47.380855: Pseudo dice [0.7208, nan] 
2023-11-22 20:45:47.380927: Epoch time: 59.7 s 
2023-11-22 20:45:48.412879:  
2023-11-22 20:45:48.412994: Epoch 226 
2023-11-22 20:45:48.413088: Current learning rate: 0.00794 
2023-11-22 20:46:48.230210: train_loss -0.3417 
2023-11-22 20:46:48.230420: val_loss -0.3539 
2023-11-22 20:46:48.230493: Pseudo dice [0.7084, nan] 
2023-11-22 20:46:48.230563: Epoch time: 59.82 s 
2023-11-22 20:46:49.263510:  
2023-11-22 20:46:49.263688: Epoch 227 
2023-11-22 20:46:49.263819: Current learning rate: 0.00793 
2023-11-22 20:47:48.838080: train_loss -0.3551 
2023-11-22 20:47:48.838266: val_loss -0.3517 
2023-11-22 20:47:48.838346: Pseudo dice [0.7196, nan] 
2023-11-22 20:47:48.838412: Epoch time: 59.58 s 
2023-11-22 20:47:49.975977:  
2023-11-22 20:47:49.976101: Epoch 228 
2023-11-22 20:47:49.976202: Current learning rate: 0.00792 
2023-11-22 20:48:49.765267: train_loss -0.3499 
2023-11-22 20:48:49.765439: val_loss -0.3579 
2023-11-22 20:48:49.765513: Pseudo dice [0.7184, nan] 
2023-11-22 20:48:49.765619: Epoch time: 59.79 s 
2023-11-22 20:48:50.805106:  
2023-11-22 20:48:50.805230: Epoch 229 
2023-11-22 20:48:50.805331: Current learning rate: 0.00791 
2023-11-22 20:49:50.583561: train_loss -0.3392 
2023-11-22 20:49:50.583720: val_loss -0.3705 
2023-11-22 20:49:50.583792: Pseudo dice [0.7541, nan] 
2023-11-22 20:49:50.583865: Epoch time: 59.78 s 
2023-11-22 20:49:51.625011:  
2023-11-22 20:49:51.625138: Epoch 230 
2023-11-22 20:49:51.625241: Current learning rate: 0.0079 
2023-11-22 20:50:51.448452: train_loss -0.3499 
2023-11-22 20:50:51.448668: val_loss -0.3685 
2023-11-22 20:50:51.448757: Pseudo dice [0.7589, nan] 
2023-11-22 20:50:51.448837: Epoch time: 59.82 s 
2023-11-22 20:50:51.448899: Yayy! New best EMA pseudo Dice: 0.7287 
2023-11-22 20:50:52.598636:  
2023-11-22 20:50:52.598758: Epoch 231 
2023-11-22 20:50:52.598854: Current learning rate: 0.00789 
2023-11-22 20:51:52.267513: train_loss -0.3458 
2023-11-22 20:51:52.267704: val_loss -0.356 
2023-11-22 20:51:52.267823: Pseudo dice [0.7269, nan] 
2023-11-22 20:51:52.267893: Epoch time: 59.67 s 
2023-11-22 20:51:53.304315:  
2023-11-22 20:51:53.304574: Epoch 232 
2023-11-22 20:51:53.304753: Current learning rate: 0.00789 
2023-11-22 20:52:52.923775: train_loss -0.3304 
2023-11-22 20:52:52.923980: val_loss -0.3624 
2023-11-22 20:52:52.924059: Pseudo dice [0.7381, nan] 
2023-11-22 20:52:52.924129: Epoch time: 59.62 s 
2023-11-22 20:52:52.924185: Yayy! New best EMA pseudo Dice: 0.7295 
2023-11-22 20:52:54.075933:  
2023-11-22 20:52:54.076085: Epoch 233 
2023-11-22 20:52:54.076195: Current learning rate: 0.00788 
2023-11-22 20:53:53.744701: train_loss -0.3413 
2023-11-22 20:53:53.744923: val_loss -0.3366 
2023-11-22 20:53:53.745014: Pseudo dice [0.6897, nan] 
2023-11-22 20:53:53.745095: Epoch time: 59.67 s 
2023-11-22 20:53:54.775151:  
2023-11-22 20:53:54.775259: Epoch 234 
2023-11-22 20:53:54.775357: Current learning rate: 0.00787 
2023-11-22 20:54:54.580989: train_loss -0.3357 
2023-11-22 20:54:54.581178: val_loss -0.3814 
2023-11-22 20:54:54.581294: Pseudo dice [0.7728, nan] 
2023-11-22 20:54:54.581365: Epoch time: 59.81 s 
2023-11-22 20:54:54.581422: Yayy! New best EMA pseudo Dice: 0.7302 
2023-11-22 20:54:55.736069:  
2023-11-22 20:54:55.736189: Epoch 235 
2023-11-22 20:54:55.736366: Current learning rate: 0.00786 
2023-11-22 20:55:55.499062: train_loss -0.3515 
2023-11-22 20:55:55.499243: val_loss -0.3332 
2023-11-22 20:55:55.499319: Pseudo dice [0.6983, nan] 
2023-11-22 20:55:55.499386: Epoch time: 59.76 s 
2023-11-22 20:55:56.538483:  
2023-11-22 20:55:56.538619: Epoch 236 
2023-11-22 20:55:56.538722: Current learning rate: 0.00785 
2023-11-22 20:56:56.161828: train_loss -0.3536 
2023-11-22 20:56:56.162007: val_loss -0.3646 
2023-11-22 20:56:56.162080: Pseudo dice [0.7305, nan] 
2023-11-22 20:56:56.162147: Epoch time: 59.62 s 
2023-11-22 20:56:57.300554:  
2023-11-22 20:56:57.300691: Epoch 237 
2023-11-22 20:56:57.300792: Current learning rate: 0.00784 
2023-11-22 20:57:56.845222: train_loss -0.3419 
2023-11-22 20:57:56.845454: val_loss -0.3322 
2023-11-22 20:57:56.845534: Pseudo dice [0.6725, nan] 
2023-11-22 20:57:56.845605: Epoch time: 59.55 s 
2023-11-22 20:57:57.889680:  
2023-11-22 20:57:57.889804: Epoch 238 
2023-11-22 20:57:57.889909: Current learning rate: 0.00783 
2023-11-22 20:58:57.460319: train_loss -0.3356 
2023-11-22 20:58:57.460557: val_loss -0.3185 
2023-11-22 20:58:57.460654: Pseudo dice [0.6591, nan] 
2023-11-22 20:58:57.460722: Epoch time: 59.57 s 
2023-11-22 20:58:58.502166:  
2023-11-22 20:58:58.502312: Epoch 239 
2023-11-22 20:58:58.502405: Current learning rate: 0.00782 
2023-11-22 20:59:57.991586: train_loss -0.3513 
2023-11-22 20:59:57.991794: val_loss -0.3519 
2023-11-22 20:59:57.991870: Pseudo dice [0.71, nan] 
2023-11-22 20:59:57.991988: Epoch time: 59.49 s 
2023-11-22 20:59:59.041696:  
2023-11-22 20:59:59.041816: Epoch 240 
2023-11-22 20:59:59.041921: Current learning rate: 0.00781 
2023-11-22 21:00:58.564015: train_loss -0.3367 
2023-11-22 21:00:58.564225: val_loss -0.3473 
2023-11-22 21:00:58.564369: Pseudo dice [0.709, nan] 
2023-11-22 21:00:58.564448: Epoch time: 59.52 s 
2023-11-22 21:00:59.614822:  
2023-11-22 21:00:59.614954: Epoch 241 
2023-11-22 21:00:59.615062: Current learning rate: 0.0078 
2023-11-22 21:01:59.256872: train_loss -0.3326 
2023-11-22 21:01:59.257053: val_loss -0.3712 
2023-11-22 21:01:59.257198: Pseudo dice [0.7548, nan] 
2023-11-22 21:01:59.257288: Epoch time: 59.64 s 
2023-11-22 21:02:00.315584:  
2023-11-22 21:02:00.315708: Epoch 242 
2023-11-22 21:02:00.315802: Current learning rate: 0.00779 
2023-11-22 21:02:59.846721: train_loss -0.3372 
2023-11-22 21:02:59.846894: val_loss -0.3526 
2023-11-22 21:02:59.846970: Pseudo dice [0.7111, nan] 
2023-11-22 21:02:59.847038: Epoch time: 59.53 s 
2023-11-22 21:03:00.902313:  
2023-11-22 21:03:00.902434: Epoch 243 
2023-11-22 21:03:00.902528: Current learning rate: 0.00778 
2023-11-22 21:04:00.412735: train_loss -0.3342 
2023-11-22 21:04:00.412920: val_loss -0.3722 
2023-11-22 21:04:00.412995: Pseudo dice [0.7598, nan] 
2023-11-22 21:04:00.413098: Epoch time: 59.51 s 
2023-11-22 21:04:01.471137:  
2023-11-22 21:04:01.471272: Epoch 244 
2023-11-22 21:04:01.471380: Current learning rate: 0.00777 
2023-11-22 21:05:00.930109: train_loss -0.3429 
2023-11-22 21:05:00.930289: val_loss -0.3609 
2023-11-22 21:05:00.930360: Pseudo dice [0.7342, nan] 
2023-11-22 21:05:00.930425: Epoch time: 59.46 s 
2023-11-22 21:05:02.081086:  
2023-11-22 21:05:02.081203: Epoch 245 
2023-11-22 21:05:02.081306: Current learning rate: 0.00777 
2023-11-22 21:06:01.601820: train_loss -0.3374 
2023-11-22 21:06:01.602004: val_loss -0.3574 
2023-11-22 21:06:01.602079: Pseudo dice [0.7276, nan] 
2023-11-22 21:06:01.602149: Epoch time: 59.52 s 
2023-11-22 21:06:02.659133:  
2023-11-22 21:06:02.659256: Epoch 246 
2023-11-22 21:06:02.659361: Current learning rate: 0.00776 
2023-11-22 21:07:02.238187: train_loss -0.3447 
2023-11-22 21:07:02.238393: val_loss -0.3339 
2023-11-22 21:07:02.238469: Pseudo dice [0.6822, nan] 
2023-11-22 21:07:02.238542: Epoch time: 59.58 s 
2023-11-22 21:07:03.285487:  
2023-11-22 21:07:03.285608: Epoch 247 
2023-11-22 21:07:03.285705: Current learning rate: 0.00775 
2023-11-22 21:08:02.967878: train_loss -0.3403 
2023-11-22 21:08:02.968060: val_loss -0.3328 
2023-11-22 21:08:02.968150: Pseudo dice [0.6741, nan] 
2023-11-22 21:08:02.968226: Epoch time: 59.68 s 
2023-11-22 21:08:04.015903:  
2023-11-22 21:08:04.016019: Epoch 248 
2023-11-22 21:08:04.016120: Current learning rate: 0.00774 
2023-11-22 21:09:03.630709: train_loss -0.3387 
2023-11-22 21:09:03.630931: val_loss -0.3565 
2023-11-22 21:09:03.631013: Pseudo dice [0.7245, nan] 
2023-11-22 21:09:03.631084: Epoch time: 59.62 s 
2023-11-22 21:09:04.790224:  
2023-11-22 21:09:04.790346: Epoch 249 
2023-11-22 21:09:04.790445: Current learning rate: 0.00773 
2023-11-22 21:10:04.456978: train_loss -0.3487 
2023-11-22 21:10:04.457161: val_loss -0.3471 
2023-11-22 21:10:04.457239: Pseudo dice [0.7097, nan] 
2023-11-22 21:10:04.457312: Epoch time: 59.67 s 
2023-11-22 21:10:05.637756:  
2023-11-22 21:10:05.637909: Epoch 250 
2023-11-22 21:10:05.638018: Current learning rate: 0.00772 
2023-11-22 21:11:05.193597: train_loss -0.3401 
2023-11-22 21:11:05.193770: val_loss -0.3475 
2023-11-22 21:11:05.193845: Pseudo dice [0.711, nan] 
2023-11-22 21:11:05.193912: Epoch time: 59.56 s 
2023-11-22 21:11:06.241967:  
2023-11-22 21:11:06.242131: Epoch 251 
2023-11-22 21:11:06.242235: Current learning rate: 0.00771 
2023-11-22 21:12:06.000922: train_loss -0.3361 
2023-11-22 21:12:06.001124: val_loss -0.3497 
2023-11-22 21:12:06.001237: Pseudo dice [0.7195, nan] 
2023-11-22 21:12:06.001307: Epoch time: 59.76 s 
2023-11-22 21:12:07.045954:  
2023-11-22 21:12:07.046084: Epoch 252 
2023-11-22 21:12:07.046187: Current learning rate: 0.0077 
2023-11-22 21:13:06.675921: train_loss -0.3438 
2023-11-22 21:13:06.676123: val_loss -0.3313 
2023-11-22 21:13:06.676198: Pseudo dice [0.6743, nan] 
2023-11-22 21:13:06.676265: Epoch time: 59.63 s 
2023-11-22 21:13:07.839970:  
2023-11-22 21:13:07.840093: Epoch 253 
2023-11-22 21:13:07.840232: Current learning rate: 0.00769 
2023-11-22 21:14:07.437508: train_loss -0.3627 
2023-11-22 21:14:07.437683: val_loss -0.3616 
2023-11-22 21:14:07.437758: Pseudo dice [0.7339, nan] 
2023-11-22 21:14:07.437824: Epoch time: 59.6 s 
2023-11-22 21:14:08.491313:  
2023-11-22 21:14:08.491512: Epoch 254 
2023-11-22 21:14:08.491643: Current learning rate: 0.00768 
2023-11-22 21:15:08.206195: train_loss -0.3429 
2023-11-22 21:15:08.206371: val_loss -0.353 
2023-11-22 21:15:08.206445: Pseudo dice [0.7227, nan] 
2023-11-22 21:15:08.206549: Epoch time: 59.72 s 
2023-11-22 21:15:09.259037:  
2023-11-22 21:15:09.259152: Epoch 255 
2023-11-22 21:15:09.259249: Current learning rate: 0.00767 
2023-11-22 21:16:08.964761: train_loss -0.3443 
2023-11-22 21:16:08.964924: val_loss -0.332 
2023-11-22 21:16:08.964997: Pseudo dice [0.6696, nan] 
2023-11-22 21:16:08.965070: Epoch time: 59.71 s 
2023-11-22 21:16:10.010662:  
2023-11-22 21:16:10.010778: Epoch 256 
2023-11-22 21:16:10.010879: Current learning rate: 0.00766 
2023-11-22 21:17:09.665366: train_loss -0.3441 
2023-11-22 21:17:09.665554: val_loss -0.3576 
2023-11-22 21:17:09.665633: Pseudo dice [0.73, nan] 
2023-11-22 21:17:09.665709: Epoch time: 59.66 s 
2023-11-22 21:17:10.819696:  
2023-11-22 21:17:10.819823: Epoch 257 
2023-11-22 21:17:10.819924: Current learning rate: 0.00765 
2023-11-22 21:18:10.467283: train_loss -0.3391 
2023-11-22 21:18:10.467491: val_loss -0.36 
2023-11-22 21:18:10.467567: Pseudo dice [0.7358, nan] 
2023-11-22 21:18:10.467634: Epoch time: 59.65 s 
2023-11-22 21:18:11.529757:  
2023-11-22 21:18:11.529952: Epoch 258 
2023-11-22 21:18:11.530064: Current learning rate: 0.00764 
2023-11-22 21:19:11.177302: train_loss -0.3518 
2023-11-22 21:19:11.177487: val_loss -0.3686 
2023-11-22 21:19:11.177561: Pseudo dice [0.7439, nan] 
2023-11-22 21:19:11.177632: Epoch time: 59.65 s 
2023-11-22 21:19:12.239277:  
2023-11-22 21:19:12.239396: Epoch 259 
2023-11-22 21:19:12.239506: Current learning rate: 0.00764 
2023-11-22 21:20:11.887703: train_loss -0.3519 
2023-11-22 21:20:11.887963: val_loss -0.372 
2023-11-22 21:20:11.888044: Pseudo dice [0.7559, nan] 
2023-11-22 21:20:11.888112: Epoch time: 59.65 s 
2023-11-22 21:20:12.942992:  
2023-11-22 21:20:12.943116: Epoch 260 
2023-11-22 21:20:12.943204: Current learning rate: 0.00763 
2023-11-22 21:21:12.621292: train_loss -0.3465 
2023-11-22 21:21:12.621476: val_loss -0.3519 
2023-11-22 21:21:12.621549: Pseudo dice [0.7176, nan] 
2023-11-22 21:21:12.621616: Epoch time: 59.68 s 
2023-11-22 21:21:13.686780:  
2023-11-22 21:21:13.686927: Epoch 261 
2023-11-22 21:21:13.687034: Current learning rate: 0.00762 
2023-11-22 21:22:13.426412: train_loss -0.3497 
2023-11-22 21:22:13.426592: val_loss -0.3455 
2023-11-22 21:22:13.426671: Pseudo dice [0.708, nan] 
2023-11-22 21:22:13.426751: Epoch time: 59.74 s 
2023-11-22 21:22:14.583828:  
2023-11-22 21:22:14.583939: Epoch 262 
2023-11-22 21:22:14.584031: Current learning rate: 0.00761 
2023-11-22 21:23:14.217919: train_loss -0.341 
2023-11-22 21:23:14.218161: val_loss -0.3601 
2023-11-22 21:23:14.218292: Pseudo dice [0.7294, nan] 
2023-11-22 21:23:14.218366: Epoch time: 59.63 s 
2023-11-22 21:23:15.266278:  
2023-11-22 21:23:15.266391: Epoch 263 
2023-11-22 21:23:15.266488: Current learning rate: 0.0076 
2023-11-22 21:24:14.909364: train_loss -0.3401 
2023-11-22 21:24:14.909574: val_loss -0.3502 
2023-11-22 21:24:14.909653: Pseudo dice [0.7294, nan] 
2023-11-22 21:24:14.909723: Epoch time: 59.64 s 
2023-11-22 21:24:15.966674:  
2023-11-22 21:24:15.966795: Epoch 264 
2023-11-22 21:24:15.966934: Current learning rate: 0.00759 
2023-11-22 21:25:15.630218: train_loss -0.3332 
2023-11-22 21:25:15.630385: val_loss -0.3608 
2023-11-22 21:25:15.630458: Pseudo dice [0.7349, nan] 
2023-11-22 21:25:15.630524: Epoch time: 59.66 s 
2023-11-22 21:25:16.683843:  
2023-11-22 21:25:16.683970: Epoch 265 
2023-11-22 21:25:16.684076: Current learning rate: 0.00758 
2023-11-22 21:26:16.304960: train_loss -0.3326 
2023-11-22 21:26:16.305151: val_loss -0.3583 
2023-11-22 21:26:16.305230: Pseudo dice [0.7286, nan] 
2023-11-22 21:26:16.305336: Epoch time: 59.62 s 
2023-11-22 21:26:17.362822:  
2023-11-22 21:26:17.362998: Epoch 266 
2023-11-22 21:26:17.363093: Current learning rate: 0.00757 
2023-11-22 21:27:16.979445: train_loss -0.3456 
2023-11-22 21:27:16.979674: val_loss -0.3324 
2023-11-22 21:27:16.979754: Pseudo dice [0.6827, nan] 
2023-11-22 21:27:16.979823: Epoch time: 59.62 s 
2023-11-22 21:27:18.040372:  
2023-11-22 21:27:18.040577: Epoch 267 
2023-11-22 21:27:18.040730: Current learning rate: 0.00756 
2023-11-22 21:28:17.753254: train_loss -0.3493 
2023-11-22 21:28:17.753463: val_loss -0.3208 
2023-11-22 21:28:17.753543: Pseudo dice [0.6615, nan] 
2023-11-22 21:28:17.753614: Epoch time: 59.71 s 
2023-11-22 21:28:18.818659:  
2023-11-22 21:28:18.818785: Epoch 268 
2023-11-22 21:28:18.818885: Current learning rate: 0.00755 
2023-11-22 21:29:18.436411: train_loss -0.351 
2023-11-22 21:29:18.436709: val_loss -0.3599 
2023-11-22 21:29:18.436866: Pseudo dice [0.7364, nan] 
2023-11-22 21:29:18.437033: Epoch time: 59.62 s 
2023-11-22 21:29:19.491794:  
2023-11-22 21:29:19.491908: Epoch 269 
2023-11-22 21:29:19.492007: Current learning rate: 0.00754 
2023-11-22 21:30:19.167316: train_loss -0.3378 
2023-11-22 21:30:19.167526: val_loss -0.3649 
2023-11-22 21:30:19.167606: Pseudo dice [0.7363, nan] 
2023-11-22 21:30:19.167675: Epoch time: 59.68 s 
2023-11-22 21:30:20.225109:  
2023-11-22 21:30:20.225221: Epoch 270 
2023-11-22 21:30:20.225315: Current learning rate: 0.00753 
2023-11-22 21:31:19.987686: train_loss -0.349 
2023-11-22 21:31:19.987889: val_loss -0.347 
2023-11-22 21:31:19.987967: Pseudo dice [0.7034, nan] 
2023-11-22 21:31:19.988034: Epoch time: 59.76 s 
2023-11-22 21:31:21.046993:  
2023-11-22 21:31:21.047109: Epoch 271 
2023-11-22 21:31:21.047204: Current learning rate: 0.00752 
2023-11-22 21:32:20.733730: train_loss -0.3571 
2023-11-22 21:32:20.733934: val_loss -0.3501 
2023-11-22 21:32:20.734010: Pseudo dice [0.7053, nan] 
2023-11-22 21:32:20.734078: Epoch time: 59.69 s 
2023-11-22 21:32:21.797945:  
2023-11-22 21:32:21.798133: Epoch 272 
2023-11-22 21:32:21.798238: Current learning rate: 0.00751 
2023-11-22 21:33:21.627769: train_loss -0.3532 
2023-11-22 21:33:21.627960: val_loss -0.3573 
2023-11-22 21:33:21.628034: Pseudo dice [0.7319, nan] 
2023-11-22 21:33:21.628102: Epoch time: 59.83 s 
2023-11-22 21:33:22.684778:  
2023-11-22 21:33:22.684981: Epoch 273 
2023-11-22 21:33:22.685089: Current learning rate: 0.00751 
2023-11-22 21:34:22.351598: train_loss -0.345 
2023-11-22 21:34:22.351805: val_loss -0.3462 
2023-11-22 21:34:22.352093: Pseudo dice [0.7074, nan] 
2023-11-22 21:34:22.352179: Epoch time: 59.67 s 
2023-11-22 21:34:23.396993:  
2023-11-22 21:34:23.397183: Epoch 274 
2023-11-22 21:34:23.397285: Current learning rate: 0.0075 
2023-11-22 21:35:23.184211: train_loss -0.346 
2023-11-22 21:35:23.184421: val_loss -0.3165 
2023-11-22 21:35:23.184498: Pseudo dice [0.6547, nan] 
2023-11-22 21:35:23.184576: Epoch time: 59.79 s 
2023-11-22 21:35:24.235636:  
2023-11-22 21:35:24.235756: Epoch 275 
2023-11-22 21:35:24.235893: Current learning rate: 0.00749 
2023-11-22 21:36:23.932470: train_loss -0.3379 
2023-11-22 21:36:23.932648: val_loss -0.3675 
2023-11-22 21:36:23.932722: Pseudo dice [0.7426, nan] 
2023-11-22 21:36:23.932789: Epoch time: 59.7 s 
2023-11-22 21:36:24.981760:  
2023-11-22 21:36:24.981871: Epoch 276 
2023-11-22 21:36:24.981964: Current learning rate: 0.00748 
2023-11-22 21:37:24.554795: train_loss -0.3381 
2023-11-22 21:37:24.555006: val_loss -0.3467 
2023-11-22 21:37:24.555085: Pseudo dice [0.7055, nan] 
2023-11-22 21:37:24.555196: Epoch time: 59.57 s 
2023-11-22 21:37:25.709298:  
2023-11-22 21:37:25.709423: Epoch 277 
2023-11-22 21:37:25.709515: Current learning rate: 0.00747 
2023-11-22 21:38:25.336714: train_loss -0.3481 
2023-11-22 21:38:25.336907: val_loss -0.3637 
2023-11-22 21:38:25.336989: Pseudo dice [0.7367, nan] 
2023-11-22 21:38:25.337068: Epoch time: 59.63 s 
2023-11-22 21:38:26.405532:  
2023-11-22 21:38:26.405650: Epoch 278 
2023-11-22 21:38:26.405751: Current learning rate: 0.00746 
2023-11-22 21:39:26.064805: train_loss -0.344 
2023-11-22 21:39:26.064988: val_loss -0.3675 
2023-11-22 21:39:26.065065: Pseudo dice [0.7334, nan] 
2023-11-22 21:39:26.065131: Epoch time: 59.66 s 
2023-11-22 21:39:27.131529:  
2023-11-22 21:39:27.131657: Epoch 279 
2023-11-22 21:39:27.131771: Current learning rate: 0.00745 
2023-11-22 21:40:26.763735: train_loss -0.3413 
2023-11-22 21:40:26.763918: val_loss -0.3527 
2023-11-22 21:40:26.763992: Pseudo dice [0.714, nan] 
2023-11-22 21:40:26.764060: Epoch time: 59.63 s 
2023-11-22 21:40:27.813768:  
2023-11-22 21:40:27.813899: Epoch 280 
2023-11-22 21:40:27.814002: Current learning rate: 0.00744 
2023-11-22 21:41:27.438160: train_loss -0.3464 
2023-11-22 21:41:27.438359: val_loss -0.3503 
2023-11-22 21:41:27.438444: Pseudo dice [0.7155, nan] 
2023-11-22 21:41:27.438517: Epoch time: 59.63 s 
2023-11-22 21:41:28.589700:  
2023-11-22 21:41:28.589827: Epoch 281 
2023-11-22 21:41:28.589926: Current learning rate: 0.00743 
2023-11-22 21:42:28.383167: train_loss -0.3438 
2023-11-22 21:42:28.383415: val_loss -0.3595 
2023-11-22 21:42:28.383502: Pseudo dice [0.7295, nan] 
2023-11-22 21:42:28.383578: Epoch time: 59.79 s 
2023-11-22 21:42:29.434390:  
2023-11-22 21:42:29.434511: Epoch 282 
2023-11-22 21:42:29.434612: Current learning rate: 0.00742 
2023-11-22 21:43:29.124787: train_loss -0.3502 
2023-11-22 21:43:29.124978: val_loss -0.3533 
2023-11-22 21:43:29.125060: Pseudo dice [0.7221, nan] 
2023-11-22 21:43:29.125131: Epoch time: 59.69 s 
2023-11-22 21:43:30.191184:  
2023-11-22 21:43:30.191365: Epoch 283 
2023-11-22 21:43:30.191531: Current learning rate: 0.00741 
2023-11-22 21:44:29.801152: train_loss -0.3516 
2023-11-22 21:44:29.801366: val_loss -0.3567 
2023-11-22 21:44:29.801450: Pseudo dice [0.7255, nan] 
2023-11-22 21:44:29.801521: Epoch time: 59.61 s 
2023-11-22 21:44:30.853958:  
2023-11-22 21:44:30.854078: Epoch 284 
2023-11-22 21:44:30.854175: Current learning rate: 0.0074 
2023-11-22 21:45:30.543644: train_loss -0.3444 
2023-11-22 21:45:30.543883: val_loss -0.3681 
2023-11-22 21:45:30.544029: Pseudo dice [0.748, nan] 
2023-11-22 21:45:30.544141: Epoch time: 59.69 s 
2023-11-22 21:45:31.613801:  
2023-11-22 21:45:31.613924: Epoch 285 
2023-11-22 21:45:31.614025: Current learning rate: 0.00739 
2023-11-22 21:46:31.350646: train_loss -0.336 
2023-11-22 21:46:31.350823: val_loss -0.3853 
2023-11-22 21:46:31.350906: Pseudo dice [0.7798, nan] 
2023-11-22 21:46:31.350977: Epoch time: 59.74 s 
2023-11-22 21:46:32.406106:  
2023-11-22 21:46:32.406228: Epoch 286 
2023-11-22 21:46:32.406384: Current learning rate: 0.00738 
2023-11-22 21:47:32.158530: train_loss -0.3381 
2023-11-22 21:47:32.158712: val_loss -0.3544 
2023-11-22 21:47:32.158791: Pseudo dice [0.7254, nan] 
2023-11-22 21:47:32.158862: Epoch time: 59.75 s 
2023-11-22 21:47:33.229243:  
2023-11-22 21:47:33.229378: Epoch 287 
2023-11-22 21:47:33.229508: Current learning rate: 0.00738 
2023-11-22 21:48:32.818799: train_loss -0.337 
2023-11-22 21:48:32.818992: val_loss -0.3415 
2023-11-22 21:48:32.819103: Pseudo dice [0.6938, nan] 
2023-11-22 21:48:32.819240: Epoch time: 59.59 s 
2023-11-22 21:48:33.891408:  
2023-11-22 21:48:33.891531: Epoch 288 
2023-11-22 21:48:33.891632: Current learning rate: 0.00737 
2023-11-22 21:49:33.617202: train_loss -0.3527 
2023-11-22 21:49:33.617384: val_loss -0.356 
2023-11-22 21:49:33.617463: Pseudo dice [0.7225, nan] 
2023-11-22 21:49:33.617532: Epoch time: 59.73 s 
2023-11-22 21:49:34.686758:  
2023-11-22 21:49:34.686876: Epoch 289 
2023-11-22 21:49:34.686980: Current learning rate: 0.00736 
2023-11-22 21:50:34.460189: train_loss -0.3342 
2023-11-22 21:50:34.460376: val_loss -0.326 
2023-11-22 21:50:34.460458: Pseudo dice [0.6688, nan] 
2023-11-22 21:50:34.460534: Epoch time: 59.77 s 
2023-11-22 21:50:35.534953:  
2023-11-22 21:50:35.535076: Epoch 290 
2023-11-22 21:50:35.535174: Current learning rate: 0.00735 
2023-11-22 21:51:35.239477: train_loss -0.3449 
2023-11-22 21:51:35.239673: val_loss -0.3633 
2023-11-22 21:51:35.239753: Pseudo dice [0.7421, nan] 
2023-11-22 21:51:35.239876: Epoch time: 59.71 s 
2023-11-22 21:51:36.313472:  
2023-11-22 21:51:36.313590: Epoch 291 
2023-11-22 21:51:36.313690: Current learning rate: 0.00734 
2023-11-22 21:52:36.181327: train_loss -0.3448 
2023-11-22 21:52:36.181513: val_loss -0.3352 
2023-11-22 21:52:36.181590: Pseudo dice [0.6896, nan] 
2023-11-22 21:52:36.181660: Epoch time: 59.87 s 
2023-11-22 21:52:37.255563:  
2023-11-22 21:52:37.255760: Epoch 292 
2023-11-22 21:52:37.255865: Current learning rate: 0.00733 
2023-11-22 21:53:37.042372: train_loss -0.3458 
2023-11-22 21:53:37.042595: val_loss -0.3723 
2023-11-22 21:53:37.042681: Pseudo dice [0.7548, nan] 
2023-11-22 21:53:37.042753: Epoch time: 59.79 s 
2023-11-22 21:53:38.115072:  
2023-11-22 21:53:38.115194: Epoch 293 
2023-11-22 21:53:38.115295: Current learning rate: 0.00732 
2023-11-22 21:54:37.853725: train_loss -0.3409 
2023-11-22 21:54:37.853924: val_loss -0.3532 
2023-11-22 21:54:37.854026: Pseudo dice [0.721, nan] 
2023-11-22 21:54:37.854102: Epoch time: 59.74 s 
2023-11-22 21:54:39.030703:  
2023-11-22 21:54:39.030876: Epoch 294 
2023-11-22 21:54:39.031016: Current learning rate: 0.00731 
2023-11-22 21:55:38.857876: train_loss -0.3547 
2023-11-22 21:55:38.858068: val_loss -0.3755 
2023-11-22 21:55:38.858150: Pseudo dice [0.7654, nan] 
2023-11-22 21:55:38.858228: Epoch time: 59.83 s 
2023-11-22 21:55:39.929784:  
2023-11-22 21:55:39.929962: Epoch 295 
2023-11-22 21:55:39.930111: Current learning rate: 0.0073 
2023-11-22 21:56:39.648883: train_loss -0.3475 
2023-11-22 21:56:39.649108: val_loss -0.3628 
2023-11-22 21:56:39.649227: Pseudo dice [0.7353, nan] 
2023-11-22 21:56:39.649303: Epoch time: 59.72 s 
2023-11-22 21:56:40.716167:  
2023-11-22 21:56:40.716454: Epoch 296 
2023-11-22 21:56:40.716612: Current learning rate: 0.00729 
2023-11-22 21:57:40.385410: train_loss -0.3365 
2023-11-22 21:57:40.385626: val_loss -0.3504 
2023-11-22 21:57:40.385712: Pseudo dice [0.7073, nan] 
2023-11-22 21:57:40.385795: Epoch time: 59.67 s 
2023-11-22 21:57:41.458172:  
2023-11-22 21:57:41.458303: Epoch 297 
2023-11-22 21:57:41.464698: Current learning rate: 0.00728 
2023-11-22 21:58:41.189075: train_loss -0.345 
2023-11-22 21:58:41.189278: val_loss -0.3457 
2023-11-22 21:58:41.189362: Pseudo dice [0.6962, nan] 
2023-11-22 21:58:41.189433: Epoch time: 59.73 s 
2023-11-22 21:58:42.368157:  
2023-11-22 21:58:42.368281: Epoch 298 
2023-11-22 21:58:42.368413: Current learning rate: 0.00727 
2023-11-22 21:59:42.105483: train_loss -0.3416 
2023-11-22 21:59:42.105683: val_loss -0.3334 
2023-11-22 21:59:42.105801: Pseudo dice [0.6732, nan] 
2023-11-22 21:59:42.105884: Epoch time: 59.74 s 
2023-11-22 21:59:43.180740:  
2023-11-22 21:59:43.180867: Epoch 299 
2023-11-22 21:59:43.180977: Current learning rate: 0.00726 
2023-11-22 22:00:42.772886: train_loss -0.3446 
2023-11-22 22:00:42.773078: val_loss -0.3422 
2023-11-22 22:00:42.773154: Pseudo dice [0.6919, nan] 
2023-11-22 22:00:42.773234: Epoch time: 59.59 s 
2023-11-22 22:00:43.959046:  
2023-11-22 22:00:43.959213: Epoch 300 
2023-11-22 22:00:43.959357: Current learning rate: 0.00725 
2023-11-22 22:01:43.769105: train_loss -0.3315 
2023-11-22 22:01:43.769310: val_loss -0.3301 
2023-11-22 22:01:43.769394: Pseudo dice [0.6757, nan] 
2023-11-22 22:01:43.769468: Epoch time: 59.81 s 
2023-11-22 22:01:44.829979:  
2023-11-22 22:01:44.830143: Epoch 301 
2023-11-22 22:01:44.830292: Current learning rate: 0.00724 
2023-11-22 22:02:44.549227: train_loss -0.3466 
2023-11-22 22:02:44.549442: val_loss -0.3511 
2023-11-22 22:02:44.549527: Pseudo dice [0.7126, nan] 
2023-11-22 22:02:44.549653: Epoch time: 59.72 s 
2023-11-22 22:02:45.725800:  
2023-11-22 22:02:45.726017: Epoch 302 
2023-11-22 22:02:45.726141: Current learning rate: 0.00724 
2023-11-22 22:03:45.462225: train_loss -0.3561 
2023-11-22 22:03:45.462452: val_loss -0.3542 
2023-11-22 22:03:45.462535: Pseudo dice [0.723, nan] 
2023-11-22 22:03:45.462608: Epoch time: 59.74 s 
2023-11-22 22:03:46.533939:  
2023-11-22 22:03:46.534116: Epoch 303 
2023-11-22 22:03:46.534254: Current learning rate: 0.00723 
2023-11-22 22:04:46.252712: train_loss -0.3343 
2023-11-22 22:04:46.252919: val_loss -0.3304 
2023-11-22 22:04:46.253033: Pseudo dice [0.6763, nan] 
2023-11-22 22:04:46.253108: Epoch time: 59.72 s 
2023-11-22 22:04:47.334337:  
2023-11-22 22:04:47.334467: Epoch 304 
2023-11-22 22:04:47.334600: Current learning rate: 0.00722 
2023-11-22 22:05:47.069097: train_loss -0.3415 
2023-11-22 22:05:47.069288: val_loss -0.3342 
2023-11-22 22:05:47.069368: Pseudo dice [0.6816, nan] 
2023-11-22 22:05:47.069437: Epoch time: 59.74 s 
2023-11-22 22:05:48.138688:  
2023-11-22 22:05:48.138820: Epoch 305 
2023-11-22 22:05:48.138929: Current learning rate: 0.00721 
2023-11-22 22:06:47.895811: train_loss -0.3527 
2023-11-22 22:06:47.896177: val_loss -0.3513 
2023-11-22 22:06:47.896385: Pseudo dice [0.7045, nan] 
2023-11-22 22:06:47.896585: Epoch time: 59.76 s 
2023-11-22 22:06:48.975106:  
2023-11-22 22:06:48.975237: Epoch 306 
2023-11-22 22:06:48.975332: Current learning rate: 0.0072 
2023-11-22 22:07:48.626656: train_loss -0.3396 
2023-11-22 22:07:48.626883: val_loss -0.3322 
2023-11-22 22:07:48.626967: Pseudo dice [0.674, nan] 
2023-11-22 22:07:48.627038: Epoch time: 59.65 s 
2023-11-22 22:07:49.697178:  
2023-11-22 22:07:49.697289: Epoch 307 
2023-11-22 22:07:49.697406: Current learning rate: 0.00719 
2023-11-22 22:08:49.416661: train_loss -0.3464 
2023-11-22 22:08:49.416856: val_loss -0.3581 
2023-11-22 22:08:49.416936: Pseudo dice [0.7307, nan] 
2023-11-22 22:08:49.417008: Epoch time: 59.72 s 
2023-11-22 22:08:50.486932:  
2023-11-22 22:08:50.487115: Epoch 308 
2023-11-22 22:08:50.487227: Current learning rate: 0.00718 
2023-11-22 22:09:50.113385: train_loss -0.3486 
2023-11-22 22:09:50.113556: val_loss -0.3468 
2023-11-22 22:09:50.113641: Pseudo dice [0.6956, nan] 
2023-11-22 22:09:50.113722: Epoch time: 59.63 s 
2023-11-22 22:09:51.175704:  
2023-11-22 22:09:51.175823: Epoch 309 
2023-11-22 22:09:51.175952: Current learning rate: 0.00717 
2023-11-22 22:10:50.784523: train_loss -0.3496 
2023-11-22 22:10:50.784730: val_loss -0.3357 
2023-11-22 22:10:50.784870: Pseudo dice [0.6739, nan] 
2023-11-22 22:10:50.784956: Epoch time: 59.61 s 
2023-11-22 22:10:51.861044:  
2023-11-22 22:10:51.861167: Epoch 310 
2023-11-22 22:10:51.861266: Current learning rate: 0.00716 
2023-11-22 22:11:51.476851: train_loss -0.3412 
2023-11-22 22:11:51.477039: val_loss -0.3559 
2023-11-22 22:11:51.477116: Pseudo dice [0.7141, nan] 
2023-11-22 22:11:51.477187: Epoch time: 59.62 s 
2023-11-22 22:11:52.548289:  
2023-11-22 22:11:52.548494: Epoch 311 
2023-11-22 22:11:52.548635: Current learning rate: 0.00715 
2023-11-22 22:12:52.279557: train_loss -0.3536 
2023-11-22 22:12:52.279768: val_loss -0.3765 
2023-11-22 22:12:52.279860: Pseudo dice [0.764, nan] 
2023-11-22 22:12:52.279937: Epoch time: 59.73 s 
2023-11-22 22:12:53.359970:  
2023-11-22 22:12:53.360098: Epoch 312 
2023-11-22 22:12:53.360199: Current learning rate: 0.00714 
2023-11-22 22:13:53.119204: train_loss -0.3474 
2023-11-22 22:13:53.119398: val_loss -0.3489 
2023-11-22 22:13:53.119479: Pseudo dice [0.712, nan] 
2023-11-22 22:13:53.119598: Epoch time: 59.76 s 
2023-11-22 22:13:54.196970:  
2023-11-22 22:13:54.197163: Epoch 313 
2023-11-22 22:13:54.197304: Current learning rate: 0.00713 
2023-11-22 22:14:53.988426: train_loss -0.3483 
2023-11-22 22:14:53.988635: val_loss -0.3663 
2023-11-22 22:14:53.988727: Pseudo dice [0.7406, nan] 
2023-11-22 22:14:53.988820: Epoch time: 59.79 s 
2023-11-22 22:14:55.080173:  
2023-11-22 22:14:55.080305: Epoch 314 
2023-11-22 22:14:55.080401: Current learning rate: 0.00712 
2023-11-22 22:15:54.893582: train_loss -0.3437 
2023-11-22 22:15:54.893778: val_loss -0.3435 
2023-11-22 22:15:54.893859: Pseudo dice [0.6832, nan] 
2023-11-22 22:15:54.893930: Epoch time: 59.81 s 
2023-11-22 22:15:56.097048:  
2023-11-22 22:15:56.097186: Epoch 315 
2023-11-22 22:15:56.097301: Current learning rate: 0.00711 
2023-11-22 22:16:55.810732: train_loss -0.3506 
2023-11-22 22:16:55.810933: val_loss -0.3655 
2023-11-22 22:16:55.811019: Pseudo dice [0.7505, nan] 
2023-11-22 22:16:55.811113: Epoch time: 59.71 s 
2023-11-22 22:16:56.889103:  
2023-11-22 22:16:56.889236: Epoch 316 
2023-11-22 22:16:56.889336: Current learning rate: 0.0071 
2023-11-22 22:17:56.681191: train_loss -0.3558 
2023-11-22 22:17:56.681396: val_loss -0.346 
2023-11-22 22:17:56.681478: Pseudo dice [0.7082, nan] 
2023-11-22 22:17:56.681561: Epoch time: 59.79 s 
2023-11-22 22:17:57.769780:  
2023-11-22 22:17:57.769908: Epoch 317 
2023-11-22 22:17:57.770007: Current learning rate: 0.0071 
2023-11-22 22:18:57.617232: train_loss -0.3417 
2023-11-22 22:18:57.617455: val_loss -0.3366 
2023-11-22 22:18:57.617536: Pseudo dice [0.6873, nan] 
2023-11-22 22:18:57.617607: Epoch time: 59.85 s 
2023-11-22 22:18:58.690148:  
2023-11-22 22:18:58.690273: Epoch 318 
2023-11-22 22:18:58.690387: Current learning rate: 0.00709 
2023-11-22 22:19:58.501885: train_loss -0.3436 
2023-11-22 22:19:58.502079: val_loss -0.3171 
2023-11-22 22:19:58.502164: Pseudo dice [0.6528, nan] 
2023-11-22 22:19:58.502237: Epoch time: 59.81 s 
2023-11-22 22:19:59.590513:  
2023-11-22 22:19:59.590694: Epoch 319 
2023-11-22 22:19:59.590799: Current learning rate: 0.00708 
2023-11-22 22:20:59.290553: train_loss -0.3429 
2023-11-22 22:20:59.290772: val_loss -0.3322 
2023-11-22 22:20:59.290857: Pseudo dice [0.6837, nan] 
2023-11-22 22:20:59.290941: Epoch time: 59.7 s 
2023-11-22 22:21:00.469950:  
2023-11-22 22:21:00.470088: Epoch 320 
2023-11-22 22:21:00.470190: Current learning rate: 0.00707 
2023-11-22 22:22:00.184808: train_loss -0.3408 
2023-11-22 22:22:00.184991: val_loss -0.3594 
2023-11-22 22:22:00.185107: Pseudo dice [0.7256, nan] 
2023-11-22 22:22:00.185180: Epoch time: 59.72 s 
2023-11-22 22:22:01.260098:  
2023-11-22 22:22:01.260232: Epoch 321 
2023-11-22 22:22:01.260340: Current learning rate: 0.00706 
2023-11-22 22:23:00.986520: train_loss -0.3441 
2023-11-22 22:23:00.986739: val_loss -0.3621 
2023-11-22 22:23:00.986822: Pseudo dice [0.732, nan] 
2023-11-22 22:23:00.986894: Epoch time: 59.73 s 
2023-11-22 22:23:02.059061:  
2023-11-22 22:23:02.059181: Epoch 322 
2023-11-22 22:23:02.059330: Current learning rate: 0.00705 
2023-11-22 22:24:01.703709: train_loss -0.3469 
2023-11-22 22:24:01.703896: val_loss -0.3395 
2023-11-22 22:24:01.703973: Pseudo dice [0.6904, nan] 
2023-11-22 22:24:01.704046: Epoch time: 59.65 s 
2023-11-22 22:24:02.779706:  
2023-11-22 22:24:02.779830: Epoch 323 
2023-11-22 22:24:02.779929: Current learning rate: 0.00704 
2023-11-22 22:25:02.472367: train_loss -0.3454 
2023-11-22 22:25:02.472592: val_loss -0.3447 
2023-11-22 22:25:02.472674: Pseudo dice [0.6978, nan] 
2023-11-22 22:25:02.472791: Epoch time: 59.69 s 
2023-11-22 22:25:03.644335:  
2023-11-22 22:25:03.644466: Epoch 324 
2023-11-22 22:25:03.644578: Current learning rate: 0.00703 
2023-11-22 22:26:03.360025: train_loss -0.3442 
2023-11-22 22:26:03.360225: val_loss -0.3464 
2023-11-22 22:26:03.360293: Pseudo dice [0.7052, nan] 
2023-11-22 22:26:03.360364: Epoch time: 59.72 s 
2023-11-22 22:26:04.437102:  
2023-11-22 22:26:04.437229: Epoch 325 
2023-11-22 22:26:04.437349: Current learning rate: 0.00702 
2023-11-22 22:27:04.165266: train_loss -0.3406 
2023-11-22 22:27:04.165474: val_loss -0.3531 
2023-11-22 22:27:04.165556: Pseudo dice [0.7154, nan] 
2023-11-22 22:27:04.165628: Epoch time: 59.73 s 
2023-11-22 22:27:05.238044:  
2023-11-22 22:27:05.238165: Epoch 326 
2023-11-22 22:27:05.238288: Current learning rate: 0.00701 
2023-11-22 22:28:04.865749: train_loss -0.3475 
2023-11-22 22:28:04.865964: val_loss -0.3643 
2023-11-22 22:28:04.866055: Pseudo dice [0.7427, nan] 
2023-11-22 22:28:04.866134: Epoch time: 59.63 s 
2023-11-22 22:28:05.935582:  
2023-11-22 22:28:05.935708: Epoch 327 
2023-11-22 22:28:05.935812: Current learning rate: 0.007 
2023-11-22 22:29:05.614870: train_loss -0.3391 
2023-11-22 22:29:05.615062: val_loss -0.3434 
2023-11-22 22:29:05.615198: Pseudo dice [0.6879, nan] 
2023-11-22 22:29:05.615299: Epoch time: 59.68 s 
2023-11-22 22:29:06.705365:  
2023-11-22 22:29:06.705498: Epoch 328 
2023-11-22 22:29:06.705599: Current learning rate: 0.00699 
2023-11-22 22:30:06.369046: train_loss -0.3443 
2023-11-22 22:30:06.369283: val_loss -0.3612 
2023-11-22 22:30:06.369388: Pseudo dice [0.7315, nan] 
2023-11-22 22:30:06.369459: Epoch time: 59.66 s 
2023-11-22 22:30:07.444170:  
2023-11-22 22:30:07.444327: Epoch 329 
2023-11-22 22:30:07.444445: Current learning rate: 0.00698 
2023-11-22 22:31:07.164268: train_loss -0.348 
2023-11-22 22:31:07.164448: val_loss -0.3674 
2023-11-22 22:31:07.164527: Pseudo dice [0.7586, nan] 
2023-11-22 22:31:07.164622: Epoch time: 59.72 s 
2023-11-22 22:31:08.242487:  
2023-11-22 22:31:08.242666: Epoch 330 
2023-11-22 22:31:08.242795: Current learning rate: 0.00697 
2023-11-22 22:32:07.963805: train_loss -0.3562 
2023-11-22 22:32:07.964030: val_loss -0.3589 
2023-11-22 22:32:07.964108: Pseudo dice [0.7247, nan] 
2023-11-22 22:32:07.964185: Epoch time: 59.72 s 
2023-11-22 22:32:09.250255:  
2023-11-22 22:32:09.250388: Epoch 331 
2023-11-22 22:32:09.250490: Current learning rate: 0.00696 
2023-11-22 22:33:08.874958: train_loss -0.3495 
2023-11-22 22:33:08.875150: val_loss -0.3362 
2023-11-22 22:33:08.875279: Pseudo dice [0.6861, nan] 
2023-11-22 22:33:08.875361: Epoch time: 59.63 s 
2023-11-22 22:33:09.947751:  
2023-11-22 22:33:09.947877: Epoch 332 
2023-11-22 22:33:09.947978: Current learning rate: 0.00696 
2023-11-22 22:34:09.540766: train_loss -0.3443 
2023-11-22 22:34:09.540981: val_loss -0.3512 
2023-11-22 22:34:09.541067: Pseudo dice [0.7092, nan] 
2023-11-22 22:34:09.541141: Epoch time: 59.59 s 
2023-11-22 22:34:10.708318:  
2023-11-22 22:34:10.708438: Epoch 333 
2023-11-22 22:34:10.708613: Current learning rate: 0.00695 
2023-11-22 22:35:10.384807: train_loss -0.3429 
2023-11-22 22:35:10.385040: val_loss -0.3329 
2023-11-22 22:35:10.385134: Pseudo dice [0.6847, nan] 
2023-11-22 22:35:10.385206: Epoch time: 59.68 s 
2023-11-22 22:35:11.456524:  
2023-11-22 22:35:11.456853: Epoch 334 
2023-11-22 22:35:11.456987: Current learning rate: 0.00694 
2023-11-22 22:36:11.317405: train_loss -0.3441 
2023-11-22 22:36:11.317604: val_loss -0.3577 
2023-11-22 22:36:11.317685: Pseudo dice [0.7257, nan] 
2023-11-22 22:36:11.317808: Epoch time: 59.86 s 
2023-11-22 22:36:12.404132:  
2023-11-22 22:36:12.404257: Epoch 335 
2023-11-22 22:36:12.404355: Current learning rate: 0.00693 
2023-11-22 22:37:12.139469: train_loss -0.3618 
2023-11-22 22:37:12.139657: val_loss -0.3539 
2023-11-22 22:37:12.139736: Pseudo dice [0.7132, nan] 
2023-11-22 22:37:12.139809: Epoch time: 59.74 s 
2023-11-22 22:37:13.236359:  
2023-11-22 22:37:13.236495: Epoch 336 
2023-11-22 22:37:13.236629: Current learning rate: 0.00692 
2023-11-22 22:38:12.906018: train_loss -0.3463 
2023-11-22 22:38:12.906219: val_loss -0.3874 
2023-11-22 22:38:12.906301: Pseudo dice [0.7904, nan] 
2023-11-22 22:38:12.906376: Epoch time: 59.67 s 
2023-11-22 22:38:13.988411:  
2023-11-22 22:38:13.988589: Epoch 337 
2023-11-22 22:38:13.988743: Current learning rate: 0.00691 
2023-11-22 22:39:13.796771: train_loss -0.3459 
2023-11-22 22:39:13.796962: val_loss -0.3552 
2023-11-22 22:39:13.797038: Pseudo dice [0.7168, nan] 
2023-11-22 22:39:13.797108: Epoch time: 59.81 s 
2023-11-22 22:39:14.988559:  
2023-11-22 22:39:14.988718: Epoch 338 
2023-11-22 22:39:14.988832: Current learning rate: 0.0069 
2023-11-22 22:40:14.612711: train_loss -0.3416 
2023-11-22 22:40:14.612968: val_loss -0.3339 
2023-11-22 22:40:14.613055: Pseudo dice [0.6831, nan] 
2023-11-22 22:40:14.613129: Epoch time: 59.62 s 
2023-11-22 22:40:15.699800:  
2023-11-22 22:40:15.699969: Epoch 339 
2023-11-22 22:40:15.700129: Current learning rate: 0.00689 
2023-11-22 22:41:15.388720: train_loss -0.3444 
2023-11-22 22:41:15.388930: val_loss -0.3474 
2023-11-22 22:41:15.389012: Pseudo dice [0.7096, nan] 
2023-11-22 22:41:15.389082: Epoch time: 59.69 s 
2023-11-22 22:41:16.470927:  
2023-11-22 22:41:16.471047: Epoch 340 
2023-11-22 22:41:16.471171: Current learning rate: 0.00688 
2023-11-22 22:42:16.087236: train_loss -0.3405 
2023-11-22 22:42:16.087427: val_loss -0.359 
2023-11-22 22:42:16.087507: Pseudo dice [0.7278, nan] 
2023-11-22 22:42:16.087576: Epoch time: 59.62 s 
2023-11-22 22:42:17.181980:  
2023-11-22 22:42:17.182101: Epoch 341 
2023-11-22 22:42:17.182199: Current learning rate: 0.00687 
2023-11-22 22:43:16.947258: train_loss -0.342 
2023-11-22 22:43:16.947441: val_loss -0.3517 
2023-11-22 22:43:16.947518: Pseudo dice [0.7059, nan] 
2023-11-22 22:43:16.947592: Epoch time: 59.77 s 
2023-11-22 22:43:18.039742:  
2023-11-22 22:43:18.039867: Epoch 342 
2023-11-22 22:43:18.039966: Current learning rate: 0.00686 
2023-11-22 22:44:17.729499: train_loss -0.3621 
2023-11-22 22:44:17.729706: val_loss -0.3707 
2023-11-22 22:44:17.729852: Pseudo dice [0.7508, nan] 
2023-11-22 22:44:17.729934: Epoch time: 59.69 s 
2023-11-22 22:44:18.927777:  
2023-11-22 22:44:18.927904: Epoch 343 
2023-11-22 22:44:18.928005: Current learning rate: 0.00685 
2023-11-22 22:45:18.540777: train_loss -0.3359 
2023-11-22 22:45:18.541006: val_loss -0.3476 
2023-11-22 22:45:18.541104: Pseudo dice [0.72, nan] 
2023-11-22 22:45:18.541180: Epoch time: 59.61 s 
2023-11-22 22:45:19.633108:  
2023-11-22 22:45:19.633236: Epoch 344 
2023-11-22 22:45:19.633394: Current learning rate: 0.00684 
2023-11-22 22:46:19.260016: train_loss -0.3414 
2023-11-22 22:46:19.260211: val_loss -0.3094 
2023-11-22 22:46:19.260291: Pseudo dice [0.6517, nan] 
2023-11-22 22:46:19.260367: Epoch time: 59.63 s 
2023-11-22 22:46:20.346445:  
2023-11-22 22:46:20.346785: Epoch 345 
2023-11-22 22:46:20.346920: Current learning rate: 0.00683 
2023-11-22 22:47:19.995347: train_loss -0.3387 
2023-11-22 22:47:19.995531: val_loss -0.3625 
2023-11-22 22:47:19.995622: Pseudo dice [0.7348, nan] 
2023-11-22 22:47:19.995735: Epoch time: 59.65 s 
2023-11-22 22:47:21.199410:  
2023-11-22 22:47:21.199534: Epoch 346 
2023-11-22 22:47:21.199631: Current learning rate: 0.00682 
2023-11-22 22:48:20.938422: train_loss -0.3345 
2023-11-22 22:48:20.938634: val_loss -0.3384 
2023-11-22 22:48:20.938708: Pseudo dice [0.6849, nan] 
2023-11-22 22:48:20.938783: Epoch time: 59.74 s 
2023-11-22 22:48:22.130773:  
2023-11-22 22:48:22.130905: Epoch 347 
2023-11-22 22:48:22.131035: Current learning rate: 0.00681 
2023-11-22 22:49:21.831304: train_loss -0.3524 
2023-11-22 22:49:21.831503: val_loss -0.3503 
2023-11-22 22:49:21.831608: Pseudo dice [0.7158, nan] 
2023-11-22 22:49:21.831682: Epoch time: 59.7 s 
2023-11-22 22:49:22.918704:  
2023-11-22 22:49:22.918827: Epoch 348 
2023-11-22 22:49:22.918922: Current learning rate: 0.0068 
2023-11-22 22:50:22.542342: train_loss -0.3403 
2023-11-22 22:50:22.542653: val_loss -0.3392 
2023-11-22 22:50:22.542774: Pseudo dice [0.6861, nan] 
2023-11-22 22:50:22.542946: Epoch time: 59.62 s 
2023-11-22 22:50:23.636044:  
2023-11-22 22:50:23.636169: Epoch 349 
2023-11-22 22:50:23.636269: Current learning rate: 0.0068 
2023-11-22 22:51:23.402149: train_loss -0.3372 
2023-11-22 22:51:23.402359: val_loss -0.3449 
2023-11-22 22:51:23.402445: Pseudo dice [0.6954, nan] 
2023-11-22 22:51:23.402515: Epoch time: 59.77 s 
2023-11-22 22:51:24.604320:  
2023-11-22 22:51:24.604449: Epoch 350 
2023-11-22 22:51:24.604548: Current learning rate: 0.00679 
2023-11-22 22:52:24.275206: train_loss -0.347 
2023-11-22 22:52:24.275460: val_loss -0.3519 
2023-11-22 22:52:24.275547: Pseudo dice [0.7258, nan] 
2023-11-22 22:52:24.275620: Epoch time: 59.67 s 
2023-11-22 22:52:25.362157:  
2023-11-22 22:52:25.362306: Epoch 351 
2023-11-22 22:52:25.362396: Current learning rate: 0.00678 
2023-11-22 22:53:24.813112: train_loss -0.3523 
2023-11-22 22:53:24.813360: val_loss -0.3373 
2023-11-22 22:53:24.813443: Pseudo dice [0.6901, nan] 
2023-11-22 22:53:24.813516: Epoch time: 59.45 s 
2023-11-22 22:53:26.006881:  
2023-11-22 22:53:26.007014: Epoch 352 
2023-11-22 22:53:26.007110: Current learning rate: 0.00677 
2023-11-22 22:54:25.714441: train_loss -0.3427 
2023-11-22 22:54:25.714610: val_loss -0.3473 
2023-11-22 22:54:25.714694: Pseudo dice [0.7047, nan] 
2023-11-22 22:54:25.714770: Epoch time: 59.71 s 
2023-11-22 22:54:26.803714:  
2023-11-22 22:54:26.803849: Epoch 353 
2023-11-22 22:54:26.803952: Current learning rate: 0.00676 
2023-11-22 22:55:26.413694: train_loss -0.3658 
2023-11-22 22:55:26.413895: val_loss -0.3535 
2023-11-22 22:55:26.414020: Pseudo dice [0.7163, nan] 
2023-11-22 22:55:26.414105: Epoch time: 59.61 s 
2023-11-22 22:55:27.587543:  
2023-11-22 22:55:27.587663: Epoch 354 
2023-11-22 22:55:27.587771: Current learning rate: 0.00675 
2023-11-22 22:56:27.196867: train_loss -0.3393 
2023-11-22 22:56:27.197055: val_loss -0.3578 
2023-11-22 22:56:27.197179: Pseudo dice [0.7283, nan] 
2023-11-22 22:56:27.197280: Epoch time: 59.61 s 
2023-11-22 22:56:28.289195:  
2023-11-22 22:56:28.289329: Epoch 355 
2023-11-22 22:56:28.289484: Current learning rate: 0.00674 
2023-11-22 22:57:27.846498: train_loss -0.3525 
2023-11-22 22:57:27.846699: val_loss -0.3459 
2023-11-22 22:57:27.846783: Pseudo dice [0.7063, nan] 
2023-11-22 22:57:27.846859: Epoch time: 59.56 s 
2023-11-22 22:57:28.935354:  
2023-11-22 22:57:28.935482: Epoch 356 
2023-11-22 22:57:28.935585: Current learning rate: 0.00673 
2023-11-22 22:58:28.404203: train_loss -0.3498 
2023-11-22 22:58:28.404385: val_loss -0.3475 
2023-11-22 22:58:28.404523: Pseudo dice [0.6963, nan] 
2023-11-22 22:58:28.404613: Epoch time: 59.47 s 
2023-11-22 22:58:29.595397:  
2023-11-22 22:58:29.595518: Epoch 357 
2023-11-22 22:58:29.595634: Current learning rate: 0.00672 
2023-11-22 22:59:29.203795: train_loss -0.3365 
2023-11-22 22:59:29.203982: val_loss -0.3546 
2023-11-22 22:59:29.204060: Pseudo dice [0.7164, nan] 
2023-11-22 22:59:29.204130: Epoch time: 59.61 s 
2023-11-22 22:59:30.289023:  
2023-11-22 22:59:30.289146: Epoch 358 
2023-11-22 22:59:30.289243: Current learning rate: 0.00671 
2023-11-22 23:00:28.025102: train_loss -0.3469 
2023-11-22 23:00:28.025322: val_loss -0.3311 
2023-11-22 23:00:28.025405: Pseudo dice [0.6791, nan] 
2023-11-22 23:00:28.025485: Epoch time: 57.74 s 
2023-11-22 23:00:29.111974:  
2023-11-22 23:00:29.112107: Epoch 359 
2023-11-22 23:00:29.112209: Current learning rate: 0.0067 
2023-11-22 23:01:24.811755: train_loss -0.3412 
2023-11-22 23:01:24.811973: val_loss -0.3274 
2023-11-22 23:01:24.812073: Pseudo dice [0.6799, nan] 
2023-11-22 23:01:24.812204: Epoch time: 55.7 s 
2023-11-22 23:01:25.896780:  
2023-11-22 23:01:25.896906: Epoch 360 
2023-11-22 23:01:25.897016: Current learning rate: 0.00669 
2023-11-22 23:02:21.534911: train_loss -0.3456 
2023-11-22 23:02:21.535118: val_loss -0.3475 
2023-11-22 23:02:21.535222: Pseudo dice [0.7091, nan] 
2023-11-22 23:02:21.535298: Epoch time: 55.64 s 
2023-11-22 23:02:22.729382:  
2023-11-22 23:02:22.729529: Epoch 361 
2023-11-22 23:02:22.729634: Current learning rate: 0.00668 
2023-11-22 23:03:18.351891: train_loss -0.3447 
2023-11-22 23:03:18.352102: val_loss -0.3644 
2023-11-22 23:03:18.352185: Pseudo dice [0.7407, nan] 
2023-11-22 23:03:18.352264: Epoch time: 55.62 s 
2023-11-22 23:03:19.441189:  
2023-11-22 23:03:19.441324: Epoch 362 
2023-11-22 23:03:19.441435: Current learning rate: 0.00667 
2023-11-22 23:04:15.183470: train_loss -0.3523 
2023-11-22 23:04:15.183674: val_loss -0.3474 
2023-11-22 23:04:15.183759: Pseudo dice [0.707, nan] 
2023-11-22 23:04:15.183850: Epoch time: 55.74 s 
2023-11-22 23:04:16.271122:  
2023-11-22 23:04:16.271298: Epoch 363 
2023-11-22 23:04:16.271450: Current learning rate: 0.00666 
2023-11-22 23:05:11.867246: train_loss -0.3415 
2023-11-22 23:05:11.867470: val_loss -0.3554 
2023-11-22 23:05:11.867555: Pseudo dice [0.7282, nan] 
2023-11-22 23:05:11.867637: Epoch time: 55.6 s 
2023-11-22 23:05:12.972744:  
2023-11-22 23:05:12.972883: Epoch 364 
2023-11-22 23:05:12.972978: Current learning rate: 0.00665 
2023-11-22 23:06:08.685285: train_loss -0.3413 
2023-11-22 23:06:08.685681: val_loss -0.3538 
2023-11-22 23:06:08.685851: Pseudo dice [0.7198, nan] 
2023-11-22 23:06:08.686036: Epoch time: 55.71 s 
2023-11-22 23:06:09.775071:  
2023-11-22 23:06:09.775197: Epoch 365 
2023-11-22 23:06:09.775293: Current learning rate: 0.00665 
2023-11-22 23:07:05.461071: train_loss -0.3452 
2023-11-22 23:07:05.461251: val_loss -0.3298 
2023-11-22 23:07:05.461356: Pseudo dice [0.6756, nan] 
2023-11-22 23:07:05.461456: Epoch time: 55.69 s 
2023-11-22 23:07:06.565744:  
2023-11-22 23:07:06.565904: Epoch 366 
2023-11-22 23:07:06.566047: Current learning rate: 0.00664 
2023-11-22 23:08:02.293022: train_loss -0.3434 
2023-11-22 23:08:02.293220: val_loss -0.3805 
2023-11-22 23:08:02.293297: Pseudo dice [0.7772, nan] 
2023-11-22 23:08:02.293371: Epoch time: 55.73 s 
2023-11-22 23:08:03.382794:  
2023-11-22 23:08:03.382917: Epoch 367 
2023-11-22 23:08:03.383013: Current learning rate: 0.00663 
2023-11-22 23:08:59.040489: train_loss -0.3355 
2023-11-22 23:08:59.040721: val_loss -0.3694 
2023-11-22 23:08:59.040841: Pseudo dice [0.7466, nan] 
2023-11-22 23:08:59.040926: Epoch time: 55.66 s 
2023-11-22 23:09:00.136264:  
2023-11-22 23:09:00.136384: Epoch 368 
2023-11-22 23:09:00.136482: Current learning rate: 0.00662 
2023-11-22 23:09:55.838353: train_loss -0.3492 
2023-11-22 23:09:55.838545: val_loss -0.3503 
2023-11-22 23:09:55.838624: Pseudo dice [0.7112, nan] 
2023-11-22 23:09:55.838695: Epoch time: 55.7 s 
2023-11-22 23:09:56.940850:  
2023-11-22 23:09:56.940965: Epoch 369 
2023-11-22 23:09:56.941058: Current learning rate: 0.00661 
2023-11-22 23:10:52.551932: train_loss -0.3477 
2023-11-22 23:10:52.552133: val_loss -0.3377 
2023-11-22 23:10:52.552217: Pseudo dice [0.6894, nan] 
2023-11-22 23:10:52.552293: Epoch time: 55.61 s 
2023-11-22 23:10:53.749561:  
2023-11-22 23:10:53.749703: Epoch 370 
2023-11-22 23:10:53.749842: Current learning rate: 0.0066 
2023-11-22 23:11:49.455453: train_loss -0.3503 
2023-11-22 23:11:49.455687: val_loss -0.3521 
2023-11-22 23:11:49.455811: Pseudo dice [0.7014, nan] 
2023-11-22 23:11:49.455933: Epoch time: 55.71 s 
2023-11-22 23:11:50.548810:  
2023-11-22 23:11:50.548932: Epoch 371 
2023-11-22 23:11:50.549036: Current learning rate: 0.00659 
2023-11-22 23:12:46.214100: train_loss -0.3591 
2023-11-22 23:12:46.214289: val_loss -0.361 
2023-11-22 23:12:46.214370: Pseudo dice [0.7376, nan] 
2023-11-22 23:12:46.214441: Epoch time: 55.67 s 
2023-11-22 23:12:47.312528:  
2023-11-22 23:12:47.312886: Epoch 372 
2023-11-22 23:12:47.312993: Current learning rate: 0.00658 
2023-11-22 23:13:42.988185: train_loss -0.339 
2023-11-22 23:13:42.988394: val_loss -0.3566 
2023-11-22 23:13:42.988471: Pseudo dice [0.728, nan] 
2023-11-22 23:13:42.988536: Epoch time: 55.68 s 
2023-11-22 23:13:44.083620:  
2023-11-22 23:13:44.083758: Epoch 373 
2023-11-22 23:13:44.083865: Current learning rate: 0.00657 
2023-11-22 23:14:39.864537: train_loss -0.3383 
2023-11-22 23:14:39.864756: val_loss -0.3535 
2023-11-22 23:14:39.864931: Pseudo dice [0.7007, nan] 
2023-11-22 23:14:39.865018: Epoch time: 55.78 s 
2023-11-22 23:14:40.983361:  
2023-11-22 23:14:40.983485: Epoch 374 
2023-11-22 23:14:40.983594: Current learning rate: 0.00656 
2023-11-22 23:15:36.759821: train_loss -0.3477 
2023-11-22 23:15:36.760035: val_loss -0.3319 
2023-11-22 23:15:36.760132: Pseudo dice [0.6742, nan] 
2023-11-22 23:15:36.760205: Epoch time: 55.78 s 
2023-11-22 23:15:37.937838:  
2023-11-22 23:15:37.937955: Epoch 375 
2023-11-22 23:15:37.938048: Current learning rate: 0.00655 
2023-11-22 23:16:33.581210: train_loss -0.35 
2023-11-22 23:16:33.581412: val_loss -0.3734 
2023-11-22 23:16:33.581494: Pseudo dice [0.7578, nan] 
2023-11-22 23:16:33.581566: Epoch time: 55.64 s 
2023-11-22 23:16:34.655676:  
2023-11-22 23:16:34.655795: Epoch 376 
2023-11-22 23:16:34.655894: Current learning rate: 0.00654 
2023-11-22 23:17:30.216604: train_loss -0.3374 
2023-11-22 23:17:30.216816: val_loss -0.3726 
2023-11-22 23:17:30.216903: Pseudo dice [0.7512, nan] 
2023-11-22 23:17:30.216978: Epoch time: 55.56 s 
2023-11-22 23:17:31.315707:  
2023-11-22 23:17:31.315868: Epoch 377 
2023-11-22 23:17:31.315982: Current learning rate: 0.00653 
2023-11-22 23:18:27.059038: train_loss -0.3339 
2023-11-22 23:18:27.059231: val_loss -0.3419 
2023-11-22 23:18:27.059308: Pseudo dice [0.6882, nan] 
2023-11-22 23:18:27.059377: Epoch time: 55.74 s 
2023-11-22 23:18:28.154647:  
2023-11-22 23:18:28.154818: Epoch 378 
2023-11-22 23:18:28.154974: Current learning rate: 0.00652 
2023-11-22 23:19:23.877142: train_loss -0.3552 
2023-11-22 23:19:23.877446: val_loss -0.3416 
2023-11-22 23:19:23.877665: Pseudo dice [0.6965, nan] 
2023-11-22 23:19:23.877813: Epoch time: 55.72 s 
2023-11-22 23:19:24.977729:  
2023-11-22 23:19:24.977856: Epoch 379 
2023-11-22 23:19:24.977957: Current learning rate: 0.00651 
2023-11-22 23:20:20.671672: train_loss -0.3421 
2023-11-22 23:20:20.671861: val_loss -0.3684 
2023-11-22 23:20:20.671942: Pseudo dice [0.7469, nan] 
2023-11-22 23:20:20.672016: Epoch time: 55.69 s 
2023-11-22 23:20:21.766539:  
2023-11-22 23:20:21.766660: Epoch 380 
2023-11-22 23:20:21.766810: Current learning rate: 0.0065 
2023-11-22 23:21:17.633436: train_loss -0.3458 
2023-11-22 23:21:17.633624: val_loss -0.3487 
2023-11-22 23:21:17.633707: Pseudo dice [0.7093, nan] 
2023-11-22 23:21:17.633781: Epoch time: 55.87 s 
2023-11-22 23:21:18.722131:  
2023-11-22 23:21:18.722255: Epoch 381 
2023-11-22 23:21:18.722354: Current learning rate: 0.00649 
2023-11-22 23:22:14.401509: train_loss -0.3419 
2023-11-22 23:22:14.401699: val_loss -0.3468 
2023-11-22 23:22:14.401777: Pseudo dice [0.7142, nan] 
2023-11-22 23:22:14.401848: Epoch time: 55.68 s 
2023-11-22 23:22:15.510572:  
2023-11-22 23:22:15.510710: Epoch 382 
2023-11-22 23:22:15.510809: Current learning rate: 0.00648 
2023-11-22 23:23:11.273269: train_loss -0.3558 
2023-11-22 23:23:11.273435: val_loss -0.3652 
2023-11-22 23:23:11.273515: Pseudo dice [0.7465, nan] 
2023-11-22 23:23:11.273587: Epoch time: 55.76 s 
2023-11-22 23:23:12.388105:  
2023-11-22 23:23:12.388289: Epoch 383 
2023-11-22 23:23:12.388402: Current learning rate: 0.00648 
2023-11-22 23:24:07.830802: train_loss -0.337 
2023-11-22 23:24:07.831003: val_loss -0.3628 
2023-11-22 23:24:07.831087: Pseudo dice [0.7381, nan] 
2023-11-22 23:24:07.831162: Epoch time: 55.44 s 
2023-11-22 23:24:09.035891:  
2023-11-22 23:24:09.036028: Epoch 384 
2023-11-22 23:24:09.036136: Current learning rate: 0.00647 
2023-11-22 23:25:04.781106: train_loss -0.3463 
2023-11-22 23:25:04.781340: val_loss -0.3484 
2023-11-22 23:25:04.781445: Pseudo dice [0.7137, nan] 
2023-11-22 23:25:04.781531: Epoch time: 55.75 s 
2023-11-22 23:25:05.899061:  
2023-11-22 23:25:05.899187: Epoch 385 
2023-11-22 23:25:05.899311: Current learning rate: 0.00646 
2023-11-22 23:26:01.590567: train_loss -0.3369 
2023-11-22 23:26:01.590805: val_loss -0.3479 
2023-11-22 23:26:01.590912: Pseudo dice [0.6963, nan] 
2023-11-22 23:26:01.591034: Epoch time: 55.69 s 
2023-11-22 23:26:02.696626:  
2023-11-22 23:26:02.696761: Epoch 386 
2023-11-22 23:26:02.696868: Current learning rate: 0.00645 
2023-11-22 23:26:58.375196: train_loss -0.3537 
2023-11-22 23:26:58.375389: val_loss -0.3441 
2023-11-22 23:26:58.375468: Pseudo dice [0.6976, nan] 
2023-11-22 23:26:58.375546: Epoch time: 55.68 s 
2023-11-22 23:26:59.481278:  
2023-11-22 23:26:59.481405: Epoch 387 
2023-11-22 23:26:59.481531: Current learning rate: 0.00644 
2023-11-22 23:27:55.263741: train_loss -0.3355 
2023-11-22 23:27:55.263941: val_loss -0.3443 
2023-11-22 23:27:55.264023: Pseudo dice [0.7047, nan] 
2023-11-22 23:27:55.264094: Epoch time: 55.78 s 
2023-11-22 23:27:56.372602:  
2023-11-22 23:27:56.372736: Epoch 388 
2023-11-22 23:27:56.372859: Current learning rate: 0.00643 
2023-11-22 23:28:51.894014: train_loss -0.348 
2023-11-22 23:28:51.894212: val_loss -0.3608 
2023-11-22 23:28:51.894293: Pseudo dice [0.7339, nan] 
2023-11-22 23:28:51.894366: Epoch time: 55.52 s 
2023-11-22 23:28:53.106336:  
2023-11-22 23:28:53.106468: Epoch 389 
2023-11-22 23:28:53.106567: Current learning rate: 0.00642 
2023-11-22 23:29:48.839523: train_loss -0.348 
2023-11-22 23:29:48.839723: val_loss -0.3364 
2023-11-22 23:29:48.839831: Pseudo dice [0.6861, nan] 
2023-11-22 23:29:48.839906: Epoch time: 55.73 s 
2023-11-22 23:29:49.947654:  
2023-11-22 23:29:49.947784: Epoch 390 
2023-11-22 23:29:49.947906: Current learning rate: 0.00641 
2023-11-22 23:30:45.537046: train_loss -0.3445 
2023-11-22 23:30:45.537218: val_loss -0.3254 
2023-11-22 23:30:45.537298: Pseudo dice [0.675, nan] 
2023-11-22 23:30:45.537371: Epoch time: 55.59 s 
2023-11-22 23:30:46.648335:  
2023-11-22 23:30:46.648462: Epoch 391 
2023-11-22 23:30:46.648583: Current learning rate: 0.0064 
2023-11-22 23:31:42.464270: train_loss -0.3536 
2023-11-22 23:31:42.464483: val_loss -0.3553 
2023-11-22 23:31:42.464580: Pseudo dice [0.7285, nan] 
2023-11-22 23:31:42.464657: Epoch time: 55.82 s 
2023-11-22 23:31:43.569741:  
2023-11-22 23:31:43.569953: Epoch 392 
2023-11-22 23:31:43.570066: Current learning rate: 0.00639 
2023-11-22 23:32:39.225918: train_loss -0.3346 
2023-11-22 23:32:39.226122: val_loss -0.3471 
2023-11-22 23:32:39.226218: Pseudo dice [0.7031, nan] 
2023-11-22 23:32:39.226295: Epoch time: 55.66 s 
2023-11-22 23:32:40.432446:  
2023-11-22 23:32:40.432595: Epoch 393 
2023-11-22 23:32:40.432705: Current learning rate: 0.00638 
2023-11-22 23:33:36.079226: train_loss -0.333 
2023-11-22 23:33:36.079423: val_loss -0.3286 
2023-11-22 23:33:36.079501: Pseudo dice [0.6735, nan] 
2023-11-22 23:33:36.079574: Epoch time: 55.65 s 
2023-11-22 23:33:37.185810:  
2023-11-22 23:33:37.186004: Epoch 394 
2023-11-22 23:33:37.186173: Current learning rate: 0.00637 
2023-11-22 23:34:32.921345: train_loss -0.354 
2023-11-22 23:34:32.921540: val_loss -0.3318 
2023-11-22 23:34:32.921620: Pseudo dice [0.6775, nan] 
2023-11-22 23:34:32.921693: Epoch time: 55.74 s 
2023-11-22 23:34:34.033357:  
2023-11-22 23:34:34.033549: Epoch 395 
2023-11-22 23:34:34.033666: Current learning rate: 0.00636 
2023-11-22 23:35:29.725001: train_loss -0.3488 
2023-11-22 23:35:29.725193: val_loss -0.3627 
2023-11-22 23:35:29.725274: Pseudo dice [0.7355, nan] 
2023-11-22 23:35:29.725346: Epoch time: 55.69 s 
2023-11-22 23:35:30.843898:  
2023-11-22 23:35:30.844023: Epoch 396 
2023-11-22 23:35:30.844136: Current learning rate: 0.00635 
2023-11-22 23:36:26.489166: train_loss -0.3538 
2023-11-22 23:36:26.489360: val_loss -0.3361 
2023-11-22 23:36:26.489439: Pseudo dice [0.6833, nan] 
2023-11-22 23:36:26.489511: Epoch time: 55.65 s 
2023-11-22 23:36:27.605443:  
2023-11-22 23:36:27.605619: Epoch 397 
2023-11-22 23:36:27.605773: Current learning rate: 0.00634 
2023-11-22 23:37:23.166721: train_loss -0.3358 
2023-11-22 23:37:23.166960: val_loss -0.3379 
2023-11-22 23:37:23.167046: Pseudo dice [0.693, nan] 
2023-11-22 23:37:23.167119: Epoch time: 55.56 s 
2023-11-22 23:37:24.290779:  
2023-11-22 23:37:24.290908: Epoch 398 
2023-11-22 23:37:24.291004: Current learning rate: 0.00633 
2023-11-22 23:38:19.983641: train_loss -0.3237 
2023-11-22 23:38:19.983864: val_loss -0.3452 
2023-11-22 23:38:19.983948: Pseudo dice [0.7082, nan] 
2023-11-22 23:38:19.984024: Epoch time: 55.69 s 
2023-11-22 23:38:21.083405:  
2023-11-22 23:38:21.083534: Epoch 399 
2023-11-22 23:38:21.083657: Current learning rate: 0.00632 
2023-11-22 23:39:16.668670: train_loss -0.3287 
2023-11-22 23:39:16.668869: val_loss -0.3667 
2023-11-22 23:39:16.668963: Pseudo dice [0.7403, nan] 
2023-11-22 23:39:16.669039: Epoch time: 55.59 s 
2023-11-22 23:39:17.896878:  
2023-11-22 23:39:17.897058: Epoch 400 
2023-11-22 23:39:17.897178: Current learning rate: 0.00631 
2023-11-22 23:40:13.573059: train_loss -0.3381 
2023-11-22 23:40:13.573220: val_loss -0.3586 
2023-11-22 23:40:13.573303: Pseudo dice [0.7289, nan] 
2023-11-22 23:40:13.573377: Epoch time: 55.68 s 
2023-11-22 23:40:14.689768:  
2023-11-22 23:40:14.689952: Epoch 401 
2023-11-22 23:40:14.690048: Current learning rate: 0.0063 
2023-11-22 23:41:10.266205: train_loss -0.3467 
2023-11-22 23:41:10.266418: val_loss -0.3664 
2023-11-22 23:41:10.266506: Pseudo dice [0.7285, nan] 
2023-11-22 23:41:10.266580: Epoch time: 55.58 s 
2023-11-22 23:41:11.485746:  
2023-11-22 23:41:11.485871: Epoch 402 
2023-11-22 23:41:11.485975: Current learning rate: 0.0063 
2023-11-22 23:42:07.091909: train_loss -0.3467 
2023-11-22 23:42:07.092103: val_loss -0.3393 
2023-11-22 23:42:07.092188: Pseudo dice [0.6812, nan] 
2023-11-22 23:42:07.092269: Epoch time: 55.61 s 
2023-11-22 23:42:08.206434:  
2023-11-22 23:42:08.206563: Epoch 403 
2023-11-22 23:42:08.206666: Current learning rate: 0.00629 
2023-11-22 23:43:03.996084: train_loss -0.3412 
2023-11-22 23:43:03.996306: val_loss -0.3627 
2023-11-22 23:43:03.996392: Pseudo dice [0.7404, nan] 
2023-11-22 23:43:03.996467: Epoch time: 55.79 s 
2023-11-22 23:43:05.106286:  
2023-11-22 23:43:05.106438: Epoch 404 
2023-11-22 23:43:05.106554: Current learning rate: 0.00628 
2023-11-22 23:44:00.810913: train_loss -0.3572 
2023-11-22 23:44:00.811106: val_loss -0.3575 
2023-11-22 23:44:00.811187: Pseudo dice [0.7275, nan] 
2023-11-22 23:44:00.811262: Epoch time: 55.71 s 
2023-11-22 23:44:01.917412:  
2023-11-22 23:44:01.917547: Epoch 405 
2023-11-22 23:44:01.917655: Current learning rate: 0.00627 
2023-11-22 23:44:57.600352: train_loss -0.3428 
2023-11-22 23:44:57.600546: val_loss -0.3598 
2023-11-22 23:44:57.600647: Pseudo dice [0.734, nan] 
2023-11-22 23:44:57.600724: Epoch time: 55.68 s 
2023-11-22 23:44:58.813463:  
2023-11-22 23:44:58.813584: Epoch 406 
2023-11-22 23:44:58.813685: Current learning rate: 0.00626 
2023-11-22 23:45:54.492338: train_loss -0.3409 
2023-11-22 23:45:54.492540: val_loss -0.3624 
2023-11-22 23:45:54.492640: Pseudo dice [0.7343, nan] 
2023-11-22 23:45:54.492715: Epoch time: 55.68 s 
2023-11-22 23:45:55.600353:  
2023-11-22 23:45:55.600530: Epoch 407 
2023-11-22 23:45:55.600694: Current learning rate: 0.00625 
2023-11-22 23:46:51.335425: train_loss -0.3522 
2023-11-22 23:46:51.335608: val_loss -0.3384 
2023-11-22 23:46:51.335691: Pseudo dice [0.7039, nan] 
2023-11-22 23:46:51.335764: Epoch time: 55.74 s 
2023-11-22 23:46:52.448528:  
2023-11-22 23:46:52.448663: Epoch 408 
2023-11-22 23:46:52.448759: Current learning rate: 0.00624 
2023-11-22 23:47:48.156688: train_loss -0.3371 
2023-11-22 23:47:48.156888: val_loss -0.3349 
2023-11-22 23:47:48.156968: Pseudo dice [0.6839, nan] 
2023-11-22 23:47:48.157041: Epoch time: 55.71 s 
2023-11-22 23:47:49.257436:  
2023-11-22 23:47:49.257562: Epoch 409 
2023-11-22 23:47:49.257661: Current learning rate: 0.00623 
2023-11-22 23:48:44.889592: train_loss -0.3452 
2023-11-22 23:48:44.889794: val_loss -0.3563 
2023-11-22 23:48:44.889876: Pseudo dice [0.7227, nan] 
2023-11-22 23:48:44.889951: Epoch time: 55.63 s 
2023-11-22 23:48:46.096492:  
2023-11-22 23:48:46.096708: Epoch 410 
2023-11-22 23:48:46.096861: Current learning rate: 0.00622 
2023-11-22 23:49:41.796191: train_loss -0.3528 
2023-11-22 23:49:41.796391: val_loss -0.341 
2023-11-22 23:49:41.796476: Pseudo dice [0.6967, nan] 
2023-11-22 23:49:41.796551: Epoch time: 55.7 s 
2023-11-22 23:49:42.858879:  
2023-11-22 23:49:42.859000: Epoch 411 
2023-11-22 23:49:42.859120: Current learning rate: 0.00621 
2023-11-22 23:50:38.592597: train_loss -0.3536 
2023-11-22 23:50:38.592774: val_loss -0.3418 
2023-11-22 23:50:38.592853: Pseudo dice [0.6929, nan] 
2023-11-22 23:50:38.592926: Epoch time: 55.73 s 
2023-11-22 23:50:39.645349:  
2023-11-22 23:50:39.645467: Epoch 412 
2023-11-22 23:50:39.645567: Current learning rate: 0.0062 
2023-11-22 23:51:35.359750: train_loss -0.3432 
2023-11-22 23:51:35.359983: val_loss -0.3491 
2023-11-22 23:51:35.360080: Pseudo dice [0.7083, nan] 
2023-11-22 23:51:35.360155: Epoch time: 55.72 s 
2023-11-22 23:51:36.409322:  
2023-11-22 23:51:36.409453: Epoch 413 
2023-11-22 23:51:36.409549: Current learning rate: 0.00619 
2023-11-22 23:52:32.089146: train_loss -0.3513 
2023-11-22 23:52:32.089338: val_loss -0.3732 
2023-11-22 23:52:32.089416: Pseudo dice [0.7601, nan] 
2023-11-22 23:52:32.089486: Epoch time: 55.68 s 
2023-11-22 23:52:33.238306:  
2023-11-22 23:52:33.238435: Epoch 414 
2023-11-22 23:52:33.238540: Current learning rate: 0.00618 
2023-11-22 23:53:28.880370: train_loss -0.3454 
2023-11-22 23:53:28.880599: val_loss -0.3287 
2023-11-22 23:53:28.880687: Pseudo dice [0.6718, nan] 
2023-11-22 23:53:28.880763: Epoch time: 55.64 s 
2023-11-22 23:53:29.946247:  
2023-11-22 23:53:29.946375: Epoch 415 
2023-11-22 23:53:29.946473: Current learning rate: 0.00617 
2023-11-22 23:54:25.696892: train_loss -0.3409 
2023-11-22 23:54:25.697099: val_loss -0.3415 
2023-11-22 23:54:25.697178: Pseudo dice [0.6808, nan] 
2023-11-22 23:54:25.697254: Epoch time: 55.75 s 
2023-11-22 23:54:26.749266:  
2023-11-22 23:54:26.749402: Epoch 416 
2023-11-22 23:54:26.749512: Current learning rate: 0.00616 
2023-11-22 23:55:22.605728: train_loss -0.347 
2023-11-22 23:55:22.605910: val_loss -0.3295 
2023-11-22 23:55:22.605991: Pseudo dice [0.6741, nan] 
2023-11-22 23:55:22.606064: Epoch time: 55.86 s 
2023-11-22 23:55:23.657644:  
2023-11-22 23:55:23.657802: Epoch 417 
2023-11-22 23:55:23.657960: Current learning rate: 0.00615 
2023-11-22 23:56:19.242949: train_loss -0.3455 
2023-11-22 23:56:19.243136: val_loss -0.3506 
2023-11-22 23:56:19.243212: Pseudo dice [0.7154, nan] 
2023-11-22 23:56:19.243279: Epoch time: 55.59 s 
2023-11-22 23:56:20.405991:  
2023-11-22 23:56:20.406113: Epoch 418 
2023-11-22 23:56:20.406208: Current learning rate: 0.00614 
2023-11-22 23:57:16.151895: train_loss -0.3525 
2023-11-22 23:57:16.152107: val_loss -0.3393 
2023-11-22 23:57:16.152188: Pseudo dice [0.6863, nan] 
2023-11-22 23:57:16.152260: Epoch time: 55.75 s 
2023-11-22 23:57:17.202248:  
2023-11-22 23:57:17.202411: Epoch 419 
2023-11-22 23:57:17.202514: Current learning rate: 0.00613 
2023-11-22 23:58:12.840203: train_loss -0.3455 
2023-11-22 23:58:12.840390: val_loss -0.347 
2023-11-22 23:58:12.840467: Pseudo dice [0.7169, nan] 
2023-11-22 23:58:12.840537: Epoch time: 55.64 s 
2023-11-22 23:58:13.895217:  
2023-11-22 23:58:13.895333: Epoch 420 
2023-11-22 23:58:13.895436: Current learning rate: 0.00612 
2023-11-22 23:59:09.501040: train_loss -0.3464 
2023-11-22 23:59:09.501243: val_loss -0.3346 
2023-11-22 23:59:09.501323: Pseudo dice [0.6866, nan] 
2023-11-22 23:59:09.501396: Epoch time: 55.61 s 
2023-11-22 23:59:10.557336:  
2023-11-22 23:59:10.557462: Epoch 421 
2023-11-22 23:59:10.557599: Current learning rate: 0.00612 
2023-11-23 00:00:06.093435: train_loss -0.3336 
2023-11-23 00:00:06.093685: val_loss -0.3533 
2023-11-23 00:00:06.093770: Pseudo dice [0.7157, nan] 
2023-11-23 00:00:06.093844: Epoch time: 55.54 s 
2023-11-23 00:00:07.279023:  
2023-11-23 00:00:07.279153: Epoch 422 
2023-11-23 00:00:07.279277: Current learning rate: 0.00611 
2023-11-23 00:01:03.023941: train_loss -0.3392 
2023-11-23 00:01:03.024149: val_loss -0.3516 
2023-11-23 00:01:03.024261: Pseudo dice [0.7153, nan] 
2023-11-23 00:01:03.024341: Epoch time: 55.75 s 
2023-11-23 00:01:04.086799:  
2023-11-23 00:01:04.086920: Epoch 423 
2023-11-23 00:01:04.087024: Current learning rate: 0.0061 
2023-11-23 00:01:59.920435: train_loss -0.3261 
2023-11-23 00:01:59.920667: val_loss -0.3459 
2023-11-23 00:01:59.920757: Pseudo dice [0.714, nan] 
2023-11-23 00:01:59.920852: Epoch time: 55.83 s 
2023-11-23 00:02:00.969587:  
2023-11-23 00:02:00.969778: Epoch 424 
2023-11-23 00:02:00.969886: Current learning rate: 0.00609 
2023-11-23 00:02:56.567336: train_loss -0.3411 
2023-11-23 00:02:56.567535: val_loss -0.3502 
2023-11-23 00:02:56.567618: Pseudo dice [0.7167, nan] 
2023-11-23 00:02:56.567693: Epoch time: 55.6 s 
2023-11-23 00:02:57.628599:  
2023-11-23 00:02:57.628723: Epoch 425 
2023-11-23 00:02:57.628823: Current learning rate: 0.00608 
2023-11-23 00:03:53.128613: train_loss -0.347 
2023-11-23 00:03:53.128816: val_loss -0.3533 
2023-11-23 00:03:53.128896: Pseudo dice [0.7215, nan] 
2023-11-23 00:03:53.129004: Epoch time: 55.5 s 
2023-11-23 00:03:54.289512:  
2023-11-23 00:03:54.289761: Epoch 426 
2023-11-23 00:03:54.289948: Current learning rate: 0.00607 
2023-11-23 00:04:49.958979: train_loss -0.3299 
2023-11-23 00:04:49.959181: val_loss -0.3521 
2023-11-23 00:04:49.959263: Pseudo dice [0.7107, nan] 
2023-11-23 00:04:49.959336: Epoch time: 55.67 s 
2023-11-23 00:04:51.024978:  
2023-11-23 00:04:51.025115: Epoch 427 
2023-11-23 00:04:51.025242: Current learning rate: 0.00606 
2023-11-23 00:05:46.651168: train_loss -0.3355 
2023-11-23 00:05:46.651395: val_loss -0.3398 
2023-11-23 00:05:46.651483: Pseudo dice [0.6984, nan] 
2023-11-23 00:05:46.651558: Epoch time: 55.63 s 
2023-11-23 00:05:47.707419:  
2023-11-23 00:05:47.707634: Epoch 428 
2023-11-23 00:05:47.707758: Current learning rate: 0.00605 
2023-11-23 00:06:43.477964: train_loss -0.3495 
2023-11-23 00:06:43.478190: val_loss -0.3514 
2023-11-23 00:06:43.478274: Pseudo dice [0.7066, nan] 
2023-11-23 00:06:43.478346: Epoch time: 55.77 s 
2023-11-23 00:06:44.521526:  
2023-11-23 00:06:44.521656: Epoch 429 
2023-11-23 00:06:44.521759: Current learning rate: 0.00604 
2023-11-23 00:07:40.199667: train_loss -0.3475 
2023-11-23 00:07:40.199910: val_loss -0.3566 
2023-11-23 00:07:40.199996: Pseudo dice [0.7243, nan] 
2023-11-23 00:07:40.200075: Epoch time: 55.68 s 
2023-11-23 00:07:41.359678:  
2023-11-23 00:07:41.359799: Epoch 430 
2023-11-23 00:07:41.359907: Current learning rate: 0.00603 
2023-11-23 00:08:37.088769: train_loss -0.325 
2023-11-23 00:08:37.088971: val_loss -0.3414 
2023-11-23 00:08:37.089053: Pseudo dice [0.6945, nan] 
2023-11-23 00:08:37.089126: Epoch time: 55.73 s 
2023-11-23 00:08:38.148216:  
2023-11-23 00:08:38.148347: Epoch 431 
2023-11-23 00:08:38.148494: Current learning rate: 0.00602 
2023-11-23 00:09:33.876300: train_loss -0.3332 
2023-11-23 00:09:33.876501: val_loss -0.3322 
2023-11-23 00:09:33.876630: Pseudo dice [0.6818, nan] 
2023-11-23 00:09:33.876717: Epoch time: 55.73 s 
2023-11-23 00:09:34.932692:  
2023-11-23 00:09:34.932813: Epoch 432 
2023-11-23 00:09:34.932987: Current learning rate: 0.00601 
2023-11-23 00:10:30.710406: train_loss -0.3346 
2023-11-23 00:10:30.710634: val_loss -0.3351 
2023-11-23 00:10:30.710719: Pseudo dice [0.6881, nan] 
2023-11-23 00:10:30.710793: Epoch time: 55.78 s 
2023-11-23 00:10:31.779690:  
2023-11-23 00:10:31.779818: Epoch 433 
2023-11-23 00:10:31.779914: Current learning rate: 0.006 
2023-11-23 00:11:27.361688: train_loss -0.3423 
2023-11-23 00:11:27.361905: val_loss -0.3402 
2023-11-23 00:11:27.361988: Pseudo dice [0.6908, nan] 
2023-11-23 00:11:27.362059: Epoch time: 55.58 s 
2023-11-23 00:11:28.521780:  
2023-11-23 00:11:28.521917: Epoch 434 
2023-11-23 00:11:28.522031: Current learning rate: 0.00599 
2023-11-23 00:12:24.074614: train_loss -0.3372 
2023-11-23 00:12:24.074816: val_loss -0.3661 
2023-11-23 00:12:24.074916: Pseudo dice [0.7356, nan] 
2023-11-23 00:12:24.074990: Epoch time: 55.55 s 
2023-11-23 00:12:25.150509:  
2023-11-23 00:12:25.150646: Epoch 435 
2023-11-23 00:12:25.150755: Current learning rate: 0.00598 
2023-11-23 00:13:20.816094: train_loss -0.3444 
2023-11-23 00:13:20.816304: val_loss -0.3639 
2023-11-23 00:13:20.816387: Pseudo dice [0.7381, nan] 
2023-11-23 00:13:20.816461: Epoch time: 55.67 s 
2023-11-23 00:13:21.876071:  
2023-11-23 00:13:21.876254: Epoch 436 
2023-11-23 00:13:21.876405: Current learning rate: 0.00597 
2023-11-23 00:14:17.544740: train_loss -0.3413 
2023-11-23 00:14:17.544964: val_loss -0.313 
2023-11-23 00:14:17.545045: Pseudo dice [0.6526, nan] 
2023-11-23 00:14:17.545116: Epoch time: 55.67 s 
2023-11-23 00:14:18.607618:  
2023-11-23 00:14:18.607780: Epoch 437 
2023-11-23 00:14:18.607890: Current learning rate: 0.00596 
2023-11-23 00:15:14.195988: train_loss -0.3447 
2023-11-23 00:15:14.196181: val_loss -0.3582 
2023-11-23 00:15:14.196262: Pseudo dice [0.734, nan] 
2023-11-23 00:15:14.196342: Epoch time: 55.59 s 
2023-11-23 00:15:15.375245:  
2023-11-23 00:15:15.375388: Epoch 438 
2023-11-23 00:15:15.375512: Current learning rate: 0.00595 
2023-11-23 00:16:11.091747: train_loss -0.3517 
2023-11-23 00:16:11.091987: val_loss -0.3621 
2023-11-23 00:16:11.092075: Pseudo dice [0.7378, nan] 
2023-11-23 00:16:11.092149: Epoch time: 55.72 s 
2023-11-23 00:16:12.152149:  
2023-11-23 00:16:12.152281: Epoch 439 
2023-11-23 00:16:12.152429: Current learning rate: 0.00594 
2023-11-23 00:17:07.845215: train_loss -0.3381 
2023-11-23 00:17:07.845401: val_loss -0.3328 
2023-11-23 00:17:07.845484: Pseudo dice [0.6718, nan] 
2023-11-23 00:17:07.845555: Epoch time: 55.69 s 
2023-11-23 00:17:08.908141:  
2023-11-23 00:17:08.908304: Epoch 440 
2023-11-23 00:17:08.908442: Current learning rate: 0.00593 
2023-11-23 00:18:04.624576: train_loss -0.3512 
2023-11-23 00:18:04.624769: val_loss -0.3523 
2023-11-23 00:18:04.624855: Pseudo dice [0.7063, nan] 
2023-11-23 00:18:04.624921: Epoch time: 55.72 s 
2023-11-23 00:18:05.687232:  
2023-11-23 00:18:05.687361: Epoch 441 
2023-11-23 00:18:05.687464: Current learning rate: 0.00592 
2023-11-23 00:19:01.336140: train_loss -0.3382 
2023-11-23 00:19:01.336357: val_loss -0.3501 
2023-11-23 00:19:01.336462: Pseudo dice [0.7071, nan] 
2023-11-23 00:19:01.336552: Epoch time: 55.65 s 
2023-11-23 00:19:02.394801:  
2023-11-23 00:19:02.394936: Epoch 442 
2023-11-23 00:19:02.395031: Current learning rate: 0.00592 
2023-11-23 00:19:58.097304: train_loss -0.3446 
2023-11-23 00:19:58.097519: val_loss -0.3662 
2023-11-23 00:19:58.097595: Pseudo dice [0.7469, nan] 
2023-11-23 00:19:58.097669: Epoch time: 55.7 s 
2023-11-23 00:19:59.145236:  
2023-11-23 00:19:59.145366: Epoch 443 
2023-11-23 00:19:59.145484: Current learning rate: 0.00591 
2023-11-23 00:20:54.971457: train_loss -0.3447 
2023-11-23 00:20:54.971668: val_loss -0.3214 
2023-11-23 00:20:54.971751: Pseudo dice [0.6571, nan] 
2023-11-23 00:20:54.971832: Epoch time: 55.83 s 
2023-11-23 00:20:56.029020:  
2023-11-23 00:20:56.029210: Epoch 444 
2023-11-23 00:20:56.029345: Current learning rate: 0.0059 
2023-11-23 00:21:51.796278: train_loss -0.329 
2023-11-23 00:21:51.796479: val_loss -0.3535 
2023-11-23 00:21:51.796559: Pseudo dice [0.7191, nan] 
2023-11-23 00:21:51.796640: Epoch time: 55.77 s 
2023-11-23 00:21:52.842042:  
2023-11-23 00:21:52.842183: Epoch 445 
2023-11-23 00:21:52.842394: Current learning rate: 0.00589 
2023-11-23 00:22:48.477510: train_loss -0.3383 
2023-11-23 00:22:48.477706: val_loss -0.3518 
2023-11-23 00:22:48.477786: Pseudo dice [0.718, nan] 
2023-11-23 00:22:48.477859: Epoch time: 55.64 s 
2023-11-23 00:22:49.655785:  
2023-11-23 00:22:49.655918: Epoch 446 
2023-11-23 00:22:49.656016: Current learning rate: 0.00588 
2023-11-23 00:23:45.310588: train_loss -0.3478 
2023-11-23 00:23:45.310800: val_loss -0.3671 
2023-11-23 00:23:45.310906: Pseudo dice [0.7428, nan] 
2023-11-23 00:23:45.310982: Epoch time: 55.66 s 
2023-11-23 00:23:46.360984:  
2023-11-23 00:23:46.361105: Epoch 447 
2023-11-23 00:23:46.361200: Current learning rate: 0.00587 
2023-11-23 00:24:42.050491: train_loss -0.3362 
2023-11-23 00:24:42.050702: val_loss -0.3476 
2023-11-23 00:24:42.050789: Pseudo dice [0.7057, nan] 
2023-11-23 00:24:42.050865: Epoch time: 55.69 s 
2023-11-23 00:24:43.098451:  
2023-11-23 00:24:43.098583: Epoch 448 
2023-11-23 00:24:43.098679: Current learning rate: 0.00586 
2023-11-23 00:25:38.727123: train_loss -0.3438 
2023-11-23 00:25:38.727334: val_loss -0.3846 
2023-11-23 00:25:38.727423: Pseudo dice [0.7796, nan] 
2023-11-23 00:25:38.727495: Epoch time: 55.63 s 
2023-11-23 00:25:39.781864:  
2023-11-23 00:25:39.782028: Epoch 449 
2023-11-23 00:25:39.782135: Current learning rate: 0.00585 
2023-11-23 00:26:35.479048: train_loss -0.3576 
2023-11-23 00:26:35.479243: val_loss -0.3636 
2023-11-23 00:26:35.479336: Pseudo dice [0.7379, nan] 
2023-11-23 00:26:35.479414: Epoch time: 55.7 s 
2023-11-23 00:26:36.661160:  
2023-11-23 00:26:36.661308: Epoch 450 
2023-11-23 00:26:36.661414: Current learning rate: 0.00584 
2023-11-23 00:27:32.314874: train_loss -0.3473 
2023-11-23 00:27:32.315080: val_loss -0.3308 
2023-11-23 00:27:32.315165: Pseudo dice [0.6721, nan] 
2023-11-23 00:27:32.315246: Epoch time: 55.65 s 
2023-11-23 00:27:33.486630:  
2023-11-23 00:27:33.486766: Epoch 451 
2023-11-23 00:27:33.486860: Current learning rate: 0.00583 
2023-11-23 00:28:29.080247: train_loss -0.3455 
2023-11-23 00:28:29.080426: val_loss -0.3603 
2023-11-23 00:28:29.080504: Pseudo dice [0.7274, nan] 
2023-11-23 00:28:29.080589: Epoch time: 55.59 s 
2023-11-23 00:28:30.131806:  
2023-11-23 00:28:30.131993: Epoch 452 
2023-11-23 00:28:30.132123: Current learning rate: 0.00582 
2023-11-23 00:29:25.908541: train_loss -0.3533 
2023-11-23 00:29:25.908747: val_loss -0.3491 
2023-11-23 00:29:25.908827: Pseudo dice [0.7019, nan] 
2023-11-23 00:29:25.908898: Epoch time: 55.78 s 
2023-11-23 00:29:26.950589:  
2023-11-23 00:29:26.950719: Epoch 453 
2023-11-23 00:29:26.950816: Current learning rate: 0.00581 
2023-11-23 00:30:22.648329: train_loss -0.3515 
2023-11-23 00:30:22.648549: val_loss -0.3448 
2023-11-23 00:30:22.648752: Pseudo dice [0.7029, nan] 
2023-11-23 00:30:22.648828: Epoch time: 55.7 s 
2023-11-23 00:30:23.694951:  
2023-11-23 00:30:23.695082: Epoch 454 
2023-11-23 00:30:23.695184: Current learning rate: 0.0058 
2023-11-23 00:31:19.297308: train_loss -0.3345 
2023-11-23 00:31:19.297500: val_loss -0.3891 
2023-11-23 00:31:19.297632: Pseudo dice [0.7876, nan] 
2023-11-23 00:31:19.297707: Epoch time: 55.6 s 
2023-11-23 00:31:20.441807:  
2023-11-23 00:31:20.441938: Epoch 455 
2023-11-23 00:31:20.442041: Current learning rate: 0.00579 
2023-11-23 00:32:16.067319: train_loss -0.3504 
2023-11-23 00:32:16.067512: val_loss -0.3687 
2023-11-23 00:32:16.067585: Pseudo dice [0.7515, nan] 
2023-11-23 00:32:16.067650: Epoch time: 55.63 s 
2023-11-23 00:32:17.121583:  
2023-11-23 00:32:17.121770: Epoch 456 
2023-11-23 00:32:17.121887: Current learning rate: 0.00578 
2023-11-23 00:33:12.807119: train_loss -0.3344 
2023-11-23 00:33:12.807299: val_loss -0.3479 
2023-11-23 00:33:12.807379: Pseudo dice [0.7168, nan] 
2023-11-23 00:33:12.807455: Epoch time: 55.69 s 
2023-11-23 00:33:13.852852:  
2023-11-23 00:33:13.853052: Epoch 457 
2023-11-23 00:33:13.853165: Current learning rate: 0.00577 
2023-11-23 00:34:09.525254: train_loss -0.3338 
2023-11-23 00:34:09.525495: val_loss -0.3437 
2023-11-23 00:34:09.525585: Pseudo dice [0.6969, nan] 
2023-11-23 00:34:09.525670: Epoch time: 55.67 s 
2023-11-23 00:34:10.579937:  
2023-11-23 00:34:10.580064: Epoch 458 
2023-11-23 00:34:10.580168: Current learning rate: 0.00576 
2023-11-23 00:35:06.252680: train_loss -0.3328 
2023-11-23 00:35:06.252857: val_loss -0.361 
2023-11-23 00:35:06.252960: Pseudo dice [0.7311, nan] 
2023-11-23 00:35:06.253033: Epoch time: 55.67 s 
2023-11-23 00:35:07.302930:  
2023-11-23 00:35:07.303116: Epoch 459 
2023-11-23 00:35:07.303263: Current learning rate: 0.00575 
2023-11-23 00:36:03.059949: train_loss -0.3557 
2023-11-23 00:36:03.060137: val_loss -0.3338 
2023-11-23 00:36:03.060217: Pseudo dice [0.6833, nan] 
2023-11-23 00:36:03.060289: Epoch time: 55.76 s 
2023-11-23 00:36:04.104643:  
2023-11-23 00:36:04.104775: Epoch 460 
2023-11-23 00:36:04.104869: Current learning rate: 0.00574 
2023-11-23 00:36:59.667517: train_loss -0.3496 
2023-11-23 00:36:59.667723: val_loss -0.3507 
2023-11-23 00:36:59.667816: Pseudo dice [0.715, nan] 
2023-11-23 00:36:59.667898: Epoch time: 55.56 s 
2023-11-23 00:37:00.710575:  
2023-11-23 00:37:00.710795: Epoch 461 
2023-11-23 00:37:00.710935: Current learning rate: 0.00573 
2023-11-23 00:37:56.300298: train_loss -0.3512 
2023-11-23 00:37:56.300492: val_loss -0.3635 
2023-11-23 00:37:56.300577: Pseudo dice [0.7364, nan] 
2023-11-23 00:37:56.300650: Epoch time: 55.59 s 
2023-11-23 00:37:57.358446:  
2023-11-23 00:37:57.358643: Epoch 462 
2023-11-23 00:37:57.358809: Current learning rate: 0.00572 
2023-11-23 00:38:52.978260: train_loss -0.3526 
2023-11-23 00:38:52.978466: val_loss -0.3258 
2023-11-23 00:38:52.978549: Pseudo dice [0.6699, nan] 
2023-11-23 00:38:52.978616: Epoch time: 55.62 s 
2023-11-23 00:38:54.132652:  
2023-11-23 00:38:54.132856: Epoch 463 
2023-11-23 00:38:54.132972: Current learning rate: 0.00571 
2023-11-23 00:39:49.927396: train_loss -0.3459 
2023-11-23 00:39:49.927601: val_loss -0.3623 
2023-11-23 00:39:49.927747: Pseudo dice [0.7406, nan] 
2023-11-23 00:39:49.927823: Epoch time: 55.8 s 
2023-11-23 00:39:50.977252:  
2023-11-23 00:39:50.977418: Epoch 464 
2023-11-23 00:39:50.977572: Current learning rate: 0.0057 
2023-11-23 00:40:46.672998: train_loss -0.3471 
2023-11-23 00:40:46.673195: val_loss -0.3353 
2023-11-23 00:40:46.673275: Pseudo dice [0.6846, nan] 
2023-11-23 00:40:46.673352: Epoch time: 55.7 s 
2023-11-23 00:40:47.717085:  
2023-11-23 00:40:47.717202: Epoch 465 
2023-11-23 00:40:47.717331: Current learning rate: 0.0057 
2023-11-23 00:41:43.595680: train_loss -0.3518 
2023-11-23 00:41:43.595892: val_loss -0.3528 
2023-11-23 00:41:43.595968: Pseudo dice [0.7161, nan] 
2023-11-23 00:41:43.596047: Epoch time: 55.88 s 
2023-11-23 00:41:44.640838:  
2023-11-23 00:41:44.640971: Epoch 466 
2023-11-23 00:41:44.641071: Current learning rate: 0.00569 
2023-11-23 00:42:40.165785: train_loss -0.3516 
2023-11-23 00:42:40.165997: val_loss -0.3515 
2023-11-23 00:42:40.166191: Pseudo dice [0.7065, nan] 
2023-11-23 00:42:40.166265: Epoch time: 55.53 s 
2023-11-23 00:42:41.314856:  
2023-11-23 00:42:41.314988: Epoch 467 
2023-11-23 00:42:41.315087: Current learning rate: 0.00568 
2023-11-23 00:43:37.000154: train_loss -0.3444 
2023-11-23 00:43:37.000346: val_loss -0.3429 
2023-11-23 00:43:37.000427: Pseudo dice [0.6986, nan] 
2023-11-23 00:43:37.000508: Epoch time: 55.69 s 
2023-11-23 00:43:38.050385:  
2023-11-23 00:43:38.050560: Epoch 468 
2023-11-23 00:43:38.050712: Current learning rate: 0.00567 
2023-11-23 00:44:33.811472: train_loss -0.3458 
2023-11-23 00:44:33.811660: val_loss -0.3478 
2023-11-23 00:44:33.811741: Pseudo dice [0.708, nan] 
2023-11-23 00:44:33.811814: Epoch time: 55.76 s 
2023-11-23 00:44:34.861378:  
2023-11-23 00:44:34.861504: Epoch 469 
2023-11-23 00:44:34.861615: Current learning rate: 0.00566 
2023-11-23 00:45:30.573470: train_loss -0.3531 
2023-11-23 00:45:30.573702: val_loss -0.3489 
2023-11-23 00:45:30.573785: Pseudo dice [0.7054, nan] 
2023-11-23 00:45:30.573857: Epoch time: 55.71 s 
2023-11-23 00:45:31.637985:  
2023-11-23 00:45:31.638271: Epoch 470 
2023-11-23 00:45:31.638376: Current learning rate: 0.00565 
2023-11-23 00:46:27.283902: train_loss -0.3489 
2023-11-23 00:46:27.284110: val_loss -0.3416 
2023-11-23 00:46:27.284209: Pseudo dice [0.6925, nan] 
2023-11-23 00:46:27.284282: Epoch time: 55.65 s 
2023-11-23 00:46:28.432460:  
2023-11-23 00:46:28.432588: Epoch 471 
2023-11-23 00:46:28.432691: Current learning rate: 0.00564 
2023-11-23 00:47:24.007910: train_loss -0.3458 
2023-11-23 00:47:24.008101: val_loss -0.3544 
2023-11-23 00:47:24.008192: Pseudo dice [0.7236, nan] 
2023-11-23 00:47:24.008271: Epoch time: 55.58 s 
2023-11-23 00:47:25.055791:  
2023-11-23 00:47:25.055968: Epoch 472 
2023-11-23 00:47:25.056113: Current learning rate: 0.00563 
2023-11-23 00:48:20.662055: train_loss -0.3209 
2023-11-23 00:48:20.662248: val_loss -0.3633 
2023-11-23 00:48:20.662332: Pseudo dice [0.7339, nan] 
2023-11-23 00:48:20.662406: Epoch time: 55.61 s 
2023-11-23 00:48:21.709557:  
2023-11-23 00:48:21.709755: Epoch 473 
2023-11-23 00:48:21.709862: Current learning rate: 0.00562 
2023-11-23 00:49:17.423849: train_loss -0.3362 
2023-11-23 00:49:17.424058: val_loss -0.3264 
2023-11-23 00:49:17.424144: Pseudo dice [0.6767, nan] 
2023-11-23 00:49:17.424220: Epoch time: 55.72 s 
2023-11-23 00:49:18.470509:  
2023-11-23 00:49:18.470696: Epoch 474 
2023-11-23 00:49:18.470819: Current learning rate: 0.00561 
2023-11-23 00:50:14.134845: train_loss -0.3503 
2023-11-23 00:50:14.135055: val_loss -0.3683 
2023-11-23 00:50:14.135139: Pseudo dice [0.7504, nan] 
2023-11-23 00:50:14.135214: Epoch time: 55.67 s 
2023-11-23 00:50:15.180508:  
2023-11-23 00:50:15.180841: Epoch 475 
2023-11-23 00:50:15.180949: Current learning rate: 0.0056 
2023-11-23 00:51:10.838255: train_loss -0.3418 
2023-11-23 00:51:10.838448: val_loss -0.3211 
2023-11-23 00:51:10.838537: Pseudo dice [0.6623, nan] 
2023-11-23 00:51:10.838610: Epoch time: 55.66 s 
2023-11-23 00:51:11.891724:  
2023-11-23 00:51:11.891866: Epoch 476 
2023-11-23 00:51:11.891997: Current learning rate: 0.00559 
2023-11-23 00:52:07.516745: train_loss -0.3611 
2023-11-23 00:52:07.516978: val_loss -0.3518 
2023-11-23 00:52:07.517070: Pseudo dice [0.7225, nan] 
2023-11-23 00:52:07.517161: Epoch time: 55.63 s 
2023-11-23 00:52:08.569644:  
2023-11-23 00:52:08.569785: Epoch 477 
2023-11-23 00:52:08.569885: Current learning rate: 0.00558 
2023-11-23 00:53:04.088834: train_loss -0.3532 
2023-11-23 00:53:04.089059: val_loss -0.3769 
2023-11-23 00:53:04.089158: Pseudo dice [0.758, nan] 
2023-11-23 00:53:04.089284: Epoch time: 55.52 s 
2023-11-23 00:53:05.166204:  
2023-11-23 00:53:05.166394: Epoch 478 
2023-11-23 00:53:05.166545: Current learning rate: 0.00557 
2023-11-23 00:54:00.828382: train_loss -0.3589 
2023-11-23 00:54:00.828606: val_loss -0.3688 
2023-11-23 00:54:00.828691: Pseudo dice [0.75, nan] 
2023-11-23 00:54:00.828762: Epoch time: 55.66 s 
2023-11-23 00:54:01.904517:  
2023-11-23 00:54:01.904657: Epoch 479 
2023-11-23 00:54:01.904780: Current learning rate: 0.00556 
2023-11-23 00:54:57.601818: train_loss -0.3532 
2023-11-23 00:54:57.602033: val_loss -0.355 
2023-11-23 00:54:57.602113: Pseudo dice [0.7247, nan] 
2023-11-23 00:54:57.602180: Epoch time: 55.7 s 
2023-11-23 00:54:58.775450:  
2023-11-23 00:54:58.775720: Epoch 480 
2023-11-23 00:54:58.775880: Current learning rate: 0.00555 
2023-11-23 00:55:54.424047: train_loss -0.3504 
2023-11-23 00:55:54.424263: val_loss -0.359 
2023-11-23 00:55:54.424341: Pseudo dice [0.7353, nan] 
2023-11-23 00:55:54.424408: Epoch time: 55.65 s 
2023-11-23 00:55:55.495069:  
2023-11-23 00:55:55.495255: Epoch 481 
2023-11-23 00:55:55.495358: Current learning rate: 0.00554 
2023-11-23 00:56:51.104335: train_loss -0.3435 
2023-11-23 00:56:51.104535: val_loss -0.3553 
2023-11-23 00:56:51.104641: Pseudo dice [0.7244, nan] 
2023-11-23 00:56:51.104755: Epoch time: 55.61 s 
2023-11-23 00:56:52.164364:  
2023-11-23 00:56:52.164500: Epoch 482 
2023-11-23 00:56:52.164615: Current learning rate: 0.00553 
2023-11-23 00:57:47.867263: train_loss -0.3446 
2023-11-23 00:57:47.867461: val_loss -0.3597 
2023-11-23 00:57:47.867543: Pseudo dice [0.7317, nan] 
2023-11-23 00:57:47.867623: Epoch time: 55.7 s 
2023-11-23 00:57:48.931922:  
2023-11-23 00:57:48.932094: Epoch 483 
2023-11-23 00:57:48.932218: Current learning rate: 0.00552 
2023-11-23 00:58:44.374154: train_loss -0.3629 
2023-11-23 00:58:44.374356: val_loss -0.3461 
2023-11-23 00:58:44.374440: Pseudo dice [0.7044, nan] 
2023-11-23 00:58:44.374512: Epoch time: 55.44 s 
2023-11-23 00:58:45.545550:  
2023-11-23 00:58:45.545686: Epoch 484 
2023-11-23 00:58:45.545785: Current learning rate: 0.00551 
2023-11-23 00:59:41.199106: train_loss -0.3463 
2023-11-23 00:59:41.199304: val_loss -0.3461 
2023-11-23 00:59:41.199384: Pseudo dice [0.6899, nan] 
2023-11-23 00:59:41.199460: Epoch time: 55.65 s 
2023-11-23 00:59:42.257638:  
2023-11-23 00:59:42.257809: Epoch 485 
2023-11-23 00:59:42.257911: Current learning rate: 0.0055 
2023-11-23 01:00:37.821578: train_loss -0.345 
2023-11-23 01:00:37.821780: val_loss -0.3518 
2023-11-23 01:00:37.821866: Pseudo dice [0.7045, nan] 
2023-11-23 01:00:37.821944: Epoch time: 55.56 s 
2023-11-23 01:00:38.882752:  
2023-11-23 01:00:38.882922: Epoch 486 
2023-11-23 01:00:38.883081: Current learning rate: 0.00549 
2023-11-23 01:01:34.524509: train_loss -0.3553 
2023-11-23 01:01:34.524722: val_loss -0.3704 
2023-11-23 01:01:34.524804: Pseudo dice [0.7526, nan] 
2023-11-23 01:01:34.524881: Epoch time: 55.64 s 
2023-11-23 01:01:35.597422:  
2023-11-23 01:01:35.597652: Epoch 487 
2023-11-23 01:01:35.597781: Current learning rate: 0.00548 
2023-11-23 01:02:31.041383: train_loss -0.3547 
2023-11-23 01:02:31.041575: val_loss -0.3496 
2023-11-23 01:02:31.041654: Pseudo dice [0.7164, nan] 
2023-11-23 01:02:31.041731: Epoch time: 55.44 s 
2023-11-23 01:02:32.207726:  
2023-11-23 01:02:32.207863: Epoch 488 
2023-11-23 01:02:32.207985: Current learning rate: 0.00547 
2023-11-23 01:03:27.811368: train_loss -0.3456 
2023-11-23 01:03:27.811562: val_loss -0.3668 
2023-11-23 01:03:27.811642: Pseudo dice [0.7421, nan] 
2023-11-23 01:03:27.811713: Epoch time: 55.6 s 
2023-11-23 01:03:28.880574:  
2023-11-23 01:03:28.880785: Epoch 489 
2023-11-23 01:03:28.880900: Current learning rate: 0.00546 
2023-11-23 01:04:24.432944: train_loss -0.3415 
2023-11-23 01:04:24.433123: val_loss -0.3486 
2023-11-23 01:04:24.433200: Pseudo dice [0.7149, nan] 
2023-11-23 01:04:24.433269: Epoch time: 55.55 s 
2023-11-23 01:04:25.497192:  
2023-11-23 01:04:25.497382: Epoch 490 
2023-11-23 01:04:25.497552: Current learning rate: 0.00546 
2023-11-23 01:05:21.219028: train_loss -0.3437 
2023-11-23 01:05:21.219296: val_loss -0.3604 
2023-11-23 01:05:21.219382: Pseudo dice [0.7325, nan] 
2023-11-23 01:05:21.219455: Epoch time: 55.72 s 
2023-11-23 01:05:22.300123:  
2023-11-23 01:05:22.300253: Epoch 491 
2023-11-23 01:05:22.300377: Current learning rate: 0.00545 
2023-11-23 01:06:17.910701: train_loss -0.3506 
2023-11-23 01:06:17.910871: val_loss -0.369 
2023-11-23 01:06:17.910948: Pseudo dice [0.7394, nan] 
2023-11-23 01:06:17.911020: Epoch time: 55.61 s 
2023-11-23 01:06:19.074451:  
2023-11-23 01:06:19.074645: Epoch 492 
2023-11-23 01:06:19.074774: Current learning rate: 0.00544 
2023-11-23 01:07:14.799348: train_loss -0.3411 
2023-11-23 01:07:14.799508: val_loss -0.3516 
2023-11-23 01:07:14.799585: Pseudo dice [0.7149, nan] 
2023-11-23 01:07:14.799656: Epoch time: 55.73 s 
2023-11-23 01:07:15.856799:  
2023-11-23 01:07:15.856923: Epoch 493 
2023-11-23 01:07:15.857064: Current learning rate: 0.00543 
2023-11-23 01:08:11.582967: train_loss -0.345 
2023-11-23 01:08:11.583169: val_loss -0.3417 
2023-11-23 01:08:11.583262: Pseudo dice [0.6903, nan] 
2023-11-23 01:08:11.583345: Epoch time: 55.73 s 
2023-11-23 01:08:12.660333:  
2023-11-23 01:08:12.660521: Epoch 494 
2023-11-23 01:08:12.660701: Current learning rate: 0.00542 
2023-11-23 01:09:08.406360: train_loss -0.3296 
2023-11-23 01:09:08.406583: val_loss -0.3347 
2023-11-23 01:09:08.406686: Pseudo dice [0.6809, nan] 
2023-11-23 01:09:08.406772: Epoch time: 55.75 s 
2023-11-23 01:09:09.467269:  
2023-11-23 01:09:09.467403: Epoch 495 
2023-11-23 01:09:09.467531: Current learning rate: 0.00541 
2023-11-23 01:10:04.991261: train_loss -0.3451 
2023-11-23 01:10:04.991471: val_loss -0.3333 
2023-11-23 01:10:04.991555: Pseudo dice [0.6791, nan] 
2023-11-23 01:10:04.991631: Epoch time: 55.52 s 
2023-11-23 01:10:06.167628:  
2023-11-23 01:10:06.167755: Epoch 496 
2023-11-23 01:10:06.167857: Current learning rate: 0.0054 
2023-11-23 01:11:01.872865: train_loss -0.3334 
2023-11-23 01:11:01.873060: val_loss -0.3336 
2023-11-23 01:11:01.873136: Pseudo dice [0.678, nan] 
2023-11-23 01:11:01.873205: Epoch time: 55.71 s 
2023-11-23 01:11:02.931204:  
2023-11-23 01:11:02.931339: Epoch 497 
2023-11-23 01:11:02.931444: Current learning rate: 0.00539 
2023-11-23 01:11:58.525321: train_loss -0.3329 
2023-11-23 01:11:58.525518: val_loss -0.3329 
2023-11-23 01:11:58.525598: Pseudo dice [0.676, nan] 
2023-11-23 01:11:58.525669: Epoch time: 55.59 s 
2023-11-23 01:11:59.592139:  
2023-11-23 01:11:59.592311: Epoch 498 
2023-11-23 01:11:59.592454: Current learning rate: 0.00538 
2023-11-23 01:12:55.288278: train_loss -0.3494 
2023-11-23 01:12:55.288496: val_loss -0.3552 
2023-11-23 01:12:55.288606: Pseudo dice [0.7131, nan] 
2023-11-23 01:12:55.288698: Epoch time: 55.7 s 
2023-11-23 01:12:56.353897:  
2023-11-23 01:12:56.354042: Epoch 499 
2023-11-23 01:12:56.354164: Current learning rate: 0.00537 
2023-11-23 01:13:52.135041: train_loss -0.3524 
2023-11-23 01:13:52.135256: val_loss -0.363 
2023-11-23 01:13:52.135376: Pseudo dice [0.7391, nan] 
2023-11-23 01:13:52.135471: Epoch time: 55.78 s 
2023-11-23 01:13:53.323748:  
2023-11-23 01:13:53.323928: Epoch 500 
2023-11-23 01:13:53.324054: Current learning rate: 0.00536 
2023-11-23 01:14:48.917648: train_loss -0.3455 
2023-11-23 01:14:48.917830: val_loss -0.3304 
2023-11-23 01:14:48.917909: Pseudo dice [0.6767, nan] 
2023-11-23 01:14:48.917980: Epoch time: 55.59 s 
2023-11-23 01:14:49.995452:  
2023-11-23 01:14:49.995613: Epoch 501 
2023-11-23 01:14:49.995753: Current learning rate: 0.00535 
2023-11-23 01:15:45.638090: train_loss -0.344 
2023-11-23 01:15:45.638312: val_loss -0.3527 
2023-11-23 01:15:45.638400: Pseudo dice [0.714, nan] 
2023-11-23 01:15:45.638482: Epoch time: 55.64 s 
2023-11-23 01:15:46.700849:  
2023-11-23 01:15:46.701098: Epoch 502 
2023-11-23 01:15:46.701247: Current learning rate: 0.00534 
2023-11-23 01:16:42.479060: train_loss -0.3328 
2023-11-23 01:16:42.479260: val_loss -0.336 
2023-11-23 01:16:42.479353: Pseudo dice [0.6954, nan] 
2023-11-23 01:16:42.479432: Epoch time: 55.78 s 
2023-11-23 01:16:43.542167:  
2023-11-23 01:16:43.542298: Epoch 503 
2023-11-23 01:16:43.542402: Current learning rate: 0.00533 
2023-11-23 01:17:39.327228: train_loss -0.3305 
2023-11-23 01:17:39.327418: val_loss -0.3463 
2023-11-23 01:17:39.327498: Pseudo dice [0.7097, nan] 
2023-11-23 01:17:39.327580: Epoch time: 55.79 s 
2023-11-23 01:17:40.424093:  
2023-11-23 01:17:40.424296: Epoch 504 
2023-11-23 01:17:40.424413: Current learning rate: 0.00532 
2023-11-23 01:18:35.968264: train_loss -0.3365 
2023-11-23 01:18:35.968449: val_loss -0.3413 
2023-11-23 01:18:35.968529: Pseudo dice [0.691, nan] 
2023-11-23 01:18:35.968626: Epoch time: 55.54 s 
2023-11-23 01:18:37.023107:  
2023-11-23 01:18:37.023235: Epoch 505 
2023-11-23 01:18:37.023341: Current learning rate: 0.00531 
2023-11-23 01:19:32.565592: train_loss -0.3569 
2023-11-23 01:19:32.565811: val_loss -0.351 
2023-11-23 01:19:32.565900: Pseudo dice [0.7152, nan] 
2023-11-23 01:19:32.565976: Epoch time: 55.54 s 
2023-11-23 01:19:33.625005:  
2023-11-23 01:19:33.625130: Epoch 506 
2023-11-23 01:19:33.625242: Current learning rate: 0.0053 
2023-11-23 01:20:29.228055: train_loss -0.3517 
2023-11-23 01:20:29.228240: val_loss -0.3444 
2023-11-23 01:20:29.228324: Pseudo dice [0.6957, nan] 
2023-11-23 01:20:29.228396: Epoch time: 55.6 s 
2023-11-23 01:20:30.294826:  
2023-11-23 01:20:30.294946: Epoch 507 
2023-11-23 01:20:30.295074: Current learning rate: 0.00529 
2023-11-23 01:21:25.929652: train_loss -0.352 
2023-11-23 01:21:25.929862: val_loss -0.3549 
2023-11-23 01:21:25.929986: Pseudo dice [0.7213, nan] 
2023-11-23 01:21:25.930066: Epoch time: 55.64 s 
2023-11-23 01:21:26.992855:  
2023-11-23 01:21:26.992988: Epoch 508 
2023-11-23 01:21:26.993113: Current learning rate: 0.00528 
2023-11-23 01:22:22.604757: train_loss -0.3574 
2023-11-23 01:22:22.604994: val_loss -0.3655 
2023-11-23 01:22:22.605080: Pseudo dice [0.7402, nan] 
2023-11-23 01:22:22.605161: Epoch time: 55.61 s 
2023-11-23 01:22:23.666175:  
2023-11-23 01:22:23.666301: Epoch 509 
2023-11-23 01:22:23.666430: Current learning rate: 0.00527 
2023-11-23 01:23:19.444424: train_loss -0.3519 
2023-11-23 01:23:19.444649: val_loss -0.3214 
2023-11-23 01:23:19.444740: Pseudo dice [0.6468, nan] 
2023-11-23 01:23:19.444822: Epoch time: 55.78 s 
2023-11-23 01:23:20.520628:  
2023-11-23 01:23:20.520856: Epoch 510 
2023-11-23 01:23:20.520962: Current learning rate: 0.00526 
2023-11-23 01:24:16.187364: train_loss -0.3508 
2023-11-23 01:24:16.187556: val_loss -0.3558 
2023-11-23 01:24:16.187639: Pseudo dice [0.7124, nan] 
2023-11-23 01:24:16.187711: Epoch time: 55.67 s 
2023-11-23 01:24:17.245511:  
2023-11-23 01:24:17.245634: Epoch 511 
2023-11-23 01:24:17.245764: Current learning rate: 0.00525 
2023-11-23 01:25:12.871962: train_loss -0.339 
2023-11-23 01:25:12.872162: val_loss -0.3471 
2023-11-23 01:25:12.872247: Pseudo dice [0.7042, nan] 
2023-11-23 01:25:12.872324: Epoch time: 55.63 s 
2023-11-23 01:25:14.033134:  
2023-11-23 01:25:14.033380: Epoch 512 
2023-11-23 01:25:14.033575: Current learning rate: 0.00524 
2023-11-23 01:26:09.762522: train_loss -0.341 
2023-11-23 01:26:09.762702: val_loss -0.3468 
2023-11-23 01:26:09.762785: Pseudo dice [0.7029, nan] 
2023-11-23 01:26:09.762864: Epoch time: 55.73 s 
2023-11-23 01:26:10.842362:  
2023-11-23 01:26:10.842546: Epoch 513 
2023-11-23 01:26:10.842685: Current learning rate: 0.00523 
2023-11-23 01:27:06.420126: train_loss -0.3342 
2023-11-23 01:27:06.420322: val_loss -0.3617 
2023-11-23 01:27:06.420406: Pseudo dice [0.729, nan] 
2023-11-23 01:27:06.420483: Epoch time: 55.58 s 
2023-11-23 01:27:07.490518:  
2023-11-23 01:27:07.490791: Epoch 514 
2023-11-23 01:27:07.490951: Current learning rate: 0.00522 
2023-11-23 01:28:03.166917: train_loss -0.3393 
2023-11-23 01:28:03.167113: val_loss -0.3327 
2023-11-23 01:28:03.167194: Pseudo dice [0.6756, nan] 
2023-11-23 01:28:03.167270: Epoch time: 55.68 s 
2023-11-23 01:28:04.240067:  
2023-11-23 01:28:04.240213: Epoch 515 
2023-11-23 01:28:04.240340: Current learning rate: 0.00521 
2023-11-23 01:28:59.931109: train_loss -0.3519 
2023-11-23 01:28:59.931316: val_loss -0.3533 
2023-11-23 01:28:59.931398: Pseudo dice [0.7173, nan] 
2023-11-23 01:28:59.931489: Epoch time: 55.69 s 
2023-11-23 01:29:00.998436:  
2023-11-23 01:29:00.998617: Epoch 516 
2023-11-23 01:29:00.998775: Current learning rate: 0.0052 
2023-11-23 01:29:56.709850: train_loss -0.338 
2023-11-23 01:29:56.710046: val_loss -0.3287 
2023-11-23 01:29:56.710127: Pseudo dice [0.6772, nan] 
2023-11-23 01:29:56.710199: Epoch time: 55.71 s 
2023-11-23 01:29:57.772897:  
2023-11-23 01:29:57.773062: Epoch 517 
2023-11-23 01:29:57.773218: Current learning rate: 0.00519 
2023-11-23 01:30:53.407265: train_loss -0.3442 
2023-11-23 01:30:53.407465: val_loss -0.3623 
2023-11-23 01:30:53.407550: Pseudo dice [0.7376, nan] 
2023-11-23 01:30:53.407625: Epoch time: 55.64 s 
2023-11-23 01:30:54.483802:  
2023-11-23 01:30:54.483926: Epoch 518 
2023-11-23 01:30:54.484048: Current learning rate: 0.00518 
2023-11-23 01:31:50.137074: train_loss -0.348 
2023-11-23 01:31:50.137262: val_loss -0.3343 
2023-11-23 01:31:50.137410: Pseudo dice [0.6876, nan] 
2023-11-23 01:31:50.137518: Epoch time: 55.65 s 
2023-11-23 01:31:51.212566:  
2023-11-23 01:31:51.212693: Epoch 519 
2023-11-23 01:31:51.212799: Current learning rate: 0.00518 
2023-11-23 01:32:46.879439: train_loss -0.3464 
2023-11-23 01:32:46.879707: val_loss -0.352 
2023-11-23 01:32:46.879827: Pseudo dice [0.7149, nan] 
2023-11-23 01:32:46.879926: Epoch time: 55.67 s 
2023-11-23 01:32:47.965384:  
2023-11-23 01:32:47.965512: Epoch 520 
2023-11-23 01:32:47.965637: Current learning rate: 0.00517 
2023-11-23 01:33:43.650570: train_loss -0.3476 
2023-11-23 01:33:43.650744: val_loss -0.3645 
2023-11-23 01:33:43.650849: Pseudo dice [0.7426, nan] 
2023-11-23 01:33:43.650923: Epoch time: 55.69 s 
2023-11-23 01:33:44.715478:  
2023-11-23 01:33:44.715822: Epoch 521 
2023-11-23 01:33:44.715997: Current learning rate: 0.00516 
2023-11-23 01:34:40.378389: train_loss -0.3445 
2023-11-23 01:34:40.378607: val_loss -0.3421 
2023-11-23 01:34:40.378687: Pseudo dice [0.6893, nan] 
2023-11-23 01:34:40.378761: Epoch time: 55.66 s 
2023-11-23 01:34:41.453343:  
2023-11-23 01:34:41.453492: Epoch 522 
2023-11-23 01:34:41.453614: Current learning rate: 0.00515 
2023-11-23 01:35:37.108918: train_loss -0.3347 
2023-11-23 01:35:37.109141: val_loss -0.3314 
2023-11-23 01:35:37.109227: Pseudo dice [0.6693, nan] 
2023-11-23 01:35:37.109305: Epoch time: 55.66 s 
2023-11-23 01:35:38.181741:  
2023-11-23 01:35:38.181908: Epoch 523 
2023-11-23 01:35:38.182054: Current learning rate: 0.00514 
2023-11-23 01:36:33.802402: train_loss -0.3454 
2023-11-23 01:36:33.802591: val_loss -0.3653 
2023-11-23 01:36:33.802668: Pseudo dice [0.7429, nan] 
2023-11-23 01:36:33.802740: Epoch time: 55.62 s 
2023-11-23 01:36:34.967139:  
2023-11-23 01:36:34.967281: Epoch 524 
2023-11-23 01:36:34.967389: Current learning rate: 0.00513 
2023-11-23 01:37:30.716302: train_loss -0.3601 
2023-11-23 01:37:30.716499: val_loss -0.3843 
2023-11-23 01:37:30.716605: Pseudo dice [0.7787, nan] 
2023-11-23 01:37:30.716683: Epoch time: 55.75 s 
2023-11-23 01:37:31.796302:  
2023-11-23 01:37:31.796438: Epoch 525 
2023-11-23 01:37:31.796581: Current learning rate: 0.00512 
2023-11-23 01:38:27.581638: train_loss -0.3514 
2023-11-23 01:38:27.581828: val_loss -0.3466 
2023-11-23 01:38:27.581911: Pseudo dice [0.7055, nan] 
2023-11-23 01:38:27.581985: Epoch time: 55.79 s 
2023-11-23 01:38:28.655443:  
2023-11-23 01:38:28.655646: Epoch 526 
2023-11-23 01:38:28.655793: Current learning rate: 0.00511 
2023-11-23 01:39:24.344010: train_loss -0.3336 
2023-11-23 01:39:24.344222: val_loss -0.3346 
2023-11-23 01:39:24.344305: Pseudo dice [0.6736, nan] 
2023-11-23 01:39:24.344380: Epoch time: 55.69 s 
2023-11-23 01:39:25.403659:  
2023-11-23 01:39:25.403772: Epoch 527 
2023-11-23 01:39:25.403893: Current learning rate: 0.0051 
2023-11-23 01:40:21.024858: train_loss -0.3533 
2023-11-23 01:40:21.025079: val_loss -0.3639 
2023-11-23 01:40:21.025164: Pseudo dice [0.7454, nan] 
2023-11-23 01:40:21.025237: Epoch time: 55.62 s 
2023-11-23 01:40:22.191572:  
2023-11-23 01:40:22.191710: Epoch 528 
2023-11-23 01:40:22.191818: Current learning rate: 0.00509 
2023-11-23 01:41:17.750643: train_loss -0.3378 
2023-11-23 01:41:17.750845: val_loss -0.3551 
2023-11-23 01:41:17.750943: Pseudo dice [0.7251, nan] 
2023-11-23 01:41:17.751019: Epoch time: 55.56 s 
2023-11-23 01:41:18.821683:  
2023-11-23 01:41:18.821846: Epoch 529 
2023-11-23 01:41:18.821996: Current learning rate: 0.00508 
2023-11-23 01:42:14.561513: train_loss -0.346 
2023-11-23 01:42:14.561712: val_loss -0.3458 
2023-11-23 01:42:14.561793: Pseudo dice [0.7006, nan] 
2023-11-23 01:42:14.561862: Epoch time: 55.74 s 
2023-11-23 01:42:15.623793:  
2023-11-23 01:42:15.623997: Epoch 530 
2023-11-23 01:42:15.624178: Current learning rate: 0.00507 
2023-11-23 01:43:11.286282: train_loss -0.3437 
2023-11-23 01:43:11.286476: val_loss -0.3408 
2023-11-23 01:43:11.286592: Pseudo dice [0.6987, nan] 
2023-11-23 01:43:11.286669: Epoch time: 55.66 s 
2023-11-23 01:43:12.355104:  
2023-11-23 01:43:12.355367: Epoch 531 
2023-11-23 01:43:12.355525: Current learning rate: 0.00506 
2023-11-23 01:44:07.967287: train_loss -0.345 
2023-11-23 01:44:07.967463: val_loss -0.3643 
2023-11-23 01:44:07.967545: Pseudo dice [0.7425, nan] 
2023-11-23 01:44:07.967618: Epoch time: 55.61 s 
2023-11-23 01:44:09.148162:  
2023-11-23 01:44:09.148284: Epoch 532 
2023-11-23 01:44:09.148388: Current learning rate: 0.00505 
2023-11-23 01:45:04.837521: train_loss -0.3512 
2023-11-23 01:45:04.837718: val_loss -0.3546 
2023-11-23 01:45:04.837799: Pseudo dice [0.7197, nan] 
2023-11-23 01:45:04.837872: Epoch time: 55.69 s 
2023-11-23 01:45:05.902454:  
2023-11-23 01:45:05.902583: Epoch 533 
2023-11-23 01:45:05.902702: Current learning rate: 0.00504 
2023-11-23 01:46:01.620964: train_loss -0.3477 
2023-11-23 01:46:01.621176: val_loss -0.3544 
2023-11-23 01:46:01.621261: Pseudo dice [0.718, nan] 
2023-11-23 01:46:01.621333: Epoch time: 55.72 s 
2023-11-23 01:46:02.687794:  
2023-11-23 01:46:02.687913: Epoch 534 
2023-11-23 01:46:02.688011: Current learning rate: 0.00503 
2023-11-23 01:46:58.381689: train_loss -0.3486 
2023-11-23 01:46:58.381897: val_loss -0.349 
2023-11-23 01:46:58.381982: Pseudo dice [0.7128, nan] 
2023-11-23 01:46:58.382055: Epoch time: 55.69 s 
2023-11-23 01:46:59.438597:  
2023-11-23 01:46:59.438853: Epoch 535 
2023-11-23 01:46:59.438979: Current learning rate: 0.00502 
2023-11-23 01:47:55.055035: train_loss -0.3535 
2023-11-23 01:47:55.055237: val_loss -0.3589 
2023-11-23 01:47:55.055315: Pseudo dice [0.7285, nan] 
2023-11-23 01:47:55.055393: Epoch time: 55.62 s 
2023-11-23 01:47:56.228037:  
2023-11-23 01:47:56.228287: Epoch 536 
2023-11-23 01:47:56.228459: Current learning rate: 0.00501 
2023-11-23 01:48:51.794736: train_loss -0.3571 
2023-11-23 01:48:51.794922: val_loss -0.3552 
2023-11-23 01:48:51.795003: Pseudo dice [0.7215, nan] 
2023-11-23 01:48:51.795082: Epoch time: 55.57 s 
2023-11-23 01:48:52.859052:  
2023-11-23 01:48:52.859196: Epoch 537 
2023-11-23 01:48:52.859300: Current learning rate: 0.005 
2023-11-23 01:49:48.634561: train_loss -0.3416 
2023-11-23 01:49:48.634773: val_loss -0.3607 
2023-11-23 01:49:48.634861: Pseudo dice [0.7361, nan] 
2023-11-23 01:49:48.634937: Epoch time: 55.78 s 
2023-11-23 01:49:49.705154:  
2023-11-23 01:49:49.705287: Epoch 538 
2023-11-23 01:49:49.705419: Current learning rate: 0.00499 
2023-11-23 01:50:45.334621: train_loss -0.3451 
2023-11-23 01:50:45.334844: val_loss -0.3387 
2023-11-23 01:50:45.334928: Pseudo dice [0.6989, nan] 
2023-11-23 01:50:45.335005: Epoch time: 55.63 s 
2023-11-23 01:50:46.402081:  
2023-11-23 01:50:46.402435: Epoch 539 
2023-11-23 01:50:46.402632: Current learning rate: 0.00498 
2023-11-23 01:51:42.031500: train_loss -0.3338 
2023-11-23 01:51:42.031714: val_loss -0.3434 
2023-11-23 01:51:42.031790: Pseudo dice [0.6982, nan] 
2023-11-23 01:51:42.031858: Epoch time: 55.63 s 
2023-11-23 01:51:43.204544:  
2023-11-23 01:51:43.204734: Epoch 540 
2023-11-23 01:51:43.204845: Current learning rate: 0.00497 
2023-11-23 01:52:38.850043: train_loss -0.3484 
2023-11-23 01:52:38.850262: val_loss -0.37 
2023-11-23 01:52:38.850341: Pseudo dice [0.7529, nan] 
2023-11-23 01:52:38.850411: Epoch time: 55.65 s 
2023-11-23 01:52:39.918798:  
2023-11-23 01:52:39.919019: Epoch 541 
2023-11-23 01:52:39.919210: Current learning rate: 0.00496 
2023-11-23 01:53:35.671165: train_loss -0.344 
2023-11-23 01:53:35.671366: val_loss -0.3416 
2023-11-23 01:53:35.671457: Pseudo dice [0.7006, nan] 
2023-11-23 01:53:35.671526: Epoch time: 55.75 s 
2023-11-23 01:53:36.737604:  
2023-11-23 01:53:36.737758: Epoch 542 
2023-11-23 01:53:36.737871: Current learning rate: 0.00495 
2023-11-23 01:54:32.413308: train_loss -0.3554 
2023-11-23 01:54:32.413494: val_loss -0.3419 
2023-11-23 01:54:32.413577: Pseudo dice [0.6945, nan] 
2023-11-23 01:54:32.413650: Epoch time: 55.68 s 
2023-11-23 01:54:33.479404:  
2023-11-23 01:54:33.479528: Epoch 543 
2023-11-23 01:54:33.479625: Current learning rate: 0.00494 
2023-11-23 01:55:29.044878: train_loss -0.347 
2023-11-23 01:55:29.045078: val_loss -0.3613 
2023-11-23 01:55:29.045157: Pseudo dice [0.7375, nan] 
2023-11-23 01:55:29.045232: Epoch time: 55.57 s 
2023-11-23 01:55:30.213565:  
2023-11-23 01:55:30.213726: Epoch 544 
2023-11-23 01:55:30.213902: Current learning rate: 0.00493 
2023-11-23 01:56:25.905957: train_loss -0.3419 
2023-11-23 01:56:25.906174: val_loss -0.3364 
2023-11-23 01:56:25.906260: Pseudo dice [0.688, nan] 
2023-11-23 01:56:25.906334: Epoch time: 55.69 s 
2023-11-23 01:56:26.968513:  
2023-11-23 01:56:26.968649: Epoch 545 
2023-11-23 01:56:26.968745: Current learning rate: 0.00492 
2023-11-23 01:57:22.744774: train_loss -0.3553 
2023-11-23 01:57:22.744973: val_loss -0.3768 
2023-11-23 01:57:22.745056: Pseudo dice [0.7708, nan] 
2023-11-23 01:57:22.745132: Epoch time: 55.78 s 
2023-11-23 01:57:23.814810:  
2023-11-23 01:57:23.814929: Epoch 546 
2023-11-23 01:57:23.815028: Current learning rate: 0.00491 
2023-11-23 01:58:19.505186: train_loss -0.3594 
2023-11-23 01:58:19.505393: val_loss -0.346 
2023-11-23 01:58:19.505478: Pseudo dice [0.7132, nan] 
2023-11-23 01:58:19.505563: Epoch time: 55.69 s 
2023-11-23 01:58:20.578150:  
2023-11-23 01:58:20.578295: Epoch 547 
2023-11-23 01:58:20.578405: Current learning rate: 0.0049 
2023-11-23 01:59:16.213706: train_loss -0.3418 
2023-11-23 01:59:16.213902: val_loss -0.3522 
2023-11-23 01:59:16.213983: Pseudo dice [0.7168, nan] 
2023-11-23 01:59:16.214054: Epoch time: 55.64 s 
2023-11-23 01:59:17.385769:  
2023-11-23 01:59:17.385925: Epoch 548 
2023-11-23 01:59:17.386034: Current learning rate: 0.00489 
2023-11-23 02:00:13.076376: train_loss -0.3415 
2023-11-23 02:00:13.076637: val_loss -0.349 
2023-11-23 02:00:13.076721: Pseudo dice [0.7149, nan] 
2023-11-23 02:00:13.076798: Epoch time: 55.69 s 
2023-11-23 02:00:14.145538:  
2023-11-23 02:00:14.145681: Epoch 549 
2023-11-23 02:00:14.145782: Current learning rate: 0.00488 
2023-11-23 02:01:09.781413: train_loss -0.3593 
2023-11-23 02:01:09.781626: val_loss -0.3359 
2023-11-23 02:01:09.781720: Pseudo dice [0.6903, nan] 
2023-11-23 02:01:09.781804: Epoch time: 55.64 s 
2023-11-23 02:01:10.967642:  
2023-11-23 02:01:10.967767: Epoch 550 
2023-11-23 02:01:10.967868: Current learning rate: 0.00487 
2023-11-23 02:02:06.648906: train_loss -0.3489 
2023-11-23 02:02:06.649123: val_loss -0.3399 
2023-11-23 02:02:06.649205: Pseudo dice [0.6885, nan] 
2023-11-23 02:02:06.649279: Epoch time: 55.68 s 
2023-11-23 02:02:07.708293:  
2023-11-23 02:02:07.708450: Epoch 551 
2023-11-23 02:02:07.708571: Current learning rate: 0.00486 
2023-11-23 02:03:03.372599: train_loss -0.3336 
2023-11-23 02:03:03.372817: val_loss -0.3595 
2023-11-23 02:03:03.372896: Pseudo dice [0.7334, nan] 
2023-11-23 02:03:03.372968: Epoch time: 55.67 s 
2023-11-23 02:03:04.445718:  
2023-11-23 02:03:04.445868: Epoch 552 
2023-11-23 02:03:04.445973: Current learning rate: 0.00485 
2023-11-23 02:04:00.157645: train_loss -0.3307 
2023-11-23 02:04:00.157818: val_loss -0.3478 
2023-11-23 02:04:00.157911: Pseudo dice [0.6994, nan] 
2023-11-23 02:04:00.157985: Epoch time: 55.71 s 
2023-11-23 02:04:01.345403:  
2023-11-23 02:04:01.345530: Epoch 553 
2023-11-23 02:04:01.345634: Current learning rate: 0.00484 
2023-11-23 02:04:57.044001: train_loss -0.356 
2023-11-23 02:04:57.044210: val_loss -0.3543 
2023-11-23 02:04:57.044288: Pseudo dice [0.719, nan] 
2023-11-23 02:04:57.044359: Epoch time: 55.7 s 
2023-11-23 02:04:58.108248:  
2023-11-23 02:04:58.108394: Epoch 554 
2023-11-23 02:04:58.108500: Current learning rate: 0.00484 
2023-11-23 02:05:53.773457: train_loss -0.3516 
2023-11-23 02:05:53.773698: val_loss -0.3687 
2023-11-23 02:05:53.773786: Pseudo dice [0.7433, nan] 
2023-11-23 02:05:53.773864: Epoch time: 55.67 s 
2023-11-23 02:05:54.833244:  
2023-11-23 02:05:54.833364: Epoch 555 
2023-11-23 02:05:54.833467: Current learning rate: 0.00483 
2023-11-23 02:06:50.437803: train_loss -0.3435 
2023-11-23 02:06:50.438019: val_loss -0.3459 
2023-11-23 02:06:50.438105: Pseudo dice [0.7093, nan] 
2023-11-23 02:06:50.438180: Epoch time: 55.61 s 
2023-11-23 02:06:51.504647:  
2023-11-23 02:06:51.504775: Epoch 556 
2023-11-23 02:06:51.504877: Current learning rate: 0.00482 
2023-11-23 02:07:47.138727: train_loss -0.3448 
2023-11-23 02:07:47.138939: val_loss -0.337 
2023-11-23 02:07:47.139024: Pseudo dice [0.6911, nan] 
2023-11-23 02:07:47.139097: Epoch time: 55.63 s 
2023-11-23 02:07:48.202661:  
2023-11-23 02:07:48.202861: Epoch 557 
2023-11-23 02:07:48.202972: Current learning rate: 0.00481 
2023-11-23 02:08:43.806944: train_loss -0.3523 
2023-11-23 02:08:43.807141: val_loss -0.3798 
2023-11-23 02:08:43.807261: Pseudo dice [0.7685, nan] 
2023-11-23 02:08:43.807335: Epoch time: 55.61 s 
2023-11-23 02:08:44.873482:  
2023-11-23 02:08:44.873626: Epoch 558 
2023-11-23 02:08:44.873738: Current learning rate: 0.0048 
2023-11-23 02:09:40.480703: train_loss -0.3418 
2023-11-23 02:09:40.480914: val_loss -0.3567 
2023-11-23 02:09:40.480993: Pseudo dice [0.7266, nan] 
2023-11-23 02:09:40.481066: Epoch time: 55.61 s 
2023-11-23 02:09:41.552640:  
2023-11-23 02:09:41.552778: Epoch 559 
2023-11-23 02:09:41.552922: Current learning rate: 0.00479 
2023-11-23 02:10:37.173714: train_loss -0.3516 
2023-11-23 02:10:37.173930: val_loss -0.3435 
2023-11-23 02:10:37.174016: Pseudo dice [0.7043, nan] 
2023-11-23 02:10:37.174088: Epoch time: 55.62 s 
2023-11-23 02:10:38.246330:  
2023-11-23 02:10:38.246497: Epoch 560 
2023-11-23 02:10:38.246603: Current learning rate: 0.00478 
2023-11-23 02:11:33.829158: train_loss -0.3514 
2023-11-23 02:11:33.829340: val_loss -0.3545 
2023-11-23 02:11:33.829419: Pseudo dice [0.7206, nan] 
2023-11-23 02:11:33.829489: Epoch time: 55.58 s 
2023-11-23 02:11:34.893717:  
2023-11-23 02:11:34.893848: Epoch 561 
2023-11-23 02:11:34.893944: Current learning rate: 0.00477 
2023-11-23 02:12:30.521212: train_loss -0.3439 
2023-11-23 02:12:30.521412: val_loss -0.3339 
2023-11-23 02:12:30.521489: Pseudo dice [0.6797, nan] 
2023-11-23 02:12:30.521564: Epoch time: 55.63 s 
2023-11-23 02:12:31.594666:  
2023-11-23 02:12:31.594796: Epoch 562 
2023-11-23 02:12:31.594896: Current learning rate: 0.00476 
2023-11-23 02:13:27.358295: train_loss -0.3456 
2023-11-23 02:13:27.358507: val_loss -0.3399 
2023-11-23 02:13:27.358583: Pseudo dice [0.6951, nan] 
2023-11-23 02:13:27.358662: Epoch time: 55.76 s 
2023-11-23 02:13:28.427780:  
2023-11-23 02:13:28.427951: Epoch 563 
2023-11-23 02:13:28.428087: Current learning rate: 0.00475 
2023-11-23 02:14:24.155931: train_loss -0.3634 
2023-11-23 02:14:24.156153: val_loss -0.3558 
2023-11-23 02:14:24.156237: Pseudo dice [0.7171, nan] 
2023-11-23 02:14:24.156312: Epoch time: 55.73 s 
2023-11-23 02:14:25.214396:  
2023-11-23 02:14:25.214521: Epoch 564 
2023-11-23 02:14:25.214648: Current learning rate: 0.00474 
2023-11-23 02:15:20.786661: train_loss -0.3503 
2023-11-23 02:15:20.786865: val_loss -0.3581 
2023-11-23 02:15:20.786947: Pseudo dice [0.7318, nan] 
2023-11-23 02:15:20.787020: Epoch time: 55.57 s 
2023-11-23 02:15:21.947791:  
2023-11-23 02:15:21.947920: Epoch 565 
2023-11-23 02:15:21.948009: Current learning rate: 0.00473 
2023-11-23 02:16:17.693637: train_loss -0.3432 
2023-11-23 02:16:17.693858: val_loss -0.3305 
2023-11-23 02:16:17.693944: Pseudo dice [0.6738, nan] 
2023-11-23 02:16:17.694028: Epoch time: 55.75 s 
2023-11-23 02:16:18.762150:  
2023-11-23 02:16:18.762299: Epoch 566 
2023-11-23 02:16:18.762399: Current learning rate: 0.00472 
2023-11-23 02:17:14.390861: train_loss -0.3478 
2023-11-23 02:17:14.391091: val_loss -0.3517 
2023-11-23 02:17:14.391181: Pseudo dice [0.7098, nan] 
2023-11-23 02:17:14.391276: Epoch time: 55.63 s 
2023-11-23 02:17:15.446408:  
2023-11-23 02:17:15.446546: Epoch 567 
2023-11-23 02:17:15.446671: Current learning rate: 0.00471 
2023-11-23 02:18:11.220316: train_loss -0.3493 
2023-11-23 02:18:11.220521: val_loss -0.3792 
2023-11-23 02:18:11.220641: Pseudo dice [0.7668, nan] 
2023-11-23 02:18:11.220737: Epoch time: 55.77 s 
2023-11-23 02:18:12.297398:  
2023-11-23 02:18:12.297527: Epoch 568 
2023-11-23 02:18:12.297656: Current learning rate: 0.0047 
2023-11-23 02:19:07.941865: train_loss -0.3538 
2023-11-23 02:19:07.942060: val_loss -0.3541 
2023-11-23 02:19:07.942163: Pseudo dice [0.722, nan] 
2023-11-23 02:19:07.942264: Epoch time: 55.65 s 
2023-11-23 02:19:09.114784:  
2023-11-23 02:19:09.114905: Epoch 569 
2023-11-23 02:19:09.115005: Current learning rate: 0.00469 
2023-11-23 02:20:04.811971: train_loss -0.3649 
2023-11-23 02:20:04.812160: val_loss -0.3476 
2023-11-23 02:20:04.812282: Pseudo dice [0.7102, nan] 
2023-11-23 02:20:04.812357: Epoch time: 55.7 s 
2023-11-23 02:20:05.903837:  
2023-11-23 02:20:05.903970: Epoch 570 
2023-11-23 02:20:05.904112: Current learning rate: 0.00468 
2023-11-23 02:21:01.713730: train_loss -0.3377 
2023-11-23 02:21:01.713925: val_loss -0.3487 
2023-11-23 02:21:01.714002: Pseudo dice [0.7064, nan] 
2023-11-23 02:21:01.714075: Epoch time: 55.81 s 
2023-11-23 02:21:02.787752:  
2023-11-23 02:21:02.787888: Epoch 571 
2023-11-23 02:21:02.787997: Current learning rate: 0.00467 
2023-11-23 02:21:58.511375: train_loss -0.3486 
2023-11-23 02:21:58.511554: val_loss -0.3527 
2023-11-23 02:21:58.511633: Pseudo dice [0.7153, nan] 
2023-11-23 02:21:58.511704: Epoch time: 55.72 s 
2023-11-23 02:21:59.582105:  
2023-11-23 02:21:59.582289: Epoch 572 
2023-11-23 02:21:59.582390: Current learning rate: 0.00466 
2023-11-23 02:22:55.164520: train_loss -0.3424 
2023-11-23 02:22:55.164726: val_loss -0.3491 
2023-11-23 02:22:55.164817: Pseudo dice [0.7084, nan] 
2023-11-23 02:22:55.164893: Epoch time: 55.58 s 
2023-11-23 02:22:56.355700:  
2023-11-23 02:22:56.355871: Epoch 573 
2023-11-23 02:22:56.356003: Current learning rate: 0.00465 
2023-11-23 02:23:51.982806: train_loss -0.3673 
2023-11-23 02:23:51.983022: val_loss -0.3783 
2023-11-23 02:23:51.983104: Pseudo dice [0.7763, nan] 
2023-11-23 02:23:51.983180: Epoch time: 55.63 s 
2023-11-23 02:23:53.065892:  
2023-11-23 02:23:53.066019: Epoch 574 
2023-11-23 02:23:53.066138: Current learning rate: 0.00464 
2023-11-23 02:24:48.747928: train_loss -0.3287 
2023-11-23 02:24:48.748122: val_loss -0.3609 
2023-11-23 02:24:48.748204: Pseudo dice [0.7354, nan] 
2023-11-23 02:24:48.748275: Epoch time: 55.68 s 
2023-11-23 02:24:49.825598:  
2023-11-23 02:24:49.825720: Epoch 575 
2023-11-23 02:24:49.825819: Current learning rate: 0.00463 
2023-11-23 02:25:45.492360: train_loss -0.342 
2023-11-23 02:25:45.492546: val_loss -0.3422 
2023-11-23 02:25:45.492650: Pseudo dice [0.6934, nan] 
2023-11-23 02:25:45.492731: Epoch time: 55.67 s 
2023-11-23 02:25:46.574222:  
2023-11-23 02:25:46.574344: Epoch 576 
2023-11-23 02:25:46.574447: Current learning rate: 0.00462 
2023-11-23 02:26:42.219692: train_loss -0.3531 
2023-11-23 02:26:42.219875: val_loss -0.3684 
2023-11-23 02:26:42.219964: Pseudo dice [0.7559, nan] 
2023-11-23 02:26:42.220053: Epoch time: 55.65 s 
2023-11-23 02:26:43.407962:  
2023-11-23 02:26:43.408073: Epoch 577 
2023-11-23 02:26:43.408158: Current learning rate: 0.00461 
2023-11-23 02:27:39.016159: train_loss -0.3553 
2023-11-23 02:27:39.016375: val_loss -0.3385 
2023-11-23 02:27:39.016464: Pseudo dice [0.686, nan] 
2023-11-23 02:27:39.016548: Epoch time: 55.61 s 
2023-11-23 02:27:40.092124:  
2023-11-23 02:27:40.092284: Epoch 578 
2023-11-23 02:27:40.092389: Current learning rate: 0.0046 
2023-11-23 02:28:35.810197: train_loss -0.3458 
2023-11-23 02:28:35.810392: val_loss -0.3542 
2023-11-23 02:28:35.810474: Pseudo dice [0.7208, nan] 
2023-11-23 02:28:35.810546: Epoch time: 55.72 s 
2023-11-23 02:28:36.886273:  
2023-11-23 02:28:36.886466: Epoch 579 
2023-11-23 02:28:36.886608: Current learning rate: 0.00459 
2023-11-23 02:29:32.495194: train_loss -0.3484 
2023-11-23 02:29:32.495403: val_loss -0.3452 
2023-11-23 02:29:32.495485: Pseudo dice [0.7084, nan] 
2023-11-23 02:29:32.495571: Epoch time: 55.61 s 
2023-11-23 02:29:33.580008:  
2023-11-23 02:29:33.580139: Epoch 580 
2023-11-23 02:29:33.580243: Current learning rate: 0.00458 
2023-11-23 02:30:29.198933: train_loss -0.3424 
2023-11-23 02:30:29.199125: val_loss -0.3534 
2023-11-23 02:30:29.199205: Pseudo dice [0.7222, nan] 
2023-11-23 02:30:29.199278: Epoch time: 55.62 s 
2023-11-23 02:30:30.267251:  
2023-11-23 02:30:30.267371: Epoch 581 
2023-11-23 02:30:30.267516: Current learning rate: 0.00457 
2023-11-23 02:31:25.816202: train_loss -0.3382 
2023-11-23 02:31:25.816380: val_loss -0.3606 
2023-11-23 02:31:25.816455: Pseudo dice [0.7355, nan] 
2023-11-23 02:31:25.816533: Epoch time: 55.55 s 
2023-11-23 02:31:27.004579:  
2023-11-23 02:31:27.004715: Epoch 582 
2023-11-23 02:31:27.004843: Current learning rate: 0.00456 
2023-11-23 02:32:22.509792: train_loss -0.3497 
2023-11-23 02:32:22.509992: val_loss -0.3587 
2023-11-23 02:32:22.510098: Pseudo dice [0.7295, nan] 
2023-11-23 02:32:22.510214: Epoch time: 55.51 s 
2023-11-23 02:32:23.597968:  
2023-11-23 02:32:23.598223: Epoch 583 
2023-11-23 02:32:23.598423: Current learning rate: 0.00455 
2023-11-23 02:33:19.312956: train_loss -0.3504 
2023-11-23 02:33:19.313156: val_loss -0.3582 
2023-11-23 02:33:19.313237: Pseudo dice [0.7284, nan] 
2023-11-23 02:33:19.313313: Epoch time: 55.72 s 
2023-11-23 02:33:20.397873:  
2023-11-23 02:33:20.398059: Epoch 584 
2023-11-23 02:33:20.398186: Current learning rate: 0.00454 
2023-11-23 02:34:16.129782: train_loss -0.3513 
2023-11-23 02:34:16.130000: val_loss -0.3465 
2023-11-23 02:34:16.130114: Pseudo dice [0.7081, nan] 
2023-11-23 02:34:16.130190: Epoch time: 55.73 s 
2023-11-23 02:34:17.221764:  
2023-11-23 02:34:17.221936: Epoch 585 
2023-11-23 02:34:17.222062: Current learning rate: 0.00453 
2023-11-23 02:35:12.852846: train_loss -0.3412 
2023-11-23 02:35:12.853063: val_loss -0.3538 
2023-11-23 02:35:12.853149: Pseudo dice [0.7188, nan] 
2023-11-23 02:35:12.853223: Epoch time: 55.63 s 
2023-11-23 02:35:14.051820:  
2023-11-23 02:35:14.051943: Epoch 586 
2023-11-23 02:35:14.052056: Current learning rate: 0.00452 
2023-11-23 02:36:09.767111: train_loss -0.3544 
2023-11-23 02:36:09.767308: val_loss -0.334 
2023-11-23 02:36:09.767389: Pseudo dice [0.6817, nan] 
2023-11-23 02:36:09.767460: Epoch time: 55.72 s 
2023-11-23 02:36:10.849701:  
2023-11-23 02:36:10.849873: Epoch 587 
2023-11-23 02:36:10.849981: Current learning rate: 0.00451 
2023-11-23 02:37:06.560332: train_loss -0.3497 
2023-11-23 02:37:06.560534: val_loss -0.358 
2023-11-23 02:37:06.560630: Pseudo dice [0.7266, nan] 
2023-11-23 02:37:06.560705: Epoch time: 55.71 s 
2023-11-23 02:37:07.656264:  
2023-11-23 02:37:07.656403: Epoch 588 
2023-11-23 02:37:07.656540: Current learning rate: 0.0045 
2023-11-23 02:38:03.439749: train_loss -0.3392 
2023-11-23 02:38:03.439955: val_loss -0.3599 
2023-11-23 02:38:03.440036: Pseudo dice [0.7313, nan] 
2023-11-23 02:38:03.440116: Epoch time: 55.78 s 
2023-11-23 02:38:04.519411:  
2023-11-23 02:38:04.519542: Epoch 589 
2023-11-23 02:38:04.519676: Current learning rate: 0.00449 
2023-11-23 02:39:00.182099: train_loss -0.3386 
2023-11-23 02:39:00.182263: val_loss -0.3496 
2023-11-23 02:39:00.182354: Pseudo dice [0.7037, nan] 
2023-11-23 02:39:00.182428: Epoch time: 55.66 s 
2023-11-23 02:39:01.264907:  
2023-11-23 02:39:01.265092: Epoch 590 
2023-11-23 02:39:01.265209: Current learning rate: 0.00448 
2023-11-23 02:39:56.871936: train_loss -0.3432 
2023-11-23 02:39:56.872137: val_loss -0.3502 
2023-11-23 02:39:56.872234: Pseudo dice [0.7089, nan] 
2023-11-23 02:39:56.872307: Epoch time: 55.61 s 
2023-11-23 02:39:57.956790:  
2023-11-23 02:39:57.957039: Epoch 591 
2023-11-23 02:39:57.957146: Current learning rate: 0.00447 
2023-11-23 02:40:53.709362: train_loss -0.3481 
2023-11-23 02:40:53.709558: val_loss -0.354 
2023-11-23 02:40:53.709672: Pseudo dice [0.7205, nan] 
2023-11-23 02:40:53.709745: Epoch time: 55.75 s 
2023-11-23 02:40:54.797865:  
2023-11-23 02:40:54.798058: Epoch 592 
2023-11-23 02:40:54.798166: Current learning rate: 0.00446 
2023-11-23 02:41:50.532903: train_loss -0.35 
2023-11-23 02:41:50.533072: val_loss -0.3473 
2023-11-23 02:41:50.533144: Pseudo dice [0.7053, nan] 
2023-11-23 02:41:50.533214: Epoch time: 55.74 s 
2023-11-23 02:41:51.616333:  
2023-11-23 02:41:51.616479: Epoch 593 
2023-11-23 02:41:51.616600: Current learning rate: 0.00445 
2023-11-23 02:42:47.267661: train_loss -0.35 
2023-11-23 02:42:47.267877: val_loss -0.3777 
2023-11-23 02:42:47.267958: Pseudo dice [0.7654, nan] 
2023-11-23 02:42:47.268050: Epoch time: 55.65 s 
2023-11-23 02:42:48.361636:  
2023-11-23 02:42:48.361771: Epoch 594 
2023-11-23 02:42:48.361895: Current learning rate: 0.00444 
2023-11-23 02:43:44.157267: train_loss -0.3421 
2023-11-23 02:43:44.157449: val_loss -0.3572 
2023-11-23 02:43:44.157525: Pseudo dice [0.7317, nan] 
2023-11-23 02:43:44.157595: Epoch time: 55.8 s 
2023-11-23 02:43:45.247225:  
2023-11-23 02:43:45.247403: Epoch 595 
2023-11-23 02:43:45.247564: Current learning rate: 0.00443 
2023-11-23 02:44:41.077145: train_loss -0.3502 
2023-11-23 02:44:41.077337: val_loss -0.3407 
2023-11-23 02:44:41.077417: Pseudo dice [0.6863, nan] 
2023-11-23 02:44:41.077489: Epoch time: 55.83 s 
2023-11-23 02:44:42.151586:  
2023-11-23 02:44:42.151765: Epoch 596 
2023-11-23 02:44:42.151926: Current learning rate: 0.00442 
2023-11-23 02:45:37.908931: train_loss -0.3485 
2023-11-23 02:45:37.909122: val_loss -0.3584 
2023-11-23 02:45:37.909203: Pseudo dice [0.7206, nan] 
2023-11-23 02:45:37.909274: Epoch time: 55.76 s 
2023-11-23 02:45:38.996357:  
2023-11-23 02:45:38.996479: Epoch 597 
2023-11-23 02:45:38.996585: Current learning rate: 0.00441 
2023-11-23 02:46:34.665031: train_loss -0.3508 
2023-11-23 02:46:34.665247: val_loss -0.3716 
2023-11-23 02:46:34.665331: Pseudo dice [0.7565, nan] 
2023-11-23 02:46:34.665404: Epoch time: 55.67 s 
2023-11-23 02:46:35.745613:  
2023-11-23 02:46:35.745744: Epoch 598 
2023-11-23 02:46:35.745841: Current learning rate: 0.0044 
2023-11-23 02:47:31.300576: train_loss -0.3535 
2023-11-23 02:47:31.300773: val_loss -0.3504 
2023-11-23 02:47:31.300854: Pseudo dice [0.7084, nan] 
2023-11-23 02:47:31.300927: Epoch time: 55.56 s 
2023-11-23 02:47:32.383313:  
2023-11-23 02:47:32.383441: Epoch 599 
2023-11-23 02:47:32.383540: Current learning rate: 0.00439 
2023-11-23 02:48:28.135085: train_loss -0.3371 
2023-11-23 02:48:28.135275: val_loss -0.3547 
2023-11-23 02:48:28.135352: Pseudo dice [0.7279, nan] 
2023-11-23 02:48:28.135423: Epoch time: 55.75 s 
2023-11-23 02:48:29.332055:  
2023-11-23 02:48:29.332185: Epoch 600 
2023-11-23 02:48:29.332281: Current learning rate: 0.00438 
2023-11-23 02:49:24.997810: train_loss -0.3437 
2023-11-23 02:49:24.998012: val_loss -0.3297 
2023-11-23 02:49:24.998093: Pseudo dice [0.6677, nan] 
2023-11-23 02:49:24.998164: Epoch time: 55.67 s 
2023-11-23 02:49:26.085922:  
2023-11-23 02:49:26.086058: Epoch 601 
2023-11-23 02:49:26.086180: Current learning rate: 0.00437 
2023-11-23 02:50:21.689682: train_loss -0.3471 
2023-11-23 02:50:21.689880: val_loss -0.3349 
2023-11-23 02:50:21.689958: Pseudo dice [0.6825, nan] 
2023-11-23 02:50:21.690030: Epoch time: 55.6 s 
2023-11-23 02:50:22.876086:  
2023-11-23 02:50:22.876241: Epoch 602 
2023-11-23 02:50:22.876389: Current learning rate: 0.00436 
2023-11-23 02:51:18.480449: train_loss -0.3445 
2023-11-23 02:51:18.480660: val_loss -0.3656 
2023-11-23 02:51:18.480743: Pseudo dice [0.7458, nan] 
2023-11-23 02:51:18.480814: Epoch time: 55.61 s 
2023-11-23 02:51:19.568012:  
2023-11-23 02:51:19.568125: Epoch 603 
2023-11-23 02:51:19.568222: Current learning rate: 0.00435 
2023-11-23 02:52:15.178431: train_loss -0.341 
2023-11-23 02:52:15.178627: val_loss -0.3546 
2023-11-23 02:52:15.178741: Pseudo dice [0.7281, nan] 
2023-11-23 02:52:15.178820: Epoch time: 55.61 s 
2023-11-23 02:52:16.268925:  
2023-11-23 02:52:16.269067: Epoch 604 
2023-11-23 02:52:16.269181: Current learning rate: 0.00434 
2023-11-23 02:53:11.987451: train_loss -0.3502 
2023-11-23 02:53:11.987641: val_loss -0.3327 
2023-11-23 02:53:11.987722: Pseudo dice [0.6826, nan] 
2023-11-23 02:53:11.987794: Epoch time: 55.72 s 
2023-11-23 02:53:13.060572:  
2023-11-23 02:53:13.060714: Epoch 605 
2023-11-23 02:53:13.060818: Current learning rate: 0.00433 
2023-11-23 02:54:08.894947: train_loss -0.3517 
2023-11-23 02:54:08.895159: val_loss -0.3511 
2023-11-23 02:54:08.895260: Pseudo dice [0.7099, nan] 
2023-11-23 02:54:08.895343: Epoch time: 55.84 s 
2023-11-23 02:54:10.079960:  
2023-11-23 02:54:10.080083: Epoch 606 
2023-11-23 02:54:10.080210: Current learning rate: 0.00432 
2023-11-23 02:55:05.686567: train_loss -0.3584 
2023-11-23 02:55:05.686765: val_loss -0.3461 
2023-11-23 02:55:05.686888: Pseudo dice [0.7031, nan] 
2023-11-23 02:55:05.686964: Epoch time: 55.61 s 
2023-11-23 02:55:06.763347:  
2023-11-23 02:55:06.763494: Epoch 607 
2023-11-23 02:55:06.763636: Current learning rate: 0.00431 
2023-11-23 02:56:02.434772: train_loss -0.3499 
2023-11-23 02:56:02.434986: val_loss -0.3476 
2023-11-23 02:56:02.435069: Pseudo dice [0.7035, nan] 
2023-11-23 02:56:02.435143: Epoch time: 55.67 s 
2023-11-23 02:56:03.523440:  
2023-11-23 02:56:03.523601: Epoch 608 
2023-11-23 02:56:03.523705: Current learning rate: 0.0043 
2023-11-23 02:56:59.144913: train_loss -0.3541 
2023-11-23 02:56:59.145169: val_loss -0.3409 
2023-11-23 02:56:59.145254: Pseudo dice [0.7049, nan] 
2023-11-23 02:56:59.145328: Epoch time: 55.62 s 
2023-11-23 02:57:00.221551:  
2023-11-23 02:57:00.221716: Epoch 609 
2023-11-23 02:57:00.221816: Current learning rate: 0.00429 
2023-11-23 02:57:55.787377: train_loss -0.3252 
2023-11-23 02:57:55.787589: val_loss -0.3606 
2023-11-23 02:57:55.787669: Pseudo dice [0.7356, nan] 
2023-11-23 02:57:55.787744: Epoch time: 55.57 s 
2023-11-23 02:57:56.972730:  
2023-11-23 02:57:56.972853: Epoch 610 
2023-11-23 02:57:56.972949: Current learning rate: 0.00429 
2023-11-23 02:58:52.613630: train_loss -0.3415 
2023-11-23 02:58:52.613832: val_loss -0.3379 
2023-11-23 02:58:52.613918: Pseudo dice [0.6895, nan] 
2023-11-23 02:58:52.613993: Epoch time: 55.64 s 
2023-11-23 02:58:53.706017:  
2023-11-23 02:58:53.706141: Epoch 611 
2023-11-23 02:58:53.706259: Current learning rate: 0.00428 
2023-11-23 02:59:49.323030: train_loss -0.3451 
2023-11-23 02:59:49.323221: val_loss -0.356 
2023-11-23 02:59:49.323302: Pseudo dice [0.7208, nan] 
2023-11-23 02:59:49.323374: Epoch time: 55.62 s 
2023-11-23 02:59:50.420058:  
2023-11-23 02:59:50.420247: Epoch 612 
2023-11-23 02:59:50.420356: Current learning rate: 0.00427 
2023-11-23 03:00:46.052428: train_loss -0.3499 
2023-11-23 03:00:46.052654: val_loss -0.3622 
2023-11-23 03:00:46.052732: Pseudo dice [0.7296, nan] 
2023-11-23 03:00:46.052811: Epoch time: 55.63 s 
2023-11-23 03:00:47.136994:  
2023-11-23 03:00:47.137154: Epoch 613 
2023-11-23 03:00:47.137300: Current learning rate: 0.00426 
2023-11-23 03:01:42.789142: train_loss -0.3585 
2023-11-23 03:01:42.789360: val_loss -0.3545 
2023-11-23 03:01:42.789462: Pseudo dice [0.7215, nan] 
2023-11-23 03:01:42.789537: Epoch time: 55.65 s 
2023-11-23 03:01:43.870109:  
2023-11-23 03:01:43.870238: Epoch 614 
2023-11-23 03:01:43.870334: Current learning rate: 0.00425 
2023-11-23 03:02:39.603279: train_loss -0.3388 
2023-11-23 03:02:39.603491: val_loss -0.3581 
2023-11-23 03:02:39.603572: Pseudo dice [0.732, nan] 
2023-11-23 03:02:39.603650: Epoch time: 55.73 s 
2023-11-23 03:02:40.799257:  
2023-11-23 03:02:40.799402: Epoch 615 
2023-11-23 03:02:40.799513: Current learning rate: 0.00424 
2023-11-23 03:03:36.424344: train_loss -0.3317 
2023-11-23 03:03:36.424576: val_loss -0.3633 
2023-11-23 03:03:36.424667: Pseudo dice [0.7408, nan] 
2023-11-23 03:03:36.424754: Epoch time: 55.63 s 
2023-11-23 03:03:37.513442:  
2023-11-23 03:03:37.513570: Epoch 616 
2023-11-23 03:03:37.513675: Current learning rate: 0.00423 
2023-11-23 03:04:33.282287: train_loss -0.347 
2023-11-23 03:04:33.282471: val_loss -0.3621 
2023-11-23 03:04:33.282552: Pseudo dice [0.7331, nan] 
2023-11-23 03:04:33.282624: Epoch time: 55.77 s 
2023-11-23 03:04:34.369684:  
2023-11-23 03:04:34.369812: Epoch 617 
2023-11-23 03:04:34.369912: Current learning rate: 0.00422 
2023-11-23 03:05:30.196127: train_loss -0.3357 
2023-11-23 03:05:30.196317: val_loss -0.3734 
2023-11-23 03:05:30.196399: Pseudo dice [0.7474, nan] 
2023-11-23 03:05:30.196473: Epoch time: 55.83 s 
2023-11-23 03:05:31.276215:  
2023-11-23 03:05:31.276415: Epoch 618 
2023-11-23 03:05:31.276522: Current learning rate: 0.00421 
2023-11-23 03:06:27.003774: train_loss -0.3561 
2023-11-23 03:06:27.003963: val_loss -0.3531 
2023-11-23 03:06:27.004040: Pseudo dice [0.7081, nan] 
2023-11-23 03:06:27.004125: Epoch time: 55.73 s 
2023-11-23 03:06:28.090852:  
2023-11-23 03:06:28.090978: Epoch 619 
2023-11-23 03:06:28.091072: Current learning rate: 0.0042 
2023-11-23 03:07:23.733666: train_loss -0.3432 
2023-11-23 03:07:23.733908: val_loss -0.351 
2023-11-23 03:07:23.733992: Pseudo dice [0.7054, nan] 
2023-11-23 03:07:23.734067: Epoch time: 55.64 s 
2023-11-23 03:07:24.929195:  
2023-11-23 03:07:24.929363: Epoch 620 
2023-11-23 03:07:24.929497: Current learning rate: 0.00419 
2023-11-23 03:08:20.631808: train_loss -0.3605 
2023-11-23 03:08:20.632056: val_loss -0.346 
2023-11-23 03:08:20.632141: Pseudo dice [0.7043, nan] 
2023-11-23 03:08:20.632226: Epoch time: 55.7 s 
2023-11-23 03:08:21.721555:  
2023-11-23 03:08:21.721690: Epoch 621 
2023-11-23 03:08:21.721795: Current learning rate: 0.00418 
2023-11-23 03:09:17.464850: train_loss -0.3476 
2023-11-23 03:09:17.465045: val_loss -0.3366 
2023-11-23 03:09:17.465120: Pseudo dice [0.6878, nan] 
2023-11-23 03:09:17.465189: Epoch time: 55.74 s 
2023-11-23 03:09:18.559081:  
2023-11-23 03:09:18.559205: Epoch 622 
2023-11-23 03:09:18.559317: Current learning rate: 0.00417 
2023-11-23 03:10:14.387190: train_loss -0.347 
2023-11-23 03:10:14.387400: val_loss -0.3326 
2023-11-23 03:10:14.387482: Pseudo dice [0.6841, nan] 
2023-11-23 03:10:14.387557: Epoch time: 55.83 s 
2023-11-23 03:10:15.473712:  
2023-11-23 03:10:15.473843: Epoch 623 
2023-11-23 03:10:15.474054: Current learning rate: 0.00416 
2023-11-23 03:11:11.091910: train_loss -0.355 
2023-11-23 03:11:11.092107: val_loss -0.322 
2023-11-23 03:11:11.092227: Pseudo dice [0.657, nan] 
2023-11-23 03:11:11.092302: Epoch time: 55.62 s 
2023-11-23 03:11:12.184823:  
2023-11-23 03:11:12.184952: Epoch 624 
2023-11-23 03:11:12.185055: Current learning rate: 0.00415 
2023-11-23 03:12:07.916971: train_loss -0.342 
2023-11-23 03:12:07.917177: val_loss -0.3439 
2023-11-23 03:12:07.917255: Pseudo dice [0.6876, nan] 
2023-11-23 03:12:07.917345: Epoch time: 55.73 s 
2023-11-23 03:12:09.104090:  
2023-11-23 03:12:09.104231: Epoch 625 
2023-11-23 03:12:09.104344: Current learning rate: 0.00414 
2023-11-23 03:13:04.793937: train_loss -0.337 
2023-11-23 03:13:04.794131: val_loss -0.3726 
2023-11-23 03:13:04.794210: Pseudo dice [0.759, nan] 
2023-11-23 03:13:04.794281: Epoch time: 55.69 s 
2023-11-23 03:13:05.889335:  
2023-11-23 03:13:05.889471: Epoch 626 
2023-11-23 03:13:05.889596: Current learning rate: 0.00413 
2023-11-23 03:14:01.710108: train_loss -0.3568 
2023-11-23 03:14:01.710283: val_loss -0.3419 
2023-11-23 03:14:01.710362: Pseudo dice [0.7044, nan] 
2023-11-23 03:14:01.710435: Epoch time: 55.82 s 
2023-11-23 03:14:02.796428:  
2023-11-23 03:14:02.796553: Epoch 627 
2023-11-23 03:14:02.796675: Current learning rate: 0.00412 
2023-11-23 03:14:58.582759: train_loss -0.3456 
2023-11-23 03:14:58.582998: val_loss -0.3545 
2023-11-23 03:14:58.583085: Pseudo dice [0.727, nan] 
2023-11-23 03:14:58.583162: Epoch time: 55.79 s 
2023-11-23 03:14:59.670589:  
2023-11-23 03:14:59.670760: Epoch 628 
2023-11-23 03:14:59.670883: Current learning rate: 0.00411 
2023-11-23 03:15:55.293879: train_loss -0.3426 
2023-11-23 03:15:55.294094: val_loss -0.3721 
2023-11-23 03:15:55.294219: Pseudo dice [0.7536, nan] 
2023-11-23 03:15:55.294302: Epoch time: 55.62 s 
2023-11-23 03:15:56.379456:  
2023-11-23 03:15:56.379644: Epoch 629 
2023-11-23 03:15:56.379796: Current learning rate: 0.0041 
2023-11-23 03:16:51.833466: train_loss -0.3367 
2023-11-23 03:16:51.833656: val_loss -0.3629 
2023-11-23 03:16:51.833747: Pseudo dice [0.7279, nan] 
2023-11-23 03:16:51.833822: Epoch time: 55.45 s 
2023-11-23 03:16:53.023694:  
2023-11-23 03:16:53.023831: Epoch 630 
2023-11-23 03:16:53.023957: Current learning rate: 0.00409 
2023-11-23 03:17:48.501631: train_loss -0.3491 
2023-11-23 03:17:48.501832: val_loss -0.3514 
2023-11-23 03:17:48.501923: Pseudo dice [0.7134, nan] 
2023-11-23 03:17:48.502009: Epoch time: 55.48 s 
2023-11-23 03:17:49.586776:  
2023-11-23 03:17:49.587066: Epoch 631 
2023-11-23 03:17:49.587223: Current learning rate: 0.00408 
2023-11-23 03:18:45.251584: train_loss -0.3537 
2023-11-23 03:18:45.251788: val_loss -0.3428 
2023-11-23 03:18:45.251868: Pseudo dice [0.6902, nan] 
2023-11-23 03:18:45.251942: Epoch time: 55.67 s 
2023-11-23 03:18:46.337934:  
2023-11-23 03:18:46.338068: Epoch 632 
2023-11-23 03:18:46.338166: Current learning rate: 0.00407 
2023-11-23 03:19:41.898719: train_loss -0.342 
2023-11-23 03:19:41.898943: val_loss -0.3472 
2023-11-23 03:19:41.899037: Pseudo dice [0.7045, nan] 
2023-11-23 03:19:41.899124: Epoch time: 55.56 s 
2023-11-23 03:19:42.982987:  
2023-11-23 03:19:42.983156: Epoch 633 
2023-11-23 03:19:42.983310: Current learning rate: 0.00406 
2023-11-23 03:20:38.600918: train_loss -0.3535 
2023-11-23 03:20:38.601119: val_loss -0.3551 
2023-11-23 03:20:38.601209: Pseudo dice [0.7267, nan] 
2023-11-23 03:20:38.601283: Epoch time: 55.62 s 
2023-11-23 03:20:39.693430:  
2023-11-23 03:20:39.693558: Epoch 634 
2023-11-23 03:20:39.693659: Current learning rate: 0.00405 
2023-11-23 03:21:35.324681: train_loss -0.3417 
2023-11-23 03:21:35.324928: val_loss -0.3483 
2023-11-23 03:21:35.325015: Pseudo dice [0.7195, nan] 
2023-11-23 03:21:35.325096: Epoch time: 55.63 s 
2023-11-23 03:21:36.402362:  
2023-11-23 03:21:36.402489: Epoch 635 
2023-11-23 03:21:36.402591: Current learning rate: 0.00404 
2023-11-23 03:22:32.125150: train_loss -0.3405 
2023-11-23 03:22:32.125368: val_loss -0.3603 
2023-11-23 03:22:32.125470: Pseudo dice [0.7335, nan] 
2023-11-23 03:22:32.125572: Epoch time: 55.72 s 
2023-11-23 03:22:33.210882:  
2023-11-23 03:22:33.211021: Epoch 636 
2023-11-23 03:22:33.211157: Current learning rate: 0.00403 
2023-11-23 03:23:28.813519: train_loss -0.3505 
2023-11-23 03:23:28.813706: val_loss -0.3618 
2023-11-23 03:23:28.813788: Pseudo dice [0.7374, nan] 
2023-11-23 03:23:28.813864: Epoch time: 55.6 s 
2023-11-23 03:23:29.908239:  
2023-11-23 03:23:29.908369: Epoch 637 
2023-11-23 03:23:29.908471: Current learning rate: 0.00402 
2023-11-23 03:24:25.623633: train_loss -0.3289 
2023-11-23 03:24:25.623829: val_loss -0.3407 
2023-11-23 03:24:25.623910: Pseudo dice [0.6924, nan] 
2023-11-23 03:24:25.623984: Epoch time: 55.72 s 
2023-11-23 03:24:26.827890:  
2023-11-23 03:24:26.828077: Epoch 638 
2023-11-23 03:24:26.828189: Current learning rate: 0.00401 
2023-11-23 03:25:22.457426: train_loss -0.3423 
2023-11-23 03:25:22.457616: val_loss -0.3463 
2023-11-23 03:25:22.457695: Pseudo dice [0.7036, nan] 
2023-11-23 03:25:22.457766: Epoch time: 55.63 s 
2023-11-23 03:25:23.543060:  
2023-11-23 03:25:23.543246: Epoch 639 
2023-11-23 03:25:23.543353: Current learning rate: 0.004 
2023-11-23 03:26:19.175143: train_loss -0.341 
2023-11-23 03:26:19.175345: val_loss -0.3427 
2023-11-23 03:26:19.175426: Pseudo dice [0.678, nan] 
2023-11-23 03:26:19.175500: Epoch time: 55.63 s 
2023-11-23 03:26:20.261500:  
2023-11-23 03:26:20.261636: Epoch 640 
2023-11-23 03:26:20.261739: Current learning rate: 0.00399 
2023-11-23 03:27:15.914071: train_loss -0.3401 
2023-11-23 03:27:15.914259: val_loss -0.3434 
2023-11-23 03:27:15.914342: Pseudo dice [0.6922, nan] 
2023-11-23 03:27:15.914415: Epoch time: 55.65 s 
2023-11-23 03:27:16.996028:  
2023-11-23 03:27:16.996150: Epoch 641 
2023-11-23 03:27:16.996250: Current learning rate: 0.00398 
2023-11-23 03:28:12.595334: train_loss -0.3482 
2023-11-23 03:28:12.595547: val_loss -0.346 
2023-11-23 03:28:12.595632: Pseudo dice [0.7086, nan] 
2023-11-23 03:28:12.595703: Epoch time: 55.6 s 
2023-11-23 03:28:13.775647:  
2023-11-23 03:28:13.775828: Epoch 642 
2023-11-23 03:28:13.775929: Current learning rate: 0.00397 
2023-11-23 03:29:09.504335: train_loss -0.3426 
2023-11-23 03:29:09.504530: val_loss -0.3724 
2023-11-23 03:29:09.504628: Pseudo dice [0.7606, nan] 
2023-11-23 03:29:09.504705: Epoch time: 55.73 s 
2023-11-23 03:29:10.580260:  
2023-11-23 03:29:10.580391: Epoch 643 
2023-11-23 03:29:10.580497: Current learning rate: 0.00396 
2023-11-23 03:30:06.157585: train_loss -0.3402 
2023-11-23 03:30:06.157775: val_loss -0.3534 
2023-11-23 03:30:06.157852: Pseudo dice [0.7208, nan] 
2023-11-23 03:30:06.157944: Epoch time: 55.58 s 
2023-11-23 03:30:07.246020:  
2023-11-23 03:30:07.246153: Epoch 644 
2023-11-23 03:30:07.246256: Current learning rate: 0.00395 
2023-11-23 03:31:02.943664: train_loss -0.3466 
2023-11-23 03:31:02.943890: val_loss -0.3532 
2023-11-23 03:31:02.943972: Pseudo dice [0.7185, nan] 
2023-11-23 03:31:02.944052: Epoch time: 55.7 s 
2023-11-23 03:31:04.024473:  
2023-11-23 03:31:04.024664: Epoch 645 
2023-11-23 03:31:04.024826: Current learning rate: 0.00394 
2023-11-23 03:31:59.624163: train_loss -0.3457 
2023-11-23 03:31:59.624357: val_loss -0.344 
2023-11-23 03:31:59.624435: Pseudo dice [0.7001, nan] 
2023-11-23 03:31:59.624523: Epoch time: 55.6 s 
2023-11-23 03:32:00.706720:  
2023-11-23 03:32:00.706894: Epoch 646 
2023-11-23 03:32:00.707028: Current learning rate: 0.00393 
2023-11-23 03:32:56.337191: train_loss -0.3479 
2023-11-23 03:32:56.337380: val_loss -0.3716 
2023-11-23 03:32:56.337459: Pseudo dice [0.7397, nan] 
2023-11-23 03:32:56.337534: Epoch time: 55.63 s 
2023-11-23 03:32:57.414337:  
2023-11-23 03:32:57.414464: Epoch 647 
2023-11-23 03:32:57.414563: Current learning rate: 0.00392 
2023-11-23 03:33:53.013100: train_loss -0.338 
2023-11-23 03:33:53.013303: val_loss -0.3258 
2023-11-23 03:33:53.013383: Pseudo dice [0.6679, nan] 
2023-11-23 03:33:53.013453: Epoch time: 55.6 s 
2023-11-23 03:33:54.120836:  
2023-11-23 03:33:54.121061: Epoch 648 
2023-11-23 03:33:54.121201: Current learning rate: 0.00391 
2023-11-23 03:34:49.739157: train_loss -0.3598 
2023-11-23 03:34:49.739344: val_loss -0.3326 
2023-11-23 03:34:49.739424: Pseudo dice [0.6817, nan] 
2023-11-23 03:34:49.739496: Epoch time: 55.62 s 
2023-11-23 03:34:50.828323:  
2023-11-23 03:34:50.828456: Epoch 649 
2023-11-23 03:34:50.828576: Current learning rate: 0.0039 
2023-11-23 03:35:46.466743: train_loss -0.3546 
2023-11-23 03:35:46.466937: val_loss -0.3661 
2023-11-23 03:35:46.467014: Pseudo dice [0.7527, nan] 
2023-11-23 03:35:46.467086: Epoch time: 55.64 s 
2023-11-23 03:35:47.677241:  
2023-11-23 03:35:47.677466: Epoch 650 
2023-11-23 03:35:47.677586: Current learning rate: 0.00389 
2023-11-23 03:36:43.316865: train_loss -0.3587 
2023-11-23 03:36:43.317077: val_loss -0.332 
2023-11-23 03:36:43.317163: Pseudo dice [0.6837, nan] 
2023-11-23 03:36:43.317240: Epoch time: 55.64 s 
2023-11-23 03:36:44.507332:  
2023-11-23 03:36:44.507468: Epoch 651 
2023-11-23 03:36:44.507572: Current learning rate: 0.00388 
2023-11-23 03:37:40.158785: train_loss -0.3345 
2023-11-23 03:37:40.159065: val_loss -0.3332 
2023-11-23 03:37:40.159150: Pseudo dice [0.6869, nan] 
2023-11-23 03:37:40.159223: Epoch time: 55.65 s 
2023-11-23 03:37:41.238490:  
2023-11-23 03:37:41.238658: Epoch 652 
2023-11-23 03:37:41.238799: Current learning rate: 0.00387 
2023-11-23 03:38:36.953335: train_loss -0.3451 
2023-11-23 03:38:36.953557: val_loss -0.3445 
2023-11-23 03:38:36.953669: Pseudo dice [0.7097, nan] 
2023-11-23 03:38:36.953744: Epoch time: 55.72 s 
2023-11-23 03:38:38.037590:  
2023-11-23 03:38:38.037717: Epoch 653 
2023-11-23 03:38:38.037815: Current learning rate: 0.00386 
2023-11-23 03:39:33.681518: train_loss -0.345 
2023-11-23 03:39:33.681729: val_loss -0.3468 
2023-11-23 03:39:33.681813: Pseudo dice [0.7012, nan] 
2023-11-23 03:39:33.681889: Epoch time: 55.64 s 
2023-11-23 03:39:34.764036:  
2023-11-23 03:39:34.764170: Epoch 654 
2023-11-23 03:39:34.764270: Current learning rate: 0.00385 
2023-11-23 03:40:30.462285: train_loss -0.336 
2023-11-23 03:40:30.462477: val_loss -0.3476 
2023-11-23 03:40:30.462557: Pseudo dice [0.7108, nan] 
2023-11-23 03:40:30.462629: Epoch time: 55.7 s 
2023-11-23 03:40:31.538770:  
2023-11-23 03:40:31.538897: Epoch 655 
2023-11-23 03:40:31.539003: Current learning rate: 0.00384 
2023-11-23 03:41:27.163910: train_loss -0.3463 
2023-11-23 03:41:27.164103: val_loss -0.3565 
2023-11-23 03:41:27.164182: Pseudo dice [0.7195, nan] 
2023-11-23 03:41:27.164255: Epoch time: 55.63 s 
2023-11-23 03:41:28.370263:  
2023-11-23 03:41:28.370406: Epoch 656 
2023-11-23 03:41:28.370533: Current learning rate: 0.00383 
2023-11-23 03:42:24.236159: train_loss -0.3591 
2023-11-23 03:42:24.236354: val_loss -0.3672 
2023-11-23 03:42:24.236433: Pseudo dice [0.7438, nan] 
2023-11-23 03:42:24.236505: Epoch time: 55.87 s 
2023-11-23 03:42:25.322254:  
2023-11-23 03:42:25.322421: Epoch 657 
2023-11-23 03:42:25.322567: Current learning rate: 0.00382 
2023-11-23 03:43:21.035281: train_loss -0.3447 
2023-11-23 03:43:21.035519: val_loss -0.3519 
2023-11-23 03:43:21.035611: Pseudo dice [0.7108, nan] 
2023-11-23 03:43:21.035692: Epoch time: 55.71 s 
2023-11-23 03:43:22.117007:  
2023-11-23 03:43:22.117125: Epoch 658 
2023-11-23 03:43:22.117221: Current learning rate: 0.00381 
2023-11-23 03:44:17.794889: train_loss -0.347 
2023-11-23 03:44:17.795071: val_loss -0.3443 
2023-11-23 03:44:17.795151: Pseudo dice [0.6943, nan] 
2023-11-23 03:44:17.795224: Epoch time: 55.68 s 
2023-11-23 03:44:18.880953:  
2023-11-23 03:44:18.881077: Epoch 659 
2023-11-23 03:44:18.881177: Current learning rate: 0.0038 
2023-11-23 03:45:14.472873: train_loss -0.3365 
2023-11-23 03:45:14.473084: val_loss -0.3293 
2023-11-23 03:45:14.473167: Pseudo dice [0.6727, nan] 
2023-11-23 03:45:14.473253: Epoch time: 55.59 s 
2023-11-23 03:45:15.559568:  
2023-11-23 03:45:15.559697: Epoch 660 
2023-11-23 03:45:15.559798: Current learning rate: 0.00379 
2023-11-23 03:46:11.064949: train_loss -0.358 
2023-11-23 03:46:11.065138: val_loss -0.3591 
2023-11-23 03:46:11.065232: Pseudo dice [0.7338, nan] 
2023-11-23 03:46:11.065305: Epoch time: 55.51 s 
2023-11-23 03:46:12.150326:  
2023-11-23 03:46:12.150497: Epoch 661 
2023-11-23 03:46:12.150614: Current learning rate: 0.00378 
2023-11-23 03:47:07.749675: train_loss -0.3662 
2023-11-23 03:47:07.749872: val_loss -0.3425 
2023-11-23 03:47:07.749967: Pseudo dice [0.689, nan] 
2023-11-23 03:47:07.750042: Epoch time: 55.6 s 
2023-11-23 03:47:08.836986:  
2023-11-23 03:47:08.837121: Epoch 662 
2023-11-23 03:47:08.837245: Current learning rate: 0.00377 
2023-11-23 03:48:04.488579: train_loss -0.3504 
2023-11-23 03:48:04.488762: val_loss -0.3541 
2023-11-23 03:48:04.488842: Pseudo dice [0.7205, nan] 
2023-11-23 03:48:04.488916: Epoch time: 55.65 s 
2023-11-23 03:48:05.598722:  
2023-11-23 03:48:05.598863: Epoch 663 
2023-11-23 03:48:05.598980: Current learning rate: 0.00376 
2023-11-23 03:49:01.321254: train_loss -0.3506 
2023-11-23 03:49:01.321474: val_loss -0.3576 
2023-11-23 03:49:01.321562: Pseudo dice [0.7223, nan] 
2023-11-23 03:49:01.321645: Epoch time: 55.72 s 
2023-11-23 03:49:02.426172:  
2023-11-23 03:49:02.426305: Epoch 664 
2023-11-23 03:49:02.426525: Current learning rate: 0.00375 
2023-11-23 03:49:57.927058: train_loss -0.3466 
2023-11-23 03:49:57.927271: val_loss -0.3726 
2023-11-23 03:49:57.927358: Pseudo dice [0.7578, nan] 
2023-11-23 03:49:57.927425: Epoch time: 55.5 s 
2023-11-23 03:49:59.106336:  
2023-11-23 03:49:59.106464: Epoch 665 
2023-11-23 03:49:59.106591: Current learning rate: 0.00374 
2023-11-23 03:50:54.660540: train_loss -0.3429 
2023-11-23 03:50:54.660738: val_loss -0.3502 
2023-11-23 03:50:54.660819: Pseudo dice [0.7205, nan] 
2023-11-23 03:50:54.660896: Epoch time: 55.55 s 
2023-11-23 03:50:55.743258:  
2023-11-23 03:50:55.743381: Epoch 666 
2023-11-23 03:50:55.743479: Current learning rate: 0.00373 
2023-11-23 03:51:51.472378: train_loss -0.3448 
2023-11-23 03:51:51.472606: val_loss -0.3663 
2023-11-23 03:51:51.472694: Pseudo dice [0.7377, nan] 
2023-11-23 03:51:51.472780: Epoch time: 55.73 s 
2023-11-23 03:51:52.569118:  
2023-11-23 03:51:52.569295: Epoch 667 
2023-11-23 03:51:52.569426: Current learning rate: 0.00372 
2023-11-23 03:52:48.252490: train_loss -0.3458 
2023-11-23 03:52:48.252697: val_loss -0.362 
2023-11-23 03:52:48.252799: Pseudo dice [0.7349, nan] 
2023-11-23 03:52:48.252887: Epoch time: 55.68 s 
2023-11-23 03:52:49.346778:  
2023-11-23 03:52:49.346903: Epoch 668 
2023-11-23 03:52:49.347045: Current learning rate: 0.00371 
2023-11-23 03:53:44.864572: train_loss -0.3454 
2023-11-23 03:53:44.864774: val_loss -0.3512 
2023-11-23 03:53:44.864857: Pseudo dice [0.72, nan] 
2023-11-23 03:53:44.864932: Epoch time: 55.52 s 
2023-11-23 03:53:46.063735:  
2023-11-23 03:53:46.063946: Epoch 669 
2023-11-23 03:53:46.064089: Current learning rate: 0.0037 
2023-11-23 03:54:41.635417: train_loss -0.3497 
2023-11-23 03:54:41.635631: val_loss -0.3581 
2023-11-23 03:54:41.635716: Pseudo dice [0.7323, nan] 
2023-11-23 03:54:41.635789: Epoch time: 55.57 s 
2023-11-23 03:54:42.729677:  
2023-11-23 03:54:42.729850: Epoch 670 
2023-11-23 03:54:42.729975: Current learning rate: 0.00369 
2023-11-23 03:55:38.437198: train_loss -0.3393 
2023-11-23 03:55:38.437405: val_loss -0.3518 
2023-11-23 03:55:38.437485: Pseudo dice [0.7224, nan] 
2023-11-23 03:55:38.437556: Epoch time: 55.71 s 
2023-11-23 03:55:39.533335:  
2023-11-23 03:55:39.533657: Epoch 671 
2023-11-23 03:55:39.533830: Current learning rate: 0.00368 
2023-11-23 03:56:35.203846: train_loss -0.3491 
2023-11-23 03:56:35.204045: val_loss -0.3529 
2023-11-23 03:56:35.204131: Pseudo dice [0.7206, nan] 
2023-11-23 03:56:35.204206: Epoch time: 55.67 s 
2023-11-23 03:56:36.307753:  
2023-11-23 03:56:36.307891: Epoch 672 
2023-11-23 03:56:36.308032: Current learning rate: 0.00367 
2023-11-23 03:57:31.869381: train_loss -0.3473 
2023-11-23 03:57:31.869567: val_loss -0.304 
2023-11-23 03:57:31.869649: Pseudo dice [0.6359, nan] 
2023-11-23 03:57:31.869722: Epoch time: 55.56 s 
2023-11-23 03:57:32.977286:  
2023-11-23 03:57:32.977430: Epoch 673 
2023-11-23 03:57:32.977541: Current learning rate: 0.00366 
2023-11-23 03:58:28.662793: train_loss -0.3368 
2023-11-23 03:58:28.663004: val_loss -0.3746 
2023-11-23 03:58:28.663105: Pseudo dice [0.7526, nan] 
2023-11-23 03:58:28.663184: Epoch time: 55.69 s 
2023-11-23 03:58:29.872358:  
2023-11-23 03:58:29.872497: Epoch 674 
2023-11-23 03:58:29.872619: Current learning rate: 0.00365 
2023-11-23 03:59:25.523134: train_loss -0.3528 
2023-11-23 03:59:25.523336: val_loss -0.3739 
2023-11-23 03:59:25.523417: Pseudo dice [0.7625, nan] 
2023-11-23 03:59:25.523499: Epoch time: 55.65 s 
2023-11-23 03:59:26.629991:  
2023-11-23 03:59:26.630119: Epoch 675 
2023-11-23 03:59:26.630242: Current learning rate: 0.00364 
2023-11-23 04:00:22.265718: train_loss -0.3551 
2023-11-23 04:00:22.265929: val_loss -0.3425 
2023-11-23 04:00:22.266039: Pseudo dice [0.701, nan] 
2023-11-23 04:00:22.266129: Epoch time: 55.64 s 
2023-11-23 04:00:23.391916:  
2023-11-23 04:00:23.392031: Epoch 676 
2023-11-23 04:00:23.392192: Current learning rate: 0.00363 
2023-11-23 04:01:19.114643: train_loss -0.352 
2023-11-23 04:01:19.114835: val_loss -0.3538 
2023-11-23 04:01:19.114918: Pseudo dice [0.7197, nan] 
2023-11-23 04:01:19.114990: Epoch time: 55.72 s 
2023-11-23 04:01:20.208146:  
2023-11-23 04:01:20.208275: Epoch 677 
2023-11-23 04:01:20.208375: Current learning rate: 0.00362 
2023-11-23 04:02:15.929285: train_loss -0.3464 
2023-11-23 04:02:15.929508: val_loss -0.3534 
2023-11-23 04:02:15.929596: Pseudo dice [0.7226, nan] 
2023-11-23 04:02:15.929674: Epoch time: 55.72 s 
2023-11-23 04:02:17.035609:  
2023-11-23 04:02:17.035733: Epoch 678 
2023-11-23 04:02:17.035837: Current learning rate: 0.00361 
2023-11-23 04:03:12.587659: train_loss -0.3466 
2023-11-23 04:03:12.587869: val_loss -0.3281 
2023-11-23 04:03:12.587952: Pseudo dice [0.6678, nan] 
2023-11-23 04:03:12.588026: Epoch time: 55.55 s 
2023-11-23 04:03:13.794837:  
2023-11-23 04:03:13.794961: Epoch 679 
2023-11-23 04:03:13.795064: Current learning rate: 0.0036 
2023-11-23 04:04:09.393458: train_loss -0.3514 
2023-11-23 04:04:09.393667: val_loss -0.3381 
2023-11-23 04:04:09.393752: Pseudo dice [0.6865, nan] 
2023-11-23 04:04:09.393832: Epoch time: 55.6 s 
2023-11-23 04:04:10.501760:  
2023-11-23 04:04:10.501880: Epoch 680 
2023-11-23 04:04:10.501982: Current learning rate: 0.00359 
2023-11-23 04:05:06.254035: train_loss -0.3288 
2023-11-23 04:05:06.254238: val_loss -0.3461 
2023-11-23 04:05:06.254321: Pseudo dice [0.709, nan] 
2023-11-23 04:05:06.254406: Epoch time: 55.75 s 
2023-11-23 04:05:07.353759:  
2023-11-23 04:05:07.353879: Epoch 681 
2023-11-23 04:05:07.353984: Current learning rate: 0.00358 
2023-11-23 04:06:02.985971: train_loss -0.3393 
2023-11-23 04:06:02.986157: val_loss -0.3576 
2023-11-23 04:06:02.986237: Pseudo dice [0.7329, nan] 
2023-11-23 04:06:02.986320: Epoch time: 55.63 s 
2023-11-23 04:06:04.096212:  
2023-11-23 04:06:04.096403: Epoch 682 
2023-11-23 04:06:04.096549: Current learning rate: 0.00357 
2023-11-23 04:06:59.798341: train_loss -0.346 
2023-11-23 04:06:59.798544: val_loss -0.3378 
2023-11-23 04:06:59.798666: Pseudo dice [0.6915, nan] 
2023-11-23 04:06:59.798741: Epoch time: 55.7 s 
2023-11-23 04:07:00.911371:  
2023-11-23 04:07:00.911493: Epoch 683 
2023-11-23 04:07:00.911594: Current learning rate: 0.00356 
2023-11-23 04:07:56.506710: train_loss -0.3422 
2023-11-23 04:07:56.506897: val_loss -0.3337 
2023-11-23 04:07:56.506977: Pseudo dice [0.6879, nan] 
2023-11-23 04:07:56.507045: Epoch time: 55.6 s 
2023-11-23 04:07:57.714158:  
2023-11-23 04:07:57.714296: Epoch 684 
2023-11-23 04:07:57.714406: Current learning rate: 0.00355 
2023-11-23 04:08:53.292909: train_loss -0.3423 
2023-11-23 04:08:53.293100: val_loss -0.3737 
2023-11-23 04:08:53.293193: Pseudo dice [0.7547, nan] 
2023-11-23 04:08:53.293267: Epoch time: 55.58 s 
2023-11-23 04:08:54.395772:  
2023-11-23 04:08:54.395916: Epoch 685 
2023-11-23 04:08:54.396048: Current learning rate: 0.00354 
2023-11-23 04:09:49.983379: train_loss -0.347 
2023-11-23 04:09:49.983580: val_loss -0.3643 
2023-11-23 04:09:49.983670: Pseudo dice [0.7444, nan] 
2023-11-23 04:09:49.983749: Epoch time: 55.59 s 
2023-11-23 04:09:51.106841:  
2023-11-23 04:09:51.106978: Epoch 686 
2023-11-23 04:09:51.107083: Current learning rate: 0.00353 
2023-11-23 04:10:46.588765: train_loss -0.346 
2023-11-23 04:10:46.588997: val_loss -0.3568 
2023-11-23 04:10:46.589084: Pseudo dice [0.7277, nan] 
2023-11-23 04:10:46.589161: Epoch time: 55.48 s 
2023-11-23 04:10:47.691623:  
2023-11-23 04:10:47.691773: Epoch 687 
2023-11-23 04:10:47.691877: Current learning rate: 0.00352 
2023-11-23 04:11:43.364582: train_loss -0.3483 
2023-11-23 04:11:43.364785: val_loss -0.3597 
2023-11-23 04:11:43.364870: Pseudo dice [0.7311, nan] 
2023-11-23 04:11:43.364948: Epoch time: 55.67 s 
2023-11-23 04:11:44.573737:  
2023-11-23 04:11:44.573867: Epoch 688 
2023-11-23 04:11:44.573969: Current learning rate: 0.00351 
2023-11-23 04:12:40.224399: train_loss -0.3631 
2023-11-23 04:12:40.224593: val_loss -0.3407 
2023-11-23 04:12:40.224675: Pseudo dice [0.6973, nan] 
2023-11-23 04:12:40.224754: Epoch time: 55.65 s 
2023-11-23 04:12:41.328490:  
2023-11-23 04:12:41.328624: Epoch 689 
2023-11-23 04:12:41.328723: Current learning rate: 0.0035 
2023-11-23 04:13:36.911611: train_loss -0.3463 
2023-11-23 04:13:36.911801: val_loss -0.3362 
2023-11-23 04:13:36.911895: Pseudo dice [0.6871, nan] 
2023-11-23 04:13:36.911967: Epoch time: 55.58 s 
2023-11-23 04:13:38.015949:  
2023-11-23 04:13:38.016078: Epoch 690 
2023-11-23 04:13:38.016207: Current learning rate: 0.00349 
2023-11-23 04:14:33.700884: train_loss -0.3473 
2023-11-23 04:14:33.701114: val_loss -0.3586 
2023-11-23 04:14:33.701195: Pseudo dice [0.7291, nan] 
2023-11-23 04:14:33.701276: Epoch time: 55.69 s 
2023-11-23 04:14:34.816888:  
2023-11-23 04:14:34.817082: Epoch 691 
2023-11-23 04:14:34.817248: Current learning rate: 0.00348 
2023-11-23 04:15:30.576911: train_loss -0.3235 
2023-11-23 04:15:30.577106: val_loss -0.3452 
2023-11-23 04:15:30.577187: Pseudo dice [0.7019, nan] 
2023-11-23 04:15:30.577258: Epoch time: 55.76 s 
2023-11-23 04:15:31.681609:  
2023-11-23 04:15:31.681800: Epoch 692 
2023-11-23 04:15:31.681908: Current learning rate: 0.00346 
2023-11-23 04:16:27.380225: train_loss -0.3412 
2023-11-23 04:16:27.380422: val_loss -0.3672 
2023-11-23 04:16:27.380506: Pseudo dice [0.7564, nan] 
2023-11-23 04:16:27.380595: Epoch time: 55.7 s 
2023-11-23 04:16:28.583985:  
2023-11-23 04:16:28.584105: Epoch 693 
2023-11-23 04:16:28.584203: Current learning rate: 0.00345 
2023-11-23 04:17:24.135746: train_loss -0.341 
2023-11-23 04:17:24.135936: val_loss -0.3548 
2023-11-23 04:17:24.136021: Pseudo dice [0.7203, nan] 
2023-11-23 04:17:24.136093: Epoch time: 55.55 s 
2023-11-23 04:17:25.237359:  
2023-11-23 04:17:25.237486: Epoch 694 
2023-11-23 04:17:25.237598: Current learning rate: 0.00344 
2023-11-23 04:18:20.939264: train_loss -0.3516 
2023-11-23 04:18:20.939456: val_loss -0.3414 
2023-11-23 04:18:20.939538: Pseudo dice [0.6993, nan] 
2023-11-23 04:18:20.939610: Epoch time: 55.7 s 
2023-11-23 04:18:22.041463:  
2023-11-23 04:18:22.041591: Epoch 695 
2023-11-23 04:18:22.041697: Current learning rate: 0.00343 
2023-11-23 04:19:17.764621: train_loss -0.3451 
2023-11-23 04:19:17.764817: val_loss -0.3579 
2023-11-23 04:19:17.764903: Pseudo dice [0.7263, nan] 
2023-11-23 04:19:17.764974: Epoch time: 55.72 s 
2023-11-23 04:19:18.866159:  
2023-11-23 04:19:18.866297: Epoch 696 
2023-11-23 04:19:18.866400: Current learning rate: 0.00342 
2023-11-23 04:20:14.601164: train_loss -0.3526 
2023-11-23 04:20:14.601383: val_loss -0.3544 
2023-11-23 04:20:14.601480: Pseudo dice [0.7165, nan] 
2023-11-23 04:20:14.601561: Epoch time: 55.74 s 
2023-11-23 04:20:15.706547:  
2023-11-23 04:20:15.706684: Epoch 697 
2023-11-23 04:20:15.706853: Current learning rate: 0.00341 
2023-11-23 04:21:11.270505: train_loss -0.3423 
2023-11-23 04:21:11.270710: val_loss -0.3541 
2023-11-23 04:21:11.270792: Pseudo dice [0.7207, nan] 
2023-11-23 04:21:11.270865: Epoch time: 55.56 s 
2023-11-23 04:21:12.472773:  
2023-11-23 04:21:12.472903: Epoch 698 
2023-11-23 04:21:12.472999: Current learning rate: 0.0034 
2023-11-23 04:22:08.011506: train_loss -0.3606 
2023-11-23 04:22:08.011693: val_loss -0.3536 
2023-11-23 04:22:08.011768: Pseudo dice [0.7331, nan] 
2023-11-23 04:22:08.011847: Epoch time: 55.54 s 
2023-11-23 04:22:09.123444:  
2023-11-23 04:22:09.123570: Epoch 699 
2023-11-23 04:22:09.123671: Current learning rate: 0.00339 
2023-11-23 04:23:04.693203: train_loss -0.3531 
2023-11-23 04:23:04.693395: val_loss -0.3564 
2023-11-23 04:23:04.693480: Pseudo dice [0.7268, nan] 
2023-11-23 04:23:04.693553: Epoch time: 55.57 s 
2023-11-23 04:23:05.919609:  
2023-11-23 04:23:05.919780: Epoch 700 
2023-11-23 04:23:05.919892: Current learning rate: 0.00338 
2023-11-23 04:24:01.613691: train_loss -0.3574 
2023-11-23 04:24:01.613912: val_loss -0.2983 
2023-11-23 04:24:01.614024: Pseudo dice [0.5717, nan] 
2023-11-23 04:24:01.614134: Epoch time: 55.69 s 
2023-11-23 04:24:02.710256:  
2023-11-23 04:24:02.710378: Epoch 701 
2023-11-23 04:24:02.710480: Current learning rate: 0.00337 
2023-11-23 04:24:58.386746: train_loss -0.3603 
2023-11-23 04:24:58.386943: val_loss -0.3514 
2023-11-23 04:24:58.387037: Pseudo dice [0.7169, nan] 
2023-11-23 04:24:58.387120: Epoch time: 55.68 s 
2023-11-23 04:24:59.479329:  
2023-11-23 04:24:59.479573: Epoch 702 
2023-11-23 04:24:59.479736: Current learning rate: 0.00336 
2023-11-23 04:25:55.106209: train_loss -0.3415 
2023-11-23 04:25:55.106406: val_loss -0.3278 
2023-11-23 04:25:55.106498: Pseudo dice [0.6789, nan] 
2023-11-23 04:25:55.106592: Epoch time: 55.63 s 
2023-11-23 04:25:56.321923:  
2023-11-23 04:25:56.322221: Epoch 703 
2023-11-23 04:25:56.322375: Current learning rate: 0.00335 
2023-11-23 04:26:52.049781: train_loss -0.3544 
2023-11-23 04:26:52.050006: val_loss -0.3432 
2023-11-23 04:26:52.050104: Pseudo dice [0.7036, nan] 
2023-11-23 04:26:52.050183: Epoch time: 55.73 s 
2023-11-23 04:26:53.167864:  
2023-11-23 04:26:53.167982: Epoch 704 
2023-11-23 04:26:53.168083: Current learning rate: 0.00334 
2023-11-23 04:27:48.875036: train_loss -0.346 
2023-11-23 04:27:48.875240: val_loss -0.3641 
2023-11-23 04:27:48.875329: Pseudo dice [0.7446, nan] 
2023-11-23 04:27:48.875424: Epoch time: 55.71 s 
2023-11-23 04:27:49.978330:  
2023-11-23 04:27:49.978510: Epoch 705 
2023-11-23 04:27:49.978627: Current learning rate: 0.00333 
2023-11-23 04:28:45.576914: train_loss -0.3634 
2023-11-23 04:28:45.577116: val_loss -0.3365 
2023-11-23 04:28:45.577195: Pseudo dice [0.6882, nan] 
2023-11-23 04:28:45.577267: Epoch time: 55.6 s 
2023-11-23 04:28:46.689695:  
2023-11-23 04:28:46.689819: Epoch 706 
2023-11-23 04:28:46.689917: Current learning rate: 0.00332 
2023-11-23 04:29:42.362978: train_loss -0.348 
2023-11-23 04:29:42.363198: val_loss -0.3314 
2023-11-23 04:29:42.363281: Pseudo dice [0.6817, nan] 
2023-11-23 04:29:42.363352: Epoch time: 55.67 s 
2023-11-23 04:29:43.472784:  
2023-11-23 04:29:43.472918: Epoch 707 
2023-11-23 04:29:43.473022: Current learning rate: 0.00331 
2023-11-23 04:30:39.085500: train_loss -0.3483 
2023-11-23 04:30:39.085700: val_loss -0.3465 
2023-11-23 04:30:39.085783: Pseudo dice [0.7054, nan] 
2023-11-23 04:30:39.085859: Epoch time: 55.61 s 
2023-11-23 04:30:40.298148:  
2023-11-23 04:30:40.298278: Epoch 708 
2023-11-23 04:30:40.298416: Current learning rate: 0.0033 
2023-11-23 04:31:35.888952: train_loss -0.3432 
2023-11-23 04:31:35.889180: val_loss -0.3553 
2023-11-23 04:31:35.889274: Pseudo dice [0.7178, nan] 
2023-11-23 04:31:35.889360: Epoch time: 55.59 s 
2023-11-23 04:31:37.009607:  
2023-11-23 04:31:37.009728: Epoch 709 
2023-11-23 04:31:37.009831: Current learning rate: 0.00329 
2023-11-23 04:32:32.687086: train_loss -0.3553 
2023-11-23 04:32:32.687316: val_loss -0.3571 
2023-11-23 04:32:32.687408: Pseudo dice [0.7161, nan] 
2023-11-23 04:32:32.687479: Epoch time: 55.68 s 
2023-11-23 04:32:33.796383:  
2023-11-23 04:32:33.796507: Epoch 710 
2023-11-23 04:32:33.796633: Current learning rate: 0.00328 
2023-11-23 04:33:29.414833: train_loss -0.3452 
2023-11-23 04:33:29.415032: val_loss -0.3576 
2023-11-23 04:33:29.415131: Pseudo dice [0.7308, nan] 
2023-11-23 04:33:29.415217: Epoch time: 55.62 s 
2023-11-23 04:33:30.533999:  
2023-11-23 04:33:30.534127: Epoch 711 
2023-11-23 04:33:30.534225: Current learning rate: 0.00327 
2023-11-23 04:34:26.083447: train_loss -0.3426 
2023-11-23 04:34:26.083679: val_loss -0.3543 
2023-11-23 04:34:26.083771: Pseudo dice [0.7152, nan] 
2023-11-23 04:34:26.083853: Epoch time: 55.55 s 
2023-11-23 04:34:27.189597:  
2023-11-23 04:34:27.189721: Epoch 712 
2023-11-23 04:34:27.189816: Current learning rate: 0.00326 
2023-11-23 04:35:22.710916: train_loss -0.3346 
2023-11-23 04:35:22.711102: val_loss -0.3573 
2023-11-23 04:35:22.711200: Pseudo dice [0.7254, nan] 
2023-11-23 04:35:22.711274: Epoch time: 55.52 s 
2023-11-23 04:35:24.031195:  
2023-11-23 04:35:24.031326: Epoch 713 
2023-11-23 04:35:24.031435: Current learning rate: 0.00325 
2023-11-23 04:36:19.666927: train_loss -0.3404 
2023-11-23 04:36:19.667153: val_loss -0.3595 
2023-11-23 04:36:19.667239: Pseudo dice [0.7329, nan] 
2023-11-23 04:36:19.667313: Epoch time: 55.64 s 
2023-11-23 04:36:20.778199:  
2023-11-23 04:36:20.778326: Epoch 714 
2023-11-23 04:36:20.778425: Current learning rate: 0.00324 
2023-11-23 04:37:16.422730: train_loss -0.348 
2023-11-23 04:37:16.422948: val_loss -0.3635 
2023-11-23 04:37:16.423064: Pseudo dice [0.7255, nan] 
2023-11-23 04:37:16.423139: Epoch time: 55.65 s 
2023-11-23 04:37:17.538726:  
2023-11-23 04:37:17.538858: Epoch 715 
2023-11-23 04:37:17.538969: Current learning rate: 0.00323 
2023-11-23 04:38:13.142166: train_loss -0.3575 
2023-11-23 04:38:13.142354: val_loss -0.361 
2023-11-23 04:38:13.142438: Pseudo dice [0.7342, nan] 
2023-11-23 04:38:13.142512: Epoch time: 55.6 s 
2023-11-23 04:38:14.248235:  
2023-11-23 04:38:14.248362: Epoch 716 
2023-11-23 04:38:14.248463: Current learning rate: 0.00322 
2023-11-23 04:39:09.918036: train_loss -0.3565 
2023-11-23 04:39:09.918252: val_loss -0.3814 
2023-11-23 04:39:09.918336: Pseudo dice [0.7747, nan] 
2023-11-23 04:39:09.918408: Epoch time: 55.67 s 
2023-11-23 04:39:11.013713:  
2023-11-23 04:39:11.013835: Epoch 717 
2023-11-23 04:39:11.013931: Current learning rate: 0.00321 
2023-11-23 04:40:06.525600: train_loss -0.3505 
2023-11-23 04:40:06.525795: val_loss -0.3575 
2023-11-23 04:40:06.525874: Pseudo dice [0.723, nan] 
2023-11-23 04:40:06.525965: Epoch time: 55.51 s 
2023-11-23 04:40:07.630349:  
2023-11-23 04:40:07.630500: Epoch 718 
2023-11-23 04:40:07.630640: Current learning rate: 0.0032 
2023-11-23 04:41:03.237672: train_loss -0.3555 
2023-11-23 04:41:03.237864: val_loss -0.372 
2023-11-23 04:41:03.237945: Pseudo dice [0.7648, nan] 
2023-11-23 04:41:03.238016: Epoch time: 55.61 s 
2023-11-23 04:41:04.345325:  
2023-11-23 04:41:04.345512: Epoch 719 
2023-11-23 04:41:04.345625: Current learning rate: 0.00319 
2023-11-23 04:41:59.943664: train_loss -0.3477 
2023-11-23 04:41:59.943866: val_loss -0.3572 
2023-11-23 04:41:59.943951: Pseudo dice [0.7206, nan] 
2023-11-23 04:41:59.944023: Epoch time: 55.6 s 
2023-11-23 04:42:01.059199:  
2023-11-23 04:42:01.059333: Epoch 720 
2023-11-23 04:42:01.059434: Current learning rate: 0.00318 
2023-11-23 04:42:56.703060: train_loss -0.3535 
2023-11-23 04:42:56.703250: val_loss -0.34 
2023-11-23 04:42:56.703334: Pseudo dice [0.693, nan] 
2023-11-23 04:42:56.703406: Epoch time: 55.64 s 
2023-11-23 04:42:57.815972:  
2023-11-23 04:42:57.816102: Epoch 721 
2023-11-23 04:42:57.816224: Current learning rate: 0.00317 
2023-11-23 04:43:53.390919: train_loss -0.3559 
2023-11-23 04:43:53.391100: val_loss -0.3579 
2023-11-23 04:43:53.391179: Pseudo dice [0.7331, nan] 
2023-11-23 04:43:53.391250: Epoch time: 55.58 s 
2023-11-23 04:43:54.609116:  
2023-11-23 04:43:54.609249: Epoch 722 
2023-11-23 04:43:54.609371: Current learning rate: 0.00316 
2023-11-23 04:44:50.231695: train_loss -0.3558 
2023-11-23 04:44:50.231939: val_loss -0.3455 
2023-11-23 04:44:50.232023: Pseudo dice [0.7071, nan] 
2023-11-23 04:44:50.232102: Epoch time: 55.62 s 
2023-11-23 04:44:51.334113:  
2023-11-23 04:44:51.334246: Epoch 723 
2023-11-23 04:44:51.334345: Current learning rate: 0.00315 
2023-11-23 04:45:46.959189: train_loss -0.3435 
2023-11-23 04:45:46.959413: val_loss -0.3563 
2023-11-23 04:45:46.959494: Pseudo dice [0.7268, nan] 
2023-11-23 04:45:46.959574: Epoch time: 55.63 s 
2023-11-23 04:45:48.148072:  
2023-11-23 04:45:48.148205: Epoch 724 
2023-11-23 04:45:48.148301: Current learning rate: 0.00314 
2023-11-23 04:46:43.846969: train_loss -0.3405 
2023-11-23 04:46:43.847187: val_loss -0.3374 
2023-11-23 04:46:43.847270: Pseudo dice [0.6938, nan] 
2023-11-23 04:46:43.847343: Epoch time: 55.7 s 
2023-11-23 04:46:44.954794:  
2023-11-23 04:46:44.954921: Epoch 725 
2023-11-23 04:46:44.955015: Current learning rate: 0.00313 
2023-11-23 04:47:40.575836: train_loss -0.3555 
2023-11-23 04:47:40.576055: val_loss -0.3632 
2023-11-23 04:47:40.576137: Pseudo dice [0.7308, nan] 
2023-11-23 04:47:40.576211: Epoch time: 55.62 s 
2023-11-23 04:47:41.802208:  
2023-11-23 04:47:41.802327: Epoch 726 
2023-11-23 04:47:41.802446: Current learning rate: 0.00312 
2023-11-23 04:48:37.418985: train_loss -0.3674 
2023-11-23 04:48:37.419169: val_loss -0.3551 
2023-11-23 04:48:37.419268: Pseudo dice [0.7256, nan] 
2023-11-23 04:48:37.419342: Epoch time: 55.62 s 
2023-11-23 04:48:38.528096:  
2023-11-23 04:48:38.528292: Epoch 727 
2023-11-23 04:48:38.528430: Current learning rate: 0.00311 
2023-11-23 04:49:34.163755: train_loss -0.3472 
2023-11-23 04:49:34.163968: val_loss -0.3592 
2023-11-23 04:49:34.164052: Pseudo dice [0.7254, nan] 
2023-11-23 04:49:34.164125: Epoch time: 55.64 s 
2023-11-23 04:49:35.277940:  
2023-11-23 04:49:35.278060: Epoch 728 
2023-11-23 04:49:35.278157: Current learning rate: 0.0031 
2023-11-23 04:50:30.800021: train_loss -0.3483 
2023-11-23 04:50:30.800235: val_loss -0.3355 
2023-11-23 04:50:30.800317: Pseudo dice [0.6804, nan] 
2023-11-23 04:50:30.800397: Epoch time: 55.52 s 
2023-11-23 04:50:31.914505:  
2023-11-23 04:50:31.914626: Epoch 729 
2023-11-23 04:50:31.914727: Current learning rate: 0.00309 
2023-11-23 04:51:27.645732: train_loss -0.3455 
2023-11-23 04:51:27.645936: val_loss -0.3524 
2023-11-23 04:51:27.646023: Pseudo dice [0.7132, nan] 
2023-11-23 04:51:27.646096: Epoch time: 55.73 s 
2023-11-23 04:51:28.760700:  
2023-11-23 04:51:28.760869: Epoch 730 
2023-11-23 04:51:28.760991: Current learning rate: 0.00308 
2023-11-23 04:52:24.322659: train_loss -0.3424 
2023-11-23 04:52:24.322874: val_loss -0.349 
2023-11-23 04:52:24.322982: Pseudo dice [0.6976, nan] 
2023-11-23 04:52:24.323060: Epoch time: 55.56 s 
2023-11-23 04:52:25.537934:  
2023-11-23 04:52:25.538076: Epoch 731 
2023-11-23 04:52:25.538182: Current learning rate: 0.00307 
2023-11-23 04:53:21.211764: train_loss -0.3553 
2023-11-23 04:53:21.212015: val_loss -0.3484 
2023-11-23 04:53:21.212094: Pseudo dice [0.7143, nan] 
2023-11-23 04:53:21.212174: Epoch time: 55.67 s 
2023-11-23 04:53:22.314283:  
2023-11-23 04:53:22.314434: Epoch 732 
2023-11-23 04:53:22.314559: Current learning rate: 0.00306 
2023-11-23 04:54:18.056006: train_loss -0.3466 
2023-11-23 04:54:18.056210: val_loss -0.3651 
2023-11-23 04:54:18.056313: Pseudo dice [0.739, nan] 
2023-11-23 04:54:18.056393: Epoch time: 55.74 s 
2023-11-23 04:54:19.167698:  
2023-11-23 04:54:19.167830: Epoch 733 
2023-11-23 04:54:19.167930: Current learning rate: 0.00305 
2023-11-23 04:55:14.787014: train_loss -0.3592 
2023-11-23 04:55:14.787193: val_loss -0.3674 
2023-11-23 04:55:14.787302: Pseudo dice [0.7504, nan] 
2023-11-23 04:55:14.787409: Epoch time: 55.62 s 
2023-11-23 04:55:15.920337:  
2023-11-23 04:55:15.920457: Epoch 734 
2023-11-23 04:55:15.920553: Current learning rate: 0.00304 
2023-11-23 04:56:11.551081: train_loss -0.3567 
2023-11-23 04:56:11.551285: val_loss -0.3465 
2023-11-23 04:56:11.551497: Pseudo dice [0.701, nan] 
2023-11-23 04:56:11.551579: Epoch time: 55.63 s 
2023-11-23 04:56:12.667806:  
2023-11-23 04:56:12.667928: Epoch 735 
2023-11-23 04:56:12.668028: Current learning rate: 0.00303 
2023-11-23 04:57:08.330099: train_loss -0.3726 
2023-11-23 04:57:08.330321: val_loss -0.3595 
2023-11-23 04:57:08.330400: Pseudo dice [0.7382, nan] 
2023-11-23 04:57:08.330479: Epoch time: 55.66 s 
2023-11-23 04:57:09.437668:  
2023-11-23 04:57:09.437793: Epoch 736 
2023-11-23 04:57:09.437891: Current learning rate: 0.00302 
2023-11-23 04:58:05.042200: train_loss -0.3648 
2023-11-23 04:58:05.042416: val_loss -0.3393 
2023-11-23 04:58:05.042502: Pseudo dice [0.6816, nan] 
2023-11-23 04:58:05.042580: Epoch time: 55.61 s 
2023-11-23 04:58:06.143144:  
2023-11-23 04:58:06.143269: Epoch 737 
2023-11-23 04:58:06.143371: Current learning rate: 0.00301 
2023-11-23 04:59:01.821686: train_loss -0.3366 
2023-11-23 04:59:01.821901: val_loss -0.3643 
2023-11-23 04:59:01.821984: Pseudo dice [0.7433, nan] 
2023-11-23 04:59:01.822057: Epoch time: 55.68 s 
2023-11-23 04:59:02.925749:  
2023-11-23 04:59:02.925872: Epoch 738 
2023-11-23 04:59:02.925972: Current learning rate: 0.003 
2023-11-23 04:59:58.628118: train_loss -0.3436 
2023-11-23 04:59:58.628309: val_loss -0.3616 
2023-11-23 04:59:58.628390: Pseudo dice [0.7351, nan] 
2023-11-23 04:59:58.628461: Epoch time: 55.7 s 
2023-11-23 04:59:59.731008:  
2023-11-23 04:59:59.731136: Epoch 739 
2023-11-23 04:59:59.731238: Current learning rate: 0.00299 
2023-11-23 05:00:55.311428: train_loss -0.3402 
2023-11-23 05:00:55.311622: val_loss -0.3481 
2023-11-23 05:00:55.311700: Pseudo dice [0.7051, nan] 
2023-11-23 05:00:55.311771: Epoch time: 55.58 s 
2023-11-23 05:00:56.412461:  
2023-11-23 05:00:56.412638: Epoch 740 
2023-11-23 05:00:56.412741: Current learning rate: 0.00297 
2023-11-23 05:01:52.063475: train_loss -0.3414 
2023-11-23 05:01:52.063721: val_loss -0.3419 
2023-11-23 05:01:52.063807: Pseudo dice [0.7007, nan] 
2023-11-23 05:01:52.063884: Epoch time: 55.65 s 
2023-11-23 05:01:53.165409:  
2023-11-23 05:01:53.165540: Epoch 741 
2023-11-23 05:01:53.165636: Current learning rate: 0.00296 
2023-11-23 05:02:48.890447: train_loss -0.3412 
2023-11-23 05:02:48.890637: val_loss -0.3656 
2023-11-23 05:02:48.890724: Pseudo dice [0.7465, nan] 
2023-11-23 05:02:48.890805: Epoch time: 55.73 s 
2023-11-23 05:02:49.993023:  
2023-11-23 05:02:49.993161: Epoch 742 
2023-11-23 05:02:49.993285: Current learning rate: 0.00295 
2023-11-23 05:03:45.657628: train_loss -0.3457 
2023-11-23 05:03:45.657826: val_loss -0.3722 
2023-11-23 05:03:45.657909: Pseudo dice [0.7513, nan] 
2023-11-23 05:03:45.657980: Epoch time: 55.67 s 
2023-11-23 05:03:46.764192:  
2023-11-23 05:03:46.764328: Epoch 743 
2023-11-23 05:03:46.764427: Current learning rate: 0.00294 
2023-11-23 05:04:42.369673: train_loss -0.3435 
2023-11-23 05:04:42.369866: val_loss -0.3561 
2023-11-23 05:04:42.369952: Pseudo dice [0.7266, nan] 
2023-11-23 05:04:42.370037: Epoch time: 55.61 s 
2023-11-23 05:04:43.581946:  
2023-11-23 05:04:43.582223: Epoch 744 
2023-11-23 05:04:43.582403: Current learning rate: 0.00293 
2023-11-23 05:05:39.186870: train_loss -0.3496 
2023-11-23 05:05:39.187060: val_loss -0.3573 
2023-11-23 05:05:39.187139: Pseudo dice [0.7311, nan] 
2023-11-23 05:05:39.187211: Epoch time: 55.61 s 
2023-11-23 05:05:40.294872:  
2023-11-23 05:05:40.295003: Epoch 745 
2023-11-23 05:05:40.295126: Current learning rate: 0.00292 
2023-11-23 05:06:35.930114: train_loss -0.3655 
2023-11-23 05:06:35.930307: val_loss -0.3535 
2023-11-23 05:06:35.930386: Pseudo dice [0.7227, nan] 
2023-11-23 05:06:35.930461: Epoch time: 55.64 s 
2023-11-23 05:06:37.030622:  
2023-11-23 05:06:37.030749: Epoch 746 
2023-11-23 05:06:37.030864: Current learning rate: 0.00291 
2023-11-23 05:07:32.626514: train_loss -0.3526 
2023-11-23 05:07:32.626709: val_loss -0.3186 
2023-11-23 05:07:32.626792: Pseudo dice [0.6323, nan] 
2023-11-23 05:07:32.626868: Epoch time: 55.6 s 
2023-11-23 05:07:33.732538:  
2023-11-23 05:07:33.732683: Epoch 747 
2023-11-23 05:07:33.732786: Current learning rate: 0.0029 
2023-11-23 05:08:29.392303: train_loss -0.3491 
2023-11-23 05:08:29.392513: val_loss -0.3292 
2023-11-23 05:08:29.392596: Pseudo dice [0.6737, nan] 
2023-11-23 05:08:29.392677: Epoch time: 55.66 s 
2023-11-23 05:08:30.490929:  
2023-11-23 05:08:30.491050: Epoch 748 
2023-11-23 05:08:30.491150: Current learning rate: 0.00289 
2023-11-23 05:09:26.086918: train_loss -0.351 
2023-11-23 05:09:26.087133: val_loss -0.321 
2023-11-23 05:09:26.087219: Pseudo dice [0.6313, nan] 
2023-11-23 05:09:26.087303: Epoch time: 55.6 s 
2023-11-23 05:09:27.289438:  
2023-11-23 05:09:27.289621: Epoch 749 
2023-11-23 05:09:27.289748: Current learning rate: 0.00288 
2023-11-23 05:10:23.014304: train_loss -0.3398 
2023-11-23 05:10:23.014531: val_loss -0.3491 
2023-11-23 05:10:23.014634: Pseudo dice [0.7082, nan] 
2023-11-23 05:10:23.014729: Epoch time: 55.73 s 
2023-11-23 05:10:24.258040:  
2023-11-23 05:10:24.258182: Epoch 750 
2023-11-23 05:10:24.258305: Current learning rate: 0.00287 
2023-11-23 05:11:19.847710: train_loss -0.3358 
2023-11-23 05:11:19.847917: val_loss -0.3559 
2023-11-23 05:11:19.848000: Pseudo dice [0.7233, nan] 
2023-11-23 05:11:19.848075: Epoch time: 55.59 s 
2023-11-23 05:11:20.948385:  
2023-11-23 05:11:20.948519: Epoch 751 
2023-11-23 05:11:20.948626: Current learning rate: 0.00286 
2023-11-23 05:12:16.483402: train_loss -0.3373 
2023-11-23 05:12:16.483620: val_loss -0.3317 
2023-11-23 05:12:16.483731: Pseudo dice [0.6786, nan] 
2023-11-23 05:12:16.483805: Epoch time: 55.54 s 
2023-11-23 05:12:17.598783:  
2023-11-23 05:12:17.598905: Epoch 752 
2023-11-23 05:12:17.599004: Current learning rate: 0.00285 
2023-11-23 05:13:13.097400: train_loss -0.3375 
2023-11-23 05:13:13.097625: val_loss -0.3525 
2023-11-23 05:13:13.097710: Pseudo dice [0.7142, nan] 
2023-11-23 05:13:13.097792: Epoch time: 55.5 s 
2023-11-23 05:13:14.309656:  
2023-11-23 05:13:14.309785: Epoch 753 
2023-11-23 05:13:14.309885: Current learning rate: 0.00284 
2023-11-23 05:14:09.875114: train_loss -0.3515 
2023-11-23 05:14:09.875304: val_loss -0.342 
2023-11-23 05:14:09.875407: Pseudo dice [0.7052, nan] 
2023-11-23 05:14:09.875482: Epoch time: 55.57 s 
2023-11-23 05:14:10.985076:  
2023-11-23 05:14:10.985216: Epoch 754 
2023-11-23 05:14:10.985385: Current learning rate: 0.00283 
2023-11-23 05:15:06.709513: train_loss -0.344 
2023-11-23 05:15:06.709708: val_loss -0.3302 
2023-11-23 05:15:06.709818: Pseudo dice [0.6867, nan] 
2023-11-23 05:15:06.709966: Epoch time: 55.73 s 
2023-11-23 05:15:07.816523:  
2023-11-23 05:15:07.816670: Epoch 755 
2023-11-23 05:15:07.816772: Current learning rate: 0.00282 
2023-11-23 05:16:03.575959: train_loss -0.346 
2023-11-23 05:16:03.576176: val_loss -0.3498 
2023-11-23 05:16:03.576258: Pseudo dice [0.7103, nan] 
2023-11-23 05:16:03.576331: Epoch time: 55.76 s 
2023-11-23 05:16:04.690393:  
2023-11-23 05:16:04.690625: Epoch 756 
2023-11-23 05:16:04.690734: Current learning rate: 0.00281 
2023-11-23 05:17:00.209699: train_loss -0.3542 
2023-11-23 05:17:00.209941: val_loss -0.347 
2023-11-23 05:17:00.210024: Pseudo dice [0.7065, nan] 
2023-11-23 05:17:00.210098: Epoch time: 55.52 s 
2023-11-23 05:17:01.306780:  
2023-11-23 05:17:01.306891: Epoch 757 
2023-11-23 05:17:01.307015: Current learning rate: 0.0028 
2023-11-23 05:17:56.810827: train_loss -0.3317 
2023-11-23 05:17:56.811024: val_loss -0.3309 
2023-11-23 05:17:56.811109: Pseudo dice [0.678, nan] 
2023-11-23 05:17:56.811183: Epoch time: 55.5 s 
2023-11-23 05:17:57.928476:  
2023-11-23 05:17:57.928615: Epoch 758 
2023-11-23 05:17:57.928744: Current learning rate: 0.00279 
2023-11-23 05:18:53.458277: train_loss -0.3486 
2023-11-23 05:18:53.458477: val_loss -0.3267 
2023-11-23 05:18:53.458557: Pseudo dice [0.6728, nan] 
2023-11-23 05:18:53.458628: Epoch time: 55.53 s 
2023-11-23 05:18:54.577341:  
2023-11-23 05:18:54.577469: Epoch 759 
2023-11-23 05:18:54.577579: Current learning rate: 0.00278 
2023-11-23 05:19:50.324963: train_loss -0.3344 
2023-11-23 05:19:50.325161: val_loss -0.3587 
2023-11-23 05:19:50.325242: Pseudo dice [0.7302, nan] 
2023-11-23 05:19:50.325315: Epoch time: 55.75 s 
2023-11-23 05:19:51.439243:  
2023-11-23 05:19:51.439424: Epoch 760 
2023-11-23 05:19:51.439567: Current learning rate: 0.00277 
2023-11-23 05:20:47.144483: train_loss -0.3452 
2023-11-23 05:20:47.144668: val_loss -0.3642 
2023-11-23 05:20:47.144752: Pseudo dice [0.7313, nan] 
2023-11-23 05:20:47.144832: Epoch time: 55.71 s 
2023-11-23 05:20:48.254981:  
2023-11-23 05:20:48.255104: Epoch 761 
2023-11-23 05:20:48.255239: Current learning rate: 0.00276 
2023-11-23 05:21:43.830965: train_loss -0.3419 
2023-11-23 05:21:43.831182: val_loss -0.3824 
2023-11-23 05:21:43.831275: Pseudo dice [0.7773, nan] 
2023-11-23 05:21:43.831350: Epoch time: 55.58 s 
2023-11-23 05:21:45.065742:  
2023-11-23 05:21:45.065862: Epoch 762 
2023-11-23 05:21:45.065982: Current learning rate: 0.00275 
2023-11-23 05:22:40.628956: train_loss -0.3453 
2023-11-23 05:22:40.629154: val_loss -0.3349 
2023-11-23 05:22:40.629255: Pseudo dice [0.6872, nan] 
2023-11-23 05:22:40.629331: Epoch time: 55.56 s 
2023-11-23 05:22:41.747881:  
2023-11-23 05:22:41.748001: Epoch 763 
2023-11-23 05:22:41.748103: Current learning rate: 0.00274 
2023-11-23 05:23:37.383756: train_loss -0.3473 
2023-11-23 05:23:37.383946: val_loss -0.3789 
2023-11-23 05:23:37.384028: Pseudo dice [0.7816, nan] 
2023-11-23 05:23:37.384103: Epoch time: 55.64 s 
2023-11-23 05:23:38.504255:  
2023-11-23 05:23:38.504379: Epoch 764 
2023-11-23 05:23:38.504482: Current learning rate: 0.00273 
2023-11-23 05:24:34.101846: train_loss -0.3479 
2023-11-23 05:24:34.102049: val_loss -0.35 
2023-11-23 05:24:34.102129: Pseudo dice [0.7084, nan] 
2023-11-23 05:24:34.102210: Epoch time: 55.6 s 
2023-11-23 05:24:35.223235:  
2023-11-23 05:24:35.223382: Epoch 765 
2023-11-23 05:24:35.223497: Current learning rate: 0.00272 
2023-11-23 05:25:30.994025: train_loss -0.3372 
2023-11-23 05:25:30.994244: val_loss -0.3227 
2023-11-23 05:25:30.994331: Pseudo dice [0.6565, nan] 
2023-11-23 05:25:30.994408: Epoch time: 55.77 s 
2023-11-23 05:25:32.106791:  
2023-11-23 05:25:32.106968: Epoch 766 
2023-11-23 05:25:32.107074: Current learning rate: 0.00271 
2023-11-23 05:26:27.672525: train_loss -0.3421 
2023-11-23 05:26:27.672762: val_loss -0.3401 
2023-11-23 05:26:27.672844: Pseudo dice [0.7024, nan] 
2023-11-23 05:26:27.672919: Epoch time: 55.57 s 
2023-11-23 05:26:28.789849:  
2023-11-23 05:26:28.790008: Epoch 767 
2023-11-23 05:26:28.790136: Current learning rate: 0.0027 
2023-11-23 05:27:24.413495: train_loss -0.3361 
2023-11-23 05:27:24.413702: val_loss -0.3616 
2023-11-23 05:27:24.413783: Pseudo dice [0.7355, nan] 
2023-11-23 05:27:24.413858: Epoch time: 55.62 s 
2023-11-23 05:27:25.530536:  
2023-11-23 05:27:25.530671: Epoch 768 
2023-11-23 05:27:25.530771: Current learning rate: 0.00268 
2023-11-23 05:28:21.304578: train_loss -0.3439 
2023-11-23 05:28:21.304771: val_loss -0.3621 
2023-11-23 05:28:21.304852: Pseudo dice [0.728, nan] 
2023-11-23 05:28:21.304923: Epoch time: 55.77 s 
2023-11-23 05:28:22.426396:  
2023-11-23 05:28:22.426624: Epoch 769 
2023-11-23 05:28:22.426766: Current learning rate: 0.00267 
2023-11-23 05:29:18.158217: train_loss -0.3367 
2023-11-23 05:29:18.158433: val_loss -0.3484 
2023-11-23 05:29:18.158517: Pseudo dice [0.699, nan] 
2023-11-23 05:29:18.158592: Epoch time: 55.73 s 
2023-11-23 05:29:19.271818:  
2023-11-23 05:29:19.271949: Epoch 770 
2023-11-23 05:29:19.272064: Current learning rate: 0.00266 
2023-11-23 05:30:14.811390: train_loss -0.3474 
2023-11-23 05:30:14.811568: val_loss -0.3596 
2023-11-23 05:30:14.811648: Pseudo dice [0.7328, nan] 
2023-11-23 05:30:14.811724: Epoch time: 55.54 s 
2023-11-23 05:30:16.060100:  
2023-11-23 05:30:16.060302: Epoch 771 
2023-11-23 05:30:16.060450: Current learning rate: 0.00265 
2023-11-23 05:31:11.723536: train_loss -0.3462 
2023-11-23 05:31:11.723722: val_loss -0.3395 
2023-11-23 05:31:11.723799: Pseudo dice [0.6952, nan] 
2023-11-23 05:31:11.723871: Epoch time: 55.66 s 
2023-11-23 05:31:12.847139:  
2023-11-23 05:31:12.847324: Epoch 772 
2023-11-23 05:31:12.847430: Current learning rate: 0.00264 
2023-11-23 05:32:08.532399: train_loss -0.3585 
2023-11-23 05:32:08.532618: val_loss -0.3281 
2023-11-23 05:32:08.532708: Pseudo dice [0.6809, nan] 
2023-11-23 05:32:08.532782: Epoch time: 55.69 s 
2023-11-23 05:32:09.659571:  
2023-11-23 05:32:09.659707: Epoch 773 
2023-11-23 05:32:09.659813: Current learning rate: 0.00263 
2023-11-23 05:33:05.305146: train_loss -0.3543 
2023-11-23 05:33:05.305343: val_loss -0.3604 
2023-11-23 05:33:05.305424: Pseudo dice [0.725, nan] 
2023-11-23 05:33:05.305496: Epoch time: 55.65 s 
2023-11-23 05:33:06.431451:  
2023-11-23 05:33:06.431574: Epoch 774 
2023-11-23 05:33:06.431673: Current learning rate: 0.00262 
2023-11-23 05:34:02.010451: train_loss -0.3408 
2023-11-23 05:34:02.010645: val_loss -0.3538 
2023-11-23 05:34:02.010740: Pseudo dice [0.717, nan] 
2023-11-23 05:34:02.010814: Epoch time: 55.58 s 
2023-11-23 05:34:03.127661:  
2023-11-23 05:34:03.127965: Epoch 775 
2023-11-23 05:34:03.128077: Current learning rate: 0.00261 
2023-11-23 05:34:58.746325: train_loss -0.3394 
2023-11-23 05:34:58.746523: val_loss -0.327 
2023-11-23 05:34:58.746604: Pseudo dice [0.665, nan] 
2023-11-23 05:34:58.746677: Epoch time: 55.62 s 
2023-11-23 05:34:59.880865:  
2023-11-23 05:34:59.880997: Epoch 776 
2023-11-23 05:34:59.881105: Current learning rate: 0.0026 
2023-11-23 05:35:55.556207: train_loss -0.3485 
2023-11-23 05:35:55.556396: val_loss -0.3537 
2023-11-23 05:35:55.556477: Pseudo dice [0.7216, nan] 
2023-11-23 05:35:55.556549: Epoch time: 55.68 s 
2023-11-23 05:35:56.678485:  
2023-11-23 05:35:56.678683: Epoch 777 
2023-11-23 05:35:56.678858: Current learning rate: 0.00259 
2023-11-23 05:36:52.237720: train_loss -0.3444 
2023-11-23 05:36:52.237908: val_loss -0.3359 
2023-11-23 05:36:52.237991: Pseudo dice [0.6885, nan] 
2023-11-23 05:36:52.238066: Epoch time: 55.56 s 
2023-11-23 05:36:53.366248:  
2023-11-23 05:36:53.366377: Epoch 778 
2023-11-23 05:36:53.366480: Current learning rate: 0.00258 
2023-11-23 05:37:48.943100: train_loss -0.344 
2023-11-23 05:37:48.943412: val_loss -0.3484 
2023-11-23 05:37:48.943513: Pseudo dice [0.7145, nan] 
2023-11-23 05:37:48.943612: Epoch time: 55.58 s 
2023-11-23 05:37:50.063179:  
2023-11-23 05:37:50.063336: Epoch 779 
2023-11-23 05:37:50.063448: Current learning rate: 0.00257 
2023-11-23 05:38:45.641611: train_loss -0.351 
2023-11-23 05:38:45.641790: val_loss -0.3356 
2023-11-23 05:38:45.641871: Pseudo dice [0.6791, nan] 
2023-11-23 05:38:45.641943: Epoch time: 55.58 s 
2023-11-23 05:38:46.868334:  
2023-11-23 05:38:46.868497: Epoch 780 
2023-11-23 05:38:46.868707: Current learning rate: 0.00256 
2023-11-23 05:39:42.524648: train_loss -0.3509 
2023-11-23 05:39:42.524843: val_loss -0.3445 
2023-11-23 05:39:42.524924: Pseudo dice [0.7067, nan] 
2023-11-23 05:39:42.524997: Epoch time: 55.66 s 
2023-11-23 05:39:43.661116:  
2023-11-23 05:39:43.661309: Epoch 781 
2023-11-23 05:39:43.661477: Current learning rate: 0.00255 
2023-11-23 05:40:39.277779: train_loss -0.3549 
2023-11-23 05:40:39.277997: val_loss -0.3537 
2023-11-23 05:40:39.278085: Pseudo dice [0.7237, nan] 
2023-11-23 05:40:39.278214: Epoch time: 55.62 s 
2023-11-23 05:40:40.405500:  
2023-11-23 05:40:40.405630: Epoch 782 
2023-11-23 05:40:40.405742: Current learning rate: 0.00254 
2023-11-23 05:41:36.035509: train_loss -0.3449 
2023-11-23 05:41:36.035754: val_loss -0.3592 
2023-11-23 05:41:36.035843: Pseudo dice [0.7338, nan] 
2023-11-23 05:41:36.035918: Epoch time: 55.63 s 
2023-11-23 05:41:37.146811:  
2023-11-23 05:41:37.146939: Epoch 783 
2023-11-23 05:41:37.147057: Current learning rate: 0.00253 
2023-11-23 05:42:32.691083: train_loss -0.3553 
2023-11-23 05:42:32.691286: val_loss -0.3553 
2023-11-23 05:42:32.691430: Pseudo dice [0.7211, nan] 
2023-11-23 05:42:32.691521: Epoch time: 55.55 s 
2023-11-23 05:42:33.934888:  
2023-11-23 05:42:33.935015: Epoch 784 
2023-11-23 05:42:33.935110: Current learning rate: 0.00252 
2023-11-23 05:43:29.697235: train_loss -0.3527 
2023-11-23 05:43:29.697476: val_loss -0.3534 
2023-11-23 05:43:29.697573: Pseudo dice [0.7206, nan] 
2023-11-23 05:43:29.697656: Epoch time: 55.76 s 
2023-11-23 05:43:30.814790:  
2023-11-23 05:43:30.814922: Epoch 785 
2023-11-23 05:43:30.815042: Current learning rate: 0.00251 
2023-11-23 05:44:26.523103: train_loss -0.3557 
2023-11-23 05:44:26.523301: val_loss -0.3696 
2023-11-23 05:44:26.523447: Pseudo dice [0.7537, nan] 
2023-11-23 05:44:26.523549: Epoch time: 55.71 s 
2023-11-23 05:44:27.676181:  
2023-11-23 05:44:27.676302: Epoch 786 
2023-11-23 05:44:27.676405: Current learning rate: 0.0025 
2023-11-23 05:45:23.375644: train_loss -0.3495 
2023-11-23 05:45:23.375852: val_loss -0.3681 
2023-11-23 05:45:23.375947: Pseudo dice [0.7289, nan] 
2023-11-23 05:45:23.376030: Epoch time: 55.7 s 
2023-11-23 05:45:24.512494:  
2023-11-23 05:45:24.512641: Epoch 787 
2023-11-23 05:45:24.512748: Current learning rate: 0.00249 
2023-11-23 05:46:20.148887: train_loss -0.347 
2023-11-23 05:46:20.149073: val_loss -0.3686 
2023-11-23 05:46:20.149157: Pseudo dice [0.7486, nan] 
2023-11-23 05:46:20.149227: Epoch time: 55.64 s 
2023-11-23 05:46:21.280180:  
2023-11-23 05:46:21.280343: Epoch 788 
2023-11-23 05:46:21.280456: Current learning rate: 0.00248 
2023-11-23 05:47:16.805027: train_loss -0.3529 
2023-11-23 05:47:16.805211: val_loss -0.3795 
2023-11-23 05:47:16.805289: Pseudo dice [0.7751, nan] 
2023-11-23 05:47:16.805360: Epoch time: 55.53 s 
2023-11-23 05:47:18.033914:  
2023-11-23 05:47:18.034050: Epoch 789 
2023-11-23 05:47:18.034147: Current learning rate: 0.00247 
2023-11-23 05:48:13.709125: train_loss -0.3303 
2023-11-23 05:48:13.709305: val_loss -0.3343 
2023-11-23 05:48:13.709383: Pseudo dice [0.6886, nan] 
2023-11-23 05:48:13.709460: Epoch time: 55.68 s 
2023-11-23 05:48:14.836522:  
2023-11-23 05:48:14.836736: Epoch 790 
2023-11-23 05:48:14.836882: Current learning rate: 0.00245 
2023-11-23 05:49:10.544733: train_loss -0.3433 
2023-11-23 05:49:10.544948: val_loss -0.3553 
2023-11-23 05:49:10.545023: Pseudo dice [0.7207, nan] 
2023-11-23 05:49:10.545092: Epoch time: 55.71 s 
2023-11-23 05:49:11.674975:  
2023-11-23 05:49:11.675092: Epoch 791 
2023-11-23 05:49:11.675188: Current learning rate: 0.00244 
2023-11-23 05:50:07.348553: train_loss -0.3446 
2023-11-23 05:50:07.348771: val_loss -0.329 
2023-11-23 05:50:07.348848: Pseudo dice [0.6626, nan] 
2023-11-23 05:50:07.348915: Epoch time: 55.67 s 
2023-11-23 05:50:08.461570:  
2023-11-23 05:50:08.461743: Epoch 792 
2023-11-23 05:50:08.461889: Current learning rate: 0.00243 
2023-11-23 05:51:04.141879: train_loss -0.3371 
2023-11-23 05:51:04.142064: val_loss -0.351 
2023-11-23 05:51:04.142140: Pseudo dice [0.7025, nan] 
2023-11-23 05:51:04.142210: Epoch time: 55.68 s 
2023-11-23 05:51:05.371709:  
2023-11-23 05:51:05.371889: Epoch 793 
2023-11-23 05:51:05.371993: Current learning rate: 0.00242 
2023-11-23 05:52:01.096671: train_loss -0.3339 
2023-11-23 05:52:01.096869: val_loss -0.3562 
2023-11-23 05:52:01.096948: Pseudo dice [0.7249, nan] 
2023-11-23 05:52:01.097017: Epoch time: 55.73 s 
2023-11-23 05:52:02.220493:  
2023-11-23 05:52:02.220812: Epoch 794 
2023-11-23 05:52:02.220943: Current learning rate: 0.00241 
2023-11-23 05:52:57.912734: train_loss -0.327 
2023-11-23 05:52:57.912918: val_loss -0.3541 
2023-11-23 05:52:57.913010: Pseudo dice [0.7148, nan] 
2023-11-23 05:52:57.913080: Epoch time: 55.69 s 
2023-11-23 05:52:59.031707:  
2023-11-23 05:52:59.031883: Epoch 795 
2023-11-23 05:52:59.032026: Current learning rate: 0.0024 
2023-11-23 05:53:54.617592: train_loss -0.3368 
2023-11-23 05:53:54.617781: val_loss -0.3825 
2023-11-23 05:53:54.617857: Pseudo dice [0.7699, nan] 
2023-11-23 05:53:54.617927: Epoch time: 55.59 s 
2023-11-23 05:53:55.743604:  
2023-11-23 05:53:55.743722: Epoch 796 
2023-11-23 05:53:55.743821: Current learning rate: 0.00239 
2023-11-23 05:54:51.375179: train_loss -0.3525 
2023-11-23 05:54:51.375365: val_loss -0.3517 
2023-11-23 05:54:51.375443: Pseudo dice [0.7183, nan] 
2023-11-23 05:54:51.375519: Epoch time: 55.63 s 
2023-11-23 05:54:52.496486:  
2023-11-23 05:54:52.496623: Epoch 797 
2023-11-23 05:54:52.496726: Current learning rate: 0.00238 
2023-11-23 05:55:48.131927: train_loss -0.3415 
2023-11-23 05:55:48.132153: val_loss -0.3535 
2023-11-23 05:55:48.132238: Pseudo dice [0.7069, nan] 
2023-11-23 05:55:48.132310: Epoch time: 55.64 s 
2023-11-23 05:55:49.280853:  
2023-11-23 05:55:49.280996: Epoch 798 
2023-11-23 05:55:49.281104: Current learning rate: 0.00237 
2023-11-23 05:56:44.961025: train_loss -0.3485 
2023-11-23 05:56:44.961229: val_loss -0.3464 
2023-11-23 05:56:44.961307: Pseudo dice [0.707, nan] 
2023-11-23 05:56:44.961377: Epoch time: 55.68 s 
2023-11-23 05:56:46.088225:  
2023-11-23 05:56:46.088353: Epoch 799 
2023-11-23 05:56:46.088459: Current learning rate: 0.00236 
2023-11-23 05:57:41.714080: train_loss -0.3382 
2023-11-23 05:57:41.714285: val_loss -0.34 
2023-11-23 05:57:41.714365: Pseudo dice [0.6928, nan] 
2023-11-23 05:57:41.714441: Epoch time: 55.63 s 
2023-11-23 05:57:42.975809:  
2023-11-23 05:57:42.975977: Epoch 800 
2023-11-23 05:57:42.976086: Current learning rate: 0.00235 
2023-11-23 05:58:38.709110: train_loss -0.3497 
2023-11-23 05:58:38.709344: val_loss -0.3681 
2023-11-23 05:58:38.709441: Pseudo dice [0.7452, nan] 
2023-11-23 05:58:38.709512: Epoch time: 55.73 s 
2023-11-23 05:58:39.848488:  
2023-11-23 05:58:39.848672: Epoch 801 
2023-11-23 05:58:39.848783: Current learning rate: 0.00234 
2023-11-23 05:59:35.489132: train_loss -0.3485 
2023-11-23 05:59:35.489306: val_loss -0.377 
2023-11-23 05:59:35.489380: Pseudo dice [0.7703, nan] 
2023-11-23 05:59:35.489450: Epoch time: 55.64 s 
2023-11-23 05:59:36.706019:  
2023-11-23 05:59:36.706144: Epoch 802 
2023-11-23 05:59:36.706242: Current learning rate: 0.00233 
2023-11-23 06:00:46.177326: train_loss -0.3533 
2023-11-23 06:00:46.177504: val_loss -0.3446 
2023-11-23 06:00:46.177616: Pseudo dice [0.7111, nan] 
2023-11-23 06:00:46.177710: Epoch time: 69.47 s 
2023-11-23 06:00:47.308377:  
2023-11-23 06:00:47.308526: Epoch 803 
2023-11-23 06:00:47.308687: Current learning rate: 0.00232 
2023-11-23 06:02:07.126832: train_loss -0.366 
2023-11-23 06:02:07.127028: val_loss -0.3526 
2023-11-23 06:02:07.127108: Pseudo dice [0.7287, nan] 
2023-11-23 06:02:07.127194: Epoch time: 79.82 s 
2023-11-23 06:02:08.258625:  
2023-11-23 06:02:08.258803: Epoch 804 
2023-11-23 06:02:08.258916: Current learning rate: 0.00231 
2023-11-23 06:03:26.408211: train_loss -0.3619 
2023-11-23 06:03:26.408394: val_loss -0.3653 
2023-11-23 06:03:26.408500: Pseudo dice [0.7482, nan] 
2023-11-23 06:03:26.408585: Epoch time: 78.15 s 
2023-11-23 06:03:27.537915:  
2023-11-23 06:03:27.538037: Epoch 805 
2023-11-23 06:03:27.538136: Current learning rate: 0.0023 
2023-11-23 06:04:45.489151: train_loss -0.36 
2023-11-23 06:04:45.489462: val_loss -0.3796 
2023-11-23 06:04:45.489657: Pseudo dice [0.7751, nan] 
2023-11-23 06:04:45.489819: Epoch time: 77.95 s 
2023-11-23 06:04:45.489953: Yayy! New best EMA pseudo Dice: 0.7305 
2023-11-23 06:04:46.752376:  
2023-11-23 06:04:46.752534: Epoch 806 
2023-11-23 06:04:46.752721: Current learning rate: 0.00229 
2023-11-23 06:06:04.834171: train_loss -0.3499 
2023-11-23 06:06:04.834365: val_loss -0.3566 
2023-11-23 06:06:04.834583: Pseudo dice [0.7279, nan] 
2023-11-23 06:06:04.834657: Epoch time: 78.08 s 
2023-11-23 06:06:06.075913:  
2023-11-23 06:06:06.076063: Epoch 807 
2023-11-23 06:06:06.076172: Current learning rate: 0.00228 
2023-11-23 06:07:23.989450: train_loss -0.3574 
2023-11-23 06:07:23.989634: val_loss -0.3642 
2023-11-23 06:07:23.989714: Pseudo dice [0.7471, nan] 
2023-11-23 06:07:23.989783: Epoch time: 77.91 s 
2023-11-23 06:07:23.989849: Yayy! New best EMA pseudo Dice: 0.7319 
2023-11-23 06:07:25.245680:  
2023-11-23 06:07:25.245802: Epoch 808 
2023-11-23 06:07:25.245900: Current learning rate: 0.00226 
2023-11-23 06:08:42.782379: train_loss -0.3576 
2023-11-23 06:08:42.782571: val_loss -0.3535 
2023-11-23 06:08:42.782648: Pseudo dice [0.7199, nan] 
2023-11-23 06:08:42.782719: Epoch time: 77.54 s 
2023-11-23 06:08:43.902749:  
2023-11-23 06:08:43.902914: Epoch 809 
2023-11-23 06:08:43.903085: Current learning rate: 0.00225 
2023-11-23 06:10:01.016495: train_loss -0.3425 
2023-11-23 06:10:01.016768: val_loss -0.3335 
2023-11-23 06:10:01.016883: Pseudo dice [0.6819, nan] 
2023-11-23 06:10:01.016976: Epoch time: 77.11 s 
2023-11-23 06:10:02.147201:  
2023-11-23 06:10:02.147323: Epoch 810 
2023-11-23 06:10:02.147430: Current learning rate: 0.00224 
2023-11-23 06:11:18.802808: train_loss -0.3588 
2023-11-23 06:11:18.803005: val_loss -0.3578 
2023-11-23 06:11:18.803085: Pseudo dice [0.7202, nan] 
2023-11-23 06:11:18.803159: Epoch time: 76.66 s 
2023-11-23 06:11:20.035320:  
2023-11-23 06:11:20.035481: Epoch 811 
2023-11-23 06:11:20.035633: Current learning rate: 0.00223 
2023-11-23 06:12:37.157533: train_loss -0.3644 
2023-11-23 06:12:37.157718: val_loss -0.3367 
2023-11-23 06:12:37.157795: Pseudo dice [0.6819, nan] 
2023-11-23 06:12:37.157870: Epoch time: 77.12 s 
2023-11-23 06:12:38.369954:  
2023-11-23 06:12:38.370145: Epoch 812 
2023-11-23 06:12:38.370291: Current learning rate: 0.00222 
2023-11-23 06:13:55.324176: train_loss -0.3601 
2023-11-23 06:13:55.324398: val_loss -0.3628 
2023-11-23 06:13:55.324478: Pseudo dice [0.7529, nan] 
2023-11-23 06:13:55.324548: Epoch time: 76.95 s 
2023-11-23 06:13:56.460887:  
2023-11-23 06:13:56.461070: Epoch 813 
2023-11-23 06:13:56.461212: Current learning rate: 0.00221 
2023-11-23 06:15:12.808764: train_loss -0.3331 
2023-11-23 06:15:12.808979: val_loss -0.3279 
2023-11-23 06:15:12.809060: Pseudo dice [0.6859, nan] 
2023-11-23 06:15:12.809130: Epoch time: 76.35 s 
2023-11-23 06:15:13.933909:  
2023-11-23 06:15:13.934027: Epoch 814 
2023-11-23 06:15:13.934130: Current learning rate: 0.0022 
2023-11-23 06:16:30.569057: train_loss -0.3386 
2023-11-23 06:16:30.569251: val_loss -0.3637 
2023-11-23 06:16:30.569328: Pseudo dice [0.7414, nan] 
2023-11-23 06:16:30.569394: Epoch time: 76.64 s 
2023-11-23 06:16:31.686619:  
2023-11-23 06:16:31.686790: Epoch 815 
2023-11-23 06:16:31.686933: Current learning rate: 0.00219 
2023-11-23 06:17:48.716199: train_loss -0.3371 
2023-11-23 06:17:48.716383: val_loss -0.3433 
2023-11-23 06:17:48.716459: Pseudo dice [0.7005, nan] 
2023-11-23 06:17:48.716528: Epoch time: 77.03 s 
2023-11-23 06:17:49.943196:  
2023-11-23 06:17:49.943317: Epoch 816 
2023-11-23 06:17:49.943422: Current learning rate: 0.00218 
2023-11-23 06:19:06.836457: train_loss -0.3591 
2023-11-23 06:19:06.836684: val_loss -0.3473 
2023-11-23 06:19:06.836778: Pseudo dice [0.7157, nan] 
2023-11-23 06:19:06.836860: Epoch time: 76.89 s 
2023-11-23 06:19:07.967674:  
2023-11-23 06:19:07.967849: Epoch 817 
2023-11-23 06:19:07.967969: Current learning rate: 0.00217 
2023-11-23 06:20:25.512469: train_loss -0.3435 
2023-11-23 06:20:25.512706: val_loss -0.3546 
2023-11-23 06:20:25.512791: Pseudo dice [0.7219, nan] 
2023-11-23 06:20:25.512870: Epoch time: 77.55 s 
2023-11-23 06:20:26.630123:  
2023-11-23 06:20:26.630277: Epoch 818 
2023-11-23 06:20:26.630381: Current learning rate: 0.00216 
2023-11-23 06:21:44.258368: train_loss -0.3604 
2023-11-23 06:21:44.258592: val_loss -0.3844 
2023-11-23 06:21:44.258675: Pseudo dice [0.7814, nan] 
2023-11-23 06:21:44.258766: Epoch time: 77.63 s 
2023-11-23 06:21:45.387838:  
2023-11-23 06:21:45.387962: Epoch 819 
2023-11-23 06:21:45.388065: Current learning rate: 0.00215 
2023-11-23 06:23:02.542372: train_loss -0.3519 
2023-11-23 06:23:02.542546: val_loss -0.3533 
2023-11-23 06:23:02.542627: Pseudo dice [0.7084, nan] 
2023-11-23 06:23:02.542701: Epoch time: 77.16 s 
2023-11-23 06:23:03.606855:  
2023-11-23 06:23:03.607046: Epoch 820 
2023-11-23 06:23:03.607156: Current learning rate: 0.00214 
2023-11-23 06:24:20.936696: train_loss -0.3398 
2023-11-23 06:24:20.936891: val_loss -0.3447 
2023-11-23 06:24:20.936995: Pseudo dice [0.7051, nan] 
2023-11-23 06:24:20.937079: Epoch time: 77.33 s 
2023-11-23 06:24:22.204026:  
2023-11-23 06:24:22.204154: Epoch 821 
2023-11-23 06:24:22.204268: Current learning rate: 0.00213 
2023-11-23 06:25:39.357027: train_loss -0.3443 
2023-11-23 06:25:39.357248: val_loss -0.3409 
2023-11-23 06:25:39.357332: Pseudo dice [0.6921, nan] 
2023-11-23 06:25:39.357419: Epoch time: 77.15 s 
2023-11-23 06:25:40.422938:  
2023-11-23 06:25:40.423102: Epoch 822 
2023-11-23 06:25:40.423214: Current learning rate: 0.00212 
2023-11-23 06:26:57.683507: train_loss -0.3569 
2023-11-23 06:26:57.683698: val_loss -0.3628 
2023-11-23 06:26:57.683780: Pseudo dice [0.7329, nan] 
2023-11-23 06:26:57.683851: Epoch time: 77.26 s 
2023-11-23 06:26:58.755005:  
2023-11-23 06:26:58.755194: Epoch 823 
2023-11-23 06:26:58.755337: Current learning rate: 0.0021 
2023-11-23 06:28:15.511277: train_loss -0.3503 
2023-11-23 06:28:15.511517: val_loss -0.3621 
2023-11-23 06:28:15.511606: Pseudo dice [0.7418, nan] 
2023-11-23 06:28:15.511684: Epoch time: 76.76 s 
2023-11-23 06:28:16.580661:  
2023-11-23 06:28:16.580786: Epoch 824 
2023-11-23 06:28:16.580889: Current learning rate: 0.00209 
2023-11-23 06:29:33.673740: train_loss -0.3412 
2023-11-23 06:29:33.673931: val_loss -0.3729 
2023-11-23 06:29:33.674024: Pseudo dice [0.7496, nan] 
2023-11-23 06:29:33.674105: Epoch time: 77.09 s 
2023-11-23 06:29:34.737379:  
2023-11-23 06:29:34.737501: Epoch 825 
2023-11-23 06:29:34.737598: Current learning rate: 0.00208 
2023-11-23 06:30:52.289655: train_loss -0.3434 
2023-11-23 06:30:52.289849: val_loss -0.3581 
2023-11-23 06:30:52.289942: Pseudo dice [0.7268, nan] 
2023-11-23 06:30:52.290017: Epoch time: 77.55 s 
2023-11-23 06:30:53.355573:  
2023-11-23 06:30:53.355697: Epoch 826 
2023-11-23 06:30:53.355817: Current learning rate: 0.00207 
2023-11-23 06:32:10.804614: train_loss -0.3564 
2023-11-23 06:32:10.804794: val_loss -0.3526 
2023-11-23 06:32:10.804875: Pseudo dice [0.7195, nan] 
2023-11-23 06:32:10.804960: Epoch time: 77.45 s 
2023-11-23 06:32:11.870390:  
2023-11-23 06:32:11.870576: Epoch 827 
2023-11-23 06:32:11.870686: Current learning rate: 0.00206 
2023-11-23 06:33:29.278964: train_loss -0.3471 
2023-11-23 06:33:29.279158: val_loss -0.3654 
2023-11-23 06:33:29.279235: Pseudo dice [0.7444, nan] 
2023-11-23 06:33:29.279313: Epoch time: 77.41 s 
2023-11-23 06:33:30.346656:  
2023-11-23 06:33:30.346835: Epoch 828 
2023-11-23 06:33:30.346975: Current learning rate: 0.00205 
2023-11-23 06:34:48.159974: train_loss -0.3556 
2023-11-23 06:34:48.160157: val_loss -0.3468 
2023-11-23 06:34:48.160262: Pseudo dice [0.7071, nan] 
2023-11-23 06:34:48.160335: Epoch time: 77.81 s 
2023-11-23 06:34:49.437860:  
2023-11-23 06:34:49.438000: Epoch 829 
2023-11-23 06:34:49.438107: Current learning rate: 0.00204 
2023-11-23 06:36:07.205567: train_loss -0.3524 
2023-11-23 06:36:07.205765: val_loss -0.3744 
2023-11-23 06:36:07.205843: Pseudo dice [0.7742, nan] 
2023-11-23 06:36:07.205918: Epoch time: 77.77 s 
2023-11-23 06:36:08.276279:  
2023-11-23 06:36:08.276403: Epoch 830 
2023-11-23 06:36:08.276509: Current learning rate: 0.00203 
2023-11-23 06:37:25.990975: train_loss -0.3485 
2023-11-23 06:37:25.991171: val_loss -0.3454 
2023-11-23 06:37:25.991284: Pseudo dice [0.7008, nan] 
2023-11-23 06:37:25.991369: Epoch time: 77.72 s 
2023-11-23 06:37:27.067439:  
2023-11-23 06:37:27.067571: Epoch 831 
2023-11-23 06:37:27.067672: Current learning rate: 0.00202 
2023-11-23 06:38:44.833196: train_loss -0.3567 
2023-11-23 06:38:44.833444: val_loss -0.3503 
2023-11-23 06:38:44.833529: Pseudo dice [0.7072, nan] 
2023-11-23 06:38:44.833607: Epoch time: 77.77 s 
2023-11-23 06:38:46.007010:  
2023-11-23 06:38:46.007140: Epoch 832 
2023-11-23 06:38:46.007236: Current learning rate: 0.00201 
2023-11-23 06:40:03.594789: train_loss -0.3532 
2023-11-23 06:40:03.594987: val_loss -0.3552 
2023-11-23 06:40:03.595067: Pseudo dice [0.726, nan] 
2023-11-23 06:40:03.595139: Epoch time: 77.59 s 
2023-11-23 06:40:04.662105:  
2023-11-23 06:40:04.662292: Epoch 833 
2023-11-23 06:40:04.662405: Current learning rate: 0.002 
2023-11-23 06:41:22.116840: train_loss -0.3758 
2023-11-23 06:41:22.117182: val_loss -0.3487 
2023-11-23 06:41:22.117266: Pseudo dice [0.703, nan] 
2023-11-23 06:41:22.117340: Epoch time: 77.46 s 
2023-11-23 06:41:23.187930:  
2023-11-23 06:41:23.188062: Epoch 834 
2023-11-23 06:41:23.188160: Current learning rate: 0.00199 
2023-11-23 06:42:40.668962: train_loss -0.3539 
2023-11-23 06:42:40.669201: val_loss -0.3513 
2023-11-23 06:42:40.669285: Pseudo dice [0.7049, nan] 
2023-11-23 06:42:40.669359: Epoch time: 77.48 s 
2023-11-23 06:42:41.738583:  
2023-11-23 06:42:41.738712: Epoch 835 
2023-11-23 06:42:41.738819: Current learning rate: 0.00198 
2023-11-23 06:43:58.885573: train_loss -0.3528 
2023-11-23 06:43:58.885761: val_loss -0.3467 
2023-11-23 06:43:58.885838: Pseudo dice [0.706, nan] 
2023-11-23 06:43:58.885921: Epoch time: 77.15 s 
2023-11-23 06:44:00.065600:  
2023-11-23 06:44:00.065799: Epoch 836 
2023-11-23 06:44:00.065914: Current learning rate: 0.00196 
2023-11-23 06:45:17.541263: train_loss -0.3556 
2023-11-23 06:45:17.541450: val_loss -0.3415 
2023-11-23 06:45:17.541655: Pseudo dice [0.6928, nan] 
2023-11-23 06:45:17.541734: Epoch time: 77.48 s 
2023-11-23 06:45:18.609442:  
2023-11-23 06:45:18.609620: Epoch 837 
2023-11-23 06:45:18.609727: Current learning rate: 0.00195 
2023-11-23 06:46:36.074685: train_loss -0.3512 
2023-11-23 06:46:36.074862: val_loss -0.3417 
2023-11-23 06:46:36.074941: Pseudo dice [0.6972, nan] 
2023-11-23 06:46:36.075012: Epoch time: 77.47 s 
2023-11-23 06:46:37.144045:  
2023-11-23 06:46:37.144221: Epoch 838 
2023-11-23 06:46:37.144351: Current learning rate: 0.00194 
2023-11-23 06:47:54.754482: train_loss -0.3604 
2023-11-23 06:47:54.754713: val_loss -0.2965 
2023-11-23 06:47:54.754838: Pseudo dice [0.6052, nan] 
2023-11-23 06:47:54.754911: Epoch time: 77.61 s 
2023-11-23 06:47:55.816912:  
2023-11-23 06:47:55.817044: Epoch 839 
2023-11-23 06:47:55.817144: Current learning rate: 0.00193 
2023-11-23 06:49:12.729484: train_loss -0.3485 
2023-11-23 06:49:12.729699: val_loss -0.3618 
2023-11-23 06:49:12.729808: Pseudo dice [0.7307, nan] 
2023-11-23 06:49:12.729900: Epoch time: 76.91 s 
2023-11-23 06:49:13.804401:  
2023-11-23 06:49:13.804528: Epoch 840 
2023-11-23 06:49:13.804680: Current learning rate: 0.00192 
2023-11-23 06:50:31.025745: train_loss -0.3517 
2023-11-23 06:50:31.025967: val_loss -0.3253 
2023-11-23 06:50:31.026050: Pseudo dice [0.6734, nan] 
2023-11-23 06:50:31.026123: Epoch time: 77.22 s 
2023-11-23 06:50:32.099670:  
2023-11-23 06:50:32.099797: Epoch 841 
2023-11-23 06:50:32.099895: Current learning rate: 0.00191 
2023-11-23 06:51:49.311312: train_loss -0.3495 
2023-11-23 06:51:49.311500: val_loss -0.3244 
2023-11-23 06:51:49.311576: Pseudo dice [0.6589, nan] 
2023-11-23 06:51:49.311652: Epoch time: 77.21 s 
2023-11-23 06:51:50.382514:  
2023-11-23 06:51:50.382632: Epoch 842 
2023-11-23 06:51:50.382746: Current learning rate: 0.0019 
2023-11-23 06:53:07.866104: train_loss -0.3607 
2023-11-23 06:53:07.866300: val_loss -0.331 
2023-11-23 06:53:07.866383: Pseudo dice [0.6922, nan] 
2023-11-23 06:53:07.866457: Epoch time: 77.48 s 
2023-11-23 06:53:08.935720:  
2023-11-23 06:53:08.935839: Epoch 843 
2023-11-23 06:53:08.935939: Current learning rate: 0.00189 
2023-11-23 06:54:26.568768: train_loss -0.3386 
2023-11-23 06:54:26.569098: val_loss -0.3514 
2023-11-23 06:54:26.569331: Pseudo dice [0.7146, nan] 
2023-11-23 06:54:26.569487: Epoch time: 77.63 s 
2023-11-23 06:54:27.724579:  
2023-11-23 06:54:27.724791: Epoch 844 
2023-11-23 06:54:27.725017: Current learning rate: 0.00188 
2023-11-23 06:55:45.441588: train_loss -0.3556 
2023-11-23 06:55:45.441794: val_loss -0.3509 
2023-11-23 06:55:45.441877: Pseudo dice [0.7189, nan] 
2023-11-23 06:55:45.441957: Epoch time: 77.72 s 
2023-11-23 06:55:46.506843:  
2023-11-23 06:55:46.507121: Epoch 845 
2023-11-23 06:55:46.507313: Current learning rate: 0.00187 
2023-11-23 06:57:03.759114: train_loss -0.3455 
2023-11-23 06:57:03.759309: val_loss -0.3482 
2023-11-23 06:57:03.759396: Pseudo dice [0.7042, nan] 
2023-11-23 06:57:03.759471: Epoch time: 77.25 s 
2023-11-23 06:57:04.825351:  
2023-11-23 06:57:04.825475: Epoch 846 
2023-11-23 06:57:04.825572: Current learning rate: 0.00186 
2023-11-23 06:58:21.722079: train_loss -0.3451 
2023-11-23 06:58:21.722270: val_loss -0.3554 
2023-11-23 06:58:21.722351: Pseudo dice [0.7316, nan] 
2023-11-23 06:58:21.722424: Epoch time: 76.9 s 
2023-11-23 06:58:22.797155:  
2023-11-23 06:58:22.797354: Epoch 847 
2023-11-23 06:58:22.797498: Current learning rate: 0.00185 
2023-11-23 06:59:40.102819: train_loss -0.3468 
2023-11-23 06:59:40.103020: val_loss -0.3421 
2023-11-23 06:59:40.103099: Pseudo dice [0.6877, nan] 
2023-11-23 06:59:40.103179: Epoch time: 77.31 s 
2023-11-23 06:59:41.171346:  
2023-11-23 06:59:41.171508: Epoch 848 
2023-11-23 06:59:41.171696: Current learning rate: 0.00184 
2023-11-23 07:02:27.725616: train_loss -0.3462 
2023-11-23 07:02:27.725831: val_loss -0.3561 
2023-11-23 07:02:27.725953: Pseudo dice [0.7216, nan] 
2023-11-23 07:02:27.726029: Epoch time: 166.55 s 
2023-11-23 07:02:28.807235:  
2023-11-23 07:02:28.807365: Epoch 849 
2023-11-23 07:02:28.807471: Current learning rate: 0.00182 
2023-11-23 07:05:46.461832: train_loss -0.3576 
2023-11-23 07:05:46.462042: val_loss -0.3548 
2023-11-23 07:05:46.462147: Pseudo dice [0.7118, nan] 
2023-11-23 07:05:46.462222: Epoch time: 197.66 s 
2023-11-23 07:05:47.662908:  
2023-11-23 07:05:47.663126: Epoch 850 
2023-11-23 07:05:47.663254: Current learning rate: 0.00181 
2023-11-23 07:09:03.707472: train_loss -0.3665 
2023-11-23 07:09:03.707672: val_loss -0.3468 
2023-11-23 07:09:03.707753: Pseudo dice [0.7161, nan] 
2023-11-23 07:09:03.707828: Epoch time: 196.05 s 
2023-11-23 07:09:04.764432:  
2023-11-23 07:09:04.764601: Epoch 851 
2023-11-23 07:09:04.764768: Current learning rate: 0.0018 
2023-11-23 07:12:21.097066: train_loss -0.3641 
2023-11-23 07:12:21.097297: val_loss -0.3786 
2023-11-23 07:12:21.097385: Pseudo dice [0.7786, nan] 
2023-11-23 07:12:21.097462: Epoch time: 196.33 s 
2023-11-23 07:12:22.168885:  
2023-11-23 07:12:22.169023: Epoch 852 
2023-11-23 07:12:22.169131: Current learning rate: 0.00179 
2023-11-23 07:15:38.530195: train_loss -0.3612 
2023-11-23 07:15:38.530396: val_loss -0.3644 
2023-11-23 07:15:38.530477: Pseudo dice [0.7415, nan] 
2023-11-23 07:15:38.530552: Epoch time: 196.36 s 
2023-11-23 07:15:39.707072:  
2023-11-23 07:15:39.707281: Epoch 853 
2023-11-23 07:15:39.707412: Current learning rate: 0.00178 
2023-11-23 07:18:56.901932: train_loss -0.3579 
2023-11-23 07:18:56.902129: val_loss -0.3741 
2023-11-23 07:18:56.902208: Pseudo dice [0.7552, nan] 
2023-11-23 07:18:56.902280: Epoch time: 197.2 s 
2023-11-23 07:18:57.959670:  
2023-11-23 07:18:57.959922: Epoch 854 
2023-11-23 07:18:57.960041: Current learning rate: 0.00177 
2023-11-23 07:22:15.159175: train_loss -0.3571 
2023-11-23 07:22:15.159432: val_loss -0.3532 
2023-11-23 07:22:15.159516: Pseudo dice [0.7015, nan] 
2023-11-23 07:22:15.159592: Epoch time: 197.2 s 
2023-11-23 07:22:16.220866:  
2023-11-23 07:22:16.221012: Epoch 855 
2023-11-23 07:22:16.221130: Current learning rate: 0.00176 
2023-11-23 07:25:33.182432: train_loss -0.3623 
2023-11-23 07:25:33.182646: val_loss -0.3554 
2023-11-23 07:25:33.182731: Pseudo dice [0.7161, nan] 
2023-11-23 07:25:33.182811: Epoch time: 196.96 s 
2023-11-23 07:25:34.241032:  
2023-11-23 07:25:34.241159: Epoch 856 
2023-11-23 07:25:34.241279: Current learning rate: 0.00175 
2023-11-23 07:28:51.073699: train_loss -0.3607 
2023-11-23 07:28:51.073921: val_loss -0.3714 
2023-11-23 07:28:51.074019: Pseudo dice [0.766, nan] 
2023-11-23 07:28:51.074093: Epoch time: 196.83 s 
2023-11-23 07:28:52.140939:  
2023-11-23 07:28:52.141067: Epoch 857 
2023-11-23 07:28:52.141195: Current learning rate: 0.00174 
2023-11-23 07:32:08.388458: train_loss -0.3505 
2023-11-23 07:32:08.388694: val_loss -0.3308 
2023-11-23 07:32:08.388793: Pseudo dice [0.6774, nan] 
2023-11-23 07:32:08.388873: Epoch time: 196.25 s 
2023-11-23 07:32:09.447837:  
2023-11-23 07:32:09.447959: Epoch 858 
2023-11-23 07:32:09.448056: Current learning rate: 0.00173 
2023-11-23 07:35:26.390308: train_loss -0.3544 
2023-11-23 07:35:26.390499: val_loss -0.3758 
2023-11-23 07:35:26.390604: Pseudo dice [0.7617, nan] 
2023-11-23 07:35:26.390698: Epoch time: 196.94 s 
2023-11-23 07:35:27.450316:  
2023-11-23 07:35:27.450500: Epoch 859 
2023-11-23 07:35:27.450659: Current learning rate: 0.00172 
2023-11-23 07:38:43.670738: train_loss -0.3583 
2023-11-23 07:38:43.670924: val_loss -0.3386 
2023-11-23 07:38:43.671002: Pseudo dice [0.6798, nan] 
2023-11-23 07:38:43.671073: Epoch time: 196.22 s 
2023-11-23 07:38:44.725586:  
2023-11-23 07:38:44.725738: Epoch 860 
2023-11-23 07:38:44.725860: Current learning rate: 0.0017 
2023-11-23 07:42:00.172936: train_loss -0.3731 
2023-11-23 07:42:00.173132: val_loss -0.3595 
2023-11-23 07:42:00.173236: Pseudo dice [0.7424, nan] 
2023-11-23 07:42:00.173313: Epoch time: 195.45 s 
2023-11-23 07:42:01.235269:  
2023-11-23 07:42:01.235609: Epoch 861 
2023-11-23 07:42:01.235726: Current learning rate: 0.00169 
2023-11-23 07:45:16.559633: train_loss -0.3678 
2023-11-23 07:45:16.559850: val_loss -0.3569 
2023-11-23 07:45:16.559973: Pseudo dice [0.7351, nan] 
2023-11-23 07:45:16.560051: Epoch time: 195.33 s 
2023-11-23 07:45:17.737725:  
2023-11-23 07:45:17.737910: Epoch 862 
2023-11-23 07:45:17.738058: Current learning rate: 0.00168 
2023-11-23 07:48:33.450715: train_loss -0.3606 
2023-11-23 07:48:33.450916: val_loss -0.3593 
2023-11-23 07:48:33.450998: Pseudo dice [0.7398, nan] 
2023-11-23 07:48:33.451072: Epoch time: 195.71 s 
2023-11-23 07:48:34.514188:  
2023-11-23 07:48:34.514375: Epoch 863 
2023-11-23 07:48:34.514527: Current learning rate: 0.00167 
2023-11-23 07:51:50.663070: train_loss -0.3552 
2023-11-23 07:51:50.663301: val_loss -0.3494 
2023-11-23 07:51:50.663383: Pseudo dice [0.7177, nan] 
2023-11-23 07:51:50.663455: Epoch time: 196.15 s 
2023-11-23 07:51:51.722871:  
2023-11-23 07:51:51.723054: Epoch 864 
2023-11-23 07:51:51.723200: Current learning rate: 0.00166 
2023-11-23 07:55:07.818563: train_loss -0.3587 
2023-11-23 07:55:07.818760: val_loss -0.3569 
2023-11-23 07:55:07.818837: Pseudo dice [0.7243, nan] 
2023-11-23 07:55:07.818907: Epoch time: 196.1 s 
2023-11-23 07:55:08.877575:  
2023-11-23 07:55:08.877697: Epoch 865 
2023-11-23 07:55:08.877798: Current learning rate: 0.00165 
2023-11-23 07:58:24.222638: train_loss -0.3583 
2023-11-23 07:58:24.222846: val_loss -0.3524 
2023-11-23 07:58:24.222926: Pseudo dice [0.7179, nan] 
2023-11-23 07:58:24.222995: Epoch time: 195.35 s 
2023-11-23 07:58:25.278966:  
2023-11-23 07:58:25.279142: Epoch 866 
2023-11-23 07:58:25.279278: Current learning rate: 0.00164 
2023-11-23 08:00:29.836035: train_loss -0.353 
2023-11-23 08:00:29.836255: val_loss -0.3685 
2023-11-23 08:00:29.836338: Pseudo dice [0.7497, nan] 
2023-11-23 08:00:29.836409: Epoch time: 124.56 s 
2023-11-23 08:00:30.906104:  
2023-11-23 08:00:30.906364: Epoch 867 
2023-11-23 08:00:30.906562: Current learning rate: 0.00163 
2023-11-23 08:01:26.406337: train_loss -0.3489 
2023-11-23 08:01:26.406539: val_loss -0.3406 
2023-11-23 08:01:26.406620: Pseudo dice [0.6958, nan] 
2023-11-23 08:01:26.406698: Epoch time: 55.5 s 
2023-11-23 08:01:27.475311:  
2023-11-23 08:01:27.475435: Epoch 868 
2023-11-23 08:01:27.475546: Current learning rate: 0.00162 
2023-11-23 08:02:23.299915: train_loss -0.3548 
2023-11-23 08:02:23.300150: val_loss -0.3538 
2023-11-23 08:02:23.300231: Pseudo dice [0.7227, nan] 
2023-11-23 08:02:23.300301: Epoch time: 55.83 s 
2023-11-23 08:02:24.357489:  
2023-11-23 08:02:24.357660: Epoch 869 
2023-11-23 08:02:24.357805: Current learning rate: 0.00161 
2023-11-23 08:03:20.121649: train_loss -0.368 
2023-11-23 08:03:20.121842: val_loss -0.3361 
2023-11-23 08:03:20.121920: Pseudo dice [0.6828, nan] 
2023-11-23 08:03:20.121990: Epoch time: 55.76 s 
2023-11-23 08:03:21.181629:  
2023-11-23 08:03:21.181801: Epoch 870 
2023-11-23 08:03:21.181942: Current learning rate: 0.00159 
2023-11-23 08:04:16.863468: train_loss -0.3604 
2023-11-23 08:04:16.863675: val_loss -0.3149 
2023-11-23 08:04:16.863755: Pseudo dice [0.6351, nan] 
2023-11-23 08:04:16.863825: Epoch time: 55.68 s 
2023-11-23 08:04:18.024228:  
2023-11-23 08:04:18.024536: Epoch 871 
2023-11-23 08:04:18.024683: Current learning rate: 0.00158 
2023-11-23 08:05:13.686670: train_loss -0.3574 
2023-11-23 08:05:13.686860: val_loss -0.3386 
2023-11-23 08:05:13.686938: Pseudo dice [0.6955, nan] 
2023-11-23 08:05:13.687006: Epoch time: 55.66 s 
2023-11-23 08:05:14.749031:  
2023-11-23 08:05:14.749226: Epoch 872 
2023-11-23 08:05:14.749339: Current learning rate: 0.00157 
2023-11-23 08:06:10.458778: train_loss -0.3562 
2023-11-23 08:06:10.458970: val_loss -0.3411 
2023-11-23 08:06:10.459053: Pseudo dice [0.6937, nan] 
2023-11-23 08:06:10.459125: Epoch time: 55.71 s 
2023-11-23 08:06:11.512665:  
2023-11-23 08:06:11.512786: Epoch 873 
2023-11-23 08:06:11.512883: Current learning rate: 0.00156 
2023-11-23 08:07:07.298912: train_loss -0.3428 
2023-11-23 08:07:07.299153: val_loss -0.3417 
2023-11-23 08:07:07.299230: Pseudo dice [0.6889, nan] 
2023-11-23 08:07:07.299302: Epoch time: 55.79 s 
2023-11-23 08:07:08.362721:  
2023-11-23 08:07:08.362853: Epoch 874 
2023-11-23 08:07:08.362954: Current learning rate: 0.00155 
2023-11-23 08:08:04.034646: train_loss -0.3489 
2023-11-23 08:08:04.034821: val_loss -0.3324 
2023-11-23 08:08:04.034894: Pseudo dice [0.681, nan] 
2023-11-23 08:08:04.034964: Epoch time: 55.67 s 
2023-11-23 08:08:05.202747:  
2023-11-23 08:08:05.202869: Epoch 875 
2023-11-23 08:08:05.202965: Current learning rate: 0.00154 
2023-11-23 08:09:00.781019: train_loss -0.3493 
2023-11-23 08:09:00.781243: val_loss -0.3249 
2023-11-23 08:09:00.781366: Pseudo dice [0.6608, nan] 
2023-11-23 08:09:00.781446: Epoch time: 55.58 s 
2023-11-23 08:09:01.849550:  
2023-11-23 08:09:01.849687: Epoch 876 
2023-11-23 08:09:01.849795: Current learning rate: 0.00153 
2023-11-23 08:09:57.475544: train_loss -0.3532 
2023-11-23 08:09:57.475735: val_loss -0.3479 
2023-11-23 08:09:57.475814: Pseudo dice [0.7161, nan] 
2023-11-23 08:09:57.475882: Epoch time: 55.63 s 
2023-11-23 08:09:58.534756:  
2023-11-23 08:09:58.534928: Epoch 877 
2023-11-23 08:09:58.535077: Current learning rate: 0.00152 
2023-11-23 08:10:54.199185: train_loss -0.3689 
2023-11-23 08:10:54.199436: val_loss -0.3672 
2023-11-23 08:10:54.199533: Pseudo dice [0.763, nan] 
2023-11-23 08:10:54.199613: Epoch time: 55.67 s 
2023-11-23 08:10:55.265231:  
2023-11-23 08:10:55.265418: Epoch 878 
2023-11-23 08:10:55.265558: Current learning rate: 0.00151 
2023-11-23 08:11:50.960275: train_loss -0.356 
2023-11-23 08:11:50.960479: val_loss -0.3418 
2023-11-23 08:11:50.960554: Pseudo dice [0.7103, nan] 
2023-11-23 08:11:50.960634: Epoch time: 55.7 s 
2023-11-23 08:11:52.123529:  
2023-11-23 08:11:52.123643: Epoch 879 
2023-11-23 08:11:52.123740: Current learning rate: 0.00149 
2023-11-23 08:12:47.682913: train_loss -0.3632 
2023-11-23 08:12:47.683140: val_loss -0.3406 
2023-11-23 08:12:47.683225: Pseudo dice [0.6868, nan] 
2023-11-23 08:12:47.683297: Epoch time: 55.56 s 
2023-11-23 08:12:48.752581:  
2023-11-23 08:12:48.752735: Epoch 880 
2023-11-23 08:12:48.752846: Current learning rate: 0.00148 
2023-11-23 08:13:44.359000: train_loss -0.3719 
2023-11-23 08:13:44.359181: val_loss -0.3486 
2023-11-23 08:13:44.359260: Pseudo dice [0.705, nan] 
2023-11-23 08:13:44.359334: Epoch time: 55.61 s 
2023-11-23 08:13:45.424660:  
2023-11-23 08:13:45.424783: Epoch 881 
2023-11-23 08:13:45.424886: Current learning rate: 0.00147 
2023-11-23 08:14:41.224882: train_loss -0.3453 
2023-11-23 08:14:41.225083: val_loss -0.3529 
2023-11-23 08:14:41.225163: Pseudo dice [0.7235, nan] 
2023-11-23 08:14:41.225233: Epoch time: 55.8 s 
2023-11-23 08:14:42.288383:  
2023-11-23 08:14:42.288707: Epoch 882 
2023-11-23 08:14:42.288869: Current learning rate: 0.00146 
2023-11-23 08:15:37.936939: train_loss -0.3677 
2023-11-23 08:15:37.937127: val_loss -0.3636 
2023-11-23 08:15:37.937205: Pseudo dice [0.7464, nan] 
2023-11-23 08:15:37.937280: Epoch time: 55.65 s 
2023-11-23 08:15:39.003470:  
2023-11-23 08:15:39.003591: Epoch 883 
2023-11-23 08:15:39.003690: Current learning rate: 0.00145 
2023-11-23 08:16:55.544749: train_loss -0.3532 
2023-11-23 08:16:55.544937: val_loss -0.3558 
2023-11-23 08:16:55.545015: Pseudo dice [0.7242, nan] 
2023-11-23 08:16:55.545084: Epoch time: 76.54 s 
2023-11-23 08:16:56.723376:  
2023-11-23 08:16:56.723655: Epoch 884 
2023-11-23 08:16:56.723876: Current learning rate: 0.00144 
2023-11-23 08:17:52.305707: train_loss -0.3471 
2023-11-23 08:17:52.305887: val_loss -0.354 
2023-11-23 08:17:52.305965: Pseudo dice [0.7226, nan] 
2023-11-23 08:17:52.306034: Epoch time: 55.58 s 
2023-11-23 08:17:53.371986:  
2023-11-23 08:17:53.372098: Epoch 885 
2023-11-23 08:17:53.372193: Current learning rate: 0.00143 
2023-11-23 08:18:48.959091: train_loss -0.3679 
2023-11-23 08:18:48.959276: val_loss -0.3377 
2023-11-23 08:18:48.959356: Pseudo dice [0.6871, nan] 
2023-11-23 08:18:48.959425: Epoch time: 55.59 s 
2023-11-23 08:18:50.021230:  
2023-11-23 08:18:50.021403: Epoch 886 
2023-11-23 08:18:50.021502: Current learning rate: 0.00142 
2023-11-23 08:19:45.615184: train_loss -0.364 
2023-11-23 08:19:45.615370: val_loss -0.3524 
2023-11-23 08:19:45.615445: Pseudo dice [0.7105, nan] 
2023-11-23 08:19:45.615521: Epoch time: 55.59 s 
2023-11-23 08:19:46.677433:  
2023-11-23 08:19:46.677660: Epoch 887 
2023-11-23 08:19:46.677804: Current learning rate: 0.00141 
2023-11-23 08:20:42.307436: train_loss -0.3665 
2023-11-23 08:20:42.307706: val_loss -0.358 
2023-11-23 08:20:42.307840: Pseudo dice [0.7366, nan] 
2023-11-23 08:20:42.307965: Epoch time: 55.63 s 
2023-11-23 08:20:43.402125:  
2023-11-23 08:20:43.402297: Epoch 888 
2023-11-23 08:20:43.402402: Current learning rate: 0.00139 
2023-11-23 08:21:38.970744: train_loss -0.3549 
2023-11-23 08:21:38.970930: val_loss -0.3509 
2023-11-23 08:21:38.971008: Pseudo dice [0.7227, nan] 
2023-11-23 08:21:38.971076: Epoch time: 55.57 s 
2023-11-23 08:21:40.042999:  
2023-11-23 08:21:40.043130: Epoch 889 
2023-11-23 08:21:40.043227: Current learning rate: 0.00138 
2023-11-23 08:22:35.718138: train_loss -0.3652 
2023-11-23 08:22:35.718308: val_loss -0.3621 
2023-11-23 08:22:35.718383: Pseudo dice [0.7221, nan] 
2023-11-23 08:22:35.718455: Epoch time: 55.68 s 
2023-11-23 08:22:36.780611:  
2023-11-23 08:22:36.780869: Epoch 890 
2023-11-23 08:22:36.780978: Current learning rate: 0.00137 
2023-11-23 08:23:32.373774: train_loss -0.3657 
2023-11-23 08:23:32.373954: val_loss -0.3474 
2023-11-23 08:23:32.374033: Pseudo dice [0.7156, nan] 
2023-11-23 08:23:32.374106: Epoch time: 55.59 s 
2023-11-23 08:23:33.443976:  
2023-11-23 08:23:33.444272: Epoch 891 
2023-11-23 08:23:33.444416: Current learning rate: 0.00136 
2023-11-23 08:24:28.977038: train_loss -0.3635 
2023-11-23 08:24:28.977236: val_loss -0.3626 
2023-11-23 08:24:28.977317: Pseudo dice [0.7345, nan] 
2023-11-23 08:24:28.977388: Epoch time: 55.53 s 
2023-11-23 08:24:30.033893:  
2023-11-23 08:24:30.034015: Epoch 892 
2023-11-23 08:24:30.034120: Current learning rate: 0.00135 
2023-11-23 08:25:25.685163: train_loss -0.3715 
2023-11-23 08:25:25.685347: val_loss -0.3616 
2023-11-23 08:25:25.685423: Pseudo dice [0.7439, nan] 
2023-11-23 08:25:25.685490: Epoch time: 55.65 s 
2023-11-23 08:25:26.858222:  
2023-11-23 08:25:26.858340: Epoch 893 
2023-11-23 08:25:26.858436: Current learning rate: 0.00134 
2023-11-23 08:26:22.636105: train_loss -0.3634 
2023-11-23 08:26:22.636322: val_loss -0.3639 
2023-11-23 08:26:22.636405: Pseudo dice [0.7469, nan] 
2023-11-23 08:26:22.636474: Epoch time: 55.78 s 
2023-11-23 08:26:23.690105:  
2023-11-23 08:26:23.690252: Epoch 894 
2023-11-23 08:26:23.690368: Current learning rate: 0.00133 
2023-11-23 08:27:19.315591: train_loss -0.3607 
2023-11-23 08:27:19.315778: val_loss -0.3595 
2023-11-23 08:27:19.315858: Pseudo dice [0.7431, nan] 
2023-11-23 08:27:19.315934: Epoch time: 55.63 s 
2023-11-23 08:27:20.371956:  
2023-11-23 08:27:20.372088: Epoch 895 
2023-11-23 08:27:20.372189: Current learning rate: 0.00132 
2023-11-23 08:28:16.026363: train_loss -0.3587 
2023-11-23 08:28:16.026546: val_loss -0.3689 
2023-11-23 08:28:16.026623: Pseudo dice [0.7486, nan] 
2023-11-23 08:28:16.026691: Epoch time: 55.66 s 
2023-11-23 08:28:17.092180:  
2023-11-23 08:28:17.092306: Epoch 896 
2023-11-23 08:28:17.092406: Current learning rate: 0.0013 
2023-11-23 08:29:12.725816: train_loss -0.3675 
2023-11-23 08:29:12.726009: val_loss -0.3431 
2023-11-23 08:29:12.726088: Pseudo dice [0.7059, nan] 
2023-11-23 08:29:12.726156: Epoch time: 55.63 s 
2023-11-23 08:29:13.890198:  
2023-11-23 08:29:13.890316: Epoch 897 
2023-11-23 08:29:13.890418: Current learning rate: 0.00129 
2023-11-23 08:30:09.492182: train_loss -0.3544 
2023-11-23 08:30:09.492369: val_loss -0.3206 
2023-11-23 08:30:09.492446: Pseudo dice [0.6669, nan] 
2023-11-23 08:30:09.492520: Epoch time: 55.6 s 
2023-11-23 08:30:10.547804:  
2023-11-23 08:30:10.547966: Epoch 898 
2023-11-23 08:30:10.548072: Current learning rate: 0.00128 
2023-11-23 08:31:06.299329: train_loss -0.3614 
2023-11-23 08:31:06.299519: val_loss -0.3824 
2023-11-23 08:31:06.299595: Pseudo dice [0.7888, nan] 
2023-11-23 08:31:06.299667: Epoch time: 55.75 s 
2023-11-23 08:31:07.370735:  
2023-11-23 08:31:07.370855: Epoch 899 
2023-11-23 08:31:07.370957: Current learning rate: 0.00127 
2023-11-23 08:32:03.061452: train_loss -0.3761 
2023-11-23 08:32:03.061643: val_loss -0.3401 
2023-11-23 08:32:03.061721: Pseudo dice [0.6935, nan] 
2023-11-23 08:32:03.061791: Epoch time: 55.69 s 
2023-11-23 08:32:04.280616:  
2023-11-23 08:32:04.280764: Epoch 900 
2023-11-23 08:32:04.280860: Current learning rate: 0.00126 
2023-11-23 08:32:59.946105: train_loss -0.3627 
2023-11-23 08:32:59.946292: val_loss -0.3524 
2023-11-23 08:32:59.946373: Pseudo dice [0.7287, nan] 
2023-11-23 08:32:59.946442: Epoch time: 55.67 s 
2023-11-23 08:33:01.007334:  
2023-11-23 08:33:01.007457: Epoch 901 
2023-11-23 08:33:01.007556: Current learning rate: 0.00125 
2023-11-23 08:33:56.686930: train_loss -0.3679 
2023-11-23 08:33:56.687121: val_loss -0.3519 
2023-11-23 08:33:56.687205: Pseudo dice [0.7232, nan] 
2023-11-23 08:33:56.687277: Epoch time: 55.68 s 
2023-11-23 08:33:57.879536:  
2023-11-23 08:33:57.879658: Epoch 902 
2023-11-23 08:33:57.879761: Current learning rate: 0.00124 
2023-11-23 08:34:53.515766: train_loss -0.373 
2023-11-23 08:34:53.515949: val_loss -0.3257 
2023-11-23 08:34:53.516026: Pseudo dice [0.6727, nan] 
2023-11-23 08:34:53.516093: Epoch time: 55.64 s 
2023-11-23 08:34:54.604005:  
2023-11-23 08:34:54.604194: Epoch 903 
2023-11-23 08:34:54.604342: Current learning rate: 0.00122 
2023-11-23 08:35:50.197297: train_loss -0.3704 
2023-11-23 08:35:50.197520: val_loss -0.3558 
2023-11-23 08:35:50.197602: Pseudo dice [0.7177, nan] 
2023-11-23 08:35:50.197672: Epoch time: 55.59 s 
2023-11-23 08:35:51.265531:  
2023-11-23 08:35:51.265653: Epoch 904 
2023-11-23 08:35:51.265770: Current learning rate: 0.00121 
2023-11-23 08:36:46.874883: train_loss -0.3672 
2023-11-23 08:36:46.875069: val_loss -0.348 
2023-11-23 08:36:46.875146: Pseudo dice [0.7082, nan] 
2023-11-23 08:36:46.875222: Epoch time: 55.61 s 
2023-11-23 08:36:47.937578:  
2023-11-23 08:36:47.937702: Epoch 905 
2023-11-23 08:36:47.937806: Current learning rate: 0.0012 
2023-11-23 08:37:43.677978: train_loss -0.3765 
2023-11-23 08:37:43.678172: val_loss -0.349 
2023-11-23 08:37:43.678255: Pseudo dice [0.7076, nan] 
2023-11-23 08:37:43.678325: Epoch time: 55.74 s 
2023-11-23 08:37:44.856335:  
2023-11-23 08:37:44.856454: Epoch 906 
2023-11-23 08:37:44.856552: Current learning rate: 0.00119 
2023-11-23 08:38:40.427079: train_loss -0.3597 
2023-11-23 08:38:40.427288: val_loss -0.3549 
2023-11-23 08:38:40.427367: Pseudo dice [0.7215, nan] 
2023-11-23 08:38:40.427438: Epoch time: 55.57 s 
2023-11-23 08:38:41.483073:  
2023-11-23 08:38:41.483285: Epoch 907 
2023-11-23 08:38:41.483430: Current learning rate: 0.00118 
2023-11-23 08:39:37.119418: train_loss -0.371 
2023-11-23 08:39:37.119624: val_loss -0.347 
2023-11-23 08:39:37.119704: Pseudo dice [0.7128, nan] 
2023-11-23 08:39:37.119781: Epoch time: 55.64 s 
2023-11-23 08:39:38.181597:  
2023-11-23 08:39:38.181774: Epoch 908 
2023-11-23 08:39:38.181927: Current learning rate: 0.00117 
2023-11-23 08:40:33.868090: train_loss -0.3674 
2023-11-23 08:40:33.868266: val_loss -0.3611 
2023-11-23 08:40:33.868336: Pseudo dice [0.7336, nan] 
2023-11-23 08:40:33.868400: Epoch time: 55.69 s 
2023-11-23 08:40:34.943144:  
2023-11-23 08:40:34.943269: Epoch 909 
2023-11-23 08:40:34.943369: Current learning rate: 0.00116 
2023-11-23 08:41:30.555073: train_loss -0.3624 
2023-11-23 08:41:30.555263: val_loss -0.3439 
2023-11-23 08:41:30.555337: Pseudo dice [0.6884, nan] 
2023-11-23 08:41:30.555412: Epoch time: 55.61 s 
2023-11-23 08:41:31.722067:  
2023-11-23 08:41:31.722187: Epoch 910 
2023-11-23 08:41:31.722300: Current learning rate: 0.00115 
2023-11-23 08:42:27.289161: train_loss -0.3525 
2023-11-23 08:42:27.289372: val_loss -0.354 
2023-11-23 08:42:27.289451: Pseudo dice [0.725, nan] 
2023-11-23 08:42:27.289521: Epoch time: 55.57 s 
2023-11-23 08:42:28.357430:  
2023-11-23 08:42:28.357555: Epoch 911 
2023-11-23 08:42:28.357656: Current learning rate: 0.00113 
2023-11-23 08:43:24.079526: train_loss -0.3584 
2023-11-23 08:43:24.079714: val_loss -0.3615 
2023-11-23 08:43:24.079792: Pseudo dice [0.7439, nan] 
2023-11-23 08:43:24.079858: Epoch time: 55.72 s 
2023-11-23 08:43:25.142432:  
2023-11-23 08:43:25.142609: Epoch 912 
2023-11-23 08:43:25.142745: Current learning rate: 0.00112 
2023-11-23 08:44:20.655868: train_loss -0.3608 
2023-11-23 08:44:20.656095: val_loss -0.3365 
2023-11-23 08:44:20.656173: Pseudo dice [0.6914, nan] 
2023-11-23 08:44:20.656245: Epoch time: 55.51 s 
2023-11-23 08:44:21.723251:  
2023-11-23 08:44:21.723392: Epoch 913 
2023-11-23 08:44:21.723500: Current learning rate: 0.00111 
2023-11-23 08:45:17.385078: train_loss -0.3668 
2023-11-23 08:45:17.385290: val_loss -0.3412 
2023-11-23 08:45:17.385369: Pseudo dice [0.7012, nan] 
2023-11-23 08:45:17.385440: Epoch time: 55.66 s 
2023-11-23 08:45:18.458119:  
2023-11-23 08:45:18.458249: Epoch 914 
2023-11-23 08:45:18.458348: Current learning rate: 0.0011 
2023-11-23 08:46:13.952515: train_loss -0.3719 
2023-11-23 08:46:13.952718: val_loss -0.3477 
2023-11-23 08:46:13.952805: Pseudo dice [0.7258, nan] 
2023-11-23 08:46:13.952875: Epoch time: 55.5 s 
2023-11-23 08:46:15.124303:  
2023-11-23 08:46:15.124425: Epoch 915 
2023-11-23 08:46:15.124530: Current learning rate: 0.00109 
2023-11-23 08:47:10.722615: train_loss -0.3739 
2023-11-23 08:47:10.722825: val_loss -0.3543 
2023-11-23 08:47:10.722909: Pseudo dice [0.712, nan] 
2023-11-23 08:47:10.722982: Epoch time: 55.6 s 
2023-11-23 08:47:11.787900:  
2023-11-23 08:47:11.788022: Epoch 916 
2023-11-23 08:47:11.788121: Current learning rate: 0.00108 
2023-11-23 08:48:07.483401: train_loss -0.3749 
2023-11-23 08:48:07.483607: val_loss -0.3528 
2023-11-23 08:48:07.483682: Pseudo dice [0.7145, nan] 
2023-11-23 08:48:07.483757: Epoch time: 55.7 s 
2023-11-23 08:48:08.540532:  
2023-11-23 08:48:08.540721: Epoch 917 
2023-11-23 08:48:08.540822: Current learning rate: 0.00106 
2023-11-23 08:49:04.205392: train_loss -0.3708 
2023-11-23 08:49:04.205602: val_loss -0.3418 
2023-11-23 08:49:04.205679: Pseudo dice [0.7192, nan] 
2023-11-23 08:49:04.205750: Epoch time: 55.67 s 
2023-11-23 08:49:05.277476:  
2023-11-23 08:49:05.277591: Epoch 918 
2023-11-23 08:49:05.277696: Current learning rate: 0.00105 
2023-11-23 08:50:00.770970: train_loss -0.379 
2023-11-23 08:50:00.771161: val_loss -0.3479 
2023-11-23 08:50:00.771235: Pseudo dice [0.6899, nan] 
2023-11-23 08:50:00.771303: Epoch time: 55.49 s 
2023-11-23 08:50:01.946010:  
2023-11-23 08:50:01.946137: Epoch 919 
2023-11-23 08:50:01.946233: Current learning rate: 0.00104 
2023-11-23 08:50:57.646451: train_loss -0.3696 
2023-11-23 08:50:57.646660: val_loss -0.3556 
2023-11-23 08:50:57.646742: Pseudo dice [0.7189, nan] 
2023-11-23 08:50:57.646915: Epoch time: 55.7 s 
2023-11-23 08:50:58.714459:  
2023-11-23 08:50:58.714651: Epoch 920 
2023-11-23 08:50:58.714804: Current learning rate: 0.00103 
2023-11-23 08:51:54.389097: train_loss -0.3661 
2023-11-23 08:51:54.389334: val_loss -0.3532 
2023-11-23 08:51:54.389417: Pseudo dice [0.7233, nan] 
2023-11-23 08:51:54.389489: Epoch time: 55.68 s 
2023-11-23 08:51:55.452870:  
2023-11-23 08:51:55.452986: Epoch 921 
2023-11-23 08:51:55.453083: Current learning rate: 0.00102 
2023-11-23 08:52:51.092760: train_loss -0.3672 
2023-11-23 08:52:51.092972: val_loss -0.3362 
2023-11-23 08:52:51.093051: Pseudo dice [0.6783, nan] 
2023-11-23 08:52:51.093119: Epoch time: 55.64 s 
2023-11-23 08:52:52.166899:  
2023-11-23 08:52:52.167017: Epoch 922 
2023-11-23 08:52:52.167112: Current learning rate: 0.00101 
2023-11-23 08:53:47.836641: train_loss -0.3636 
2023-11-23 08:53:47.836866: val_loss -0.353 
2023-11-23 08:53:47.836971: Pseudo dice [0.72, nan] 
2023-11-23 08:53:47.837061: Epoch time: 55.67 s 
2023-11-23 08:53:48.912077:  
2023-11-23 08:53:48.912198: Epoch 923 
2023-11-23 08:53:48.912298: Current learning rate: 0.001 
2023-11-23 08:54:59.990202: train_loss -0.3638 
2023-11-23 08:54:59.990439: val_loss -0.3176 
2023-11-23 08:54:59.990521: Pseudo dice [0.6524, nan] 
2023-11-23 08:54:59.990592: Epoch time: 71.08 s 
2023-11-23 08:55:01.052464:  
2023-11-23 08:55:01.052600: Epoch 924 
2023-11-23 08:55:01.052702: Current learning rate: 0.00098 
2023-11-23 08:56:20.479908: train_loss -0.3659 
2023-11-23 08:56:20.480102: val_loss -0.3652 
2023-11-23 08:56:20.480181: Pseudo dice [0.7354, nan] 
2023-11-23 08:56:20.480253: Epoch time: 79.43 s 
2023-11-23 08:56:21.541209:  
2023-11-23 08:56:21.541364: Epoch 925 
2023-11-23 08:56:21.541462: Current learning rate: 0.00097 
2023-11-23 08:57:39.684683: train_loss -0.3668 
2023-11-23 08:57:39.684896: val_loss -0.3252 
2023-11-23 08:57:39.684986: Pseudo dice [0.6563, nan] 
2023-11-23 08:57:39.685074: Epoch time: 78.14 s 
2023-11-23 08:57:40.752438:  
2023-11-23 08:57:40.752609: Epoch 926 
2023-11-23 08:57:40.752782: Current learning rate: 0.00096 
2023-11-23 08:58:58.295824: train_loss -0.3705 
2023-11-23 08:58:58.296089: val_loss -0.3589 
2023-11-23 08:58:58.296170: Pseudo dice [0.7269, nan] 
2023-11-23 08:58:58.296242: Epoch time: 77.54 s 
2023-11-23 08:58:59.357953:  
2023-11-23 08:58:59.358068: Epoch 927 
2023-11-23 08:58:59.358166: Current learning rate: 0.00095 
2023-11-23 09:00:12.580250: train_loss -0.3653 
2023-11-23 09:00:12.580467: val_loss -0.3547 
2023-11-23 09:00:12.580537: Pseudo dice [0.7322, nan] 
2023-11-23 09:00:12.580622: Epoch time: 73.22 s 
2023-11-23 09:00:13.752276:  
2023-11-23 09:00:13.752408: Epoch 928 
2023-11-23 09:00:13.752514: Current learning rate: 0.00094 
2023-11-23 09:01:26.175152: train_loss -0.3684 
2023-11-23 09:01:26.175375: val_loss -0.3383 
2023-11-23 09:01:26.175455: Pseudo dice [0.6943, nan] 
2023-11-23 09:01:26.175525: Epoch time: 72.42 s 
2023-11-23 09:01:27.239941:  
2023-11-23 09:01:27.240055: Epoch 929 
2023-11-23 09:01:27.240157: Current learning rate: 0.00092 
2023-11-23 09:02:45.137485: train_loss -0.3734 
2023-11-23 09:02:45.137678: val_loss -0.3407 
2023-11-23 09:02:45.137754: Pseudo dice [0.6845, nan] 
2023-11-23 09:02:45.137826: Epoch time: 77.9 s 
2023-11-23 09:02:46.202846:  
2023-11-23 09:02:46.202971: Epoch 930 
2023-11-23 09:02:46.203075: Current learning rate: 0.00091 
2023-11-23 09:04:02.713855: train_loss -0.3689 
2023-11-23 09:04:02.714045: val_loss -0.3397 
2023-11-23 09:04:02.714120: Pseudo dice [0.6915, nan] 
2023-11-23 09:04:02.714188: Epoch time: 76.51 s 
2023-11-23 09:04:03.781385:  
2023-11-23 09:04:03.781505: Epoch 931 
2023-11-23 09:04:03.781600: Current learning rate: 0.0009 
2023-11-23 09:06:35.722753: train_loss -0.3828 
2023-11-23 09:06:35.722992: val_loss -0.3716 
2023-11-23 09:06:35.723078: Pseudo dice [0.7552, nan] 
2023-11-23 09:06:35.723148: Epoch time: 151.94 s 
2023-11-23 09:06:36.901608:  
2023-11-23 09:06:36.901738: Epoch 932 
2023-11-23 09:06:36.901835: Current learning rate: 0.00089 
2023-11-23 09:09:52.960097: train_loss -0.3667 
2023-11-23 09:09:52.960283: val_loss -0.3562 
2023-11-23 09:09:52.960359: Pseudo dice [0.7328, nan] 
2023-11-23 09:09:52.960438: Epoch time: 196.06 s 
2023-11-23 09:09:54.024654:  
2023-11-23 09:09:54.024821: Epoch 933 
2023-11-23 09:09:54.024919: Current learning rate: 0.00088 
2023-11-23 09:13:09.701336: train_loss -0.3698 
2023-11-23 09:13:09.701550: val_loss -0.3732 
2023-11-23 09:13:09.701632: Pseudo dice [0.7572, nan] 
2023-11-23 09:13:09.701710: Epoch time: 195.68 s 
2023-11-23 09:13:10.761600:  
2023-11-23 09:13:10.761786: Epoch 934 
2023-11-23 09:13:10.761890: Current learning rate: 0.00087 
2023-11-23 09:16:26.008506: train_loss -0.3757 
2023-11-23 09:16:26.008721: val_loss -0.3589 
2023-11-23 09:16:26.008798: Pseudo dice [0.738, nan] 
2023-11-23 09:16:26.008876: Epoch time: 195.25 s 
2023-11-23 09:16:27.074525:  
2023-11-23 09:16:27.074702: Epoch 935 
2023-11-23 09:16:27.074806: Current learning rate: 0.00085 
2023-11-23 09:19:42.762553: train_loss -0.3831 
2023-11-23 09:19:42.762753: val_loss -0.349 
2023-11-23 09:19:42.762831: Pseudo dice [0.7127, nan] 
2023-11-23 09:19:42.762901: Epoch time: 195.69 s 
2023-11-23 09:19:43.859346:  
2023-11-23 09:19:43.859471: Epoch 936 
2023-11-23 09:19:43.859573: Current learning rate: 0.00084 
2023-11-23 09:22:59.517749: train_loss -0.371 
2023-11-23 09:22:59.517953: val_loss -0.3498 
2023-11-23 09:22:59.518033: Pseudo dice [0.7107, nan] 
2023-11-23 09:22:59.518104: Epoch time: 195.66 s 
2023-11-23 09:23:00.584048:  
2023-11-23 09:23:00.584182: Epoch 937 
2023-11-23 09:23:00.584285: Current learning rate: 0.00083 
2023-11-23 09:26:16.175196: train_loss -0.3736 
2023-11-23 09:26:16.175397: val_loss -0.3239 
2023-11-23 09:26:16.175477: Pseudo dice [0.659, nan] 
2023-11-23 09:26:16.175546: Epoch time: 195.59 s 
2023-11-23 09:26:17.254682:  
2023-11-23 09:26:17.254805: Epoch 938 
2023-11-23 09:26:17.254915: Current learning rate: 0.00082 
2023-11-23 09:29:33.666211: train_loss -0.3683 
2023-11-23 09:29:33.666423: val_loss -0.3416 
2023-11-23 09:29:33.666502: Pseudo dice [0.6831, nan] 
2023-11-23 09:29:33.666607: Epoch time: 196.41 s 
2023-11-23 09:29:34.730704:  
2023-11-23 09:29:34.730820: Epoch 939 
2023-11-23 09:29:34.730914: Current learning rate: 0.00081 
2023-11-23 09:32:51.544748: train_loss -0.3642 
2023-11-23 09:32:51.544987: val_loss -0.3534 
2023-11-23 09:32:51.545063: Pseudo dice [0.7211, nan] 
2023-11-23 09:32:51.545132: Epoch time: 196.81 s 
2023-11-23 09:32:52.604062:  
2023-11-23 09:32:52.604189: Epoch 940 
2023-11-23 09:32:52.604293: Current learning rate: 0.00079 
2023-11-23 09:36:09.340777: train_loss -0.369 
2023-11-23 09:36:09.340965: val_loss -0.3557 
2023-11-23 09:36:09.341041: Pseudo dice [0.7239, nan] 
2023-11-23 09:36:09.341109: Epoch time: 196.74 s 
2023-11-23 09:36:10.522882:  
2023-11-23 09:36:10.523008: Epoch 941 
2023-11-23 09:36:10.523109: Current learning rate: 0.00078 
2023-11-23 09:39:27.345832: train_loss -0.3711 
2023-11-23 09:39:27.345994: val_loss -0.3153 
2023-11-23 09:39:27.346067: Pseudo dice [0.6585, nan] 
2023-11-23 09:39:27.346142: Epoch time: 196.82 s 
2023-11-23 09:39:28.413840:  
2023-11-23 09:39:28.413973: Epoch 942 
2023-11-23 09:39:28.414081: Current learning rate: 0.00077 
2023-11-23 09:42:44.708724: train_loss -0.3731 
2023-11-23 09:42:44.708940: val_loss -0.3606 
2023-11-23 09:42:44.709020: Pseudo dice [0.7314, nan] 
2023-11-23 09:42:44.709091: Epoch time: 196.3 s 
2023-11-23 09:42:45.776982:  
2023-11-23 09:42:45.777099: Epoch 943 
2023-11-23 09:42:45.777198: Current learning rate: 0.00076 
2023-11-23 09:46:01.883933: train_loss -0.3821 
2023-11-23 09:46:01.884135: val_loss -0.3636 
2023-11-23 09:46:01.884216: Pseudo dice [0.7466, nan] 
2023-11-23 09:46:01.884286: Epoch time: 196.11 s 
2023-11-23 09:46:02.952318:  
2023-11-23 09:46:02.952450: Epoch 944 
2023-11-23 09:46:02.952553: Current learning rate: 0.00075 
2023-11-23 09:49:18.523927: train_loss -0.3764 
2023-11-23 09:49:18.524166: val_loss -0.3431 
2023-11-23 09:49:18.524247: Pseudo dice [0.7033, nan] 
2023-11-23 09:49:18.524315: Epoch time: 195.57 s 
2023-11-23 09:49:19.592551:  
2023-11-23 09:49:19.592686: Epoch 945 
2023-11-23 09:49:19.592782: Current learning rate: 0.00074 
2023-11-23 09:52:35.984547: train_loss -0.373 
2023-11-23 09:52:35.984769: val_loss -0.3419 
2023-11-23 09:52:35.984846: Pseudo dice [0.6956, nan] 
2023-11-23 09:52:35.984914: Epoch time: 196.39 s 
2023-11-23 09:52:37.049142:  
2023-11-23 09:52:37.049261: Epoch 946 
2023-11-23 09:52:37.049353: Current learning rate: 0.00072 
2023-11-23 09:55:53.369851: train_loss -0.3655 
2023-11-23 09:55:53.370052: val_loss -0.3574 
2023-11-23 09:55:53.370131: Pseudo dice [0.7247, nan] 
2023-11-23 09:55:53.370199: Epoch time: 196.32 s 
2023-11-23 09:55:54.437721:  
2023-11-23 09:55:54.437839: Epoch 947 
2023-11-23 09:55:54.437938: Current learning rate: 0.00071 
2023-11-23 09:59:11.167217: train_loss -0.3786 
2023-11-23 09:59:11.167453: val_loss -0.3687 
2023-11-23 09:59:11.167534: Pseudo dice [0.7471, nan] 
2023-11-23 09:59:11.167603: Epoch time: 196.73 s 
2023-11-23 09:59:12.226071:  
2023-11-23 09:59:12.226205: Epoch 948 
2023-11-23 09:59:12.226302: Current learning rate: 0.0007 
2023-11-23 10:02:29.782586: train_loss -0.3796 
2023-11-23 10:02:29.782793: val_loss -0.3586 
2023-11-23 10:02:29.782873: Pseudo dice [0.7202, nan] 
2023-11-23 10:02:29.782942: Epoch time: 197.56 s 
2023-11-23 10:02:30.851689:  
2023-11-23 10:02:30.851811: Epoch 949 
2023-11-23 10:02:30.851906: Current learning rate: 0.00069 
2023-11-23 10:05:48.236542: train_loss -0.3913 
2023-11-23 10:05:48.236750: val_loss -0.3394 
2023-11-23 10:05:48.236827: Pseudo dice [0.6982, nan] 
2023-11-23 10:05:48.236896: Epoch time: 197.39 s 
2023-11-23 10:05:49.546849:  
2023-11-23 10:05:49.546983: Epoch 950 
2023-11-23 10:05:49.547096: Current learning rate: 0.00067 
2023-11-23 10:09:06.216018: train_loss -0.3779 
2023-11-23 10:09:06.216249: val_loss -0.3519 
2023-11-23 10:09:06.216332: Pseudo dice [0.72, nan] 
2023-11-23 10:09:06.216404: Epoch time: 196.67 s 
2023-11-23 10:09:07.283298:  
2023-11-23 10:09:07.283420: Epoch 951 
2023-11-23 10:09:07.283520: Current learning rate: 0.00066 
2023-11-23 10:12:24.192134: train_loss -0.3815 
2023-11-23 10:12:24.192401: val_loss -0.3632 
2023-11-23 10:12:24.192483: Pseudo dice [0.7371, nan] 
2023-11-23 10:12:24.192559: Epoch time: 196.91 s 
2023-11-23 10:12:25.261162:  
2023-11-23 10:12:25.261299: Epoch 952 
2023-11-23 10:12:25.261405: Current learning rate: 0.00065 
2023-11-23 10:15:41.868364: train_loss -0.3738 
2023-11-23 10:15:41.868553: val_loss -0.3648 
2023-11-23 10:15:41.868646: Pseudo dice [0.7412, nan] 
2023-11-23 10:15:41.868715: Epoch time: 196.61 s 
2023-11-23 10:15:42.934521:  
2023-11-23 10:15:42.934642: Epoch 953 
2023-11-23 10:15:42.934744: Current learning rate: 0.00064 
2023-11-23 10:18:59.367880: train_loss -0.3778 
2023-11-23 10:18:59.368066: val_loss -0.3686 
2023-11-23 10:18:59.368142: Pseudo dice [0.7523, nan] 
2023-11-23 10:18:59.368217: Epoch time: 196.43 s 
2023-11-23 10:19:00.452112:  
2023-11-23 10:19:00.452232: Epoch 954 
2023-11-23 10:19:00.452329: Current learning rate: 0.00063 
2023-11-23 10:22:16.639873: train_loss -0.3823 
2023-11-23 10:22:16.640074: val_loss -0.3715 
2023-11-23 10:22:16.640149: Pseudo dice [0.7602, nan] 
2023-11-23 10:22:16.640218: Epoch time: 196.19 s 
2023-11-23 10:22:17.713304:  
2023-11-23 10:22:17.713482: Epoch 955 
2023-11-23 10:22:17.713635: Current learning rate: 0.00061 
2023-11-23 10:25:35.137346: train_loss -0.3755 
2023-11-23 10:25:35.137541: val_loss -0.3775 
2023-11-23 10:25:35.137612: Pseudo dice [0.7719, nan] 
2023-11-23 10:25:35.137681: Epoch time: 197.42 s 
2023-11-23 10:25:36.226939:  
2023-11-23 10:25:36.227064: Epoch 956 
2023-11-23 10:25:36.227165: Current learning rate: 0.0006 
2023-11-23 10:28:53.175913: train_loss -0.3825 
2023-11-23 10:28:53.176108: val_loss -0.3667 
2023-11-23 10:28:53.176187: Pseudo dice [0.7403, nan] 
2023-11-23 10:28:53.176260: Epoch time: 196.95 s 
2023-11-23 10:28:54.267272:  
2023-11-23 10:28:54.267395: Epoch 957 
2023-11-23 10:28:54.267499: Current learning rate: 0.00059 
2023-11-23 10:32:10.978957: train_loss -0.3823 
2023-11-23 10:32:10.979164: val_loss -0.3562 
2023-11-23 10:32:10.979240: Pseudo dice [0.7189, nan] 
2023-11-23 10:32:10.979314: Epoch time: 196.71 s 
2023-11-23 10:32:12.058758:  
2023-11-23 10:32:12.058881: Epoch 958 
2023-11-23 10:32:12.058980: Current learning rate: 0.00058 
2023-11-23 10:35:28.970598: train_loss -0.3794 
2023-11-23 10:35:28.970797: val_loss -0.3475 
2023-11-23 10:35:28.970871: Pseudo dice [0.7164, nan] 
2023-11-23 10:35:28.970939: Epoch time: 196.91 s 
2023-11-23 10:35:30.042918:  
2023-11-23 10:35:30.043028: Epoch 959 
2023-11-23 10:35:30.043168: Current learning rate: 0.00056 
2023-11-23 10:38:47.213170: train_loss -0.3782 
2023-11-23 10:38:47.213370: val_loss -0.3234 
2023-11-23 10:38:47.213448: Pseudo dice [0.6782, nan] 
2023-11-23 10:38:47.213519: Epoch time: 197.17 s 
2023-11-23 10:38:48.290986:  
2023-11-23 10:38:48.291179: Epoch 960 
2023-11-23 10:38:48.291284: Current learning rate: 0.00055 
2023-11-23 10:42:05.350778: train_loss -0.3784 
2023-11-23 10:42:05.350981: val_loss -0.3716 
2023-11-23 10:42:05.351061: Pseudo dice [0.751, nan] 
2023-11-23 10:42:05.351131: Epoch time: 197.06 s 
2023-11-23 10:42:06.426636:  
2023-11-23 10:42:06.426820: Epoch 961 
2023-11-23 10:42:06.426941: Current learning rate: 0.00054 
2023-11-23 10:45:23.582468: train_loss -0.3813 
2023-11-23 10:45:23.582700: val_loss -0.3622 
2023-11-23 10:45:23.582783: Pseudo dice [0.7466, nan] 
2023-11-23 10:45:23.582852: Epoch time: 197.16 s 
2023-11-23 10:45:24.761189:  
2023-11-23 10:45:24.761303: Epoch 962 
2023-11-23 10:45:24.761402: Current learning rate: 0.00053 
2023-11-23 10:48:41.429708: train_loss -0.3807 
2023-11-23 10:48:41.429909: val_loss -0.3471 
2023-11-23 10:48:41.429987: Pseudo dice [0.6974, nan] 
2023-11-23 10:48:41.430056: Epoch time: 196.67 s 
2023-11-23 10:48:42.509899:  
2023-11-23 10:48:42.510020: Epoch 963 
2023-11-23 10:48:42.510118: Current learning rate: 0.00051 
2023-11-23 10:51:59.703439: train_loss -0.3875 
2023-11-23 10:51:59.703647: val_loss -0.3579 
2023-11-23 10:51:59.703724: Pseudo dice [0.7119, nan] 
2023-11-23 10:51:59.703795: Epoch time: 197.19 s 
2023-11-23 10:52:00.796903:  
2023-11-23 10:52:00.797056: Epoch 964 
2023-11-23 10:52:00.797200: Current learning rate: 0.0005 
2023-11-23 10:55:16.810304: train_loss -0.382 
2023-11-23 10:55:16.810510: val_loss -0.3498 
2023-11-23 10:55:16.810590: Pseudo dice [0.7147, nan] 
2023-11-23 10:55:16.810659: Epoch time: 196.01 s 
2023-11-23 10:55:17.885049:  
2023-11-23 10:55:17.885170: Epoch 965 
2023-11-23 10:55:17.885269: Current learning rate: 0.00049 
2023-11-23 10:58:34.268512: train_loss -0.3817 
2023-11-23 10:58:34.268731: val_loss -0.3588 
2023-11-23 10:58:34.268807: Pseudo dice [0.7323, nan] 
2023-11-23 10:58:34.268877: Epoch time: 196.38 s 
2023-11-23 10:58:35.458521:  
2023-11-23 10:58:35.458652: Epoch 966 
2023-11-23 10:58:35.458763: Current learning rate: 0.00048 
2023-11-23 11:01:52.063946: train_loss -0.3896 
2023-11-23 11:01:52.064148: val_loss -0.34 
2023-11-23 11:01:52.064225: Pseudo dice [0.7021, nan] 
2023-11-23 11:01:52.064296: Epoch time: 196.61 s 
2023-11-23 11:01:53.146333:  
2023-11-23 11:01:53.146452: Epoch 967 
2023-11-23 11:01:53.146554: Current learning rate: 0.00046 
2023-11-23 11:05:10.592898: train_loss -0.3892 
2023-11-23 11:05:10.593134: val_loss -0.3697 
2023-11-23 11:05:10.593215: Pseudo dice [0.7553, nan] 
2023-11-23 11:05:10.593285: Epoch time: 197.45 s 
2023-11-23 11:05:11.680320:  
2023-11-23 11:05:11.680452: Epoch 968 
2023-11-23 11:05:11.680557: Current learning rate: 0.00045 
2023-11-23 11:08:29.188504: train_loss -0.3833 
2023-11-23 11:08:29.188771: val_loss -0.3567 
2023-11-23 11:08:29.188901: Pseudo dice [0.7351, nan] 
2023-11-23 11:08:29.188974: Epoch time: 197.51 s 
2023-11-23 11:08:30.277786:  
2023-11-23 11:08:30.277906: Epoch 969 
2023-11-23 11:08:30.278013: Current learning rate: 0.00044 
2023-11-23 11:11:47.917323: train_loss -0.3824 
2023-11-23 11:11:47.917540: val_loss -0.3398 
2023-11-23 11:11:47.917669: Pseudo dice [0.6986, nan] 
2023-11-23 11:11:47.917745: Epoch time: 197.64 s 
2023-11-23 11:11:49.112755:  
2023-11-23 11:11:49.112909: Epoch 970 
2023-11-23 11:11:49.113040: Current learning rate: 0.00043 
2023-11-23 11:15:06.834347: train_loss -0.3888 
2023-11-23 11:15:06.834553: val_loss -0.3452 
2023-11-23 11:15:06.834630: Pseudo dice [0.6921, nan] 
2023-11-23 11:15:06.834705: Epoch time: 197.72 s 
2023-11-23 11:15:07.918362:  
2023-11-23 11:15:07.918486: Epoch 971 
2023-11-23 11:15:07.918589: Current learning rate: 0.00041 
2023-11-23 11:18:24.730515: train_loss -0.3941 
2023-11-23 11:18:24.730747: val_loss -0.3633 
2023-11-23 11:18:24.730848: Pseudo dice [0.7358, nan] 
2023-11-23 11:18:24.730939: Epoch time: 196.81 s 
2023-11-23 11:18:25.812823:  
2023-11-23 11:18:25.812964: Epoch 972 
2023-11-23 11:18:25.813065: Current learning rate: 0.0004 
2023-11-23 11:21:42.429925: train_loss -0.3859 
2023-11-23 11:21:42.430111: val_loss -0.3433 
2023-11-23 11:21:42.430192: Pseudo dice [0.7053, nan] 
2023-11-23 11:21:42.430263: Epoch time: 196.62 s 
2023-11-23 11:21:43.512448:  
2023-11-23 11:21:43.512587: Epoch 973 
2023-11-23 11:21:43.512696: Current learning rate: 0.00039 
2023-11-23 11:25:00.075109: train_loss -0.3876 
2023-11-23 11:25:00.075319: val_loss -0.3396 
2023-11-23 11:25:00.075425: Pseudo dice [0.6971, nan] 
2023-11-23 11:25:00.075503: Epoch time: 196.56 s 
2023-11-23 11:25:01.255792:  
2023-11-23 11:25:01.255972: Epoch 974 
2023-11-23 11:25:01.256124: Current learning rate: 0.00037 
2023-11-23 11:28:17.044812: train_loss -0.3859 
2023-11-23 11:28:17.045036: val_loss -0.3823 
2023-11-23 11:28:17.045132: Pseudo dice [0.7773, nan] 
2023-11-23 11:28:17.045209: Epoch time: 195.79 s 
2023-11-23 11:28:18.129066:  
2023-11-23 11:28:18.129179: Epoch 975 
2023-11-23 11:28:18.129282: Current learning rate: 0.00036 
2023-11-23 11:31:34.607239: train_loss -0.3925 
2023-11-23 11:31:34.607444: val_loss -0.3674 
2023-11-23 11:31:34.607569: Pseudo dice [0.7508, nan] 
2023-11-23 11:31:34.607645: Epoch time: 196.48 s 
2023-11-23 11:31:35.698518:  
2023-11-23 11:31:35.698646: Epoch 976 
2023-11-23 11:31:35.698747: Current learning rate: 0.00035 
2023-11-23 11:34:51.797198: train_loss -0.3858 
2023-11-23 11:34:51.797397: val_loss -0.3728 
2023-11-23 11:34:51.797523: Pseudo dice [0.753, nan] 
2023-11-23 11:34:51.797599: Epoch time: 196.1 s 
2023-11-23 11:34:52.891163:  
2023-11-23 11:34:52.891331: Epoch 977 
2023-11-23 11:34:52.891474: Current learning rate: 0.00034 
2023-11-23 11:38:09.702022: train_loss -0.3789 
2023-11-23 11:38:09.702226: val_loss -0.3497 
2023-11-23 11:38:09.702309: Pseudo dice [0.7162, nan] 
2023-11-23 11:38:09.702386: Epoch time: 196.81 s 
2023-11-23 11:38:10.909534:  
2023-11-23 11:38:10.909847: Epoch 978 
2023-11-23 11:38:10.910030: Current learning rate: 0.00032 
2023-11-23 11:41:27.787633: train_loss -0.3812 
2023-11-23 11:41:27.787832: val_loss -0.3644 
2023-11-23 11:41:27.787946: Pseudo dice [0.7454, nan] 
2023-11-23 11:41:27.788042: Epoch time: 196.88 s 
2023-11-23 11:41:28.885931:  
2023-11-23 11:41:28.886064: Epoch 979 
2023-11-23 11:41:28.886174: Current learning rate: 0.00031 
2023-11-23 11:44:45.253780: train_loss -0.3902 
2023-11-23 11:44:45.253973: val_loss -0.3746 
2023-11-23 11:44:45.254070: Pseudo dice [0.7631, nan] 
2023-11-23 11:44:45.254145: Epoch time: 196.37 s 
2023-11-23 11:44:45.254206: Yayy! New best EMA pseudo Dice: 0.733 
2023-11-23 11:44:46.493041:  
2023-11-23 11:44:46.493167: Epoch 980 
2023-11-23 11:44:46.493274: Current learning rate: 0.0003 
2023-11-23 11:48:02.677056: train_loss -0.3804 
2023-11-23 11:48:02.677268: val_loss -0.3606 
2023-11-23 11:48:02.677362: Pseudo dice [0.7348, nan] 
2023-11-23 11:48:02.677438: Epoch time: 196.18 s 
2023-11-23 11:48:02.677502: Yayy! New best EMA pseudo Dice: 0.7332 
2023-11-23 11:48:03.899934:  
2023-11-23 11:48:03.900068: Epoch 981 
2023-11-23 11:48:03.900175: Current learning rate: 0.00028 
2023-11-23 11:51:21.371910: train_loss -0.3809 
2023-11-23 11:51:21.372123: val_loss -0.3492 
2023-11-23 11:51:21.372207: Pseudo dice [0.7144, nan] 
2023-11-23 11:51:21.372283: Epoch time: 197.47 s 
2023-11-23 11:51:22.486920:  
2023-11-23 11:51:22.487104: Epoch 982 
2023-11-23 11:51:22.487257: Current learning rate: 0.00027 
2023-11-23 11:54:39.140235: train_loss -0.3874 
2023-11-23 11:54:39.140427: val_loss -0.3619 
2023-11-23 11:54:39.140533: Pseudo dice [0.7452, nan] 
2023-11-23 11:54:39.140666: Epoch time: 196.65 s 
2023-11-23 11:54:40.326777:  
2023-11-23 11:54:40.326911: Epoch 983 
2023-11-23 11:54:40.327009: Current learning rate: 0.00026 
2023-11-23 11:57:56.615492: train_loss -0.3761 
2023-11-23 11:57:56.615710: val_loss -0.3458 
2023-11-23 11:57:56.615790: Pseudo dice [0.7226, nan] 
2023-11-23 11:57:56.615863: Epoch time: 196.29 s 
2023-11-23 11:57:57.696753:  
2023-11-23 11:57:57.696880: Epoch 984 
2023-11-23 11:57:57.696977: Current learning rate: 0.00024 
2023-11-23 12:01:14.030845: train_loss -0.3813 
2023-11-23 12:01:14.031096: val_loss -0.3694 
2023-11-23 12:01:14.031182: Pseudo dice [0.7596, nan] 
2023-11-23 12:01:14.031255: Epoch time: 196.33 s 
2023-11-23 12:01:14.031319: Yayy! New best EMA pseudo Dice: 0.7345 
2023-11-23 12:01:15.264899:  
2023-11-23 12:01:15.265025: Epoch 985 
2023-11-23 12:01:15.265123: Current learning rate: 0.00023 
2023-11-23 12:04:31.593045: train_loss -0.3949 
2023-11-23 12:04:31.593263: val_loss -0.3629 
2023-11-23 12:04:31.593344: Pseudo dice [0.7312, nan] 
2023-11-23 12:04:31.593426: Epoch time: 196.33 s 
2023-11-23 12:04:32.675928:  
2023-11-23 12:04:32.676050: Epoch 986 
2023-11-23 12:04:32.676157: Current learning rate: 0.00021 
2023-11-23 12:07:48.971934: train_loss -0.3949 
2023-11-23 12:07:48.972156: val_loss -0.3725 
2023-11-23 12:07:48.972244: Pseudo dice [0.7481, nan] 
2023-11-23 12:07:48.972321: Epoch time: 196.3 s 
2023-11-23 12:07:48.972383: Yayy! New best EMA pseudo Dice: 0.7355 
2023-11-23 12:07:50.199596:  
2023-11-23 12:07:50.199785: Epoch 987 
2023-11-23 12:07:50.199936: Current learning rate: 0.0002 
2023-11-23 12:11:06.674971: train_loss -0.3876 
2023-11-23 12:11:06.675174: val_loss -0.357 
2023-11-23 12:11:06.675255: Pseudo dice [0.7305, nan] 
2023-11-23 12:11:06.675329: Epoch time: 196.48 s 
2023-11-23 12:11:07.861032:  
2023-11-23 12:11:07.861177: Epoch 988 
2023-11-23 12:11:07.861290: Current learning rate: 0.00019 
2023-11-23 12:14:24.532481: train_loss -0.392 
2023-11-23 12:14:24.532726: val_loss -0.3527 
2023-11-23 12:14:24.532845: Pseudo dice [0.7208, nan] 
2023-11-23 12:14:24.532922: Epoch time: 196.67 s 
2023-11-23 12:14:25.612445:  
2023-11-23 12:14:25.612580: Epoch 989 
2023-11-23 12:14:25.612709: Current learning rate: 0.00017 
2023-11-23 12:17:42.104780: train_loss -0.3822 
2023-11-23 12:17:42.104962: val_loss -0.3637 
2023-11-23 12:17:42.105056: Pseudo dice [0.7214, nan] 
2023-11-23 12:17:42.105134: Epoch time: 196.49 s 
2023-11-23 12:17:43.189975:  
2023-11-23 12:17:43.190099: Epoch 990 
2023-11-23 12:17:43.190229: Current learning rate: 0.00016 
2023-11-23 12:21:00.155118: train_loss -0.3917 
2023-11-23 12:21:00.155329: val_loss -0.3588 
2023-11-23 12:21:00.155412: Pseudo dice [0.7307, nan] 
2023-11-23 12:21:00.155490: Epoch time: 196.97 s 
2023-11-23 12:21:01.250379:  
2023-11-23 12:21:01.250580: Epoch 991 
2023-11-23 12:21:01.250793: Current learning rate: 0.00014 
2023-11-23 12:24:18.369730: train_loss -0.3945 
2023-11-23 12:24:18.369968: val_loss -0.3591 
2023-11-23 12:24:18.370055: Pseudo dice [0.7258, nan] 
2023-11-23 12:24:18.370131: Epoch time: 197.12 s 
2023-11-23 12:24:19.557308:  
2023-11-23 12:24:19.557439: Epoch 992 
2023-11-23 12:24:19.557544: Current learning rate: 0.00013 
2023-11-23 12:27:36.190140: train_loss -0.3872 
2023-11-23 12:27:36.190349: val_loss -0.3641 
2023-11-23 12:27:36.190470: Pseudo dice [0.7376, nan] 
2023-11-23 12:27:36.190549: Epoch time: 196.63 s 
2023-11-23 12:27:37.274132:  
2023-11-23 12:27:37.274257: Epoch 993 
2023-11-23 12:27:37.274436: Current learning rate: 0.00011 
2023-11-23 12:30:54.273268: train_loss -0.3898 
2023-11-23 12:30:54.273482: val_loss -0.3361 
2023-11-23 12:30:54.273587: Pseudo dice [0.6959, nan] 
2023-11-23 12:30:54.273663: Epoch time: 197.0 s 
2023-11-23 12:30:55.374849:  
2023-11-23 12:30:55.374978: Epoch 994 
2023-11-23 12:30:55.375081: Current learning rate: 0.0001 
2023-11-23 12:34:11.880224: train_loss -0.3956 
2023-11-23 12:34:11.880408: val_loss -0.3624 
2023-11-23 12:34:11.880486: Pseudo dice [0.7316, nan] 
2023-11-23 12:34:11.880558: Epoch time: 196.51 s 
2023-11-23 12:34:12.958701:  
2023-11-23 12:34:12.958826: Epoch 995 
2023-11-23 12:34:12.958960: Current learning rate: 8e-05 
2023-11-23 12:37:29.582166: train_loss -0.393 
2023-11-23 12:37:29.582376: val_loss -0.3837 
2023-11-23 12:37:29.582460: Pseudo dice [0.781, nan] 
2023-11-23 12:37:29.582540: Epoch time: 196.62 s 
2023-11-23 12:37:30.667230:  
2023-11-23 12:37:30.667419: Epoch 996 
2023-11-23 12:37:30.667531: Current learning rate: 7e-05 
2023-11-23 12:40:47.335641: train_loss -0.3906 
2023-11-23 12:40:47.335819: val_loss -0.3643 
2023-11-23 12:40:47.335897: Pseudo dice [0.7514, nan] 
2023-11-23 12:40:47.335968: Epoch time: 196.67 s 
2023-11-23 12:40:47.336029: Yayy! New best EMA pseudo Dice: 0.7358 
2023-11-23 12:40:48.548105:  
2023-11-23 12:40:48.548226: Epoch 997 
2023-11-23 12:40:48.548329: Current learning rate: 5e-05 
2023-11-23 12:44:05.413106: train_loss -0.3834 
2023-11-23 12:44:05.413339: val_loss -0.3571 
2023-11-23 12:44:05.413425: Pseudo dice [0.7315, nan] 
2023-11-23 12:44:05.413502: Epoch time: 196.87 s 
2023-11-23 12:44:06.506734:  
2023-11-23 12:44:06.506863: Epoch 998 
2023-11-23 12:44:06.506966: Current learning rate: 4e-05 
2023-11-23 12:47:23.684194: train_loss -0.3949 
2023-11-23 12:47:23.684380: val_loss -0.3739 
2023-11-23 12:47:23.684459: Pseudo dice [0.7621, nan] 
2023-11-23 12:47:23.684533: Epoch time: 197.18 s 
2023-11-23 12:47:23.684604: Yayy! New best EMA pseudo Dice: 0.7381 
2023-11-23 12:47:24.898800:  
2023-11-23 12:47:24.898927: Epoch 999 
2023-11-23 12:47:24.899048: Current learning rate: 2e-05 
2023-11-23 12:50:42.249720: train_loss -0.3918 
2023-11-23 12:50:42.249938: val_loss -0.3659 
2023-11-23 12:50:42.250021: Pseudo dice [0.7429, nan] 
2023-11-23 12:50:42.250095: Epoch time: 197.35 s 
2023-11-23 12:50:42.250156: Yayy! New best EMA pseudo Dice: 0.7385 
2023-11-23 12:50:43.819580: Training done. 
2023-11-23 12:50:43.845473: Using splits from existing split file: /home/xfr/nnUNet/preprocessed/Dataset223_ICTS2023/splits_final.json 
2023-11-23 12:50:43.846746: The split file contains 5 splits. 
2023-11-23 12:50:43.846809: Desired fold for training: 0 
2023-11-23 12:50:43.846859: This split has 1600 training and 400 validation cases. 
2023-11-23 12:50:43.853469: predicting ICTS_0001 
2023-11-23 12:50:46.622981: predicting ICTS_0004 
2023-11-23 12:50:48.871774: predicting ICTS_0009 
2023-11-23 12:50:51.059525: predicting ICTS_0018 
2023-11-23 12:50:53.135421: predicting ICTS_0023 
2023-11-23 12:50:55.271463: predicting ICTS_0026 
2023-11-23 12:50:57.372457: predicting ICTS_0027 
2023-11-23 12:50:59.474476: predicting ICTS_0031 
2023-11-23 12:51:01.580256: predicting ICTS_0034 
2023-11-23 12:51:03.675914: predicting ICTS_0042 
2023-11-23 12:51:05.815702: predicting ICTS_0057 
2023-11-23 12:51:07.884814: predicting ICTS_0064 
2023-11-23 12:51:09.951244: predicting ICTS_0073 
2023-11-23 12:51:12.021330: predicting ICTS_0077 
2023-11-23 12:51:14.088017: predicting ICTS_0083 
2023-11-23 12:51:16.157238: predicting ICTS_0088 
2023-11-23 12:51:18.222216: predicting ICTS_0093 
2023-11-23 12:51:20.292011: predicting ICTS_0096 
2023-11-23 12:51:22.357533: predicting ICTS_0100 
2023-11-23 12:51:24.431087: predicting ICTS_0101 
2023-11-23 12:51:26.509554: predicting ICTS_0102 
2023-11-23 12:51:28.604899: predicting ICTS_0104 
2023-11-23 12:51:30.666558: predicting ICTS_0105 
2023-11-23 12:51:32.722399: predicting ICTS_0110 
2023-11-23 12:51:34.788460: predicting ICTS_0112 
2023-11-23 12:51:36.854468: predicting ICTS_0116 
2023-11-23 12:51:38.907834: predicting ICTS_0128 
2023-11-23 12:51:40.969696: predicting ICTS_0132 
2023-11-23 12:51:43.030478: predicting ICTS_0137 
2023-11-23 12:51:45.091191: predicting ICTS_0138 
2023-11-23 12:51:47.154033: predicting ICTS_0149 
2023-11-23 12:51:49.220389: predicting ICTS_0150 
2023-11-23 12:51:51.279176: predicting ICTS_0151 
2023-11-23 12:51:53.357905: predicting ICTS_0152 
2023-11-23 12:51:55.421598: predicting ICTS_0154 
2023-11-23 12:51:57.493240: predicting ICTS_0155 
2023-11-23 12:51:59.558170: predicting ICTS_0163 
2023-11-23 12:52:01.651725: predicting ICTS_0165 
2023-11-23 12:52:03.737194: predicting ICTS_0168 
2023-11-23 12:52:05.827886: predicting ICTS_0172 
2023-11-23 12:52:07.920246: predicting ICTS_0179 
2023-11-23 12:52:10.006269: predicting ICTS_0181 
2023-11-23 12:52:12.095562: predicting ICTS_0182 
2023-11-23 12:52:14.186777: predicting ICTS_0188 
2023-11-23 12:52:16.281882: predicting ICTS_0189 
2023-11-23 12:52:18.370303: predicting ICTS_0191 
2023-11-23 12:52:20.458584: predicting ICTS_0194 
2023-11-23 12:52:22.541872: predicting ICTS_0200 
2023-11-23 12:52:24.602407: predicting ICTS_0202 
2023-11-23 12:52:26.761423: predicting ICTS_0207 
2023-11-23 12:52:28.817024: predicting ICTS_0208 
2023-11-23 12:52:30.880516: predicting ICTS_0209 
2023-11-23 12:52:32.947719: predicting ICTS_0210 
2023-11-23 12:52:35.017849: predicting ICTS_0220 
2023-11-23 12:52:37.109487: predicting ICTS_0222 
2023-11-23 12:52:39.174354: predicting ICTS_0223 
2023-11-23 12:52:41.232934: predicting ICTS_0228 
2023-11-23 12:52:43.294882: predicting ICTS_0230 
2023-11-23 12:52:45.369769: predicting ICTS_0239 
2023-11-23 12:52:47.477777: predicting ICTS_0251 
2023-11-23 12:52:49.540233: predicting ICTS_0261 
2023-11-23 12:52:51.612802: predicting ICTS_0265 
2023-11-23 12:52:53.672120: predicting ICTS_0266 
2023-11-23 12:52:55.810688: predicting ICTS_0275 
2023-11-23 12:52:57.975203: predicting ICTS_0284 
2023-11-23 12:53:00.046100: predicting ICTS_0285 
2023-11-23 12:53:02.120171: predicting ICTS_0299 
2023-11-23 12:53:04.217281: predicting ICTS_0301 
2023-11-23 12:53:06.329481: predicting ICTS_0303 
2023-11-23 12:53:08.432465: predicting ICTS_0309 
2023-11-23 12:53:10.522505: predicting ICTS_0315 
2023-11-23 12:53:12.616154: predicting ICTS_0325 
2023-11-23 12:53:14.712914: predicting ICTS_0328 
2023-11-23 12:53:16.811571: predicting ICTS_0329 
2023-11-23 12:53:18.916447: predicting ICTS_0332 
2023-11-23 12:53:21.048918: predicting ICTS_0336 
2023-11-23 12:53:23.224667: predicting ICTS_0339 
2023-11-23 12:53:25.311059: predicting ICTS_0348 
2023-11-23 12:53:27.405841: predicting ICTS_0358 
2023-11-23 12:53:29.503093: predicting ICTS_0359 
2023-11-23 12:53:31.634501: predicting ICTS_0362 
2023-11-23 12:53:33.725679: predicting ICTS_0363 
2023-11-23 12:53:35.848859: predicting ICTS_0370 
2023-11-23 12:53:37.949439: predicting ICTS_0371 
2023-11-23 12:53:40.040583: predicting ICTS_0375 
2023-11-23 12:53:42.134974: predicting ICTS_0382 
2023-11-23 12:53:44.228098: predicting ICTS_0384 
2023-11-23 12:53:46.418460: predicting ICTS_0391 
2023-11-23 12:53:48.512175: predicting ICTS_0392 
2023-11-23 12:53:50.602822: predicting ICTS_0395 
2023-11-23 12:53:52.696061: predicting ICTS_0396 
2023-11-23 12:53:54.790431: predicting ICTS_0404 
2023-11-23 12:53:56.958106: predicting ICTS_0413 
2023-11-23 12:53:59.170256: predicting ICTS_0414 
2023-11-23 12:54:01.321490: predicting ICTS_0415 
2023-11-23 12:54:03.506760: predicting ICTS_0419 
2023-11-23 12:54:05.727944: predicting ICTS_0424 
2023-11-23 12:54:07.911799: predicting ICTS_0428 
2023-11-23 12:54:10.006218: predicting ICTS_0429 
2023-11-23 12:54:12.102188: predicting ICTS_0431 
2023-11-23 12:54:14.195252: predicting ICTS_0433 
2023-11-23 12:54:16.307929: predicting ICTS_0437 
2023-11-23 12:54:18.405330: predicting ICTS_0450 
2023-11-23 12:54:20.501465: predicting ICTS_0454 
2023-11-23 12:54:22.591961: predicting ICTS_0455 
2023-11-23 12:54:24.713870: predicting ICTS_0456 
2023-11-23 12:54:26.914889: predicting ICTS_0463 
2023-11-23 12:54:29.009480: predicting ICTS_0466 
2023-11-23 12:54:31.127226: predicting ICTS_0469 
2023-11-23 12:54:33.226165: predicting ICTS_0470 
2023-11-23 12:54:35.314874: predicting ICTS_0488 
2023-11-23 12:54:37.474089: predicting ICTS_0489 
2023-11-23 12:54:39.572481: predicting ICTS_0490 
2023-11-23 12:54:41.703141: predicting ICTS_0503 
2023-11-23 12:54:43.800197: predicting ICTS_0509 
2023-11-23 12:54:45.929934: predicting ICTS_0510 
2023-11-23 12:54:47.990920: predicting ICTS_0514 
2023-11-23 12:54:50.155834: predicting ICTS_0522 
2023-11-23 12:54:52.237433: predicting ICTS_0523 
2023-11-23 12:54:54.298972: predicting ICTS_0527 
2023-11-23 12:54:56.485604: predicting ICTS_0529 
2023-11-23 12:54:58.558900: predicting ICTS_0530 
2023-11-23 12:55:00.620997: predicting ICTS_0545 
2023-11-23 12:55:02.795624: predicting ICTS_0552 
2023-11-23 12:55:04.865617: predicting ICTS_0553 
2023-11-23 12:55:06.928346: predicting ICTS_0564 
2023-11-23 12:55:08.987791: predicting ICTS_0566 
2023-11-23 12:55:11.067854: predicting ICTS_0569 
2023-11-23 12:55:13.140085: predicting ICTS_0572 
2023-11-23 12:55:15.222991: predicting ICTS_0581 
2023-11-23 12:55:17.309713: predicting ICTS_0587 
2023-11-23 12:55:19.392963: predicting ICTS_0589 
2023-11-23 12:55:21.467578: predicting ICTS_0590 
2023-11-23 12:55:23.554911: predicting ICTS_0604 
2023-11-23 12:55:25.684396: predicting ICTS_0607 
2023-11-23 12:55:27.755197: predicting ICTS_0613 
2023-11-23 12:55:29.821854: predicting ICTS_0618 
2023-11-23 12:55:31.885560: predicting ICTS_0623 
2023-11-23 12:55:33.948457: predicting ICTS_0630 
2023-11-23 12:55:36.019145: predicting ICTS_0638 
2023-11-23 12:55:38.078031: predicting ICTS_0639 
2023-11-23 12:55:40.146884: predicting ICTS_0649 
2023-11-23 12:55:42.226292: predicting ICTS_0659 
2023-11-23 12:55:44.295364: predicting ICTS_0665 
2023-11-23 12:55:46.449645: predicting ICTS_0667 
2023-11-23 12:55:48.621818: predicting ICTS_0669 
2023-11-23 12:55:50.723401: predicting ICTS_0673 
2023-11-23 12:55:52.794910: predicting ICTS_0674 
2023-11-23 12:55:54.890303: predicting ICTS_0675 
2023-11-23 12:55:56.997961: predicting ICTS_0678 
2023-11-23 12:55:59.097215: predicting ICTS_0687 
2023-11-23 12:56:01.251379: predicting ICTS_0693 
2023-11-23 12:56:03.441383: predicting ICTS_0699 
2023-11-23 12:56:05.567966: predicting ICTS_0711 
2023-11-23 12:56:07.663160: predicting ICTS_0715 
2023-11-23 12:56:09.807617: predicting ICTS_0716 
2023-11-23 12:56:11.902582: predicting ICTS_0721 
2023-11-23 12:56:13.995119: predicting ICTS_0722 
2023-11-23 12:56:16.186913: predicting ICTS_0725 
2023-11-23 12:56:18.363384: predicting ICTS_0729 
2023-11-23 12:56:20.453650: predicting ICTS_0733 
2023-11-23 12:56:22.553239: predicting ICTS_0734 
2023-11-23 12:56:24.651696: predicting ICTS_0737 
2023-11-23 12:56:26.879265: predicting ICTS_0747 
2023-11-23 12:56:28.975039: predicting ICTS_0751 
2023-11-23 12:56:31.074790: predicting ICTS_0757 
2023-11-23 12:56:33.162889: predicting ICTS_0764 
2023-11-23 12:56:35.240510: predicting ICTS_0771 
2023-11-23 12:56:37.328236: predicting ICTS_0785 
2023-11-23 12:56:39.393666: predicting ICTS_0788 
2023-11-23 12:56:41.496196: predicting ICTS_0794 
2023-11-23 12:56:43.588509: predicting ICTS_0796 
2023-11-23 12:56:45.776998: predicting ICTS_0809 
2023-11-23 12:56:47.965705: predicting ICTS_0813 
2023-11-23 12:56:50.036175: predicting ICTS_0815 
2023-11-23 12:56:52.121144: predicting ICTS_0817 
2023-11-23 12:56:54.192525: predicting ICTS_0820 
2023-11-23 12:56:56.377987: predicting ICTS_0821 
2023-11-23 12:56:58.453855: predicting ICTS_0824 
2023-11-23 12:57:00.525927: predicting ICTS_0843 
2023-11-23 12:57:02.605376: predicting ICTS_0847 
2023-11-23 12:57:04.701480: predicting ICTS_0848 
2023-11-23 12:57:06.889880: predicting ICTS_0850 
2023-11-23 12:57:08.957122: predicting ICTS_0853 
2023-11-23 12:57:11.030590: predicting ICTS_0861 
2023-11-23 12:57:13.100977: predicting ICTS_0864 
2023-11-23 12:57:15.171339: predicting ICTS_0866 
2023-11-23 12:57:17.288397: predicting ICTS_0870 
2023-11-23 12:57:19.362709: predicting ICTS_0872 
2023-11-23 12:57:21.435693: predicting ICTS_0874 
2023-11-23 12:57:23.514386: predicting ICTS_0879 
2023-11-23 12:57:25.591393: predicting ICTS_0888 
2023-11-23 12:57:27.662239: predicting ICTS_0904 
2023-11-23 12:57:29.735999: predicting ICTS_0911 
2023-11-23 12:57:31.805098: predicting ICTS_0912 
2023-11-23 12:57:33.874125: predicting ICTS_0915 
2023-11-23 12:57:36.007755: predicting ICTS_0920 
2023-11-23 12:57:38.120239: predicting ICTS_0927 
2023-11-23 12:57:40.188270: predicting ICTS_0928 
2023-11-23 12:57:42.269470: predicting ICTS_0931 
2023-11-23 12:57:44.341069: predicting ICTS_0934 
2023-11-23 12:57:46.475422: predicting ICTS_0943 
2023-11-23 12:57:48.547483: predicting ICTS_0951 
2023-11-23 12:57:50.617591: predicting ICTS_0954 
2023-11-23 12:57:52.687563: predicting ICTS_0955 
2023-11-23 12:57:54.756894: predicting ICTS_0956 
2023-11-23 12:57:56.868904: predicting ICTS_0960 
2023-11-23 12:57:58.943266: predicting ICTS_0970 
2023-11-23 12:58:01.017381: predicting ICTS_0972 
2023-11-23 12:58:03.088597: predicting ICTS_0975 
2023-11-23 12:58:05.190166: predicting ICTS_0976 
2023-11-23 12:58:07.371589: predicting ICTS_0977 
2023-11-23 12:58:09.539064: predicting ICTS_0982 
2023-11-23 12:58:11.649910: predicting ICTS_0988 
2023-11-23 12:58:13.723974: predicting ICTS_0990 
2023-11-23 12:58:15.862853: predicting ICTS_0991 
2023-11-23 12:58:17.943245: predicting ICTS_0992 
2023-11-23 12:58:20.101607: predicting ICTS_1005 
2023-11-23 12:58:22.175030: predicting ICTS_1012 
2023-11-23 12:58:24.251877: predicting ICTS_1018 
2023-11-23 12:58:26.323667: predicting ICTS_1020 
2023-11-23 12:58:28.397431: predicting ICTS_1021 
2023-11-23 12:58:30.487460: predicting ICTS_1031 
2023-11-23 12:58:32.564251: predicting ICTS_1032 
2023-11-23 12:58:34.638716: predicting ICTS_1035 
2023-11-23 12:58:36.705484: predicting ICTS_1048 
2023-11-23 12:58:38.774820: predicting ICTS_1050 
2023-11-23 12:58:40.839055: predicting ICTS_1053 
2023-11-23 12:58:42.905407: predicting ICTS_1068 
2023-11-23 12:58:44.968045: predicting ICTS_1072 
2023-11-23 12:58:47.041194: predicting ICTS_1073 
2023-11-23 12:58:49.098312: predicting ICTS_1086 
2023-11-23 12:58:51.157839: predicting ICTS_1095 
2023-11-23 12:58:53.220793: predicting ICTS_1100 
2023-11-23 12:58:55.277025: predicting ICTS_1109 
2023-11-23 12:58:56.681218: predicting ICTS_1110 
2023-11-23 12:58:58.727350: predicting ICTS_1112 
2023-11-23 12:59:00.788589: predicting ICTS_1118 
2023-11-23 12:59:02.219024: predicting ICTS_1119 
2023-11-23 12:59:04.280461: predicting ICTS_1136 
2023-11-23 12:59:06.341993: predicting ICTS_1148 
2023-11-23 12:59:08.403863: predicting ICTS_1154 
2023-11-23 12:59:10.447378: predicting ICTS_1155 
2023-11-23 12:59:11.750577: predicting ICTS_1159 
2023-11-23 12:59:13.375338: predicting ICTS_1160 
2023-11-23 12:59:15.436613: predicting ICTS_1173 
2023-11-23 12:59:17.478945: predicting ICTS_1178 
2023-11-23 12:59:19.366047: predicting ICTS_1180 
2023-11-23 12:59:21.427309: predicting ICTS_1181 
2023-11-23 12:59:23.134427: predicting ICTS_1206 
2023-11-23 12:59:25.184522: predicting ICTS_1208 
2023-11-23 12:59:27.164837: predicting ICTS_1209 
2023-11-23 12:59:29.208958: predicting ICTS_1211 
2023-11-23 12:59:31.232592: predicting ICTS_1213 
2023-11-23 12:59:33.297021: predicting ICTS_1214 
2023-11-23 12:59:35.358207: predicting ICTS_1215 
2023-11-23 12:59:37.405250: predicting ICTS_1217 
2023-11-23 12:59:39.402583: predicting ICTS_1218 
2023-11-23 12:59:41.447114: predicting ICTS_1219 
2023-11-23 12:59:43.489053: predicting ICTS_1222 
2023-11-23 12:59:45.534698: predicting ICTS_1223 
2023-11-23 12:59:47.065368: predicting ICTS_1226 
2023-11-23 12:59:49.119198: predicting ICTS_1237 
2023-11-23 12:59:51.210310: predicting ICTS_1240 
2023-11-23 12:59:53.185318: predicting ICTS_1249 
2023-11-23 12:59:55.281597: predicting ICTS_1253 
2023-11-23 12:59:57.366771: predicting ICTS_1256 
2023-11-23 12:59:59.466570: predicting ICTS_1263 
2023-11-23 13:00:01.528458: predicting ICTS_1264 
2023-11-23 13:00:03.592527: predicting ICTS_1279 
2023-11-23 13:00:05.662204: predicting ICTS_1287 
2023-11-23 13:00:06.922760: predicting ICTS_1292 
2023-11-23 13:00:08.964267: predicting ICTS_1303 
2023-11-23 13:00:11.035947: predicting ICTS_1306 
2023-11-23 13:00:13.104588: predicting ICTS_1324 
2023-11-23 13:00:15.179276: predicting ICTS_1329 
2023-11-23 13:00:17.253829: predicting ICTS_1337 
2023-11-23 13:00:19.371891: predicting ICTS_1347 
2023-11-23 13:00:21.449501: predicting ICTS_1359 
2023-11-23 13:00:23.479318: predicting ICTS_1361 
2023-11-23 13:00:25.553974: predicting ICTS_1362 
2023-11-23 13:00:27.625540: predicting ICTS_1373 
2023-11-23 13:00:29.680639: predicting ICTS_1377 
2023-11-23 13:00:31.733030: predicting ICTS_1391 
2023-11-23 13:00:33.806759: predicting ICTS_1400 
2023-11-23 13:00:35.827858: predicting ICTS_1401 
2023-11-23 13:00:37.931996: predicting ICTS_1416 
2023-11-23 13:00:40.015315: predicting ICTS_1424 
2023-11-23 13:00:42.114795: predicting ICTS_1433 
2023-11-23 13:00:44.121633: predicting ICTS_1446 
2023-11-23 13:00:46.224402: predicting ICTS_1449 
2023-11-23 13:00:48.319966: predicting ICTS_1451 
2023-11-23 13:00:50.406332: predicting ICTS_1453 
2023-11-23 13:00:52.428877: predicting ICTS_1471 
2023-11-23 13:00:54.526186: predicting ICTS_1472 
2023-11-23 13:00:56.623322: predicting ICTS_1477 
2023-11-23 13:00:58.442431: predicting ICTS_1480 
2023-11-23 13:01:00.542126: predicting ICTS_1483 
2023-11-23 13:01:02.628150: predicting ICTS_1487 
2023-11-23 13:01:04.734022: predicting ICTS_1489 
2023-11-23 13:01:06.831434: predicting ICTS_1504 
2023-11-23 13:01:08.881979: predicting ICTS_1506 
2023-11-23 13:01:10.963319: predicting ICTS_1508 
2023-11-23 13:01:13.057598: predicting ICTS_1514 
2023-11-23 13:01:15.149502: predicting ICTS_1521 
2023-11-23 13:01:17.242822: predicting ICTS_1536 
2023-11-23 13:01:19.273562: predicting ICTS_1538 
2023-11-23 13:01:21.377496: predicting ICTS_1544 
2023-11-23 13:01:23.478414: predicting ICTS_1552 
2023-11-23 13:01:25.572452: predicting ICTS_1553 
2023-11-23 13:01:27.672062: predicting ICTS_1556 
2023-11-23 13:01:29.750466: predicting ICTS_1563 
2023-11-23 13:01:31.848365: predicting ICTS_1565 
2023-11-23 13:01:33.945091: predicting ICTS_1567 
2023-11-23 13:01:36.041499: predicting ICTS_1569 
2023-11-23 13:01:38.135093: predicting ICTS_1572 
2023-11-23 13:01:40.214100: predicting ICTS_1575 
2023-11-23 13:01:42.261130: predicting ICTS_1600 
2023-11-23 13:01:44.356195: predicting ICTS_1602 
2023-11-23 13:01:46.451360: predicting ICTS_1604 
2023-11-23 13:01:47.974398: predicting ICTS_1610 
2023-11-23 13:01:50.067528: predicting ICTS_1636 
2023-11-23 13:01:51.625850: predicting ICTS_1640 
2023-11-23 13:01:53.671141: predicting ICTS_1644 
2023-11-23 13:01:55.674976: predicting ICTS_1648 
2023-11-23 13:01:57.756723: predicting ICTS_1650 
2023-11-23 13:01:59.851478: predicting ICTS_1651 
2023-11-23 13:02:01.947098: predicting ICTS_1657 
2023-11-23 13:02:04.041244: predicting ICTS_1664 
2023-11-23 13:02:06.137452: predicting ICTS_1665 
2023-11-23 13:02:08.212768: predicting ICTS_1674 
2023-11-23 13:02:10.304550: predicting ICTS_1679 
2023-11-23 13:02:12.380176: predicting ICTS_1689 
2023-11-23 13:02:14.455595: predicting ICTS_1700 
2023-11-23 13:02:16.556953: predicting ICTS_1713 
2023-11-23 13:02:18.651350: predicting ICTS_1719 
2023-11-23 13:02:20.701218: predicting ICTS_1728 
2023-11-23 13:02:22.776212: predicting ICTS_1738 
2023-11-23 13:02:24.819642: predicting ICTS_1741 
2023-11-23 13:02:26.910063: predicting ICTS_1749 
2023-11-23 13:02:28.672706: predicting ICTS_1754 
2023-11-23 13:02:30.717627: predicting ICTS_1761 
2023-11-23 13:02:32.558252: predicting ICTS_1762 
2023-11-23 13:02:34.113635: predicting ICTS_1765 
2023-11-23 13:02:36.213779: predicting ICTS_1776 
2023-11-23 13:02:38.290435: predicting ICTS_1777 
2023-11-23 13:02:39.951923: predicting ICTS_1781 
2023-11-23 13:02:41.473087: predicting ICTS_1783 
2023-11-23 13:02:43.512670: predicting ICTS_1785 
2023-11-23 13:02:45.561996: predicting ICTS_1786 
2023-11-23 13:02:47.641919: predicting ICTS_1792 
2023-11-23 13:02:49.290632: predicting ICTS_1795 
2023-11-23 13:02:51.374356: predicting ICTS_1803 
2023-11-23 13:02:52.890926: predicting ICTS_1808 
2023-11-23 13:02:54.968833: predicting ICTS_1812 
2023-11-23 13:02:56.847657: predicting ICTS_1816 
2023-11-23 13:02:58.645081: predicting ICTS_1817 
2023-11-23 13:03:00.618478: predicting ICTS_1819 
2023-11-23 13:03:02.840138: predicting ICTS_1830 
2023-11-23 13:03:04.916898: predicting ICTS_1837 
2023-11-23 13:03:06.993130: predicting ICTS_1838 
2023-11-23 13:03:09.084535: predicting ICTS_1841 
2023-11-23 13:03:11.176392: predicting ICTS_1843 
2023-11-23 13:03:13.248004: predicting ICTS_1852 
2023-11-23 13:03:15.327115: predicting ICTS_1853 
2023-11-23 13:03:17.420789: predicting ICTS_1857 
2023-11-23 13:03:19.512448: predicting ICTS_1860 
2023-11-23 13:03:21.605490: predicting ICTS_1870 
2023-11-23 13:03:23.599644: predicting ICTS_1871 
2023-11-23 13:03:25.754409: predicting ICTS_1885 
2023-11-23 13:03:27.544819: predicting ICTS_1886 
2023-11-23 13:03:29.622862: predicting ICTS_1887 
2023-11-23 13:03:31.633410: predicting ICTS_1889 
2023-11-23 13:03:33.684321: predicting ICTS_1894 
2023-11-23 13:03:35.743716: predicting ICTS_1895 
2023-11-23 13:03:37.823513: predicting ICTS_1896 
2023-11-23 13:03:39.867315: predicting ICTS_1897 
2023-11-23 13:03:41.877321: predicting ICTS_1898 
2023-11-23 13:03:43.938977: predicting ICTS_1902 
2023-11-23 13:03:46.004391: predicting ICTS_1905 
2023-11-23 13:03:48.049417: predicting ICTS_1907 
2023-11-23 13:03:50.095157: predicting ICTS_1910 
2023-11-23 13:03:52.136461: predicting ICTS_1911 
2023-11-23 13:03:53.668384: predicting ICTS_1923 
2023-11-23 13:03:55.731056: predicting ICTS_1938 
2023-11-23 13:03:57.795746: predicting ICTS_1945 
2023-11-23 13:03:59.842791: predicting ICTS_1948 
2023-11-23 13:04:01.889477: predicting ICTS_1953 
2023-11-23 13:04:03.868821: predicting ICTS_1955 
2023-11-23 13:04:05.594233: predicting ICTS_1960 
2023-11-23 13:04:07.638461: predicting ICTS_1968 
2023-11-23 13:04:09.707299: predicting ICTS_1970 
2023-11-23 13:04:11.788808: predicting ICTS_1973 
2023-11-23 13:04:13.833868: predicting ICTS_1974 
2023-11-23 13:04:15.786224: predicting ICTS_1975 
2023-11-23 13:04:17.836065: predicting ICTS_1977 
2023-11-23 13:04:19.739898: predicting ICTS_1981 
2023-11-23 13:04:21.784833: predicting ICTS_1986 
2023-11-23 13:04:23.829931: predicting ICTS_1991 
2023-11-23 13:04:25.881742: predicting ICTS_1994 
2023-11-23 13:04:34.399034: Validation complete 
2023-11-23 13:04:34.399135: Mean Validation Dice:  nan 
