Skip to content

Juristische Konsilien Tübingen

Stefan Weil edited this page Dec 4, 2022 · 14 revisions

Training

2022-11-26

Training with augmentation (failed)

Training was started:

(venv3.9_20221126) stweil@ocr-01:~/src/gitlab/scripta/escriptorium/Juristische_Konsilien_Tuebingen/Transkribus_Exporte$ time nice ketos train -f page -t list.train -e list.eval -o Juristische_Konsilien_Tuebingen+256 -d cuda:0 --augment --workers 24 -r 0.0001 -B 1 --min-epochs 200 --lag 20 -w 0 -s '[256,64,0,1 Cr4,2,8,4,2 Cr4,2,32,1,1 Mp4,2,4,2 Cr3,3,64,1,1 Mp1,2,1,2 S1(1x0)1,3 Lbx256 Do0.5 Lbx256 Do0.5 Lbx256 Do0.5 Cr255,1,85,1,1]'
Torch version 1.14.0.dev20221125+cu117 has not been tested with coremltools. You may run into unexpected errors. Torch 1.12.1 is the most recent version that has been tested.
[11/26/22 09:28:53] WARNING  alphabet mismatch: chars in training set only: {'’', '=', '‡', 'º', 'ꝸ', 'X', '╒', '†', '♃', '[', 'Ü', '½', ']', 'û', 'ꝯ', 'ꝟ'} (not included in accuracy test during        train.py:307
                             training)                                                                                                                                                                                
GPU available: True (cuda), used: True
TPU available: False, using: 0 TPU cores
IPU available: False, using: 0 IPUs
HPU available: False, using: 0 HPUs
`Trainer(val_check_interval=1.0)` was configured so validation will run at the end of the training epoch..
LOCAL_RANK: 0 - CUDA_VISIBLE_DEVICES: [0]
┏━━━━┳━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━┓
┃    ┃ Name      ┃ Type                     ┃ Params ┃
┡━━━━╇━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━┩
│ 0  │ net       │ MultiParamSequential     │ 15.2 M │
│ 1  │ net.C_0   │ ActConv2D                │     72 │
│ 2  │ net.C_1   │ ActConv2D                │  2.1 K │
│ 3  │ net.Mp_2  │ MaxPool                  │      0 │
│ 4  │ net.C_3   │ ActConv2D                │ 18.5 K │
│ 5  │ net.Mp_4  │ MaxPool                  │      0 │
│ 6  │ net.S_5   │ Reshape                  │      0 │
│ 7  │ net.L_6   │ TransposedSummarizingRNN │  921 K │
│ 8  │ net.Do_7  │ Dropout                  │      0 │
│ 9  │ net.L_8   │ TransposedSummarizingRNN │  1.6 M │
│ 10 │ net.Do_9  │ Dropout                  │      0 │
│ 11 │ net.L_10  │ TransposedSummarizingRNN │  1.6 M │
│ 12 │ net.Do_11 │ Dropout                  │      0 │
│ 13 │ net.C_12  │ ActConv2D                │ 11.1 M │
│ 14 │ net.O_13  │ LinSoftmax               │ 10.4 K │
└────┴───────────┴──────────────────────────┴────────┘
Trainable params: 15.2 M                                                                                                                                                                                              
Non-trainable params: 0                                                                                                                                                                                               
Total params: 15.2 M                                                                                                                                                                                                  
Total estimated model params size (MB): 60                                                                                                                                                                            
stage 0/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:21 val_accuracy: 0.00452  early_stopping: 0/20 0.00452
stage 1/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.04285  early_stopping: 0/20 0.04285
stage 2/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.03587  early_stopping: 1/20 0.04285
stage 3/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.03603  early_stopping: 2/20 0.04285
stage 4/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.21317  early_stopping: 0/20 0.21317
stage 5/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:19 val_accuracy: 0.27053  early_stopping: 0/20 0.27053
stage 6/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.33514  early_stopping: 0/20 0.33514
stage 7/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.36232  early_stopping: 0/20 0.36232
stage 8/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.41037  early_stopping: 0/20 0.41037
stage 9/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.43547  early_stopping: 0/20 0.43547
stage 10/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.48354  early_stopping: 0/20 0.48354
stage 11/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.53467  early_stopping: 0/20 0.53467
stage 12/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.57232  early_stopping: 0/20 0.57232
stage 13/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.60660  early_stopping: 0/20 0.60660
stage 14/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.63319  early_stopping: 0/20 0.63319
stage 15/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:20 val_accuracy: 0.66243  early_stopping: 0/20 0.66243
stage 16/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.67968  early_stopping: 0/20 0.67968
stage 17/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:19 val_accuracy: 0.69946  early_stopping: 0/20 0.69946
stage 18/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.71565  early_stopping: 0/20 0.71565
stage 19/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.73442  early_stopping: 0/20 0.73442
stage 20/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.74554  early_stopping: 0/20 0.74554
stage 21/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.75933  early_stopping: 0/20 0.75933
stage 22/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.77222  early_stopping: 0/20 0.77222
stage 23/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.76541  early_stopping: 1/20 0.77222
stage 24/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.77811  early_stopping: 0/20 0.77811
stage 25/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.78512  early_stopping: 0/20 0.78512
stage 26/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:20 val_accuracy: 0.79349  early_stopping: 0/20 0.79349
stage 27/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.79869  early_stopping: 0/20 0.79869
stage 28/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.79875  early_stopping: 0/20 0.79875
stage 29/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.80726  early_stopping: 0/20 0.80726
stage 30/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.80875  early_stopping: 0/20 0.80875
stage 31/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.81809  early_stopping: 0/20 0.81809
stage 32/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.82413  early_stopping: 0/20 0.82413
stage 33/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:19 val_accuracy: 0.82815  early_stopping: 0/20 0.82815
stage 34/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.82612  early_stopping: 1/20 0.82815
stage 35/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.82837  early_stopping: 0/20 0.82837
stage 36/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.82768  early_stopping: 1/20 0.82837
stage 37/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.83363  early_stopping: 0/20 0.83363
stage 38/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.82684  early_stopping: 1/20 0.83363
stage 39/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.83497  early_stopping: 0/20 0.83497
stage 40/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.83908  early_stopping: 0/20 0.83908
stage 41/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.83861  early_stopping: 1/20 0.83908
stage 42/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:20 val_accuracy: 0.84163  early_stopping: 0/20 0.84163
stage 43/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.84017  early_stopping: 1/20 0.84163
stage 44/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.84144  early_stopping: 2/20 0.84163
stage 45/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.84219  early_stopping: 0/20 0.84219
stage 46/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.85063  early_stopping: 0/20 0.85063
stage 47/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.84509  early_stopping: 1/20 0.85063
stage 48/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.85661  early_stopping: 0/20 0.85661
stage 49/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.85188  early_stopping: 1/20 0.85661
stage 50/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.85399  early_stopping: 2/20 0.85661
stage 51/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.86290  early_stopping: 0/20 0.86290
stage 52/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.85091  early_stopping: 1/20 0.86290
stage 53/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.85384  early_stopping: 2/20 0.86290
stage 54/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.85116  early_stopping: 3/20 0.86290
stage 55/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:19 val_accuracy: 0.85813  early_stopping: 4/20 0.86290
stage 56/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.85742  early_stopping: 5/20 0.86290
stage 57/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.85689  early_stopping: 6/20 0.86290
stage 58/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.85798  early_stopping: 7/20 0.86290
stage 59/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.85954  early_stopping: 8/20 0.86290
stage 60/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.85692  early_stopping: 9/20 0.86290
stage 61/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.85371  early_stopping: 10/20 0.86290
stage 62/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.85792  early_stopping: 11/20 0.86290
stage 63/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.85798  early_stopping: 12/20 0.86290
stage 64/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:19 val_accuracy: 0.86168  early_stopping: 13/20 0.86290
stage 65/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.87140  early_stopping: 0/20 0.87140
stage 66/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.86573  early_stopping: 1/20 0.87140
stage 67/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.86586  early_stopping: 2/20 0.87140
stage 68/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.86682  early_stopping: 3/20 0.87140
stage 69/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.86137  early_stopping: 4/20 0.87140
stage 70/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.86365  early_stopping: 5/20 0.87140
stage 71/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.87015  early_stopping: 6/20 0.87140
stage 72/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.86816  early_stopping: 7/20 0.87140
stage 73/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.86984  early_stopping: 8/20 0.87140
stage 74/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.85617  early_stopping: 9/20 0.87140
stage 75/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.86200  early_stopping: 10/20 0.87140
stage 76/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:20 val_accuracy: 0.86595  early_stopping: 11/20 0.87140
stage 77/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.87314  early_stopping: 0/20 0.87314
stage 78/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.85795  early_stopping: 1/20 0.87314
stage 79/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.86997  early_stopping: 2/20 0.87314
stage 80/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.86025  early_stopping: 3/20 0.87314
stage 81/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.86486  early_stopping: 4/20 0.87314
stage 82/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.87193  early_stopping: 5/20 0.87314
stage 83/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.86558  early_stopping: 6/20 0.87314
stage 84/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.87296  early_stopping: 7/20 0.87314
stage 85/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.86931  early_stopping: 8/20 0.87314
stage 86/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.86720  early_stopping: 9/20 0.87314
stage 87/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.86947  early_stopping: 10/20 0.87314
stage 88/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.87532  early_stopping: 0/20 0.87532
stage 89/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.87068  early_stopping: 1/20 0.87532
stage 90/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.86729  early_stopping: 2/20 0.87532
stage 91/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.87974  early_stopping: 0/20 0.87974
stage 92/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.86913  early_stopping: 1/20 0.87974
stage 93/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.87339  early_stopping: 2/20 0.87974
stage 94/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.87177  early_stopping: 3/20 0.87974
stage 95/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.87137  early_stopping: 4/20 0.87974
stage 96/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.86358  early_stopping: 5/20 0.87974
stage 97/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.87103  early_stopping: 6/20 0.87974
stage 98/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:20 val_accuracy: 0.86362  early_stopping: 7/20 0.87974
stage 99/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.87688  early_stopping: 8/20 0.87974
stage 100/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.87280  early_stopping: 9/20 0.87974
stage 101/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:20 val_accuracy: 0.86349  early_stopping: 10/20 0.87974
stage 102/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.87349  early_stopping: 11/20 0.87974
stage 103/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.87299  early_stopping: 12/20 0.87974
stage 104/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.87458  early_stopping: 13/20 0.87974
stage 105/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.87258  early_stopping: 14/20 0.87974
stage 106/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.87059  early_stopping: 15/20 0.87974
stage 107/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.87286  early_stopping: 16/20 0.87974
stage 108/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.87865  early_stopping: 17/20 0.87974
stage 109/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.87676  early_stopping: 18/20 0.87974
stage 110/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.87542  early_stopping: 19/20 0.87974
stage 111/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:18 val_accuracy: 0.87760  early_stopping: 20/20 0.87974
stage 112/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0/7366 -:--:-- 0:00:00  early_stopping: 20/20 0.87974Trainer was signaled to stop but the required `min_epochs=200` or `min_steps=None` has not been met. Training will continue...
stage 112/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 20908/7366 0:00:00 0:08:02 val_accuracy: 0.87570  early_stopping: 20/20 0.87974
Validation  ━━━━━━━━━━━╺━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 231/826    0:00:38 0:00:17                                                     Terminated

real    619m37.975s
user    10487m38.918s
sys     6080m15.838s

The training process had to be killed because the validation step was running again and again.

Training without augmentation

(venv3.9_20221126) stweil@ocr-01:~/src/gitlab/scripta/escriptorium/Juristische_Konsilien_Tuebingen/Transkribus_Exporte$ time nice ketos train -f page -t list.train -e list.eval -o Juristische_Konsilien_Tuebingen+256 -d cuda:0 --workers 24 -r 0.0001 -B 1 --lag 10 -w 0 -s '[256,64,0,1 Cr4,2,8,4,2 Cr4,2,32,1,1 Mp4,2,4,2 Cr3,3,64,1,1 Mp1,2,1,2 S1(1x0)1,3 Lbx256 Do0.5 Lbx256 Do0.5 Lbx256 Do0.5 Cr255,1,85,1,1]'
Torch version 1.14.0.dev20221125+cu117 has not been tested with coremltools. You may run into unexpected errors. Torch 1.12.1 is the most recent version that has been tested.
[11/26/22 19:55:29] WARNING  alphabet mismatch: chars in training set only: {'ꝟ', 'º', '=', '½', 'Ü', ']', '’', '‡', 'ꝸ', '╒', 'ꝯ', '♃', 'û', 'X', '[', '†'} (not included in accuracy test during        train.py:307
                             training)                                                                                                                                                                                
GPU available: True (cuda), used: True
TPU available: False, using: 0 TPU cores
IPU available: False, using: 0 IPUs
HPU available: False, using: 0 HPUs
`Trainer(val_check_interval=1.0)` was configured so validation will run at the end of the training epoch..
LOCAL_RANK: 0 - CUDA_VISIBLE_DEVICES: [0]
┏━━━━┳━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━┓
┃    ┃ Name      ┃ Type                     ┃ Params ┃
┡━━━━╇━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━┩
│ 0  │ net       │ MultiParamSequential     │ 15.2 M │
│ 1  │ net.C_0   │ ActConv2D                │     72 │
│ 2  │ net.C_1   │ ActConv2D                │  2.1 K │
│ 3  │ net.Mp_2  │ MaxPool                  │      0 │
│ 4  │ net.C_3   │ ActConv2D                │ 18.5 K │
│ 5  │ net.Mp_4  │ MaxPool                  │      0 │
│ 6  │ net.S_5   │ Reshape                  │      0 │
│ 7  │ net.L_6   │ TransposedSummarizingRNN │  921 K │
│ 8  │ net.Do_7  │ Dropout                  │      0 │
│ 9  │ net.L_8   │ TransposedSummarizingRNN │  1.6 M │
│ 10 │ net.Do_9  │ Dropout                  │      0 │
│ 11 │ net.L_10  │ TransposedSummarizingRNN │  1.6 M │
│ 12 │ net.Do_11 │ Dropout                  │      0 │
│ 13 │ net.C_12  │ ActConv2D                │ 11.1 M │
│ 14 │ net.O_13  │ LinSoftmax               │ 10.4 K │
└────┴───────────┴──────────────────────────┴────────┘
Trainable params: 15.2 M                                                                                                                                                                                              
Non-trainable params: 0                                                                                                                                                                                               
Total params: 15.2 M                                                                                                                                                                                                  
Total estimated model params size (MB): 60                                                                                                                                                                            
stage 0/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:10 val_accuracy: 0.00853  early_stopping: 0/10 0.00853
stage 1/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.04182  early_stopping: 0/10 0.04182
stage 2/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.04148  early_stopping: 1/10 0.04182
stage 3/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.04369  early_stopping: 0/10 0.04369
stage 4/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:10 val_accuracy: 0.24313  early_stopping: 0/10 0.24313
stage 5/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.28628  early_stopping: 0/10 0.28628
stage 6/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.34143  early_stopping: 0/10 0.34143
stage 7/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.38649  early_stopping: 0/10 0.38649
stage 8/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.41831  early_stopping: 0/10 0.41831
stage 9/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.47224  early_stopping: 0/10 0.47224
stage 10/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:11 val_accuracy: 0.50615  early_stopping: 0/10 0.50615
stage 11/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:10 val_accuracy: 0.56201  early_stopping: 0/10 0.56201
stage 12/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.59215  early_stopping: 0/10 0.59215
stage 13/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:10 val_accuracy: 0.62332  early_stopping: 0/10 0.62332
stage 14/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.64951  early_stopping: 0/10 0.64951
stage 15/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.67850  early_stopping: 0/10 0.67850
stage 16/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:11 val_accuracy: 0.69425  early_stopping: 0/10 0.69425
stage 17/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.71867  early_stopping: 0/10 0.71867
stage 18/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.73539  early_stopping: 0/10 0.73539
stage 19/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:11 val_accuracy: 0.74012  early_stopping: 0/10 0.74012
stage 20/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.75326  early_stopping: 0/10 0.75326
stage 21/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:08 val_accuracy: 0.76678  early_stopping: 0/10 0.76678
stage 22/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.78122  early_stopping: 0/10 0.78122
stage 23/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:11 val_accuracy: 0.78932  early_stopping: 0/10 0.78932
stage 24/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.79620  early_stopping: 0/10 0.79620
stage 25/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:11 val_accuracy: 0.80455  early_stopping: 0/10 0.80455
stage 26/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.81090  early_stopping: 0/10 0.81090
stage 27/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.81196  early_stopping: 0/10 0.81196
stage 28/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.81924  early_stopping: 0/10 0.81924
stage 29/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.82407  early_stopping: 0/10 0.82407
stage 30/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.82768  early_stopping: 0/10 0.82768
stage 31/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.83192  early_stopping: 0/10 0.83192
stage 32/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.83544  early_stopping: 0/10 0.83544
stage 33/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.83322  early_stopping: 1/10 0.83544
stage 34/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.83955  early_stopping: 0/10 0.83955
stage 35/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.84204  early_stopping: 0/10 0.84204
stage 36/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.84176  early_stopping: 1/10 0.84204
stage 37/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.84792  early_stopping: 0/10 0.84792
stage 38/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.84724  early_stopping: 1/10 0.84792
stage 39/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.85023  early_stopping: 0/10 0.85023
stage 40/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:11 val_accuracy: 0.85044  early_stopping: 0/10 0.85044
stage 41/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.85368  early_stopping: 0/10 0.85368
stage 42/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.85340  early_stopping: 1/10 0.85368
stage 43/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.85552  early_stopping: 0/10 0.85552
stage 44/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.85655  early_stopping: 0/10 0.85655
stage 45/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.85480  early_stopping: 1/10 0.85655
stage 46/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.85879  early_stopping: 0/10 0.85879
stage 47/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.85789  early_stopping: 1/10 0.85879
stage 48/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.85935  early_stopping: 0/10 0.85935
stage 49/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.85873  early_stopping: 1/10 0.85935
stage 50/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.85807  early_stopping: 2/10 0.85935
stage 51/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:11 val_accuracy: 0.86072  early_stopping: 0/10 0.86072
stage 52/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:17 val_accuracy: 0.86059  early_stopping: 1/10 0.86072
stage 53/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.85882  early_stopping: 2/10 0.86072
stage 54/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.86137  early_stopping: 0/10 0.86137
stage 55/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.86421  early_stopping: 0/10 0.86421
stage 56/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.86433  early_stopping: 0/10 0.86433
stage 57/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.86424  early_stopping: 1/10 0.86433
stage 58/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.86430  early_stopping: 2/10 0.86433
stage 59/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.86692  early_stopping: 0/10 0.86692
stage 60/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:10 val_accuracy: 0.86449  early_stopping: 1/10 0.86692
stage 61/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.86782  early_stopping: 0/10 0.86782
stage 62/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.86561  early_stopping: 1/10 0.86782
stage 63/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:16 val_accuracy: 0.86822  early_stopping: 0/10 0.86822
stage 64/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.86869  early_stopping: 0/10 0.86869
stage 65/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.86648  early_stopping: 1/10 0.86869
stage 66/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.86564  early_stopping: 2/10 0.86869
stage 67/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.86975  early_stopping: 0/10 0.86975
stage 68/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:11 val_accuracy: 0.86869  early_stopping: 1/10 0.86975
stage 69/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.86966  early_stopping: 2/10 0.86975
stage 70/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.86947  early_stopping: 3/10 0.86975
stage 71/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:11 val_accuracy: 0.87208  early_stopping: 0/10 0.87208
stage 72/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.87215  early_stopping: 0/10 0.87215
stage 73/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.87047  early_stopping: 1/10 0.87215
stage 74/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.87224  early_stopping: 0/10 0.87224
stage 75/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:15 val_accuracy: 0.87177  early_stopping: 1/10 0.87224
stage 76/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.86919  early_stopping: 2/10 0.87224
stage 77/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.86763  early_stopping: 3/10 0.87224
stage 78/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.87162  early_stopping: 4/10 0.87224
stage 79/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.87308  early_stopping: 0/10 0.87308
stage 80/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.87099  early_stopping: 1/10 0.87308
stage 81/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.86987  early_stopping: 2/10 0.87308
stage 82/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.87043  early_stopping: 3/10 0.87308
stage 83/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.87386  early_stopping: 0/10 0.87386
stage 84/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.87520  early_stopping: 0/10 0.87520
stage 85/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.87423  early_stopping: 1/10 0.87520
stage 86/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:12 val_accuracy: 0.87426  early_stopping: 2/10 0.87520
stage 87/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.87289  early_stopping: 3/10 0.87520
stage 88/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:11 val_accuracy: 0.87193  early_stopping: 4/10 0.87520
stage 89/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.87345  early_stopping: 5/10 0.87520
stage 90/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:11 val_accuracy: 0.87196  early_stopping: 6/10 0.87520
stage 91/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.87370  early_stopping: 7/10 0.87520
stage 92/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:14 val_accuracy: 0.87286  early_stopping: 8/10 0.87520
stage 93/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.87302  early_stopping: 9/10 0.87520
stage 94/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7366/7366 0:00:00 0:05:13 val_accuracy: 0.87314  early_stopping: 10/10 0.87520
Moving best model Juristische_Konsilien_Tuebingen+256_84.mlmodel (0.875198483467102) to Juristische_Konsilien_Tuebingen+256_best.mlmodel

real    500m7.467s
user    8686m2.071s
sys     4967m2.416s

ls -lt *.mlmodel
-rw-r--r-- 1 stweil stweil 60806268 Nov 27 04:14 Juristische_Konsilien_Tuebingen+256_best.mlmodel
-rw-r--r-- 1 stweil stweil 60807382 Nov 27 04:14 Juristische_Konsilien_Tuebingen+256_94.mlmodel
-rw-r--r-- 1 stweil stweil 60807270 Nov 27 04:09 Juristische_Konsilien_Tuebingen+256_93.mlmodel
-rw-r--r-- 1 stweil stweil 60807158 Nov 27 04:04 Juristische_Konsilien_Tuebingen+256_92.mlmodel
-rw-r--r-- 1 stweil stweil 60807046 Nov 27 03:59 Juristische_Konsilien_Tuebingen+256_91.mlmodel
-rw-r--r-- 1 stweil stweil 60806934 Nov 27 03:53 Juristische_Konsilien_Tuebingen+256_90.mlmodel
-rw-r--r-- 1 stweil stweil 60806822 Nov 27 03:48 Juristische_Konsilien_Tuebingen+256_89.mlmodel
-rw-r--r-- 1 stweil stweil 60806713 Nov 27 03:43 Juristische_Konsilien_Tuebingen+256_88.mlmodel
-rw-r--r-- 1 stweil stweil 60806601 Nov 27 03:38 Juristische_Konsilien_Tuebingen+256_87.mlmodel
-rw-r--r-- 1 stweil stweil 60806492 Nov 27 03:32 Juristische_Konsilien_Tuebingen+256_86.mlmodel
-rw-r--r-- 1 stweil stweil 60806380 Nov 27 03:27 Juristische_Konsilien_Tuebingen+256_85.mlmodel
-rw-r--r-- 1 stweil stweil 60806268 Nov 27 03:22 Juristische_Konsilien_Tuebingen+256_84.mlmodel
-rw-r--r-- 1 stweil stweil 60806159 Nov 27 03:17 Juristische_Konsilien_Tuebingen+256_83.mlmodel
-rw-r--r-- 1 stweil stweil 60806047 Nov 27 03:11 Juristische_Konsilien_Tuebingen+256_82.mlmodel
-rw-r--r-- 1 stweil stweil 60805935 Nov 27 03:06 Juristische_Konsilien_Tuebingen+256_81.mlmodel
-rw-r--r-- 1 stweil stweil 60805823 Nov 27 03:01 Juristische_Konsilien_Tuebingen+256_80.mlmodel
-rw-r--r-- 1 stweil stweil 60805711 Nov 27 02:56 Juristische_Konsilien_Tuebingen+256_79.mlmodel
-rw-r--r-- 1 stweil stweil 60805599 Nov 27 02:50 Juristische_Konsilien_Tuebingen+256_78.mlmodel
-rw-r--r-- 1 stweil stweil 60805487 Nov 27 02:45 Juristische_Konsilien_Tuebingen+256_77.mlmodel
-rw-r--r-- 1 stweil stweil 60805375 Nov 27 02:40 Juristische_Konsilien_Tuebingen+256_76.mlmodel
-rw-r--r-- 1 stweil stweil 60805263 Nov 27 02:34 Juristische_Konsilien_Tuebingen+256_75.mlmodel
-rw-r--r-- 1 stweil stweil 60805151 Nov 27 02:29 Juristische_Konsilien_Tuebingen+256_74.mlmodel
[...]
-rw-r--r-- 1 stweil stweil 60798244 Nov 26 21:03 Juristische_Konsilien_Tuebingen+256_12.mlmodel
-rw-r--r-- 1 stweil stweil 60798134 Nov 26 20:58 Juristische_Konsilien_Tuebingen+256_11.mlmodel
-rw-r--r-- 1 stweil stweil 60798024 Nov 26 20:53 Juristische_Konsilien_Tuebingen+256_10.mlmodel
-rw-r--r-- 1 stweil stweil 60797914 Nov 26 20:48 Juristische_Konsilien_Tuebingen+256_9.mlmodel
-rw-r--r-- 1 stweil stweil 60797803 Nov 26 20:42 Juristische_Konsilien_Tuebingen+256_8.mlmodel
-rw-r--r-- 1 stweil stweil 60797690 Nov 26 20:37 Juristische_Konsilien_Tuebingen+256_7.mlmodel
-rw-r--r-- 1 stweil stweil 60797580 Nov 26 20:32 Juristische_Konsilien_Tuebingen+256_6.mlmodel
-rw-r--r-- 1 stweil stweil 60797470 Nov 26 20:27 Juristische_Konsilien_Tuebingen+256_5.mlmodel
-rw-r--r-- 1 stweil stweil 60797360 Nov 26 20:21 Juristische_Konsilien_Tuebingen+256_4.mlmodel
-rw-r--r-- 1 stweil stweil 60797250 Nov 26 20:16 Juristische_Konsilien_Tuebingen+256_3.mlmodel
-rw-r--r-- 1 stweil stweil 60797134 Nov 26 20:11 Juristische_Konsilien_Tuebingen+256_2.mlmodel
-rw-r--r-- 1 stweil stweil 60797018 Nov 26 20:06 Juristische_Konsilien_Tuebingen+256_1.mlmodel
-rw-r--r-- 1 stweil stweil 60796903 Nov 26 20:00 Juristische_Konsilien_Tuebingen+256_0.mlmodel

2022-11-27

Pretrain

(venv3.9_20221126) stweil@ocr-01:~/src/gitlab/scripta/escriptorium/Juristische_Konsilien_Tuebingen/Transkribus_Exporte$ time nice ketos pretrain -f page -t list.train -e list.eval -o pretrain -d cuda:0 --workers 24Torch version 1.14.0.dev20221125+cu117 has not been tested with coremltools. You may run into unexpected errors. Torch 1.12.1 is the most recent version that has been tested.
GPU available: True (cuda), used: True
TPU available: False, using: 0 TPU cores
IPU available: False, using: 0 IPUs
HPU available: False, using: 0 HPUs
`Trainer(val_check_interval=1.0)` was configured so validation will run at the end of the training epoch..
LOCAL_RANK: 0 - CUDA_VISIBLE_DEVICES: [0]
Adjusting learning rate of group 0 to 1.0000e-06.
┏━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━┓
┃    ┃ Name                   ┃ Type                     ┃ Params ┃
┡━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━┩
│ 0  │ net                    │ MultiParamSequential     │  4.0 M │
│ 1  │ net.C_0                │ ActConv2D                │  1.3 K │
│ 2  │ net.Do_1               │ Dropout                  │      0 │
│ 3  │ net.Mp_2               │ MaxPool                  │      0 │
│ 4  │ net.C_3                │ ActConv2D                │ 40.0 K │
│ 5  │ net.Do_4               │ Dropout                  │      0 │
│ 6  │ net.Mp_5               │ MaxPool                  │      0 │
│ 7  │ net.C_6                │ ActConv2D                │ 55.4 K │
│ 8  │ net.Do_7               │ Dropout                  │      0 │
│ 9  │ net.Mp_8               │ MaxPool                  │      0 │
│ 10 │ net.C_9                │ ActConv2D                │  110 K │
│ 11 │ net.Do_10              │ Dropout                  │      0 │
│ 12 │ net.S_11               │ Reshape                  │      0 │
│ 13 │ net.L_12               │ TransposedSummarizingRNN │  1.9 M │
│ 14 │ net.Do_13              │ Dropout                  │      0 │
│ 15 │ net.L_14               │ TransposedSummarizingRNN │  963 K │
│ 16 │ net.Do_15              │ Dropout                  │      0 │
│ 17 │ net.L_16               │ TransposedSummarizingRNN │  963 K │
│ 18 │ net.Do_17              │ Dropout                  │      0 │
│ 19 │ features               │ MultiParamSequential     │  207 K │
│ 20 │ wav2vec2mask           │ Wav2Vec2Mask             │  388 K │
│ 21 │ wav2vec2mask.mask_emb  │ Embedding                │  3.8 K │
│ 22 │ wav2vec2mask.project_q │ Linear                   │  384 K │
│ 23 │ encoder                │ MultiParamSequential     │  3.8 M │
└────┴────────────────────────┴──────────────────────────┴────────┘
Trainable params: 4.4 M                                                                                                                                                                                               
Non-trainable params: 0                                                                                                                                                                                               
Total params: 4.4 M                                                                                                                                                                                                   
Total estimated model params size (MB): 17                                                                                                                                                                            
Adjusting learning rate of group 0 to 1.0000e-06.
stage 0/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:19 loss: 1.94e+03  early_stopping: 0/5 2036.78601
Adjusting learning rate of group 0 to 1.0000e-06.
stage 1/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:22 loss: 1.86e+03  early_stopping: 0/5 2030.18481
Adjusting learning rate of group 0 to 1.0000e-06.
stage 2/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:20 loss: 1.79e+03  early_stopping: 0/5 2016.71643
Adjusting learning rate of group 0 to 1.0000e-06.
stage 3/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 1.7e+03  early_stopping: 0/5 2008.26001
Adjusting learning rate of group 0 to 1.0000e-06.
stage 4/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 1.62e+03  early_stopping: 0/5 1993.02002
Adjusting learning rate of group 0 to 1.0000e-06.
stage 5/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 1.52e+03  early_stopping: 0/5 1979.84839
Adjusting learning rate of group 0 to 1.0000e-06.
stage 6/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:20 loss: 1.43e+03  early_stopping: 0/5 1971.93323
Adjusting learning rate of group 0 to 1.0000e-06.
stage 7/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 1.32e+03  early_stopping: 0/5 1956.12549
Adjusting learning rate of group 0 to 1.0000e-06.
stage 8/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:20 loss: 1.23e+03  early_stopping: 0/5 1952.79724
Adjusting learning rate of group 0 to 1.0000e-06.
stage 9/∞  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 1.14e+03  early_stopping: 0/5 1943.24829
Adjusting learning rate of group 0 to 1.0000e-06.
stage 10/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 1.05e+03  early_stopping: 0/5 1934.16309
Adjusting learning rate of group 0 to 1.0000e-06.
stage 11/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:20 loss: 975  early_stopping: 0/5 1925.65979
Adjusting learning rate of group 0 to 1.0000e-06.
stage 12/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:22 loss: 919  early_stopping: 0/5 1916.17700
Adjusting learning rate of group 0 to 1.0000e-06.
stage 13/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 865  early_stopping: 0/5 1911.95837
Adjusting learning rate of group 0 to 1.0000e-06.
stage 14/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 796  early_stopping: 0/5 1900.35925
Adjusting learning rate of group 0 to 1.0000e-06.
stage 15/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 745  early_stopping: 1/5 1900.35925
Adjusting learning rate of group 0 to 1.0000e-06.
stage 16/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:20 loss: 705  early_stopping: 0/5 1895.72766
Adjusting learning rate of group 0 to 1.0000e-06.
stage 17/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:22 loss: 683  early_stopping: 0/5 1891.54480
Adjusting learning rate of group 0 to 1.0000e-06.
stage 18/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 636  early_stopping: 1/5 1891.54480
Adjusting learning rate of group 0 to 1.0000e-06.
stage 19/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 607  early_stopping: 2/5 1891.54480
Adjusting learning rate of group 0 to 1.0000e-06.
stage 20/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:22 loss: 577  early_stopping: 3/5 1891.54480
Adjusting learning rate of group 0 to 1.0000e-06.
stage 21/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 548  early_stopping: 0/5 1889.50232
Adjusting learning rate of group 0 to 1.0000e-06.
stage 22/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 524  early_stopping: 0/5 1880.25671
Adjusting learning rate of group 0 to 1.0000e-06.
stage 23/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:20 loss: 494  early_stopping: 1/5 1880.25671
Adjusting learning rate of group 0 to 1.0000e-06.
stage 24/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 480  early_stopping: 2/5 1880.25671
Adjusting learning rate of group 0 to 1.0000e-06.
stage 25/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 469  early_stopping: 3/5 1880.25671
Adjusting learning rate of group 0 to 1.0000e-06.
stage 26/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 444  early_stopping: 0/5 1870.65674
Adjusting learning rate of group 0 to 1.0000e-06.
stage 27/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 442  early_stopping: 0/5 1850.01819
Adjusting learning rate of group 0 to 1.0000e-06.
stage 28/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 425  early_stopping: 1/5 1850.01819
Adjusting learning rate of group 0 to 1.0000e-06.
stage 29/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 410  early_stopping: 0/5 1849.98108
Adjusting learning rate of group 0 to 1.0000e-06.
stage 30/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:22 loss: 399  early_stopping: 1/5 1849.98108
Adjusting learning rate of group 0 to 1.0000e-06.
stage 31/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:22 loss: 394  early_stopping: 2/5 1849.98108
Adjusting learning rate of group 0 to 1.0000e-06.
stage 32/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 379  early_stopping: 3/5 1849.98108
Adjusting learning rate of group 0 to 1.0000e-06.
stage 33/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:19 loss: 359  early_stopping: 4/5 1849.98108
Adjusting learning rate of group 0 to 1.0000e-06.
stage 34/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:22 loss: 365  early_stopping: 0/5 1843.54395
Adjusting learning rate of group 0 to 1.0000e-06.
stage 35/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:20 loss: 352  early_stopping: 1/5 1843.54395
Adjusting learning rate of group 0 to 1.0000e-06.
stage 36/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 342  early_stopping: 2/5 1843.54395
Adjusting learning rate of group 0 to 1.0000e-06.
stage 37/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:22 loss: 328  early_stopping: 3/5 1843.54395
Adjusting learning rate of group 0 to 1.0000e-06.
stage 38/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:21 loss: 325  early_stopping: 4/5 1843.54395
Adjusting learning rate of group 0 to 1.0000e-06.
stage 39/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 116/116 0:00:00 0:04:20 loss: 320  early_stopping: 5/5 1843.54395
stage 40/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0/116 -:--:-- 0:00:00  early_stopping: 5/5 1843.54395Trainer was signaled to stop but the required `min_epochs=100` or `min_steps=None` has not been met. Training will continue...
Adjusting learning rate of group 0 to 1.0000e-06.
stage 40/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:04 loss: 306  early_stopping: 0/5 1837.04565
Adjusting learning rate of group 0 to 1.0000e-06.
stage 41/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:05 loss: 301  early_stopping: 0/5 1836.88599
Adjusting learning rate of group 0 to 1.0000e-06.
stage 42/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:05 loss: 299  early_stopping: 1/5 1836.88599
Adjusting learning rate of group 0 to 1.0000e-06.
stage 43/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:05 loss: 292  early_stopping: 2/5 1836.88599
Adjusting learning rate of group 0 to 1.0000e-06.
stage 44/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:03 loss: 279  early_stopping: 3/5 1836.88599
Adjusting learning rate of group 0 to 1.0000e-06.
stage 45/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:04 loss: 279  early_stopping: 0/5 1834.09973
Adjusting learning rate of group 0 to 1.0000e-06.
stage 46/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:04 loss: 272  early_stopping: 0/5 1828.76099
Adjusting learning rate of group 0 to 1.0000e-06.
stage 47/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:04 loss: 266  early_stopping: 0/5 1824.72766
Adjusting learning rate of group 0 to 1.0000e-06.
stage 48/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:14 loss: 255  early_stopping: 1/5 1824.72766
Adjusting learning rate of group 0 to 1.0000e-06.
stage 49/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:03 loss: 255  early_stopping: 2/5 1824.72766
Adjusting learning rate of group 0 to 1.0000e-06.
stage 50/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:03 loss: 256  early_stopping: 3/5 1824.72766
Adjusting learning rate of group 0 to 1.0000e-06.
stage 51/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:02 loss: 250  early_stopping: 4/5 1824.72766
Adjusting learning rate of group 0 to 1.0000e-06.
stage 52/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:04 loss: 241  early_stopping: 0/5 1821.01440
Adjusting learning rate of group 0 to 1.0000e-06.
stage 53/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:06 loss: 236  early_stopping: 1/5 1821.01440
Adjusting learning rate of group 0 to 1.0000e-06.
stage 54/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:05 loss: 236  early_stopping: 2/5 1821.01440
Adjusting learning rate of group 0 to 1.0000e-06.
stage 55/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:03 loss: 229  early_stopping: 3/5 1821.01440
Adjusting learning rate of group 0 to 1.0000e-06.
stage 56/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:05 loss: 223  early_stopping: 4/5 1821.01440
Adjusting learning rate of group 0 to 1.0000e-06.
stage 57/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:05 loss: 225  early_stopping: 5/5 1821.01440
Adjusting learning rate of group 0 to 1.0000e-06.
stage 58/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:06 loss: 218  early_stopping: 6/5 1821.01440
Adjusting learning rate of group 0 to 1.0000e-06.
stage 59/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:06 loss: 216  early_stopping: 7/5 1821.01440
Adjusting learning rate of group 0 to 1.0000e-06.
stage 60/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:07 loss: 210  early_stopping: 8/5 1821.01440
Adjusting learning rate of group 0 to 1.0000e-06.
stage 61/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:06 loss: 208  early_stopping: 9/5 1821.01440
Adjusting learning rate of group 0 to 1.0000e-06.
stage 62/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:06 loss: 206  early_stopping: 10/5 1821.01440
Adjusting learning rate of group 0 to 1.0000e-06.
stage 63/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:09 loss: 202  early_stopping: 11/5 1821.01440
Adjusting learning rate of group 0 to 1.0000e-06.
stage 64/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:05 loss: 198  early_stopping: 12/5 1821.01440
Adjusting learning rate of group 0 to 1.0000e-06.
stage 65/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:08 loss: 205  early_stopping: 13/5 1821.01440
Adjusting learning rate of group 0 to 1.0000e-06.
stage 66/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:06 loss: 193  early_stopping: 14/5 1821.01440
Adjusting learning rate of group 0 to 1.0000e-06.
stage 67/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:07 loss: 192  early_stopping: 15/5 1821.01440
Adjusting learning rate of group 0 to 1.0000e-06.
stage 68/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:06 loss: 197  early_stopping: 0/5 1819.53687
Adjusting learning rate of group 0 to 1.0000e-06.
stage 69/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:07 loss: 188  early_stopping: 1/5 1819.53687
Adjusting learning rate of group 0 to 1.0000e-06.
stage 70/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:04 loss: 183  early_stopping: 2/5 1819.53687
Adjusting learning rate of group 0 to 1.0000e-06.
stage 71/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:06 loss: 186  early_stopping: 3/5 1819.53687
Adjusting learning rate of group 0 to 1.0000e-06.
stage 72/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:07 loss: 182  early_stopping: 0/5 1809.54614
Adjusting learning rate of group 0 to 1.0000e-06.
stage 73/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:10 loss: 177  early_stopping: 1/5 1809.54614
Adjusting learning rate of group 0 to 1.0000e-06.
stage 74/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:07 loss: 178  early_stopping: 2/5 1809.54614
Adjusting learning rate of group 0 to 1.0000e-06.
stage 75/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:08 loss: 168  early_stopping: 3/5 1809.54614
Adjusting learning rate of group 0 to 1.0000e-06.
stage 76/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:07 loss: 166  early_stopping: 4/5 1809.54614
Adjusting learning rate of group 0 to 1.0000e-06.
stage 77/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:09 loss: 172  early_stopping: 5/5 1809.54614
Adjusting learning rate of group 0 to 1.0000e-06.
stage 78/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:05 loss: 163  early_stopping: 6/5 1809.54614
Adjusting learning rate of group 0 to 1.0000e-06.
stage 79/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:05 loss: 164  early_stopping: 7/5 1809.54614
Adjusting learning rate of group 0 to 1.0000e-06.
stage 80/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:08 loss: 164  early_stopping: 8/5 1809.54614
Adjusting learning rate of group 0 to 1.0000e-06.
stage 81/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:09 loss: 161  early_stopping: 9/5 1809.54614
Adjusting learning rate of group 0 to 1.0000e-06.
stage 82/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:06 loss: 158  early_stopping: 10/5 1809.54614
Adjusting learning rate of group 0 to 1.0000e-06.
stage 83/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:07 loss: 161  early_stopping: 0/5 1809.19214
Adjusting learning rate of group 0 to 1.0000e-06.
stage 84/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:06 loss: 153  early_stopping: 1/5 1809.19214
Adjusting learning rate of group 0 to 1.0000e-06.
stage 85/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:07 loss: 153  early_stopping: 0/5 1801.30469
Adjusting learning rate of group 0 to 1.0000e-06.
stage 86/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:06 loss: 156  early_stopping: 1/5 1801.30469
Adjusting learning rate of group 0 to 1.0000e-06.
stage 87/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:05 loss: 151  early_stopping: 2/5 1801.30469
Adjusting learning rate of group 0 to 1.0000e-06.
stage 88/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:08 loss: 154  early_stopping: 3/5 1801.30469
Adjusting learning rate of group 0 to 1.0000e-06.
stage 89/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:06 loss: 152  early_stopping: 0/5 1790.89856
Adjusting learning rate of group 0 to 1.0000e-06.
stage 90/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:07 loss: 150  early_stopping: 1/5 1790.89856
Adjusting learning rate of group 0 to 1.0000e-06.
stage 91/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:06 loss: 147  early_stopping: 2/5 1790.89856
Adjusting learning rate of group 0 to 1.0000e-06.
stage 92/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:05 loss: 145  early_stopping: 3/5 1790.89856
Adjusting learning rate of group 0 to 1.0000e-06.
stage 93/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:07 loss: 141  early_stopping: 4/5 1790.89856
Adjusting learning rate of group 0 to 1.0000e-06.
stage 94/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:08 loss: 148  early_stopping: 5/5 1790.89856
Adjusting learning rate of group 0 to 1.0000e-06.
stage 95/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:07 loss: 143  early_stopping: 6/5 1790.89856
Adjusting learning rate of group 0 to 1.0000e-06.
stage 96/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:08 loss: 144  early_stopping: 7/5 1790.89856
Adjusting learning rate of group 0 to 1.0000e-06.
stage 97/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:10 loss: 144  early_stopping: 8/5 1790.89856
Adjusting learning rate of group 0 to 1.0000e-06.
stage 98/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:06 loss: 138  early_stopping: 9/5 1790.89856
Adjusting learning rate of group 0 to 1.0000e-06.
stage 99/∞ ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1442/116 0:00:00 0:09:05 loss: 135  early_stopping: 0/5 1787.32471
Moving best model pretrain_0.mlmodel (0.0) to pretrain_best.mlmodel

real    5874m38.127s
user    65817m47.515s
sys     44613m53.079s

(venv3.9_20221126) stweil@ocr-01:~/src/gitlab/scripta/escriptorium/Juristische_Konsilien_Tuebingen/Transkribus_Exporte$ ls -lt pretrain_*
-rw-r--r-- 1 stweil stweil 17511996 Dec  1 09:59 pretrain_best.mlmodel
-rw-r--r-- 1 stweil stweil 17817375 Dec  1 09:59 pretrain_99.mlmodel
-rw-r--r-- 1 stweil stweil 17812116 Dec  1 08:23 pretrain_98.mlmodel
-rw-r--r-- 1 stweil stweil 17806866 Dec  1 06:48 pretrain_97.mlmodel
-rw-r--r-- 1 stweil stweil 17801607 Dec  1 05:13 pretrain_96.mlmodel
-rw-r--r-- 1 stweil stweil 17796546 Dec  1 03:37 pretrain_95.mlmodel
-rw-r--r-- 1 stweil stweil 17791506 Dec  1 02:02 pretrain_94.mlmodel
-rw-r--r-- 1 stweil stweil 17786463 Dec  1 00:27 pretrain_93.mlmodel
-rw-r--r-- 1 stweil stweil 17781420 Nov 30 22:52 pretrain_92.mlmodel
-rw-r--r-- 1 stweil stweil 17776376 Nov 30 21:17 pretrain_91.mlmodel
-rw-r--r-- 1 stweil stweil 17771320 Nov 30 19:42 pretrain_90.mlmodel
-rw-r--r-- 1 stweil stweil 17766284 Nov 30 18:07 pretrain_89.mlmodel
-rw-r--r-- 1 stweil stweil 17761246 Nov 30 16:32 pretrain_88.mlmodel
-rw-r--r-- 1 stweil stweil 17756190 Nov 30 14:57 pretrain_87.mlmodel
-rw-r--r-- 1 stweil stweil 17751176 Nov 30 13:22 pretrain_86.mlmodel
-rw-r--r-- 1 stweil stweil 17746142 Nov 30 11:47 pretrain_85.mlmodel
-rw-r--r-- 1 stweil stweil 17741095 Nov 30 10:12 pretrain_84.mlmodel
-rw-r--r-- 1 stweil stweil 17736040 Nov 30 08:36 pretrain_83.mlmodel
-rw-r--r-- 1 stweil stweil 17730995 Nov 30 07:01 pretrain_82.mlmodel
-rw-r--r-- 1 stweil stweil 17725925 Nov 30 05:26 pretrain_81.mlmodel
-rw-r--r-- 1 stweil stweil 17720860 Nov 30 03:51 pretrain_80.mlmodel
-rw-r--r-- 1 stweil stweil 17715847 Nov 30 02:16 pretrain_79.mlmodel
-rw-r--r-- 1 stweil stweil 17710802 Nov 30 00:41 pretrain_78.mlmodel
-rw-r--r-- 1 stweil stweil 17705731 Nov 29 23:06 pretrain_77.mlmodel
-rw-r--r-- 1 stweil stweil 17700672 Nov 29 21:31 pretrain_76.mlmodel
-rw-r--r-- 1 stweil stweil 17695646 Nov 29 19:56 pretrain_75.mlmodel
-rw-r--r-- 1 stweil stweil 17690595 Nov 29 18:20 pretrain_74.mlmodel
-rw-r--r-- 1 stweil stweil 17685521 Nov 29 16:45 pretrain_73.mlmodel
-rw-r--r-- 1 stweil stweil 17680460 Nov 29 15:10 pretrain_72.mlmodel
-rw-r--r-- 1 stweil stweil 17675403 Nov 29 13:35 pretrain_71.mlmodel
-rw-r--r-- 1 stweil stweil 17670345 Nov 29 12:00 pretrain_70.mlmodel
-rw-r--r-- 1 stweil stweil 17665302 Nov 29 10:25 pretrain_69.mlmodel
-rw-r--r-- 1 stweil stweil 17660278 Nov 29 08:51 pretrain_68.mlmodel
-rw-r--r-- 1 stweil stweil 17655222 Nov 29 07:15 pretrain_67.mlmodel
-rw-r--r-- 1 stweil stweil 17650169 Nov 29 05:40 pretrain_66.mlmodel
-rw-r--r-- 1 stweil stweil 17645115 Nov 29 04:06 pretrain_65.mlmodel
-rw-r--r-- 1 stweil stweil 17640097 Nov 29 02:31 pretrain_64.mlmodel
-rw-r--r-- 1 stweil stweil 17635043 Nov 29 00:56 pretrain_63.mlmodel
-rw-r--r-- 1 stweil stweil 17629991 Nov 28 23:21 pretrain_62.mlmodel
-rw-r--r-- 1 stweil stweil 17624930 Nov 28 21:46 pretrain_61.mlmodel
-rw-r--r-- 1 stweil stweil 17619873 Nov 28 20:11 pretrain_60.mlmodel
-rw-r--r-- 1 stweil stweil 17614826 Nov 28 18:35 pretrain_59.mlmodel
-rw-r--r-- 1 stweil stweil 17609783 Nov 28 17:00 pretrain_58.mlmodel
-rw-r--r-- 1 stweil stweil 17604733 Nov 28 15:25 pretrain_57.mlmodel
-rw-r--r-- 1 stweil stweil 17599661 Nov 28 13:50 pretrain_56.mlmodel
-rw-r--r-- 1 stweil stweil 17594602 Nov 28 12:15 pretrain_55.mlmodel
-rw-r--r-- 1 stweil stweil 17589549 Nov 28 10:40 pretrain_54.mlmodel
-rw-r--r-- 1 stweil stweil 17584507 Nov 28 09:05 pretrain_53.mlmodel
-rw-r--r-- 1 stweil stweil 17579467 Nov 28 07:31 pretrain_52.mlmodel
-rw-r--r-- 1 stweil stweil 17574479 Nov 28 05:56 pretrain_51.mlmodel
-rw-r--r-- 1 stweil stweil 17569448 Nov 28 04:21 pretrain_50.mlmodel
-rw-r--r-- 1 stweil stweil 17564370 Nov 28 02:47 pretrain_49.mlmodel
-rw-r--r-- 1 stweil stweil 17559305 Nov 28 01:12 pretrain_48.mlmodel
-rw-r--r-- 1 stweil stweil 17554260 Nov 27 23:37 pretrain_47.mlmodel
-rw-r--r-- 1 stweil stweil 17549216 Nov 27 22:03 pretrain_46.mlmodel
-rw-r--r-- 1 stweil stweil 17544162 Nov 27 20:28 pretrain_45.mlmodel
-rw-r--r-- 1 stweil stweil 17539082 Nov 27 18:54 pretrain_44.mlmodel
-rw-r--r-- 1 stweil stweil 17534069 Nov 27 17:19 pretrain_43.mlmodel
-rw-r--r-- 1 stweil stweil 17529009 Nov 27 15:44 pretrain_42.mlmodel
-rw-r--r-- 1 stweil stweil 17523973 Nov 27 14:10 pretrain_41.mlmodel
-rw-r--r-- 1 stweil stweil 17518940 Nov 27 12:35 pretrain_40.mlmodel
-rw-r--r-- 1 stweil stweil 17513890 Nov 27 11:00 pretrain_39.mlmodel
-rw-r--r-- 1 stweil stweil 17513842 Nov 27 10:56 pretrain_38.mlmodel
-rw-r--r-- 1 stweil stweil 17513794 Nov 27 10:52 pretrain_37.mlmodel
-rw-r--r-- 1 stweil stweil 17513744 Nov 27 10:47 pretrain_36.mlmodel
-rw-r--r-- 1 stweil stweil 17513694 Nov 27 10:43 pretrain_35.mlmodel
-rw-r--r-- 1 stweil stweil 17513647 Nov 27 10:39 pretrain_34.mlmodel
-rw-r--r-- 1 stweil stweil 17513600 Nov 27 10:34 pretrain_33.mlmodel
-rw-r--r-- 1 stweil stweil 17513551 Nov 27 10:30 pretrain_32.mlmodel
-rw-r--r-- 1 stweil stweil 17513501 Nov 27 10:26 pretrain_31.mlmodel
-rw-r--r-- 1 stweil stweil 17513452 Nov 27 10:21 pretrain_30.mlmodel
-rw-r--r-- 1 stweil stweil 17513403 Nov 27 10:17 pretrain_29.mlmodel
-rw-r--r-- 1 stweil stweil 17513353 Nov 27 10:12 pretrain_28.mlmodel
-rw-r--r-- 1 stweil stweil 17513308 Nov 27 10:08 pretrain_27.mlmodel
-rw-r--r-- 1 stweil stweil 17513258 Nov 27 10:04 pretrain_26.mlmodel
-rw-r--r-- 1 stweil stweil 17513210 Nov 27 09:59 pretrain_25.mlmodel
-rw-r--r-- 1 stweil stweil 17513160 Nov 27 09:55 pretrain_24.mlmodel
-rw-r--r-- 1 stweil stweil 17513110 Nov 27 09:51 pretrain_23.mlmodel
-rw-r--r-- 1 stweil stweil 17513060 Nov 27 09:46 pretrain_22.mlmodel
-rw-r--r-- 1 stweil stweil 17513010 Nov 27 09:42 pretrain_21.mlmodel
-rw-r--r-- 1 stweil stweil 17512960 Nov 27 09:38 pretrain_20.mlmodel
-rw-r--r-- 1 stweil stweil 17512914 Nov 27 09:33 pretrain_19.mlmodel
-rw-r--r-- 1 stweil stweil 17512865 Nov 27 09:29 pretrain_18.mlmodel
-rw-r--r-- 1 stweil stweil 17512816 Nov 27 09:24 pretrain_17.mlmodel
-rw-r--r-- 1 stweil stweil 17512766 Nov 27 09:20 pretrain_16.mlmodel
-rw-r--r-- 1 stweil stweil 17512716 Nov 27 09:16 pretrain_15.mlmodel
-rw-r--r-- 1 stweil stweil 17512668 Nov 27 09:11 pretrain_14.mlmodel
-rw-r--r-- 1 stweil stweil 17512618 Nov 27 09:07 pretrain_13.mlmodel
-rw-r--r-- 1 stweil stweil 17512568 Nov 27 09:03 pretrain_12.mlmodel
-rw-r--r-- 1 stweil stweil 17512519 Nov 27 08:58 pretrain_11.mlmodel
-rw-r--r-- 1 stweil stweil 17512469 Nov 27 08:54 pretrain_10.mlmodel
-rw-r--r-- 1 stweil stweil 17512422 Nov 27 08:49 pretrain_9.mlmodel
-rw-r--r-- 1 stweil stweil 17512372 Nov 27 08:45 pretrain_8.mlmodel
-rw-r--r-- 1 stweil stweil 17512324 Nov 27 08:41 pretrain_7.mlmodel
-rw-r--r-- 1 stweil stweil 17512278 Nov 27 08:36 pretrain_6.mlmodel
-rw-r--r-- 1 stweil stweil 17512230 Nov 27 08:32 pretrain_5.mlmodel
-rw-r--r-- 1 stweil stweil 17512183 Nov 27 08:28 pretrain_4.mlmodel
-rw-r--r-- 1 stweil stweil 17512137 Nov 27 08:23 pretrain_3.mlmodel
-rw-r--r-- 1 stweil stweil 17512090 Nov 27 08:19 pretrain_2.mlmodel
-rw-r--r-- 1 stweil stweil 17512042 Nov 27 08:15 pretrain_1.mlmodel
-rw-r--r-- 1 stweil stweil 17511996 Nov 27 08:10 pretrain_0.mlmodel