[restart] [share link] select language: [swap] [compare scores] [compare models] [map] [release history] [uploads]

OPUS-MT Dashboard

newstest2018.fin-eng.log
Translating sentences from newstest2018.fi to eng

 - size of sentences to translate:
3000
 - size of sentences to translate (subword tokenized):
   3000   67582  487796
 - temporary log-files: /scratch/project_2002688/tmp.xyMXLLxm6p[.gpu]
 - energy-comsumption counter (start): 
     energy counter GPU 0: 691279724793 mJ
[2023-07-03 14:40:15] [marian] Marian v1.12.0 65bf82ff 2023-02-21 09:56:29 -0800
[2023-07-03 14:40:15] [marian] Running on g4101.mahti.csc.fi as process 1579372 with command line:
[2023-07-03 14:40:15] [marian] /users/tiedeman/.local/bin/marian-decoder -i /scratch/project_2003093/OPUS-MT-leaderboard/models/Tatoeba-MT-models/fin-eng/opusTCv20210807+nopar+ft95-sepvoc_transformer-tiny11-align_2023-07-03/newstest2018.fin-eng.input -c /scratch/project_2003093/OPUS-MT-leaderboard/models/Tatoeba-MT-models/work/fin-eng/opusTCv20210807+nopar+ft95-sepvoc_transformer-tiny11-align_2023-07-03/model/decoder.yml -b 4 -n1 -d 0 --quiet-translation -w 10000 --mini-batch 256 --maxi-batch 512 --maxi-batch-sort src --max-length 500 --max-length-crop
[2023-07-03 14:40:15] [config] alignment: ""
[2023-07-03 14:40:15] [config] allow-special: false
[2023-07-03 14:40:15] [config] allow-unk: false
[2023-07-03 14:40:15] [config] authors: false
[2023-07-03 14:40:15] [config] beam-size: 4
[2023-07-03 14:40:15] [config] bert-class-symbol: "[CLS]"
[2023-07-03 14:40:15] [config] bert-mask-symbol: "[MASK]"
[2023-07-03 14:40:15] [config] bert-masking-fraction: 0.15
[2023-07-03 14:40:15] [config] bert-sep-symbol: "[SEP]"
[2023-07-03 14:40:15] [config] bert-train-type-embeddings: true
[2023-07-03 14:40:15] [config] bert-type-vocab-size: 2
[2023-07-03 14:40:15] [config] best-deep: false
[2023-07-03 14:40:15] [config] build-info: ""
[2023-07-03 14:40:15] [config] check-nan: false
[2023-07-03 14:40:15] [config] cite: false
[2023-07-03 14:40:15] [config] cpu-threads: 0
[2023-07-03 14:40:15] [config] data-threads: 8
[2023-07-03 14:40:15] [config] dec-cell: ssru
[2023-07-03 14:40:15] [config] dec-cell-base-depth: 2
[2023-07-03 14:40:15] [config] dec-cell-high-depth: 1
[2023-07-03 14:40:15] [config] dec-depth: 2
[2023-07-03 14:40:15] [config] devices:
[2023-07-03 14:40:15] [config]   - 0
[2023-07-03 14:40:15] [config] dim-emb: 256
[2023-07-03 14:40:15] [config] dim-rnn: 1024
[2023-07-03 14:40:15] [config] dim-vocabs:
[2023-07-03 14:40:15] [config]   - 32000
[2023-07-03 14:40:15] [config]   - 32000
[2023-07-03 14:40:15] [config] dump-config: ""
[2023-07-03 14:40:15] [config] enc-cell: gru
[2023-07-03 14:40:15] [config] enc-cell-depth: 1
[2023-07-03 14:40:15] [config] enc-depth: 6
[2023-07-03 14:40:15] [config] enc-type: bidirectional
[2023-07-03 14:40:15] [config] factors-combine: sum
[2023-07-03 14:40:15] [config] factors-dim-emb: 0
[2023-07-03 14:40:15] [config] force-decode: false
[2023-07-03 14:40:15] [config] gemm-type: float32
[2023-07-03 14:40:15] [config] ignore-model-config: false
[2023-07-03 14:40:15] [config] input:
[2023-07-03 14:40:15] [config]   - /scratch/project_2003093/OPUS-MT-leaderboard/models/Tatoeba-MT-models/fin-eng/opusTCv20210807+nopar+ft95-sepvoc_transformer-tiny11-align_2023-07-03/newstest2018.fin-eng.input
[2023-07-03 14:40:15] [config] input-types:
[2023-07-03 14:40:15] [config]   []
[2023-07-03 14:40:15] [config] interpolate-env-vars: false
[2023-07-03 14:40:15] [config] layer-normalization: false
[2023-07-03 14:40:15] [config] lemma-dependency: ""
[2023-07-03 14:40:15] [config] lemma-dim-emb: 0
[2023-07-03 14:40:15] [config] log: ""
[2023-07-03 14:40:15] [config] log-level: info
[2023-07-03 14:40:15] [config] log-time-zone: ""
[2023-07-03 14:40:15] [config] max-length: 500
[2023-07-03 14:40:15] [config] max-length-crop: true
[2023-07-03 14:40:15] [config] max-length-factor: 3
[2023-07-03 14:40:15] [config] maxi-batch: 512
[2023-07-03 14:40:15] [config] maxi-batch-sort: src
[2023-07-03 14:40:15] [config] mini-batch: 256
[2023-07-03 14:40:15] [config] mini-batch-words: 0
[2023-07-03 14:40:15] [config] model-mmap: false
[2023-07-03 14:40:15] [config] models:
[2023-07-03 14:40:15] [config]   - /scratch/project_2003093/OPUS-MT-leaderboard/models/Tatoeba-MT-models/work/fin-eng/opusTCv20210807+nopar+ft95-sepvoc_transformer-tiny11-align_2023-07-03/model/opusTCv20210807+nopar+ft95-sepvoc.spm32k-spm32k.transformer-tiny11-align.model1.npz.best-perplexity.npz
[2023-07-03 14:40:15] [config] n-best: false
[2023-07-03 14:40:15] [config] no-spm-decode: false
[2023-07-03 14:40:15] [config] normalize: 1
[2023-07-03 14:40:15] [config] num-devices: 0
[2023-07-03 14:40:15] [config] optimize: false
[2023-07-03 14:40:15] [config] output: stdout
[2023-07-03 14:40:15] [config] output-approx-knn:
[2023-07-03 14:40:15] [config]   []
[2023-07-03 14:40:15] [config] output-omit-bias: false
[2023-07-03 14:40:15] [config] output-sampling:
[2023-07-03 14:40:15] [config]   []
[2023-07-03 14:40:15] [config] precision:
[2023-07-03 14:40:15] [config]   - float32
[2023-07-03 14:40:15] [config] quantize-range: 0
[2023-07-03 14:40:15] [config] quiet: false
[2023-07-03 14:40:15] [config] quiet-translation: true
[2023-07-03 14:40:15] [config] relative-paths: false
[2023-07-03 14:40:15] [config] right-left: false
[2023-07-03 14:40:15] [config] seed: 0
[2023-07-03 14:40:15] [config] shortlist:
[2023-07-03 14:40:15] [config]   []
[2023-07-03 14:40:15] [config] skip: false
[2023-07-03 14:40:15] [config] skip-cost: false
[2023-07-03 14:40:15] [config] stat-freq: 0
[2023-07-03 14:40:15] [config] tied-embeddings: true
[2023-07-03 14:40:15] [config] tied-embeddings-all: false
[2023-07-03 14:40:15] [config] tied-embeddings-src: false
[2023-07-03 14:40:15] [config] transformer-aan-activation: swish
[2023-07-03 14:40:15] [config] transformer-aan-depth: 2
[2023-07-03 14:40:15] [config] transformer-aan-nogate: false
[2023-07-03 14:40:15] [config] transformer-decoder-autoreg: rnn
[2023-07-03 14:40:15] [config] transformer-decoder-dim-ffn: 0
[2023-07-03 14:40:15] [config] transformer-decoder-ffn-depth: 0
[2023-07-03 14:40:15] [config] transformer-depth-scaling: false
[2023-07-03 14:40:15] [config] transformer-dim-aan: 2048
[2023-07-03 14:40:15] [config] transformer-dim-ffn: 1536
[2023-07-03 14:40:15] [config] transformer-ffn-activation: swish
[2023-07-03 14:40:15] [config] transformer-ffn-depth: 2
[2023-07-03 14:40:15] [config] transformer-guided-alignment-layer: last
[2023-07-03 14:40:15] [config] transformer-heads: 8
[2023-07-03 14:40:15] [config] transformer-no-projection: false
[2023-07-03 14:40:15] [config] transformer-pool: false
[2023-07-03 14:40:15] [config] transformer-postprocess: dan
[2023-07-03 14:40:15] [config] transformer-postprocess-emb: d
[2023-07-03 14:40:15] [config] transformer-postprocess-top: ""
[2023-07-03 14:40:15] [config] transformer-preprocess: ""
[2023-07-03 14:40:15] [config] transformer-rnn-projection: false
[2023-07-03 14:40:15] [config] transformer-tied-layers:
[2023-07-03 14:40:15] [config]   []
[2023-07-03 14:40:15] [config] transformer-train-position-embeddings: false
[2023-07-03 14:40:15] [config] tsv: false
[2023-07-03 14:40:15] [config] tsv-fields: 0
[2023-07-03 14:40:15] [config] type: transformer
[2023-07-03 14:40:15] [config] ulr: false
[2023-07-03 14:40:15] [config] ulr-dim-emb: 0
[2023-07-03 14:40:15] [config] ulr-trainable-transformation: false
[2023-07-03 14:40:15] [config] version: v1.10.24; 4dd30b5 2021-09-08 14:02:21 +0100
[2023-07-03 14:40:15] [config] vocabs:
[2023-07-03 14:40:15] [config]   - /scratch/project_2003093/OPUS-MT-leaderboard/models/Tatoeba-MT-models/work/fin-eng/opusTCv20210807+nopar+ft95-sepvoc_transformer-tiny11-align_2023-07-03/model/opusTCv20210807+nopar+ft95-sepvoc.spm32k-spm32k.src.vocab
[2023-07-03 14:40:15] [config]   - /scratch/project_2003093/OPUS-MT-leaderboard/models/Tatoeba-MT-models/work/fin-eng/opusTCv20210807+nopar+ft95-sepvoc_transformer-tiny11-align_2023-07-03/model/opusTCv20210807+nopar+ft95-sepvoc.spm32k-spm32k.trg.vocab
[2023-07-03 14:40:15] [config] weights:
[2023-07-03 14:40:15] [config]   []
[2023-07-03 14:40:15] [config] word-penalty: 0
[2023-07-03 14:40:15] [config] word-scores: false
[2023-07-03 14:40:15] [config] workspace: 10000
[2023-07-03 14:40:15] [config] Loaded model has been created with Marian v1.10.24; 4dd30b5 2021-09-08 14:02:21 +0100
[2023-07-03 14:40:15] [data] Loading vocabulary from text file /scratch/project_2003093/OPUS-MT-leaderboard/models/Tatoeba-MT-models/work/fin-eng/opusTCv20210807+nopar+ft95-sepvoc_transformer-tiny11-align_2023-07-03/model/opusTCv20210807+nopar+ft95-sepvoc.spm32k-spm32k.src.vocab
[2023-07-03 14:40:15] [data] Loading vocabulary from text file /scratch/project_2003093/OPUS-MT-leaderboard/models/Tatoeba-MT-models/work/fin-eng/opusTCv20210807+nopar+ft95-sepvoc_transformer-tiny11-align_2023-07-03/model/opusTCv20210807+nopar+ft95-sepvoc.spm32k-spm32k.trg.vocab
[2023-07-03 14:40:15] [MPI rank 0 out of 1]: GPU[0]
[2023-07-03 14:40:15] Loading model from /scratch/project_2003093/OPUS-MT-leaderboard/models/Tatoeba-MT-models/work/fin-eng/opusTCv20210807+nopar+ft95-sepvoc_transformer-tiny11-align_2023-07-03/model/opusTCv20210807+nopar+ft95-sepvoc.spm32k-spm32k.transformer-tiny11-align.model1.npz.best-perplexity.npz
[2023-07-03 14:40:15] [memory] Extending reserved space to 10112 MB (device gpu0)
[2023-07-03 14:40:15] Loaded model config
[2023-07-03 14:40:15] Loading scorer of type transformer as feature F0
[2023-07-03 14:40:15] [memory] Reserving 95 MB, device gpu0
[2023-07-03 14:40:16] [gpu] 16-bit TensorCores enabled for float32 matrix operations
[2023-07-03 14:40:17] Total time: 2.14322s wall
 - resources used according to time:
	Command being timed: "/users/tiedeman/.local/bin/marian-decoder -i /scratch/project_2003093/OPUS-MT-leaderboard/models/Tatoeba-MT-models/fin-eng/opusTCv20210807+nopar+ft95-sepvoc_transformer-tiny11-align_2023-07-03/newstest2018.fin-eng.input -c /scratch/project_2003093/OPUS-MT-leaderboard/models/Tatoeba-MT-models/work/fin-eng/opusTCv20210807+nopar+ft95-sepvoc_transformer-tiny11-align_2023-07-03/model/decoder.yml -b 4 -n1 -d 0 --quiet-translation -w 10000 --mini-batch 256 --maxi-batch 512 --maxi-batch-sort src --max-length 500 --max-length-crop"
	User time (seconds): 2.06
	System time (seconds): 1.00
	Percent of CPU this job got: 97%
	Elapsed (wall clock) time (h:mm:ss or m:ss): 0:03.14
	Average shared text size (kbytes): 0
	Average unshared data size (kbytes): 0
	Average stack size (kbytes): 0
	Average total size (kbytes): 0
	Maximum resident set size (kbytes): 1661928
	Average resident set size (kbytes): 0
	Major (requiring I/O) page faults: 0
	Minor (reclaiming a frame) page faults: 259206
	Voluntary context switches: 3162
	Involuntary context switches: 4464
	Swaps: 0
	File system inputs: 0
	File system outputs: 16
	Socket messages sent: 0
	Socket messages received: 0
	Signals delivered: 0
	Page size (bytes): 4096
	Exit status: 0
 - energy-comsumption counter (end): 
     energy counter GPU 0: 691280159050 mJ
 - GPU utlization:
timestamp, name, pci.bus_id, driver_version, pstate, pcie.link.gen.max, pcie.link.gen.current, temperature.gpu, utilization.gpu [%], utilization.memory [%], power.draw [W], memory.total [MiB], memory.free [MiB], memory.used [MiB]
2023/07/03 14:40:14.025, NVIDIA A100-SXM4-40GB, 00000000:84:00.0, 525.105.17, P0, 4, 4, 45, 0 %, 0 %, 77.47 W, 40960 MiB, 40337 MiB, 0 MiB
2023/07/03 14:40:15.030, NVIDIA A100-SXM4-40GB, 00000000:84:00.0, 525.105.17, P0, 4, 4, 43, 0 %, 0 %, 56.46 W, 40960 MiB, 40337 MiB, 0 MiB
2023/07/03 14:40:16.035, NVIDIA A100-SXM4-40GB, 00000000:84:00.0, 525.105.17, P0, 4, 4, 43, 5 %, 0 %, 64.16 W, 40960 MiB, 29568 MiB, 10769 MiB
2023/07/03 14:40:17.039, NVIDIA A100-SXM4-40GB, 00000000:84:00.0, 525.105.17, P0, 4, 4, 48, 72 %, 27 %, 187.61 W, 40960 MiB, 29344 MiB, 10993 MiB
2023/07/03 14:40:18.047, NVIDIA A100-SXM4-40GB, 00000000:84:00.0, 525.105.17, P0, 4, 4, 46, 70 %, 6 %, 84.33 W, 40960 MiB, 39562 MiB, 775 MiB