0 %
SNOMED-CT German Validation Form
title
Concept ID: 19923001
coid
INSTRUCTIONS: On the basis of the following descriptions in English, Spanish and Swedish, please assign one label (Correct, Acceptable, Wrong) to each German translation candidate. If all candidates are wrong, you can suggest your translation at the bottom of the form.
help
line1
English description:
Catheter
Spanish description:
catéter
Swedish description:
kateter
or
line2
Katheter
Correct
Acceptable
Wrong
candidate_1
Catheter
Correct
Acceptable
Wrong
candidate_2
катетер
Correct
Acceptable
Wrong
candidate_3
Correct
Acceptable
Wrong
candidate_4
Cather
Correct
Acceptable
Wrong
candidate_5
line3
Insert your translation here:
own
Submit
submit