Main Article Content
Omission and other sins: Tracking the quality of online machine translation output over four years
Abstract
Online machine translation (MT) has empowered ordinary language users to have texts translated all by themselves. But are these users aware of the pitfalls? This article draws on a longitudinal study that explored the quality of output by online MT application Google Translate in the language combination Afrikaans–English. We investigated the distribution of errors in two sets of translations (slide-show text and news report text) that we had Google Translate produce annually over a period of four years, 2010–2013. Omission, Mistranslation, Non-translation and Grammar were error categories that scored high in the analyses. In addition, we found that although the quality of the translations seemed to improve up to 2012, the pattern of improvement levelled off, with some of the 2013 output containing more errors than that of the previous year. We believe users should be made aware of the risks they unknowingly take when using online MT.
Keywords: error categories, Google Translate, machine translation, mistranslation, non-translation, translation quality