Exploring Localization Trends for Year 2017
Localization is an important aspect of an online business today. It is termed as the process of accommodating a product or content to suit the specific geographical and location needs of the business looking to expand its operations in that territory. Localization comprises several elements, translation being one of them. The process of localization has evolved over the years and its trends keep changing.
Let us look at some of the vital localization trends that are expected to rule the market in the coming year. Localization trends for year 2017 include:
1. Fresh Business Models
Translation industry sticks to payments-per-word methodology. However, this mode of industry pricing may not suit services including multimedia and transcreation. For instance, transcreating 5 words for a TV show on Netflix might be a tedious & lengthy exercise and may take a few days of research and word-craft. Would then paying per word justify the hard work of the translator? Apparently, the answer is a BIG NO. In case of multimedia, it is imperative to take into account the production hours invested by audio engineers, motion graphics artists, voice-over artists and post-production QA. The cost of a project of this magnitude doesn’t reckon with the traditional pay-per-word method. Only if the video meets necessary length & complexity requirements, it is manageable to chalk out a decent pricing model enabling buyers to purchase localized videos for price. This model is results-driven and has its own benefits which simplify budgeting buyers & sellers. This model is set to gain traction in 2017.
2. Quality Criterion
Localization industry has evolved in the past years and this trend is set to continue in the given years too. However, going forth, the industry is set to stiffen the quality criterion or quality benchmarks for the greater good of translation and its impact on overall business expansion. Year 2017 will witness a massive shift in setting quality-benchmarks. Many such benchmarks will start impacting certain content records basis the data collected from the buyers and sellers of translation-service. TAUS DQF (Dynamic Quality Framework) is the name of the technology set to quality benchmarking trend. Interestingly, this technology has gathered a massive support from the industry. The aim of adapting DQF is to regulate the methodology & instruments to measure the quality, aggregate the scores and measurements, at the same time making them available within the capacity of industry-shared metrics. DQF is project friendly and can be enforced by adding a plugin. This plugin is readily available across translation management systems and translation tools. DQF is equipped to measure time that a translator takes on translation of each section. As the project draws to a close, a reviewer marks the errors that fail the quality criterion and these scores are then transmitted to DQF central database. The DQF central database stores scores and productivity ratings for individual content profile.
Buyers and sellers of translation-service will be able to apply these benchmarks in their SLAs thus, facilitating quality expectations.
3. Neural Machine Translation
The robust Machine Translation technology is emerging stronger than ever. Neutral MT has substituted the phrase-based statistical method with neural MT, known for simulating integrated network of the brain to render translations. Tech giants including Google, Microsoft and Facebook have unveiled neural MT technologies. Neutral MT produces high quality output for various language pairs, which of course is qualitatively superior to statistical MT. Meanwhile, Google’s neural MT develops its own version of Interlingua between language pairs and is a conception representation of language within the network. At present, neural MT produces high quality results in many language pairs including Chinese, Arabic, Japanese, Turkish and Korean. However, statistical MT still wins over languages like French, Portuguese and Spanish. This is the reason why no independent MT developers are doing away with statistical MT as yet. Neural MT is executed as a component of the architecture of neural, statistical and rules-based MT engines.
Few more days to go before we can actually witness the changing trends being executed in the process of Localization