DictionaryForumContacts

   English Russian
Google | Forvo | +
generative pre-trained transformerstresses
AI. трансформер, обученный для генерации текста (MichaelBurov); обученный генерации текста трансформер (MichaelBurov); предварительно обученный генеративный преобразователь (Generative pre-trained transformers (GPT) are a family of language models generally trained on a large corpus of text data to generate human-like text. They are built using several blocks of the transformer architecture. They can be fine-tuned for various natural language processing tasks such as text generation, language translation, and text classification. The "pre-training" in its name refers to the initial training process on a large text corpus where the model learns to predict the next word in a passage, which provides a solid foundation for the model to perform well on downstream tasks with limited amounts of task-specific data.: According to the researchers, LLMs demonstrate “state-of-the-art capabilities” in translation quality assessment at the system level. However, they emphasized that only GPT 3.5 and larger models are capable of achieving state-of-the-art accuracy when compared to human judgments. Those findings provide “a first glimpse into the usefulness of pre-trained, generative large language models for quality assessment of translations,” they said. wikipedia.org Alexander Demidov); генеративный предобученный трансформер (Alex_Odeychuk)