NLP-KG
Semantic Search

Field of Study:

Transformers & Large Language Models

Transformers & Large Language Models is concerned with models based on the transformer neural network architecture that introduced the attention mechanism and has become foundational in various machine learning applications. In addition, this concept includes Large Language Models, a specific application of transformers that are pre-trained on large amounts of textual data and excel at various natural language processing tasks.

Synonyms:

Transformer Language Models, Transformer, Pre-trained Language Models, LLMs, Pre-trained Foundation Models, Pretrained Model, LLM, Pretrained Foundation Models, Pretrained Language Models, Pre-trained Model, Transformer Language Model, Large Language Model, Transformer Models, Foundation Models

Papers published in this field over the years:

Hierarchy

Loading...
Venue
Field

Publications for Transformers & Large Language Models

Sort by
Previous
Next

Showing results 1 to 0 of 0

Previous
Next

Researchers for Transformers & Large Language Models

Sort by