NLP-KG
Semantic Search

Field of Study:

Language Model Adapters

Language Model Adapters are a method of fine-tuning pre-trained language models. Instead of fine-tuning all parameters of the model, only a small set of parameters in added adapter modules are trained. This approach allows for a more efficient adaptation of the model to a specific task, while preserving the original model weights. It also enables multi-task learning and transfer learning, as different adapters can be plugged into the model for different tasks.

Synonyms:

Adapters

Papers published in this field over the years:

Hierarchy

Loading...
Venue
Field

Publications for Language Model Adapters

Sort by
Previous
Next

Showing results 1 to 0 of 0

Previous
Next

Researchers for Language Model Adapters

Sort by