NLP-KG
Semantic Search

Field of Study:

Sentence Embeddings

Sentence Embeddings refer to the numerical representation of sentences or short paragraphs in a multi-dimensional vector space. It is a form of feature extraction where sentences or short paragraphs are converted into vectors. These vectors capture the semantic meaning of the sentence, allowing similar sentences to be mapped close to each other in the vector space.

Synonyms:

Sentence Representations, Document Embeddings

Papers published in this field over the years:

Hierarchy

Loading...
Venue
Field

Publications for Sentence Embeddings

Sort by
Previous
Next

Showing results 1 to 0 of 0

Previous
Next

Researchers for Sentence Embeddings

Sort by