A simple lookup table that stores embeddings of a fixed dictionary and size.
This module is often used to store word embeddings and retrieve them using indices. The input to the module is a list of indices, and the output is the corresponding word embeddings.
If paddding_idx is non negative:
num_embeddings, embedding_dim, and
Tensor parallel is performed along the
After the devices in the same communicate world perform embedding, all gather
Number of embedding vector in embedding weight, marked as
Dimension of embedding weight, marked as
Enable padding when the value is non negative. The embedding vector at padding_idx will fill with zeros.
If greater than 0, each embedding vector with norm larger than max_norm is renormalized to have norm max_norm.
The p of the p-norm to compute for the max_norm option.
Input token ids, the value of ids should between 0 and num_embeddings-1.
Shape:
Embedding weight.
Shape:
Shape: