Bag Of Words |
Where a text is represented as an unordered collection of words. |
Fine-Tuning |
A method of further training a pre-trained model on a specific dataset to improve performance on a particular task. |
Inference |
The process of using a trained model to generate predictions or outputs based on new input data. |
Natural Language Processing |
A field of artificial intelligence that focuses on the interaction between computers and human language. |
Overfitting |
A modeling error that occurs when a model learns the training data too well, failing to generalize to new data. |
Parameters |
The internal variables of a model that are adjusted during training to minimize prediction error. |
Semantic Analysis |
The process of interpreting the meaning of words and phrases in context. |
Tokenization |
The process of breaking text into smaller pieces, called tokens, which can be words or subwords. |
Training Corpus |
A large set of texts used to train a model to understand and generate language. |
Transformer |
An architecture that uses self-attention mechanisms to process and generate sequences of data. |