#Pre-training
In deep learning, this refers to training a model with a large amount of general-purpose data (e.g., entire text corpora, large image datasets) before specializing it for a specific task. This allows the model to acquire general knowledge and features, enhancing the efficiency of subsequent "fine-tuning." This is associated with the stage of acquiring existing knowledge in "metaphysical learning."
1
Articles
Chronological
Latest first
Articles
1 Article