The Problem:
When training a sequence-to-sequence model using HuggingFace Transformers’ Seq2SeqTrainer
, the user encounters a deprecation warning about using a deprecated strategy to control generation. The user wants to modify their code to use the recommended approach but is unable to access the documentation link provided in the warning message. Additionally, the error occurs despite using the latest versions of Transformers (4.28.1) and Python (3.9.7).
The Solutions:
Solution 1: Use Generation Configuration File
The use of configuration files is recommended for controlling generation parameters. To resolve the deprecation warning, create a GenerationConfig
object and pass it to the generate
method instead of modifying the model’s configuration directly.
from transformers import GenerationConfig
model.config.max_new_tokens = 10
model.config.min_length = 1
gen_cfg = GenerationConfig.from_model_config(model.config)
gen_cfg.max_new_tokens = 10
gen_cfg.min_length = 1
summary_ids = model.generate(inputs["input_ids"], generation_config=gen_cfg)
Q&A
In HuggingFace, how can I fix the ‘deprecated strategy’ warning when modifying the model configuration for text generation?
Use a ‘GenerationConfig’ object instead of setting configuration attributes directly.
Video Explanation:
The following video, titled "Accelerate Transformer Model Training with Hugging Face and ...", provides additional insights and in-depth exploration related to the topics discussed in this post.
Transformer models deliver state-of-the-art performance on a wide range of machine learning tasks, such as natural language processing, ...
The following video, titled "Accelerate Transformer Model Training with Hugging Face and ...", provides additional insights and in-depth exploration related to the topics discussed in this post.
Transformer models deliver state-of-the-art performance on a wide range of machine learning tasks, such as natural language processing, ...