Skip to content

Update transformer version checks and documentation for lr_scheduler_kwargs workaround#4876

Merged
qgallouedec merged 4 commits intomainfrom
update-version-comment
Jan 20, 2026
Merged

Update transformer version checks and documentation for lr_scheduler_kwargs workaround#4876
qgallouedec merged 4 commits intomainfrom
update-version-comment

Conversation

@qgallouedec
Copy link
Copy Markdown
Member

What does this PR do?

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a GitHub issue? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@HuggingFaceDocBuilderDev
Copy link
Copy Markdown

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@require_torch_accelerator
def test_training_with_transformers_paged(self, config_name):
if Version(transformers.__version__) < Version("4.57.0"):
pytest.xfail("Upstream bug in transformers (GH#40692). Fix merged; awaiting release >= 4.57.0")
Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should keep the xfail and just update the message.

  • We want the test to be marked as xfail if run in a venv with transformers < 4.57.0

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah right, it does fail when running with transformers==4.56.2 (minimal version supported). We just don't test it, because this test lives in experimental

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

def test_training_with_transformers_paged(self, model_name):
"""Test that training works with transformers paged implementation (requires GPU)."""
if Version(transformers.__version__) < Version("4.57.0"):
pytest.xfail("Upstream bug in transformers (GH#40692). Fix merged; awaiting release >= 4.57.0")
Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Copy Markdown
Member

@albertvillanova albertvillanova left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the updates.

Just a comment below.

@require_torch_accelerator
def test_training_with_transformers_paged(self, config_name):
if Version(transformers.__version__) < Version("4.57.0"):
pytest.xfail("Upstream bug in transformers (GH#40692). Fix merged; awaiting release >= 4.57.0")
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should keep the xfail and just update the message.

  • We want the test to be marked as xfail if run in a venv with transformers < 4.57.0

@qgallouedec qgallouedec merged commit 14a8ed7 into main Jan 20, 2026
14 checks passed
@qgallouedec qgallouedec deleted the update-version-comment branch January 20, 2026 20:41
marcandrelarochelle pushed a commit to marcandrelarochelle/trl that referenced this pull request Mar 25, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants