Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BERT Layer Built-In Backward Pass #2213

Closed

Conversation

MaximilianSchreff
Copy link
Contributor

This PR adds the backward pass for previously introduced full BERT layer from the BERT transformer architecture to SystemDS as a built-in operation.

This PR is part of a series of PRs to support the BERT architecture in SystemDS. The BERT layer is the component in the BERT architecture.

Includes

  • Backward pass

Testing:

Added 2 test cases comparing the backward pass results against HuggingFace Transformer Library's implementation, which uses PyTorch Autograd module for backward pass computation, for correctness.

  • The tests validate:
    • Backward pass against HuggingFace Transformer Library's transformers.models.bert.modeling_bert.BertLayer
    • For every single gradient from the whole layer

Copy link

codecov bot commented Feb 4, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 72.38%. Comparing base (e022eaf) to head (91ec288).
Report is 8 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff            @@
##               main    #2213   +/-   ##
=========================================
  Coverage     72.38%   72.38%           
  Complexity    45325    45325           
=========================================
  Files          1467     1467           
  Lines        170629   170629           
  Branches      33260    33260           
=========================================
+ Hits         123512   123513    +1     
+ Misses        37728    37725    -3     
- Partials       9389     9391    +2     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@Baunsgaard
Copy link
Contributor

LGTM

@Baunsgaard Baunsgaard closed this in 22642a1 Feb 5, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

2 participants