Skip to content

[Bug] AttributeError & TypeError in bertwarper.py with Transformers v5.x #196

@t8ja

Description

@t8ja

Description:
The current implementation of bertwarper.py within the groundingdino dependency is incompatible with Transformers 5.3.0 and Torch 2.10. Several legacy BERT methods used by GroundingDINO have been removed or their signatures altered in these newer releases.

Environment Info (Verified):

Component Version
OS Linux 6.19.9-1-cachyos
Python 3.10.20
Torch 2.10.0+cu126
Transformers 5.3.0
GroundingDINO-py 0.4.0

The Fixes:
The following patches to groundingdino/models/GroundingDINO/bertwarper.py resolve the initialization and inference crashes:

1. Fix AttributeError (Line ~29)

BertModel in v5.x no longer exposes get_head_mask directly.

# Use getattr to safely handle missing legacy methods
self.get_head_mask = getattr(bert_model, "get_head_mask", None)

2. Fix TypeError (Line ~109)

The device argument has been removed from get_extended_attention_mask as it is now inferred from the input tensors.

# Updated for Transformers 5.x API
extended_attention_mask: torch.Tensor = self.get_extended_attention_mask(
    attention_mask, input_shape
)

3. Fix NoneType Exception (Line ~131)

Prevents a crash when the code attempts to call the now-missing get_head_mask function.

if self.get_head_mask is not None:
    head_mask = self.get_head_mask(head_mask, self.config.num_hidden_layers)
else:
    head_mask = [None] * self.config.num_hidden_layers

Additional Note:
These changes are backwards compatible and allow the node to function across both legacy (4.x) and modern (5.x) Transformer environments.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions