Skip to content

Commit 9e95028

Browse files
Add check for hpu and wrap_in_hpu_graph availability. (#3249)
Prevent AttributeError by verifying that habana_frameworks.torch.hpu and its wrap_in_hpu_graph function exist before calling it.
1 parent 309a9ac commit 9e95028

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed

sentence_transformers/SentenceTransformer.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -529,8 +529,9 @@ def encode(
529529
if self.device.type == "hpu" and not self.is_hpu_graph_enabled:
530530
import habana_frameworks.torch as ht
531531

532-
ht.hpu.wrap_in_hpu_graph(self, disable_tensor_cache=True)
533-
self.is_hpu_graph_enabled = True
532+
if hasattr(ht, "hpu") and hasattr(ht.hpu, "wrap_in_hpu_graph"):
533+
ht.hpu.wrap_in_hpu_graph(self, disable_tensor_cache=True)
534+
self.is_hpu_graph_enabled = True
534535

535536
self.eval()
536537
if show_progress_bar is None:

0 commit comments

Comments
 (0)