Skip to content

Commit b163c5d

Browse files
Update usage of numpy to reflect numpy 2.0 changes (#6871)
## Motivation for features / changes Nightlies are failing due to new numpy release(one example: https://github.com/tensorflow/tensorboard/actions/runs/9546526095/job/26309613783) ## Technical description of changes The new numpy replaced np.string_ with np.bytes_ and np.unicode_ with np.str_ ## Screenshots of UI changes (or N/A) ## Detailed steps to verify changes work correctly (as executed by you) I ran tests that were failing and they passed. ## Alternate designs / implementations considered (or N/A) Locking numpy version < 2 was considered. However, this will eventually get us very out of date and should be avoided. --------- Co-authored-by: James Hollyer <[email protected]>
1 parent c19a63a commit b163c5d

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

tensorboard/compat/tensorflow_stub/dtypes.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -674,7 +674,7 @@ def as_dtype(type_value):
674674
# dtype with a single constant (np.string does not exist) to decide
675675
# dtype is a "string" type. We need to compare the dtype.type to be
676676
# sure it's a string type.
677-
if type_value.type == np.string_ or type_value.type == np.unicode_:
677+
if type_value.type == np.bytes_ or type_value.type == np.str_:
678678
return string
679679

680680
if isinstance(type_value, (type, np.dtype)):

tensorboard/util/tensor_util.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -137,7 +137,7 @@ def GetNumpyAppendFn(dtype):
137137
# dtype with a single constant (np.string does not exist) to decide
138138
# dtype is a "string" type. We need to compare the dtype.type to be
139139
# sure it's a string type.
140-
if dtype.type == np.string_ or dtype.type == np.unicode_:
140+
if dtype.type == np.bytes_ or dtype.type == np.str_:
141141
return SlowAppendObjectArrayToTensorProto
142142
return GetFromNumpyDTypeDict(_NP_TO_APPEND_FN, dtype)
143143

0 commit comments

Comments
 (0)