New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
np.logical_xor.accumulate fails on 1.24 on mac #22841
Comments
Again ping @seiko2plus and maybe @Developer-Ecosystem-Engineering. This looks like probably the identical issue as gh-22840 (just now slightly different with the reduction). Presumably introduced by gh-22167. With a view (although I am not sure how important it is directly without digging in):
|
Memory overlaps! The comparison loops of all data types are free of overlap checking: numpy/numpy/core/src/umath/fast_loop_macros.h Lines 381 to 392 in 0d1bb8e
numpy/numpy/core/src/umath/loops_comparison.dispatch.c.src Lines 315 to 328 in 0d1bb8e
+ // multiply by sizeof(@type@) due to SIMD unroll
/* argument one scalar */
- if (IS_BLOCKABLE_BINARY_SCALAR1_BOOL(sizeof(@type@), NPY_SIMD_WIDTH)) {
+ if (IS_BLOCKABLE_BINARY_SCALAR1(sizeof(@type@), NPY_SIMD_WIDTH*sizeof(@type@))) {
simd_binary_scalar1_@kind@_@sfx@(args, dimensions[0]);
return;
}
/* argument two scalar */
- else if (IS_BLOCKABLE_BINARY_SCALAR2_BOOL(sizeof(@type@), NPY_SIMD_WIDTH)) {
+ else if (IS_BLOCKABLE_BINARY_SCALAR2(sizeof(@type@), NPY_SIMD_WIDTH*sizeof(@type@))) {
simd_binary_scalar2_@kind@_@sfx@(args, dimensions[0]);
return;
}
- else if (IS_BLOCKABLE_BINARY_BOOL(sizeof(@type@), NPY_SIMD_WIDTH)) {
+ else if (IS_BLOCKABLE_BINARY(sizeof(@type@), NPY_SIMD_WIDTH*sizeof(@type@))) {
simd_binary_@kind@_@sfx@(args, dimensions[0]);
return;
} |
Describe the issue:
.Accumulate on logical_xor gives wrong result on newest version. I suspect the same for bitwise_xor.
Reproduce the code example:
Error message:
No response
NumPy/Python version information:
numpy=1.24
python 3.8,
mac version 12.5.1
Context for the issue:
No response
The text was updated successfully, but these errors were encountered: