@stdlib/ndarray-base-ind2sub
Convert a linear index to an array of subscripts.
Found 24 results for strides
Convert a linear index to an array of subscripts.
Compute the minimum and maximum linear indices in an underlying data buffer which are accessible to an array view.
Given a stride array, determine array iteration order.
Determine the order of a multidimensional array based on a provided stride array.
Convert a linear index in an array view to a linear index in an underlying data buffer.
Determine if a buffer length is compatible with provided ndarray meta data.
Generate a stride array from an array shape.
Determine the index offset which specifies the location of the first indexed value in a multidimensional array based on a stride array.
Return the strides of a provided ndarray.
Return the strides of a provided ndarray.
Return the stride along a specified dimension for a provided ndarray.
Given a stride array, determine whether an array is row-major.
Determine if an array is row-major contiguous.
Compute the maximum linear index in an underlying data buffer accessible to an array view.
Determine if an array is column-major contiguous.
Convert subscripts to a linear index.
Compute the minimum linear index in an underlying data buffer accessible to an array view.
Determine if an array is compatible with a single memory segment.
Return the minimum accessible index based on a set of provided strided array parameters.
Determine if an array is contiguous.
Given a stride array, determine whether an array is column-major.
Convert a linear index in an underlying data buffer to a linear index in an array view.
Return the maximum accessible index based on a set of provided strided array parameters.
Return the stride along a specified dimension for a provided ndarray.