@stdlib/ndarray-base-ind2sub
Convert a linear index to an array of subscripts.
Found 24 results for strides
Convert a linear index to an array of subscripts.
Determine the order of a multidimensional array based on a provided stride array.
Compute the minimum and maximum linear indices in an underlying data buffer which are accessible to an array view.
Given a stride array, determine array iteration order.
Determine if a buffer length is compatible with provided ndarray meta data.
Convert a linear index in an array view to a linear index in an underlying data buffer.
Determine the index offset which specifies the location of the first indexed value in a multidimensional array based on a stride array.
Generate a stride array from an array shape.
Return the strides of a provided ndarray.
Return the stride along a specified dimension for a provided ndarray.
Return the strides of a provided ndarray.
Return the minimum accessible index based on a set of provided strided array parameters.
Given a stride array, determine whether an array is row-major.
Determine if an array is row-major contiguous.
Determine if an array is column-major contiguous.
Determine if an array is compatible with a single memory segment.
Determine if an array is contiguous.
Given a stride array, determine whether an array is column-major.
Compute the maximum linear index in an underlying data buffer accessible to an array view.
Compute the minimum linear index in an underlying data buffer accessible to an array view.
Convert a linear index in an underlying data buffer to a linear index in an array view.
Convert subscripts to a linear index.
Return the stride along a specified dimension for a provided ndarray.
Return the maximum accessible index based on a set of provided strided array parameters.