Welcome to our Support Center

Slice

Description

Produces a slice of the input tensor along multiple axes. Similar to numpy : https://numpy.org/doc/stable/user/basics.indexing.html?highlight=slice#slicing-and-striding

 

 

Slice uses the startsendsaxes and steps inputs to select a sub-tensor of its input data tensor.

An effective starts[i]ends[i], and steps[i] must be computed for each i in [0, ... r-1] where r = rank(input) as follows:

If axes are omitted, they are set to [0, ..., r-1]. If steps are omitted, they are set to [1, ..., 1] of length len(starts)

The effective values are initialized as start[i] = 0ends[i] = dims[i] where dims are the dimensions of input and steps[i] = 1.

All negative elements of axes are made non-negative by adding r to them, where r =rank(input).

All negative values in starts[i] and ends[i] have dims[axes[i]] added to them, where dims are the dimensions of input. Then start[axes[i]] is the adjusted starts[i] is clamped into the range [0, dims[axes[i]]] for positive stepping and [0, dims[axes[i]]-1] for negative stepping.

The clamping for the adjusted ends[i] depends on the sign of steps[i] and must accommodate copying 0 through dims[axes[i]] elements, so for positive stepping ends[axes[i]] is clamped to [0, dims[axes[i]]], while for negative stepping it is clamped to [-1, dims[axes[i]]-1].

Finally, steps[axes[i]] = steps[i].

For slicing to the end of a dimension with unknown size, it is recommended to pass in INT_MAX when slicing forward and ‘INT_MIN’ when slicing backward.

 

 

Input parameters

 

specified_outputs_namearray, this parameter lets you manually assign custom names to the output tensors of a node.

 Graphs in : cluster, ONNX model architecture.

data (heterogeneous) – T : object, tensor of data to extract slices from.
starts (heterogeneous) – Tind : object, 1-D tensor of starting indices of corresponding axis in axes.
ends (heterogeneous) – Tind : object, 1-D tensor of ending indices (exclusive) of corresponding axis in axes.
axes (optional, heterogeneous) – Tind : object, 1-D tensor of axes that starts and ends apply to. Negative value means counting dimensions from the back. Accepted range is [-r, r-1] where r = rank(data). Behavior is undefined if an axis is repeated.
steps (optional, heterogeneous) – Tind : object, 1-D tensor of slice step of corresponding axis in axes. Negative value means slicing backward. ‘steps’ cannot be 0. Defaults to 1s.

 Parameters : cluster,

 training? : boolean, whether the layer is in training mode (can store data for backward).
Default value “True”.
 lda coeff : float, defines the coefficient by which the loss derivative will be multiplied before being sent to the previous layer (since during the backward run we go backwards).
Default value “1”.

 name (optional) : string, name of the node.

Output parameters

output (heterogeneous) – T : object, sliced data tensor.

Type Constraints

T in (tensor(bfloat16)tensor(bool)tensor(complex128)tensor(complex64)tensor(double)tensor(float)tensor(float16)
tensor(int16)tensor(int32)tensor(int64)tensor(int8)tensor(string)tensor(uint16)tensor(uint32)tensor(uint64)tensor(uint8)) : Constrain input and output types to all tensor types.

Tind in (tensor(int32)tensor(int64)) : Constrain indices to integer types

Example

All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install Deep Learning library to run it).
Table of Contents