class MXNet::Gluon::NN::Conv1D

Overview

1D convolution layer (e.g. temporal convolution).

This layer creates a convolution kernel that is convolved with the input over a single spatial (or temporal) dimension to produce a tensor of outputs. If use_bias is true, a bias vector is created and added to the outputs. If activation is not nil, the activation is applied to the outputs. If in_channels is not specified, parameter initialization will be deferred to the first time #forward is called and in_channels will be inferred from the shape of input data.

Defined in:

mxnet/gluon/nn/layers.cr

Constructors

Instance methods inherited from class MXNet::Gluon::NN::Internal::Conv

act act, act=(act : MXNet::Gluon::NN::Activation?) act=, act? : MXNet::Gluon::NN::Activation? act?, bias bias, bias=(bias : MXNet::Gluon::Parameter?) bias=, bias? : MXNet::Gluon::Parameter? bias?, hybrid_forward(inputs : Array(T), params : Hash(String, T)) : Array(T) forall T hybrid_forward, weight weight, weight=(weight : MXNet::Gluon::Parameter?) weight=, weight? : MXNet::Gluon::Parameter? weight?

Constructor methods inherited from class MXNet::Gluon::NN::Internal::Conv

new(*, channels : Int32, kernel_size : Array(Int32), strides : Array(Int32) | Int32, padding : Array(Int32) | Int32, dilation : Array(Int32) | Int32, layout : String, in_channels = 0, use_bias = true, activation = nil, **kwargs) new

Instance methods inherited from class MXNet::Gluon::HybridBlock

export(filename, epoch = 0) export, forward(inputs : Array(T)) : Array(T) forall T forward, hybrid_forward(inputs : Array(T), params : Hash(String, T) = {} of String => T) : Array(T) forall T hybrid_forward, hybridize(active = true, flags = {} of String => String) hybridize, register_child(block, name = nil) register_child

Instance methods inherited from module MXNet::Gluon::CachedGraph

clear_cache clear_cache, infer_dtype(args) infer_dtype, infer_shape(args) infer_shape

Constructor methods inherited from module MXNet::Gluon::CachedGraph

new(**kwargs) new

Instance methods inherited from class MXNet::Gluon::Block

call(inputs : Array(T)) : Array(T) forall T call, children children, collect_params(selector = nil) collect_params, forward(inputs : Array(T)) : Array(T) forall T forward, get_attr(name : String) : Block | Parameter | Nil get_attr, hybridize(active = true) hybridize, init(init = nil, ctx = nil, force_reinit = false) init, load_parameters(fname, ctx = MXNet.cpu, allow_missing = false, ignore_extra = false) load_parameters, params : MXNet::Gluon::ParameterDict params, prefix : String prefix, register_child(block, name = nil) register_child, register_parameter(param, name = nil) register_parameter, save_parameters(fname) save_parameters, scope : MXNet::Gluon::BlockScope? scope, set_attr(name : String, value : Block | Parameter | Nil) set_attr, with_name_scope(&) with_name_scope

Constructor methods inherited from class MXNet::Gluon::Block

new(prefix = nil, params = nil) new

Constructor Detail

def self.new(*, channels, kernel_size, strides = 1, padding = 0, dilation = 1, layout = "NCW", in_channels = 0, use_bias = true, activation = nil, **kwargs) #

Creates a new instance.

Parameters

  • channels (Int32) The dimensionality of the output space (the number of output channels in the convolution).
  • kernel_size (Array(Int32) of 1 integer) Specifies the dimensions of the convolution window.
  • strides (Int32 or Array(Int32) of 1 integer, default = 1) Specifies the strides of the convolution.
  • padding (Int32 or Array(Int32) of 1 integer, default = 0) If padding is non-zero, then the input is implicitly zero-padded on both sides for padding number of points.
  • dilation (Int32 or Array(Int32) of 1 integer, default = 1) Specifies the dilation rate to use for dilated convolution.
  • layout (String, default = "NCW") Dimension ordering of data and weight. Only supports "NCW" layout for now. "N", "C", "W" stands for batch, channel, and width (time) dimensions respectively. Convolution is applied on the "W" dimension.
  • in_channels (Int32, default = 0) The number of input channels to this layer. If not specified, initialization will be deferred to the first time #forward is called and in_channels will be inferred from the shape of the input data.
  • use_bias (Bool, default = true) Whether the layer uses a bias vector.
  • activation (String, optional) Activation function to use. If nothing is specified, no activation is applied (it acts like "linear" activation: a(x) = x).

[View source]