Layer Types¶
All 48 layer types supported by NNV. Every layer implements evaluate(x)
for forward pass and reach(inputSet, method, ...) for reachability.
Common Interface¶
All layers follow this interface:
% Forward pass
y = layer.evaluate(x)
% Reachability (called internally by NN.reach)
outputSet = layer.reach(inputSet, method, option, relaxFactor, dis_opt, lp_solver)
% Static: parse from MATLAB layer (used by matlab2nnv)
L = LayerClass.parse(matlab_layer)
Input Layers¶
Layer |
Constructor |
Notes |
|---|---|---|
|
|
Feature vector input |
|
|
2D image input (H x W x C) |
|
|
3D volume input (H x W x D x C) |
|
|
Sequential/time-series input |
Linear Layers¶
Layer |
Constructor |
Notes |
|---|---|---|
|
|
Dense: y = Wx + b |
|
|
Per-element: y = scale * x + offset |
Convolutional Layers¶
Layer |
Constructor |
Notes |
|---|---|---|
|
|
1D convolution with stride, padding, dilation |
|
|
2D convolution (workhorse for CNN verification) |
|
|
3D convolution for volumetric data |
|
|
1D deconvolution |
|
|
2D deconvolution (segmentation decoders) |
Pooling Layers¶
Layer |
Constructor |
Notes |
|---|---|---|
|
|
2D max pooling |
|
|
2D average pooling |
|
|
3D average pooling |
|
|
Global average over 1D spatial dim |
|
|
Global average over 2D spatial dims |
|
|
Global average over 3D spatial dims |
|
|
Unpooling for segmentation |
Activation Layers¶
Layer |
Constructor |
Notes |
|---|---|---|
|
|
ReLU: exact-star and approx-star reachability |
|
|
Leaky ReLU: y = max(alpha*x, x) |
|
|
Sigmoid activation |
|
|
Hyperbolic tangent |
|
|
Piecewise-linear sigmoid approximation |
|
|
Clip to [0, 1] |
|
|
Clip to [-1, 1] |
|
|
Sign/step function |
|
|
Softmax for classification output |
Normalization Layers¶
Layer |
Constructor |
Notes |
|---|---|---|
|
|
Fused with preceding linear layer during verification |
|
|
Layer normalization |
Graph Layers¶
Layer |
Constructor |
Notes |
|---|---|---|
|
|
Graph Convolutional Network layer |
|
|
Graph Isomorphism Network with Edge features |
Recurrent Layers¶
Layer |
Constructor |
Notes |
|---|---|---|
|
|
LSTM cell |
|
|
Simple RNN with Wi, Wh, bh, Wo kernels |
Reshaping & Combination Layers¶
FlattenLayer, ReshapeLayer, Resize2DLayer, UpsampleLayer,
AdditionLayer, ConcatenationLayer, DepthConcatenationLayer
Special Layers¶
ODEblockLayer (Neural ODE integration), PixelClassificationLayer
(segmentation output), LayerS (custom layer support)