Supported Architectures & Layers¶
NNV supports verification of 10+ neural network architecture types
through a unified NN class and 48 individual layer types.
Network Types¶
Architecture |
Description |
Key Set Type |
|---|---|---|
Feed-Forward (FFNN) |
Fully-connected layers with activation functions |
Star |
Convolutional (CNN) |
Conv2D, pooling, batch normalization for image classification |
ImageStar |
Recurrent (RNN/LSTM) |
Sequential data with recurrent/LSTM cells |
Star (per time step) |
Semantic Segmentation |
Pixel-level classification (transposed conv, unpooling) |
ImageStar |
Neural ODE |
Continuous-time dynamics learned as ODE blocks |
Star + Zono (via CORA) |
Graph Neural Network |
GCN and GINE layers for graph-structured data |
GraphStar |
Binary Neural Network |
Networks with binary weights/activations |
Star |
Vision Transformer |
Attention-based image classification |
Star |
3D / Volumetric CNN |
Conv3D for medical images and video classification |
VolumeStar |
Time-Dependent NN |
Variable-length sequence verification |
Star (union over time horizons) |
Feature Overview¶
The table below summarizes the major features available across NNV versions. Items in bold were introduced in NNV3.
Feature |
Supported (NNV 1.0, NNV 2.0, NNV3) |
|---|---|
Network Types |
FFNN, CNN, NeuralODE, SSNN, RNN, GNN, TDNN, 3D CNN |
Layers |
MaxPool, Conv, BN, AvgPool, FC, MaxUnpool, TC, DC, NODE, GCN, GINE, Conv3D |
Activations |
ReLU, Satlin, Sigmoid, Tanh, Leaky ReLU, Satlins |
Plant Dynamics |
Linear ODE, Nonlinear ODE, Continuous & Discrete Time, Hybrid Automata |
Set Representations |
Polyhedron, Zonotope, Star, ImageStar, VolumeStar, ModelStar, GraphStar |
Reach Methods |
exact, approx, abs-dom, relax-* |
Visualization |
Exact and over-approximate reachable sets |
Verification |
Safety, Robustness, VNNLIB, Fairness, Weight Perturbation, Probabilistic |
Miscellaneous |
Parallel computing, counterexample generation, ONNX, CI/CD |
NNV vs Other Tools¶
The following table compares NNV3 against 13 other verification tools across supported architectures and applications. A ~ indicates possible or partial support.
NNV3 |
a,b-Crown |
CORA |
FastBATLLNN |
Marabou |
NeuralSAT |
NeVer2 |
nnenum |
ReachNN |
Reluplex |
SobolBox |
Verinet |
Verisig |
StarV |
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
FFNN |
Yes |
Yes |
Yes |
Yes |
Yes |
Yes |
Yes |
Yes |
- |
Yes |
Yes |
Yes |
- |
Yes |
CNN |
Yes |
Yes |
Yes |
- |
Yes |
Yes |
Yes |
Yes |
- |
- |
Yes |
Yes |
- |
Yes |
RNN |
Yes |
Yes |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
Yes |
SSNN |
Yes |
~ |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
Yes |
TDNNs |
Yes |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
GNNs |
Yes |
- |
Yes |
- |
~ |
- |
- |
- |
- |
- |
- |
- |
- |
- |
3D Data |
Yes |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
Weight Perturbation |
Yes |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
Neural ODEs |
Yes |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
- |
Yes |
NNCS |
Yes |
~ |
Yes |
- |
- |
- |
- |
- |
Yes |
- |
- |
- |
Yes |
Yes |
The NN Class¶
The NN class is the unified interface for all feed-forward-style networks
(FFNNs, CNNs, segmentation nets, BNNs, etc.):
% Create from a cell array of layers
layers = {FullyConnectedLayer(W1, b1), ReLULayer(), FullyConnectedLayer(W2, b2)};
net = NN(layers);
% Or convert from MATLAB/ONNX format
dlnet = importNetworkFromONNX('model.onnx', InputDataFormats='BC');
net = matlab2nnv(dlnet);
Key Properties:
Property |
Type |
Description |
|---|---|---|
|
cell array |
Ordered list of layer objects |
|
table |
DAG connectivity (for non-sequential architectures) |
|
string |
Network name |
|
int |
Number of layers |
|
int |
Total neuron count |
|
array |
Input dimensions |
|
array |
Output dimensions |
Key Methods:
Method |
Description |
|---|---|
|
Forward pass on a concrete input vector/image |
|
Compute reachable output set(s) from an input set |
|
Verify a safety property over an input region |
|
Verify properties from a VNNLIB specification file |
The GNN Class¶
The GNN class handles graph neural networks with adjacency-based message passing:
% Create GNN with layers and graph structure
gnn = GNN(layers, A_norm, adj_list);
% Compute reachable set for node feature perturbation
input_set = GraphStar(NF, LB, UB);
output_sets = gnn.reach(input_set, reachOptions);
Key Properties:
Property |
Type |
Description |
|---|---|---|
|
cell array |
GCN/GINE and activation layers |
|
matrix |
Normalized adjacency matrix |
|
matrix |
Edge list representation |
|
matrix |
Edge feature matrix (for GINE layers) |
|
vector |
Optional edge weight vector |
Layer Reference¶
NNV implements 48 layer types organized into the following categories.
Input Layers¶
Layer |
Description |
|---|---|
|
Input layer for feature vectors |
|
Input layer for 2D images (H x W x C) |
|
Input layer for 3D volumes (H x W x D x C) |
|
Input layer for sequential/time-series data |
|
Generic placeholder for unsupported import layers |
Linear / Affine Layers¶
Layer |
Description |
|---|---|
|
Dense layer: y = Wx + b |
|
Per-element scale and offset: y = scale * x + offset |
Convolutional Layers¶
Layer |
Description |
|---|---|
|
1D convolution with stride, padding, dilation |
|
2D convolution for images (the workhorse of CNN verification) |
|
3D convolution for volumetric data / video frames |
|
1D transposed convolution (deconvolution) |
|
2D transposed convolution (used in segmentation decoders) |
Pooling Layers¶
Layer |
Description |
|---|---|
|
2D max pooling with configurable pool size and stride |
|
2D average pooling |
|
3D average pooling for volumetric data |
|
Global average over 1D spatial dimension |
|
Global average over 2D spatial dimensions |
|
Global average over 3D spatial dimensions |
|
Unpooling for segmentation decoder networks |
Activation Layers¶
Layer |
Description |
|---|---|
|
ReLU: y = max(0, x) – supports exact and approximate-star reachability |
|
Leaky ReLU: y = max(alpha*x, x) |
|
Sigmoid: y = 1/(1 + exp(-x)) |
|
Hyperbolic tangent |
|
Piecewise-linear approximation of sigmoid |
|
Clipped linear: y = clip(x, 0, 1) |
|
Symmetric saturation: y = clip(x, -1, 1) |
|
Sign/step function |
|
Softmax for classification output |
|
Base class for custom activation functions |
Normalization Layers¶
Layer |
Description |
|---|---|
|
Batch normalization (fused with preceding conv/FC during verification) |
|
Layer normalization |
Graph Neural Network Layers¶
Layer |
Description |
|---|---|
|
Graph Convolutional Network layer (Kipf & Welling) |
|
Graph Isomorphism Network with Edge features |
Recurrent Layers¶
Layer |
Description |
|---|---|
|
Long Short-Term Memory cell with input/recurrent/output gates |
|
Generic simple recurrent layer (Wi, Wh kernels) |
Reshaping & Structural Layers¶
Layer |
Description |
|---|---|
|
Flatten multi-dimensional input to 1D vector |
|
Reshape tensor dimensions |
|
Reshape for concatenation compatibility |
|
Resize spatial dimensions (interpolation) |
|
Upsample spatial dimensions |
Combination Layers¶
Layer |
Description |
|---|---|
|
Element-wise addition of multiple inputs (for skip connections) |
|
Channel/feature concatenation |
|
Depth-wise concatenation |
Special Layers¶
Layer |
Description |
|---|---|
|
Neural ODE integration block (wraps LinearODE/NonLinearODE) |
|
Pixel-level classification output (for segmentation) |
|
Custom/generic layer support |
Model Loading Utilities¶
NNV provides several utilities for loading models from different formats:
% From ONNX format
dlnet = importNetworkFromONNX('model.onnx', InputDataFormats='BC');
net = matlab2nnv(dlnet);
% From MATLAB deep learning format
net = matlab2nnv(trained_network); % SeriesNetwork, DAGNetwork, or dlnetwork
% Using the dedicated ONNX converter
net = onnx2nnv('model.onnx');
% From .mat file
net = load_NN_from_mat('network.mat');
See ONNX & VNNLIB Support for detailed ONNX and VNNLIB workflow documentation.