MarabouNetworkNNet¶
- Top contributors (to current version):
Christopher Lazarus
Andrew Wu
Kyle Julian
Alex Usvyatsov
This file is part of the Marabou project. Copyright (c) 2017-2019 by the authors listed in the file AUTHORS in the top-level source directory) and their institutional affiliations. All rights reserved. See the file COPYING in the top-level source directory for licensing information.
MarabouNetworkNNet represents neural networks with piecewise linear constraints derived from the NNet format
-
class
maraboupy.MarabouNetworkNNet.
MarabouNetworkNNet
(filename='', normalize=False)[source]¶ Bases:
maraboupy.MarabouNetwork.MarabouNetwork
Class representing a Marabou network from an NNet file Inherits from MarabouNetwork class
-
weights
[source]¶ Network weight matrices, where the outer index corresponds to layer number
- Type
list
- Parameters
filename (str) – Path to the .nnet file
normalize (bool) – True if network parameters should be adjusted to incorporate network input/output normalization. Otherwise, properties must be written with the normalization already incorporated.
-
addAbsConstraint
(b, f)[source]¶ Function to add a new Abs constraint
- Parameters
b (int) – Variable representing input of the Abs constraint
f (int) – Variable representing output of the Abs constraint
-
addDisjunctionConstraint
(disjuncts)[source]¶ Function to add a new Disjunction constraint
- Parameters
disjuncts (list of list of Equations) – Each inner list represents a disjunct
-
addEquality
(vars, coeffs, scalar)[source]¶ Function to add equality constraint to network
\[\sum_i vars_i * coeffs_i = scalar\]- Parameters
vars (list of int) – Variable numbers
coeffs (list of float) – Coefficients
scalar (float) – Right hand side constant of equation
-
addEquation
(x)[source]¶ Function to add new equation to the network
- Parameters
x (
Equation
) – New equation to add
-
addInequality
(vars, coeffs, scalar)[source]¶ Function to add inequality constraint to network
\[\sum_i vars_i * coeffs_i \le scalar\]- Parameters
vars (list of int) – Variable numbers
coeffs (list of float) – Coefficients
scalar (float) – Right hand side constant of inequality
-
addMaxConstraint
(elements, v)[source]¶ Function to add a new Max constraint
- Parameters
elements (set of int) – Variable representing input to max constraint
v (int) – Variable representing output of max constraint
-
addRelu
(v1, v2)[source]¶ Function to add a new Relu constraint
- Parameters
v1 (int) – Variable representing input of Relu
v2 (int) – Variable representing output of Relu
-
addSignConstraint
(b, f)[source]¶ Function to add a new Sign constraint
- Parameters
b (int) – Variable representing input of Sign
f (int) – Variable representing output of Sign
-
createRandomInputsForNetwork
()[source]¶ Create a random input for the network.
The value for each input variable is chosen uniformly at random between the lower and the upper bounds for that variable.
- Returns
(list of float)
-
evaluate
(inputValues, useMarabou=True, options=None, filename='evaluateWithMarabou.log')[source]¶ Function to evaluate network at a given point
- Parameters
inputValues (list of np arrays) – Inputs to evaluate
useMarabou (bool) – Whether to use Marabou solver or TF/ONNX, defaults to True
options (
Options
) – Object for specifying Marabou options, defaults to Nonefilename (str) – Path to redirect output if using Marabou solver, defaults to “evaluateWithMarabou.log”
- Returns
Values representing the output of the network or None if output cannot be computed
- Return type
(np array)
-
evaluateNNet
(inputs, first_layer=0, last_layer=- 1, normalize_inputs=False, normalize_outputs=False, activate_output_layer=False)[source]¶ Evaluate nnet directly (without Marabou) at a given point
Evaluates the network on input inputs between layer first_layer (input layer by default) and layer last_layer (output layer by default)
- Parameters
inputs (list of float) – Network inputs to be evaluated
first_layer (int) – The initial layer of the evaluation
last_layer (int) – The last layer of the evaluation
normalize_inputs (bool) – If True and first_layer==0, normalization of inputs is performed
normalize_outputs (bool) – If True, normalization of output is undone
activate_output_layer (bool) – If True, the last layer is activated, otherwise it is not.
- Returns
the result of the evaluation
- Return type
(list of float)
-
evaluateWithMarabou
(inputValues, filename='evaluateWithMarabou.log', options=None)[source]¶ Function to evaluate network at a given point using Marabou as solver
- Parameters
inputValues (list of np arrays) – Inputs to evaluate
filename (str) – Path to redirect output if using Marabou solver, defaults to “evaluateWithMarabou.log”
options (
Options
) – Object for specifying Marabou options, defaults to None
- Returns
Values representing the output of the network or None if system is UNSAT
- Return type
(np array)
-
evaluateWithoutMarabou
(inputValues)[source]¶ Evaluate network directly (without Marabou) at a given point
- Parameters
inputValues (list of np array) – Input to network
- Returns
Output of the network
- Return type
(np array)
-
findError
(inputValues, options=None, filename='evaluateWithMarabou.log')[source]¶ Function to find error between Marabou solver and TF/Nnet at a given point
- Parameters
inputValues (list of np arrays) – Input values to evaluate
options (
Options
) –filename (str) – Path to redirect output if using Marabou solver, defaults to “evaluateWithMarabou.log”
- Returns
Values representing the error in each output variable
- Return type
(np array)
-
getBoundsForLayer
(layer, b=True)[source]¶ Returns a tuple of two lists, the lower and upper bounds for the variables corresponding to the given layer
If b=True, b variable Otherwise, f variable
- Parameters
layer (int) – Layer number
b (bool) – If True, the b variable is considered, otherwise the f variable
- Returns
list of lower bounds, list of upper bounds
- Return type
(tuple of two lists of float)
-
getInputMaximum
(input_var)[source]¶ Get maximal value for given input variable
- Parameters
input_var (int) – index of input variable
- Returns
(float)
-
getInputMinimum
(input_var)[source]¶ Get minimum value for given input variable
- Parameters
input_var (int) – Index of input variable
- Returns
(float)
-
getLowerBound
(layer, node, b=True)[source]¶ Get lower bound for the variable corresponding to layer, node
If b=True, b variable Otherwise, f variable
- Parameters
layer (int) – Layer number
node (int) – Node index within layer
b (bool) – If True, the b variable is considered, otherwise the f variable
- Returns
(float)
-
getLowerBoundsForLayer
(layer, b=True)[source]¶ Returns a list of lower bounds for the given layer
If b=True, b variables Otherwise, f variables
- Parameters
layer (int) – Layer number
b (bool) – If True, the b variable is considered, otherwise the f variable
- Returns
(list of float)
-
getUpperBound
(layer, node, b=True)[source]¶ Get upper bound for the variable corresponding to layer, node
If b=True, b variable Otherwise, f variable
- Parameters
layer (int) – Layer number
node (int) – Node index within layer
b (bool) – If True, the b variable is considered, otherwise the f variable
- Returns
(float)
-
getUpperBoundsForLayer
(layer, b=True)[source]¶ Returns a list of upper bounds for the given layer
If b=True, b variables Otherwise, f variables
- Parameters
layer (int) – Layer number
b (bool) – If True, the b variable is considered, otherwise the f variable
- Returns
(list of float)
-
getVariable
(layer, node, b=True)[source]¶ Get variable number corresponding to layer, node
If b=True, b variable Otherwise, f variable If layer is 0 (input layer), always returns the f variable
- Parameters
layer (int) – Layer number
node (int) – Node index within layer
b (bool) – If True, the b variable is considered, otherwise the f variable
- Returns
(int)
-
lowerBoundExists
(x)[source]¶ Function to check whether lower bound for a variable is known
- Parameters
x (int) – Variable to check
-
resetNetworkFromParameters
(weights: list, biases: list, normalize=False, inputMinimums=[], inputMaximums=[], inputMeans=[], inputRanges=[], outputMean=0, outputRange=1)[source]¶ Recompute the attributes of the network from basic arguments
- Parameters
weights (list of list of list of float) – Outer index corresponds to layer number
biases (list of list of float) – Outer index corresponds to layer number
inputMinimums (list of float) –
inputMaximums (list of float) –
inputMeans (list of float) – Mean value for each input
inputRanges (list of float) – Range for each input
outputMean (float) – Mean value of outputs
outputRange (float) – Range of output values
-
saveQuery
(filename='')[source]¶ Serializes the inputQuery in the given filename
- Parameters
filename – (string) file to write serialized inputQuery
-
setLowerBound
(x, v)[source]¶ Function to set lower bound for variable
- Parameters
x (int) – Variable number to set
v (float) – Value representing lower bound
-
setUpperBound
(x, v)[source]¶ Function to set upper bound for variable
- Parameters
x (int) – Variable number to set
v (float) – Value representing upper bound
-
solve
(filename='', verbose=True, options=None)[source]¶ Function to solve query represented by this network
- Parameters
filename (string) – Path for redirecting output
verbose (bool) – If true, print out solution after solve finishes
options (
Options
) – Object for specifying Marabou options, defaults to None
- Returns
- tuple containing:
vals (Dict[int, float]): Empty dictionary if UNSAT, otherwise a dictionary of SATisfying values for variables
stats (
Statistics
): A Statistics object to how Marabou performed
- Return type
(tuple)
-