How PerceptiLabs Works With TensorFlow

PerceptiLabs is built on top of TensorFlow 1.15, and provides a visually intuitive user interface (UI) that assists you with building TensorFlow-based machine learning models. This topic helps existing TensorFlow users start with PerceptiLabs, by providing details around how PerceptiLabs uses TensorFlow, and the correlation between visual modeling and the underlying TensorFlow code.

Layers as Components

Since models are built visually in PerceptiLabs, the layersof a model are represented as components in PerceptiLabs' Modeling Tool. However, the following categories components do not represent the "traditional layers" that you might think of in terms of those in a machine learning model's structure:

  • Data: provides a source of data to the model. If the data needs to be transformed, the Data component can then optionally be connected to a subsequent "layer" component (e.g., a Processing component) to transform the data into the correct input format for the model (e.g., input vector).

  • Training: ties the whole model together and handles its training. PerceptiLabs will find and use your model's training component to invoke training.

Fundamentals of Components

Each component in your model represents an auto-generated class written in Python, which implements that component's logic with TensorFlow.

Class Naming Convention

The name of a component's auto-generated class encodes the component's category, component type, and an instance name appended with the instance number. The instance number corresponds to when the component was added to the model in relation to other components of the same type. For example, the following is a code snippet from a Reshapecomponent in a model:

class ProcessReshape_Reshape_2(Tf1xLayer):
def __call__(self, x: tf.Tensor, is_training: tf.Tensor = None) -> tf.Tensor:
""" Takes a tensor as input and reshapes it."""
shp = [28, 28, 1]
perm = [0, 1, 2]
shp = [i for i in shp if i != 0]
...

The class name ProcessReshape_Reshape_2can be broken down as follows:

  • Process: the name of the component category that the component was dragged onto the model from.

  • Reshape: the type of the component.

  • Reshape_2: the name of the component instance. In this example, 2 means it was the second reshape component instance added to the model.

Elements of the Component Class's Interface

Each component's class derives from a PerceptiLabs-defined parent class, that defines the interface required for PerceptiLabs' Training component to invoke operations and pass results onto subsequent components. PerceptiLabs will regenerate this class when changes are made in the component's Settingsscreen (i.e., changes to hyperparameters). The following are common interface elements found in many classes.

A "Layer" component implements the PerceptiLabs-defined TflxLayerclass that defines an interface (i.e., the functions and properties) for the layer with the following:

  • weights: Returns a dictionary of weights that will be updated during training.

  • biases: Returns a dictionary of biases that will be updated during training.

  • __call__(): Invoked by the model's Training component's to run the component's logic. This is likely to be the only method that you would modify if you're customizing the class's algorithm.

  • get_sample(): Returns the first data element computed by that component so that PerceptiLabs can render a visual preview of that component's computation.

Components may implement the following:

  • run(): Found in training components, this method is invoked by PerceptiLabs to start training the model.

  • variables: Returns a dictionary of variables. These variables can be passed on to the next component as well as previewed. Since it also automatically collects all tensors, minimal or no modification is needed. However, you can add variables that you want to visualize or include as part of the component's output.

  • trainable_variables: For components that represent a "layer" in a model, returns a dictionary of tensor parameters that will be updated during training, specifically during backpropagation.

  • __init__(): Invoked by the model's Training component to initialize the component's members when instantiating the component.

How PerceptiLabs Trains and Runs the Model

PerceptiLabs makes use of TensorFlow's Eager and Graph modes.

PerceptiLabs uses Eager mode while you're developing your model. This allows for better debugging and enables PerceptiLabs to render immediate visualizations of each component's output.

Once you invoke training (i.e., by clicking Run), PerceptiLabs uses Graph Mode to build a Graph and then runs it using TensorFlow's Session class. The code for this is located in the run()method of your model's Training component, which is why a Training component must exist in every model and is required to enable training of the model. In other words, clicking Run tells PerceptiLabs to invoke the run()method of your model's Training component.

The following example shows part of the code taken from the run()method of a Classification training component. In this example, a build_graph()method is defined to construct the graph based on the layers that were set up by the other components, and then invoked:

class TrainNormal_Normal_1(ClassificationLayer):
def run(self, graph: Graph):
"""Called as the main entry point for training. Responsible for training the model.
...
def build_graph(input_tensor, label_tensor):
layer_output_tensors = {
input_data_node.layer_id: input_tensor,
label_data_node.layer_id: label_tensor
}
for node in graph.inner_nodes:
args = []
for input_node in graph.get_input_nodes(node):
args.append(layer_output_tensors[input_node.layer_id])
args.append(is_training)
y = node.layer_instance(*args)
layer_output_tensors[node.layer_id] = y
return layer_output_tensors
layer_output_tensors = build_graph(input_tensor, label_tensor)
...

Further down in the class's run() method, a TensorFlow Session object is created. The Session's run() method is then invoked both to initialize global variables and as part of the train_step() subroutine used to train the model:

class TrainNormal_Normal_1(ClassificationLayer):
sess = None
config = tf.ConfigProto()
config.gpu_options.allow_growth = True
sess = tf.Session(config=config)
self._sess = sess
...
sess.run(tf.global_variables_initializer())
...
def train_step():
if not self._headless:
_, self._loss_training, self._accuracy_training, \
self._layer_outputs, self._layer_weights, self._layer_biases, \
self._layer_gradients \
= sess.run([
update_weights, loss_tensor, accuracy_tensor,
layer_output_tensors, layer_weight_tensors, layer_bias_tensors, layer_gradient_tensors
])
else:
_, self._loss_training, self._accuracy_training, \
= sess.run([
update_weights, loss_tensor, accuracy_tensor
])