Export View
Last updated
Last updated
PerceptiLabs allows you to export your model to TensorFlow's SavedModel format, resulting in the creation of a .pb model file.
Note
You must first train your model via PerceptiLabs' Modeling Tool by clicking Run on the Modeling Tool toolbar, before it can be exported.
Click the Export tab to display the Export View which presents the following options:
Select Trained Models: lists the trained models that you can select for export.
Save to: allows you to specify the location to export the model to.
Format: the type of export to perform. Select TensorFlow Model to export to TensorFlow's exported model format.
Compress (available for TensorFlow model exports): provides options to compress and/or quantize your exported model(s) during export.
Click Export at the bottom of the screen to start the export process.
After the export is complete, a directory will be created in the location that you specified in Step 3, with the same name as your model. The structure of the directory will look as follows:
To use the exported model for inference, take the entire directory and use it for serving the model. This structure is standard for TensorFlow serving. The variables subdirectory contains a standard training checkpoint, this is needed to load the model unless it's frozen. A frozen model (or frozen graph) is a minimized model that can only be used for inference. All the variables needed for training are removed and the only variables that remain are stored together with their definitions in a single protobuf (.pb) file. Note that TensorFlow 2.0 no longer generates frozen graph models.
The exported model also includes all of the pre-processing options you specified in the Data Wizard, This means that you can pass raw, non-preprocessed data to the exported model for inference, and the model will pre-process the data for you, in the same manner as when you built and trained the model in PerceptiLabs.
For additional information see Serving a TensorFlow Model.
PerceptiLabs allows you to export your model and its supporting files to a new repo under your existing GitHub account. This allows you to back up your model on GitHub, keep a revision history for each file, and share your model as a package with other PerceptiLabs users. As part of the export process, PerceptiLabs will also generate and include a template README.md file that you can modify to describe your repo in more detail.
Follow the steps below to export your model to GitHub:
Note
You will need Git installed before you can perform these steps.
1. Ensure your model is saved by selecting File > Save.
2. Start the export by selecting File > Export to GitHub. Your browser will redirect you to the GitHub website which will request your authorization to allow PerceptiLabs to create a new repository under your GitHub user account and to push files to that repository. Once you've granted authorization to GitHub, your browser will redirect back to PerceptiLabs and display the Export to GitHub popup:
3. Enter a descriptive name to save the new repo.
4. (Optional) Enable Include TensorFlow Files if you want to include your trained TensorFlow model files (e.g., checkpoints, etc.) as part of the export. For a description of these files, see Exporting Your Trained Model above.
5. (Optional) Enable Include Data file if you want to include the files referenced by the Data component(s) of your model as part of the export.
6. Click Export to start the export process.