PerceptiLabs
v0.13
v0.11v0.12v0.13
v0.13
v0.11v0.12v0.13
  • Welcome
  • ⚡Getting Started
    • Quickstart Guide
      • Requirements
      • Installation
      • Video Tutorials
    • Load and Pre-process Data
    • Build Models
    • Train Models
    • Evaluate Models
    • Export and Deploy Models
    • Manage, Version, and Track Models
  • 🙏How to Contribute
    • Datasets
    • Models
    • Components
  • 👨‍🍳Tutorials
    • Basic Image Recognition
    • Basic Image Segmentation
  • 🛠️Advanced
    • Common Testing Issues
    • Common Training Issues
    • Components
      • Input and Target
      • Processing
      • Deep Learning
      • Operations
      • Custom
    • CSV File Format
    • Debugging and Diagnostic Features
    • How PerceptiLabs Works With TensorFlow
    • Included Packages
    • Types of Tests
    • UI Overview
      • Data Wizard
      • Overview Screen
      • Model Training Settings
      • Modeling Tool
      • Training View
      • Evaluate View
      • Deploy View
    • Using the Exported/Deployed Model
  • 💡Use Cases
    • General
      • A Guide to Using U-Nets for Image Segmentation
      • A Voice Recognition model using Image Recognition
    • Environmental
      • Automated Weather Analysis Using Image Recognition
      • Wildfire Detection
    • Healthcare & Medical
      • Brain Tumor Detection
      • Breast Cancer Detection
      • Classifying Chest X-Rays to Detect Pneumonia
      • Classifying Ways to Wear a Face Mask
      • Detecting Defective Pills
      • Highlighting Blood Cells in Dark-field Microscopy Using Image Segmentation
      • Ocular Disease Recognition
      • Retinal OCT
      • Skin Cancer Classification
    • Industrial IoT & Manufacturing
      • Air Conditioner Piston Check
      • Classifying Fruit
      • Classifying Wood Veneers Into Dry and Wet
      • Defect Detection in Metal Surfaces
      • Fabric Stain Classification
  • 📖Support
    • FAQs
    • Changelog
  • Code of Conduct
  • Marketing Site
Powered by GitBook
On this page

Was this helpful?

  1. Advanced
  2. UI Overview

Deploy View

PreviousEvaluate ViewNextUsing the Exported/Deployed Model

Last updated 3 years ago

Overview

PerceptiLabs Deploy View allows you to export and deploy your model to different targets.

The view displays the following options to select the model(s) to export/deploy:

1. Model Selection Checkbox: enable this to select the model(s) for export/deployment.

2. Search bar: allows you to filter the list of models by name.

To the right of the model selection screen are the export/deployment targets that you can click:

The following subsections describe these targets:

Export Options

The current export options include:

  • TensorFlow: exports to TensorFlow's exported model format or to TensorFlow Lite.

  • FastAPI Server: generates a TensorFlow model along with a Python server app with a simple API that you can use for inference on your model.

  • PL Package: exports a zipped package containing your PerceptiLabs model that you easily can share and load.

Selecting either of these displays a popup with some or all of the following options:

  • Save to: allows you to specify the location to which the exported model files are to be placed.

  • Optimize (available for TensorFlow model exports): provides options to compress and/or quantize your exported model(s) during export. Selecting either of these options will export to TensorFlow Lite format.

Deployment Options

Select Gradio to export and deploy your model as a Gradio app.

Using the Exported/Deployed Model

After you complete the export/deployment the model can be used for inference.

See Exporting and Deploying Models for information on how to use your exported/deployed model.

🛠️
  • Overview
  • Export Options
  • Deployment Options
  • Using the Exported/Deployed Model