26 days ago

1.9.0

Deployment Offering

About Bitnami TensorFlow Serving Stack

TensorFlow Serving is an open source system for serving a wide variety of machine learning models. Developed and released by the Google Brain team in 2015, the system uses a standard architecture and set of APIs for new and existing machine learning algorithms and frameworks.

The Bitnami TensorFlow Serving stack comes with the Inception v-3 framework pre-installed and configured. Inception v-3 is was developed for classifying complete images into 1,000 classes (such as llama, zebra, aircraft carrier, electric fan) as part of the ImageNet Large Visual Recognition Challenge. This enables image captioning out-of-the-box, while also allowing users to add or develop new machine learning frameworks.

TensorFlow Serving Get-Started Guides

Want to get a leg up on using your TensorFlow Serving Stack? You can get up and running quickly with our comprehensive get-started guide in the Bitnami docs pages.

In addition to cloud images, native installers, and VMs, Bitnami also publishes a TensorFlow Serving Docker container. You can use our in-depth guide to be up and running with TensorFlow Serving on Kubernetes in minutes!

Download installers and virtual machines, or run your own TensorFlow Serving server in the cloud.

Additional resources

Why use the Bitnami TensorFlow Serving Stack?

  • Up-to-date
  • Secure
  • Consistent between platforms