About TensorFlow Serving packaged by Bitnami
TensorFlow Serving is an open source system for serving a wide variety of machine learning models. Developed and released by the Google Brain team in 2015, the system uses a standard architecture and set of APIs for new and existing machine learning algorithms and frameworks.
The Bitnami TensorFlow Serving stack comes with the Inception v-3 framework pre-installed and configured. Inception v-3 is was developed for classifying complete images into 1,000 classes (such as llama, zebra, aircraft carrier, electric fan) as part of the ImageNet Large Visual Recognition Challenge. This enables image captioning out-of-the-box, while also allowing users to add or develop new machine learning frameworks.
TensorFlow Serving Get-Started Guides
Want to get a leg up on using your TensorFlow Serving Stack? You can get up and running quickly with our comprehensive get-started guide in the Bitnami docs pages.
In addition to cloud images, native installers, and VMs, Bitnami also publishes a TensorFlow Serving Docker container. You can use our in-depth guide to be up and running with TensorFlow Serving on Kubernetes in minutes!
Why use TensorFlow Serving packaged by Bitnami?
- Consistent between platforms
If you work for a large business, looking to use TensorFlow Serving packaged by Bitnami in production environments, please check out VMware Application Catalog, the enterprise edition of Bitnami Application Catalog.