Google has declared TensorFlow Lite Model Maker, a tool for converting an present TensorFlow model to the TensorFlow Lite structure utilised to provide predictions on light-weight hardware these as cell devices.
TensorFlow designs can be really large, and serving predictions remotely from beefy hardware able of dealing with them isn’t often doable. Google designed the TensorFlow Lite model structure to make it much more efficient to provide predictions regionally, but developing a TensorFlow Lite edition of a model formerly required some perform.
In a blog write-up, Google described how TensorFlow Lite Model Maker adapts present TensorFlow designs to the Lite structure with only a handful of lines of code. The adaptation method employs a person of a compact quantity of undertaking sorts to assess the model and deliver a Lite edition. The downside is that only a couple of undertaking sorts are readily available for use correct now — i.e., impression and textual content classification — so designs for other jobs (e.g., equipment vision) are not still supported.
Other TensorFlow Lite instruments declared in the same write-up incorporate a tool to quickly deliver platform-distinct wrapper code to perform with a provided model. Since hand-coding wrappers for designs can be mistake-prone, the tool quickly generates the wrapper from metadata in the model autogenerated by Model Maker. The tool is at the moment readily available in a pre-release beta edition, and supports only Android correct now, with strategies to finally combine it into Android Studio.
Copyright © 2020 IDG Communications, Inc.