View source on GitHub |
ImageClassifier for building image classification model.
mediapipe_model_maker.image_classifier.ImageClassifier( model_spec: mediapipe_model_maker.image_classifier.ModelSpec, label_names: List[str], hparams: mediapipe_model_maker.image_classifier.HParams, model_options: mediapipe_model_maker.image_classifier.ModelOptions ) Methods
create
@classmethodcreate( train_data:mediapipe_model_maker.face_stylizer.dataset.classification_dataset.ClassificationDataset, validation_data:mediapipe_model_maker.face_stylizer.dataset.classification_dataset.ClassificationDataset, options:mediapipe_model_maker.image_classifier.ImageClassifierOptions) -> 'ImageClassifier'
Creates and trains an ImageClassifier.
Loads data and trains the model based on data for image classification. If a checkpoint file exists in the {options.hparams.export_dir}/checkpoint/ directory, the training process will load the weight from the checkpoint file for continual training.
| Args | |
|---|---|
train_data | Training data. |
validation_data | Validation data. |
options | configuration to create image classifier. |
| Returns | |
|---|---|
| An instance based on ImageClassifier. |
evaluate
evaluate( data: mediapipe_model_maker.model_util.dataset.Dataset, batch_size: int = 32 ) -> Any Evaluates the classifier with the provided evaluation dataset.
| Args | |
|---|---|
data | Evaluation dataset |
batch_size | Number of samples per evaluation step. |
| Returns | |
|---|---|
| The loss value and accuracy. |
export_labels
export_labels( export_dir: str, label_filename: str = 'labels.txt' ) Exports classification labels into a label file.
| Args | |
|---|---|
export_dir | The directory to save exported files. |
label_filename | File name to save labels model. The full export path is {export_dir}/{label_filename}. |
export_model
export_model( model_name: str = 'model.tflite', quantization_config: Optional[mediapipe_model_maker.quantization.QuantizationConfig] = None ) Converts and saves the model to a TFLite file with metadata included.
Note that only the TFLite file is needed for deployment. This function also saves a metadata.json file to the same directory as the TFLite file which can be used to interpret the metadata content in the TFLite file.
| Args | |
|---|---|
model_name | File name to save TFLite model with metadata. The full export path is {self._hparams.export_dir}/{model_name}. |
quantization_config | The configuration for model quantization. |
export_tflite
export_tflite( export_dir: str, tflite_filename: str = 'model.tflite', quantization_config: Optional[mediapipe_model_maker.quantization.QuantizationConfig] = None, preprocess: Optional[Callable[..., bool]] = None ) Converts the model to requested formats.
| Args | |
|---|---|
export_dir | The directory to save exported files. |
tflite_filename | File name to save TFLite model. The full export path is {export_dir}/{tflite_filename}. |
quantization_config | The configuration for model quantization. |
preprocess | A callable to preprocess the representative dataset for quantization. The callable takes three arguments in order: feature, label, and is_training. |
summary
summary() Prints a summary of the model.
View source on GitHub