MobileBERT Classifier (MediaPipe)
MobileBERT is a compact version of the BERT(Bidirectional Encoder Representations from Transformers) model designed specifically for on-device usage.MobileBERT is a compact version of the BERT (Bidirectional Encoder Representations from Transformers) model designed specifically for on-device usage.
The MobileBERT architecture utilizes a variety of compression techniques such as smaller embedding dimensions, fewer parameters, and knowledge distillation in order to maintain SOTA performance. For more information, see "MobileBERT: a Compact Task-Agnostic BERT for Resource-Limited Devices" by Sun et al (2020).
The exact MobileBERT pre-trained variant used for this task is uncased_L-24_H-128_B-512_A-4_F-4_OPT. The model checkpoint was initially published in the MobileBERT GitHub repository.
The MobileBERT model is intended for on-device use cases. Using the notebook, you can create a custom MobileBERT model in TFLite, which can be deployed on-device (Android, iOS, Web, desktop, etc) using MediaPipe Tasks TextClassifier. Use MediaPipe Studio to evaluate the model through interactive live demo.
This model can be used in a notebook. Click Open notebook to use the model in Colab.
Input text is preprocessed using tokenization and truncated to the max_seq_len
parameter.
Given a piece of text, the model will output a vector of confidence scores for each class the model identifies.
For best results, use MediaPipe Tasks TextClassifier to deploy the output TFLite model on-device as it ensures the same preprocessing logic between training and inference. Use MediaPipe Studio to evaluate the model through interactive live demo.
Resource ID | Release date | Release stage | Description |
---|---|---|---|
mediapipe/mobilebert-001 | 2024-04-01 | General Availability | Fine-tuning and on-device serving |
La consola de Google Cloud no pudo cargar las fuentes de JavaScript de www.gstatic.com.
Esto puede deberse a que