Edit model card

Short Summary:
This model combines the power of IndoBERT for understanding language with a Convolutional Neural Network (CNN) for capturing local patterns in text data. It's specifically designed for multi-label classification of customer reviews in e-commerce, focusing on: Product, Customer Sevice, and Shipping/Delivery

Detailed Description:
Explain that the model is based on IndoBERT-base-p1, a pre-trained IndoBERT model specifically designed for Indonesian text. Highlight that it's fine-tuned on a dataset of e-commerce reviews, allowing it to understand the nuances of customer sentiment in this domain. Following the IndoBERT layer, a 1D convolutional layer (Conv1d) is applied to extract local features from the hidden state vectors. An Adaptive Average Pooling layer (AdaptiveAvgPool1d) is used to reduce the dimensionality of the convolutional output, making it suitable for feeding into a final linear layer. The final linear layer produces logits for each of the three classes (Produk, Layanan Pelanggan, Pengiriman). A sigmoid activation function is applied to convert the logits into probabilities, allowing for multi-label classification.

Clearly define the three output classes and their corresponding labels:

  • Produk (Product): Customer satisfaction with product quality, performance, and description accuracy.
  • Layanan Pelanggan (Customer Service): Interaction with sellers, their responsiveness, and complaint handling.
  • Pengiriman (Shipping/Delivery): Speed of delivery, item condition upon arrival, and timeliness.

How to import in PyTorch:

import torch.nn as nn
from huggingface_hub import PyTorchModelHubMixin
from transformers import BertModel, AutoTokenizer

class IndoBertCNNEcommerceReview(nn.Module, PyTorchModelHubMixin):
        def __init__(self, bert):
            super().__init__()
            self.bert = bert
            self.conv1 = nn.Conv1d(in_channels=bert.config.hidden_size, out_channels=512, kernel_size=3, padding=1)
            self.pool = nn.AdaptiveAvgPool1d(1)
            self.linear = nn.Linear(512, 3)
            self.sigmoid = nn.Sigmoid()

        def forward(self, input_ids, attention_mask):
            outputs = self.bert(input_ids=input_ids, attention_mask=attention_mask)
            last_hidden_state = outputs.last_hidden_state

            # Permute to [batch_size, hidden_size, seq_len]
            last_hidden_state = last_hidden_state.permute(0, 2, 1)

            conv1_output = self.conv1(last_hidden_state)
            pooled_output = self.pool(conv1_output).squeeze(-1)
            logits = self.linear(pooled_output)
            probabilities = self.sigmoid(logits)
            return probabilities

bert = BertModel.from_pretrained("indobenchmark/indobert-base-p1",
                                                            num_labels=3,
                                                           problem_type="multi_label_classification")
tokenizer = AutoTokenizer.from_pretrained("fahrendrakhoirul/indobert-cnn-finetuned-ecommerce-reviews")
model = IndoBertCNNEcommerceReview.from_pretrained("fahrendrakhoirul/indobert-cnn-finetuned-ecommerce-reviews", bert=bert)

This model has been pushed to the Hub using the PytorchModelHubMixin integration:

  • Library: [More Information Needed]
  • Docs: [More Information Needed]
Downloads last month
123
Safetensors
Model size
126M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Dataset used to train fahrendrakhoirul/indobert-cnn-finetuned-ecommerce-reviews

Space using fahrendrakhoirul/indobert-cnn-finetuned-ecommerce-reviews 1