Edit model card

Model Name

This is a fine-tuned model for question-answering on the SQuAD dataset.

BERT for Malang QA (bert-mlg)

This model is fine-tuned for question-answering (QA) tasks. It was trained on a custom dataset to answer questions related to the city of Malang in Indonesia.

Model Description

This model is based on the BERT architecture and has been fine-tuned for question-answering on a custom set of questions related to Malang, such as its famous landmarks, universities, and tourism destinations.

Task

The model is fine-tuned for the question-answering task, which takes a context (paragraph) and a question as input and provides an answer based on the context.

Intended Use

This model is intended for answering questions about the city of Malang. It can be used for applications such as:

  • Tourism applications
  • Geographical knowledge systems
  • Educational applications

How to Use

To use this model with Hugging Face's transformers library, follow these steps:

Using the Hugging Face API:

const axios = require("axios");

const HF_API_URL = "https://api-inference.huggingface.co/models/untiltomorrow/bert-mlg";
const HF_API_TOKEN = "YOUR_HUGGING_FACE_TOKEN";

async function askModel(question, context) {
  try {
    const response = await axios.post(
      HF_API_URL,
      {
        inputs: { question, context },
      },
      {
        headers: { Authorization: `Bearer ${HF_API_TOKEN}` },
      }
    );
    console.log("Model's response:", response.data);
  } catch (error) {
    console.error("Error:", error.response ? error.response.data : error.message);
  }
}

// Example Usage:
const context = "Malang is a city in East Java, Indonesia, known for its cool climate and apple cultivation.";
const question = "What is Malang known for?";
askModel(question, context);
Downloads last month
7
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
Unable to determine this model's library. Check the docs .

Datasets used to train untiltomorrow/bert-mlg