Upload tokenizer
ca83915
verified
-
1.52 kB
initial commit
MCQA-specialized-t5-base-int4.pth
Detected Pickle imports (37)
- "torch.device",
- "torch._utils._rebuild_wrapper_subclass",
- "torch._utils._rebuild_parameter",
- "quanto.tensor.qtype.qtype",
- "transformers.models.t5.configuration_t5.T5Config",
- "torch._tensor._rebuild_from_type_v2",
- "quanto.tensor.qbits.qbits.QBitsTensor",
- "transformers.models.t5.modeling_t5.T5DenseGatedActDense",
- "torch.nn.modules.container.ModuleList",
- "torch.nn.modules.sparse.Embedding",
- "transformers.models.t5.modeling_t5.T5LayerNorm",
- "transformers.models.t5.modeling_t5.T5LayerSelfAttention",
- "transformers.models.t5.modeling_t5.T5Stack",
- "torch._utils._rebuild_tensor_v2",
- "transformers.activations.NewGELUActivation",
- "torch.CharStorage",
- "torch.ByteStorage",
- "torch.serialization._get_layout",
- "quanto.nn.qlinear.QLinear",
- "torch.float32",
- "transformers.models.t5.modeling_t5.T5ForConditionalGeneration",
- "transformers.models.t5.modeling_t5.T5LayerFF",
- "torch.int8",
- "transformers.quantizers.quantizer_quanto.QuantoHfQuantizer",
- "collections.OrderedDict",
- "transformers.utils.quantization_config.QuantoConfig",
- "torch.uint8",
- "transformers.models.t5.modeling_t5.T5Block",
- "transformers.utils.quantization_config.QuantizationMethod",
- "transformers.models.t5.modeling_t5.T5LayerCrossAttention",
- "__builtin__.set",
- "torch.nn.modules.linear.Linear",
- "torch.nn.modules.dropout.Dropout",
- "transformers.models.t5.modeling_t5.T5Attention",
- "torch.FloatStorage",
- "transformers.generation.configuration_utils.GenerationConfig",
- "quanto.tensor.qbits.packed.PackedTensor"
How to fix it?
305 MB
Upload MCQA-specialized-t5-base-int4.pth
-
305 MB
Upload MCQA-specialized-t5-base-int4_state_dict.pth
-
5.17 kB
Upload T5ForConditionalGeneration
-
2.06 kB
Upload T5ForConditionalGeneration
-
142 Bytes
Upload T5ForConditionalGeneration
-
340 MB
Upload T5ForConditionalGeneration
-
2.54 kB
Upload tokenizer
-
792 kB
Upload tokenizer
-
2.42 MB
Upload tokenizer
-
20.9 kB
Upload tokenizer