Update training_args.bin
ee47d9b
-
345 Bytes
initial commit
-
832 Bytes
Update config.json
-
126 Bytes
Update dataset-metadata.json
-
712 MB
Update pytorch_model.bin
-
112 Bytes
Update special_tokens_map.json
-
1.98 kB
Update test_predictions.txt
-
712 MB
Update tf_model.h5
-
48 Bytes
Update tokenizer_config.json
training_args.bin
Detected Pickle imports (30)
- "transformers.modeling_bert.BertForSequenceClassification",
- "transformers.modeling_bert.BertAttention",
- "transformers.modeling_bert.BertOutput",
- "torch.FloatStorage",
- "torch.nn.modules.linear.Linear",
- "transformers.modeling_bert.BertSelfOutput",
- "torch.nn.modules.normalization.LayerNorm",
- "torch.nn.modules.activation.Tanh",
- "transformers.modeling_bert.BertModel",
- "transformers.modeling_bert.BertEncoder",
- "transformers.modeling_bert.BertSelfAttention",
- "torch._utils._rebuild_tensor_v2",
- "torch.nn.modules.dropout.Dropout",
- "transformers.tokenization_bert.BasicTokenizer",
- "collections.OrderedDict",
- "torch.nn.modules.container.ModuleList",
- "transformers.tokenization_bert.BertTokenizer",
- "transformers.modeling_bert.BertPooler",
- "transformers.modeling_bert.BertIntermediate",
- "torch.device",
- "torch.nn.modules.sparse.Embedding",
- "transformers.configuration_bert.BertConfig",
- "torch._utils._rebuild_parameter",
- "transformers.modeling_bert.BertEmbeddings",
- "transformers.modeling_bert.BertLayer",
- "__builtin__.set",
- "lingualytics.learner.Learner",
- "transformers.tokenization_bert.WordpieceTokenizer",
- "torch.nn.functional.gelu",
- "pathlib.PosixPath"
How to fix it?
715 MB
Update training_args.bin
-
996 kB
Update vocab.txt