Upload 88 files
e2b1d98
-
PY3
Upload 88 files
-
6.15 kB
Upload 88 files
-
8.57 kB
Upload 88 files
czech.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.27 MB
Upload 88 files
danish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.26 MB
Upload 88 files
dutch.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
743 kB
Upload 88 files
english.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
433 kB
Upload 88 files
estonian.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.6 MB
Upload 88 files
finnish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.95 MB
Upload 88 files
french.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
583 kB
Upload 88 files
german.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.53 MB
Upload 88 files
greek.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.95 MB
Upload 88 files
italian.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
658 kB
Upload 88 files
malayalam.pickle
Detected Pickle imports (7)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
221 kB
Upload 88 files
norwegian.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.26 MB
Upload 88 files
polish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
2.04 MB
Upload 88 files
portuguese.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
649 kB
Upload 88 files
russian.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.long",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
33 kB
Upload 88 files
slovene.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
833 kB
Upload 88 files
spanish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
598 kB
Upload 88 files
swedish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.03 MB
Upload 88 files
turkish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.23 MB
Upload 88 files