Datasets:
Tasks:
Text Generation
Modalities:
Text
Formats:
parquet
Languages:
English
Size:
10K - 100K
License:
Dataset Viewer issue
#1
by
sedthh
- opened
I reuploaded my dataset in response to https://discuss.huggingface.co/t/one-of-my-datasets-was-marked-unsafe/33727/6
I have also changed one of the column's names. This did not cause an issue last time, but now I get a key error.
Error details:
Error code: StreamingRowsError
Exception: KeyError
Message: 'METADATA'
Traceback: Traceback (most recent call last):
File "/src/services/worker/src/worker/job_runners/first_rows.py", line 570, in compute_first_rows_response
rows = get_rows(
File "/src/services/worker/src/worker/job_runners/first_rows.py", line 161, in decorator
return func(*args, **kwargs)
File "/src/services/worker/src/worker/job_runners/first_rows.py", line 217, in get_rows
rows_plus_one = list(itertools.islice(ds, rows_max_number + 1))
File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 941, in __iter__
yield _apply_feature_types_on_example(
File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 698, in _apply_feature_types_on_example
encoded_example = features.encode_example(example)
File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/features/features.py", line 1809, in encode_example
return encode_nested_example(self, example)
File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/features/features.py", line 1212, in encode_nested_example
{
File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/features/features.py", line 1212, in <dictcomp>
{
File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/utils/py_utils.py", line 302, in zip_dict
yield key, tuple(d[key] for d in dicts)
File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/utils/py_utils.py", line 302, in <genexpr>
yield key, tuple(d[key] for d in dicts)
KeyError: 'METADATA'
sedthh
changed discussion status to
closed
Thanks for the fix, @sedthh .