Datasets:

Modalities:
Text
Formats:
parquet
ArXiv:
Libraries:
Datasets
Dask
License:

Dataset Viewer issue: StreamingRowsError

#8
by AnanthZeke - opened
AI4Bharat org

The dataset viewer is not working.

Error details:

Error code:   StreamingRowsError
Exception:    CastError
Message:      Couldn't cast
doc_id: string
asm_Beng: string
-- schema metadata --
huggingface: '{"info": {"features": {"doc_id": {"dtype": "string", "_type' + 65
to
{'doc_id': Value(dtype='string', id=None), 'text': Value(dtype='string', id=None)}
because column names don't match
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/split/first_rows.py", line 320, in compute
                  compute_first_rows_from_parquet_response(
                File "/src/services/worker/src/worker/job_runners/split/first_rows.py", line 88, in compute_first_rows_from_parquet_response
                  rows_index = indexer.get_rows_index(
                File "/src/libs/libcommon/src/libcommon/parquet_utils.py", line 631, in get_rows_index
                  return RowsIndex(
                File "/src/libs/libcommon/src/libcommon/parquet_utils.py", line 512, in __init__
                  self.parquet_index = self._init_parquet_index(
                File "/src/libs/libcommon/src/libcommon/parquet_utils.py", line 529, in _init_parquet_index
                  response = get_previous_step_or_raise(
                File "/src/libs/libcommon/src/libcommon/simple_cache.py", line 566, in get_previous_step_or_raise
                  raise CachedArtifactError(
              libcommon.simple_cache.CachedArtifactError: The previous step failed.
              
              During handling of the above exception, another exception occurred:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/utils.py", line 91, in get_rows_or_raise
                  return get_rows(
                File "/src/libs/libcommon/src/libcommon/utils.py", line 183, in decorator
                  return func(*args, **kwargs)
                File "/src/services/worker/src/worker/utils.py", line 68, in get_rows
                  rows_plus_one = list(itertools.islice(ds, rows_max_number + 1))
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 1389, in __iter__
                  for key, example in ex_iterable:
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 282, in __iter__
                  for key, pa_table in self.generate_tables_fn(**self.kwargs):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/parquet/parquet.py", line 97, in _generate_tables
                  yield f"{file_idx}_{batch_idx}", self._cast_table(pa_table)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/parquet/parquet.py", line 75, in _cast_table
                  pa_table = table_cast(pa_table, self.info.features.arrow_schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2295, in table_cast
                  return cast_table_to_schema(table, schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2249, in cast_table_to_schema
                  raise CastError(
              datasets.table.CastError: Couldn't cast
              doc_id: string
              asm_Beng: string
              -- schema metadata --
              huggingface: '{"info": {"features": {"doc_id": {"dtype": "string", "_type' + 65
              to
              {'doc_id': Value(dtype='string', id=None), 'text': Value(dtype='string', id=None)}
              because column names don't match

cc @albertvillanova @lhoestq @severo .

AnanthZeke changed discussion status to closed
AI4Bharat org

The old and new parquet files were present in the same folder, which is why the columns did not match.

Sign up or log in to comment