Datasets:
Dataset Preview
Full Screen Viewer
Full Screen
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code: DatasetGenerationError Exception: TypeError Message: Mask must be a pyarrow.Array of type boolean Traceback: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1625, in _prepare_split_single writer.write(example, key) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 537, in write self.write_examples_on_file() File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 495, in write_examples_on_file self.write_batch(batch_examples=batch_examples) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 609, in write_batch self.write_table(pa_table, writer_batch_size) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 624, in write_table pa_table = embed_table_storage(pa_table) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2270, in embed_table_storage arrays = [ File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2271, in <listcomp> embed_array_storage(table[name], feature) if require_storage_embed(feature) else table[name] File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1795, in wrapper return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks]) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1795, in <listcomp> return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks]) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2140, in embed_array_storage return feature.embed_storage(array) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/features/image.py", line 283, in embed_storage storage = pa.StructArray.from_arrays([bytes_array, path_array], ["bytes", "path"], mask=bytes_array.is_null()) File "pyarrow/array.pxi", line 3257, in pyarrow.lib.StructArray.from_arrays File "pyarrow/array.pxi", line 3697, in pyarrow.lib.c_mask_inverted_from_obj TypeError: Mask must be a pyarrow.Array of type boolean During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1634, in _prepare_split_single num_examples, num_bytes = writer.finalize() File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 636, in finalize self.write_examples_on_file() File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 495, in write_examples_on_file self.write_batch(batch_examples=batch_examples) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 609, in write_batch self.write_table(pa_table, writer_batch_size) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 624, in write_table pa_table = embed_table_storage(pa_table) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2270, in embed_table_storage arrays = [ File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2271, in <listcomp> embed_array_storage(table[name], feature) if require_storage_embed(feature) else table[name] File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1795, in wrapper return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks]) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1795, in <listcomp> return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks]) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2140, in embed_array_storage return feature.embed_storage(array) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/features/image.py", line 283, in embed_storage storage = pa.StructArray.from_arrays([bytes_array, path_array], ["bytes", "path"], mask=bytes_array.is_null()) File "pyarrow/array.pxi", line 3257, in pyarrow.lib.StructArray.from_arrays File "pyarrow/array.pxi", line 3697, in pyarrow.lib.c_mask_inverted_from_obj TypeError: Mask must be a pyarrow.Array of type boolean The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1412, in compute_config_parquet_and_info_response parquet_operations, partial, estimated_dataset_info = stream_convert_to_parquet( File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 988, in stream_convert_to_parquet builder._prepare_split( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1486, in _prepare_split for job_id, done, content in self._prepare_split_single( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1643, in _prepare_split_single raise DatasetGenerationError("An error occurred while generating the dataset") from e datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
image
image |
---|
End of preview.
Sub Arctic Polar Bear
"Sub Arctic Polar Bear" is a fine tuning text-to-image dataset consisting of dynamic photographs captured in 30MP+ resolution using Canon R5 and a Canon 600mm RF lens.
Key Features
- 📷 Detail: Every photograph is captured at the proper exposure, full sharpness, delivering rich detail ideal for fine tuning polar bear datasets.
- 🎞️ Variance: Includes close-up details, behavioural, low perspective ground level, all angles of the polar bears, medium and far shots in habitat, autumn and winter scenes.
- ⚙️ Consistency: Re-thinking training data at the point of capture by "overshooting" a subject, enabling models to learn more nuanced relationships and views across scenes.
- 🌅 Light: Shot during early morning, sunset light for optimal color contrast and dynamic range. Also, shot during bright overcast, maximizing visual quality for color and lighting-sensitive tasks.
- 🔍 Curation: Curated specifically for machine learning, providing clean, high-quality data for next generation model training.
Dataset Details
- Total photos: 1,801
- Total size: 55.01 GB
- Photo types: Close-ups, medium distance, habitat, portraits, side profile, all angles, awake, walking, sleeping, eating.
Technical Details
- Cameras: Canon R5
- Lenses: Canon 600mm RF, Canon 70-200mm EF
- Resolution: 8192x5464
- Colorspace: Adobe RGB 1998
- Image Size: 35MP+
Content Credentials
Each video in the "Sub Arctic Polar Bear" dataset contains C2PA Content Credentials metadata and can be used with the Content Credentials Space by Truepic to begin testing model provenance.
What is Overlai.ai?
Overlai.ai specializes in 8K+ fine tuning datasets with the goal of improving photo & video models. Contact us for the complete dataset list or access to this dataset in full resolution.
Contact
- Downloads last month
- 52