Datasets:
metadata
license:
- other
pretty_name: python copilot large coding dataset
dataset_info:
- config_name: view_schema
splits:
- name: view_schema
configs:
- config_name: view_schema
data_files:
- split: view_schema
path: files/lok-python-code-large-v1_00000013.parquet
size_categories:
- 100K<n<1M
- 1M<n<10M
tags:
- python-copilot
- python-coding
- fine-tuning
- training
- alpaca
- text
- coding
task_categories:
- text-generation
task_ids:
- parsing
Python Copilot Large Coding Dataset
This dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.
Details
Each row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.
- Rows: 2350782
- Size: 3.1 GB
- Data type: text
- Format: Extracted code using python AST
Schema
{
"args": "string",
"class_bases": "string",
"class_docstr": "string",
"class_docstr_tok": "string",
"class_name": "string",
"code": "string",
"code_tok": "string",
"docstr": "string",
"docstr_tok": "string",
"file_path": "string",
"filename": "string",
"imports": "string",
"is_member": "bool",
"label_desc": "string",
"label_desc_len": "int64",
"label_id": "string",
"lend": "int64",
"lstart": "int64",
"name": "string",
"num_all_bases": "float64",
"num_bases": "float64",
"num_classes": "float64",
"num_functions": "int64",
"num_imports": "int64",
"num_methods": "float64",
"raises": "string",
"returns": "string",
"total_objects": "int64"
}
How to use the dataset
from datasets import load_dataset
ds = load_dataset("matlok/python-copilot-training-from-many-repos-large", data_dir="files")