--- base_model: sentence-transformers/paraphrase-mpnet-base-v2 library_name: setfit metrics: - f1 pipeline_tag: text-classification tags: - setfit - sentence-transformers - text-classification - generated_from_setfit_trainer widget: - text: 'Title: "Deep Residual Learning for Image Recognition". Abstract: In this paper, we propose a new deep residual learning framework for image classification. We introduce a novel residual block architecture that learns to represent high-level features in an image. Our approach is based on the idea of residual learning, where the network learns to represent the difference between the input and the output of a layer, rather than learning to represent the output directly. We evaluate our approach on several benchmark datasets, including ImageNet and CIFAR-10, and show that it achieves state-of-the-art performance. Our results demonstrate the effectiveness of residual learning for image classification, and show that it can be used to improve the performance of deep neural networks. We also provide a detailed analysis of the residual block architecture, and show how it can be used to improve the performance of other deep learning models. This paper provides a comprehensive overview of the residual learning framework, and demonstrates its effectiveness for image classification tasks.' - text: Let G be a finite group and let V be a finite-dimensional representation of G over an algebraically closed field k. We say that V is a representation of G in characteristic zero if the characteristic of k is zero. In this paper, we investigate the structure of the representation ring R(G) of a finite group G in characteristic zero. We show that R(G) is isomorphic to the group ring k[G] if and only if G is a cyclic group. Furthermore, we provide a characterization of the representation rings of finite abelian groups in terms of their irreducible representations. Our results have implications for the study of the representation theory of finite groups in characteristic zero. - text: 'Denotational Semantics of Programming Languages: A Survey Abstract: Denotational semantics is a branch of programming language theory that focuses on the meaning of programming languages. In this survey, we provide an overview of the key concepts and results in denotational semantics, including the use of domain theory and categorical semantics. We also discuss the relationship between denotational semantics and other areas of programming language theory, such as operational semantics and axiomatic semantics. Introduction Denotational semantics is a mathematical approach to understanding the meaning of programming languages. It is based on the idea that a programming language can be viewed as a mapping from a set of programs to a set of meanings, where the meanings are represented as mathematical objects. The key concept in denotational semantics is the notion of a denotation, which is a function that maps a program to its meaning. Domain Theory Domain theory is a branch of mathematics that provides a framework for understanding the notion of a denotation. It is based on the idea that a denotation is a function from a set of programs to a set of meanings, where the meanings are represented as elements of a domain. The key concept in domain theory is the notion of a continuous function, which is a function that preserves the order relation on the domain. Categorical Semantics Categorical semantics is a branch of mathematics that provides a framework for understanding the notion of a denotation in a categorical setting. It is based on the idea that a denotation is a function from a set of programs to a set of meanings, where the meanings are represented as objects in a category. The key concept in categorical semantics is the notion of a functor, which is a function that preserves the morphisms in the category. Conclusion In this survey, we have provided an overview of the key concepts and results in denotational semantics. We have also discussed the relationship between denotational semantics and other areas of programming language theory. The results presented in this survey demonstrate the importance of denotational semantics in understanding the meaning of programming languages.' - text: 'A Novel Robust Control Approach for Uncertain Systems with Time-Varying Delays Abstract: This paper presents a new robust control method for uncertain systems with time-varying delays. The proposed approach combines the advantages of model predictive control (MPC) and sliding mode control (SMC) to achieve robust stability and performance. The MPC algorithm is used to predict the future behavior of the system, while the SMC algorithm is employed to reject disturbances and uncertainties. The stability and performance of the proposed controller are analyzed using Lyapunov theory and simulation results. The effectiveness of the proposed approach is demonstrated through numerical examples and comparisons with existing methods. Keywords: Robust control, uncertain systems, time-varying delays, model predictive control, sliding mode control, Lyapunov theory.' - text: 'A Novel Compiler Framework for Parallel Computing: Design and Implementation Abstract: With the increasing demand for high-performance computing, parallel computing has become a crucial aspect of modern computing systems. However, the complexity of parallel programming models and the lack of efficient compilation techniques hinder the widespread adoption of parallel computing. In this paper, we propose a novel compiler framework for parallel computing, which aims to bridge the gap between parallel programming models and efficient compilation techniques. Our framework, called ParComp, is designed to support a wide range of parallel programming models, including OpenMP, MPI, and CUDA. ParComp consists of three main components: a parallelization module, a scheduling module, and a code generation module. The parallelization module is responsible for identifying parallelizable loops and transforming them into parallel code. The scheduling module is responsible for scheduling the parallel tasks and allocating resources to them. The code generation module is responsible for generating efficient parallel code from the scheduled tasks. We evaluate the performance of ParComp using a set of benchmark programs and compare it with state-of-the-art parallel compilers. The results show that ParComp outperforms the existing compilers in terms of execution time and scalability. Our framework is implemented using C++ and is available online for public use. Index Terms: Parallel computing, compiler design, parallel programming models, OpenMP, MPI, CUDA, code generation, scheduling, resource allocation.' inference: true model-index: - name: SetFit with sentence-transformers/paraphrase-mpnet-base-v2 results: - task: type: text-classification name: Text Classification dataset: name: Unknown type: unknown split: test metrics: - type: f1 value: 0.6184 name: F1 --- # SetFit with sentence-transformers/paraphrase-mpnet-base-v2 This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Model Details ### Model Description - **Model Type:** SetFit - **Sentence Transformer body:** [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2) - **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance - **Maximum Sequence Length:** 512 tokens - **Number of Classes:** 11 classes ### Model Sources - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit) - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055) - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit) ### Model Labels | Label | Examples | |:--------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Data Structures | | | Programming Languages | | | Information Theory | | | Group Theory | | | Neural and Evolutionary | | | Commutative Algebra | | | Systems and Control | | | Statistics Theory | | | Artificial Intelligence | | | Computational Engineering | | | Computer Vision | | ## Evaluation ### Metrics | Label | F1 | |:--------|:-------| | **all** | 0.6184 | ## Uses ### Direct Use for Inference First install the SetFit library: ```bash pip install setfit ``` Then you can load this model and run inference. ```python from setfit import SetFitModel # Download from the 🤗 Hub model = SetFitModel.from_pretrained("setfit_model_id") # Run inference preds = model("Let G be a finite group and let V be a finite-dimensional representation of G over an algebraically closed field k. We say that V is a representation of G in characteristic zero if the characteristic of k is zero. In this paper, we investigate the structure of the representation ring R(G) of a finite group G in characteristic zero. We show that R(G) is isomorphic to the group ring k[G] if and only if G is a cyclic group. Furthermore, we provide a characterization of the representation rings of finite abelian groups in terms of their irreducible representations. Our results have implications for the study of the representation theory of finite groups in characteristic zero.") ``` ## Training Details ### Training Set Metrics | Training set | Min | Median | Max | |:-------------|:----|:---------|:-----| | Word count | 69 | 220.7380 | 1079 | | Label | Training Sample Count | |:--------------------------|:----------------------| | Commutative Algebra | 15 | | Computer Vision | 12 | | Artificial Intelligence | 16 | | Systems and Control | 19 | | Group Theory | 21 | | Computational Engineering | 16 | | Programming Languages | 13 | | Information Theory | 21 | | Data Structures | 21 | | Neural and Evolutionary | 21 | | Statistics Theory | 12 | ### Training Hyperparameters - batch_size: (16, 16) - num_epochs: (5, 5) - max_steps: -1 - sampling_strategy: oversampling - body_learning_rate: (2e-05, 1e-05) - head_learning_rate: 0.01 - loss: CosineSimilarityLoss - distance_metric: cosine_distance - margin: 0.25 - end_to_end: False - use_amp: False - warmup_proportion: 0.1 - seed: 42 - eval_max_steps: -1 - load_best_model_at_end: True ### Training Results | Epoch | Step | Training Loss | Validation Loss | |:-------:|:--------:|:-------------:|:---------------:| | 0.0005 | 1 | 0.158 | - | | 0.0253 | 50 | 0.1482 | - | | 0.0505 | 100 | 0.1408 | - | | 0.0758 | 150 | 0.1071 | - | | 0.1011 | 200 | 0.1294 | - | | 0.1263 | 250 | 0.0782 | - | | 0.1516 | 300 | 0.0628 | - | | 0.1769 | 350 | 0.0909 | - | | 0.2021 | 400 | 0.0161 | - | | 0.2274 | 450 | 0.0068 | - | | 0.2527 | 500 | 0.011 | - | | 0.2779 | 550 | 0.0027 | - | | 0.3032 | 600 | 0.0018 | - | | 0.3284 | 650 | 0.0011 | - | | 0.3537 | 700 | 0.0037 | - | | 0.3790 | 750 | 0.0015 | - | | 0.4042 | 800 | 0.0012 | - | | 0.4295 | 850 | 0.0006 | - | | 0.4548 | 900 | 0.0013 | - | | 0.4800 | 950 | 0.0004 | - | | 0.5053 | 1000 | 0.0003 | - | | 0.5306 | 1050 | 0.0001 | - | | 0.5558 | 1100 | 0.0007 | - | | 0.5811 | 1150 | 0.0001 | - | | 0.6064 | 1200 | 0.0004 | - | | 0.6316 | 1250 | 0.0001 | - | | 0.6569 | 1300 | 0.0001 | - | | 0.6822 | 1350 | 0.0006 | - | | 0.7074 | 1400 | 0.0002 | - | | 0.7327 | 1450 | 0.0002 | - | | 0.7580 | 1500 | 0.0001 | - | | 0.7832 | 1550 | 0.0001 | - | | 0.8085 | 1600 | 0.0001 | - | | 0.8338 | 1650 | 0.0001 | - | | 0.8590 | 1700 | 0.0002 | - | | 0.8843 | 1750 | 0.0001 | - | | 0.9096 | 1800 | 0.0001 | - | | 0.9348 | 1850 | 0.0001 | - | | 0.9601 | 1900 | 0.0001 | - | | 0.9853 | 1950 | 0.0001 | - | | **1.0** | **1979** | **-** | **0.0359** | | 1.0106 | 2000 | 0.0001 | - | | 1.0359 | 2050 | 0.0001 | - | | 1.0611 | 2100 | 0.0002 | - | | 1.0864 | 2150 | 0.0001 | - | | 1.1117 | 2200 | 0.0002 | - | | 1.1369 | 2250 | 0.0001 | - | | 1.1622 | 2300 | 0.0 | - | | 1.1875 | 2350 | 0.0003 | - | | 1.2127 | 2400 | 0.0001 | - | | 1.2380 | 2450 | 0.0001 | - | | 1.2633 | 2500 | 0.0001 | - | | 1.2885 | 2550 | 0.0 | - | | 1.3138 | 2600 | 0.0 | - | | 1.3391 | 2650 | 0.0001 | - | | 1.3643 | 2700 | 0.0046 | - | | 1.3896 | 2750 | 0.0044 | - | | 1.4149 | 2800 | 0.0005 | - | | 1.4401 | 2850 | 0.0002 | - | | 1.4654 | 2900 | 0.0001 | - | | 1.4907 | 2950 | 0.0 | - | | 1.5159 | 3000 | 0.0001 | - | | 1.5412 | 3050 | 0.0001 | - | | 1.5664 | 3100 | 0.0001 | - | | 1.5917 | 3150 | 0.0001 | - | | 1.6170 | 3200 | 0.0 | - | | 1.6422 | 3250 | 0.0 | - | | 1.6675 | 3300 | 0.0 | - | | 1.6928 | 3350 | 0.0 | - | | 1.7180 | 3400 | 0.0001 | - | | 1.7433 | 3450 | 0.0 | - | | 1.7686 | 3500 | 0.0 | - | | 1.7938 | 3550 | 0.0001 | - | | 1.8191 | 3600 | 0.0 | - | | 1.8444 | 3650 | 0.0 | - | | 1.8696 | 3700 | 0.0 | - | | 1.8949 | 3750 | 0.0 | - | | 1.9202 | 3800 | 0.0 | - | | 1.9454 | 3850 | 0.0 | - | | 1.9707 | 3900 | 0.0 | - | | 1.9960 | 3950 | 0.0 | - | | 2.0 | 3958 | - | 0.0579 | | 2.0212 | 4000 | 0.0 | - | | 2.0465 | 4050 | 0.0 | - | | 2.0718 | 4100 | 0.0001 | - | | 2.0970 | 4150 | 0.0001 | - | | 2.1223 | 4200 | 0.0 | - | | 2.1475 | 4250 | 0.0 | - | | 2.1728 | 4300 | 0.0 | - | | 2.1981 | 4350 | 0.0 | - | | 2.2233 | 4400 | 0.0 | - | | 2.2486 | 4450 | 0.0 | - | | 2.2739 | 4500 | 0.0 | - | | 2.2991 | 4550 | 0.0 | - | | 2.3244 | 4600 | 0.0001 | - | | 2.3497 | 4650 | 0.0 | - | | 2.3749 | 4700 | 0.0001 | - | | 2.4002 | 4750 | 0.0 | - | | 2.4255 | 4800 | 0.0 | - | | 2.4507 | 4850 | 0.0001 | - | | 2.4760 | 4900 | 0.0 | - | | 2.5013 | 4950 | 0.0 | - | | 2.5265 | 5000 | 0.0 | - | | 2.5518 | 5050 | 0.0 | - | | 2.5771 | 5100 | 0.0 | - | | 2.6023 | 5150 | 0.0 | - | | 2.6276 | 5200 | 0.0 | - | | 2.6529 | 5250 | 0.0 | - | | 2.6781 | 5300 | 0.0 | - | | 2.7034 | 5350 | 0.0001 | - | | 2.7287 | 5400 | 0.0 | - | | 2.7539 | 5450 | 0.0 | - | | 2.7792 | 5500 | 0.0001 | - | | 2.8044 | 5550 | 0.0 | - | | 2.8297 | 5600 | 0.0 | - | | 2.8550 | 5650 | 0.0 | - | | 2.8802 | 5700 | 0.0 | - | | 2.9055 | 5750 | 0.0 | - | | 2.9308 | 5800 | 0.0 | - | | 2.9560 | 5850 | 0.0 | - | | 2.9813 | 5900 | 0.0 | - | | 3.0 | 5937 | - | 0.0557 | | 3.0066 | 5950 | 0.0 | - | | 3.0318 | 6000 | 0.0 | - | | 3.0571 | 6050 | 0.0 | - | | 3.0824 | 6100 | 0.0 | - | | 3.1076 | 6150 | 0.0 | - | | 3.1329 | 6200 | 0.0 | - | | 3.1582 | 6250 | 0.0 | - | | 3.1834 | 6300 | 0.0 | - | | 3.2087 | 6350 | 0.0 | - | | 3.2340 | 6400 | 0.0 | - | | 3.2592 | 6450 | 0.0 | - | | 3.2845 | 6500 | 0.0 | - | | 3.3098 | 6550 | 0.0 | - | | 3.3350 | 6600 | 0.0 | - | | 3.3603 | 6650 | 0.0 | - | | 3.3855 | 6700 | 0.0 | - | | 3.4108 | 6750 | 0.0 | - | | 3.4361 | 6800 | 0.0 | - | | 3.4613 | 6850 | 0.0 | - | | 3.4866 | 6900 | 0.0 | - | | 3.5119 | 6950 | 0.0 | - | | 3.5371 | 7000 | 0.0 | - | | 3.5624 | 7050 | 0.0 | - | | 3.5877 | 7100 | 0.0 | - | | 3.6129 | 7150 | 0.0 | - | | 3.6382 | 7200 | 0.0 | - | | 3.6635 | 7250 | 0.0 | - | | 3.6887 | 7300 | 0.0 | - | | 3.7140 | 7350 | 0.0 | - | | 3.7393 | 7400 | 0.0 | - | | 3.7645 | 7450 | 0.0 | - | | 3.7898 | 7500 | 0.0 | - | | 3.8151 | 7550 | 0.0 | - | | 3.8403 | 7600 | 0.0 | - | | 3.8656 | 7650 | 0.0 | - | | 3.8909 | 7700 | 0.0 | - | | 3.9161 | 7750 | 0.0 | - | | 3.9414 | 7800 | 0.0 | - | | 3.9666 | 7850 | 0.0 | - | | 3.9919 | 7900 | 0.0 | - | | 4.0 | 7916 | - | 0.0543 | | 4.0172 | 7950 | 0.0 | - | | 4.0424 | 8000 | 0.0 | - | | 4.0677 | 8050 | 0.0 | - | | 4.0930 | 8100 | 0.0 | - | | 4.1182 | 8150 | 0.0 | - | | 4.1435 | 8200 | 0.0 | - | | 4.1688 | 8250 | 0.0 | - | | 4.1940 | 8300 | 0.0 | - | | 4.2193 | 8350 | 0.0 | - | | 4.2446 | 8400 | 0.0 | - | | 4.2698 | 8450 | 0.0 | - | | 4.2951 | 8500 | 0.0 | - | | 4.3204 | 8550 | 0.0 | - | | 4.3456 | 8600 | 0.0 | - | | 4.3709 | 8650 | 0.0 | - | | 4.3962 | 8700 | 0.0 | - | | 4.4214 | 8750 | 0.0 | - | | 4.4467 | 8800 | 0.0 | - | | 4.4720 | 8850 | 0.0 | - | | 4.4972 | 8900 | 0.0 | - | | 4.5225 | 8950 | 0.0 | - | | 4.5478 | 9000 | 0.0 | - | | 4.5730 | 9050 | 0.0 | - | | 4.5983 | 9100 | 0.0 | - | | 4.6235 | 9150 | 0.0 | - | | 4.6488 | 9200 | 0.0 | - | | 4.6741 | 9250 | 0.0 | - | | 4.6993 | 9300 | 0.0 | - | | 4.7246 | 9350 | 0.0 | - | | 4.7499 | 9400 | 0.0 | - | | 4.7751 | 9450 | 0.0 | - | | 4.8004 | 9500 | 0.0 | - | | 4.8257 | 9550 | 0.0 | - | | 4.8509 | 9600 | 0.0 | - | | 4.8762 | 9650 | 0.0 | - | | 4.9015 | 9700 | 0.0 | - | | 4.9267 | 9750 | 0.0 | - | | 4.9520 | 9800 | 0.0 | - | | 4.9773 | 9850 | 0.0 | - | | 5.0 | 9895 | - | 0.0537 | * The bold row denotes the saved checkpoint. ### Framework Versions - Python: 3.9.19 - SetFit: 1.1.0.dev0 - Sentence Transformers: 3.0.1 - Transformers: 4.39.0 - PyTorch: 2.4.0 - Datasets: 2.20.0 - Tokenizers: 0.15.2 ## Citation ### BibTeX ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```