File size: 46,336 Bytes
9da6843
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
---
base_model: sentence-transformers/all-mpnet-base-v2
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
- dot_accuracy@1
- dot_accuracy@3
- dot_accuracy@5
- dot_accuracy@10
- dot_precision@1
- dot_precision@3
- dot_precision@5
- dot_precision@10
- dot_recall@1
- dot_recall@3
- dot_recall@5
- dot_recall@10
- dot_ndcg@10
- dot_mrr@10
- dot_map@100
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:714
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
widget:
- source_sentence: What does the term 'rights, opportunities, or access' encompass
    in this framework?
  sentences:
  - "10 \nGAI systems can ease the unintentional production or dissemination of false,\
    \ inaccurate, or misleading \ncontent (misinformation) at scale, particularly\
    \ if the content stems from confabulations.  \nGAI systems can also ease the deliberate\
    \ production or dissemination of false or misleading information \n(disinformation)\
    \ at scale, where an actor has the explicit intent to deceive or cause harm to\
    \ others. Even \nvery subtle changes to text or images can manipulate human and\
    \ machine perception. \nSimilarly, GAI systems could enable a higher degree of\
    \ sophistication for malicious actors to produce \ndisinformation that is targeted\
    \ towards specific demographics. Current and emerging multimodal models \nmake\
    \ it possible to generate both text-based disinformation and highly realistic\
    \ “deepfakes” – that is, \nsynthetic audiovisual content and photorealistic images.12\
    \ Additional disinformation threats could be \nenabled by future GAI models trained\
    \ on new data modalities."
  - '74. See, e.g., Heather Morrison. Virtual Testing Puts Disabled Students at a
    Disadvantage. Government

    Technology. May 24, 2022.

    https://www.govtech.com/education/k-12/virtual-testing-puts-disabled-students-at-a-disadvantage;

    Lydia X. Z. Brown, Ridhi Shetty, Matt Scherer, and Andrew Crawford. Ableism And
    Disability

    Discrimination In New Surveillance Technologies: How new surveillance technologies
    in education,

    policing, health care, and the workplace disproportionately harm disabled people.
    Center for Democracy

    and Technology Report. May 24, 2022.

    https://cdt.org/insights/ableism-and-disability-discrimination-in-new-surveillance-technologies-how­

    new-surveillance-technologies-in-education-policing-health-care-and-the-workplace­

    disproportionately-harm-disabled-people/

    69'
  - "persons, Asian Americans and Pacific Islanders and other persons of color; members\
    \ of religious minorities; \nwomen, girls, and non-binary people; lesbian, gay,\
    \ bisexual, transgender, queer, and intersex (LGBTQI+) \npersons; older adults;\
    \ persons with disabilities; persons who live in rural areas; and persons otherwise\
    \ adversely \naffected by persistent poverty or inequality. \nRIGHTS, OPPORTUNITIES,\
    \ OR ACCESS: “Rights, opportunities, or access” is used to indicate the scoping\
    \ \nof this framework. It describes the set of: civil rights, civil liberties,\
    \ and privacy, including freedom of speech, \nvoting, and protections from discrimination,\
    \ excessive punishment, unlawful surveillance, and violations of \nprivacy and\
    \ other freedoms in both public and private sector contexts; equal opportunities,\
    \ including equitable \naccess to education, housing, credit, employment, and\
    \ other programs; or, access to critical resources or"
- source_sentence: What are some broad negative risks associated with GAI design,
    development, and deployment?
  sentences:
  - "actually occurring, or large-scale risks could occur); and broad GAI negative\
    \ risks, \nincluding: Immature safety or risk cultures related to AI and GAI design,\
    \ \ndevelopment and deployment, public information integrity risks, including\
    \ impacts \non democratic processes, unknown long-term performance characteristics\
    \ of GAI. \nInformation Integrity; Dangerous, \nViolent, or Hateful Content; CBRN\
    \ \nInformation or Capabilities \nGV-1.3-007 Devise a plan to halt development\
    \ or deployment of a GAI system that poses \nunacceptable negative risk. \nCBRN\
    \ Information and Capability; \nInformation Security; Information \nIntegrity\
    \ \nAI Actor Tasks: Governance and Oversight \n \nGOVERN 1.4: The risk management\
    \ process and its outcomes are established through transparent policies, procedures,\
    \ and other \ncontrols based on organizational risk priorities. \nAction ID \n\
    Suggested Action \nGAI Risks \nGV-1.4-001 \nEstablish policies and mechanisms\
    \ to prevent GAI systems from generating"
  - "39 \nMS-3.3-004 \nProvide input for training materials about the capabilities\
    \ and limitations of GAI \nsystems related to digital content transparency for\
    \ AI Actors, other \nprofessionals, and the public about the societal impacts\
    \ of AI and the role of \ndiverse and inclusive content generation. \nHuman-AI\
    \ Configuration; \nInformation Integrity; Harmful Bias \nand Homogenization \n\
    MS-3.3-005 \nRecord and integrate structured feedback about content provenance\
    \ from \noperators, users, and potentially impacted communities through the use\
    \ of \nmethods such as user research studies, focus groups, or community forums.\
    \ \nActively seek feedback on generated content quality and potential biases.\
    \ \nAssess the general awareness among end users and impacted communities \nabout\
    \ the availability of these feedback channels. \nHuman-AI Configuration; \nInformation\
    \ Integrity; Harmful Bias \nand Homogenization \nAI Actor Tasks: AI Deployment,\
    \ Affected Individuals and Communities, End-Users, Operation and Monitoring, TEVV"
  - "NOTICE & \nEXPLANATION \nWHY THIS PRINCIPLE IS IMPORTANT\nThis section provides\
    \ a brief summary of the problems which the principle seeks to address and protect\
    \ \nagainst, including illustrative examples. \nAutomated systems now determine\
    \ opportunities, from employment to credit, and directly shape the American \n\
    public’s experiences, from the courtroom to online classrooms, in ways that profoundly\
    \ impact people’s lives. But this \nexpansive impact is not always visible. An\
    \ applicant might not know whether a person rejected their resume or a \nhiring\
    \ algorithm moved them to the bottom of the list. A defendant in the courtroom\
    \ might not know if a judge deny­\ning their bail is informed by an automated\
    \ system that labeled them “high risk.” From correcting errors to contesting \n\
    decisions, people are often denied the knowledge they need to address the impact\
    \ of automated systems on their lives."
- source_sentence: Who should conduct the assessment of the impact of surveillance
    on rights and opportunities?
  sentences:
  - "APPENDIX\n•\nJulia Simon-Mishel, Supervising Attorney, Philadelphia Legal Assistance\n\
    •\nDr. Zachary Mahafza, Research & Data Analyst, Southern Poverty Law Center\n\
    •\nJ. Khadijah Abdurahman, Tech Impact Network Research Fellow, AI Now Institute,\
    \ UCLA C2I1, and\nUWA Law School\nPanelists separately described the increasing\
    \ scope of technology use in providing for social welfare, including \nin fraud\
    \ detection, digital ID systems, and other methods focused on improving efficiency\
    \ and reducing cost. \nHowever, various panelists individually cautioned that\
    \ these systems may reduce burden for government \nagencies by increasing the\
    \ burden and agency of people using and interacting with these technologies. \n\
    Additionally, these systems can produce feedback loops and compounded harm, collecting\
    \ data from \ncommunities and using it to reinforce inequality. Various panelists\
    \ suggested that these harms could be"
  - "assessments, including data retention timelines and associated justification,\
    \ and an assessment of the \nimpact of surveillance or data collection on rights,\
    \ opportunities, and access. Where possible, this \nassessment of the impact of\
    \ surveillance should be done by an independent party. Reporting should be \n\
    provided in a clear and machine-readable manner.  \n35"
  - "access to education, housing, credit, employment, and other programs; or, access\
    \ to critical resources or \nservices, such as healthcare, financial services,\
    \ safety, social services, non-deceptive information about goods \nand services,\
    \ and government benefits. \n10"
- source_sentence: How can voting-related systems impact privacy and security?
  sentences:
  - "as custody and divorce information, and home, work, or school environmental data);\
    \ or have the reasonable potential \nto be used in ways that are likely to expose\
    \ individuals to meaningful harm, such as a loss of privacy or financial harm\
    \ \ndue to identity theft. Data and metadata generated by or about those who are\
    \ not yet legal adults is also sensitive, even \nif not related to a sensitive\
    \ domain. Such data includes, but is not limited to, numerical, text, image, audio,\
    \ or video \ndata. “Sensitive domains” are those in which activities being conducted\
    \ can cause material harms, including signifi­\ncant adverse effects on human\
    \ rights such as autonomy and dignity, as well as civil liberties and civil rights.\
    \ Domains \nthat have historically been singled out as deserving of enhanced data\
    \ protections or where such enhanced protections \nare reasonably expected by\
    \ the public include, but are not limited to, health, family planning and care,\
    \ employment,"
  - "agreed upon the importance of advisory boards and compensated community input\
    \ early in the design process \n(before the technology is built and instituted).\
    \ Various panelists also emphasized the importance of regulation \nthat includes\
    \ limits to the type and cost of such technologies. \n56"
  - "Surveillance and criminal justice system algorithms such as risk assessments,\
    \ predictive  \n    policing, automated license plate readers, real-time facial\
    \ recognition systems (especially  \n    those used in public places or during\
    \ protected activities like peaceful protests), social media  \n    monitoring,\
    \ and ankle monitoring devices; \nVoting-related systems such as signature matching\
    \ tools; \nSystems with a potential privacy impact such as smart home systems\
    \ and associated data,  \n    systems that use or collect health-related data,\
    \ systems that use or collect education-related  \n    data, criminal justice\
    \ system data, ad-targeting systems, and systems that perform big data  \n   \
    \ analytics in order to build profiles or infer personal information about individuals;\
    \ and \nAny system that has the meaningful potential to lead to algorithmic discrimination.\
    \ \n• Equal opportunities, including but not limited to:"
- source_sentence: What impact do automated systems have on underserved communities?
  sentences:
  - "generation, summarization, search, and chat. These activities can take place\
    \ within organizational \nsettings or in the public domain. \nOrganizations can\
    \ restrict AI applications that cause harm, exceed stated risk tolerances, or\
    \ that conflict \nwith their tolerances or values. Governance tools and protocols\
    \ that are applied to other types of AI \nsystems can be applied to GAI systems.\
    \ These plans and actions include: \n• Accessibility and reasonable \naccommodations\
    \ \n• AI actor credentials and qualifications  \n• Alignment to organizational\
    \ values \n• Auditing and assessment \n• Change-management controls \n• Commercial\
    \ use \n• Data provenance"
  - "automated systems make on underserved communities and to institute proactive\
    \ protections that support these \ncommunities. \n•\nAn automated system using\
    \ nontraditional factors such as educational attainment and employment history\
    \ as\npart of its loan underwriting and pricing model was found to be much more\
    \ likely to charge an applicant who\nattended a Historically Black College or\
    \ University (HBCU) higher loan prices for refinancing a student loan\nthan an\
    \ applicant who did not attend an HBCU. This was found to be true even when controlling\
    \ for\nother credit-related factors.32\n•\nA hiring tool that learned the features\
    \ of a company's employees (predominantly men) rejected women appli­\ncants for\
    \ spurious and discriminatory reasons; resumes with the word “women’s,” such as\
    \ “women’s\nchess club captain,” were penalized in the candidate ranking.33\n\
    •\nA predictive model marketed as being able to predict whether students are likely\
    \ to drop out of school was"
  - "on a principle of local control, such that those individuals closest to the data\
    \ subject have more access while \nthose who are less proximate do not (e.g.,\
    \ a teacher has access to their students’ daily progress data while a \nsuperintendent\
    \ does not). \nReporting. In addition to the reporting on data privacy (as listed\
    \ above for non-sensitive data), entities devel-\noping technologies related to\
    \ a sensitive domain and those collecting, using, storing, or sharing sensitive\
    \ data \nshould, whenever appropriate, regularly provide public reports describing:\
    \ any data security lapses or breaches \nthat resulted in sensitive data leaks;\
    \ the number, type, and outcomes of ethical pre-reviews undertaken; a \ndescription\
    \ of any data sold, shared, or made public, and how that data was assessed to\
    \ determine it did not pres-\nent a sensitive data risk; and ongoing risk identification\
    \ and management procedures, and any mitigation added"
model-index:
- name: SentenceTransformer based on sentence-transformers/all-mpnet-base-v2
  results:
  - task:
      type: information-retrieval
      name: Information Retrieval
    dataset:
      name: Unknown
      type: unknown
    metrics:
    - type: cosine_accuracy@1
      value: 0.8881578947368421
      name: Cosine Accuracy@1
    - type: cosine_accuracy@3
      value: 0.9868421052631579
      name: Cosine Accuracy@3
    - type: cosine_accuracy@5
      value: 0.9868421052631579
      name: Cosine Accuracy@5
    - type: cosine_accuracy@10
      value: 1.0
      name: Cosine Accuracy@10
    - type: cosine_precision@1
      value: 0.8881578947368421
      name: Cosine Precision@1
    - type: cosine_precision@3
      value: 0.32894736842105265
      name: Cosine Precision@3
    - type: cosine_precision@5
      value: 0.19736842105263155
      name: Cosine Precision@5
    - type: cosine_precision@10
      value: 0.09999999999999999
      name: Cosine Precision@10
    - type: cosine_recall@1
      value: 0.8881578947368421
      name: Cosine Recall@1
    - type: cosine_recall@3
      value: 0.9868421052631579
      name: Cosine Recall@3
    - type: cosine_recall@5
      value: 0.9868421052631579
      name: Cosine Recall@5
    - type: cosine_recall@10
      value: 1.0
      name: Cosine Recall@10
    - type: cosine_ndcg@10
      value: 0.9499393562918366
      name: Cosine Ndcg@10
    - type: cosine_mrr@10
      value: 0.9331140350877194
      name: Cosine Mrr@10
    - type: cosine_map@100
      value: 0.9331140350877194
      name: Cosine Map@100
    - type: dot_accuracy@1
      value: 0.8881578947368421
      name: Dot Accuracy@1
    - type: dot_accuracy@3
      value: 0.9868421052631579
      name: Dot Accuracy@3
    - type: dot_accuracy@5
      value: 0.9868421052631579
      name: Dot Accuracy@5
    - type: dot_accuracy@10
      value: 1.0
      name: Dot Accuracy@10
    - type: dot_precision@1
      value: 0.8881578947368421
      name: Dot Precision@1
    - type: dot_precision@3
      value: 0.32894736842105265
      name: Dot Precision@3
    - type: dot_precision@5
      value: 0.19736842105263155
      name: Dot Precision@5
    - type: dot_precision@10
      value: 0.09999999999999999
      name: Dot Precision@10
    - type: dot_recall@1
      value: 0.8881578947368421
      name: Dot Recall@1
    - type: dot_recall@3
      value: 0.9868421052631579
      name: Dot Recall@3
    - type: dot_recall@5
      value: 0.9868421052631579
      name: Dot Recall@5
    - type: dot_recall@10
      value: 1.0
      name: Dot Recall@10
    - type: dot_ndcg@10
      value: 0.9499393562918366
      name: Dot Ndcg@10
    - type: dot_mrr@10
      value: 0.9331140350877194
      name: Dot Mrr@10
    - type: dot_map@100
      value: 0.9331140350877194
      name: Dot Map@100
    - type: cosine_accuracy@1
      value: 0.8828125
      name: Cosine Accuracy@1
    - type: cosine_accuracy@3
      value: 0.96875
      name: Cosine Accuracy@3
    - type: cosine_accuracy@5
      value: 0.9921875
      name: Cosine Accuracy@5
    - type: cosine_accuracy@10
      value: 1.0
      name: Cosine Accuracy@10
    - type: cosine_precision@1
      value: 0.8828125
      name: Cosine Precision@1
    - type: cosine_precision@3
      value: 0.32291666666666663
      name: Cosine Precision@3
    - type: cosine_precision@5
      value: 0.19843750000000004
      name: Cosine Precision@5
    - type: cosine_precision@10
      value: 0.10000000000000002
      name: Cosine Precision@10
    - type: cosine_recall@1
      value: 0.8828125
      name: Cosine Recall@1
    - type: cosine_recall@3
      value: 0.96875
      name: Cosine Recall@3
    - type: cosine_recall@5
      value: 0.9921875
      name: Cosine Recall@5
    - type: cosine_recall@10
      value: 1.0
      name: Cosine Recall@10
    - type: cosine_ndcg@10
      value: 0.9458381646710927
      name: Cosine Ndcg@10
    - type: cosine_mrr@10
      value: 0.9279296875
      name: Cosine Mrr@10
    - type: cosine_map@100
      value: 0.9279296875
      name: Cosine Map@100
    - type: dot_accuracy@1
      value: 0.8828125
      name: Dot Accuracy@1
    - type: dot_accuracy@3
      value: 0.96875
      name: Dot Accuracy@3
    - type: dot_accuracy@5
      value: 0.9921875
      name: Dot Accuracy@5
    - type: dot_accuracy@10
      value: 1.0
      name: Dot Accuracy@10
    - type: dot_precision@1
      value: 0.8828125
      name: Dot Precision@1
    - type: dot_precision@3
      value: 0.32291666666666663
      name: Dot Precision@3
    - type: dot_precision@5
      value: 0.19843750000000004
      name: Dot Precision@5
    - type: dot_precision@10
      value: 0.10000000000000002
      name: Dot Precision@10
    - type: dot_recall@1
      value: 0.8828125
      name: Dot Recall@1
    - type: dot_recall@3
      value: 0.96875
      name: Dot Recall@3
    - type: dot_recall@5
      value: 0.9921875
      name: Dot Recall@5
    - type: dot_recall@10
      value: 1.0
      name: Dot Recall@10
    - type: dot_ndcg@10
      value: 0.9458381646710927
      name: Dot Ndcg@10
    - type: dot_mrr@10
      value: 0.9279296875
      name: Dot Mrr@10
    - type: dot_map@100
      value: 0.9279296875
      name: Dot Map@100
---

# SentenceTransformer based on sentence-transformers/all-mpnet-base-v2

This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

## Model Details

### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) <!-- at revision 84f2bcc00d77236f9e89c8a360a00fb1139bf47d -->
- **Maximum Sequence Length:** 384 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->

### Model Sources

- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)

### Full Model Architecture

```
SentenceTransformer(
  (0): Transformer({'max_seq_length': 384, 'do_lower_case': False}) with Transformer model: MPNetModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)
```

## Usage

### Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

```bash
pip install -U sentence-transformers
```

Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("jet-taekyo/mpnet_finetuned_semantic")
# Run inference
sentences = [
    'What impact do automated systems have on underserved communities?',
    "automated systems make on underserved communities and to institute proactive protections that support these \ncommunities. \n•\nAn automated system using nontraditional factors such as educational attainment and employment history as\npart of its loan underwriting and pricing model was found to be much more likely to charge an applicant who\nattended a Historically Black College or University (HBCU) higher loan prices for refinancing a student loan\nthan an applicant who did not attend an HBCU. This was found to be true even when controlling for\nother credit-related factors.32\n•\nA hiring tool that learned the features of a company's employees (predominantly men) rejected women appli\xad\ncants for spurious and discriminatory reasons; resumes with the word “women’s,” such as “women’s\nchess club captain,” were penalized in the candidate ranking.33\n•\nA predictive model marketed as being able to predict whether students are likely to drop out of school was",
    'on a principle of local control, such that those individuals closest to the data subject have more access while \nthose who are less proximate do not (e.g., a teacher has access to their students’ daily progress data while a \nsuperintendent does not). \nReporting. In addition to the reporting on data privacy (as listed above for non-sensitive data), entities devel-\noping technologies related to a sensitive domain and those collecting, using, storing, or sharing sensitive data \nshould, whenever appropriate, regularly provide public reports describing: any data security lapses or breaches \nthat resulted in sensitive data leaks; the number, type, and outcomes of ethical pre-reviews undertaken; a \ndescription of any data sold, shared, or made public, and how that data was assessed to determine it did not pres-\nent a sensitive data risk; and ongoing risk identification and management procedures, and any mitigation added',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```

<!--
### Direct Usage (Transformers)

<details><summary>Click to see the direct usage in Transformers</summary>

</details>
-->

<!--
### Downstream Usage (Sentence Transformers)

You can finetune this model on your own dataset.

<details><summary>Click to expand</summary>

</details>
-->

<!--
### Out-of-Scope Use

*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->

## Evaluation

### Metrics

#### Information Retrieval

* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)

| Metric              | Value      |
|:--------------------|:-----------|
| cosine_accuracy@1   | 0.8882     |
| cosine_accuracy@3   | 0.9868     |
| cosine_accuracy@5   | 0.9868     |
| cosine_accuracy@10  | 1.0        |
| cosine_precision@1  | 0.8882     |
| cosine_precision@3  | 0.3289     |
| cosine_precision@5  | 0.1974     |
| cosine_precision@10 | 0.1        |
| cosine_recall@1     | 0.8882     |
| cosine_recall@3     | 0.9868     |
| cosine_recall@5     | 0.9868     |
| cosine_recall@10    | 1.0        |
| cosine_ndcg@10      | 0.9499     |
| cosine_mrr@10       | 0.9331     |
| **cosine_map@100**  | **0.9331** |
| dot_accuracy@1      | 0.8882     |
| dot_accuracy@3      | 0.9868     |
| dot_accuracy@5      | 0.9868     |
| dot_accuracy@10     | 1.0        |
| dot_precision@1     | 0.8882     |
| dot_precision@3     | 0.3289     |
| dot_precision@5     | 0.1974     |
| dot_precision@10    | 0.1        |
| dot_recall@1        | 0.8882     |
| dot_recall@3        | 0.9868     |
| dot_recall@5        | 0.9868     |
| dot_recall@10       | 1.0        |
| dot_ndcg@10         | 0.9499     |
| dot_mrr@10          | 0.9331     |
| dot_map@100         | 0.9331     |

#### Information Retrieval

* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)

| Metric              | Value      |
|:--------------------|:-----------|
| cosine_accuracy@1   | 0.8828     |
| cosine_accuracy@3   | 0.9688     |
| cosine_accuracy@5   | 0.9922     |
| cosine_accuracy@10  | 1.0        |
| cosine_precision@1  | 0.8828     |
| cosine_precision@3  | 0.3229     |
| cosine_precision@5  | 0.1984     |
| cosine_precision@10 | 0.1        |
| cosine_recall@1     | 0.8828     |
| cosine_recall@3     | 0.9688     |
| cosine_recall@5     | 0.9922     |
| cosine_recall@10    | 1.0        |
| cosine_ndcg@10      | 0.9458     |
| cosine_mrr@10       | 0.9279     |
| **cosine_map@100**  | **0.9279** |
| dot_accuracy@1      | 0.8828     |
| dot_accuracy@3      | 0.9688     |
| dot_accuracy@5      | 0.9922     |
| dot_accuracy@10     | 1.0        |
| dot_precision@1     | 0.8828     |
| dot_precision@3     | 0.3229     |
| dot_precision@5     | 0.1984     |
| dot_precision@10    | 0.1        |
| dot_recall@1        | 0.8828     |
| dot_recall@3        | 0.9688     |
| dot_recall@5        | 0.9922     |
| dot_recall@10       | 1.0        |
| dot_ndcg@10         | 0.9458     |
| dot_mrr@10          | 0.9279     |
| dot_map@100         | 0.9279     |

<!--
## Bias, Risks and Limitations

*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->

<!--
### Recommendations

*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->

## Training Details

### Training Dataset

#### Unnamed Dataset


* Size: 714 training samples
* Columns: <code>sentence_0</code> and <code>sentence_1</code>
* Approximate statistics based on the first 714 samples:
  |         | sentence_0                                                                       | sentence_1                                                                          |
  |:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
  | type    | string                                                                           | string                                                                              |
  | details | <ul><li>min: 7 tokens</li><li>mean: 17.7 tokens</li><li>max: 36 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 176.29 tokens</li><li>max: 384 tokens</li></ul> |
* Samples:
  | sentence_0                                                                                 | sentence_1                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   |
  |:-------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>What are the key characteristics of high-integrity information?</code>               | <code>This information can be linked to the original source(s) with appropriate evidence. High-integrity <br>information is also accurate and reliable, can be verified and authenticated, has a clear chain of custody, <br>and creates reasonable expectations about when its validity may expire.”11 <br> <br> <br>11 This definition of information integrity is derived from the 2022 White House Roadmap for Researchers on <br>Priorities Related to Information Integrity Research and Development. </code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                            |
  | <code>How can the validity of information be verified and authenticated?</code>            | <code>This information can be linked to the original source(s) with appropriate evidence. High-integrity <br>information is also accurate and reliable, can be verified and authenticated, has a clear chain of custody, <br>and creates reasonable expectations about when its validity may expire.”11 <br> <br> <br>11 This definition of information integrity is derived from the 2022 White House Roadmap for Researchers on <br>Priorities Related to Information Integrity Research and Development. </code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                            |
  | <code>What should trigger the use of a human alternative in the attainment process?</code> | <code>In many scenarios, there is a reasonable expectation <br>of human involvement in attaining rights, opportunities, or access. When automated systems make up part of <br>the attainment process, alternative timely human-driven processes should be provided. The use of a human <br>alternative should be triggered by an opt-out process. Timely and not burdensome human alternative. Opting out should be timely and not unreasonably <br>burdensome in both the process of requesting to opt-out and the human-driven alternative provided. Provide timely human consideration and remedy by a fallback and escalation system in the <br>event that an automated system fails, produces error, or you would like to appeal or con­<br>test its impacts on you <br>Proportionate. The availability of human consideration and fallback, along with associated training and <br>safeguards against human bias, should be proportionate to the potential of the automated system to meaning­<br>fully impact rights, opportunities, or access. Automated systems that have greater control over outcomes, <br>provide input to high-stakes decisions, relate to sensitive domains, or otherwise have a greater potential to <br>meaningfully impact rights, opportunities, or access should have greater availability (e.g., staffing) and over­<br>sight of human consideration and fallback mechanisms. Accessible. Mechanisms for human consideration and fallback, whether in-person, on paper, by phone, or <br>otherwise provided, should be easy to find and use. These mechanisms should be tested to ensure that users <br>who have trouble with the automated system are able to use human consideration and fallback, with the under­<br>standing that it may be these users who are most likely to need the human assistance. Similarly, it should be <br>tested to ensure that users with disabilities are able to find and use human consideration and fallback and also <br>request reasonable accommodations or modifications. Convenient. Mechanisms for human consideration and fallback should not be unreasonably burdensome as <br>compared to the automated system’s equivalent. 49<br></code> |
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
  ```json
  {
      "loss": "MultipleNegativesRankingLoss",
      "matryoshka_dims": [
          768,
          512,
          256,
          128,
          64
      ],
      "matryoshka_weights": [
          1,
          1,
          1,
          1,
          1
      ],
      "n_dims_per_step": -1
  }
  ```

### Training Hyperparameters
#### Non-Default Hyperparameters

- `eval_strategy`: steps
- `per_device_train_batch_size`: 20
- `per_device_eval_batch_size`: 20
- `num_train_epochs`: 5
- `multi_dataset_batch_sampler`: round_robin

#### All Hyperparameters
<details><summary>Click to expand</summary>

- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 20
- `per_device_eval_batch_size`: 20
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 5e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1
- `num_train_epochs`: 5
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.0
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`: 
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `eval_use_gather_object`: False
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: round_robin

</details>

### Training Logs
| Epoch  | Step | cosine_map@100 |
|:------:|:----:|:--------------:|
| 1.0    | 36   | 0.9395         |
| 1.3889 | 50   | 0.9320         |
| 2.0    | 72   | 0.9298         |
| 2.7778 | 100  | 0.9348         |
| 3.0    | 108  | 0.9304         |
| 4.0    | 144  | 0.9342         |
| 4.1667 | 150  | 0.9342         |
| 5.0    | 180  | 0.9331         |
| 1.0    | 31   | 0.9163         |
| 1.6129 | 50   | 0.9279         |


### Framework Versions
- Python: 3.11.9
- Sentence Transformers: 3.1.0
- Transformers: 4.44.2
- PyTorch: 2.4.1+cu121
- Accelerate: 0.34.2
- Datasets: 3.0.0
- Tokenizers: 0.19.1

## Citation

### BibTeX

#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
```

#### MatryoshkaLoss
```bibtex
@misc{kusupati2024matryoshka,
    title={Matryoshka Representation Learning},
    author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
    year={2024},
    eprint={2205.13147},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}
```

#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
```

<!--
## Glossary

*Clearly define terms in order to be accessible across audiences.*
-->

<!--
## Model Card Authors

*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->

<!--
## Model Card Contact

*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->