UP - logo

Rezultati iskanja

Osnovno iskanje    Ukazno iskanje   

Trenutno NISTE avtorizirani za dostop do e-virov UPUK. Za polni dostop se PRIJAVITE.

20 21 22
zadetkov: 218
211.
  • CERT: Continual Pre-Training on Sketches for Library-Oriented Code Generation
    Daoguang Zan; Chen, Bei; Yang, Dejian ... arXiv (Cornell University), 06/2022
    Paper, Journal Article
    Odprti dostop

    Code generation is a longstanding challenge, aiming to generate a code snippet based on a natural language description. Usually, expensive text-code paired data is essential for training a code ...
Celotno besedilo
212.
  • Input-Tuning: Adapting Unfamiliar Inputs to Frozen Pretrained Models
    An, Shengnan; Li, Yifei; Lin, Zeqi ... arXiv (Cornell University), 03/2022
    Paper, Journal Article
    Odprti dostop

    Recently the prompt-tuning paradigm has attracted significant attention. By only tuning continuous prompts with a frozen pre-trained language model (PLM), prompt-tuning takes a step towards deploying ...
Celotno besedilo
213.
  • Revisiting Iterative Back-Translation from the Perspective of Compositional Generalization
    Guo, Yinuo; Zhu, Hualei; Lin, Zeqi ... arXiv (Cornell University), 12/2020
    Paper, Journal Article
    Odprti dostop

    Human intelligence exhibits compositional generalization (i.e., the capacity to understand and produce unseen combinations of seen components), but current neural seq2seq models lack such ability. In ...
Celotno besedilo
214.
  • Phi-3 Technical Report: A Highly Capable Language Model Locally on Your Phone
    Abdin, Marah; Sam Ade Jacobs; Ammar Ahmad Awan ... arXiv.org, 04/2024
    Paper, Journal Article
    Odprti dostop

    We introduce phi-3-mini, a 3.8 billion parameter language model trained on 3.3 trillion tokens, whose overall performance, as measured by both academic benchmarks and internal testing, rivals that of ...
Celotno besedilo
215.
  • Learning Algebraic Recombination for Compositional Generalization
    Liu, Chenyao; An, Shengnan; Lin, Zeqi ... arXiv (Cornell University), 07/2021
    Paper, Journal Article
    Odprti dostop

    Neural sequence models exhibit limited compositional generalization ability in semantic parsing tasks. Compositional generalization requires algebraic recombination, i.e., dynamically recombining ...
Celotno besedilo
216.
  • Compositional Generalization by Learning Analytical Expressions
    Liu, Qian; An, Shengnan; Jian-Guang Lou ... arXiv (Cornell University), 10/2020
    Paper, Journal Article
    Odprti dostop

    Compositional generalization is a basic and essential intellective capability of human beings, which allows us to recombine known parts readily. However, existing neural network based models have ...
Celotno besedilo
217.
Celotno besedilo
218.
Celotno besedilo
20 21 22
zadetkov: 218

Nalaganje filtrov