Articles tagged with: Seminar

  • Trace-of-Thought: Enhancing Arithmetic Reasoning in Small Language Models Through Distillation of Reasoning

    Join Us Wednesday May 8 for an online presentation:

    Speaker: Tyler McDonald

    Abstract: Large Language Models (LLMs) are an emergent tool in many disciplines, with commercially accessible language models providing a simple interface to help solve a variety of problems & daily tasks. While these models are optimized for various common tasks, such as arithmetic reasoning, the exhaustive employment of commercialized language models comes with many challenges, such as potential resource unavailability, cost burdens, and constraints on flexibility. To aid in cultivating a support for open-source language modeling, an often more accessible alternative that allows for scale and size flexibilities, we introduce Trace-of-Thought Prompting: a simple, zero-shot prompt engineering technique responsible for creating observable and rational subproblems designed for delegation to smaller-scale systems. By harnessing the joint benefits of large-scale reasoning with small-scale accessibility, we aim to produce small language models – those below 7 billion parameters – capable of enhanced & transparent reasoning skills on arithmetic reasoning tasks. Our applications of Trace-of-Thought on these arithmetic reasoning problems yields gains as large as 125% on small language models, while providing observable reasoning pathways that are easily corrected on tasks of increasing difficulty. Our findings introduce a critical paradigm towards leveraging the myriad abilities of language models at a variety of scales for accessible problem solving.

    Time: 11:00 – 11:45 (Online) Wednesday, May 8

    Teams Meeting info:

    ________________________________________________________________________________
    Microsoft Teams Need help?
    Meeting ID: 289 571 693 504
    Passcode: DwRdJk

    Tags: , , ,
    Categories: News