• 801: Merged LLMs Are Smaller And More Capable, with Arcee AI's Mark McQuade and Charles Goddard

  • Jul 16 2024
  • Length: 1 hr and 17 mins
  • Podcast

801: Merged LLMs Are Smaller And More Capable, with Arcee AI's Mark McQuade and Charles Goddard  By  cover art

801: Merged LLMs Are Smaller And More Capable, with Arcee AI's Mark McQuade and Charles Goddard

  • Summary

  • Merged LLMs are the future, and we’re exploring how with Mark McQuade and Charles Goddard from Arcee AI on this episode with Jon Krohn. Learn how to combine multiple LLMs without adding bulk, train more efficiently, and dive into different expert approaches. Discover how smaller models can outperform larger ones and leverage open-source projects for big enterprise wins. This episode is packed with must-know insights for data scientists and ML engineers. Don’t miss out!

    Interested in sponsoring a SuperDataScience Podcast episode? Email natalie@superdatascience.com for sponsorship information.

    In this episode you will learn:
    • Explanation of Charles' job title: Chief of Frontier Research [03:31]
    • Model Merging Technology combining multiple LLMs without increasing size [04:43]
    • Using MergeKit for model merging [14:49]
    • Evolutionary Model Merging using evolutionary algorithms [22:55]
    • Commercial applications and success stories [28:10]
    • Comparison of Mixture of Experts (MoE) vs. Mixture of Agents [37:57]
    • Spectrum Project for efficient training by targeting specific modules [54:28]
    • Future of Small Language Models (SLMs) and their advantages [01:01:22]

    Additional materials: www.superdatascience.com/801

    Show more Show less
activate_primeday_promo_in_buybox_DT

What listeners say about 801: Merged LLMs Are Smaller And More Capable, with Arcee AI's Mark McQuade and Charles Goddard

Average customer ratings

Reviews - Please select the tabs below to change the source of reviews.