Training Data

De: Sequoia Capital
  • Resumen

  • Join us as we train our neural nets on the theme of the century: AI. Sonya Huang, Pat Grady and more Sequoia Capital partners host conversations with leading AI builders and researchers to ask critical questions and develop a deeper understanding of the evolving technologies—and their implications for technology, business and society. The content of this podcast does not constitute investment advice, an offer to provide investment advisory services, or an offer to sell or solicitation of an offer to buy an interest in any investment fund.
    Más Menos
activate_WEBCRO358_DT_T2
Episodios
  • Sierra Co-Founder Clay Bavor on Making Customer-Facing AI Agents Delightful
    Aug 27 2024
    Customer service is hands down the first killer app of generative AI for businesses. The reasons are simple: the costs of existing solutions are so high, the satisfaction so low and the margin for ROI so wide. But trusting your interactions with customers to hallucination-prone LLMs can be daunting. Enter Sierra. Co-founder Clay Bavor walks us through the sophisticated engineering challenges his team solved along the way to delivering AI agents for all aspects of the customer experience that are delightful, safe and reliable—and being deployed widely by Sierra’s customers. The Company’s AgentOS enables businesses to create branded AI agents to interact with customers, follow nuanced policies and even handle customer retention and upsell. Clay describes how companies can capture their brand voice, values and internal processes to create AI agents that truly represent the business. Hosted by: Ravi Gupta and Pat Grady, Sequoia Capital Mentioned in this episode: Bret Taylor: co-founder of Sierra Towards a Human-like Open-Domain Chatbot: 2020 Google paper that introduced Meena, a predecessor of ChatGPT (followed by LaMDA in 2021) PaLM: Scaling Language Modeling with Pathways: 2022 Google paper about their unreleased 540B parameter transformer model (GPT-3, at the time, had 175B) Avocado chair: Images generated by OpenAI’s DALL·E model in 2022 Large Language Models Understand and Can be Enhanced by Emotional Stimuli: 2023 Microsoft paper on how models like GPT-4 can be manipulated into providing better results 𝛕-bench: A Benchmark for Tool-Agent-User Interaction in Real-World Domains: 2024 paper authored by Sierra research team, led by Karthik Narasimhan (co-author of the 2022 ReACT paper and the 2023 Reflexion paper) 00:00:00 Introduction 00:01:21 Clay’s background 00:03:20 Google before the ChatGPT moment 00:07:31 What is Sierra? 00:12:03 What’s possible now that wasn’t possible 18 months ago? 00:17:11 AgentOS 00:23:45 The solution to many problems with AI is more AI 00:28:37 𝛕-bench 00:33:19 Engineering task vs research task 00:37:27 What tasks can you trust an agent with now? 00:43:21 What metrics will move? 00:46:22 The reality of deploying AI to customers today 00:53:33 The experience manager 01:03:54 Outcome-based pricing 01:05:55 Lightning Round
    Más Menos
    1 h y 13 m
  • Phaidra’s Jim Gao on Building the Fourth Industrial Revolution with Reinforcement Learning
    Aug 20 2024
    After AlphaGo beat Lee Sedol, a young mechanical engineer at Google thought of another game reinforcement learning could win: energy optimization at data centers. Jim Gao convinced his bosses at the Google data center team to let him work with the DeepMind team to try. The initial pilot resulted in a 40% energy savings and led he and his co-founders to start Phaidra to turn this technology into a product. Jim discusses the challenges of AI readiness in industrial settings and how we have to build on top of the control systems of the 70s and 80s to achieve the promise of the Fourth Industrial Revolution. He believes this new world of self-learning systems and self-improving infrastructure is a key factor in addressing global climate change. Hosted by: Sonya Huang and Pat Grady, Sequoia Capital Mentioned in this episode: Mustafa Suleyman: Co-founder of DeepMind and Inflection AI and currently CEO of Microsoft AI, known to his friends as “Moose” Joe Kava: Google VP of data centers who Jim sent his initial email to pitching the idea that would eventually become Phaidra Constrained optimization: the class of problem that reinforcement learning can be applied to in real world systems Vedavyas Panneershelvam: co-founder and CTO of Phaidra; one of the original engineers on the AlphaGo project Katie Hoffman: co-founder, President and COO of Phaidra Demis Hassabis: CEO of DeepMind
    Más Menos
    51 m
  • Fireworks Founder Lin Qiao on How Fast Inference and Small Models Will Benefit Businesses
    Aug 13 2024
    In the first wave of the generative AI revolution, startups and enterprises built on top of the best closed-source models available, mostly from OpenAI. The AI customer journey moves from training to inference, and as these first products find PMF, many are hitting a wall on latency and cost. Fireworks Founder and CEO Lin Qiao led the PyTorch team at Meta that rebuilt the whole stack to meet the complex needs of the world’s largest B2C company. Meta moved PyTorch to its own non-profit foundation in 2022 and Lin started Fireworks with the mission to compress the timeframe of training and inference and democratize access to GenAI beyond the hyperscalers to let a diversity of AI applications thrive. Lin predicts when open and closed source models will converge and reveals her goal to build simple API access to the totality of knowledge. Hosted by: Sonya Huang and Pat Grady, Sequoia Capital Mentioned in this episode: Pytorch: the leading framework for building deep learning models, originated at Meta and now part of the Linux Foundation umbrella Caffe2 and ONNX: ML frameworks Meta used that PyTorch eventually replaced Conservation of complexity: the idea that that every computer application has inherent complexity that cannot be reduced but merely moved between the backend and frontend, originated by Xerox PARC researcher Larry Tesler Mixture of Experts: a class of transformer models that route requests between different subsets of a model based on use case Fathom: a product the Fireworks team uses for video conference summarization LMSYS Chatbot Arena: crowdsourced open platform for LLM evals hosted on Hugging Face 00:00 - Introduction 02:01 - What is Fireworks? 02:48 - Leading Pytorch 05:01 - What do researchers like about PyTorch? 07:50 - How Fireworks compares to open source 10:38 - Simplicity scales 12:51 - From training to inference 17:46 - Will open and closed source converge? 22:18 - Can you match OpenAI on the Fireworks stack? 26:53 - What is your vision for the Fireworks platform? 31:17 - Competition for Nvidia? 32:47 - Are returns to scale starting to slow down? 34:28 - Competition 36:32 - Lightning round
    Más Menos
    39 m

Lo que los oyentes dicen sobre Training Data

Calificaciones medias de los clientes

Reseñas - Selecciona las pestañas a continuación para cambiar el origen de las reseñas.