Episodios

  • #110 Unpacking Bayesian Methods in AI with Sam Duffield
    Jul 10 2024

    Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    • My Intuitive Bayes Online Courses
    • 1:1 Mentorship with me

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways:

    • Use mini-batch methods to efficiently process large datasets within Bayesian frameworks in enterprise AI applications.
    • Apply approximate inference techniques, like stochastic gradient MCMC and Laplace approximation, to optimize Bayesian analysis in practical settings.
    • Explore thermodynamic computing to significantly speed up Bayesian computations, enhancing model efficiency and scalability.
    • Leverage the Posteriors python package for flexible and integrated Bayesian analysis in modern machine learning workflows.
    • Overcome challenges in Bayesian inference by simplifying complex concepts for non-expert audiences, ensuring the practical application of statistical models.
    • Address the intricacies of model assumptions and communicate effectively to non-technical stakeholders to enhance decision-making processes.

    Chapters:

    00:00 Introduction to Large-Scale Machine Learning

    11:26 Scalable and Flexible Bayesian Inference with Posteriors

    25:56 The Role of Temperature in Bayesian Models

    32:30 Stochastic Gradient MCMC for Large Datasets

    36:12 Introducing Posteriors: Bayesian Inference in Machine Learning

    41:22 Uncertainty Quantification and Improved Predictions

    52:05 Supporting New Algorithms and Arbitrary Likelihoods

    59:16 Thermodynamic Computing

    01:06:22 Decoupling Model Specification, Data Generation, and Inference

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal

    Más Menos
    1 h y 12 m
  • #109 Prior Sensitivity Analysis, Overfitting & Model Selection, with Sonja Winter
    Jun 25 2024

    Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    • My Intuitive Bayes Online Courses
    • 1:1 Mentorship with me

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work !

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways

    • Bayesian methods align better with researchers' intuitive understanding of research questions and provide more tools to evaluate and understand models.
    • Prior sensitivity analysis is crucial for understanding the robustness of findings to changes in priors and helps in contextualizing research findings.
    • Bayesian methods offer an elegant and efficient way to handle missing data in longitudinal studies, providing more flexibility and information for researchers.
    • Fit indices in Bayesian model selection are effective in detecting underfitting but may struggle to detect overfitting, highlighting the need for caution in model complexity.
    • Bayesian methods have the potential to revolutionize educational research by addressing the challenges of small samples, complex nesting structures, and longitudinal data.
    • Posterior predictive checks are valuable for model evaluation and selection.

    Chapters

    00:00 The Power and Importance of Priors

    09:29 Updating Beliefs and Choosing Reasonable Priors

    16:08 Assessing Robustness with Prior Sensitivity Analysis

    34:53 Aligning Bayesian Methods with Researchers' Thinking

    37:10 Detecting Overfitting in SEM

    43:48 Evaluating Model Fit with Posterior Predictive Checks

    47:44 Teaching Bayesian Methods

    54:07 Future Developments in Bayesian Statistics

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi...

    Más Menos
    1 h y 11 m
  • #108 Modeling Sports & Extracting Player Values, with Paul Sabin
    Jun 14 2024

    Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    • My Intuitive Bayes Online Courses
    • 1:1 Mentorship with me

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways

    • Convincing non-stats stakeholders in sports analytics can be challenging, but building trust and confirming their prior beliefs can help in gaining acceptance.
    • Combining subjective beliefs with objective data in Bayesian analysis leads to more accurate forecasts.
    • The availability of massive data sets has revolutionized sports analytics, allowing for more complex and accurate models.
    • Sports analytics models should consider factors like rest, travel, and altitude to capture the full picture of team performance.
    • The impact of budget on team performance in American sports and the use of plus-minus models in basketball and American football are important considerations in sports analytics.
    • The future of sports analytics lies in making analysis more accessible and digestible for everyday fans.
    • There is a need for more focus on estimating distributions and variance around estimates in sports analytics.
    • AI tools can empower analysts to do their own analysis and make better decisions, but it's important to ensure they understand the assumptions and structure of the data.
    • Measuring the value of certain positions, such as midfielders in soccer, is a challenging problem in sports analytics.
    • Game theory plays a significant role in sports strategies, and optimal strategies can change over time as the game evolves.

    Chapters

    00:00 Introduction and Overview

    09:27 The Power of Bayesian Analysis in Sports Modeling

    16:28 The Revolution of Massive Data Sets in Sports Analytics

    31:03 The Impact of Budget in Sports Analytics

    39:35 Introduction to Sports Analytics

    52:22 Plus-Minus Models in American Football

    01:04:11 The Future of Sports Analytics

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi...

    Más Menos
    1 h y 18 m
  • #107 Amortized Bayesian Inference with Deep Neural Networks, with Marvin Schmitt
    May 29 2024

    Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    • My Intuitive Bayes Online Courses
    • 1:1 Mentorship with me

    In this episode, Marvin Schmitt introduces the concept of amortized Bayesian inference, where the upfront training phase of a neural network is followed by fast posterior inference.

    Marvin will guide us through this new concept, discussing his work in probabilistic machine learning and uncertainty quantification, using Bayesian inference with deep neural networks.

    He also introduces BayesFlow, a Python library for amortized Bayesian workflows, and discusses its use cases in various fields, while also touching on the concept of deep fusion and its relation to multimodal simulation-based inference.

    A PhD student in computer science at the University of Stuttgart, Marvin is supervised by two LBS guests you surely know — Paul Bürkner and Aki Vehtari. Marvin’s research combines deep learning and statistics, to make Bayesian inference fast and trustworthy.

    In his free time, Marvin enjoys board games and is a passionate guitar player.

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary and Blake Walters.

    Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)

    Takeaways:

    • Amortized Bayesian inference...
    Más Menos
    1 h y 22 m
  • #106 Active Statistics, Two Truths & a Lie, with Andrew Gelman
    May 16 2024

    Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    • My Intuitive Bayes Online Courses
    • 1:1 Mentorship with me

    If there is one guest I don’t need to introduce, it’s mister Andrew Gelman. So… I won’t! I will refer you back to his two previous appearances on the show though, because learning from Andrew is always a pleasure. So go ahead and listen to episodes 20 and 27.

    In this episode, Andrew and I discuss his new book, Active Statistics, which focuses on teaching and learning statistics through active student participation. Like this episode, the book is divided into three parts: 1) The ideas of statistics, regression, and causal inference; 2) The value of storytelling to make statistical concepts more relatable and interesting; 3) The importance of teaching statistics in an active learning environment, where students are engaged in problem-solving and discussion.

    And Andrew is so active and knowledgeable that we of course touched on a variety of other topics — but for that, you’ll have to listen ;)

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary and Blake Walters.

    Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)

    Takeaways:

    - Active learning is essential for teaching and learning statistics.

    - Storytelling can make...

    Más Menos
    1 h y 17 m
  • #105 The Power of Bayesian Statistics in Glaciology, with Andy Aschwanden & Doug Brinkerhoff
    May 2 2024

    Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    • My Intuitive Bayes Online Courses
    • 1:1 Mentorship with me

    In this episode, Andy Aschwanden and Doug Brinkerhoff tell us about their work in glaciology and the application of Bayesian statistics in studying glaciers. They discuss the use of computer models and data analysis in understanding glacier behavior and predicting sea level rise, and a lot of other fascinating topics.

    Andy grew up in the Swiss Alps, and studied Earth Sciences, with a focus on atmospheric and climate science and glaciology. After his PhD, Andy moved to Fairbanks, Alaska, and became involved with the Parallel Ice Sheet Model, the first open-source and openly-developed ice sheet model.

    His first PhD student was no other than… Doug Brinkerhoff! Doug did an MS in computer science at the University of Montana, focusing on numerical methods for ice sheet modeling, and then moved to Fairbanks to complete his PhD. While in Fairbanks, he became an ardent Bayesian after “seeing that uncertainty needs to be embraced rather than ignored”. Doug has since moved back to Montana, becoming faculty in the University of Montana’s computer science department.

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero and Will Geary.

    Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)

    Más Menos
    1 h y 15 m
  • #104 Automated Gaussian Processes & Sequential Monte Carlo, with Feras Saad
    Apr 16 2024

    Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    • My Intuitive Bayes Online Courses
    • 1:1 Mentorship with me

    GPs are extremely powerful…. but hard to handle. One of the bottlenecks is learning the appropriate kernel. What if you could learn the structure of GP kernels automatically? Sounds really cool, but also a bit futuristic, doesn’t it?

    Well, think again, because in this episode, Feras Saad will teach us how to do just that! Feras is an Assistant Professor in the Computer Science Department at Carnegie Mellon University. He received his PhD in Computer Science from MIT, and, most importantly for our conversation, he’s the creator of AutoGP.jl, a Julia package for automatic Gaussian process modeling.

    Feras discusses the implementation of AutoGP, how it scales, what you can do with it, and how you can integrate its outputs in your models.

    Finally, Feras provides an overview of Sequential Monte Carlo and its usefulness in AutoGP, highlighting the ability of SMC to incorporate new data in a streaming fashion and explore multiple modes efficiently.

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell and Gal Kampel.

    Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)

    Takeaways:

    - AutoGP is a Julia package for automatic Gaussian process modeling that learns the

    Más Menos
    1 h y 31 m
  • #103 Improving Sampling Algorithms & Prior Elicitation, with Arto Klami
    Apr 5 2024

    Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    • My Intuitive Bayes Online Courses
    • 1:1 Mentorship with me

    Changing perspective is often a great way to solve burning research problems. Riemannian spaces are such a perspective change, as Arto Klami, an Associate Professor of computer science at the University of Helsinki and member of the Finnish Center for Artificial Intelligence, will tell us in this episode.

    He explains the concept of Riemannian spaces, their application in inference algorithms, how they can help sampling Bayesian models, and their similarity with normalizing flows, that we discussed in episode 98.

    Arto also introduces PreliZ, a tool for prior elicitation, and highlights its benefits in simplifying the process of setting priors, thus improving the accuracy of our models.

    When Arto is not solving mathematical equations, you’ll find him cycling, or around a good board game.

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.

    Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)

    Takeaways:

    - Riemannian spaces offer a way to improve computational efficiency and accuracy in Bayesian inference by considering the curvature of the posterior distribution.

    - Riemannian spaces can be used in Laplace approximation and Markov chain Monte Carlo...

    Más Menos
    1 h y 15 m