The New Stack Podcast  By  cover art

The New Stack Podcast

By: The New Stack
  • Summary

  • The New Stack Podcast is all about the developers, software engineers and operations people who build at-scale architectures that change the way we develop and deploy software. For more content from The New Stack, subscribe on YouTube at: https://www.youtube.com/c/TheNewStack
    All rights reserved
    Show more Show less
Episodes
  • Who’s Keeping the Python Ecosystem Safe?
    Jun 6 2024

    Mike Fiedler, a PyPI safety and security engineer at the Python Software Foundation, prefers the title “code gardener,” reflecting his role in maintaining and securing open source projects. Recorded at PyCon US, Fiedler explains his task of “pulling the weeds” in code—handling unglamorous but crucial aspects of open source contributions. Since August, funded by Amazon Web Services, Fiedler has focused on enhancing the security of the Python Package Index (PyPI). His efforts include ensuring that both packages and the pipeline are secure, emphasizing the importance of vetting third-party modules before deployment.

    One of Fiedler’s significant initiatives was enforcing mandatory two-factor authentication (2FA) for all PyPI user accounts by January 1, following a community awareness campaign. This transition was smooth, thanks to proactive outreach. Additionally, the foundation collaborates with security researchers and the public to report and address malicious packages.

    In late 2023, a security audit by Trail of Bits, funded by the Open Technology Fund, identified and quickly resolved medium-sized vulnerabilities, increasing PyPI's overall security. More details on Fiedler's work are available in the full interview video.

    Learn more from The New Stack about PyPl:

    PyPl Strives to Pull Itself Out of Trouble

    How Python Is Evolving

    Poisoned Lolip0p PyPI Packages

    Join our community of newsletter subscribers to stay on top of the news and at the top of your game.

    Show more Show less
    18 mins
  • How Training Data Differentiates Falcon, the LLM from the UAE
    May 30 2024

    The name "Falcon" for the UAE’s large language model (LLM) symbolizes the national bird's qualities of courage and perseverance, reflecting the vision of the Technology Innovation Institute (TII) in Abu Dhabi. TII, launched in 2020, addresses AI’s rapid advancements and unintended consequences by fostering an open-source approach to enhance community understanding and control of AI. In this New Stack Makers, Dr. Hakim Hacid, Executive Director and Acting Chief Researcher, Technology Innovation Institute emphasized the importance of perseverance and innovation in overcoming challenges. Falcon gained attention for being the first truly open model with capabilities matching many closed-source models, opening new possibilities for practitioners and industry.

    Last June, Falcon introduced a 40-billion parameter model, outperforming the LLaMA-65B, with smaller models enabling local inference without the cloud. The latest 180-billion parameter model, trained on 3.5 trillion tokens, illustrates Falcon’s commitment to quality and efficiency over sheer size. Falcon’s distinctiveness lies in its data quality, utilizing over 80% RefinedWeb data, based on CommonCrawl, which ensures cleaner and deduplicated data, resulting in high-quality outcomes. This data-centric approach, combined with powerful computational resources, sets Falcon apart in the AI landscape.

    Learn more from The New Stack about Open Source AI:

    Open Source Initiative Hits the Road to Define Open Source AI

    Linus Torvalds on Security, AI, Open Source and Trust

    Transparency and Community: An Open Source Vision for AI

    Join our community of newsletter subscribers to stay on top of the news and at the top of your game.

    Show more Show less
    23 mins
  • Out with C and C++, In with Memory Safety
    May 22 2024

    Crash-level bugs continue to pose a significant challenge due to the lack of memory safety in programming languages, an issue persisting since the punch card era. This enduring problem, described as "the Joker to the Batman" by Anil Dash, VP of developer experience at Fastly, is highlighted in a recent episode of The New Stack Makers. The White House has emphasized memory safety, advocating for the adoption of memory-safe programming languages and better software measurability. The Office of the National Cyber Director (ONCD) noted that languages like C and C++ lack memory safety traits and are prevalent in critical systems. They recommend using memory-safe languages, such as Java, C#, and Rust, to develop secure software. Memory safety is particularly crucial for the US government due to the high stakes, especially in space exploration, where reliability standards are exceptionally stringent. Dash underscores the importance of resilience and predictability in missions that may outlast their creators, necessitating rigorous memory safety practices.

    Learn more from The New Stack about Memory Safety:

    White House Warns Against Using Memory-Unsafe Languages

    Can C++ Be Saved? Bjarne Stroupstrup on Ensuring Memory Safety

    Bjarne Stroupstrup's Plan for Bringing Safety to C++

    Join our community of newsletter subscribers to stay on top of the news and at the top of your game.

    Show more Show less
    36 mins

What listeners say about The New Stack Podcast

Average customer ratings

Reviews - Please select the tabs below to change the source of reviews.