Nvidia DGX Cloud: Train Your Own ChatGPT in a Web Browser For $37K a Month

    0
    106

    An anonymous reader writes: Last week, we learned that Microsoft spent hundreds of millions of dollars to buy tens of thousands of Nvidia A100 graphics chips so that partner OpenAI could train the large language models (LLMs) behind Bing’s AI chatbot and ChatGPT. Don’t have access to all that capital or space for all that hardware for your own LLM project? Nvidia’s DGX Cloud is an attempt to sell remote web access to the very same thing. Announced today at the company’s 2023 GPU Technology Conference, the service rents virtual versions of its DGX Server boxes, each containing eight Nvidia H100 or A100 GPUs and 640GB of memory. The service includes interconnects that scale up to the neighborhood of 32,000 GPUs, storage, software, and “direct access to Nvidia AI experts who optimize your code,” starting at $36,999 a month for the A100 tier. Meanwhile, a physical DGX Server box can cost upwards of $200,000 for the same hardware if you’re buying it outright, and that doesn’t count the efforts companies like Microsoft say they made to build working data centers around the technology. Read more of this story at Slashdot.

    Advertisment
    Previous articleAn emerging fungal threat spread at an alarming rate in US health care facilities, study says
    Next articleBiden weighs in on the battle for the soul of Wall Street with the atheistic pseudoreligion called ESG

    LEAVE A REPLY

    Please enter your comment!
    Please enter your name here