More

    Microsoft unveils Cobalt 100 chips and new AI Innovations ahead of Build Conference

    Microsoft is set to debut its custom Cobalt 100 chips to customers as a public preview at the upcoming Build conference, TechCrunch has exclusively reported. During an analyst briefing ahead of the event, Scott Guthrie, Microsoft’s Executive Vice President of the Cloud and AI group, compared the new Cobalt chips to AWS’s Graviton chips, highlighting that Cobalt will offer 40% better performance than other ARM chips currently available. Major companies like Adobe and Snowflake have already started utilizing these new chips.

    Microsoft Cobalt 100 chips

    Microsoft Azure Cobalt 100 chips

    First announced last November, the Cobalt chips are 64-bit and based on ARM architecture, featuring an impressive 128 cores. Alongside the Cobalt chips, Microsoft will also introduce AMD’s MI300X accelerators to Azure clients next week. Despite AMD traditionally lagging behind Nvidia in the AI sector, these new chips are gaining traction as they offer better software support and provide a more cost-effective solution for cloud providers looking for alternatives to Nvidia’s expensive GPUs. Guthrie emphasized that the MI300X is currently the “most cost-effective GPU for Azure OpenAI.”

    Another significant announcement from Microsoft involves a pricing reduction for accessing and running large language models, with details to be revealed at Build. Additionally, Microsoft will preview a new “real-time intelligence system” designed to stream data into Fabric, the company’s data analytics platform. This system will include native Kafka integration and support for AWS Kinesis and Google Cloud’s Pub/Sub data-streaming systems.

    Microsoft will also announce a partnership with Snowflake, integrating Snowflake’s Iceberg format into Fabric. This integration will facilitate seamless interoperability, allowing data to move freely between Snowflake and Fabric.

    Furthermore, Microsoft plans to enhance its Copilot feature, enabling developers to manage Azure resources directly through natural language commands. This new capability, built on a common extensibility mechanism, will allow other providers to integrate and offer similar functionalities, aiming to create a tighter developer loop across the development stack and Azure.

    These announcements highlight Microsoft’s continued efforts to innovate and provide competitive, efficient solutions in the cloud and AI markets. The upcoming Build conference promises to deliver a range of exciting updates and new technologies for developers and businesses alike.

    author avatar
    Shamit Shankara

    LATEST ARTICLES

    RELATED ARTICLES

    LEAVE A COMMENT

    Please enter your comment!
    Please enter your name here

    spot_img