• decerian@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    4 months ago

    If this actually did lead to faster matrix multiplication, then essentially anything that can be done on a GPU would benefit. That definitely could include games, and physics models, along with a bunch of other applications (and yes, also AI stuff).

    I’m sure the papers authors know all of that, but somehow along the line the article just became"faster and better AI"