• sushibowl@feddit.nl
    link
    fedilink
    English
    arrow-up
    25
    ·
    3 months ago

    but is this prompt the entirety of what differentiates it from other GPT-4 LLMs?

    Yes. Probably 90% of AI implementations based on GPT use this technique.

    you can really have a product that’s just someone else’s extremely complicated product but you staple some shit to the front of every prompt?

    Oh yeah. In fact that is what OpenAI wants, it’s their whole business model: they get paid by gab for every conversation people have with this thing.

    • elrik@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 months ago

      Not only that but the API cost is per token, so every message exchange in every conversation costs more because of the length of the system prompt.