LLMs sure. My question is whether it is the same in practice for LLMs behind said API. I found no official documentation that we will get exactly the same result as far as I can tell.
And no one here touched how high a multiple the cost is, so I assume its pretty high.
And no one here touched how high a multiple the cost is, so I assume its pretty high.