Calling Azure OpenAI API in stream mode from an SPFx solution
As mentioned in my previous article, we have integrated ChatGPT in our Atlas product. This was both challenging and fun. The OpenAI API is well documented, so is the Azure OpenAI service. However, one of the things that is not completely documented, is the Stream option. This allows you to call the API and get the response as a stream of data, so you are getting data from the very first second, and you don´t have to wait for the entire response. Note that depending on your request, ChatGPT can take some seconds until you get the response, so from a UX point of view, it is potentially a better experience if you start showing data while the response is fully completed.