Skip to content

OpenAI API

The OpenAI library supports access to the OpenAI cloud-based LLM.

It provides the ability to create on-device LLM prompts and send to powerful cloud-based LLMs.

Function Index

Json *openaiChatCompletion(Json *props)
 Submit a request to OpenAI Chat Completion API.
Url *openaiRealTimeConnect(Json *props)
 Submit a request to OpenAI Real Time API.
Json *openaiResponse(Json *props)
 Submit a request to OpenAI Response API.

Typedef Index

Defines

Typedefs

Functions

Json * openaiChatCompletion (Json *props)

Submit a request to OpenAI Chat Completion API.

Description:
The following defaults are set: model: gpt-4o-mini.
Parameters:
propsis a JSON object of Response API parameters.
Returns:
Returns a JSON object with the response from the OpenAI Response API. Caller must free the returned JSON object with jsonFree.
API Stability:
Evolving.

Url * openaiRealTimeConnect (Json *props)

Submit a request to OpenAI Real Time API.

Parameters:
propsis a JSON object of Real Time API parameters.
Returns:
Returns an OpenAIRealTime object on success, or 0 on failure.
API Stability:
Evolving.

Json * openaiResponse (Json *props)

Submit a request to OpenAI Response API.

Description:
The following defaults are set: {model: 'gpt-4o-mini', truncation: 'auto'}. The API will aggregate the output text into "output_text" for convenience.
Parameters:
propsis a JSON object of Response API parameters.
Returns:
Returns a JSON object with the response from the OpenAI Response API. Caller must free the returned JSON object with jsonFree.
API Stability:
Evolving.