October 3: Support tool calling, ingestBatch mutation, Gemini Flash 1.5 8b, bug fixes
New Features
💡 Graphlit now supports the
ingestBatchmutation, which accepts an array of URIs to files or web pages, and will asynchronously ingest these into content objects.💡 Graphlit now supports the
continueConversationmutation, which accepts an array of called tool responses. Also,promptConversationnow accepts an array of tool definitions. When tools are called by the LLM, the assistant message returned frompromptConversationwill have a list oftoolCallswhich need to responded to from your calling code. These responses are to be provided back to the LLM via thecontinueConversationmutation.💡 Graphlit now supports tool calling with OpenAI, Mistral, Deepseek, Groq, and Cerebras model services. Anthropic, Google Gemini and Cohere support will come later.
Added support for prefilled user and assistant messages with
createConversationmutation. Now you can send an array of messages when creating a new conversation, which will bootstrap the conversation with the LLM. These must be provided in user/assistant pairs.Added support for Google Gemini Flash 1.5 8b model.
⚡ We have deprecated the
toolsproperty in the Specification object. These will be removed at a later date. Tools are now to be sent directly to theextractContentsandpromptConversationmutations.
Bugs Fixed
GPLA-3207: Models shouldn't be required on update specification call
GPLA-3220: Don't send system prompt with OpenAI o1 models
Last updated