October 3: Support tool calling, ingestBatch mutation, Gemini Flash 1.5 8b, bug fixes

New Features

  • πŸ’‘ Graphlit now supports the ingestBatch mutation, which accepts an array of URIs to files or web pages, and will asynchronously ingest these into content objects.

  • πŸ’‘ Graphlit now supports the continueConversation mutation, which accepts an array of called tool responses. Also, promptConversation now accepts an array of tool definitions. When tools are called by the LLM, the assistant message returned from promptConversation will have a list of toolCalls which need to responded to from your calling code. These responses are to be provided back to the LLM via the continueConversation mutation.

  • πŸ’‘ Graphlit now supports tool calling with OpenAI, Mistral, Deepseek, Groq, and Cerebras model services. Anthropic, Google Gemini and Cohere support will come later.

  • Added support for prefilled user and assistant messages with createConversation mutation. Now you can send an array of messages when creating a new conversation, which will bootstrap the conversation with the LLM. These must be provided in user/assistant pairs.

  • Added support for Google Gemini Flash 1.5 8b model.

  • ⚑ We have deprecated the tools property in the Specification object. These will be removed at a later date. Tools are now to be sent directly to the extractContents and promptConversation mutations.

Bugs Fixed

  • GPLA-3207: Models shouldn't be required on update specification call

  • GPLA-3220: Don't send system prompt with OpenAI o1 models

Last updated