September 30: Support for Azure AI Inference models, Mistral Pixtral and latest Google Gemini models

New Features

  • 💡 Graphlit now supports the Azure AI Model Inference API (aka Models as a Service) model service which offers serverless hosting to many models such as Meta Llama 3.2, Cohere Command-R, and many more. For Azure AI, all models are 'custom', and you will need to provide the serverless endpoint, API key and number of tokens accepted in context window, after provisioning the model of your choice.

  • We have added support for the multimodal Mistral Pixtral model, under the model enum PIXTRAL_12B_2409.

  • We have added versioned model enums for Google Gemini, so you can access GEMINI_1_5_FLASH_001, GEMINI_1_5_FLASH_002, GEMINI_1_5_PRO_001 and GEMINI_1_5_PRO_002.

Last updated