Skip to content

feat: Add LiteLLM AI Gateway support#40

Open
devhimanshuu wants to merge 1 commit intobug0inc:mainfrom
devhimanshuu:ai_gateway
Open

feat: Add LiteLLM AI Gateway support#40
devhimanshuu wants to merge 1 commit intobug0inc:mainfrom
devhimanshuu:ai_gateway

Conversation

@devhimanshuu
Copy link
Copy Markdown

Support for LiteLLM Gateway & Enterprise Models

Description

This PR introduces support for LiteLLM, a popular open-source AI proxy. By adding LiteLLM as an officially supported AI gateway, Passmark now seamlessly integrates with enterprise providers like Azure OpenAI and AWS Bedrock.

Enterprise users who mandate strict compliance requirements and cannot use public APIs can now configure Passmark to route requests securely through their internal infrastructure via LiteLLM.

Changes Made

  • src/config.ts:
    • Added 'litellm' to the AIGateway union type.
    • Updated configuration error messages to include LiteLLM setup instructions (e.g., configuring LITELLM_BASE_URL and LITELLM_API_KEY).
  • src/models.ts:
    • Added the @ai-sdk/openai provider to interact with LiteLLM's OpenAI-compatible endpoints.
    • Implemented getLiteLLMProvider() to securely instantiate the model client based on environment variables.
    • Created resolveLiteLLMModelId() to seamlessly map Passmark's default canonical model prefixes (e.g., google/ to gemini/) to match LiteLLM's expected formats.
    • Integrated the new gateway routing logic into resolveModel().
  • Testing (src/__tests__/):
    • Added unit tests to config.test.ts to verify the new gateway configuration parsing.
    • Created models.test.ts to fully cover the environment validation, provider creation, and prefix-mapping logic for LiteLLM.

How to Test

  1. Set the environment variable LITELLM_BASE_URL to your local or hosted LiteLLM instance (e.g., export LITELLM_BASE_URL=http://localhost:4000/v1).
  2. Optionally configure LITELLM_API_KEY if your proxy requires authentication.
  3. Configure Passmark to use the new gateway:
    configure({ ai: { gateway: 'litellm' } });
  4. Run your test suite. Passmark should now securely proxy all requests through your LiteLLM gateway to Azure, Bedrock, or any other configured backend.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant