定制 1tomany/llm-sdk 二次开发

按需修改功能、优化性能、对接业务系统,提供一站式技术支持

邮箱:yvsm@zunyunkeji.com | QQ:316430983 | 微信:yvsm316

1tomany/llm-sdk

最新稳定版本:v0.8.0

Composer 安装命令:

composer require 1tomany/llm-sdk

包简介

A single, unified, framework-independent library for integration with many popular AI platforms and large language models

README 文档

README

This library provides a single, unified, framework-independent library for integration with several popular AI platforms and large language models.

Installation

Install the library using Composer:

composer require 1tomany/llm-sdk

Usage

There are two ways to use this library:

  1. Direct Instantiate the AI client you wish to use and send a request object to it. This method is easier to use, but comes with the cost that your application will be less flexible and testable.
  2. Actions Register the clients you wish to use with a OneToMany\LlmSdk\Factory\ClientFactory instance, inject that instance into each action you wish to take, and interact with the action instead of through the client.

Note: A Symfony bundle is available if you wish to integrate this library into your Symfony applications with autowiring and configuration support.

Examples

Review the examples below to get an idea of how the library works.

Embeddings

Files

Outputs

Search Stores

Supported platforms

  • Anthropic
  • Gemini
  • Mock
  • OpenAI

Platform feature support

Note: Each platform refers to generating output (inference) differently; OpenAI uses the word "Responses" while Gemini uses the word "Content". I've decided the word "Output" best represents what a large language model produces in the case of generative models, and "Embedding" in the case of embedding models.

To generate output or create an embedding, you must first compile a "Query". A query is made up of different input components: text prompts, files, a JSON schema, and/or system instructions.

This library allows you to compile a query before sending it to the model for two reasons:

  1. You can log/analyze the request payload before sending it to the model.
  2. You can compile individual requests for batching.
Feature Anthropic Gemini Mock OpenAI
Batches
Create
Read
Cancel
Embeddings
Create
Files
Upload
Read
List
Download
Delete
Outputs
Generate
Queries
Compile
Search Stores
Create
Read
Search
ImportFile

Credits

License

The MIT License

统计信息

  • 总下载量: 124
  • 月度下载量: 0
  • 日度下载量: 0
  • 收藏数: 0
  • 点击次数: 3
  • 依赖项目数: 1
  • 推荐数: 0

GitHub 信息

  • Stars: 0
  • Watchers: 0
  • Forks: 0
  • 开发语言: PHP

其他信息

  • 授权协议: MIT
  • 更新时间: 2026-02-20

承接程序开发

PHP开发

VUE

Vue开发

前端开发

小程序开发

公众号开发

系统定制

数据库设计

云部署

网站建设

安全加固