Merge branch 'main' into feat/rig-rust

This commit is contained in:
David Maple 2025-02-17 20:07:49 -08:00 committed by GitHub
commit 3189f88d81
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
57 changed files with 2420 additions and 64 deletions

297
README.md
View file

@ -8,7 +8,7 @@
Integrate the DeepSeek API into popular softwares. Access [DeepSeek Open Platform](https://platform.deepseek.com/) to get an API key.
English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/README_cn.md)
English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/README_cn.md)/[日本語](https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/README_ja.md)
</div>
@ -18,6 +18,11 @@ English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integrati
### Applications
<table>
<tr>
<td> <img src="https://avatars.githubusercontent.com/u/171659527?s=400&u=39906ab3b6e2066f83046096a66a77fb3f8bb836&v=4" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/quantalogic/quantalogic">Quantalogic</a> </td>
<td> QuantaLogic is a ReAct (Reasoning & Action) framework for building advanced AI agents. </td>
</tr>
<tr>
<td> <img src="https://github.com/deepseek-ai/awesome-deepseek-integration/assets/13600976/224d547a-6fbc-47c8-859f-aa14813e2b0f" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/chatbox/README.md">Chatbox</a> </td>
@ -42,6 +47,16 @@ English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integrati
<td> <img src="https://www.librechat.ai/librechat.svg" alt="LibreChat" width="64" height="auto" /> </td>
<td> <a href="https://www.librechat.ai/docs/configuration/librechat_yaml/ai_endpoints/deepseek">LibreChat</a> </td>
<td> LibreChat is a customizable open-source app that seamlessly integrates DeepSeek for enhanced AI interactions. </td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/longevity-genie/chat-ui/11c6647c83f9d2de21180b552474ac5ffcf53980/static/geneticsgenie/icon-128x128.png" alt="Icon" width="64" height="auto"/> </td>
<td> <a href="https://github.com/longevity-genie/just-chat">Just-Chat</a> </td>
<td> Make your LLM agent and chat with it simple and fast!</td>
</tr>
<tr>
<td> <img src="https://www.papersgpt.com/images/logo/favicon.ico" alt="PapersGPT" width="64" height="auto" /> </td>
<td> <a href="https://github.com/papersgpt/papersgpt-for-zotero">PapersGPT</a> </td>
<td> PapersGPT is a Zotero plugin that seamlessly with DeepSeek and other multiple AI models for quickly reading papers in Zotero. </td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/rss-translator/RSS-Translator/main/core/static/favicon.ico" alt="Icon" width="64" height="auto" /> </td>
@ -83,12 +98,31 @@ English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integrati
<td> <a href="docs/raycast/README.md">Raycast</a></td>
<td> <a href="https://raycast.com/?via=ViGeng">Raycast</a> is a productivity tool for macOS that lets you control your tools with a few keystrokes. It supports various extensions including DeepSeek AI.</td>
</tr>
</tr> <td> <img src="https://niceprompt.app/favicon.ico" alt="Icon" width="64" height="auto" /> </td> <td> <a href="https://niceprompt.app">Nice Prompt</a></td> <td> <a href="https://niceprompt.app">Nice Prompt</a> Organize, share and use your prompts in your code editor, with Cursor and VSCode。</td>
</tr>
<tr>
<td> <img src="https://avatars.githubusercontent.com/u/193405629?s=200&v=4" alt="PHP Client" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-php/deepseek-php-client/blob/master/README.md">PHP Client</a> </td>
<td> Deepseek PHP Client is a robust and community-driven PHP client library for seamless integration with the Deepseek API. </td>
</tr>
<tr>
<tr>
<td>
<img
src="https://github.com/tornikegomareli/DeepSwiftSeek/blob/main/logo.webp"
alt="DeepSwiftSeek Logo"
width="64"
height="auto"
/>
</td>
<td>
<a href="https://github.com/tornikegomareli/DeepSwiftSeek/blob/main/README.md">DeepSwiftSeek</a>
</td>
<td>
DeepSwiftSeek is a lightweight yet powerful Swift client library, pretty good integration with the DeepSeek API.
It provides easy-to-use Swift concurrency for chat, streaming, FIM (Fill-in-the-Middle) completions, and more.
</td>
</tr>
<td> <img src="https://avatars.githubusercontent.com/u/958072?s=200&v=4" alt="Laravel Integration" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-php/deepseek-laravel/blob/master/README.md">Laravel Integration</a> </td>
<td> Laravel wrapper for Deepseek PHP client, to seamless deepseek API integration with laravel applications.</td>
@ -99,8 +133,8 @@ English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integrati
<td> <a href="https://www.zotero.org">Zotero</a> is a free, easy-to-use tool to help you collect, organize, annotate, cite, and share research.</td>
</tr>
<tr>
<td> <img src="./docs/Siyuan/assets/image-20250122162731-7wkftbw.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/Siyuan/README.md">SiYuan</a> </td>
<td> <img src="https://b3log.org/images/brand/siyuan-128.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/SiYuan/README.md">SiYuan</a> </td>
<td> SiYuan is a privacy-first personal knowledge management system that supports complete offline usage, as well as end-to-end encrypted data sync.</td>
</tr>
<tr>
@ -138,11 +172,64 @@ English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integrati
<td> <a href="https://agenticflow.ai/">AgenticFlow</a> </td>
<td> <a href="https://agenticflow.ai/">AgenticFlow</a> is a no-code platform where marketers build agentic AI workflows for go-to-market automation, powered by hundreds of everyday apps as tools for your AI agents.</td>
</tr>
<tr>
<td> <img src="https://github.com/ZGGSONG/STranslate/raw/main/img/favicon.svg" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://stranslate.zggsong.com/en/">STranslate</a></td>
<td> <a href="https://stranslate.zggsong.com/en/">STranslate</a>Windows is a ready-to-go translation ocr tool developed by WPF </td>
</tr>
<tr>
<td> <img src="https://github.com/user-attachments/assets/5e16beb0-993e-47bf-807e-7c8804b313a2" alt="Asp Client" width="64" height="auto" /> </td>
<td> <a href="https://github.com/Anwar-alhitar/Deepseek.Asp.Client/blob/master/README.md">ASP Client</a> </td>
<td><a href="https://github.com/Anwar-alhitar/Deepseek.Asp.Client/blob/master/README.md">Deepseek.ASPClient</a> is a lightweight ASP.NET wrapper for the Deepseek AI API, designed to simplify AI-driven text processing in .NET applications.. </td>
</tr>
<tr>
<td> <img src="https://www.gptaiflow.tech/logo.png" alt="gpt-ai-flow-logo" width="64" height="auto" /> </td>
<td> <a href="https://www.gptaiflow.tech/docs/product/api-keys-setup#setup-deepseek-api-keys">GPT AI Flow</a></td>
<td>
The ultimate productivity weapon built by engineers for efficiency enthusiasts (themselves): <a href="https://www.gptaiflow.tech/">GPT AI Flow</a>
<ul>
<li>`Shift+Alt+Space` Wake up desktop intelligent hub</li>
<li>Local encrypted storage</li>
<li>Custom instruction engine</li>
<li>On-demand calling without subscription bundling</li>
</ul>
</td>
</tr>
<tr>
<td> <img src="https://github.com/user-attachments/assets/b09f17a8-936d-4dac-8b24-1682d52c9a3c" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/alecm20/story-flicks">Story-Flicks</a></td>
<td>With just one sentence, you can quickly generate high-definition story short videos, supporting models such as DeepSeek.</td>
</tr>
<tr>
<td> <img src="https://prompt.16x.engineer/favicon.ico" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/16x_prompt/README.md">16x Prompt</a> </td>
<td> <a href="https://prompt.16x.engineer/">16x Prompt</a> is an AI coding tool with context management. It helps developers manage source code context and craft prompts for complex coding tasks on existing codebases.</td>
</tr>
<tr>
<td> <img src="https://www.petercat.ai/images/favicon.ico" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://www.petercat.ai">PeterCat</a> </td>
<td> A conversational Q&A agent configuration system, self-hosted deployment solutions, and a convenient all-in-one application SDK, allowing you to create intelligent Q&A bots for your GitHub repositories.</td>
</tr>
</table>
### AI Agent frameworks
<table>
<tr>
<td> <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/smolagents/mascot_smol.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/huggingface/smolagents/tree/main"> smolagents </a> </td>
<td> The simplest way to build great agents. Agents write python code to call tools and orchestrate other agents. Priority support for open models like DeepSeek-R1! </td>
</tr>
<tr>
<td><img src="https://yomo.run/yomo-logo.png" alt="Icon" width="64" height="auto" /></td>
<td><a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/yomo/README.md">YoMo</a></td>
<td>Stateful Serverless LLM Function Calling Framework with Strongly-typed Language Support</td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/superagentxai/superagentX/refs/heads/master/docs/logo/icononly_transparent_nobuffer.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/superagentx/README.md">SuperAgentX</a> </td>
<td>SuperAgentX: A Lightweight Open Source AI Framework Built for Autonomous Multi-Agent Applications with Artificial General Intelligence (AGI) Capabilities.</td>
</tr>
<tr>
<td> <img src="https://panda.fans/_assets/favicons/apple-touch-icon.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/anda/README.md">Anda</a> </td>
@ -152,6 +239,26 @@ English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integrati
<td> <img src="https://rig.rs/assets/favicon.png" alt="Icon" width="64" height="auto" alt="Rig (Rust)" /> </td>
<td> <a href="https://rig.rs/)](https://rig.rs/">RIG</a> </td>
<td>Build modular and scalable LLM Applications in Rust.</td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/longevity-genie/chat-ui/11c6647c83f9d2de21180b552474ac5ffcf53980/static/geneticsgenie/icon-128x128.png" alt="Icon" width="64" height="auto"/> </td>
<td> <a href="https://github.com/longevity-genie/just-agents">Just-Agents</a> </td>
<td>A lightweight, straightforward library for LLM agents - no over-engineering, just simplicity!</td>
</tr>
<tr>
<td> <img src="https://alice.fun/alice-logo.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/bob-robert-ai/bob/blob/main/alice/readme.md">Alice</a> </td>
<td>An autonomous AI agent on ICP, leveraging LLMs like DeepSeek for on-chain decision-making. Alice combines real-time data analysis with a playful personality to manage tokens, mine BOB, and govern ecosystems.</td>
</tr>
<tr>
<td> <img src="https://github.com/Upsonic/Upsonic/blob/9d2e6d43b44defc6744817330625661ca3a2184e/Upsonic%20pp.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/Upsonic/Upsonic">Upsonic</a> </td>
<td>Upsonic offers a cutting-edge enterprise-ready agent framework where you can orchestrate LLM calls, agents, and computer use to complete tasks cost-effectively.</td>
</tr>
<tr>
<td> <img src="https://avatars.githubusercontent.com/u/173022229" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/APRO-com">ATTPs</a> </td>
<td>A foundational protocol framework for trusted communication between agents. Any agents based on DeepSeek, By integrating with the <a href="https://docs.apro.com/attps">ATTPs</a> SDK, can access features such as agent registration, sending verifiable data, and retrieving verifiable data. So that it can make trusted communication with agents from other platforms. </td>
</tr>
</table>
@ -163,6 +270,42 @@ English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integrati
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/ragflow/README.md"> RAGFlow </a> </td>
<td> An open-source RAG (Retrieval-Augmented Generation) engine based on deep document understanding. It offers a streamlined RAG workflow for businesses of any scale, combining LLM (Large Language Models) to provide truthful question-answering capabilities, backed by well-founded citations from various complex formatted data. </td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/pingcap/tidb.ai/main/frontend/app/public/nextra/icon-dark.svg" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/autoflow/README.md"> Autoflow </a> </td>
<td> <a href="https://github.com/pingcap/autoflow">AutoFlow</a> is an open-source knowledge base tool based on GraphRAG (Graph-based Retrieval-Augmented Generation), built on <a href="https://www.pingcap.com/ai?utm_source=tidb.ai&utm_medium=community">TiDB</a> Vector, LlamaIndex, and DSPy. It provides a Perplexity-like search interface and allows easy integration of AutoFlow's conversational search window into your website by embedding a simple JavaScript snippet. </td>
</tr>
<tr>
<td> <img src="https://assets.zilliz.com/Zilliz_Logo_Mark_White_20230223_041013_86057436cc.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/zilliztech/deep-searcher"> DeepSearcher </a> </td>
<td> DeepSearcher combines powerful LLMs (DeepSeek, OpenAI, etc.) and Vector Databases (Milvus, etc.) to perform search, evaluation, and reasoning based on private data, providing highly accurate answer and comprehensive report. </td>
</tr>
</table>
### Solana frameworks
<table>
<tr>
<td> <img src="./docs/solana-agent-kit/assets/sendai-logo.png" alt="Icon" width="128" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/solana-agent-kit/README.md"> Solana Agent Kit </a> </td>
<td>An open-source toolkit for connecting AI agents to Solana protocols. Now, any agent, using any Deepseek LLM, can autonomously perform 60+ Solana actions: </td>
</tr>
</table>
### Synthetic data curation
<table>
<tr>
<td> <img src="https://raw.githubusercontent.com/bespokelabsai/curator/main/docs/Bespoke-Labs-Logomark-Red-crop.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/curator/README.md"> Curator </a> </td>
<td> An open-source tool to curate large scale datasets for post-training LLMs. </td>
</tr>
<tr>
<td> <img src="https://github.com/user-attachments/assets/8455694b-c52e-40ec-847e-adf6a5ac064f" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/Kiln-AI/Kiln"> Kiln </a> </td>
<td>Generate synthetic datasets and distill R1 models into custom fine-tunes. </td>
</tr>
</table>
### IM Application Plugins
@ -174,9 +317,14 @@ English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integrati
<td>Domain knowledge assistant in personal WeChat and Feishu, focusing on answering questions.</td>
</tr>
<tr>
<td> <img src="https://github.com/RockChinQ/QChatGPT/blob/master/res/logo.png?raw=true" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/RockChinQ/QChatGPT">QChatGPT<br/>QQ</a> </td>
<td> A QQ chatbot with high stability, plugin support, and real-time networking. </td>
<td> <img src="https://github.com/RockChinQ/LangBot/blob/master/res/logo.png?raw=true" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/RockChinQ/LangBot">LangBot<br/>QQ, Lark, WeCom</a> </td>
<td> LLM-based IM bots framework, supports QQ, Lark, WeCom, and more platforms.</td>
</tr>
<tr>
<td> <img src="https://nonebot.dev/logo.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/KomoriDev/nonebot-plugin-deepseek">NoneBot<br/>QQ, Lark, Discord, TG, etc.</a> </td>
<td> Based on NoneBot framework, provide intelligent chat and deep thinking functions, supports QQ, Lark, Discord, TG, and more platforms.</td>
</tr>
</table>
@ -213,13 +361,33 @@ English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integrati
<td> <a href="https://fluent.thinkstu.com/"> FluentRead </a> </td>
<td> A revolutionary open-source browser translation plugin that enables everyone to have a native-like reading experience </td>
</tr>
<tr>
<td> <img src="https://www.ncurator.com/_next/image?url=%2Ffavicon.ico&w=96&q=75" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://www.ncurator.com/"> Ncurator </a> </td>
<td> Knowledge Base AI Q&A Assistant - Let AI help you organize and analyze knowledge</td>
</tr>
<tr>
<td> <img src="https://github.com/oinzen/RSSFlow-doc/blob/main/docs/images/en/icon64.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://rssflow.oinchain.com"> RssFlow </a> </td>
<td>An intelligent RSS reader browser extension with AI-powered RSS summarization and multi-dimensional feed views. Supports DeepSeek model configuration for enhanced content understanding. </td>
</tr>
<tr>
<td> <img src="https://www.typral.com/_next/image?url=%2Ffavicon.ico&w=96&q=75" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://www.typral.com/"> Typral </a> </td>
<td> Fast AI writer assistant - Let AI help you quickly improve article, paper, text...</td>
</tr>
<tr>
<td> <img src="https://static.trancy.org/assets/trancy_logo.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://www.trancy.org/"> Trancy </a> </td>
<td>Immersive bilingual translation, video bilingual subtitles, sentence/word selection translation extension</td>
</tr>
</table>
### VS Code Extensions
<table>
<tr>
<td> <img src="https://github.com/deepseek-ai/awesome-deepseek-integration/assets/59196087/e4d082de-6f64-44b9-beaa-0de55d70cfab" alt="Icon" width="64" height="auto" /> </td>
<td> <img src="https://github.com/continuedev/continue/blob/main/docs/static/img/logo.png?raw=true" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/continue/README.md"> Continue </a> </td>
<td> Continue is an open-source autopilot in IDE. </td>
</tr>
@ -228,26 +396,56 @@ English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integrati
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/cline/README.md"> Cline </a> </td>
<td> Meet Cline, an AI assistant that can use your CLI aNd Editor. </td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/Sitoi/ai-commit/refs/heads/main/images/logo.png?raw=true" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/Sitoi/ai-commit/blob/main/README.md"> AI Commit </a> </td>
<td> Use AI to generate git commit messages in VS Code. </td>
</tr>
</table>
### Visual Studio Extensions
<table>
<tr>
<td> <img src="https://merryyellow.gallerycdn.vsassets.io/extensions/merryyellow/comment2gpt/2.0.5/1739475434185/Microsoft.VisualStudio.Services.Icons.Default" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://marketplace.visualstudio.com/items?itemName=MerryYellow.Comment2GPT"> Comment2GPT </a> </td>
<td> Use OpenAI ChatGPT, Google Gemini, Anthropic Claude, DeepSeek and Ollama through your comments </td>
</tr>
<tr>
<td> <img src="https://merryyellow.gallerycdn.vsassets.io/extensions/merryyellow/codelens2gpt/2.0.5/1739475875714/Microsoft.VisualStudio.Services.Icons.Default" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://marketplace.visualstudio.com/items?itemName=MerryYellow.CodeLens2GPT"> CodeLens2GPT </a> </td>
<td> Use OpenAI ChatGPT, Google Gemini, Anthropic Claude, DeepSeek and Ollama through the CodeLens </td>
</tr>
<tr>
<td> <img src="https://merryyellow.gallerycdn.vsassets.io/extensions/merryyellow/uca-lite/1.4.2/1739392928984/Microsoft.VisualStudio.Services.Icons.Default" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://marketplace.visualstudio.com/items?itemName=MerryYellow.UCA-Lite"> Unity Code Assist Lite </a> </td>
<td> Code assistance for Unity scripts </td>
</tr>
</table>
### neovim Extensions
<table>
<tr>
<td> <img src="https://github.com/user-attachments/assets/d66dfc62-8e69-4b00-8549-d0158e48e2e0" alt="Icon" width="64" height="auto" /> </td>
<td> <img src="https://github.com/user-attachments/assets/c316f70a-0a3c-4a32-b148-4df15e609acc" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/avante.nvim/README.md"> avante.nvim </a> </td>
<td> avante.nvim is an open-source autopilot in IDE. </td>
</tr>
<tr>
<td> <img src="https://github.com/user-attachments/assets/d66dfc62-8e69-4b00-8549-d0158e48e2e0" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/llm.nvim/README.md"> llm.nvim </a> </td>
<td> A free large language model(LLM) plugin that allows you to interact with LLM in Neovim. Supports any LLM, such as Deepseek, GPT, GLM, Kimi or local LLMs (such as ollama). </td>
<td> A free large language model (LLM) plugin that allows you to interact with LLM in Neovim. Supports any LLM, such as Deepseek, GPT, GLM, Kimi or local LLMs (such as ollama). </td>
</tr>
<tr>
<td> <img src="https://github.com/user-attachments/assets/d66dfc62-8e69-4b00-8549-d0158e48e2e0" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/codecompanion.nvim/README.md"> codecompanion.nvim </a> </td>
<td> AI-powered coding, seamlessly in Neovim. </td>
</tr>
<tr>
<td> <img src="https://github.com/user-attachments/assets/d66dfc62-8e69-4b00-8549-d0158e48e2e0" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/minuet-ai.nvim/README.md"> minuet-ai.nvim </a> </td>
<td> Minuet offers code completion as-you-type from popular LLMs including Deepseek, OpenAI, Gemini, Claude, Ollama, Codestral, and more. </td>
</tr>
</table>
### JetBrains Extensions
@ -264,7 +462,7 @@ English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integrati
<td>Onegai Copilot is an AI coding assistant in JetBrain's IDE. </td>
</tr>
<tr>
<td> <img src="https://github.com/deepseek-ai/awesome-deepseek-integration/assets/59196087/e4d082de-6f64-44b9-beaa-0de55d70cfab" alt="Icon" width="64" height="auto" /> </td>
<td> <img src="https://github.com/continuedev/continue/blob/main/docs/static/img/logo.png?raw=true" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/continue/README.md"> Continue </a> </td>
<td> Continue is an open-source autopilot in IDE. </td>
</tr>
@ -280,13 +478,28 @@ English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integrati
</tr>
</table>
### Cursor
### Discord Bots
<table>
<tr>
<td> <img src="https://geneplore.com/img/geneplore_color_logo_circular.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/Geneplore AI/README.md"> Geneplore AI </a> </td>
<td> Geneplore AI runs one of the largest AI Discord bots, now with Deepseek v3 and R1. </td>
</tr>
</table>
### Native AI Code Editor
<table>
<tr>
<td> <img src="https://global.discourse-cdn.com/flex020/uploads/cursor1/original/2X/a/a4f78589d63edd61a2843306f8e11bad9590f0ca.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://www.cursor.com/"> Cursor </a> </td>
<td>The AI Code Editor</td>
<td>The AI Code Editor based on VS Code</td>
</tr>
<tr>
<td> <img src="https://exafunction.github.io/public/images/windsurf/windsurf-app-icon.svg" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://codeium.com/windsurf"> WindSurf </a> </td>
<td>Another AI Code Editor based on VS Code by Codeium</td>
</tr>
</table>
@ -305,9 +518,34 @@ English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integrati
</tr>
</table>
### Security
<table>
<tr>
<td> <img src="https://github.com/lukehinds/awesome-deepseek-integration/blob/codegate/docs/codegate/assets/codegate.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/stacklok/codegate/"> CodeGate </a> </td>
<td> CodeGate: secure AI code generation</td>
</tr>
</table>
### Others
<table>
<tr>
<td style="font-size: 64px">&#128032;</td>
<td> <a href="https://github.com/lunary-ai/abso/blob/main/README.md"> Abso </a></td>
<td>TypeScript SDK to interact with any LLM provider using the OpenAI format.</td>
</tr>
<tr>
<td> <img src="https://i.imgur.com/IsQYInJ.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/djcopley/ShellOracle/"> ShellOracle </a> </td>
<td> A terminal utility for intelligent shell command generation. </td>
</tr>
<tr>
<td> <img src="https://avatars.githubusercontent.com/u/178783630?s=200&v=4" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/bolna-ai/bolna/"> Bolna </a> </td>
<td> Use DeepSeek as the LLM for conversational voice AI agents</td>
</tr>
<tr>
<td> <img src="https://github.com/deepseek-ai/awesome-deepseek-integration/assets/59196087/c1e47b01-1766-4f7e-bfe6-ab3cb3991c30" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/tree/main/docs/siri_deepseek_shortcut"> siri_deepseek_shortcut </a> </td>
@ -318,6 +556,11 @@ English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integrati
<td> <a href="https://github.com/rubickecho/n8n-deepseek"> n8n-nodes-deepseek </a> </td>
<td> An N8N community node that supports direct integration with the DeepSeek API into workflows. </td>
</tr>
<tr>
<td> <img src="https://framerusercontent.com/images/TSKshn2UFdTyvUi85EDMIXrXgs.png?scale-down-to=512" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/Portkey-AI/gateway"> Portkey AI </a> </td>
<td> Portkey is a unified API for interacting with over 1600+ LLM models, offering advanced tools for control, visibility, and security in your DeepSeek apps. Python & Node SDK available. </td>
</tr>
<tr>
<td> <img src="https://framerusercontent.com/images/8rF2JOaZ8l9AvM4H6ezliw44aI.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/BerriAI/litellm"> LiteLLM </a> </td>
@ -328,14 +571,34 @@ English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integrati
<td> <a href="https://github.com/mem0ai/mem0"> Mem0 </a> </td>
<td> Mem0 enhances AI assistants with an intelligent memory layer, enabling personalized interactions and continuous learning over time. </td>
</tr>
<tr>
<td> <img src="https://geneplore.com/img/geneplore_color_logo_circular.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://geneplore.com/bot"> Geneplore AI </a> </td>
<td> Geneplore AI runs one of the largest AI Discord bots, now with Deepseek v3 and R1. </td>
</tr>
<tr>
<td> <img src="https://www.promptfoo.dev/img/logo-panda.svg" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/promptfoo/README.md"> promptfoo </a> </td>
<td> Test and evaluate LLM prompts, including DeepSeek models. Compare different LLM providers, catch regressions, and evaluate responses. </td>
</tr>
<tr>
<td> </td>
<td> <a href="https://github.com/AndersonBY/deepseek-tokenizer"> deepseek-tokenizer </a> </td>
<td> An efficient and lightweight tokenization library for DeepSeek models, relying solely on the `tokenizers` library without heavy dependencies like `transformers`. </td>
</tr>
<tr>
<td> <img src="https://langfuse.com/icon.svg" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://langfuse.com/docs/integrations/deepseek"> Langfuse </a> </td>
<td> Open-source LLM observability platform that helps teams collaboratively debug, analyze, and iterate on their DeepSeek applications. </td>
</tr>
<tr>
<td> CR </td>
<td> <a href="https://github.com/hustcer/deepseek-review"> deepseek-review </a> </td>
<td> 🚀 Sharpen Your Code, Ship with Confidence Elevate Your Workflow with Deepseek Code Review 🚀 </td>
</tr>
<tr>
<td> <img src="http://gptlocalhost.com/wp-content/uploads/2025/01/icon_1024.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://youtu.be/T1my2gqi-7Q"> GPTLocalost </a> </td>
<td> Use DeepSeek-R1 in Microsoft Word Locally. No inference costs. </td>
</tr>
<tr>
<td> <img src="https://github.com/suqicloud/wp-ai-chat/raw/main/ic_logo.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/suqicloud/wp-ai-chat"> WordPress ai助手 </a> </td>
<td> Docking Deepseek api for WordPress site ai conversation assistant, post generation, post summary plugin. </td>
</tr>
</table>

View file

@ -8,7 +8,7 @@
将 DeepSeek 大模型能力轻松接入各类软件。访问 [DeepSeek 开放平台](https://platform.deepseek.com/)来获取您的 API key。
[English](https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/README.md)/简体中文
[English](https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/README.md)/简体中文/[日本語](https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/README_ja.md)
</div>
@ -18,6 +18,11 @@
### 应用程序
<table>
<tr>
<td> <img src="https://avatars.githubusercontent.com/u/171659527?s=400&u=39906ab3b6e2066f83046096a66a77fb3f8bb836&v=4" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/quantalogic/quantalogic">Quantalogic</a> </td>
<td> QuantaLogic 是一个 ReAct推理和行动框架用于构建高级 AI 代理。 </td>
</tr>
<tr>
<td> <img src="https://github.com/deepseek-ai/awesome-deepseek-integration/assets/13600976/224d547a-6fbc-47c8-859f-aa14813e2b0f" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/chatbox/README_cn.md">Chatbox</a> </td>
@ -43,9 +48,14 @@
<td> <a href="https://www.librechat.ai/docs/configuration/librechat_yaml/ai_endpoints/deepseek">LibreChat</a> </td>
<td> LibreChat 是一个可定制的开源应用程序,无缝集成了 DeepSeek以增强人工智能交互体验 </td>
</tr>
<tr>
<td> <img src="https://www.papersgpt.com/images/logo/favicon.ico" alt="PapersGPT" width="64" height="auto" /> </td>
<td> <a href="https://github.com/papersgpt/papersgpt-for-zotero">PapersGPT</a> </td>
<td> PapersGPT是一款集成了DeepSeek及其他多种AI模型的辅助论文阅读的Zotero插件. </td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/rss-translator/RSS-Translator/main/core/static/favicon.ico" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="hhttps://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/rss_translator/README_cn.md"> RSS翻译器 </a> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/rss_translator/README_cn.md"> RSS翻译器 </a> </td>
<td> 开源、简洁、可自部署的RSS翻译器 </td>
</tr>
<tr>
@ -77,14 +87,15 @@
<td> <a href="docs/raycast/README_cn.md">Raycast</a></td>
<td> <a href="https://raycast.com/?via=ViGeng">Raycast</a> 是一款 macOS 生产力工具,它允许你用几个按键来控制你的工具。它支持各种扩展,包括 DeepSeek AI。</td>
</tr>
</tr> <td> <img src="https://niceprompt.app/favicon.ico" alt="Icon" width="64" height="auto" /> </td> <td> <a href="https://niceprompt.app">Nice Prompt</a></td> <td> <a href="https://niceprompt.app">Nice Prompt</a> 是一个结合提示工程与社交功能的平台支持用户高效创建、分享和协作开发AI提示词。</td> </tr>
<tr>
<td> <img src="./docs/zotero/assets/zotero-icon.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/zotero/README_cn.md">Zotero</a></td>
<td> <a href="https://www.zotero.org">Zotero</a> 是一款免费且易于使用的文献管理工具,旨在帮助您收集、整理、注释、引用和分享研究成果。</td>
</tr>
<tr>
<td> <img src="./docs/Siyuan/assets/image-20250122162731-7wkftbw.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/Siyuan/README_cn.md">思源笔记</a> </td>
<td> <img src="https://b3log.org/images/brand/siyuan-128.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/SiYuan/README_cn.md">思源笔记</a> </td>
<td> 思源笔记是一款隐私优先的个人知识管理系统,支持完全离线使用,并提供端到端加密的数据同步功能。</td>
</tr>
<tr>
@ -112,6 +123,34 @@
<td> <a href="https://bobtranslate.com/">Bob</a></td>
<td> <a href="https://bobtranslate.com/">Bob</a> 是一款 macOS 平台的翻译和 OCR 软件,您可以在任何应用程序中使用 Bob 进行翻译和 OCR即用即走</td>
</tr>
<tr>
<td> <img src="https://github.com/ZGGSONG/STranslate/raw/main/img/favicon.svg" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://stranslate.zggsong.com/">STranslate</a></td>
<td> <a href="https://stranslate.zggsong.com/">STranslate</a>Windows 是 WPF 开发的一款即用即走的翻译、OCR工具 </td>
</tr>
<tr>
<td> <img src="https://www.gptaiflow.tech/logo.png" alt="gpt-ai-flow-logo" width="64" height="auto" /> </td>
<td> <a href="https://www.gptaiflow.tech/zh/docs/product/api-keys-setup#setup-deepseek-api-keys">GPT AI Flow</a></td>
<td>
工程师为效率狂人(他们自己)打造的终极生产力武器: <a href="https://www.gptaiflow.tech/zh/">GPT AI Flow</a>
<ul>
<li>`Shift+Alt+空格` 唤醒桌面智能中枢</li>
<li>本地加密存储</li>
<li>自定义指令引擎</li>
<li>按需调用拒绝订阅捆绑</li>
</ul>
</td>
</tr>
<tr>
<td> <img src="https://github.com/user-attachments/assets/b09f17a8-936d-4dac-8b24-1682d52c9a3c" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/alecm20/story-flicks">Story-Flicks</a></td>
<td>通过一句话即可快速生成高清故事短视频,支持 DeepSeek 等模型。</td>
</tr>
<tr>
<td> <img src="https://www.petercat.ai/images/favicon.ico" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://www.petercat.ai">PeterCat</a> </td>
<td> 我们提供对话式答疑 Agent 配置系统、自托管部署方案和便捷的一体化应用 SDK让您能够为自己的 GitHub 仓库一键创建智能答疑机器人,并快速集成到各类官网或项目中, 为社区提供更高效的技术支持生态。</td>
</tr>
</table>
### AI Agent 框架
@ -122,6 +161,21 @@
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/anda/README_cn.md">Anda</a> </td>
<td>一个专为 AI 智能体开发设计的 Rust 语言框架,致力于构建高度可组合、自主运行且具备永久记忆能力的 AI 智能体网络。</td>
</tr>
<tr>
<td><img src="https://yomo.run/yomo-logo.png" alt="Icon" width="64" height="auto" /></td>
<td><a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/yomo/README.md">YoMo</a></td>
<td>Stateful Serverless LLM Function Calling Framework with Strongly-typed Language Support</td>
</tr>
<tr>
<td> <img src="https://alice.fun/alice-logo.png" alt="图标" width="64" height="auto" /> </td>
<td> <a href="https://github.com/bob-robert-ai/bob/blob/main/alice/readme.md">Alice</a> </td>
<td>一个基于 ICP 的自主 AI 代理,利用 DeepSeek 等大型语言模型进行链上决策。Alice 结合实时数据分析和独特的个性,管理代币、挖掘 BOB 并参与生态系统治理。</td>
</tr>
<tr>
<td> <img src="https://avatars.githubusercontent.com/u/173022229" alt="图标" width="64" height="auto" /> </td>
<td> <a href="https://github.com/APRO-com">ATTPs</a> </td>
<td>一个用于Agent之间可信通信的基础协议框架基于DeekSeek的Agent可以接入<a href="https://docs.apro.com/attps">ATTPs</a>的SDK获得注册Agent发送可验证数据获取可验证数据等功能从而与其他平台的Agent进行可信通信。</td>
</tr>
</table>
### RAG 框架
@ -132,6 +186,26 @@
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/ragflow/README_cn.md"> RAGFlow </a> </td>
<td> 一款基于深度文档理解构建的开源 RAGRetrieval-Augmented Generation引擎。RAGFlow 可以为各种规模的企业及个人提供一套精简的 RAG 工作流程结合大语言模型LLM针对用户各类不同的复杂格式数据提供可靠的问答以及有理有据的引用。 </td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/pingcap/tidb.ai/main/frontend/app/public/nextra/icon-dark.svg" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/autoflow/README_cn.md"> Autoflow </a> </td>
<td> <a href="https://github.com/pingcap/autoflow">AutoFlow</a> 是一个开源的基于 GraphRAG 的知识库工具,构建于 <a href="https://www.pingcap.com/ai?utm_source=tidb.ai&utm_medium=community">TiDB</a> Vector、LlamaIndex 和 DSPy 之上。提供类 Perplexity 的搜索页面,并可以嵌入简单的 JavaScript 代码片段,轻松将 Autoflow 的对话式搜索窗口集成到您的网站。 </td>
</tr>
<tr>
<td> <img src="https://assets.zilliz.com/Zilliz_Logo_Mark_White_20230223_041013_86057436cc.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/zilliztech/deep-searcher"> DeepSearcher </a> </td>
<td> DeepSearcher 结合强大的 LLMDeepSeek、OpenAI 等和向量数据库Milvus 等),根据私有数据进行搜索、评估和推理,提供高度准确的答案和全面的报告。</td>
</tr>
</table>
### Solana 框架
<table>
<tr>
<td> <img src="./docs/solana-agent-kit/assets/sendai-logo.png" alt="Icon" width="128" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/ragflow/README.md"> Solana Agent Kit </a> </td>
<td>一个用于连接 AI 智能体到 Solana 协议的开源工具包。现在,任何使用 Deepseek LLM 的智能体都可以自主执行 60+ 种 Solana 操作:</td>
</tr>
</table>
### 即时通讯插件
@ -143,9 +217,14 @@
<td> 一个集成到个人微信群/飞书群的领域知识助手,专注解答问题不闲聊</td>
</tr>
<tr>
<td> <img src="https://github.com/RockChinQ/QChatGPT/blob/master/res/logo.png?raw=true" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/RockChinQ/QChatGPT">QChatGPT<br/>QQ</a> </td>
<td> 😎高稳定性、🧩支持插件、🌏实时联网的 LLM QQ / QQ频道 / One Bot 机器人🤖</td>
<td> <img src="https://github.com/RockChinQ/LangBot/blob/master/res/logo.png?raw=true" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/RockChinQ/LangBot">LangBot<br/>QQ, 企微, 飞书)</a> </td>
<td> 大模型原生即时通信机器人平台,适配 QQ / QQ频道 / 飞书 / OneBot / 企业微信wecom 等多种消息平台 </td>
</tr>
<tr>
<td> <img src="https://nonebot.dev/logo.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/KomoriDev/nonebot-plugin-deepseek"">NoneBot<br/>QQ, 飞书, Discord, TG, etc.</a> </td>
<td> 基于 NoneBot 框架,支持智能对话与深度思考功能。适配 QQ / 飞书 / Discord, TG 等多种消息平台 </td>
</tr>
</table>
@ -182,13 +261,33 @@
<td> <a href="https://fluent.thinkstu.com/"> 流畅阅读 </a> </td>
<td> 一款革新性的浏览器开源翻译插件,让所有人都能够拥有基于母语般的阅读体验 </td>
</tr>
<tr>
<td> <img src="https://www.ncurator.com/_next/image?url=%2Ffavicon.ico&w=96&q=75" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://www.ncurator.com/"> 馆长 </a> </td>
<td> 知识库AI问答助手 - 让AI帮助你整理与分析知识</td>
</tr>
<tr>
<td> <img src="https://github.com/oinzen/RSSFlow-doc/blob/main/docs/images/en/icon64.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://rssflow.oinchain.com"> RssFlow </a> </td>
<td>一款智能的RSS阅读器浏览器扩展具有AI驱动的RSS摘要和多维度订阅视图功能。支持配置DeepSeek模型以增强内容理解能力。</td>
</tr>
<tr>
<td> <img src="https://www.typral.com/_next/image?url=%2Ffavicon.ico&w=96&q=75" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://www.typral.com/"> Typral </a> </td>
<td>超快的AI写作助手 - 让AI帮你快速优化日报,文章,文本等等...</td>
</tr>
<tr>
<td> <img src="https://static.trancy.org/assets/trancy_logo.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://www.trancy.org/"> Trancy </a> </td>
<td>沉浸双语对照翻译、视频双语字幕、划句/划词翻译插件</td>
</tr>
</table>
### VS Code 插件
<table>
<tr>
<td> <img src="https://github.com/deepseek-ai/awesome-deepseek-integration/assets/59196087/e4d082de-6f64-44b9-beaa-0de55d70cfab" alt="Icon" width="64" height="auto" /> </td>
<td> <img src="https://github.com/continuedev/continue/blob/main/docs/static/img/logo.png?raw=true" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/continue/README_cn.md"> Continue </a> </td>
<td> 开源 IDE 插件,使用 LLM 做你的编程助手 </td>
</tr>
@ -197,13 +296,18 @@
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/cline/README.md"> Cline </a> </td>
<td> Cline 是一款能够使用您的 CLI 和编辑器的 AI 助手。 </td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/Sitoi/ai-commit/refs/heads/main/images/logo.png?raw=true" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/Sitoi/ai-commit/blob/main/README.md"> AI Commit </a> </td>
<td> 使用 AI 生成 git commit message 的 VS Code 插件。 </td>
</tr>
</table>
### neovim 插件
<table>
<tr>
<td> <img src="https://github.com/user-attachments/assets/d66dfc62-8e69-4b00-8549-d0158e48e2e0" alt="Icon" width="64" height="auto" /> </td>
<td> <img src="https://github.com/user-attachments/assets/c316f70a-0a3c-4a32-b148-4df15e609acc" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/avante.nvim/README_cn.md"> avante.nvim </a> </td>
<td> 开源 IDE 插件,使用 LLM 做你的编程助手 </td>
</tr>
@ -212,6 +316,11 @@
<td> <a href="docs/llm.nvim/README.md"> llm.nvim </a> </td>
<td> 免费的大语言模型插件让你在Neovim中与大模型交互支持任意一款大模型比如DeepseekGPTGLMkimi或者本地运行的大模型(比如ollama) </td>
</tr>
<tr>
<td> <img src="https://github.com/user-attachments/assets/d66dfc62-8e69-4b00-8549-d0158e48e2e0" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/minuet-ai.nvim/README_cn.md"> minuet-ai.nvim </a> </td>
<td> Minuet 提供实时代码补全功能,支持多个主流大语言模型,包括 Deepseek、OpenAI、Gemini、Claude、Ollama、Codestral 等。 </td>
</tr>
<tr>
<td> <img src="https://github.com/user-attachments/assets/d66dfc62-8e69-4b00-8549-d0158e48e2e0" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/codecompanion.nvim/README.md"> codecompanion.nvim </a> </td>
@ -234,9 +343,34 @@
</tr>
</table>
### AI Code编辑器
<table>
<tr>
<td> <img src="https://global.discourse-cdn.com/flex020/uploads/cursor1/original/2X/a/a4f78589d63edd61a2843306f8e11bad9590f0ca.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://www.cursor.com/"> Cursor </a> </td>
<td>基于VS Code进行扩展的AI Code编辑器</td>
</tr>
<tr>
<td> <img src="https://exafunction.github.io/public/images/windsurf/windsurf-app-icon.svg" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://codeium.com/windsurf"> WindSurf </a> </td>
<td>另一个基于VS Code的AI Code编辑器由Codeium出品</td>
</tr>
</table>
### 其它
<table>
<tr>
<td><p style="font-size: 84px">&#128032;</p></td>
<td> <a href="https://github.com/lunary-ai/abso/blob/main/README.md"> Abso </a>
<td> TypeScript SDK 使用 OpenAI 格式与任何 LLM 提供商进行交互。</td>
</tr>
<tr>
<td> <img src="https://i.imgur.com/IsQYInJ.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/djcopley/ShellOracle/"> ShellOracle </a> </td>
<td> 一种用于智能 shell 命令生成的终端工具。 </td>
</tr>
<tr>
<td> <img src="https://github.com/deepseek-ai/awesome-deepseek-integration/assets/59196087/c1e47b01-1766-4f7e-bfe6-ab3cb3991c30" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/tree/main/docs/siri_deepseek_shortcut"> 深度求索(快捷指令) </a> </td>
@ -252,4 +386,18 @@
<td> <a href="docs/promptfoo/README.md"> promptfoo </a> </td>
<td> 测试和评估LLM提示包括DeepSeek模型。比较不同的LLM提供商捕获回归并评估响应。 </td>
</tr>
<tr>
<td> </td>
<td> <a href="https://github.com/AndersonBY/deepseek-tokenizer"> deepseek-tokenizer </a> </td>
<td> 一个高效的轻量级tokenization库仅依赖`tokenizers`库,不依赖`transformers`等重量级依赖。 </td>
</tr>
<td> CR </td>
<td> <a href="https://github.com/hustcer/deepseek-review"> deepseek-review </a> </td>
<td> 🚀 使用 Deepseek 进行代码审核,支持 GitHub Action 和本地 🚀 </td>
</tr>
<tr>
<td> <img src="https://github.com/suqicloud/wp-ai-chat/raw/main/ic_logo.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/suqicloud/wp-ai-chat"> WordPress ai助手 </a> </td>
<td> 对接Deepseek api用于WordPress站点的ai对话助手、ai文章生成、ai文章总结插件。 </td>
</tr>
</table>

411
README_ja.md Normal file
View file

@ -0,0 +1,411 @@
<div align="center">
<p align="center">
<img width="1000px" alt="Awesome DeepSeek Integrations" src="docs/Awesome DeepSeek Integrations.png">
</p>
# Awesome DeepSeek Integrations ![Awesome](https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg)
DeepSeek API を人気のソフトウェアに統合します。API キーを取得するには、[DeepSeek Open Platform](https://platform.deepseek.com/)にアクセスしてください。
[English](https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/README.md)/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/README_cn.md)/日本語
</div>
</br>
</br>
### アプリケーション
<table>
<tr>
<td> <img src="https://avatars.githubusercontent.com/u/171659527?s=400&u=39906ab3b6e2066f83046096a66a77fb3f8bb836&v=4" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/quantalogic/quantalogic">Quantalogic</a> </td>
<td> QuantaLogicは、高度なAIエージェントを構築するためのReAct推論と行動フレームワークです。 </td>
</tr>
<tr>
<td> <img src="https://github.com/deepseek-ai/awesome-deepseek-integration/assets/13600976/224d547a-6fbc-47c8-859f-aa14813e2b0f" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/chatbox/README.md">Chatbox</a> </td>
<td> Chatboxは、Windows、Mac、Linuxで利用可能な複数の最先端LLMモデルのデスクトップクライアントです。 </td>
</tr>
<tr>
<td> <img src="https://github.com/deepseek-ai/awesome-deepseek-integration/assets/59196087/bb65404c-f867-42d8-ae2b-281fe953ab54" alt="Icon" width="64" height="auto"/> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/chatgpt_next_web/README.md"> ChatGPT-Next-Web </a> </td>
<td> ChatGPT Next Webは、GPT3、GPT4、Gemini ProをサポートするクロスプラットフォームのChatGPTウェブUIです。 </td>
</tr>
<tr>
<td> <img src="./docs/liubai/assets/liubai-logo.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/liubai/README.md">Liubai</a> </td>
<td> Liubaiは、WeChat上でDeepSeekを使用してート、タスク、カレンダー、ToDoリストを操作できるようにします </td>
</tr>
<tr>
<td> <img src="https://github.com/deepseek-ai/awesome-deepseek-integration/assets/59196087/1ac9791b-87f7-41d9-9282-a70698344e1d" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/pal/README.md"> Pal - AI Chat Client<br/>(iOS, ipadOS) </a> </td>
<td> Palは、iOS上でカスタマイズされたチャットプレイグラウンドです。 </td>
</tr>
<tr>
<td> <img src="https://www.librechat.ai/librechat.svg" alt="LibreChat" width="64" height="auto" /> </td>
<td> <a href="https://www.librechat.ai/docs/configuration/librechat_yaml/ai_endpoints/deepseek">LibreChat</a> </td>
<td> LibreChatは、DeepSeekをシームレスに統合してAIインタラクションを強化するカスタマイズ可能なオープンソースアプリです。 </td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/rss-translator/RSS-Translator/main/core/static/favicon.ico" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/rss_translator/README.md"> RSS Translator </a> </td>
<td> RSSフィードをあなたの言語に翻訳します </td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/ysnows/enconvo_media/main/logo.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/enconvo/README.md"> Enconvo </a> </td>
<td> Enconvoは、AI時代のランチャーであり、すべてのAI機能のエントリーポイントであり、思いやりのあるインテリジェントアシスタントです。</td>
</tr>
<tr>
<td><img src="https://github.com/kangfenmao/cherry-studio/blob/main/src/renderer/src/assets/images/logo.png?raw=true" alt="Icon" width="64" height="auto" style="border-radius: 10px" /></td>
<td><a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/cherrystudio/README.md">Cherry Studio</a></td>
<td>プロデューサーのための強力なデスクトップAIアシスタント</td>
</tr>
<tr>
<td> <img src="https://tomemo.top/images/logo.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/tomemo/README.md"> ToMemo (iOS, ipadOS) </a> </td>
<td> フレーズブック+クリップボード履歴+キーボードiOSアプリで、キーボードでの迅速な出力にAIマクロモデリングを統合しています。</td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/buxuku/video-subtitle-master/refs/heads/main/resources/icon.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/buxuku/video-subtitle-master">Video Subtitle Master</a></td>
<td> ビデオの字幕を一括生成し、字幕を他の言語に翻訳することができます。これはクライアントサイドのツールで、MacとWindowsの両方のプラットフォームをサポートし、Baidu、Volcengine、DeepLx、OpenAI、DeepSeek、Ollamaなどの複数の翻訳サービスと統合されています。</td>
</tr>
<tr>
<td> <img src="https://github.com/UnknownEnergy/chatgpt-api/blob/master/dist/assets/chatworm-72x72.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/UnknownEnergy/chatgpt-api/blob/master/README.md">Chatworm</a> </td>
<td> Chatwormは、複数の最先端LLMモデルのためのウェブアプリで、オープンソースであり、Androidでも利用可能です。 </td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/tisfeng/ImageBed/main/uPic/icon_512x512@2x.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/tisfeng/Easydict">Easydict</a></td>
<td> Easydictは、単語の検索やテキストの翻訳を簡単かつエレガントに行うことができる、簡潔で使いやすい翻訳辞書macOSアプリです。大規模言語モデルAPIを呼び出して翻訳を行うことができます。</td>
</tr>
<tr>
<td> <img src="https://www.raycast.com/favicon-production.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/raycast/README.md">Raycast</a></td>
<td> <a href="https://raycast.com/?via=ViGeng">Raycast</a>は、macOSの生産性ツールで、いくつかのキーストロークでツールを制御できます。DeepSeek AIを含むさまざまな拡張機能をサポートしています。</td>
</tr>
<tr>
<td> <img src="https://avatars.githubusercontent.com/u/193405629?s=200&v=4" alt="PHP Client" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-php/deepseek-php-client/blob/master/README.md">PHP Client</a> </td>
<td> Deepseek PHP Clientは、Deepseek APIとのシームレスな統合のための堅牢でコミュニティ主導のPHPクライアントライブラリです。 </td>
</tr>
<tr>
<td> <img src="https://avatars.githubusercontent.com/u/958072?s=200&v=4" alt="Laravel Integration" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-php/deepseek-laravel/blob/master/README.md">Laravel Integration</a> </td>
<td> LaravelアプリケーションとのシームレスなDeepseek API統合のためのLaravelラッパー。</td>
</tr>
<tr>
<td> <img src="./docs/zotero/assets/zotero-icon.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/zotero/README.md">Zotero</a></td>
<td> <a href="https://www.zotero.org">Zotero</a>は、研究成果を収集、整理、注釈、引用、共有するのに役立つ無料で使いやすいツールです。</td>
</tr>
<tr>
<td> <img src="https://b3log.org/images/brand/siyuan-128.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/SiYuan/README.md">SiYuan</a> </td>
<td> SiYuanは、完全にオフラインで使用できるプライバシー優先の個人知識管理システムであり、エンドツーエンドの暗号化データ同期を提供します。</td>
</tr>
<tr>
<td> <img src="https://github.com/ArvinLovegood/go-stock/raw/master/build/appicon.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/ArvinLovegood/go-stock/blob/master/README.md">go-stock</a> </td>
<td>go-stockは、Wailsを使用してNativeUIで構築され、LLMによって強化された中国株データビューアです。</td>
</tr>
<tr>
<td> <img src="https://avatars.githubusercontent.com/u/102771702?s=200&v=4" alt="Wordware" width="64" height="auto" /> </td>
<td> <a href="docs/wordware/README.md">Wordware</a> </td>
<td><a href="https://www.wordware.ai/">Wordware</a>は、誰でも自然言語だけでAIスタックを構築、反復、デプロイできるツールキットです。</td>
</tr>
<tr>
<td> <img src="https://framerusercontent.com/images/xRJ6vNo9mUYeVNxt0KITXCXEuSk.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/langgenius/dify/">Dify</a> </td>
<td> <a href="https://dify.ai/">Dify</a>は、アシスタント、ワークフロー、テキストジェネレーターなどのアプリケーションを作成するためのDeepSeekモデルをサポートするLLMアプリケーション開発プラットフォームです。 </td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/enricoros/big-AGI/refs/heads/v2-dev/public/favicon.ico" alt="Big-AGI" width="64" height="auto" /> </td>
<td> <a href="https://github.com/enricoros/big-AGI/blob/v2-dev/README.md">Big-AGI</a> </td>
<td><a href="https://big-agi.com/">Big-AGI</a>は、誰もが高度な人工知能にアクセスできるようにするための画期的なAIスイートです。</td>
</tr>
<tr>
<td> <img src="https://github.com/LiberSonora/LiberSonora/blob/main/assets/avatar.jpeg?raw=true" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/LiberSonora/LiberSonora/blob/main/README_en.md">LiberSonora</a> </td>
<td> LiberSonoraは、「自由の声」を意味し、AIによって強化された強力なオープンソースのオーディオブックツールキットであり、インテリジェントな字幕抽出、AIタイトル生成、多言語翻訳などの機能を備え、GPUアクセラレーションとバッチオフライン処理をサポートしています。</td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/ripperhe/Bob/master/docs/_media/icon_128.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://bobtranslate.com/">Bob</a></td>
<td> <a href="https://bobtranslate.com/">Bob</a>は、任意のアプリで使用できるmacOSの翻訳およびOCRツールです。</td>
</tr>
<tr>
<td> <img src="https://agenticflow.ai/favicon.ico" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://agenticflow.ai/">AgenticFlow</a> </td>
<td> <a href="https://agenticflow.ai/">AgenticFlow</a>は、マーケターがAIエージェントのためのエージェンティックAIワークフローを構築するためのーコードプラットフォームであり、数百の毎日のアプリをツールとして使用します。</td>
</tr>
<tr>
<td> <img src="https://www.petercat.ai/images/favicon.ico" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://www.petercat.ai">PeterCat</a> </td>
<td> 会話型Q&Aエージェントの構成システム、自ホスト型デプロイメントソリューション、および便利なオールインワンアプリケーションSDKを提供し、GitHubリポジトリのためのインテリジェントQ&Aボットをワンクリックで作成し、さまざまな公式ウェブサイトやプロジェクトに迅速に統合し、コミュニティのためのより効率的な技術サポートエコシステムを提供します。</td>
</tr>
</table>
### AI エージェントフレームワーク
<table>
<tr>
<td> <img src="https://panda.fans/_assets/favicons/apple-touch-icon.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/anda/README.md">Anda</a> </td>
<td>高度にコンポーザブルで自律的かつ永続的な記憶を持つAIエージェントネットワークを構築するために設計されたRustフレームワーク。</td>
</tr>
<tr>
<td> <img src="https://avatars.githubusercontent.com/u/173022229" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/APRO-com">ATTPs</a> </td>
<td>エージェント間の信頼できる通信のための基本プロトコルフレームワークです。利用者は<a href="https://docs.apro.com/attps">ATTPs</a>のSDKを導入することで、エージェントの登録、検証可能なデータの送信、検証可能なデータの取得などの機能を利用することができます。</td>
</tr>
</table>
### RAG フレームワーク
<table>
<tr>
<td> <img src="https://github.com/deepseek-ai/awesome-deepseek-integration/assets/33142505/77093e84-9f7c-4716-9168-bac962fa1372" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/ragflow/README.md"> RAGFlow </a> </td>
<td> 深い文書理解に基づいたオープンソースのRAGRetrieval-Augmented Generationエンジン。RAGFlowは、あらゆる規模の企業や個人に対して、ユーザーのさまざまな複雑な形式のデータに対して信頼性のある質問応答と根拠のある引用を提供するための簡素化されたRAGワークフローを提供します。 </td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/pingcap/tidb.ai/main/frontend/app/public/nextra/icon-dark.svg" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/autoflow/README.md"> Autoflow </a> </td>
<td> <a href="https://github.com/pingcap/autoflow">AutoFlow</a> は、GraphRAGに基づくオープンソースのナレッジベースツールであり、<a href="https://www.pingcap.com/ai?utm_source=tidb.ai&utm_medium=community">TiDB</a> Vector、LlamaIndex、DSPy の上に構築されています。Perplexity のような検索インターフェースを提供し、シンプルな JavaScript スニペットを埋め込むことで、AutoFlow の対話型検索ウィンドウを簡単にウェブサイトに統合できます。 </td>
</tr>
<tr>
<td> <img src="https://assets.zilliz.com/Zilliz_Logo_Mark_White_20230223_041013_86057436cc.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/zilliztech/deep-searcher"> DeepSearcher </a> </td>
<td> DeepSearcher は、強力な大規模言語モデルDeepSeek、OpenAI などとベクトルデータベースMilvus など)を組み合わせて、私有データに基づく検索、評価、推論を行い、高精度な回答と包括的なレポートを提供します。</td>
</tr>
</table>
### Solana フレームワーク
<table>
<tr>
<td> <img src="./docs/solana-agent-kit/assets/sendai-logo.png" alt="Icon" width="128" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/ragflow/README.md"> Solana Agent Kit </a> </td>
<td>AIエージェントをSolanaプロトコルに接続するためのオープンソースツールキット。DeepSeek LLMを使用する任意のエージェントが、60以上のSolanaアクションを自律的に実行できます。</td>
</tr>
</table>
### IM アプリケーションプラグイン
<table>
<tr>
<td> <img src="https://github.com/InternLM/HuixiangDou/releases/download/v0.1.0rc1/huixiangdou.jpg" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/huixiangdou/README_cn.md">HuixiangDou<br/>(wechat,lark)</a> </td>
<td>個人のWeChatおよびFeishuでのドメイン知識アシスタントで、質問に答えることに焦点を当てています。</td>
</tr>
<tr>
<td> <img src="https://github.com/RockChinQ/QChatGPT/blob/master/res/logo.png?raw=true" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/RockChinQ/QChatGPT">QChatGPT<br/>QQ</a> </td>
<td> 高い安定性、プラグインサポート、リアルタイムネットワーキングを備えたQQチャットボット。 </td>
</tr>
<tr>
<td> <img src="https://nonebot.dev/logo.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/KomoriDev/nonebot-plugin-deepseek"">NoneBot<br/>QQ, Lark, Discord, TG, etc.</a> </td>
<td> NoneBotフレームワークを基に、インテリジェントな会話と深い思考機能をサポートします。QQ/飛書/Discord/Telegram等多种多様なメッセージプラットフォームに対応しています </td>
</tr>
</table>
### ブラウザ拡張機能
<table>
<tr>
<td> <img src="https://github.com/deepseek-ai/awesome-deepseek-integration/assets/59196087/9d3f42b8-fcd0-47ab-8b06-1dd0554dd80e" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/immersive_translate/README.md"> Immersive Translate </a> </td>
<td> Immersive Translateは、バイリンガルのウェブページ翻訳プラグインです。 </td>
</tr>
<tr>
<td> <img src="https://lh3.googleusercontent.com/K9i0qJb8phasC5wWf5T68rhnfvX4swsE0hrhJP-WB3WV7MwE5KpMUIJvHKNHHRE6GKNIvIdTNSWoDMl_NggrmUsaw=s120" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/immersive_reading_guide/README.md"> Immersive Reading Guide </a> </td>
<td> サイドバーなし!!! 没入型のAIウェブ要約、質問をする... </td>
</tr>
<tr>
<td> <img src="https://github.com/deepseek-ai/awesome-deepseek-integration/assets/59196087/8a301619-a3de-489b-81fd-69aaa7c1c561" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/chatgpt_box/README.md"> ChatGPT Box </a> </td>
<td> ChatGPT Boxは、ブラウザに統合されたChatGPTで、完全に無料です。 </td>
</tr>
<tr>
<td> <img src="https://github.com/deepseek-ai/awesome-deepseek-integration/assets/59196087/c3d9d100-247a-41cc-97c1-10b01ed25e70" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/hcfy/README.md"> hcfy (划词翻译) </a> </td>
<td> hcfy (划词翻译)は、複数の翻訳サービスを統合するウェブブラウザ拡張機能です。 </td>
</tr>
<tr>
<td> <img src="https://static.eudic.net/web/trans/en_trans.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/Lulu Translate/README.md"> Lulu Translate </a> </td>
<td> このプラグインは、マウス選択翻訳、段落ごとの比較翻訳、およびPDF文書<E69687><E69BB8><EFBFBD>訳機能を提供します。DeepSeek AI、Bing、GPT、Googleなどのさまざまな翻訳エンジンを利用できます。 </td>
</tr>
<tr>
<td> <img src="https://github.com/Bistutu/FluentRead/raw/refs/heads/main/public/icon/192.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://fluent.thinkstu.com/"> FluentRead </a> </td>
<td> 誰もが母国語のような読書体験を持つことができる革新的なオープンソースのブラウザ翻訳プラグイン </td>
</tr>
<tr>
<td> <img src="https://github.com/oinzen/RSSFlow-doc/blob/main/docs/images/en/icon64.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://rssflow.oinchain.com"> RssFlow </a> </td>
<td>AIを活用したRSS要約と多次元フィードビューを備えたインテリジェントなRSSリーダーブラウザ拡張機能。コンテンツ理解を強化するためのDeepSeekモデル設定をサポートしています。</td>
</tr>
<tr>
<td> <img src="https://www.ncurator.com/_next/image?url=%2Ffavicon.ico&w=96&q=75" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://www.ncurator.com/"> Ncurator </a> </td>
<td> ナレッジベース AI Q&Aアシスタント AIがあなたの知識の整理と分析をお手伝いします</td>
</tr>
<tr>
<td> <img src="https://www.typral.com/_next/image?url=%2Ffavicon.ico&w=96&q=75" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://www.typral.com/"> Typral </a> </td>
<td>超高速AIライティングアシスタント - AIがあなたの日報、記事、テキストなどを素早く最適化します</td>
</tr>
<tr>
<td> <img src="https://static.trancy.org/assets/trancy_logo.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://www.trancy.org/"> Trancy </a> </td>
<td>イマーシブな二か国語対照翻訳、動画の二か国語字幕、文/単語の選択翻訳プラグイン</td>
</tr>
</table>
### VS Code 拡張機能
<table>
<tr>
<td> <img src="https://github.com/deepseek-ai/awesome-deepseek-integration/assets/59196087/e4d082de-6f64-44b9-beaa-0de55d70cfab" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/continue/README.md"> Continue </a> </td>
<td> Continueは、IDEのオープンソースの自動操縦です。 </td>
</tr>
<tr>
<td> <img src="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/cline/assets/favicon.png?raw=true" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/cline/README.md"> Cline </a> </td>
<td> Clineは、CLIとエディタを使用できるAIアシスタントです。 </td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/Sitoi/ai-commit/refs/heads/main/images/logo.png?raw=true" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/Sitoi/ai-commit/blob/main/README.md"> AI Commit </a> </td>
<td> VS Code で AI を使用して git commit message を生成するプラグイン。 </td>
</tr>
</table>
### neovim 拡張機能
<table>
<tr>
<td> <img src="https://github.com/user-attachments/assets/d66dfc62-8e69-4b00-8549-d0158e48e2e0" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/avante.nvim/README.md"> avante.nvim </a> </td>
<td> avante.nvimは、IDEのオープンソースの自動操縦です。 </td>
</tr>
<tr>
<td> <img src="https://github.com/user-attachments/assets/d66dfc62-8e69-4b00-8549-d0158e48e2e0" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/llm.nvim/README.md"> llm.nvim </a> </td>
<td> NeovimでLLMと対話できる無料の大規模言語モデルLLMプラグイン。Deepseek、GPT、GLM、Kimi、またはローカルLLMollamaなどなど、任意のLLMをサポートします。 </td>
</tr>
<tr>
<td> <img src="https://github.com/user-attachments/assets/d66dfc62-8e69-4b00-8549-d0158e48e2e0" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/codecompanion.nvim/README.md"> codecompanion.nvim </a> </td>
<td> Neovimでシームレスに統合されたAI駆動のコーディング。 </td>
</tr>
</table>
### JetBrains 拡張機能
<table>
<tr>
<td> <img src="https://plugins.jetbrains.com/files/21520/412905/icon/pluginIcon.svg" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://ide.unitmesh.cc/quick-start"> AutoDev </a> </td>
<td>AutoDevは、JetBrainのIDEでのオープンソースのAIコーディングアシスタントです。 </td>
</tr>
<tr>
<td> <img src="https://plugins.jetbrains.com/files/21410/561595/icon/pluginIcon.svg" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://plugins.jetbrains.com/plugin/21410-onegai-copilot"> Onegai Copilot </a> </td>
<td>Onegai Copilotは、JetBrainのIDEでのAIコーディングアシスタントです。 </td>
</tr>
<tr>
<td> <img src="https://github.com/deepseek-ai/awesome-deepseek-integration/assets/59196087/e4d082de-6f64-44b9-beaa-0de55d70cfab" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/continue/README.md"> Continue </a> </td>
<td> Continueは、IDEのオープンソースの自動操縦です。 </td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/a18792721831/studyplugin/535b9cab69da0f97b42dcaebb00bb0d4ed15c8a6/translate/src/main/resources/META-INF/pluginIcon.svg" alt="Icon" width="64" height="auto"/> </td>
<td> <a href="https://plugins.jetbrains.com/plugin/18336-chinese-english-translate">Chinese-English Translate</a> </td>
<td> JetBrainのIDEでの複数の翻訳サービス。 </td>
</tr>
<tr>
<td> <img src="https://plugins.jetbrains.com/files/24851/659002/icon/pluginIcon.svg" alt="Icon" width="64" height="auto"/> </td>
<td> <a href="https://plugins.jetbrains.com/plugin/24851-ai-git-commit">AI Git Commit</a> </td>
<td> このプラグインは、コードの変更に基づいてコミットメッセージを自動生成するためにAIを使用します。 </td>
</tr>
</table>
### AI コードエディタ
<table>
<tr>
<td> <img src="https://global.discourse-cdn.com/flex020/uploads/cursor1/original/2X/a/a4f78589d63edd61a2843306f8e11bad9590f0ca.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://www.cursor.com/"> Cursor </a> </td>
<td>AIコードエディタ</td>
</tr>
<tr>
<td> <img src="https://exafunction.github.io/public/images/windsurf/windsurf-app-icon.svg" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://codeium.com/windsurf"> WindSurf </a> </td>
<td>CodeiumによるVS CodeをベースにしたのAIコードエディタ</td>
</tr>
</table>
### Emacs
<table>
<tr>
<td> <img src="https://upload.wikimedia.org/wikipedia/commons/0/08/EmacsIcon.svg" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/karthink/gptel"> gptel </a> </td>
<td>EmacsのためのシンプルなLLMクライアント</td>
</tr>
<tr>
<td> <img src="https://upload.wikimedia.org/wikipedia/commons/0/08/EmacsIcon.svg" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/milanglacier/minuet-ai.el"> Minuet AI </a> </td>
<td>コードでインテリジェンスとダンス💃</td>
</tr>
</table>
### その他
<table>
<tr>
<td style="font-size: 64px">&#128032;</td>
<td> <a href="https://github.com/lunary-ai/abso/blob/main/README.md"> Abso </a></td>
<td> OpenAIフォーマットを使用するあらゆるLLMプロバイダと対話するためのTypeScript SDK.</td>
</tr>
<tr>
<td> <img src="https://github.com/deepseek-ai/awesome-deepseek-integration/assets/59196087/c1e47b01-1766-4f7e-bfe6-ab3cb3991c30" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/tree/main/docs/siri_deepseek_shortcut"> siri_deepseek_shortcut </a> </td>
<td> DeepSeek APIを装備したSiri </td>
</tr>
<tr>
<td> <img src="https://github.com/n8n-io/n8n/blob/master/assets/n8n-logo.png?raw=true" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/rubickecho/n8n-deepseek"> n8n-nodes-deepseek </a> </td>
<td> DeepSeek APIをワークフローに直接統合するためのN8Nコミュニティード。 </td>
</tr>
<tr>
<td> <img src="https://framerusercontent.com/images/8rF2JOaZ8l9AvM4H6ezliw44aI.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/BerriAI/litellm"> LiteLLM </a> </td>
<td> 100以上のLLM APIをOpenAI形式で呼び出すためのPython SDK、プロキシサーバーLLMゲートウェイ。DeepSeek AIもサポートし、コスト追跡も可能です。 </td>
</tr>
<tr>
<td> <img src="https://i.postimg.cc/k5Z4YWjt/Screenshot-2025-01-23-at-6-08-01-PM.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/mem0ai/mem0"> Mem0 </a> </td>
<td> Mem0は、AIアシスタントにインテリジェントなメモリレイヤーを追加し、パーソナライズされたインタラクションと継続的な学習を可能にします。 </td>
</tr>
<tr>
<td> <img src="https://geneplore.com/img/geneplore_color_logo_circular.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://geneplore.com/bot"> Geneplore AI </a> </td>
<td> Geneplore AIは、Deepseek v3およびR1を搭載した最大のAI Discordボットの1つを運営しています。 </td>
</tr>
<tr>
<td> <img src="https://www.promptfoo.dev/img/logo-panda.svg" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/promptfoo/README.md"> promptfoo </a> </td>
<td> LLMプロンプトをテストおよび評価し、DeepSeekモデルを含む。さまざまなLLMプロバイダーを比較し、回帰をキャッチし、応答を評価します。 </td>
</tr>
</table>

18
docs/16x_prompt/README.md Normal file
View file

@ -0,0 +1,18 @@
# [16x Prompt](https://prompt.16x.engineer/)
AI Coding with Context Management.
16x Prompt helps developers manage source code context and craft prompts for complex coding tasks on existing codebases.
# UI
![image](assets/16x_prompt_ui.png)
## Integrate with DeepSeek API
1. Click on the model selection button at bottom right
2. Click on "DeepSeek API" to automatically fill in API Endpoint
3. Enter model ID, for example `deepseek-chat` (for DeepSeek V3) or `deepseek-reasoner` (for DeepSeek R1)
4. Enter your API key
![image](assets/16x_prompt_integration.png)

Binary file not shown.

After

Width:  |  Height:  |  Size: 646 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 675 KiB

379
docs/ATTPs/README.md Normal file
View file

@ -0,0 +1,379 @@
# APRO-COM/ATTPs-framework
Foundation framework that enables advanced agent based on DeepSeek interactions, data verification, and price queries with [ATTPs Protocol](https://docs.apro.com/attps) . It streamlines agent creation, verification processes, and provides a flexible framework for building robust agent-based solutions.
For more details about ATTPs, you can see the [whitepaper here](https://www.apro.com/attps.pdf)
## Overview
The ATTPs framework bridges agent-based logic with the DeepSeek. It handles agent registration, data verification, and price queries, empowering both automated and user-driven workflows.
## Features
### Agent Operations
- **Agent Creation**: Deploy new agents with custom settings
- **Registration**: Register agents on-chain or via standardized processes
- **Multi-Signer Framework**: Supports threshold-based approval flows
### Data Verification
- **Chain Validation**: Verify data authenticity on-chain
- **Transaction Execution**: Handle verification logic with built-in security checks
- **Auto-Hashing**: Convert raw data to hashed formats when needed
- **Metadata Parsing**: Validate content type, encoding, and compression
### Price Queries
- **Live Price Data**: Fetch price information for various pairs
- **Format Validation**: Normalize user query inputs to standard trading-pair formats
- **APIs Integration**: Retrieve real-time or near-real-time pricing information
## Security Features
### Access Control
- **Private Key Management**: Safe usage of private keys for transaction signing
- **Environment Variables**: Secure injection of credentials
- **On-Chain Validation**: Leverage on-chain contract checks
### Verification
- **Input Validation**: Strict schema checks before on-chain operations
- **Transaction Receipts**: Provide verifiable transaction details
- **Error Handling**: Detailed error logs for quick debugging
## Performance Optimization
1. **Cache Management**
- Implement caching for frequent queries
- Monitor retrieval times and cache hits
2. **Network Efficiency**
- Batch requests where possible
- Validate response parsing to reduce overhead
## System Requirements
- Node.js 16.x or higher
- Sufficient network access to on-chain endpoints
- Basic configuration of environment variables
- Minimum 4GB RAM recommended
## Troubleshooting
1. **Invalid Agent Settings**
- Ensure signers and threshold are correct
- Validate agentHeader for proper UUIDs and numeric values
2. **Verification Failures**
- Check the input data formats
- Confirm environment variables are set
3. **Price Query Errors**
- Verify the trading pair format
- Check external API availability
## Safety & Security
1. **Credential Management**
- Store private keys securely
- Do not commit secrets to version control
2. **Transaction Limits**
- Configure thresholds to mitigate abuse
- Log transaction attempts and failures
3. **Monitoring & Logging**
- Track unusual activity
- Maintain detailed audit logs
# Usage with js
## Installation
```bash
npm install ai-agent-sdk-js
```
## Configuration
Configure the plugin by setting environment variables or runtime settings:
- APRO_RPC_URL
- APRO_PROXY_ADDRESS
- APRO_PRIVATE_KEY
- APRO_CONVERTER_ADDRESS
- APRO_AUTO_HASH_DATA
## Usage with js sdk
To use the AI Agent SDK, import the library and create an instance of the `Agent` class:
```typescript
import { AgentSDK } from 'ai-agent-sdk-js'
const agent = new AgentSDK({
rpcUrl: 'https://bsc-testnet-rpc.publicnode.com',
privateKey: '',
proxyAddress: '',
})
// if you want the SDK to hash the data automatically
const autoHashAgent = new AgentSDK({
rpcUrl: 'https://bsc-testnet-rpc.publicnode.com',
privateKey: '',
proxyAddress: '',
autoHashData: true,
converterAddress: '',
})
```
To create a new agent, call the `createAndRegisterAgent` method:
```typescript
import type { AgentSettings, TransactionOptions } from 'ai-agent-sdk-js'
import { randomUUID } from 'node:crypto'
import { parseUnits } from 'ethers'
// prepare the agent settings
const agentSettings: AgentSettings = {
signers: [],
threshold: 3,
converterAddress: '',
agentHeader: {
messageId: randomUUID(),
sourceAgentId: randomUUID(),
sourceAgentName: 'AI Agent SDK JS',
targetAgentId: '',
timestamp: Math.floor(Date.now() / 1000),
messageType: 0,
priority: 1,
ttl: 3600,
},
}
// prepare the transaction options
const nonce = await agent.getNextNonce()
const transactionOptions: TransactionOptions = {
nonce,
gasPrice: parseUnits('1', 'gwei'),
gasLimit: BigInt(2000000),
}
const tx = await agent.createAndRegisterAgent({ agentSettings, transactionOptions })
// or you can leave the transaction options empty, the SDK will use the auto-generated values
// const tx = await agent.createAndRegisterAgent({ agentSettings })
```
The SDK also provides the tool to extract the new agent address from the transaction receipt:
```typescript
import { parseNewAgentAddress } from 'ai-agent-sdk-js'
const receipt = await tx.wait()
const agentAddress = parseNewAgentAddress(receipt)
```
To verify the data integrity, call the `verify` method:
```typescript
import type { MessagePayload } from 'ai-agent-sdk-js'
import { hexlify, keccak256, toUtf8Bytes } from 'ethers'
// prepare the payload
const data = hexlify(toUtf8Bytes('Hello World!'))
const dataHash = keccak256(data)
const payload: MessagePayload = {
data,
dataHash,
signatures: [
{
r: '',
s: '',
v: 1, // 1, 0, 27, 28 are allowed
},
// ...
],
metadata: {
contentType: '',
encoding: '',
compression: '',
},
}
const tx = await agent.verify({ payload, agent: '', digest: '' })
```
If the data is obtained from the APRO DATA pull service, you can use the auto-hash feature:
```typescript
import type { MessagePayload } from 'ai-agent-sdk-js'
const payload: MessagePayload = {
data: '0x...',
signatures: [
{
r: '',
s: '',
v: 1, // 1, 0, 27, 28 are allowed
},
// ...
],
metadata: {
contentType: '',
encoding: '',
compression: '',
},
}
// When
const tx = await autoHashAgent.verify({ payload, agent: '', digest: '' })
```
For more examples, see the [test](https://github.com/APRO-com/ai-agent-sdk-js/tree/main/test) cases.
# Usage with Python
## Installation
```bash
$ pip3 install ai-agent-sdk
```
## Usage with Python SDK
### Initialize AgentSDK
```python
from ai_agent.agent import AgentSDK
AGENT_PROXY_ADDRESS = "0x07771A3026E60776deC8C1C61106FB9623521394"
NETWORK_RPC = "https://testnet-rpc.bitlayer.org"
agent = AgentSDK(endpoint_uri=NETWORK_RPC, proxy_address=AGENT_PROXY_ADDRESS)
```
To create a new agent, call the createAndRegisterAgent method:
```python
import time
from ai_agent.entities import (
AgentSettings,
AgentHeader,
MessageType,
Priority
)
from ai_agent.utils import (
generate_uuid_v4
)
AGENT_SETTINGS = AgentSettings(
signers=[
"0x4b1056f504f32c678227b5Ae812936249c40AfBF",
"0xB973476e0cF88a3693014b99f230CEB5A01ac686",
"0x6cF0803D049a4e8DC01da726A5a212BCB9FAC1a1",
"0x9D46daa26342e9E9e586A6AdCEDaD667f985567B",
"0x33AF673aBcE193E20Ee94D6fBEb30fEf0cA7015b",
"0x868D2dE4a0378450BC62A7596463b30Dc4e3897E",
"0xD4E157c36E7299bB40800e4aE7909DDcA8097f67",
"0xA3866A07ABEf3fD0643BD7e1c32600520F465ca8",
"0x62f642Ae0Ed7F12Bc40F2a9Bf82ccD0a3F3b7531"
],
threshold=2,
converter_address="0xaB303EF87774D9D259d1098E9aA4dD6c07F69240",
agent_header=AgentHeader(
version="1.0",
message_id="d4d0813f-ceb7-4ce1-8988-12899b26c4b6",
source_agent_id="da70f6b3-e580-470f-b88b-caa5369e7778",
source_agent_name="APRO Pull Mode Agent",
target_agent_id="",
timestamp=int(time.time()),
message_type=MessageType.Event,
priority=Priority.Low,
ttl=60 * 60
)
)
dynamic_setting = AGENT_SETTINGS
dynamic_setting.agent_header.source_agent_id = generate_uuid_v4()
dynamic_setting.agent_header.target_agent_id = generate_uuid_v4()
dynamic_setting.agent_header.message_id = generate_uuid_v4()
user_owner = agent.add_account("0x_user_private_key")
result = agent.create_and_register_agent(
transmitter="",
nonce=None,
settings=AGENT_SETTINGS)
print("created agent:", result)
```
To verify the data integrity, call the verify method:
```python
from ai_agent.entities import (
AgentMessagePayload,
Proofs,
AgentMetadata,
)
AGENT_CONTRACT = "0xA1903361Ee8Ec35acC7c8951b4008dbE8D12C155"
AGENT_SETTING_DIGEST = "0x010038164dba6abffb84eb5cb538850d9bc5d8f815149a371069b3255fd177a4"
AGENT_PAYLOAD = AgentMessagePayload(
data="0x0006e706cf7ab41fa599311eb3de68be869198ce62aef1cd079475ca50e5b3f60000000000000000000000000000000000000000000000000000000002b1bf0e000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000e0000000000000000000000000000000000000000000000000000000000000022000000000000000000000000000000000000000000000000000000000000002a0000101000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001200003665949c883f9e0f6f002eac32e00bd59dfe6c34e92a91c37d6a8322d6489000000000000000000000000000000000000000000000000000000006762677d000000000000000000000000000000000000000000000000000000006762677d000000000000000000000000000000000000000000000000000003128629ec0800000000000000000000000000000000000000000000000004db732547630000000000000000000000000000000000000000000000000000000000006763b8fd0000000000000000000000000000000000000000000015f0f60671beb95cc0000000000000000000000000000000000000000000000015f083baa654a7b900000000000000000000000000000000000000000000000015f103ec7cb057ea80000000000000000000000000000000000000000000000000000000000000000003b64f7e72208147bb898e8b215d0997967bef0219263726c76995d8a19107d6ba5306a176474f9ccdb1bc5841f97e0592013e404e15b0de0839b81d0efb26179f222e0191269a8560ebd9096707d225bc606d61466b85d8568d7620a3b59a73e800000000000000000000000000000000000000000000000000000000000000037cae0f05c1bf8353eb5db27635f02b40a534d4192099de445764891198231c597a303cd15f302dafbb1263eb6e8e19cbacea985c66c6fed3231fd84a84ebe0276f69f481fe7808c339a04ceb905bb49980846c8ceb89a27b1c09713cb356f773",
data_hash="0x53d9f133f1265bd4391fcdf89b63424cbcfd316c8448f76cc515647267ac0a8e",
proofs=Proofs(
zk_proof="0x",
merkle_proof="0x",
signature_proof="0x000000000000000000000000000000000000000000000000000000000000006000000000000000000000000000000000000000000000000000000000000000e000000000000000000000000000000000000000000000000000000000000001600000000000000000000000000000000000000000000000000000000000000003b64f7e72208147bb898e8b215d0997967bef0219263726c76995d8a19107d6ba5306a176474f9ccdb1bc5841f97e0592013e404e15b0de0839b81d0efb26179f222e0191269a8560ebd9096707d225bc606d61466b85d8568d7620a3b59a73e800000000000000000000000000000000000000000000000000000000000000037cae0f05c1bf8353eb5db27635f02b40a534d4192099de445764891198231c597a303cd15f302dafbb1263eb6e8e19cbacea985c66c6fed3231fd84a84ebe0276f69f481fe7808c339a04ceb905bb49980846c8ceb89a27b1c09713cb356f7730000000000000000000000000000000000000000000000000000000000000003000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000010000000000000000000000000000000000000000000000000000000000000001",
),
meta_data=AgentMetadata(
content_type="0x",
encoding="0x",
compression="0x"
)
)
user_owner = agent.add_account("0x_user_private_key")
result = agent.verify(
transmitter=user_owner,
nonce=None,
agent_contract=AGENT_CONTRACT,
settings_digest=AGENT_SETTING_DIGEST,
payload=AGENT_PAYLOAD
)
print("verify:", result)
```
For more examples, see the [test cases](https://github.com/APRO-com/ai-agent-sdk-python/tree/main/tests).
# Other SDKs
JAVA: https://github.com/APRO-com/ai-agent-sdk-java
RUST: https://github.com/APRO-com/ai-agent-sdk-rust
GOLANG: https://github.com/APRO-com/ai-agent-sdk-go
# Support
For issues or feature requests:
1. Check existing documentation
2. Submit a GitHub issue with relevant details
3. Include transaction logs and system info if applicable
# Contributing
We welcome pull requests! Refer to the projects CONTRIBUTING.md and open discussions to coordinate efforts.
# Credits
- [APRO](https://www.apro.com/) - Plugin sponsor and partner
- [ai-agent-sdk-js](https://github.com/APRO-com/ai-agent-sdk-js) - Underlying agent SDK
- [ethers.js](https://docs.ethers.io/) - Transaction and contract interaction
- Community contributors for feedback and testing
For more information about Apro plugin capabilities:
- [Apro Documentation](https://docs.apro.com/en)
# License
This plugin is part of the Eliza project. Refer to the main project repository for licensing details.

View file

@ -0,0 +1,17 @@
# [Geneplore AI](https://geneplore.com/bot)
## Geneplore AI is building the world's easiest way to use AI - Use 50+ models, all on Discord
Chat with the all-new Deepseek v3, GPT-4o, Claude 3 Opus, LLaMA 3, Gemini Pro, FLUX.1, and ChatGPT with **one bot**. Generate videos with Stable Diffusion Video, and images with the newest and most popular models available.
Don't like how the bot responds? Simply change the model in *seconds* and continue chatting like normal, without adding another bot to your server. No more fiddling with API keys and webhooks - every model is completely integrated into the bot.
**NEW:** Try the most powerful open AI model, Deepseek v3, for free with our bot. Simply type /chat and select Deepseek in the model list.
![image](https://github.com/user-attachments/assets/14db7e3c-c2c7-46d7-9fe1-5a5d1e3fc856)
Use the bot trusted by over 60,000 servers and hundreds of paying subscribers, without the hassle of multiple $20/month subscriptions and complicated programming.
https://geneplore.com
© 2025 Geneplore AI, All Rights Reserved.

12
docs/Ncurator/README.md Normal file
View file

@ -0,0 +1,12 @@
<img src="./assets/logo.png" width="64" height="auto" />
# [Ncurator](https://www.ncurator.com)
Knowledge Base AI Q&A Assistant -
Let AI help you organize and analyze knowledge
## UI
<img src="./assets/screenshot3.png" width="360" height="auto" />
## Integrate with Deepseek API
<img src="./assets/screenshot2.png" width="360" height="auto" />

View file

@ -0,0 +1,11 @@
<img src="./assets/logo.png" width="64" height="auto" />
# [Ncurator](https://www.ncurator.com)
知识库AI问答助手-让AI帮助你整理与分析知识
## UI
<img src="./assets/screenshot1.png" width="360" height="auto" />
## 配置 Deepseek API
<img src="./assets/screenshot2.png" width="360" height="auto" />

Binary file not shown.

After

Width:  |  Height:  |  Size: 99 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 178 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 96 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 180 KiB

View file

@ -2,7 +2,7 @@
![image](assets/image-20250122162731-7wkftbw.png)
![image](https://b3log.org/images/brand/siyuan-128.png)
---

View file

@ -1,6 +1,6 @@
# README_cn
![image](assets/image-20250122162731-7wkftbw.png)
![image](https://b3log.org/images/brand/siyuan-128.png)
---

View file

Before

Width:  |  Height:  |  Size: 75 KiB

After

Width:  |  Height:  |  Size: 75 KiB

View file

Before

Width:  |  Height:  |  Size: 105 KiB

After

Width:  |  Height:  |  Size: 105 KiB

View file

Before

Width:  |  Height:  |  Size: 101 KiB

After

Width:  |  Height:  |  Size: 101 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.6 KiB

11
docs/Typral/README.md Normal file
View file

@ -0,0 +1,11 @@
<img src="https://www.typral.com/_next/image?url=%2Ffavicon.ico&w=96&q=75" width="64" height="auto" />
# [Typral](https://www.typral.com)
Fast AI writer assistant - Let AI help you quickly improve article, paper, text...
## UI
<img src="./assets/screenshot1.png" width="360" height="auto" />
## 配置 Deepseek API
<img src="./assets/screenshot2.png" width="360" height="auto" />

11
docs/Typral/README_cn.md Normal file
View file

@ -0,0 +1,11 @@
<img src="https://www.typral.com/_next/image?url=%2Ffavicon.ico&w=96&q=75" width="64" height="auto" />
# [Typral](https://www.typral.com)
超快的AI写作助手 - 让AI帮你快速优化日报,文章,文本等等...
## UI
<img src="./assets/screenshot1.png" width="360" height="auto" />
## 配置 Deepseek API
<img src="./assets/screenshot2.png" width="360" height="auto" />

Binary file not shown.

After

Width:  |  Height:  |  Size: 113 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 36 KiB

23
docs/autoflow/README.md Normal file
View file

@ -0,0 +1,23 @@
# Autoflow
<a href="https://trendshift.io/repositories/12294" target="_blank"><img src="https://trendshift.io/api/badge/repositories/12294" alt="pingcap%2Fautoflow | Trendshift" style="width: 250px; height: 55px;" width="250" height="55"/></a>
[AutoFlow](https://github.com/pingcap/autoflow) is an open-source knowledge base tool based on GraphRAG (Graph-based Retrieval-Augmented Generation), built on [TiDB](https://www.pingcap.com/ai?utm_source=tidb.ai&utm_medium=community) Vector, LlamaIndex, and DSPy. It provides a Perplexity-like search interface and allows easy integration of AutoFlow's conversational search window into your website by embedding a simple JavaScript snippet.
## UI
1. **Perplexity-style Conversational Search page**: Our platform features an advanced built-in website crawler, designed to elevate your browsing experience. This crawler effortlessly navigates official and documentation sites, ensuring comprehensive coverage and streamlined search processes through sitemap URL scraping.
![Image](https://github.com/user-attachments/assets/50a4e5ce-8b93-446a-8ce7-11ed7844bd1e)
2. **Embeddable JavaScript Snippet**: Integrate our conversational search window effortlessly into your website by copying and embedding a simple JavaScript code snippet. This widget, typically placed at the bottom right corner of your site, facilitates instant responses to product-related queries.
![Image](https://github.com/user-attachments/assets/f0dc82db-c14d-4863-a242-c7da3a719568)
## Integrate with Deepseek API
- Click the tab `Models` then `LLMs` to enter the LLM model management page.
- Click the `Create` button to create a new LLM model.
- Input data like below, then click the `Create LLM` button.
![Image](https://github.com/user-attachments/assets/875cac18-707b-465f-ac62-89ddb416f94d)

View file

@ -0,0 +1,23 @@
# Autoflow
<a href="https://trendshift.io/repositories/12294" target="_blank"><img src="https://trendshift.io/api/badge/repositories/12294" alt="pingcap%2Fautoflow | Trendshift" style="width: 250px; height: 55px;" width="250" height="55"/></a>
[AutoFlow](https://github.com/pingcap/autoflow) 是一个基于 GraphRAG基于图的检索增强生成的开源知识库工具构建于 [TiDB](https://www.pingcap.com/ai?utm_source=tidb.ai&utm_medium=community) Vector、LlamaIndex 和 DSPy 之上。它提供类似 Perplexity 的搜索界面,并允许通过嵌入简单的 JavaScript 代码片段,将 AutoFlow 的对话式搜索窗口轻松集成到您的网站中。
## UI 界面
1. **Perplexity 风格的对话式搜索页面**:我们的平台配备了高级内置网站爬虫,旨在提升您的浏览体验。该爬虫能够轻松抓取官方网站和文档站点,通过 sitemap 抓取,实现全面覆盖和高效搜索。
![Image](https://github.com/user-attachments/assets/50a4e5ce-8b93-446a-8ce7-11ed7844bd1e)
2. **可嵌入的 JavaScript 代码片段**:通过复制并嵌入一段简单的 JavaScript 代码,即可轻松将我们的对话式搜索窗口集成到您的网站中。此小部件通常放置在网站右下角,可即时回答与产品相关的查询。
![Image](https://github.com/user-attachments/assets/f0dc82db-c14d-4863-a242-c7da3a719568)
## 集成 Deepseek API
- 点击 `Models` 选项卡,然后进入 `LLMs` 以进入 LLM 模型管理页面。
- 点击 `Create` 按钮创建一个新的 LLM 模型。
- 按照下方示例输入数据,然后点击 `Create LLM` 按钮。
![Image](https://github.com/user-attachments/assets/875cac18-707b-465f-ac62-89ddb416f94d)

View file

@ -25,16 +25,14 @@ return {
lazy = false,
version = false, -- set this if you want to always pull the latest change
opts = {
provider = "openai",
auto_suggestions_provider = "openai", -- Since auto-suggestions are a high-frequency operation and therefore expensive, it is recommended to specify an inexpensive provider or even a free provider: copilot
openai = {
endpoint = "https://api.deepseek.com/v1",
model = "deepseek-chat",
timeout = 30000, -- Timeout in milliseconds
temperature = 0,
max_tokens = 4096,
-- optional
api_key_name = "OPENAI_API_KEY", -- default OPENAI_API_KEY if not set
provider = "deepseek",
vendors = {
deepseek = {
__inherited_from = "openai",
api_key_name = "DEEPSEEK_API_KEY",
endpoint = "https://api.deepseek.com",
model = "deepseek-coder",
},
},
},
-- if you want to build from source then do `make BUILD_FROM_SOURCE=true`

View file

@ -25,16 +25,14 @@ return {
lazy = false,
version = false, -- set this if you want to always pull the latest change
opts = {
provider = "openai",
auto_suggestions_provider = "openai", -- Since auto-suggestions are a high-frequency operation and therefore expensive, it is recommended to specify an inexpensive provider or even a free provider: copilot
openai = {
endpoint = "https://api.deepseek.com/v1",
model = "deepseek-chat",
timeout = 30000, -- Timeout in milliseconds
temperature = 0,
max_tokens = 4096,
-- optional
api_key_name = "OPENAI_API_KEY", -- default OPENAI_API_KEY if not set
provider = "deepseek",
vendors = {
deepseek = {
__inherited_from = "openai",
api_key_name = "DEEPSEEK_API_KEY",
endpoint = "https://api.deepseek.com",
model = "deepseek-coder",
},
},
},
-- if you want to build from source then do `make BUILD_FROM_SOURCE=true`

View file

@ -34,9 +34,8 @@ return {
require("codecompanion").setup({
adapters = {
deepseek = function()
return require("codecompanion.adapters").extend("openai_compatible", {
return require("codecompanion.adapters").extend("deepseek", {
env = {
url = "https://api.deepseek.com",
api_key = "YOUR_API_KEY",
},
})
@ -71,9 +70,8 @@ later(function()
require("codecompanion").setup({
adapters = {
deepseek = function()
return require("codecompanion.adapters").extend("openai_compatible", {
return require("codecompanion.adapters").extend("deepseek", {
env = {
url = "https://api.deepseek.com",
api_key = "YOUR_API_KEY",
},
})

View file

@ -34,9 +34,8 @@ return {
require("codecompanion").setup({
adapters = {
deepseek = function()
return require("codecompanion.adapters").extend("openai_compatible", {
return require("codecompanion.adapters").extend("deepseek", {
env = {
url = "https://api.deepseek.com",
api_key = "YOUR_API_KEY",
},
})
@ -71,9 +70,8 @@ later(function()
require("codecompanion").setup({
adapters = {
deepseek = function()
return require("codecompanion.adapters").extend("openai_compatible", {
return require("codecompanion.adapters").extend("deepseek", {
env = {
url = "https://api.deepseek.com",
api_key = "YOUR_API_KEY",
},
})

158
docs/codegate/README.md Normal file
View file

@ -0,0 +1,158 @@
# CodeGate: secure AI code generation
CodeGate is a **local gateway** that makes AI agents and coding assistants safer. It
ensures AI-generated recommendations adhere to best practices while safeguarding
your code's integrity and protecting your privacy. With CodeGate, you can
confidently leverage AI in your development workflow without sacrificing
security or productivity.
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/stacklok/codegate/main/static/diagram-dark.png">
<img alt="CodeGate dashboard" src="https://github.com/stacklok/codegate/raw/main/static/diagram-light.png" width="1100px" style="max-width: 100%;">
</picture>
---
## ✨ Why choose CodeGate?
AI coding assistants are powerful, but they can inadvertently introduce risks.
CodeGate protects your development process by:
- 🔒 Preventing accidental exposure of secrets and sensitive data
- 🛡️ Ensuring AI suggestions follow secure coding practices
- ⚠️ Blocking recommendations of known malicious or deprecated libraries
- 🔍 Providing real-time security analysis of AI suggestions
---
## 🚀 Quickstart with 🐋 Deepseek!
### Prerequisites
CodeGate is distributed as a Docker container. You need a container runtime like
Docker Desktop or Docker Engine. Podman and Podman Desktop are also supported.
CodeGate works on Windows, macOS, and Linux operating systems with x86_64 and
arm64 (ARM and Apple Silicon) CPU architectures.
These instructions assume the `docker` CLI is available. If you use Podman,
replace `docker` with `podman` in all commands.
### Installation
To start CodeGate, run this simple command (making sure to pass in the
deepseek.com URL as the `CODEGATE_PROVIDER_OPENAI_URL` environment variable):
```bash
docker run --name codegate -d -p 8989:8989 -p 9090:9090 -p 8990:8990 \
-e CODEGATE_PROVIDER_OPENAI_URL=https://api.deepseek.com \
--mount type=volume,src=codegate_volume,dst=/app/codegate_volume \
--restart unless-stopped ghcr.io/stacklok/codegate:latest
```
Thats it! CodeGate is now running locally.
### Using CodeGate and 🐋 Deepseek within Continue
To use Continue with CodeGate, open the Continue settings and add
the following configuration:
```json
{
"title": "Deepseek-r1",
"provider": "openai",
"model": "deepseek-ai/DeepSeek-R1-Distill-Qwen-32B",
"apiKey": "YOUR_DEEPSEEK_API_KEY",
"apiBase": "http://localhost:8989/openai",
}
```
Just use Continue as normal, and you know longer have to worry about security
or privacy concerns!
![continue-image](https://github.com/deepseek/awesome-deepseek-integration/blob/codegate/docs/codegate/assets/continue-screen.png)
### Using CodeGate and 🐋 Deepseek with Cline
To use Cline with CodeGate, open the Cline settings and add
the following configuration:
![cline-settings](https://github.com/deepseek/awesome-deepseek-integration/blob/codegate/docs/codegate/assets/cline-settings.png)
Just use Cline as normal, and you know longer have to worry about security
or privacy concerns!
![cline-image](https://github.com/deepseek/awesome-deepseek-integration/blob/codegate/docs/codegate/assets/cline-screen.png)
---
## 🖥️ Dashboard
CodeGate includes a web dashboard that provides:
- A view of **security risks** detected by CodeGate
- A **history of interactions** between your AI coding assistant and your LLM
<picture>
<source media="(prefers-color-scheme: dark)" srcset="./static/dashboard-dark.webp">
<img alt="CodeGate dashboard" src="./static/dashboard-light.webp" width="1200px" style="max-width: 100%;">
</picture>
### Accessing the dashboard
Open [http://localhost:9090](http://localhost:9090) in your web browser to
access the dashboard.
To learn more, visit the
[CodeGate Dashboard documentation](https://docs.codegate.ai/how-to/dashboard).
---
## 🔐 Features
### Secrets encryption
CodeGate helps you protect sensitive information from being accidentally exposed
to AI models and third-party AI provider systems by redacting detected secrets
from your prompts using encryption.
[Learn more](https://docs.codegate.ai/features/secrets-encryption)
### Dependency risk awareness
LLMs knowledge cutoff date is often months or even years in the past. They
might suggest outdated, vulnerable, or non-existent packages (hallucinations),
exposing you and your users to security risks.
CodeGate scans direct, transitive, and development dependencies in your package
definition files, installation scripts, and source code imports that you supply
as context to an LLM.
[Learn more](https://docs.codegate.ai/features/dependency-risk)
### Security reviews
CodeGate performs security-centric code reviews, identifying insecure patterns
or potential vulnerabilities to help you adopt more secure coding practices.
[Learn more](https://docs.codegate.ai/features/security-reviews)
---
## 🛡️ Privacy first
Unlike other tools, with CodeGate **your code never leaves your machine**.
CodeGate is built with privacy at its core:
- 🏠 **Everything stays local**
- 🚫 **No external data collection**
- 🔐 **No calling home or telemetry**
- 💪 **Complete control over your data**
---
## 🛠️ Development
Are you a developer looking to contribute? Dive into our technical resources:
- [Development guide](https://github.com/stacklok/codegate/blob/main/docs/development.md)
- [CLI commands and flags](https://github.com/stacklok/codegate/blob/main/docs/cli.md)
- [Configuration system](https://github.com/stacklok/codegate/blob/main/docs/configuration.md)
- [Logging system](https://github.com/stacklok/codegate/blob/main/docs/logging.md)
---
## 📜 License
CodeGate is licensed under the terms specified in the
[LICENSE file](https://github.com/stacklok/codegate/blob/main/LICENSE).

132
docs/codegate/README_cn.md Normal file
View file

@ -0,0 +1,132 @@
# CodeGate:安全的 AI 代码生成
CodeGate 是一个**本地代理**,可以让 AI 代理和编码助手更加安全。它确保 AI 生成的建议遵循最佳实践,同时保护您的代码完整性和隐私。使用 CodeGate,您可以在开发工作流程中自信地利用 AI,而不会牺牲安全性或生产力。
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/stacklok/codegate/main/static/diagram-dark.png">
<img alt="CodeGate dashboard" src="https://github.com/stacklok/codegate/raw/main/static/diagram-light.png" width="1100px" style="max-width: 100%;">
</picture>
---
## ✨ 为什么选择 CodeGate?
AI 编码助手功能强大,但可能会无意中带来风险。CodeGate 通过以下方式保护您的开发过程:
- 🔒 防止意外泄露机密和敏感数据
- 🛡️ 确保 AI 建议遵循安全编码实践
- ⚠️ 阻止推荐已知的恶意或已弃用的库
- 🔍 提供 AI 建议的实时安全分析
---
## 🚀 使用 🐋 Deepseek 快速开始!
### 前提条件
CodeGate 以 Docker 容器的形式分发。您需要一个容器运行时,如 Docker Desktop 或 Docker Engine。同时也支持 Podman 和 Podman Desktop。CodeGate 可在 Windows、macOS 和 Linux 操作系统上运行,支持 x86_64 和 arm64(ARM 和 Apple Silicon)CPU 架构。
以下说明基于 `docker` CLI 可用的前提。如果您使用 Podman,请在所有命令中将 `docker` 替换为 `podman`
### 安装
要启动 CodeGate,运行这个简单的命令(确保将 deepseek.com URL 作为 `CODEGATE_PROVIDER_OPENAI_URL` 环境变量传入):
```bash
docker run --name codegate -d -p 8989:8989 -p 9090:9090 -p 8990:8990 \
-e CODEGATE_PROVIDER_OPENAI_URL=https://api.deepseek.com \
--mount type=volume,src=codegate_volume,dst=/app/codegate_volume \
--restart unless-stopped ghcr.io/stacklok/codegate:latest
```
就是这样!CodeGate 现在在本地运行了。
### 在 Continue 中使用 CodeGate 和 🐋 Deepseek
要在 Continue 中使用 CodeGate,打开 Continue 设置并添加以下配置:
```json
{
"title": "Deepseek-r1",
"provider": "openai",
"model": "deepseek-ai/DeepSeek-R1-Distill-Qwen-32B",
"apiKey": "YOUR_DEEPSEEK_API_KEY",
"apiBase": "http://localhost:8989/openai",
}
```
像往常一样使用 Continue,您不再需要担心安全或隐私问题!
![continue-image](](https://github.com/deepseek/awesome-deepseek-integration/blob/codegate/docs/codegate/assets/continue-screen.png))
### 在 Cline 中使用 CodeGate 和 🐋 Deepseek
要在 Cline 中使用 CodeGate,打开 Cline 设置并添加以下配置:
![cline-settings](https://github.com/deepseek/awesome-deepseek-integration/blob/codegate/docs/codegate/assets/cline-settings.png)
像往常一样使用 Cline,您不再需要担心安全或隐私问题!
![cline-image](https://github.com/deepseek/awesome-deepseek-integration/blob/codegate/docs/codegate/assets/cline-screen.png)
---
## 🖥️ 仪表板
CodeGate 包含一个 Web 仪表板,提供:
- CodeGate 检测到的**安全风险**视图
- AI 编码助手与 LLM 之间的**交互历史**
<picture>
<source media="(prefers-color-scheme: dark)" srcset="./static/dashboard-dark.webp">
<img alt="CodeGate dashboard" src="./static/dashboard-light.webp" width="1200px" style="max-width: 100%;">
</picture>
### 访问仪表板
在您的网络浏览器中打开 [http://localhost:9090](http://localhost:9090) 以访问仪表板。
要了解更多信息,请访问 [CodeGate 仪表板文档](https://docs.codegate.ai/how-to/dashboard)。
---
## 🔐 功能
### 机密加密
CodeGate 通过使用加密对检测到的机密进行编辑,帮助您防止敏感信息意外暴露给 AI 模型和第三方 AI 提供商系统。
[了解更多](https://docs.codegate.ai/features/secrets-encryption)
### 依赖风险意识
LLM 的知识截止日期通常是几个月甚至几年前。它们可能会建议过时的、易受攻击的或不存在的包(幻觉),使您和您的用户面临安全风险。
CodeGate 扫描您作为上下文提供给 LLM 的包定义文件、安装脚本和源代码导入中的直接依赖、传递依赖和开发依赖。
[了解更多](https://docs.codegate.ai/features/dependency-risk)
### 安全审查
CodeGate 执行以安全为中心的代码审查,识别不安全的模式或潜在的漏洞,帮助您采用更安全的编码实践。
[了解更多](https://docs.codegate.ai/features/security-reviews)
---
## 🛡️ 隐私优先
与其他工具不同,使用 CodeGate **您的代码永远不会离开您的机器**。CodeGate 以隐私为核心构建:
- 🏠 **所有数据均本地存储**
- 🚫 **没有外部数据收集**
- 🔐 **没有回传或遥测**
- 💪 **完全控制您的数据**
---
## 🛠️ 开发
您是想要贡献的开发者吗?深入了解我们的技术资源:
- [开发指南](https://github.com/stacklok/codegate/blob/main/docs/development.md)
- [CLI 命令和标志](https://github.com/stacklok/codegate/blob/main/docs/cli.md)
- [配置系统](https://github.com/stacklok/codegate/blob/main/docs/configuration.md)
- [日志系统](https://github.com/stacklok/codegate/blob/main/docs/logging.md)
---
## 📜 许可证
CodeGate 根据 [LICENSE 文件](https://github.com/stacklok/codegate/blob/main/LICENSE) 中指定的条款获得许可。

Binary file not shown.

After

Width:  |  Height:  |  Size: 645 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 210 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 744 KiB

View file

@ -1,4 +1,4 @@
<img src="https://github.com/deepseek-ai/awesome-deepseek-integration/assets/59196087/e4d082de-6f64-44b9-beaa-0de55d70cfab" width="64" height="auto" />
<img src="https://github.com/continuedev/continue/blob/main/docs/static/img/logo.png?raw=true" width="64" height="auto" />
# [Continue](https://continue.dev/)

View file

@ -1,4 +1,4 @@
<img src="https://github.com/deepseek-ai/awesome-deepseek-integration/assets/59196087/e4d082de-6f64-44b9-beaa-0de55d70cfab" width="64" height="auto" />
<img src="https://github.com/continuedev/continue/blob/main/docs/static/img/logo.png?raw=true" width="64" height="auto" />
# [Continue](https://continue.dev/)

30
docs/curator/README.md Normal file
View file

@ -0,0 +1,30 @@
![image](https://raw.githubusercontent.com/bespokelabsai/curator/main/docs/Bespoke-Labs-Logomark-Red-crop.png)
# [Curator](https://github.com/bespokelabsai/curator)
Curator is an open-source tool to curate large scale datasets for post-training LLMs.
Curator was used to curate [Bespoke-Stratos-17k](https://huggingface.co/datasets/bespokelabs/Bespoke-Stratos-17k), a reasoning dataset to train a fully open reasoning model [Bespoke-Stratos](https://www.bespokelabs.ai/blog/bespoke-stratos-the-unreasonable-effectiveness-of-reasoning-distillation).
### Curator supports:
- Calling Deepseek API for scalable synthetic data curation
- Easy structured data extraction
- Caching and automatic recovery
- Dataset visualization
- Saving $$$ using batch mode
### Call Deepseek API with Curator easily:
![image](https://pbs.twimg.com/media/GiLHb-xasAAbs4m?format=jpg&name=4096x4096)
# Get Started here
- [Colab Example](https://colab.research.google.com/drive/1Z78ciwHIl_ytACzcrslNrZP2iwK05eIF?usp=sharing)
- [Github Repo](https://github.com/bespokelabsai/curator)
- [Documentation](https://docs.bespokelabs.ai/)
- [Discord](https://discord.com/invite/KqpXvpzVBS)

29
docs/curator/README_cn.md Normal file
View file

@ -0,0 +1,29 @@
![image](https://raw.githubusercontent.com/bespokelabsai/curator/main/docs/Bespoke-Labs-Logomark-Red-crop.png)
# [Curator](https://github.com/bespokelabsai/curator)
Curator 是一个用于后训练大型语言模型 (LLMs) 和结构化数据提取的制作与管理可扩展的数据集的开源工具。
Curator 被用来制作 [Bespoke-Stratos-17k](https://huggingface.co/datasets/bespokelabs/Bespoke-Stratos-17k),这是一个用于训练完全开源的推理模型 [Bespoke-Stratos](https://www.bespokelabs.ai/blog/bespoke-stratos-the-unreasonable-effectiveness-of-reasoning-distillation) 的推理数据集。
### Curator 支持:
- 调用 Deepseek API 进行可扩展的合成数据管理
- 简便的结构化数据提取
- 缓存和自动恢复
- 数据集可视化
- 使用批处理模式节省费用
### 轻松使用 Curator 调用 Deepseek API
![image](https://pbs.twimg.com/media/GiLHb-xasAAbs4m?format=jpg&name=4096x4096)
# 从这里开始
- [Colab 示例](https://colab.research.google.com/drive/1Z78ciwHIl_ytACzcrslNrZP2iwK05eIF?usp=sharing)
- [Github 仓库](https://github.com/bespokelabsai/curator)
- [文档](https://docs.bespokelabs.ai/)
- [Discord](https://discord.com/invite/KqpXvpzVBS)

View file

@ -0,0 +1,182 @@
# Minuet AI
Minuet AI: Dance with Intelligence in Your Code 💃.
`Minuet-ai` brings the grace and harmony of a minuet to your coding process.
Just as dancers move during a minuet.
# Features
- AI-powered code completion with dual modes:
- Specialized prompts and various enhancements for chat-based LLMs on code completion tasks.
- Fill-in-the-middle (FIM) completion for compatible models (DeepSeek,
Codestral, Qwen, and others).
- Support for multiple AI providers (OpenAI, Claude, Gemini, Codestral, Ollama, and
OpenAI-compatible services).
- Customizable configuration options.
- Streaming support to enable completion delivery even with slower LLMs.
- Support `nvim-cmp`, `blink-cmp`, `virtual text` frontend.
# Requirements
- Neovim 0.10+.
- [plenary.nvim](https://github.com/nvim-lua/plenary.nvim)
- optional: [nvim-cmp](https://github.com/hrsh7th/nvim-cmp)
- optional: [blink.cmp](https://github.com/Saghen/blink.cmp)
- An API key for at least one of the supported AI providers
# Installation
**Lazy.nvim**:
```lua
specs = {
{
'milanglacier/minuet-ai.nvim',
config = function()
require('minuet').setup {
-- Your configuration options here
}
end,
},
{ 'nvim-lua/plenary.nvim' },
-- optional, if you are using virtual-text frontend, nvim-cmp is not
-- required.
{ 'hrsh7th/nvim-cmp' },
-- optional, if you are using virtual-text frontend, blink is not required.
{ 'Saghen/blink.cmp' },
}
```
**Rocks.nvim**:
`Minuet` is available on luarocks.org. Simply run `Rocks install
minuet-ai.nvim` to install it like any other luarocks package.
**Setting up with virtual text**:
```lua
require('minuet').setup {
virtualtext = {
auto_trigger_ft = {},
keymap = {
-- accept whole completion
accept = '<A-A>',
-- accept one line
accept_line = '<A-a>',
-- accept n lines (prompts for number)
-- e.g. "A-z 2 CR" will accept 2 lines
accept_n_lines = '<A-z>',
-- Cycle to prev completion item, or manually invoke completion
prev = '<A-[>',
-- Cycle to next completion item, or manually invoke completion
next = '<A-]>',
dismiss = '<A-e>',
},
},
}
```
**Setting up with nvim-cmp**:
<details>
```lua
require('cmp').setup {
sources = {
{
-- Include minuet as a source to enable autocompletion
{ name = 'minuet' },
-- and your other sources
}
},
performance = {
-- It is recommended to increase the timeout duration due to
-- the typically slower response speed of LLMs compared to
-- other completion sources. This is not needed when you only
-- need manual completion.
fetching_timeout = 2000,
},
}
-- If you wish to invoke completion manually,
-- The following configuration binds `A-y` key
-- to invoke the configuration manually.
require('cmp').setup {
mapping = {
["<A-y>"] = require('minuet').make_cmp_map()
-- and your other keymappings
},
}
```
</details>
**Setting up with blink-cmp**:
<details>
```lua
require('blink-cmp').setup {
keymap = {
-- Manually invoke minuet completion.
['<A-y>'] = require('minuet').make_blink_map(),
},
sources = {
-- Enable minuet for autocomplete
default = { 'lsp', 'path', 'buffer', 'snippets', 'minuet' },
-- For manual completion only, remove 'minuet' from default
providers = {
minuet = {
name = 'minuet',
module = 'minuet.blink',
score_offset = 8, -- Gives minuet higher priority among suggestions
},
},
},
-- Recommended to avoid unnecessary request
completion = { trigger = { prefetch_on_insert = false } },
}
```
</details>
**LLM Provider Examples**:
**Deepseek**:
```lua
-- you can use deepseek with both openai_fim_compatible or openai_compatible provider
require('minuet').setup {
provider = 'openai_fim_compatible',
provider_options = {
openai_fim_compatible = {
api_key = 'DEEPSEEK_API_KEY',
name = 'deepseek',
optional = {
max_tokens = 256,
top_p = 0.9,
},
},
},
}
-- or
require('minuet').setup {
provider = 'openai_compatible',
provider_options = {
openai_compatible = {
end_point = 'https://api.deepseek.com/v1/chat/completions',
api_key = 'DEEPSEEK_API_KEY',
name = 'deepseek',
optional = {
max_tokens = 256,
top_p = 0.9,
},
},
},
}
```

View file

@ -0,0 +1,172 @@
# Minuet AI
Minuet AI在您的代码中翩翩起舞挥洒智能 💃。
`Minuet-ai` 将小步舞曲的优雅与和谐带入您的编码流程。正如舞者在小步舞曲中舞动一样。
# 特性
- 基于 AI 的代码补全,提供双重模式:
- 针对代码补全任务,为基于聊天的 LLMs 提供专门的提示和各种增强功能。
- 针对兼容的模型DeepSeek、Codestral、Qwen 等)提供中间填充 (FIM) 补全。
- 支持多种 AI 提供商OpenAI、Claude、Gemini、Codestral、Ollama 和兼容 OpenAI 的服务)。
- 可自定义配置选项。
- 支持流式传输,即使使用较慢的 LLMs 也能实现补全的交付。
- 支持 `nvim-cmp``blink-cmp``virtual text` 前端。
# 要求
- Neovim 0.10+。
- [plenary.nvim](https://github.com/nvim-lua/plenary.nvim)
- 可选: [nvim-cmp](https://github.com/hrsh7th/nvim-cmp)
- 可选: [blink.cmp](https://github.com/Saghen/blink.cmp)
- 至少一个受支持的 AI 提供商的 API 密钥
# 安装
**Lazy.nvim**
```lua
specs = {
{
'milanglacier/minuet-ai.nvim',
config = function()
require('minuet').setup {
-- 在此处配置您的选项
}
end,
},
{ 'nvim-lua/plenary.nvim' },
-- 可选,如果您使用 virtual-text 前端,则不需要 nvim-cmp。
{ 'hrsh7th/nvim-cmp' },
-- 可选,如果您使用 virtual-text 前端,则不需要 blink。
{ 'Saghen/blink.cmp' },
}
```
**Rocks.nvim**
`Minuet` 可在 luarocks.org 上获取。只需运行 `Rocks install minuet-ai.nvim` 即可像安装其他 luarocks 包一样安装它。
**使用 virtual text 进行设置:**
```lua
require('minuet').setup {
virtualtext = {
auto_trigger_ft = {},
keymap = {
-- 接受完整补全
accept = '<A-A>',
-- 接受一行
accept_line = '<A-a>',
-- 接受 n 行(提示输入数字)
-- 例如“A-z 2 CR”将接受 2 行
accept_n_lines = '<A-z>',
-- 切换到上一个补全项,或手动调用补全
prev = '<A-[>',
-- 切换到下一个补全项,或手动调用补全
next = '<A-]>',
dismiss = '<A-e>',
},
},
}
```
**使用 nvim-cmp 进行设置:**
<details>
```lua
require('cmp').setup {
sources = {
{
-- 包含 minuet 作为源以启用自动补全
{ name = 'minuet' },
-- 和您的其他来源
}
},
performance = {
-- 建议增加超时时间因为与其他补全来源相比LLMs 的响应速度通常较慢。如果您只需要手动补全,则不需要此设置。
fetching_timeout = 2000,
},
}
-- 如果你希望手动调用补全,
-- 以下配置将 `A-y` 键绑定到手动调用配置。
require('cmp').setup {
mapping = {
["<A-y>"] = require('minuet').make_cmp_map()
-- 和您的其他键映射
},
}
```
</details>
**使用 blink-cmp 进行设置:**
<details>
```lua
require('blink-cmp').setup {
keymap = {
-- 手动调用 minuet 补全。
['<A-y>'] = require('minuet').make_blink_map(),
},
sources = {
-- 启用 minuet 进行自动补全
default = { 'lsp', 'path', 'buffer', 'snippets', 'minuet' },
-- 仅对于手动补全,从默认值中删除 'minuet'
providers = {
minuet = {
name = 'minuet',
module = 'minuet.blink',
score_offset = 8, -- 在建议中赋予 minuet 更高的优先级
},
},
},
-- 建议避免不必要的请求
completion = { trigger = { prefetch_on_insert = false } },
}
```
</details>
**LLM 提供商示例:**
**Deepseek**
```lua
-- 你可以使用 openai_fim_compatible 或 openai_compatible 提供商来使用 deepseek
require('minuet').setup {
provider = 'openai_fim_compatible',
provider_options = {
openai_fim_compatible = {
api_key = 'DEEPSEEK_API_KEY',
name = 'deepseek',
optional = {
max_tokens = 256,
top_p = 0.9,
},
},
},
}
-- 或者
require('minuet').setup {
provider = 'openai_compatible',
provider_options = {
openai_compatible = {
end_point = 'https://api.deepseek.com/v1/chat/completions',
api_key = 'DEEPSEEK_API_KEY',
name = 'deepseek',
optional = {
max_tokens = 256,
top_p = 0.9,
},
},
},
}
```

View file

@ -0,0 +1,67 @@
# README
<img src="assets/sendai-logo.png" width="64" height="auto" alt="logo">
---
An open-source toolkit for connecting AI agents to Solana protocols. Now, any agent, using any model can autonomously perform 60+ Solana actions:
## STEP1.
Apply a token from [deepseek open platform](https://platform.deepseek.com/)
## STEP 2
Intialize the DeepSeek LLM
```
import { ChatDeepSeek } from "@langchain/deepseek";
const deepseek = new ChatDeepSeek({
model: "deepseek-chat",
temperature: 0,
});
```
## STEP 3
Intialize the Solana Agent Kit with DeepSeek
```typescript
const solanaAgent = new SolanaAgentKit(
process.env.SOLANA_PRIVATE_KEY!,
process.env.RPC_URL!,
{
OPENAI_API_KEY: process.env.OPENAI_API_KEY!,
HELIUS_API_KEY: process.env.HELIUS_API_KEY!,
PERPLEXITY_API_KEY: process.env.PERPLEXITY_API_KEY!,
},
);
const tools = createSolanaTools(solanaAgent);
const agent = createReactAgent({
llm: deepseek,
tools,
checkpointSaver: memory,
messageModifier: `
You are a helpful agent that can interact onchain using the Solana Agent Kit. You are
empowered to interact onchain using your tools. If you ever need funds, you can request them from the
faucet. If not, you can provide your wallet details and request funds from the user. If there is a 5XX
(internal) HTTP error code, ask the user to try again later. If someone asks you to do something you
can't do with your currently available tools, you must say so, and encourage them to implement it
themselves using the Solana Agent Kit, recommend they go to https://www.solanaagentkit.xyz for more information. Be
concise and helpful with your responses. Refrain from restating your tools' descriptions unless it is explicitly requested.
`,
});
const stream = await agent.stream(
{ messages: [new HumanMessage(userInput)] },
config,
);
```
More guides can be found in the [Solana Agent Kit](https://docs.solanaagentkit.xyz/v0/introduction)

Binary file not shown.

After

Width:  |  Height:  |  Size: 26 KiB

31
docs/stranslate/README.md Normal file
View file

@ -0,0 +1,31 @@
<img src="./assets/stranslate.svg" width="64" height="auto" />
# [`STranslate`](https://stranslate.zggsong.com/)
STranslate is a translation and OCR tool that is ready to use on the go.
## Translation
Supports multiple translation languages and various translation methods such as input, text selection, screenshot, clipboard monitoring, and mouse text selection. It also allows displaying multiple service translation results simultaneously for easy comparison.
## OCR
Supports fully offline OCR for Chinese, English, Japanese, and Korean, based on PaddleOCR, with excellent performance and quick response. It supports screenshot, clipboard, and file OCR, as well as silent OCR. Additionally, it supports OCR services from WeChat, Baidu, Tencent, OpenAI, and Google.
## Services
Supports integration with over ten translation services including DeepSeek, OpenAI, Gemini, ChatGLM, Baidu, Microsoft, Tencent, Youdao, and Alibaba. It also offers free API options. Built-in services like Microsoft, Yandex, Google, and Kingsoft PowerWord are ready to use out of the box.
## Features
Supports back-translation, global TTS, writing (directly translating and replacing selected content), custom prompts, QR code recognition, external calls, and more.
## Main Interface
![main_ui](./assets/main.png)
## Configuration
![settings_1](./assets/settings_1.png)
![settings_2](./assets/settings_2.png)

View file

@ -0,0 +1,31 @@
<img src="./assets/stranslate.svg" width="64" height="auto" />
# [`STranslate`](https://stranslate.zggsong.com/)
STranslate 是一款即用即走的翻译、OCR工具
## 翻译
支持多种翻译语言,支持输入、划词、截图、监听剪贴板、监听鼠标划词等多种翻译方式,支持同时显示多个服务翻译结果,方便比较翻译结果
## OCR
支持中英日韩完全离线OCR基于 PaddleOCR效果优秀反应迅速支持截图、剪贴板、文件OCR支持静默OCR同时支持微信、百度、腾讯、OpenAI、Google等OCR
## 服务
支持DeepSeek、OpenAI、Gemini、ChatGLM、百度、微软、腾讯、有道、阿里等十多家翻译服务接入同时还提供免费API可供选择内置微软、Yandex、Google、金山词霸等内置服务可做到开箱即用
## 特色
支持回译、全局TTS、写作(选中后直接翻译替换内容)、自定义Prompt、二维码识别、外部调用等等功能
## 主界面
![main_ui](./assets/main.png)
## 配置
![settings_1](./assets/settings_1.png)
![settings_2](./assets/settings_2.png)

Binary file not shown.

After

Width:  |  Height:  |  Size: 133 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 204 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 50 KiB

View file

@ -0,0 +1,2 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg data-name="Layer 1" id="Layer_1" viewBox="0 0 32 32" xmlns="http://www.w3.org/2000/svg"><defs><style>.cls-1{fill:#ba63c6;}</style></defs><title/><path class="cls-1" d="M31.79,28.11l-7-14a2,2,0,0,0-3.58,0L18,20.56,15.3,18.73A17.13,17.13,0,0,0,19.91,9H22a2,2,0,0,0,0-4H14V3a2,2,0,0,0-4,0V5H2A2,2,0,0,0,2,9H15.86a13.09,13.09,0,0,1-3.79,7.28,13.09,13.09,0,0,1-3.19-4.95,2,2,0,1,0-3.77,1.34A17.1,17.1,0,0,0,8.9,18.75L3.84,22.37a2,2,0,0,0,2.33,3.25l5.93-4.24,4.08,2.79-2,3.93a2,2,0,0,0,3.58,1.79l.45-.89H23a2,2,0,0,0,0-4H20.24L23,19.47l5.21,10.42a2,2,0,0,0,3.58-1.79Z"/></svg>

After

Width:  |  Height:  |  Size: 614 B

View file

@ -0,0 +1,33 @@
# `SuperAgentX`
> 🤖 SuperAgentX: A lightweight autonomous true multi-agent framework with AGI capabilities.
**SuperAgentX Source Code**: [https://github.com/superagentxai/superagentx](https://github.com/superagentxai/superagentx)
**DeepSeek AI Agent Example**: [https://github.com/superagentxai/superagentx/blob/master/tests/llm/test_deepseek_client.py](https://github.com/superagentxai/superagentx/blob/master/tests/llm/test_deepseek_client.py)
**Documentation** : [https://docs.superagentx.ai/](https://docs.superagentx.ai/)
The SuperAgentX framework integrates DeepSeek as its LLM service provider, enhancing the multi-agent's reasoning and decision-making capabilities.
## 🤖 Introduction
`SuperAgentX` SuperAgentX is an advanced agentic AI framework designed to accelerate the development of Artificial General Intelligence (AGI). It provides a powerful, modular, and flexible platform for building autonomous AI agents capable of executing complex tasks with minimal human intervention.
![SuperAgentX Diagram](https://raw.githubusercontent.com/superagentxai/superagentX/refs/heads/master/docs/images/architecture.png)
### ✨ Key Features
🚀 Open-Source Framework: A lightweight, open-source AI framework built for multi-agent applications with Artificial General Intelligence (AGI) capabilities.
🎯 Goal-Oriented Multi-Agents: This technology enables the creation of agents with retry mechanisms to achieve set goals. Communication between agents is Parallel, Sequential, or hybrid.
🏖️ Easy Deployment: Offers WebSocket, RESTful API, and IO console interfaces for rapid setup of agent-based AI solutions.
♨️ Streamlined Architecture: Enterprise-ready scalable and pluggable architecture. No major dependencies; built independently!
📚 Contextual Memory: Uses SQL + Vector databases to store and retrieve user-specific context effectively.
🧠 Flexible LLM Configuration: Supports simple configuration options of various Gen AI models.
🤝🏻 Extendable Handlers: Allows integration with diverse APIs, databases, data warehouses, data lakes, IoT streams, and more, making them accessible for function-calling features.

Binary file not shown.

After

Width:  |  Height:  |  Size: 99 KiB

View file

@ -12,7 +12,19 @@ ToMemo is a phrasebook + clipboard history + keyboard iOS app with integrated AI
## Integrate with Deepseek API
Go to Settings-Extensions-AI Services-AI Providers to add the Deepseek API Key.
After adding, you can turn on the 「show in bottom tab」 in the AI service page, so that you can talk to Deepseek directly in the application.
- Go to "Settings-Extensions-AI Services-AI Providers", click "Add" in the top right corner, and select "DeepSeek" in the **Provider** field.
- Enter your API Key in the **API Key** field.
- Click the "Test" button to verify if the input is valid.
- Click "Load Models" to select the model you want to use
- Turn on "Enable" and click "Save"
![image](assets/Integrate.jpg)
![image](assets/app-provider.png)
## Use
- Go to "Settings-Extensions-AI Services"
- Click "AI Assistant" to enter the AI Assistant page
- Add an AI Assistant in the top right corner, you can select "Deepseek" in the models
- Start chatting with Deepseek
![image](assets/use-deepseek.png)

View file

@ -12,7 +12,19 @@ ToMemo 是一款短语合集 + 剪切板历史 + 键盘输出的 iOS 应用,
## Integrate with Deepseek API
进入设置-扩展-AI 服务-AI 供应商,即可添加 Deepseek API Key。
添加完成后,可以 AI 服务页面中开启底部 Tab 页,方便应用中直接与 Deepseek 对话。
- 进入「设置-扩展-AI 服务-AI 供应商」,点击右上角「添加」,在**供应商**中选择「DeepSeek」。
- 在**API Key**中输入你的 API Key。
- 点击「测试」按钮,测试填入是否可用。
- 点击「加载模型」,选择需要使用的模型
- 打开「启用」后,点击「保存」
![image](assets/Integrate.jpg)
![image](assets/app-provider.png)
## Use
- 进入「设置-扩展-AI 服务」,
- 点击「AI 助手」进入 AI 助手页面,
- 右上角添加 AI 助手,可以在模型中选择「深度求索」
- 开始和 Deepseek 聊天
![image](assets/use-deepseek.png)

Binary file not shown.

After

Width:  |  Height:  |  Size: 416 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 260 KiB

146
docs/yomo/README.md Normal file
View file

@ -0,0 +1,146 @@
# YoMo Framework - Deepseek Provider
YoMo is an open-source LLM Function Calling Framework for building Geo-distributed AI agents. Built atop QUIC Transport Protocol and Strongly-typed Stateful Serverless architecture, makes your AI agents low-latency, reliable, secure, and easy.
## 🚀 Getting Started
Let's implement a function calling serverless `sfn-get-ip-latency`:
### Step 1. Install CLI
```bash
curl -fsSL https://get.yomo.run | sh
```
### Step 2. Start the server
Prepare the configuration as `my-agent.yaml`
```yaml
name: ai-zipper
host: 0.0.0.0
port: 9000
auth:
type: token
token: SECRET_TOKEN
bridge:
ai:
server:
addr: 0.0.0.0:9000 ## Restful API endpoint
provider: deepseek ## LLM API Service we will use
providers:
deepseek:
api_key: <DEEPSEEK_API_KEY>
model: deepseek-reasoner
```
Start the server:
```sh
YOMO_LOG_LEVEL=debug yomo serve -c my-agent.yaml
```
### Step 3. Write the function
First, let's define what this function do and how's the parameters required, these will be combined to prompt when invoking LLM.
```golang
type Parameter struct {
Domain string `json:"domain" jsonschema:"description=Domain of the website,example=example.com"`
}
func Description() string {
return `if user asks ip or network latency of a domain, you should return the result of the giving domain. try your best to dissect user expressions to infer the right domain names`
}
func InputSchema() any {
return &Parameter{}
}
```
Create a Stateful Serverless Function to get the IP and Latency of a domain:
```golang
func Handler(ctx serverless.Context) {
var msg Parameter
ctx.ReadLLMArguments(&msg)
// get ip of the domain
ips, _ := net.LookupIP(msg.Domain)
// get ip[0] ping latency
pinger, _ := ping.NewPinger(ips[0].String())
pinger.Count = 3
pinger.Run()
stats := pinger.Statistics()
val := fmt.Sprintf("domain %s has ip %s with average latency %s", msg.Domain, ips[0], stats.AvgRtt)
ctx.WriteLLMResult(val)
}
```
Finally, let's run it
```bash
$ yomo run app.go
time=2025-01-29T21:43:30.583+08:00 level=INFO msg="connected to zipper" component=StreamFunction sfn_id=B0ttNSEKLSgMjXidB11K1 sfn_name=fn-get-ip-from-domain zipper_addr=localhost:9000
time=2025-01-29T21:43:30.584+08:00 level=INFO msg="register ai function success" component=StreamFunction sfn_id=B0ttNSEKLSgMjXidB11K1 sfn_name=fn-get-ip-from-domain zipper_addr=localhost:9000 name=fn-get-ip-from-domain tag=16
```
### Done, let's have a try
```sh
$ curl -i http://127.0.0.1:9000/v1/chat/completions -H "Content-Type: application/json" -d '{
"messages": [
{
"role": "system",
"content": "You are a test assistant."
},
{
"role": "user",
"content": "Compare website speed between Nike and Puma"
}
],
"stream": false
}'
HTTP/1.1 200 OK
Content-Length: 944
Connection: keep-alive
Content-Type: application/json
Date: Wed, 29 Jan 2025 13:30:14 GMT
Keep-Alive: timeout=4
Proxy-Connection: keep-alive
{
"Content": "Based on the data provided for the domains nike.com and puma.com which include IP addresses and average latencies, we can infer the following about their website speeds:
- Nike.com has an IP address of 13.225.183.84 with an average latency of 65.568333 milliseconds.
- Puma.com has an IP address of 151.101.194.132 with an average latency of 54.563666 milliseconds.
Comparing these latencies, Puma.com is faster than Nike.com as it has a lower average latency.
Please be aware, however, that website speed can be influenced by many factors beyond latency, such as server processing time, content size, and delivery networks among others. To get a more comprehensive understanding of website speed, you would need to consider additional metrics and possibly conductreal-time speed tests.",
"FinishReason": "stop"
}
```
### Full Example Code
[Full LLM Function Calling Codes](https://github.com/yomorun/llm-function-calling-examples)
## 🎯 Focuses on Geo-distributed AI Inference Infra
Its no secret that todays users want instant AI inference, every AI
application is more powerful when it response quickly. But, currently, when we
talk about `distribution`, it represents **distribution in data center**. The AI model is
far away from their users from all over the world.
If an application can be deployed anywhere close to their end users, solve the
problem, this is **Geo-distributed System Architecture**:
<img width="580" alt="yomo geo-distributed system" src="https://user-images.githubusercontent.com/65603/162367572-5a0417fa-e2b2-4d35-8c92-2c95d461706d.png">