shaoyingchen_mcp_client_llm.git

shaoyingchen_mcp_client_llm.git

by shaoyingchen
Enables large language models to access any API type and demonstrates local application integration.

Local Application Integration for Large Language Models

Overview

This project provides a simple example of how to use the OpenAI API to create a local application that can access any API type. It supports large models and demonstrates how to open a local application seamlessly.

Tools

  • Python: The project is entirely written in Python.
  • OpenAI API: Used to interact with large language models.
  • uv sync: A library required for installation.

Configuration

  1. Install Necessary Libraries:
    bash uv sync
  2. Modify .env File:
    Add your OpenAI API key, endpoint, and model details.
    env OPENAI_API_KEY=your_api_key OPENAI_ENDPOINT=your_endpoint OPENAI_MODEL=your_model
  3. Modify servers_config.json:
    Configure your server settings in the servers_config.json file.

Usage

  1. Set Up Environment: Ensure all configurations are correctly set up as described in the Configuration section.
  2. Run the Application: Execute the Python script to start the local application.
    bash python src/main.py

Resources

Repository Details

Languages

  • Python: 100.0%

Releases

No releases published.

Packages

No packages published.

Features & Capabilities

Categories
mcp_server model_context_protocol python api_integration openai

Implementation Details

Stats

0 Views
1 GitHub Stars

Repository Info

shaoyingchen Organization

Similar MCP Servers

continuedev_continue by continuedev
25049
21423
9300