preloader

Api Documentation

This LynixAI API Documentation Helps Developers Integrate the Service Easily, Providing Complete Instructions on Each Feature and How to Get the Most Out of It.

🧠 Overview

LynixAI is an intelligent chatbot API that enables developers to build responsive and contextual AI-based conversational experiences. Using a variety of advanced language models, LynixAI can understand and respond to user interactions with high accuracy.

Designed with flexibility in mind, LynixAI supports integration into a wide range of applications, from automated customer service to virtual assistants. Key features include natural language understanding, conversational context tracking, and the option to customize chatbot behavior to suit your business or product needs.

With comprehensive documentation and a developer-friendly interface, LynixAI is an efficient solution for building modern AI chatbots that are fast, secure, and easy to integrate.


✨ Features

  • Contextual Conversations with Per-User Memory

    Stores context and interaction history based on user_id, enabling a more personalized and consistent conversational experience.

  • Custom Prompts for AI Response Style

    Supports customizing the style and tone of AI responses through custom prompts, to suit your brand personality or application needs.

  • Image Input Support for Visual Analysis

    Accepts image input and is able to visually interpret the content sent.

  • Advanced Visual Understanding

    Able to recognize and understand graphs, objects, and tables in images to provide relevant and in-depth analysis.

  • Fast Response with Latest Models

    Uses the latest language models optimized for speed and accuracy of responses across a variety of usage scenarios.

  • Unique Session System Based on user_id

    Manage conversations individually using unique user identifiers, ensuring session continuity and security.

  • Lightweight and Flexible Endpoint

    The API can be easily accessed via Postman, frontend, or other platforms thanks to the lightweight and efficient endpoint design.

  • Multi-Model Support

    Provides flexibility in choosing a language model according to performance needs, scalability, and task complexity.

  • Adaptive Multilingual Capabilities

    AI automatically adjusts the response language based on the language used by the user in the conversation.


🚀 Endpoint

LynixAI provides several API endpoints to support both text and image-based interactions. Here is a list of available endpoints and how to use them.

1. https://lynix-ai.vercel.app/api/chat — Text Conversation

  • Metode: POST
  • Headers:
    x-api-key: Lynix-xxxxxxxx-xxxxxxxx-xxxxxxxx  
    Content-Type: application/json
    
  • Request
    {
      "user_id": "lynix-20250408-143322-abc123",     // Optional
      "prompt": "You are a friendly AI...",           // Optional
      "model": 1,                                     // Optional
      "question": "What is AI?"                       // Required
    }
    
  • Response
    {
      "success": true,
      "user_id": "lynix-20250408-143322-xxxxxx",
      "answer": "AI stands for Artificial Intelligence...",
      "notice": "Use this user_id for future conversations."
    }
    

2. https://lynix-ai.vercel.app/api/vision — Picture Conversation

  • Metode: POST
  • Headers:
    x-api-key: Lynix-xxxxxxxx-xxxxxxxx-xxxxxxxx  
    Content-Type: application/json
    
  • Request
    {
      "user_id": "lynix-20250408-143322-abc123",              // Optional
      "prompt": "You are a friendly AI...",                   // Optional
      "image": "https://example.com/image.jpg",               // Optional
      "question": "What is the content of this picture?"    // Required
    }
    
  • Response
    {
      "success": true,
      "user_id": "lynix-20250408-143322-xxxxxx",
      "answer": "The picture shows mountains...",
      "notice": "Use this user_id for future conversations."
    }
    

3. https://lynix-ai.vercel.app/api/clear — Delete User Conversation Memory

  • Metode: POST
  • Request
    {
      "user_id": "lynix-20250408-143322-xxxxxx"
    }
    
  • Response
    {
      "success": true,
      "answer": "Memory successfully cleared."
    }
    

4. https://lynix-ai.vercel.app/api/model — List of Available Models

  • Metode: GET
  • Response
    {
      "success": true,
      "models": [
        {
          "id": "1",
          "name": "Lynix Turbo",
          "description": "Fast and powerful for conversation, programming, and reasoning."
        }
        // More models
      ]
    }
    
  • Available Models
    ID Model Name Endpoint Description
    1 Lynix Turbo /api/chat Fast and efficient for conversation, coding and logic.
    2 Lynix Instruct Turbo /api/chat Large model, suitable for complex and technical tasks.
    3 Lynix Vision /api/vision Multimodal models for image and text analysis.

🧪 Testing

  • Method: POST

  • URL:

    https://lynix-ai.vercel.app/api/chat
    
  • Headers:

    Key Value
    x-api-key Lynix-xxxxxxxx-xxxxxxxx-xxxxxxxx
    Content-Type application/json
  • Body: select rawJSON, then fill in:

    {
      "question": "What is AI?"
    }
    
  • Click Send and check the AI ​​response.


📊 Usage Limits

Package Daily Limit Tone Default Satus Api Key
Free 100 request/day Simple & Short Public lynix-ctgSueDfwx7WaE89h0yb68ivPDjaxFSj
Bronze 1500 request/day Formal and Neutral Private Subscribe
Silver 5000 request/day Professional & Friendly Private Subscribe
Gold Unlimited (∞) Casual & Conversational Private Subscribe

❓ FAQ

➤ What happens if I exceed the daily limit?

If the daily limit is reached (for Bronze and Silver plans), the API will automatically stop responding until the next day.

➤ How do I choose the tone of the AI ​​response?

The AI ​​response tone is automatically adjusted based on the plan you are using:

  • Bronze: Formal, objective, like an official document.
  • Silver: Professional but friendly, suitable for customer-facing.
  • Gold: Casual, flexible, and can use emojis or light humor.

➤ How do I get started with the API?

  1. Get an x-api-key from your account dashboard.
  2. Use the /api/chat or /api/vision endpoint as needed.
  3. Include the user_id parameter to track conversations per user.

➤ Does this API support image input?

Yes! Use the /api/vision endpoint to send an image and ask questions about its content.

➤ Can I clear the conversation memory?

Yes. Use the /api/clear endpoint with the user_id parameter to clear the history of a specific conversation context.

➤ Can I change the AI ​​model used?

Yes. You can select a model via the model parameter in the /api/chat endpoint. Use the /api/model endpoint to see a list of available models.

➤ How do I upgrade my plan?

Please contact our team to upgrade from Bronze to Silver or Gold.

shape3 shape3 shape2 shape1