Document Vector Database

Overview

The Document Vector Database Connected System enables Large Language Models (LLMs) to answer user submitted questions based on Appian Knowledge Center Documents. By uploading documents to this connected system, users can perform semantic searches to pinpoint the most pertinent content related to their questions. The Connected System also boasts Client APIs tailored for the AI Knowledge Assistant Component. This allows the AI Knowledge Assistant Component to deliver AI generated answers to user inquiries sourced from documents stored in the database, as well as general questions.

Key Features & Functionality

  1. Upload Document - Uploads and stores the documents and its vector in the database.
  2. List Documents - Provides us the list of documents uploaded in the database.
  3. Database Operations
    1. Delete Documents - Enables us to delete the documents that are uploaded in the database.
    2. Sync Documents - Updates the existing documents in the database with the latest version of the document available in the Appian Knowledge Center.
    3. Change Database Password - Changes Database password.
  4. Query Documents - Get relevant pieces of content from documents for the given prompt.
  5. Generate Response - Perform search in the given documents and generate ChatGPT response for the given prompt.
  6. Client APIs for AI Knowledge Assistant component for fetching document details, chat completions, document querying, and uploading new documents to the database.

Note: Download the AI Knowledge Assistant, a sophisticated chatbot designed to perform semantic searches across your documents and provide precise answers to your queries.

Anonymous
  • v3.1.2 Release Notes
    • Security patch updated.

  • v3.1.1 Release Notes
    1. Bug fixes.
    2. Security patch updated.

    IMPORTANT NOTE: If you are using this plugin in production, open a support case and ask to increase Heap Max for app server by 1GB. This will increase query performance and allow the plugin to handle a larger number of concurrent users.

  • v3.1.0 Release Notes
    1. Introduced a new integration - Grouping Integration enabling the grouping of text entries into specified categories.
    2. Enhanced usability by restricting editing of the embedding model and Database constant name in the Connected System configuration only during the initial creation phase of the Connected System.

    IMPORTANT NOTE: If you are using this plugin in production, open a support case and ask to increase Heap Max for app server by 1GB. This will increase query performance and allow the plugin to handle a larger number of concurrent users.

  • v3.0.1 Release Notes
    1. Bug fixes
    2. Made all the fields in Connected System as import customizable.
    IMPORTANT NOTE: If you are using this plugin in production, open a support case and ask to increase Heap Max for app server by 1GB. This will increase query performance and allow the plugin to handle a larger number of concurrent users.

  • v3.0.0 Release Notes
    • Scalability Enhancements
    • We have significantly enhanced the vector database’s scalability to allow multiple users to interact with the chatbot concurrently. Please note that query speeds may vary depending on the length of the documents, the amount of documents queried, the number of concurrent users, and hardware configurations.
    • Query Performance
    • There has been a substantial increase in query performance. Documents that are queried are now temporarily cached in-memory, resulting in a slight latency during the initial query but followed by nearly instantaneous responses for subsequent questions. This update markedly improves the speed and efficiency of our query processing.
    IMPORTANT NOTE: If you are using this plugin in production, open a support case and ask to increase Heap Max for app server by 1GB. This will increase query performance and allow the plugin to handle a larger number of concurrent users.

  • v2.0.1 Release Notes
    • Query Performance Optimization.
    • Bug Fixes and improvements.
  • 1. There is no size limit to the embedded database. 

    2. The embedded vector db is stored as a Appian document in the Appian Knowledge Center.

  • If you are already using an older version of the Document Vector Database Connected System, complete the following steps to migrate to the newer version. This will be a one-time setup.

    • Once the latest version of the connected system is installed, create a new Vector Database Connected System object. The previous connected system object and integrations based on it will continue to work.
      • Important Note: While creating the connected system object, do not use the same Database Name, provide a different value.
    • Use the List Documents Integration (based on the older version of the connected system) to get the list of all documents uploaded to the previous vector database.
    • Create an integration based on the new connected system object using the Upload Document Integration template and upload all the documents that were in the previous database.
    • Create new integration objects for the new connected system. Older integration objects will continue to function off the previous connected system version.
    • If you are using the “AI Knowledge Assistant Component”, update the component to the latest version. This is required for it to work with the new version of the connected system. To make the yellow error under “securityKey” disappear, append the latest version to the component name (ex. “AIKnowledgeAssistantField_v2”). After updating the component to the latest version, follow the Migration Guide provided in the component documentation.
  • v2.0.0 Release Notes
    1. Connected System Configuration Changes
      1. New Parameters: Added for Model/Deployment ID and Max Tokens (specific to Azure OpenAI Service) to enable the new Generate Response Integration.
      2. Database Username Removal: Now, only the Appian Username is required.
    2. Integration and Functionality Enhancements
      1. New Integration - Generate Response: This integration mimics AI Knowledge Assistant’s functionality by allowing documents to be queried based on a user’s question and sent to OpenAI to generate a response. A developer can now asynchronously call the ‘Generate Response’ integration, save this output, and pass this into the conversation parameter of the AI Assistant. This allows the AI Assistant to be automatically loaded with questions and answers when a user loads the interface.
      2. New Integration - Sync Documents: Updates the existing documents in the database with the latest version of the document available in the Appian Knowledge Center.
      3. Reduced Complexity: Chunk size and topK are now handled for the developer.
      4. New Parameter: Logged In User - loggedInUser() now required for all integrations to handle document security. Users will only be able to upload, delete, and view documents they have access to.