This document gives a step-by-step breakdown of how prompts flow through the system:
- From frontend trigger to backend Python handlers
- Includes hooks, handlers, payload formats, and utility functions
- Ideal for developers working on prompt submission, socket handling, and agent-based AI routing
Core Functions
handleAIApiType(fileMetadata)
Determines which backend API handles the request based on file metadata.
Returns: String key (doc
, custom_gpt
, canvas
) for backend routing
handleSubmitPrompt(refine = false)
Triggers the complete prompt submission lifecycle.
Behavior:
- First message: Creates new chat and adds user as member
- Always disables chat input until response received
enterNewPrompt()
Creates message entry in MongoDB before backend submission.
Usage: Must be called before sending to backend
Payload Structure
Standard payload format across all LLM APIs:Backend API Functions
General Chat Functions
getPerplexityResponse(payload)
- General LLM processinggetAINormatChatResponse(payload)
- Image uploads or no prompt selectedgetAIDocResponse(payload)
- Document-based chatssetChatTitleByAI(params)
- Auto-generate chat titles
Agent Functions
getAICustomGPTResponse(payload)
- Custom GPT agents (usesagent.model_name
)getAIProAgentChatResponse(payload)
- QA and Web Proposal agentsgetSalesCallResponse(payload)
- Sales call analysis
Specialized Functions
chatCanvasAiResponse(payload)
- Selection-based canvas responsesgetSeoKeyWords(payload)
- SEO keyword generation
React Hooks
Core Hooks
useConversation()
- Central conversation management, LLM streaming, prompt submissionuseThunderBoltPopup()
- File uploads, agent selection, prompt selectionuseMediaUpload()
- Media uploads for canvas responses
Data Hooks
useCustomGpt()
- Fetch available custom GPT agentsusePrompt()
- Retrieve saved promptsuseBrainDocs()
- Load uploaded documents
Utility Hooks
useIntersectionObserver()
- Infinite scroll for listsuseDebounce()
- Debounce user inputuseServerAction()
- Server-side actions in Next.js
Utility Functions
State Management
handleModelSelectionUrl()
- Sync model state with URLhandleProAgentUrlState()
- Sync Pro Agent state with URLhandleNewChatClick()
- Start new chat session
UI Control
blockProAgentAction()
- Disable input during Pro Agent response
API Helpers
commonApi()
- Generic client-side API callsserverApi()
- Server-specific API wrapperValidationError
- Display Yup validation errors
UI Components
Lists and Modals
CommonList
- Combined private/shared document listRenderModalList
- Model selection dropdown
Common Parameters
Standard Parameters
custom_gpt_id
- Agent identifiercompanyId
- Company identifierprovider
- LLM providercode
- Model code
Canvas-specific Parameters
currentMessageId
- Message identifierstartIndex
- Selection startendIndex
- Selection end
Agent-specific Parameters
agent_extra_info
- Additional agent data
Troubleshooting
Prompt Not Reaching Backend
- Verify
handleSubmitPrompt()
is triggered - Ensure
enterNewPrompt()
runs before submission - Check
handleAIApiType()
returns correct type - Confirm socket connection is active
Custom GPT Issues
- Verify
getAICustomGPTResponse()
usesagent.model_name
- Check payload structure matches expected schema
Pro Agent Input Issues
- Ensure
blockProAgentAction()
is called beforegetAIProAgentChatResponse()