Files
brachnha-insight/_bmad-output/implementation-artifacts/1-4-fast-track-mode.md
Max e9e6fadb1d fix: ChatBubble crash and DeepSeek API compatibility
- Fix ChatBubble to handle non-string content with String() wrapper
- Fix API route to use generateText for non-streaming requests
- Add @ai-sdk/openai-compatible for non-OpenAI providers (DeepSeek, etc.)
- Use Chat Completions API instead of Responses API for compatible providers
- Update ChatBubble tests and fix component exports to kebab-case
- Remove stale PascalCase ChatBubble.tsx file
2026-01-26 16:55:05 +07:00

13 KiB

Story 1.4: Fast Track Mode

Status: done

Story

As a Power User, I want to bypass the interview questions, So that I can generate a post immediately if I already have the insight.

Acceptance Criteria

  1. Fast Track Toggle/Action

    • Given a user is in the chat
    • When they toggle "Fast Track" or press a specific "Just Draft It" button
    • Then the system enters "Fast Track" mode
    • And the UI indicates that the next message will be used for drafting
  2. Skip Probing Phase

    • Given "Fast Track" is active
    • When the user sends their input
    • Then the Teacher Agent skips the probing questions
    • And the input is treated as the final "Insight"
    • And the system automatically transitions to the drafting phase (Epic 2)
  3. Immediate Drafting Trigger

    • Given the user sends a message in Fast Track mode
    • When the message is received
    • Then the system immediately triggers the Ghostwriter Agent (to be implemented in Epic 2)
    • OR (for MVP if Ghostwriter isn't ready) shows a "Drafting..." placeholder state
    • And provided feedback that drafting has started
  4. Exit Fast Track

    • Given the user is in Fast Track mode
    • When they toggle it off
    • Then normal Teacher Agent flow resumes (intent detection > probing)

Tasks / Subtasks

  • Implement Fast Track State in ChatStore

    • Add isFastTrackEnabled boolean to store
    • Add action to toggle fast track mode
    • Add logic to addMessage to check for fast track
  • Update Chat Interface

    • Add "Fast Track" toggle or button to UI (e.g., in header or input area)
    • Add visual indicator when Fast Track is active (e.g., distinct input border or badge)
  • Implement Logic Bypass

    • Modify ChatStore addMessage to skip Teacher Agent if fast track is on
    • Trigger "Drafting" placeholder response (Ghostwriter to be implemented in Epic 2)
    • For now (pre-Epic 2): System responds with "Understood. I'll draft a post based on this insight immediately."
  • Test Fast Track Logic

    • Unit test: Fast track toggle in store
    • Integration test: Message flow skips Teacher Agent logic when fast track is on
    • Integration test: Normal flow resumes when toggled off

Dev Notes

Architecture Compliance (CRITICAL)

Logic Sandwich Pattern - DO NOT VIOLATE:

  • UI Components MUST NOT import src/lib/llm or src/services/llm-service.ts directly
  • All Fast Track logic MUST go through ChatService layer
  • ChatService then calls LLMService as needed
  • Components use Zustand store via atomic selectors only
  • Services return plain objects, not Dexie observables

State Management - Atomic Selectors Required:

// BAD - Causes unnecessary re-renders
const { isFastTrack, messages } = useChatStore();

// GOOD - Atomic selectors
const isFastTrack = useChatStore(s => s.isFastTrack);
const toggleFastTrack = useChatStore(s => s.toggleFastTrack);

Local-First Data Boundary:

  • Fast Track mode state should be persisted in IndexedDB for session recovery
  • If user is in Fast Track mode and closes/reopens app, mode should be restored
  • All messages sent in Fast Track mode stored in chatLogs table with metadata

Architecture Implementation Details

State Management:

  • Add isFastTrackEnabled boolean to ChatStore (use consistent naming)
  • Add toggleFastTrack() action to ChatStore
  • Add logic to sendMessage to check for fast track state
  • Store Fast Track preference in session state for persistence

Logic Flow:

  • The ChatService.sendMessage() function is the gatekeeper
  • IF isFastTrack: Skip LLMService.getTeacherResponse(), trigger drafting
  • ELSE: Proceed as normal (intent detection -> Teacher response)
  • Transition to Epic 2: Since Epic 2 (Ghostwriter) is not done, use placeholder response

Files to Modify:

  • src/lib/store/chat-store.ts - Add isFastTrackEnabled, toggleFastTrack action
  • src/services/chat-service.ts - Add Fast Track routing logic
  • src/services/llm-service.ts - Add placeholder for Ghostwriter trigger
  • src/components/features/chat/ChatWindow.tsx - Add Fast Track toggle UI
  • src/components/features/chat/TypingIndicator.tsx - Update text for "Drafting..."

New Files to Create:

  • src/components/features/chat/FastTrackToggle.tsx - Toggle UI component

UX Design Specifications

From UX Design Document:

  • Toggle Placement: Options include header area (less intrusive), above input field (more discoverable), or as special send button mode
  • Visual Indicators: When active, show "Fast Track Active" badge; change send button icon (e.g., lightning bolt); show different loading state ("Drafting..." vs "Teacher is typing...")
  • User Education: First-time use: show tooltip or modal explaining Fast Track; help text: "Skip straight to draft generation when you already know what you want to say"

Button Hierarchy:

  • Fast Track toggle should be Secondary (Outline/Ghost) style to not compete with primary Send button
  • Or use an icon-only button with clear tooltip (lightning bolt icon suggested)

Visual Feedback:

  • Keep it subtle but accessible
  • Distinct input border or badge when Fast Track is active
  • For MVP, a header toggle is easiest to implement and test

Testing Requirements

Unit Tests:

  • ChatStore: toggleFastTrack() toggles state correctly
  • ChatStore: isFastTrackEnabled defaults to false
  • ChatService: Routes to Fast Track path when isFastTrackEnabled is true
  • ChatService: Routes to normal path when isFastTrackEnabled is false
  • FastTrackToggle: Component renders and responds to clicks

Integration Tests:

  • Full flow: User toggles Fast Track -> sends message -> skips Teacher -> shows drafting state
  • Mode switch: User toggles Fast Track during active chat -> context preserved
  • Normal flow resumes when toggled off

Edge Cases:

  • User toggles Fast Track mid-conversation: Preserve existing chat history
  • User sends message, then toggles Fast Track: No effect on sent messages
  • Fast Track active, but Ghostwriter (Epic 2) not implemented: Show helpful message
  • Offline mode in Fast Track: Queue for Ghostwriter when connection restored (Epic 3)

Previous Story Intelligence (from Story 1.3)

Patterns Established:

  • Logic Sandwich Pattern: UI -> Zustand -> Service -> LLM (strictly enforced)
  • Atomic Selectors: All state access uses useChatStore(s => s.field)
  • Typing Indicator Pattern: isTyping state shows "Teacher is typing..."
  • LLM Service Pattern: LLMService handles all LLM API calls with retry logic
  • Intent Detection: Teacher agent uses IntentDetector for classifying user input

Files from 1.3 (Reference):

  • src/lib/llm/intent-detector.ts - Intent classification (can be skipped in Fast Track)
  • src/lib/llm/prompt-engine.ts - Prompt generation (can be skipped in Fast Track)
  • src/app/api/llm/route.ts - Edge Function proxy (will be used by Ghostwriter in Epic 2)
  • src/services/llm-service.ts - LLM integration with retry/error handling
  • src/lib/store/chat-store.ts - Zustand store with currentIntent, isProcessing state

Learnings to Apply:

  • Fast Track mode state should follow the same pattern as currentIntent in ChatStore
  • Use isProcessing state for "Drafting..." indicator (reuse existing pattern)
  • Fast Track toggle should use Edge-safe state management (no server-side rendering issues)

Testing Patterns:

  • Story 1.3 established 98 passing tests
  • Follow same test structure: unit tests for each service, integration tests for full flow
  • Use mocked LLM responses for Fast Track tests (Ghostwriter not yet implemented)

Data Schema Considerations

Dexie schema - chatLogs table may need new field:

interface ChatLog {
  id: string;
  role: 'user' | 'assistant' | 'system';
  content: string;
  timestamp: number;
  intent?: 'venting' | 'insight';  // From Story 1.3
  isFastTrackInput?: boolean;      // NEW: Mark Fast Track inputs
  sessionId: string;
}

Session state should include Fast Track preference:

interface SessionState {
  id: string;
  createdAt: number;
  updatedAt: number;
  isFastTrackMode: boolean;  // NEW: Persist Fast Track preference
  currentIntent?: 'venting' | 'insight';
}

Performance Requirements

NFR-01 Compliance (Chat Latency):

  • Fast Track mode should be FASTER than normal mode (skips Teacher call)
  • "Drafting..." indicator should appear within 1 second of message send
  • Mode toggle should be instant (no network calls required)

State Persistence:

  • Fast Track mode preference saved to IndexedDB immediately on toggle
  • Mode restoration on app load should be <500ms

References

Architecture Documents:

UX Design Specifications:

Epic Reference:

Previous Story:

Dev Agent Record

Agent Model Used

Claude Opus 4.5 (model ID: 'claude-opus-4-5-20251101')

Debug Log References

Session file: /tmp/claude/-home-maximilienmao-Projects-Test01/e9769bf5-0607-4356-a7cc-0b046e1f56f4/scratchpad

Completion Notes List

Implementation Summary: Story 1.4 (Fast Track Mode) was already implemented. Fixed test compatibility issue between streaming implementation and test mocks.

Test Fix Applied (2026-01-22):

  • Fixed src/lib/store/chat-store.test.ts to properly mock getTeacherResponseStream instead of deprecated getTeacherResponse
  • All 101 tests now passing (was 1 failure before fix)
  • Test now properly simulates streaming callbacks: onIntent, onToken, onComplete

Story Analysis Completed:

  • Extracted story requirements from Epic 1, Story 1.4
  • Analyzed previous story (1.3) for established patterns and learnings
  • Reviewed architecture for compliance requirements (Logic Sandwich, State Management, Local-First)
  • Reviewed UX specification for visual design and interaction patterns
  • Identified all files to create and modify

Key Technical Decisions:

  1. Fast Track mode state: Added isFastTrack to ChatStore with atomic selector pattern
  2. UI toggle: Integrated into ChatInput component with lightning bolt icon
  3. Service routing: Fast Track logic in ChatStore addMessage bypasses LLM streaming
  4. Ghostwriter trigger: Placeholder implementation for Epic 2 integration
  5. Visual indicators: Amber/gold theme when Fast Track is active

Dependencies:

  • No new dependencies required
  • Reuses existing Zustand, Dexie, LLM service infrastructure

Integration Points:

  • Connected to existing ChatStore state management
  • Fast Track bypass in ChatStore addMessage before LLM streaming call
  • Reuses isProcessing state for "Drafting..." indicator
  • Prepares for Epic 2 Ghostwriter integration

Implementation Notes:

  • Fast Track toggle integrated into ChatInput component (not separate FastTrackToggle as planned)
  • Visual feedback: amber/gold accent when active, lightning bolt icon
  • Placeholder response: "Understood. I'll draft a post based on this insight immediately."
  • All 101 tests passing including 3 new Fast Track integration tests

File List

Modified (implementation):

  • src/lib/store/chat-store.ts - Added isFastTrack, toggleFastTrack, Fast Track bypass logic
  • src/components/features/chat/ChatInput.tsx - Integrated Fast Track toggle button with visual indicators
  • src/components/features/chat/ChatWindow.tsx - Passes Fast Track state to ChatInput

Modified (tests):

  • src/lib/store/chat-store.test.ts - Fixed streaming mock compatibility

Created (tests):

  • src/integration/fast-track.test.ts - 3 integration tests for Fast Track mode

Implementation Variations from Plan:

  • Fast Track toggle integrated into ChatInput component (not separate FastTrackToggle.tsx as originally planned)
  • This simplification reduces component count while maintaining all functionality