DEV Community

Pieces 🌟 for Pieces.app

Posted on

Building a TUI with Pieces SDK - Part 3: Advanced Features

Building a Pieces Copilot TUI - Part 3: Advanced Features & Integration

Note: This tutorial is part of the Pieces CLI project. We welcome contributions! Feel free to open issues, submit PRs, or suggest improvements.

Introduction

Welcome to Part 3! In Part 2, we built the core UI components. Now, we'll add the advanced features to create a fully-functional Pieces Copilot TUI.

What we built in Part 2:

  • ✅ Message widget (chat_message.py)
  • ✅ Chat panel (chat_panel.py)
  • ✅ Input widget (chat_input.py)
  • ✅ Basic view (chat_view.py - simple version)
  • ✅ Main application (app.py)

Step 1: Create the Conversations List

Let's build chats_list.py - the sidebar showing all conversations:

# chats_list.py
"""Conversations list panel widget."""

from typing import Optional
from textual.widgets import Static, Button
from textual.containers import VerticalScroll, Vertical
from textual.message import Message

from pieces_os_client import PiecesClient
from pieces_os_client.wrapper.basic_identifier.chat import BasicChat


class ChatSelected(Message):
    """Message emitted when a chat is selected."""

    def __init__(self, chat: BasicChat):
        super().__init__()
        self.chat = chat


class NewChatRequested(Message):
    """Message emitted when new chat is requested."""
    pass


class ChatItem(Static):
    """Widget representing a single chat in the list."""

    DEFAULT_CSS = """
    ChatItem {
        width: 100%;
        height: auto;
        padding: 1 2;
        margin-bottom: 1;
        background: $surface;
        border-left: solid $primary;
    }

    ChatItem:hover {
        background: $panel;
        border-left: solid $accent;
    }

    ChatItem.active {
        background: $primary 30%;
        border-left: thick $accent;
        text-style: bold;
    }

    ChatItem .chat-title {
        color: $text;
        text-style: bold;
    }

    ChatItem .chat-summary {
        color: $text-muted;
        text-style: italic;
    }
    """

    def __init__(self, chat: BasicChat, **kwargs):
        super().__init__(**kwargs)
        self.chat = chat

    def compose(self):
        """Compose the chat item."""
        title = self.chat.name or "Untitled"
        summary = self.chat.summary or ""
        if len(summary) > 50:
            summary = summary[:47] + "..."

        yield Static(f"💬 {title}", classes="chat-title")
        if summary:
            yield Static(summary, classes="chat-summary")

    def on_click(self) -> None:
        """Handle click event."""
        self.post_message(ChatSelected(self.chat))


class ChatsList(Vertical):
    """Panel to display list of conversations."""

    DEFAULT_CSS = """
    ChatsList {
        width: 25%;
        height: 100%;
        border: solid $primary;
        background: $background;
        padding: 1;
    }

    ChatsList:focus-within {
        border: solid $accent;
    }

    ChatsList Button {
        width: 100%;
        margin-bottom: 1;
    }

    ChatsList VerticalScroll {
        height: 1fr;
    }

    ChatsList .empty-state {
        text-align: center;
        color: $text-muted;
        text-style: italic;
        margin: 2;
    }
    """

    def __init__(self, **kwargs):
        super().__init__(**kwargs)
        self.border_title = "Conversations"
        self.active_chat: Optional[BasicChat] = None
        self._chat_items = {}

    def compose(self):
        """Compose the chats list panel."""
        yield Button("➕ New Chat", id="new-chat-btn")
        yield VerticalScroll(id="chats-container")

    def load_chats(self, pieces_client: PiecesClient):
        """Load chats from the API."""
        try:
            chats = pieces_client.copilot.chats()

            container = self.query_one("#chats-container", VerticalScroll)

            # Clear existing items
            container.remove_children()
            self._chat_items.clear()

            if not chats:
                container.mount(Static("No chats yet...", classes="empty-state"))
            else:
                for chat in chats:
                    chat_item = ChatItem(chat)
                    self._chat_items[chat.id] = chat_item
                    container.mount(chat_item)
        except (ConnectionError, AttributeError) as e:
            container = self.query_one("#chats-container", VerticalScroll)
            container.mount(Static(f"❌ Failed to load chats: {str(e)}", classes="empty-state"))

    def set_active_chat(self, chat: Optional[BasicChat]):
        """Set the active chat and update UI."""
        # Remove active class from all items
        for item in self._chat_items.values():
            item.remove_class("active")

        self.active_chat = chat

        # Add active class to the selected item
        if chat and chat.id in self._chat_items:
            self._chat_items[chat.id].add_class("active")

    def add_new_chat(self, chat: BasicChat):
        """Add a new chat to the list."""
        container = self.query_one("#chats-container", VerticalScroll)

        # Remove empty state if present
        try:
            empty_state = container.query_one(".empty-state")
            empty_state.remove()
        except LookupError:
            # No empty state widget present, which is fine
            pass

        # Add new chat at the top
        chat_item = ChatItem(chat)
        self._chat_items[chat.id] = chat_item

        # Mount at the beginning
        if container.children:
            container.mount(chat_item, before=container.children[0])
        else:
            container.mount(chat_item)

    def on_button_pressed(self, event: Button.Pressed) -> None:
        """Handle button press."""
        if event.button.id == "new-chat-btn":
            self.post_message(NewChatRequested())
Enter fullscreen mode Exit fullscreen mode

Key features:

  • ChatItem - Individual conversation with title and summary
  • ChatSelected - Custom message when chat is clicked
  • NewChatRequested - Message when "New Chat" button is pressed
  • Active chat highlighting
  • Hover effects
  • Error handling for API failures

Step 2: Create the Streaming Handler

Now src/pieces_copilot_tui/streaming_handler.py - connects Pieces OS streaming to our UI:

# src/pieces_copilot_tui/streaming_handler.py
"""Handler for streaming responses from Pieces OS."""

from typing import Callable, Optional

from pieces_os_client.wrapper import PiecesClient
from pieces_os_client.wrapper.basic_identifier.chat import BasicChat


class StreamingHandler:
    """Handles streaming responses from the Pieces Copilot."""

    def __init__(
        self,
        pieces_client: PiecesClient,
        on_thinking_started: Callable[[], None],
        on_stream_started: Callable[[str], None],
        on_stream_chunk: Callable[[str], None],
        on_stream_completed: Callable[[Optional[BasicChat]], None],
        on_stream_error: Callable[[str], None],
    ):
        """
        Initialize the streaming handler.

        Args:
            pieces_client: The Pieces client instance
            on_thinking_started: Callback when thinking starts
            on_stream_started: Callback when streaming starts (with initial text)
            on_stream_chunk: Callback for each chunk (with full accumulated text)
            on_stream_completed: Callback when streaming completes (with optional new chat)
            on_stream_error: Callback when an error occurs (with error message)
        """
        self.pieces_client = pieces_client
        self.on_thinking_started = on_thinking_started
        self.on_stream_started = on_stream_started
        self.on_stream_chunk = on_stream_chunk
        self.on_stream_completed = on_stream_completed
        self.on_stream_error = on_stream_error

        self._current_response = ""
        self._current_status: Optional[str] = None

        # Register callback
        if self.pieces_client.copilot.ask_stream_ws:
            self.pieces_client.copilot.ask_stream_ws.on_message_callback = (
                self._handle_stream_message
            )

    def ask_question(self, query: str) -> None:
        """
        Ask a question to the copilot.

        Args:
            query: The question to ask
        """
        self._current_response = ""
        self._current_status = None
        self.on_thinking_started()
        self.pieces_client.copilot.stream_question(query)

    def _handle_stream_message(self, response) -> None:
        """Handle streaming messages from the copilot."""
        try:
            status = response.status

            if status == "IN-PROGRESS":
                if response.question:
                    answers = response.question.answers.iterable
                    for answer in answers:
                        if answer.text:
                            if not self._current_response:
                                # First chunk - start streaming
                                self._current_response = answer.text
                                self.on_stream_started(self._current_response)
                            else:
                                # Subsequent chunks
                                self._current_response += answer.text
                                self.on_stream_chunk(self._current_response)

            elif status == "COMPLETED":
                # Streaming completed
                new_chat = None
                if response.conversation:
                    new_chat = BasicChat(response.conversation)
                    self.pieces_client.copilot.chat = new_chat

                self.on_stream_completed(new_chat)
                self._current_response = ""

            elif status in ["FAILED", "STOPPED", "CANCELED"]:
                # Handle error
                error_msg = getattr(response, "error_message", "Unknown error")
                self.on_stream_error(error_msg)
                self._current_response = ""

        except (AttributeError, ConnectionError, ValueError) as e:
            self.on_stream_error(str(e))
            self._current_response = ""
Enter fullscreen mode Exit fullscreen mode

Step 3: Create the Full View Orchestrator

Now the complete chat_view.py - replaces the simple version from Part 2:

# chat_view.py
"""Main copilot view combining all widgets."""

from typing import Optional
from textual.screen import Screen
from textual.containers import Horizontal, Vertical
from textual.binding import Binding
from textual.widgets import Footer

from pieces_os_client import PiecesClient
from pieces_os_client.wrapper.basic_identifier.chat import BasicChat

from .chat_panel import ChatPanel
from .chats_list import ChatsList, ChatSelected, NewChatRequested
from .chat_input import ChatInput, MessageSubmitted
from .streaming_handler import StreamingHandler


class CopilotView(Screen):
    """Main copilot view screen."""

    BINDINGS = [
        Binding("ctrl+n", "new_chat", "New Chat"),
        Binding("ctrl+r", "rename_chat", "Rename"),
        Binding("ctrl+d", "delete_chat", "Delete"),
        Binding("ctrl+l", "toggle_ltm", "Toggle LTM"),
        Binding("ctrl+q", "quit", "Quit"),
    ]

    DEFAULT_CSS = """
    CopilotView {
        layout: vertical;
    }

    CopilotView Horizontal {
        height: 1fr;
    }

    CopilotView Vertical.main-content {
        width: 75%;
        layout: vertical;
    }
    """

    def __init__(self, pieces_client: PiecesClient, **kwargs):
        super().__init__(**kwargs)
        self.pieces_client = pieces_client
        self.current_chat: Optional[BasicChat] = None
        self.chat_panel: Optional[ChatPanel] = None
        self.chats_list: Optional[ChatsList] = None
        self.chat_input: Optional[ChatInput] = None
        self._ltm_enabled = False
        self.streaming_handler: Optional[StreamingHandler] = None

    def compose(self):
        """Compose the view."""
        with Horizontal():
            # Left sidebar - conversations list
            self.chats_list = ChatsList()
            yield self.chats_list

            # Right side - chat panel and input
            with Vertical(classes="main-content"):
                self.chat_panel = ChatPanel()
                yield self.chat_panel

                self.chat_input = ChatInput()
                yield self.chat_input

        yield Footer()

    def on_mount(self) -> None:
        """Initialize the view."""
        # Load chats
        self.chats_list.load_chats(self.pieces_client)

        # Show welcome message
        self.chat_panel.show_welcome()

        # Check LTM status
        try:
            self._ltm_enabled = self.pieces_client.copilot.is_chat_ltm_enabled()
        except (ConnectionError, AttributeError):
            self._ltm_enabled = False

        # Setup streaming handler
        self.streaming_handler = StreamingHandler(
            pieces_client=self.pieces_client,
            on_thinking_started=self._on_thinking_started,
            on_stream_started=self._on_stream_started,
            on_stream_chunk=self._on_stream_chunk,
            on_stream_completed=self._on_stream_completed,
            on_stream_error=self._on_stream_error,
        )

    def on_chat_selected(self, message: ChatSelected) -> None:
        """Handle chat selection."""
        self.current_chat = message.chat
        self.pieces_client.copilot.chat = message.chat
        self.chats_list.set_active_chat(message.chat)

        # Load conversation
        self.chat_panel.clear_messages()
        self.chat_panel.border_title = f"Chat: {message.chat.name}"

        try:
            messages = message.chat.messages()
            for msg in messages:
                role = msg.role
                content = msg.raw_content
                self.chat_panel.add_message(role, content)
        except (ConnectionError, AttributeError, ValueError) as e:
            self.chat_panel.add_message("system", f"❌ Error loading messages: {str(e)}")

    def on_new_chat_requested(self, _: NewChatRequested) -> None:
        """Handle new chat request."""
        self.action_new_chat()

    def on_message_submitted(self, message: MessageSubmitted) -> None:
        """Handle user message submission."""
        if self.chat_panel.is_streaming_active():
            return

        # Add user message to chat panel
        self.chat_panel.add_message("user", message.text)

        # Send to copilot via streaming handler
        if self.streaming_handler:
            self.streaming_handler.ask_question(message.text)

    # Streaming handler callbacks
    def _on_thinking_started(self) -> None:
        """Called when copilot starts thinking."""
        self.call_from_thread(self.chat_panel.add_thinking_indicator)

    def _on_stream_started(self, initial_text: str) -> None:
        """Called when streaming starts with initial text."""
        self.call_from_thread(
            self.chat_panel.add_streaming_message,
            "assistant",
            initial_text
        )

    def _on_stream_chunk(self, full_text: str) -> None:
        """Called for each streaming chunk with accumulated text."""
        self.call_from_thread(
            self.chat_panel.update_streaming_message,
            full_text
        )

    def _on_stream_completed(self, new_chat: Optional[BasicChat]) -> None:
        """Called when streaming completes."""
        self.call_from_thread(self.chat_panel.finalize_streaming_message)

        # Update chat reference if new chat was created
        if new_chat:
            old_chat = self.current_chat
            if not old_chat or old_chat.id != new_chat.id:
                self.current_chat = new_chat
                self.call_from_thread(self.chats_list.add_new_chat, new_chat)
                self.call_from_thread(self.chats_list.set_active_chat, new_chat)
                self.chat_panel.border_title = f"Chat: {new_chat.name}"

    def _on_stream_error(self, error_msg: str) -> None:
        """Called when a streaming error occurs."""
        self.call_from_thread(
            self.chat_panel.add_message,
            "system",
            f"❌ Error: {error_msg}"
        )

    def action_new_chat(self):
        """Create a new chat."""
        self.current_chat = None
        self.pieces_client.copilot.chat = None
        self.chats_list.set_active_chat(None)
        self.chat_panel.clear_messages()
        self.chat_panel.border_title = "Chat: New Conversation"
        self.chat_panel.show_welcome()
        self.chat_input.focus()

    def action_rename_chat(self):
        """Rename the current chat."""
        if not self.current_chat:
            self.notify("No chat selected", severity="warning")
            return

        # For now, just show a notification - you can implement a dialog later
        self.notify("Rename chat - Not implemented yet", severity="info")

    def action_delete_chat(self):
        """Delete the current chat."""
        if not self.current_chat:
            self.notify("No chat selected", severity="warning")
            return

        try:
            chat_name = self.current_chat.name
            self.current_chat.delete()
            self.notify(f"Deleted chat: {chat_name}", severity="success")

            # Clear view and reload chats
            self.action_new_chat()
            self.chats_list.load_chats(self.pieces_client)

        except (ConnectionError, AttributeError) as e:
            self.notify(f"Error deleting chat: {str(e)}", severity="error")

    def action_toggle_ltm(self):
        """Toggle LTM (Long Term Memory)."""
        try:
            is_chat_ltm_enabled = self.pieces_client.copilot.is_chat_ltm_enabled()

            if is_chat_ltm_enabled:
                self.pieces_client.copilot.deactivate_ltm()
                self.notify("🧠 Chat LTM disabled", severity="info")
                self._ltm_enabled = False
            else:
                is_system_ltm_running = self.pieces_client.copilot.is_ltm_running()

                if is_system_ltm_running:
                    self.pieces_client.copilot.activate_ltm()
                    self.notify("🧠 Chat LTM enabled", severity="success")
                    self._ltm_enabled = True
                else:
                    self.notify("🧠 LTM system not running. Please enable it first.", severity="warning")

        except (ConnectionError, AttributeError) as e:
            self.notify(f"❌ Error toggling LTM: {str(e)}", severity="error")
Enter fullscreen mode Exit fullscreen mode

🚨 Critical for Thread Safety

Always use call_from_thread() when updating UI from background threads. Direct UI updates from threads will cause crashes! The streaming handler runs in a background thread, so all UI updates must go through this method.

Key features:

  • Split-pane layout (25% sidebar, 75% chat)
  • Handles all custom messages (ChatSelected, NewChatRequested, MessageSubmitted)
  • Thread-safe UI updates via call_from_thread()
  • Keyboard shortcuts (Ctrl+N, Ctrl+D, Ctrl+L, Ctrl+Q)
  • Conversation management
  • LTM toggle support

Step 4: Update the Main Application

Now update src/pieces_copilot_tui/app.py to use the full CopilotView:

# src/pieces_copilot_tui/app.py
"""Pieces Copilot TUI - A Terminal User Interface for Pieces OS."""

from textual.app import App

from pieces_os_client.wrapper import PiecesClient
from .chat_view import CopilotView


class PiecesCopilotTUI(App):
    """Pieces Copilot TUI with full features."""

    # CSS styles for the entire app
    DEFAULT_CSS = """
    Screen {
        background: $background;
        color: $text;
    }

    .error {
        color: $error;
        text-style: bold;
    }

    .success {
        color: $success;
        text-style: bold;
    }

    .warning {
        color: $warning;
        text-style: bold;
    }
    """

    def __init__(self, pieces_client: PiecesClient = None, **kwargs):
        super().__init__(**kwargs)
        self.pieces_client = pieces_client or PiecesClient()
        self.copilot_view = None

    def on_mount(self) -> None:
        """Initialize the application."""
        self.title = "Pieces Copilot TUI"

        # Initialize Pieces client
        if not self.pieces_client.connect_websocket():
            self.notify("Failed to connect to Pieces OS", severity="error")
            self.exit()
            return

        # Create and push the full copilot view
        self.copilot_view = CopilotView(pieces_client=self.pieces_client)
        self.push_screen(self.copilot_view)


def run_tui():
    """Run the Pieces Copilot TUI application."""
    # Initialize Pieces client
    pieces_client = PiecesClient()

    # Check if Pieces OS is running
    if not pieces_client.is_pieces_running():
        print("Error: Pieces OS is not running. Please start Pieces OS first.")
        return

    app = PiecesCopilotTUI(pieces_client=pieces_client)
    app.run()


if __name__ == "__main__":
    run_tui()
Enter fullscreen mode Exit fullscreen mode

Changes from Part 2:

  • Import CopilotView instead of SimpleChatView
  • Added warning CSS class
  • Now using the full-featured view

Step 5: Run Your Complete TUI

The __main__.py file is already set up from Part 2. Run it the Pythonic way:

# Navigate to src directory
cd src

# Run the module
python -m pieces_copilot_tui
Enter fullscreen mode Exit fullscreen mode

That's it! Simple, clean, and Pythonic. ✨

🎉 You now have a fully functional Pieces Copilot TUI!


Add Debug Logging

# In any widget method
self.log(f"Current state: {some_variable}")
self.log.info("Info message")
self.log.warning("Warning message")
self.log.error("Error message")

# View in textual console
Enter fullscreen mode Exit fullscreen mode

Common Issues

Problem: Streaming doesn't start

# Check WebSocket connection
if not client.copilot.ask_stream_ws:
    print("❌ WebSocket not connected!")
    client.connect_websocket()
Enter fullscreen mode Exit fullscreen mode

Problem: UI not updating from streaming

# Make sure you're using call_from_thread
self.call_from_thread(self.chat_panel.add_message, "assistant", text)
# NOT: self.chat_panel.add_message("assistant", text)
Enter fullscreen mode Exit fullscreen mode

Problem: Messages not loading

# Add error handling
try:
    messages = chat.messages()
except Exception as e:
    self.log.error(f"Failed to load messages: {e}")
Enter fullscreen mode Exit fullscreen mode

Top comments (1)

Collapse
 
art_light profile image
Art light

This is a really solid walkthrough—clear structure, practical code, and it builds confidence step by step. I especially like how you didn’t just add features, but explained why things like call_from_thread() and streaming handlers matter in real TUI apps. The split between UI orchestration and streaming logic feels clean and very maintainable. One thing I’d love to see next is a small section on testing or debugging patterns for TUIs at this scale. Overall, this series makes advanced Textual + AI integration feel approachable, and I’m definitely interested in following where you take it next.
Could I join to your team?
Telegram: light4661
Discord: lighthouse4661
WhatsApp: +1 (814) 488-3243