# Best Practices Source: https://alexcode.ai/docs/chat/best-practices Guidelines for effective code assistance and documentation usage in Alex Sidebar ## Writing Effective Prompts * Be specific about your goals * State what you want to achieve * Provide clear success criteria * Share relevant code snippets * Include error messages * Mention project requirements ## Using Think First Mode * Complex architectural decisions * Bug investigation * Performance optimization * Security considerations * Quick syntax questions * Simple code completions * Documentation lookups * Basic refactoring ### Tips for Think First Mode * Allow extra time for the dual-model processing * Provide detailed context for better analysis * Use it for mission-critical code changes * Consider disabling for rapid prototyping phases ### Examples ❌ **Ineffective**: `My code isn't working. Can you help?` ✅ **Effective**: `I'm getting a 'Thread 1: Fatal error: Unexpectedly found nil' when trying to unwrap an optional UIImage in my custom UICollectionViewCell. Here's my cellForItemAt implementation...` ✅ **Feature Request**: `I need to implement a custom tab bar in SwiftUI that shows a circular progress indicator around the selected tab icon. The progress should be animated. Here's my current TabView implementation...` ## Managing Context * Use @ Files to add specific files * Include related dependencies * Share configuration files * Use @ Codebase for framework-level questions * Reference specific components * Share relevant modules ### Examples ❌ **Limited**: `How do I update this delegate method?` ✅ **Complete**: `I need to update this UITableViewDelegate method to handle custom swipe actions. Here's my current implementation (@Files TableViewController.swift) and the custom SwipeActionView (@Files Views/SwipeActionView.swift) I want to integrate.` ✅ **Framework**: `I'm building a custom networking layer (@Codebase Networking/*). Can you help me implement proper retry logic with exponential backoff?` ## Using Documentation * Use @ Apple Docs for framework reference * Reference specific APIs * Include version information * Use @ Apple Docs (Individual) for specific methods * Reference specific classes * Include parameter details ### Examples ❌ **Vague**: `How do I use Core Data?` ✅ **Specific**: `I need help implementing NSFetchedResultsController (@Apple Docs NSFetchedResultsController) with multiple sections based on dates. Here's my current Core Data model (@Files Model.xcdatamodeld)...` ✅ **API Reference**: `Can you explain how to use URLSession's (@Apple Docs URLSession) background download tasks with proper delegate handling (@Apple Docs URLSessionDownloadDelegate)?` ## Platform-Specific Best Practices * Use Swift-specific terminology * Reference Swift documentation * Include Swift version * Reference UI frameworks * Include view hierarchy * Share layout constraints ### Examples ❌ **Ambiguous**: `How do I create a button?` ✅ **SwiftUI**: `I'm using SwiftUI (iOS 16+) and need to create a custom button with a gradient background, dynamic shadow, and haptic feedback. Here's my current Button implementation...` ✅ **UIKit Integration**: `I need to embed this SwiftUI view (@Files CustomView.swift) into my existing UIKit navigation stack. Here's my current UIHostingController setup...` ## Common Scenarios * Share error messages * Include stack traces * Reference relevant code * Explain current structure * Describe desired outcome * Share relevant files ### Examples ❌ **Unclear**: `The app crashes sometimes.` ✅ **Detailed**: `The app crashes when switching between tabs while a network request is in progress. Here's the crash log and relevant networking code (@Files NetworkManager.swift). The issue started after implementing async/await...` ✅ **Refactoring**: `I want to refactor this massive view controller (@Files ProfileViewController.swift) into smaller components using MVVM. Here's my planned architecture diagram...` ### Examples ❌ **Too Much**: `Sharing entire project files for a simple UI fix` ✅ **Just Right**: `Continuing from our previous chat about the networking layer (see chat history), I need to add request caching. Here's the specific RequestCaching protocol I want to implement...` ✅ **Breaking Down**: \`I need to migrate this UIKit project to SwiftUI. Let's break it down: 1. First, let's handle the navigation structure 2. Then, convert each view controller individually 3. Finally, implement the data flow with @StateObject and ObservableObject\` ## Working with Build Errors * Use for automatic error resolution * Alex handles the entire build-fix cycle by using Xcode's build system * No manual intervention needed * Continues until build succeeds ### When to Use Automatic Build & Fix * Multiple compilation errors * Missing imports or protocols * Type mismatches * Initialization errors * Access control issues * Complex architectural decisions * Business logic errors * Performance optimizations * Custom framework integration ### Examples ✅ **Automatic Fix**: `Click "Build & Fix Errors" when you see multiple red errors in Xcode. Alex will handle missing imports, protocol conformance, and type issues automatically one by one until the build succeeds.` ### Build Error Best Practices 1. **Let Alex Work**: Do not interrupt the build-fix loop unless necessary 2. **Review Changes**: Always review the final working code 3. **Save Progress**: Use checkpoints after successful builds ## Community Resources Need more help? Join the [Discord community](https://discord.gg/T5zxfReEnd) for support and tips from other developers. # Agents Source: https://alexcode.ai/docs/chat/context/agents AI agents specialized for development tasks Alex Sidebar's agents are specialized AI assistants, each trained for specific development tasks. Alex automatically selects the right agent for your workflow for autonomously handling your development tasks! Agent mode is now always on! Claude Sonnet 4 is the recommended model for its superior code generation capabilities, but you can now use other supported models including: * Claude 3.5 Sonnet * Gemini 2.5 Pro * Gemini 2.5 Flash * OpenAI o3 * OpenAI o4 Mini * OpenAI GPT 4.1 * DeepSeek R1 * DeepSeek V3 (03.24) Choose your preferred model from the model selector while maintaining all agent capabilities. ## Quick Actions One-click automatic build, error detection, and fix application. Alex continuously rebuilds until your project compiles successfully. Let agents handle repetitive tasks while you focus on core development work. Switch between different AI models while maintaining agent capabilities and context. ## Understanding Agents Agents work best by: * Learning your project structure and patterns * Maintaining contextual understanding across sessions * **Automatically building and running your app after changes** * **Detecting and fixing compilation errors in a continuous loop** * **Taking screenshots for verification and debugging** You can make agents more effective over time by providing more notes about your project and coding preferences. Learn more about project notes [here](/chat/context/memory). ## Getting Started Press **Command + Shift + A** to toggle auto-apply for code changes. When enabled, code suggestions will be automatically applied to your files. Use voice or text to describe your project: ```swift "This is an iOS app using SwiftUI and MVVM architecture. The main features include user authentication and data persistence." ``` The agent will: * Analyze project structure * Study coding patterns * Build contextual understanding * Use regex search to find relevant code patterns ## Automatic Build & Run Alex Sidebar's agent automatically builds and runs your project after making changes, creating a seamless development experience. While the agent handles most errors automatically, always review the final changes to ensure they align with your project's requirements. ## Best Practices Provide specific requirements and context for best results Break complex tasks into smaller, manageable steps Always review and test agent-suggested modifications Keep project notes updated for better agent performance # Commands Source: https://alexcode.ai/docs/chat/context/commands Available commands and shortcuts in Alex Sidebar's chat interface ## Available Commands Available commands and shortcuts in the chat interface showing @ commands menu Access and reference files in your project directly from the chat interface. Search and reference your entire codebase context during chat conversations. Search and reference official Apple documentation without leaving the chat. Access specific Apple documentation entries for targeted reference. ## Using Commands Type `@` or click the + button in the chat interface to see available commands. Use the search bar at the top to filter available commands and find what you need quickly. Commands help you efficiently access resources and context without leaving your chat workflow. ## Files Command Access your project files directly within the chat interface. Browse, search, and reference specific files during your conversations. Files command interface showing file browser and search functionality ## Codebase Command Search through your entire codebase context, find specific implementations, and reference code snippets in your discussions. Codebase command interface showing code search and reference features ## Apple Documentation Command Search and browse through the complete Apple documentation library without switching contexts or leaving your chat. Apple Documentation command interface showing documentation search and browsing ## Individual Apple Documentation Command Access and reference specific documentation entries, methods, or APIs for technical discussions. Individual Apple Documentation command interface showing specific documentation reference # Project Memory Source: https://alexcode.ai/docs/chat/context/memory Remember context across conversations with Alex Project Memory enables Alex to remember context across conversations by using terms like "remember this" or "keep this in mind". This helps maintain continuity and provides more contextually relevant responses over time. ## Understanding Project Memory When enabled, you can ask Alex to remember context across conversations by using phrases like: * "Remember this" * "Keep this in mind" * "Remember that..." * "Take note of..." This allows Alex to: * Maintain important context between chat sessions * Remember project-specific requirements and patterns * Provide more consistent and personalized responses * Reference previously discussed solutions ## Managing Project Memory ### Accessing Memory Settings To manage your project memory: 1. Open Settings (gear icon) 2. Navigate to **Tools & Features** → **Project Memory** 3. Toggle **Enable Memory** on/off using the switch ### Searching Memories Once memories are saved, you can: * Use the search bar to find specific memories * View all stored memories in the list * Delete individual memories as needed ### Creating Memories Simply tell Alex what to remember during any conversation: * "Remember that our app uses SwiftUI and MVVM architecture" * "Keep in mind that we're targeting iOS 17+" * "Note that all API calls should use async/await" ## Privacy and Security All project memory data is: * Stored locally on your device * Never shared with external services * Fully under your control ## Best Practices ### What to Remember Use project memory for: * **Project architecture**: "Remember we're using MVVM with Combine" * **Coding standards**: "Keep in mind we use 2-space indentation" * **API details**: "Remember our API base URL is api.example.com" * **Team preferences**: "Note that we prefer guard statements over if-let" * **Dependencies**: "Remember we're using Firebase for authentication" ### Memory Management Tips 1. **Be specific**: Clear, specific memories are more useful than vague ones 2. **Update regularly**: Remove outdated memories to keep context relevant 3. **Use search**: Quickly find memories using the search feature 4. **Review periodically**: Check your stored memories to ensure they're still accurate Project Memory is especially useful for long-term projects where maintaining consistent context across multiple coding sessions is important. # Image to Code Source: https://alexcode.ai/docs/chat/input-modes/image-to-code Transform designs into code by dragging images into Alex Sidebar The Image-to-Code feature allows you to quickly convert design mockups, screenshots, or UI elements into code. You can input images through multiple methods: ## Input Methods Use **Command + Shift + 6** to: * Take window screenshots * Capture specific UI selections * Perfect for quick UI component captures Drag images directly into the chat interface from: * Finder * Browser * Design tools ## Screenshot Tool (⌘ + ⇧ + 6) The built-in screenshot tool provides a quick way to capture UI elements and convert them to code. You can customize its behavior in the settings to match your workflow. ### Capture Options Capture entire windows with a single click. Perfect for: * Full screen interfaces * Complete view hierarchies * Dialog boxes and alerts Select specific areas to capture. Ideal for: * Individual UI components * Specific sections of an interface * Custom-sized regions ### Settings Customization ![Screenshot settings showing capture options](https://mintlify.s3.us-west-1.amazonaws.com/alex/chat/images/screenshot-settings.png) You can customize the screenshot tool in three ways: #### 1. Quick Access Menu When you click the screenshot button or use ⌘ + ⇧ + 6, a dropdown menu appears with three options: * **Capture window**: Take a screenshot of the entire window * **Capture selection**: Draw a selection box around the desired area * **Attach file**: Choose a file from your system instead #### 2. Default Behavior Settings You can set your preferred default screenshot behavior in Settings: 1. Open Settings 2. Navigate to "Chat Settings" 3. Look for "Default Screenshot Behavior" under the chat options 4. Choose between: * Capture window (automatically capture the entire window) * Capture selection (start with selection tool) * Show options (always show the dropdown menu) #### 3. Customize Keyboard Shortcut You can change the default ⌘ + ⇧ + 6 shortcut: 1. Open Settings 2. Go to "Chat Settings" 3. Find "Take Screenshot" in the list 4. Click on the current shortcut (⌘ + ⇧ + 6) to change it 5. Press your desired key combination Even with a default behavior set, you can always access other capture modes through the dropdown menu or by using your configured keyboard shortcut. ### Best Practices for Screenshots 1. **Clean Captures** * Close unnecessary windows or tabs * Hide sensitive information * Ensure the UI is in its final state 2. **Component Focus** * Zoom in for small components * Include padding for context * Capture in the correct state (hover, active, etc.) ## Getting Started Supported formats: * PNG * JPEG * Screenshots directly from Xcode/Figma For best results, ensure your images clearly show the UI elements you want to convert. Choose your preferred method: 1. Use **Command + Shift + 6** to take a screenshot 2. Drag and drop images into the chat 3. Copy-paste images directly The eligible model will analyze the images and generate corresponding code. You can: * Copy the code directly * Request modifications * Ask for explanations of specific parts ## Best Practices ### Image Preparation * Use high-resolution images for better accuracy * Crop to include only relevant UI elements * Ensure good contrast between elements * Include any specific styling details as prompt you want to capture ### Code Generation * Start with simple components before complex layouts * Review generated code for customization needs * Use follow-up questions to refine the output * Consider breaking complex UIs into smaller pieces For complex designs, try generating code for individual components first, then combine them into the final layout. ## Example Workflows ### Basic UI Component Take a screenshot of a button or card design Drag the image into Alex Sidebar ```swift // Generated code example struct CustomButton: View { var body: some View { Button(action: {}) { Text("Get Started") .font(.headline) .foregroundColor(.white) .padding(.horizontal, 24) .padding(.vertical, 12) .background(Color.blue) .cornerRadius(8) } } } ``` ### Complex Layout Identify main components in your layout. You can: * Split your design into logical sections * Take screenshots of individual components * Prepare multiple images for different parts You have two options: * Drag multiple component images at once to generate all parts simultaneously * Generate code for each major section individually Ask Alex to help combine components into a cohesive layout. You can: * Request adjustments to match the overall design * Fine-tune spacing and alignment * Add container views and navigation elements ## Troubleshooting ### Common Issues * Ensure image quality is high * Try cropping closer to the component * Use screenshots rather than photos * Provide additional context in your prompt * Specify exact colors if known * Include style guide references * Ask for specific modifications * Use follow-up questions for refinement * Break complex layouts into sections * Specify constraints explicitly * Ask for alternative layout approaches * Provide reference screenshots ## Tips for Better Results 1. **Start Simple**: Begin with basic components before attempting complex layouts 2. **Iterate**: Use follow-up questions to refine the generated code 3. **Combine Methods**: Use both image and text descriptions for better results 4. **Review Output**: Always review and test generated code in your project Remember that Image-to-Code is a starting point. You can always ask Alex to modify the generated code to better match your needs. # Voice Mode Source: https://alexcode.ai/docs/chat/input-modes/voice-mode Use voice commands to interact with Alex Sidebar more naturally ## Overview Voice Mode allows you to interact with Alex Sidebar using speech instead of typing, making code discussions more natural and efficient. This feature is particularly useful when you need to explain complex problems or want to reduce typing fatigue. Voice mode interface showing microphone button and transcription ## Quick Actions Toggle voice recording on/off. Use this shortcut to quickly start or stop voice input without clicking. Click the microphone icon in the chat input area to start/stop recording. ## Using Voice Mode 1. Open Settings (gear icon) 2. Navigate to Chat Settings 3. Enable Voice Mode Voice mode interface showing microphone button and transcription Voice Mode requires microphone permissions. You will be prompted to grant access when first enabling this feature. Two ways to start: * Press **Command + Shift + V** * Click the microphone icon Voice recording button in chat interface Speak clearly into your microphone. Use any of these methods to stop: * Press **Command + Shift + V** again * Click the microphone icon * Press **Return/Enter** key * Press **Escape** key to cancel recording Your speech will be automatically transcribed and inserted into the chat input box. Voice transcribing indicator showing speech-to-text conversion in progress Use the Escape key if you want to cancel the recording without transcribing the audio. ## Auto Mode Voice mode interface showing microphone button and transcription Enable "Auto Mode" in settings to automatically send transcribed messages: * Finish speaking and stop recording * Press Return/Enter * Message sends automatically and starts AI inference Voice auto mode interface showing automatic message sending after transcription ## Best Practices * Speak at a natural pace * Enunciate clearly * Avoid background noise * Use technical terms carefully * Use voice for longer explanations * Review transcription before sending * Combine with code selection ## Accessibility Benefits Voice Mode makes Alex more accessible and easier to use for everyone. * When you are coding for long periods, **Voice Mode** lets you take breaks from typing while staying productive. * For developers with mobility challenges or strain injuries, voice input provides a comfortable way to interact with Alex. You can dictate code explanations and questions instead. * Having voice as an additional input option means you can choose what works best for you in different situations. Sometimes speaking is just more convenient than typing. And when you need to explain complex concepts or walk through detailed logic, speaking it out loud often feels more natural and fluid than typing it all out. # Overview Source: https://alexcode.ai/docs/chat/overview Learn how to use Alex Sidebar's chat for code assistance Alex Sidebar's chat feature enables code discussions and improvements in Xcode running on the side. By prompting your queries, you can assist your coding process and resolve issues more quickly. ## Quick Actions Click this button in the main view to automatically build your project and fix all compilation errors in a continuous loop until success. Select code in Xcode to start a **new chat**. The selected content will automatically be added as a reference. Add selected code to your existing chat without opening a new window. Perfect for building context incrementally. Start a chat with entire codebase as context. Useful for high-level questions about your project. Access documentation, add more files and more using the @ menu in your chat. Copy-paste or drag images directly into chat for design analysis and code generation. Perfect for UI discussions and visual debugging. ## Copy Request Button Perfect for using the most powerful models like o3 Pro or models with massive context windows like Gemini 2.5 Pro (1M+ tokens).