DEV Community

Cover image for Mastering Collections in C#
Sukhpinder Singh for C# Programming

Posted on

Mastering Collections in C#

Table of Contents

Chapter 1: Why Collections Matter in C# (Beyond Arrays)

Chapter 2: Lists: Your Go-To Dynamic Collection

Chapter 3: Dictionaries: Key-Value Powerhouses

Chapter 4: Queues: First In, First Out Processing

Chapter 5: Stacks: Last In, First Out Operations

Chapter 6: HashSet & SortedSet: Unique Collections

Chapter 7: LinkedList: When Order and Insertion Matter

Chapter 8: ObservableCollection & Modern .NET Collections

Chapter 9: LINQ with Collections: Querying Made Easy

Chapter 10: Best Practices, Performance, and Real-World Use Cases


Chapter 1: Why Collections Matter in C# (Beyond Arrays)

Picture this: you're building a social media feed that needs to handle thousands of posts, a gaming leaderboard that constantly updates, or a fintech application processing real-time transactions. Arrays might seem like the obvious choice, but they'll quickly become your bottleneck.

As developers, we often start with arrays because they're simple and familiar. However, real-world applications demand more flexibility, better performance, and cleaner code. Collections in C# provide the tools to build scalable, maintainable applications that can grow with your user base.

The Array Limitation Problem

Arrays in C# are fixed-size data structures. Once you declare an array with a specific length, that's it—no growing, no shrinking. Here's what this looks like in practice:

// Fixed size - this becomes problematic quickly
int[] userScores = new int[100]; // What if we get 101 users?

// Adding elements requires manual array manipulation
int[] expandedScores = new int[userScores.Length + 1];
Array.Copy(userScores, expandedScores, userScores.Length);
expandedScores[userScores.Length] = newScore; // Tedious and error-prone
Enter fullscreen mode Exit fullscreen mode

This manual memory management becomes a nightmare when building dynamic applications like social media platforms or gaming systems where data constantly changes.

Collections: Built for Real-World Development

Collections solve the fundamental limitations of arrays by providing dynamic sizing, optimized operations, and specialized behaviors. They're designed for the messy, unpredictable nature of real-world software development.

// Dynamic and clean - this is what modern C# looks like
List<int> userScores = new List<int>();
userScores.Add(newScore); // Simple, safe, and efficient
Enter fullscreen mode Exit fullscreen mode

Memory Management and Performance

Unlike arrays, collections handle memory management intelligently. When a List<T> needs more space, it automatically allocates a larger internal array and copies existing elements. This happens behind the scenes, following algorithms optimized by Microsoft's engineers over decades.

The key insight: collections trade small memory overhead for massive developer productivity gains. In a typical business application, the slight memory cost is negligible compared to the time saved and bugs avoided.

Real-World Scenarios Where Collections Shine

Gaming Development

// Managing dynamic game objects
List<Enemy> activeEnemies = new List<Enemy>();
Dictionary<string, PlayerStats> leaderboard = new Dictionary<string, PlayerStats>();
Queue<GameEvent> eventQueue = new Queue<GameEvent>(); // Process events in order
Enter fullscreen mode Exit fullscreen mode

Social Media Applications

// Dynamic content feeds
List<Post> userFeed = new List<Post>();
HashSet<string> uniqueHashtags = new HashSet<string>(); // No duplicate tags
Dictionary<int, User> userCache = new Dictionary<int, User>(); // Fast user lookup
Enter fullscreen mode Exit fullscreen mode

Financial Systems

// Transaction processing
Queue<Transaction> pendingTransactions = new Queue<Transaction>(); // FIFO processing
Stack<Transaction> undoStack = new Stack<Transaction>(); // Undo last operations
Dictionary<string, decimal> accountBalances = new Dictionary<string, decimal>(); // O(1) balance lookup
Enter fullscreen mode Exit fullscreen mode

Good Practices vs Bad Practices

✅ Good Practice: Choose the Right Collection

// Use Dictionary for fast lookups by key
Dictionary<string, User> userCache = new Dictionary<string, User>();
var user = userCache["john123"]; // O(1) average case
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Using Lists for Everything

// Inefficient - O(n) search every time
List<User> users = new List<User>();
var user = users.FirstOrDefault(u => u.Username == "john123");
Enter fullscreen mode Exit fullscreen mode

✅ Good Practice: Initialize with Capacity When Known

// Prevents multiple internal array allocations
List<string> knownSizeList = new List<string>(1000);
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Ignoring Performance Characteristics

// Inefficient for frequent insertions at the beginning
List<int> numbers = new List<int>();
numbers.Insert(0, newNumber); // O(n) operation - all elements shift
Enter fullscreen mode Exit fullscreen mode

Theory Foundation: Big O Notation in Collections

Understanding performance characteristics helps you make informed decisions:

  • List: O(1) append, O(n) insert at beginning
  • Dictionary: O(1) average lookup, O(n) worst case
  • HashSet: O(1) average add/contains, O(n) worst case
  • Queue and Stack: O(1) for their primary operations

This isn't academic theory—it directly impacts user experience. A social media feed with O(n) user lookups will crawl as your user base grows, while O(1) dictionary lookups scale effortlessly to millions of users.

Chapter Summary

Collections are the foundation of professional C# development because they solve real-world problems that arrays cannot. They provide dynamic sizing, optimized operations, and specialized behaviors that make complex applications maintainable and performant. Understanding when and why to use each collection type separates junior developers from senior engineers who build systems that scale.

The key insight: arrays are tools for simple, fixed-size data; collections are tools for building applications that handle the complexity and growth of real business requirements.

Practice Exercise

Scenario: You're building a customer service ticket system. Design the data structures you'd use for:

  1. Storing all tickets (need to add/remove frequently)
  2. Looking up tickets by ticket ID (must be fast)
  3. Processing tickets in order of arrival (first come, first served)
  4. Storing unique customer email addresses (no duplicates)

Write a brief explanation of which collection you'd choose for each requirement and why. Consider both the theoretical performance characteristics and real-world usage patterns.


Chapter 2: Lists: Your Go-To Dynamic Collection

Every successful application starts with managing changing data. Whether you're tracking user interactions in a mobile app, managing inventory in an e-commerce system, or building playlists in a music streaming service, List<T> is your reliable workhorse.

Lists bridge the gap between the simplicity of arrays and the dynamic requirements of modern software. They provide array-like syntax with the flexibility to grow and shrink as your data changes, making them the most commonly used collection in professional C# development.

Understanding List Fundamentals

A List<T> is essentially a dynamic array backed by an internal array that automatically resizes when needed. The generic <T> means it can store any type while maintaining type safety—no boxing, no casting, no runtime errors from type mismatches.

// Type-safe and efficient
List<string> userNames = new List<string>();
List<int> scores = new List<int>();
List<Customer> customers = new List<Customer>();

// Compile-time safety - this won't compile
// userNames.Add(42); // Error: Cannot convert int to string
Enter fullscreen mode Exit fullscreen mode

Dynamic Resizing: The Magic Behind the Scenes

When you add elements to a list, it doesn't allocate memory for each individual item. Instead, it uses a capacity doubling strategy:

List<int> numbers = new List<int>(); // Initial capacity: 0
numbers.Add(1); // Capacity becomes 4
numbers.Add(2);
numbers.Add(3);
numbers.Add(4);
numbers.Add(5); // Capacity doubles to 8
Enter fullscreen mode Exit fullscreen mode

This exponential growth strategy ensures that adding elements is amortized O(1)—even though occasional resizing operations are O(n), they happen infrequently enough that average performance remains excellent.

Essential List Operations

Adding Elements

List<string> tasks = new List<string>();

// Add single items
tasks.Add("Review code");
tasks.Add("Deploy to staging");

// Add multiple items
tasks.AddRange(new[] { "Run tests", "Update documentation" });

// Insert at specific position
tasks.Insert(0, "Morning standup"); // Insert at beginning
Enter fullscreen mode Exit fullscreen mode

Accessing and Modifying Elements

// Index-based access (just like arrays)
string firstTask = tasks[0];
tasks[1] = "Deploy to production"; // Modify existing item

// Safe access with bounds checking
if (tasks.Count > 2)
{
    string thirdTask = tasks[2];
}
Enter fullscreen mode Exit fullscreen mode

Removing Elements

// Remove by value
tasks.Remove("Update documentation"); // Removes first occurrence

// Remove by index
tasks.RemoveAt(0); // Remove first item

// Remove multiple items
tasks.RemoveRange(1, 2); // Remove 2 items starting at index 1

// Clear everything
tasks.Clear();
Enter fullscreen mode Exit fullscreen mode

Searching and Checking

// Check if item exists
bool hasDeployTask = tasks.Contains("Deploy to production");

// Find index of item
int index = tasks.IndexOf("Run tests"); // Returns -1 if not found

// Find items with conditions
string urgentTask = tasks.Find(task => task.Contains("urgent"));
List<string> deployTasks = tasks.FindAll(task => task.Contains("Deploy"));
Enter fullscreen mode Exit fullscreen mode

Performance Characteristics in Practice

Understanding List performance helps you write efficient code:

  • Add/AddRange: O(1) amortized, O(n) when resizing
  • Index access: O(1) - just like arrays
  • Insert at beginning: O(n) - must shift all elements
  • Remove by index: O(n) - must shift elements after removal
  • Remove by value: O(n) - must search then shift
  • Contains/IndexOf: O(n) - linear search
// Efficient: Adding to end
List<LogEntry> logs = new List<LogEntry>();
logs.Add(newLogEntry); // O(1) - fast

// Inefficient: Frequent insertions at beginning
List<string> recentMessages = new List<string>();
recentMessages.Insert(0, newMessage); // O(n) - slow for large lists
Enter fullscreen mode Exit fullscreen mode

Real-World Application Patterns

User Interface Lists

// Managing dynamic UI elements
List<MenuItem> navigationItems = new List<MenuItem>
{
    new MenuItem("Dashboard", "/dashboard"),
    new MenuItem("Reports", "/reports"),
    new MenuItem("Settings", "/settings")
};

// Add items based on user permissions
if (user.HasPermission("Admin"))
{
    navigationItems.Add(new MenuItem("Admin Panel", "/admin"));
}
Enter fullscreen mode Exit fullscreen mode

Data Processing Pipelines

// Processing batches of data
List<CustomerOrder> todaysOrders = GetTodaysOrders();
List<CustomerOrder> priorityOrders = new List<CustomerOrder>();

foreach (var order in todaysOrders)
{
    if (order.IsPriority)
    {
        priorityOrders.Add(order);
    }
}

// Process priority orders first
ProcessOrders(priorityOrders);
Enter fullscreen mode Exit fullscreen mode

Caching and Temporary Storage

// Temporary collection for API responses
List<WeatherData> weatherCache = new List<WeatherData>();

public async Task RefreshWeatherData()
{
    weatherCache.Clear();
    var newData = await weatherApiClient.GetForecast();
    weatherCache.AddRange(newData);
}
Enter fullscreen mode Exit fullscreen mode

Good Practices vs Bad Practices

✅ Good Practice: Initialize with Known Capacity

// Prevents multiple internal reallocations
List<Customer> customers = new List<Customer>(expectedCustomerCount);
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Ignoring Capacity for Large Collections

// Multiple expensive resize operations
List<DataPoint> largeDataset = new List<DataPoint>(); // Starts at capacity 0
// Adding 10,000 items triggers many resize operations
Enter fullscreen mode Exit fullscreen mode

✅ Good Practice: Use Collection Initializer for Static Data

List<string> allowedFileTypes = new List<string> 
{ 
    ".pdf", ".docx", ".txt", ".jpg", ".png" 
};
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Multiple Individual Add Calls for Known Data

List<string> fileTypes = new List<string>();
fileTypes.Add(".pdf");
fileTypes.Add(".docx");
fileTypes.Add(".txt");
// More verbose and slightly less efficient
Enter fullscreen mode Exit fullscreen mode

✅ Good Practice: Consider Performance for Frequent Operations

// Use Queue<T> instead of List<T> for frequent beginning insertions
Queue<ProcessingTask> taskQueue = new Queue<ProcessingTask>();
taskQueue.Enqueue(newTask); // O(1) instead of List.Insert(0, item) which is O(n)
Enter fullscreen mode Exit fullscreen mode

Working with Custom Objects

Lists truly shine when working with custom objects and business entities:

public class Product
{
    public int Id { get; set; }
    public string Name { get; set; }
    public decimal Price { get; set; }
    public DateTime DateAdded { get; set; }
}

List<Product> inventory = new List<Product>();

// Adding products
inventory.Add(new Product 
{ 
    Id = 1, 
    Name = "Wireless Headphones", 
    Price = 99.99m, 
    DateAdded = DateTime.Now 
});

// Finding products with LINQ (covered in detail in Chapter 9)
var expensiveProducts = inventory.Where(p => p.Price > 100).ToList();
var recentProducts = inventory.Where(p => p.DateAdded > DateTime.Now.AddDays(-7)).ToList();
Enter fullscreen mode Exit fullscreen mode

Thread Safety Considerations

Lists are not thread-safe by default. In multi-threaded applications, you need additional synchronization:

// Thread-safe wrapper (simple but can impact performance)
List<string> threadSafeList = new List<string>();
lock (threadSafeList)
{
    threadSafeList.Add("Thread-safe addition");
}

// Better: Use ConcurrentBag<T> for concurrent scenarios
// (covered in Chapter 10: Best Practices)
Enter fullscreen mode Exit fullscreen mode

Chapter Summary

List<T> is the foundation of dynamic data management in C#. Its automatic resizing, type safety, and familiar array-like syntax make it the go-to choice for most scenarios involving changing collections of data. Understanding its performance characteristics—especially the O(1) append and O(n) insertion costs—helps you write efficient code that scales with your application's growth.

The key insight: Lists excel at append-heavy scenarios and provide the best balance of usability and performance for general-purpose dynamic collections.

Practice Exercise

Scenario: Build a simple task management system for a development team.

Create a Task class with properties: Id, Title, AssignedTo, Priority (1-5), and IsCompleted.

Write methods to:

  1. Add new tasks to the list
  2. Mark tasks as completed
  3. Find all high-priority tasks (priority 4-5)
  4. Remove completed tasks older than 30 days
  5. Get tasks assigned to a specific team member

Consider the performance implications of each operation and explain why List<T> is appropriate for this use case.


Chapter 3: Dictionaries: Key-Value Powerhouses

Imagine you're building a user authentication system, a product catalog, or a caching layer for a high-traffic web application. You need lightning-fast lookups by unique identifiers—user IDs, product SKUs, or cache keys. This is where Dictionary<TKey, TValue> becomes indispensable.

Dictionaries solve the fundamental problem of associative data storage. While lists are excellent for ordered collections, dictionaries excel when you need to quickly find information using a unique key rather than searching through every element.

Understanding Hash Tables and Dictionary Internals

A Dictionary<TKey, TValue> is C#'s implementation of a hash table. The "magic" happens through a process called hashing:

  1. Hash Function: Converts your key into a numeric hash code
  2. Bucket Assignment: Uses the hash code to determine where to store the value
  3. Collision Handling: Manages cases where different keys produce the same hash code
// Behind the scenes when you do this:
Dictionary<string, User> users = new Dictionary<string, User>();
users["john123"] = new User("John", "Doe");

// C# calculates: "john123".GetHashCode() → determines storage location
// Result: O(1) average lookup time
Enter fullscreen mode Exit fullscreen mode

Essential Dictionary Operations

Creating and Adding Items

// Initialize empty dictionary
Dictionary<string, decimal> productPrices = new Dictionary<string, decimal>();

// Add items
productPrices["laptop"] = 999.99m;
productPrices["mouse"] = 29.99m;
productPrices["keyboard"] = 79.99m;

// Alternative: using Add method (throws exception if key exists)
productPrices.Add("monitor", 399.99m);

// Safe addition - check if key exists first
if (!productPrices.ContainsKey("webcam"))
{
    productPrices["webcam"] = 149.99m;
}

// Dictionary initializer syntax
var defaultSettings = new Dictionary<string, bool>
{
    ["EnableLogging"] = true,
    ["UseHttps"] = true,
    ["AllowGuestAccess"] = false
};
Enter fullscreen mode Exit fullscreen mode

Accessing and Modifying Values

// Direct access (throws KeyNotFoundException if key doesn't exist)
decimal laptopPrice = productPrices["laptop"];

// Safe access with TryGetValue
if (productPrices.TryGetValue("tablet", out decimal tabletPrice))
{
    Console.WriteLine($"Tablet costs: {tabletPrice}");
}
else
{
    Console.WriteLine("Tablet not found");
}

// Modify existing values
productPrices["laptop"] = 899.99m; // Update price
Enter fullscreen mode Exit fullscreen mode

Checking and Removing Items

// Check if key exists
bool hasLaptop = productPrices.ContainsKey("laptop");

// Check if value exists (less efficient - O(n))
bool hasExpensiveItem = productPrices.ContainsValue(999.99m);

// Remove items
bool removed = productPrices.Remove("mouse"); // Returns true if key existed

// Remove with condition
productPrices.Remove("laptop", out decimal oldLaptopPrice);
Enter fullscreen mode Exit fullscreen mode

Iterating Through Dictionaries

// Iterate over key-value pairs
foreach (var kvp in productPrices)
{
    Console.WriteLine($"{kvp.Key}: ${kvp.Value}");
}

// Iterate over keys only
foreach (string productName in productPrices.Keys)
{
    Console.WriteLine($"Product: {productName}");
}

// Iterate over values only
foreach (decimal price in productPrices.Values)
{
    Console.WriteLine($"Price: ${price}");
}
Enter fullscreen mode Exit fullscreen mode

Performance Characteristics in Real-World Context

Understanding dictionary performance is crucial for building scalable applications:

  • Get/Set by key: O(1) average case, O(n) worst case
  • Add: O(1) average case, O(n) when resizing
  • Remove: O(1) average case
  • ContainsKey: O(1) average case
  • ContainsValue: O(n) - must check all values

The "average case" performance assumes a good hash function and reasonable load factor. .NET's built-in hash functions for common types (string, int, etc.) are well-optimized for real-world use.

Real-World Application Patterns

User Management Systems

// Fast user lookups in authentication systems
Dictionary<string, User> userCache = new Dictionary<string, User>();

public User GetUser(string username)
{
    if (userCache.TryGetValue(username, out User cachedUser))
    {
        return cachedUser; // O(1) cache hit
    }

    // Cache miss - load from database
    var user = database.LoadUser(username);
    userCache[username] = user; // Cache for next time
    return user;
}
Enter fullscreen mode Exit fullscreen mode

Configuration Management

// Application settings with fast lookups
Dictionary<string, string> appSettings = new Dictionary<string, string>
{
    ["DatabaseConnection"] = "Server=localhost;Database=MyApp",
    ["ApiTimeout"] = "30",
    ["MaxRetries"] = "3",
    ["EnableFeatureX"] = "true"
};

public T GetSetting<T>(string key, T defaultValue)
{
    if (appSettings.TryGetValue(key, out string value))
    {
        return (T)Convert.ChangeType(value, typeof(T));
    }
    return defaultValue;
}
Enter fullscreen mode Exit fullscreen mode

Gaming and Leaderboards

// Player statistics with instant lookups
Dictionary<string, PlayerStats> playerStats = new Dictionary<string, PlayerStats>();

public void UpdatePlayerScore(string playerId, int score)
{
    if (playerStats.TryGetValue(playerId, out PlayerStats stats))
    {
        stats.TotalScore += score; // O(1) lookup and update
        stats.GamesPlayed++;
    }
    else
    {
        playerStats[playerId] = new PlayerStats { TotalScore = score, GamesPlayed = 1 };
    }
}
Enter fullscreen mode Exit fullscreen mode

E-commerce Product Catalogs

// Product information with SKU lookups
Dictionary<string, Product> productCatalog = new Dictionary<string, Product>();

public Product GetProduct(string sku)
{
    return productCatalog.TryGetValue(sku, out Product product) ? product : null;
}

public void UpdateInventory(string sku, int quantity)
{
    if (productCatalog.TryGetValue(sku, out Product product))
    {
        product.QuantityInStock = quantity; // Instant update
    }
}
Enter fullscreen mode Exit fullscreen mode

Good Practices vs Bad Practices

✅ Good Practice: Use TryGetValue for Safe Access

// Safe - no exceptions thrown
if (userCache.TryGetValue(userId, out User user))
{
    ProcessUser(user);
}
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Direct Access Without Checking

// Dangerous - throws KeyNotFoundException if key doesn't exist
User user = userCache[userId]; // Could crash your application
Enter fullscreen mode Exit fullscreen mode

✅ Good Practice: Choose Appropriate Key Types

// Good - strings and primitives have efficient hash functions
Dictionary<int, Customer> customerById = new Dictionary<int, Customer>();
Dictionary<string, Order> orderByNumber = new Dictionary<string, Order>();
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Using Complex Objects as Keys Without Proper Hashing

// Problematic - needs custom GetHashCode() and Equals() implementation
public class CompositeKey 
{
    public string Part1 { get; set; }
    public int Part2 { get; set; }
    // Missing GetHashCode() and Equals() overrides
}

Dictionary<CompositeKey, string> badExample = new Dictionary<CompositeKey, string>();
Enter fullscreen mode Exit fullscreen mode

✅ Good Practice: Initialize with Capacity for Known Size

// Prevents multiple resize operations for large datasets
Dictionary<string, ProductInfo> products = new Dictionary<string, ProductInfo>(10000);
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Using Lists for Key-Based Lookups

// Inefficient O(n) search instead of O(1) dictionary lookup
List<User> users = new List<User>();
var user = users.FirstOrDefault(u => u.Id == targetId); // Slow!
Enter fullscreen mode Exit fullscreen mode

Advanced Dictionary Features

Custom Key Types

When using custom objects as keys, implement GetHashCode() and Equals():

public class ProductKey : IEquatable<ProductKey>
{
    public string Category { get; set; }
    public string SubCategory { get; set; }

    public bool Equals(ProductKey other)
    {
        return other != null && 
               Category == other.Category && 
               SubCategory == other.SubCategory;
    }

    public override bool Equals(object obj) => Equals(obj as ProductKey);

    public override int GetHashCode()
    {
        return HashCode.Combine(Category, SubCategory); // .NET Core 2.1+
    }
}
Enter fullscreen mode Exit fullscreen mode

Dictionary with Custom Comparer

// Case-insensitive string keys
Dictionary<string, User> users = new Dictionary<string, User>(StringComparer.OrdinalIgnoreCase);
users["JOHN"] = user1;
var sameUser = users["john"]; // Finds the same user
Enter fullscreen mode Exit fullscreen mode

Thread Safety and Concurrent Access

Dictionaries are not thread-safe for modifications. For concurrent scenarios:

// Thread-safe alternative for concurrent access
ConcurrentDictionary<string, User> threadSafeUsers = new ConcurrentDictionary<string, User>();

// Atomic operations
threadSafeUsers.AddOrUpdate("john123", 
    newUser,                           // Add if doesn't exist
    (key, existingUser) => newUser);   // Update if exists
Enter fullscreen mode Exit fullscreen mode

Common Pitfalls and How to Avoid Them

Hash Code Stability

// ❌ Bad: Mutable key objects can break dictionary
public class MutableKey
{
    public string Value { get; set; } // Changing this breaks hash table
}

// ✅ Good: Immutable key objects
public class ImmutableKey
{
    public string Value { get; }
    public ImmutableKey(string value) => Value = value;
}
Enter fullscreen mode Exit fullscreen mode

Null Key Handling

// Most dictionary implementations don't allow null keys
// Check for null before using as key
if (keyValue != null)
{
    dictionary[keyValue] = someValue;
}
Enter fullscreen mode Exit fullscreen mode

Chapter Summary

Dictionary<TKey, TValue> provides O(1) average-case performance for key-based operations, making it essential for scenarios requiring fast lookups, caching, and associative data storage. Understanding hash table concepts and proper key design ensures optimal performance in real-world applications.

The key insight: Dictionaries transform linear O(n) searches into constant O(1) lookups, making them indispensable for scalable application architecture.

Practice Exercise

Scenario: Build a simple inventory management system for a warehouse.

Create an InventoryItem class with properties: SKU, Name, Quantity, Location, and LastUpdated.

Using dictionaries, implement:

  1. Fast SKU-based product lookup
  2. Location-based inventory grouping (multiple items per location)
  3. Low-stock alerts (items with quantity below threshold)
  4. Audit trail of the last 10 operations per SKU

Explain why dictionaries are more efficient than lists for this use case, and identify which operations benefit most from O(1) lookup performance.


Chapter 4: Queues: First In, First Out Processing

Think about the last time you waited in line at a coffee shop, submitted a print job, or uploaded files to a cloud service. These scenarios all follow the same principle: first come, first served. In software development, Queue<T> implements this FIFO (First In, First Out) pattern, making it essential for task processing, message handling, and any system where order matters.

Queues are the backbone of many critical systems—from web servers handling HTTP requests to operating systems managing process scheduling. Understanding when and how to use queues properly can mean the difference between a responsive application and one that frustrates users.

Understanding FIFO Principles

A queue works exactly like a line at the bank: the first person to arrive is the first person served. In programming terms:

  • Enqueue: Add an item to the back of the queue
  • Dequeue: Remove and return the item from the front of the queue
  • Peek: Look at the front item without removing it
Queue<string> customerService = new Queue<string>();

// Customers arrive (enqueue)
customerService.Enqueue("Alice");    // Queue: [Alice]
customerService.Enqueue("Bob");      // Queue: [Alice, Bob]  
customerService.Enqueue("Charlie");  // Queue: [Alice, Bob, Charlie]

// Serve customers (dequeue)
string nextCustomer = customerService.Dequeue(); // Returns "Alice"
// Queue is now: [Bob, Charlie]
Enter fullscreen mode Exit fullscreen mode

Essential Queue Operations

Adding and Removing Items

Queue<ProcessingTask> taskQueue = new Queue<ProcessingTask>();

// Add items to the queue
taskQueue.Enqueue(new ProcessingTask("Send email"));
taskQueue.Enqueue(new ProcessingTask("Generate report"));
taskQueue.Enqueue(new ProcessingTask("Update database"));

// Process items in order
while (taskQueue.Count > 0)
{
    ProcessingTask currentTask = taskQueue.Dequeue();
    await currentTask.Execute();
    Console.WriteLine($"Completed: {currentTask.Description}");
}
Enter fullscreen mode Exit fullscreen mode

Examining Queue Contents

// Check what's next without removing it
if (taskQueue.Count > 0)
{
    ProcessingTask nextTask = taskQueue.Peek();
    Console.WriteLine($"Next task: {nextTask.Description}");
}

// Check if queue is empty
bool hasWork = taskQueue.Count > 0;

// Check if specific item is in queue (O(n) operation)
bool hasEmailTask = taskQueue.Contains(emailTask);
Enter fullscreen mode Exit fullscreen mode

Converting and Enumerating

// Convert to array (preserves order)
ProcessingTask[] allTasks = taskQueue.ToArray();

// Enumerate without modifying queue
foreach (ProcessingTask task in taskQueue)
{
    Console.WriteLine($"Queued: {task.Description}");
    // Items remain in queue after enumeration
}
Enter fullscreen mode Exit fullscreen mode

Performance Characteristics

Queues are optimized for their primary operations:

  • Enqueue: O(1) - Adding to the back is constant time
  • Dequeue: O(1) - Removing from the front is constant time
  • Peek: O(1) - Looking at the front item
  • Contains: O(n) - Must search through all items
  • Count: O(1) - Maintained internally

This makes queues extremely efficient for their intended use cases but inappropriate when you need frequent random access or searching.

Real-World Application Patterns

Background Task Processing

public class BackgroundTaskProcessor
{
    private readonly Queue<IBackgroundTask> _taskQueue = new Queue<IBackgroundTask>();
    private readonly object _lock = new object();

    public void QueueTask(IBackgroundTask task)
    {
        lock (_lock)
        {
            _taskQueue.Enqueue(task);
        }
        Console.WriteLine($"Queued: {task.Name}");
    }

    public async Task ProcessTasks()
    {
        while (true)
        {
            IBackgroundTask nextTask = null;

            lock (_lock)
            {
                if (_taskQueue.Count > 0)
                {
                    nextTask = _taskQueue.Dequeue();
                }
            }

            if (nextTask != null)
            {
                await nextTask.Execute();
                Console.WriteLine($"Completed: {nextTask.Name}");
            }
            else
            {
                await Task.Delay(1000); // Wait for new tasks
            }
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Web Request Processing

public class RequestProcessor
{
    private readonly Queue<HttpRequest> _requestQueue = new Queue<HttpRequest>();
    private const int MAX_QUEUE_SIZE = 1000;

    public bool TryQueueRequest(HttpRequest request)
    {
        if (_requestQueue.Count >= MAX_QUEUE_SIZE)
        {
            // Queue full - reject request or implement overflow handling
            return false;
        }

        _requestQueue.Enqueue(request);
        return true;
    }

    public async Task<HttpResponse> ProcessNextRequest()
    {
        if (_requestQueue.Count == 0)
        {
            return null; // No requests to process
        }

        HttpRequest request = _requestQueue.Dequeue();
        return await HandleRequest(request);
    }
}
Enter fullscreen mode Exit fullscreen mode

Game Development: Event Processing

public class GameEventSystem
{
    private readonly Queue<GameEvent> _eventQueue = new Queue<GameEvent>();

    public void QueueEvent(GameEvent gameEvent)
    {
        _eventQueue.Enqueue(gameEvent);
    }

    public void ProcessEvents()
    {
        // Process all events that occurred this frame
        int eventsToProcess = _eventQueue.Count; // Snapshot count

        for (int i = 0; i < eventsToProcess; i++)
        {
            GameEvent currentEvent = _eventQueue.Dequeue();

            switch (currentEvent.Type)
            {
                case EventType.PlayerMoved:
                    HandlePlayerMovement(currentEvent);
                    break;
                case EventType.ItemCollected:
                    HandleItemCollection(currentEvent);
                    break;
                case EventType.EnemySpawned:
                    HandleEnemySpawn(currentEvent);
                    break;
            }
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

File Processing Pipeline

public class FileProcessingPipeline
{
    private readonly Queue<FileInfo> _processingQueue = new Queue<FileInfo>();

    public void AddFilesToQueue(string directoryPath)
    {
        DirectoryInfo directory = new DirectoryInfo(directoryPath);
        FileInfo[] files = directory.GetFiles("*.csv");

        foreach (FileInfo file in files)
        {
            _processingQueue.Enqueue(file);
        }

        Console.WriteLine($"Queued {files.Length} files for processing");
    }

    public async Task ProcessFiles()
    {
        while (_processingQueue.Count > 0)
        {
            FileInfo currentFile = _processingQueue.Dequeue();

            try
            {
                await ProcessFile(currentFile);
                Console.WriteLine($"Processed: {currentFile.Name}");
            }
            catch (Exception ex)
            {
                Console.WriteLine($"Error processing {currentFile.Name}: {ex.Message}");
                // Optionally re-queue failed files for retry
            }
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Good Practices vs Bad Practices

✅ Good Practice: Use Queues for Sequential Processing

// Perfect use case - processing items in arrival order
Queue<EmailMessage> outboxQueue = new Queue<EmailMessage>();

public void SendQueuedEmails()
{
    while (outboxQueue.Count > 0)
    {
        EmailMessage email = outboxQueue.Dequeue();
        await emailService.SendAsync(email); // Process in order
    }
}
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Using Queues for Random Access

// Inefficient - queues aren't designed for searching
Queue<Customer> customers = new Queue<Customer>();
var targetCustomer = customers.FirstOrDefault(c => c.Id == targetId); // O(n) operation
// Better: Use Dictionary<int, Customer> for ID-based lookups
Enter fullscreen mode Exit fullscreen mode

✅ Good Practice: Implement Queue Size Limits

public class BoundedQueue<T>
{
    private readonly Queue<T> _queue = new Queue<T>();
    private readonly int _maxSize;

    public BoundedQueue(int maxSize) => _maxSize = maxSize;

    public bool TryEnqueue(T item)
    {
        if (_queue.Count >= _maxSize)
        {
            return false; // Queue full
        }

        _queue.Enqueue(item);
        return true;
    }
}
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Unbounded Queues in Production

// Dangerous - could consume unlimited memory
Queue<LogEntry> logQueue = new Queue<LogEntry>();
// In high-traffic scenarios, this could cause OutOfMemoryException
Enter fullscreen mode Exit fullscreen mode

✅ Good Practice: Thread-Safe Queue Operations

// For concurrent scenarios, use thread-safe alternatives
ConcurrentQueue<WorkItem> threadSafeQueue = new ConcurrentQueue<WorkItem>();

// Or synchronize access to regular queues
private readonly object _queueLock = new object();

public void SafeEnqueue(WorkItem item)
{
    lock (_queueLock)
    {
        _workQueue.Enqueue(item);
    }
}
Enter fullscreen mode Exit fullscreen mode

Advanced Queue Scenarios

Priority Queue Implementation (Custom)

While Queue<T> processes items in arrival order, sometimes you need priority-based processing:

public class PriorityQueue<T> where T : IComparable<T>
{
    private readonly List<T> _items = new List<T>();

    public void Enqueue(T item)
    {
        _items.Add(item);
        _items.Sort(); // Simple implementation - could be optimized with heap
    }

    public T Dequeue()
    {
        if (_items.Count == 0)
            throw new InvalidOperationException("Queue is empty");

        T item = _items[0]; // Highest priority item
        _items.RemoveAt(0);
        return item;
    }

    public int Count => _items.Count;
}
Enter fullscreen mode Exit fullscreen mode

Queue with Retry Logic

public class RetryQueue<T>
{
    private readonly Queue<QueueItem<T>> _mainQueue = new Queue<QueueItem<T>>();
    private readonly Queue<QueueItem<T>> _retryQueue = new Queue<QueueItem<T>>();

    public void Enqueue(T item)
    {
        _mainQueue.Enqueue(new QueueItem<T>(item, maxRetries: 3));
    }

    public bool TryDequeue(out T item)
    {
        // First try retry queue (failed items get priority)
        if (_retryQueue.Count > 0)
        {
            var queueItem = _retryQueue.Dequeue();
            item = queueItem.Item;
            return true;
        }

        // Then try main queue
        if (_mainQueue.Count > 0)
        {
            var queueItem = _mainQueue.Dequeue();
            item = queueItem.Item;
            return true;
        }

        item = default(T);
        return false;
    }

    public void RequeueForRetry(T item, int remainingRetries)
    {
        if (remainingRetries > 0)
        {
            _retryQueue.Enqueue(new QueueItem<T>(item, remainingRetries - 1));
        }
        // Otherwise, item is discarded after max retries
    }
}
Enter fullscreen mode Exit fullscreen mode

Thread Safety Considerations

Standard Queue<T> is not thread-safe. For concurrent scenarios:

// Option 1: Use ConcurrentQueue<T>
ConcurrentQueue<string> concurrentQueue = new ConcurrentQueue<string>();
concurrentQueue.Enqueue("thread-safe item");
if (concurrentQueue.TryDequeue(out string result))
{
    Console.WriteLine($"Dequeued: {result}");
}

// Option 2: Synchronize access with locks
private readonly Queue<string> _queue = new Queue<string>();
private readonly object _lock = new object();

public void ThreadSafeEnqueue(string item)
{
    lock (_lock)
    {
        _queue.Enqueue(item);
    }
}

public bool ThreadSafeTryDequeue(out string item)
{
    lock (_lock)
    {
        if (_queue.Count > 0)
        {
            item = _queue.Dequeue();
            return true;
        }
        item = null;
        return false;
    }
}
Enter fullscreen mode Exit fullscreen mode

Chapter Summary

Queue<T> implements FIFO processing with O(1) performance for enqueue and dequeue operations, making it ideal for sequential task processing, request handling, and any scenario where order preservation is critical. The key is recognizing when order matters more than random access or searching capabilities.

The key insight: Queues excel in producer-consumer scenarios where fairness and order preservation are more important than flexible data access patterns.

Practice Exercise

Scenario: Build a customer support ticket system that processes tickets in the order they're received.

Create a SupportTicket class with properties: TicketId, CustomerName, Issue, Priority, CreatedAt, and Status.

Implement:

  1. A ticket queue that processes standard tickets in FIFO order
  2. An urgent ticket queue that gets priority over standard tickets
  3. A method to estimate wait time based on queue position
  4. Handling for tickets that fail processing (retry up to 3 times)
  5. Daily statistics: tickets processed, average processing time, tickets remaining

Explain why queues are more appropriate than lists for this system, and identify potential issues with unbounded queues in a high-traffic support system.


Chapter 5: Stacks: Last In, First Out Operations

Picture the last time you used your browser's back button, pressed Ctrl+Z to undo an edit, or debugged through nested method calls. Each of these scenarios follows a Last In, First Out (LIFO) pattern—the most recent action is the first one you reverse. Stack<T> in C# implements this fundamental pattern, making it essential for undo systems, expression evaluation, and managing hierarchical operations.

Stacks are everywhere in computing, often working behind the scenes. Understanding when to reach for a stack versus other collections can dramatically simplify complex problems, especially those involving nested operations or reversible actions.

Understanding LIFO Principles

A stack works like a stack of plates: you can only add or remove plates from the top. In programming terms:

  • Push: Add an item to the top of the stack
  • Pop: Remove and return the top item from the stack
  • Peek: Look at the top item without removing it
Stack<string> browserHistory = new Stack<string>();

// User navigates through pages (push)
browserHistory.Push("google.com");      // Stack: [google.com]
browserHistory.Push("stackoverflow.com"); // Stack: [google.com, stackoverflow.com]
browserHistory.Push("github.com");      // Stack: [google.com, stackoverflow.com, github.com]

// User clicks back button (pop)
string currentPage = browserHistory.Pop(); // Returns "github.com"
// Stack is now: [google.com, stackoverflow.com]
Enter fullscreen mode Exit fullscreen mode

Essential Stack Operations

Adding and Removing Items

Stack<string> undoStack = new Stack<string>();

// Record actions as they happen
undoStack.Push("Typed 'Hello'");
undoStack.Push("Typed ' World'");
undoStack.Push("Changed font to bold");

// Undo actions in reverse order
while (undoStack.Count > 0)
{
    string lastAction = undoStack.Pop();
    Console.WriteLine($"Undoing: {lastAction}");
}
// Output:
// Undoing: Changed font to bold
// Undoing: Typed ' World'  
// Undoing: Typed 'Hello'
Enter fullscreen mode Exit fullscreen mode

Examining Stack Contents

Stack<decimal> calculatorStack = new Stack<decimal>();
calculatorStack.Push(10);
calculatorStack.Push(20);
calculatorStack.Push(5);

// Look at top item without removing it
decimal topValue = calculatorStack.Peek(); // Returns 5
Console.WriteLine($"Top value: {topValue}"); // Stack unchanged

// Check if stack is empty
bool hasValues = calculatorStack.Count > 0;

// Search for specific value (O(n) operation)
bool containsTwenty = calculatorStack.Contains(20);
Enter fullscreen mode Exit fullscreen mode

Converting and Enumerating

// Convert to array (top-to-bottom order)
decimal[] values = calculatorStack.ToArray(); // [5, 20, 10]

// Enumerate without modifying stack
foreach (decimal value in calculatorStack)
{
    Console.WriteLine(value); // Prints 5, then 20, then 10
    // Stack remains unchanged after enumeration
}
Enter fullscreen mode Exit fullscreen mode

Performance Characteristics

Stacks excel at their core operations:

  • Push: O(1) - Adding to the top is constant time
  • Pop: O(1) - Removing from the top is constant time
  • Peek: O(1) - Looking at the top item
  • Contains: O(n) - Must search through all items
  • Count: O(1) - Maintained internally

This makes stacks extremely efficient for LIFO scenarios but inefficient when you need to access items in the middle or search frequently.

Real-World Application Patterns

Undo/Redo Functionality

public class TextEditor
{
    private readonly Stack<EditorAction> _undoStack = new Stack<EditorAction>();
    private readonly Stack<EditorAction> _redoStack = new Stack<EditorAction>();
    private string _content = "";

    public void ExecuteAction(EditorAction action)
    {
        // Save current state for undo
        _undoStack.Push(new EditorAction 
        { 
            Type = ActionType.RestoreContent, 
            PreviousContent = _content 
        });

        // Apply the action
        _content = action.Apply(_content);

        // Clear redo stack - new action invalidates redo history
        _redoStack.Clear();

        Console.WriteLine($"Executed: {action.Description}");
    }

    public void Undo()
    {
        if (_undoStack.Count == 0)
        {
            Console.WriteLine("Nothing to undo");
            return;
        }

        // Save current state for redo
        _redoStack.Push(new EditorAction 
        { 
            Type = ActionType.RestoreContent, 
            PreviousContent = _content 
        });

        // Restore previous state
        EditorAction undoAction = _undoStack.Pop();
        _content = undoAction.PreviousContent;

        Console.WriteLine($"Undid last action");
    }

    public void Redo()
    {
        if (_redoStack.Count == 0)
        {
            Console.WriteLine("Nothing to redo");
            return;
        }

        // Move action from redo to undo stack
        EditorAction redoAction = _redoStack.Pop();
        _undoStack.Push(new EditorAction 
        { 
            Type = ActionType.RestoreContent, 
            PreviousContent = _content 
        });

        // Apply the redo action
        _content = redoAction.PreviousContent;

        Console.WriteLine("Redid last undone action");
    }
}
Enter fullscreen mode Exit fullscreen mode

Expression Evaluation (Calculator)

public class PostfixCalculator
{
    public decimal Evaluate(string[] tokens)
    {
        Stack<decimal> operandStack = new Stack<decimal>();

        foreach (string token in tokens)
        {
            if (IsNumber(token))
            {
                operandStack.Push(decimal.Parse(token));
            }
            else if (IsOperator(token))
            {
                // Pop operands (note the order for non-commutative operations)
                decimal right = operandStack.Pop();
                decimal left = operandStack.Pop();

                decimal result = token switch
                {
                    "+" => left + right,
                    "-" => left - right,
                    "*" => left * right,
                    "/" => left / right,
                    _ => throw new ArgumentException($"Unknown operator: {token}")
                };

                operandStack.Push(result);
            }
        }

        return operandStack.Pop(); // Final result
    }

    // Example: "3 4 + 2 *" evaluates to 14
    // 1. Push 3, Push 4
    // 2. Pop 4, Pop 3, Push 3+4=7
    // 3. Push 2
    // 4. Pop 2, Pop 7, Push 7*2=14
}
Enter fullscreen mode Exit fullscreen mode

Function Call Stack Simulation

public class CallStackTracer
{
    private readonly Stack<MethodCall> _callStack = new Stack<MethodCall>();

    public void EnterMethod(string methodName, Dictionary<string, object> parameters = null)
    {
        var methodCall = new MethodCall
        {
            Name = methodName,
            Parameters = parameters ?? new Dictionary<string, object>(),
            StartTime = DateTime.Now
        };

        _callStack.Push(methodCall);
        Console.WriteLine($"→ Entering {methodName} (depth: {_callStack.Count})");
    }

    public void ExitMethod()
    {
        if (_callStack.Count == 0)
        {
            Console.WriteLine("Warning: ExitMethod called with empty call stack");
            return;
        }

        MethodCall completedCall = _callStack.Pop();
        TimeSpan duration = DateTime.Now - completedCall.StartTime;

        Console.WriteLine($"← Exiting {completedCall.Name} " +
                         $"(duration: {duration.TotalMilliseconds:F2}ms, " +
                         $"depth: {_callStack.Count})");
    }

    public void PrintCallStack()
    {
        Console.WriteLine("Current Call Stack:");
        foreach (MethodCall call in _callStack)
        {
            Console.WriteLine($"  {call.Name}");
        }
    }
}

// Usage in methods:
public void ProcessOrder(Order order)
{
    callTracer.EnterMethod("ProcessOrder", new Dictionary<string, object> { ["orderId"] = order.Id });

    try
    {
        ValidateOrder(order);
        CalculateTotal(order);
        SaveOrder(order);
    }
    finally
    {
        callTracer.ExitMethod();
    }
}
Enter fullscreen mode Exit fullscreen mode

Bracket/Parentheses Matching

public class BracketValidator
{
    private static readonly Dictionary<char, char> BracketPairs = new Dictionary<char, char>
    {
        { ')', '(' },
        { '}', '{' },
        { ']', '[' }
    };

    public bool IsValid(string expression)
    {
        Stack<char> bracketStack = new Stack<char>();

        foreach (char character in expression)
        {
            if (IsOpeningBracket(character))
            {
                bracketStack.Push(character);
            }
            else if (IsClosingBracket(character))
            {
                if (bracketStack.Count == 0)
                {
                    return false; // Closing bracket without matching opening
                }

                char lastOpening = bracketStack.Pop();
                if (BracketPairs[character] != lastOpening)
                {
                    return false; // Mismatched bracket types
                }
            }
            // Ignore other characters
        }

        return bracketStack.Count == 0; // All brackets should be matched
    }

    // Examples:
    // IsValid("()[]{}") → true
    // IsValid("([{}])") → true  
    // IsValid("([)]") → false (crossed brackets)
    // IsValid("((())") → false (unmatched opening)
}
Enter fullscreen mode Exit fullscreen mode

Navigation History (Web Browser)

public class BrowserNavigationManager
{
    private readonly Stack<string> _backStack = new Stack<string>();
    private readonly Stack<string> _forwardStack = new Stack<string>();
    private string _currentPage = "about:blank";

    public void NavigateTo(string url)
    {
        // Save current page to back stack
        if (!string.IsNullOrEmpty(_currentPage))
        {
            _backStack.Push(_currentPage);
        }

        _currentPage = url;

        // Clear forward stack - new navigation invalidates forward history
        _forwardStack.Clear();

        Console.WriteLine($"Navigated to: {url}");
    }

    public bool CanGoBack => _backStack.Count > 0;
    public bool CanGoForward => _forwardStack.Count > 0;

    public void GoBack()
    {
        if (!CanGoBack)
        {
            Console.WriteLine("Cannot go back - no previous pages");
            return;
        }

        // Move current page to forward stack
        _forwardStack.Push(_currentPage);

        // Get previous page from back stack
        _currentPage = _backStack.Pop();

        Console.WriteLine($"Went back to: {_currentPage}");
    }

    public void GoForward()
    {
        if (!CanGoForward)
        {
            Console.WriteLine("Cannot go forward - no forward history");
            return;
        }

        // Move current page to back stack
        _backStack.Push(_currentPage);

        // Get next page from forward stack
        _currentPage = _forwardStack.Pop();

        Console.WriteLine($"Went forward to: {_currentPage}");
    }
}
Enter fullscreen mode Exit fullscreen mode

Good Practices vs Bad Practices

✅ Good Practice: Use Stacks for Naturally LIFO Operations

// Perfect use case - reversing operations
Stack<DatabaseTransaction> transactionStack = new Stack<DatabaseTransaction>();

public void BeginTransaction(DatabaseTransaction transaction)
{
    transactionStack.Push(transaction);
    transaction.Begin();
}

public void RollbackLastTransaction()
{
    if (transactionStack.Count > 0)
    {
        DatabaseTransaction lastTransaction = transactionStack.Pop();
        lastTransaction.Rollback(); // Most recent transaction rolled back first
    }
}
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Using Stacks for Sequential Processing

// Inefficient - items are processed in reverse order
Stack<EmailMessage> emailQueue = new Stack<EmailMessage>();
emailQueue.Push(firstEmail);  // Will be processed LAST
emailQueue.Push(secondEmail); // Will be processed FIRST
// Use Queue<T> instead for FIFO processing
Enter fullscreen mode Exit fullscreen mode

✅ Good Practice: Check for Empty Stack Before Operations

public T SafePop<T>(Stack<T> stack)
{
    if (stack.Count == 0)
    {
        throw new InvalidOperationException("Cannot pop from empty stack");
    }
    return stack.Pop();
}

// Or use TryPop pattern
public bool TryPop<T>(Stack<T> stack, out T item)
{
    if (stack.Count > 0)
    {
        item = stack.Pop();
        return true;
    }
    item = default(T);
    return false;
}
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Assuming Stack Will Always Have Items

// Dangerous - will throw exception if stack is empty
Stack<string> history = new Stack<string>();
string lastAction = history.Pop(); // InvalidOperationException if empty
Enter fullscreen mode Exit fullscreen mode

✅ Good Practice: Use Stacks for Hierarchical Processing

// Excellent for tree traversal or nested structures
public void TraverseDirectoryStructure(DirectoryInfo directory)
{
    Stack<DirectoryInfo> directories = new Stack<DirectoryInfo>();
    directories.Push(directory);

    while (directories.Count > 0)
    {
        DirectoryInfo current = directories.Pop();
        ProcessDirectory(current);

        // Add subdirectories to stack (will be processed depth-first)
        foreach (DirectoryInfo subdirectory in current.GetDirectories())
        {
            directories.Push(subdirectory);
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Advanced Stack Scenarios

Stack with Size Limit

public class BoundedStack<T>
{
    private readonly Stack<T> _stack = new Stack<T>();
    private readonly int _maxSize;

    public BoundedStack(int maxSize) => _maxSize = maxSize;

    public bool TryPush(T item)
    {
        if (_stack.Count >= _maxSize)
        {
            return false; // Stack full
        }

        _stack.Push(item);
        return true;
    }

    public bool TryPop(out T item)
    {
        if (_stack.Count > 0)
        {
            item = _stack.Pop();
            return true;
        }
        item = default(T);
        return false;
    }

    public int Count => _stack.Count;
    public bool IsFull => _stack.Count >= _maxSize;
}
Enter fullscreen mode Exit fullscreen mode

Thread-Safe Stack Operations

public class ThreadSafeStack<T>
{
    private readonly Stack<T> _stack = new Stack<T>();
    private readonly object _lock = new object();

    public void Push(T item)
    {
        lock (_lock)
        {
            _stack.Push(item);
        }
    }

    public bool TryPop(out T item)
    {
        lock (_lock)
        {
            if (_stack.Count > 0)
            {
                item = _stack.Pop();
                return true;
            }
            item = default(T);
            return false;
        }
    }

    public bool TryPeek(out T item)
    {
        lock (_lock)
        {
            if (_stack.Count > 0)
            {
                item = _stack.Peek();
                return true;
            }
            item = default(T);
            return false;
        }
    }

    public int Count
    {
        get
        {
            lock (_lock)
            {
                return _stack.Count;
            }
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Chapter Summary

Stack<T> implements LIFO processing with O(1) performance for push, pop, and peek operations, making it essential for undo systems, expression evaluation, and any scenario involving nested or reversible operations. The key is recognizing when the "last in, first out" pattern matches your problem domain.

The key insight: Stacks excel when you need to reverse the order of operations or process nested structures, making complex hierarchical problems much simpler to solve.

Practice Exercise

Scenario: Build a simple arithmetic expression evaluator that can handle parentheses and basic operators (+, -, *, /).

Create a system that:

  1. Uses a stack to convert infix notation ("3 + 4 * 2") to postfix notation ("3 4 2 * +")
  2. Uses another stack to evaluate the postfix expression
  3. Handles nested parentheses correctly
  4. Provides meaningful error messages for invalid expressions
  5. Tracks the operation history for debugging purposes

For example:

  • Input: "2 * (3 + 4)" should evaluate to 14
  • Input: "((2 + 3) * 4" should report unmatched parentheses

Explain why stacks are ideal for this problem and how the LIFO principle helps manage the nested structure of mathematical expressions.


Chapter 6: HashSet & SortedSet: Unique Collections

Imagine you're building a social media platform tracking user interests, a gaming system preventing duplicate achievements, or an e-commerce site managing product categories. In each scenario, you need to ensure uniqueness while maintaining efficient lookups and operations. HashSet<T> and SortedSet<T> are C#'s implementations of mathematical sets, providing guaranteed uniqueness with different ordering and performance characteristics.

Sets solve the fundamental problem of duplicate prevention and set operations (union, intersection, difference) that are essential in data deduplication, relationship analysis, and mathematical computations in software systems.

Understanding Set Theory in Programming

Sets are mathematical collections where each element appears exactly once. In programming, this translates to:

  • Uniqueness: No duplicate values allowed
  • Efficient membership testing: Fast "contains" operations
  • Set operations: Union, intersection, difference, and subset testing
  • Unordered (HashSet) vs Ordered (SortedSet): Different performance trade-offs
HashSet<string> userInterests = new HashSet<string>();

// Adding items - duplicates are automatically ignored
userInterests.Add("programming");
userInterests.Add("gaming");
userInterests.Add("programming"); // Ignored - already exists

Console.WriteLine(userInterests.Count); // Output: 2 (not 3)
Enter fullscreen mode Exit fullscreen mode

HashSet: Fast, Unordered Uniqueness

HashSet<T> uses hash table implementation for O(1) average-case performance on core operations.

Essential HashSet Operations

// Creating and adding items
HashSet<string> technologies = new HashSet<string>
{
    "C#", "JavaScript", "Python", "Go"
};

// Adding items (returns false if item already exists)
bool added = technologies.Add("Rust"); // true - new item
bool duplicate = technologies.Add("C#"); // false - already exists

// Membership testing
bool knowsCSharp = technologies.Contains("C#"); // O(1) average case

// Removing items
bool removed = technologies.Remove("Go"); // true if item existed

// Count and enumeration
Console.WriteLine($"Technologies: {technologies.Count}");
foreach (string tech in technologies)
{
    Console.WriteLine(tech); // Order is not guaranteed
}
Enter fullscreen mode Exit fullscreen mode

Set Operations with HashSet

HashSet<string> backendSkills = new HashSet<string> { "C#", "Python", "SQL", "Docker" };
HashSet<string> frontendSkills = new HashSet<string> { "JavaScript", "React", "CSS", "HTML" };
HashSet<string> fullStackSkills = new HashSet<string> { "C#", "JavaScript", "React", "SQL" };

// Union - combine all unique items
HashSet<string> allSkills = new HashSet<string>(backendSkills);
allSkills.UnionWith(frontendSkills);
// Result: { "C#", "Python", "SQL", "Docker", "JavaScript", "React", "CSS", "HTML" }

// Intersection - find common items
HashSet<string> commonSkills = new HashSet<string>(backendSkills);
commonSkills.IntersectWith(fullStackSkills);
// Result: { "C#", "SQL" }

// Difference - items in first set but not in second
HashSet<string> backendOnly = new HashSet<string>(backendSkills);
backendOnly.ExceptWith(fullStackSkills);
// Result: { "Python", "Docker" }

// Symmetric difference - items in either set but not both
HashSet<string> uniqueSkills = new HashSet<string>(backendSkills);
uniqueSkills.SymmetricExceptWith(fullStackSkills);
// Result: { "Python", "Docker", "JavaScript", "React" }
Enter fullscreen mode Exit fullscreen mode

Subset and Superset Operations

HashSet<string> webFrameworks = new HashSet<string> { "React", "Angular", "Vue" };
HashSet<string> jsFrameworks = new HashSet<string> { "React", "Angular", "Vue", "Express", "Node.js" };

// Subset testing
bool isSubset = webFrameworks.IsSubsetOf(jsFrameworks); // true
bool isProperSubset = webFrameworks.IsProperSubsetOf(jsFrameworks); // true

// Superset testing  
bool isSuperset = jsFrameworks.IsSupersetOf(webFrameworks); // true
bool isProperSuperset = jsFrameworks.IsProperSupersetOf(webFrameworks); // true

// Set equality
bool areEqual = webFrameworks.SetEquals(jsFrameworks); // false

// Overlap testing
bool hasOverlap = webFrameworks.Overlaps(jsFrameworks); // true
Enter fullscreen mode Exit fullscreen mode

SortedSet: Ordered Uniqueness

SortedSet<T> maintains elements in sorted order using a balanced binary tree (typically red-black tree).

Essential SortedSet Operations

// Creating sorted sets
SortedSet<int> scores = new SortedSet<int> { 95, 87, 92, 78, 95, 89 };
// Automatically sorted and deduplicated: { 78, 87, 89, 92, 95 }

// Adding maintains sort order
scores.Add(85); // Inserted in correct position
// Set becomes: { 78, 85, 87, 89, 92, 95 }

// Range operations (not available in HashSet)
var topScores = scores.GetViewBetween(90, 100); // { 92, 95 }
var bottomScores = scores.GetViewBetween(70, 85); // { 78, 85 }

// Min and Max (O(log n))
int lowest = scores.Min;  // 78
int highest = scores.Max; // 95

// Ordered enumeration
foreach (int score in scores)
{
    Console.WriteLine(score); // Always prints in ascending order
}
Enter fullscreen mode Exit fullscreen mode

Range and Navigation Operations

SortedSet<DateTime> eventDates = new SortedSet<DateTime>
{
    new DateTime(2024, 3, 15),
    new DateTime(2024, 1, 10),
    new DateTime(2024, 2, 20),
    new DateTime(2024, 4, 5)
};

// Find events in a date range
var q1Events = eventDates.GetViewBetween(
    new DateTime(2024, 1, 1), 
    new DateTime(2024, 3, 31)
);

// Reverse iteration
foreach (DateTime date in eventDates.Reverse())
{
    Console.WriteLine(date); // Newest to oldest
}

// Find first/last elements
DateTime firstEvent = eventDates.Min;
DateTime lastEvent = eventDates.Max;
Enter fullscreen mode Exit fullscreen mode

Performance Characteristics Comparison

Operation HashSet SortedSet
Add O(1) average O(log n)
Remove O(1) average O(log n)
Contains O(1) average O(log n)
Enumeration O(n) unordered O(n) ordered
Min/Max O(n) O(log n)
Range queries Not available O(log n + k)

Real-World Application Patterns

User Permission Systems

public class UserPermissionManager
{
    private readonly Dictionary<int, HashSet<string>> _userPermissions = 
        new Dictionary<int, HashSet<string>>();

    public void GrantPermission(int userId, string permission)
    {
        if (!_userPermissions.ContainsKey(userId))
        {
            _userPermissions[userId] = new HashSet<string>();
        }

        _userPermissions[userId].Add(permission); // Automatically handles duplicates
    }

    public bool HasPermission(int userId, string permission)
    {
        return _userPermissions.TryGetValue(userId, out HashSet<string> permissions) &&
               permissions.Contains(permission); // O(1) lookup
    }

    public HashSet<string> GetCommonPermissions(int userId1, int userId2)
    {
        var user1Permissions = _userPermissions.GetValueOrDefault(userId1, new HashSet<string>());
        var user2Permissions = _userPermissions.GetValueOrDefault(userId2, new HashSet<string>());

        var common = new HashSet<string>(user1Permissions);
        common.IntersectWith(user2Permissions);
        return common;
    }

    public void InheritPermissions(int userId, int parentUserId)
    {
        if (!_userPermissions.ContainsKey(userId))
        {
            _userPermissions[userId] = new HashSet<string>();
        }

        if (_userPermissions.TryGetValue(parentUserId, out HashSet<string> parentPermissions))
        {
            _userPermissions[userId].UnionWith(parentPermissions); // Merge permissions
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Gaming Achievement System

public class AchievementTracker
{
    private readonly Dictionary<string, HashSet<string>> _playerAchievements = 
        new Dictionary<string, HashSet<string>>();

    public bool UnlockAchievement(string playerId, string achievementId)
    {
        if (!_playerAchievements.ContainsKey(playerId))
        {
            _playerAchievements[playerId] = new HashSet<string>();
        }

        bool wasNewAchievement = _playerAchievements[playerId].Add(achievementId);

        if (wasNewAchievement)
        {
            Console.WriteLine($"🎉 {playerId} unlocked achievement: {achievementId}");
            CheckForMetaAchievements(playerId);
        }

        return wasNewAchievement;
    }

    public bool HasAchievement(string playerId, string achievementId)
    {
        return _playerAchievements.TryGetValue(playerId, out HashSet<string> achievements) &&
               achievements.Contains(achievementId);
    }

    public void CheckForMetaAchievements(string playerId)
    {
        if (!_playerAchievements.TryGetValue(playerId, out HashSet<string> achievements))
            return;

        // Meta-achievements based on achievement count
        if (achievements.Count >= 10 && !achievements.Contains("Achiever"))
        {
            UnlockAchievement(playerId, "Achiever");
        }

        if (achievements.Count >= 50 && !achievements.Contains("Completionist"))
        {
            UnlockAchievement(playerId, "Completionist");
        }

        // Check for achievement combinations
        HashSet<string> combatAchievements = new HashSet<string> 
        { 
            "FirstKill", "BossSlayer", "Warrior", "BattleMaster" 
        };

        if (combatAchievements.IsSubsetOf(achievements))
        {
            UnlockAchievement(playerId, "CombatExpert");
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

E-commerce Category Management

public class ProductCategoryManager
{
    private readonly Dictionary<string, HashSet<string>> _productCategories = 
        new Dictionary<string, HashSet<string>>();

    private readonly Dictionary<string, SortedSet<decimal>> _categoryPrices = 
        new Dictionary<string, SortedSet<decimal>>();

    public void AddProduct(string productId, HashSet<string> categories, decimal price)
    {
        _productCategories[productId] = new HashSet<string>(categories);

        // Track price ranges per category
        foreach (string category in categories)
        {
            if (!_categoryPrices.ContainsKey(category))
            {
                _categoryPrices[category] = new SortedSet<decimal>();
            }
            _categoryPrices[category].Add(price);
        }
    }

    public HashSet<string> FindSimilarProducts(string productId)
    {
        if (!_productCategories.TryGetValue(productId, out HashSet<string> categories))
        {
            return new HashSet<string>();
        }

        HashSet<string> similarProducts = new HashSet<string>();

        foreach (var kvp in _productCategories)
        {
            if (kvp.Key == productId) continue;

            // Products with overlapping categories are similar
            if (categories.Overlaps(kvp.Value))
            {
                similarProducts.Add(kvp.Key);
            }
        }

        return similarProducts;
    }

    public (decimal min, decimal max, decimal median) GetCategoryPriceStats(string category)
    {
        if (!_categoryPrices.TryGetValue(category, out SortedSet<decimal> prices) || 
            prices.Count == 0)
        {
            return (0, 0, 0);
        }

        decimal min = prices.Min;
        decimal max = prices.Max;

        // Median calculation using sorted order
        var priceArray = prices.ToArray();
        decimal median = priceArray.Length % 2 == 0
            ? (priceArray[priceArray.Length / 2 - 1] + priceArray[priceArray.Length / 2]) / 2
            : priceArray[priceArray.Length / 2];

        return (min, max, median);
    }

    public SortedSet<decimal> GetPricesInRange(string category, decimal minPrice, decimal maxPrice)
    {
        if (_categoryPrices.TryGetValue(category, out SortedSet<decimal> prices))
        {
            return prices.GetViewBetween(minPrice, maxPrice);
        }
        return new SortedSet<decimal>();
    }
}
Enter fullscreen mode Exit fullscreen mode

Data Deduplication System

public class DataDeduplicationProcessor
{
    public List<T> RemoveDuplicates<T>(List<T> data)
    {
        // Simple deduplication while preserving first occurrence
        HashSet<T> seen = new HashSet<T>();
        List<T> result = new List<T>();

        foreach (T item in data)
        {
            if (seen.Add(item)) // Add returns false if item already exists
            {
                result.Add(item);
            }
        }

        return result;
    }

    public void ProcessLogFiles(List<string> logLines)
    {
        HashSet<string> uniqueErrors = new HashSet<string>();
        HashSet<string> uniqueUsers = new HashSet<string>();
        SortedSet<DateTime> errorTimes = new SortedSet<DateTime>();

        foreach (string line in logLines)
        {
            var logEntry = ParseLogLine(line);

            if (logEntry.Level == "ERROR")
            {
                uniqueErrors.Add(logEntry.Message);
                errorTimes.Add(logEntry.Timestamp);
            }

            if (!string.IsNullOrEmpty(logEntry.UserId))
            {
                uniqueUsers.Add(logEntry.UserId);
            }
        }

        Console.WriteLine($"Unique errors: {uniqueErrors.Count}");
        Console.WriteLine($"Unique users: {uniqueUsers.Count}");
        Console.WriteLine($"Error time range: {errorTimes.Min} to {errorTimes.Max}");

        // Find errors in the last hour
        DateTime oneHourAgo = DateTime.Now.AddHours(-1);
        var recentErrors = errorTimes.GetViewBetween(oneHourAgo, DateTime.Now);
        Console.WriteLine($"Recent errors: {recentErrors.Count}");
    }
}
Enter fullscreen mode Exit fullscreen mode

Good Practices vs Bad Practices

✅ Good Practice: Use Sets for Uniqueness Requirements

// Automatically handles duplicates
HashSet<string> subscribedEmails = new HashSet<string>();
subscribedEmails.Add("user@example.com"); // Added
subscribedEmails.Add("user@example.com"); // Ignored (duplicate)
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Manual Duplicate Checking with Lists

// Inefficient and error-prone
List<string> emails = new List<string>();
if (!emails.Contains("user@example.com")) // O(n) operation
{
    emails.Add("user@example.com");
}
Enter fullscreen mode Exit fullscreen mode

✅ Good Practice: Choose HashSet vs SortedSet Based on Requirements

// Use HashSet for fast lookups without ordering
HashSet<int> activeUserIds = new HashSet<int>(); // O(1) contains

// Use SortedSet when ordering matters
SortedSet<DateTime> scheduledTasks = new SortedSet<DateTime>(); // Chronological order
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Ignoring Performance Characteristics

// Inefficient for frequent min/max operations
HashSet<int> scores = new HashSet<int>();
int maxScore = scores.Max(); // O(n) operation every time

// Better: Use SortedSet for frequent min/max
SortedSet<int> sortedScores = new SortedSet<int>();
int maxScore = sortedScores.Max; // O(log n) operation
Enter fullscreen mode Exit fullscreen mode

✅ Good Practice: Leverage Set Operations for Complex Logic

// Clean, expressive code using set operations
HashSet<string> requiredSkills = new HashSet<string> { "C#", "SQL", "REST" };
HashSet<string> candidateSkills = new HashSet<string> { "C#", "JavaScript", "SQL" };

bool isQualified = requiredSkills.IsSubsetOf(candidateSkills);
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Implementing Set Operations Manually

// Verbose and potentially buggy
List<string> required = new List<string> { "C#", "SQL", "REST" };
List<string> candidate = new List<string> { "C#", "JavaScript", "SQL" };

bool isQualified = required.All(skill => candidate.Contains(skill)); // Less efficient
Enter fullscreen mode Exit fullscreen mode

Thread Safety Considerations

Neither HashSet<T> nor SortedSet<T> are thread-safe by default:

// Thread-safe wrapper
private readonly HashSet<string> _threadSafeSet = new HashSet<string>();
private readonly object _setLock = new object();

public bool ThreadSafeAdd(string item)
{
    lock (_setLock)
    {
        return _threadSafeSet.Add(item);
    }
}

// For high-concurrency scenarios, consider ConcurrentDictionary with dummy values
// or custom thread-safe implementations
Enter fullscreen mode Exit fullscreen mode

Custom Equality and Comparison

Custom HashSet with Complex Objects

public class Product : IEquatable<Product>
{
    public string SKU { get; set; }
    public string Name { get; set; }
    public decimal Price { get; set; }

    // Define equality based on SKU only
    public bool Equals(Product other)
    {
        return other != null && SKU == other.SKU;
    }

    public override bool Equals(object obj) => Equals(obj as Product);

    public override int GetHashCode() => SKU?.GetHashCode() ?? 0;
}

// HashSet will use custom equality
HashSet<Product> uniqueProducts = new HashSet<Product>();
uniqueProducts.Add(new Product { SKU = "ABC123", Name = "Widget", Price = 10.00m });
uniqueProducts.Add(new Product { SKU = "ABC123", Name = "Different Name", Price = 15.00m }); 
// Second item ignored - same SKU
Enter fullscreen mode Exit fullscreen mode

Chapter Summary

HashSet<T> and SortedSet<T> provide guaranteed uniqueness with different performance and ordering characteristics. HashSet excels at fast membership testing and set operations, while SortedSet adds ordering and range query capabilities at the cost of logarithmic operations. Understanding set theory and choosing the right implementation based on ordering and performance requirements is crucial for building efficient deduplication and relationship analysis systems.

The key insight: Sets eliminate the complexity of manual duplicate management while providing powerful mathematical operations that make complex data relationships simple to express and compute.

Practice Exercise

Scenario: Build a social media recommendation system that suggests new connections based on mutual friends and shared interests.

Create classes for User (with Id, Name, Interests) and implement:

  1. A friend recommendation system using set intersections to find users with mutual friends
  2. Interest-based recommendations using set overlaps
  3. A "degrees of separation" calculator using set operations
  4. Popular interests tracking across all users (using SortedSet for ranking)
  5. Friend suggestion filtering (exclude existing friends and blocked users)

Use both HashSet and SortedSet appropriately and explain:

  • Why sets are better than lists for this domain
  • When you chose HashSet vs SortedSet and why
  • How set operations simplify the recommendation logic
  • Performance implications of your design choices

Consider edge cases like users with no friends, empty interest lists, and circular relationships.


Chapter 7: LinkedList: When Order and Insertion Matter

Picture yourself building a music playlist where users frequently add songs at specific positions, a browser's navigation history that allows insertion of bookmarks, or a real-time chat system where messages need to be inserted and removed from different positions efficiently. While List<T> handles most dynamic collection scenarios, LinkedList<T> excels when you need frequent insertions and deletions at arbitrary positions without the costly element shifting that arrays require.

LinkedList represents a different approach to sequential data storage, trading array-like index access for superior insertion/deletion performance at any position. Understanding when this trade-off makes sense is key to building efficient systems that handle complex data manipulation patterns.

Understanding Doubly Linked Lists

A LinkedList<T> in C# is implemented as a doubly linked list, where each element (node) contains:

  • Value: The actual data
  • Next: Reference to the next node
  • Previous: Reference to the previous node

This structure allows traversal in both directions and enables O(1) insertion/deletion when you have a reference to a node.

// Conceptual structure:
// [prev|data|next] <-> [prev|data|next] <-> [prev|data|next]
//  Node A              Node B              Node C

LinkedList<string> playlist = new LinkedList<string>();
var songNode = playlist.AddLast("Song Title");
// songNode.Value = "Song Title"
// songNode.Next = reference to next song (or null)
// songNode.Previous = reference to previous song (or null)
Enter fullscreen mode Exit fullscreen mode

Essential LinkedList Operations


Adding Elements

LinkedList<string> tasks = new LinkedList<string>();

// Add to ends - O(1)
LinkedListNode<string> firstTask = tasks.AddFirst("Morning standup");
LinkedListNode<string> lastTask = tasks.AddLast("End of day wrap-up");

// Add relative to existing nodes - O(1)
LinkedListNode<string> reviewTask = tasks.AddAfter(firstTask, "Code review");
LinkedListNode<string> planningTask = tasks.AddBefore(lastTask, "Sprint planning");

// Current order: [Morning standup] -> [Code review] -> [Sprint planning] -> [End of day wrap-up]
Enter fullscreen mode Exit fullscreen mode

Removing Elements

// Remove specific nodes - O(1)
tasks.Remove(reviewTask); // Removes "Code review" node

// Remove by value - O(n) - must search first
bool removed = tasks.Remove("Sprint planning");

// Remove from ends - O(1)
tasks.RemoveFirst(); // Remove "Morning standup"  
tasks.RemoveLast();  // Remove "End of day wrap-up"

// Clear all elements
tasks.Clear();
Enter fullscreen mode Exit fullscreen mode

Navigating and Accessing Elements

LinkedList<int> numbers = new LinkedList<int>(new[] { 1, 2, 3, 4, 5 });

// Access first and last nodes
LinkedListNode<int> first = numbers.First; // Node containing 1
LinkedListNode<int> last = numbers.Last;   // Node containing 5

// Navigate through nodes
LinkedListNode<int> current = numbers.First;
while (current != null)
{
    Console.WriteLine(current.Value);
    current = current.Next; // Move to next node
}

// Reverse navigation
current = numbers.Last;
while (current != null)
{
    Console.WriteLine(current.Value);
    current = current.Previous; // Move to previous node
}
Enter fullscreen mode Exit fullscreen mode

Performance Characteristics

Understanding LinkedList performance helps you choose it appropriately:

  • Add/Remove at ends: O(1) - Direct access to first/last nodes
  • Add/Remove at known node: O(1) - No element shifting required
  • Add/Remove by value: O(n) - Must search for the node first
  • Access by index: O(n) - No random access, must traverse
  • Contains: O(n) - Must search through nodes
  • Count: O(1) - Maintained internally

Real-World Application Patterns

Music Playlist Manager

public class MusicPlaylist
{
    private readonly LinkedList<Song> _playlist = new LinkedList<Song>();
    private LinkedListNode<Song> _currentSong = null;

    public void AddSong(Song song)
    {
        _playlist.AddLast(song);
        _currentSong ??= _playlist.First; // Set first song as current if none set
    }

    public void InsertSongAfterCurrent(Song song)
    {
        if (_currentSong != null)
        {
            _playlist.AddAfter(_currentSong, song);
        }
        else
        {
            AddSong(song);
        }
    }

    public Song NextSong()
    {
        if (_currentSong?.Next != null)
        {
            _currentSong = _currentSong.Next;
            return _currentSong.Value;
        }

        // Loop to beginning if at end
        _currentSong = _playlist.First;
        return _currentSong?.Value;
    }

    public Song PreviousSong()
    {
        if (_currentSong?.Previous != null)
        {
            _currentSong = _currentSong.Previous;
            return _currentSong.Value;
        }

        // Loop to end if at beginning
        _currentSong = _playlist.Last;
        return _currentSong?.Value;
    }

    public bool RemoveCurrentSong()
    {
        if (_currentSong == null) return false;

        LinkedListNode<Song> nextCurrent = _currentSong.Next ?? _currentSong.Previous;
        _playlist.Remove(_currentSong);
        _currentSong = nextCurrent;

        return true;
    }

    public void MoveSongUp(Song song)
    {
        LinkedListNode<Song> node = FindNode(song);
        if (node?.Previous != null)
        {
            _playlist.Remove(node);
            _playlist.AddBefore(node.Previous, song);
        }
    }

    private LinkedListNode<Song> FindNode(Song song)
    {
        LinkedListNode<Song> current = _playlist.First;
        while (current != null)
        {
            if (current.Value.Equals(song))
                return current;
            current = current.Next;
        }
        return null;
    }
}
Enter fullscreen mode Exit fullscreen mode

Browser Navigation History

public class BrowserHistory
{
    private readonly LinkedList<string> _history = new LinkedList<string>();
    private LinkedListNode<string> _currentPage = null;
    private const int MAX_HISTORY_SIZE = 100;

    public void NavigateTo(string url)
    {
        // Remove any forward history when navigating to new page
        if (_currentPage != null)
        {
            LinkedListNode<string> nodeToRemove = _currentPage.Next;
            while (nodeToRemove != null)
            {
                LinkedListNode<string> next = nodeToRemove.Next;
                _history.Remove(nodeToRemove);
                nodeToRemove = next;
            }
        }

        // Add new page
        if (_currentPage != null)
        {
            _currentPage = _history.AddAfter(_currentPage, url);
        }
        else
        {
            _currentPage = _history.AddFirst(url);
        }

        // Maintain maximum history size
        while (_history.Count > MAX_HISTORY_SIZE)
        {
            _history.RemoveFirst();
        }

        Console.WriteLine($"Navigated to: {url}");
    }

    public string GoBack()
    {
        if (_currentPage?.Previous != null)
        {
            _currentPage = _currentPage.Previous;
            Console.WriteLine($"Went back to: {_currentPage.Value}");
            return _currentPage.Value;
        }

        Console.WriteLine("Cannot go back - at beginning of history");
        return _currentPage?.Value;
    }

    public string GoForward()
    {
        if (_currentPage?.Next != null)
        {
            _currentPage = _currentPage.Next;
            Console.WriteLine($"Went forward to: {_currentPage.Value}");
            return _currentPage.Value;
        }

        Console.WriteLine("Cannot go forward - at end of history");
        return _currentPage?.Value;
    }

    public bool CanGoBack => _currentPage?.Previous != null;
    public bool CanGoForward => _currentPage?.Next != null;
    public string CurrentUrl => _currentPage?.Value;

    public void InsertBookmark(string bookmarkUrl)
    {
        if (_currentPage != null)
        {
            _history.AddAfter(_currentPage, bookmarkUrl);
            Console.WriteLine($"Inserted bookmark: {bookmarkUrl}");
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Real-Time Chat Message Buffer

public class ChatMessageBuffer
{
    private readonly LinkedList<ChatMessage> _messages = new LinkedList<ChatMessage>();
    private const int MAX_BUFFER_SIZE = 1000;
    private readonly object _lock = new object();

    public void AddMessage(ChatMessage message)
    {
        lock (_lock)
        {
            _messages.AddLast(message);

            // Remove old messages if buffer is full
            while (_messages.Count > MAX_BUFFER_SIZE)
            {
                _messages.RemoveFirst();
            }
        }
    }

    public void InsertSystemMessage(string content, LinkedListNode<ChatMessage> afterMessage)
    {
        lock (_lock)
        {
            var systemMessage = new ChatMessage
            {
                Content = content,
                Timestamp = DateTime.Now,
                IsSystemMessage = true
            };

            if (afterMessage != null && _messages.Contains(afterMessage))
            {
                _messages.AddAfter(afterMessage, systemMessage);
            }
            else
            {
                _messages.AddLast(systemMessage);
            }
        }
    }

    public bool DeleteMessage(ChatMessage message)
    {
        lock (_lock)
        {
            LinkedListNode<ChatMessage> node = FindMessageNode(message);
            if (node != null)
            {
                _messages.Remove(node);
                return true;
            }
            return false;
        }
    }

    public List<ChatMessage> GetRecentMessages(int count)
    {
        lock (_lock)
        {
            List<ChatMessage> recent = new List<ChatMessage>();
            LinkedListNode<ChatMessage> current = _messages.Last;

            while (current != null && recent.Count < count)
            {
                recent.Insert(0, current.Value); // Insert at beginning for chronological order
                current = current.Previous;
            }

            return recent;
        }
    }

    public ChatMessage GetMessageContext(ChatMessage targetMessage, int contextBefore, int contextAfter)
    {
        lock (_lock)
        {
            LinkedListNode<ChatMessage> targetNode = FindMessageNode(targetMessage);
            if (targetNode == null) return null;

            List<ChatMessage> context = new List<ChatMessage>();

            // Get messages before
            LinkedListNode<ChatMessage> current = targetNode;
            for (int i = 0; i < contextBefore && current.Previous != null; i++)
            {
                current = current.Previous;
                context.Insert(0, current.Value);
            }

            // Add target message
            context.Add(targetMessage);

            // Get messages after
            current = targetNode;
            for (int i = 0; i < contextAfter && current.Next != null; i++)
            {
                current = current.Next;
                context.Add(current.Value);
            }

            return new ChatMessage
            {
                Content = string.Join("\n", context.Select(m => m.Content)),
                Timestamp = targetMessage.Timestamp,
                IsSystemMessage = true
            };
        }
    }

    private LinkedListNode<ChatMessage> FindMessageNode(ChatMessage message)
    {
        LinkedListNode<ChatMessage> current = _messages.First;
        while (current != null)
        {
            if (current.Value.Id == message.Id)
                return current;
            current = current.Next;
        }
        return null;
    }
}
Enter fullscreen mode Exit fullscreen mode

Task Processing with Priority Insertions

public class PriorityTaskProcessor
{
    private readonly LinkedList<ProcessingTask> _taskQueue = new LinkedList<ProcessingTask>();
    private readonly object _lock = new object();

    public void AddTask(ProcessingTask task)
    {
        lock (_lock)
        {
            if (task.Priority == TaskPriority.Urgent)
            {
                InsertByPriority(task);
            }
            else
            {
                _taskQueue.AddLast(task); // Normal tasks go to end
            }
        }
    }

    private void InsertByPriority(ProcessingTask urgentTask)
    {
        // Insert urgent task before first non-urgent task
        LinkedListNode<ProcessingTask> current = _taskQueue.First;

        while (current != null)
        {
            if (current.Value.Priority != TaskPriority.Urgent)
            {
                _taskQueue.AddBefore(current, urgentTask);
                return;
            }
            current = current.Next;
        }

        // All tasks are urgent or queue is empty
        _taskQueue.AddLast(urgentTask);
    }

    public ProcessingTask GetNextTask()
    {
        lock (_lock)
        {
            if (_taskQueue.Count > 0)
            {
                var task = _taskQueue.First.Value;
                _taskQueue.RemoveFirst();
                return task;
            }
            return null;
        }
    }

    public bool CancelTask(ProcessingTask task)
    {
        lock (_lock)
        {
            LinkedListNode<ProcessingTask> node = FindTaskNode(task);
            if (node != null)
            {
                _taskQueue.Remove(node);
                return true;
            }
            return false;
        }
    }

    public void MoveCriticalTasksToFront(Predicate<ProcessingTask> criticalCondition)
    {
        lock (_lock)
        {
            List<ProcessingTask> criticalTasks = new List<ProcessingTask>();
            LinkedListNode<ProcessingTask> current = _taskQueue.First;

            // Find all critical tasks
            while (current != null)
            {
                LinkedListNode<ProcessingTask> next = current.Next;
                if (criticalCondition(current.Value))
                {
                    criticalTasks.Add(current.Value);
                    _taskQueue.Remove(current);
                }
                current = next;
            }

            // Add critical tasks to front
            for (int i = criticalTasks.Count - 1; i >= 0; i--)
            {
                _taskQueue.AddFirst(criticalTasks[i]);
            }
        }
    }

    private LinkedListNode<ProcessingTask> FindTaskNode(ProcessingTask task)
    {
        LinkedListNode<ProcessingTask> current = _taskQueue.First;
        while (current != null)
        {
            if (current.Value.Id == task.Id)
                return current;
            current = current.Next;
        }
        return null;
    }
}
Enter fullscreen mode Exit fullscreen mode

Good Practices vs Bad Practices

✅ Good Practice: Use LinkedList for Frequent Middle Insertions

// Excellent use case - inserting items in the middle frequently
LinkedList<LogEntry> auditLog = new LinkedList<LogEntry>();
var currentEntry = auditLog.AddLast(new LogEntry("Operation started"));
auditLog.AddAfter(currentEntry, new LogEntry("Validation passed")); // O(1) insertion
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Using LinkedList for Index-Based Access

// Inefficient - no random access in linked lists
LinkedList<string> items = new LinkedList<string>();
// This is O(n) operation - avoid!
string fifthItem = items.ElementAt(4); // Must traverse from beginning
// Use List<T> instead for index-based access
Enter fullscreen mode Exit fullscreen mode

✅ Good Practice: Keep References to Nodes You'll Modify

// Efficient - store node references for O(1) operations
Dictionary<string, LinkedListNode<Task>> taskNodes = new Dictionary<string, LinkedListNode<Task>>();
var taskNode = taskQueue.AddLast(newTask);
taskNodes[newTask.Id] = taskNode; // Store reference

// Later - O(1) removal
if (taskNodes.TryGetValue(taskId, out var nodeToRemove))
{
    taskQueue.Remove(nodeToRemove);
    taskNodes.Remove(taskId);
}
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Searching for Nodes Repeatedly

// Inefficient - O(n) search every time
LinkedList<Order> orders = new LinkedList<Order>();
var orderToUpdate = orders.FirstOrDefault(o => o.Id == targetId); // Linear search
orders.Remove(orderToUpdate); // Another O(n) operation internally
Enter fullscreen mode Exit fullscreen mode

✅ Good Practice: Use LinkedList for LRU (Least Recently Used) Caches

// Perfect use case - frequent head/tail operations
public class LRUCache<TKey, TValue>
{
    private readonly Dictionary<TKey, LinkedListNode<CacheItem<TKey, TValue>>> _cache;
    private readonly LinkedList<CacheItem<TKey, TValue>> _lruList;
    private readonly int _capacity;

    public TValue Get(TKey key)
    {
        if (_cache.TryGetValue(key, out var node))
        {
            // Move to front (most recently used)
            _lruList.Remove(node);
            _lruList.AddFirst(node);
            return node.Value.Value;
        }
        throw new KeyNotFoundException();
    }

    public void Put(TKey key, TValue value)
    {
        if (_cache.TryGetValue(key, out var existingNode))
        {
            // Update existing
            existingNode.Value.Value = value;
            _lruList.Remove(existingNode);
            _lruList.AddFirst(existingNode);
        }
        else
        {
            // Add new
            if (_cache.Count >= _capacity)
            {
                // Remove least recently used (last item)
                var lru = _lruList.Last;
                _lruList.RemoveLast();
                _cache.Remove(lru.Value.Key);
            }

            var newNode = _lruList.AddFirst(new CacheItem<TKey, TValue>(key, value));
            _cache[key] = newNode;
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Using LinkedList When Order Doesn't Matter

// Unnecessary complexity if order doesn't matter
LinkedList<string> uniqueUsers = new LinkedList<string>();
// Better: Use HashSet<string> for uniqueness without ordering requirements
Enter fullscreen mode Exit fullscreen mode

Thread Safety and Performance Considerations

LinkedList is not thread-safe by default. For concurrent scenarios:

public class ThreadSafeLinkedList<T>
{
    private readonly LinkedList<T> _list = new LinkedList<T>();
    private readonly ReaderWriterLockSlim _lock = new ReaderWriterLockSlim();

    public LinkedListNode<T> AddFirst(T value)
    {
        _lock.EnterWriteLock();
        try
        {
            return _list.AddFirst(value);
        }
        finally
        {
            _lock.ExitWriteLock();
        }
    }

    public bool Remove(T value)
    {
        _lock.EnterWriteLock();
        try
        {
            return _list.Remove(value);
        }
        finally
        {
            _lock.ExitWriteLock();
        }
    }

    public bool Contains(T value)
    {
        _lock.EnterReadLock();
        try
        {
            return _list.Contains(value);
        }
        finally
        {
            _lock.ExitReadLock();
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

When to Choose LinkedList vs Other Collections

Choose LinkedList When:

  • Frequent insertions/deletions at arbitrary positions
  • You maintain references to nodes for O(1) operations
  • Sequential access patterns (not random access)
  • Implementing specialized data structures (LRU caches, undo systems)

Avoid LinkedList When:

  • You need index-based access frequently
  • Memory usage is a primary concern (higher per-element overhead)
  • Random access patterns are common
  • Simple append-only operations (List is simpler)

Memory and Performance Implications

// Memory comparison:
// List<T>: Array-based, ~4 bytes per reference + array overhead
// LinkedList<T>: Node-based, ~24 bytes per node (3 references + object overhead)

// Performance patterns:
List<int> list = new List<int>();
LinkedList<int> linkedList = new LinkedList<int>();

// Adding 1 million items:
// List<T>.Add() - Average O(1), occasional O(n) for resize
// LinkedList<T>.AddLast() - Always O(1)

// Accessing middle element:
// List<T>[500000] - O(1) direct access  
// LinkedList<T>.ElementAt(500000) - O(n) traversal
Enter fullscreen mode Exit fullscreen mode

Chapter Summary

LinkedList<T> excels in scenarios requiring frequent insertions and deletions at arbitrary positions, offering O(1) performance for node-based operations when you maintain node references. While it trades random access performance for insertion/deletion efficiency, it's invaluable for implementing sophisticated data structures like LRU caches, navigation histories, and real-time data processing systems where ordering and flexible insertion patterns matter more than index-based access.

The key insight: LinkedList shines when you need to frequently modify the structure of your collection rather than just accessing its contents, making it perfect for dynamic, order-sensitive data manipulation scenarios.

Practice Exercise

Scenario: Build a collaborative text editor that supports real-time editing with conflict resolution.

Create a system that:

  1. Represents text as a LinkedList of characters or text chunks
  2. Supports insertions at arbitrary positions (cursor movements)
  3. Implements an undo/redo system for text operations
  4. Handles multiple user cursors simultaneously
  5. Provides efficient text search and replace functionality
  6. Maintains a revision history with the ability to branch edits

Requirements:

  • Support operations like "insert text at position", "delete range", "move text block"
  • Track multiple user cursors and ensure they update correctly after edits
  • Implement collaborative editing where edits from different users are merged
  • Provide performance analysis comparing your LinkedList approach vs a simple string-based approach

Explain why LinkedList is or isn't the right choice for this scenario, and identify the trade-offs between LinkedList and other collection types for text editing operations.


Chapter 8: ObservableCollection & Modern .NET Collections

Modern applications aren't just about storing and manipulating data—they're about creating responsive, reactive user interfaces that automatically update when data changes. Picture building a real-time dashboard, a shopping cart that updates totals instantly, or a social media feed that adds new posts as they arrive. ObservableCollection<T> and modern .NET reactive collections provide the foundation for building these dynamic, event-driven applications.

This chapter explores how collections can participate in the broader application ecosystem through data binding, change notifications, and reactive programming patterns that make modern UI development seamless and maintainable.

Understanding ObservableCollection Fundamentals

ObservableCollection<T> is essentially a List<T> that implements INotifyCollectionChanged and INotifyPropertyChanged. This means it automatically notifies subscribers when items are added, removed, or the collection itself changes.

using System.Collections.ObjectModel;
using System.Collections.Specialized;

ObservableCollection<string> todoItems = new ObservableCollection<string>();

// Subscribe to collection changes
todoItems.CollectionChanged += (sender, e) =>
{
    switch (e.Action)
    {
        case NotifyCollectionChangedAction.Add:
            Console.WriteLine($"Added: {e.NewItems[0]}");
            break;
        case NotifyCollectionChangedAction.Remove:
            Console.WriteLine($"Removed: {e.OldItems[0]}");
            break;
        case NotifyCollectionChangedAction.Replace:
            Console.WriteLine($"Replaced: {e.OldItems[0]} with {e.NewItems[0]}");
            break;
        case NotifyCollectionChangedAction.Reset:
            Console.WriteLine("Collection was cleared");
            break;
    }
};

todoItems.Add("Buy groceries");    // Triggers Add notification
todoItems.Remove("Buy groceries"); // Triggers Remove notification
todoItems.Clear();                 // Triggers Reset notification
Enter fullscreen mode Exit fullscreen mode

Essential ObservableCollection Operations

Basic Collection Operations with Notifications

public class TodoListManager
{
    public ObservableCollection<TodoItem> TodoItems { get; } = new ObservableCollection<TodoItem>();
    public ObservableCollection<string> Categories { get; } = new ObservableCollection<string>();

    public TodoListManager()
    {
        // Monitor changes for automatic UI updates
        TodoItems.CollectionChanged += OnTodoItemsChanged;
        Categories.CollectionChanged += OnCategoriesChanged;
    }

    public void AddTodoItem(string title, string category)
    {
        var todoItem = new TodoItem
        {
            Id = Guid.NewGuid(),
            Title = title,
            Category = category,
            CreatedAt = DateTime.Now,
            IsCompleted = false
        };

        TodoItems.Add(todoItem); // Automatically notifies UI

        // Add category if it doesn't exist
        if (!Categories.Contains(category))
        {
            Categories.Add(category); // Also notifies UI
        }
    }

    public void MarkAsCompleted(Guid todoId)
    {
        var item = TodoItems.FirstOrDefault(t => t.Id == todoId);
        if (item != null)
        {
            item.IsCompleted = true; // If TodoItem implements INotifyPropertyChanged
            // The collection itself doesn't change, but the item does
        }
    }

    public void RemoveCompletedItems()
    {
        // Remove in reverse order to avoid index issues
        for (int i = TodoItems.Count - 1; i >= 0; i--)
        {
            if (TodoItems[i].IsCompleted)
            {
                TodoItems.RemoveAt(i); // Each removal triggers notification
            }
        }
    }

    private void OnTodoItemsChanged(object sender, NotifyCollectionChangedEventArgs e)
    {
        switch (e.Action)
        {
            case NotifyCollectionChangedAction.Add:
                Console.WriteLine($"Added {e.NewItems.Count} todo item(s)");
                UpdateStatistics();
                break;
            case NotifyCollectionChangedAction.Remove:
                Console.WriteLine($"Removed {e.OldItems.Count} todo item(s)");
                UpdateStatistics();
                break;
        }
    }

    private void OnCategoriesChanged(object sender, NotifyCollectionChangedEventArgs e)
    {
        Console.WriteLine($"Categories updated. Total: {Categories.Count}");
    }

    private void UpdateStatistics()
    {
        int total = TodoItems.Count;
        int completed = TodoItems.Count(t => t.IsCompleted);
        Console.WriteLine($"Progress: {completed}/{total} completed");
    }
}
Enter fullscreen mode Exit fullscreen mode

Real-World Application Patterns

WPF/XAML Data Binding

// ViewModel for WPF application
public class ProductCatalogViewModel : INotifyPropertyChanged
{
    public ObservableCollection<Product> Products { get; } = new ObservableCollection<Product>();
    public ObservableCollection<Product> FilteredProducts { get; } = new ObservableCollection<Product>();

    private string _searchTerm = "";
    public string SearchTerm
    {
        get => _searchTerm;
        set
        {
            _searchTerm = value;
            OnPropertyChanged();
            FilterProducts(); // Update filtered collection when search changes
        }
    }

    private decimal _maxPrice = 1000m;
    public decimal MaxPrice
    {
        get => _maxPrice;
        set
        {
            _maxPrice = value;
            OnPropertyChanged();
            FilterProducts();
        }
    }

    public ProductCatalogViewModel()
    {
        // When the main collection changes, update filtered collection
        Products.CollectionChanged += (s, e) => FilterProducts();
        LoadProducts();
    }

    private void FilterProducts()
    {
        var filtered = Products
            .Where(p => p.Name.Contains(SearchTerm, StringComparison.OrdinalIgnoreCase))
            .Where(p => p.Price <= MaxPrice)
            .OrderBy(p => p.Name);

        // Update ObservableCollection while minimizing change notifications
        FilteredProducts.Clear();
        foreach (var product in filtered)
        {
            FilteredProducts.Add(product);
        }
    }

    private async void LoadProducts()
    {
        // Simulate loading from API
        var products = await productService.GetProductsAsync();

        foreach (var product in products)
        {
            Products.Add(product); // Each addition triggers UI update
        }
    }

    public event PropertyChangedEventHandler PropertyChanged;
    protected void OnPropertyChanged([CallerMemberName] string propertyName = null)
    {
        PropertyChanged?.Invoke(this, new PropertyChangedEventArgs(propertyName));
    }
}

// XAML binding (automatic UI updates)
/*
<ListBox ItemsSource="{Binding FilteredProducts}">
    <ListBox.ItemTemplate>
        <DataTemplate>
            <StackPanel>
                <TextBlock Text="{Binding Name}" FontWeight="Bold"/>
                <TextBlock Text="{Binding Price, StringFormat=C}"/>
            </StackPanel>
        </DataTemplate>
    </ListBox.ItemTemplate>
</ListBox>

<TextBox Text="{Binding SearchTerm, UpdateSourceTrigger=PropertyChanged}" 
         PlaceholderText="Search products..."/>
<Slider Value="{Binding MaxPrice}" Maximum="1000" Minimum="0"/>
*/
Enter fullscreen mode Exit fullscreen mode

Real-Time Dashboard with Live Data

public class LiveDashboardManager
{
    public ObservableCollection<MetricData> RealtimeMetrics { get; } = new ObservableCollection<MetricData>();
    public ObservableCollection<AlertMessage> ActiveAlerts { get; } = new ObservableCollection<AlertMessage>();

    private readonly Timer _updateTimer;
    private readonly Queue<MetricData> _metricBuffer = new Queue<MetricData>();

    public LiveDashboardManager()
    {
        // Update dashboard every second
        _updateTimer = new Timer(UpdateDashboard, null, TimeSpan.Zero, TimeSpan.FromSeconds(1));

        // Keep only last 100 metrics for performance
        RealtimeMetrics.CollectionChanged += (s, e) =>
        {
            while (RealtimeMetrics.Count > 100)
            {
                RealtimeMetrics.RemoveAt(0);
            }
        };
    }

    private void UpdateDashboard(object state)
    {
        // Simulate receiving real-time data
        var newMetric = GenerateMetricData();

        // Add to UI thread (if needed for WPF/WinUI)
        Application.Current?.Dispatcher.BeginInvoke(() =>
        {
            RealtimeMetrics.Add(newMetric);

            // Check for alerts
            if (newMetric.Value > newMetric.ThresholdHigh)
            {
                ActiveAlerts.Insert(0, new AlertMessage
                {
                    Timestamp = DateTime.Now,
                    Level = AlertLevel.High,
                    Message = $"{newMetric.MetricName} exceeded threshold: {newMetric.Value:F2}"
                });
            }

            // Remove old alerts (keep last 10)
            while (ActiveAlerts.Count > 10)
            {
                ActiveAlerts.RemoveAt(ActiveAlerts.Count - 1);
            }
        });
    }

    public void ClearAlerts()
    {
        ActiveAlerts.Clear(); // Single notification for UI update
    }

    private MetricData GenerateMetricData()
    {
        var random = new Random();
        return new MetricData
        {
            MetricName = "CPU Usage",
            Value = random.NextDouble() * 100,
            ThresholdHigh = 80,
            Timestamp = DateTime.Now
        };
    }
}
Enter fullscreen mode Exit fullscreen mode

Shopping Cart with Automatic Totals

public class ShoppingCartManager : INotifyPropertyChanged
{
    public ObservableCollection<CartItem> CartItems { get; } = new ObservableCollection<CartItem>();

    private decimal _subtotal;
    public decimal Subtotal
    {
        get => _subtotal;
        private set
        {
            _subtotal = value;
            OnPropertyChanged();
            OnPropertyChanged(nameof(Tax));
            OnPropertyChanged(nameof(Total));
        }
    }

    public decimal Tax => Subtotal * 0.08m; // 8% tax
    public decimal Total => Subtotal + Tax;

    public ShoppingCartManager()
    {
        // Recalculate totals whenever cart changes
        CartItems.CollectionChanged += OnCartItemsChanged;
    }

    public void AddItem(Product product, int quantity = 1)
    {
        var existingItem = CartItems.FirstOrDefault(item => item.ProductId == product.Id);

        if (existingItem != null)
        {
            // Update quantity of existing item
            existingItem.Quantity += quantity; // Triggers property change if CartItem implements INotifyPropertyChanged
            RecalculateTotals(); // Manual recalculation since collection didn't change
        }
        else
        {
            // Add new item
            var cartItem = new CartItem
            {
                ProductId = product.Id,
                ProductName = product.Name,
                UnitPrice = product.Price,
                Quantity = quantity
            };

            // If CartItem implements INotifyPropertyChanged, subscribe to its changes
            if (cartItem is INotifyPropertyChanged notifyItem)
            {
                notifyItem.PropertyChanged += OnCartItemPropertyChanged;
            }

            CartItems.Add(cartItem); // Triggers collection change
        }
    }

    public void RemoveItem(Guid productId)
    {
        var item = CartItems.FirstOrDefault(i => i.ProductId == productId);
        if (item != null)
        {
            CartItems.Remove(item); // Triggers collection change
        }
    }

    public void UpdateQuantity(Guid productId, int newQuantity)
    {
        var item = CartItems.FirstOrDefault(i => i.ProductId == productId);
        if (item != null)
        {
            if (newQuantity <= 0)
            {
                RemoveItem(productId);
            }
            else
            {
                item.Quantity = newQuantity; // Triggers property change
            }
        }
    }

    private void OnCartItemsChanged(object sender, NotifyCollectionChangedEventArgs e)
    {
        // Handle items being added or removed
        if (e.OldItems != null)
        {
            foreach (INotifyPropertyChanged item in e.OldItems)
            {
                item.PropertyChanged -= OnCartItemPropertyChanged;
            }
        }

        if (e.NewItems != null)
        {
            foreach (INotifyPropertyChanged item in e.NewItems)
            {
                item.PropertyChanged += OnCartItemPropertyChanged;
            }
        }

        RecalculateTotals();
    }

    private void OnCartItemPropertyChanged(object sender, PropertyChangedEventArgs e)
    {
        // Recalculate when individual item properties change (like quantity)
        RecalculateTotals();
    }

    private void RecalculateTotals()
    {
        Subtotal = CartItems.Sum(item => item.UnitPrice * item.Quantity);
    }

    public event PropertyChangedEventHandler PropertyChanged;
    protected void OnPropertyChanged([CallerMemberName] string propertyName = null)
    {
        PropertyChanged?.Invoke(this, new PropertyChangedEventArgs(propertyName));
    }
}
Enter fullscreen mode Exit fullscreen mode

Modern .NET Collections

ReadOnlyObservableCollection

public class SecureDataManager
{
    private readonly ObservableCollection<SensitiveData> _internalData = new ObservableCollection<SensitiveData>();

    // Expose read-only view to external consumers
    public ReadOnlyObservableCollection<SensitiveData> PublicData { get; }

    public SecureDataManager()
    {
        PublicData = new ReadOnlyObservableCollection<SensitiveData>(_internalData);

        // External code can subscribe to changes but can't modify the collection
        PublicData.CollectionChanged += (s, e) =>
        {
            Console.WriteLine("External observer: Data collection changed");
        };
    }

    public void AddData(SensitiveData data)
    {
        if (ValidateData(data))
        {
            _internalData.Add(data); // Triggers change notification to PublicData observers
        }
    }

    private bool ValidateData(SensitiveData data)
    {
        // Internal validation logic
        return data != null && !string.IsNullOrEmpty(data.Id);
    }
}
Enter fullscreen mode Exit fullscreen mode

Collection for Custom Collections

public class ValidatedCollection<T> : Collection<T> where T : IValidatable
{
    public event EventHandler<ValidationEventArgs> ValidationFailed;

    protected override void InsertItem(int index, T item)
    {
        if (!ValidateItem(item))
        {
            ValidationFailed?.Invoke(this, new ValidationEventArgs(item, "Validation failed"));
            return; // Don't add invalid items
        }

        base.InsertItem(index, item);
    }

    protected override void SetItem(int index, T item)
    {
        if (!ValidateItem(item))
        {
            ValidationFailed?.Invoke(this, new ValidationEventArgs(item, "Validation failed"));
            return;
        }

        base.SetItem(index, item);
    }

    private bool ValidateItem(T item)
    {
        return item != null && item.IsValid();
    }
}

// Usage
ValidatedCollection<UserProfile> userProfiles = new ValidatedCollection<UserProfile>();
userProfiles.ValidationFailed += (s, e) =>
{
    Console.WriteLine($"Validation failed: {e.ErrorMessage}");
};

userProfiles.Add(new UserProfile { Name = "", Email = "invalid" }); // Triggers validation failure
Enter fullscreen mode Exit fullscreen mode

Advanced ObservableCollection Patterns

Batch Operations with Suspend Notifications

public class BatchObservableCollection<T> : ObservableCollection<T>
{
    private bool _suppressNotifications = false;

    public void SuspendNotifications()
    {
        _suppressNotifications = true;
    }

    public void ResumeNotifications()
    {
        _suppressNotifications = false;
        OnCollectionChanged(new NotifyCollectionChangedEventArgs(NotifyCollectionChangedAction.Reset));
    }

    protected override void OnCollectionChanged(NotifyCollectionChangedEventArgs e)
    {
        if (!_suppressNotifications)
        {
            base.OnCollectionChanged(e);
        }
    }

    public void AddRange(IEnumerable<T> items)
    {
        SuspendNotifications();
        try
        {
            foreach (var item in items)
            {
                Add(item);
            }
        }
        finally
        {
            ResumeNotifications(); // Single notification for all additions
        }
    }
}

// Usage
BatchObservableCollection<Product> products = new BatchObservableCollection<Product>();

// Add 1000 items with only one UI update
products.AddRange(await productService.GetProductsBatchAsync());
Enter fullscreen mode Exit fullscreen mode

Filtered ObservableCollection

public class FilteredObservableCollection<T> : ObservableCollection<T>
{
    private readonly ObservableCollection<T> _sourceCollection;
    private Predicate<T> _filter;

    public FilteredObservableCollection(ObservableCollection<T> sourceCollection)
    {
        _sourceCollection = sourceCollection;
        _sourceCollection.CollectionChanged += OnSourceCollectionChanged;
    }

    public void SetFilter(Predicate<T> filter)
    {
        _filter = filter;
        RefreshFilter();
    }

    private void RefreshFilter()
    {
        Clear();

        if (_filter == null)
        {
            foreach (var item in _sourceCollection)
            {
                Add(item);
            }
        }
        else
        {
            foreach (var item in _sourceCollection.Where(x => _filter(x)))
            {
                Add(item);
            }
        }
    }

    private void OnSourceCollectionChanged(object sender, NotifyCollectionChangedEventArgs e)
    {
        switch (e.Action)
        {
            case NotifyCollectionChangedAction.Add:
                foreach (T newItem in e.NewItems)
                {
                    if (_filter?.Invoke(newItem) != false)
                    {
                        Add(newItem);
                    }
                }
                break;
            case NotifyCollectionChangedAction.Remove:
                foreach (T oldItem in e.OldItems)
                {
                    Remove(oldItem);
                }
                break;
            case NotifyCollectionChangedAction.Reset:
                RefreshFilter();
                break;
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Performance Considerations

ObservableCollection Performance Characteristics

// Performance comparison for UI binding scenarios:

// Standard List<T> - No change notifications
List<Item> standardList = new List<Item>();
standardList.Add(newItem); // O(1) but UI doesn't update

// ObservableCollection<T> - Change notifications
ObservableCollection<Item> observableCollection = new ObservableCollection<Item>();
observableCollection.Add(newItem); // O(1) + notification overhead

// Batch operations performance:
ObservableCollection<Item> items = new ObservableCollection<Item>();

// Inefficient - triggers 1000 notifications
for (int i = 0; i < 1000; i++)
{
    items.Add(new Item(i)); // Each triggers UI update
}

// Efficient - single notification
BatchObservableCollection<Item> batchItems = new BatchObservableCollection<Item>();
batchItems.AddRange(Enumerable.Range(0, 1000).Select(i => new Item(i)));
Enter fullscreen mode Exit fullscreen mode

Memory and Event Handler Management

public class DataManagerWithCleanup : IDisposable
{
    public ObservableCollection<DataItem> Items { get; } = new ObservableCollection<DataItem>();

    public DataManagerWithCleanup()
    {
        Items.CollectionChanged += OnItemsChanged;
    }

    private void OnItemsChanged(object sender, NotifyCollectionChangedEventArgs e)
    {
        // Handle changes
        if (e.NewItems != null)
        {
            foreach (INotifyPropertyChanged item in e.NewItems.OfType<INotifyPropertyChanged>())
            {
                item.PropertyChanged += OnItemPropertyChanged; // Subscribe to item changes
            }
        }

        if (e.OldItems != null)
        {
            foreach (INotifyPropertyChanged item in e.OldItems.OfType<INotifyPropertyChanged>())
            {
                item.PropertyChanged -= OnItemPropertyChanged; // Unsubscribe to prevent memory leaks
            }
        }
    }

    private void OnItemPropertyChanged(object sender, PropertyChangedEventArgs e)
    {
        // Handle individual item property changes
    }

    public void Dispose()
    {
        // Clean up event handlers to prevent memory leaks
        Items.CollectionChanged -= OnItemsChanged;

        foreach (INotifyPropertyChanged item in Items.OfType<INotifyPropertyChanged>())
        {
            item.PropertyChanged -= OnItemPropertyChanged;
        }

        Items.Clear();
    }
}
Enter fullscreen mode Exit fullscreen mode

Integration with Modern UI Frameworks

Blazor Integration

// Blazor component using ObservableCollection
@page "/todo"
@implements IDisposable

<h3>Todo List</h3>

<input @bind="newTodoTitle" @onkeypress="HandleKeyPress" placeholder="Add new todo..." />
<button @onclick="AddTodo">Add</button>

<ul>
    @foreach (var todo in todoManager.TodoItems)
    {
        <li>
            <input type="checkbox" @bind="todo.IsCompleted" />
            @todo.Title
            <button @onclick="() => RemoveTodo(todo)">Remove</button>
        </li>
    }
</ul>

<p>Completed: @todoManager.TodoItems.Count(t => t.IsCompleted) / @todoManager.TodoItems.Count</p>

@code {
    private TodoListManager todoManager = new TodoListManager();
    private string newTodoTitle = "";

    protected override void OnInitialized()
    {
        // Subscribe to collection changes for UI updates
        todoManager.TodoItems.CollectionChanged += OnTodoItemsChanged;
    }

    private void OnTodoItemsChanged(object sender, NotifyCollectionChangedEventArgs e)
    {
        InvokeAsync(StateHasChanged); // Trigger UI refresh
    }

    private void AddTodo()
    {
        if (!string.IsNullOrWhiteSpace(newTodoTitle))
        {
            todoManager.AddTodoItem(newTodoTitle, "General");
            newTodoTitle = "";
        }
    }

    private void RemoveTodo(TodoItem todo)
    {
        todoManager.TodoItems.Remove(todo);
    }

    private void HandleKeyPress(KeyboardEventArgs e)
    {
        if (e.Key == "Enter")
        {
            AddTodo();
        }
    }

    public void Dispose()
    {
        todoManager.TodoItems.CollectionChanged -= OnTodoItemsChanged;
    }
}
Enter fullscreen mode Exit fullscreen mode

Good Practices vs Bad Practices

✅ Good Practice: Use ObservableCollection for UI Binding

// Automatically updates bound UI elements
public ObservableCollection<Customer> Customers { get; } = new ObservableCollection<Customer>();

// UI binding in XAML automatically reflects changes:
// <ListView ItemsSource="{Binding Customers}">
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Using List for Data-Bound UI

// UI won't update when list changes
public List<Customer> Customers { get; } = new List<Customer>();
Customers.Add(newCustomer); // UI remains unchanged
Enter fullscreen mode Exit fullscreen mode

✅ Good Practice: Batch Operations for Performance

// Suspend notifications during bulk operations
batchCollection.SuspendNotifications();
foreach (var item in bulkItems)
{
    batchCollection.Add(item);
}
batchCollection.ResumeNotifications(); // Single UI update
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Individual Operations in Loops

// Triggers UI update for each addition
foreach (var item in 1000Items)
{
    observableCollection.Add(item); // 1000 UI updates!
}
Enter fullscreen mode Exit fullscreen mode

✅ Good Practice: Proper Event Handler Cleanup

// Always unsubscribe to prevent memory leaks
public void Cleanup()
{
    Items.CollectionChanged -= OnItemsChanged;
    foreach (var item in Items.OfType<INotifyPropertyChanged>())
    {
        item.PropertyChanged -= OnItemPropertyChanged;
    }
}
Enter fullscreen mode Exit fullscreen mode

Chapter Summary

ObservableCollection<T> and modern .NET reactive collections enable building responsive, data-driven applications by automatically notifying UI components and other subscribers when data changes. Understanding change notification patterns, performance implications of frequent updates, and proper event handler management is crucial for building scalable, memory-efficient applications that provide excellent user experiences.

The key insight: ObservableCollection transforms static data display into dynamic, reactive user interfaces, making complex UI synchronization scenarios simple and maintainable through automatic change propagation.

Practice Exercise

Scenario: Build a real-time inventory management system for a retail store.

Create a system that:

  1. Uses ObservableCollection to track products with automatic UI updates
  2. Implements low-stock alerts that trigger when inventory falls below thresholds
  3. Supports bulk inventory updates without overwhelming the UI
  4. Provides filtered views (by category, low stock, out of stock)
  5. Tracks inventory changes with automatic audit logging
  6. Integrates with a simulated POS system that decrements inventory in real-time

Requirements:

  • Create Product class with proper change notification
  • Implement efficient bulk operations for inventory imports
  • Design filtered collections that update automatically
  • Handle memory management and event cleanup properly
  • Simulate concurrent inventory updates from multiple sources

Explain your design choices regarding:

  • When to use ObservableCollection vs other collection types
  • How you handle performance with large datasets
  • Event handler management and memory leak prevention
  • UI responsiveness during bulk operations

Chapter 9: LINQ with Collections: Querying Made Easy

LINQ (Language Integrated Query) transforms how we work with collections, turning complex data manipulation tasks into readable, expressive code. Whether you're filtering user data, aggregating financial reports, or transforming API responses, LINQ provides a unified syntax for querying any collection type. Picture building a social media analytics dashboard, processing e-commerce orders, or analyzing game statistics—LINQ makes complex data operations feel natural and maintainable.

LINQ bridges the gap between imperative programming (telling the computer how to do something) and declarative programming (telling it what you want), making your intent clearer while often improving performance through optimization.

Understanding LINQ Fundamentals

LINQ provides two syntax styles for querying collections:

  • Method Syntax: Uses extension methods with lambda expressions
  • Query Syntax: Uses SQL-like keywords (from, where, select)

Both compile to the same underlying code, so choose based on readability and team preferences.

List<Product> products = GetProducts();

// Method syntax
var expensiveProducts = products
    .Where(p => p.Price > 100)
    .OrderBy(p => p.Name)
    .Select(p => new { p.Name, p.Price })
    .ToList();

// Query syntax (equivalent)
var expensiveProductsQuery = (from p in products
                             where p.Price > 100
                             orderby p.Name
                             select new { p.Name, p.Price }).ToList();
Enter fullscreen mode Exit fullscreen mode

Essential LINQ Operations

Filtering with Where

List<Order> orders = GetOrders();

// Simple filtering
var recentOrders = orders.Where(o => o.OrderDate > DateTime.Now.AddDays(-30));

// Complex filtering with multiple conditions
var highValueOrders = orders.Where(o => 
    o.TotalAmount > 1000 && 
    o.Status == OrderStatus.Completed && 
    o.Customer.IsPremium);

// Filtering with index
var firstTenExpensiveOrders = orders
    .Where((order, index) => order.TotalAmount > 500 && index < 10);

// Working with different collection types
HashSet<string> uniqueCategories = new HashSet<string> { "Electronics", "Books", "Clothing" };
var electronicCategories = uniqueCategories.Where(c => c.Contains("Electronic"));

Dictionary<string, decimal> productPrices = GetProductPrices();
var affordableProducts = productPrices.Where(kvp => kvp.Value < 50);
Enter fullscreen mode Exit fullscreen mode

Projection with Select

List<Customer> customers = GetCustomers();

// Simple projection
var customerNames = customers.Select(c => c.Name);

// Complex projection with anonymous types
var customerSummary = customers.Select(c => new
{
    c.Id,
    FullName = $"{c.FirstName} {c.LastName}",
    OrderCount = c.Orders.Count,
    TotalSpent = c.Orders.Sum(o => o.TotalAmount)
});

// Index-based projection
var numberedCustomers = customers.Select((customer, index) => new
{
    Position = index + 1,
    Name = customer.Name,
    Email = customer.Email
});

// Flattening collections with SelectMany
var allOrderItems = customers.SelectMany(c => c.Orders.SelectMany(o => o.Items));

// Working with different collection types
Queue<ProcessingTask> taskQueue = GetTaskQueue();
var taskDescriptions = taskQueue.Select(t => t.Description);
Enter fullscreen mode Exit fullscreen mode

Aggregation Operations

List<Sale> sales = GetSalesData();

// Basic aggregations
decimal totalRevenue = sales.Sum(s => s.Amount);
decimal averageSale = sales.Average(s => s.Amount);
decimal maxSale = sales.Max(s => s.Amount);
decimal minSale = sales.Min(s => s.Amount);
int saleCount = sales.Count();

// Conditional aggregations
decimal qtr1Revenue = sales
    .Where(s => s.Date.Month <= 3)
    .Sum(s => s.Amount);

int highValueSales = sales.Count(s => s.Amount > 10000);

// Custom aggregations with Aggregate
decimal totalWithTax = sales.Aggregate(0m, (total, sale) => total + (sale.Amount * 1.08m));

// Complex aggregations
var salesSummary = new
{
    TotalRevenue = sales.Sum(s => s.Amount),
    AverageOrderValue = sales.Average(s => s.Amount),
    TopSalesperson = sales
        .GroupBy(s => s.SalespersonId)
        .OrderByDescending(g => g.Sum(s => s.Amount))
        .First().Key
};
Enter fullscreen mode Exit fullscreen mode

Grouping Operations

List<Transaction> transactions = GetTransactions();

// Simple grouping
var transactionsByCategory = transactions
    .GroupBy(t => t.Category)
    .Select(g => new
    {
        Category = g.Key,
        Count = g.Count(),
        TotalAmount = g.Sum(t => t.Amount)
    });

// Multi-level grouping
var salesByRegionAndMonth = sales
    .GroupBy(s => new { s.Region, Month = s.Date.Month })
    .Select(g => new
    {
        g.Key.Region,
        g.Key.Month,
        Sales = g.ToList(),
        Revenue = g.Sum(s => s.Amount)
    });

// Grouping with custom key selector
var customersByFirstLetter = customers
    .GroupBy(c => c.Name.Substring(0, 1).ToUpper())
    .OrderBy(g => g.Key);

// Using GroupBy with dictionaries
Dictionary<string, List<Product>> productsByCategory = products
    .GroupBy(p => p.Category)
    .ToDictionary(g => g.Key, g => g.ToList());
Enter fullscreen mode Exit fullscreen mode

Sorting Operations

List<Employee> employees = GetEmployees();

// Single-field sorting
var employeesByName = employees.OrderBy(e => e.LastName);
var employeesBySalaryDesc = employees.OrderByDescending(e => e.Salary);

// Multi-field sorting
var employeesMultiSort = employees
    .OrderBy(e => e.Department)
    .ThenBy(e => e.LastName)
    .ThenByDescending(e => e.HireDate);

// Custom sorting with IComparer
var employeesByCustomLogic = employees.OrderBy(e => e, new CustomEmployeeComparer());

// Sorting different collection types
SortedSet<DateTime> eventDates = new SortedSet<DateTime>();
var chronologicalEvents = eventDates.OrderBy(d => d); // Already sorted, but can re-sort

ObservableCollection<Product> products = new ObservableCollection<Product>();
var sortedProducts = products.OrderBy(p => p.Price).ToList();
Enter fullscreen mode Exit fullscreen mode

Advanced LINQ Patterns

Set Operations

List<string> list1 = new List<string> { "A", "B", "C", "D" };
List<string> list2 = new List<string> { "C", "D", "E", "F" };
HashSet<string> set1 = new HashSet<string> { "X", "Y", "Z" };

// Union - combine all unique elements
var union = list1.Union(list2); // [A, B, C, D, E, F]

// Intersect - common elements only
var intersection = list1.Intersect(list2); // [C, D]

// Except - elements in first but not second
var difference = list1.Except(list2); // [A, B]

// Working with different collection types
var combinedSets = set1.Union(list1.Take(2)); // Works across different collection types
Enter fullscreen mode Exit fullscreen mode

Joining Operations

List<Customer> customers = GetCustomers();
List<Order> orders = GetOrders();

// Inner join
var customerOrders = customers.Join(
    orders,
    customer => customer.Id,      // Outer key selector
    order => order.CustomerId,    // Inner key selector
    (customer, order) => new      // Result selector
    {
        CustomerName = customer.Name,
        OrderId = order.Id,
        OrderTotal = order.TotalAmount
    });

// Group join (left outer join)
var customersWithOrders = customers.GroupJoin(
    orders,
    customer => customer.Id,
    order => order.CustomerId,
    (customer, customerOrders) => new
    {
        Customer = customer,
        Orders = customerOrders.ToList(),
        OrderCount = customerOrders.Count(),
        TotalSpent = customerOrders.Sum(o => o.TotalAmount)
    });

// Complex joins with multiple conditions
var detailedCustomerData = from c in customers
                          join o in orders on c.Id equals o.CustomerId into customerOrders
                          from co in customerOrders.DefaultIfEmpty() // Left join
                          where c.IsActive
                          select new
                          {
                              c.Name,
                              OrderId = co?.Id ?? 0,
                              OrderAmount = co?.TotalAmount ?? 0
                          };
Enter fullscreen mode Exit fullscreen mode

Partitioning Operations

List<LogEntry> logs = GetLogEntries();

// Take and Skip
var recentLogs = logs
    .OrderByDescending(l => l.Timestamp)
    .Take(100); // Get most recent 100 entries

var pagedResults = logs
    .Skip(pageNumber * pageSize)
    .Take(pageSize); // Pagination

// Conditional partitioning
var criticalErrors = logs.TakeWhile(l => l.Level == LogLevel.Critical);
var afterFirstInfo = logs.SkipWhile(l => l.Level != LogLevel.Info);

// Working with different collections
Queue<ProcessingTask> taskQueue = GetTaskQueue();
var nextBatch = taskQueue.Take(5); // Process next 5 tasks

Stack<string> undoStack = GetUndoStack();
var recentActions = undoStack.Take(3); // Show last 3 actions
Enter fullscreen mode Exit fullscreen mode

Real-World Application Patterns

E-commerce Analytics Dashboard

public class EcommerceAnalytics
{
    private readonly List<Order> _orders;
    private readonly List<Product> _products;
    private readonly List<Customer> _customers;

    public EcommerceAnalytics(List<Order> orders, List<Product> products, List<Customer> customers)
    {
        _orders = orders;
        _products = products;
        _customers = customers;
    }

    public SalesSummary GetSalesSummary(DateTime startDate, DateTime endDate)
    {
        var ordersInPeriod = _orders.Where(o => o.OrderDate >= startDate && o.OrderDate <= endDate);

        return new SalesSummary
        {
            TotalOrders = ordersInPeriod.Count(),
            TotalRevenue = ordersInPeriod.Sum(o => o.TotalAmount),
            AverageOrderValue = ordersInPeriod.Average(o => o.TotalAmount),

            TopProducts = ordersInPeriod
                .SelectMany(o => o.Items)
                .GroupBy(i => i.ProductId)
                .Select(g => new ProductSales
                {
                    ProductName = _products.First(p => p.Id == g.Key).Name,
                    QuantitySold = g.Sum(i => i.Quantity),
                    Revenue = g.Sum(i => i.Quantity * i.UnitPrice)
                })
                .OrderByDescending(ps => ps.Revenue)
                .Take(10)
                .ToList(),

            CustomerSegments = _customers
                .GroupJoin(ordersInPeriod, 
                          c => c.Id, 
                          o => o.CustomerId, 
                          (customer, orders) => new
                          {
                              Customer = customer,
                              TotalSpent = orders.Sum(o => o.TotalAmount)
                          })
                .GroupBy(c => c.TotalSpent switch
                {
                    var x when x >= 10000 => "VIP",
                    var x when x >= 1000 => "Premium", 
                    var x when x > 0 => "Regular",
                    _ => "No Purchase"
                })
                .Select(g => new CustomerSegment
                {
                    Segment = g.Key,
                    Count = g.Count(),
                    TotalRevenue = g.Sum(c => c.TotalSpent)
                })
                .ToList()
        };
    }

    public List<TrendData> GetSalesTrend(int days)
    {
        return _orders
            .Where(o => o.OrderDate >= DateTime.Now.AddDays(-days))
            .GroupBy(o => o.OrderDate.Date)
            .Select(g => new TrendData
            {
                Date = g.Key,
                Orders = g.Count(),
                Revenue = g.Sum(o => o.TotalAmount)
            })
            .OrderBy(t => t.Date)
            .ToList();
    }

    public List<ProductRecommendation> GetProductRecommendations(int customerId)
    {
        var customer = _customers.First(c => c.Id == customerId);
        var customerOrders = _orders.Where(o => o.CustomerId == customerId);
        var customerCategories = customerOrders
            .SelectMany(o => o.Items)
            .Select(i => _products.First(p => p.Id == i.ProductId).Category)
            .Distinct();

        // Find products in similar categories that customer hasn't bought
        var purchasedProductIds = customerOrders
            .SelectMany(o => o.Items)
            .Select(i => i.ProductId)
            .ToHashSet();

        return _products
            .Where(p => customerCategories.Contains(p.Category))
            .Where(p => !purchasedProductIds.Contains(p.Id))
            .GroupJoin(_orders.SelectMany(o => o.Items),
                      p => p.Id,
                      i => i.ProductId,
                      (product, items) => new ProductRecommendation
                      {
                          Product = product,
                          PopularityScore = items.Sum(i => i.Quantity),
                          AverageRating = items.Any() ? items.Average(i => i.Rating ?? 0) : 0
                      })
            .OrderByDescending(r => r.PopularityScore)
            .ThenByDescending(r => r.AverageRating)
            .Take(5)
            .ToList();
    }
}
Enter fullscreen mode Exit fullscreen mode

Log Analysis System

public class LogAnalyzer
{
    public LogAnalysisResult AnalyzeLogs(IEnumerable<LogEntry> logs, TimeSpan timeWindow)
    {
        var cutoffTime = DateTime.Now.Subtract(timeWindow);
        var recentLogs = logs.Where(l => l.Timestamp >= cutoffTime);

        var errorSummary = recentLogs
            .Where(l => l.Level >= LogLevel.Warning)
            .GroupBy(l => new { l.Level, l.Source })
            .Select(g => new ErrorSummary
            {
                Level = g.Key.Level,
                Source = g.Key.Source,
                Count = g.Count(),
                FirstOccurrence = g.Min(l => l.Timestamp),
                LastOccurrence = g.Max(l => l.Timestamp),
                SampleMessages = g.Take(3).Select(l => l.Message).ToList()
            })
            .OrderByDescending(es => es.Count)
            .ToList();

        var performanceMetrics = recentLogs
            .Where(l => l.Duration.HasValue)
            .GroupBy(l => l.Operation)
            .Select(g => new PerformanceMetric
            {
                Operation = g.Key,
                Count = g.Count(),
                AverageDuration = g.Average(l => l.Duration.Value.TotalMilliseconds),
                MaxDuration = g.Max(l => l.Duration.Value.TotalMilliseconds),
                P95Duration = g.OrderByDescending(l => l.Duration.Value)
                              .Skip((int)(g.Count() * 0.05))
                              .First()
                              .Duration.Value.TotalMilliseconds
            })
            .OrderByDescending(pm => pm.AverageDuration)
            .ToList();

        return new LogAnalysisResult
        {
            TotalLogs = recentLogs.Count(),
            ErrorSummaries = errorSummary,
            PerformanceMetrics = performanceMetrics,
            TrendData = GetTrendData(recentLogs)
        };
    }

    private List<TrendDataPoint> GetTrendData(IEnumerable<LogEntry> logs)
    {
        return logs
            .GroupBy(l => new { Hour = l.Timestamp.Hour, l.Level })
            .Select(g => new TrendDataPoint
            {
                Time = g.Key.Hour,
                Level = g.Key.Level,
                Count = g.Count()
            })
            .OrderBy(t => t.Time)
            .ThenBy(t => t.Level)
            .ToList();
    }
}
Enter fullscreen mode Exit fullscreen mode

Game Statistics Processor

public class GameStatsProcessor
{
    public PlayerStats CalculatePlayerStats(List<GameMatch> matches, int playerId)
    {
        var playerMatches = matches.Where(m => m.Players.Any(p => p.Id == playerId));

        return new PlayerStats
        {
            PlayerId = playerId,
            TotalMatches = playerMatches.Count(),
            Wins = playerMatches.Count(m => m.WinnerId == playerId),
            Losses = playerMatches.Count(m => m.WinnerId != null && m.WinnerId != playerId),

            AverageScore = playerMatches
                .SelectMany(m => m.Players.Where(p => p.Id == playerId))
                .Average(p => p.Score),

            BestStreak = CalculateWinStreak(playerMatches.OrderBy(m => m.Timestamp), playerId),

            FavoriteMap = playerMatches
                .GroupBy(m => m.MapName)
                .OrderByDescending(g => g.Count())
                .First().Key,

            RecentPerformance = playerMatches
                .OrderByDescending(m => m.Timestamp)
                .Take(10)
                .SelectMany(m => m.Players.Where(p => p.Id == playerId))
                .Average(p => p.Score),

            Achievements = CalculateAchievements(playerMatches, playerId)
        };
    }

    public List<LeaderboardEntry> GenerateLeaderboard(List<GameMatch> matches)
    {
        return matches
            .SelectMany(m => m.Players)
            .GroupBy(p => p.Id)
            .Select(g => new LeaderboardEntry
            {
                PlayerId = g.Key,
                PlayerName = g.First().Name,
                TotalScore = g.Sum(p => p.Score),
                MatchesPlayed = g.Count(),
                AverageScore = g.Average(p => p.Score),
                WinRate = CalculateWinRate(matches, g.Key)
            })
            .OrderByDescending(l => l.TotalScore)
            .Take(100)
            .ToList();
    }

    private int CalculateWinStreak(IEnumerable<GameMatch> matches, int playerId)
    {
        int maxStreak = 0;
        int currentStreak = 0;

        foreach (var match in matches)
        {
            if (match.WinnerId == playerId)
            {
                currentStreak++;
                maxStreak = Math.Max(maxStreak, currentStreak);
            }
            else
            {
                currentStreak = 0;
            }
        }

        return maxStreak;
    }
}
Enter fullscreen mode Exit fullscreen mode

Performance Considerations and Deferred Execution

Understanding Deferred Execution

List<Product> products = GetLargeProductList(); // 1 million products

// This is NOT executed immediately - it's deferred
var expensiveProducts = products.Where(p => p.Price > 1000);

Console.WriteLine("Query created, but not executed yet");

// Execution happens here when we enumerate
foreach (var product in expensiveProducts) // NOW the query executes
{
    Console.WriteLine(product.Name);
}

// Multiple enumerations = multiple executions
var count1 = expensiveProducts.Count(); // Executes the query
var count2 = expensiveProducts.Count(); // Executes the query AGAIN

// Force immediate execution with ToList(), ToArray(), etc.
var expensiveProductsList = products.Where(p => p.Price > 1000).ToList(); // Execute once
var count3 = expensiveProductsList.Count; // No query execution, just property access
var count4 = expensiveProductsList.Count; // Same - no execution
Enter fullscreen mode Exit fullscreen mode

Performance Best Practices

List<Order> orders = GetLargeOrderList();

// ✅ Good: Filter early, reduce data processed
var recentHighValueOrders = orders
    .Where(o => o.OrderDate > DateTime.Now.AddDays(-30))  // Filter first
    .Where(o => o.TotalAmount > 1000)                     // Then filter more
    .Select(o => new { o.Id, o.TotalAmount })             // Project to smaller object
    .OrderByDescending(o => o.TotalAmount)
    .Take(10)                                             // Limit results
    .ToList();                                            // Execute once

// ❌ Bad: Process all data then filter
var badQuery = orders
    .Select(o => new ComplexOrderSummary(o))              // Heavy computation first
    .OrderByDescending(o => o.TotalAmount)                // Sort everything
    .Where(o => o.OrderDate > DateTime.Now.AddDays(-30))  // Filter after processing
    .ToList();

// ✅ Good: Use appropriate collection types
HashSet<int> validIds = new HashSet<int> { 1, 2, 3, 4, 5 };
var validOrders = orders.Where(o => validIds.Contains(o.Id)); // O(1) lookup

// ❌ Bad: Linear search in list
List<int> validIdsList = new List<int> { 1, 2, 3, 4, 5 };
var badValidOrders = orders.Where(o => validIdsList.Contains(o.Id)); // O(n) lookup

// ✅ Good: Materialize queries that will be reused
var activeCustomers = customers.Where(c => c.IsActive).ToList(); // Execute once
var activeCustomerCount = activeCustomers.Count;                 // No re-execution
var activeCustomerNames = activeCustomers.Select(c => c.Name);   // Work with materialized data
Enter fullscreen mode Exit fullscreen mode

Working with Different Collection Types Efficiently

// Dictionary-based lookups are more efficient than joins for simple scenarios
Dictionary<int, Customer> customerLookup = customers.ToDictionary(c => c.Id);

// ✅ Efficient: Direct lookup
var ordersWithCustomers = orders
    .Where(o => customerLookup.ContainsKey(o.CustomerId))
    .Select(o => new
    {
        Order = o,
        Customer = customerLookup[o.CustomerId]
    });

// ❌ Less efficient: Join operation for simple lookup
var joinedOrders = orders.Join(customers,
    o => o.CustomerId,
    c => c.Id,
    (o, c) => new { Order = o, Customer = c });

// Working efficiently with different collection types
ObservableCollection<Product> observableProducts = GetObservableProducts();
Queue<ProcessingTask> taskQueue = GetTaskQueue();
Stack<string> operationHistory = GetOperationHistory();

// All support LINQ operations
var recentProducts = observableProducts.Where(p => p.DateAdded > DateTime.Now.AddDays(-7));
var urgentTasks = taskQueue.Where(t => t.Priority == TaskPriority.Urgent);
var recentOperations = operationHistory.Take(5);
Enter fullscreen mode Exit fullscreen mode

Good Practices vs Bad Practices

✅ Good Practice: Chain Operations Efficiently

// Process in logical order: filter → transform → aggregate
var result = orders
    .Where(o => o.Status == OrderStatus.Completed)    // Filter first (reduces data)
    .Select(o => o.TotalAmount)                       // Transform to needed data only
    .Sum();                                           // Aggregate
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Inefficient Operation Ordering

// Transform all data before filtering
var badResult = orders
    .Select(o => new { o.Id, o.TotalAmount, ComplexCalculation = ExpensiveOperation(o) })
    .Where(o => o.TotalAmount > 1000)  // Filter after expensive operations
    .Sum(o => o.TotalAmount);
Enter fullscreen mode Exit fullscreen mode

✅ Good Practice: Use Appropriate Execution Methods

// Use ToList() when you need to enumerate multiple times
var expensiveQuery = products.Where(p => ExpensiveValidation(p)).ToList();
var count = expensiveQuery.Count;     // No re-execution
var first = expensiveQuery.First();   // No re-execution

// Use streaming for large datasets processed once
foreach (var item in products.Where(p => p.IsActive)) // Streaming - low memory
{
    ProcessItem(item);
}
Enter fullscreen mode Exit fullscreen mode

❌ Bad Practice: Multiple Enumerations Without Materialization

var query = products.Where(p => ExpensiveValidation(p)); // Deferred execution
var count = query.Count();  // Executes expensive validation for all items
var first = query.First();  // Executes expensive validation AGAIN
var list = query.ToList(); // Executes expensive validation THIRD TIME
Enter fullscreen mode Exit fullscreen mode

✅ Good Practice: Combine LINQ with Collection-Specific Features

// Use HashSet for efficient Contains operations
HashSet<string> categories = new HashSet<string> { "Electronics", "Books" };
var filteredProducts = products.Where(p => categories.Contains(p.Category));

// Use SortedSet when you need both uniqueness and ordering
SortedSet<decimal> pricePoints = new SortedSet<decimal>();
var uniquePrices = products.Select(p => p.Price).ToList();
foreach (var price in uniquePrices) pricePoints.Add(price);
var priceRange = pricePoints.GetViewBetween(10m, 100m);
Enter fullscreen mode Exit fullscreen mode

Chapter Summary

LINQ transforms collection manipulation from imperative loops into declarative, readable queries that express intent clearly. Understanding deferred execution, performance characteristics, and how LINQ integrates with different collection types enables building efficient data processing pipelines that are both maintainable and performant. The key is choosing the right combination of LINQ operations and collection types for your specific use case.

The key insight: LINQ makes complex data transformations readable and maintainable, but understanding execution timing and performance characteristics is crucial for building efficient applications that scale with real-world data volumes.

Practice Exercise

Scenario: Build a comprehensive reporting system for a university student information system.

Using LINQ with various collection types, create a system that:

  1. Student Performance Analytics: Analyze grades across multiple semesters

    • Calculate GPA trends over time
    • Identify students at risk (declining performance)
    • Find top performers by major and semester
  2. Course Enrollment Optimization:

    • Determine optimal class sizes based on historical data
    • Identify prerequisite bottlenecks affecting graduation rates
    • Analyze course popularity trends
  3. Financial Analysis:

    • Track tuition payments and outstanding balances
    • Analyze scholarship distribution effectiveness
    • Generate financial aid recommendations
  4. Resource Planning:

    • Predict classroom and instructor needs
    • Optimize course scheduling to minimize conflicts
    • Analyze resource utilization patterns

Requirements:

  • Use different collection types appropriately (List, Dictionary, HashSet, etc.)
  • Implement efficient LINQ queries with proper performance considerations
  • Handle large datasets with streaming vs materialization strategies
  • Create reusable query components for different report types
  • Include error handling for missing or invalid data

Provide performance analysis comparing different approaches and explain:

  • When you chose method vs query syntax and why
  • How you optimized queries for large datasets
  • Your strategy for handling deferred vs immediate execution
  • Integration patterns with different collection types

Chapter 10: Best Practices, Performance Tips, and Real-World Use Cases

After mastering individual collection types and LINQ operations, the final step is understanding how to architect scalable, maintainable systems that handle real-world complexity. This chapter synthesizes everything we've learned into practical patterns, performance optimization strategies, and architectural decisions that separate junior developers from senior engineers who build systems that scale.

Real-world applications rarely use collections in isolation—they combine multiple collection types, handle concurrent access, optimize for performance, and maintain clean abstractions. Understanding these patterns enables building robust systems that perform well under pressure.

Architectural Patterns and Collection Strategy

The Repository Pattern with Multiple Collection Types

public interface ICustomerRepository
{
    Task<Customer> GetByIdAsync(int id);
    Task<IEnumerable<Customer>> GetByRegionAsync(string region);
    Task<bool> ExistsAsync(int id);
    Task AddAsync(Customer customer);
    Task UpdateAsync(Customer customer);
    Task DeleteAsync(int id);
}

public class OptimizedCustomerRepository : ICustomerRepository
{
    // Different collection types for different access patterns
    private readonly Dictionary<int, Customer> _customerCache = new Dictionary<int, Customer>();
    private readonly Dictionary<string, HashSet<int>> _regionIndex = new Dictionary<string, HashSet<int>>();
    private readonly SortedSet<DateTime> _lastModifiedIndex = new SortedSet<DateTime>();
    private readonly ConcurrentDictionary<int, DateTime> _accessLog = new ConcurrentDictionary<int, DateTime>();

    private readonly ICustomerDataSource _dataSource;
    private readonly object _lockObject = new object();

    public async Task<Customer> GetByIdAsync(int id)
    {
        // Record access for analytics
        _accessLog[id] = DateTime.UtcNow;

        // Check cache first - O(1) lookup
        if (_customerCache.TryGetValue(id, out Customer cachedCustomer))
        {
            return cachedCustomer;
        }

        // Cache miss - load from data source
        Customer customer = await _dataSource.GetByIdAsync(id);
        if (customer != null)
        {
            lock (_lockObject)
            {
                CacheCustomer(customer);
            }
        }

        return customer;
    }

    public async Task<IEnumerable<Customer>> GetByRegionAsync(string region)
    {
        HashSet<int> customerIds;
        lock (_lockObject)
        {
            if (!_regionIndex.TryGetValue(region, out customerIds))
            {
                return Enumerable.Empty<Customer>();
            }
        }

        // Use LINQ with efficient collection operations
        var customers = customerIds
            .Select(id => _customerCache.TryGetValue(id, out Customer customer) ? customer : null)
            .Where(c => c != null)
            .ToList();

        // If we don't have all customers cached, load missing ones
        if (customers.Count < customerIds.Count)
        {
            var missingIds = customerIds.Except(customers.Select(c => c.Id));
            var missingCustomers = await _dataSource.GetByIdsAsync(missingIds);

            lock (_lockObject)
            {
                foreach (var customer in missingCustomers)
                {
                    CacheCustomer(customer);
                }
            }

            customers.AddRange(missingCustomers);
        }

        return customers;
    }

    public bool ExistsAsync(int id)
    {
        return _customerCache.ContainsKey(id); // O(1) check
    }

    private void CacheCustomer(Customer customer)
    {
        // Update main cache
        _customerCache[customer.Id] = customer;

        // Update region index
        if (!_regionIndex.ContainsKey(customer.Region))
        {
            _regionIndex[customer.Region] = new HashSet<int>();
        }
        _regionIndex[customer.Region].Add(customer.Id);

        // Update modification tracking
        _lastModifiedIndex.Add(customer.LastModified);

        // Implement cache eviction if needed
        if (_customerCache.Count > 10000)
        {
            EvictLeastRecentlyUsed();
        }
    }

    private void EvictLeastRecentlyUsed()
    {
        // Find least recently accessed customers
        var oldestAccess = _accessLog
            .OrderBy(kvp => kvp.Value)
            .Take(1000)
            .Select(kvp => kvp.Key)
            .ToList();

        foreach (int customerId in oldestAccess)
        {
            if (_customerCache.TryGetValue(customerId, out Customer customer))
            {
                // Remove from all indexes
                _customerCache.Remove(customerId);
                _regionIndex[customer.Region]?.Remove(customerId);
                _accessLog.TryRemove(customerId, out _);
            }
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Event-Driven Architecture with Collections

public class OrderProcessingSystem
{
    // Different collections for different processing stages
    private readonly ConcurrentQueue<Order> _incomingOrders = new ConcurrentQueue<Order>();
    private readonly Dictionary<OrderStatus, ConcurrentBag<Order>> _ordersByStatus = 
        new Dictionary<OrderStatus, ConcurrentBag<Order>>();
    private readonly ObservableCollection<OrderStatusUpdate> _statusUpdates = 
        new ObservableCollection<OrderStatusUpdate>();
    private readonly LinkedList<ProcessingStep> _processingPipeline = new LinkedList<ProcessingStep>();

    public event EventHandler<OrderStatusChangedEventArgs> OrderStatusChanged;

    public OrderProcessingSystem()
    {
        // Initialize status collections
        foreach (OrderStatus status in Enum.GetValues<OrderStatus>())
        {
            _ordersByStatus[status] = new ConcurrentBag<Order>();
        }

        // Set up processing pipeline
        SetupProcessingPipeline();

        // Monitor status updates for UI binding
        _statusUpdates.CollectionChanged += OnStatusUpdatesChanged;

        // Start processing threads
        StartProcessingWorkers();
    }

    public void SubmitOrder(Order order)
    {
        order.Status = OrderStatus.Pending;
        _incomingOrders.Enqueue(order);

        // Add to status-specific collection
        _ordersByStatus[OrderStatus.Pending].Add(order);

        // Trigger status update
        UpdateOrderStatus(order, OrderStatus.Submitted, OrderStatus.Pending);
    }

    private void SetupProcessingPipeline()
    {
        _processingPipeline.AddLast(new ProcessingStep
        {
            Name = "Validation",
            Handler = ValidateOrder,
            NextStatus = OrderStatus.Validated
        });

        _processingPipeline.AddLast(new ProcessingStep
        {
            Name = "Inventory Check",
            Handler = CheckInventory,
            NextStatus = OrderStatus.InventoryConfirmed
        });

        _processingPipeline.AddLast(new ProcessingStep
        {
            Name = "Payment Processing",
            Handler = ProcessPayment,
            NextStatus = OrderStatus.PaymentProcessed
        });

        _processingPipeline.AddLast(new ProcessingStep
        {
            Name = "Fulfillment",
            Handler = FulfillOrder,
            NextStatus = OrderStatus.Fulfilled
        });
    }

    private void StartProcessingWorkers()
    {
        // Start multiple worker threads for parallel processing
        for (int i = 0; i < Environment.ProcessorCount; i++)
        {
            Task.Run(ProcessOrdersWorker);
        }
    }

    private async Task ProcessOrdersWorker()
    {
        while (true)
        {
            if (_incomingOrders.TryDequeue(out Order order))
            {
                await ProcessOrderThroughPipeline(order);
            }
            else
            {
                await Task.Delay(100); // Wait for new orders
            }
        }
    }

    private async Task ProcessOrderThroughPipeline(Order order)
    {
        var currentStep = _processingPipeline.First;

        while (currentStep != null)
        {
            try
            {
                bool success = await currentStep.Value.Handler(order);

                if (success)
                {
                    var oldStatus = order.Status;
                    order.Status = currentStep.Value.NextStatus;

                    // Move order between status collections
                    _ordersByStatus[currentStep.Value.NextStatus].Add(order);

                    UpdateOrderStatus(order, oldStatus, order.Status);
                    currentStep = currentStep.Next;
                }
                else
                {
                    // Handle failure
                    HandleProcessingFailure(order, currentStep.Value.Name);
                    break;
                }
            }
            catch (Exception ex)
            {
                HandleProcessingError(order, currentStep.Value.Name, ex);
                break;
            }
        }
    }

    private void UpdateOrderStatus(Order order, OrderStatus oldStatus, OrderStatus newStatus)
    {
        var statusUpdate = new OrderStatusUpdate
        {
            OrderId = order.Id,
            OldStatus = oldStatus,
            NewStatus = newStatus,
            Timestamp = DateTime.UtcNow
        };

        // Thread-safe UI update
        Application.Current?.Dispatcher.BeginInvoke(() =>
        {
            _statusUpdates.Insert(0, statusUpdate); // Add to beginning

            // Keep only recent updates for performance
            while (_statusUpdates.Count > 1000)
            {
                _statusUpdates.RemoveAt(_statusUpdates.Count - 1);
            }
        });

        OrderStatusChanged?.Invoke(this, new OrderStatusChangedEventArgs(order, oldStatus, newStatus));
    }

    public OrderMetrics GetOrderMetrics()
    {
        return new OrderMetrics
        {
            TotalOrders = _ordersByStatus.Values.Sum(bag => bag.Count),
            OrdersByStatus = _ordersByStatus.ToDictionary(
                kvp => kvp.Key, 
                kvp => kvp.Value.Count),
            ProcessingRate = CalculateProcessingRate(),
            AverageProcessingTime = CalculateAverageProcessingTime()
        };
    }
}
Enter fullscreen mode Exit fullscreen mode

Performance Optimization Strategies

Memory-Efficient Collection Usage

public class MemoryOptimizedDataProcessor
{
    // Use appropriate initial capacities to prevent allocations
    private readonly Dictionary<string, CustomerData> _customerCache;
    private readonly List<Transaction> _transactionBuffer;

    public MemoryOptimizedDataProcessor(int expectedCustomers, int expectedTransactions)
    {
        // Pre-allocate with expected capacity to prevent resizing
        _customerCache = new Dictionary<string, CustomerData>(expectedCustomers);
        _transactionBuffer = new List<Transaction>(expectedTransactions);
    }

    public async Task<ProcessingResult> ProcessLargeDataset(IAsyncEnumerable<DataRecord> dataStream)
    {
        var result = new ProcessingResult();
        var batchProcessor = new BatchProcessor<DataRecord>(1000); // Process in batches

        await foreach (var record in dataStream)
        {
            batchProcessor.Add(record);

            if (batchProcessor.IsFull)
            {
                await ProcessBatch(batchProcessor.GetBatch(), result);
                batchProcessor.Clear(); // Reuse collection
            }
        }

        // Process remaining items
        if (batchProcessor.HasItems)
        {
            await ProcessBatch(batchProcessor.GetBatch(), result);
        }

        return result;
    }


    private async Task ProcessBatch(IReadOnlyList<DataRecord> batch, ProcessingResult result)
    {
        // Use parallel processing for CPU-intensive operations
        var partitioner = Partitioner.Create(batch, true);
        var processedItems = new ConcurrentBag<ProcessedData>();

        await Task.Run(() =>
        {
            Parallel.ForEach(partitioner, record =>
            {
                var processed = ProcessRecord(record);
                if (processed != null)
                {
                    processedItems.Add(processed);
                }
            });
        });

        // Aggregate results efficiently
        result.ProcessedCount += processedItems.Count;
        result.TotalValue += processedItems.Sum(item => item.Value);

        // Clear references for garbage collection
        processedItems.Clear();
    }

    // Efficient object pooling for high-frequency operations
    private readonly ConcurrentQueue<StringBuilder> _stringBuilderPool = new ConcurrentQueue<StringBuilder>();

    private StringBuilder GetStringBuilder()
    {
        if (_stringBuilderPool.TryDequeue(out StringBuilder sb))
        {
            sb.Clear(); // Reset for reuse
            return sb;
        }
        return new StringBuilder();
    }

    private void ReturnStringBuilder(StringBuilder sb)
    {
        if (sb.Capacity <= 8192) // Don't pool overly large instances
        {
            _stringBuilderPool.Enqueue(sb);
        }
    }
}

public class BatchProcessor<T>
{
    private readonly List<T> _batch;
    private readonly int _batchSize;

    public BatchProcessor(int batchSize)
    {
        _batchSize = batchSize;
        _batch = new List<T>(batchSize); // Pre-allocate exact capacity
    }

    public void Add(T item)
    {
        _batch.Add(item);
    }

    public bool IsFull => _batch.Count >= _batchSize;
    public bool HasItems => _batch.Count > 0;

    public IReadOnlyList<T> GetBatch()
    {
        return _batch.AsReadOnly(); // Avoid copying
    }

    public void Clear()
    {
        _batch.Clear(); // Reuse allocated capacity
    }
}
Enter fullscreen mode Exit fullscreen mode

Thread-Safe Collection Patterns

public class HighThroughputDataProcessor
{
    // Different thread-safe collections for different scenarios
    private readonly ConcurrentDictionary<string, CustomerData> _customerCache = new ConcurrentDictionary<string, CustomerData>();
    private readonly ConcurrentQueue<ProcessingTask> _taskQueue = new ConcurrentQueue<ProcessingTask>();
    private readonly ConcurrentBag<ProcessingResult> _results = new ConcurrentBag<ProcessingResult>();

    // Custom thread-safe collections where needed
    private readonly ThreadSafeObservableCollection<StatusUpdate> _statusUpdates = new ThreadSafeObservableCollection<StatusUpdate>();

    // Reader-writer locks for collections that are read-heavy
    private readonly ReaderWriterLockSlim _configLock = new ReaderWriterLockSlim();
    private readonly Dictionary<string, ConfigValue> _configuration = new Dictionary<string, ConfigValue>();

    public async Task ProcessConcurrentRequests(IEnumerable<ProcessingRequest> requests)
    {
        // Use Parallel.ForEach for CPU-bound work
        var parallelOptions = new ParallelOptions
        {
            MaxDegreeOfParallelism = Environment.ProcessorCount,
            CancellationToken = CancellationToken.None
        };

        await Task.Run(() =>
        {
            Parallel.ForEach(requests, parallelOptions, request =>
            {
                ProcessRequest(request);
            });
        });
    }

    private void ProcessRequest(ProcessingRequest request)
    {
        // Thread-safe cache operations
        var customerData = _customerCache.GetOrAdd(request.CustomerId, id => LoadCustomerData(id));

        // Thread-safe configuration access (read-heavy scenario)
        ConfigValue config = GetConfiguration(request.ConfigKey);

        var result = new ProcessingResult
        {
            RequestId = request.Id,
            CustomerId = request.CustomerId,
            ProcessedAt = DateTime.UtcNow,
            Success = true
        };

        // Thread-safe result collection
        _results.Add(result);

        // Thread-safe status updates for UI
        _statusUpdates.Add(new StatusUpdate
        {
            Message = $"Processed request {request.Id}",
            Timestamp = DateTime.UtcNow
        });
    }

    private ConfigValue GetConfiguration(string key)
    {
        _configLock.EnterReadLock();
        try
        {
            return _configuration.TryGetValue(key, out ConfigValue value) ? value : null;
        }
        finally
        {
            _configLock.ExitReadLock();
        }
    }

    public void UpdateConfiguration(string key, ConfigValue value)
    {
        _configLock.EnterWriteLock();
        try
        {
            _configuration[key] = value;
        }
        finally
        {
            _configLock.ExitWriteLock();
        }
    }

    public ProcessingSummary GetProcessingSummary()
    {
        var allResults = _results.ToArray(); // Snapshot of current results

        return new ProcessingSummary
        {
            TotalProcessed = allResults.Length,
            SuccessCount = allResults.Count(r => r.Success),
            FailureCount = allResults.Count(r => !r.Success),
            AverageProcessingTime = allResults.Any() ? 
                allResults.Average(r => (DateTime.UtcNow - r.ProcessedAt).TotalMilliseconds) : 0,
            CustomerDistribution = allResults
                .GroupBy(r => r.CustomerId)
                .ToDictionary(g => g.Key, g => g.Count())
        };
    }

    public void Dispose()
    {
        _configLock?.Dispose();
    }
}

public class ThreadSafeObservableCollection<T> : INotifyCollectionChanged, IDisposable
{
    private readonly ObservableCollection<T> _collection = new ObservableCollection<T>();
    private readonly object _lock = new object();
    private readonly SynchronizationContext _synchronizationContext;

    public ThreadSafeObservableCollection()
    {
        _synchronizationContext = SynchronizationContext.Current;
    }

    public event NotifyCollectionChangedEventHandler CollectionChanged;

    public void Add(T item)
    {
        lock (_lock)
        {
            _collection.Add(item);
        }

        // Notify on UI thread if available
        if (_synchronizationContext != null)
        {
            _synchronizationContext.Post(_ =>
            {
                CollectionChanged?.Invoke(this, new NotifyCollectionChangedEventArgs(
                    NotifyCollectionChangedAction.Add, item));
            }, null);
        }
        else
        {
            CollectionChanged?.Invoke(this, new NotifyCollectionChangedEventArgs(
                NotifyCollectionChangedAction.Add, item));
        }
    }

    public T[] ToArray()
    {
        lock (_lock)
        {
            return _collection.ToArray();
        }
    }

    public void Clear()
    {
        lock (_lock)
        {
            _collection.Clear();
        }

        if (_synchronizationContext != null)
        {
            _synchronizationContext.Post(_ =>
            {
                CollectionChanged?.Invoke(this, new NotifyCollectionChangedEventArgs(
                    NotifyCollectionChangedAction.Reset));
            }, null);
        }
        else
        {
            CollectionChanged?.Invoke(this, new NotifyCollectionChangedEventArgs(
                NotifyCollectionChangedAction.Reset));
        }
    }

    public void Dispose()
    {
        CollectionChanged = null;
    }
}
Enter fullscreen mode Exit fullscreen mode

Production-Ready Collection Architectures

Microservices Data Synchronization

public class DistributedCacheManager
{
    private readonly IDistributedCache _distributedCache;
    private readonly IMemoryCache _localCache;

    // Local collections for different access patterns
    private readonly ConcurrentDictionary<string, CacheEntry> _fastLookup = new ConcurrentDictionary<string, CacheEntry>();
    private readonly ConcurrentQueue<CacheInvalidation> _invalidationQueue = new ConcurrentQueue<CacheInvalidation>();
    private readonly ObservableCollection<CacheStatistic> _cacheStats = new ObservableCollection<CacheStatistic>();

    public DistributedCacheManager(IDistributedCache distributedCache, IMemoryCache localCache)
    {
        _distributedCache = distributedCache;
        _localCache = localCache;

        // Start background tasks
        _ = Task.Run(ProcessInvalidations);
        _ = Task.Run(UpdateStatistics);
    }

    public async Task<T> GetAsync<T>(string key) where T : class
    {
        // L1 Cache: In-memory lookup (fastest)
        if (_fastLookup.TryGetValue(key, out CacheEntry fastEntry) && !fastEntry.IsExpired)
        {
            RecordCacheHit(CacheLevel.L1, key);
            return fastEntry.Value as T;
        }

        // L2 Cache: Local memory cache
        if (_localCache.TryGetValue(key, out T localValue))
        {
            // Update fast lookup
            _fastLookup[key] = new CacheEntry(localValue, DateTime.UtcNow.AddMinutes(5));
            RecordCacheHit(CacheLevel.L2, key);
            return localValue;
        }

        // L3 Cache: Distributed cache (slowest)
        byte[] distributedValue = await _distributedCache.GetAsync(key);
        if (distributedValue != null)
        {
            T deserializedValue = JsonSerializer.Deserialize<T>(distributedValue);

            // Populate all cache levels
            _localCache.Set(key, deserializedValue, TimeSpan.FromMinutes(10));
            _fastLookup[key] = new CacheEntry(deserializedValue, DateTime.UtcNow.AddMinutes(5));

            RecordCacheHit(CacheLevel.L3, key);
            return deserializedValue;
        }

        RecordCacheMiss(key);
        return null;
    }

    public async Task SetAsync<T>(string key, T value, TimeSpan expiration) where T : class
    {
        // Update all cache levels
        byte[] serializedValue = JsonSerializer.SerializeToUtf8Bytes(value);
        await _distributedCache.SetAsync(key, serializedValue, new DistributedCacheEntryOptions
        {
            AbsoluteExpirationRelativeToNow = expiration
        });

        _localCache.Set(key, value, expiration);
        _fastLookup[key] = new CacheEntry(value, DateTime.UtcNow.Add(expiration));

        // Notify other instances to invalidate
        _invalidationQueue.Enqueue(new CacheInvalidation
        {
            Key = key,
            Timestamp = DateTime.UtcNow,
            Operation = CacheOperation.Set
        });
    }

    private async Task ProcessInvalidations()
    {
        while (true)
        {
            if (_invalidationQueue.TryDequeue(out CacheInvalidation invalidation))
            {
                await ProcessInvalidation(invalidation);
            }
            else
            {
                await Task.Delay(100);
            }
        }
    }

    private async Task UpdateStatistics()
    {
        var timer = new PeriodicTimer(TimeSpan.FromMinutes(1));

        while (await timer.WaitForNextTickAsync())
        {
            var stats = CalculateCacheStatistics();

            // Update observable collection for real-time monitoring
            Application.Current?.Dispatcher.BeginInvoke(() =>
            {
                _cacheStats.Insert(0, stats);

                // Keep only recent statistics
                while (_cacheStats.Count > 60) // Last hour of minute-by-minute stats
                {
                    _cacheStats.RemoveAt(_cacheStats.Count - 1);
                }
            });
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Event Sourcing with Collections

public class EventSourcingRepository<TAggregate> where TAggregate : IAggregateRoot, new()
{
    // Different collection types for different aspects of event sourcing
    private readonly List<IEvent> _uncommittedEvents = new List<IEvent>();
    private readonly Dictionary<Guid, List<IEvent>> _eventStreams = new Dictionary<Guid, List<IEvent>>();
    private readonly SortedDictionary<long, IEvent> _globalEventLog = new SortedDictionary<long, IEvent>();
    private readonly ConcurrentDictionary<Guid, TAggregate> _aggregateCache = new ConcurrentDictionary<Guid, TAggregate>();
    private readonly Queue<IEvent> _publishQueue = new Queue<IEvent>();

    // Snapshots for performance
    private readonly Dictionary<Guid, AggregateSnapshot> _snapshots = new Dictionary<Guid, AggregateSnapshot>();
    private const int SNAPSHOT_FREQUENCY = 100; // Snapshot every 100 events

    public async Task<TAggregate> GetByIdAsync(Guid id)
    {
        // Check cache first
        if (_aggregateCache.TryGetValue(id, out TAggregate cachedAggregate))
        {
            return cachedAggregate;
        }

        // Try to load from snapshot + recent events
        TAggregate aggregate = await LoadFromSnapshotAsync(id);
        if (aggregate == null)
        {
            aggregate = new TAggregate();
        }

        // Get events since snapshot
        var events = GetEventsForAggregate(id, aggregate.Version);

        // Replay events to rebuild aggregate state
        foreach (var @event in events.OrderBy(e => e.Version))
        {
            aggregate.ApplyEvent(@event);
        }

        // Cache the rebuilt aggregate
        _aggregateCache[id] = aggregate;
        return aggregate;
    }

    public async Task SaveAsync(TAggregate aggregate)
    {
        var events = aggregate.GetUncommittedEvents().ToList();
        if (!events.Any()) return;

        // Store events in multiple collections for different access patterns
        foreach (var @event in events)
        {
            // Add to aggregate's event stream
            if (!_eventStreams.ContainsKey(aggregate.Id))
            {
                _eventStreams[aggregate.Id] = new List<IEvent>();
            }
            _eventStreams[aggregate.Id].Add(@event);

            // Add to global event log with sequence number
            long sequenceNumber = GetNextSequenceNumber();
            @event.GlobalSequenceNumber = sequenceNumber;
            _globalEventLog[sequenceNumber] = @event;

            // Queue for publishing
            _publishQueue.Enqueue(@event);
            _uncommittedEvents.Add(@event);
        }

        // Update cache
        _aggregateCache[aggregate.Id] = aggregate;

        // Create snapshot if needed
        if (aggregate.Version % SNAPSHOT_FREQUENCY == 0)
        {
            await CreateSnapshotAsync(aggregate);
        }

        // Mark events as committed
        aggregate.MarkEventsAsCommitted();

        // Publish events asynchronously
        _ = Task.Run(PublishEvents);
    }

    private IEnumerable<IEvent> GetEventsForAggregate(Guid aggregateId, long fromVersion)
    {
        if (_eventStreams.TryGetValue(aggregateId, out List<IEvent> events))
        {
            return events.Where(e => e.Version > fromVersion);
        }
        return Enumerable.Empty<IEvent>();
    }

    public IEnumerable<IEvent> GetEventsSince(long sequenceNumber)
    {
        // Efficient range query using SortedDictionary
        return _globalEventLog
            .Where(kvp => kvp.Key > sequenceNumber)
            .Select(kvp => kvp.Value);
    }

    private async Task<TAggregate> LoadFromSnapshotAsync(Guid id)
    {
        if (_snapshots.TryGetValue(id, out AggregateSnapshot snapshot))
        {
            return await DeserializeSnapshotAsync<TAggregate>(snapshot);
        }
        return null;
    }

    private async Task CreateSnapshotAsync(TAggregate aggregate)
    {
        var snapshot = new AggregateSnapshot
        {
            AggregateId = aggregate.Id,
            Version = aggregate.Version,
            Data = await SerializeAggregateAsync(aggregate),
            CreatedAt = DateTime.UtcNow
        };

        _snapshots[aggregate.Id] = snapshot;
    }

    private async Task PublishEvents()
    {
        var eventsToPublish = new List<IEvent>();

        // Drain the publish queue
        while (_publishQueue.Count > 0)
        {
            if (_publishQueue.TryDequeue(out IEvent @event))
            {
                eventsToPublish.Add(@event);
            }
        }

        // Publish in batches for efficiency
        await PublishEventBatch(eventsToPublish);
    }

    // Projection support
    public IEnumerable<IEvent> GetAllEventsForProjection()
    {
        return _globalEventLog.Values.OrderBy(e => e.GlobalSequenceNumber);
    }

    public Dictionary<string, int> GetEventStatistics()
    {
        return _globalEventLog.Values
            .GroupBy(e => e.GetType().Name)
            .ToDictionary(g => g.Key, g => g.Count());
    }
}
Enter fullscreen mode Exit fullscreen mode

Real-World Performance Scenarios

High-Frequency Trading System

public class HighFrequencyTradingEngine
{
    // Ultra-low latency collections
    private readonly Dictionary<string, OrderBook> _orderBooks = new Dictionary<string, OrderBook>(1000);
    private readonly ConcurrentQueue<TradeOrder> _incomingOrders = new ConcurrentQueue<TradeOrder>();
    private readonly RingBuffer<ExecutedTrade> _executedTrades = new RingBuffer<ExecutedTrade>(100000);

    // Price level collections optimized for frequent updates
    private readonly SortedDictionary<decimal, PriceLevel> _bidLevels = new SortedDictionary<decimal, PriceLevel>();
    private readonly SortedDictionary<decimal, PriceLevel> _askLevels = new SortedDictionary<decimal, PriceLevel>();

    // Market data collections
    private readonly CircularBuffer<MarketTick> _marketData = new CircularBuffer<MarketTick>(50000);
    private readonly Dictionary<string, MarketStatistics> _statistics = new Dictionary<string, MarketStatistics>();

    public void ProcessMarketData(MarketTick tick)
    {
        // Store in circular buffer for historical analysis
        _marketData.Add(tick);

        // Update order books with minimal allocations
        if (_orderBooks.TryGetValue(tick.Symbol, out OrderBook orderBook))
        {
            orderBook.UpdatePrices(tick.BidPrice, tick.AskPrice, tick.Timestamp);

            // Check for matching orders
            ProcessMatchingOrders(orderBook);
        }

        // Update real-time statistics
        UpdateMarketStatistics(tick);
    }

    public void SubmitOrder(TradeOrder order)
    {
        // Validate order using pre-allocated validation objects
        if (!ValidateOrder(order))
        {
            return;
        }

        // High-frequency scenario: process immediately vs queue
        if (_incomingOrders.Count < 1000) // Low load - process immediately
        {
            ProcessOrderDirect(order);
        }
        else // High load - queue for batch processing
        {
            _incomingOrders.Enqueue(order);
        }
    }

    private void ProcessOrderDirect(TradeOrder order)
    {
        if (!_orderBooks.TryGetValue(order.Symbol, out OrderBook orderBook))
        {
            orderBook = new OrderBook(order.Symbol);
            _orderBooks[order.Symbol] = orderBook;
        }

        var matches = orderBook.FindMatches(order);

        foreach (var match in matches)
        {
            var trade = new ExecutedTrade
            {
                OrderId = order.Id,
                MatchedOrderId = match.OrderId,
                Price = match.Price,
                Quantity = match.Quantity,
                Timestamp = DateTime.UtcNow,
                Symbol = order.Symbol
            };

            // Use ring buffer for ultra-fast storage
            _executedTrades.Add(trade);

            // Notify interested parties
            NotifyTradeExecution(trade);
        }
    }

    public MarketSummary GenerateMarketSummary(string symbol)
    {
        // Efficient data aggregation using appropriate collection methods
        var recentTicks = _marketData
            .Where(tick => tick.Symbol == symbol)
            .Where(tick => tick.Timestamp > DateTime.UtcNow.AddMinutes(-5))
            .ToList(); // Materialize once for multiple operations

        if (!recentTicks.Any())
            return null;

        var recentTrades = _executedTrades
            .Where(trade => trade.Symbol == symbol)
            .Where(trade => trade.Timestamp > DateTime.UtcNow.AddMinutes(-5))
            .ToList();

        return new MarketSummary
        {
            Symbol = symbol,
            LastPrice = recentTicks.Last().LastPrice,
            Volume = recentTrades.Sum(t => t.Quantity),
            High = recentTicks.Max(t => t.LastPrice),
            Low = recentTicks.Min(t => t.LastPrice),
            Open = recentTicks.First().LastPrice,

            // Use SortedDictionary for efficient price level access
            BidDepth = _orderBooks[symbol].BidLevels.Take(10).ToDictionary(kvp => kvp.Key, kvp => kvp.Value.Quantity),
            AskDepth = _orderBooks[symbol].AskLevels.Take(10).ToDictionary(kvp => kvp.Key, kvp => kvp.Value.Quantity),

            TradeCount = recentTrades.Count,
            AverageTradeSize = recentTrades.Any() ? recentTrades.Average(t => t.Quantity) : 0
        };
    }
}

// Specialized collections for ultra-low latency scenarios
public class RingBuffer<T>
{
    private readonly T[] _buffer;
    private readonly int _size;
    private int _head = 0;
    private int _tail = 0;
    private bool _isFull = false;

    public RingBuffer(int size)
    {
        _size = size;
        _buffer = new T[size];
    }

    public void Add(T item)
    {
        _buffer[_head] = item;

        if (_isFull)
        {
            _tail = (_tail + 1) % _size;
        }

        _head = (_head + 1) % _size;
        _isFull = _head == _tail;
    }

    public IEnumerable<T> Where(Func<T, bool> predicate)
    {
        if (_isFull)
        {
            // Buffer is full, check all elements
            for (int i = 0; i < _size; i++)
            {
                int index = (_tail + i) % _size;
                if (predicate(_buffer[index]))
                {
                    yield return _buffer[index];
                }
            }
        }
        else
        {
            // Buffer not full, check from tail to head
            for (int i = _tail; i != _head; i = (i + 1) % _size)
            {
                if (predicate(_buffer[i]))
                {
                    yield return _buffer[i];
                }
            }
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Large-Scale Data Analytics Pipeline

public class BigDataAnalyticsPipeline
{
    private readonly ConcurrentDictionary<string, MetricAccumulator> _metricAccumulators = new ConcurrentDictionary<string, MetricAccumulator>();
    private readonly PartitionedCollection<DataPoint> _dataPoints = new PartitionedCollection<DataPoint>(Environment.ProcessorCount);
    private readonly TimeBucketedCollection<AggregatedMetric> _timeSeriesData = new TimeBucketedCollection<AggregatedMetric>();

    public async Task<AnalyticsResult> ProcessLargeDataset(IAsyncEnumerable<RawDataRecord> dataStream)
    {
        var processingTasks = new List<Task>();
        var semaphore = new SemaphoreSlim(Environment.ProcessorCount);

        await foreach (var rawData in dataStream.ConfigureAwait(false))
        {
            await semaphore.WaitAsync();

            processingTasks.Add(Task.Run(async () =>
            {
                try
                {
                    await ProcessDataRecord(rawData);
                }
                finally
                {
                    semaphore.Release();
                }
            }));

            // Prevent unbounded task accumulation
            if (processingTasks.Count > 10000)
            {
                await Task.WhenAll(processingTasks.Take(5000));
                processingTasks.RemoveRange(0, 5000);
            }
        }

        // Wait for remaining tasks
        await Task.WhenAll(processingTasks);

        return GenerateAnalyticsResult();
    }

    private async Task ProcessDataRecord(RawDataRecord record)
    {
        // Transform raw data
        var dataPoints = ExtractDataPoints(record);

        // Partition data for parallel processing
        var partitionKey = record.TenantId.GetHashCode() % Environment.ProcessorCount;
        _dataPoints.AddToPartition(partitionKey, dataPoints);

        // Update running aggregations
        foreach (var point in dataPoints)
        {
            var accumulator = _metricAccumulators.GetOrAdd(point.MetricName, _ => new MetricAccumulator());
            accumulator.AddValue(point.Value, point.Timestamp);
        }

        // Store in time-bucketed collection for temporal queries
        var aggregatedMetric = new AggregatedMetric
        {
            MetricName = record.Type,
            Value = dataPoints.Average(dp => dp.Value),
            Timestamp = record.Timestamp,
            Count = dataPoints.Count
        };

        _timeSeriesData.Add(aggregatedMetric, record.Timestamp);
    }

    private AnalyticsResult GenerateAnalyticsResult()
    {
        // Parallel aggregation across partitions
        var partitionResults = _dataPoints.Partitions
            .AsParallel()
            .Select(partition => new PartitionResult
            {
                PartitionId = partition.Key,
                TotalPoints = partition.Value.Count,
                AverageValue = partition.Value.Any() ? partition.Value.Average(dp => dp.Value) : 0,
                UniqueMetrics = partition.Value.Select(dp => dp.MetricName).Distinct().Count()
            })
            .ToList();

        // Time-based analysis
        var timeSeriesAnalysis = _timeSeriesData.GetBucketsInRange(
            DateTime.UtcNow.AddHours(-24), 
            DateTime.UtcNow)
            .Select(bucket => new TimeSeriesPoint
            {
                Timestamp = bucket.Key,
                TotalEvents = bucket.Value.Count,
                AverageValue = bucket.Value.Average(m => m.Value)
            })
            .OrderBy(tsp => tsp.Timestamp)
            .ToList();

        // Top metrics analysis
        var topMetrics = _metricAccumulators
            .Select(kvp => new MetricSummary
            {
                Name = kvp.Key,
                TotalValue = kvp.Value.TotalValue,
                Count = kvp.Value.Count,
                Average = kvp.Value.Average,
                StandardDeviation = kvp.Value.StandardDeviation
            })
            .OrderByDescending(ms => ms.TotalValue)
            .Take(100)
            .ToList();

        return new AnalyticsResult
        {
            ProcessingDuration = DateTime.UtcNow.Subtract(_processingStartTime),
            TotalDataPoints = partitionResults.Sum(pr => pr.TotalPoints),
            PartitionResults = partitionResults,
            TimeSeriesAnalysis = timeSeriesAnalysis,
            TopMetrics = topMetrics,
            MemoryUsage = GC.GetTotalMemory(false)
        };
    }
}

public class PartitionedCollection<T>
{
    private readonly ConcurrentDictionary<int, ConcurrentBag<T>> _partitions;
    private readonly int _partitionCount;

    public PartitionedCollection(int partitionCount)
    {
        _partitionCount = partitionCount;
        _partitions = new ConcurrentDictionary<int, ConcurrentBag<T>>();

        for (int i = 0; i < partitionCount; i++)
        {
            _partitions[i] = new ConcurrentBag<T>();
        }
    }

    public void AddToPartition(int partitionKey, IEnumerable<T> items)
    {
        var partition = _partitions[partitionKey % _partitionCount];
        foreach (var item in items)
        {
            partition.Add(item);
        }
    }

    public IReadOnlyDictionary<int, ConcurrentBag<T>> Partitions => _partitions;

    public ParallelQuery<T> AsParallel()
    {
        return _partitions.Values
            .AsParallel()
            .SelectMany(partition => partition);
    }
}

public class TimeBucketedCollection<T>
{
    private readonly ConcurrentDictionary<DateTime, ConcurrentBag<T>> _timeBuckets = new ConcurrentDictionary<DateTime, ConcurrentBag<T>>();
    private readonly TimeSpan _bucketSize = TimeSpan.FromMinutes(1);

    public void Add(T item, DateTime timestamp)
    {
        var bucketKey = new DateTime((timestamp.Ticks / _bucketSize.Ticks) * _bucketSize.Ticks);
        var bucket = _timeBuckets.GetOrAdd(bucketKey, _ => new ConcurrentBag<T>());
        bucket.Add(item);
    }

    public IEnumerable<KeyValuePair<DateTime, ConcurrentBag<T>>> GetBucketsInRange(DateTime start, DateTime end)
    {
        return _timeBuckets
            .Where(kvp => kvp.Key >= start && kvp.Key <= end)
            .OrderBy(kvp => kvp.Key);
    }
}
Enter fullscreen mode Exit fullscreen mode

Collection Selection Decision Matrix

Choosing the Right Collection Type

public static class CollectionSelector
{
    public static string RecommendCollection(CollectionRequirements requirements)
    {
        // Decision matrix based on usage patterns
        return requirements switch
        {
            // Fast lookups by key
            { NeedsFastLookup: true, KeyType: not null, AllowsDuplicates: false } 
                => "Dictionary<TKey, TValue>",

            // Unique items without ordering
            { AllowsDuplicates: false, NeedsOrdering: false } 
                => "HashSet<T>",

            // Unique items with ordering
            { AllowsDuplicates: false, NeedsOrdering: true } 
                => "SortedSet<T>",

            // FIFO processing
            { ProcessingOrder: ProcessingOrder.FIFO } 
                => "Queue<T>",

            // LIFO processing
            { ProcessingOrder: ProcessingOrder.LIFO } 
                => "Stack<T>",

            // Frequent middle insertions/deletions
            { FrequentMiddleModification: true } 
                => "LinkedList<T>",

            // UI data binding
            { NeedsChangeNotification: true } 
                => "ObservableCollection<T>",

            // Thread-safe operations
            { RequiresThreadSafety: true } when requirements.ProcessingOrder == ProcessingOrder.FIFO 
                => "ConcurrentQueue<T>",

            { RequiresThreadSafety: true, NeedsFastLookup: true } 
                => "ConcurrentDictionary<TKey, TValue>",

            { RequiresThreadSafety: true, AllowsDuplicates: true } 
                => "ConcurrentBag<T>",

            // Default: general purpose dynamic collection
            _ => "List<T>"
        };
    }

    public static CollectionPerformanceProfile GetPerformanceProfile(string collectionType)
    {
        return collectionType switch
        {
            "List<T>" => new CollectionPerformanceProfile
            {
                IndexAccess = "O(1)",
                Add = "O(1) amortized",
                Insert = "O(n)",
                Remove = "O(n)",
                Contains = "O(n)",
                Memory = "Efficient",
                ThreadSafe = false
            },

            "Dictionary<TKey,TValue>" => new CollectionPerformanceProfile
            {
                IndexAccess = "O(1) average",
                Add = "O(1) average",
                Insert = "N/A",
                Remove = "O(1) average", 
                Contains = "O(1) average",
                Memory = "Good",
                ThreadSafe = false
            },

            "HashSet<T>" => new CollectionPerformanceProfile
            {
                IndexAccess = "N/A",
                Add = "O(1) average",
                Insert = "N/A",
                Remove = "O(1) average",
                Contains = "O(1) average", 
                Memory = "Good",
                ThreadSafe = false
            },

            "Queue<T>" => new CollectionPerformanceProfile
            {
                IndexAccess = "N/A",
                Add = "O(1) (Enqueue)",
                Insert = "N/A",
                Remove = "O(1) (Dequeue)",
                Contains = "O(n)",
                Memory = "Efficient",
                ThreadSafe = false
            },

            _ => throw new ArgumentException($"Unknown collection type: {collectionType}")
        };
    }
}

public record CollectionRequirements
{
    public bool NeedsFastLookup { get; init; }
    public bool AllowsDuplicates { get; init; } = true;
    public bool NeedsOrdering { get; init; }
    public bool FrequentMiddleModification { get; init; }
    public bool NeedsChangeNotification { get; init; }
    public bool RequiresThreadSafety { get; init; }
    public ProcessingOrder ProcessingOrder { get; init; } = ProcessingOrder.None;
    public Type KeyType { get; init; }
    public int ExpectedSize { get; init; }
    public AccessPattern PrimaryAccessPattern { get; init; }
}

public enum ProcessingOrder { None, FIFO, LIFO }
public enum AccessPattern { Sequential, Random, KeyBased }

public record CollectionPerformanceProfile
{
    public string IndexAccess { get; init; }
    public string Add { get; init; }
    public string Insert { get; init; }
    public string Remove { get; init; }
    public string Contains { get; init; }
    public string Memory { get; init; }
    public bool ThreadSafe { get; init; }
}
Enter fullscreen mode Exit fullscreen mode

Production Debugging and Monitoring

Collection Performance Monitoring

public class CollectionPerformanceMonitor
{
    private readonly ConcurrentDictionary<string, CollectionMetrics> _metrics = new ConcurrentDictionary<string, CollectionMetrics>();
    private readonly Timer _reportingTimer;

    public CollectionPerformanceMonitor()
    {
        _reportingTimer = new Timer(ReportMetrics, null, TimeSpan.FromMinutes(1), TimeSpan.FromMinutes(1));
    }

    public IDisposable MonitorCollection<T>(ICollection<T> collection, string name)
    {
        return new CollectionMonitorScope<T>(collection, name, this);
    }

    internal void RecordOperation(string collectionName, string operation, TimeSpan duration, int collectionSize)
    {
        var metrics = _metrics.GetOrAdd(collectionName, _ => new CollectionMetrics());

        lock (metrics)
        {
            metrics.OperationCounts[operation] = metrics.OperationCounts.GetValueOrDefault(operation, 0) + 1;
            metrics.OperationDurations[operation] = metrics.OperationDurations.GetValueOrDefault(operation, TimeSpan.Zero) + duration;
            metrics.MaxSize = Math.Max(metrics.MaxSize, collectionSize);
            metrics.LastUpdated = DateTime.UtcNow;
        }
    }

    private void ReportMetrics(object state)
    {
        foreach (var kvp in _metrics)
        {
            var metrics = kvp.Value;
            Console.WriteLine($"Collection: {kvp.Key}");
            Console.WriteLine($"  Max Size: {metrics.MaxSize}");

            foreach (var operation in metrics.OperationCounts)
            {
                var avgDuration = metrics.OperationDurations[operation.Key].TotalMilliseconds / operation.Value;
                Console.WriteLine($"  {operation.Key}: {operation.Value} ops, {avgDuration:F2}ms avg");
            }
            Console.WriteLine();
        }
    }
}

public class CollectionMonitorScope<T> : IDisposable
{
    private readonly ICollection<T> _collection;
    private readonly string _name;
    private readonly CollectionPerformanceMonitor _monitor;
    private readonly Stopwatch _stopwatch = new Stopwatch();

    public CollectionMonitorScope(ICollection<T> collection, string name, CollectionPerformanceMonitor monitor)
    {
        _collection = collection;
        _name = name;
        _monitor = monitor;
    }

    public void BeginOperation(string operationName)
    {
        _stopwatch.Restart();
    }

    public void EndOperation(string operationName)
    {
        _stopwatch.Stop();
        _monitor.RecordOperation(_name, operationName, _stopwatch.Elapsed, _collection.Count);
    }

    public void Dispose()
    {
        // Cleanup if needed
    }
}
Enter fullscreen mode Exit fullscreen mode

Chapter Summary

Building production-ready applications requires understanding not just individual collection types, but how to architect systems that combine multiple collections strategically, handle concurrent access safely, and optimize for real-world performance requirements. The patterns in this chapter demonstrate how senior engineers approach complex data management challenges by choosing appropriate collection types, implementing efficient caching strategies, and monitoring system performance.

Key architectural principles include: separating concerns through specialized collections, optimizing for your specific access patterns, implementing proper thread safety, monitoring performance in production, and choosing simplicity over premature optimization while maintaining the flexibility to optimize when needed.

Conclusion

Mastering collections in C# transforms how you approach software architecture and problem-solving. From simple data storage to complex distributed systems, the right collection choices make the difference between code that merely works and code that excels under pressure.

Throughout this journey, we've explored not just the mechanics of each collection type, but the thinking process behind choosing and combining them effectively. The patterns and principles in this book prepare you to tackle real-world challenges with confidence, whether you're building user interfaces, processing big data, or architecting microservices.

Remember: Great software engineering isn't about memorizing APIs—it's about understanding trade-offs and making informed decisions that serve both your current requirements and future growth. Keep practicing, keep measuring, and keep learning.

The collections are your tools. The architecture is your craft. Now go build something amazing.

Top comments (0)