Tuesday, 16 December 2025

C# Developer Common Mistakes That Hurt Performance (And How To Avoid Them)

 

Introduction

Performance problems in C# apps are rarely about a single “big bug”. They usually come from a collection of small, common patterns: convenient code that looks clean, but quietly wastes CPU, memory, threads, or connections. This article walks through typical mistakes and gives practical, low-friction alternatives you can apply in day‑to‑day coding.

1. Treating LINQ as “Free” (Hidden N² and Allocations)

1.1. Chaining LINQ Inside Hot Loops: 

Mistake: Using .Where().Select().First()/Any() repeatedly inside loops. For example:
- For each row, you call Where(...) and First(...) over the same list.
- For each item, you run Select(...).Any(...) over another list.

Impact: 
- Hidden O(n²) behavior when both sequences grow.
- Many temporary iterators and lists → extra allocations and GC pressure.

Better approach: 
- Precompute lookups once outside the loop: Use Dictionary<TKey, TValue> to map from one key to another or Use HashSet<T> for “contains?” checks.
- Use a single-pass foreach over the source and do all necessary checks inside.

Heuristics:
-
If you see LINQ in an inner loop, ask: “Can I precompute this?”
-
Avoid patterns like  .Where(...).Count().Where(...).Any().Where(...).ToList() run multiple times for the same input.

Example (bad vs better)

// BAD: LINQ inside the inner loop, repeated scans
foreach (var order in orders)
{
var linesForOrder = orderLines
.Where(l => l.OrderId == order.Id)
.ToList();

foreach (var line in linesForOrder)
{
var productName = products
.Where(p => p.Id == line.ProductId)
.Select(p => p.Name)
.FirstOrDefault();

Console.WriteLine($"{order.Id}: {productName}");
}
}
// BETTER: precompute lookups once, then use O(1) access in loops
var linesByOrder = orderLines
.GroupBy(l => l.OrderId)
.ToDictionary(g => g.Key, g => g.ToList());

var productNameById = products
.ToDictionary(p => p.Id, p => p.Name);

foreach (var order in orders)
{
if (!linesByOrder.TryGetValue(order.Id, out var lines))
continue;

foreach (var line in lines)
{
if (!productNameById.TryGetValue(line.ProductId, out var productName))
continue;

Console.WriteLine($"{order.Id}: {productName}");
}
}
1.2. Overusing ToList() and ToList().ForEach(...)

Mistake: 
- Calling .ToList() just to iterate.
- Using .ToList().ForEach(...) instead of a foreach.
- Turning arrays or List<T> into another List<T> unnecessarily.

Impact: 
- Extra memory allocations with no benefit.
- Double enumeration if the source is not already materialized.

Better approach:
- Use foreach directly on IEnumerable<T>.

- Only call .ToList() when you need:
 => A stable snapshot, or
=> Indexing / random access, or
=> To reuse the result multiple times.

Heuristics:
- If .ToList() is followed immediately by .ForEach, replace with foreach.
- If a sequence is enumerated only once, don’t force a list.

Example (bad vs better)

// BAD: needless ToList + ForEach
GetUsersFromDb()
.ToList()
.ForEach(u => Console.WriteLine(u.Email));


// BETTER: just iterate the sequence
foreach (var user in GetUsersFromDb())
{
Console.WriteLine(user.Email);
}

2. “Half‑Async” Code: Mixing Async With .Result and .Wait()

2.1. Blocking on Async (.Result.Wait())

Mistake:
- Calling SomeAsync(...).Result or SomeAsync(...).Wait() in:
 => Controllers
 => Middlewares
 => Repositories and services

Impact:
- Deadlocks when combined with synchronization contexts.
- Thread pool starvation: many threads block instead of doing work.
- Poor scalability under concurrent load.

Better approach:
- Make the entire call chain async all the way down:
 => Expose async methods: Task<T> FooAsync(...).
 => Always use await FooAsync(...).

- If you must bridge sync/async (e.g. in Main), keep it at the very edge and use ConfigureAwait(false) carefully.

Heuristics:
- In ASP.NET Core request handling, never use .Result or .Wait() on Task.
- If a method calls an async API, strongly consider making that method async too.

Example (bad vs better)

// BAD: blocking on async inside a web request
public IActionResult GetUser(int id)
{
var user = _userService.GetUserAsync(id).Result; // can deadlock
return Ok(user);
}

// BETTER: async all the way
public async Task<IActionResult> GetUser(int id)
{
var user = await _userService.GetUserAsync(id);
return Ok(user);
}
2.2. Wrapping I/O in Task.Run

Mistake:
- Using await Task.Run(() => /* DB or HTTP call */) just to make it “look async”.

Impact:
- Still blocks a thread per I/O operation; now it’s a thread-pool thread instead of the request thread.
- Minimal scalability benefit.

Better approach:
- Use true async I/O:
 => await connection.QueryAsync(...) instead of Query(...).
 => await httpClient.SendAsync(...) instead of Send(...).

- Let the framework manage threads; your methods should just await I/O tasks.

Heuristics:
- If the work is I/O-bound, use async APIs.
- Use Task.Run only for CPU-bound work you explicitly want to parallelize.

Example (bad vs better)

// BAD: wrapping HTTP I/O in Task.Run
public async Task<string> GetDataAsync(string url)
{
return await Task.Run(() =>
{
// This still blocks a thread
return _httpClient.GetStringAsync(url).Result;
});
}

// BETTER: use proper async I/O
public async Task<string> GetDataAsync(string url)
{
return await _httpClient.GetStringAsync(url);
}

3. Inefficient HTTP Usage (HttpClient & Headers)

Mistake:
- Writing using var client = new HttpClient(); inside every method that calls an API.

Impact:
- Socket exhaustion from many short‑lived connections.
- No effective connection pooling.
- Increased latency and resource consumption.

Better approach:
- In ASP.NET Core, use IHttpClientFactory:
 => Register named or typed clients in DI.
 => Inject them into your services / repositories.
- Configure base address and default headers once at registration, not on every call.

Heuristics:
- In modern .NET apps, grep for new HttpClient() and replace with factory-based clients.
- Set per‑request headers on HttpRequestMessage, not by creating new clients.

Example (bad vs better)

// BAD: new HttpClient every request
public async Task<string> GetWeatherAsync(string city)
{
using var client = new HttpClient();
var url = $"https://api.example.com/weather?city={city}";
return await client.GetStringAsync(url);
}
// BETTER: use IHttpClientFactory (ASP.NET Core)
public class WeatherService
{
private readonly HttpClient _client;

public WeatherService(IHttpClientFactory httpClientFactory)
{
_client = httpClientFactory.CreateClient("WeatherApi");
}

public Task<string> GetWeatherAsync(string city)
{
var url = $"weather?city={city}";
return _client.GetStringAsync(url);
}
}

4. Heavy or Blocking Constructors / Initialization

Mistake:
- Calling .Result.Wait() on async factory methods inside constructors.
- Performing network or DB calls as soon as a service is constructed.

Impact:
- Slower and fragile application startup.
- Hard‑to‑debug failures at DI resolution time.
- Potential deadlocks during startup.

Better approach:
- Keep constructors lightweight and side-effect free.
- Initialize expensive resources:
 => Lazily (via async methods or lazy wrappers), or
 => At startup via background services / startup tasks.
- For things like DB or Redis connections:
 => Prefer singleton ConnectionMultiplexer/connection factories initialized once and reused.

Heuristics:
- If a constructor can throw due to network/DB issues, redesign.
- Avoid any .Result.Wait() in constructors of DI services.

Example (bad vs better)

// BAD: constructor blocks on async DB call
public class UserRepository
{
private readonly string _connectionString;

public UserRepository(IConfigService config)
{
// Async method, but forced sync here
_connectionString = config.GetConnectionStringAsync().Result;
}
}
// BETTER: keep ctor light, use async initialization in methods
public class UserRepository
{
private readonly IConfigService _config;

public UserRepository(IConfigService config)
{
_config = config;
}

private async Task<SqlConnection> CreateConnectionAsync()
{
var connString = await _config.GetConnectionStringAsync();
return new SqlConnection(connString);
}

public async Task<User?> GetByIdAsync(int id)
{
await using var connection = await CreateConnectionAsync();
// query DB...
return null;
}
}

5. Repeated Scans Over Large Data Sets With LINQ

5.1. Re‑Filtering the Same Sequence Many Times

Mistake:
- For a given dataset, running similar Where(...) filters multiple times:
 => One for Count, one for Sum, one for Average, etc.
- Example: For each region or key in a geo dataset, scanning the full dataset again in nested loops.

Impact:
- O(m × n) behavior where you could do O(n).
- Wasted CPU and allocations as data volume grows.

Better approach:
- Group once, compute many things:
 => Use .GroupBy(keySelector) so you iterate the raw data once.
 => Inside each group, compute all required aggregates (Count, Sum, etc.).
- Or, pre-filter into a list:
 => var filtered = data.Where(...).ToList();
 =>
Then use filtered.Count, filtered.Sum(...), etc.

Heuristics:
- If you see the same Where predicate copy‑pasted several times, factor it out.
- If multiple stats use the same subset of data, filter once into a temporary collection.

Example (bad vs better)

// BAD: re-filtering the same sequence multiple times
var highValueOrdersCount = orders
.Where(o => o.Total > 1000 && o.Status == OrderStatus.Completed)
.Count();

var highValueOrdersTotal = orders
.Where(o => o.Total > 1000 && o.Status == OrderStatus.Completed)
.Sum(o => o.Total);


// BETTER: filter once, reuse
var highValueOrders = orders
.Where(o => o.Total > 1000 && o.Status == OrderStatus.Completed)
.ToList();

var highValueOrdersCount = highValueOrders.Count;
var highValueOrdersTotal = highValueOrders.Sum(o => o.Total);

6. Unnecessary Allocations and Object Churn

6.1. Throwaway Collections and Strings

Mistake:
- Creating new List<>/Dictionary<>/string instances in tight loops where they could be reused.
- Rebuilding complex keys or query strings again and again.

Impact:
- High GC overhead and more frequent pauses.
- Increased memory footprint.

Better approach:
- Reuse existing collections where safe by clearing them and reusing:
 => E.g. list.Clear(); then list.Add(...).
- Use StringBuilder in truly hot string-building paths.
- Centralize key-building logic and cache expensive computed strings when reused.

Heuristics:
- Profile hot paths and look at allocation-heavy methods.
- Pay close attention to code running “per row / per KPI / per request”.

Example (bad vs better)

// BAD: allocating a new list and string on every iteration
foreach (var order in orders)
{
var lineItems = new List<string>();
foreach (var line in order.Lines)
{
lineItems.Add($"{line.ProductCode}-{line.Quantity}");
}

var summary = string.Join(",", lineItems);
Console.WriteLine(summary);
}
// BETTER: reuse collections and use StringBuilder for hot paths
var lineItems = new List<string>();
var sb = new StringBuilder();

foreach (var order in orders)
{
lineItems.Clear();

foreach (var line in order.Lines)
{
lineItems.Add($"{line.ProductCode}-{line.Quantity}");
}

sb.Clear();
for (int i = 0; i < lineItems.Count; i++)
{
if (i > 0) sb.Append(',');
sb.Append(lineItems[i]);
}

Console.WriteLine(sb.ToString());
}

7. JSON & Dynamic Data Overuse

7.1. Deep Nested JToken Traversals

Mistake:
- Multiple nested foreach loops calling SelectTokens(...).ToList() over the same JSON tree.
- Treating all JSON as dynamic with JToken even when schema is fairly stable.

Impact:
- Slow traversal on large JSON payloads.
- Lots of allocations for intermediate JToken lists.

Better approach:
- Cache results of SelectTokens if you reuse them.
- Flatten traversal where possible; walk the tree once and process everything you need.
- For stable schemas, use strongly-typed models:
 => JsonConvert.DeserializeObject<MyDto>(json) and work on C# objects.

Heuristics:
- If the JSON structure is known and stable, prefer POCOs over dynamic JToken.
- Treat calls to SelectTokens(...).ToList() as potential hotspots and minimize repeats.

Example (bad vs better)

// BAD: repeated SelectTokens and ToList over the same JSON
var root = JToken.Parse(json);

foreach (var user in root.SelectTokens("$.users[*]").ToList())
{
var name = user.SelectToken("$.profile.name")?.ToString();
var email = user.SelectToken("$.profile.email")?.ToString();
Console.WriteLine($"{name} - {email}");
}
// BETTER: strongly-typed model or simpler traversal
public class UserProfile
{
public string Name { get; set; } = string.Empty;
public string Email { get; set; } = string.Empty;
}

public class UserDto
{
public UserProfile Profile { get; set; } = new();
}

public class UsersRoot
{
public List<UserDto> Users { get; set; } = new();
}

var usersRoot = JsonConvert.DeserializeObject<UsersRoot>(json);

foreach (var user in usersRoot?.Users ?? Enumerable.Empty<UserDto>())
{
Console.WriteLine($"{user.Profile.Name} - {user.Profile.Email}");
}

8. Practical Checklists for Code Reviews

8.1. LINQ Checklist

Check:
- LINQ inside tight loops?
- Repeated Where/Select with identical predicates?
- Unnecessary .ToList() allocations?

Action:
- Precompute dictionaries/sets.
- Use foreach where it’s clearer and cheaper.
- Filter once, reuse the filtered result.

8.2. Async / Threading Checklist

Check:
- Any .Result.Wait(), or GetAwaiter().GetResult()?
- Any Task.Run wrapping DB/HTTP/Redis calls?
- Async methods that never use await?

Action:
- Make methods async all the way.
- Use async I/O APIs directly.
- Remove Task.Run for I/O work.

8.3. I/O and HttpClient Checklist

Check:
- new HttpClient() in code (especially in web apps)?
- Connections opened in loops without pooling or reuse?


Action:
- Use IHttpClientFactory and proper connection pooling.
- Reuse clients and connections where appropriate.

Conclusion

Most performance wins in C# come from consistent small habits, not heroic rewrites:

  • Use LINQ thoughtfully; avoid hidden N² and unnecessary ToList().
  • Make async code truly async; never block on Task.
  • Reuse I/O resources like HttpClient, DB connections, and caches.
  • Avoid repeated work and needless allocations in hot paths.

If you bake these checks into your everyday coding and reviews, you’ll build C# services that are faster, more scalable, and easier to reason about — without giving up code clarity. 


Hope you enjoyed the article. Happy Programming.

Zero-Downtime CI/CD to Azure App Service with GitHub Actions and Slots

 This guide walks you through building a production-grade, zero-downtime CI/CD pipeline using GitHub Actions and Azure App Service deployment slots.

The problem we’re solving

Technologies and services used

High-level CI/CD architecture

CI/CD flow diagram (Mermaid)

Press enter or click to view image in full size

The Core Workflow (Sanitized Example)

name: Prod - Web App Deployment

on:
push:
branches:
- main
paths:
- 'UI/**'
workflow_dispatch:
inputs:
slot:
description: 'Web App slot to deploy to'
required: false
default: 'staging'
type: choice
options: [staging, production]
runE2E:
description: 'Run E2E tests after deployment'
required: false
default: 'true'
type: choice
options: ['true', 'false']
runMobileE2E:
description: 'Run Mobile E2E tests after deployment'
required: false
default: 'true'
type: choice
options: ['true', 'false']

env:
SLOT_NAME: ${{ inputs.slot || 'staging' }}

jobs:
build:
runs-on: windows-latest
environment: Production
steps:
- uses: actions/checkout@v2

- uses: Azure/login@v1
with:
creds: ${{ secrets.AZURE_CREDENTIALS }}

- uses: Azure/get-keyvault-secrets@v1
id: keyVaultSecrets
with:
keyvault: ${{ secrets.KEYVAULT_NAME }}
secrets: 'SECRET_A,SECRET_B,SECRET_C' # use your own secret names

- name: build
uses: ./.github/actions/build_web
with:
npmuserEmail: ${{ secrets.NPM_USER }}
npmtoken: ${{ secrets.NPM_TOKEN }}
# Example of mapping secrets/vars to your build-time env
appKey: ${{ secrets.APP_KEY }}
appPath: ${{ secrets.API_PATH }}
skipTestRun: "false"

deploy:
runs-on: windows-latest
needs: build
environment:
name: Production
url: ${{ steps.deploy.outputs.webapp-url }}
steps:
- uses: actions/checkout@v2

- uses: Azure/login@v1
with:
creds: ${{ secrets.AZURE_CREDENTIALS }}

- name: deploy
id: deploy
uses: ./.github/actions/deploy_web
with:
appName: ${{ secrets.AZUREAPPSERVICE_WEBAPP_NAME }}
slotName: ${{ env.SLOT_NAME }}
resourceGroup: ${{ vars.AZURE_RESOURCE_GROUP || secrets.AZURE_RESOURCE_GROUP }}

E2ETests:
needs: deploy
if: ${{ github.event_name != 'workflow_dispatch' || inputs.runE2E == 'true' }}
uses: ./.github/workflows/reusable_e2e_ui_tests.yml
with:
product: ${{ vars.PRODUCT || 'Your Product Name' }}
environment: Production
useStagingUrl: true
secrets: inherit

E2EMobileTests:
needs: E2ETests
if: ${{ github.event_name != 'workflow_dispatch' || inputs.runMobileE2E == 'true' }}
uses: ./.github/workflows/reusable_e2e_mobile_tests.yml
with:
product: ${{ vars.PRODUCT || 'Your Product Name' }}
environment: Production
useStagingUrl: true
secrets: inherit

SwapSlot:
needs: E2ETests
if: ${{
(github.event_name != 'workflow_dispatch' || inputs.runE2E == 'true' || inputs.runMobileE2E == 'true') &&
( !(github.event_name == 'workflow_dispatch' && inputs.runE2E != 'true') || needs.E2ETests.result == 'success' ) &&
( !(github.event_name == 'workflow_dispatch' && inputs.runMobileE2E != 'true') || needs.E2EMobileTests.result == 'success' )
}}
uses: ./.github/workflows/reusable_slot_swap.yml
with:
environment: Production
slot: ${{ inputs.slot || 'staging' }}
secrets: inherit

Jobs and steps overview

Building low-cost real-time Chat + In‑App Notifications with Azure Web PubSub (and “no missed messages” persistence)

 Real-time features often get postponed because WebSockets can feel “expensive”:

Why Azure Web PubSub is “low cost” in practice

The core building block: “client access URL” issuance

Generic flow

Browser/App
|
| POST /realtime/clientAccessUrl (authenticated)
| body: { userId, hubName, groupName }
v
Backend (Function/API)
|
| validates identity + authorization
| generates signed Web PubSub client URL (short TTL)
v
Browser/App receives:
{ url }
|
v
WebPubSubClient(url).start()
; joinGroup(groupName)

Example (pseudocode)

// client
async function connectToHub({ hubName, groupName, userId }) {
const { url } = await fetchJson('/realtime/clientAccessUrl', {
method: 'POST',
headers: { Authorization: `Bearer ${accessToken}` },
body: JSON.stringify({ hubName, groupName, userId })
});
const client = new WebPubSubClient(url);
registerHandlers(client);
await client.start();
await client.joinGroup(groupName);
return client;
}

Part 1 — Chat architecture (real-time + durable)

Key idea: persist first, then broadcast

Example send path (pseudocode)

async function sendMessage({ conversationId, text }) {
// 1) persist
const saved = await postJson('/chat/reply', { conversationId, message: text });
// saved includes messageId, timestamp, sender, etc.
// 2) broadcast (best-effort)
await pubsubClient.sendToGroup(
saved.conversationContext, // groupName
{ conversationId, message: saved }, // payload
'json'
);
}

Conversation group model

Receiving messages

client.on('group-message', (e) => {
const msg = e.message.data;
appendToUI(msg);
});

Reconnects that don’t melt your system

async function reconnectWithBackoff(makeClient, attempts = 5) {
for (let i = 1; i <= attempts; i++) {
try {
return await makeClient();
} catch (e) {
if (i === attempts) throw e;
await sleep(backoff(i) + jitter());
}
}
}

Part 2 — Real-time in-app notifications using the same Web PubSub approach

Notification fanout patterns (choose one)

Example (pseudocode)

class NotificationService {
listeners = [];
async start({ userId }) {
this.client = await connectToHub({
hubName: 'notifications',
groupName: `notifications:user:${userId}`,
userId
});
this.client.on('group-message', (e) => {
const notif = normalizeNotification(e.message.data);
this.listeners.forEach(fn => fn(notif));
});
}
}

Hybrid model: API load + real-time updates (recommended)

Part 3 — The missing piece: “no missed messages” with SQL Server

Reliability goal

Best-practice backend: Store → Outbox → Publish

Client
|
| 1) POST /chat/reply
v
API Service
| 2) Begin DB transaction
| - insert message row (SQL Server)
| - insert outbox event row (same transaction)
| - commit
v
Outbox Processor (background worker)
| 3) Reads unprocessed outbox rows
| 4) Publishes to Web PubSub group
| 5) Marks outbox row processed (idempotent)
v
Web PubSub -> connected clients

Suggested SQL Server tables (minimum viable)

Idempotency and dedupe (do this even if you think you don’t need it)

Catch-up strategy (how clients avoid gaps)

Better ways / improvements over a basic implementation

1) Use ID-based group names (not free text)

2) Don’t broadcast from the request thread (use outbox + worker)

3) Add delivery semantics explicitly

4) Use push notifications for offline delivery

5) Handle attachments properly

6) Security hardening

Closing notes