Introduction
Performance problems in C# apps are rarely about a single “big bug”. They usually come from a collection of small, common patterns: convenient code that looks clean, but quietly wastes CPU, memory, threads, or connections. This article walks through typical mistakes and gives practical, low-friction alternatives you can apply in day‑to‑day coding.
1. Treating LINQ as “Free” (Hidden N² and Allocations)
1.1. Chaining LINQ Inside Hot Loops:
Mistake: Using .Where().Select().First()/Any() repeatedly inside loops. For example:
- For each row, you call Where(...) and First(...) over the same list.
- For each item, you run Select(...).Any(...) over another list.
Impact:
- Hidden O(n²) behavior when both sequences grow.
- Many temporary iterators and lists → extra allocations and GC pressure.
Better approach:
- Precompute lookups once outside the loop: Use Dictionary<TKey, TValue> to map from one key to another or Use HashSet<T> for “contains?” checks.
- Use a single-pass foreach over the source and do all necessary checks inside.
Heuristics:
- If you see LINQ in an inner loop, ask: “Can I precompute this?”
- Avoid patterns like .Where(...).Count(), .Where(...).Any(), .Where(...).ToList() run multiple times for the same input.
Example (bad vs better)
// BAD: LINQ inside the inner loop, repeated scans
foreach (var order in orders)
{
var linesForOrder = orderLines
.Where(l => l.OrderId == order.Id)
.ToList();
foreach (var line in linesForOrder)
{
var productName = products
.Where(p => p.Id == line.ProductId)
.Select(p => p.Name)
.FirstOrDefault();
Console.WriteLine($"{order.Id}: {productName}");
}
} // BETTER: precompute lookups once, then use O(1) access in loops
var linesByOrder = orderLines
.GroupBy(l => l.OrderId)
.ToDictionary(g => g.Key, g => g.ToList());
var productNameById = products
.ToDictionary(p => p.Id, p => p.Name);
foreach (var order in orders)
{
if (!linesByOrder.TryGetValue(order.Id, out var lines))
continue;
foreach (var line in lines)
{
if (!productNameById.TryGetValue(line.ProductId, out var productName))
continue;
Console.WriteLine($"{order.Id}: {productName}");
}
}1.2. OverusingToList()andToList().ForEach(...)
Mistake:
- Calling .ToList() just to iterate.
- Using .ToList().ForEach(...) instead of a foreach.
- Turning arrays or List<T> into another List<T> unnecessarily.
Impact:
- Extra memory allocations with no benefit.
- Double enumeration if the source is not already materialized.
Better approach:
- Use foreach directly on IEnumerable<T>.
- Only call .ToList() when you need:
=> A stable snapshot, or
=> Indexing / random access, or
=> To reuse the result multiple times.
Heuristics:
- If .ToList() is followed immediately by .ForEach, replace with foreach.
- If a sequence is enumerated only once, don’t force a list.
Example (bad vs better)
// BAD: needless ToList + ForEach
GetUsersFromDb()
.ToList()
.ForEach(u => Console.WriteLine(u.Email));
// BETTER: just iterate the sequence
foreach (var user in GetUsersFromDb())
{
Console.WriteLine(user.Email);
}2. “Half‑Async” Code: Mixing Async With .Result and .Wait()
2.1. Blocking on Async (.Result/.Wait())
Mistake:
- Calling SomeAsync(...).Result or SomeAsync(...).Wait() in:
=> Controllers
=> Middlewares
=> Repositories and services
Impact:
- Deadlocks when combined with synchronization contexts.
- Thread pool starvation: many threads block instead of doing work.
- Poor scalability under concurrent load.
Better approach:
- Make the entire call chain async all the way down:
=> Expose async methods: Task<T> FooAsync(...).
=> Always use await FooAsync(...).
- If you must bridge sync/async (e.g. in Main), keep it at the very edge and use ConfigureAwait(false) carefully.
Heuristics:
- In ASP.NET Core request handling, never use .Result or .Wait() on Task.
- If a method calls an async API, strongly consider making that method async too.
Example (bad vs better)
// BAD: blocking on async inside a web request
public IActionResult GetUser(int id)
{
var user = _userService.GetUserAsync(id).Result; // can deadlock
return Ok(user);
}
// BETTER: async all the way
public async Task<IActionResult> GetUser(int id)
{
var user = await _userService.GetUserAsync(id);
return Ok(user);
}2.2. Wrapping I/O in Task.RunMistake:
- Using await Task.Run(() => /* DB or HTTP call */) just to make it “look async”.
Impact:
- Still blocks a thread per I/O operation; now it’s a thread-pool thread instead of the request thread.
- Minimal scalability benefit.
Better approach:
- Use true async I/O:
=> await connection.QueryAsync(...) instead of Query(...).
=> await httpClient.SendAsync(...) instead of Send(...).
- Let the framework manage threads; your methods should just await I/O tasks.
Heuristics:
- If the work is I/O-bound, use async APIs.
- Use Task.Run only for CPU-bound work you explicitly want to parallelize.
Example (bad vs better)
// BAD: wrapping HTTP I/O in Task.Run
public async Task<string> GetDataAsync(string url)
{
return await Task.Run(() =>
{
// This still blocks a thread
return _httpClient.GetStringAsync(url).Result;
});
}
// BETTER: use proper async I/O
public async Task<string> GetDataAsync(string url)
{
return await _httpClient.GetStringAsync(url);
}3. Inefficient HTTP Usage (HttpClient & Headers)
Mistake:
- Writing using var client = new HttpClient(); inside every method that calls an API.
Impact:
- Socket exhaustion from many short‑lived connections.
- No effective connection pooling.
- Increased latency and resource consumption.
Better approach:
- In ASP.NET Core, use IHttpClientFactory:
=> Register named or typed clients in DI.
=> Inject them into your services / repositories.
- Configure base address and default headers once at registration, not on every call.
Heuristics:
- In modern .NET apps, grep for new HttpClient() and replace with factory-based clients.
- Set per‑request headers on HttpRequestMessage, not by creating new clients.
Example (bad vs better)
// BAD: new HttpClient every request
public async Task<string> GetWeatherAsync(string city)
{
using var client = new HttpClient();
var url = $"https://api.example.com/weather?city={city}";
return await client.GetStringAsync(url);
}// BETTER: use IHttpClientFactory (ASP.NET Core)
public class WeatherService
{
private readonly HttpClient _client;
public WeatherService(IHttpClientFactory httpClientFactory)
{
_client = httpClientFactory.CreateClient("WeatherApi");
}
public Task<string> GetWeatherAsync(string city)
{
var url = $"weather?city={city}";
return _client.GetStringAsync(url);
}
}4. Heavy or Blocking Constructors / Initialization
Mistake:
- Calling .Result / .Wait() on async factory methods inside constructors.
- Performing network or DB calls as soon as a service is constructed.
Impact:
- Slower and fragile application startup.
- Hard‑to‑debug failures at DI resolution time.
- Potential deadlocks during startup.
Better approach:
- Keep constructors lightweight and side-effect free.
- Initialize expensive resources:
=> Lazily (via async methods or lazy wrappers), or
=> At startup via background services / startup tasks.
- For things like DB or Redis connections:
=> Prefer singleton ConnectionMultiplexer/connection factories initialized once and reused.
Heuristics:
- If a constructor can throw due to network/DB issues, redesign.
- Avoid any .Result / .Wait() in constructors of DI services.
Example (bad vs better)
// BAD: constructor blocks on async DB call
public class UserRepository
{
private readonly string _connectionString;
public UserRepository(IConfigService config)
{
// Async method, but forced sync here
_connectionString = config.GetConnectionStringAsync().Result;
}
}// BETTER: keep ctor light, use async initialization in methods
public class UserRepository
{
private readonly IConfigService _config;
public UserRepository(IConfigService config)
{
_config = config;
}
private async Task<SqlConnection> CreateConnectionAsync()
{
var connString = await _config.GetConnectionStringAsync();
return new SqlConnection(connString);
}
public async Task<User?> GetByIdAsync(int id)
{
await using var connection = await CreateConnectionAsync();
// query DB...
return null;
}
}5. Repeated Scans Over Large Data Sets With LINQ
5.1. Re‑Filtering the Same Sequence Many Times
Mistake:
- For a given dataset, running similar Where(...) filters multiple times:
=> One for Count, one for Sum, one for Average, etc.
- Example: For each region or key in a geo dataset, scanning the full dataset again in nested loops.
Impact:
- O(m × n) behavior where you could do O(n).
- Wasted CPU and allocations as data volume grows.
Better approach:
- Group once, compute many things:
=> Use .GroupBy(keySelector) so you iterate the raw data once.
=> Inside each group, compute all required aggregates (Count, Sum, etc.).
- Or, pre-filter into a list:
=> var filtered = data.Where(...).ToList();Then use
=> filtered.Count, filtered.Sum(...), etc.
Heuristics:
- If you see the same Where predicate copy‑pasted several times, factor it out.
- If multiple stats use the same subset of data, filter once into a temporary collection.
Example (bad vs better)
// BAD: re-filtering the same sequence multiple times
var highValueOrdersCount = orders
.Where(o => o.Total > 1000 && o.Status == OrderStatus.Completed)
.Count();
var highValueOrdersTotal = orders
.Where(o => o.Total > 1000 && o.Status == OrderStatus.Completed)
.Sum(o => o.Total);
// BETTER: filter once, reuse
var highValueOrders = orders
.Where(o => o.Total > 1000 && o.Status == OrderStatus.Completed)
.ToList();
var highValueOrdersCount = highValueOrders.Count;
var highValueOrdersTotal = highValueOrders.Sum(o => o.Total);6. Unnecessary Allocations and Object Churn
6.1. Throwaway Collections and Strings
Mistake:
- Creating new List<>/Dictionary<>/string instances in tight loops where they could be reused.
- Rebuilding complex keys or query strings again and again.
Impact:
- High GC overhead and more frequent pauses.
- Increased memory footprint.
Better approach:
- Reuse existing collections where safe by clearing them and reusing:
=> E.g. list.Clear(); then list.Add(...).
- Use StringBuilder in truly hot string-building paths.
- Centralize key-building logic and cache expensive computed strings when reused.
Heuristics:
- Profile hot paths and look at allocation-heavy methods.
- Pay close attention to code running “per row / per KPI / per request”.
Example (bad vs better)
// BAD: allocating a new list and string on every iteration
foreach (var order in orders)
{
var lineItems = new List<string>();
foreach (var line in order.Lines)
{
lineItems.Add($"{line.ProductCode}-{line.Quantity}");
}
var summary = string.Join(",", lineItems);
Console.WriteLine(summary);
}// BETTER: reuse collections and use StringBuilder for hot paths
var lineItems = new List<string>();
var sb = new StringBuilder();
foreach (var order in orders)
{
lineItems.Clear();
foreach (var line in order.Lines)
{
lineItems.Add($"{line.ProductCode}-{line.Quantity}");
}
sb.Clear();
for (int i = 0; i < lineItems.Count; i++)
{
if (i > 0) sb.Append(',');
sb.Append(lineItems[i]);
}
Console.WriteLine(sb.ToString());
}7. JSON & Dynamic Data Overuse
7.1. Deep Nested JToken TraversalsMistake:
- Multiple nested foreach loops calling SelectTokens(...).ToList() over the same JSON tree.
- Treating all JSON as dynamic with JToken even when schema is fairly stable.
Impact:
- Slow traversal on large JSON payloads.
- Lots of allocations for intermediate JToken lists.
Better approach:
- Cache results of SelectTokens if you reuse them.
- Flatten traversal where possible; walk the tree once and process everything you need.
- For stable schemas, use strongly-typed models:
=> JsonConvert.DeserializeObject<MyDto>(json) and work on C# objects.
Heuristics:
- If the JSON structure is known and stable, prefer POCOs over dynamic JToken.
- Treat calls to SelectTokens(...).ToList() as potential hotspots and minimize repeats.
Example (bad vs better)
// BAD: repeated SelectTokens and ToList over the same JSON
var root = JToken.Parse(json);
foreach (var user in root.SelectTokens("$.users[*]").ToList())
{
var name = user.SelectToken("$.profile.name")?.ToString();
var email = user.SelectToken("$.profile.email")?.ToString();
Console.WriteLine($"{name} - {email}");
}// BETTER: strongly-typed model or simpler traversal
public class UserProfile
{
public string Name { get; set; } = string.Empty;
public string Email { get; set; } = string.Empty;
}
public class UserDto
{
public UserProfile Profile { get; set; } = new();
}
public class UsersRoot
{
public List<UserDto> Users { get; set; } = new();
}
var usersRoot = JsonConvert.DeserializeObject<UsersRoot>(json);
foreach (var user in usersRoot?.Users ?? Enumerable.Empty<UserDto>())
{
Console.WriteLine($"{user.Profile.Name} - {user.Profile.Email}");
}8. Practical Checklists for Code Reviews
8.1. LINQ Checklist
Check:
- LINQ inside tight loops?
- Repeated Where/Select with identical predicates?
- Unnecessary .ToList() allocations?
Action:
- Precompute dictionaries/sets.
- Use foreach where it’s clearer and cheaper.
- Filter once, reuse the filtered result.
8.2. Async / Threading Checklist
Check:
- Any .Result, .Wait(), or GetAwaiter().GetResult()?
- Any Task.Run wrapping DB/HTTP/Redis calls?
- Async methods that never use await?
Action:
- Make methods async all the way.
- Use async I/O APIs directly.
- Remove Task.Run for I/O work.
8.3. I/O and HttpClient Checklist
Check:- new HttpClient() in code (especially in web apps)?
- Connections opened in loops without pooling or reuse?
Action:
- Use IHttpClientFactory and proper connection pooling.
- Reuse clients and connections where appropriate.
Conclusion
Most performance wins in C# come from consistent small habits, not heroic rewrites:
- Use LINQ thoughtfully; avoid hidden N² and unnecessary
ToList(). - Make async code truly async; never block on
Task. - Reuse I/O resources like
HttpClient, DB connections, and caches. - Avoid repeated work and needless allocations in hot paths.
If you bake these checks into your everyday coding and reviews, you’ll build C# services that are faster, more scalable, and easier to reason about — without giving up code clarity.
Hope you enjoyed the article. Happy Programming.
