C# Lambda, Func, and Action Explained — Internals, Gotchas & Real Use
Every time you write a LINQ query, wire up an event handler, or pass behaviour into a dependency-injection container, you're leaning on delegates, lambdas, Func, and Action. These aren't just syntactic sugar — they're the foundation of functional-style C# and the engine behind async pipelines, middleware chains, and strategy patterns. Missing the internals here means writing code that leaks memory, captures variables by accident, and benchmarks 10× slower than it should.
Before lambdas, passing behaviour meant either creating a named method somewhere else in the class or writing a verbose anonymous delegate. Both approaches forced you to break your train of thought, scroll away, and name something that only ever lived for one call site. Lambdas collapsed that gap, letting you express intent exactly where the intent is needed. Func and Action gave those anonymous blocks a type-safe home — a way for the compiler to reason about inputs and outputs without you writing a custom delegate type for every scenario.
By the end of this article you'll understand how the compiler lowers a lambda to IL, why closures allocate on the heap even when you don't expect them to, when Func causes boxing and how to dodge it, and how to use Expression
How the C# Compiler Actually Turns a Lambda Into Code
A lambda is not a new kind of runtime object — it's a compiler transformation. When you write x => x * 2, the compiler looks at the context and decides what to emit. If the lambda captures no variables from the enclosing scope, the compiler emits a static private method on the same class and caches a single delegate instance pointing to it. You pay zero allocation cost after the first call.
The moment your lambda captures a local variable or a parameter from the enclosing method — say int multiplier = 3; Func — the compiler synthesises a hidden class (often called a 'closure class' or a 'display class'). It lifts the captured variable into a field on that class, rewrites your local variable as a reference to that field, and the delegate points to an instance method on the heap-allocated closure object. One capture, one allocation.
Understanding this distinction is not academic. In a hot loop that creates lambdas on every iteration, capturing a variable accidentally can turn a zero-allocation path into thousands of small objects per second — exactly the kind of thing that causes GC pressure in game loops, real-time trading systems, and high-throughput ASP.NET endpoints. SharpLab.io lets you paste any lambda and see exactly what the compiler emits before you commit to production.
using System; using System.Runtime.CompilerServices; public class LambdaCompilerBehaviour { // ── CASE 1: No capture ──────────────────────────────────────────────── // The compiler emits a static method and caches ONE delegate instance. // Re-using this Func thousands of times costs zero extra allocations. public static Func<int, int> GetDoublerNonCapturing() { // 'number' is the lambda parameter — not captured from outer scope Func<int, int> doubler = number => number * 2; return doubler; } // ── CASE 2: Captures a local variable ──────────────────────────────── // Compiler creates a hidden 'DisplayClass' on the heap. // Every call to GetDoublerCapturing allocates a new closure object. public static Func<int, int> GetDoublerCapturing(int factor) { // 'factor' comes from the method parameter — this IS a capture Func<int, int> multiplier = number => number * factor; return multiplier; } public static void Main() { // Non-capturing: delegate is reused from the static cache var doubler = GetDoublerNonCapturing(); Console.WriteLine($"Non-capturing result: {doubler(5)}"); // 10 // Capturing: each call allocates a fresh closure + delegate var tripler = GetDoublerCapturing(3); var quadrupler = GetDoublerCapturing(4); Console.WriteLine($"Tripler result : {tripler(5)}"); // 15 Console.WriteLine($"Quadrupler result: {quadrupler(5)}"); // 20 // Prove they are independent objects — different factors captured Console.WriteLine($"Same delegate? {ReferenceEquals(tripler, quadrupler)}"); // False // ── CASE 3: Loop capture gotcha ─────────────────────────────────── // Classic interview trap: all lambdas share the SAME closure variable var actions = new Action[3]; for (int i = 0; i < 3; i++) { int snapshot = i; // fix: copy to a loop-local variable actions[i] = () => Console.WriteLine($"Snapshot value: {snapshot}"); } foreach (var action in actions) action(); // prints 0, 1, 2 — not 3, 3, 3 } }
Tripler result : 15
Quadrupler result: 20
Same delegate? False
Snapshot value: 0
Snapshot value: 1
Snapshot value: 2
Func vs Action vs Predicate — Choosing the Right Delegate Type
Func
The real decision isn't Func vs Action — it's whether you need the lambda to be a delegate (executable right now) or an expression tree (inspectable data). LINQ-to-Objects uses IEnumerable
For performance-critical code, consider using static lambdas (the static modifier on a lambda, introduced in C# 9). Marking a lambda static causes a compile-time error if you accidentally capture anything, enforcing the zero-allocation path. It's a guardrail, not a performance boost in itself — the compiler already optimises non-capturing lambdas to static methods, but the static keyword makes the intent explicit and the constraint enforced.
using System; using System.Collections.Generic; using System.Linq; using System.Linq.Expressions; public class FuncActionComparison { // ── Func: takes input(s), returns a value ───────────────────────────── // Func<TInput, TOutput> — last type param is ALWAYS the return type static Func<string, int> ParseLength = text => text.Length; // ── Action: takes input(s), returns nothing ─────────────────────────── // Great for side-effects: logging, writing to DB, sending events static Action<string> LogMessage = message => Console.WriteLine($"[LOG {DateTime.UtcNow:HH:mm:ss}] {message}"); // ── Predicate: specialised Func<T, bool> ───────────────────────────── // Identical to Func<string, bool> but more semantically expressive static Predicate<string> IsLongWord = word => word.Length > 6; // ── Static lambda (C# 9+): enforces no capture at compile time ──────── static Func<double, double> CircleArea = static radius => Math.PI * radius * radius; // ── Expression tree: lambda as DATA, not code ───────────────────────── // EF Core translates this to SQL; a plain Func<> would execute in-memory static Expression<Func<string, bool>> LongWordExpression = word => word.Length > 6; public static void Main() { // Func in action string sentence = "Lambda expressions are powerful"; int wordCount = sentence.Split(' ').Sum(ParseLength); Console.WriteLine($"Total chars (no spaces): {wordCount}"); // 27 // Action for side-effects LogMessage("Application started"); // Predicate with List<T>.FindAll — predates generic Func var words = new List<string> { "C#", "delegates", "lambda", "closures", "IL" }; List<string> longWords = words.FindAll(IsLongWord); Console.WriteLine($"Long words: {string.Join(", ", longWords)}"); // Static lambda — can't accidentally capture double area = CircleArea(5.0); Console.WriteLine($"Circle area (r=5): {area:F4}"); // 78.5398 // Expression tree — inspect it, don't just execute it Console.WriteLine($"Expression body: {LongWordExpression.Body}"); // (word.Length > 6) // Compile the expression to a delegate when you DO want to execute it Func<string, bool> compiledPredicate = LongWordExpression.Compile(); Console.WriteLine($"'delegates' is long: {compiledPredicate("delegates")}"); // True // ── Chaining with Func: build a simple pipeline ─────────────────── Func<string, string> trim = s => s.Trim(); Func<string, string> toLower = s => s.ToLower(); Func<string, string> sanitise = s => toLower(trim(s)); // manual composition Console.WriteLine(sanitise(" Hello World ")); // hello world } }
[LOG 14:22:05] Application started
Long words: delegates, closures
Circle area (r=5): 78.5398
Expression body: (word.Length > 6)
'delegates' is long: True
hello world
Closures, Variable Capture, and the Memory Leak You Don't See Coming
A closure keeps its captured variables alive as long as the delegate itself is alive. That sounds obvious, but the consequences are non-obvious in production. If you capture this — either explicitly or by accessing an instance field inside a lambda — the entire object graph rooted at this is pinned in memory for as long as any subscriber holds a reference to that delegate.
The most common real-world leak pattern: a long-lived service (say, a singleton) subscribes a lambda to an event on a short-lived object, and the lambda captures this. The short-lived object can't be collected because the event holds a delegate that holds the closure that holds a reference back to the singleton — and transitively, back to the short-lived object itself if the closure also touched any of its fields. Event subscriptions and callbacks passed to timer APIs are the two most common vectors.
The fix is either to unsubscribe explicitly when the short-lived object is disposed, use weak event patterns, or — best of all — restructure so the lambda captures only value-type snapshots of the data it needs rather than a reference to the object. Analysing capture graphs manually is tedious; dotMemory and the .NET Object Allocation Tracker in Visual Studio both show you exactly which delegate is keeping which object graph alive.
using System; using System.Collections.Generic; // ── Simulates a long-lived event source (e.g. a message bus singleton) ─── public class MessageBus { // Storing delegates keeps all their captured variables alive private readonly List<Action<string>> _subscribers = new(); public void Subscribe(Action<string> handler) => _subscribers.Add(handler); public void Publish(string message) { foreach (var subscriber in _subscribers) subscriber(message); } // Missing an Unsubscribe here is the leak — omitted intentionally to show the problem } // ── Short-lived processor that captures 'this' inside a lambda ──────────── public class OrderProcessor : IDisposable { private readonly string _processorId; private readonly MessageBus _bus; private bool _disposed; // Large payload simulating a real object with meaningful state private readonly byte[] _largeCache = new byte[1024 * 1024]; // 1 MB public OrderProcessor(string processorId, MessageBus bus) { _processorId = processorId; _bus = bus; // PROBLEM: 'this' is implicitly captured because we access _processorId // MessageBus._subscribers now holds a reference chain: // delegate -> closure -> this -> _largeCache (1 MB stays alive!) _bus.Subscribe(order => HandleOrder(order)); } private void HandleOrder(string order) { if (_disposed) return; // guard, but memory is still not freed Console.WriteLine($"[{_processorId}] Processing: {order}"); } // ── CORRECT PATTERN: capture only the value you need ───────────────── public static OrderProcessor CreateWithSnapshot(string processorId, MessageBus bus) { var processor = new OrderProcessor(processorId, bus); // Capture a value-type snapshot, not 'this' // The delegate no longer roots the entire OrderProcessor graph string idSnapshot = processorId; bus.Subscribe(order => Console.WriteLine($"[SNAPSHOT:{idSnapshot}] {order}")); return processor; } public void Dispose() { _disposed = true; // In a real system: _bus.Unsubscribe(handler) — store handler ref to enable this Console.WriteLine($"[{_processorId}] Disposed — but lambda still in MessageBus!"); } } public class ClosureMemoryBehaviour { public static void Main() { var bus = new MessageBus(); // long-lived singleton // Short-lived processor — we call Dispose and null the ref, // but the lambda inside MessageBus still roots the 1 MB cache var processor = new OrderProcessor("OP-001", bus); bus.Publish("ORDER-42"); processor.Dispose(); processor = null!; // local ref gone GC.Collect(); // GC runs GC.WaitForPendingFinalizers(); // bus._subscribers still holds the delegate -> _largeCache is NOT collected Console.WriteLine("Processor nulled and GC ran — memory leak in place."); // Publishing again still works because closure is still alive bus.Publish("ORDER-43"); // prints [OP-001] Processing: ORDER-43 } }
[OP-001] Disposed — but lambda still in MessageBus!
Processor nulled and GC ran — memory leak in place.
[OP-001] Processing: ORDER-43
Performance Deep-Dive — When Lambdas Cost Nothing vs When They Cost a Lot
Let's be precise about allocations. A non-capturing, non-static lambda called repeatedly allocates its delegate once on first use and never again — the compiler caches it in a static field. A capturing lambda allocates a closure object every time the enclosing method runs. A multicast delegate (one with multiple subscribers via +=) allocates a new immutable delegate array on every subscription change.
The second cost is virtual dispatch. Invoking a delegate is roughly equivalent to a virtual method call — it's not free like a direct static call, but on modern JIT it's a single indirect branch prediction miss in the worst case. For 99% of code this is irrelevant. For tight numeric loops called millions of times per second, prefer generic constraints with interfaces (where T : IProcessor) over Func
The third cost is generic instantiation. Func
using System; using System.Diagnostics; public class LambdaPerformanceComparison { private const int Iterations = 10_000_000; // ── Option 1: Direct static method call ────────────────────────────── // Fastest possible — the JIT inlines this with zero indirection private static double ComputeCircleArea(double radius) => Math.PI * radius * radius; // ── Option 2: Non-capturing static lambda ──────────────────────────── // Compiler emits a static method + one cached delegate; zero ongoing allocation private static readonly Func<double, double> CircleAreaDelegate = static radius => Math.PI * radius * radius; // ── Option 3: Capturing lambda (re-created on every call) ───────────── // Returns a NEW closure object each time — heap pressure in a tight loop private static Func<double, double> BuildCapturingDelegate(double factor) { // 'factor' is captured — forces closure allocation return radius => Math.PI * radius * radius * factor; } public static void Main() { double total = 0; var sw = Stopwatch.StartNew(); // ── Benchmark 1: Direct static method ──────────────────────────── sw.Restart(); for (int i = 1; i <= Iterations; i++) total += ComputeCircleArea(i); long directMs = sw.ElapsedMilliseconds; Console.WriteLine($"Direct static method : {directMs} ms (total={total:E2})"); // ── Benchmark 2: Cached non-capturing delegate ──────────────────── total = 0; sw.Restart(); for (int i = 1; i <= Iterations; i++) total += CircleAreaDelegate(i); long cachedDelegateMs = sw.ElapsedMilliseconds; Console.WriteLine($"Cached delegate : {cachedDelegateMs} ms (total={total:E2})"); // ── Benchmark 3: Capturing lambda re-created per outer call ─────── // Simulates a pattern like: services.AddTransient(sp => BuildPipeline(config)) total = 0; sw.Restart(); for (int i = 1; i <= Iterations; i++) { // BUG PATTERN: building a new delegate on every loop tick var capturingFunc = BuildCapturingDelegate(1.0); // new closure each time total += capturingFunc(i); } long capturingMs = sw.ElapsedMilliseconds; Console.WriteLine($"Capturing (per-iter) : {capturingMs} ms (total={total:E2})"); Console.WriteLine(); Console.WriteLine("Key insight: cached non-capturing delegates approach direct-call speed."); Console.WriteLine("Re-creating capturing delegates in a loop is the real performance killer."); } }
Cached delegate : 48 ms (total=3.33E+20)
Capturing (per-iter) : 187 ms (total=3.33E+20)
Key insight: cached non-capturing delegates approach direct-call speed.
Re-creating capturing delegates in a loop is the real performance killer.
| Aspect | Func | Action | Expression |
|---|---|---|---|
| Return value | Yes — last type param is the return type | No — always void | N/A — not directly callable |
| Executable | Yes — call like a method | Yes — call like a method | Only after .Compile() (expensive) |
| Primary use case | Transformations, selectors, factory methods | Side-effects, callbacks, event handlers | ORM query translation, rule engines, serialisation |
| LINQ flavour | IEnumerable | ForEach, custom pipelines | IQueryable |
| Closure allocation | Yes if capturing | Yes if capturing | Yes — always allocates an expression tree |
| Can be static lambda | Yes (C# 9+) | Yes (C# 9+) | No — expressions cannot be static lambdas |
| Max type parameters | 16 inputs + 1 output | 16 inputs, no output | Same as underlying Func — 16 inputs |
| Supports async/await | Yes — Func | Yes — Func | No — async lambdas cannot be expression trees |
🎯 Key Takeaways
- Non-capturing lambdas are compiled to static methods with a single cached delegate — they are effectively zero-allocation after first use; capturing lambdas allocate a new closure object every time the enclosing method runs.
- Func
is for transformations that return a value; Action is for side-effects that return nothing; Expression > is for lambdas you need to inspect as data — use the wrong one and LINQ providers silently fall back to client-side evaluation. - The loop variable capture gotcha — storing
() => iin a delegate inside a for-loop — is the single most common lambda bug in C# code reviews; the fix is always to copy the loop variable to a local snapshot before capture. - Expression.Compile() generates IL at runtime and costs ~1000× a delegate invocation; cache compiled delegates aggressively and never call .Compile() in a hot path or on every HTTP request.
⚠ Common Mistakes to Avoid
- ✕Mistake 1: Loop variable capture — Writing
for(int i=0; i<5; i++) actions[i] = () => Console.Write(i);and then being shocked that every action prints '5'. All lambdas share the sameivariable, and by the time they run, the loop has finished. Fix: introduce a loop-local copy —int snapshot = i; actions[i] = () => Console.Write(snapshot);— each lambda now captures its own independent variable. - ✕Mistake 2: Storing async void lambdas in Action — Writing
Action callback = async () => await DoWorkAsync();appears to compile fine, but exceptions thrown inside the lambda are unobserved and will silently swallow errors or crash the process on older runtimes. Fix: useFuncinstead ofActionfor async callbacks, and always await the returned Task:Func.callback = async () => await DoWorkAsync(); await callback(); - ✕Mistake 3: Calling Expression.Compile() on every request — Developers who use custom specification patterns or dynamic rule engines sometimes call .Compile() inside a controller action or a hot service method, adding ~0.5–2 ms per call from JIT compilation. Fix: cache the compiled delegate in a ConcurrentDictionary
keyed on the expression's ToString(), or precompile all expressions at application startup in IHostedService.StartAsync().
Interview Questions on This Topic
- QWhat is the difference between a delegate, a lambda, and an expression tree in C#? Can you explain when you'd choose Expression
> over Func and why LINQ-to-SQL requires it? - QWalk me through exactly what the C# compiler emits for a capturing lambda versus a non-capturing lambda. What are the memory and performance implications, and how would you detect a closure leak in a production application?
- QIf I mark a lambda with the `static` modifier in C# 9, what guarantee does that give me? If two non-static, non-capturing lambdas with identical bodies are assigned to two separate Func
variables, are the delegates reference-equal? Why or why not?
Frequently Asked Questions
What is the difference between Func and Action in C#?
Func
Can a lambda in C# cause a memory leak?
Yes. When a lambda captures a reference-type variable — including implicit captures of this via instance field access — it keeps that object alive for as long as the delegate exists. If that delegate is stored in a long-lived collection like an event subscriber list or a cache, the entire object graph rooted at the captured reference cannot be garbage collected. Always unsubscribe event handlers when the subscriber is disposed, and prefer capturing value-type snapshots over capturing object references.
Why can't I use an async lambda as an Expression> in C#?
Expression trees represent code as data — a tree of ExpressionNode objects that can be traversed, translated, and serialised (e.g. to SQL). Async methods compile into state-machine classes with complex control flow that cannot be represented as a simple expression tree. The C# compiler enforces this at compile time: writing Expression produces CS1989. If you need async behaviour with a delegate, use Func and await it normally.
Written and reviewed by senior developers with real-world experience across enterprise, startup and open-source projects. Every article on TheCodeForge is written to be clear, accurate and genuinely useful — not just SEO filler.