Event Sourcing Series Part 1: The Honest Truth
September 5, 2025 · 7 min read
Event Sourcing, Architecture, DDD, .NET, Azure
This is Part 1 of a 5-part series on Event Sourcing and Saga Orchestration with Azure.
I'm going to be honest with you: I spent months implementing event sourcing on a project that didn't need it. The codebase became a nightmare. Debugging was painful. New developers took weeks to understand the flow. All for a glorified CRUD app.
That experience taught me something valuable: knowing when NOT to use a pattern is as important as knowing how to implement it.
What Event Sourcing Actually Is
Instead of storing the current state of an entity, you store every event that happened to it.
Traditional approach (state-based):
// Database stores current state
Order {
Id: 123,
Status: "Shipped",
Total: 150.00,
UpdatedAt: "2025-09-05"
}
Event sourcing approach:
// Database stores events
OrderCreated { OrderId: 123, CustomerId: 456, CreatedAt: "2025-09-01" }
ItemAdded { OrderId: 123, ProductId: 789, Price: 50.00 }
ItemAdded { OrderId: 123, ProductId: 101, Price: 100.00 }
PaymentReceived { OrderId: 123, Amount: 150.00 }
OrderShipped { OrderId: 123, TrackingNumber: "ABC123" }
To get current state, you replay all events. The event stream IS your source of truth.
When You Actually Need Event Sourcing
1. Audit Requirements Are Non-Negotiable
Financial services, healthcare, legal domains - anywhere regulators ask "show me exactly what happened and when."
// With event sourcing, this query is trivial
var accountHistoryAt2pm = events
.Where(e => e.Timestamp <= targetTime)
.Aggregate(new AccountState(), (state, evt) => state.Apply(evt));
Wait, what's that .Aggregate() doing?
Aggregate is a LINQ method - the C# equivalent of reduce in JavaScript or fold in functional languages. It takes a collection and reduces it to a single value:
// Simple example - summing numbers
var numbers = new[] { 1, 2, 3, 4 };
var sum = numbers.Aggregate(0, (total, num) => total + num);
// Result: 10
// Step by step: 0 → 0+1=1 → 1+2=3 → 3+3=6 → 6+4=10
In the event sourcing example:
- Seed value:
new AccountState()- start with a blank account - Accumulator:
(state, evt) => state.Apply(evt)- for each event, apply it to current state
The AccountState class handles how each event type changes state:
public class AccountState
{
public decimal Balance { get; private set; }
public string Status { get; private set; } = "New";
public AccountState Apply(IAccountEvent evt)
{
return evt switch
{
AccountOpened e => this with { Status = "Active", Balance = e.InitialBalance },
DepositMade e => this with { Balance = Balance + e.Amount },
WithdrawalMade e => this with { Balance = Balance - e.Amount },
_ => this
};
}
}
Events are typically stored in a table like this:
| Id | StreamId | EventType | Data | Version |
|---|---|---|---|---|
| 1 | acc-123 | AccountOpened | {"InitialBalance": 0} | 1 |
| 2 | acc-123 | DepositMade | {"Amount": 1000} | 2 |
| 3 | acc-123 | WithdrawalMade | {"Amount": 200} | 3 |
StreamId groups events for the same entity, and Version ensures ordering. In Azure, you'd typically use Cosmos DB or a library like Marten (PostgreSQL) or EventStoreDB.
2. Temporal Queries Are a Core Feature
"What was the customer's credit limit on March 15th?" "Show me the portfolio value at market close each day last month."
If you need to answer these questions regularly, event sourcing makes them easy. With traditional state storage, you'd need complex audit tables or change data capture.
3. Complex Domain Events Drive Business Logic
When events themselves trigger important business processes:
// Other systems react to these events
OrderShipped → Send notification → Update inventory → Trigger loyalty points
PaymentFailed → Notify customer → Schedule retry → Alert fraud team
4. You Need to Replay History
Fixing bugs in projections, building new read models from historical data, or analyzing past behavior patterns.
When You DON'T Need Event Sourcing
Simple CRUD Applications
If your app is basically:
- Create record
- Read record
- Update fields
- Delete record
Don't use event sourcing. You're adding massive complexity for zero benefit.
When "Last Updated By" Is Enough
If a simple UpdatedAt and UpdatedBy column satisfies your audit needs, you don't need a full event store.
Small Teams Without Event Sourcing Experience
The learning curve is steep. If your team hasn't done this before and you're on a tight deadline, you'll regret it.
When You Can't Afford Eventual Consistency
Event sourcing typically means your read models are eventually consistent with writes. If you need strong consistency everywhere, this adds significant complexity.
The Complexity Costs Nobody Warns You About
1. Debugging Is Hard
Something's wrong with an order. With traditional storage, you query the order table. Done.
With event sourcing:
- Query the event store
- Replay events to reconstruct state
- Figure out which event caused the issue
- Trace back to what triggered that event
2. Schema Evolution Is Painful
Your OrderCreated event from 2023 has different fields than your 2025 version. Now you need:
public class OrderCreatedV1 { public decimal Total { get; set; } }
public class OrderCreatedV2 { public Money Total { get; set; } } // Breaking change!
// Upcasters to transform old events
public OrderCreatedV2 Upcast(OrderCreatedV1 old) => new OrderCreatedV2
{
Total = Money.FromDecimal(old.Total, "USD")
};
3. Event Store Size Grows Forever
Events are immutable. You never delete them. That order from 5 years ago? Still in your event store with all 47 events that happened to it.
4. Projections Need Maintenance
Read models are built by projecting events. When projections break or need changes:
// Oh no, we need to rebuild the entire OrderSummary projection
// That's 5 million orders × average 12 events = 60 million events to replay
5. Testing Is More Complex
// Traditional: Assert state
Assert.Equal("Shipped", order.Status);
// Event sourcing: Assert events were raised
Assert.Contains(events, e => e is OrderShipped);
Assert.True(events.OfType<OrderShipped>().Single().TrackingNumber == "ABC123");
My Decision Framework
Ask yourself these questions:
| Question | If Yes | If No |
|---|---|---|
| Do regulators require complete audit trails? | Consider ES | Skip ES |
| Do you need temporal queries as a feature? | Consider ES | Skip ES |
| Is the domain genuinely complex with rich events? | Consider ES | Skip ES |
| Is your team experienced with ES? | Lower risk | Higher risk |
| Can you accept eventual consistency? | ES works | Reconsider |
| Is this a new/greenfield project? | Easier to adopt | Harder to retrofit |
If you answered "No" to the first three questions, you probably don't need event sourcing.
What I Recommend Instead
For most applications, these alternatives give you 80% of the benefits with 20% of the complexity:
Option 1: Audit Tables
CREATE TABLE OrderAudit (
Id INT,
OrderId INT,
FieldChanged VARCHAR(100),
OldValue VARCHAR(MAX),
NewValue VARCHAR(MAX),
ChangedBy VARCHAR(100),
ChangedAt DATETIME
);
Option 2: Change Data Capture
SQL Server CDC captures changes automatically. Query the change tables when you need history.
Option 3: Soft Deletes + Temporal Tables
SQL Server temporal tables give you point-in-time queries without event sourcing complexity.
Coming Up Next
In Part 2, I'll show you how to actually implement event sourcing with Azure when you've determined you need it - using Cosmos DB as an event store, Azure Functions for projections, and Service Bus for event distribution.
But please, learn from my mistakes: make sure you actually need it first.
This is Part 1 of a 5-part series on Event Sourcing and Saga Orchestration:
- Part 1: The Honest Truth About Event Sourcing (You are here)
- Part 2: Event Sourcing with Azure - Building Blocks
- Part 3: Saga Orchestration - Distributed Transactions Done Right
- Part 4: Implementing a Saga Orchestrator with Azure Durable Functions
- Part 5: Putting It All Together - Interview-Ready Knowledge