React Performance: Memoization and Code Splitting
March 15, 2025 · 10 min read
React, Performance, JavaScript, Optimization, Interview Prep
Introduction
React applications can suffer from performance issues as they grow. This article covers essential optimization techniques: preventing unnecessary re-renders, implementing code splitting, and profiling to identify bottlenecks. The key rule: measure first, optimize second.
Understanding Re-renders
React re-renders a component when:
- Its state changes
- Its props change
- Its parent re-renders
- A context it consumes changes
function App() {
const [count, setCount] = useState(0);
const [items] = useState([{ id: 1, name: 'Item 1' }]);
return (
<div>
<button onClick={() => setCount(c => c + 1)}>
Count: {count}
</button>
{/* Re-renders when count changes, even though items didn't */}
<ExpensiveList items={items} />
</div>
);
}
React.memo: Preventing Re-renders
React.memo memoizes the rendered output. The component only re-renders when props change.
// Without memo - re-renders on every parent render
function ExpensiveList({ items }) {
console.log('ExpensiveList rendered');
return (
<ul>
{items.map(item => <li key={item.id}>{item.name}</li>)}
</ul>
);
}
// With memo - only re-renders when items change
const ExpensiveList = memo(function ExpensiveList({ items }) {
console.log('ExpensiveList rendered');
return (
<ul>
{items.map(item => <li key={item.id}>{item.name}</li>)}
</ul>
);
});
Custom Comparison
const ProductCard = memo(
function ProductCard({ product, onAddToCart }) {
return (
<div className="product-card">
<h3>{product.name}</h3>
<p>${product.price}</p>
<button onClick={() => onAddToCart(product.id)}>Add</button>
</div>
);
},
(prevProps, nextProps) => {
// Return true if props are equal (skip re-render)
return (
prevProps.product.id === nextProps.product.id &&
prevProps.product.name === nextProps.product.name &&
prevProps.product.price === nextProps.product.price
);
}
);
useMemo: Memoizing Calculations
useMemo caches computed values, only recalculating when dependencies change.
function Dashboard({ orders }) {
// Without useMemo - recalculates every render
const totals = orders.reduce(
(acc, order) => ({
revenue: acc.revenue + order.total,
count: acc.count + 1,
}),
{ revenue: 0, count: 0 }
);
// With useMemo - only recalculates when orders change
const memoizedTotals = useMemo(() => {
return orders.reduce(
(acc, order) => ({
revenue: acc.revenue + order.total,
count: acc.count + 1,
}),
{ revenue: 0, count: 0 }
);
}, [orders]);
return <Summary totals={memoizedTotals} />;
}
Stable Object References
function FilteredList({ items, filter }) {
// Without useMemo - new array reference every render
const filtered = items.filter(item => item.category === filter);
// With useMemo - stable reference when deps don't change
const memoizedFiltered = useMemo(
() => items.filter(item => item.category === filter),
[items, filter]
);
// MemoizedChild won't re-render unnecessarily
return <MemoizedChild items={memoizedFiltered} />;
}
useCallback: Memoizing Functions
useCallback returns a memoized function that only changes when dependencies change.
function ProductManager() {
const [products, setProducts] = useState([]);
// Without useCallback - new function every render
// Breaks React.memo on children
const handleDelete = (id) => {
setProducts(prev => prev.filter(p => p.id !== id));
};
// With useCallback - stable function reference
const handleDeleteMemo = useCallback((id) => {
setProducts(prev => prev.filter(p => p.id !== id));
}, []); // Empty deps: uses functional update
return (
<div>
{products.map(product => (
<ProductCard
key={product.id}
product={product}
onDelete={handleDeleteMemo}
/>
))}
</div>
);
}
The Callback + Memo Pattern
// Parent with stable callbacks
function Dashboard() {
const [items, setItems] = useState([]);
const [selectedId, setSelectedId] = useState(null);
const handleSelect = useCallback((id) => {
setSelectedId(id);
}, []);
const handleDelete = useCallback((id) => {
setItems(prev => prev.filter(item => item.id !== id));
}, []);
return (
<ItemList
items={items}
selectedId={selectedId}
onSelect={handleSelect}
onDelete={handleDelete}
/>
);
}
// Memoized child
const ItemList = memo(function ItemList({
items,
selectedId,
onSelect,
onDelete,
}) {
return (
<ul>
{items.map(item => (
<Item
key={item.id}
item={item}
isSelected={item.id === selectedId}
onSelect={onSelect}
onDelete={onDelete}
/>
))}
</ul>
);
});
Code Splitting with React.lazy
Split your bundle into smaller chunks that load on demand.
import { lazy, Suspense } from 'react';
// Before - all in main bundle
import Dashboard from './pages/Dashboard';
import Reports from './pages/Reports';
import Settings from './pages/Settings';
// After - lazy loaded
const Dashboard = lazy(() => import('./pages/Dashboard'));
const Reports = lazy(() => import('./pages/Reports'));
const Settings = lazy(() => import('./pages/Settings'));
function App() {
return (
<Suspense fallback={<LoadingSpinner />}>
<Routes>
<Route path="/" element={<Dashboard />} />
<Route path="/reports" element={<Reports />} />
<Route path="/settings" element={<Settings />} />
</Routes>
</Suspense>
);
}
Component-Level Splitting
function ProductPage({ productId }) {
const [showChart, setShowChart] = useState(false);
// Heavy charting library only loads when needed
const NutritionChart = lazy(() => import('./components/NutritionChart'));
return (
<div>
<ProductDetails id={productId} />
<button onClick={() => setShowChart(true)}>
Show Nutrition Breakdown
</button>
{showChart && (
<Suspense fallback={<ChartSkeleton />}>
<NutritionChart productId={productId} />
</Suspense>
)}
</div>
);
}
Virtualization for Long Lists
Render only visible items in long lists:
import { useVirtualizer } from '@tanstack/react-virtual';
function VirtualizedList({ items }) {
const parentRef = useRef(null);
const virtualizer = useVirtualizer({
count: items.length,
getScrollElement: () => parentRef.current,
estimateSize: () => 80,
overscan: 5,
});
return (
<div ref={parentRef} style={{ height: '600px', overflow: 'auto' }}>
<div
style={{
height: `${virtualizer.getTotalSize()}px`,
position: 'relative',
}}
>
{virtualizer.getVirtualItems().map(virtualRow => {
const item = items[virtualRow.index];
return (
<div
key={item.id}
style={{
position: 'absolute',
top: virtualRow.start,
height: virtualRow.size,
width: '100%',
}}
>
<ItemCard item={item} />
</div>
);
})}
</div>
</div>
);
}
Simpler Solutions Often Win
Building Glucoplate taught me that virtualization isn't always the answer for large lists. Debounced search, pagination, and smart caching often provide a better UX with less complexity. The key insight: don't render what users don't need. Consider the simpler solutions first - they're easier to maintain and debug.
React 18 Concurrent Features
React 18 introduced concurrent rendering, enabling smoother UIs by allowing React to interrupt renders for higher priority updates.
useTransition: Non-Blocking Updates
Mark state updates as non-urgent to keep the UI responsive:
import { useState, useTransition } from 'react';
function FilterableList({ items }) {
const [query, setQuery] = useState('');
const [filteredItems, setFilteredItems] = useState(items);
const [isPending, startTransition] = useTransition();
const handleSearch = (e) => {
const value = e.target.value;
setQuery(value); // Urgent: update input immediately
// Non-urgent: filter can lag behind
startTransition(() => {
const filtered = items.filter(item =>
item.name.toLowerCase().includes(value.toLowerCase())
);
setFilteredItems(filtered);
});
};
return (
<div>
<input value={query} onChange={handleSearch} placeholder="Search..." />
{isPending && <span className="loading">Filtering...</span>}
<ul style={{ opacity: isPending ? 0.7 : 1 }}>
{filteredItems.map(item => <li key={item.id}>{item.name}</li>)}
</ul>
</div>
);
}
useDeferredValue: Deferred Rendering
Defer updating a value until the UI has time to render it:
import { useState, useDeferredValue, memo } from 'react';
function SearchResults({ query }) {
const deferredQuery = useDeferredValue(query);
const isStale = query !== deferredQuery;
return (
<div style={{ opacity: isStale ? 0.7 : 1 }}>
<SlowList query={deferredQuery} />
</div>
);
}
// Expensive component that benefits from deferred updates
const SlowList = memo(function SlowList({ query }) {
// Imagine this does heavy filtering/rendering
const items = heavyFilter(allItems, query);
return <ul>{items.map(item => <li key={item.id}>{item.name}</li>)}</ul>;
});
useTransition vs useDeferredValue
useTransition - You control when the update happens (wrap setState)
useDeferredValue - React controls when the value updates (wrap the value)
Use useTransition when you own the state update. Use useDeferredValue when receiving a value as a prop.
Automatic Batching
React 18 automatically batches all state updates, even in async code:
// React 17: Two renders
setTimeout(() => {
setCount(c => c + 1); // Render 1
setFlag(f => !f); // Render 2
}, 1000);
// React 18: One render (automatic batching)
setTimeout(() => {
setCount(c => c + 1);
setFlag(f => !f); // Both batched into single render
}, 1000);
// Opt out if needed (rare)
import { flushSync } from 'react-dom';
flushSync(() => setCount(c => c + 1)); // Forces immediate render
setFlag(f => !f); // Separate render
When to Use useTransition
useTransition shines when filtering large datasets client-side. If Glucoplate filtered 8,000+ foods locally, we'd use it. But our architecture pushes filtering to the server with debounced API calls. The tradeoff: server-side filtering adds network latency but keeps the bundle small and works consistently. useTransition is perfect when you have the data locally and need to filter without blocking input.
Debouncing Input
Prevent excessive updates from rapid user input:
function useDebounce(value, delay) {
const [debouncedValue, setDebouncedValue] = useState(value);
useEffect(() => {
const timer = setTimeout(() => {
setDebouncedValue(value);
}, delay);
return () => clearTimeout(timer);
}, [value, delay]);
return debouncedValue;
}
// Usage
function Search() {
const [query, setQuery] = useState('');
const debouncedQuery = useDebounce(query, 300);
// API call only triggers 300ms after user stops typing
const { data } = useQuery({
queryKey: ['search', debouncedQuery],
queryFn: () => searchApi(debouncedQuery),
enabled: debouncedQuery.length > 2,
});
return (
<input
value={query}
onChange={(e) => setQuery(e.target.value)}
placeholder="Search..."
/>
);
}
Profiling
Always profile before optimizing. Use React DevTools Profiler:
import { Profiler } from 'react';
function onRenderCallback(
id,
phase,
actualDuration,
baseDuration,
startTime,
commitTime
) {
console.log(`[${id}] ${phase} - ${actualDuration.toFixed(2)}ms`);
if (actualDuration > 16) {
console.warn(`Slow render in ${id}`);
}
}
function App() {
return (
<Profiler id="Dashboard" onRender={onRenderCallback}>
<Dashboard />
</Profiler>
);
}
Interview Questions
Q1: What's the difference between useMemo and useCallback?
Answer:
- useMemo memoizes a computed value (result of a function)
- useCallback memoizes a function definition
// useMemo - caches the RESULT
const total = useMemo(() => calculateTotal(items), [items]);
// useCallback - caches the FUNCTION
const handleClick = useCallback(() => doSomething(id), [id]);
// useCallback is essentially:
const handleClick = useMemo(() => () => doSomething(id), [id]);
Q2: When should you NOT use React.memo?
Answer: Avoid memo when:
- Component is cheap - Comparison cost exceeds render cost
- Props change frequently - No benefit from memoization
- Component rarely re-renders - Optimization for non-problem
- Props are inline objects - Always "new" reference
Q3: How does code splitting improve performance?
Answer:
- Faster initial load - Less JavaScript downloaded upfront
- Better caching - Unchanged chunks remain cached
- Reduced memory - Unused code never loaded
Trade-offs: Additional network requests, loading states needed.
Q4: How would you optimize a list with 10,000 items?
Answer: Combine strategies:
- Virtualization - Only render visible items
- Pagination - Load in chunks
- Memoize list items - React.memo
- Stable callbacks - useCallback
- Debounce filters - Prevent excessive updates
Common Mistakes
1. Premature Optimization
// Don't memoize everything "just in case"
const Label = memo(({ text }) => <span>{text}</span>);
// ^ Probably unnecessary for simple components
// Profile first, then optimize bottlenecks
2. Missing Dependencies
// ❌ Stale closure
const handleSubmit = useCallback(() => {
submitForm(formData);
}, []); // Missing formData!
// ✅ Include dependencies
const handleSubmit = useCallback(() => {
submitForm(formData);
}, [formData]);
3. Inline Objects Breaking Memo
// ❌ Memo is useless - options recreated every render
<MemoizedComponent options={{ sort: 'name' }} />
// ✅ Stable reference
const options = useMemo(() => ({ sort: 'name' }), []);
<MemoizedComponent options={options} />
Summary
- Profile first - Use React DevTools to identify actual bottlenecks
- React.memo prevents re-renders when props haven't changed
- useMemo caches expensive calculations and stable references
- useCallback provides stable function references
- Code splitting reduces initial bundle with lazy loading
- Virtualization essential for long lists
- Don't over-optimize - memoization has overhead
Next up: Testing React Applications with Jest and React Testing Library.
Part 8 of the React Developer Reference series.