
In 2021, we faced an interesting challenge: build a self-service portal where business users could create and customize their own React components without deploying code. The components needed to be dynamic, stored in a database, and rendered server-side for performance and SEO.
The scale: 12 million views per week. The solution had to be fast, secure, and infinitely cacheable.
Here’s how we solved it using Acorn parser, component-based JSX storage, Next.js serverless rendering, and aggressive caching strategies.
The Problem
Traditional approaches to building customizable portals fall into two camps:
- Configuration-based builders - Limited flexibility, users can only tweak predefined options
- Full code editors - Security nightmare, requires code reviews and deployments
We needed something in between: give users the power of React components while maintaining security and instant deployment.
Additional constraints:
- 12M views/week (~200 requests/second average, 1000+ req/s peak)
- Cost-effective - Serverless functions at this scale get expensive fast
- Fast response times - <100ms p95 for page loads
- Zero downtime deployments - Business users updating components shouldn’t break anything
Architecture Overview
Our solution had three key components:
Data flow from database through parsing to server-side rendering
1. Component Storage in Database
Instead of storing entire page templates, we stored reusable component definitions:
// Example component stored in PostgreSQL
{
id: 'user-dashboard-card',
name: 'User Dashboard Card',
jsx: `
function DashboardCard({ title, metrics, onClick }) {
return (
<div className="dashboard-card">
<h3>{title}</h3>
<div className="metrics">
{metrics.map(m => (
<div key={m.label} className="metric">
<span className="label">{m.label}</span>
<span className="value">{m.value}</span>
</div>
))}
</div>
<button onClick={onClick}>View Details</button>
</div>
)
}
`,
allowedProps: ['title', 'metrics', 'onClick'],
version: 3,
createdBy: 'user@example.com',
createdAt: '2021-04-10T10:30:00Z'
}Key decisions:
- Store complete function components, not fragments
- Include metadata: allowed props, version, author
- Keep JSX human-readable for editing
- No external imports allowed (security)
2. Parsing with Acorn
We chose Acorn as our parser because:
- Fast, lightweight, battle-tested
- Produces standard ESTree AST
- Supports JSX via plugin
- No dependency on Babel’s heavy transform pipeline
import * as acorn from 'acorn'
import jsx from 'acorn-jsx'
const JSXParser = acorn.Parser.extend(jsx())
function parseComponent(jsxString) {
try {
// Parse JSX to AST
const ast = JSXParser.parse(jsxString, {
ecmaVersion: 2021,
sourceType: 'module',
})
return {
success: true,
ast,
}
} catch (error) {
return {
success: false,
error: error.message,
line: error.loc?.line,
column: error.loc?.column,
}
}
}3. Component Validation
Before allowing a component to be saved, we validated it against security rules:
function validateComponent(ast) {
const violations = []
// Walk the AST looking for dangerous patterns
walk(ast, {
ImportDeclaration(node) {
// No imports allowed - components must be self-contained
violations.push({
type: 'FORBIDDEN_IMPORT',
message: 'Import statements not allowed',
line: node.loc.start.line
})
},
CallExpression(node) {
const dangerousFunctions = [
'eval', 'Function', 'setTimeout',
'setInterval', 'fetch', 'XMLHttpRequest'
]
if (dangerousFunctions.includes(node.callee.name)) {
violations.push({
type: 'FORBIDDEN_FUNCTION',
message: `${node.callee.name} is not allowed`,
line: node.loc.start.line
})
}
},
MemberExpression(node) {
// Block access to window, document, etc.
const blockedGlobals = ['window', 'document', 'global', 'process']
if (blockedGlobals.includes(node.object.name)) {
violations.push({
type: 'FORBIDDEN_GLOBAL',
message: `Access to ${node.object.name} is not allowed`,
line: node.loc.start.line
})
}
}
})
return {
valid: violations.length === 0,
violations
}
}Validation rules:
- No imports or requires
- No eval, Function constructor, or dynamic code execution
- No DOM access (window, document)
- No network calls (fetch, XMLHttpRequest)
- No timers (setTimeout, setInterval)
- Only whitelisted React hooks
4. Component Registry
We maintained a registry of allowed external components that users could reference:
// Server-side component registry
const ALLOWED_COMPONENTS = {
// UI primitives
Button: require('./ui/Button'),
Input: require('./ui/Input'),
Card: require('./ui/Card'),
Modal: require('./ui/Modal'),
// Data display
Table: require('./data/Table'),
Chart: require('./data/Chart'),
// Layout
Grid: require('./layout/Grid'),
Flex: require('./layout/Flex'),
}
// Users could reference these in their JSX
const userComponent = `
function MyDashboard({ data }) {
return (
<Grid columns={2}>
<Card title="Sales">
<Chart data={data.sales} type="line" />
</Card>
<Card title="Users">
<Table data={data.users} />
</Card>
</Grid>
)
}
`5. Runtime Compilation with Next.js
The magic happened in Next.js API routes (serverless functions):
// pages/api/render/[componentId].js
import { compileComponent } from '../../../lib/compiler'
import { getComponent } from '../../../lib/db'
export default async function handler(req, res) {
const { componentId } = req.query
const { props } = req.body
try {
// 1. Fetch component from database
const component = await getComponent(componentId)
if (!component) {
return res.status(404).json({ error: 'Component not found' })
}
// 2. Compile JSX to executable function
const CompiledComponent = compileComponent(
component.jsx,
ALLOWED_COMPONENTS
)
// 3. Render to HTML (server-side)
const html = ReactDOMServer.renderToString(
<CompiledComponent {...props} />
)
// 4. Return HTML + hydration data
res.status(200).json({
html,
props,
componentId,
})
} catch (error) {
res.status(500).json({
error: 'Render failed',
message: error.message
})
}
}6. The Compilation Step
Converting JSX string to executable React component:
import { transform } from '@babel/standalone'
function compileComponent(jsxString, registry) {
// Transform JSX to plain JavaScript
const { code } = transform(jsxString, {
presets: ['react'],
filename: 'component.jsx',
})
// Create isolated scope with only allowed components
const scopedEval = new Function(
...Object.keys(registry),
'React',
`
${code}
// Extract the component function
const componentName = ${extractComponentName(jsxString)}
return eval(componentName)
`
)
// Execute with controlled scope
const Component = scopedEval(
...Object.values(registry),
React
)
return Component
}
function extractComponentName(jsxString) {
// Parse to find function declaration name
const match = jsxString.match(/function\s+(\w+)/)
return match ? `"${match[1]}"` : null
}Security Considerations
This approach had several security layers:
- AST-based validation - Caught dangerous patterns before compilation
- No file system access - Components couldn’t read/write files
- Isolated scope - Only whitelisted components available
- Server-side rendering - User code never executed in browser
- Content Security Policy - Strict CSP headers on frontend
- Rate limiting - Prevent abuse of compilation endpoints
- Versioning - All component changes tracked, rollback available
Performance Optimizations
Component Caching
import LRU from 'lru-cache'
const compiledCache = new LRU({
max: 500, // Cache 500 components
maxAge: 1000 * 60 * 60, // 1 hour TTL
updateAgeOnGet: true,
})
function getOrCompileComponent(componentId, jsx) {
const cacheKey = `${componentId}:${hashString(jsx)}`
let compiled = compiledCache.get(cacheKey)
if (!compiled) {
compiled = compileComponent(jsx, ALLOWED_COMPONENTS)
compiledCache.set(cacheKey, compiled)
}
return compiled
}Incremental Static Regeneration
For pages using these components:
// pages/portal/[pageId].js
export async function getStaticProps({ params }) {
const page = await getPage(params.pageId)
const components = await getPageComponents(page.componentIds)
return {
props: { page, components },
revalidate: 60, // Regenerate every 60 seconds
}
}
export async function getStaticPaths() {
return {
paths: [],
fallback: 'blocking', // Generate on-demand
}
}Caching Strategy at 12M Views/Week
At scale, caching wasn’t optional - it was the entire strategy. Here’s how we made it work:
Multi-Layer Cache Architecture
Cache layers protecting serverless functions from load
Layer 1: CDN Edge Cache (95% hit rate)
// next.config.js
module.exports = {
async headers() {
return [
{
source: '/portal/:path*',
headers: [
{
key: 'Cache-Control',
// CDN caches for 1 hour, browser for 5 minutes
// Stale-while-revalidate keeps serving stale for 24h
value: 's-maxage=3600, max-age=300, stale-while-revalidate=86400',
},
{
key: 'CDN-Cache-Control',
// Cloudflare-specific: cache for 24 hours
value: 'max-age=86400',
},
],
},
]
},
}Key decisions:
- Serve stale content during revalidation (zero downtime)
- Long CDN cache, short browser cache (flexibility)
- Cache-Control varies by route priority
Layer 2: ISR with Redis Backing
// lib/redis-cache.js
import Redis from 'ioredis'
const redis = new Redis(process.env.REDIS_URL)
export async function getCachedComponent(componentId, version) {
const cacheKey = `component:${componentId}:${version}`
// Try Redis first
const cached = await redis.get(cacheKey)
if (cached) {
return JSON.parse(cached)
}
// Fetch from database
const component = await db.components.findOne({
id: componentId,
version
})
// Cache for 1 hour
await redis.setex(
cacheKey,
3600,
JSON.stringify(component)
)
return component
}
// Invalidate cache when component updates
export async function invalidateComponent(componentId) {
const pattern = `component:${componentId}:*`
const keys = await redis.keys(pattern)
if (keys.length > 0) {
await redis.del(...keys)
}
// Also trigger ISR revalidation
await fetch(`/api/revalidate?component=${componentId}`, {
headers: { 'x-revalidate-token': process.env.REVALIDATE_TOKEN }
})
}Layer 3: Compiled Component Cache
// lib/compilation-cache.js
import { createHash } from 'crypto'
// In-memory LRU for hot components
const hotCache = new LRU({
max: 100,
maxAge: 1000 * 60 * 60, // 1 hour
})
// Redis for distributed cache
export async function getCompiledComponent(jsx, componentId) {
const hash = createHash('sha256').update(jsx).digest('hex')
const cacheKey = `compiled:${componentId}:${hash}`
// Try hot cache first (microseconds)
let compiled = hotCache.get(cacheKey)
if (compiled) return compiled
// Try Redis (milliseconds)
const cached = await redis.get(cacheKey)
if (cached) {
compiled = deserializeFunction(cached)
hotCache.set(cacheKey, compiled)
return compiled
}
// Compile (expensive - seconds)
compiled = compileComponent(jsx, ALLOWED_COMPONENTS)
// Store in both caches
hotCache.set(cacheKey, compiled)
await redis.setex(
cacheKey,
3600 * 24, // 24 hours
serializeFunction(compiled)
)
return compiled
}
function serializeFunction(fn) {
// Serialize function to string for Redis storage
return fn.toString()
}
function deserializeFunction(str) {
// Reconstruct function from string
return new Function('return ' + str)()
}Cache Warming Strategy
To prevent cold starts and compilation spikes:
// lib/cache-warmer.js
import { getPopularComponents } from './analytics'
// Warm cache every 30 minutes
export async function warmCache() {
const popular = await getPopularComponents({
limit: 50,
timeframe: '24h'
})
await Promise.all(
popular.map(async ({ componentId, version }) => {
const component = await getCachedComponent(componentId, version)
// Pre-compile popular components
await getCompiledComponent(component.jsx, componentId)
})
)
console.log(`Warmed cache for ${popular.length} components`)
}
// Schedule via cron or scheduled function
// Vercel: vercel.json cron
// AWS: EventBridge ruleHandling Cache Invalidation
The hardest problem - when users update components:
// pages/api/admin/update-component.js
export default async function handler(req, res) {
const { componentId, jsx } = req.body
// 1. Validate new JSX
const validation = validateComponent(parseComponent(jsx))
if (!validation.valid) {
return res.status(400).json({ errors: validation.violations })
}
// 2. Save new version to database
const newVersion = await db.components.create({
id: componentId,
jsx,
version: Date.now(), // Use timestamp as version
})
// 3. Invalidate all caches
await Promise.all([
// Clear Redis
invalidateComponent(componentId),
// Clear compiled cache
redis.del(`compiled:${componentId}:*`),
// Trigger ISR revalidation for all pages using this component
revalidatePagesWithComponent(componentId),
// Purge CDN cache (Cloudflare example)
purgeCloudflareCache([
`https://portal.example.com/*${componentId}*`
])
])
// 4. Warm cache with new version
await getCompiledComponent(jsx, componentId)
res.json({
success: true,
version: newVersion.version
})
}
async function revalidatePagesWithComponent(componentId) {
// Find all pages using this component
const pages = await db.pages.find({
componentIds: { $contains: componentId }
})
// Revalidate each page
await Promise.all(
pages.map(page =>
fetch(`/api/revalidate?page=${page.id}`, {
headers: { 'x-revalidate-token': process.env.REVALIDATE_TOKEN }
})
)
)
}Monitoring Cache Performance
// lib/metrics.js
import { CloudWatch } from 'aws-sdk'
const cloudwatch = new CloudWatch()
export async function trackCacheHit(layer, hit) {
await cloudwatch.putMetricData({
Namespace: 'Portal/Cache',
MetricData: [
{
MetricName: 'HitRate',
Value: hit ? 1 : 0,
Unit: 'None',
Dimensions: [
{ Name: 'Layer', Value: layer }
]
}
]
})
}
// Usage in compilation cache
export async function getCompiledComponentWithMetrics(jsx, componentId) {
const cached = hotCache.get(cacheKey)
trackCacheHit('HotCache', !!cached)
if (cached) return cached
// Continue with Redis, etc...
}Cost Impact
Before aggressive caching:
- ~200 req/s × 100ms avg execution = 20 concurrent lambdas
- 12M requests/week × $0.20 per 1M requests = $2.40/week in Lambda invocations
- 12M × 100ms × $0.0000166667 per GB-second = $20/week in compute time
- Total: ~$1,000/month
After multi-layer caching:
- 95% CDN hit rate = only 600K requests hit origin
- 90% ISR hit rate = only 60K hit Lambda
- 60K requests/week × $0.20 per 1M = $0.012/week
- Total: ~$5/month (200x reduction)
Cache Stats We Achieved
CDN Layer: 95% hit rate
ISR Layer: 90% hit rate
Redis Layer: 99% hit rate
Compilation: 99.9% cache hit (compilation ~1x/day per component)
P95 Response Time: 45ms (from CDN)
P99 Response Time: 120ms (includes ISR)
Cold Start (1%): 800ms (full compilation path)Real-World Example
Here’s a complete flow of creating a custom analytics dashboard:
- User creates component in admin UI:
function AnalyticsDashboard({ metrics, dateRange }) {
const [selectedMetric, setSelectedMetric] = React.useState(null)
return (
<div>
<Card>
<h2>Analytics: {dateRange}</h2>
<Grid columns={3}>
{metrics.map(metric => (
<Button
key={metric.id}
onClick={() => setSelectedMetric(metric)}
variant={selectedMetric?.id === metric.id ? 'primary' : 'default'}
>
<div className="metric-value">{metric.value}</div>
<div className="metric-label">{metric.label}</div>
</Button>
))}
</Grid>
</Card>
{selectedMetric && (
<Card>
<Chart
data={selectedMetric.history}
type="line"
height={300}
/>
</Card>
)}
</div>
)
}- System validates and stores the component
- Component becomes available instantly at
/portal/analytics-dashboard - Next.js renders it server-side with real data from APIs
- React hydrates for interactivity on the client
Lessons Learned
What worked well:
- Component-based approach scaled better than page-based
- Multi-layer caching reduced costs by 200x - critical at scale
- Server-side rendering kept user code isolated and eliminated XSS vectors
- Acorn parser was fast and reliable for validation
- ISR + stale-while-revalidate gave us zero-downtime deployments
- Versioning saved us multiple times during rollbacks
- Cache warming eliminated cold starts for popular components
- Redis + LRU combo gave sub-10ms lookup times
What we’d do differently:
- Start with TypeScript - type checking would catch errors earlier
- Build a visual component library browser - users struggled with discovery
- Add automated testing for user components before production
- Implement blue/green deployments for component updates at the CDN level
- Better cache invalidation - we had some stale content issues initially
- GraphQL for component queries - REST was getting unwieldy
- Edge compute - would’ve been perfect for this use case (Vercel Edge wasn’t mature in 2021)
Alternatives Considered
We evaluated several other approaches:
- Babel in Node.js - Too slow, too heavy
- VM2 sandboxing - Security concerns with VM escapes
- WebAssembly compilation - Too experimental in 2021
- Template strings - Not flexible enough
- AST transformation only - Still needed runtime somehow
Conclusion
Building a self-service portal with dynamic JSX components at 12M views/week taught us that:
- Parsing ≠ Security - You need validation at multiple layers
- Cache or die - At scale, caching isn’t an optimization, it’s the architecture
- Server-side rendering is your friend for untrusted code
- Component isolation is harder than it looks
- User experience matters - Fast validation feedback is critical
- Stale content > No content - stale-while-revalidate saved us
- Monitor everything - Cache hit rates, compilation times, invalidation patterns
The system served thousands of custom components and 12 million weekly views over two years with:
- Zero security incidents
- 45ms P95 response time (CDN)
- 99.9% uptime
- $5/month infrastructure cost (200x reduction from naive approach)
The key was treating user-created components as untrusted input at every stage, while caching aggressively at every layer, and still providing a great developer experience.
If you’re building something similar today, consider tools like:
- Sandpack for in-browser sandboxing
- Mitosis for cross-framework components
- Qwik for resumable SSR
- Edge Functions for even faster rendering
The core principles remain: parse, validate, isolate, render, and cache aggressively.
Have you built a similar system? I’d love to hear about your approach and trade-offs. Find me on Threads or GitHub.