Chapter 8: Serverless & Event-Driven Architecture
Serverless means you write business logic while Azure handles infrastructure, scaling, and billing. You pay only for execution time, not idle capacity.
Azure Functions
Azure Functions is a serverless compute service. Write a function triggered by an event; it runs, then stops. No servers to manage, scales to zero, billed per execution.
Triggers & Bindings
A trigger defines what starts the function. Bindings declaratively connect inputs and outputs without boilerplate code.
| Trigger | Use Case |
|---|---|
HttpTrigger | REST API endpoints, webhooks |
TimerTrigger | Scheduled jobs (cron) |
BlobTrigger | Process files on upload to Blob Storage |
QueueTrigger | Process messages from Storage Queue or Service Bus |
EventGridTrigger | Respond to Azure events |
EventHubTrigger | Process IoT / streaming events |
CosmosDBTrigger | React to Cosmos DB change feed |
Creating a Function App
# Create a storage account for the Function App
az storage account create \
--name myfuncappstorage \
--resource-group myapp-rg \
--location eastus \
--sku Standard_LRS
# Create the Function App (Python, Consumption plan)
az functionapp create \
--resource-group myapp-rg \
--name myapp-functions \
--storage-account myfuncappstorage \
--consumption-plan-location eastus \
--runtime python \
--runtime-version 3.11 \
--functions-version 4 \
--os-type linux
Writing Functions (Python)
Project structure (v2 programming model):
myapp-functions/
├── function_app.py (all functions in one file, v2 model)
├── requirements.txt
├── host.json
└── local.settings.json (local development config, never commit this)
# function_app.py
import azure.functions as func
import logging
import json
app = func.FunctionApp()
# HTTP Trigger: REST endpoint
@app.route(route="hello", auth_level=func.AuthLevel.ANONYMOUS)
def hello(req: func.HttpRequest) -> func.HttpResponse:
name = req.params.get("name") or "World"
return func.HttpResponse(
json.dumps({"message": f"Hello, {name}!"}),
mimetype="application/json",
status_code=200,
)
# Timer Trigger: run every 5 minutes
@app.timer_trigger(schedule="0 */5 * * * *", arg_name="timer", run_on_startup=False)
def cleanup_job(timer: func.TimerRequest) -> None:
logging.info("Running cleanup job")
# ... cleanup logic
# Blob Trigger: process uploaded images
@app.blob_trigger(arg_name="blob", path="images/{name}", connection="AzureWebJobsStorage")
def process_image(blob: func.InputStream) -> None:
logging.info(f"Processing blob: {blob.name}, size: {blob.length} bytes")
# ... image processing logic
# Queue Trigger with output binding
@app.queue_trigger(arg_name="msg", queue_name="orders", connection="AzureWebJobsStorage")
@app.cosmos_db_output(
arg_name="outputDoc",
database_name="myappdb",
container_name="processed-orders",
connection="CosmosDBConnection",
)
def process_order(msg: func.QueueMessage, outputDoc: func.Out[func.Document]) -> None:
order = json.loads(msg.get_body().decode("utf-8"))
order["status"] = "processed"
outputDoc.set(func.Document.from_dict(order))
Local Development
# Install the Azure Functions Core Tools
npm install -g azure-functions-core-tools@4
# Create a new project
func init myapp-functions --python
# Start the local runtime
cd myapp-functions
func start
# Test the HTTP trigger
curl http://localhost:7071/api/hello?name=Azure
Deploy Functions
# Deploy via zip (simplest)
func azure functionapp publish myapp-functions
# Or via Azure CLI
cd myapp-functions
zip -r app.zip . --exclude ".git/*" --exclude "__pycache__/*"
az functionapp deployment source config-zip \
--resource-group myapp-rg \
--name myapp-functions \
--src app.zip
Hosting Plans
| Plan | Cold Start | Max Duration | Scale | Cost |
|---|---|---|---|---|
| Consumption | Yes (~1–3s) | 10 min | Auto (0–200 instances) | Per execution |
| Flex Consumption | Minimal | 60 min | Fast auto-scale | Per execution + standby |
| Premium | No (pre-warmed) | 60 min (unlimited) | Auto | Per instance hour |
| Dedicated (App Service) | No | Unlimited | Manual or auto | Per App Service Plan |
Durable Functions
Durable Functions add stateful workflow patterns (orchestrations) on top of Azure Functions:
import azure.durable_functions as df
# Orchestrator: defines the workflow
def orchestrator_function(context: df.DurableOrchestrationContext):
# Call activity functions in sequence
result1 = yield context.call_activity("FetchData", {"id": "123"})
result2 = yield context.call_activity("ProcessData", result1)
result3 = yield context.call_activity("SaveResult", result2)
# Or in parallel (fan-out/fan-in)
tasks = [context.call_activity("ProcessItem", item) for item in result1]
results = yield context.task_all(tasks)
return results
main = df.Orchestrator.create(orchestrator_function)
Patterns: chaining, fan-out/fan-in, async HTTP, monitoring (polling), human interaction (wait for approval).
Azure Logic Apps
Logic Apps is a low-code/no-code workflow automation service. Use it for integration scenarios connecting SaaS apps (Office 365, Salesforce, ServiceNow), Azure services, and HTTP endpoints without writing code.
Logic App Workflow Example: "New order → send email + create ticket"
Trigger: HTTP Request (POST /order)
│
├── Action: Insert row in Azure SQL
├── Action: Send email (Office 365 Outlook connector)
└── Action: Create ticket (ServiceNow connector)
Logic Apps vs Azure Functions:
- Logic Apps: visual designer, 400+ connectors, low-code, per-action billing
- Azure Functions: code-first, any language, flexible, per-execution billing
Use Logic Apps for integration/orchestration between services. Use Functions for compute-heavy or custom logic.
Azure Event Grid
Event Grid is a managed event-routing service. It routes events from sources to subscribers using a publish/subscribe model.
Event Sources → Event Grid → Event Handlers
─────────────────────────────────────────────
Blob Storage → Topic → Azure Function
Azure SQL → → Logic App
Custom app → → Event Hub
ARM events → → Webhook
# Create a custom topic
az eventgrid topic create \
--resource-group myapp-rg \
--name myapp-topic \
--location eastus
# Subscribe a function to the topic
az eventgrid event-subscription create \
--name myapp-subscription \
--source-resource-id $(az eventgrid topic show --resource-group myapp-rg --name myapp-topic --query id -o tsv) \
--endpoint $(az functionapp function show --resource-group myapp-rg --function-app myapp-functions --name EventGridHandler --query invokeUrlTemplate -o tsv)
# Publish an event
TOPIC_ENDPOINT=$(az eventgrid topic show --resource-group myapp-rg --name myapp-topic --query endpoint -o tsv)
TOPIC_KEY=$(az eventgrid topic key list --resource-group myapp-rg --name myapp-topic --query key1 -o tsv)
curl -X POST "$TOPIC_ENDPOINT/api/events" \
-H "aeg-sas-key: $TOPIC_KEY" \
-H "Content-Type: application/json" \
-d '[{
"id": "1",
"eventType": "order.created",
"subject": "orders/12345",
"eventTime": "2024-01-15T10:00:00Z",
"data": {"orderId": "12345", "amount": 99.99},
"dataVersion": "1.0"
}]'
Event Grid vs Service Bus vs Event Hub:
| Service | Model | Ordering | Scale | Use Case |
|---|---|---|---|---|
| Event Grid | Push (reactive) | No | Moderate | Azure resource events, webhook routing |
| Service Bus | Pull (queue/topic) | Yes (sessions) | High | Enterprise messaging, transactions |
| Event Hub | Pull (stream) | Per partition | Very High (millions/sec) | IoT telemetry, log ingestion, streaming |
Azure Event Hub
Event Hub is a big-data streaming platform and event ingestion service. Think Kafka for Azure.
# Create an Event Hub namespace
az eventhubs namespace create \
--resource-group myapp-rg \
--name myapp-eventhub-ns \
--location eastus \
--sku Standard
# Create an Event Hub
az eventhubs eventhub create \
--resource-group myapp-rg \
--namespace-name myapp-eventhub-ns \
--name telemetry \
--partition-count 4 \
--message-retention 7
Sending events (Python):
from azure.eventhub import EventHubProducerClient, EventData
from azure.identity import DefaultAzureCredential
credential = DefaultAzureCredential()
producer = EventHubProducerClient(
fully_qualified_namespace="myapp-eventhub-ns.servicebus.windows.net",
eventhub_name="telemetry",
credential=credential,
)
with producer:
batch = producer.create_batch()
batch.add(EventData('{"deviceId": "sensor-1", "temperature": 22.5}'))
batch.add(EventData('{"deviceId": "sensor-2", "temperature": 21.0}'))
producer.send_batch(batch)
Azure Service Bus
Service Bus is an enterprise message broker supporting queues (point-to-point) and topics/subscriptions (publish/subscribe).
# Create a Service Bus namespace
az servicebus namespace create \
--resource-group myapp-rg \
--name myapp-servicebus \
--location eastus \
--sku Standard
# Create a queue
az servicebus queue create \
--resource-group myapp-rg \
--namespace-name myapp-servicebus \
--name orders
# Create a topic
az servicebus topic create \
--resource-group myapp-rg \
--namespace-name myapp-servicebus \
--name events
# Create a subscription on the topic
az servicebus topic subscription create \
--resource-group myapp-rg \
--namespace-name myapp-servicebus \
--topic-name events \
--name email-notifications
Key Service Bus features:
- Dead-letter queue for unprocessable messages
- Message sessions for ordered processing
- Scheduled delivery
- Message lock (prevents duplicate processing)
- Transactions across multiple queues/topics
- Duplicate detection
Architecture Patterns
Event-Driven Microservices
API Gateway (APIM)
│
▼
Order Service (Function)
│
├── Publishes to Service Bus topic "order.created"
│ │
│ ├── Inventory Service (Function): updates stock
│ ├── Notification Service (Function): sends email
│ └── Analytics Service (Function): logs to Event Hub)
│
└── Writes order to Cosmos DB
Fan-Out Processing
Upload trigger (Blob trigger)
│
▼
Event Grid (publishes "blob.uploaded")
│
├── Thumbnail generator (Function)
├── Virus scanner (Function)
└── Metadata extractor (Function)
Next Steps
Continue to 09-devops.md to learn about Azure DevOps, GitHub Actions, and container image management.