Smart Alerts
| Field | Value |
|---|---|
| Document ID | ASCEND-NOTIF-003 |
| Version | 2026.04 |
| Last Updated | April 2026 |
| Author | Ascend Engineering Team |
| Publisher | OW-KAI Technologies Inc. |
| Classification | Enterprise Client Documentation |
| Compliance | SOC 2 CC6.1/CC6.2, PCI-DSS 7.1/8.3, HIPAA 164.312, NIST 800-53 AC-2/SI-4 |
Reading Time: 8 minutes | Skill Level: Intermediate
Overview
Smart Alerts provide real-time monitoring and alerting based on customizable rules. Alert on risk score thresholds, action patterns, system health, and anomalies with automatic escalation.
tip
Start with conservative alert thresholds and tune down over time. Over-aggressive thresholds cause alert fatigue, while under-configured thresholds miss critical events.
Alert Engine Architecture
+---------------------------------------------------------------------+
| SMART ALERT ENGINE |
+---------------------------------------------------------------------+
| |
| ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ |
| │ Metrics │ │ Rules │ │ Channels │ |
| │ Stream │───▶│ Engine │───▶│ Dispatch │ |
| └─────────────┘ └─────────────┘ └─────────────┘ |
| │ │ │ |
| │ ┌─────┴─────┐ │ |
| │ │ │ │ |
| ▼ ▼ ▼ ▼ |
| ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────┐ |
| │ System │ │Threshold │ │ Anomaly │ │ Webhook │ |
| │ Metrics │ │ Check │ │Detection │ │ Slack │ |
| │ Actions │ │ Pattern │ │ │ │ Teams │ |
| └──────────┘ └──────────┘ └──────────┘ └──────────┘ |
| |
+---------------------------------------------------------------------+
Creating Alert Rules
Threshold-Based Alerts
curl -X POST "https://pilot.owkai.app/api/smart-rules" \
-H "Authorization: Bearer <admin_jwt>" \
-H "Content-Type: application/json" \
-d '{
"name": "High Risk Action Alert",
"is_active": true,
"rule_definition": {
"condition_type": "threshold",
"metric": "risk_score",
"operator": ">=",
"threshold": 80,
"severity": "high"
}
}'
Pattern-Based Alerts
curl -X POST "https://pilot.owkai.app/api/smart-rules" \
-H "Authorization: Bearer <admin_jwt>" \
-d '{
"name": "Repeated Denials Alert",
"is_active": true,
"rule_definition": {
"condition_type": "pattern",
"pattern": {
"event": "action.denied",
"count": 5,
"window_minutes": 10
},
"severity": "critical"
}
}'
System Health Alerts
curl -X POST "https://pilot.owkai.app/api/smart-rules" \
-H "Authorization: Bearer <admin_jwt>" \
-d '{
"name": "System CPU Alert",
"is_active": true,
"rule_definition": {
"condition_type": "threshold",
"metric": "system.cpu_percent",
"operator": ">",
"threshold": 90,
"severity": "high"
}
}'
Severity Levels
| Severity | Color | Use Case | Default Response |
|---|---|---|---|
critical | Red | Security incidents, system down | Immediate escalation |
high | Orange | High-risk actions, failures | 5 minute response |
medium | Yellow | Unusual patterns | 15 minute response |
low | Blue | Informational | Log only |
Alert Lifecycle
1. Alert Triggered
When a rule condition is met:
{
"id": "alert_42_1702656000",
"rule_id": 42,
"rule_name": "High Risk Action Alert",
"severity": "high",
"message": "High Risk Action Alert: risk_score is 85 (threshold: >= 80)",
"triggered_at": "2025-12-15T10:00:00Z",
"status": "active",
"metrics_snapshot": {
"risk_score": 85,
"action_type": "data_export",
"agent_id": "export-agent"
}
}
2. Get Active Alerts
curl "https://pilot.owkai.app/api/alerts/active" \
-H "Authorization: Bearer <jwt_token>"
Response:
{
"alerts": [
{
"id": "alert_42_1702656000",
"rule_name": "High Risk Action Alert",
"severity": "high",
"message": "risk_score is 85 (threshold: >= 80)",
"triggered_at": "2025-12-15T10:00:00Z",
"status": "active"
}
],
"statistics": {
"total_active": 3,
"by_severity": {
"critical": 0,
"high": 1,
"medium": 2,
"low": 0
}
}
}
3. Acknowledge Alert
curl -X POST "https://pilot.owkai.app/api/alerts/42/acknowledge" \
-H "Authorization: Bearer <jwt_token>" \
-H "X-CSRF-Token: <csrf_token>"
4. Resolve Alert
curl -X POST "https://pilot.owkai.app/api/alerts/alert_42_1702656000/resolve" \
-H "Authorization: Bearer <jwt_token>" \
-H "X-CSRF-Token: <csrf_token>"
5. Escalate Alert
curl -X POST "https://pilot.owkai.app/api/alerts/42/escalate" \
-H "Authorization: Bearer <jwt_token>" \
-H "X-CSRF-Token: <csrf_token>"
Real-Time Alert Stream
WebSocket Connection
const ws = new WebSocket('wss://pilot.owkai.app/api/alerts/stream');
ws.onmessage = (event) => {
const data = JSON.parse(event.data);
if (data.type === 'initial_alerts') {
console.log('Current active alerts:', data.data);
} else if (data.type === 'alerts') {
console.log('New alerts:', data.data);
data.data.forEach(alert => {
showNotification(alert);
});
}
};
Python WebSocket Client
import asyncio
import websockets
import json
async def alert_listener():
uri = "wss://pilot.owkai.app/api/alerts/stream"
headers = {"Authorization": "Bearer <jwt_token>"}
async with websockets.connect(uri, extra_headers=headers) as ws:
async for message in ws:
data = json.loads(message)
if data['type'] == 'alerts':
for alert in data['data']:
handle_alert(alert)
def handle_alert(alert):
if alert['severity'] == 'critical':
send_pager_notification(alert)
elif alert['severity'] == 'high':
send_slack_notification(alert)
asyncio.run(alert_listener())
Alert History
Query Historical Alerts
curl "https://pilot.owkai.app/api/alerts/history?days=30" \
-H "Authorization: Bearer <jwt_token>"
Response:
{
"history": [
{
"id": "123",
"rule_name": "High Risk Alert",
"severity": "high",
"triggered_at": "2025-12-14T15:30:00Z",
"resolved_at": "2025-12-14T16:00:00Z",
"status": "resolved"
}
],
"total_count": 45,
"date_range": {
"start": "2025-11-15T00:00:00Z",
"end": "2025-12-15T23:59:59Z"
}
}
Condition Types
Threshold Conditions
{
"condition_type": "threshold",
"metric": "risk_score",
"operator": ">=",
"threshold": 80
}
Supported Operators:
>- Greater than<- Less than>=- Greater than or equal<=- Less than or equal==- Equal to
Anomaly Detection
{
"condition_type": "anomaly",
"metric": "actions_per_minute",
"sensitivity": 0.8,
"historical_average": 50
}
Pattern Matching
{
"condition_type": "pattern",
"pattern": {
"sequence": ["login_failed", "login_failed", "login_failed"],
"window_minutes": 5
}
}
Available Metrics
System Metrics
| Metric | Description |
|---|---|
system.cpu_percent | CPU utilization |
system.memory_percent | Memory utilization |
system.disk_percent | Disk utilization |
Action Metrics
| Metric | Description |
|---|---|
risk_score | Action risk score (0-100) |
actions_per_minute | Action submission rate |
denial_rate | Percentage of denied actions |
Agent Metrics
| Metric | Description |
|---|---|
agent.error_rate | Agent error percentage |
agent.response_time | Average response time |
agent.active_count | Active agent count |
Integration with Notification Channels
Configure Alert Routing
curl -X POST "https://pilot.owkai.app/api/notifications/channels" \
-H "Authorization: Bearer <admin_jwt>" \
-d '{
"name": "Critical Alerts - PagerDuty",
"channel_type": "slack",
"webhook_url": "https://hooks.slack.com/...",
"subscribed_events": [
"alert.critical",
"alert.high"
],
"min_risk_score": 80
}'
Alert Notification Flow
Alert Triggered
│
▼
┌─────────────────┐
│ Check Severity │
└────────┬────────┘
│
┌────┴────┐
│ │
▼ ▼
Critical High/Medium
│ │
▼ ▼
PagerDuty Slack
│ │
▼ ▼
On-Call Channel
Best Practices
1. Set Meaningful Thresholds
{
"name": "Actionable High Risk Alert",
"rule_definition": {
"condition_type": "threshold",
"metric": "risk_score",
"operator": ">=",
"threshold": 85,
"cooldown_minutes": 5
}
}
2. Use Severity Appropriately
- Critical: Security incidents only
- High: Requires immediate attention
- Medium: Review within the hour
- Low: Informational
3. Configure Cooldowns
Prevent alert fatigue with cooldown periods:
{
"rule_definition": {
"cooldown_minutes": 15,
"max_alerts_per_hour": 10
}
}
4. Test Alerts
# Trigger a test alert
curl -X POST "https://pilot.owkai.app/api/alerts/test" \
-H "Authorization: Bearer <admin_jwt>" \
-d '{
"rule_id": 42,
"test_metrics": {"risk_score": 90}
}'
Next Steps
- Webhooks - External notifications
- Slack/Teams - Chat integrations
- PagerDuty - On-call alerting
Document Version: 2026.04 | Last Updated: April 2026