Add initial work from Codex
This commit is contained in:
9
.gitignore
vendored
9
.gitignore
vendored
@@ -219,6 +219,12 @@ build/Release
|
||||
node_modules/
|
||||
jspm_packages/
|
||||
|
||||
# Frontend build/typecheck artifacts
|
||||
frontend/dist/
|
||||
frontend/*.tsbuildinfo
|
||||
frontend/vite.config.js
|
||||
frontend/vite.config.d.ts
|
||||
|
||||
# Snowpack dependency directory (https://snowpack.dev/)
|
||||
web_modules/
|
||||
|
||||
@@ -312,3 +318,6 @@ dist
|
||||
.yarn/install-state.gz
|
||||
.pnp.*
|
||||
|
||||
# This project uses Kubernetes + Grafana Alloy instead of local docker compose collector.
|
||||
docker-compose.yml
|
||||
otel-collector-config.yaml
|
||||
|
||||
26
LICENSE
26
LICENSE
@@ -1,18 +1,14 @@
|
||||
MIT License
|
||||
Copyright (C) 2026 Domagoj Andrić
|
||||
|
||||
Copyright (c) 2026 domagoj
|
||||
This program is free software: you can redistribute it and/or modify
|
||||
it under the terms of the GNU Affero General Public License as
|
||||
published by the Free Software Foundation, either version 3 of the
|
||||
License, or (at your option) any later version.
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and
|
||||
associated documentation files (the "Software"), to deal in the Software without restriction, including
|
||||
without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the
|
||||
following conditions:
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU Affero General Public License for more details.
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all copies or substantial
|
||||
portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT
|
||||
LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO
|
||||
EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
|
||||
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
|
||||
USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
You should have received a copy of the GNU Affero General Public License
|
||||
along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
|
||||
144
README.md
144
README.md
@@ -1,2 +1,144 @@
|
||||
# zavrsni-rad-otel-app
|
||||
# OTel BI Forecast App
|
||||
|
||||
OpenTelemetry-instrumented BI platform with microservices, frontend OIDC login plus backend token validation, read-only MSSQL data warehouse access, and PostgreSQL persistence for writable app data.
|
||||
|
||||
## Architecture
|
||||
|
||||
- Frontend: React + TypeScript (`frontend/`)
|
||||
- Backend microservices (`backend/microservices/`):
|
||||
- `api_gateway`: public API, frontend JWT validation, internal token minting, routing/audit forwarding
|
||||
- `bi_query`: read-only MSSQL warehouse queries
|
||||
- `analytics`: forecasting, rankings, recommendations
|
||||
- `persistence`: PostgreSQL writes/reads for app data
|
||||
- Data sources:
|
||||
- MSSQL (`WorldWideImporters`, `AdventureWorks2022DWH`) read-only only
|
||||
- PostgreSQL writable app store (`audit_logs`, `forecast_runs`, `ranking_runs`, `recommendation_runs`)
|
||||
- Observability: OTLP/HTTP to Grafana Alloy (`/v1/traces`, `/v1/metrics`)
|
||||
|
||||
## Authentication Model
|
||||
|
||||
- Frontend uses OIDC Authorization Code + PKCE.
|
||||
- `api_gateway` validates frontend bearer JWT (`iss`, `aud`, signature, expiry, optional scopes) against configured JWKS.
|
||||
- `api_gateway` mints short-lived internal service tokens (`x-internal-service-token`) for service-to-service calls.
|
||||
- Internal services (`analytics`, `bi_query`, `persistence`) require valid internal token on non-health endpoints and enforce issuer/type checks.
|
||||
- Combine with K8s network controls (ClusterIP, NetworkPolicy, mTLS/service mesh where available).
|
||||
|
||||
Frontend uses OIDC Authorization Code + PKCE with:
|
||||
- `VITE_OIDC_ENABLED=true`
|
||||
- `VITE_OIDC_AUTHORITY=<issuer-base-url>`
|
||||
- `VITE_OIDC_CLIENT_ID=<frontend-client-id>`
|
||||
- `VITE_OIDC_REDIRECT_URI=<frontend-url>`
|
||||
- `VITE_OIDC_POST_LOGOUT_REDIRECT_URI=<frontend-url>`
|
||||
- `VITE_OIDC_SCOPE=openid profile email`
|
||||
|
||||
Backend security env:
|
||||
- `REQUIRE_FRONTEND_AUTH=true`
|
||||
- `FRONTEND_JWT_ISSUER_URL=<oidc-issuer>`
|
||||
- `FRONTEND_JWT_JWKS_URL=<issuer-jwks-url>`
|
||||
- `FRONTEND_JWT_AUDIENCE=<api-audience>`
|
||||
- `FRONTEND_REQUIRED_SCOPES=<space-separated>`
|
||||
- `INTERNAL_SERVICE_SHARED_SECRET=<strong-random-secret-at-least-32-bytes>`
|
||||
- `INTERNAL_SERVICE_ALLOWED_ISSUERS=api-gateway`
|
||||
- `MSSQL_TRUST_SERVER_CERTIFICATE=false` and `POSTGRES_SSLMODE=require` for production TLS validation
|
||||
|
||||
## Local Run (Microservices)
|
||||
|
||||
```bash
|
||||
cd backend
|
||||
python -m venv .venv
|
||||
source .venv/bin/activate
|
||||
pip install -e .
|
||||
cp .env.example .env
|
||||
```
|
||||
|
||||
Run services in separate terminals:
|
||||
|
||||
```bash
|
||||
uvicorn microservices.persistence.main:app --host 0.0.0.0 --port 8103 --reload
|
||||
uvicorn microservices.bi_query.main:app --host 0.0.0.0 --port 8101 --reload
|
||||
uvicorn microservices.analytics.main:app --host 0.0.0.0 --port 8102 --reload
|
||||
uvicorn microservices.api_gateway.main:app --host 0.0.0.0 --port 8000 --reload
|
||||
```
|
||||
|
||||
Frontend:
|
||||
|
||||
```bash
|
||||
cd frontend
|
||||
npm install
|
||||
cp .env.example .env
|
||||
npm run dev
|
||||
```
|
||||
|
||||
Set:
|
||||
- `VITE_API_BASE_URL=http://localhost:8000`
|
||||
- `VITE_OTEL_COLLECTOR_ENDPOINT=http://alloy.monitoring.svc.cluster.local:4318`
|
||||
|
||||
Frontend sends `Authorization: Bearer <token>` from the active OIDC session.
|
||||
|
||||
## API Endpoints (via Gateway)
|
||||
|
||||
- `GET /api/health`
|
||||
- `GET /api/telemetry/status`
|
||||
- `GET /api/kpis`
|
||||
- `GET /api/history?days_back=365`
|
||||
- `GET /api/forecasts?days=30`
|
||||
- `GET /api/rankings?top_n=10`
|
||||
- `GET /api/recommendations`
|
||||
- `GET /api/dashboard`
|
||||
- `GET /api/storage/audit-logs?limit=50`
|
||||
- `GET /api/storage/forecasts?limit=50`
|
||||
- `GET /api/storage/rankings?limit=50`
|
||||
- `GET /api/storage/recommendations?limit=50`
|
||||
|
||||
## K8s Deployment
|
||||
|
||||
Example manifest:
|
||||
- `k8s/microservices.yaml`
|
||||
|
||||
It includes:
|
||||
- namespace, config map, secret
|
||||
- deployments/services for `api-gateway`, `bi-query`, `analytics`, `persistence`
|
||||
- Alloy endpoint wiring via `OTEL_COLLECTOR_ENDPOINT`
|
||||
- frontend JWT validation config and internal token secret wiring
|
||||
- hardened pod security defaults (`runAsNonRoot`, dropped capabilities, `seccompProfile: RuntimeDefault`, no auto-mounted service account token)
|
||||
|
||||
## Read-Only Guarantee
|
||||
|
||||
- MSSQL connections enforce `ApplicationIntent=ReadOnly`.
|
||||
- Warehouse query layer only accepts `SELECT`/`WITH`.
|
||||
- Writable operations are isolated to PostgreSQL only.
|
||||
- Use SQL Server account with `SELECT` grants only.
|
||||
|
||||
## OTel Coverage
|
||||
|
||||
- Frontend:
|
||||
- W3C trace/baggage propagation
|
||||
- document-load, user-interaction, fetch, XHR instrumentation
|
||||
- manual dashboard spans
|
||||
- Backend services:
|
||||
- FastAPI request spans
|
||||
- HTTP client spans for service-to-service calls
|
||||
- SQLAlchemy spans (MSSQL and PostgreSQL)
|
||||
- manual analytics + persistence spans
|
||||
- audit/snapshot persistence telemetry
|
||||
|
||||
## Verification
|
||||
|
||||
1. Call `GET /api/telemetry/status` with a valid frontend bearer token.
|
||||
2. Confirm response has non-null `trace_id` and `span_id`.
|
||||
3. Trigger `GET /api/dashboard`; then verify records in `GET /api/storage/audit-logs`.
|
||||
4. In Grafana/Tempo, confirm trace path includes:
|
||||
- `api-gateway` span
|
||||
- `analytics` span
|
||||
- `bi-query` MSSQL spans
|
||||
- `persistence` PostgreSQL spans
|
||||
5. Call internal service endpoint directly without `x-internal-service-token` and verify it returns `401`.
|
||||
|
||||
## Optional Tests
|
||||
|
||||
```bash
|
||||
cd backend
|
||||
source .venv/bin/activate
|
||||
pip install -e .[dev]
|
||||
pytest
|
||||
```
|
||||
|
||||
59
backend/.env.example
Normal file
59
backend/.env.example
Normal file
@@ -0,0 +1,59 @@
|
||||
APP_NAME=otel-bi-backend
|
||||
APP_ENV=dev
|
||||
LOG_LEVEL=INFO
|
||||
API_HOST=0.0.0.0
|
||||
API_PORT=8000
|
||||
|
||||
CORS_ORIGINS=http://localhost:5173
|
||||
|
||||
MSSQL_HOST=localhost
|
||||
MSSQL_PORT=1433
|
||||
MSSQL_USERNAME=readonly_user
|
||||
MSSQL_PASSWORD=readonly_password
|
||||
MSSQL_DRIVER=ODBC Driver 18 for SQL Server
|
||||
MSSQL_TRUST_SERVER_CERTIFICATE=false
|
||||
|
||||
WWI_DATABASE=WorldWideImporters
|
||||
AW_DATABASE=AdventureWorks2022DWH
|
||||
# Optional direct URLs (override generated URLs):
|
||||
# WWI_CONNECTION_STRING=mssql+pyodbc://user:pass@host:1433/WorldWideImporters?driver=ODBC+Driver+18+for+SQL+Server&ApplicationIntent=ReadOnly
|
||||
# AW_CONNECTION_STRING=mssql+pyodbc://user:pass@host:1433/AdventureWorks2022DWH?driver=ODBC+Driver+18+for+SQL+Server&ApplicationIntent=ReadOnly
|
||||
|
||||
POSTGRES_HOST=localhost
|
||||
POSTGRES_PORT=5432
|
||||
POSTGRES_DATABASE=otel_bi_app
|
||||
POSTGRES_USERNAME=otel_bi_app
|
||||
POSTGRES_PASSWORD=otel_bi_app
|
||||
POSTGRES_SSLMODE=require
|
||||
# Optional direct URL:
|
||||
# POSTGRES_CONNECTION_STRING=postgresql+psycopg://otel_bi_app:otel_bi_app@localhost:5432/otel_bi_app?sslmode=prefer
|
||||
POSTGRES_REQUIRED=true
|
||||
|
||||
QUERY_SERVICE_URL=http://localhost:8101
|
||||
ANALYTICS_SERVICE_URL=http://localhost:8102
|
||||
PERSISTENCE_SERVICE_URL=http://localhost:8103
|
||||
REQUEST_TIMEOUT_SECONDS=20
|
||||
REQUIRE_FRONTEND_AUTH=true
|
||||
FRONTEND_JWT_ISSUER_URL=https://<your-idp-domain>/realms/<your-realm>
|
||||
FRONTEND_JWT_AUDIENCE=otel-bi-api
|
||||
FRONTEND_JWT_JWKS_URL=https://<your-idp-domain>/realms/<your-realm>/protocol/openid-connect/certs
|
||||
FRONTEND_JWT_ALGORITHM=RS256
|
||||
FRONTEND_REQUIRED_SCOPES=openid profile email
|
||||
FRONTEND_CLOCK_SKEW_SECONDS=30
|
||||
INTERNAL_SERVICE_AUTH_ENABLED=true
|
||||
INTERNAL_SERVICE_SHARED_SECRET=replace-with-strong-random-secret-min-32-bytes
|
||||
INTERNAL_SERVICE_TOKEN_TTL_SECONDS=120
|
||||
INTERNAL_SERVICE_TOKEN_AUDIENCE=bi-internal
|
||||
INTERNAL_SERVICE_ALLOWED_ISSUERS=api-gateway
|
||||
INTERNAL_TOKEN_CLOCK_SKEW_SECONDS=15
|
||||
|
||||
OTEL_SERVICE_NAME=otel-bi-backend
|
||||
OTEL_SERVICE_NAMESPACE=final-thesis
|
||||
OTEL_COLLECTOR_ENDPOINT=http://localhost:4318
|
||||
# K8s + Alloy example:
|
||||
# OTEL_COLLECTOR_ENDPOINT=http://alloy.monitoring.svc.cluster.local:4318
|
||||
OTEL_EXPORT_TIMEOUT_MS=10000
|
||||
|
||||
FORECAST_HORIZON_DAYS=30
|
||||
DEFAULT_HISTORY_DAYS=365
|
||||
RANKING_DEFAULT_TOP_N=10
|
||||
1
backend/app/__init__.py
Normal file
1
backend/app/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
"""Backend application package."""
|
||||
135
backend/app/core/config.py
Normal file
135
backend/app/core/config.py
Normal file
@@ -0,0 +1,135 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from functools import lru_cache
|
||||
from urllib.parse import quote_plus
|
||||
|
||||
from pydantic import Field
|
||||
from pydantic_settings import BaseSettings, SettingsConfigDict
|
||||
|
||||
|
||||
class Settings(BaseSettings):
|
||||
model_config = SettingsConfigDict(
|
||||
env_file=".env",
|
||||
env_file_encoding="utf-8",
|
||||
extra="ignore",
|
||||
)
|
||||
|
||||
app_name: str = "otel-bi-backend"
|
||||
app_env: str = "dev"
|
||||
log_level: str = "INFO"
|
||||
|
||||
api_host: str = "0.0.0.0"
|
||||
api_port: int = 8000
|
||||
|
||||
cors_origins: str = "http://localhost:5173"
|
||||
request_timeout_seconds: float = 20.0
|
||||
|
||||
mssql_host: str = "localhost"
|
||||
mssql_port: int = 1433
|
||||
mssql_username: str = "sa"
|
||||
mssql_password: str = "Password!123"
|
||||
mssql_driver: str = "ODBC Driver 18 for SQL Server"
|
||||
mssql_trust_server_certificate: bool = False
|
||||
|
||||
wwi_database: str = "WorldWideImporters"
|
||||
aw_database: str = "AdventureWorks2022DWH"
|
||||
wwi_connection_string: str | None = None
|
||||
aw_connection_string: str | None = None
|
||||
postgres_host: str = "localhost"
|
||||
postgres_port: int = 5432
|
||||
postgres_database: str = "otel_bi_app"
|
||||
postgres_username: str = "otel_bi_app"
|
||||
postgres_password: str = "otel_bi_app"
|
||||
postgres_sslmode: str = "require"
|
||||
postgres_connection_string: str | None = None
|
||||
postgres_required: bool = True
|
||||
query_service_url: str = "http://localhost:8101"
|
||||
analytics_service_url: str = "http://localhost:8102"
|
||||
persistence_service_url: str = "http://localhost:8103"
|
||||
require_frontend_auth: bool = True
|
||||
frontend_jwt_issuer_url: str = ""
|
||||
frontend_jwt_audience: str = ""
|
||||
frontend_jwt_jwks_url: str | None = None
|
||||
frontend_jwt_algorithm: str = "RS256"
|
||||
frontend_required_scopes: str = ""
|
||||
frontend_clock_skew_seconds: int = Field(default=30, ge=0, le=300)
|
||||
internal_service_auth_enabled: bool = True
|
||||
internal_service_shared_secret: str = "change-me"
|
||||
internal_service_token_ttl_seconds: int = Field(default=120, ge=30, le=900)
|
||||
internal_service_token_audience: str = "bi-internal"
|
||||
internal_service_allowed_issuers: str = "api-gateway"
|
||||
internal_token_clock_skew_seconds: int = Field(default=15, ge=0, le=120)
|
||||
|
||||
otel_service_name: str = "otel-bi-backend"
|
||||
otel_service_namespace: str = "final-thesis"
|
||||
otel_collector_endpoint: str = "http://localhost:4318"
|
||||
otel_export_timeout_ms: int = 10000
|
||||
|
||||
forecast_horizon_days: int = Field(default=30, ge=7, le=180)
|
||||
default_history_days: int = Field(default=365, ge=30, le=1460)
|
||||
ranking_default_top_n: int = Field(default=10, ge=3, le=100)
|
||||
storage_default_limit: int = Field(default=50, ge=10, le=500)
|
||||
|
||||
@property
|
||||
def cors_origins_list(self) -> list[str]:
|
||||
return [
|
||||
origin.strip() for origin in self.cors_origins.split(",") if origin.strip()
|
||||
]
|
||||
|
||||
@property
|
||||
def frontend_required_scopes_list(self) -> list[str]:
|
||||
return [
|
||||
scope.strip()
|
||||
for scope in self.frontend_required_scopes.split(" ")
|
||||
if scope.strip()
|
||||
]
|
||||
|
||||
@property
|
||||
def internal_service_allowed_issuers_list(self) -> list[str]:
|
||||
return [
|
||||
issuer.strip()
|
||||
for issuer in self.internal_service_allowed_issuers.split(",")
|
||||
if issuer.strip()
|
||||
]
|
||||
|
||||
def _build_mssql_connection_url(self, database: str) -> str:
|
||||
driver = quote_plus(self.mssql_driver)
|
||||
user = quote_plus(self.mssql_username)
|
||||
password = quote_plus(self.mssql_password)
|
||||
trust_cert = "yes" if self.mssql_trust_server_certificate else "no"
|
||||
return (
|
||||
f"mssql+pyodbc://{user}:{password}@{self.mssql_host}:{self.mssql_port}/{database}"
|
||||
f"?driver={driver}&TrustServerCertificate={trust_cert}&ApplicationIntent=ReadOnly"
|
||||
)
|
||||
|
||||
@property
|
||||
def wwi_connection_url(self) -> str:
|
||||
return self.wwi_connection_string or self._build_mssql_connection_url(
|
||||
self.wwi_database
|
||||
)
|
||||
|
||||
@property
|
||||
def aw_connection_url(self) -> str:
|
||||
return self.aw_connection_string or self._build_mssql_connection_url(
|
||||
self.aw_database
|
||||
)
|
||||
|
||||
@property
|
||||
def postgres_connection_url(self) -> str:
|
||||
if self.postgres_connection_string:
|
||||
return self.postgres_connection_string
|
||||
|
||||
user = quote_plus(self.postgres_username)
|
||||
password = quote_plus(self.postgres_password)
|
||||
return (
|
||||
f"postgresql+psycopg://{user}:{password}@{self.postgres_host}:{self.postgres_port}/"
|
||||
f"{self.postgres_database}?sslmode={self.postgres_sslmode}"
|
||||
)
|
||||
|
||||
|
||||
@lru_cache
|
||||
def get_settings() -> Settings:
|
||||
return Settings()
|
||||
|
||||
|
||||
settings = get_settings()
|
||||
103
backend/app/core/otel.py
Normal file
103
backend/app/core/otel.py
Normal file
@@ -0,0 +1,103 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
from dataclasses import dataclass
|
||||
from typing import Any
|
||||
|
||||
from fastapi import FastAPI
|
||||
from opentelemetry import metrics, trace
|
||||
from opentelemetry.baggage.propagation import W3CBaggagePropagator
|
||||
from opentelemetry.exporter.otlp.proto.http.metric_exporter import OTLPMetricExporter
|
||||
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
|
||||
from opentelemetry.instrumentation.fastapi import FastAPIInstrumentor
|
||||
from opentelemetry.instrumentation.httpx import HTTPXClientInstrumentor
|
||||
from opentelemetry.instrumentation.logging import LoggingInstrumentor
|
||||
from opentelemetry.instrumentation.sqlalchemy import SQLAlchemyInstrumentor
|
||||
from opentelemetry.propagate import set_global_textmap
|
||||
from opentelemetry.propagators.composite import CompositePropagator
|
||||
from opentelemetry.sdk.metrics import MeterProvider
|
||||
from opentelemetry.sdk.metrics.export import PeriodicExportingMetricReader
|
||||
from opentelemetry.sdk.resources import Resource
|
||||
from opentelemetry.sdk.trace import TracerProvider
|
||||
from opentelemetry.sdk.trace.export import BatchSpanProcessor
|
||||
from opentelemetry.trace.propagation.tracecontext import TraceContextTextMapPropagator
|
||||
|
||||
try:
|
||||
from opentelemetry.instrumentation.system_metrics import SystemMetricsInstrumentor
|
||||
except ImportError: # pragma: no cover - defensive fallback for minimal envs
|
||||
SystemMetricsInstrumentor = None # type: ignore[assignment]
|
||||
|
||||
from app.core.config import Settings
|
||||
|
||||
LOGGER = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@dataclass
|
||||
class TelemetryProviders:
|
||||
tracer_provider: TracerProvider
|
||||
meter_provider: MeterProvider
|
||||
|
||||
|
||||
def configure_otel(settings: Settings) -> TelemetryProviders:
|
||||
set_global_textmap(
|
||||
CompositePropagator([TraceContextTextMapPropagator(), W3CBaggagePropagator()])
|
||||
)
|
||||
resource = Resource.create(
|
||||
{
|
||||
"service.name": settings.otel_service_name,
|
||||
"service.namespace": settings.otel_service_namespace,
|
||||
"deployment.environment": settings.app_env,
|
||||
}
|
||||
)
|
||||
|
||||
trace_exporter = OTLPSpanExporter(
|
||||
endpoint=f"{settings.otel_collector_endpoint}/v1/traces",
|
||||
timeout=settings.otel_export_timeout_ms / 1000,
|
||||
)
|
||||
tracer_provider = TracerProvider(resource=resource)
|
||||
tracer_provider.add_span_processor(BatchSpanProcessor(trace_exporter))
|
||||
trace.set_tracer_provider(tracer_provider)
|
||||
|
||||
metric_reader = PeriodicExportingMetricReader(
|
||||
exporter=OTLPMetricExporter(
|
||||
endpoint=f"{settings.otel_collector_endpoint}/v1/metrics",
|
||||
timeout=settings.otel_export_timeout_ms / 1000,
|
||||
),
|
||||
export_interval_millis=10000,
|
||||
)
|
||||
meter_provider = MeterProvider(resource=resource, metric_readers=[metric_reader])
|
||||
metrics.set_meter_provider(meter_provider)
|
||||
|
||||
LoggingInstrumentor().instrument(set_logging_format=True)
|
||||
if SystemMetricsInstrumentor is not None:
|
||||
SystemMetricsInstrumentor().instrument()
|
||||
else:
|
||||
LOGGER.warning(
|
||||
"System metrics instrumentor not available, runtime host metrics disabled."
|
||||
)
|
||||
LOGGER.info("OpenTelemetry providers configured")
|
||||
return TelemetryProviders(
|
||||
tracer_provider=tracer_provider, meter_provider=meter_provider
|
||||
)
|
||||
|
||||
|
||||
def instrument_fastapi(app: FastAPI) -> None:
|
||||
FastAPIInstrumentor.instrument_app(app)
|
||||
|
||||
|
||||
def instrument_sqlalchemy_engines(engines: dict[str, Any]) -> None:
|
||||
for engine in engines.values():
|
||||
SQLAlchemyInstrumentor().instrument(engine=engine)
|
||||
|
||||
|
||||
def instrument_httpx_clients() -> None:
|
||||
HTTPXClientInstrumentor().instrument()
|
||||
|
||||
|
||||
def shutdown_otel(providers: TelemetryProviders) -> None:
|
||||
HTTPXClientInstrumentor().uninstrument()
|
||||
if SystemMetricsInstrumentor is not None:
|
||||
SystemMetricsInstrumentor().uninstrument()
|
||||
LoggingInstrumentor().uninstrument()
|
||||
providers.meter_provider.shutdown()
|
||||
providers.tracer_provider.shutdown()
|
||||
231
backend/app/core/security.py
Normal file
231
backend/app/core/security.py
Normal file
@@ -0,0 +1,231 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from dataclasses import dataclass
|
||||
from functools import lru_cache
|
||||
from time import time
|
||||
from uuid import uuid4
|
||||
|
||||
import jwt
|
||||
from fastapi import Depends, Header, HTTPException, status
|
||||
from fastapi.security import HTTPAuthorizationCredentials, HTTPBearer
|
||||
from jwt import InvalidTokenError, PyJWKClient
|
||||
|
||||
from app.core.config import settings
|
||||
|
||||
BEARER_SCHEME = HTTPBearer(auto_error=False)
|
||||
|
||||
|
||||
@dataclass
|
||||
class FrontendPrincipal:
|
||||
subject: str
|
||||
scopes: list[str]
|
||||
claims: dict
|
||||
token: str
|
||||
|
||||
|
||||
@dataclass
|
||||
class InternalPrincipal:
|
||||
subject: str
|
||||
scopes: list[str]
|
||||
claims: dict
|
||||
token: str
|
||||
|
||||
|
||||
class FrontendJWTVerifier:
|
||||
@property
|
||||
def jwks_url(self) -> str:
|
||||
if not settings.frontend_jwt_jwks_url:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail="FRONTEND_JWT_JWKS_URL is not configured.",
|
||||
)
|
||||
return settings.frontend_jwt_jwks_url
|
||||
|
||||
@lru_cache(maxsize=1)
|
||||
def _jwks_client(self) -> PyJWKClient:
|
||||
return PyJWKClient(self.jwks_url)
|
||||
|
||||
@staticmethod
|
||||
def _extract_scopes(claims: dict) -> list[str]:
|
||||
scope = claims.get("scope")
|
||||
if isinstance(scope, str):
|
||||
return [item for item in scope.split(" ") if item]
|
||||
scp = claims.get("scp")
|
||||
if isinstance(scp, list):
|
||||
return [str(item) for item in scp]
|
||||
return []
|
||||
|
||||
def verify(self, token: str) -> FrontendPrincipal:
|
||||
if not settings.frontend_jwt_issuer_url:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail="FRONTEND_JWT_ISSUER_URL is not configured.",
|
||||
)
|
||||
if not settings.frontend_jwt_audience:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail="FRONTEND_JWT_AUDIENCE is not configured.",
|
||||
)
|
||||
|
||||
try:
|
||||
signing_key = self._jwks_client().get_signing_key_from_jwt(token).key
|
||||
claims = jwt.decode(
|
||||
token,
|
||||
key=signing_key,
|
||||
algorithms=[settings.frontend_jwt_algorithm],
|
||||
audience=settings.frontend_jwt_audience,
|
||||
issuer=settings.frontend_jwt_issuer_url,
|
||||
leeway=settings.frontend_clock_skew_seconds,
|
||||
)
|
||||
except InvalidTokenError as exc:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail="Invalid frontend access token.",
|
||||
) from exc
|
||||
|
||||
subject = str(claims.get("sub") or "")
|
||||
if not subject:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail="Frontend token missing subject.",
|
||||
)
|
||||
|
||||
scopes = self._extract_scopes(claims)
|
||||
required = settings.frontend_required_scopes_list
|
||||
missing = [scope for scope in required if scope not in scopes]
|
||||
if missing:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN,
|
||||
detail=f"Missing required scope(s): {', '.join(missing)}",
|
||||
)
|
||||
return FrontendPrincipal(
|
||||
subject=subject, scopes=scopes, claims=claims, token=token
|
||||
)
|
||||
|
||||
|
||||
class InternalTokenManager:
|
||||
token_type = "internal-service"
|
||||
|
||||
@staticmethod
|
||||
def _assert_secret() -> str:
|
||||
secret = settings.internal_service_shared_secret
|
||||
if not secret or secret == "change-me":
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail="INTERNAL_SERVICE_SHARED_SECRET must be configured.",
|
||||
)
|
||||
if len(secret.encode("utf-8")) < 32:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
detail=(
|
||||
"INTERNAL_SERVICE_SHARED_SECRET must be at least 32 bytes for "
|
||||
"HS256 token signing."
|
||||
),
|
||||
)
|
||||
return secret
|
||||
|
||||
def mint(
|
||||
self,
|
||||
*,
|
||||
subject: str,
|
||||
scopes: list[str],
|
||||
source_service: str,
|
||||
) -> str:
|
||||
now = int(time())
|
||||
payload = {
|
||||
"sub": subject,
|
||||
"scope": " ".join(scopes),
|
||||
"iss": source_service,
|
||||
"aud": settings.internal_service_token_audience,
|
||||
"typ": self.token_type,
|
||||
"iat": now,
|
||||
"nbf": now,
|
||||
"exp": now + settings.internal_service_token_ttl_seconds,
|
||||
"jti": str(uuid4()),
|
||||
}
|
||||
return jwt.encode(payload, self._assert_secret(), algorithm="HS256")
|
||||
|
||||
def verify(self, token: str) -> InternalPrincipal:
|
||||
try:
|
||||
claims = jwt.decode(
|
||||
token,
|
||||
self._assert_secret(),
|
||||
algorithms=["HS256"],
|
||||
audience=settings.internal_service_token_audience,
|
||||
options={
|
||||
"require": ["sub", "iss", "aud", "exp", "iat", "nbf", "jti", "typ"]
|
||||
},
|
||||
leeway=settings.internal_token_clock_skew_seconds,
|
||||
)
|
||||
except InvalidTokenError as exc:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail="Invalid internal service token.",
|
||||
) from exc
|
||||
|
||||
subject = str(claims.get("sub") or "")
|
||||
if not subject:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail="Internal token missing subject.",
|
||||
)
|
||||
|
||||
issuer = str(claims.get("iss") or "")
|
||||
if issuer not in settings.internal_service_allowed_issuers_list:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail="Internal token issuer is not allowed.",
|
||||
)
|
||||
|
||||
token_type = str(claims.get("typ") or "")
|
||||
if token_type != self.token_type:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail="Internal token type is invalid.",
|
||||
)
|
||||
|
||||
scope = claims.get("scope")
|
||||
scopes = [item for item in str(scope).split(" ") if item] if scope else []
|
||||
return InternalPrincipal(
|
||||
subject=subject, scopes=scopes, claims=claims, token=token
|
||||
)
|
||||
|
||||
|
||||
@lru_cache(maxsize=1)
|
||||
def get_frontend_verifier() -> FrontendJWTVerifier:
|
||||
return FrontendJWTVerifier()
|
||||
|
||||
|
||||
@lru_cache(maxsize=1)
|
||||
def get_internal_token_manager() -> InternalTokenManager:
|
||||
return InternalTokenManager()
|
||||
|
||||
|
||||
def require_frontend_principal(
|
||||
credentials: HTTPAuthorizationCredentials | None = Depends(BEARER_SCHEME),
|
||||
) -> FrontendPrincipal:
|
||||
if not settings.require_frontend_auth:
|
||||
return FrontendPrincipal(subject="anonymous", scopes=[], claims={}, token="")
|
||||
|
||||
if credentials is None or credentials.scheme.lower() != "bearer":
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail="Missing bearer token.",
|
||||
)
|
||||
return get_frontend_verifier().verify(credentials.credentials)
|
||||
|
||||
|
||||
def require_internal_principal(
|
||||
internal_token: str | None = Header(default=None, alias="x-internal-service-token"),
|
||||
) -> InternalPrincipal:
|
||||
if not settings.internal_service_auth_enabled:
|
||||
return InternalPrincipal(
|
||||
subject="internal-unauth", scopes=[], claims={}, token=""
|
||||
)
|
||||
|
||||
if not internal_token:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail="Missing x-internal-service-token header.",
|
||||
)
|
||||
return get_internal_token_manager().verify(internal_token)
|
||||
1
backend/app/db/__init__.py
Normal file
1
backend/app/db/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
"""Database helpers for warehouse connections."""
|
||||
34
backend/app/db/engine.py
Normal file
34
backend/app/db/engine.py
Normal file
@@ -0,0 +1,34 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from sqlalchemy import create_engine, event
|
||||
from sqlalchemy.engine import Engine
|
||||
|
||||
from app.core.config import settings
|
||||
|
||||
|
||||
def _create_read_only_engine(connection_url: str) -> Engine:
|
||||
engine = create_engine(
|
||||
connection_url, pool_pre_ping=True, pool_recycle=3600, future=True
|
||||
)
|
||||
|
||||
@event.listens_for(engine, "connect")
|
||||
def _on_connect(dbapi_connection, _connection_record) -> None:
|
||||
cursor = dbapi_connection.cursor()
|
||||
try:
|
||||
cursor.execute("SET TRANSACTION ISOLATION LEVEL READ COMMITTED;")
|
||||
finally:
|
||||
cursor.close()
|
||||
|
||||
return engine
|
||||
|
||||
|
||||
def create_warehouse_engines() -> dict[str, Engine]:
|
||||
return {
|
||||
"wwi": _create_read_only_engine(settings.wwi_connection_url),
|
||||
"aw": _create_read_only_engine(settings.aw_connection_url),
|
||||
}
|
||||
|
||||
|
||||
def dispose_engines(engines: dict[str, Engine]) -> None:
|
||||
for engine in engines.values():
|
||||
engine.dispose()
|
||||
27
backend/app/db/postgres.py
Normal file
27
backend/app/db/postgres.py
Normal file
@@ -0,0 +1,27 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.engine import Engine
|
||||
from sqlalchemy.orm import Session, sessionmaker
|
||||
|
||||
from app.core.config import settings
|
||||
from app.db.postgres_models import Base
|
||||
|
||||
|
||||
def create_postgres_engine() -> Engine:
|
||||
return create_engine(
|
||||
settings.postgres_connection_url,
|
||||
pool_pre_ping=True,
|
||||
pool_recycle=3600,
|
||||
future=True,
|
||||
)
|
||||
|
||||
|
||||
def initialize_postgres_schema(engine: Engine) -> None:
|
||||
Base.metadata.create_all(bind=engine)
|
||||
|
||||
|
||||
def create_postgres_session_factory(engine: Engine) -> sessionmaker[Session]:
|
||||
return sessionmaker(
|
||||
bind=engine, autoflush=False, autocommit=False, expire_on_commit=False
|
||||
)
|
||||
86
backend/app/db/postgres_models.py
Normal file
86
backend/app/db/postgres_models.py
Normal file
@@ -0,0 +1,86 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime, timezone
|
||||
from uuid import uuid4
|
||||
|
||||
from sqlalchemy import JSON, DateTime, Float, Integer, String, Text
|
||||
from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column
|
||||
|
||||
|
||||
def _utcnow() -> datetime:
|
||||
return datetime.now(timezone.utc)
|
||||
|
||||
|
||||
class Base(DeclarativeBase):
|
||||
pass
|
||||
|
||||
|
||||
class AuditLog(Base):
|
||||
__tablename__ = "audit_logs"
|
||||
|
||||
id: Mapped[str] = mapped_column(
|
||||
String(36), primary_key=True, default=lambda: str(uuid4())
|
||||
)
|
||||
created_at: Mapped[datetime] = mapped_column(
|
||||
DateTime(timezone=True), default=_utcnow, index=True
|
||||
)
|
||||
method: Mapped[str] = mapped_column(String(12), index=True)
|
||||
path: Mapped[str] = mapped_column(String(300), index=True)
|
||||
query_string: Mapped[str] = mapped_column(String(1000), default="")
|
||||
status_code: Mapped[int] = mapped_column(Integer, index=True)
|
||||
duration_ms: Mapped[float] = mapped_column(Float)
|
||||
trace_id: Mapped[str | None] = mapped_column(String(32), nullable=True, index=True)
|
||||
span_id: Mapped[str | None] = mapped_column(String(16), nullable=True, index=True)
|
||||
client_ip: Mapped[str | None] = mapped_column(String(120), nullable=True)
|
||||
user_agent: Mapped[str | None] = mapped_column(Text, nullable=True)
|
||||
details: Mapped[dict] = mapped_column(JSON, default=dict)
|
||||
|
||||
|
||||
class ForecastRun(Base):
|
||||
__tablename__ = "forecast_runs"
|
||||
|
||||
id: Mapped[str] = mapped_column(
|
||||
String(36), primary_key=True, default=lambda: str(uuid4())
|
||||
)
|
||||
created_at: Mapped[datetime] = mapped_column(
|
||||
DateTime(timezone=True), default=_utcnow, index=True
|
||||
)
|
||||
horizon_days: Mapped[int] = mapped_column(Integer)
|
||||
point_count: Mapped[int] = mapped_column(Integer)
|
||||
trigger_source: Mapped[str] = mapped_column(String(64), index=True)
|
||||
trace_id: Mapped[str | None] = mapped_column(String(32), nullable=True, index=True)
|
||||
span_id: Mapped[str | None] = mapped_column(String(16), nullable=True, index=True)
|
||||
payload: Mapped[list[dict]] = mapped_column(JSON, default=list)
|
||||
|
||||
|
||||
class RankingRun(Base):
|
||||
__tablename__ = "ranking_runs"
|
||||
|
||||
id: Mapped[str] = mapped_column(
|
||||
String(36), primary_key=True, default=lambda: str(uuid4())
|
||||
)
|
||||
created_at: Mapped[datetime] = mapped_column(
|
||||
DateTime(timezone=True), default=_utcnow, index=True
|
||||
)
|
||||
top_n: Mapped[int] = mapped_column(Integer)
|
||||
item_count: Mapped[int] = mapped_column(Integer)
|
||||
trigger_source: Mapped[str] = mapped_column(String(64), index=True)
|
||||
trace_id: Mapped[str | None] = mapped_column(String(32), nullable=True, index=True)
|
||||
span_id: Mapped[str | None] = mapped_column(String(16), nullable=True, index=True)
|
||||
payload: Mapped[list[dict]] = mapped_column(JSON, default=list)
|
||||
|
||||
|
||||
class RecommendationRun(Base):
|
||||
__tablename__ = "recommendation_runs"
|
||||
|
||||
id: Mapped[str] = mapped_column(
|
||||
String(36), primary_key=True, default=lambda: str(uuid4())
|
||||
)
|
||||
created_at: Mapped[datetime] = mapped_column(
|
||||
DateTime(timezone=True), default=_utcnow, index=True
|
||||
)
|
||||
item_count: Mapped[int] = mapped_column(Integer)
|
||||
trigger_source: Mapped[str] = mapped_column(String(64), index=True)
|
||||
trace_id: Mapped[str | None] = mapped_column(String(32), nullable=True, index=True)
|
||||
span_id: Mapped[str | None] = mapped_column(String(16), nullable=True, index=True)
|
||||
payload: Mapped[list[dict]] = mapped_column(JSON, default=list)
|
||||
167
backend/app/db/queries.py
Normal file
167
backend/app/db/queries.py
Normal file
@@ -0,0 +1,167 @@
|
||||
from __future__ import annotations
|
||||
|
||||
AW_DAILY_SALES_QUERIES = [
|
||||
"""
|
||||
SELECT
|
||||
CAST(d.FullDateAlternateKey AS date) AS sale_date,
|
||||
SUM(f.SalesAmount) AS revenue,
|
||||
SUM(f.TotalProductCost) AS cost,
|
||||
SUM(f.OrderQuantity) AS quantity,
|
||||
COUNT_BIG(*) AS orders
|
||||
FROM dbo.FactInternetSales AS f
|
||||
INNER JOIN dbo.DimDate AS d ON d.DateKey = f.OrderDateKey
|
||||
GROUP BY CAST(d.FullDateAlternateKey AS date)
|
||||
ORDER BY sale_date;
|
||||
""",
|
||||
"""
|
||||
SELECT
|
||||
CAST(OrderDate AS date) AS sale_date,
|
||||
SUM(SalesAmount) AS revenue,
|
||||
SUM(TotalProductCost) AS cost,
|
||||
SUM(OrderQuantity) AS quantity,
|
||||
COUNT_BIG(*) AS orders
|
||||
FROM dbo.FactInternetSales
|
||||
GROUP BY CAST(OrderDate AS date)
|
||||
ORDER BY sale_date;
|
||||
""",
|
||||
]
|
||||
|
||||
WWI_DAILY_SALES_QUERIES = [
|
||||
"""
|
||||
SELECT
|
||||
CAST(i.InvoiceDate AS date) AS sale_date,
|
||||
SUM(il.ExtendedPrice) AS revenue,
|
||||
SUM(il.TaxAmount) AS cost,
|
||||
SUM(il.Quantity) AS quantity,
|
||||
COUNT_BIG(DISTINCT i.InvoiceID) AS orders
|
||||
FROM Sales.Invoices AS i
|
||||
INNER JOIN Sales.InvoiceLines AS il ON il.InvoiceID = i.InvoiceID
|
||||
GROUP BY CAST(i.InvoiceDate AS date)
|
||||
ORDER BY sale_date;
|
||||
""",
|
||||
"""
|
||||
SELECT
|
||||
CAST(i.InvoiceDate AS date) AS sale_date,
|
||||
SUM(il.UnitPrice * il.Quantity) AS revenue,
|
||||
CAST(0 AS float) AS cost,
|
||||
SUM(il.Quantity) AS quantity,
|
||||
COUNT_BIG(DISTINCT i.InvoiceID) AS orders
|
||||
FROM Sales.Invoices AS i
|
||||
INNER JOIN Sales.InvoiceLines AS il ON il.InvoiceID = i.InvoiceID
|
||||
GROUP BY CAST(i.InvoiceDate AS date)
|
||||
ORDER BY sale_date;
|
||||
""",
|
||||
]
|
||||
|
||||
AW_PRODUCT_PERFORMANCE_QUERIES = [
|
||||
"""
|
||||
SELECT
|
||||
p.ProductAlternateKey AS product_id,
|
||||
p.EnglishProductName AS product_name,
|
||||
COALESCE(sc.EnglishProductSubcategoryName, 'Unknown') AS category_name,
|
||||
SUM(f.SalesAmount) AS revenue,
|
||||
SUM(f.TotalProductCost) AS cost,
|
||||
SUM(f.OrderQuantity) AS quantity,
|
||||
COUNT_BIG(*) AS orders
|
||||
FROM dbo.FactInternetSales AS f
|
||||
INNER JOIN dbo.DimProduct AS p ON p.ProductKey = f.ProductKey
|
||||
LEFT JOIN dbo.DimProductSubcategory AS sc ON sc.ProductSubcategoryKey = p.ProductSubcategoryKey
|
||||
GROUP BY p.ProductAlternateKey, p.EnglishProductName, sc.EnglishProductSubcategoryName
|
||||
ORDER BY revenue DESC;
|
||||
""",
|
||||
"""
|
||||
SELECT
|
||||
CAST(ProductKey AS nvarchar(100)) AS product_id,
|
||||
CAST(ProductKey AS nvarchar(100)) AS product_name,
|
||||
'Unknown' AS category_name,
|
||||
SUM(SalesAmount) AS revenue,
|
||||
SUM(TotalProductCost) AS cost,
|
||||
SUM(OrderQuantity) AS quantity,
|
||||
COUNT_BIG(*) AS orders
|
||||
FROM dbo.FactInternetSales
|
||||
GROUP BY ProductKey
|
||||
ORDER BY revenue DESC;
|
||||
""",
|
||||
]
|
||||
|
||||
WWI_PRODUCT_PERFORMANCE_QUERIES = [
|
||||
"""
|
||||
SELECT
|
||||
CAST(s.StockItemID AS nvarchar(100)) AS product_id,
|
||||
s.StockItemName AS product_name,
|
||||
COALESCE(cg.StockGroupName, 'Unknown') AS category_name,
|
||||
SUM(il.ExtendedPrice) AS revenue,
|
||||
SUM(il.TaxAmount) AS cost,
|
||||
SUM(il.Quantity) AS quantity,
|
||||
COUNT_BIG(*) AS orders
|
||||
FROM Sales.InvoiceLines AS il
|
||||
INNER JOIN Warehouse.StockItems AS s ON s.StockItemID = il.StockItemID
|
||||
LEFT JOIN Warehouse.StockItemStockGroups AS sig ON sig.StockItemID = s.StockItemID
|
||||
LEFT JOIN Warehouse.StockGroups AS cg ON cg.StockGroupID = sig.StockGroupID
|
||||
GROUP BY s.StockItemID, s.StockItemName, cg.StockGroupName
|
||||
ORDER BY revenue DESC;
|
||||
""",
|
||||
"""
|
||||
SELECT
|
||||
CAST(il.StockItemID AS nvarchar(100)) AS product_id,
|
||||
CAST(il.StockItemID AS nvarchar(100)) AS product_name,
|
||||
'Unknown' AS category_name,
|
||||
SUM(il.UnitPrice * il.Quantity) AS revenue,
|
||||
CAST(0 AS float) AS cost,
|
||||
SUM(il.Quantity) AS quantity,
|
||||
COUNT_BIG(*) AS orders
|
||||
FROM Sales.InvoiceLines AS il
|
||||
GROUP BY il.StockItemID
|
||||
ORDER BY revenue DESC;
|
||||
""",
|
||||
]
|
||||
|
||||
AW_CUSTOMER_QUERIES = [
|
||||
"""
|
||||
SELECT
|
||||
CAST(c.CustomerAlternateKey AS nvarchar(100)) AS customer_id,
|
||||
c.FirstName + ' ' + c.LastName AS customer_name,
|
||||
SUM(f.SalesAmount) AS revenue,
|
||||
COUNT_BIG(*) AS orders
|
||||
FROM dbo.FactInternetSales AS f
|
||||
INNER JOIN dbo.DimCustomer AS c ON c.CustomerKey = f.CustomerKey
|
||||
GROUP BY c.CustomerAlternateKey, c.FirstName, c.LastName
|
||||
ORDER BY revenue DESC;
|
||||
""",
|
||||
"""
|
||||
SELECT
|
||||
CAST(CustomerKey AS nvarchar(100)) AS customer_id,
|
||||
CAST(CustomerKey AS nvarchar(100)) AS customer_name,
|
||||
SUM(SalesAmount) AS revenue,
|
||||
COUNT_BIG(*) AS orders
|
||||
FROM dbo.FactInternetSales
|
||||
GROUP BY CustomerKey
|
||||
ORDER BY revenue DESC;
|
||||
""",
|
||||
]
|
||||
|
||||
WWI_CUSTOMER_QUERIES = [
|
||||
"""
|
||||
SELECT
|
||||
CAST(c.CustomerID AS nvarchar(100)) AS customer_id,
|
||||
c.CustomerName AS customer_name,
|
||||
SUM(il.ExtendedPrice) AS revenue,
|
||||
COUNT_BIG(DISTINCT i.InvoiceID) AS orders
|
||||
FROM Sales.Invoices AS i
|
||||
INNER JOIN Sales.InvoiceLines AS il ON il.InvoiceID = i.InvoiceID
|
||||
INNER JOIN Sales.Customers AS c ON c.CustomerID = i.CustomerID
|
||||
GROUP BY c.CustomerID, c.CustomerName
|
||||
ORDER BY revenue DESC;
|
||||
""",
|
||||
"""
|
||||
SELECT
|
||||
CAST(i.CustomerID AS nvarchar(100)) AS customer_id,
|
||||
CAST(i.CustomerID AS nvarchar(100)) AS customer_name,
|
||||
SUM(il.UnitPrice * il.Quantity) AS revenue,
|
||||
COUNT_BIG(DISTINCT i.InvoiceID) AS orders
|
||||
FROM Sales.Invoices AS i
|
||||
INNER JOIN Sales.InvoiceLines AS il ON il.InvoiceID = i.InvoiceID
|
||||
GROUP BY i.CustomerID
|
||||
ORDER BY revenue DESC;
|
||||
""",
|
||||
]
|
||||
1
backend/app/services/__init__.py
Normal file
1
backend/app/services/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
"""Business logic services."""
|
||||
373
backend/app/services/analytics_service.py
Normal file
373
backend/app/services/analytics_service.py
Normal file
@@ -0,0 +1,373 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from dataclasses import dataclass
|
||||
from datetime import date, timedelta
|
||||
from math import sqrt
|
||||
|
||||
import numpy as np
|
||||
import pandas as pd
|
||||
from opentelemetry import trace
|
||||
from sklearn.linear_model import LinearRegression
|
||||
|
||||
from app.core.config import settings
|
||||
from app.services.persistence_service import PersistenceService
|
||||
from app.services.warehouse_service import ReadOnlyWarehouseClient
|
||||
|
||||
|
||||
@dataclass
|
||||
class DashboardSnapshot:
|
||||
kpis: dict
|
||||
history: list[dict]
|
||||
forecasts: list[dict]
|
||||
rankings: list[dict]
|
||||
recommendations: list[dict]
|
||||
|
||||
|
||||
class AnalyticsService:
|
||||
def __init__(
|
||||
self,
|
||||
warehouse_client: ReadOnlyWarehouseClient,
|
||||
persistence_service: PersistenceService | None = None,
|
||||
) -> None:
|
||||
self.warehouse_client = warehouse_client
|
||||
self.persistence_service = persistence_service
|
||||
self.tracer = trace.get_tracer(__name__)
|
||||
|
||||
@staticmethod
|
||||
def _normalize_frame(df: pd.DataFrame, date_col: str = "sale_date") -> pd.DataFrame:
|
||||
normalized = df.copy()
|
||||
normalized[date_col] = pd.to_datetime(normalized[date_col], errors="coerce")
|
||||
for numeric in ("revenue", "cost", "quantity", "orders"):
|
||||
if numeric in normalized.columns:
|
||||
normalized[numeric] = pd.to_numeric(
|
||||
normalized[numeric], errors="coerce"
|
||||
).fillna(0.0)
|
||||
return normalized.dropna(subset=[date_col])
|
||||
|
||||
def load_sales_history(self, days_back: int | None = None) -> pd.DataFrame:
|
||||
with self.tracer.start_as_current_span("analytics.load_sales_history"):
|
||||
daily_sales = self._normalize_frame(
|
||||
self.warehouse_client.fetch_daily_sales()
|
||||
)
|
||||
days = days_back or settings.default_history_days
|
||||
min_date = pd.Timestamp(date.today() - timedelta(days=days))
|
||||
filtered = daily_sales[daily_sales["sale_date"] >= min_date]
|
||||
return (
|
||||
filtered.groupby("sale_date", as_index=False)[
|
||||
["revenue", "cost", "quantity", "orders"]
|
||||
]
|
||||
.sum()
|
||||
.sort_values("sale_date")
|
||||
)
|
||||
|
||||
def get_kpis(self) -> dict:
|
||||
with self.tracer.start_as_current_span("analytics.kpis"):
|
||||
sales = self.load_sales_history(days_back=180)
|
||||
if sales.empty:
|
||||
return {
|
||||
"total_revenue": 0.0,
|
||||
"gross_margin_pct": 0.0,
|
||||
"total_quantity": 0.0,
|
||||
"avg_order_value": 0.0,
|
||||
"records_in_window": 0,
|
||||
}
|
||||
|
||||
total_revenue = float(sales["revenue"].sum())
|
||||
total_cost = float(sales["cost"].sum())
|
||||
total_orders = max(float(sales["orders"].sum()), 1.0)
|
||||
margin_pct = (
|
||||
((total_revenue - total_cost) / total_revenue * 100)
|
||||
if total_revenue
|
||||
else 0.0
|
||||
)
|
||||
return {
|
||||
"total_revenue": round(total_revenue, 2),
|
||||
"gross_margin_pct": round(margin_pct, 2),
|
||||
"total_quantity": round(float(sales["quantity"].sum()), 2),
|
||||
"avg_order_value": round(total_revenue / total_orders, 2),
|
||||
"records_in_window": int(sales.shape[0]),
|
||||
}
|
||||
|
||||
def get_history_points(self, days_back: int | None = None) -> list[dict]:
|
||||
with self.tracer.start_as_current_span("analytics.history_points"):
|
||||
sales = self.load_sales_history(days_back=days_back)
|
||||
if sales.empty:
|
||||
return []
|
||||
return [
|
||||
{
|
||||
"date": pd.Timestamp(row["sale_date"]).date().isoformat(),
|
||||
"revenue": round(float(row["revenue"]), 2),
|
||||
"cost": round(float(row["cost"]), 2),
|
||||
"quantity": round(float(row["quantity"]), 2),
|
||||
}
|
||||
for _, row in sales.iterrows()
|
||||
]
|
||||
|
||||
def get_forecast(
|
||||
self,
|
||||
horizon_days: int | None = None,
|
||||
*,
|
||||
trigger_source: str = "api.forecasts",
|
||||
persist: bool = True,
|
||||
) -> list[dict]:
|
||||
with self.tracer.start_as_current_span("analytics.forecast"):
|
||||
horizon = horizon_days or settings.forecast_horizon_days
|
||||
sales = self.load_sales_history(days_back=720)
|
||||
if sales.empty:
|
||||
return []
|
||||
|
||||
series = (
|
||||
sales.set_index("sale_date")["revenue"]
|
||||
.sort_index()
|
||||
.resample("D")
|
||||
.sum()
|
||||
.fillna(0.0)
|
||||
)
|
||||
y = series.values
|
||||
x = np.arange(len(y), dtype=float).reshape(-1, 1)
|
||||
model = LinearRegression()
|
||||
model.fit(x, y)
|
||||
baseline = model.predict(x)
|
||||
residual = y - baseline
|
||||
sigma = float(np.std(residual)) if len(residual) > 1 else 0.0
|
||||
|
||||
weekday_baseline = series.groupby(series.index.weekday).mean()
|
||||
overall_mean = float(series.mean()) if len(series) else 0.0
|
||||
weekday_factor = (
|
||||
weekday_baseline / overall_mean
|
||||
if overall_mean > 0
|
||||
else pd.Series([1.0] * 7, index=range(7))
|
||||
)
|
||||
weekday_factor = weekday_factor.replace([np.inf, -np.inf], 1.0).fillna(1.0)
|
||||
|
||||
future_x = np.arange(len(y), len(y) + horizon, dtype=float).reshape(-1, 1)
|
||||
raw_forecast = model.predict(future_x)
|
||||
|
||||
predictions: list[dict] = []
|
||||
start_date = series.index.max().date()
|
||||
for idx, point in enumerate(raw_forecast, start=1):
|
||||
day = start_date + timedelta(days=idx)
|
||||
factor = (
|
||||
float(weekday_factor.loc[day.weekday()])
|
||||
if day.weekday() in weekday_factor.index
|
||||
else 1.0
|
||||
)
|
||||
yhat = max(float(point) * factor, 0.0)
|
||||
ci = 1.96 * sigma * sqrt(1 + idx / max(len(y), 1))
|
||||
predictions.append(
|
||||
{
|
||||
"date": day.isoformat(),
|
||||
"predicted_revenue": round(yhat, 2),
|
||||
"lower_bound": round(max(yhat - ci, 0.0), 2),
|
||||
"upper_bound": round(yhat + ci, 2),
|
||||
}
|
||||
)
|
||||
|
||||
if persist and self.persistence_service is not None:
|
||||
span_context = trace.get_current_span().get_span_context()
|
||||
trace_id = (
|
||||
f"{span_context.trace_id:032x}" if span_context.is_valid else None
|
||||
)
|
||||
span_id = (
|
||||
f"{span_context.span_id:016x}" if span_context.is_valid else None
|
||||
)
|
||||
self.persistence_service.record_forecast_run(
|
||||
horizon_days=horizon,
|
||||
payload=predictions,
|
||||
trigger_source=trigger_source,
|
||||
trace_id=trace_id,
|
||||
span_id=span_id,
|
||||
)
|
||||
|
||||
return predictions
|
||||
|
||||
def get_rankings(
|
||||
self,
|
||||
top_n: int | None = None,
|
||||
*,
|
||||
trigger_source: str = "api.rankings",
|
||||
persist: bool = True,
|
||||
) -> list[dict]:
|
||||
with self.tracer.start_as_current_span("analytics.rankings"):
|
||||
n = top_n or settings.ranking_default_top_n
|
||||
products = self.warehouse_client.fetch_product_performance().copy()
|
||||
if products.empty:
|
||||
return []
|
||||
|
||||
products["revenue"] = pd.to_numeric(
|
||||
products["revenue"], errors="coerce"
|
||||
).fillna(0.0)
|
||||
products["cost"] = pd.to_numeric(products["cost"], errors="coerce").fillna(
|
||||
0.0
|
||||
)
|
||||
products["quantity"] = pd.to_numeric(
|
||||
products["quantity"], errors="coerce"
|
||||
).fillna(0.0)
|
||||
products["orders"] = pd.to_numeric(
|
||||
products["orders"], errors="coerce"
|
||||
).fillna(0.0)
|
||||
|
||||
grouped = (
|
||||
products.groupby(
|
||||
["product_id", "product_name", "category_name"], as_index=False
|
||||
)[["revenue", "cost", "quantity", "orders"]]
|
||||
.sum()
|
||||
.sort_values("revenue", ascending=False)
|
||||
)
|
||||
|
||||
grouped["margin_pct"] = np.where(
|
||||
grouped["revenue"] > 0,
|
||||
((grouped["revenue"] - grouped["cost"]) / grouped["revenue"]) * 100,
|
||||
0.0,
|
||||
)
|
||||
|
||||
revenue_norm = grouped["revenue"] / max(
|
||||
float(grouped["revenue"].max()), 1.0
|
||||
)
|
||||
margin_norm = (grouped["margin_pct"] + 100) / 200
|
||||
velocity_norm = grouped["quantity"] / max(
|
||||
float(grouped["quantity"].max()), 1.0
|
||||
)
|
||||
grouped["score"] = (
|
||||
(0.55 * revenue_norm)
|
||||
+ (0.30 * margin_norm.clip(0, 1))
|
||||
+ (0.15 * velocity_norm)
|
||||
)
|
||||
ranked = (
|
||||
grouped.sort_values("score", ascending=False)
|
||||
.head(n)
|
||||
.reset_index(drop=True)
|
||||
)
|
||||
|
||||
result = [
|
||||
{
|
||||
"rank": int(idx + 1),
|
||||
"product_id": str(row["product_id"]),
|
||||
"product_name": str(row["product_name"]),
|
||||
"category": str(row["category_name"]),
|
||||
"revenue": round(float(row["revenue"]), 2),
|
||||
"margin_pct": round(float(row["margin_pct"]), 2),
|
||||
"score": round(float(row["score"]) * 100, 2),
|
||||
}
|
||||
for idx, row in ranked.iterrows()
|
||||
]
|
||||
|
||||
if persist and self.persistence_service is not None:
|
||||
span_context = trace.get_current_span().get_span_context()
|
||||
trace_id = (
|
||||
f"{span_context.trace_id:032x}" if span_context.is_valid else None
|
||||
)
|
||||
span_id = (
|
||||
f"{span_context.span_id:016x}" if span_context.is_valid else None
|
||||
)
|
||||
self.persistence_service.record_ranking_run(
|
||||
top_n=n,
|
||||
payload=result,
|
||||
trigger_source=trigger_source,
|
||||
trace_id=trace_id,
|
||||
span_id=span_id,
|
||||
)
|
||||
|
||||
return result
|
||||
|
||||
def get_recommendations(
|
||||
self,
|
||||
rankings: list[dict] | None = None,
|
||||
*,
|
||||
trigger_source: str = "api.recommendations",
|
||||
persist: bool = True,
|
||||
) -> list[dict]:
|
||||
with self.tracer.start_as_current_span("analytics.recommendations"):
|
||||
ranking_rows = (
|
||||
rankings
|
||||
if rankings is not None
|
||||
else self.get_rankings(
|
||||
top_n=20, trigger_source=trigger_source, persist=persist
|
||||
)
|
||||
)
|
||||
customers = self.warehouse_client.fetch_customer_performance().copy()
|
||||
if customers.empty:
|
||||
customers = pd.DataFrame(columns=["customer_name", "revenue", "orders"])
|
||||
|
||||
recommendations: list[dict] = []
|
||||
|
||||
if ranking_rows:
|
||||
champion = ranking_rows[0]
|
||||
recommendations.append(
|
||||
{
|
||||
"title": "Double down on champion SKU",
|
||||
"priority": "high",
|
||||
"summary": (
|
||||
f"Promote '{champion['product_name']}' with score {champion['score']:.2f} "
|
||||
f"and margin {champion['margin_pct']:.2f}%."
|
||||
),
|
||||
}
|
||||
)
|
||||
|
||||
low_margin = next(
|
||||
(row for row in ranking_rows if row["margin_pct"] < 10), None
|
||||
)
|
||||
if low_margin:
|
||||
recommendations.append(
|
||||
{
|
||||
"title": "Review pricing for low-margin bestseller",
|
||||
"priority": "medium",
|
||||
"summary": (
|
||||
f"'{low_margin['product_name']}' has strong rank but only "
|
||||
f"{low_margin['margin_pct']:.2f}% margin."
|
||||
),
|
||||
}
|
||||
)
|
||||
|
||||
if not customers.empty:
|
||||
customers["revenue"] = pd.to_numeric(
|
||||
customers["revenue"], errors="coerce"
|
||||
).fillna(0.0)
|
||||
customers["orders"] = pd.to_numeric(
|
||||
customers["orders"], errors="coerce"
|
||||
).fillna(0.0)
|
||||
customer = customers.sort_values("revenue", ascending=False).iloc[0]
|
||||
recommendations.append(
|
||||
{
|
||||
"title": "Protect top customer relationship",
|
||||
"priority": "high",
|
||||
"summary": (
|
||||
f"Prioritize retention for '{customer['customer_name']}' with "
|
||||
f"{float(customer['orders']):.0f} orders and {float(customer['revenue']):.2f} revenue."
|
||||
),
|
||||
}
|
||||
)
|
||||
|
||||
result = recommendations[:5]
|
||||
if persist and self.persistence_service is not None:
|
||||
span_context = trace.get_current_span().get_span_context()
|
||||
trace_id = (
|
||||
f"{span_context.trace_id:032x}" if span_context.is_valid else None
|
||||
)
|
||||
span_id = (
|
||||
f"{span_context.span_id:016x}" if span_context.is_valid else None
|
||||
)
|
||||
self.persistence_service.record_recommendation_run(
|
||||
payload=result,
|
||||
trigger_source=trigger_source,
|
||||
trace_id=trace_id,
|
||||
span_id=span_id,
|
||||
)
|
||||
return result
|
||||
|
||||
def get_dashboard(self) -> DashboardSnapshot:
|
||||
with self.tracer.start_as_current_span("analytics.dashboard"):
|
||||
rankings = self.get_rankings(trigger_source="api.dashboard", persist=True)
|
||||
return DashboardSnapshot(
|
||||
kpis=self.get_kpis(),
|
||||
history=self.get_history_points(),
|
||||
forecasts=self.get_forecast(
|
||||
trigger_source="api.dashboard", persist=True
|
||||
),
|
||||
rankings=rankings,
|
||||
recommendations=self.get_recommendations(
|
||||
rankings=rankings,
|
||||
trigger_source="api.dashboard",
|
||||
persist=True,
|
||||
),
|
||||
)
|
||||
281
backend/app/services/persistence_service.py
Normal file
281
backend/app/services/persistence_service.py
Normal file
@@ -0,0 +1,281 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
from time import perf_counter
|
||||
|
||||
from opentelemetry import metrics, trace
|
||||
from sqlalchemy import desc, select
|
||||
from sqlalchemy.exc import SQLAlchemyError
|
||||
from sqlalchemy.orm import Session, sessionmaker
|
||||
|
||||
from app.db.postgres_models import AuditLog, ForecastRun, RankingRun, RecommendationRun
|
||||
|
||||
LOGGER = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class PersistenceService:
|
||||
def __init__(self, session_factory: sessionmaker[Session]) -> None:
|
||||
self.session_factory = session_factory
|
||||
self.tracer = trace.get_tracer(__name__)
|
||||
self.meter = metrics.get_meter(__name__)
|
||||
self.write_counter = self.meter.create_counter(
|
||||
name="postgres_persist_writes_total",
|
||||
description="Total writes to app persistence PostgreSQL",
|
||||
)
|
||||
self.write_latency = self.meter.create_histogram(
|
||||
name="postgres_persist_write_latency_ms",
|
||||
unit="ms",
|
||||
description="Latency of app persistence write operations",
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def _to_audit_dict(row: AuditLog) -> dict:
|
||||
return {
|
||||
"id": row.id,
|
||||
"created_at": row.created_at.isoformat(),
|
||||
"method": row.method,
|
||||
"path": row.path,
|
||||
"query_string": row.query_string,
|
||||
"status_code": row.status_code,
|
||||
"duration_ms": row.duration_ms,
|
||||
"trace_id": row.trace_id,
|
||||
"span_id": row.span_id,
|
||||
"client_ip": row.client_ip,
|
||||
"user_agent": row.user_agent,
|
||||
"details": row.details,
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def _to_forecast_dict(row: ForecastRun) -> dict:
|
||||
return {
|
||||
"id": row.id,
|
||||
"created_at": row.created_at.isoformat(),
|
||||
"horizon_days": row.horizon_days,
|
||||
"point_count": row.point_count,
|
||||
"trigger_source": row.trigger_source,
|
||||
"trace_id": row.trace_id,
|
||||
"span_id": row.span_id,
|
||||
"payload": row.payload,
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def _to_ranking_dict(row: RankingRun) -> dict:
|
||||
return {
|
||||
"id": row.id,
|
||||
"created_at": row.created_at.isoformat(),
|
||||
"top_n": row.top_n,
|
||||
"item_count": row.item_count,
|
||||
"trigger_source": row.trigger_source,
|
||||
"trace_id": row.trace_id,
|
||||
"span_id": row.span_id,
|
||||
"payload": row.payload,
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def _to_recommendation_dict(row: RecommendationRun) -> dict:
|
||||
return {
|
||||
"id": row.id,
|
||||
"created_at": row.created_at.isoformat(),
|
||||
"item_count": row.item_count,
|
||||
"trigger_source": row.trigger_source,
|
||||
"trace_id": row.trace_id,
|
||||
"span_id": row.span_id,
|
||||
"payload": row.payload,
|
||||
}
|
||||
|
||||
def record_audit_log(
|
||||
self,
|
||||
*,
|
||||
method: str,
|
||||
path: str,
|
||||
query_string: str,
|
||||
status_code: int,
|
||||
duration_ms: float,
|
||||
trace_id: str | None,
|
||||
span_id: str | None,
|
||||
client_ip: str | None,
|
||||
user_agent: str | None,
|
||||
details: dict | None = None,
|
||||
) -> None:
|
||||
started = perf_counter()
|
||||
with self.tracer.start_as_current_span("persist.audit_log"):
|
||||
try:
|
||||
with self.session_factory() as session:
|
||||
session.add(
|
||||
AuditLog(
|
||||
method=method,
|
||||
path=path,
|
||||
query_string=query_string[:1000],
|
||||
status_code=status_code,
|
||||
duration_ms=duration_ms,
|
||||
trace_id=trace_id,
|
||||
span_id=span_id,
|
||||
client_ip=client_ip,
|
||||
user_agent=user_agent,
|
||||
details=details or {},
|
||||
)
|
||||
)
|
||||
session.commit()
|
||||
self.write_counter.add(
|
||||
1, attributes={"entity": "audit", "status": "ok"}
|
||||
)
|
||||
except SQLAlchemyError as exc:
|
||||
LOGGER.exception("Failed to persist audit log: %s", exc)
|
||||
self.write_counter.add(
|
||||
1, attributes={"entity": "audit", "status": "error"}
|
||||
)
|
||||
finally:
|
||||
self.write_latency.record(
|
||||
(perf_counter() - started) * 1000,
|
||||
attributes={"entity": "audit"},
|
||||
)
|
||||
|
||||
def record_forecast_run(
|
||||
self,
|
||||
*,
|
||||
horizon_days: int,
|
||||
payload: list[dict],
|
||||
trigger_source: str,
|
||||
trace_id: str | None,
|
||||
span_id: str | None,
|
||||
) -> None:
|
||||
started = perf_counter()
|
||||
with self.tracer.start_as_current_span("persist.forecast_run"):
|
||||
try:
|
||||
with self.session_factory() as session:
|
||||
session.add(
|
||||
ForecastRun(
|
||||
horizon_days=horizon_days,
|
||||
point_count=len(payload),
|
||||
trigger_source=trigger_source,
|
||||
trace_id=trace_id,
|
||||
span_id=span_id,
|
||||
payload=payload,
|
||||
)
|
||||
)
|
||||
session.commit()
|
||||
self.write_counter.add(
|
||||
1, attributes={"entity": "forecast", "status": "ok"}
|
||||
)
|
||||
except SQLAlchemyError as exc:
|
||||
LOGGER.exception("Failed to persist forecast run: %s", exc)
|
||||
self.write_counter.add(
|
||||
1, attributes={"entity": "forecast", "status": "error"}
|
||||
)
|
||||
finally:
|
||||
self.write_latency.record(
|
||||
(perf_counter() - started) * 1000,
|
||||
attributes={"entity": "forecast"},
|
||||
)
|
||||
|
||||
def record_ranking_run(
|
||||
self,
|
||||
*,
|
||||
top_n: int,
|
||||
payload: list[dict],
|
||||
trigger_source: str,
|
||||
trace_id: str | None,
|
||||
span_id: str | None,
|
||||
) -> None:
|
||||
started = perf_counter()
|
||||
with self.tracer.start_as_current_span("persist.ranking_run"):
|
||||
try:
|
||||
with self.session_factory() as session:
|
||||
session.add(
|
||||
RankingRun(
|
||||
top_n=top_n,
|
||||
item_count=len(payload),
|
||||
trigger_source=trigger_source,
|
||||
trace_id=trace_id,
|
||||
span_id=span_id,
|
||||
payload=payload,
|
||||
)
|
||||
)
|
||||
session.commit()
|
||||
self.write_counter.add(
|
||||
1, attributes={"entity": "ranking", "status": "ok"}
|
||||
)
|
||||
except SQLAlchemyError as exc:
|
||||
LOGGER.exception("Failed to persist ranking run: %s", exc)
|
||||
self.write_counter.add(
|
||||
1, attributes={"entity": "ranking", "status": "error"}
|
||||
)
|
||||
finally:
|
||||
self.write_latency.record(
|
||||
(perf_counter() - started) * 1000,
|
||||
attributes={"entity": "ranking"},
|
||||
)
|
||||
|
||||
def record_recommendation_run(
|
||||
self,
|
||||
*,
|
||||
payload: list[dict],
|
||||
trigger_source: str,
|
||||
trace_id: str | None,
|
||||
span_id: str | None,
|
||||
) -> None:
|
||||
started = perf_counter()
|
||||
with self.tracer.start_as_current_span("persist.recommendation_run"):
|
||||
try:
|
||||
with self.session_factory() as session:
|
||||
session.add(
|
||||
RecommendationRun(
|
||||
item_count=len(payload),
|
||||
trigger_source=trigger_source,
|
||||
trace_id=trace_id,
|
||||
span_id=span_id,
|
||||
payload=payload,
|
||||
)
|
||||
)
|
||||
session.commit()
|
||||
self.write_counter.add(
|
||||
1, attributes={"entity": "recommendation", "status": "ok"}
|
||||
)
|
||||
except SQLAlchemyError as exc:
|
||||
LOGGER.exception("Failed to persist recommendation run: %s", exc)
|
||||
self.write_counter.add(
|
||||
1, attributes={"entity": "recommendation", "status": "error"}
|
||||
)
|
||||
finally:
|
||||
self.write_latency.record(
|
||||
(perf_counter() - started) * 1000,
|
||||
attributes={"entity": "recommendation"},
|
||||
)
|
||||
|
||||
def list_audit_logs(self, limit: int) -> list[dict]:
|
||||
with self.tracer.start_as_current_span("persist.list_audit_logs"):
|
||||
with self.session_factory() as session:
|
||||
rows = session.execute(
|
||||
select(AuditLog).order_by(desc(AuditLog.created_at)).limit(limit)
|
||||
).scalars()
|
||||
return [self._to_audit_dict(row) for row in rows]
|
||||
|
||||
def list_forecast_runs(self, limit: int) -> list[dict]:
|
||||
with self.tracer.start_as_current_span("persist.list_forecast_runs"):
|
||||
with self.session_factory() as session:
|
||||
rows = session.execute(
|
||||
select(ForecastRun)
|
||||
.order_by(desc(ForecastRun.created_at))
|
||||
.limit(limit)
|
||||
).scalars()
|
||||
return [self._to_forecast_dict(row) for row in rows]
|
||||
|
||||
def list_ranking_runs(self, limit: int) -> list[dict]:
|
||||
with self.tracer.start_as_current_span("persist.list_ranking_runs"):
|
||||
with self.session_factory() as session:
|
||||
rows = session.execute(
|
||||
select(RankingRun)
|
||||
.order_by(desc(RankingRun.created_at))
|
||||
.limit(limit)
|
||||
).scalars()
|
||||
return [self._to_ranking_dict(row) for row in rows]
|
||||
|
||||
def list_recommendation_runs(self, limit: int) -> list[dict]:
|
||||
with self.tracer.start_as_current_span("persist.list_recommendation_runs"):
|
||||
with self.session_factory() as session:
|
||||
rows = session.execute(
|
||||
select(RecommendationRun)
|
||||
.order_by(desc(RecommendationRun.created_at))
|
||||
.limit(limit)
|
||||
).scalars()
|
||||
return [self._to_recommendation_dict(row) for row in rows]
|
||||
101
backend/app/services/warehouse_service.py
Normal file
101
backend/app/services/warehouse_service.py
Normal file
@@ -0,0 +1,101 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import hashlib
|
||||
import logging
|
||||
from collections.abc import Sequence
|
||||
from time import perf_counter
|
||||
|
||||
import pandas as pd
|
||||
from opentelemetry import metrics, trace
|
||||
from sqlalchemy import text
|
||||
from sqlalchemy.engine import Engine
|
||||
from sqlalchemy.exc import SQLAlchemyError
|
||||
|
||||
from app.db import queries
|
||||
|
||||
LOGGER = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ReadOnlyWarehouseClient:
|
||||
def __init__(self, engines: dict[str, Engine]) -> None:
|
||||
self.engines = engines
|
||||
self.tracer = trace.get_tracer(__name__)
|
||||
self.meter = metrics.get_meter(__name__)
|
||||
self.query_counter = self.meter.create_counter(
|
||||
name="warehouse_queries_total",
|
||||
description="Total warehouse query executions",
|
||||
)
|
||||
self.query_latency = self.meter.create_histogram(
|
||||
name="warehouse_query_latency_ms",
|
||||
unit="ms",
|
||||
description="Warehouse query latency",
|
||||
)
|
||||
|
||||
def _validate_read_only_query(self, sql: str) -> None:
|
||||
normalized = sql.strip().lower()
|
||||
if not (normalized.startswith("select") or normalized.startswith("with")):
|
||||
raise ValueError("Only read-only SELECT/CTE SQL statements are allowed.")
|
||||
|
||||
def _run_query_list(
|
||||
self, source: str, sql_candidates: Sequence[str]
|
||||
) -> pd.DataFrame:
|
||||
engine = self.engines[source]
|
||||
last_error: Exception | None = None
|
||||
|
||||
for candidate in sql_candidates:
|
||||
self._validate_read_only_query(candidate)
|
||||
query_hash = hashlib.sha256(candidate.encode("utf-8")).hexdigest()[:12]
|
||||
with self.tracer.start_as_current_span("warehouse.query") as span:
|
||||
span.set_attribute("db.system", "mssql")
|
||||
span.set_attribute("db.source", source)
|
||||
span.set_attribute("db.query.hash", query_hash)
|
||||
started = perf_counter()
|
||||
try:
|
||||
with engine.connect() as conn:
|
||||
with self.tracer.start_as_current_span(
|
||||
"warehouse.query.execute"
|
||||
):
|
||||
df = pd.read_sql_query(sql=text(candidate), con=conn)
|
||||
elapsed_ms = (perf_counter() - started) * 1000
|
||||
self.query_latency.record(elapsed_ms, attributes={"source": source})
|
||||
self.query_counter.add(
|
||||
1, attributes={"source": source, "status": "ok"}
|
||||
)
|
||||
return df
|
||||
except SQLAlchemyError as exc:
|
||||
last_error = exc
|
||||
elapsed_ms = (perf_counter() - started) * 1000
|
||||
self.query_latency.record(elapsed_ms, attributes={"source": source})
|
||||
self.query_counter.add(
|
||||
1, attributes={"source": source, "status": "error"}
|
||||
)
|
||||
LOGGER.warning(
|
||||
"Query failed for %s with hash %s: %s", source, query_hash, exc
|
||||
)
|
||||
|
||||
if last_error is not None:
|
||||
raise RuntimeError(
|
||||
f"All query candidates failed for source '{source}'."
|
||||
) from last_error
|
||||
return pd.DataFrame()
|
||||
|
||||
def fetch_daily_sales(self) -> pd.DataFrame:
|
||||
aw = self._run_query_list("aw", queries.AW_DAILY_SALES_QUERIES)
|
||||
aw["source"] = "AdventureWorks2022DWH"
|
||||
wwi = self._run_query_list("wwi", queries.WWI_DAILY_SALES_QUERIES)
|
||||
wwi["source"] = "WorldWideImporters"
|
||||
return pd.concat([aw, wwi], ignore_index=True)
|
||||
|
||||
def fetch_product_performance(self) -> pd.DataFrame:
|
||||
aw = self._run_query_list("aw", queries.AW_PRODUCT_PERFORMANCE_QUERIES)
|
||||
aw["source"] = "AdventureWorks2022DWH"
|
||||
wwi = self._run_query_list("wwi", queries.WWI_PRODUCT_PERFORMANCE_QUERIES)
|
||||
wwi["source"] = "WorldWideImporters"
|
||||
return pd.concat([aw, wwi], ignore_index=True)
|
||||
|
||||
def fetch_customer_performance(self) -> pd.DataFrame:
|
||||
aw = self._run_query_list("aw", queries.AW_CUSTOMER_QUERIES)
|
||||
aw["source"] = "AdventureWorks2022DWH"
|
||||
wwi = self._run_query_list("wwi", queries.WWI_CUSTOMER_QUERIES)
|
||||
wwi["source"] = "WorldWideImporters"
|
||||
return pd.concat([aw, wwi], ignore_index=True)
|
||||
1
backend/microservices/__init__.py
Normal file
1
backend/microservices/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
"""Microservices package for BI platform."""
|
||||
1
backend/microservices/analytics/__init__.py
Normal file
1
backend/microservices/analytics/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
"""Analytics and forecasting microservice."""
|
||||
260
backend/microservices/analytics/main.py
Normal file
260
backend/microservices/analytics/main.py
Normal file
@@ -0,0 +1,260 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
from contextlib import asynccontextmanager
|
||||
from contextvars import ContextVar
|
||||
|
||||
import httpx
|
||||
import pandas as pd
|
||||
from fastapi import Depends, FastAPI, Query, Request, Response
|
||||
|
||||
from app.core.config import settings
|
||||
from app.core.otel import (
|
||||
TelemetryProviders,
|
||||
configure_otel,
|
||||
instrument_fastapi,
|
||||
instrument_httpx_clients,
|
||||
shutdown_otel,
|
||||
)
|
||||
from app.core.security import InternalPrincipal, require_internal_principal
|
||||
from app.services.analytics_service import AnalyticsService
|
||||
from microservices.common.http import current_trace_headers, with_internal_service_token
|
||||
|
||||
logging.basicConfig(level=settings.log_level)
|
||||
LOGGER = logging.getLogger(__name__)
|
||||
|
||||
FORWARD_HEADERS: ContextVar[dict[str, str]] = ContextVar("forward_headers", default={})
|
||||
|
||||
|
||||
class QueryWarehouseClient:
|
||||
def __init__(self, client: httpx.Client, query_service_url: str) -> None:
|
||||
self.client = client
|
||||
self.query_service_url = query_service_url.rstrip("/")
|
||||
|
||||
def _fetch(self, path: str) -> pd.DataFrame:
|
||||
response = self.client.get(
|
||||
f"{self.query_service_url}{path}",
|
||||
headers=FORWARD_HEADERS.get(),
|
||||
timeout=settings.request_timeout_seconds,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return pd.DataFrame(response.json())
|
||||
|
||||
def fetch_daily_sales(self) -> pd.DataFrame:
|
||||
return self._fetch("/internal/daily-sales")
|
||||
|
||||
def fetch_product_performance(self) -> pd.DataFrame:
|
||||
return self._fetch("/internal/product-performance")
|
||||
|
||||
def fetch_customer_performance(self) -> pd.DataFrame:
|
||||
return self._fetch("/internal/customer-performance")
|
||||
|
||||
|
||||
class PersistenceProxy:
|
||||
def __init__(self, client: httpx.Client, persistence_service_url: str) -> None:
|
||||
self.client = client
|
||||
self.persistence_service_url = persistence_service_url.rstrip("/")
|
||||
|
||||
def _post(self, path: str, payload: dict) -> None:
|
||||
response = self.client.post(
|
||||
f"{self.persistence_service_url}{path}",
|
||||
headers=FORWARD_HEADERS.get(),
|
||||
json=payload,
|
||||
timeout=settings.request_timeout_seconds,
|
||||
)
|
||||
response.raise_for_status()
|
||||
|
||||
def record_forecast_run(
|
||||
self,
|
||||
*,
|
||||
horizon_days: int,
|
||||
payload: list[dict],
|
||||
trigger_source: str,
|
||||
trace_id: str | None,
|
||||
span_id: str | None,
|
||||
) -> None:
|
||||
self._post(
|
||||
"/internal/forecast-runs",
|
||||
{
|
||||
"horizon_days": horizon_days,
|
||||
"payload": payload,
|
||||
"trigger_source": trigger_source,
|
||||
"trace_id": trace_id,
|
||||
"span_id": span_id,
|
||||
},
|
||||
)
|
||||
|
||||
def record_ranking_run(
|
||||
self,
|
||||
*,
|
||||
top_n: int,
|
||||
payload: list[dict],
|
||||
trigger_source: str,
|
||||
trace_id: str | None,
|
||||
span_id: str | None,
|
||||
) -> None:
|
||||
self._post(
|
||||
"/internal/ranking-runs",
|
||||
{
|
||||
"top_n": top_n,
|
||||
"payload": payload,
|
||||
"trigger_source": trigger_source,
|
||||
"trace_id": trace_id,
|
||||
"span_id": span_id,
|
||||
},
|
||||
)
|
||||
|
||||
def record_recommendation_run(
|
||||
self,
|
||||
*,
|
||||
payload: list[dict],
|
||||
trigger_source: str,
|
||||
trace_id: str | None,
|
||||
span_id: str | None,
|
||||
) -> None:
|
||||
self._post(
|
||||
"/internal/recommendation-runs",
|
||||
{
|
||||
"payload": payload,
|
||||
"trigger_source": trigger_source,
|
||||
"trace_id": trace_id,
|
||||
"span_id": span_id,
|
||||
},
|
||||
)
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI):
|
||||
telemetry: TelemetryProviders = configure_otel(settings)
|
||||
instrument_httpx_clients()
|
||||
|
||||
http_client = httpx.Client()
|
||||
warehouse_client = QueryWarehouseClient(http_client, settings.query_service_url)
|
||||
persistence_proxy = PersistenceProxy(http_client, settings.persistence_service_url)
|
||||
app.state.http_client = http_client
|
||||
app.state.analytics = AnalyticsService(warehouse_client, persistence_proxy)
|
||||
LOGGER.info("Analytics service ready")
|
||||
yield
|
||||
http_client.close()
|
||||
shutdown_otel(telemetry)
|
||||
|
||||
|
||||
app = FastAPI(title="analytics-service", version="0.1.0", lifespan=lifespan)
|
||||
instrument_fastapi(app)
|
||||
|
||||
|
||||
def _analytics() -> AnalyticsService:
|
||||
return app.state.analytics
|
||||
|
||||
|
||||
def _with_request_headers(request: Request):
|
||||
headers = current_trace_headers()
|
||||
incoming_internal = request.headers.get("x-internal-service-token")
|
||||
if incoming_internal:
|
||||
headers = with_internal_service_token(headers, incoming_internal)
|
||||
token = FORWARD_HEADERS.set(headers)
|
||||
return token
|
||||
|
||||
|
||||
@app.get("/internal/health")
|
||||
def health(request: Request, response: Response) -> dict:
|
||||
token = _with_request_headers(request)
|
||||
try:
|
||||
response.headers.update(current_trace_headers())
|
||||
return {"status": "ok", "service": "analytics-service"}
|
||||
finally:
|
||||
FORWARD_HEADERS.reset(token)
|
||||
|
||||
|
||||
@app.get("/internal/kpis")
|
||||
def kpis(
|
||||
request: Request,
|
||||
response: Response,
|
||||
_auth: InternalPrincipal = Depends(require_internal_principal),
|
||||
) -> dict:
|
||||
token = _with_request_headers(request)
|
||||
try:
|
||||
response.headers.update(current_trace_headers())
|
||||
return _analytics().get_kpis()
|
||||
finally:
|
||||
FORWARD_HEADERS.reset(token)
|
||||
|
||||
|
||||
@app.get("/internal/history")
|
||||
def history(
|
||||
request: Request,
|
||||
response: Response,
|
||||
days_back: int = Query(default=settings.default_history_days, ge=30, le=1460),
|
||||
_auth: InternalPrincipal = Depends(require_internal_principal),
|
||||
) -> list[dict]:
|
||||
token = _with_request_headers(request)
|
||||
try:
|
||||
response.headers.update(current_trace_headers())
|
||||
return _analytics().get_history_points(days_back=days_back)
|
||||
finally:
|
||||
FORWARD_HEADERS.reset(token)
|
||||
|
||||
|
||||
@app.get("/internal/forecasts")
|
||||
def forecasts(
|
||||
request: Request,
|
||||
response: Response,
|
||||
days: int = Query(default=settings.forecast_horizon_days, ge=7, le=180),
|
||||
_auth: InternalPrincipal = Depends(require_internal_principal),
|
||||
) -> list[dict]:
|
||||
token = _with_request_headers(request)
|
||||
try:
|
||||
response.headers.update(current_trace_headers())
|
||||
return _analytics().get_forecast(
|
||||
horizon_days=days, trigger_source="analytics.api.forecasts", persist=True
|
||||
)
|
||||
finally:
|
||||
FORWARD_HEADERS.reset(token)
|
||||
|
||||
|
||||
@app.get("/internal/rankings")
|
||||
def rankings(
|
||||
request: Request,
|
||||
response: Response,
|
||||
top_n: int = Query(default=settings.ranking_default_top_n, ge=3, le=100),
|
||||
_auth: InternalPrincipal = Depends(require_internal_principal),
|
||||
) -> list[dict]:
|
||||
token = _with_request_headers(request)
|
||||
try:
|
||||
response.headers.update(current_trace_headers())
|
||||
return _analytics().get_rankings(
|
||||
top_n=top_n, trigger_source="analytics.api.rankings", persist=True
|
||||
)
|
||||
finally:
|
||||
FORWARD_HEADERS.reset(token)
|
||||
|
||||
|
||||
@app.get("/internal/recommendations")
|
||||
def recommendations(
|
||||
request: Request,
|
||||
response: Response,
|
||||
_auth: InternalPrincipal = Depends(require_internal_principal),
|
||||
) -> list[dict]:
|
||||
token = _with_request_headers(request)
|
||||
try:
|
||||
response.headers.update(current_trace_headers())
|
||||
return _analytics().get_recommendations(
|
||||
trigger_source="analytics.api.recommendations", persist=True
|
||||
)
|
||||
finally:
|
||||
FORWARD_HEADERS.reset(token)
|
||||
|
||||
|
||||
@app.get("/internal/dashboard")
|
||||
def dashboard(
|
||||
request: Request,
|
||||
response: Response,
|
||||
_auth: InternalPrincipal = Depends(require_internal_principal),
|
||||
) -> dict:
|
||||
token = _with_request_headers(request)
|
||||
try:
|
||||
response.headers.update(current_trace_headers())
|
||||
snapshot = _analytics().get_dashboard()
|
||||
return snapshot.__dict__
|
||||
finally:
|
||||
FORWARD_HEADERS.reset(token)
|
||||
1
backend/microservices/api_gateway/__init__.py
Normal file
1
backend/microservices/api_gateway/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
"""Public API gateway microservice."""
|
||||
326
backend/microservices/api_gateway/main.py
Normal file
326
backend/microservices/api_gateway/main.py
Normal file
@@ -0,0 +1,326 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
from contextlib import asynccontextmanager
|
||||
from time import perf_counter
|
||||
|
||||
import httpx
|
||||
from fastapi import Depends, FastAPI, HTTPException, Query, Request, Response
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
|
||||
from app.core.config import settings
|
||||
from app.core.otel import (
|
||||
TelemetryProviders,
|
||||
configure_otel,
|
||||
instrument_fastapi,
|
||||
instrument_httpx_clients,
|
||||
shutdown_otel,
|
||||
)
|
||||
from app.core.security import (
|
||||
FrontendPrincipal,
|
||||
get_internal_token_manager,
|
||||
require_frontend_principal,
|
||||
)
|
||||
from microservices.common.http import current_trace_headers, with_internal_service_token
|
||||
|
||||
logging.basicConfig(level=settings.log_level)
|
||||
LOGGER = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def _raise_upstream(exc: httpx.HTTPStatusError) -> None:
|
||||
detail = exc.response.text
|
||||
raise HTTPException(status_code=exc.response.status_code, detail=detail) from exc
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI):
|
||||
telemetry: TelemetryProviders = configure_otel(settings)
|
||||
instrument_httpx_clients()
|
||||
app.state.http_client = httpx.Client()
|
||||
LOGGER.info("API gateway ready")
|
||||
yield
|
||||
app.state.http_client.close()
|
||||
shutdown_otel(telemetry)
|
||||
|
||||
|
||||
app = FastAPI(title="api-gateway-service", version="0.1.0", lifespan=lifespan)
|
||||
instrument_fastapi(app)
|
||||
app.add_middleware(
|
||||
CORSMiddleware,
|
||||
allow_origins=settings.cors_origins_list,
|
||||
allow_credentials=True,
|
||||
allow_methods=["GET", "POST"],
|
||||
allow_headers=["*"],
|
||||
expose_headers=["x-trace-id", "x-span-id"],
|
||||
)
|
||||
|
||||
|
||||
@app.middleware("http")
|
||||
async def security_headers(request: Request, call_next):
|
||||
response = await call_next(request)
|
||||
response.headers["X-Content-Type-Options"] = "nosniff"
|
||||
response.headers["X-Frame-Options"] = "DENY"
|
||||
response.headers["Referrer-Policy"] = "no-referrer"
|
||||
response.headers["Permissions-Policy"] = "camera=(), microphone=(), geolocation=()"
|
||||
response.headers["X-Permitted-Cross-Domain-Policies"] = "none"
|
||||
response.headers["Strict-Transport-Security"] = (
|
||||
"max-age=31536000; includeSubDomains"
|
||||
)
|
||||
response.headers["Cache-Control"] = "no-store"
|
||||
response.headers["Pragma"] = "no-cache"
|
||||
return response
|
||||
|
||||
|
||||
def _client() -> httpx.Client:
|
||||
return app.state.http_client
|
||||
|
||||
|
||||
def _upstream_headers(principal: FrontendPrincipal) -> dict[str, str]:
|
||||
token = get_internal_token_manager().mint(
|
||||
subject=principal.subject,
|
||||
scopes=principal.scopes,
|
||||
source_service="api-gateway",
|
||||
)
|
||||
return with_internal_service_token(current_trace_headers(), token)
|
||||
|
||||
|
||||
def _get_json(url: str, principal: FrontendPrincipal) -> dict | list:
|
||||
try:
|
||||
response = _client().get(
|
||||
url,
|
||||
headers=_upstream_headers(principal),
|
||||
timeout=settings.request_timeout_seconds,
|
||||
)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
except httpx.HTTPStatusError as exc:
|
||||
_raise_upstream(exc)
|
||||
|
||||
|
||||
def _audit_payload(
|
||||
request: Request, response: Response, started: float, principal: FrontendPrincipal
|
||||
) -> dict:
|
||||
headers = current_trace_headers()
|
||||
return {
|
||||
"method": request.method,
|
||||
"path": request.url.path,
|
||||
"query_string": request.url.query,
|
||||
"status_code": response.status_code,
|
||||
"duration_ms": (perf_counter() - started) * 1000,
|
||||
"trace_id": headers.get("x-trace-id"),
|
||||
"span_id": headers.get("x-span-id"),
|
||||
"client_ip": request.client.host if request.client else None,
|
||||
"user_agent": request.headers.get("user-agent"),
|
||||
"details": {
|
||||
"subject": principal.subject,
|
||||
"scopes": principal.scopes,
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
def _persist_audit(
|
||||
request: Request, response: Response, started: float, principal: FrontendPrincipal
|
||||
) -> None:
|
||||
if not request.url.path.startswith("/api/"):
|
||||
return
|
||||
try:
|
||||
_client().post(
|
||||
f"{settings.persistence_service_url.rstrip('/')}/internal/audit-logs",
|
||||
headers=_upstream_headers(principal),
|
||||
json=_audit_payload(request, response, started, principal),
|
||||
timeout=settings.request_timeout_seconds,
|
||||
).raise_for_status()
|
||||
except httpx.HTTPError as exc:
|
||||
LOGGER.warning("Audit persistence failed: %s", exc)
|
||||
|
||||
|
||||
@app.get("/api/health")
|
||||
def health(response: Response) -> dict:
|
||||
response.headers.update(current_trace_headers())
|
||||
return {"status": "ok", "service": "api-gateway-service"}
|
||||
|
||||
|
||||
@app.get("/api/telemetry/status")
|
||||
def telemetry_status(
|
||||
request: Request,
|
||||
response: Response,
|
||||
principal: FrontendPrincipal = Depends(require_frontend_principal),
|
||||
) -> dict:
|
||||
started = perf_counter()
|
||||
response.headers.update(current_trace_headers())
|
||||
payload = {
|
||||
"status": "instrumented",
|
||||
"service_name": "api-gateway-service",
|
||||
"collector_endpoint": settings.otel_collector_endpoint,
|
||||
"trace_id": current_trace_headers().get("x-trace-id"),
|
||||
"span_id": current_trace_headers().get("x-span-id"),
|
||||
"trace_headers": ["traceparent", "tracestate", "baggage", "x-trace-id"],
|
||||
"subject": principal.subject,
|
||||
}
|
||||
_persist_audit(request, response, started, principal)
|
||||
return payload
|
||||
|
||||
|
||||
@app.get("/api/kpis")
|
||||
def kpis(
|
||||
request: Request,
|
||||
response: Response,
|
||||
principal: FrontendPrincipal = Depends(require_frontend_principal),
|
||||
) -> dict:
|
||||
started = perf_counter()
|
||||
response.headers.update(current_trace_headers())
|
||||
payload = _get_json(
|
||||
f"{settings.analytics_service_url.rstrip('/')}/internal/kpis", principal
|
||||
)
|
||||
_persist_audit(request, response, started, principal)
|
||||
return payload # type: ignore[return-value]
|
||||
|
||||
|
||||
@app.get("/api/history")
|
||||
def history(
|
||||
request: Request,
|
||||
response: Response,
|
||||
days_back: int = Query(default=settings.default_history_days, ge=30, le=1460),
|
||||
principal: FrontendPrincipal = Depends(require_frontend_principal),
|
||||
) -> list[dict]:
|
||||
started = perf_counter()
|
||||
response.headers.update(current_trace_headers())
|
||||
payload = _get_json(
|
||||
f"{settings.analytics_service_url.rstrip('/')}/internal/history?days_back={days_back}",
|
||||
principal,
|
||||
)
|
||||
_persist_audit(request, response, started, principal)
|
||||
return payload # type: ignore[return-value]
|
||||
|
||||
|
||||
@app.get("/api/forecasts")
|
||||
def forecasts(
|
||||
request: Request,
|
||||
response: Response,
|
||||
days: int = Query(default=settings.forecast_horizon_days, ge=7, le=180),
|
||||
principal: FrontendPrincipal = Depends(require_frontend_principal),
|
||||
) -> list[dict]:
|
||||
started = perf_counter()
|
||||
response.headers.update(current_trace_headers())
|
||||
payload = _get_json(
|
||||
f"{settings.analytics_service_url.rstrip('/')}/internal/forecasts?days={days}",
|
||||
principal,
|
||||
)
|
||||
_persist_audit(request, response, started, principal)
|
||||
return payload # type: ignore[return-value]
|
||||
|
||||
|
||||
@app.get("/api/rankings")
|
||||
def rankings(
|
||||
request: Request,
|
||||
response: Response,
|
||||
top_n: int = Query(default=settings.ranking_default_top_n, ge=3, le=100),
|
||||
principal: FrontendPrincipal = Depends(require_frontend_principal),
|
||||
) -> list[dict]:
|
||||
started = perf_counter()
|
||||
response.headers.update(current_trace_headers())
|
||||
payload = _get_json(
|
||||
f"{settings.analytics_service_url.rstrip('/')}/internal/rankings?top_n={top_n}",
|
||||
principal,
|
||||
)
|
||||
_persist_audit(request, response, started, principal)
|
||||
return payload # type: ignore[return-value]
|
||||
|
||||
|
||||
@app.get("/api/recommendations")
|
||||
def recommendations(
|
||||
request: Request,
|
||||
response: Response,
|
||||
principal: FrontendPrincipal = Depends(require_frontend_principal),
|
||||
) -> list[dict]:
|
||||
started = perf_counter()
|
||||
response.headers.update(current_trace_headers())
|
||||
payload = _get_json(
|
||||
f"{settings.analytics_service_url.rstrip('/')}/internal/recommendations",
|
||||
principal,
|
||||
)
|
||||
_persist_audit(request, response, started, principal)
|
||||
return payload # type: ignore[return-value]
|
||||
|
||||
|
||||
@app.get("/api/dashboard")
|
||||
def dashboard(
|
||||
request: Request,
|
||||
response: Response,
|
||||
principal: FrontendPrincipal = Depends(require_frontend_principal),
|
||||
) -> dict:
|
||||
started = perf_counter()
|
||||
response.headers.update(current_trace_headers())
|
||||
payload = _get_json(
|
||||
f"{settings.analytics_service_url.rstrip('/')}/internal/dashboard", principal
|
||||
)
|
||||
_persist_audit(request, response, started, principal)
|
||||
return payload # type: ignore[return-value]
|
||||
|
||||
|
||||
@app.get("/api/storage/audit-logs")
|
||||
def storage_audit_logs(
|
||||
request: Request,
|
||||
response: Response,
|
||||
limit: int = Query(default=settings.storage_default_limit, ge=1, le=500),
|
||||
principal: FrontendPrincipal = Depends(require_frontend_principal),
|
||||
) -> list[dict]:
|
||||
started = perf_counter()
|
||||
response.headers.update(current_trace_headers())
|
||||
payload = _get_json(
|
||||
f"{settings.persistence_service_url.rstrip('/')}/internal/audit-logs?limit={limit}",
|
||||
principal,
|
||||
)
|
||||
_persist_audit(request, response, started, principal)
|
||||
return payload # type: ignore[return-value]
|
||||
|
||||
|
||||
@app.get("/api/storage/forecasts")
|
||||
def storage_forecasts(
|
||||
request: Request,
|
||||
response: Response,
|
||||
limit: int = Query(default=settings.storage_default_limit, ge=1, le=500),
|
||||
principal: FrontendPrincipal = Depends(require_frontend_principal),
|
||||
) -> list[dict]:
|
||||
started = perf_counter()
|
||||
response.headers.update(current_trace_headers())
|
||||
payload = _get_json(
|
||||
f"{settings.persistence_service_url.rstrip('/')}/internal/forecast-runs?limit={limit}",
|
||||
principal,
|
||||
)
|
||||
_persist_audit(request, response, started, principal)
|
||||
return payload # type: ignore[return-value]
|
||||
|
||||
|
||||
@app.get("/api/storage/rankings")
|
||||
def storage_rankings(
|
||||
request: Request,
|
||||
response: Response,
|
||||
limit: int = Query(default=settings.storage_default_limit, ge=1, le=500),
|
||||
principal: FrontendPrincipal = Depends(require_frontend_principal),
|
||||
) -> list[dict]:
|
||||
started = perf_counter()
|
||||
response.headers.update(current_trace_headers())
|
||||
payload = _get_json(
|
||||
f"{settings.persistence_service_url.rstrip('/')}/internal/ranking-runs?limit={limit}",
|
||||
principal,
|
||||
)
|
||||
_persist_audit(request, response, started, principal)
|
||||
return payload # type: ignore[return-value]
|
||||
|
||||
|
||||
@app.get("/api/storage/recommendations")
|
||||
def storage_recommendations(
|
||||
request: Request,
|
||||
response: Response,
|
||||
limit: int = Query(default=settings.storage_default_limit, ge=1, le=500),
|
||||
principal: FrontendPrincipal = Depends(require_frontend_principal),
|
||||
) -> list[dict]:
|
||||
started = perf_counter()
|
||||
response.headers.update(current_trace_headers())
|
||||
payload = _get_json(
|
||||
f"{settings.persistence_service_url.rstrip('/')}/internal/recommendation-runs?limit={limit}",
|
||||
principal,
|
||||
)
|
||||
_persist_audit(request, response, started, principal)
|
||||
return payload # type: ignore[return-value]
|
||||
1
backend/microservices/bi_query/__init__.py
Normal file
1
backend/microservices/bi_query/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
"""Read-only MSSQL query microservice."""
|
||||
85
backend/microservices/bi_query/main.py
Normal file
85
backend/microservices/bi_query/main.py
Normal file
@@ -0,0 +1,85 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
from contextlib import asynccontextmanager
|
||||
|
||||
import pandas as pd
|
||||
from fastapi import Depends, FastAPI, Response
|
||||
|
||||
from app.core.config import settings
|
||||
from app.core.otel import (
|
||||
TelemetryProviders,
|
||||
configure_otel,
|
||||
instrument_fastapi,
|
||||
instrument_sqlalchemy_engines,
|
||||
shutdown_otel,
|
||||
)
|
||||
from app.core.security import InternalPrincipal, require_internal_principal
|
||||
from app.db.engine import create_warehouse_engines, dispose_engines
|
||||
from app.services.warehouse_service import ReadOnlyWarehouseClient
|
||||
from microservices.common.http import current_trace_headers
|
||||
|
||||
logging.basicConfig(level=settings.log_level)
|
||||
LOGGER = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def _frame_to_rows(df: pd.DataFrame) -> list[dict]:
|
||||
rows: list[dict] = []
|
||||
for _, row in df.iterrows():
|
||||
payload: dict = {}
|
||||
for key, value in row.items():
|
||||
if hasattr(value, "isoformat"):
|
||||
payload[str(key)] = value.isoformat()
|
||||
else:
|
||||
payload[str(key)] = value
|
||||
rows.append(payload)
|
||||
return rows
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI):
|
||||
telemetry: TelemetryProviders = configure_otel(settings)
|
||||
engines = create_warehouse_engines()
|
||||
instrument_sqlalchemy_engines(engines)
|
||||
app.state.query_client = ReadOnlyWarehouseClient(engines)
|
||||
LOGGER.info("BI query service ready with read-only MSSQL engines")
|
||||
yield
|
||||
dispose_engines(engines)
|
||||
shutdown_otel(telemetry)
|
||||
|
||||
|
||||
app = FastAPI(title="bi-query-service", version="0.1.0", lifespan=lifespan)
|
||||
instrument_fastapi(app)
|
||||
|
||||
|
||||
@app.get("/internal/health")
|
||||
def health(response: Response) -> dict:
|
||||
response.headers.update(current_trace_headers())
|
||||
return {"status": "ok", "service": "bi-query-service"}
|
||||
|
||||
|
||||
@app.get("/internal/daily-sales")
|
||||
def daily_sales(
|
||||
response: Response, _auth: InternalPrincipal = Depends(require_internal_principal)
|
||||
) -> list[dict]:
|
||||
response.headers.update(current_trace_headers())
|
||||
client: ReadOnlyWarehouseClient = app.state.query_client
|
||||
return _frame_to_rows(client.fetch_daily_sales())
|
||||
|
||||
|
||||
@app.get("/internal/product-performance")
|
||||
def product_performance(
|
||||
response: Response, _auth: InternalPrincipal = Depends(require_internal_principal)
|
||||
) -> list[dict]:
|
||||
response.headers.update(current_trace_headers())
|
||||
client: ReadOnlyWarehouseClient = app.state.query_client
|
||||
return _frame_to_rows(client.fetch_product_performance())
|
||||
|
||||
|
||||
@app.get("/internal/customer-performance")
|
||||
def customer_performance(
|
||||
response: Response, _auth: InternalPrincipal = Depends(require_internal_principal)
|
||||
) -> list[dict]:
|
||||
response.headers.update(current_trace_headers())
|
||||
client: ReadOnlyWarehouseClient = app.state.query_client
|
||||
return _frame_to_rows(client.fetch_customer_performance())
|
||||
1
backend/microservices/common/__init__.py
Normal file
1
backend/microservices/common/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
"""Shared helpers for microservices."""
|
||||
19
backend/microservices/common/http.py
Normal file
19
backend/microservices/common/http.py
Normal file
@@ -0,0 +1,19 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from opentelemetry import trace
|
||||
|
||||
|
||||
def current_trace_headers() -> dict[str, str]:
|
||||
span_context = trace.get_current_span().get_span_context()
|
||||
if not span_context.is_valid:
|
||||
return {}
|
||||
return {
|
||||
"x-trace-id": f"{span_context.trace_id:032x}",
|
||||
"x-span-id": f"{span_context.span_id:016x}",
|
||||
}
|
||||
|
||||
|
||||
def with_internal_service_token(headers: dict[str, str], token: str) -> dict[str, str]:
|
||||
merged = dict(headers)
|
||||
merged["x-internal-service-token"] = token
|
||||
return merged
|
||||
1
backend/microservices/persistence/__init__.py
Normal file
1
backend/microservices/persistence/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
"""PostgreSQL persistence microservice."""
|
||||
176
backend/microservices/persistence/main.py
Normal file
176
backend/microservices/persistence/main.py
Normal file
@@ -0,0 +1,176 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
from contextlib import asynccontextmanager
|
||||
|
||||
from fastapi import Depends, FastAPI, Query, Response
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
from app.core.config import settings
|
||||
from app.core.otel import (
|
||||
TelemetryProviders,
|
||||
configure_otel,
|
||||
instrument_fastapi,
|
||||
instrument_sqlalchemy_engines,
|
||||
shutdown_otel,
|
||||
)
|
||||
from app.core.security import InternalPrincipal, require_internal_principal
|
||||
from app.db.postgres import (
|
||||
create_postgres_engine,
|
||||
create_postgres_session_factory,
|
||||
initialize_postgres_schema,
|
||||
)
|
||||
from app.services.persistence_service import PersistenceService
|
||||
from microservices.common.http import current_trace_headers
|
||||
|
||||
logging.basicConfig(level=settings.log_level)
|
||||
LOGGER = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class AuditLogIn(BaseModel):
|
||||
method: str
|
||||
path: str
|
||||
query_string: str = ""
|
||||
status_code: int
|
||||
duration_ms: float
|
||||
trace_id: str | None = None
|
||||
span_id: str | None = None
|
||||
client_ip: str | None = None
|
||||
user_agent: str | None = None
|
||||
details: dict = Field(default_factory=dict)
|
||||
|
||||
|
||||
class ForecastRunIn(BaseModel):
|
||||
horizon_days: int
|
||||
payload: list[dict]
|
||||
trigger_source: str
|
||||
trace_id: str | None = None
|
||||
span_id: str | None = None
|
||||
|
||||
|
||||
class RankingRunIn(BaseModel):
|
||||
top_n: int
|
||||
payload: list[dict]
|
||||
trigger_source: str
|
||||
trace_id: str | None = None
|
||||
span_id: str | None = None
|
||||
|
||||
|
||||
class RecommendationRunIn(BaseModel):
|
||||
payload: list[dict]
|
||||
trigger_source: str
|
||||
trace_id: str | None = None
|
||||
span_id: str | None = None
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI):
|
||||
telemetry: TelemetryProviders = configure_otel(settings)
|
||||
engine = create_postgres_engine()
|
||||
initialize_postgres_schema(engine)
|
||||
instrument_sqlalchemy_engines({"appdb": engine})
|
||||
app.state.persistence_service = PersistenceService(
|
||||
create_postgres_session_factory(engine)
|
||||
)
|
||||
LOGGER.info("Persistence service ready with PostgreSQL")
|
||||
yield
|
||||
engine.dispose()
|
||||
shutdown_otel(telemetry)
|
||||
|
||||
|
||||
app = FastAPI(title="persistence-service", version="0.1.0", lifespan=lifespan)
|
||||
instrument_fastapi(app)
|
||||
|
||||
|
||||
def _service() -> PersistenceService:
|
||||
return app.state.persistence_service
|
||||
|
||||
|
||||
@app.get("/internal/health")
|
||||
def health(response: Response) -> dict:
|
||||
response.headers.update(current_trace_headers())
|
||||
return {"status": "ok", "service": "persistence-service"}
|
||||
|
||||
|
||||
@app.post("/internal/audit-logs")
|
||||
def create_audit_log(
|
||||
payload: AuditLogIn,
|
||||
response: Response,
|
||||
_auth: InternalPrincipal = Depends(require_internal_principal),
|
||||
) -> dict:
|
||||
response.headers.update(current_trace_headers())
|
||||
_service().record_audit_log(**payload.model_dump())
|
||||
return {"status": "ok"}
|
||||
|
||||
|
||||
@app.post("/internal/forecast-runs")
|
||||
def create_forecast_run(
|
||||
payload: ForecastRunIn,
|
||||
response: Response,
|
||||
_auth: InternalPrincipal = Depends(require_internal_principal),
|
||||
) -> dict:
|
||||
response.headers.update(current_trace_headers())
|
||||
_service().record_forecast_run(**payload.model_dump())
|
||||
return {"status": "ok"}
|
||||
|
||||
|
||||
@app.post("/internal/ranking-runs")
|
||||
def create_ranking_run(
|
||||
payload: RankingRunIn,
|
||||
response: Response,
|
||||
_auth: InternalPrincipal = Depends(require_internal_principal),
|
||||
) -> dict:
|
||||
response.headers.update(current_trace_headers())
|
||||
_service().record_ranking_run(**payload.model_dump())
|
||||
return {"status": "ok"}
|
||||
|
||||
|
||||
@app.post("/internal/recommendation-runs")
|
||||
def create_recommendation_run(
|
||||
payload: RecommendationRunIn,
|
||||
response: Response,
|
||||
_auth: InternalPrincipal = Depends(require_internal_principal),
|
||||
) -> dict:
|
||||
response.headers.update(current_trace_headers())
|
||||
_service().record_recommendation_run(**payload.model_dump())
|
||||
return {"status": "ok"}
|
||||
|
||||
|
||||
@app.get("/internal/audit-logs")
|
||||
def list_audit_logs(
|
||||
response: Response,
|
||||
limit: int = Query(default=settings.storage_default_limit, ge=1, le=500),
|
||||
_auth: InternalPrincipal = Depends(require_internal_principal),
|
||||
) -> list[dict]:
|
||||
response.headers.update(current_trace_headers())
|
||||
return _service().list_audit_logs(limit=limit)
|
||||
|
||||
|
||||
@app.get("/internal/forecast-runs")
|
||||
def list_forecast_runs(
|
||||
response: Response,
|
||||
limit: int = Query(default=settings.storage_default_limit, ge=1, le=500),
|
||||
_auth: InternalPrincipal = Depends(require_internal_principal),
|
||||
) -> list[dict]:
|
||||
response.headers.update(current_trace_headers())
|
||||
return _service().list_forecast_runs(limit=limit)
|
||||
|
||||
|
||||
@app.get("/internal/ranking-runs")
|
||||
def list_ranking_runs(
|
||||
response: Response,
|
||||
limit: int = Query(default=settings.storage_default_limit, ge=1, le=500),
|
||||
_auth: InternalPrincipal = Depends(require_internal_principal),
|
||||
) -> list[dict]:
|
||||
response.headers.update(current_trace_headers())
|
||||
return _service().list_ranking_runs(limit=limit)
|
||||
|
||||
|
||||
@app.get("/internal/recommendation-runs")
|
||||
def list_recommendation_runs(
|
||||
response: Response,
|
||||
limit: int = Query(default=settings.storage_default_limit, ge=1, le=500),
|
||||
_auth: InternalPrincipal = Depends(require_internal_principal),
|
||||
) -> list[dict]:
|
||||
response.headers.update(current_trace_headers())
|
||||
return _service().list_recommendation_runs(limit=limit)
|
||||
46
backend/pyproject.toml
Normal file
46
backend/pyproject.toml
Normal file
@@ -0,0 +1,46 @@
|
||||
[project]
|
||||
name = "otel-bi-backend"
|
||||
version = "0.1.0"
|
||||
description = "OpenTelemetry-instrumented BI and forecasting backend for MSSQL data warehouses"
|
||||
requires-python = ">=3.11"
|
||||
license = "AGPL-3.0-or-later"
|
||||
authors = [{ name = "Domagoj Andrić" }]
|
||||
dependencies = [
|
||||
"fastapi>=0.116.0",
|
||||
"uvicorn[standard]>=0.35.0",
|
||||
"pydantic>=2.11.0",
|
||||
"pydantic-settings>=2.10.0",
|
||||
"python-dotenv>=1.1.0",
|
||||
"httpx>=0.28.0",
|
||||
"pyjwt[crypto]>=2.10.0",
|
||||
"sqlalchemy>=2.0.40",
|
||||
"pyodbc>=5.2.0",
|
||||
"psycopg[binary]>=3.2.0",
|
||||
"pandas>=2.3.0",
|
||||
"numpy>=2.3.0",
|
||||
"scikit-learn>=1.7.0",
|
||||
"opentelemetry-api>=1.36.0",
|
||||
"opentelemetry-sdk>=1.36.0",
|
||||
"opentelemetry-exporter-otlp-proto-http>=1.36.0",
|
||||
"opentelemetry-instrumentation-fastapi>=0.57b0",
|
||||
"opentelemetry-instrumentation-httpx>=0.57b0",
|
||||
"opentelemetry-instrumentation-sqlalchemy>=0.57b0",
|
||||
"opentelemetry-instrumentation-logging>=0.57b0",
|
||||
"opentelemetry-instrumentation-system-metrics>=0.57b0",
|
||||
]
|
||||
|
||||
[project.optional-dependencies]
|
||||
dev = [
|
||||
"pytest>=8.4.0",
|
||||
]
|
||||
|
||||
[build-system]
|
||||
requires = ["setuptools>=68", "wheel"]
|
||||
build-backend = "setuptools.build_meta"
|
||||
|
||||
[tool.setuptools.packages.find]
|
||||
where = ["."]
|
||||
include = ["app*", "microservices*"]
|
||||
|
||||
[tool.pytest.ini_options]
|
||||
pythonpath = ["."]
|
||||
79
backend/tests/test_analytics_service.py
Normal file
79
backend/tests/test_analytics_service.py
Normal file
@@ -0,0 +1,79 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import date, timedelta
|
||||
|
||||
import pandas as pd
|
||||
|
||||
from app.services.analytics_service import AnalyticsService
|
||||
|
||||
|
||||
class StubWarehouseClient:
|
||||
def fetch_daily_sales(self) -> pd.DataFrame:
|
||||
today = date.today()
|
||||
rows = []
|
||||
for i in range(120):
|
||||
day = today - timedelta(days=120 - i)
|
||||
rows.append(
|
||||
{
|
||||
"sale_date": day.isoformat(),
|
||||
"revenue": 1000 + (i * 5),
|
||||
"cost": 500 + (i * 2),
|
||||
"quantity": 40 + i,
|
||||
"orders": 5 + (i % 4),
|
||||
"source": "stub",
|
||||
}
|
||||
)
|
||||
return pd.DataFrame(rows)
|
||||
|
||||
def fetch_product_performance(self) -> pd.DataFrame:
|
||||
return pd.DataFrame(
|
||||
[
|
||||
{
|
||||
"product_id": "A1",
|
||||
"product_name": "Alpha",
|
||||
"category_name": "CatA",
|
||||
"revenue": 12000,
|
||||
"cost": 6000,
|
||||
"quantity": 400,
|
||||
"orders": 150,
|
||||
"source": "stub",
|
||||
},
|
||||
{
|
||||
"product_id": "B1",
|
||||
"product_name": "Beta",
|
||||
"category_name": "CatB",
|
||||
"revenue": 9000,
|
||||
"cost": 8500,
|
||||
"quantity": 300,
|
||||
"orders": 110,
|
||||
"source": "stub",
|
||||
},
|
||||
]
|
||||
)
|
||||
|
||||
def fetch_customer_performance(self) -> pd.DataFrame:
|
||||
return pd.DataFrame(
|
||||
[
|
||||
{
|
||||
"customer_id": "C1",
|
||||
"customer_name": "Contoso",
|
||||
"revenue": 15000,
|
||||
"orders": 80,
|
||||
"source": "stub",
|
||||
}
|
||||
]
|
||||
)
|
||||
|
||||
|
||||
def test_forecast_has_expected_horizon() -> None:
|
||||
service = AnalyticsService(StubWarehouseClient()) # type: ignore[arg-type]
|
||||
forecast = service.get_forecast(horizon_days=15)
|
||||
assert len(forecast) == 15
|
||||
assert "predicted_revenue" in forecast[0]
|
||||
|
||||
|
||||
def test_rankings_are_sorted() -> None:
|
||||
service = AnalyticsService(StubWarehouseClient()) # type: ignore[arg-type]
|
||||
rankings = service.get_rankings(top_n=2)
|
||||
assert len(rankings) == 2
|
||||
assert rankings[0]["score"] >= rankings[1]["score"]
|
||||
65
backend/tests/test_security_tokens.py
Normal file
65
backend/tests/test_security_tokens.py
Normal file
@@ -0,0 +1,65 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import pytest
|
||||
from fastapi import HTTPException
|
||||
|
||||
from app.core.config import settings
|
||||
from app.core.security import InternalTokenManager, require_internal_principal
|
||||
|
||||
|
||||
def test_internal_token_round_trip(monkeypatch: pytest.MonkeyPatch) -> None:
|
||||
monkeypatch.setattr(
|
||||
settings,
|
||||
"internal_service_shared_secret",
|
||||
"unit-test-shared-secret-key-at-least-32b",
|
||||
)
|
||||
monkeypatch.setattr(settings, "internal_service_token_audience", "bi-internal-test")
|
||||
monkeypatch.setattr(settings, "internal_service_allowed_issuers", "api-gateway")
|
||||
monkeypatch.setattr(settings, "internal_token_clock_skew_seconds", 0)
|
||||
|
||||
manager = InternalTokenManager()
|
||||
token = manager.mint(
|
||||
subject="user-123",
|
||||
scopes=["openid", "profile"],
|
||||
source_service="api-gateway",
|
||||
)
|
||||
|
||||
principal = manager.verify(token)
|
||||
assert principal.subject == "user-123"
|
||||
assert principal.claims["iss"] == "api-gateway"
|
||||
assert principal.claims["typ"] == "internal-service"
|
||||
|
||||
|
||||
def test_internal_token_rejects_untrusted_issuer(
|
||||
monkeypatch: pytest.MonkeyPatch,
|
||||
) -> None:
|
||||
monkeypatch.setattr(
|
||||
settings,
|
||||
"internal_service_shared_secret",
|
||||
"unit-test-shared-secret-key-at-least-32b",
|
||||
)
|
||||
monkeypatch.setattr(settings, "internal_service_token_audience", "bi-internal-test")
|
||||
monkeypatch.setattr(settings, "internal_service_allowed_issuers", "api-gateway")
|
||||
monkeypatch.setattr(settings, "internal_token_clock_skew_seconds", 0)
|
||||
|
||||
manager = InternalTokenManager()
|
||||
token = manager.mint(
|
||||
subject="user-123",
|
||||
scopes=["openid"],
|
||||
source_service="analytics",
|
||||
)
|
||||
|
||||
with pytest.raises(HTTPException) as exc:
|
||||
manager.verify(token)
|
||||
assert exc.value.status_code == 401
|
||||
assert exc.value.detail == "Internal token issuer is not allowed."
|
||||
|
||||
|
||||
def test_require_internal_principal_rejects_missing_token(
|
||||
monkeypatch: pytest.MonkeyPatch,
|
||||
) -> None:
|
||||
monkeypatch.setattr(settings, "internal_service_auth_enabled", True)
|
||||
with pytest.raises(HTTPException) as exc:
|
||||
require_internal_principal(None)
|
||||
assert exc.value.status_code == 401
|
||||
assert exc.value.detail == "Missing x-internal-service-token header."
|
||||
13
frontend/.env.example
Normal file
13
frontend/.env.example
Normal file
@@ -0,0 +1,13 @@
|
||||
VITE_API_BASE_URL=http://localhost:8000
|
||||
VITE_OTEL_COLLECTOR_ENDPOINT=http://localhost:4318
|
||||
# K8s + Alloy example:
|
||||
# VITE_OTEL_COLLECTOR_ENDPOINT=http://alloy.monitoring.svc.cluster.local:4318
|
||||
VITE_OTEL_SERVICE_NAME=otel-bi-frontend
|
||||
VITE_OTEL_SERVICE_NAMESPACE=final-thesis
|
||||
|
||||
VITE_OIDC_ENABLED=true
|
||||
VITE_OIDC_AUTHORITY=https://<your-idp-domain>/realms/<your-realm>
|
||||
VITE_OIDC_CLIENT_ID=otel-bi-frontend
|
||||
VITE_OIDC_REDIRECT_URI=http://localhost:5173
|
||||
VITE_OIDC_POST_LOGOUT_REDIRECT_URI=http://localhost:5173
|
||||
VITE_OIDC_SCOPE=openid profile email
|
||||
12
frontend/index.html
Normal file
12
frontend/index.html
Normal file
@@ -0,0 +1,12 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<title>OTel BI Command Center</title>
|
||||
</head>
|
||||
<body>
|
||||
<div id="root"></div>
|
||||
<script type="module" src="/src/main.tsx"></script>
|
||||
</body>
|
||||
</html>
|
||||
2749
frontend/package-lock.json
generated
Normal file
2749
frontend/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
40
frontend/package.json
Normal file
40
frontend/package.json
Normal file
@@ -0,0 +1,40 @@
|
||||
{
|
||||
"name": "otel-bi-frontend",
|
||||
"version": "0.1.0",
|
||||
"private": true,
|
||||
"license": "AGPL-3.0-or-later",
|
||||
"author": "Domagoj Andrić",
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"dev": "vite",
|
||||
"build": "tsc -b && vite build",
|
||||
"preview": "vite preview"
|
||||
},
|
||||
"dependencies": {
|
||||
"@opentelemetry/api": "^1.9.0",
|
||||
"@opentelemetry/context-zone-peer-dep": "^2.2.0",
|
||||
"@opentelemetry/core": "^2.2.0",
|
||||
"@opentelemetry/exporter-trace-otlp-http": "^0.213.0",
|
||||
"@opentelemetry/instrumentation-document-load": "^0.58.0",
|
||||
"@opentelemetry/instrumentation": "^0.213.0",
|
||||
"@opentelemetry/instrumentation-fetch": "^0.213.0",
|
||||
"@opentelemetry/instrumentation-user-interaction": "^0.57.0",
|
||||
"@opentelemetry/instrumentation-xml-http-request": "^0.213.0",
|
||||
"@opentelemetry/resources": "^2.2.0",
|
||||
"@opentelemetry/sdk-trace-base": "^2.2.0",
|
||||
"@opentelemetry/sdk-trace-web": "^2.2.0",
|
||||
"@tanstack/react-query": "^5.90.2",
|
||||
"oidc-client-ts": "^3.1.0",
|
||||
"react": "^19.1.1",
|
||||
"react-dom": "^19.1.1",
|
||||
"recharts": "^3.2.1",
|
||||
"zone.js": "^0.15.1"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/react": "^19.1.10",
|
||||
"@types/react-dom": "^19.1.7",
|
||||
"@vitejs/plugin-react": "^5.0.0",
|
||||
"typescript": "~5.9.2",
|
||||
"vite": "^7.1.4"
|
||||
}
|
||||
}
|
||||
363
frontend/src/App.tsx
Normal file
363
frontend/src/App.tsx
Normal file
@@ -0,0 +1,363 @@
|
||||
import { trace, SpanStatusCode } from "@opentelemetry/api";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import { startTransition, useDeferredValue } from "react";
|
||||
import {
|
||||
Area,
|
||||
AreaChart,
|
||||
CartesianGrid,
|
||||
Line,
|
||||
LineChart,
|
||||
ResponsiveContainer,
|
||||
Tooltip,
|
||||
XAxis,
|
||||
YAxis,
|
||||
} from "recharts";
|
||||
|
||||
import { getDashboard } from "./api/client";
|
||||
import { useAuth } from "./auth/AuthContext";
|
||||
|
||||
const money = new Intl.NumberFormat("en-US", {
|
||||
style: "currency",
|
||||
currency: "USD",
|
||||
maximumFractionDigits: 0,
|
||||
});
|
||||
|
||||
const tracer = trace.getTracer("bi-frontend-ui");
|
||||
|
||||
function formatCompactDate(value: string): string {
|
||||
return new Date(value).toLocaleDateString("en-US", {
|
||||
month: "short",
|
||||
day: "numeric",
|
||||
});
|
||||
}
|
||||
|
||||
function formatTooltipMoney(
|
||||
value: string | number | readonly (string | number)[] | undefined,
|
||||
): string {
|
||||
const raw = Array.isArray(value) ? Number(value[0]) : Number(value);
|
||||
return money.format(Number.isFinite(raw) ? raw : 0);
|
||||
}
|
||||
|
||||
function formatTooltipNumber(
|
||||
value: string | number | readonly (string | number)[] | undefined,
|
||||
): string {
|
||||
const raw = Array.isArray(value) ? Number(value[0]) : Number(value);
|
||||
return Number.isFinite(raw) ? raw.toFixed(2) : "0.00";
|
||||
}
|
||||
|
||||
export default function App() {
|
||||
const auth = useAuth();
|
||||
const dashboardQuery = useQuery({
|
||||
queryKey: ["dashboard"],
|
||||
queryFn: getDashboard,
|
||||
staleTime: 30_000,
|
||||
refetchInterval: 120_000,
|
||||
enabled: auth.authenticated || !auth.enabled,
|
||||
});
|
||||
|
||||
const deferredRankings = useDeferredValue(
|
||||
dashboardQuery.data?.rankings ?? [],
|
||||
);
|
||||
|
||||
const chartHistory =
|
||||
dashboardQuery.data?.history.slice(-120).map((point) => ({
|
||||
date: point.date,
|
||||
actual: point.revenue,
|
||||
forecast: null as number | null,
|
||||
lower: null as number | null,
|
||||
upper: null as number | null,
|
||||
})) ?? [];
|
||||
const chartForecast =
|
||||
dashboardQuery.data?.forecasts.slice(0, 45).map((point) => ({
|
||||
date: point.date,
|
||||
actual: null as number | null,
|
||||
forecast: point.predicted_revenue,
|
||||
lower: point.lower_bound,
|
||||
upper: point.upper_bound,
|
||||
})) ?? [];
|
||||
const trendData = [...chartHistory, ...chartForecast];
|
||||
|
||||
const refreshData = () => {
|
||||
tracer.startActiveSpan("frontend.refresh_click", async (span) => {
|
||||
try {
|
||||
startTransition(() => {
|
||||
void dashboardQuery.refetch();
|
||||
});
|
||||
span.setStatus({ code: SpanStatusCode.OK });
|
||||
} catch (error) {
|
||||
span.recordException(error as Error);
|
||||
span.setStatus({
|
||||
code: SpanStatusCode.ERROR,
|
||||
message: "Failed to refresh dashboard data.",
|
||||
});
|
||||
} finally {
|
||||
span.end();
|
||||
}
|
||||
});
|
||||
};
|
||||
|
||||
if (auth.loading) {
|
||||
return <div className="loading-shell">Initializing OIDC session...</div>;
|
||||
}
|
||||
|
||||
if (auth.error) {
|
||||
return (
|
||||
<div className="loading-shell">
|
||||
Authentication setup error.
|
||||
<br />
|
||||
{auth.error}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
if (auth.enabled && !auth.authenticated) {
|
||||
return (
|
||||
<div className="loading-shell">
|
||||
Authentication required.
|
||||
<br />
|
||||
<button
|
||||
className="refresh-button"
|
||||
onClick={() => void auth.login()}
|
||||
type="button"
|
||||
>
|
||||
Sign In with OIDC
|
||||
</button>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
if (dashboardQuery.isLoading) {
|
||||
return (
|
||||
<div className="loading-shell">
|
||||
Loading telemetry-enabled BI dashboard...
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
if (dashboardQuery.error || !dashboardQuery.data) {
|
||||
return (
|
||||
<div className="loading-shell">
|
||||
Dashboard could not load.
|
||||
<br />
|
||||
{(dashboardQuery.error as Error | undefined)?.message ??
|
||||
"No response from backend."}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
const { kpis, recommendations, telemetry } = dashboardQuery.data;
|
||||
const topScore = deferredRankings[0]?.score ?? 0;
|
||||
|
||||
return (
|
||||
<main className="app-shell">
|
||||
<div className="radial-glow" />
|
||||
<header className="dashboard-header">
|
||||
<div>
|
||||
<p className="eyebrow">Business Intelligence Command Center</p>
|
||||
<h1>Warehouse Forecasting and Ranking Dashboard</h1>
|
||||
<p className="subtitle">
|
||||
Data sources: <strong>WorldWideImporters</strong> +{" "}
|
||||
<strong>AdventureWorks2022DWH</strong> (read-only) with
|
||||
OpenTelemetry traces from browser to SQL.
|
||||
</p>
|
||||
<p className="trace-id">
|
||||
Last backend trace:{" "}
|
||||
<code>{telemetry.backendTraceId ?? "missing-trace-id-header"}</code>
|
||||
</p>
|
||||
</div>
|
||||
<div className="auth-actions">
|
||||
<p className="subtitle">
|
||||
User: <strong>{auth.subject ?? "unknown"}</strong>
|
||||
</p>
|
||||
<div className="header-actions">
|
||||
<button
|
||||
className="refresh-button"
|
||||
onClick={refreshData}
|
||||
type="button"
|
||||
>
|
||||
Refresh
|
||||
</button>
|
||||
{auth.enabled ? (
|
||||
<button
|
||||
className="logout-button"
|
||||
onClick={() => void auth.logout()}
|
||||
type="button"
|
||||
>
|
||||
Sign Out
|
||||
</button>
|
||||
) : null}
|
||||
</div>
|
||||
</div>
|
||||
</header>
|
||||
|
||||
<section className="kpi-grid">
|
||||
<article className="kpi-card">
|
||||
<p>Total Revenue</p>
|
||||
<h2>{money.format(kpis.total_revenue)}</h2>
|
||||
</article>
|
||||
<article className="kpi-card">
|
||||
<p>Gross Margin</p>
|
||||
<h2>{kpis.gross_margin_pct.toFixed(2)}%</h2>
|
||||
</article>
|
||||
<article className="kpi-card">
|
||||
<p>Avg Order Value</p>
|
||||
<h2>{money.format(kpis.avg_order_value)}</h2>
|
||||
</article>
|
||||
<article className="kpi-card">
|
||||
<p>Total Quantity</p>
|
||||
<h2>
|
||||
{kpis.total_quantity.toLocaleString("en-US", {
|
||||
maximumFractionDigits: 0,
|
||||
})}
|
||||
</h2>
|
||||
</article>
|
||||
</section>
|
||||
|
||||
<section className="panel-grid">
|
||||
<article className="panel wide">
|
||||
<div className="panel-title-row">
|
||||
<h3>Revenue Trend + Forecast</h3>
|
||||
<span>{trendData.length} points</span>
|
||||
</div>
|
||||
<div className="chart-wrap">
|
||||
<ResponsiveContainer width="100%" height={320}>
|
||||
<LineChart data={trendData}>
|
||||
<CartesianGrid
|
||||
strokeDasharray="4 4"
|
||||
stroke="rgba(255,255,255,0.08)"
|
||||
/>
|
||||
<XAxis
|
||||
dataKey="date"
|
||||
tickFormatter={formatCompactDate}
|
||||
stroke="rgba(255,255,255,0.65)"
|
||||
/>
|
||||
<YAxis
|
||||
tickFormatter={(value) => money.format(value)}
|
||||
stroke="rgba(255,255,255,0.65)"
|
||||
/>
|
||||
<Tooltip
|
||||
labelFormatter={(label) =>
|
||||
new Date(label).toLocaleDateString("en-US")
|
||||
}
|
||||
formatter={formatTooltipMoney}
|
||||
/>
|
||||
<Area
|
||||
type="monotone"
|
||||
dataKey="upper"
|
||||
stroke="none"
|
||||
fill="rgba(90, 201, 255, 0.1)"
|
||||
/>
|
||||
<Area
|
||||
type="monotone"
|
||||
dataKey="lower"
|
||||
stroke="none"
|
||||
fill="rgba(15, 20, 31, 0.9)"
|
||||
/>
|
||||
<Line
|
||||
type="monotone"
|
||||
dataKey="actual"
|
||||
stroke="#f9de70"
|
||||
strokeWidth={2.5}
|
||||
dot={false}
|
||||
/>
|
||||
<Line
|
||||
type="monotone"
|
||||
dataKey="forecast"
|
||||
stroke="#57d4ff"
|
||||
strokeWidth={2.5}
|
||||
strokeDasharray="8 5"
|
||||
dot={false}
|
||||
/>
|
||||
</LineChart>
|
||||
</ResponsiveContainer>
|
||||
</div>
|
||||
</article>
|
||||
|
||||
<article className="panel">
|
||||
<div className="panel-title-row">
|
||||
<h3>Top Product Score</h3>
|
||||
<span>Weighted ranking index</span>
|
||||
</div>
|
||||
<div className="score-wrap">
|
||||
<ResponsiveContainer width="100%" height={240}>
|
||||
<AreaChart
|
||||
data={[
|
||||
{ label: "baseline", value: 0 },
|
||||
{ label: "current", value: topScore },
|
||||
]}
|
||||
>
|
||||
<CartesianGrid
|
||||
strokeDasharray="3 3"
|
||||
stroke="rgba(255,255,255,0.08)"
|
||||
/>
|
||||
<XAxis dataKey="label" stroke="rgba(255,255,255,0.65)" />
|
||||
<YAxis stroke="rgba(255,255,255,0.65)" />
|
||||
<Tooltip formatter={formatTooltipNumber} />
|
||||
<Area
|
||||
type="monotone"
|
||||
dataKey="value"
|
||||
stroke="#8ef2c7"
|
||||
fill="rgba(142, 242, 199, 0.28)"
|
||||
/>
|
||||
</AreaChart>
|
||||
</ResponsiveContainer>
|
||||
<p className="score-caption">
|
||||
Current leader score <strong>{topScore.toFixed(2)}</strong> / 100
|
||||
</p>
|
||||
</div>
|
||||
</article>
|
||||
|
||||
<article className="panel wide">
|
||||
<div className="panel-title-row">
|
||||
<h3>Product Rankings</h3>
|
||||
<span>Top {deferredRankings.length}</span>
|
||||
</div>
|
||||
<div className="table-wrap">
|
||||
<table>
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Rank</th>
|
||||
<th>Product</th>
|
||||
<th>Category</th>
|
||||
<th>Revenue</th>
|
||||
<th>Margin</th>
|
||||
<th>Score</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{deferredRankings.map((item) => (
|
||||
<tr key={`${item.rank}-${item.product_id}`}>
|
||||
<td>{item.rank}</td>
|
||||
<td>{item.product_name}</td>
|
||||
<td>{item.category}</td>
|
||||
<td>{money.format(item.revenue)}</td>
|
||||
<td>{item.margin_pct.toFixed(2)}%</td>
|
||||
<td>{item.score.toFixed(2)}</td>
|
||||
</tr>
|
||||
))}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</article>
|
||||
|
||||
<article className="panel">
|
||||
<div className="panel-title-row">
|
||||
<h3>Recommendations</h3>
|
||||
<span>Action queue</span>
|
||||
</div>
|
||||
<ul className="recommendations-list">
|
||||
{recommendations.map((item, index) => (
|
||||
<li key={`${item.title}-${index}`}>
|
||||
<span className={`priority ${item.priority}`}>
|
||||
{item.priority}
|
||||
</span>
|
||||
<h4>{item.title}</h4>
|
||||
<p>{item.summary}</p>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
</article>
|
||||
</section>
|
||||
</main>
|
||||
);
|
||||
}
|
||||
53
frontend/src/api/client.ts
Normal file
53
frontend/src/api/client.ts
Normal file
@@ -0,0 +1,53 @@
|
||||
import { SpanStatusCode, trace } from "@opentelemetry/api";
|
||||
|
||||
import { currentAccessToken } from "../auth/oidc";
|
||||
import type { DashboardPayload, DashboardResponse } from "./types";
|
||||
|
||||
const API_BASE_URL =
|
||||
import.meta.env.VITE_API_BASE_URL ?? "http://localhost:8000";
|
||||
const tracer = trace.getTracer("bi-frontend-api");
|
||||
|
||||
async function parseJson<T>(response: Response): Promise<T> {
|
||||
if (!response.ok) {
|
||||
const body = await response.text();
|
||||
throw new Error(`HTTP ${response.status}: ${body}`);
|
||||
}
|
||||
return (await response.json()) as T;
|
||||
}
|
||||
|
||||
export async function getDashboard(): Promise<DashboardPayload> {
|
||||
return tracer.startActiveSpan("frontend.api.dashboard", async (span) => {
|
||||
try {
|
||||
const token = currentAccessToken();
|
||||
const response = await fetch(`${API_BASE_URL}/api/dashboard`, {
|
||||
method: "GET",
|
||||
headers: {
|
||||
Accept: "application/json",
|
||||
...(token ? { Authorization: `Bearer ${token}` } : {}),
|
||||
},
|
||||
});
|
||||
const data = await parseJson<DashboardResponse>(response);
|
||||
const backendTraceId = response.headers.get("x-trace-id");
|
||||
const backendSpanId = response.headers.get("x-span-id");
|
||||
span.setAttribute("dashboard.kpis", Object.keys(data.kpis).length);
|
||||
span.setAttribute("backend.trace_id_present", backendTraceId !== null);
|
||||
span.setStatus({ code: SpanStatusCode.OK });
|
||||
return {
|
||||
...data,
|
||||
telemetry: {
|
||||
backendTraceId,
|
||||
backendSpanId,
|
||||
},
|
||||
};
|
||||
} catch (error) {
|
||||
span.recordException(error as Error);
|
||||
span.setStatus({
|
||||
code: SpanStatusCode.ERROR,
|
||||
message: "dashboard request failed",
|
||||
});
|
||||
throw error;
|
||||
} finally {
|
||||
span.end();
|
||||
}
|
||||
});
|
||||
}
|
||||
52
frontend/src/api/types.ts
Normal file
52
frontend/src/api/types.ts
Normal file
@@ -0,0 +1,52 @@
|
||||
export type KPI = {
|
||||
total_revenue: number;
|
||||
gross_margin_pct: number;
|
||||
total_quantity: number;
|
||||
avg_order_value: number;
|
||||
records_in_window: number;
|
||||
};
|
||||
|
||||
export type HistoryPoint = {
|
||||
date: string;
|
||||
revenue: number;
|
||||
cost: number;
|
||||
quantity: number;
|
||||
};
|
||||
|
||||
export type ForecastPoint = {
|
||||
date: string;
|
||||
predicted_revenue: number;
|
||||
lower_bound: number;
|
||||
upper_bound: number;
|
||||
};
|
||||
|
||||
export type RankingItem = {
|
||||
rank: number;
|
||||
product_id: string;
|
||||
product_name: string;
|
||||
category: string;
|
||||
revenue: number;
|
||||
margin_pct: number;
|
||||
score: number;
|
||||
};
|
||||
|
||||
export type Recommendation = {
|
||||
title: string;
|
||||
priority: string;
|
||||
summary: string;
|
||||
};
|
||||
|
||||
export type DashboardResponse = {
|
||||
kpis: KPI;
|
||||
history: HistoryPoint[];
|
||||
forecasts: ForecastPoint[];
|
||||
rankings: RankingItem[];
|
||||
recommendations: Recommendation[];
|
||||
};
|
||||
|
||||
export type DashboardPayload = DashboardResponse & {
|
||||
telemetry: {
|
||||
backendTraceId: string | null;
|
||||
backendSpanId: string | null;
|
||||
};
|
||||
};
|
||||
90
frontend/src/auth/AuthContext.tsx
Normal file
90
frontend/src/auth/AuthContext.tsx
Normal file
@@ -0,0 +1,90 @@
|
||||
import {
|
||||
createContext,
|
||||
useContext,
|
||||
useEffect,
|
||||
useState,
|
||||
type ReactNode,
|
||||
} from "react";
|
||||
|
||||
import {
|
||||
currentUser,
|
||||
initializeOIDC,
|
||||
isOIDCEnabled,
|
||||
login,
|
||||
logout,
|
||||
oidcConfigError,
|
||||
} from "./oidc";
|
||||
|
||||
type AuthState = {
|
||||
loading: boolean;
|
||||
authenticated: boolean;
|
||||
enabled: boolean;
|
||||
subject: string | null;
|
||||
error: string | null;
|
||||
login: () => Promise<void>;
|
||||
logout: () => Promise<void>;
|
||||
};
|
||||
|
||||
const AuthContext = createContext<AuthState>({
|
||||
loading: true,
|
||||
authenticated: false,
|
||||
enabled: true,
|
||||
subject: null,
|
||||
error: null,
|
||||
login,
|
||||
logout,
|
||||
});
|
||||
|
||||
export function AuthProvider({ children }: { children: ReactNode }) {
|
||||
const [loading, setLoading] = useState(true);
|
||||
const [authenticated, setAuthenticated] = useState(false);
|
||||
const [subject, setSubject] = useState<string | null>(null);
|
||||
const [error, setError] = useState<string | null>(null);
|
||||
const enabled = isOIDCEnabled();
|
||||
|
||||
useEffect(() => {
|
||||
const bootstrap = async () => {
|
||||
try {
|
||||
const configIssue = oidcConfigError();
|
||||
if (configIssue) {
|
||||
setError(configIssue);
|
||||
setAuthenticated(false);
|
||||
return;
|
||||
}
|
||||
|
||||
await initializeOIDC();
|
||||
const user = currentUser();
|
||||
const isAuthed = !!user && !user.expired;
|
||||
setAuthenticated(isAuthed);
|
||||
setSubject((user?.profile?.sub as string | undefined) ?? null);
|
||||
} catch (err) {
|
||||
setError((err as Error).message);
|
||||
setAuthenticated(false);
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
void bootstrap();
|
||||
}, []);
|
||||
|
||||
return (
|
||||
<AuthContext.Provider
|
||||
value={{
|
||||
loading,
|
||||
authenticated,
|
||||
enabled,
|
||||
subject,
|
||||
error,
|
||||
login,
|
||||
logout,
|
||||
}}
|
||||
>
|
||||
{children}
|
||||
</AuthContext.Provider>
|
||||
);
|
||||
}
|
||||
|
||||
export function useAuth(): AuthState {
|
||||
return useContext(AuthContext);
|
||||
}
|
||||
105
frontend/src/auth/oidc.ts
Normal file
105
frontend/src/auth/oidc.ts
Normal file
@@ -0,0 +1,105 @@
|
||||
import { UserManager, type User, WebStorageStateStore } from "oidc-client-ts";
|
||||
|
||||
type OIDCConfig = {
|
||||
enabled: boolean;
|
||||
authority: string;
|
||||
clientId: string;
|
||||
redirectUri: string;
|
||||
postLogoutRedirectUri: string;
|
||||
scope: string;
|
||||
};
|
||||
|
||||
let cachedUser: User | null = null;
|
||||
|
||||
function config(): OIDCConfig {
|
||||
const enabled = (import.meta.env.VITE_OIDC_ENABLED ?? "true") !== "false";
|
||||
return {
|
||||
enabled,
|
||||
authority: import.meta.env.VITE_OIDC_AUTHORITY ?? "",
|
||||
clientId: import.meta.env.VITE_OIDC_CLIENT_ID ?? "",
|
||||
redirectUri:
|
||||
import.meta.env.VITE_OIDC_REDIRECT_URI ?? window.location.origin,
|
||||
postLogoutRedirectUri:
|
||||
import.meta.env.VITE_OIDC_POST_LOGOUT_REDIRECT_URI ??
|
||||
window.location.origin,
|
||||
scope: import.meta.env.VITE_OIDC_SCOPE ?? "openid profile email",
|
||||
};
|
||||
}
|
||||
|
||||
export function isOIDCEnabled(): boolean {
|
||||
return config().enabled;
|
||||
}
|
||||
|
||||
export function oidcConfigError(): string | null {
|
||||
const cfg = config();
|
||||
if (!cfg.enabled) return null;
|
||||
if (!cfg.authority) return "VITE_OIDC_AUTHORITY is not set.";
|
||||
if (!cfg.clientId) return "VITE_OIDC_CLIENT_ID is not set.";
|
||||
return null;
|
||||
}
|
||||
|
||||
function manager(): UserManager {
|
||||
const cfg = config();
|
||||
return new UserManager({
|
||||
authority: cfg.authority,
|
||||
client_id: cfg.clientId,
|
||||
redirect_uri: cfg.redirectUri,
|
||||
post_logout_redirect_uri: cfg.postLogoutRedirectUri,
|
||||
response_type: "code",
|
||||
scope: cfg.scope,
|
||||
userStore: new WebStorageStateStore({ store: window.sessionStorage }),
|
||||
monitorSession: true,
|
||||
automaticSilentRenew: false,
|
||||
});
|
||||
}
|
||||
|
||||
function hasSigninParams(): boolean {
|
||||
const params = new URLSearchParams(window.location.search);
|
||||
return params.has("code") && params.has("state");
|
||||
}
|
||||
|
||||
export async function initializeOIDC(): Promise<User | null> {
|
||||
if (!isOIDCEnabled()) {
|
||||
cachedUser = null;
|
||||
return null;
|
||||
}
|
||||
|
||||
if (oidcConfigError()) {
|
||||
cachedUser = null;
|
||||
return null;
|
||||
}
|
||||
|
||||
const userManager = manager();
|
||||
if (hasSigninParams()) {
|
||||
cachedUser = await userManager.signinRedirectCallback();
|
||||
window.history.replaceState({}, document.title, window.location.pathname);
|
||||
return cachedUser;
|
||||
}
|
||||
|
||||
cachedUser = await userManager.getUser();
|
||||
return cachedUser;
|
||||
}
|
||||
|
||||
export function currentUser(): User | null {
|
||||
return cachedUser;
|
||||
}
|
||||
|
||||
export function currentAccessToken(): string | null {
|
||||
if (cachedUser?.access_token && !cachedUser.expired)
|
||||
return cachedUser.access_token;
|
||||
return null;
|
||||
}
|
||||
|
||||
export async function login(): Promise<void> {
|
||||
if (!isOIDCEnabled()) return;
|
||||
const userManager = manager();
|
||||
await userManager.signinRedirect({
|
||||
state: { returnTo: window.location.pathname + window.location.search },
|
||||
});
|
||||
}
|
||||
|
||||
export async function logout(): Promise<void> {
|
||||
if (!isOIDCEnabled()) return;
|
||||
const userManager = manager();
|
||||
await userManager.signoutRedirect();
|
||||
}
|
||||
31
frontend/src/main.tsx
Normal file
31
frontend/src/main.tsx
Normal file
@@ -0,0 +1,31 @@
|
||||
import "zone.js";
|
||||
import "./styles.css";
|
||||
|
||||
import { QueryClient, QueryClientProvider } from "@tanstack/react-query";
|
||||
import { StrictMode } from "react";
|
||||
import { createRoot } from "react-dom/client";
|
||||
|
||||
import App from "./App";
|
||||
import { AuthProvider } from "./auth/AuthContext";
|
||||
import { setupTelemetry } from "./telemetry";
|
||||
|
||||
setupTelemetry();
|
||||
|
||||
const queryClient = new QueryClient({
|
||||
defaultOptions: {
|
||||
queries: {
|
||||
retry: 1,
|
||||
refetchOnWindowFocus: false,
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
createRoot(document.getElementById("root")!).render(
|
||||
<StrictMode>
|
||||
<QueryClientProvider client={queryClient}>
|
||||
<AuthProvider>
|
||||
<App />
|
||||
</AuthProvider>
|
||||
</QueryClientProvider>
|
||||
</StrictMode>,
|
||||
);
|
||||
325
frontend/src/styles.css
Normal file
325
frontend/src/styles.css
Normal file
@@ -0,0 +1,325 @@
|
||||
:root {
|
||||
font-family: "Space Grotesk", "Segoe UI", sans-serif;
|
||||
line-height: 1.5;
|
||||
font-weight: 400;
|
||||
color: #f3f7ff;
|
||||
background: #0a1019;
|
||||
|
||||
--bg-primary: #0a1019;
|
||||
--bg-secondary: #101d2e;
|
||||
--bg-panel: rgba(16, 28, 44, 0.72);
|
||||
--border: rgba(186, 212, 255, 0.22);
|
||||
--accent-a: #f9de70;
|
||||
--accent-b: #57d4ff;
|
||||
--accent-c: #8ef2c7;
|
||||
--text-muted: rgba(233, 244, 255, 0.7);
|
||||
--shadow: 0 20px 55px rgba(3, 8, 16, 0.45);
|
||||
}
|
||||
|
||||
* {
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
body {
|
||||
margin: 0;
|
||||
min-height: 100vh;
|
||||
background:
|
||||
radial-gradient(
|
||||
circle at 0% 0%,
|
||||
rgba(122, 82, 242, 0.2),
|
||||
transparent 30%
|
||||
),
|
||||
radial-gradient(
|
||||
circle at 100% 10%,
|
||||
rgba(87, 212, 255, 0.18),
|
||||
transparent 30%
|
||||
),
|
||||
linear-gradient(150deg, var(--bg-primary), var(--bg-secondary));
|
||||
}
|
||||
|
||||
.app-shell {
|
||||
width: min(1200px, 100% - 2rem);
|
||||
margin: 1.5rem auto 3rem;
|
||||
position: relative;
|
||||
}
|
||||
|
||||
.radial-glow {
|
||||
position: fixed;
|
||||
width: 48vw;
|
||||
height: 48vw;
|
||||
max-width: 540px;
|
||||
max-height: 540px;
|
||||
border-radius: 50%;
|
||||
background: radial-gradient(
|
||||
circle,
|
||||
rgba(87, 212, 255, 0.16),
|
||||
transparent 65%
|
||||
);
|
||||
top: -12rem;
|
||||
right: -10rem;
|
||||
pointer-events: none;
|
||||
z-index: 0;
|
||||
}
|
||||
|
||||
.dashboard-header,
|
||||
.kpi-grid,
|
||||
.panel-grid {
|
||||
position: relative;
|
||||
z-index: 1;
|
||||
}
|
||||
|
||||
.dashboard-header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
gap: 1rem;
|
||||
align-items: flex-start;
|
||||
margin-bottom: 1rem;
|
||||
}
|
||||
|
||||
.auth-actions {
|
||||
display: grid;
|
||||
gap: 0.5rem;
|
||||
justify-items: end;
|
||||
}
|
||||
|
||||
.header-actions {
|
||||
display: flex;
|
||||
gap: 0.5rem;
|
||||
}
|
||||
|
||||
.dashboard-header h1 {
|
||||
margin: 0.2rem 0 0.5rem;
|
||||
font-size: clamp(1.6rem, 2.2vw, 2.4rem);
|
||||
letter-spacing: -0.04em;
|
||||
}
|
||||
|
||||
.eyebrow {
|
||||
margin: 0;
|
||||
color: var(--accent-b);
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.14em;
|
||||
font-size: 0.74rem;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.subtitle {
|
||||
margin: 0;
|
||||
color: var(--text-muted);
|
||||
max-width: 74ch;
|
||||
}
|
||||
|
||||
.trace-id {
|
||||
margin: 0.5rem 0 0;
|
||||
color: var(--text-muted);
|
||||
font-size: 0.8rem;
|
||||
}
|
||||
|
||||
.trace-id code {
|
||||
color: var(--accent-c);
|
||||
}
|
||||
|
||||
.refresh-button {
|
||||
background: linear-gradient(125deg, var(--accent-b), #7be8ff);
|
||||
color: #04111b;
|
||||
border: 0;
|
||||
font-weight: 700;
|
||||
padding: 0.72rem 1rem;
|
||||
border-radius: 0.8rem;
|
||||
box-shadow: var(--shadow);
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.logout-button {
|
||||
background: rgba(248, 159, 159, 0.2);
|
||||
color: #ffd6d6;
|
||||
border: 1px solid rgba(255, 184, 184, 0.45);
|
||||
font-weight: 700;
|
||||
padding: 0.72rem 1rem;
|
||||
border-radius: 0.8rem;
|
||||
box-shadow: var(--shadow);
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.kpi-grid {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(4, minmax(0, 1fr));
|
||||
gap: 0.9rem;
|
||||
margin-bottom: 0.9rem;
|
||||
}
|
||||
|
||||
.kpi-card {
|
||||
background: var(--bg-panel);
|
||||
border: 1px solid var(--border);
|
||||
border-radius: 1rem;
|
||||
padding: 0.95rem 1rem;
|
||||
box-shadow: var(--shadow);
|
||||
backdrop-filter: blur(8px);
|
||||
}
|
||||
|
||||
.kpi-card p {
|
||||
margin: 0;
|
||||
color: var(--text-muted);
|
||||
font-size: 0.82rem;
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.08em;
|
||||
}
|
||||
|
||||
.kpi-card h2 {
|
||||
margin: 0.5rem 0 0;
|
||||
font-size: clamp(1.1rem, 1.7vw, 1.6rem);
|
||||
}
|
||||
|
||||
.panel-grid {
|
||||
display: grid;
|
||||
grid-template-columns: 2fr 1fr;
|
||||
gap: 0.9rem;
|
||||
}
|
||||
|
||||
.panel {
|
||||
background: var(--bg-panel);
|
||||
border: 1px solid var(--border);
|
||||
border-radius: 1rem;
|
||||
box-shadow: var(--shadow);
|
||||
backdrop-filter: blur(8px);
|
||||
padding: 0.9rem;
|
||||
}
|
||||
|
||||
.panel.wide {
|
||||
grid-column: span 2;
|
||||
}
|
||||
|
||||
.panel-title-row {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: baseline;
|
||||
gap: 1rem;
|
||||
}
|
||||
|
||||
.panel-title-row h3 {
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
.panel-title-row span {
|
||||
color: var(--text-muted);
|
||||
font-size: 0.85rem;
|
||||
}
|
||||
|
||||
.chart-wrap,
|
||||
.score-wrap {
|
||||
margin-top: 0.8rem;
|
||||
}
|
||||
|
||||
.score-caption {
|
||||
margin-top: 0.4rem;
|
||||
color: var(--text-muted);
|
||||
}
|
||||
|
||||
.table-wrap {
|
||||
margin-top: 0.7rem;
|
||||
max-height: 350px;
|
||||
overflow: auto;
|
||||
}
|
||||
|
||||
table {
|
||||
width: 100%;
|
||||
border-collapse: collapse;
|
||||
}
|
||||
|
||||
th,
|
||||
td {
|
||||
text-align: left;
|
||||
padding: 0.6rem 0.45rem;
|
||||
border-bottom: 1px solid rgba(216, 232, 255, 0.09);
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
th {
|
||||
color: var(--text-muted);
|
||||
font-size: 0.77rem;
|
||||
letter-spacing: 0.08em;
|
||||
text-transform: uppercase;
|
||||
}
|
||||
|
||||
.recommendations-list {
|
||||
margin: 0.8rem 0 0;
|
||||
padding: 0;
|
||||
list-style: none;
|
||||
display: grid;
|
||||
gap: 0.75rem;
|
||||
}
|
||||
|
||||
.recommendations-list li {
|
||||
border: 1px solid rgba(190, 210, 245, 0.14);
|
||||
background: rgba(12, 20, 31, 0.7);
|
||||
border-radius: 0.8rem;
|
||||
padding: 0.75rem;
|
||||
}
|
||||
|
||||
.recommendations-list h4 {
|
||||
margin: 0.5rem 0 0.3rem;
|
||||
}
|
||||
|
||||
.recommendations-list p {
|
||||
margin: 0;
|
||||
color: var(--text-muted);
|
||||
}
|
||||
|
||||
.priority {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
border-radius: 999px;
|
||||
padding: 0.2rem 0.55rem;
|
||||
font-size: 0.72rem;
|
||||
text-transform: uppercase;
|
||||
font-weight: 700;
|
||||
letter-spacing: 0.08em;
|
||||
}
|
||||
|
||||
.priority.high {
|
||||
background: rgba(255, 112, 112, 0.17);
|
||||
color: #ffb6b6;
|
||||
}
|
||||
|
||||
.priority.medium {
|
||||
background: rgba(255, 205, 112, 0.18);
|
||||
color: #ffe3ae;
|
||||
}
|
||||
|
||||
.priority.low {
|
||||
background: rgba(142, 242, 199, 0.18);
|
||||
color: #b9ffd8;
|
||||
}
|
||||
|
||||
.loading-shell {
|
||||
color: #d6e7ff;
|
||||
min-height: 100vh;
|
||||
display: grid;
|
||||
place-items: center;
|
||||
text-align: center;
|
||||
padding: 1rem;
|
||||
font-size: 1.04rem;
|
||||
}
|
||||
|
||||
@media (max-width: 980px) {
|
||||
.kpi-grid {
|
||||
grid-template-columns: repeat(2, minmax(0, 1fr));
|
||||
}
|
||||
|
||||
.panel-grid {
|
||||
grid-template-columns: 1fr;
|
||||
}
|
||||
|
||||
.panel.wide {
|
||||
grid-column: auto;
|
||||
}
|
||||
}
|
||||
|
||||
@media (max-width: 640px) {
|
||||
.dashboard-header {
|
||||
flex-direction: column;
|
||||
}
|
||||
|
||||
.kpi-grid {
|
||||
grid-template-columns: 1fr;
|
||||
}
|
||||
}
|
||||
77
frontend/src/telemetry.ts
Normal file
77
frontend/src/telemetry.ts
Normal file
@@ -0,0 +1,77 @@
|
||||
import { propagation } from "@opentelemetry/api";
|
||||
import { resourceFromAttributes } from "@opentelemetry/resources";
|
||||
import { BatchSpanProcessor } from "@opentelemetry/sdk-trace-base";
|
||||
import { WebTracerProvider } from "@opentelemetry/sdk-trace-web";
|
||||
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-http";
|
||||
import {
|
||||
CompositePropagator,
|
||||
W3CBaggagePropagator,
|
||||
W3CTraceContextPropagator,
|
||||
} from "@opentelemetry/core";
|
||||
import { DocumentLoadInstrumentation } from "@opentelemetry/instrumentation-document-load";
|
||||
import { registerInstrumentations } from "@opentelemetry/instrumentation";
|
||||
import { FetchInstrumentation } from "@opentelemetry/instrumentation-fetch";
|
||||
import { UserInteractionInstrumentation } from "@opentelemetry/instrumentation-user-interaction";
|
||||
import { XMLHttpRequestInstrumentation } from "@opentelemetry/instrumentation-xml-http-request";
|
||||
import { ZoneContextManager } from "@opentelemetry/context-zone-peer-dep";
|
||||
|
||||
let initialized = false;
|
||||
|
||||
function escapeRegExp(value: string): string {
|
||||
return value.replace(/[.*+?^${}()|[\]\\]/g, "\\$&");
|
||||
}
|
||||
|
||||
export function setupTelemetry(): void {
|
||||
if (initialized) return;
|
||||
initialized = true;
|
||||
|
||||
const endpoint =
|
||||
import.meta.env.VITE_OTEL_COLLECTOR_ENDPOINT ?? "http://localhost:4318";
|
||||
const serviceName =
|
||||
import.meta.env.VITE_OTEL_SERVICE_NAME ?? "otel-bi-frontend";
|
||||
const serviceNamespace =
|
||||
import.meta.env.VITE_OTEL_SERVICE_NAMESPACE ?? "final-thesis";
|
||||
const apiBaseUrl =
|
||||
import.meta.env.VITE_API_BASE_URL ?? "http://localhost:8000";
|
||||
|
||||
propagation.setGlobalPropagator(
|
||||
new CompositePropagator({
|
||||
propagators: [
|
||||
new W3CTraceContextPropagator(),
|
||||
new W3CBaggagePropagator(),
|
||||
],
|
||||
}),
|
||||
);
|
||||
|
||||
const provider = new WebTracerProvider({
|
||||
resource: resourceFromAttributes({
|
||||
"service.name": serviceName,
|
||||
"service.namespace": serviceNamespace,
|
||||
"deployment.environment": import.meta.env.MODE,
|
||||
}),
|
||||
spanProcessors: [
|
||||
new BatchSpanProcessor(
|
||||
new OTLPTraceExporter({
|
||||
url: `${endpoint}/v1/traces`,
|
||||
}),
|
||||
),
|
||||
],
|
||||
});
|
||||
|
||||
provider.register({
|
||||
contextManager: new ZoneContextManager(),
|
||||
});
|
||||
|
||||
registerInstrumentations({
|
||||
instrumentations: [
|
||||
new DocumentLoadInstrumentation(),
|
||||
new FetchInstrumentation({
|
||||
propagateTraceHeaderCorsUrls: [
|
||||
new RegExp(`^${escapeRegExp(apiBaseUrl)}`),
|
||||
],
|
||||
}),
|
||||
new XMLHttpRequestInstrumentation(),
|
||||
new UserInteractionInstrumentation(),
|
||||
],
|
||||
});
|
||||
}
|
||||
15
frontend/tsconfig.app.json
Normal file
15
frontend/tsconfig.app.json
Normal file
@@ -0,0 +1,15 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"target": "ES2022",
|
||||
"lib": ["ES2022", "DOM", "DOM.Iterable"],
|
||||
"module": "ESNext",
|
||||
"moduleResolution": "Bundler",
|
||||
"jsx": "react-jsx",
|
||||
"strict": true,
|
||||
"skipLibCheck": true,
|
||||
"resolveJsonModule": true,
|
||||
"noEmit": true,
|
||||
"types": ["vite/client"]
|
||||
},
|
||||
"include": ["src"]
|
||||
}
|
||||
7
frontend/tsconfig.json
Normal file
7
frontend/tsconfig.json
Normal file
@@ -0,0 +1,7 @@
|
||||
{
|
||||
"files": [],
|
||||
"references": [
|
||||
{ "path": "./tsconfig.app.json" },
|
||||
{ "path": "./tsconfig.node.json" }
|
||||
]
|
||||
}
|
||||
10
frontend/tsconfig.node.json
Normal file
10
frontend/tsconfig.node.json
Normal file
@@ -0,0 +1,10 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"composite": true,
|
||||
"module": "ESNext",
|
||||
"moduleResolution": "Bundler",
|
||||
"types": ["node"],
|
||||
"skipLibCheck": true
|
||||
},
|
||||
"include": ["vite.config.ts"]
|
||||
}
|
||||
10
frontend/vite.config.ts
Normal file
10
frontend/vite.config.ts
Normal file
@@ -0,0 +1,10 @@
|
||||
import react from "@vitejs/plugin-react";
|
||||
import { defineConfig } from "vite";
|
||||
|
||||
export default defineConfig({
|
||||
plugins: [react()],
|
||||
server: {
|
||||
host: "0.0.0.0",
|
||||
port: 5173,
|
||||
},
|
||||
});
|
||||
277
k8s/microservices.yaml
Normal file
277
k8s/microservices.yaml
Normal file
@@ -0,0 +1,277 @@
|
||||
apiVersion: v1
|
||||
kind: Namespace
|
||||
metadata:
|
||||
name: bi-platform
|
||||
---
|
||||
apiVersion: v1
|
||||
kind: ConfigMap
|
||||
metadata:
|
||||
name: bi-platform-config
|
||||
namespace: bi-platform
|
||||
data:
|
||||
APP_ENV: "prod"
|
||||
LOG_LEVEL: "INFO"
|
||||
CORS_ORIGINS: "https://bi.example.com"
|
||||
REQUIRE_FRONTEND_AUTH: "true"
|
||||
FRONTEND_JWT_ISSUER_URL: "https://idp.example.com/realms/bi"
|
||||
FRONTEND_JWT_JWKS_URL: "https://idp.example.com/realms/bi/protocol/openid-connect/certs"
|
||||
FRONTEND_JWT_AUDIENCE: "otel-bi-api"
|
||||
FRONTEND_JWT_ALGORITHM: "RS256"
|
||||
FRONTEND_REQUIRED_SCOPES: "openid profile email"
|
||||
FRONTEND_CLOCK_SKEW_SECONDS: "30"
|
||||
INTERNAL_SERVICE_AUTH_ENABLED: "true"
|
||||
INTERNAL_SERVICE_TOKEN_TTL_SECONDS: "120"
|
||||
INTERNAL_SERVICE_TOKEN_AUDIENCE: "bi-internal"
|
||||
INTERNAL_SERVICE_ALLOWED_ISSUERS: "api-gateway"
|
||||
INTERNAL_TOKEN_CLOCK_SKEW_SECONDS: "15"
|
||||
QUERY_SERVICE_URL: "http://bi-query.bi-platform.svc.cluster.local:8000"
|
||||
ANALYTICS_SERVICE_URL: "http://analytics.bi-platform.svc.cluster.local:8000"
|
||||
PERSISTENCE_SERVICE_URL: "http://persistence.bi-platform.svc.cluster.local:8000"
|
||||
OTEL_COLLECTOR_ENDPOINT: "http://alloy.monitoring.svc.cluster.local:4318"
|
||||
---
|
||||
apiVersion: v1
|
||||
kind: Secret
|
||||
metadata:
|
||||
name: bi-platform-secrets
|
||||
namespace: bi-platform
|
||||
type: Opaque
|
||||
stringData:
|
||||
MSSQL_HOST: "mssql.dw.svc.cluster.local"
|
||||
MSSQL_PORT: "1433"
|
||||
MSSQL_USERNAME: "readonly_user"
|
||||
MSSQL_PASSWORD: "readonly_password"
|
||||
POSTGRES_HOST: "postgres.app.svc.cluster.local"
|
||||
POSTGRES_PORT: "5432"
|
||||
POSTGRES_DATABASE: "otel_bi_app"
|
||||
POSTGRES_USERNAME: "otel_bi_app"
|
||||
POSTGRES_PASSWORD: "otel_bi_app"
|
||||
POSTGRES_REQUIRED: "true"
|
||||
INTERNAL_SERVICE_SHARED_SECRET: "replace-with-strong-random-secret-min-32-bytes"
|
||||
---
|
||||
apiVersion: apps/v1
|
||||
kind: Deployment
|
||||
metadata:
|
||||
name: api-gateway
|
||||
namespace: bi-platform
|
||||
spec:
|
||||
replicas: 2
|
||||
selector:
|
||||
matchLabels:
|
||||
app: api-gateway
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
app: api-gateway
|
||||
spec:
|
||||
automountServiceAccountToken: false
|
||||
containers:
|
||||
- name: api-gateway
|
||||
image: ghcr.io/your-org/otel-bi-backend:latest
|
||||
imagePullPolicy: IfNotPresent
|
||||
command:
|
||||
[
|
||||
"uvicorn",
|
||||
"microservices.api_gateway.main:app",
|
||||
"--host",
|
||||
"0.0.0.0",
|
||||
"--port",
|
||||
"8000",
|
||||
]
|
||||
envFrom:
|
||||
- configMapRef:
|
||||
name: bi-platform-config
|
||||
- secretRef:
|
||||
name: bi-platform-secrets
|
||||
securityContext:
|
||||
allowPrivilegeEscalation: false
|
||||
capabilities:
|
||||
drop: ["ALL"]
|
||||
runAsNonRoot: true
|
||||
runAsUser: 10001
|
||||
seccompProfile:
|
||||
type: RuntimeDefault
|
||||
ports:
|
||||
- containerPort: 8000
|
||||
---
|
||||
apiVersion: v1
|
||||
kind: Service
|
||||
metadata:
|
||||
name: api-gateway
|
||||
namespace: bi-platform
|
||||
spec:
|
||||
selector:
|
||||
app: api-gateway
|
||||
ports:
|
||||
- port: 8000
|
||||
targetPort: 8000
|
||||
---
|
||||
apiVersion: apps/v1
|
||||
kind: Deployment
|
||||
metadata:
|
||||
name: bi-query
|
||||
namespace: bi-platform
|
||||
spec:
|
||||
replicas: 2
|
||||
selector:
|
||||
matchLabels:
|
||||
app: bi-query
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
app: bi-query
|
||||
spec:
|
||||
automountServiceAccountToken: false
|
||||
containers:
|
||||
- name: bi-query
|
||||
image: ghcr.io/your-org/otel-bi-backend:latest
|
||||
imagePullPolicy: IfNotPresent
|
||||
command:
|
||||
[
|
||||
"uvicorn",
|
||||
"microservices.bi_query.main:app",
|
||||
"--host",
|
||||
"0.0.0.0",
|
||||
"--port",
|
||||
"8000",
|
||||
]
|
||||
envFrom:
|
||||
- configMapRef:
|
||||
name: bi-platform-config
|
||||
- secretRef:
|
||||
name: bi-platform-secrets
|
||||
securityContext:
|
||||
allowPrivilegeEscalation: false
|
||||
capabilities:
|
||||
drop: ["ALL"]
|
||||
runAsNonRoot: true
|
||||
runAsUser: 10001
|
||||
seccompProfile:
|
||||
type: RuntimeDefault
|
||||
ports:
|
||||
- containerPort: 8000
|
||||
---
|
||||
apiVersion: v1
|
||||
kind: Service
|
||||
metadata:
|
||||
name: bi-query
|
||||
namespace: bi-platform
|
||||
spec:
|
||||
selector:
|
||||
app: bi-query
|
||||
ports:
|
||||
- port: 8000
|
||||
targetPort: 8000
|
||||
---
|
||||
apiVersion: apps/v1
|
||||
kind: Deployment
|
||||
metadata:
|
||||
name: analytics
|
||||
namespace: bi-platform
|
||||
spec:
|
||||
replicas: 2
|
||||
selector:
|
||||
matchLabels:
|
||||
app: analytics
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
app: analytics
|
||||
spec:
|
||||
automountServiceAccountToken: false
|
||||
containers:
|
||||
- name: analytics
|
||||
image: ghcr.io/your-org/otel-bi-backend:latest
|
||||
imagePullPolicy: IfNotPresent
|
||||
command:
|
||||
[
|
||||
"uvicorn",
|
||||
"microservices.analytics.main:app",
|
||||
"--host",
|
||||
"0.0.0.0",
|
||||
"--port",
|
||||
"8000",
|
||||
]
|
||||
envFrom:
|
||||
- configMapRef:
|
||||
name: bi-platform-config
|
||||
- secretRef:
|
||||
name: bi-platform-secrets
|
||||
securityContext:
|
||||
allowPrivilegeEscalation: false
|
||||
capabilities:
|
||||
drop: ["ALL"]
|
||||
runAsNonRoot: true
|
||||
runAsUser: 10001
|
||||
seccompProfile:
|
||||
type: RuntimeDefault
|
||||
ports:
|
||||
- containerPort: 8000
|
||||
---
|
||||
apiVersion: v1
|
||||
kind: Service
|
||||
metadata:
|
||||
name: analytics
|
||||
namespace: bi-platform
|
||||
spec:
|
||||
selector:
|
||||
app: analytics
|
||||
ports:
|
||||
- port: 8000
|
||||
targetPort: 8000
|
||||
---
|
||||
apiVersion: apps/v1
|
||||
kind: Deployment
|
||||
metadata:
|
||||
name: persistence
|
||||
namespace: bi-platform
|
||||
spec:
|
||||
replicas: 2
|
||||
selector:
|
||||
matchLabels:
|
||||
app: persistence
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
app: persistence
|
||||
spec:
|
||||
automountServiceAccountToken: false
|
||||
containers:
|
||||
- name: persistence
|
||||
image: ghcr.io/your-org/otel-bi-backend:latest
|
||||
imagePullPolicy: IfNotPresent
|
||||
command:
|
||||
[
|
||||
"uvicorn",
|
||||
"microservices.persistence.main:app",
|
||||
"--host",
|
||||
"0.0.0.0",
|
||||
"--port",
|
||||
"8000",
|
||||
]
|
||||
envFrom:
|
||||
- configMapRef:
|
||||
name: bi-platform-config
|
||||
- secretRef:
|
||||
name: bi-platform-secrets
|
||||
securityContext:
|
||||
allowPrivilegeEscalation: false
|
||||
capabilities:
|
||||
drop: ["ALL"]
|
||||
runAsNonRoot: true
|
||||
runAsUser: 10001
|
||||
seccompProfile:
|
||||
type: RuntimeDefault
|
||||
ports:
|
||||
- containerPort: 8000
|
||||
---
|
||||
apiVersion: v1
|
||||
kind: Service
|
||||
metadata:
|
||||
name: persistence
|
||||
namespace: bi-platform
|
||||
spec:
|
||||
selector:
|
||||
app: persistence
|
||||
ports:
|
||||
- port: 8000
|
||||
targetPort: 8000
|
||||
Reference in New Issue
Block a user