Getting Started
Table of Contents
- Introduction
- Project Structure
- Core Components
- Architecture Overview
- Detailed Component Analysis
- Dependency Analysis
- Performance Considerations
- Troubleshooting Guide
- Conclusion
- Appendices
Introduction
This guide helps you set up a local development environment for the BI Analysis Platform and get productive quickly. It covers prerequisites, step-by-step installation, local service execution, quick-start examples, and troubleshooting tips. The platform integrates modern technologies: Go microservices with Kratos, Next.js frontend, Python-based AI assistant, and containerized deployment via Docker and Kubernetes.
Project Structure
The repository is a monorepo containing multiple services and shared libraries:
- Frontend applications: ui-web (tenant console), ui-web-admin (operations), ui-web-tech, ui-web-top
- Backend services: bi-analysis (core analytics), bi-basic, bi-tenant, bi-sys, bi-cron, bi-server, bi-notify, bi-plan-taoxi
- Shared libraries: bi-common (components like Nacos, logging, gRPC/HTTP servers, Kafka), bi-proto (IDL)
- AI assistant: bi-chat (Python FastAPI with Agentscope)
- Infrastructure: Dockerfiles, Kubernetes manifests, scripts
Diagram sources
Section sources
Core Components
- bi-analysis: Core analytics service built with Kratos, integrating bi-common, OpenAPI generation, and Nacos configuration.
- bi-common: Shared infrastructure components (Nacos, apitypes, logger, gormx, redisx, kafkax).
- ui-web: Tenant console frontend powered by Next.js.
- bi-chat: Python-based AI assistant service (FastAPI) supporting conversational analytics.
Key capabilities:
- Unified API response and error handling via apitypes.
- Dynamic configuration via Nacos with environment-specific YAML files.
- Containerized builds and Kubernetes deployments.
Section sources
Architecture Overview
High-level runtime architecture for local development:
- Frontend (ui-web) communicates with backend services (bi-analysis, bi-tenant, bi-server).
- bi-analysis loads configuration from Nacos and exposes HTTP/gRPC endpoints.
- bi-chat runs as a separate Python service with vector DB (Milvus), session storage (Redis), and metadata store (PostgreSQL).
- Docker images encapsulate build-time dependencies and runtime binaries; Kubernetes manifests define deployments and ports.
Diagram sources
Detailed Component Analysis
Local Setup Prerequisites
- Go 1.25+ (required by bi-analysis and bi-common)
- Node.js and npm (for ui-web)
- Python 3.11+ (for bi-chat)
- Docker (for containerized builds and local testing)
- Optional: Kubernetes cluster and kubectl for local clusters (kind/minikube)
These requirements align with the project’s technology stack and Dockerfiles.
Section sources
Step-by-Step Installation
1) Prepare the Environment
- Install Go, Node.js, Python, and Docker as per your OS package manager.
- Verify versions:
- Go: go version
- Node.js: node -v
- Python: python3 --version
2) Clone and Initialize Submodules
- Clone the repository and initialize submodules if applicable.
- Navigate to the repository root.
3) Run Frontend (ui-web)
- Change to the ui-web directory.
- Install dependencies and start the dev server.
- Access http://localhost:3000 in your browser.
Section sources
4) Configure Nacos for bi-analysis
- Create two Nacos Data IDs:
- bi-common.yaml (generic configuration)
- bi-analysis.yaml (project-specific overrides)
- Set namespace and group as configured in the local application YAML.
- Confirm connectivity to the Nacos endpoint defined in the local config.
Section sources
5) Build and Run bi-analysis Locally
- From bi-analysis:
- Initialize dependencies and build the binary.
- Run with the desired environment flag (dev/test/prod).
- Ports exposed:
- HTTP: 10002
- gRPC: 20002
Diagram sources
Section sources
6) Run AI Assistant (bi-chat) Locally
- Install Python dependencies.
- Start the FastAPI service.
- The AI assistant listens on port 8000.
Diagram sources
Section sources
7) Optional: Build and Run with Docker
- bi-analysis Dockerfile builds a static Linux binary and copies configs and server binary into a minimal Alpine image.
- bi-chat Dockerfile installs Python dependencies and runs the FastAPI app module.
Section sources
8) Optional: Deploy to Kubernetes
- bi-analysis deployment defines image, environment variables, ports, and resource requests/limits.
- Ensure Nacos credentials and namespace are configured via ConfigMaps and Secrets.
Section sources
Quick Start Examples
Access the Tenant Console
- Start ui-web and open http://localhost:3000.
- Log in and navigate to dashboards and analytics.
Section sources
Perform a Basic Analytics Query
- Call the bi-analysis HTTP endpoint on port 10002.
- Use the unified response format: { code, message, data }.
- For paginated lists, expect data.total and data.list.
Section sources
Interact with the AI Assistant
- Send requests to the bi-chat service on port 8000.
- The assistant supports conversational analytics and RAG over stored knowledge.
Section sources
Dependency Analysis
bi-analysis depends on bi-common and bi-proto, and integrates Kratos for transport and configuration via Nacos.
Diagram sources
Section sources
Performance Considerations
- Use production-like resource requests/limits in Kubernetes deployments.
- Enable OpenAPI generation and keep docs synchronized for efficient API iteration.
- Prefer environment-specific configs (dev/test/prod) to avoid misconfiguration overhead.
[No sources needed since this section provides general guidance]
Troubleshooting Guide
Common setup issues and resolutions:
- Nacos connectivity failures
- Verify the Nacos address and credentials in the local application YAML.
- Ensure the Data IDs (bi-common.yaml, bi-analysis.yaml) exist in the configured namespace/group.
- Port conflicts
- bi-analysis exposes 10002/20002; ui-web runs on 3000; bi-chat on 8000.
- Adjust ports or stop conflicting services.
- Missing wire_gen.go
- Ensure make generate is run to produce wire code before building.
- Python dependencies
- Rebuild the bi-chat image after updating requirements.txt.
- Docker build failures
- Confirm GOPROXY/GONOPROXY settings and network access to private registries if required.
Section sources
Conclusion
You now have the essentials to run the BI Analysis Platform locally: frontend, core analytics service, tenant services, and the AI assistant. Use the unified API response format, Nacos configuration, and containerized builds to iterate quickly. Expand into Kubernetes for production-style environments and leverage the shared bi-common components for consistent behavior across services.
[No sources needed since this section summarizes without analyzing specific files]
Appendices
Appendix A: Environment Variables Reference
- Nacos
- NACOS_SERVER_ADDR, NACOS_CLIENT_NAMESPACE_ID
- GORMX (database)
- GORMX_HOST, GORMX_DATABASE
- Redisx
- REDISX_ADDRS, REDISX_PASSWORD, REDISX_MODE
- Logging
- Aliyun SLS keys if used
Section sources
Appendix B: API Response Format
- All HTTP responses use a uniform structure with code, message, and data.
- Pagination responses include total, list, page, page_size, and total_pages.
Section sources