Go to file
mingsheng.li 55279cd40d
All checks were successful
continuous-integration/drone/push Build is passing
log format data inporting script
2025-09-05 12:14:07 +08:00
k8s log format data inporting script 2025-09-05 12:14:07 +08:00
scripts log format data inporting script 2025-09-05 12:14:07 +08:00
src log format data inporting script 2025-09-05 12:14:07 +08:00
.drone.yml 修改关于docker镜像鉴权部分 2025-07-25 17:02:51 +08:00
.gitignore 初始合入databridge,用于后续数据的导出导入 2025-07-25 11:54:20 +08:00
Dockerfile 根据BE的打包脚本,修改,利用pip缓存 2025-07-25 14:32:29 +08:00
pyproject.toml 初始合入databridge,用于后续数据的导出导入 2025-07-25 11:54:20 +08:00
README.md 初始合入databridge,用于后续数据的导出导入 2025-07-25 11:54:20 +08:00
requirements.txt 修改requirement里面的依赖的版本 2025-07-25 14:07:02 +08:00

Databridge - Data Pipeline System

Databridge is a flexible data pipeline system for processing and transferring data between various sources and destinations. It is designed to run on Kubernetes and supports multiple data processing pipelines.

Features

  • DBF to PostgreSQL: Import data from DBF files to PostgreSQL
  • CSV Export: Export data from PostgreSQL to CSV files
  • Kubernetes Native: Designed to run as Kubernetes Jobs
  • ZFS Storage: Supports ZFS persistent storage
  • Parameterized Pipelines: Flexible configuration via environment variables

Getting Started

Prerequisites

  • Kubernetes cluster
  • ZFS storage provisioner
  • PostgreSQL database

Installation

  1. Deploy Storage Infrastructure:
    kubectl apply -f k8s/pv.yaml
    kubectl apply -f k8s/pvc.yaml
    kubectl apply -f k8s/rbac.yaml