Senior Data Engineer, Virtual Insurance

AIFT Group
立即應徵

工作內容

[Job Overview]
We are looking for an experienced Senior Data Engineer to join our engineering team and play a key role in building and scaling our enterprise data platform. You will design, develop, and maintain high-quality data warehouses and data-driven applications that power analytics, reconciliation, and business decision-making across the organization.
This role requires strong expertise in modern data architectures, pipeline engineering, and data quality management. The ideal candidate combines hands-on technical capability with a deep commitment to reliability, scalability, and governance in a regulated environment.

[Responsibilities]
· Data operations: own day-to-day operations of data platforms/pipelines capacity, stability, upgrades, deployments, and recovery drills to sustain high availability and low latency.
· Data collection: design/manage multi-source ingestion (exchanges, internal and external systems), protocol parsing, and robust retry mechanisms.
· Develop rule-based and statistical data quality checks (completeness, uniqueness, time alignment, anomaly detection, error handling).
· Implement automated remediation, reconciliation workflows, and historical backfilling.
· Establish monitoring and alerting frameworks to ensure trusted, production-grade datasets.
· End-to-End pipelines: plan and maintain scalable ETL/ELT including scheduling, caching, partitioning, modelling, schema evolution, and lineage to support both batch and real-time streaming.
· Enforce data access controls, encryption, auditing, and classification to comply with internal policies and external regulatory requirements (including PII management).
· Apply Infrastructure-as-Code, data versioning, data tests, and CI/CD to improve predictability and reduce manual risk.
· Contribute to embedded GenAI and LLM-powered data applications for enterprise analytics, reconciliation, and internal productivity use cases.
· Partner with analytics and product teams to operationalize AI-driven data solutions.

條件要求

[Requirements]
· Bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field.
· 5+ years of experience in data engineering, data platform architecture, or AI/ML engineering.
· Strong experience with modern cloud data platforms (e.g., Snowflake, Databricks, BigQuery, Redshift).
· Hands-on experience building BI data foundations and supporting GenAI / LLM architectures.
· Proficiency in SQL and workflow orchestration tools (e.g., Airflow), streaming platforms (e.g., Kafka), and pipeline design best practices.
· Solid understanding of data warehouse development lifecycles and dimensional modeling concepts.
· Familiarity with GitLab and CI/CD pipelines.
· Strong debugging, performance tuning, and problem-solving skills.
· Working knowledge of data governance, lineage, privacy, and security frameworks.

遠端型態

部分遠端面試
部分遠端工作

員工福利

法定項目

勞保、健保、特別休假、勞退、婚假

其他福利

好好工作,好好休息

  • 加入第一天即享有年假,首年 15 天年假(依照入職比例發)
  • 每年全薪病假 5 天、女性員工另享有全薪生理假 3 天

一起成長,持續精進

  • 參加 conference、外部訓練都有補助 (正職員工適用)
  • 證照補助 (正職員工適用)
  • 讀書會社團 - 前端、後端、SRE、區塊鏈等多元主題(全體同仁適用)

努力工作,我們也用力生活

  • 健康檢查補助 (正職員工適用)
  • 社團補助 - 各種運動社團、桌遊社、電玩社、這週要幹嘛社
  • 定期補充的零食以及飲料櫃、義式咖啡機、氣泡水機
  • 舒適的開放式工作環境,距離捷運台北101站 5分鐘路程
  • 彈性上下班時間、混合式遠端工作

薪資範圍

面議(經常性薪資達4萬元)