工作內容

  • Design, create and maintain data pipelines will be the primary responsibility
  • Ensure and sustain data quality
  • Maintain, expand and improve our ETL processes and computational infrastructure
  • Responsible of initiatives related to building, maintaining and orchestrating all the components in data platform
  • Working with different sources of data (internals and externals) to process and transform them for other teams

條件要求

Required Qualifications:

  • Degree in Computer Science or related technical field
  • Experience with data pipeline and workflow management tools
  • Familiar with data monitor systems
  • Familiar with cloud platform services(GCP, AWS)
  • Hands-on experience with container orchestration using Kubernetes and Docker

Preferred Qualifications:

  • Passion for DataOps.
  • Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets
  • Experience in configuring and executing automated tests in a continuous integration development environment through Continuous Integration and Continuous Delivery (CI/CD) tools
  • Experience with data streaming platforms(Spark, Kafka, etc.)
  • Experience in designing high scalable and available applications

公司福利

。 隨取隨食的點心零食櫃,打開冰箱就有你最愛的飲料
。 優於勞基法的特休/年假,加入未滿一年按比例享9天年假 / 10天 / 14天 / 15天,工作之餘更要享受生活
。 生日當月可享1日休假還有生日禮金!
。 每個月固定 Happy Hour 時段 (我們會去密室脫逃、打雷射槍、騎腳踏車...更多新奇好玩的活動等你參與!)
。 每兩週一天彈性安排在家工作,我們知道工程師很需要!
。 我們相信員工就是最好的投資,提供員工訓練補助,鼓勵夥伴們多多學習新技能!
。 在這裡,每天都有新的計劃新的點子產生,只要你有能力、有決心,你會跟著公司一起快速成長!
。 公司擁有最熱情最溫暖最有責任感的一群同事 🧡

薪資範圍

NT$ 50,000 - 70,000 (月薪)