GYTPOL, a first-of-its-kind platform that automates and error-proofs configuration security for connected devices, is looking to hire a Senior Data Engineer to serve as our go-to authority on data architecture and data engineering.
The individual selected, our first hire in such a role, will demonstrate a passion for learning cutting-edge technologies and will be tasked with building a production-ready large-scale data pipeline.
Responsibilities:
- Maintain ETL data pipelines
- Construct SQL queries using ORM, manually optimizing queries when needed
- Handle upgrades and data migrations
- Maintain Airflow
- Troubleshoot issues in production
- Plan and execute architecture changes
- Note: BI is not part of the responsibilities
Requirements:
- 5+ years building data pipelines
- fluency in Python
- Experience in writing raw SQL queries and optimizing ORM-emitted queries
- Familiarity with the data engineering tech stack: ETL tools (Airflow, Spark, Kafka, SQS, RDS, DBT, etc.)
- Experience working with/in cloud environments
- Comfortable owning projects from end-to-end — taking initiatives from idea to production
- Prior hands-on experience building large-scale data pipelines
- Experience working with containers and Kubernetes
- A self-starter with excellent problem-solving skills
- Ability to work both independently and collaboratively as a highly motivated team player
- Comfortable working in a remote or hybrid capacity
- Great async written English communication skills