3 days onsite
We are seeking an experienced Cloud Data Engineer with expertise in Kafka, Snowflake, and MongoDB to design and manage robust, scalable data pipelines within a cloud environment. This role involves building CI/CD pipelines to automate and streamline data workflows, ensuring data integrity and security across test and production environments.
Responsibilities:
- Build and maintain data pipelines for Kafka streaming, Snowflake warehousing, and MongoDB storage.
- Develop and scale CI/CD processes, automating data delivery and quality control.
- Govern standards and best practices for CI/CD and data engineering systems.
- Integrate security controls, including static/dynamic analysis and container scanning.
- Collaborate with development and QA teams to ensure quality data solutions.
- Work closely with IT security to align with security and compliance standards.
- Ensure system resiliency and support BCP (Business Continuity Planning) scenarios.
Key Technologies:
- Data Stack: Kafka, Snowflake, MongoDB
- CI/CD Tools: GitHub, Azure DevOps, Harness
- Cloud & Automation: AWS (EC2, Fargate, MSK), HashiCorp Terraform, Vault
Requirements:
- Bachelor’s degree in Computer Science, Engineering, or related field.
- 5+ years in a cloud data engineering role with a focus on Kafka, Snowflake, and MongoDB.
- Strong experience in CI/CD and cloud security practices.
- Preferred certifications: AWS Certified Solutions Architect or similar.
This is an opportunity to drive innovation in cloud data engineering within a collaborative and technology-forward environment.