Company Overview
We are a technology-driven firm specializing in the development and implementation of quantitative trading strategies. Leveraging our proprietary platform, we provide data-driven insights to institutional investors and commercial hedgers across futures, and foreign exchange markets. Our focus on innovation and data analysis allows us to deliver cutting-edge solutions that complement traditional investment processes.
Please note that NO SPONSORSHIP is available for this position.
This is a fully on-site position.
Position Overview
As a Sr. Data Engineer, you will be responsible for building and optimizing data and processing pipelines, managing cloud infrastructure, and designing system architecture. This position offers an exciting opportunity to work at the forefront of cloud technologies and big data analytics in the financial services industry.
Responsibilities
- Design, develop, and deploy cloud-based data solutions using platforms such as AWS, Azure, or GCP.
- Build and optimize data pipelines to extract, transform, and load (ETL) large volumes of structured and unstructured data from various sources.
- Deploy and maintain workflow orchestration tools such as Apache Airflow.
- Deploy and maintain containerized applications and orchestration tools such as Docker and Kubernetes.
- Design, implement and maintain data storage solutions, ensuring scalability, performance, and reliability.
- Automate and orchestrate cloud infrastructure using Infrastructure-as-Code (IaC) tools like Terraform.
- Collaborate with cross-functional teams to understand data requirements, architect solutions, and deliver actionable insights to stakeholders.
Requirements
- Bachelor's or master's degree in computer science, Engineering, or a related field.
- Proven experience as a Cloud Engineer, Data Engineer, or similar role, with a strong focus on building and managing cloud-based data solutions.
- Proficiency in cloud platforms such as AWS, Azure, or GCP, with hands-on experience deploying and managing cloud services, including compute, storage, networking, and security.
- Proficiency in Python, ideally with experience building data pipelines and working with distributed computing frameworks like Apache Spark.
- Strong understanding of database technologies (relational and NoSQL), data modeling, and SQL query optimization.
- Experience with containerization and orchestration technologies such as Docker, Kubernetes, or ECS.
- Familiarity with DevOps practices, CI/CD pipelines, and version control systems.
- Excellent problem-solving skills, attention to detail, and ability to work effectively in a fast-paced, collaborative environment.
- Strong communication skills, with the ability to effectively articulate technical concepts to both technical and non-technical stakeholders.
- Ability to drive accountability, initiative and continuous improvement for individual and team performance.
- Displays leadership skills. Elevates team members via mentoring and knowledge sharing.
Benefits
- Competitive salary and benefits package.
- Generous PTO and flexibility
- Opportunity for professional growth and development in a dynamic, fast-paced industry.
- Exposure to cutting-edge technologies and methodologies in quantitative finance.
- Collaborative work environment with a focus on teamwork, innovation and continuous improvement.
Details found in the job post