Devexperts works with respected financial institutions, delivering products and tailor-made solutions for retail and brokerage houses, exchanges, and buy-side firms. The company focuses on trading platforms and brokerage automation, complex software development projects, market data products, and IT consulting services.
Job Description We are looking for a Senior Data Engineer with a Java / Scala / Python background to join the Research & Development Team.
Our current project is for a Top-5 US retail broker (by the number of users). The project is devoted to trading experience, finance reports, and risk management.
You will join cross-functional team that excels in getting features done from zero to production.
We expect the Senior Data Engineer to:
Design, develop, and maintain robust data pipelines using Java within AWS infrastructure,
Implement scalable solutions for data analysis and transformation using Apache Spark and PySpark,
Utilise Airflow for efficient workflow orchestration in complex data processing tasks,
Ensure fast and interactive querying capabilities through the use of Presto.
2. Manage Infrastructure:
Containerise applications using Docker for streamlined deployment and scaling,
Orchestrate and manage containers effectively with Kubernetes in production environments,
Implement infrastructure as code using Terraform for provisioning and managing AWS resources.
3. Collaborate and Communicate:
Collaborate with cross-functional teams to understand data requirements and architect scalable solutions aligned with business goals,
Ensure data quality and reliability through robust testing methodologies and monitoring solutions,
Stay updated with emerging technologies and industry trends to continuously enhance the data engineering ecosystem.
Qualifications Must-have skills:
1. Education and Experience:
Bachelor's degree in Computer Science, Engineering, or related field,
Minimum 5 years of hands-on experience in Java / Scala / Python development, emphasising object-oriented principles.
2. Technical Proficiency:
Proficiency in Apache Spark or PySpark for large-scale data processing,
Experience with Airflow for workflow orchestration in production environments,
Familiarity with Docker for containerisation and Kubernetes for container orchestration,
Experience managing AWS services such as S3, EMR, Glue, Athena, and Redshift,
Strong background in SQL and relational databases.
3. Communication Skills:
Excellent English language communication skills, both verbal and written,
Ability to collaborate effectively with technical and non-technical stakeholders.
Nice-to-have skills:
Experience with streaming platforms such as Kafka for real-time data processing,
Knowledge of Terraform for infrastructure as code implementation in AWS environments.
Life in Devexperts We will only achieve our mission if we live our culture. We start with becoming learners in all things—having a growth mindset. Then we apply that mindset to learning about our customers, being diverse and inclusive, working together as one, and—ultimately—making a difference in the world.
#J-18808-Ljbffr