col-wide
Job Description:
Company Description
Devexperts works with respected financial institutions, delivering products and
- made solutions for retail and brokerage houses, exchanges, and
- side firms. The company focuses on trading platforms and brokerage automation, complex software development projects, market data products, and IT consulting services.
Job Description
We are looking for a
Senior Data Engineer with a Java / Scala / Python backgroundto join the Research & Development Team.
Our current project is for a Top-5 US retail broker (by the number of users). The project is devoted to trading experience, finance reports, and risk management.
You will join
- functional team that excels in getting features done from zero to production.
We expect the Senior Data Engineer to:
1. Develop Data Pipeline:
Design, develop, and maintain robust data pipelines using Java within AWS infrastructure,
Implement scalable solutions for data analysis and transformation using Apache Spark and Py
Spark,
Utilise Airflow for efficient workflow orchestration in complex data processing tasks,
Ensure fast and interactive querying capabilities through the use of Presto.
2. Manage Infrastructure:
Containerise applications using Docker for streamlined deployment and scaling,
Orchestrate and manage containers effectively with Kubernetes in production environments,
Implement infrastructure as code using Terraform for provisioning and managing AWS resources.
3. Collaborate and Communicate:
Collaborate with
- functional teams to understand data requirements and architect scalable solutions aligned with business goals,
Ensure data quality and reliability through robust testing methodologies and monitoring solutions,
Stay updated with emerging technologies and industry trends to continuously enhance the data engineering ecosystem.
Qualifications
Must-have skills:
1. Education and Experience:
Bachelor's degree in Computer Science, Engineering, or related field,
Minimum 5 years of
- on experience in Java / Scala / Python development, emphasising
- oriented principles.
2. Technical Proficiency:
Proficiency in Apache Spark or Py
Spark for
- scale data processing,
Experience with Airflow for workflow orchestration in production environments,
Familiarity with Docker for containerisation and Kubernetes for container orchestration,
Experience managing AWS services such as S3, EMR, Glue, Athena, and Redshift,
Strong background in SQL and relational databases.
3. Communication Skills:
Excellent English language communication skills, both verbal and written,
Ability to collaborate effectively with technical and
- technical stakeholders.
Nice-to-have skills:
Experience with streaming platforms such as Kafka for
- time data processing,
Knowledge of Terraform for infrastructure as code implementation in AWS environments.
#J-18808-Ljbffr