Overview
Expleo is a trusted partner for your innovation journey. As a global engineering, technology and consulting service provider, we are ideally positioned to help you achieve your ambitions and future-proof your business. With a smart blend of bold thinking and reliable execution, we're able to fast-track innovation through each step of your value chain.
We are strategically positioned to build value, with a global footprint across 30 countries. We are as global and local as you need us to be, with strong best-in-class pan-European technological centres and unique best-shoring capabilities.
We leverage a network of high value-adding affiliates in consulting and industrial excellence, and leading partners across multiple sectors to provide you with the most comprehensive services and solutions in an ever-changing environment.
Responsibilities Design, build, and maintain scalable data platforms;
Collect, process, and analyze large and complex data sets from various sources;
Develop and implement data processing workflows using data processing framework technologies such as Spark, and Apache Beam;
Collaborate with cross-functional teams to ensure data accuracy and integrity;
Ensure data security and privacy through proper implementation of access controls and data encryption;
Extraction of data from various sources, including databases, file systems, and APIs;
Monitor system performance and optimize for high availability and scalability.
Essential Skills At least 5 years experience in a similar role;
Experience with Azure cloud platforms and services;
Proficiency in programming languages such as Python, Java or Scala;
Good knowledge of Big Data tools (Spark, Kafka, Hadoop, Hive);
Knowledge of data integration and ETL tools (e.g. Apache, Talend);
Strong SQL skills;
Familiarity with Azure Data Factory;
Good knowledge of English.
#J-18808-Ljbffr