Celfocus is a European high-tech system integrator, providing professional services focused on creating business value through Analytics and Cognitive solutions – addressing Telecommunications, Energy & Utilities, Financial Services and other markets' strategic opportunities.
Serving Clients in 25+ countries, Celfocus delivers solutions such as accelerating digital network transformation in Autonomous Networks, elevating and monetising business services in B2B2x ecosystems, and providing highly relevant customer experiences through Hyper-personalisation solutions.
Make an impact by working for sectors where technology is the enabler, everything is ground-breaking and there's a constant need to be innovative.
Be part of the team that combines business knowledge, technological edge and a design experience.
Our different backgrounds and know-how are key in developing solutions and experiences for digital clients.
Face challenges and learn other ways of thinking and seeing the world - there's always room for your energy and creativity.
About the role Analytics Ops (Analytics/Data Operations) is a collaborative approach that combines data engineering, data quality, and DevOps practices to streamline and automate the entire data lifecycle.
Their mission of a DataOps professional is to establish efficient processes and frameworks for data ingestion, integration, transformation, and delivery.
They focus on optimizing data workflows, ensuring data quality and governance, and facilitating cross-functional collaboration between data teams and other stakeholders.
As a part of your job, you will: Setting up and maintaining the infrastructure necessary for data analytics, including servers, databases, and cloud services; Monitor the performance of the analytics infrastructure and tools, identifying and resolving issues as they arise; Automate processes wherever possible, including data processing onboarding and pipeline orchestrations; Create and maintain documentation on the analytics infrastructure and tools, including policies, procedures, and best practices.
What are we looking for?
Proficiency in scripting and automation using tools like Python, Bash, or PowerShell; Knowledge of version control systems (e.g., Git) and CI/CD (Continuous Integration/Continuous Deployment) practices; Experience with containerization and orchestration tools (e.g., Docker, Kubernetes); Familiarity with data pipeline and workflow management tools (e.g., Apache Airflow, Luigi); Understanding of data quality management and data governance principles; Knowledge of cloud platforms and services for data operations (e.g., AWS Glue, Azure Data Factory).
Personal traits: Ability to adapt to different contexts, teams and Clients Teamwork skills but also sense of autonomy Motivation for international projects and ok if travel is included Willingness to collaborate with other players Strong communication skills We want people who like to roll up their sleeves and open their minds.
Believe this is you?
Come join the Team!