Data Engineer - Databricks

Detalhes da Vaga

Job Title: Data Engineer - DataBricks Location: Lisboa Work Regime: Hybrid Step into a dynamic ecosystem where the future of business is created and lived every day. Be part of it! At LUZA Group, passion, perseverance, and the drive to excel define our path to success.
Founded in 2006, we are a Portuguese multinational with over 1,200 talented professionals and a turnover surpassing €30M. With a strong presence in key markets—Portugal, Spain, Morocco, Brazil, Mexico, the United States, and China—we deliver innovative solutions in engineering, IT, design, consulting, Industry 4.0, training, and recruitment. Our people and their talent power everything we do.
This is a moment of growth and opportunity. The future belongs to visionary minds. Will you join us? About the Position As a Data Engineer, you will be responsible for understanding business and technological challenges, develop data pipelines tackling those challenges, and ensuring their smooth deployment.
You will as well be responsible of the applications of standard industry and within-the-company good practices, and the application and evolution of our various patterns.
Responsibilities Projects understanding and Communication Understand problems from a user perspective and communicate to clearly understand the issue.Ensure the architecture provided by the Data Architect is clearly understood by yourself.Communicate with the Data Architect and your peers on the technical solution youre developing and communicate with the Project Manager in charge of the project youre working on. Development Write and communicate on new or updated interface contracts.Strong understanding of data warehousing concepts, data lakes, ETL/ELT processes, and data modeling.Develop data pipelines based on the defined architecture.Ensure the regular good practices are applied.Deploy requested infrastructure, particularly using Terraform.Make peer reviews and ask to your peers to review your code when merging a new version of the codebase. Testing Define tests with your project manager, based on the functional and technical requirements of the pipeline youre developing.Perform those tests and communicate regularly on the results.Regularly summarize the results of your tests in a dedicated document. Deployments Present to the Data Architect in charge of the architecture, and the Lead DataOps, the development that was performed through our Deployment Reviews.Track and communicate on any potential errors in the entire period of active monitoring following a deployment.Ensure diligent application of deployment process, logging, and monitoring strategy. Requirements Proficiency with PySpark and Spark SQL for data processing.Experience with Databricks using Unit Catalog.Knowledge of Delta Live Tables (DLT) for automated ETL and workflow orchestration in Databricks.Familiarity with Azure Data Lake Storage.Experience with orchestration tools (e.g., Apache Airflow or similar) for building and scheduling ETL/ELT pipelines.Knowledge of data partitioning and data lifecycle management on cloudbased storage.Familiarity with implementing data security and data privacy practices in a cloud environment.Terraform: At least one year of experience with Terraform and know good practices of GitOps.Additional Knowledge and Experience that are a Plus:Databricks Asset BundlesKubernetesApache KafkaVault Personal Traits Ability to adapt to different contexts, teams and stakeholdersProactive Ownership for the projects delivered by your teamA continuous eye for improvement on existing processesClear communication and collaboration skillsExcellent analytical / problem- solving abilityDemonstrated ability to manage your stress in an operational environmentDemonstrated ability to understand quickly business and technical requirements of a data pipeline that needs to be developed, and challenge potential misconceptionsThe candidate should have worked in a Data Platform that was in an industrial scale #VisionaryFuture - Build the future, join our living ecosystem!


Salário Nominal: A acordar

Fonte: Grabsjobs_Co

Função de trabalho:

Requisitos

Crm Data Expert

About us Nestlé Business Services (NBS) is at the heart of the Nestlé Group. We provide Supply Chain, Financial, HR, Digital & social media, CRM and Consumer...


Nestlé Sa - Lisboa

Publicado 19 days ago

Senior Software Engineer -- Lisbon Engineering

In its journey to become the most loved financial service provider in Europe, Mollie is continuously investing and growing its Engineering hub in Lisbon. Lis...


Mollie - Lisboa

Publicado 19 days ago

Data & Analytics Expert

About us Nestlé Business Services (NBS) is at the heart of the Nestlé Group. We provide Supply Chain, Financial, HR, Digital & social media, CRM and Consumer...


Nestlé Sa - Lisboa

Publicado 19 days ago

Suporte Aplicacional (Salesforce E Genesys)

A Noesis procura candidatos com o seguinte perfil: Principais Tarefas e Responsabilidades: Monitorizar e manter a integração entre Salesforce e Genesys, asse...


Phiture - Lisboa

Publicado 5 days ago

Built at: 2025-01-11T09:55:23.002Z