FARFETCH exists for the love of fashion. Our mission is to be the global platform for luxury fashion, connecting creators, curators and consumers.
We're a positive platform for good, bringing together an incredible creative community made up by our people, our partners and our customers. This community is at the heart of our business success. We welcome differences, empower individuality and celebrate diverse skills and perspectives, creating an inclusive environment for everyone. We are FARFETCH for All.
TECHNOLOGY We're on a mission to build the technology that powers the global platform for luxury fashion. We operate a modular end-to-end technology platform purpose-built to connect the luxury fashion ecosystem worldwide, addressing complex challenges and enjoying it. We're empowered to break traditions and revolutionise, with the freedom and autonomy to make a difference for our customers all over the world.
PORTO Our Porto office is located in Portugal's vibrant second city, known for its history and its creative yet cosy environment. From Account Management to Technology and Product, whatever your skills are, you'll find your fit here.
THE ROLE We're a Data team that implements methods to improve data reliability and quality by combining raw information from different sources and developing architectures that enable data extraction and transformation for data modeling. Our aim is to provide an unrivaled customer experience and promote FARFETCH's growth. You will be integrated with the Data Engineering team, being responsible for helping maintain and improve the Data architecture and tools.
\n WHAT YOU'LL DOContribute to the hiring and training of engineers within the managed teamAssure the collaboration of managed team across different engineering teams (inside and outside of the domain) and adherence to the defined global engineering best practicesDesign and build scalable & reliable data pipelines (ETLs) for our data platformConstantly evolve data models & schema design of our Data Warehouse to support self-service needsWork cross functionally with various teams, creating solutions that deal with large volumes of dataWork with the team to set and maintain standards and development practicesBe a keen advocate of quality and continuous improvement. WHO YOU AREYou have experience as a professional with solid technical background building and maintaining data pipelines in a custom or commercial ETL tool (eg. SSIS, Talend, Informatica) (Airflow is a plus)You have experience working in a Data Warehouse environment with varied forms of data infrastructure, including relational databases, Hadoop, and Column StoreExpert in creating and evolving dimensional data models & schema designs to improve accessibility of data and provide intuitive analyticsBasic experience and knowledge of cloud environments (eg. AWS, GCP, Azure)Expert in SQLYou are proficient in one of the following programming languages: C#, Java, PythonYou have 2+ years experience working with a BI reporting tool (eg. Tableau, QlikView, PowerBI, Looker)Proficient applier of continuous delivery principles: version control, unit and automated testsYou are fluent in English, both written and spokenYou have good analytical and problem solving skills, the ability to work in a fast moving operational environment and you are enthusiastic and with a positive attitude. EQUAL OPPORTUNITIES STATEMENTFARFETCH is an equal opportunities employer ensuring that all applicants are treated equally and fairly throughout our recruitment process. We are determined that no applicant experiences discrimination on the basis of sex, race, ethnicity, religion or belief, disability, age, gender identity, ancestry, sexual orientation, veteran status, marriage and civil partnership, pregnancy and maternity, or any other basis prohibited by applicable law. We continue to build our consciously inclusive culture as part of our Positively FARFETCH strategy throughout our business, partnerships and communities.
\nYou will be integrated in the Data Engineering team, being responsible for helping maintain and improve the Data architecture and tools.