About Luma Financial Technologies
Founded in 2018, Luma Financial Technologies ("Luma") has pioneered a cutting-edge fintech software platform that has been adopted by broker/dealer firms, RIA offices, and private banks around the world. By using Luma, institutional and retail investors have a fully customizable, independent, buy-side technology platform that helps financial teams more efficiently learn about, research, purchase, and manage alternative investments as well as annuities. Luma gives these users the ability to oversee the full, end-to-end process lifecycle by offering a suite of solutions. These include education resources and training materials; creation and pricing of custom structured products; electronic order entry; and post-trade management. By prioritizing transparency and ease of use, Luma is a multi-issuer, multi-wholesaler, and multi-product option that advisors can utilize to best meet their clients' specific portfolio needs. Headquartered in Cincinnati, OH, Luma also has offices in New York, NY, Zurich, Switzerland, and Miami, FL. For more information, please visit Luma's website.
This will be a hybrid role out of Lisbon, Portugal
About the role
Cross-platform data reliability, and accurate analytics and reporting which requires:
Standardization of core data across modules, like product, quote, and order details.
ETL pipelines to Snowflake to build a golden source of truth.
Maintain data consistency between modules.
Data governance to ensure compliance and controls.
What you'll do
Ensuring Data Quality and Governance: Establishing policies and procedures for data quality, data management, and data governance. This includes setting standards for data entry, storage, and retrieval to ensure data consistency and accuracy.
Documentation: Creating and maintaining documentation for data architecture, data models, and other relevant processes to ensure that the data systems can be understood and maintained by other team members.
Collaboration with Stakeholders: Working closely with business leaders, data scientists, and other stakeholders to understand their data needs and ensure that data architecture aligns with business goals.
Performance Tuning: Monitoring and optimizing the performance of data systems. This includes identifying performance issues, developing strategies to address them, and implementing those strategies.
Designing Data Architecture: Creating and managing the structure of the organization's data systems. This includes defining how data is stored, consumed, integrated, and managed across different systems and applications.
Qualifications
Skills needed:
Understands data structures and different types of databases.
Can collaborate with different teams to create standard data to be displayed understanding difficult challenges and building solutions.
Scripting to create ETL pipelines (Python).
Experience with Snowflake to identify potential features we should be using.
Requirements:
A bachelor's degree in a related field, such as computer science, information technology, or data science.
Proven experience (8+ years) in designing and implementing data architectures, with a focus on Data Lakes.
#J-18808-Ljbffr