Senior Full-Stack Data Engineer (Fashion)
Data & Analytics
Senior
Porto

Who we are

We are Metyis, a forward-thinking, global company that develops and delivers solutions around Big Data, Digital Commerce, Marketing and Design and provides Advisory services. We have offices in 15 locations with a talent pool of 1000+ employees and more than 50 nationalities, dedicated to creating long-lasting impact and growth for our business partners and clients.

Together with HUGO BOSS, our esteemed business partner, we have embarked on a joint venture and created the HUGO BOSS Digital Campus, dedicated to increasing the data analytics, eCommerce and technology capabilities of the company and boosting digital sales. The HUGO BOSS Digital Campus employees will help create a state-of-the-art data architecture infrastructure, advanced business analytics, and the development and enhancement of HUGO BOSS’ eCommerce platform and services.

This collaborative environment will provide the capabilities required for HUGO BOSS to maximise data usage and support its growth trajectory towards becoming the leading premium tech-driven fashion platform worldwide.

What we offer

  • Develop your professional career working with one of the major brands in the fashion industry

  • Opportunity to accelerate the pace of digitalization & eCommerce growth through advanced technology, business intelligence, and analytics

  • Driving high-impact insights enhances decision-making across the entire organization

  • Driving brand equity and digital sales through enhanced digital experiences

  • Interaction with senior business and eCommerce leaders regularly to drive their business toward impactful change

  • Become part of a fast-growing international and diverse team


What you will do

  • Work on a company-wide batch & streaming data processing framework that can be viewed as a software application as part of an IT-Analytics team of data engineers

  • Design, develop, test, and deploy software & data pipelines using various batch & streaming technologies, and automate & monitor CI/CD/CT pipelines for an Azure based Data Platform

  • Optimize existing code for performance, reliability, and scalability

  • Debug and troubleshoot issues and provide technical support as needed

  • Follow best practices and standards for coding, documentation, testing, and security

  • Mentor and provide technical guidance to other data professionals, fostering a culture of knowledge sharing and continuous learning

  • Be proactive and be willing to work closely with Data Science, DevOps/MLOps, Cloud Infrastructure, or IT-Security colleagues

  • Research and evaluate new technologies and trends to improve existing software or create new solutions


What you will bring

  • At least 5 years of professional software development experience using languages such as Python, Spark, SQL, Scala, or Rust

  • Strong knowledge of cloud technology such as Microsoft Azure, or GCP

  • Strong knowledge of all-in-one analytics platforms such as Databricks, or Microsoft Fabric

  • Strong knowledge of data pipeline orchestration platforms such as Azure Data Factory, or Control-M

  • Experience in using data warehouse and data lake solutions such das SAP BW and Azure Data Lake

  • Experience with test driven development as well as with testing & data quality frameworks such as pytest, or great expectations in Python

  • Experience with SQL & No-SQL databases such as Azure SQL, or MongoDB

  • Experience with RESTful APIs and microservice architecture

  • Experience with GIT version control

  • Strong knowledge of software design patterns & principles, functional programming, and object-oriented programming

  • Strong ability to write clean, maintainable and scalable code

  • Strong knowledge of Azure Event Hub, Apache Kafka and its ecosystem

  • Strong knowledge of stream processing frameworks such as Confluent KSQL, or Spark Structured Streaming

  • Strong knowledge of containerisation and orchestration technologies such as Docker, or Kubernetes

  • Experience with data serialization languages such as JSON, or YAML

  • Experience with infrastructure as code such as Terraform

  • Experience with DevOps tools such as Azure DevOps, or GitHub

  • Experience with agile methodologies, such as Scrum or Kanban (e.g. with tools such as Azure DevOps, or Jira/Confluence)

  • Academic degree in computer science, software engineering, machine learning engineering, or related field

  • Excellent communication, collaboration, and problem-solving skill

In a changing world, diversity and inclusion are core values for team well-being and performance. At Metyis, we want to welcome and retain all talents, regardless of gender, age, origin or sexual orientation, and irrespective of whether or not they are living with a disability, as each of them has their own experience and identity.

Apply to this position