Data Engineer with streaming focus (Fashion)
Data & Analytics

Who we are

We are Metyis, a forward-thinking, global company that develops and delivers solutions around Big Data, Digital Commerce, Marketing and Design and provides Advisory services. We have offices in 15 locations with a talent pool of 1000+ employees and more than 50 nationalities, dedicated to creating long-lasting impact and growth for our business partners and clients.

Together with HUGO BOSS, our esteemed business partner, we have embarked on a joint venture and created the HUGO BOSS Digital Campus, dedicated to increasing the data analytics, eCommerce and technology capabilities of the company and boosting digital sales. The HUGO BOSS Digital Campus employees will help create a state-of-the-art data architecture infrastructure, advanced business analytics, and the development and enhancement of HUGO BOSS’ eCommerce platform and services.

This collaborative environment will provide the capabilities required for HUGO BOSS to maximise data usage and support its growth trajectory towards becoming the leading premium tech-driven fashion platform worldwide.

What we offer 

  • Develop your professional career working with one of the major brands in the fashion industry.
  • Interact with senior stakeholders at our clients on regular basis to drive their business towards impactful change.
  • Work with our business departments, to develop solutions for operational to management information needs in the areas of Business Intelligence.
  • Reporting, Planning, Data Warehousing and Advanced Analytics.
  • Become part of a fast-growing international and diverse team.

What you will do

  • Design, develop, test, and deploy data pipelines using various streaming technologies as part of data engineers team.
  • Work on a company-wide streaming data processing framework that can be viewed as a software application.
  • Optimize existing code for performance, reliability, and scalability.
  • Debug and troubleshoot issues and provide technical support as needed.
  • Follow best practices and standards for coding, documentation, testing, and security.
  • Mentor and provide technical guidance to other data professionals, fostering a culture of knowledge sharing and continuous learning.
  • Be proactive and be willing to work closely with Data Science, DevOps/MLOps, Cloud Infrastructure, or IT-Security colleagues.
  • Research and evaluate new technologies and trends to improve existing software or create new solutions.

What you’ll bring

  • Academic degree in computer science, software engineering, machine learning engineering, or related field.
  • At least 3 years of professional software development experience using languages such as Python, Spark, SQL, Scala, or Rust.
  • Strong knowledge of Apache Kafka and its ecosystem such as Kafka Connect, or REST Proxy.
  • Strong knowledge of stream processing frameworks such as Confluent KSQL, Kafka Streams, Faust, Spark Structured Streaming, Apache Flink, or Samza.
  • Strong knowledge of Azure Event Hub.
  • Strong knowledge of Databricks Delta Live Tables.
  • Strong knowledge of data formats, schemas, and serialization techniques, such as JSON, or Avro.
  • Strong knowledge of software design patterns & principles, functional programming, and object-oriented programming.
  • Strong ability to write clean, maintainable and scalable code.
  • Knowledge of best practices in data streaming, such as data quality, latency, reliability, and security.
  • Experience with SQL & No-SQL databases such as AzureSQL, or MongoDB.
  • Experience in cloud technology such as Microsoft Azure, or GCP.
  • Experience in data engineering or analytics is preferred.
Apply to this position