Data Engineer





We see it as our daily mission to make a significant contribution to greater safety and efficiency on roads and railways. That goes for passenger and goods transport - all over the world.

Making mobility safe on roads and railways - that is the mission driving us each and every day at Knorr-Bremse. It has been that way for over 110 years. Today the Knorr-Bremse Group, based in Munich, is the world's leading manufacturer of braking systems and a leading supplier of safety-critical sub-systems for rail and commercial vehicles. As an innovator in our fields, we advance developments in mobility and transport technologies.



Daily tasks:

  • Transforming business needs into technical solutions and data models
  • Design and build modern data pipelines, data streams, and data service Application Programming Interfaces (APIs)
  • Use proven methods to solve business problems using Azure Data and Analytics services in combination with building data pipelines, data streams and system integration
  • Implement ETL routines using Azure Data Factory to load transformed data into the Data Warehouse
  • Transform data into business usable data by applying data modelling techniques using DBT
  • Design and implement Data Lake lifecycle processes that support effective data provisioning capabilities to feed analytical requirements, ingest IOT data sources and support Machine Learning requirements
  • Work closely with BI report creators empowering them to map reports to standardized data models that are under pinned by the Data Warehouse solution

Your daily toolstack:

  • Snowflake; DBT; Azure Data Factory; GitHub

Expected skills set:

  • Academic degree in computer science or similar
  • English on very good level for daily use
  • 3 years of experience in business data warehouse and analytics/BI architectures
  • Experience in analytical development leveraging Azure analytic components end-to-end
  • Very Strong SQL scripting, Data models, Analysis Services, Big Data
  • Experience with Java, Scala, Python or R
  • Solid knowledge of Azure analytic components such as Data Lake/Blob Storage, Azure Data Factory, Azure Databricks, Azure SQL, Azure SQL Data Warehouse, DBT, Azure Analysis Services and Power BI Services
  • Ability to work independently and a self-starter
  • Ability to review a complex problem and present his/her manager with summary of options to be considered

What you can expect from us:

  • Job opportunity with long-term perspective in stable international company
  • Modern working environment in the center of Liberec
  • Full-time job, unlimited contract
  • Work from home available, 5 weeks of holiday, Meal vouchers, Cafeteria benefit system, Pension insurance contribution, etc.

Please apply in ENGLISH only! Feel free to contact us for further information.


Please note that only candidates who are eligible to work in Czech Republic (have EU citizenship, permanent residency or valid working permission) might be considered for this role.

We are not open to Visa sponsorship.


Through a blend of engineering excellence, sustainable development and social responsibility, our employees help to drive progress at more than 100 sites in 30 countries. We offer you an exciting role with plenty of variety in an international environment, as part of an attractive package that extends from flexible working hour models, via professional and personal development opportunities, all the way to sports and healthcare programs.


Then join us! We look forward to receiving your online application!