Experience:
Hands-on Big Data experience using common open source components (S3, Hive, Spark, Trino, MinIO, K8S, Kafka)
Experience in stakeholder management in heterogeneous business/technology organizations
Experience in banking or financial business, with handling sensitive data across regions
Experience in large data migration projects with on-prem Data Lakes
Hands-on experience in integrating Data Science Workbench platforms (e.g. Knime, Cloudera, Dataiku)
Track record in Agile project management and methods (e.g., Scrum, SAFe).
Skills
Knowledge of reference architectures, especially concerning integrated, data-driven landscapes and solutions
Expert SQL skills, preferably in mixed environments (i.e. classic DWH and distributed)
Working automation and troubleshooting experience in Python using Jupyter Notebooks or common IDEs
Data preparation for reporting/analytics and visualization tools (e.g Tableau, Power BI or Python based)
Applying a data quality framework within the architecture
Good knowledge of German is beneficial, excellent command of English is essential
Higher education (e.g. “Fachhochschule”,“Wirtschaftsinformatik”)