Skills
· Hands-on working knowledge of large data solutions (for example: data lakes, delta lakes, data meshes, data lakehouses, data platforms, data streaming solutions…)
· In-depth knowledge and experience in one or more large scale distributed technologies including but not limited to: S3/Parquet, Kafka, Kubernetes, Spark
· Expert in Python and Java or another static language like Scala/R, Linux/Unix scripting, Jinja templates, puppet scripts, firewall config rules setup
· VM setup and scaling (pods), K8S scaling, managing Docker with Harbor, pushing Images through CI/CD
· Good knowledge of German is beneficial, excellent command of English is essential
· Knowledge of financial sector and its products
· Higher education (e.g. “Fachhochschule”,“Wirtschaftsinformatik”)