Job Description At Blue Yonder, we employ state-of-the-art machine learning algorithms to generate hundreds of millions of microdecisions for large retail enterprises every day. Full automation allows us to shape supply chains and set selling prices such that customers never face empty shelves, less groceries go to waste, and fashion stores no longer end up with heaps of surplus clothes at the end of season. Our retail products run on the Blue Yonder platform that provides all necessary services to deliver best decisions to our customers. Our platform defines standard APIs for customer payload data exchange and offers the computational punch for our data scientists, scaling from a single CPU to hundreds of cores and terabytes of RAM within seconds. All this happens for all customers on the same shared infrastructure hosted on Microsoft Azure. As one of a few companies world wide, we have solved the tough problem of bringing data science to production while keeping costs and operations at bay. Now part of the JDA family, we continue to relentlessly drive innovation to pursue our vision of the autonomous supply chain. On the way, we have solved, and will solve plenty of interesting and challenging problems, and we would love to have you on board for that journey. To support our teams in Karlsruhe and Hamburg, we are looking for a Data Engineer (m/f) . Your tasks Work together with our machine learning experts, engineers and project managers to advance our data science products Design and develop data-intensive distributed systems in Python with focus on data engineering Use technologies such as Pandas and Dask, as well as modern NewSQL query engines like Presto or Apache Impala in the cloud Support on-boarding new customers with your expertise in organizational or data-intensive tasks Work in an agile team with emphasis on quality, testability and automation Your experience In-depth knowledge of Python, Pandas and its open-source ecosystem Expertise in working with SQL Interest in data handling with modern tools such as Apache Airflow, Apache Parquet, Dask and Presto Knowledge in techniques for data modeling, storage and access Passion for software craftsmanship and interest in modern methods such as Kanban, TDD and pair programming Are you super thrilled about the opportunity, but afraid to apply because you are not checking all boxes above? Please don't be shy. We will do our best to hire for potential rather than your current qualifications. We live for pushing artificial intelligence forward and scalable solutions. Joining our team means you can look forward to working with our people on realizing creative ideas and see how they operate in the resulting solutions. Mutual respect, friendly relationships and results-oriented work are an important part of our environment. We strive to be a family-friendly business and are open to modern working structures. We enjoy delivering the highest-quality work and like clean coding just as much as our social barbecues, contributing to open source projects, discussing new technologies and spending quality time with our families at home. We believe that the individuality and diversity of our employees make a decisive contribution to our success. Our hiring process is based on qualifications and career profile. We welcome applications from all motivated candidates, regardless of age, gender, disability, sexual orientation, origin or religion. Want to join our team? Please send your application email to email@example.com .