Current Positions | Previous Experience | Education | Voluntary Activities
Uwe Korn is a Lead Software/Data Engineer at the data science company QuantCo. His expertise is on building scalable architectures for machine learning services and the teams & culture around that. Nowadays he focuses on the data engineering infrastructure that is needed to provide the building blocks to bring machine learning models into production. As part of his work to provide an efficient data interchange he became a core committer to the Apache Parquet, Apache Arrow and conda-forge projects.
Current Positions
ML & Data Engineering – QuantCo Inc
since April 2019
- Build up a team of data / software / machine learning engineers and establish a software engineering culture in the whole company.
- Enable Machine Learning Engineers / Data Scientists to work more effectively with their stack by improving their work environment and building tools to ease their workflows.
- Support building data pipelines and a general data engineering infrastructure.
- Ensure smooth deployments through a well-setup packaging infrastructure; this lead to a heavy involvement in conda & conda-forge.
(Senior) Data Scientist — Blue Yonder
November 2014 - March 2019
- Design, planning, implementation, and operation of data pipelines using Apache Parquet and Dask/distributed.
- Shaping and planning the software architecture roadmap for the Data Engineering stack and the following project planning together with the team to implement the roadmap elements.
- Participation in open source development and regular speaker at tech conferences, especially in the PyData and Apache Arrow/Parquet communities.
- Data analysis, adoption of the machine learning model and/or communication to the customer during project and concept phases.
- Organisation of recruiting activities including the company presentation at job fairs.
- Talent development of fellow developers giving feedback and outlining education and career opportunities; active shaping of Data Engineering as a role.
- In-house consulting to improve technical collaboration on a cross-location basis.
- Review of open source usage including the licensing in the company
Core Member – conda-forge
since November 2019
Member of the core team running conda-forge, the community-led distribution for conda packages.
Committer & Member of the PMC — Apache Arrow
since October 2016
Maintenaince of the project; community building and code contributions around Parquet integration, packaging setup, and Java interoperability.
Committer & Member of the PMC — Apache Parquet
since September 2016
Building the initial write path to have complete Parquet roundtrips possible in C++ and Python; mainentance and Apache Arrow integration.
Previous Experience
Undergraduate Research Assitant — Karlsruhe Institute of Technology
September 2011 - August 2013
Algorithm implementation and performance tuning in Scala, C++ and SQL; quality testing and user-experiments with Python and Node.JS in the research area Outlier/Graph Mining.
Tutor / Undergraduate Teaching Assistant — Karlsruhe Institute of Technology
April 2011 - July 2011
Tutoring students and correction of weekly assignments of the lecture “Algorithmen 1”.
October 2010 - February 2011
Tutoring students and correction of weekly assignments of the lecture “Grundbegriffe der Informatik” (“Basic Notions of Computer Science”).
Parttime employee — Fraunhofer ITWM
November 2004 - September 2009
Tasks included the full range of dealing with data, starting with simple data entry; technical improvement of the data entry platform; adjusting code for data preprocessing as well as helping to build classifiers that then were deployed into a production environment. Furthermore, I participated in writing software in image processing on a CPU and with the first versions of CUDA on a GPU. This included experiences in the whole software lifecycle from initial proof of concepts to production-grade libraries and the setup of a matching CI system with performance tests.
Internship — DFKI (German Center for Artifical Intelligence)
August 2007
Intern at the Department for Knowledge Management.
Education
MSc Advanced Computing (Machine Learning, Data Mining and High Performance Computing) — University of Bristol
Graduated with Distinction.
Master thesis: Distributed calculation of similarity measures for very large graphs
Courses included: Uncertainty Modelling for Intelligent Systems, Statistical Pattern Recognition, Learning in Autonomous Systems, Computational Genomics and Bioinformatics Algorithms, Artificial Intelligence and Logic Programming, and Cloud Computing.
BSc Computer Science — Karlsruhe Institute of Technology
Graduated with 1.0.
Bachelor Thesis: Parameter-free Outlier-aware Clustering on Attributed Graphs (published as a reserach paper: Efficient Algorithms for a Robust Modularity-Driven Clustering of Attributed Graphs)
Courses included: Linear Algreba, Analysis, Algorithms & Data Structures, Operating Systems, Markov Chains, Cognitive Systems, Probability Theory, Programming Paradigms, Theoretical Foundations of Computer Science, Data Mining Paradigms and Methods for complex Datasets, and Algorithms for Planar Graphs.
Voluntary Activities
Hans Dickmann Kolleg (HaDiKo)
2010 - 2013
Member of the house parliament and the team organising the bar and the beverage replenishment; Member and spokesperson of the self-organised network team/ISP “HaDiNet”; part of the developer team that built a network management software in Python (Django, LDAP, …) that managed finances, contracts, printer accounts and automated network routing for the 1000 habitants of the dormitory.
Katholische junge Gemeinde Speyer (KjG)
2006 - 2013
Member of the board on diocese level and part of the leadership team on local and regional level; supervisor and organiser of youth camps and weekly groups; took care of (financial) accounting and the web presence / mail server of the whole organisation.