We are seeking an experienced Low-Level Performance Software Engineer with a focus on high-performance computing (HPC). This role requires expertise in optimizing software performance across all layers, from low-level compiler development to high-level frameworks like Apache Spark. The ideal candidate will have strong skills in C++, Java, and x86 assembly language, along with a deep understanding of CPU memory allocation and big data technologies.
Key Responsibilities:
- Develop and optimize software for HPC environments.
- Work on the full software stack, from compiler optimizations to high-level performance tuning in Spark.
- Implement efficient memory management and CPU resource allocation strategies.
- Write performance-critical code using x86 assembly and C++.
- Collaborate with data engineering teams to ensure integration of big data platforms with low-latency software.
- Troubleshoot and debug performance issues across software and hardware.
- Implement and optimize data processing algorithms in Java and C++.
- Develop code generation targeting custom DSLs for SQL operations.
- Establish benchmarking standards for performance assessments.
- Promote greenfield development on big data and compiler intersections.
Key Requirements:
- 5+ years of software development experience.
- 2+ years of experience with C++ (focus on performance optimization).
- 1 year of experience with Java in a performance-sensitive environment.
- Strong understanding of CPU memory allocation, cache utilization, and multi-threading.
- Proficiency in x86 assembly language for performance tuning.
- Experience with compilers and hardware optimization techniques.
- Solid knowledge of data structures and algorithms.
- Fluent in English (upper-intermediate or higher).
Would be a plus:
- Experience with Spark or other big data technologies (Hadoop, Flink, etc.).
This is a unique opportunity for a skilled engineer passionate about HPC and software optimization to work on cutting-edge projects at the intersection of big data and compilers.