|JuliaCortRecruiting: Search Results|
|Back to record list|
|Record 4 of 126|
|Job Title:||Compiler Engineer|
|Perm or Contract:||Permanent|
|Job Description:|| Neural Nets Compiler Engineer:|
Develop a deep learning compiler stack that interfaces frameworks such as Caffe2/PyTorch, Tensorflow, etc. and converts neural nets (CNN/RNN) into internal representations suitable for optimizations.
Develop new optimization techniques and algorithms to efficiently map CNNs onto a wide range of Tensilica Xtensa processors and specialized HW
Implement state of the art code generation (source-to-source as well as binary)
Develop supporting data compression techniques, quantization algorithms, tensor sparsity enhancements, network pruning, etc
Devise multiprocessor/multicore partitioning and scheduling strategies
Develop complex programs to validate the functionality and performance of the CNN application programming kit
Help in authoring and reviewing product documentation
Assist the Tensilica application engineering team support customers of the product (some amount of direct customer interaction may be required).
Required and desired qualifications:
3-5+ years of experience working on a production compiler.
Advanced compiler construction, target-independent optimizations and analyses, code generation fundamentals is a must.
Expertise in software development, test, debug and release required.
Great C++ is a must, Python mandatory, but less pressing.
Knowledge of and experience with LLVM compiler stack is very desirable (other state-of-the-art compilers qualify too).
High to intermediate optimization space: loop optimization, polyhedral models, IR construction/transition/lowering techniques is a big plus.
Prior work with CNNs and familiarity with deep learning frameworks (Tensorflow, Caffe/2, etc.) is a strong plus.
Familiarity with the state-of-the-art deep learning compilation approaches is a huge plus: XLA, Glow, etc)
Familiarity with various deep learning networks and their applications (Classification/Segmentation/Object Detection/RNNs) is a plus.
Knowledge of neural net exchange formats (ONNX, NNEF) is a bonus.