Algorithm Engineer (Multi-Sensor Fusion & SLAM)

Location United States of America
Contact name: Sam Jacobs

Contact email: sam@deepabacus.com
Job ref: 45
Published: about 1 month ago

Company

  • Global disruptor in Autonomy space raising the industry standard for drones

  • Products have been received a multitude of prestigious awards

  • Growing exponentially with opportunity to gain meaningful equity in lead up to IPO phase

What you'll do

  • Responsible for the development and implementation of the SLAM algorithm for tightly coupled fusion of multi-source information. 

  • Design advanced multi-sensor fusion SLAM algorithms and systems for autonomous drones. Sensor modalities include camera, IMU, Radar, Lidar, GPS, prior map, and more.

  • Research and evaluate different candidate algorithms to determine the best option based on computational requirements of the system, accuracy of the results, and applicability of the algorithm to the specific use case.

  • Document all feature design.

  • Implement these advanced algorithms in sophisticated computer programming languages, such as C++, and deploy the program onto different robotic drone platforms.

  • Perform rigorous testing to ensure implemented code meets all functional requirements.

​​​​​​​What you'll need to succeed

Must have: 

  • Master's degree or above, 3+ years of experience in SLAM or multi-sensor fusion algorithms.

  • More than 2 years of experience in the development of localization and mapping related algorithms, Visual-SLAM, VIO or Lidar localization and mapping related experience is preferred.

  • Familiar with one or more feature extraction algorithms such as SIFT, ORB, etc., have experience in mapping algorithms, and have a deep understanding of the camera's multi-view geometry.

  • Familiar with the basic theory of state estimation, deeply understand commonly used nonlinear optimization methods (such as Gauss-Newton, L-M algorithm) and filters (such as Extended Kalman Filter), and be able to design suitable algorithms according to different scenarios and requirements.

  • Have a solid foundation in probability theory, linear algebra and space geometry, and have an in-depth understanding of various matrix factorization algorithms, linear equations solutions, Lie algebra/groups, and spatial coordinate transformations in different coordinate systems.

  • Familiar with the use of optimized libraries such as C++/Python, Ceres, g2o, etc.

  • Familiar with classic calibration algorithms, camera models, such as Zhang's algorithm, Kalibr, etc.

​​​​​​​Nice to have: 

  • Familiar with commonly used visual SLAM/VIO algorithms, such as MSCKF, ORB-SLAM, etc.

  • Experience in multi-sensor fusion, and experience in inertial navigation, GPS, vision, lidar and other sensor fusion algorithms.