Autonomy Software Engineer
New Yesterday
About UCR UCR (Under Control Robotics) builds multipurpose robots to support human workers in the world's toughest jobs—turning dangerous work from a necessity into a choice. Our work demands reliability, robustness, and readiness for the unexpected—on time, every time. We're assembling a mission-driven team focused on delivering real impact in heavy industry, from construction and mining to energy. If you're driven to build rugged, reliable products that solve real-world problems, we'd love to talk.
Position Overview At UCR, building robots is a team sport. As a Robotics Autonomy Engineer, you’ll take ownership and lead the development of autonomy systems that power our multipurpose robots across diverse and unstructured environments. You’ll design, implement, and optimize cutting-edge localization, mapping, navigation, and SLAM systems—including advanced techniques such as 3D Gaussian Splatting—that enable our robots to perceive, understand, and act in the real world with confidence.
Responsibilities Develop and maintain real-time mapping, localization, and navigation software for mobility robotic systems
Build scalable SLAM pipelines using a mix of sensors, including LiDAR, vision, and IMU
Implement 3D scene representations using cutting-edge techniques such as 3D Gaussian Splatting, NeRFs, and other neural or volumetric methods
Integrate localization and mapping modules with motion planning and control systems
Deploy robust autonomy stacks to on-board compute platforms and validate them in both simulation and real-world testing
Analyze and tune performance of perception and SLAM systems in challenging environments
Collaborate with mechanical, electrical, and software engineers to develop co-designed autonomy solutions
Write clean, modular, production-quality code with thorough documentation and testing
Operate and support robots during field testing and customer deployment
Requirements 4+ years of experience working in robotics, autonomy, or a closely related field
Strong foundation in SLAM, probabilistic localization, 3D reconstruction, and navigation algorithms
Deep experience with C++ and Python, especially in real-time robotics or embedded systems
Experience building and deploying autonomy stacks using frameworks such as ROS or ROS2
Proven ability to develop algorithms for sensor fusion and state estimation (e.g., EKF, UKF, particle filters)
Hands-on experience with real robot systems—ground, legged, or aerial platforms
Familiarity with 3D mapping techniques including voxel grids, mesh reconstruction, and Gaussian Splatting
Demonstrated rapid growth and technical ownership on complex autonomy projects
Ability to prioritize and execute tasks in a fast-paced, dynamic environment
Excellent communication and collaboration skills across disciplines
Nice to Have Experience with GPU-accelerated vision or perception pipelines (CUDA, TensorRT)
Exposure to deep learning-based SLAM, view synthesis, or scene understanding techniques
Experience with multirobot SLAM, loop closure, or graph optimization frameworks
Contributions to open-source robotics or perception libraries
Comfort debugging hardware/software integration in field settings
Experience with autonomy in unstructured or GPS-denied environments
Strong understanding of simulation frameworks (e.g., Gazebo, Isaac Sim, Unity Robotics)
To apply, submit your resume here or email people@ucr.bot . To increase your chances of being selected for an interview, we encourage you to include a public portfolio of your most representative work featuring your individual contributions and public demonstrations of autonomy or SLAM systems.
#J-18808-Ljbffr
- Location:
- San Francisco, CA, United States
- Salary:
- $200,000 - $250,000
- Category:
- IT & Technology