Technical Lead Manager - Perception, Self-Driving Systems
Quick Summary
About the role Applied Intuition builds the software infrastructure for autonomous vehicles across passenger cars, trucking, mining, and defense. Our Self-Driving Systems (SDS) team develops production-grade autonomy stacks deployed on real vehicles across multiple continents, from highway trucking…
Deep familiarity with transformer-based architectures for 3D perception, BEV representations, multi-task learning, and dense prediction.
About the Role
~1 min readApplied Intuition builds the software infrastructure for autonomous vehicles across passenger cars, trucking, mining, and defense. Our Self-Driving Systems (SDS) team develops production-grade autonomy stacks deployed on real vehicles across multiple continents, from highway trucking in Japan to urban ADAS in the United States and Europe.
We are looking for a Technical Lead Manager to own the perception model at the core of our autonomy stack. This is a single combined model: shared backbone, multi-task heads, serving every SDS program from the same codebase. The same model runs on a passenger car in Los Angeles, a truck in rural Japan, and an offroad vehicle in the Philippines. Different sensor configurations, different road geometries, different weather distributions, one model. You will lead the team that trains, evaluates, and ships this model, and you will be hands-on in the architecture and training decisions that drive its performance.
Own the perception model end-to-end: architecture, training, evaluation, and deployment. The core challenge is building a model that generalizes across geographies, road types, sensor setups, and environmental conditions without per-vertical forks.
Drive a camera-first perception strategy. The goal is to progressively reduce dependencies on HD maps and lidar. How to get there is part of the job.
Lead training and iteration cycles hands-on. You will be in the data, the eval dashboards, and the failure analysis. When perception regresses in a new geography or road type, you own understanding why and fixing it.
Own model performance across the full deployment surface: highway, urban, residential, ramps, complex intersections, poor weather, hilly terrain. You care about on-vehicle driving outcomes, not just offline metrics.
Manage the model lifecycle from training through quantization and deployment on embedded compute, including device-specific optimizations. Close the gap between what the model does offboard and what it does on the vehicle.
Work directly with OEM customer programs to understand sensor configurations, target ODDs, and performance requirements. Translate these into model architecture and data strategy.
Recruit, develop, and technically lead a team of perception engineers. Set high technical standards and create a culture of rigorous experimentation and measurement.
5+ years in ML/deep learning for perception or 3D scene understanding. Deep hands-on experience training and deploying vision models at scale.
2+ years managing or technically leading a perception team, with ability to both set direction and contribute to architecture and training decisions directly.
Experience building production perception systems, especially camera-only or camera-first solutions.
Track record deploying perception models to embedded hardware under real-time latency and compute constraints, including device-specific optimizations.
Strong software engineering in Python and C++, comfortable across the stack from training code to onboard inference integration.
Experience scaling perception models across multiple geographies, sensor setups, or vehicle platforms.
Nice to Have
~2 min readDeep familiarity with transformer-based architectures for 3D perception, BEV representations, multi-task learning, and dense prediction.
Familiarity with occupancy-based scene representations, sparse query-based architectures, or temporal aggregation approaches.
Experience reducing or removing map dependencies in perception systems.
Background in autolabel pipelines, data quality monitoring, or data flywheel design for perception.
Experience with closed-loop simulation for perception model evaluation (neural sim, log sim, scenario-based testing).
Experience at an AV company that has shipped perception to production.
Compensation at Applied Intuition for eligible roles includes base salary, equity, and benefits. Base salary is a single component of the total compensation package, which may also include equity in the form of options and/or restricted stock units, comprehensive health, dental, vision, life and disability insurance coverage, 401k retirement benefits with employer match, learning and wellness stipends, and paid time off. Note that benefits are subject to change and may vary based on jurisdiction of employment.
Applied Intuition pay ranges reflect the minimum and maximum intended target base salary for new hire salaries for the position. The actual base salary offered to a successful candidate will additionally be influenced by a variety of factors including experience, credentials & certifications, educational attainment, skill level requirements, interview performance, and the level and scope of the position.
Please reference the job posting’s subtitle for where this position will be located. For pay transparency purposes, the base salary range for this full-time position in the location listed is: $231,900 - $298,100 USD annually.
Location & Eligibility
Listing Details
- Posted
- May 11, 2026
- First seen
- May 11, 2026
- Last seen
- May 12, 2026
Posting Health
- Days active
- 0
- Repost count
- 0
- Trust Level
- 52%
- Scored at
- May 11, 2026
Signal breakdown
Please let applied know you found this job on Jobera.
3 other jobs at applied
View all →Explore open roles at applied.
Similar Technical Lead Manager jobs
View all →Browse Similar Jobs
Stay ahead of the market
Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.
No spam. Unsubscribe at any time.