Entering students should have the following knowledge:
Advanced knowledge in any object-oriented programming language, preferably C++
Intermediate Linear Algebra
Basic Linux Command Lines
Udacity current price ( USD)
As a Sensor Fusion Engineer, you’ll be equipped to bring value to a wide array of industries and be eligible for many roles.
Your opportunities might include roles such as an:
Sensor Fusion Engineer
Automated Vehicle Engineer
Self-Driving Car Engineer
Object Tracking Engineer
System Integration Engineer
If you’re interested in learning about lidar, radar, and camera data and how to fuse it together, this program is right for you.
Sensors and sensor data are used in a wide array of applications — from cell phones to robots and self-driving cars — giving you a wide array of fields you could enter or advance a career in after this program.
In this Nanodegree program, you will work with camera images, radar signatures, and lidar point clouds to detect and track vehicles and pedestrians. By graduation, you will have an impressive portfolio of projects to demonstrate your skills to employers.
Learn to detect obstacles in lidar point clouds through clustering and segmentation, apply thresholds and filters to radar data in order to accurately track objects, and augment your perception by projecting camera images into three dimensions and fusing these projections with other sensor data. Combine this sensor data with Kalman filters to perceive the world around a vehicle and track objects over time.
- What They Need: Advanced C++
- What They Will Learn/Use: Lidar, Radar, Udacity Simulator (provided by Udacity),
• Lidar Obstacle Detection – Detect other cars on the road using raw lidar data from Udacity’s real self-driving car, Carla! Implement custom ransac and euclidean clustering algorithms.
• Radar Obstacle Detection – Calibrate, threshold, and filter radar data to detect obstacles in real radar data.
• Camera and Lidar Fusion – Detect and track objects from the benchmark KITTI dataset. Classify those objects and project them into three dimensions. Fuse those projections together with lidar data to create 3D objects to track over time.
• Unscented Kalman Filters – Code an Unscented Kalman Filter in C++ to track highly non-linear pedestrian and bicycle motion.
Resources (blog posts, other)
- Lectures 0
- Quizzes 0
- Duration 4 months
- Skill level All levels
- Students 0
- Assessments Yes