Sensor fusion algorithms can be used to improve the quality of position, orientation, and pose estimates obtained from individual sensors by combing the outputs from multiple sensors to improve accuracy. By fusing data from multiple sensors, the strengths of each sensor modality can be used to make up for shortcomings in the other sensors. In addition, systems that perform sense and perception need to keep track of a very large number of objects to maintain complete situational awareness.
In this webinar, you will learn about algorithms and tools to design, simulate, and analyze systems that fuse data from multiple sensors to maintain position, orientation, and situational awareness. The webinar will include several reference examples that provide a starting point for airborne, ground-based, shipborne, and underwater sensing systems.
Please allow approximately 45 minutes to attend the presentation and Q&A session. We will be recording this webinar, so if you can't make it for the live broadcast, register and we will send you a link to watch it on-demand.