Cloud-based Simultaneous Localization and Mapping using RGB cameras
Simultaneous Localization and Mapping (SLAM) algorithms are very important for Augmented Reality (AR) applications. Specifically, Visual SLAM can localize a user based on images from a single RGB camera, which is useful for mobile AR. However, SLAM algorithms are resource intensive and not suitable for mobile devices. An alternative is cloud-based visual SLAM, where the camera data is sent to high-powered servers, and calculated SLAM parameters are sent back to the mobile devices for efficient real-time AR. This can enable further intelligent processing on the server side, such as collaborative AR experience with multiple users, object-level scene understanding etc.
Camera data streaming and SLAM processing on the server side has already been implemented. The intern's responsibility will be continuing the project where they will: 1. develop prototype mobile app that can receive algorithm-specific data from the server to localize the user ( (position and orientation in world coordinate); 2. create AR prototype on the mobile device with the received data; 3. investigate collaborative AR possibilities with the prototype; 4. investigate object-level scene understanding with the prototype.
Student should have strong programming experience (solid understanding of materials covered in: COE428, COE328, COE318, additional programming experience in C++ preferred). 3rd year (or higher) students will get preference.
Naimul Mefraz Khan : Cloud-based Simultaneous Localization and Mapping using RGB cameras | Thursday March 29th 2018 10:22 AM