Projects

AR Office application

Type Mobile application
TECHNOLOGIES Android, ARToolkit, DLib, GoogleVision, MATLAB/Octave, PostgreSQL+cube, Projective and single-view geometry, Spring Boot
AREAS OF EXPERTISE Augmented reality, computer vision, neural networks
TEAM 1 developer

The project is a client-server app for AR visualization, indoor navigation and face recognition in the office of our company. The application enables ISS Art guests to detect an identity by directing the device (mobile phone) at the face of an employee, and they can get additional information about him/her.

The second function is receiving the details of the company’s portfolio by directing the glasses at the items of the “glory alley”, the icons of completed projects. We have Epson Moverio Augmented reality Smart Glasses BT-300. Users get information about our projects by pointing camera to special markers with project logo. When a camera is fixed on the logo, a colored image of the project name appears. Tapping on the logo shows the info about the project: name, duration, main technologies, etc.

Challenges
  1. When the app was used with see-through devices like Epson glasses, a user could see a real object, not an image from a camera. It made impossible to use a projection matrix given by ARToolkit.
  2. The sensor data contained some noise, so we could not use it for long-time position tracking.
  3. The magnetometer produced data with offset and a different scale on each axis, which caused the dependence of the data on the device orientation. If a user walked through the same place twice in opposite directions, the magnetometer produced different values.
Solutions
  1. We have used calibration procedure to obtain eye-to-camera relations. Ten points with known coordinates have given us a so called intrinsic camera matrix of an eye in a camera coordinate system. This is done for the left and right eye separately.
  2. We have used magnetic field values to create a map of our office. When a user walks through it, we track magnetic field changes to obtain a path that the user moves through.
  3. We have used the custom calibration method to obtain correction coefficients.
Results
  1. With see-through devices, users can see augmented images perfectly aligned with real objects in stereo view.
  2. We can determine the user position with the accuracy to 1.5-2 m.
  3. Stable, direction-independent values of the magnetic field have been obtained after calibration.
Application for dynamic compaction control
Application for dynamic compaction control
System for annotating X-ray images and generating pathologies
System for annotating X-ray images and generating pathologies