Demonstration of a Stereo Visual Odometry Algorithm

I’m pleased to share another demonstration video of our stereo visual odometry algorithm, primarily developed by my student Mohamed Aladem who is wrapping up his master’s at the University of Michigan – Dearborn. Near term goals for our lab using this framework are: navigating mobile robots (namely an autonomous snowplow for the ION Autonomous Snowplow Competition – see previous post to this one), navigating a multi-copter, and explore solutions for automotive driver assistance systems and future autonomous vehicles.

Publications:

  • Mohamed Aladem, Samir Rawashdeh, Nathir Rawashdeh, “Evaluation of a Stereo Visual Odometry Algorithm for Road Vehicle Navigation”, SAE World Congress, April 2017 Detroit, MI
  • S. A. Rawashdeh, M. Aladem, “Toward Autonomous Stereo-Vision Control of Micro Aerial Vehicles”, Proceedings of the IEEE National Aerospace and Electronics Conference, July 2016, Dayton, OH
  • Journal article pending.

 

2017 Autonomous Snowplow Competition

At the 2017 ION Autonomous Snowplow Competition, of 8 competing teams UM-Dearborn’s Yeti won second place ($4000) and team Zenith won first place in the new Cooperative Snowplow challenge ($700). 
 
Yeti and Zenith are primarily developed by students from the Intelligent Systems Club (ISC), advised by prof. Rawashdeh. Yeti uses a LIDAR for localization and obstacle avoidance, while Zenith is based on stereo vision. 
 
Some photos and videos can be found on the ISC club’s Twitter page.
 
 
 
 

Stereo Visual Odometry

We have some exciting results. Below is a brief demonstration of our recent success with visual odometry. Using stereo cameras, the camera motion and pose are tracked over time, along with depth sensing (stereo disparity map) and point cloud generation.

Publications, currently in preparation and expected this summer, will discuss our approach.

3rd Place at IGVC 2016 (Intelligent Ground Vehicle Competition)

I am happy to share that we had a good run at the Intelligent Ground Vehicle Competition this weekend which took place at the Oakland University campus.

The vehicle uses a LIDAR system for obstacle avoidance, GPS for Navigation and real-time image processing for lane detection.

Of over 30 teams participating, the Dearborn team came in third overall, with 3rd fastest speed on the basic navigation course, and tying on the advanced navigation course in terms of performance but at a longer time, earning 2nd place. Below are some photos and a short video of what the advanced course is like. Prizes total $3k and a trophy.

The team of primarily undergraduate students from the Intelligent Systems Club did very well and deserve thanks and congratulations. The team inclued Michael Bowyer, Erik Aitken, Saad Pandit, Cristian Adam, Matthew Abraham, Siddharth Mahimkar, Emmanuel Obi, Brendan Ferracciolo, Angelo Bertani, and others from the club.

 

Presenting at IEEE Southeast Michigan 2015 Fall Conference

We are excited to be presenting our research on Obstacle Avoidance for Drones at the IEEE SEM 2015 Fall Conference, 5-6PM, Nov 17. The talk is titled

“Obstacle Detect, Sense, and Avoid for Unmanned Aerial Systems”

Abstract:

Drones, or Unmanned Aerial Systems (UAS), are expected to be adopted for a wide range of commercial applications and become an aspect of everyday life. The Federal Aviation Administration (FAA) regulates airspace access of unmanned systems and has put forward a road map for UAS adoption for commercial use. It is expected that vehicles flying outside line-of-sight be capable of sensing and avoiding other aircraft and obstacles. Whether the UAS is autonomous or remotely piloted, it is expected that drones become capable of safe flight without depending on communication links which are susceptible. Therefore, sensor technologies and real-time processing and control approaches are required on board unmanned aircraft to provide situational awareness without depending on remote operation or inter-aircraft communication. This talk overviews some research activities at the University of Michigan Dearborn to address this challenges. We are developing a stereo-vision system for obstacle detection on aerial vehicles. Using stereo video (3D video), a depth map can be generated and used to detect approaching objects that need to be avoided. We are also developing a visual navigation approach to enable drones to navigate in GPS denied environments, such as between buildings or indoors. Also, a virtual “bumper” system is being developed to over-ride commands being given by an in-experienced pilot in the case of an impending crash. Such a system could help prevent incidences such as the video drone crash at the last US Open Tennis Championships.

IEEE SEM Fall Conf 2015 - Slide Screensho

Conference Time and Venue:

Tuesday Evening, November 17, 2015, From 4:00 PM to 9:00 PM

University of Michigan – Dearborn
Fairlane Center – North Building
19000 Hubbard Drive, Dearborn,
Michigan 48126

More information on conference agenda can be found at the main page, and in this flyer.

 

Motion Capture using Inertial Sensors

We are developing wearable motion capture devices using inertial measurement units (IMUs). Our focus is on shoulder health, specifically monitoring stresses and providing feedback to the user in order to help prevent injury. Another application is in physical therapy guidance. The following two videos illustrate how the device can capture the arm’s orientation as a function of time.

The subject is performing four repetitions of an external rotation with the arm elevated to 90 degrees. The second video shows the three-axis accelerometer and gyroscope measurements, along with a replay of the arm’s orientation (shown as the device’s three orthogonal axes).

Presenting at CubeSat Workshop, SmallSat 2015, Logan, UT

I will be presenting at the CubeSat Workshop on August 8th, part of the annual conference on Small Satellites on the campus of Utah State University, in Logan, UT. The presentation is about our design of a miniature star camera for small satellites and CubeSats in particular. The goal is to build a camera and develop image processing algorithms to implement a star imager at the scale of modern smart-phone cameras, which will enable the use of an array of cameras on a CubeSat.

The Workshop and Conference schedules can be found here.

Distributed Star Imaging for CubeSats

3rd Place at Autonomous Aerial Vehicle Competition

The University of Michigan Dearborn team placed in 3rd place at the Autonomous Aerial Vehicle Competition (AAVC) held on April 28, 2015, in Dayton, OH.  The competition challenges university teams to develop drones that can fly and navigate indoors to locate and image a target. The University of Michigan Dearborn team placed in 3rd place and won $2,600, out of eight entries. The vehicle was designed by engineering undergraduate students consisting of members of the Intelligent Systems Club and professor Samir Rawashdeh’s research group. The students built and tested the vehicle (a quad-copter), developed algorithms for autonomous navigation, and developed image processing algorithms to detect the target.

Invalid Displayed Gallery

Real-Time Motion Capture Using Wearable Inertial Sensors

A senior design project I’ve advised on developing real-time motion capture using wearable inertial sensors recently concluded. The work was done by a group of four undergraduate students; Jingwei Luo, Yi Wang, Yongxu Yao, and Jun Yu. The following two videos highlight their results.

The wearable units consist of an STM32L series ARM Cortex-M3 controller, and a single-chip Inertial Measurement Unit (IMU), which contains a set of accelerometers, gyroscopes, and magnetic field sensors along three orthogonal axes.