Motion Capture using Inertial Sensors

We are developing wearable motion capture devices using inertial measurement units (IMUs). Our focus is on shoulder health, specifically monitoring stresses and providing feedback to the user in order to help prevent injury. Another application is in physical therapy guidance. The following two videos illustrate how the device can capture the arm’s orientation as a function of time.

The subject is performing four repetitions of an external rotation with the arm elevated to 90 degrees. The second video shows the three-axis accelerometer and gyroscope measurements, along with a replay of the arm’s orientation (shown as the device’s three orthogonal axes).

Presenting at CubeSat Workshop, SmallSat 2015, Logan, UT

I will be presenting at the CubeSat Workshop on August 8th, part of the annual conference on Small Satellites on the campus of Utah State University, in Logan, UT. The presentation is about our design of a miniature star camera for small satellites and CubeSats in particular. The goal is to build a camera and develop image processing algorithms to implement a star imager at the scale of modern smart-phone cameras, which will enable the use of an array of cameras on a CubeSat.

The Workshop and Conference schedules can be found here.

Distributed Star Imaging for CubeSats

3rd Place at Autonomous Aerial Vehicle Competition

The University of Michigan Dearborn team placed in 3rd place at the Autonomous Aerial Vehicle Competition (AAVC) held on April 28, 2015, in Dayton, OH.  The competition challenges university teams to develop drones that can fly and navigate indoors to locate and image a target. The University of Michigan Dearborn team placed in 3rd place and won $2,600, out of eight entries. The vehicle was designed by engineering undergraduate students consisting of members of the Intelligent Systems Club and professor Samir Rawashdeh’s research group. The students built and tested the vehicle (a quad-copter), developed algorithms for autonomous navigation, and developed image processing algorithms to detect the target.

Invalid Displayed Gallery

Real-Time Motion Capture Using Wearable Inertial Sensors

A senior design project I’ve advised on developing real-time motion capture using wearable inertial sensors recently concluded. The work was done by a group of four undergraduate students; Jingwei Luo, Yi Wang, Yongxu Yao, and Jun Yu. The following two videos highlight their results.

The wearable units consist of an STM32L series ARM Cortex-M3 controller, and a single-chip Inertial Measurement Unit (IMU), which contains a set of accelerometers, gyroscopes, and magnetic field sensors along three orthogonal axes.


Projects in Embedded Systems Design Course (Winter 2015)

ECE473 Embedded Systems Design for Winter 2015 has concluded recently. In the course, we work with ARM Cortex-M4F microcontrollers, mainly using the TI Tiva C Launchpad. The following are some of the final projects student teams have designed and implemented (in about 3.5 weeks).

RealTime Vehicle Data Display via Bluetooth Communication with OBD-II Port

Digital Laser Harp Musical Synthesizer and Remote DJ Sound Bar Controller

Embedded Encryption Module

Swipe Gesture Maze

CubeSat Star Imaging

We have received an award from the NASA Michigan Space Grant Consortium to develop a star imaging approach for CubeSats using an array of miniature cameras.

Attitude determination for small spacecraft in the 1-5 kilogram range is one of the major technological challenges limiting their utility for a variety of missions. In prior work, we have developed a visual approach for attitude propagation. By tracking the motion of stars in a camera’s field of view, the rotation of the spacecraft can be found in three degrees of freedom. We refer to the approach as a stellar gyroscope. The proposed work builds on the prior success and findings to pursue a promising new topology. Essentially, we will miniaturize the sensor nodes and lenses to design a camera in the size range of modern smartphone cameras capable of star imaging while utilizing the stellar gyroscope algorithm’s noise tolerance in post-processing. This will allow small spacecraft to incorporate up to one camera on each side if needed, with one centralized image processing subsystem.

MSGC Star Imaging


Obstacle "Sense and Avoid" using Stereo Vision for Unmanned Aerial Systems

We have received a small award to pursue Stereo Vision on board small unmanned aircraft (aka drones, quad-copters, multi-copters). Obstacle sense and avoid (SAA) on board aerial vehicles is a key technology that needs to be addressed in order to meet the FAA safety requirements for future integration of drones into the civil airspace. Visual approaches, such as stereo vision, can play an important role in meeting these requirements.

We will be working with the 3D Robotics X8 copter and a set of embedded cameras. Students interested in being involved, please contact me.

Illustration of depth perception using stereo cameras3D Robotics X8 copter

Projects in Embedded Systems Design Course (Fall 2014)

ECE473 Embedded Systems Design for Fall 2014 has concluded recently. In the course, we work with ARM Cortex-M4F microcontrollers, mainly using the TI Tiva C Launchpad. The following are some of the final projects student teams have designed and implemented (in about 3 weeks).

Motorcycle Tilt Meter and Warning System
Entails interfacing a 3-axis accelerometer to measure tilt, a circular LED array, and designing a wheel RPM sensor to measure motorcycle velocity. Software and modeling challenges include identifying the safe tilt angle as a function of velocity.

Project by: Jason Learst and Alfred Kishek

2048 Game
Implementing the popular 2048 game on the embedded hardware. Entails interfacing a graphics LCD display, a matrix keypad, and developing the game engine.

Project by: Chuanzhi Yi and Hongsheng Wang

Location Reporting and Recording for High Altitude Balloons
Entails interfacing a GPS receiver, an SD Card, and a GSM modem. File System implementation is a major software component.

Project by: Gabriel Church, Nadeem Kizi, Felipe Marliere, Michael Azar

KySat-2 secondary model selected for deployement from International Space Station

NASA has awarded payload flight opportunities for research and technology development onboard the International Space Station to academic institutions across the U.S. Among the selected projects is the launch and deployment of the KySat-2 secondary model from the ISS. The primary model KySat-2 was launched by NASA on November 19, 2013 out of Wallops Island, VA.

kysat-2_flight_modelsThe research experiment includes several systems and experiments designed by Dr. Rawashdeh at the Space Systems Laboratory at the University of Kentucky. Specifically, a flight test of the Stellar Gyroscope concept, and analysis and of attitude dynamics using the SNAP simulation tool. The Rawashdeh group at UM-Dearborn will continue to support the mission, mainly in analyzing the experiment data. More on the KySat-2 Mission can be found here.

The awards are through NASA’s Experimental Program to Stimulate Competitive Research (EPSCoR). The official press release can be found here, with abstracts here.

Abbreviated abstract:

The primary mission is to test a new method of attitude determination for small spacecraft called the stellar gyroscope, which estimates attitude changes by analyzing the relative motion of stars between successive image frames, lowering the computational and power requirements necessary to propagate attitude changes. Launch from the ISS will allow characterization the stellar gyroscope hardware, verification of the sensitivity of the sensor for star imaging as well as the image processing required on-orbit. Additionally, ejection from the ISS altitude will allow analysis of the ejection dynamics of the spacecraft using the Smart Nanosatellite Attitude Propagator (SNAP) tool to characterize atmospheric drag for Low Earth Orbit (LEO) CubeSats.