Smart Wheelchair

Goal: Implement an Unscented Kalman Filter (UKF) algorithm for accurate estimation of Caster Wheel Orientations (CWOs) and pose of a robotic wheelchair

Project advisors – Prof. Brenna Argall, Dr. Jarvis Schultz

Project is based in the assistive & rehabilitation robotics laboratory (argallab) located within the Rehabilitation Institute of Chicago (RIC)

Project Objectives:

  1. To study existing code structure and implement a wall-following behavior
  2. To design & simulate a 3D model of new wheelchair in ROS Gazebo and Rviz
  3. To research and implement a model in order to estimate wheelchair’s CWOs
  4. To implement an UKF algorithm for accurate estimation of CWOs and the pose of the wheelchair

Project Website:

Documentation and code is available on my GitHub repository – http://github.com/patilnabhi/nuric_wheelchair_model_02

1. Wall-following behavior

First task included understanding the existing code structure, which was evaluated by implementing a simple behavior (a low-level behavior is implemented)

  • My implementation – getting the robotic wheelchair (in simulation) to follow a wall using laser-scan data. A simple algorithm to achieve this is summarized below –
    1. Check if wheelchair is near a “wall” by ensuring the range values from laser-scan data lie below a threshold value (if the values are below 3.0 m over a set of wide-spread distances, a wall is assumed)
    2. Also determine on which side of the wheelchair the wall is present
    3. Given the laser-scan data, the controls are adjusted such that the wheelchair is aligned parallel to the wall and follows the wall, as demonstrated in the video below
  • The code was developed in both C++ and Python languages and can be accessed here

2. 3D model of new wheelchair

Second task consisted designing a 3D model of a new electric wheelchair and integrating the model with existing code structure in ROS Gazebo and Rviz

  • The wheelchair used in this project is an electric wheelchair from Permobil and consists of following main parts:
    1. Two front wheels (motorized/driven wheels – a differential drive system)
    2. Two rear caster wheels that rotate passively due to wheelchair’s dynamics
    3. The seat and base of the wheelchair
    4. A Kinect / depth camera mounted just above the seat
    5. A laser-scanner mounted in the front-bottom of the wheelchair
  • The development of a simulated model was essential for evaluating the next two tasks relating CWOs
  • SimLab Composer software is used to export the SolidWorks .SLDPRT and .SLDASM files into .dae (collada) format, that is supported by ROS URDF parser
  • MeshLab software is used to determine the moments of inertia and center of gravity parameters of the wheelchair

This slideshow requires JavaScript.

3. Kinematic model for CWOs estimation

Third task involved estimating CWOs of the wheelchair

  • The two rear caster wheels rotate passively as the wheelchair is driven by the two front motorized wheels
  • A kinematic model is implemented to estimate CWOs given the input commands, such as linear velocity,  \dot{Y}   and angular velocity,  \dot{\phi}

cwo_model

Fig. 1: (left) Top view of the wheelchair; (right)  Kinematic model (ODEs) to determine CWOs

  • The model is tested in simulation and the estimated CWOs are compared with the actual CWOs (extracted from joint_states topic)
  • Following video demonstrates the simulation results –

4. UKF algorithm & dynamic model of wheelchair

  • Final task involved implementing an UKF filter to better estimate CWOs with unknown initial state
  • A dynamic motion model is chosen to represent the relation between CWOs and pose of the wheelchair robot. This model is shown below –

dynamic_model

Fig. 2: Dynamic model of wheelchair; ‘F’ represents the friction forces (with forces in ‘w’ direction assumed to be zero); ‘N’ refers to the total normal force acting on the wheelchair

  • The UKF algorithm implementation consists of 4 steps, as outlined below –
  1. Initialize:
    • Initialize state and controls for the wheelchair (mean and covariance)
  2. Predict:
    • Generate sigma points using Julier’s Scaled Sigma Point algorithm
    • Pass each sigma points through the dynamic motion model to from a new prior
    • Determine mean and covariance of new prior through unscented transform
  3. Update:
    • Get odometry data (measurement of pose of wheelchair)
    • Convert the sigma points of prior into expected measurements (points corresponding to pose of wheelchair – x, y and \theta are chosen)
    • Compute mean and covariance of converted sigma points through unscented transform
    • Compute residual and Kalman gain
    • Determine new estimate for the state with new covariance
  4. Loop:
    • Continue steps 2 & 3, until the wheelchair moves around and gathers new measurement data
  • The algorithm is tested in simulation and following plots are produced (demonstrated in following video) –
    1. CWOs – actual, estimated & UKF-estimated;
    2. Pose (x, y and \theta ) – actual, estimated & UKF-estimated;
    3. Error between actual & UKF-estimated data

 

Future work:

  • Integrate the UKF model with the overall code, test with actual wheelchair and analyze results
  • Integrate measurement data from Kinect/depth camera and LiDAR in the ‘update‘ step of the UKF algorithm

References:

  1. Kalman and Bayesian Filters in Python

  2. Analysis of Driving Backward in an Electric-Powered Wheelchair, Dan Ding, Rory A. Cooper, Songfeng Guo and Thomas A. Corfman (2004)

  3. A New Dynamic Model of the Wheelchair Propulsion on Straight and Curvilinear Level-ground Paths, Felix Chenier, Pascal Bigras, Rachid Aissaoui (2014)

  4. A Caster Wheel Controller For Differential Drive Wheelchairs, Bernd Gersdorf, Shi Hui

  5. Kinematic Modeling of Mobile Robots by Transfer Method of Augmented Generalized Coordinates, Wheekuk Kim, Byung-Ju Yi, Dong Jin Lim (2004)

  6. Mobile Robot Kinematics

  7. Dynamics equations of a mobile robot provided with caster wheel, Stefan Staicu (2009)

0 comments on “Smart WheelchairAdd yours →

Leave a Reply

Your email address will not be published. Required fields are marked *