Launching Erle-Rover simulation

Get the real vehicle at our store!



REMINDER: configure your environment before starting this tutorial.


source ~/simulation/ros_catkin_ws/devel/setup.bash
cd ~/simulation/ardupilot/APMrover2
../Tools/autotest/sim_vehicle.sh -j 4 -f Gazebo
# once MAVProxy has launched completely, load the parameters
param load /[path_to_your_home_directory]/simulation/ardupilot/Tools/Frame_params/3DR_Rover.param
# NOTE: replace [path_to_your_home_directory] with the actual path to your home directory.
# Example: param load /home/john/simulation/ardupilot/Tools/Frame_params/3DR_Rover.param

In another terminal

source ~/simulation/ros_catkin_ws/devel/setup.bash
roslaunch ardupilot_sitl_gazebo_plugin rover_spawn.launch
Figure 1 - Erle-Rover model in Gazebo simulator

Make the rover move forward. In the first terminal execute:

# in the MAVProxy prompt:
mode MANUAL
param set SYSID_MYGCS 255
rc 3 1900

Or backwards:

# in the MAVProxy prompt:
rc 3 1200

What we are doing here is override the 3rd channel of the RC, which corresponds to the throttle. Values go from 1100 to 1900. 1500 is to stop the throttle; so values above 1500 will make the rover move forward, and values above 1500 backwards. The same principle applies to the yaw, which is in the 1st channel of the RC. Values above 1500 will make it turn right, and below 1500 left. For instance:

# in the MAVProxy prompt:
rc 1 1400

To add a sonar sensor to the gazebo model it's necessary to add the following lines to rover.urdf and relaunch the simulation. It's quite easy to modify the width and height of the image with the params res_x and res_y, the name of the topic with ros_topic or the update rate with update_rate.

    <!-- Front facing camera -->
  <xacro:include filename="$(find ardupilot_sitl_gazebo_plugin)/urdf/sensors/generic_camera.urdf.xacro" />
  <xacro:generic_camera
    name="${namespace}/front"
    parent="chassis"
    ros_topic="image_front_raw"
    cam_info_topic="camera_front_info"
    update_rate="60"
    res_x="640"
    res_y="480"
    image_format="R8G8B8"
    hfov="110"
    framename="${namespace}_frontcam">
    <origin xyz="0.8 0 1.2" rpy="0 0 0"/>
  </xacro:generic_camera>  

The following python script shows how to visualize the image:

import roslib
import sys
import rospy
import cv2
from std_msgs.msg import String
from sensor_msgs.msg import Image
from cv_bridge import CvBridge, CvBridgeError

class image_converter:

  def __init__(self):

    self.bridge = CvBridge()
    self.image_sub = rospy.Subscriber("/rover/front/image_front_raw",Image,self.callback)

  def callback(self,data):
    try:
      cv_image = self.bridge.imgmsg_to_cv2(data, "bgr8")
    except CvBridgeError as e:
      print(e)

    cv2.imshow("Image window", cv_image)
    cv2.waitKey(3)

def main(args):
  ic = image_converter()
  rospy.init_node('image_converter', anonymous=True)
  try:
    rospy.spin()
  except KeyboardInterrupt:
    print("Shutting down")
  cv2.destroyAllWindows()

if __name__ == '__main__':
    main(sys.argv)
Figure 2 - Frontal image

To add a laser 2D to the gazebo model it's necessary to add the following lines to rover.urdf and relaunch the simulation. It's quite easy to modify the maximum and minimum range with the values max_rangeand min_range, changenge the udpate rate with update_rate (10Hz in this case), the horizontal an vertical field of view with field_of_view_horizontal and field_of_view_vertical, the number of rays in vertical and horizontal with ray_count_horizontal and ray_count_horizontal or the position relative to the copter.

  <xacro:include filename="$(find ardupilot_sitl_gazebo_plugin)/urdf/sensors/lidar_sensor.urdf.xacro" />
  <xacro:lidar_sensor
    name="sonar2"
    parent="chassis"
    ros_topic="sonar_front"
    update_rate="10"
    min_range="0.06"
    max_range="20.0"
    field_of_view_horizontal="${270*M_PI/180}"
    field_of_view_vertical="${1*M_PI/180}"
    ray_count_horizontal="542"
    ray_count_vertical="1"
    sensor_mesh="lidar_lite_v2_withRay/meshes/lidar_lite_v2_withRay.dae">
    <origin xyz="0.5 0 0.8" rpy="0 0 0"/>
  </xacro:lidar_sensor>
Figure 3 - Laser 2D

The following python script shows how to visualize the shape of the laser:

#!/usr/bin/env python
import rospy

import cv2
import numpy as np
import math

from sensor_msgs.msg import LaserScan

def callback(data):
    frame = np.zeros((500, 500,3), np.uint8)
    angle = data.angle_max
    Vx = 250
    Vy = 250
    for r in data.ranges:
        if r == float ('Inf'):
            r = data.range_max
        x = math.trunc( (r * 10)*math.cos(angle + (-90*3.1416/180)) )
        y = math.trunc( (r * 10)*math.sin(angle + (-90*3.1416/180)) )
        cv2.line(frame,(250, 250),(x+250,y+250),(255,0,0),1)
        angle= angle - data.angle_increment

    cv2.imshow('frame',frame)
    cv2.waitKey(1)

def laser_listener():
    rospy.init_node('laser_listener', anonymous=True)

    rospy.Subscriber("/scan", LaserScan,callback)
    rospy.spin()

if __name__ == '__main__':
    laser_listener()
Figure 4 - Laser shape

To add a kinect camera to the gazebo model it's necessary to add the following lines to rover.urdf and relaunch the simulation.

  <xacro:include filename="$(find ardupilot_sitl_gazebo_plugin)/urdf/sensors/kinect.urdf.xacro" />  
  <xacro:sensor_kinect
    parent="chassis">
    <origin xyz="1.6 0 0.55" rpy="0 0 0"/>
  </xacro:sensor_kinect>
Figure 5 - Kinect sensor in Gazebo simulator

Before visualizing the output of the sensor, add a couple of objects in front of the rover.

Figure 5 - Adding objects in Gazebo simulator

Now run rviz (install it by running sudo apt-get install ros-indigo-rviz if you haven't already):

rosrun rviz rviz

Once rviz has been launched, add a new display (Ctrl+N) and inside the By Topic tab select /camera/depth/points/PointCloud2 and press OK.

Figure 6 - Adding Kinect output to rviz

You should something similar to this:

Figure 7 - Kinect output in Rviz

If besides simulation, you're interested in translating these behaviors to real scenarios you can purchase the real vehicle at our store