Range Sensor

Range sensors are robot sensor permitting to detect obstacles and determine a distance to it. Basic range sensors (infrared, ultrasonic, laser) produce a unique measure considering a given direction at a time. By making the sensor rotating, it is possible to get measurements on a plan around the sensor.

Hokuhyo, equipping the Tbot, is typically a kind of rotating lidar sensor (laser imaging or light detection and ranging). The goal here is to integrate an almost 360 obstacle detection to generate safe robot movement.

More complex lidar permits 3D measurements (i.e. in several plans at a time).

Get Scan Data

Well, let’s visualize the laser scan in rviz2. For that, verify that the user has the right to read data on the device. By connecting the laser sensor, a access file appears in Linux /dev directory named ttyACM0. Verify the owner of the file:

ls -l /dev | egrep ttyACM

Normally /dev/ttyACM0 is owned by user root and group dialout with crw-rw---- right, mining that owner and all members of the group can read (r) and write (w) and all the other users have no access to the resource. Verify that bot is a member of the group dialout

cat /etc/group | egrep dialout 

Cool. Let run a driver to convert device I/O to ros messages. ROS driver for hokuyo laser range is embedded in the urg_node package

ros2 run urg_node urg_node_driver --ros-args -p serial_port:=/dev/ttyACM0

Specifying the serial_port file requires to activate arguments with --ros-args and pass a file path using -p parameter command line flag. All of this is specific to ROS parameters.

From that point data are streamed in /scan topic. It is possible to check it with ros2 topic list and ros2 topic echo scan.

Now you can visualize it on rviz2 program. Start rviz2 in a terminal, add a flux laserScan and configure it in /scan topic. Nothing appears and it is normal. Rviz2 global option is configured on map frame, and nothing permits to set the position of the laser sensor in the map. The laser-scan frame is named laser. Change this information into global options and set the laser-scan size to 0,1 for a better display.

Stop everything.

Perform the same exercise to visualize simulated LaserScan from Gazebo simulator:

ros2 launch tbot_sim challenge-1.launch.py

Warning: the scan topic and/or the laser-scan frame can have different names.

A first node logging the scan

First, we will initialize a node scan_echo.

Edit a new file scan_echo.py with a very simple code :

def main():
    print('Move move move !')

if __name__ == '_main__' :
    main()

Test your scan_echo node:

python3 scan_echo.py

In a first version, you should:

#!python3
import rclpy
from rclpy.node import Node
from sensor_msgs.msg import LaserScan

rosNode= None

def scan_callback(scanMsg):
    global rosNode
    rosNode.get_logger().info( f"scan:\n{scanMsg}" )

rclpy.init()
rosNode= Node('scan_interpreter')
rosNode.create_subscription( LaserScan, 'scan', scan_callback, 10)

while True :
    rclpy.spin_once( rosNode )
scanInterpret.destroy_node()
rclpy.shutdown()

Test your scan_echo.py node, with the hokuyo laser and on simulation.

Modify the logger to print only the information into the header and the number of ranges.

From LaserScan to PointCloud

LaserScan provides both the recorded bean distances (ranges) and the meta information permitting converting the distances on points in a regular Cartesian frame (i.e. the angle between beans).

In a python, the conversion would look like this:

obstacles= []
angle= scanMsg.angle_min
for aDistance in scanMsg.ranges :
    if 0.1 < aDistance and aDistance < 5.0 :
        aPoint= [
            math.cos(angle) * aDistance,
            math.sin(angle) * aDistance
        ]
        obstacles.append(aPoint)
    angle+= scanMsg.angle_increment

The exercise consists in modifying the scan callback function to generate the point cloud list. To log a sample of the point cloud:

sample= [ [ round(p[0], 2), round(p[1], 2) ] for p in  obstacles[10:20] ]
self.get_logger().info( f" obs({len(obstacles)}) ...{sample}..." )

Finally, it is possible to publish this result in a PointCloud message and to visualize it on rviz2 in a superposition of the LaserScan.

PointCloud is based on geometry_msgs.Point32 with float coordinate. The creation of Point32 will require explicite cast.

aPoint= Point32()
aPoint.x= (float)(math.cos(angle) * aDistance)
aPoint.y= (float)(math.sin( angle ) * aDistance)
aPoint.z= (float)(0)

Last updated