Challenge 1
The goal of the challenge is to demonstrate the capability of a robot to move in a cluttered environment with a possibility to see what the robot see.
Expected
The robot is positioned somewhere in a closed area (i.e. an area bounded with obstacles).
The robot moves continuously in this area while avoiding the obstacles (i.e. area limits and some obstacles randomly set in the area).
The sensor data (scan, vision) and potentially other information (considered obstacles, frames,...) is visible on rviz.
The robot trajectory would permit the robot to explore the overall area. In other words, the robot will go everywhere (i.e. the probability that the robot will reach a specific reachable position in the area is equal to 1 at infinite time).
One first approach can be to develop a ricochet robot that changes its direction randomly each time an obstacle prevent the robot to move forward.
consigns
Each group commit the minimal required files in a specific grp_'machine'
ros2 package inside their git repository.
Release: Monday afternoon of week-3 (Monday 15-01-2024)
The required files:
At the root repository, a
README.md
file in markdown syntax introducing the project.A directory
grp_'machine'
matching the ROS2 package where the elements will be found (machine
matches the name of the machine embedded on the robot).Inside the
grp_'machine'
package, a launch filesimulation_launch.yaml
starting the appropriate nodes for demonstrating in the simulation.Then, a launch file
tbot_launch.yaml
starting the appropriate nodes for demonstrating with a tbot.Finally, a launch file
visualize.launch.py
starting the appropriate nodes including rviz2 to visualize (the idea is that we could use to computer on a same domain id. one to control the robot, one for visualization).
In simulations, we will work with the configuration set in challenge-1.launch.py
.
Criteria
Minimal:
The group follows the consigns (i.e. the repository is presented as expected)
The robot behavior is safe (no collision with any obstacles)
rviz2 is started and well configured.
The robot moves everywhere in it environment.
A
String
message is sent in adetection
topic each time a bottle is front of the robot.
Optional:
An operator can take the control of the robot at any time.
The robot movement is fluid (no stop), fast and the robot moves smoothly even in small areas
Evaluation protocol (for evaluators...)
Here the evaluation protocol applied. It is highly recommended to process it yourself before the submission...
clone the group’s repository
Take a look to what is inside the repository and read the
README.md
file (normally it states that the project depends onmb6-tbot
, make sure thatmb6-tbot
project is already installed aside).make it:
colcon build
andsource
from the workspace directory.Launch the simulation demonstration:
ros2 launch challenge1 simulation.launch.py
and appreciate the solution.Stop everything.
Connect the computer to the robot, the hokuyo and the camera.
Launch the Turtlebot demonstration, and appreciate the solution.
Take a look to the code, by starting from the launch files.
Last updated