Skip to content
Snippets Groups Projects
Commit e23c26f5 authored by holzheim's avatar holzheim
Browse files

Updated Mobile Robot Task

parent 75bce431
No related branches found
No related tags found
No related merge requests found
Showing
with 482 additions and 221 deletions
00_GettingStarted/docker/i2r/catkin_ws/src/image.png

25.7 KiB

This diff is collapsed.
image: /root/catkin_ws/src/map_test.pgm
resolution: 0.050000
origin: [-10.000000, -10.000000, 0.000000]
negate: 0
occupied_thresh: 0.65
free_thresh: 0.196
......@@ -161,6 +161,7 @@ include_directories(
# in contrast to setup.py, you can choose the destination
catkin_install_python(PROGRAMS
scripts/simple_auto_drive.py
scripts/solution_auto_drive.py
DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION}
)
......
File added
<launch>
<!-- launching the mapping node-->
<include file="$(find turtlebot3_slam)/launch/turtlebot3_slam.launch">
<arg name="open_rviz" default="false"/>
</include>
<!-- opening rviz with custom config -->
<node pkg="rviz" type="rviz" name="rviz" required="true" args="-d $(find task_3)/rviz_config/mapping_config.rviz"/>
</launch>
\ No newline at end of file
<launch>
<include file="$(find turtlebot3_gazebo)/launch/turtlebot3_world.launch">
<arg name="x_pos" default="-1.0"/>
<arg name="y_pos" default="2.0"/>
</include>
</launch>
\ No newline at end of file
Panels:
- Class: rviz/Displays
Help Height: 78
Name: Displays
Property Tree Widget:
Expanded:
- /TF1/Frames1
- /Map1
- /Odometry1
- /Odometry1/Shape1
Splitter Ratio: 0.594406008720398
Tree Height: 681
- Class: rviz/Selection
Name: Selection
- Class: rviz/Tool Properties
Expanded:
- /2D Pose Estimate1
- /2D Nav Goal1
- /Publish Point1
Name: Tool Properties
Splitter Ratio: 0.5886790156364441
- Class: rviz/Views
Expanded:
- /Current View1
Name: Views
Splitter Ratio: 0.5
- Class: rviz/Time
Name: Time
SyncMode: 0
SyncSource: LaserScan
Preferences:
PromptSaveOnExit: true
Toolbars:
toolButtonStyle: 2
Visualization Manager:
Class: ""
Displays:
- Alpha: 0.5
Cell Size: 1
Class: rviz/Grid
Color: 160; 160; 164
Enabled: true
Line Style:
Line Width: 0.029999999329447746
Value: Lines
Name: Grid
Normal Cell Count: 0
Offset:
X: 0
Y: 0
Z: 0
Plane: XY
Plane Cell Count: 100
Reference Frame: map
Value: true
- Alpha: 1
Class: rviz/RobotModel
Collision Enabled: false
Enabled: true
Links:
All Links Enabled: true
Expand Joint Details: false
Expand Link Details: false
Expand Tree: false
Link Tree Style: Links in Alphabetic Order
base_footprint:
Alpha: 1
Show Axes: false
Show Trail: false
base_link:
Alpha: 1
Show Axes: false
Show Trail: false
Value: true
base_scan:
Alpha: 1
Show Axes: false
Show Trail: false
Value: true
caster_back_link:
Alpha: 1
Show Axes: false
Show Trail: false
Value: true
imu_link:
Alpha: 1
Show Axes: false
Show Trail: false
wheel_left_link:
Alpha: 1
Show Axes: false
Show Trail: false
Value: true
wheel_right_link:
Alpha: 1
Show Axes: false
Show Trail: false
Value: true
Name: RobotModel
Robot Description: robot_description
TF Prefix: ""
Update Interval: 0
Value: true
Visual Enabled: true
- Class: rviz/TF
Enabled: false
Filter (blacklist): ""
Filter (whitelist): ""
Frame Timeout: 15
Frames:
All Enabled: true
Marker Alpha: 1
Marker Scale: 1
Name: TF
Show Arrows: true
Show Axes: true
Show Names: true
Tree:
{}
Update Interval: 0
Value: false
- Alpha: 1
Autocompute Intensity Bounds: true
Autocompute Value Bounds:
Max Value: 10
Min Value: -10
Value: true
Axis: Z
Channel Name: intensity
Class: rviz/LaserScan
Color: 0; 255; 0
Color Transformer: FlatColor
Decay Time: 0
Enabled: true
Invert Rainbow: false
Max Color: 255; 255; 255
Min Color: 0; 0; 0
Name: LaserScan
Position Transformer: XYZ
Queue Size: 10
Selectable: true
Size (Pixels): 3
Size (m): 0.030000001192092896
Style: Flat Squares
Topic: /scan
Unreliable: false
Use Fixed Frame: true
Use rainbow: true
Value: true
- Class: rviz/Image
Enabled: false
Image Topic: /raspicam_node/image
Max Value: 1
Median window: 5
Min Value: 0
Name: Image
Normalize Range: true
Queue Size: 2
Transport Hint: compressed
Unreliable: false
Value: false
- Alpha: 0.699999988079071
Class: rviz/Map
Color Scheme: map
Draw Behind: false
Enabled: true
Name: Map
Topic: /map
Unreliable: false
Use Timestamp: false
Value: true
- Angle Tolerance: 0.10000000149011612
Class: rviz/Odometry
Covariance:
Orientation:
Alpha: 0.5
Color: 255; 255; 127
Color Style: Unique
Frame: Local
Offset: 1
Scale: 1
Value: true
Position:
Alpha: 0.30000001192092896
Color: 204; 51; 204
Scale: 1
Value: true
Value: false
Enabled: true
Keep: 1000
Name: Odometry
Position Tolerance: 0.10000000149011612
Queue Size: 10
Shape:
Alpha: 1
Axes Length: 1
Axes Radius: 0.10000000149011612
Color: 255; 25; 0
Head Length: 0.05000000074505806
Head Radius: 0.05000000074505806
Shaft Length: 0.10000000149011612
Shaft Radius: 0.05000000074505806
Value: Arrow
Topic: /odom
Unreliable: false
Value: true
Enabled: true
Global Options:
Background Color: 48; 48; 48
Default Light: true
Fixed Frame: map
Frame Rate: 30
Name: root
Tools:
- Class: rviz/Interact
Hide Inactive Objects: true
- Class: rviz/MoveCamera
- Class: rviz/Select
- Class: rviz/FocusCamera
- Class: rviz/Measure
- Class: rviz/SetInitialPose
Theta std deviation: 0.2617993950843811
Topic: /initialpose
X std deviation: 0.5
Y std deviation: 0.5
- Class: rviz/SetGoal
Topic: /move_base_simple/goal
- Class: rviz/PublishPoint
Single click: true
Topic: /clicked_point
Value: true
Views:
Current:
Angle: -1.5707963705062866
Class: rviz/TopDownOrtho
Enable Stereo Rendering:
Stereo Eye Separation: 0.05999999865889549
Stereo Focal Distance: 1
Swap Stereo Eyes: false
Value: false
Invert Z Axis: false
Name: Current View
Near Clip Distance: 0.009999999776482582
Scale: 131.43995666503906
Target Frame: map
X: 0
Y: 0
Saved: ~
Window Geometry:
Displays:
collapsed: false
Height: 978
Hide Left Dock: false
Hide Right Dock: true
Image:
collapsed: false
QMainWindow State: 000000ff00000000fd00000004000000000000016a00000334fc0200000009fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb000000100044006900730070006c006100790073010000003d00000334000000c900fffffffb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261fb0000000a0049006d006100670065000000020f000001900000001600ffffff000000010000010f0000035cfc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fb0000000a0056006900650077007300000000430000035c000000a400fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e10000019700000003000004ff0000003efc0100000002fb0000000800540069006d00650100000000000004ff000003bc00fffffffb0000000800540069006d006501000000000000045000000000000000000000038f0000033400000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000
Selection:
collapsed: false
Time:
collapsed: false
Tool Properties:
collapsed: false
Views:
collapsed: true
Width: 1279
X: 72
Y: 27
......@@ -8,23 +8,27 @@ from sensor_msgs.msg import LaserScan
from geometry_msgs.msg import Twist
def callback(dt):
print '-------------------------------------------'
print 'Range data at 0 deg: {}'.format(dt.ranges[0])
print 'Range data at 15 deg: {}'.format(dt.ranges[15])
print 'Range data at 345 deg: {}'.format(dt.ranges[345])
print '-------------------------------------------'
print ('-------------------------------------------')
print ('Range data at 0 deg: {}'.format(dt.ranges[0]))
print ('Range data at 15 deg: {}'.format(dt.ranges[15]))
print ('Range data at 345 deg: {}'.format(dt.ranges[345]))
print ('-------------------------------------------')
## YOUR TASK
#Update the logic below to move the robot close to the wall and do wall following.
thr1 = 0.8 # Laser scan range threshold
thr2 = 0.8
if dt.ranges[0]>thr1 and dt.ranges[15]>thr2 and dt.ranges[345]>thr2: # Checks if there are obstacles in front and
# 15 degrees left and right (Try changing the
# the angle values as well as the thresholds)
move.linear.x = 0.15 # go forward (linear velocity)
move.linear.x = 0.5 # go forward (linear velocity)
move.angular.z = 0.0 # do not rotate (angular velocity)
else:
move.linear.x = 0.0 # stop
move.angular.z = 0.15 # rotate counter-clockwise
move.angular.z = 0.5 # rotate counter-clockwise
if dt.ranges[0]>thr1 and dt.ranges[15]>thr2 and dt.ranges[345]>thr2:
move.linear.x = 0.15
move.linear.x = 0.5
move.angular.z = 0.0
pub.publish(move) # publish the move object
......
#!/usr/bin/env python
# Task 3
import rospy
from sensor_msgs.msg import LaserScan
from geometry_msgs.msg import Twist
import math
def turn_dir(dir):
move.linear.x = 0.0
move.angular.z = dir*0.1
pub.publish(move)
def drive_straight():
move.linear.x = 0.15
move.angular.z = 0.0
pub.publish(move)
def callback(dt):
print ('-------------------------------------------')
print ('Range data at 45 deg: {}'.format(dt.ranges[45]))
print ('Range data at 90 deg: {}'.format(dt.ranges[90]))
print ('-------------------------------------------')
dist_left = dt.ranges[90]
dist_diag = dt.ranges[25]
low_dist_thresh = 0.45
high_dist_thresh = 0.65
if(dist_left < low_dist_thresh and dist_diag < low_dist_thresh*math.cos(math.radians(25))):
turn_dir(-1)
elif(dist_left > high_dist_thresh and dist_diag > high_dist_thresh*math.cos(math.radians(25))):
turn_dir(+1)
else:
drive_straight()
move = Twist() # Creates a Twist message type object
rospy.init_node('obstacle_avoidance_node') # Initializes a node
pub = rospy.Publisher("/cmd_vel", Twist, queue_size=1000) # Publisher object which will publish "Twist" type messages
# on the "/cmd_vel" Topic, "queue_size" is the size of the
# outgoing message queue used for asynchronous publishing
sub = rospy.Subscriber("/scan", LaserScan, callback) # Subscriber object which will listen "LaserScan" type messages
# from the "/scan" Topic and call the "callback" function
# each time it reads something from the Topic
rospy.spin() # Loops infinitely until someone stops the program execution
\ No newline at end of file
# Mobile robot
## Launching in Simulation
- First install the package *turtlebot3_simulations* in your catkin workspace and build it.
......@@ -13,10 +12,18 @@
git clone -b melodic-devel https://github.com/ROBOTIS-GIT/turtlebot3_simulations.git
```
- Installing dependencies for the `turtlebot3` package
```bash
rosdep update
```
```bash
cd ~/catkin_ws
cd ~/catkin_ws && rosdep install --rosdistro $ROS_DISTRO --ignore-src --from-paths src
```
- Make sure that you have the [`task_3`](../00_GettingStarted/docker/i2r/catkin_ws/src/task_3/) package also inside the `catkin_ws/src` directory. Once the `task_3` package is in the workspace, run the following commands.
```bash
catkin build
```
......@@ -44,29 +51,39 @@
```bash
ls launch
```
- For the next examples we will use the **turtlebot3_world.launch** file.
- Now we will launch the default world along with the **turtlebot** model using the <span id="robot_spawn_command">*robot_spawn_command*</span>
```bash
roslaunch turtlebot3_gazebo turtlebot3_world.launch
roslaunch task_3 spawn_robot.launch
```
This comand will bring up the world along with the robot in it.
![world](images/turtlebot_world.png)
## Launching a SLAM node
- Launching the SLAM node
Now we will **create a 2D map** of this world. We will later use this 2D map to navigate the robot from point A to point B.
- First, we will launch the SLAM node to start mapping. Here, we are using the [gmapping package](http://wiki.ros.org/gmapping) which is one of the most commonly used 2D mapping package used in the industry. Launch the *mapping.launch* file from a new terminal.
```bash
roslaunch turtlebot3_slam turtlebot3_slam.launch
roslaunch task_3 mapping.launch
```
- Operating the node using the keyboard.
Successful execution of the launch file **will start the mapping process**. You will see an RVIZ window open as shown below.
![mapping_start](images/mapping_start.png)
- Now, you can control the robot manually and run through the world to map it. You can run the following command in a new terminal session and follow the instruction on termial and use the vaious keys to move the robot around.
```bash
roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch
```
- Operate the robot using a script. Maybe you have to make to skript *simple_auto_drive.py* executable. Following instructions show you how to make it executable.
- Now that you have familiarized yourself with the mapping process, let us now use a ROS code to do **autonomous mapping** where the robot will randomly move through the world and map it. Maybe you have to make the script *simple_auto_drive.py* as an executable by following instructions below.
```bash
roscd ~/catkind_ws/src/task_3/scripts
roscd ~/catkin_ws/src/task_3/scripts
```
```bash
......@@ -76,4 +93,110 @@
```bash
rosrun task_3 simple_auto_drive.py
```
You can stop the script by typing `Ctrl` + `C` into the terminal where the node is running.
\ No newline at end of file
You can stop the script by pressing `Ctrl+C` into the terminal where this node is running.
- The [simple_auto_drive.py](../00_GettingStarted/docker/i2r/catkin_ws/src/task_3/scripts/simple_auto_drive.py) subribes to the laserscan data from the robot and uses some logic to turn the robot away from walls or obstacles that are near the robot. As you run the code you will see a **red line** appearing along where the robot moved. **This indicaties the path that the robot took**. If you keep running the code long enough you will see that it is **completely random**.
<video width="640" height="480" controls>
<source src="images/random-mapping.mp4" type="video/mp4">
</video>
## Task 4.1
- Running the [simple_auto_drive.py](../00_GettingStarted/docker/i2r/catkin_ws/src/task_3/scripts/simple_auto_drive.py) code move the robot **in a random manner**. This will work if the area is wide open and you don't have any closed structures like rooms to explore.
- In this task we will write a code to **move the robot in a more stuctured manner.** A simple way to do that would be **follow the wall closest to you.**
### Your Task.
- Your task is to **write a simple logic** to make the robot follow the wall. To keep it simple and get you started, we will spawn the robot close to a wall on the **left side of the robot.**
- You have to extend the [simple_auto_drive.py](../00_GettingStarted/docker/i2r/catkin_ws/src/task_3/scripts/simple_auto_drive.py) code and add more conditions so that robot moves at a constant distance from the wall.
- You are free to choose how far the robot should be away from the wall. The **only condition for this task is that robot should move close to the outside wall. It SHOULD NOT move into the white pillars in the center of the building.**
- Here is a simple pseudo code that will help you get started.
```bash
GET distance to the wall on the left.
IF too close to the wall:
Turn away from the wall
IF too far away from the wall:
Turn towards the wall
If within a threshold:
Move straight
```
- Remeber that **we do not expect a perfect wall following robot that will always move parallel to the wall.** The aim of this task is to get you started on how to think about a given problem, come up with a solution to the problem, and transfer this solution in to a python code.
- We have provided an example output below. You can clearly see that **the robot doesn't always move parallel to the wall.** Sometimes it might even be far away from the wall. However, it still satisfies the task requirement that **it should not move between the white pillars**. Also, remeber that **we are not expecting you to reprodue the output shown in the example.** Just make sure that you staisfy the requirement of the task.
<video width="640" height="480" controls>
<source src="images/mapping.mp4" type="video/mp4">
</video>
### Note:
Intially, to test if your wall following logic works, you just need to use the [robot_spawn_command](#robot_spawn_command) and then run the code that you extended using the `rosrun` command.
## Launching a Navigation node
Now that you have created a map of the world you can use this 2D map to localize and navigate inside the world.
- You can use the [robot_spawn_command](#robot_spawn_command) to spawn the robot inside the world.
- The following command will initialize the [amcl](http://wiki.ros.org/amcl) node which is used for localizing the robot inside a given map and [move_base](http://wiki.ros.org/move_base) node which is responsible for planning the path to a given goal location. These two packages are most widely used in research and for industrial purposes.
```bash
roslaunch turtlebot3_navigation turtlebot3_navigation.launch
```
- Now you can use initialize the robot's pose on the map using the `2D Pose Estimate` tool that you will find on the top bar in the RVIZ window. You can also refer to a guide [here](Estimate Initial Pose
) that will help you do this.
- Once the robot's pose is initialized on the map you can use the `2D Nav Goal` tool to provide a navigation goal that the robot will reach.
## Task 4.2
- In this task we will explore the various navigation parameters and see how each of them affect the outcome of the robot's navigation. To keep it simple, **we will explore some of the most important and frequently tunned parameters.**
- To complete this task you will have to change the values for parameters mentioned below and present your results either in the form of screenshots of screen recording whichever you find comfortable.
- We can use the [dynamic reconfigure](https://automaticaddison.com/ros-parameters-and-the-dynamic_reconfigure-package-in-ros-noetic/#:~:text=How%20to%20Update%20Parameters%20Using%20the%20dynamic_reconfigure%20Package) option available in ROS to dynamically change the parameters and see how it affect the planning algorithm. You can use the following command to launch the dynamic reconfigure window.
```bash
rosrun rqt_reconfigure rqt_reconfigure
```
![rqt_window](images/rqt_reconfig.png)
- In this window you can see all the parameters that you can change for the `move_base` node. To view the parameters available for the planner, click on the `DWAPlannerROS` name on the left side and then you will see all the available parameters.
![planner_params](images/planner_params.png)
- For example, changing the `max_vel_trans` parameter will allow the robot to have higer speeds while moving. In other words, the robot will **not exceed this velocity limit.** You can see that by looking at the `/cmd_vel` topic. You will see that the velocity will not exceed the limit that is set.
![max_vel](images/max_vel.png)
- Now, it is your turn to change the following parameters and observe the changes/impact it has on the planner parameters. **Remeber for all the following parameters you have to set at least `3` different values and provide us your observation on how the parameteres affects the planner and the robot's motion.** You can also show a screen recording and explain the impact of the parameter.
1. `max_vel_trans` & `min_vel_trans`
2. `max_vel_x` & `min_vel_x`
3. `max_vel_theta` & `min_vel_theta`
4. `xy_goal_tolerance`
5. `yaw_goal_tolerance`
6. `inflation_radius` & `cost_scaling_factor` in global_costmap
7. `inflation_radius` & `cost_scaling_factor` in local_costmap
- You can see the effect of the parameters `1`, `2` & `3` by looking at the `/cmd_vel` topic.
- For parameters `3` & `4` you can observe them visually.
- To change costmap parameters mentioned in `6` & `7`, you can click on the `inflation_layer` option under the `global_costmap` & `local_costmap` options on the left.
![costmap_param](images/costmap_parms.png)
Here are some online resoruce that will help you understand and tune these parameters.
1. [This ROS Wiki](https://wiki.ros.org/navigation/Tutorials/RobotSetup#Global_Configuration:~:text=Global%20Configuration%20(global_costmap)) will help you understand about global and local costmaps.
2. More tuning guide can be found [here](https://emanual.robotis.com/docs/en/platform/turtlebot3/navigation/#tuning-guide).
# Mobile robot
## Launching in Simulation
- First install the package *turtlebot3_simulations* in your catkin workspace and build it.
```bash
cd ~/catkin_ws/src/
```
```bash
git clone -b melodic-devel https://github.com/ROBOTIS-GIT/turtlebot3_simulations.git
```
- Installing dependencies for the `turtlebot3` package
```bash
rosdep update
```
```bash
cd ~/catkin_ws && rosdep install --rosdistro $ROS_DISTRO --ignore-src --from-paths src
```
- Make sure that you have the [`task_3`](../00_GettingStarted/docker/i2r/catkin_ws/src/task_3/) package also inside the `catkin_ws/src` directory. Once the `task_3` package is in the workspace, run the following commands.
```bash
catkin build
```
```bash
source ~/.bashrc
```
- Export the name of the Turtlebot model we will use to the environment variable *TURTLEBOT3_MODEL* by the following command. Make sure to source the *.bashrc* file afterwards.
```bash
echo "export TURTLEBOT3_MODEL=burger" >> ~/.bashrc
```
```bash
source ~/.bashrc
```
- Now launch one of the predefined environments for `Gazebo` from the just installed **turtlebot3_gazebo** package. All the launch files with different envrionments can be listed to the terminal by the following commands.
```bash
roscd turtlebot3_gazebo
```
```bash
ls launch
```
- Now we will launch the default world along with the **turtlebot** model using the <span id="robot_spawn_command">*robot_spawn_command*</spawn>
```bash
roslaunch task_3 spawn_robot.launch
```
This comand will bring up the world along with the robot in it.
![world](images/turtlebot_world.png)
## Launching a SLAM node
Now we will **create a 2D map** of this world. We will later use this 2D map to navigate the robot from point A to point B.
- First, we will launch the SLAM node to start mapping. Here, we are using the [gmapping package](http://wiki.ros.org/gmapping) which is one of the most commonly used 2D mapping package used in the industry.
```bash
roslaunch task_3 mapping.launch
```
Successful execution of the launch file **will start the mapping process**. You will see an RVIZ window open as shown below.
![mapping_start](images/mapping_start.png)
- Now, you can control the robot manually and run through the world to map it. You can run the following command and follow the instruction on termial and use the vaious keys to move the robot around.
```bash
roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch
```
- Now that you have familiarized yourself with the mapping process, let us now use a ROS code to do **autonomous mapping** where the robot will randomly move through the world and map it. Maybe you have to make the script *simple_auto_drive.py* as an executable by following instructions below.
```bash
roscd ~/catkind_ws/src/task_3/scripts
```
```bash
chmod +x simple_auto_drive.py
```
```bash
rosrun task_3 simple_auto_drive.py
```
You can stop the script by pressing `Ctrl+C` into the terminal where this node is running.
- The [simple_auto_drive.py](../00_GettingStarted/docker/i2r/catkin_ws/src/task_3/scripts/simple_auto_drive.py) subribes to the laserscan data from the robot and uses some logic to turn the robot away from walls or obstacles that are near the robot. As you run the code you will see a **red line** appearing along where the robot moved. **This indicaties the path that the robot took**. If you keep running the code long enough you will see that it is **completely random**.
<video width="640" height="480" controls>
<source src="images/random-mapping.mp4" type="video/mp4">
</video>
## Task 4.1
- Running the [simple_auto_drive.py](../00_GettingStarted/docker/i2r/catkin_ws/src/task_3/scripts/simple_auto_drive.py) code move the robot **in a random manner**. This will work if the area is wide open and you don't have any closed structures like rooms to explore.
- In this task we will write a code to **move the robot in a more stuctured manner.** A simple way to do that would be **follow the wall closest to you.**
### Your Task.
- Your task is to **write a simple logic** to make the robot follow the wall. To keep it simple and get you started, we will spawn the robot close to a wall on the **left side of the robot.**
- You have to extend the [simple_auto_drive.py](../00_GettingStarted/docker/i2r/catkin_ws/src/task_3/scripts/simple_auto_drive.py) code and add more conditions so that robot moves at a constant distance from the wall.
- You are free to choose how far the robot should be away from the wall. The **only condition for this task is that robot should move close to the outside wall. It SHOULD NOT move into the white pillars in the center of the building.**
- Here is a simple pseudo code that will help you get started.
```bash
GET distance to the wall on the left.
IF too close to the wall:
Turn away from the wall
IF too far away from the wall:
Turn towards the wall
If within a threshold:
Move straight
```
- Remeber that **we do not expect a perfect wall following robot that will always move parallel to the wall.** The aim of this task is to get you started on how to think about a given problem, come up with a solution to the problem, and transfer this solution in to a python code.
- We have provided an example output below. You can clearly see that **the robot doesn't always move parallel to the wall.** Sometimes it might even be far away from the wall. However, it still satisfies the task requirement that **it should not move between the white pillars**. Also, remeber that **we are not expecting you to reprodue the output shown in the example.** Just make sure that you staisfy the requirement of the task.
<video width="640" height="480" controls>
<source src="images/mapping.mp4" type="video/mp4">
</video>
### Note:
Intially, to test if your wall following logic works, you just need to use the [robot_spawn_command](#robot_spawn_command) and then run the code that you extended using the `rosrun` command.
## Launching a Navigation node
Now that you have created a map of the world you can use this 2D map to localize and navigate inside the world.
- You can use the [robot_spawn_command](#robot_spawn_command) to spawn the robot inside the world.
- The following command will initialize the [amcl](http://wiki.ros.org/amcl) node which is used for localizing the robot inside a given map and [move_base](http://wiki.ros.org/move_base) node which is responsible for planning the path to a given goal location. These two packages are most widely used in research and for industrial purposes.
```bash
roslaunch turtlebot3_navigation turtlebot3_navigation.launch
```
- Now you can use initialize the robot's pose on the map using the `2D Pose Estimate` tool that you will find on the top bar in the RVIZ window. You can also refer to a guide [here](Estimate Initial Pose
) that will help you do this.
- Once the robot's pose is initialized on the map you can use the `2D Nav Goal` tool to provide a navigation goal that the robot will reach.
## Task 4.2
- In this task we will explore the various navigation parameters and see how each of them affect the outcome of the robot's navigation. To keep it simple, **we will explore some of the most important and frequently tunned parameters.**
- To complete this task you will have to change the values for parameters mentioned below and present your results either in the form of screenshots of screen recording whichever you find comfortable.
- We can use the [dynamic reconfigure](https://automaticaddison.com/ros-parameters-and-the-dynamic_reconfigure-package-in-ros-noetic/#:~:text=How%20to%20Update%20Parameters%20Using%20the%20dynamic_reconfigure%20Package) option available in ROS to dynamically change the parameters and see how it affect the planning algorithm. You can use the following command to launch the dynamic reconfigure window.
```bash
rosrun rqt_reconfigure rqt_reconfigure
```
![rqt_window](images/rqt_reconfig.png)
- In this window you can see all the parameters that you can change for the `move_base` node. To view the parameters available for the planner, click on the `DWAPlannerROS` name on the left side and then you will see all the available parameters.
![planner_params](images/planner_params.png)
- For example, changing the `max_vel_trans` parameter will allow the robot to have higer speeds while moving. In other words, the robot will **not exceed this velocity limit.** You can see that by looking at the `/cmd_vel` topic. You will see that the velocity will not exceed the limit that is set.
![max_vel](images/max_vel.png)
- Now, it is your turn to change the following parameters and observe the changes/impact it has on the planner parameters. **Remeber for all the following parameters you have to set at least `3` different values and provide us your observation on how the parameteres affects the planner and the robot's motion.** You can also show a screen recording and explain the impact of the parameter.
1. `max_vel_trans` & `min_vel_trans`
2. `max_vel_x` & `min_vel_x`
3. `max_vel_theta` & `min_vel_theta`
4. `xy_goal_tolerance`
5. `yaw_goal_tolerance`
6. `inflation_radius` & `cost_scaling_factor` in global_costmap
7. `inflation_radius` & `cost_scaling_factor` in local_costmap
- You can see the effect of the parameters `1`, `2` & `3` by looking at the `/cmd_vel` topic.
- For parameters `3` & `4` you can observe them visually.
- To change costmap parameters mentioned in `6` & `7`, you can click on the `inflation_layer` option under the `global_costmap` & `local_costmap` options on the left.
![costmap_param](images/costmap_parms.png)
Here are some online resoruce that will help you understand and tune these parameters.
1. [This ROS Wiki](https://wiki.ros.org/navigation/Tutorials/RobotSetup#Global_Configuration:~:text=Global%20Configuration%20(global_costmap)) will help you understand about global and local costmaps.
2. More tuning guide can be found [here](https://emanual.robotis.com/docs/en/platform/turtlebot3/navigation/#tuning-guide).
04_MobileRobot/images/costmap_parms.png

69 KiB

File added
04_MobileRobot/images/mapping_start.png

121 KiB

04_MobileRobot/images/max_vel.png

148 KiB

04_MobileRobot/images/planner_params.png

113 KiB

File added
04_MobileRobot/images/rqt_reconfig.png

41.2 KiB

04_MobileRobot/images/turtlebot_world.png

41 KiB

0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment