diff --git a/04_MobileRobot/README_2.md b/04_MobileRobot/README_2.md new file mode 100644 index 0000000000000000000000000000000000000000..01198755af7b07e73779871aca7af676a2276fe8 --- /dev/null +++ b/04_MobileRobot/README_2.md @@ -0,0 +1,191 @@ +# Mobile robot + + +## Launching in Simulation + +- First install the package *turtlebot3_simulations* in your catkin workspace and build it. + + ```bash + cd ~/catkin_ws/src/ + ``` + + ```bash + git clone -b melodic-devel https://github.com/ROBOTIS-GIT/turtlebot3_simulations.git + ``` + +- Installing dependencies for the `turtlebot3` package + ```bash + rosdep update + ``` + + ```bash + cd ~/catkin_ws && rosdep install --rosdistro $ROS_DISTRO --ignore-src --from-paths src + ``` +- Make sure that you have the [`task_3`](../00_GettingStarted/docker/i2r/catkin_ws/src/task_3/) package also inside the `catkin_ws/src` directory. Once the `task_3` package is in the workspace, run the following commands. + + ```bash + catkin build + ``` + + ```bash + source ~/.bashrc + ``` + +- Export the name of the Turtlebot model we will use to the environment variable *TURTLEBOT3_MODEL* by the following command. Make sure to source the *.bashrc* file afterwards. + + ```bash + echo "export TURTLEBOT3_MODEL=burger" >> ~/.bashrc + ``` + + ```bash + source ~/.bashrc + ``` + +- Now launch one of the predefined environments for `Gazebo` from the just installed **turtlebot3_gazebo** package. All the launch files with different envrionments can be listed to the terminal by the following commands. + + ```bash + roscd turtlebot3_gazebo + ``` + + ```bash + ls launch + ``` +- Now we will launch the default world along with the **turtlebot** model using the <span id="robot_spawn_command">*robot_spawn_command*</spawn> + ```bash + roslaunch task_3 spawn_robot.launch + ``` + This comand will bring up the world along with the robot in it. +  + +## Launching a SLAM node + +Now we will **create a 2D map** of this world. We will later use this 2D map to navigate the robot from point A to point B. + +- First, we will launch the SLAM node to start mapping. Here, we are using the [gmapping package](http://wiki.ros.org/gmapping) which is one of the most commonly used 2D mapping package used in the industry. + + ```bash + roslaunch task_3 mapping.launch + ``` + Successful execution of the launch file **will start the mapping process**. You will see an RVIZ window open as shown below. +  + +- Now, you can control the robot manually and run through the world to map it. You can run the following command and follow the instruction on termial and use the vaious keys to move the robot around. + + ```bash + roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch + ``` + +- Now that you have familiarized yourself with the mapping process, let us now use a ROS code to do **autonomous mapping** where the robot will randomly move through the world and map it. Maybe you have to make the script *simple_auto_drive.py* as an executable by following instructions below. + + ```bash + roscd ~/catkind_ws/src/task_3/scripts + ``` + + ```bash + chmod +x simple_auto_drive.py + ``` + + ```bash + rosrun task_3 simple_auto_drive.py + ``` + You can stop the script by pressing `Ctrl+C` into the terminal where this node is running. + +- The [simple_auto_drive.py](../00_GettingStarted/docker/i2r/catkin_ws/src/task_3/scripts/simple_auto_drive.py) subribes to the laserscan data from the robot and uses some logic to turn the robot away from walls or obstacles that are near the robot. As you run the code you will see a **red line** appearing along where the robot moved. **This indicaties the path that the robot took**. If you keep running the code long enough you will see that it is **completely random**. + +<video width="640" height="480" controls> + <source src="images/random-mapping.mp4" type="video/mp4"> +</video> + +## Task 4.1 + +- Running the [simple_auto_drive.py](../00_GettingStarted/docker/i2r/catkin_ws/src/task_3/scripts/simple_auto_drive.py) code move the robot **in a random manner**. This will work if the area is wide open and you don't have any closed structures like rooms to explore. + +- In this task we will write a code to **move the robot in a more stuctured manner.** A simple way to do that would be **follow the wall closest to you.** + +### Your Task. +- Your task is to **write a simple logic** to make the robot follow the wall. To keep it simple and get you started, we will spawn the robot close to a wall on the **left side of the robot.** + +- You have to extend the [simple_auto_drive.py](../00_GettingStarted/docker/i2r/catkin_ws/src/task_3/scripts/simple_auto_drive.py) code and add more conditions so that robot moves at a constant distance from the wall. + +- You are free to choose how far the robot should be away from the wall. The **only condition for this task is that robot should move close to the outside wall. It SHOULD NOT move into the white pillars in the center of the building.** + +- Here is a simple pseudo code that will help you get started. +```bash +GET distance to the wall on the left. + +IF too close to the wall: + Turn away from the wall + +IF too far away from the wall: + Turn towards the wall + +If within a threshold: + Move straight +``` + +- Remeber that **we do not expect a perfect wall following robot that will always move parallel to the wall.** The aim of this task is to get you started on how to think about a given problem, come up with a solution to the problem, and transfer this solution in to a python code. + +- We have provided an example output below. You can clearly see that **the robot doesn't always move parallel to the wall.** Sometimes it might even be far away from the wall. However, it still satisfies the task requirement that **it should not move between the white pillars**. Also, remeber that **we are not expecting you to reprodue the output shown in the example.** Just make sure that you staisfy the requirement of the task. + +<video width="640" height="480" controls> + <source src="images/mapping.mp4" type="video/mp4"> +</video> + +### Note: +Intially, to test if your wall following logic works, you just need to use the [robot_spawn_command](#robot_spawn_command) and then run the code that you extended using the `rosrun` command. + +## Launching a Navigation node + +Now that you have created a map of the world you can use this 2D map to localize and navigate inside the world. + +- You can use the [robot_spawn_command](#robot_spawn_command) to spawn the robot inside the world. + +- The following command will initialize the [amcl](http://wiki.ros.org/amcl) node which is used for localizing the robot inside a given map and [move_base](http://wiki.ros.org/move_base) node which is responsible for planning the path to a given goal location. These two packages are most widely used in research and for industrial purposes. +```bash +roslaunch turtlebot3_navigation turtlebot3_navigation.launch +``` + +- Now you can use initialize the robot's pose on the map using the `2D Pose Estimate` tool that you will find on the top bar in the RVIZ window. You can also refer to a guide [here](Estimate Initial Pose +) that will help you do this. + +- Once the robot's pose is initialized on the map you can use the `2D Nav Goal` tool to provide a navigation goal that the robot will reach. + +## Task 4.2 + +- In this task we will explore the various navigation parameters and see how each of them affect the outcome of the robot's navigation. To keep it simple, **we will explore some of the most important and frequently tunned parameters.** + +- To complete this task you will have to change the values for parameters mentioned below and present your results either in the form of screenshots of screen recording whichever you find comfortable. + +- We can use the [dynamic reconfigure](https://automaticaddison.com/ros-parameters-and-the-dynamic_reconfigure-package-in-ros-noetic/#:~:text=How%20to%20Update%20Parameters%20Using%20the%20dynamic_reconfigure%20Package) option available in ROS to dynamically change the parameters and see how it affect the planning algorithm. You can use the following command to launch the dynamic reconfigure window. +```bash +rosrun rqt_reconfigure rqt_reconfigure +``` + + +- In this window you can see all the parameters that you can change for the `move_base` node. To view the parameters available for the planner, click on the `DWAPlannerROS` name on the left side and then you will see all the available parameters. + + +- For example, changing the `max_vel_trans` parameter will allow the robot to have higer speeds while moving. In other words, the robot will **not exceed this velocity limit.** You can see that by looking at the `/cmd_vel` topic. You will see that the velocity will not exceed the limit that is set. + + +- Now, it is your turn to change the following parameters and observe the changes/impact it has on the planner parameters. **Remeber for all the following parameters you have to set at least `3` different values and provide us your observation on how the parameteres affects the planner and the robot's motion.** You can also show a screen recording and explain the impact of the parameter. + 1. `max_vel_trans` & `min_vel_trans` + 2. `max_vel_x` & `min_vel_x` + 3. `max_vel_theta` & `min_vel_theta` + 4. `xy_goal_tolerance` + 5. `yaw_goal_tolerance` + 6. `inflation_radius` & `cost_scaling_factor` in global_costmap + 7. `inflation_radius` & `cost_scaling_factor` in local_costmap + +- You can see the effect of the parameters `1`, `2` & `3` by looking at the `/cmd_vel` topic. + +- For parameters `3` & `4` you can observe them visually. + +- To change costmap parameters mentioned in `6` & `7`, you can click on the `inflation_layer` option under the `global_costmap` & `local_costmap` options on the left. + + +Here are some online resoruce that will help you understand and tune these parameters. + + 1. [This ROS Wiki](https://wiki.ros.org/navigation/Tutorials/RobotSetup#Global_Configuration:~:text=Global%20Configuration%20(global_costmap)) will help you understand about global and local costmaps. + 2. More tuning guide can be found [here](https://emanual.robotis.com/docs/en/platform/turtlebot3/navigation/#tuning-guide). +