Commit 1495357f authored by Svetlana Pavlitskaya's avatar Svetlana Pavlitskaya

Update README.md

parent c159ec6c
## Development and deployment of a CNN component using EMADL2CPP
# Development and deployment of a CNN component using EMADL2CPP
## Prerequisites
* Linux. Ubuntu Linux 16.04 and 18.04 were used during testing.
* Deep learning backend:
* MXNet
* training - generated is Python code. Required is Python 2.7 or higher, Python packages `h5py`, `mxnet` (for training on CPU) or e.g. `mxnet-cu75` for CUDA 7.5 (for training on GPU with CUDA, concrete package should be selected according to CUDA version). Follow [official instructions on MXNet site](https://mxnet.incubator.apache.org/install/index.html?platform=Linux&language=Python&processor=CPU)
* prediction - generated code is C++. Install MXNet using [official instructions on MXNet site](https://mxnet.incubator.apache.org/ for CPP package).
* prediction - generated code is C++. Install MXNet using [official instructions on MXNet site](https://mxnet.incubator.apache.org) for C++.
### HowTo
1. Define a EMADL component containing architecture of a neural network and save it in a `.emadl` file. For more information on architecture language please refer to [CNNArchLang project](https://git.rwth-aachen.de/monticore/EmbeddedMontiArc/languages/CNNArchLang). An example of NN architecture:
......@@ -60,45 +60,49 @@ configuration VGG16{
}
```
3. Generate GPL code for the specified deep learning backend using the jar package of a EMADL2CPP generator. The generator receives the following command line parameters:
* -m path to directory with EMADL models
* -r name of the root model
* -o output path
* -b backend
* `-m` path to directory with EMADL models
* `-r` name of the root model
* `-o` output path
* `-b` backend
Assume both the architecture definition `VGG16.emadl`and the corresponding training configuration `VGG16.cnnt` are located in a folder `models` and the target code should be generated into `target` folder using `MXNet` backend. An example of a command is then:
```java -jar embedded-montiarc-emadl-generator-0.2.4-SNAPSHOT-jar-with-dependencies.jar -m models -r VGG16 -o target -b=MXNET```
```java -jar embedded-montiarc-emadl-generator-0.2.4-SNAPSHOT-jar-with-dependencies.jar -m models -r VGG16 -o target -b MXNET```
You can find the EMADL2CPP jar here [link]
4. When the target code is generated, the corresponding trainer file (e.g. `CNNTrainer_<root_model_name>.py` in case of MXNet) can be executed.
## Development and deployment of an application for TORCS
# Development and deployment of an application for TORCS
### Required software
## Prerequisites
1. Linux. Ubuntu Linux 16.04 and 18.04 were used during testing.
2. ROS, Java runtime environment, GCC/Clang and armadillo - install using your linux distribution tools, e.g. apt in Ubuntu:
```apt-get install ros-base-dev clang openjdk-8-jre libarmadillo-dev```
3. TORCS
4. MXNet - install using official instructions at [MXNet Website](https://mxnet.incubator.apache.org/) for CPP package
3. MXNet - install using [official instructions at MXNet Website](https://mxnet.incubator.apache.org/) for C++
4. TORCS (see below)
### TORCS Installation
1. Download customized TORCS distribution from the [DeepDriving page](http://deepdriving.cs.princeton.edu/)
2. Unpack downloaded archive, navigate to the DeepDriving/torcs-1.3.6 directory
1. Download customized TORCS distribution from the [DeepDriving site](http://deepdriving.cs.princeton.edu/)
2. Unpack downloaded archive, navigate to the `DeepDriving/torcs-1.3.6` directory
3. Compile and install by running `./configure --prefix=/opt/torcs && make -j && make install && make datainstall`
4. Remove original TORCS tracks and copy customized tracks:
```rm -rf /opt/torcs/share/games/torcs/tracks/*
&& cp -rf ../modified_tracks/* /opt/torcs/share/games/torcs/tracks/```
5. Start TORCS by running /opt/torcs/bin/torcs
Installation help can be found in the Readme file provided with the DeepDriving distribution.
5. Start TORCS by running `/opt/torcs/bin/torcs`
Further installation help can be found in the Readme file provided with the DeepDriving distribution.
### TORCS Setup
1. Run TORCS
2. Configure race
Select Race -> Quick Race -> Configure Race
Select one of the maps with the chenyi- prefix and click Accept
Remove all drivers from the Selected section on the left by selecting every driver and clicking (De)Select
Select driver chenyi on the right side and add it by clicking (De)Select
Add other drivers with the chenyi- prefix if needed
Click Accept -> Accept -> New Race
3. Use keys 1-9 and M to hide all the widgets from the screen
4. Use F2 key to switch between camera modes to select the mode when the car or it's parts are not visible
5. Use PgUp/PgDown keys to switch between cars and select chenyi - the car that does not drive on its own
1. Select Race -> Quick Race -> Configure Race
2. Select one of the maps with the chenyi- prefix and click Accept
3. Remove all drivers from the Selected section on the left by selecting every driver and clicking (De)Select
4. Select driver chenyi on the right side and add it by clicking (De)Select
5. Add other drivers with the chenyi- prefix if needed
6. Click Accept -> Accept -> New Race
3. Use keys `1-9` and `M` to hide all the widgets from the screen
4. Use `F2` key to switch between camera modes to select the mode when the car or it's parts are not visible
5. Use `PgUp/PgDown` keys to switch between cars and select `chenyi` - the car that does not drive on its own
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment