Commit facd6e2e authored by JGlombitza's avatar JGlombitza
Browse files

New Readme

parent bc5187d7
# airshower
authors: Jonas Glombitza, David Walz
git: https://git.rwth-aachen.de/DavidWalz/airshower.git
arXiv: https://arxiv.org/pdf/1708.00647.pdf (A Deep Learning-based Reconstruction of Cosmic Ray-induced Air Showers)
For training a Deep Neural Network (DNN) on air shower reconstruction, there are 3 steps to do:
Needed: Keras, Tensorflow, seaborn, numpy and matplotlib
1. Simulate data
Use the script ./sim.py to simulate showers. (simulates ~100.000 proton showers in 10 packages = 10.000 airshowers per package)
100.000 showers - 60.000 for training - 40.000 for evaluation
The data will be saved in ./RawData/showers*
2. Preprocess the simulated data
Use the script PreProcessing.py
The script loads all showers in the RAM and preprocess the data (around 12 Gbyte RAM needed!)
The preprocessed data will be stored in ./Data_preprocessed/
3. Train DNN with preprocessed data
Start some script in ./training/
After training, the script evaluates the trained model and plots the result. (Default path: ./training/ change path with: log_dir)
Some usefull parameters:
max_steps=15 - iteration steps (runtime)
lr = 0.001 - learning rate
log_dir="." - path to plots
nbatch = 132 - size of each mini batch
There are 5 DNN's for the 3 reconstruction tasks:
Angular Reconstruction
- Only time input
- All inputs
Energy Reconstruction
- Only total signal input
- All inputs
Xmax Reconstruction
- All inputs
For more network architecture details see paper: https://arxiv.org/pdf/1708.00647.pdf
import numpy as np
from airshower import shower, utils, plotting
import argparse
parser = argparse.ArgumentParser()
parser.add_argument("--mass", default=1, type=int, help='mass number')
parser.add_argument("--nfiles", default=1, type=int, help='mass number')
parser.add_argument("--nevents", default=1000, type=int, help='mass number')
args = parser.parse_args()
print 'Simulating'
print 'A', args.mass
print 'nfiles', args.nfiles
print 'nevents', args.nevents
v_stations = utils.station_coordinates(9, layout='offset')
for i in range(args.nfiles):
logE = 18.5 + 1.5 * np.random.rand(args.nevents)
mass = args.mass * np.ones(args.nevents)
data = shower.rand_events(logE, mass, v_stations)
np.savez_compressed(
'showers-A%i-%i.npz' % (args.mass, i),
detector=data['detector'],
logE=data['logE'],
mass=data['mass'],
Xmax=data['Xmax'],
showercore=data['showercore'],
showeraxis=data['showeraxis'],
showermax=data['showermax'],
time=data['time'],
signal=data['signal'])
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment