Skip to content
Snippets Groups Projects
Commit 79093d42 authored by Jan Habscheid's avatar Jan Habscheid
Browse files

v1

parent 12e3f1cd
No related branches found
No related tags found
1 merge request!2Project3
No preview for this file type
...@@ -11,4 +11,4 @@ venv ...@@ -11,4 +11,4 @@ venv
.DS_Store .DS_Store
# Ignore LyX temporary files # Ignore LyX temporary files
*.lyx~ *.lyx~
*.lyx# *.lyx#
\ No newline at end of file
...@@ -107,35 +107,50 @@ name "sec:Discussion-and-Conclusion" ...@@ -107,35 +107,50 @@ name "sec:Discussion-and-Conclusion"
\end_layout \end_layout
\begin_layout Standard \begin_layout Standard
How to improve the identification of This work presented an accurate algorithm to quantify the uncertainty
\begin_inset Formula $k(x)$ \begin_inset Formula $k(x)$
\end_inset
in a scalar conservation law of the form
\begin_inset Formula $u_{t}+\left(k(x)f(u)\right)_{x}=0$
\end_inset \end_inset
. .
\end_layout \end_layout
\begin_layout Itemize \begin_layout Standard
Increase number of iterations The major impact of the resistance function on the solution and the importance to predict this accurately was emphasized.
An optimal choice for the different hyperparameter was discussed and the influence of those on the solution.
\end_layout \end_layout
\begin_layout Itemize \begin_layout Standard
Run Increasing the observations from one dataset to two datasets showed a positive effect on the prediction of the resistance function.
\begin_inset Formula $n$ \end_layout
\end_inset
MCMC iterations with different random starts and take best one \begin_layout Standard
For future work it remains interesting to find a more efficient method,
as the MCMC algorithm depends on many iterations and is computationally complex.
Furthermore,
several efforts can be made to improve the identification of the resistance function.
\end_layout \end_layout
\begin_layout Itemize \begin_layout Enumerate
Increase number of true data Increase the number of MCMC iterations and adapt the control parameter
\begin_inset Formula $\beta$
\end_inset
when the convergence does not show an improvement about a certain number of steps.
\end_layout \end_layout
\begin_layout Itemize \begin_layout Enumerate
Find best hyperparameter Run several independent MCMC iterations with different random starts and take the one with the smallest loss.
\end_layout \end_layout
\begin_layout Itemize \begin_layout Enumerate
Use even more true solutions Increase the number of true datasets,
as going from one to two datasets showed promising results.
\end_layout \end_layout
\end_body \end_body
......
...@@ -742,10 +742,6 @@ If the acceptance rate is above ...@@ -742,10 +742,6 @@ If the acceptance rate is above
\end_inset \end_inset
\end_layout
\end_deeper
\begin_layout Enumerate
\begin_inset Note Note \begin_inset Note Note
status open status open
...@@ -754,11 +750,7 @@ The different observations are weighted differently to ensure that the resistanc ...@@ -754,11 +750,7 @@ The different observations are weighted differently to ensure that the resistanc
\begin_inset Formula $k(x)$ \begin_inset Formula $k(x)$
\end_inset \end_inset
is accurately calculated on the whole domain is accurately calculated on the whole domain Consider the weighting vector
\end_layout
\begin_layout Enumerate
Consider the weighting vector
\begin_inset Formula $w=[w_{1},w_{2},w_{3},w_{4}]$ \begin_inset Formula $w=[w_{1},w_{2},w_{3},w_{4}]$
\end_inset \end_inset
...@@ -770,7 +762,6 @@ Consider the weighting vector ...@@ -770,7 +762,6 @@ Consider the weighting vector
\end_layout \end_layout
\begin_deeper
\begin_layout Enumerate \begin_layout Enumerate
After each iteration update the weights After each iteration update the weights
\end_layout \end_layout
...@@ -788,13 +779,13 @@ Calculate ...@@ -788,13 +779,13 @@ Calculate
\end_layout \end_layout
\end_deeper
\end_deeper \end_deeper
\end_inset \end_inset
\end_layout \end_layout
\end_deeper
\begin_layout Standard \begin_layout Standard
The randomly generated parameter The randomly generated parameter
\begin_inset Formula $\Theta^{(0)}$ \begin_inset Formula $\Theta^{(0)}$
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment