Unverified Commit 99bdbead authored by Thomas Michael Timmermanns's avatar Thomas Michael Timmermanns Committed by GitHub

Changed pooling and shape syntax. (#17)

* Syntax change.
Changed MaxPooling and AveragePooling to Pooling with type argument.
Added method GlobalPooling and removed 'global' argument of Pooling.
Changed special argument 'If' to '?'.

* Changed IO Shapes.
The architecture can now handle other data formats (until now: NHWC only).
parent 5f2575c7
......@@ -25,26 +25,26 @@ There are more compact versions of the same architecture but we will get to that
All predefined methods are listed at the end of this document.
```
architecture Alexnet_alt(img_height=224, img_width=224, img_channels=3, classes=10){
def input Z(0:255)^{img_height, img_width, img_channels} image
def output Q(0:1)^{classes} predictions
def input Z(0:255)^{H:img_height, W:img_width, C:img_channels} image
def output Q(0:1)^{C:classes} predictions
image ->
Convolution(kernel=(11,11), channels=96, stride=(4,4), padding="no_loss") ->
Lrn(nsize=5, alpha=0.0001, beta=0.75) ->
MaxPooling(kernel=(3,3), stride=(2,2), padding="no_loss") ->
Pooling(type="max", kernel=(3,3), stride=(2,2), padding="no_loss") ->
Relu() ->
Split(n=2) ->
(
[0] ->
Convolution(kernel=(5,5), channels=128) ->
Lrn(nsize=5, alpha=0.0001, beta=0.75) ->
MaxPooling(kernel=(3,3), stride=(2,2), padding="no_loss") ->
Pooling(type="max", kernel=(3,3), stride=(2,2), padding="no_loss") ->
Relu()
|
[1] ->
Convolution(kernel=(5,5), channels=128) ->
Lrn(nsize=5, alpha=0.0001, beta=0.75) ->
MaxPooling(kernel=(3,3), stride=(2,2), padding="no_loss") ->
Pooling(type="max", kernel=(3,3), stride=(2,2), padding="no_loss") ->
Relu()
) ->
Concatenate() ->
......@@ -56,14 +56,14 @@ architecture Alexnet_alt(img_height=224, img_width=224, img_channels=3, classes=
Convolution(kernel=(3,3), channels=192) ->
Relu() ->
Convolution(kernel=(3,3), channels=128) ->
MaxPooling(kernel=(3,3), stride=(2,2), padding="no_loss") ->
Pooling(type="max", kernel=(3,3), stride=(2,2), padding="no_loss") ->
Relu()
|
[1] ->
Convolution(kernel=(3,3), channels=192) ->
Relu() ->
Convolution(kernel=(3,3), channels=128) ->
MaxPooling(kernel=(3,3), stride=(2,2), padding="no_loss") ->
Pooling(type="max", kernel=(3,3), stride=(2,2), padding="no_loss") ->
Relu()
) ->
Concatenate() ->
......@@ -100,17 +100,18 @@ An architecture in CNNArch can have multiple inputs and outputs.
Multiple inputs (or outputs) of the same form can be initialized as arrays.
Assuming `h` and `w` are architecture parameter, the following is a valid example:
```
def input Z(0:255)^{h,w,3} image[2]
def input Q(-oo:+oo)^{10} additionalData
def output Q(0:1)^{3} predictions
def input Z(0:255)^{H:h, W:w, C:3} image[2]
def input Q(-oo:+oo)^{C:10} additionalData
def output Q(0:1)^{C:3} predictions
```
The first line defines the input *image* as an array of two rgb (or bgr) images with a resolution of `h` x `w`.
The part `Z(0:255)`, which corresponds to the type definition in EmbeddedMontiArc, restricts the values to integers between 0 and 255.
The following line `{h=200,w=300,c=3}` declares the shape of the input.
The shape denotes the dimensionality in form of height, width and depth(number of channels).
The following line `{H:h, W:w, C:3}` declares the shape of the input.
The shape denotes the dimensionality in form of height `H`, width `W` and depth `C`(number of channels).
Here, the height is initialized as `h`, the width as `w` and the number of channels is 3.
The second line defines another input with 10 dimensions and arbitrary rational values.
The last line defines an output as the probability of 3 classes.
The second line defines another input with one dimension of size 10 and arbitrary rational values.
The last line defines an one-dimensional output of size 3 with rational values between 0 and 1 (probabilities of 3 classes).
The order of `H`, `W` and `C` determines the shape of the input or output of the network.
If an input or output is an array, it can be used in the architecture in two different ways.
Either a single element is accessed or the array is used as a whole.
......@@ -131,7 +132,7 @@ The following is a example of multiple method declarations.
def conv(filter, channels, stride=1, act=true){
Convolution(kernel=(filter,filter), channels=channels, stride=(stride,stride)) ->
BatchNorm() ->
Relu(If=act)
Relu(?=act)
}
def skip(channels, stride){
Convolution(kernel=(1,1), channels=channels, stride=(stride,stride)) ->
......@@ -142,25 +143,25 @@ The following is a example of multiple method declarations.
conv(filter=3, channels=channels, stride=stride) ->
conv(filter=3, channels=channels, act=false)
|
skip(channels=channels, stride=stride, If=(stride!=1))
skip(channels=channels, stride=stride, ?=(stride!=1))
) ->
Add() ->
Relu()
}
```
The method `resLayer` in this example corresponds to a building block of a Residual Network.
The `If` argument is a special argument which is explained in the next section.
The `?` argument is a special argument which is explained in the next section.
## Special Arguments
There exists special structural arguments which can be used in each method.
These are `->`, `|` and `If`. `->` and `|` can only be positive integers and `If` can only be a boolean.
The argument `If` does not nothing if it is true and removes the layer completely if it is false.
These are `->`, `|` and `?`. `->` and `|` can only be positive integers and `?` can only be a boolean.
The argument `?` does not nothing if it is true and removes the layer completely if it is false.
The other two arguments create a repetition of the method.
We will show their effect with examples.
Assuming `a` is a method without required arguments,
then `a(-> = 3)->` is equal to `a()->a()->a()->`,
`a(| = 3)->` is equal to `(a() | a() | a())->` and
`a(-> = 3, | = 2->)` is equal to `(a()->a()->a() | a()->a()->a())->`.
`a(-> = 3, | = 2)->` is equal to `(a()->a()->a() | a()->a()->a())->`.
## Argument Sequences
It is also possible to create a repetition of a method in another way through the use of argument sequences.
......@@ -197,13 +198,13 @@ The comparison operators do not work reliably for the comparison of tuple (they
This version of Alexnet, which uses method construction, argument sequences and special arguments, is identical to the one in the section Basic Structure.
```
architecture Alexnet_alt2(img_height=224, img_width=224, img_channels=3, classes=10){
def input Z(0:255)^{img_height, img_width, img_channels} image
def output Q(0:1)^{classes} predictions
def input Z(0:255)^{H:img_height, W:img_width, C:img_channels} image
def output Q(0:1)^{C:classes} predictions
def conv(filter, channels, convStride=1, poolStride=1, hasLrn=false, convPadding="same"){
Convolution(kernel=(filter,filter), channels=channels, stride=(convStride,convStride), padding=convPadding) ->
Lrn(nsize=5, alpha=0.0001, beta=0.75, If=hasLrn) ->
MaxPooling(kernel=(3,3), stride=(poolStride,poolStride), padding="no_loss", If=(poolStride != 1)) ->
Lrn(nsize=5, alpha=0.0001, beta=0.75, ?=hasLrn) ->
Pooling(type="max", kernel=(3,3), stride=(poolStride,poolStride), padding="no_loss", ?=(poolStride != 1)) ->
Relu()
}
def split1(i){
......@@ -256,7 +257,7 @@ All predefined methods start with a capital letter and all constructed methods h
* **kernel** (integer tuple > 0, required): convolution kernel size: (height, width).
* **channels** (integer > 0, required): number of convolution filters and number of output channels.
* **stride** (integer tuple > 0, optional, default=(1,1)): convolution stride: (height, width).
* **padding** (String, optional, default="same"): One of "valid", "same" or "no_loss". "valid" means no padding. "same" results in padding the input such that the output has the same length as the original input divided by the stride (rounded up). "no_loss" results in minimal padding such that each input is used by at least one filter (identical to "valid" if *stride* equals 1).
* **padding** ({"valid", "same", "no_loss"}, optional, default="same"): One of "valid", "same" or "no_loss". "valid" means no padding. "same" results in padding the input such that the output has the same length as the original input divided by the stride (rounded up). "no_loss" results in minimal padding such that each input is used by at least one filter (identical to "valid" if *stride* equals 1).
* **no_bias** (boolean, optional, default=false): Whether to disable the bias parameter.
* **Softmax()**
......@@ -285,24 +286,21 @@ All predefined methods start with a capital letter and all constructed methods h
Applies dropout operation to input array during training.
* **p** (1 >= float >= 0, optional, default=0.5): Fraction of the input that gets dropped out during training time.
* **Pooling(type, kernel, stride=(1,1), padding="same")**
* **MaxPooling(kernel, stride=(1,1), padding="same", global=false)**
Performs max pooling on the input.
* **kernel** (integer tuple > 0, required): convolution kernel size: (height, width). Is not required if *global* is true.
Performs pooling on the input.
* **type** ({"avg", "max"}, required): Pooling type to be applied.
* **kernel** (integer tuple > 0, required): convolution kernel size: (height, width).
* **stride** (integer tuple > 0, optional, default=(1,1)): convolution stride: (height, width).
* **padding** (String, optional, default="same"): One of "valid", "same" or "no_loss". "valid" means no padding. "same" results in padding the input such that the output has the same length as the original input divided by the stride (rounded up). "no_loss" results in minimal padding such that each input is used by at least one filter (identical to "valid" if *stride* equals 1).
* **global** (boolean, optional, default=false): Ignore kernel, stride and padding, do global pooling based on current input feature map.
* **padding** ({"valid", "same", "no_loss"}, optional, default="same"): One of "valid", "same" or "no_loss". "valid" means no padding. "same" results in padding the input such that the output has the same length as the original input divided by the stride (rounded up). "no_loss" results in minimal padding such that each input is used by at least one filter (identical to "valid" if *stride* equals 1).
* **AveragePooling(kernel, stride=(1,1), padding="same", global=false)**
* **GlobalPooling(type)**
Performs average pooling on the input.
* **kernel** (integer tuple > 0, required): convolution kernel size: (height, width). Is not required if *global* is true.
* **stride** (integer tuple > 0, optional, default=(1,1)): convolution stride: (height, width).
* **padding** (String, optional, default="same"): One of "valid", "same" or "no_loss". "valid" means no padding. "same" results in padding the input such that the output has the same length as the original input divided by the stride (rounded up). "no_loss" results in minimal padding such that each input is used by at least one filter (identical to "valid" if *stride* equals 1).
* **global** (boolean, optional, default=false): Ignore kernel, stride and padding, do global pooling based on current input feature map.
Performs global pooling on the input.
* **type** ({"avg", "max"}, required): Pooling type to be applied.
* **Lrn(nsize, knorm=2, alpha=0.0001, beta=0.75)**
......
......@@ -23,7 +23,11 @@ grammar CNNArch extends de.monticore.lang.math.Math {
ArchType implements Type = ElementType "^" Shape;
Shape = "{" dimensions:(ArchSimpleExpression || ",")+ "}";
Shape = "{" dimensions:(DimensionArgument || ",")* "}";
DimensionArgument = (name:"H" ":" height:ArchSimpleExpression
| name:"W" ":" width:ArchSimpleExpression
| name:"C" ":" channels:ArchSimpleExpression);
ArchitectureParameter implements Variable = Name& ("=" default:ArchSimpleExpression)?;
......@@ -46,7 +50,8 @@ grammar CNNArch extends de.monticore.lang.math.Math {
ArchParameterArgument implements ArchArgument = Name "=" rhs:ArchExpression;
ArchSpecialArgument implements ArchArgument = (serial:"->" | parallel:"|") "=" rhs:ArchExpression;
ArchSpecialArgument implements ArchArgument = (serial:"->" | parallel:"|" | conditional:"?") "="
rhs:ArchExpression;
ast ArchSpecialArgument = method public String getName(){return "";};
ParallelLayer implements ArchitectureElement = "(" groups:ArchBody "|" groups:(ArchBody || "|")+ ")";
......
......@@ -27,8 +27,8 @@ public class ASTArchSpecialArgument extends ASTArchSpecialArgumentTOP {
public ASTArchSpecialArgument() {
}
public ASTArchSpecialArgument(ASTArchExpression rhs, String serial, String parallel) {
super(rhs, serial, parallel);
public ASTArchSpecialArgument(ASTArchExpression rhs, String serial, String parallel, String conditional) {
super(rhs, serial, parallel, conditional);
}
@Override
......@@ -39,7 +39,12 @@ public class ASTArchSpecialArgument extends ASTArchSpecialArgumentTOP {
else if (getSerial().isPresent()) {
return AllPredefinedVariables.FOR_NAME;
}
return null;
else if (getConditional().isPresent()){
return AllPredefinedVariables.IF_NAME;
}
else {
throw new IllegalStateException();
}
}
}
......@@ -26,7 +26,6 @@ public class CNNArchPostResolveCocos {
return new CNNArchCoCoChecker()
.addCoCo(new CheckLayerInputs())
.addCoCo(new CheckIOAccessAndIOMissing())
.addCoCo(new CheckIOType())
.addCoCo(new CheckIOShape());
}
......
......@@ -31,7 +31,7 @@ public class CheckArgument implements CNNArchASTArchArgumentCoCo {
public void check(ASTArchArgument node) {
ArgumentSymbol argument = (ArgumentSymbol) node.getSymbol().get();
if (argument.getParameter() == null){
Log.error("0"+ ErrorCodes.UNKNOWN_ARGUMENT_CODE + " Unknown Argument. " +
Log.error("0"+ ErrorCodes.UNKNOWN_ARGUMENT + " Unknown Argument. " +
"Parameter with name '" + node.getName() + "' does not exist."
, node.get_SourcePositionStart());
}
......
......@@ -34,7 +34,7 @@ public class CheckIOName implements CNNArchASTIODeclarationCoCo {
@Override
public void check(ASTIODeclaration node) {
if (ioNames.contains(node.getName())){
Log.error("0" + ErrorCodes.DUPLICATED_NAME_CODE + " Duplicated IO name. " +
Log.error("0" + ErrorCodes.DUPLICATED_NAME + " Duplicated IO name. " +
"The name '" + node.getName() + "' is already used."
, node.get_SourcePositionStart());
}
......
......@@ -20,36 +20,59 @@
*/
package de.monticore.lang.monticar.cnnarch._cocos;
import de.monticore.lang.monticar.cnnarch._ast.ASTIODeclaration;
import de.monticore.lang.monticar.cnnarch._ast.ASTDimensionArgument;
import de.monticore.lang.monticar.cnnarch._ast.ASTShape;
import de.monticore.lang.monticar.cnnarch._symboltable.ArchSimpleExpressionSymbol;
import de.monticore.lang.monticar.cnnarch._symboltable.IODeclarationSymbol;
import de.monticore.lang.monticar.cnnarch._symboltable.ShapeSymbol;
import de.monticore.lang.monticar.cnnarch.helper.ErrorCodes;
import de.se_rwth.commons.logging.Log;
import java.util.Optional;
public class CheckIOShape implements CNNArchASTIODeclarationCoCo {
public class CheckIOShape implements CNNArchASTShapeCoCo {
@Override
public void check(ASTIODeclaration node) {
int shapeSize = node.getType().getShape().getDimensions().size();
if (shapeSize != 1 && shapeSize != 3){
Log.error("0" + ErrorCodes.INVALID_IO_SHAPE + " Invalid shape. " +
"IO Shape has to be either {height, width, channels} or {channels}."
, node.getType().getShape().get_SourcePositionStart());
}
else {
IODeclarationSymbol ioDeclaration = (IODeclarationSymbol) node.getSymbol().get();
for (ArchSimpleExpressionSymbol dimension : ioDeclaration.getShape().getDimensionSymbols()){
Optional<Integer> value = dimension.getIntValue();
if (!value.isPresent() || value.get() <= 0){
Log.error("0" + ErrorCodes.INVALID_IO_SHAPE + " Invalid shape. " +
"The dimension can only be defined by a positive integer."
, dimension.getSourcePosition());
public void check(ASTShape node) {
boolean hasHeight = false;
boolean hasWidth = false;
boolean hasChannels = false;
for (ASTDimensionArgument dimensionArg : node.getDimensions()){
if (dimensionArg.getWidth().isPresent()){
if (hasWidth){
repetitionError(dimensionArg);
}
hasWidth = true;
}
else if (dimensionArg.getHeight().isPresent()){
if (hasHeight){
repetitionError(dimensionArg);
}
hasHeight = true;
}
else {
if (hasChannels){
repetitionError(dimensionArg);
}
hasChannels = true;
}
}
ShapeSymbol shape = (ShapeSymbol) node.getSymbol().get();
for (ArchSimpleExpressionSymbol dimension : shape.getDimensionSymbols()){
Optional<Integer> value = dimension.getIntValue();
if (!value.isPresent() || value.get() <= 0){
Log.error("0" + ErrorCodes.INVALID_IO_SHAPE + " Invalid shape. " +
"The dimensions can only be defined by a positive integer."
, dimension.getSourcePosition());
}
}
}
private void repetitionError(ASTDimensionArgument node){
Log.error("0" + ErrorCodes.INVALID_IO_SHAPE + " Invalid shape. " +
"The dimension '" + node.getName().get() + "' was defined multiple times. "
, node.get_SourcePositionStart());
}
}
/**
*
* ******************************************************************************
* MontiCAR Modeling Family, www.se-rwth.de
* Copyright (c) 2017, Software Engineering Group at RWTH Aachen,
* All rights reserved.
*
* This project is free software; you can redistribute it and/or
* modify it under the terms of the GNU Lesser General Public
* License as published by the Free Software Foundation; either
* version 3.0 of the License, or (at your option) any later version.
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public
* License along with this project. If not, see <http://www.gnu.org/licenses/>.
* *******************************************************************************
*/
package de.monticore.lang.monticar.cnnarch._cocos;
import de.monticore.lang.monticar.cnnarch._ast.ASTArchitecture;
public class CheckIOType implements CNNArchASTArchitectureCoCo {
@Override
public void check(ASTArchitecture node) {
//todo:
}
}
......@@ -26,7 +26,6 @@ import de.monticore.lang.monticar.cnnarch._symboltable.MethodDeclarationSymbol;
import de.monticore.lang.monticar.cnnarch._symboltable.MethodLayerSymbol;
import de.monticore.lang.monticar.cnnarch._symboltable.VariableSymbol;
import de.monticore.lang.monticar.cnnarch.helper.ErrorCodes;
import de.monticore.lang.monticar.cnnarch.predefined.AllPredefinedMethods;
import de.se_rwth.commons.logging.Log;
import java.util.HashSet;
......@@ -40,7 +39,7 @@ public class CheckMethodLayer implements CNNArchASTMethodLayerCoCo{
for (ASTArchArgument argument : node.getArguments()){
String name = argument.getName();
if (nameSet.contains(name)){
Log.error("0" + ErrorCodes.DUPLICATED_ARG_CODE + " Duplicated name: " + name +
Log.error("0" + ErrorCodes.DUPLICATED_ARG + " Duplicated name: " + name +
". Multiple values assigned to the same argument."
, argument.get_SourcePositionStart());
}
......@@ -51,7 +50,7 @@ public class CheckMethodLayer implements CNNArchASTMethodLayerCoCo{
MethodDeclarationSymbol method = ((MethodLayerSymbol) node.getSymbol().get()).getMethod();
if (method == null){
Log.error("0" + ErrorCodes.UNKNOWN_METHOD_CODE + " Unknown method error. " +
Log.error("0" + ErrorCodes.UNKNOWN_METHOD + " Unknown method error. " +
"Method with name '" + node.getName() + "' does not exist"
, node.get_SourcePositionStart());
}
......@@ -64,13 +63,10 @@ public class CheckMethodLayer implements CNNArchASTMethodLayerCoCo{
}
for (ASTArchArgument argument : node.getArguments()){
requiredArguments.remove(argument.getName());
if (argument.getName().equals(AllPredefinedMethods.GLOBAL_NAME)){
requiredArguments.remove(AllPredefinedMethods.KERNEL_NAME);
}
}
for (String missingArgumentName : requiredArguments){
Log.error("0"+ErrorCodes.MISSING_ARGUMENT_CODE + " Missing argument. " +
Log.error("0"+ErrorCodes.MISSING_ARGUMENT + " Missing argument. " +
"The argument '" + missingArgumentName + "' is required."
, node.get_SourcePositionStart());
}
......
......@@ -21,10 +21,6 @@
package de.monticore.lang.monticar.cnnarch._cocos;
import de.monticore.lang.monticar.cnnarch._ast.ASTMethodDeclaration;
import de.monticore.lang.monticar.cnnarch._symboltable.CompositeLayerSymbol;
import de.monticore.lang.monticar.cnnarch._symboltable.LayerSymbol;
import de.monticore.lang.monticar.cnnarch._symboltable.MethodDeclarationSymbol;
import de.monticore.lang.monticar.cnnarch._symboltable.MethodLayerSymbol;
import de.monticore.lang.monticar.cnnarch.helper.ErrorCodes;
import de.se_rwth.commons.logging.Log;
......@@ -39,13 +35,13 @@ public class CheckMethodName implements CNNArchASTMethodDeclarationCoCo {
public void check(ASTMethodDeclaration node) {
String name = node.getName();
if (name.isEmpty() || !Character.isLowerCase(name.codePointAt(0))){
Log.error("0" + ErrorCodes.ILLEGAL_NAME_CODE + " Illegal name: " + name +
Log.error("0" + ErrorCodes.ILLEGAL_NAME + " Illegal name: " + name +
". All new variable and method names have to start with a lowercase letter. "
, node.get_SourcePositionStart());
}
if (methodNames.contains(name)){
Log.error("0" + ErrorCodes.DUPLICATED_NAME_CODE + " Duplicated method name. " +
Log.error("0" + ErrorCodes.DUPLICATED_NAME + " Duplicated method name. " +
"The name '" + name + "' is already used."
, node.get_SourcePositionStart());
}
......
......@@ -55,7 +55,7 @@ public class CheckMethodRecursion implements CNNArchASTMethodDeclarationCoCo {
if (method != null && !method.isPredefined() && !seenMethods.contains(method)) {
seenMethods.add(method);
if (startingMethod == method) {
Log.error("0" + ErrorCodes.RECURSION_ERROR_CODE + " Recursion is not allowed. " +
Log.error("0" + ErrorCodes.RECURSION_ERROR + " Recursion is not allowed. " +
"The method '" + startingMethod.getName() + "' creates a recursive cycle."
, startingMethod.getSourcePosition());
done = true;
......
......@@ -43,7 +43,7 @@ public class CheckUnknownIO implements CNNArchASTIOLayerCoCo {
}
if (ioDeclaration == null){
Log.error("0" + ErrorCodes.UNKNOWN_IO_CODE + " Unknown input or output name. " +
Log.error("0" + ErrorCodes.UNKNOWN_IO + " Unknown input or output name. " +
"The input or output '" + node.getName() + "' does not exist"
, node.get_SourcePositionStart());
}
......
......@@ -45,17 +45,17 @@ public class CheckVariableName implements CNNArchASTVariableCoCo {
private void checkForIllegalNames(ASTVariable node){
String name = node.getName();
if (name.isEmpty() || !Character.isLowerCase(name.codePointAt(0))){
Log.error("0" + ErrorCodes.ILLEGAL_NAME_CODE + " Illegal name: " + name +
Log.error("0" + ErrorCodes.ILLEGAL_NAME + " Illegal name: " + name +
". All new variable and method names have to start with a lowercase letter. "
, node.get_SourcePositionStart());
}
else if (name.equals(AllPredefinedVariables.TRUE_NAME) || name.equals(AllPredefinedVariables.FALSE_NAME)){
Log.error("0" + ErrorCodes.ILLEGAL_NAME_CODE + " Illegal name: " + name +
Log.error("0" + ErrorCodes.ILLEGAL_NAME + " Illegal name: " + name +
". No variable can be named 'true' or 'false'"
, node.get_SourcePositionStart());
}
else if (name.equals(AllPredefinedVariables.IF_NAME.toLowerCase())){
Log.error("0" + ErrorCodes.ILLEGAL_NAME_CODE + " Illegal name: " + name +
Log.error("0" + ErrorCodes.ILLEGAL_NAME + " Illegal name: " + name +
". No variable can be named 'if'"
, node.get_SourcePositionStart());
}
......@@ -80,7 +80,7 @@ public class CheckVariableName implements CNNArchASTVariableCoCo {
}
private void duplicationError(ASTVariable node){
Log.error("0" + ErrorCodes.DUPLICATED_NAME_CODE + " Duplicated variable name. " +
Log.error("0" + ErrorCodes.DUPLICATED_NAME + " Duplicated variable name. " +
"The name '" + node.getName() + "' is already used."
, node.get_SourcePositionStart());
}
......
......@@ -120,7 +120,7 @@ public class ArchitectureSymbol extends ArchitectureSymbolTOP {
public void checkParameters(){
for (VariableSymbol parameter : getParameters()){
if (!parameter.hasExpression()){
Log.error("0" + ErrorCodes.MISSING_VAR_VALUE_CODE + " Missing architecture argument. " +
Log.error("0" + ErrorCodes.MISSING_VAR_VALUE + " Missing architecture argument. " +
"The parameter '" + parameter.getName() + "' has no value.");
}
}
......
......@@ -198,21 +198,28 @@ public class CNNArchSymbolTableCreator extends de.monticore.symboltable.CommonSy
@Override
public void endVisit(ASTShape node) {
ShapeSymbol sym = (ShapeSymbol) node.getSymbol().get();
if (node.getDimensions().size() == 1){
ArchSimpleExpressionSymbol channels = (ArchSimpleExpressionSymbol) node.getDimensions().get(0).getSymbol().get();
sym.setChannels(channels);
}
else if (node.getDimensions().size() == 3){
ArchSimpleExpressionSymbol height = (ArchSimpleExpressionSymbol) node.getDimensions().get(ShapeSymbol.HEIGHT_INDEX - 1).getSymbol().get();
ArchSimpleExpressionSymbol width = (ArchSimpleExpressionSymbol) node.getDimensions().get(ShapeSymbol.WIDTH_INDEX - 1).getSymbol().get();
ArchSimpleExpressionSymbol channels = (ArchSimpleExpressionSymbol) node.getDimensions().get(ShapeSymbol.CHANNEL_INDEX - 1).getSymbol().get();
sym.setHeight(height);
sym.setWidth(width);
sym.setChannels(channels);
}
else {
//do nothing; will be checked in coco
List<ArchSimpleExpressionSymbol> dimensionList = new ArrayList<>(3);
for (int i = 0; i < node.getDimensions().size(); i++){
ASTDimensionArgument dimensionArg = node.getDimensions().get(i);
if (dimensionArg.getHeight().isPresent()){
sym.setHeightIndex(i);
ArchSimpleExpressionSymbol exp = (ArchSimpleExpressionSymbol) dimensionArg.getHeight().get().getSymbol().get();
dimensionList.add(exp);
}
else if (dimensionArg.getWidth().isPresent()){
sym.setWidthIndex(i);
ArchSimpleExpressionSymbol exp = (ArchSimpleExpressionSymbol) dimensionArg.getWidth().get().getSymbol().get();
dimensionList.add(exp);
}
else {
sym.setChannelIndex(i);
ArchSimpleExpressionSymbol exp = (ArchSimpleExpressionSymbol) dimensionArg.getChannels().get().getSymbol().get();
dimensionList.add(exp);
}
}
sym.setDimensionSymbols(dimensionList);
addToScopeAndLinkWithNode(sym, node);
}
......
......@@ -28,7 +28,7 @@ import de.se_rwth.commons.logging.Log;
import java.util.List;
import java.util.Optional;
import static de.monticore.lang.monticar.cnnarch.helper.ErrorCodes.ILLEGAL_ASSIGNMENT_CODE;
import static de.monticore.lang.monticar.cnnarch.helper.ErrorCodes.ILLEGAL_ASSIGNMENT;
public enum Constraints {
NUMBER {
......@@ -163,7 +163,28 @@ public enum Constraints {
@Override
protected String msgString() {
return AllPredefinedMethods.PADDING_VALID + " or " + AllPredefinedMethods.PADDING_SAME;
return AllPredefinedMethods.PADDING_VALID + ", "
+ AllPredefinedMethods.PADDING_SAME + " or "
+ AllPredefinedMethods.PADDING_NO_LOSS;
}
},
POOL_TYPE {
@Override
public boolean isValid(ArchSimpleExpressionSymbol exp) {
Optional<String> optString= exp.getStringValue();