The output from the convolutional layer is usually passed from the ReLU activation purpose to bring non-linearity to the model. It requires the characteristic map and replaces each of the destructive values with zero. Knowing the complexity from the model In order to assess the complexity of the model, https://financefeeds.com/ethereum-founder-buys-pepe-coin-pepe-could-he-buy-this-new-revolutionary-ethereum-token-too/