is it possible to use different activations in MLP

Hi
I want to know is it possible to use different activation functions in different layers in MLP and if not why ??
also is it possible to use different activation functions in the same layer in MLP and if not why ??
last what does a gradient based algorithm means simply ?????
thanks

Answers (0)

This question is closed.

Products

Release

R2016a

Asked:

on 26 Jul 2019

Closed:

on 20 Aug 2021

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!