Finally, these distinctions are symptomatic of deeper differences: Tanh and ReLU/sigmoid RNNs discover distinct circuit solutions to context-dependent decision-making task. Differences in circuitry become critical when RNNs are exposed to novel stimuli outside the training range.
24.10.2025 19:14
๐ 2
๐ 0
๐ฌ 0
๐ 0
We further show that RNNs with different activation functions exhibit distinct dynamics, as characterized by the configuration of fixed points and trajectory end points, with tanh RNNs consistently displaying significant divergence from ReLU and sigmoid ones.
24.10.2025 19:14
๐ 6
๐ 1
๐ฌ 1
๐ 0
The choice of activation function in RNNs is often assumed to minimally affect its trajectories. We analyzed ReLU, sigmoid, and tanh RNNs on diverse tasks, revealing differences in their neural trajectories and individual neuron responses, challenging this assumption.
24.10.2025 19:14
๐ 2
๐ 0
๐ฌ 1
๐ 0