Neural Network Heatmap
Network Settings
Randomize
# hidden layers
Input domain
[-6,6] × [-6,6]
[-10,10] × [-10,10]
[-20,20] × [-20,20]
Hidden activation
tanh
ReLU
Leaky ReLU
ELU
Linear
Output activation is always
sigmoid
so palette bins map to (0,1).
Heatmap settings
Sampling resolution
Palette (exact, discrete)
Paste JSON array of hex colors
Apply palette
Reset to default
Palette loaded.