machine learning - Concerns related to matlab neural network toolbox -


i have concerns related use of nntool in matlab toolbox. following links these simple linear neural network weights training , not compatable training results, cant understand why, have found nntool default normalizes inputs range [-1 1]. bit concerned, created neural network tansig activation in first layer , logsig activation in output layer. manually normalized outputs range of [0 1] in data , fed nntool. question nntool further normalizes range [-1 1]. if not correct, output of logsig cannot in range of [-1 1].

any suggestions?

i not matlab user, if don't want use normalization , forced on both input , output - denormalize output. assume, simple linear normalization (squashing [-1,1] interval), if want output in [0,1] interval can apply f(x) = (x+1)/2 linearly maps [-1,1] [0,1]. neural networks scale sensitive (as correlated non-tunable parameters activation functions slopes), internal normalization has advantages. should work, if normalization applied after training.

if normalizes input should not concerned, won't imply problems using activation functions (in fact, stated before, should actualy help).

update

as question has been posted on cross validated, more details, have answered there more precise solution.


Comments

Popular posts from this blog

html - How to style widget with post count different than without post count -

How to remove text and logo OR add Overflow on Android ActionBar using AppCompat on API 8? -

javascript - storing input from prompt in array and displaying the array -