R neuralNet: “non-conformable arguments”

Argh! I keep getting the following error when attempting to compute with my neural network: I can’t figure out what the problem is. Below I’ll provide you with an example data and formatting from my matrices and then I’ll show you the code I’m attempting to run. matrix.train1 is used for training the network> matrix.train1 (Intercept) survived pclass … Read more

Backward function in PyTorch

Please read carefully the documentation on backward() to better understand it. By default, pytorch expects backward() to be called for the last output of the network – the loss function. The loss function always outputs a scalar and therefore, the gradients of the scalar loss w.r.t all other variables/parameters is well defined (using the chain rule). Thus, by default, backward() is called on a scalar … Read more

What is cross-entropy?

Cross-entropy is commonly used to quantify the difference between two probability distributions. In the context of machine learning, it is a measure of error for categorical multi-class classification problems. Usually the “true” distribution (the one that your machine learning algorithm is trying to match) is expressed in terms of a one-hot distribution. For example, suppose … Read more

What is the use of train_on_batch() in keras?

For this question, it’s a simple answer from the primary author: With fit_generator, you can use a generator for the validation data as well. In general, I would recommend using fit_generator, but using train_on_batch works fine too. These methods only exist for the sake of convenience in different use cases, there is no “correct” method. train_on_batch allows you to expressly update … Read more