That is an interesting and somewhat philosophical question. As it is normally conceptualized, a neural network with a softmax output computes a deterministic mapping to categorical variables. All that is to say that it is trained to classify input as one out of many possibilities. In the case of your number example, we would want it to output either 1 or 2, but not 1.5, since that doesn’t make sense in the context of the task.
Now, I think what you are getting at is having the neural network learn a probability function, where the uncertainty of the input is taken into account. In such a case it may have to output 1 or 2, but could output 1 half the time and 2 the other half of the time. This is related to a class of neural networks called stochastic neural networks, or bayesian neural networks. They maintain probability distributions over outputs, rather than single scalar outputs. If you are interested in this topic, I highly recommend the work of Yarin Gal: http://www.jmlr.org/proceedings/papers/v48/gal16.pdf