scepticus Posted February 24, 2013 Author Report Share Posted February 24, 2013 Thanks but I meant this one (plus all subsequent ones...) Try this: This principle states that, when choosing probabilities on a discrete hypothesis space, subject to constraints on the probabilities (e.g. a certain expectation value is specified), you should distribute the probability as uniformly as possible by the criterion of Shannon entropy. I presume you know what shannon entropy, priors and posteriors are, given your educational background? Quote Link to post Share on other sites
erat_forte Posted February 24, 2013 Report Share Posted February 24, 2013 Try this: This principle states that, when choosing probabilities on a discrete hypothesis space, subject to constraints on the probabilities (e.g. a certain expectation value is specified), you should distribute the probability as uniformly as possible by the criterion of Shannon entropy. I presume you know what shannon entropy, priors and posteriors are, given your educational background? Sorry no! It was a long time ago and I want paying attention! Anyway we were talking a minute ago about how to address people with other backgrounds! Quote Link to post Share on other sites
scepticus Posted February 24, 2013 Author Report Share Posted February 24, 2013 Sorry no! It was a long time ago and I want paying attention! Anyway we were talking a minute ago about how to address people with other backgrounds! OK, fair dos. Must think about how to set this out plainly. That said, did my explaination regarding finance and hedging help at all? Quote Link to post Share on other sites
erat_forte Posted February 26, 2013 Report Share Posted February 26, 2013 OK, fair dos. Must think about how to set this out plainly. That said, did my explaination regarding finance and hedging help at all? I think so, a little! Quote Link to post Share on other sites
scepticus Posted February 28, 2013 Author Report Share Posted February 28, 2013 I think so, a little! Actually I think this bit from wikipedia says it nicely. See if you agree: Consider a discrete probability distribution among m mutually exclusive propositions. The most informative distribution would occur when one of the propositions was known to be true. In that case, the information entropy would be equal to zero. The least informative distribution would occur when there is no reason to favor any one of the propositions over the others. In that case, the only reasonable probability distribution would be uniform, and then the information entropy would be equal to its maximum possible value, log m. The information entropy can therefore be seen as a numerical measure which describes how uninformative a particular probability distribution is, ranging from zero (completely informative) to log m (completely uninformative). By choosing to use the distribution with the maximum entropy allowed by our information, the argument goes, we are choosing the most uninformative distribution possible. To choose a distribution with lower entropy would be to assume information we do not possess; to choose one with a higher entropy would violate the constraints of the information we do possess. Thus the maximum entropy distribution is the only reasonable distribution. Quote Link to post Share on other sites
erat_forte Posted March 2, 2013 Report Share Posted March 2, 2013 Actually I think this bit from wikipedia says it nicely. See if you agree: Consider a discrete probability distribution among m mutually exclusive propositions. The most informative distribution would occur when one of the propositions was known to be true. In that case, the information entropy would be equal to zero. The least informative distribution would occur when there is no reason to favor any one of the propositions over the others. In that case, the only reasonable probability distribution would be uniform, and then the information entropy would be equal to its maximum possible value, log m. The information entropy can therefore be seen as a numerical measure which describes how uninformative a particular probability distribution is, ranging from zero (completely informative) to log m (completely uninformative). By choosing to use the distribution with the maximum entropy allowed by our information, the argument goes, we are choosing the most uninformative distribution possible. To choose a distribution with lower entropy would be to assume information we do not possess; to choose one with a higher entropy would violate the constraints of the information we do possess. Thus the maximum entropy distribution is the only reasonable distribution. Thanks, yes, this is hard going but we are getting there! Quote Link to post Share on other sites
scepticus Posted March 2, 2013 Author Report Share Posted March 2, 2013 Thanks, yes, this is hard going but we are getting there! Good, so thats MaxEnt done. My original email that set off this recent digression also mentioned the maximum entropy production principle. Did that bit make sense - go back and check... Quote Link to post Share on other sites
erat_forte Posted March 2, 2013 Report Share Posted March 2, 2013 OK! Quote Link to post Share on other sites
Guest_FaFa!_* Posted April 24, 2013 Report Share Posted April 24, 2013 Thread resurrection! I was interested in following this thread, but you lost me about 3 pages ago. You often talk about religion or the need for a new religion on the forum in somewhat disparaging terms. It is a shame most people cannot keep up with you, but people have different ways of processing information and I think you need a few timely analogies and images (like your description of ZIRP being like a rainforest the other day) to flesh out what you are saying. You may baulk at simplifying things that much, but I think it is necessary. Quote Link to post Share on other sites
erat_forte Posted April 24, 2013 Report Share Posted April 24, 2013 Thread resurrection! I was interested in following this thread, but you lost me about 3 pages ago. You often talk about religion or the need for a new religion on the forum in somewhat disparaging terms. It is a shame most people cannot keep up with you, but people have different ways of processing information and I think you need a few timely analogies and images (like your description of ZIRP being like a rainforest the other day) to flesh out what you are saying. You may baulk at simplifying things that much, but I think it is necessary. I agree Fafa, I loved this thread but it made my brain clog up! Quote Link to post Share on other sites
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.