Concrete examples of randomness in math vs. probability theory

  • #1
gentzen
Science Advisor
Gold Member
851
708
A recent question about interpretations of probability nicely clarified the role of the Kolmogorov axioms:
Dale said:
As long as any given interpretation of probability is consistent with the Kolmogorov axioms then they will be equivalent in the sense that you can use the same proofs and theorems.
kered rettop said:
I should have known better than to ask on a maths forum. The question arose because a certain interpretation of quantum mechanics has been criticised for the way it uses probability. I haven't the faintest idea whether the model is consistent with the Kolmorogov axioms or even how to go about finding out. Thanks for trying though.
[... some excursions into QM, negative probabilities, and quasiprobability distributions ...]
kered rettop said:
So, would it be correct to say that axiomatic probability theory cannot be invoked to distinguish between different interpretations (except in the sense of questioning whether a particular interpretation actually does talk about probabilities at all)?
Conclusion: the Kolmogorov axioms formalize the concept of probability. They achieve this by completely separating it from concepts like randomness, uncertainty, propensity, subjective 'degree of belief', ...

But randomness remains an important concept, even in mathematics (and computer science) itself. Take for example the challenge to proof or disprove whether randomized algorithm are more powerful than deterministic algorithms. It is a typical example of the challenges randomness presents for current proof techniques. And because they are sometimes also (wrongly?) called probabilistic algorithms, it is also a typical example of the "relevance and challenge" to distinguish between randomness and probability.

There are also examples where randomness was successfully conquered by current proof techniques:
Wikipedia said:
the Rado graph, Erdős–Rényi graph, or random graph is a countably infinite graph that can be constructed (with probability one) by choosing independently at random for each pair of its vertices whether to connect the vertices by an edge.
Wikipedia said:
a Chaitin constant (Chaitin omega number) or halting probability is a real number that, informally speaking, represents the probability that a randomly constructed program will halt.
Both examples clearly manage to go beyong probability into actual randomness. Are they of the same type, or do they use different concepts of randomness? What are other examples where randomness was successfully conquered by current proof techniques? Are they of the same type(s) as my two examples?
 
  • Like
Likes PeroK, kered rettop and Dale
Physics news on Phys.org
  • #2
gentzen said:
A recent question about interpretations of probability nicely clarified the role of the Kolmogorov axioms:
My interpretation of that thread was that the Komogorov axioms perfectly define the term "probability" that is used in the vast majority of problems. Using the same word for QM applications where the "probability" can be negative or even complex is unfortunate.
gentzen said:
Conclusion: the Kolmogorov axioms formalize the concept of probability. They achieve this by completely separating it from concepts like randomness, uncertainty, propensity, subjective 'degree of belief', ...
IMO, the Kolmogorov axioms work perfectly for the vast majority of problems regarding "randomness, uncertainty, propensity, subjective 'degree of belief', ...". If QM uses the same word for other uses, that is unfortunate.
gentzen said:
But randomness remains an important concept, even in mathematics (and computer science) itself. Take for example the challenge to proof or disprove whether randomized algorithm are more powerful than deterministic algorithms.
That is a very vague question. Do you have a reference that is more concrete?
gentzen said:
It is a typical example of the challenges randomness presents for current proof techniques. And because they are sometimes also (wrongly?) called probabilistic algorithms, it is also a typical example of the "relevance and challenge" to distinguish between randomness and probability.
This is all very vague to me. I am not sure that there is any rigorous interpretation of this statement. Do you have a reference?
gentzen said:
There are also examples where randomness was successfully conquered by current proof techniques:
More accurately, there are billions of examples. Real world problems of "probability" as defined by the Kolmogorov axioms occur and are solved every day.
gentzen said:
Both examples clearly manage to go beyong probability into actual randomness.
? I don't understand this. Can you clarify?
gentzen said:
Are they of the same type, or do they use different concepts of randomness? What are other examples where randomness was successfully conquered by current proof techniques? Are they of the same type(s) as my two examples?
After reading the prior thread and this, I still have no clue why you are saying that the concepts of randomness and probability are so different. Is this the age-old problem of a real number being randomly selected even though that exact number has 0 probability?
 
  • #3
FactChecker said:
My interpretation of that thread was that the Komogorov axioms perfectly define the term "probability" that is used in the vast majority of problems.
FactChecker said:
IMO, the Kolmogorov axioms work perfectly for the vast majority of problems regarding "randomness, uncertainty, propensity, subjective 'degree of belief', ...".
Thanks for being upfront about this. So you oppose already the most basic premise of my question that probability and randomness are not equivalent concepts.
FactChecker said:
Using the same word for QM applications where the "probability" can be negative or even complex is unfortunate.
FactChecker said:
If QM uses the same word for other uses, that is unfortunate.
No, that is not the case, even so the initial exchange between Dale and kered rettop could give that impression. This is why I posted in that thread to clarify where the claim that QM requires negative probabilties comes from, and what it means. But I guess you are simply not much into QM, which is fine.
To make this perfectly clear: The probabilities computed by the Born rule in QM are normal probabilities and satisfy the Kolmogorov axioms. You could take the specturm of the self-adjoint operator representing the measurement as ##\Omega##, and the Borel σ-algebra as the σ-algebra on ##\Omega##.

gentzen said:
But randomness remains an important concept, even in mathematics (and computer science) itself. Take for example the challenge to proof or disprove whether randomized algorithm are more powerful than deterministic algorithms.
FactChecker said:
That is a very vague question. Do you have a reference that is more concrete?
For example, the Wikipedia article on P vs ZPP (zero-error probabilistic polynomial time) contains the following statement:
Wikipedia said:
The class P is contained in ZPP, and some computer scientists have conjectured that P = ZPP, i.e., every Las Vegas algorithm has a deterministic polynomial-time equivalent.
The general theme under which such questions are analyzed and attacked is called derandomization.

gentzen said:
It is a typical example of the challenges randomness presents for current proof techniques. And because they are sometimes also (wrongly?) called probabilistic algorithms, it is also a typical example of the "relevance and challenge" to distinguish between randomness and probability.
FactChecker said:
This is all very vague to me. I am not sure that there is any rigorous interpretation of this statement. Do you have a reference?
You sound very antagonistic to me. Yes, I could find references where randomized algorithms are (wrongly?) called probabilistic algorithms. I won't find a reference talking about 'the "relevance and challenge" to distinguish between randomness and probability'. I tried to make my point here that probability and randomness are not equivalent concepts, but that people still mix them. (And I got the impression that you believe that they are even justified in mixing them. What should I say?)


FactChecker said:
More accurately, there are billions of examples. Real world problems of "probability" as defined by the Kolmogorov axioms occur and are solved every day.
So you repeat again that you don't see any difference between randomness and probability.

FactChecker said:
? I don't understand this. Can you clarify?
Indeed, you don't understand. Let me ask you: is this the first time you see those two examples? For example, the linked article for Chaitin's Ω number contains a section Algorithmic randomness
Wikipedia said:
A real number is random if the binary sequence representing the real number is an algorithmically random sequence. Calude, Hertling, Khoussainov, and Wang showed[7] that a recursively enumerable real number is an algorithmically random sequence if and only if it is a Chaitin's Ω number.

FactChecker said:
After reading the prior thread and this, I still have no clue why you are saying that the concepts of randomness and probability are so different.
Because they are different concepts. Maybe it is easier for you to see that randomness and subjective 'degree of belief' are different concepts. And then notice that probabilities can be used to quantify subjective 'degree of belief'.
FactChecker said:
Is this the age-old problem of a real number being randomly selected even though that exact number has 0 probability?
No, it is not. For this problem, I would actually side with you that probability theory (based on the Kolmogorov axioms) gets you as close to a good answer as you can reasonably expect. The entropy is infinite in this case. But you can approximate with event subalgebras of finite entropy, and study the limit process.
 
  • #4
I'm not an expert on this, but here are some thoughts.

The Kolmogorov axioms define probability in the way that Euclid's axioms define plane geometry: they represent a mathematical idealisation that may or may not exist or be approximated in reality. We use plane geometry all the time in physics, but an ideal line or circle does not exist. And, spacetime is ultimately not Euclidean in any case.

Historically, probability suffered the misconception that somehow it should be definable as a real concept, rather than a mathematical idealisation. Ths is somewhat analogous to the misconception that Euclid's fifth postulate must follow from the others. In any case, it is my understanding that Kolmogorov ended the debate by axiomising probability as a purely mathematical concept. And this was generally accepted as the only way forward.

Now, of course, we apply probability theory to the real world in the same way that we apply geometry. An idealised coin lands heads or tails with two possible outcomes, each with measure half. But, one can argue, that no coin is totally fair and each coin (and/or the tossing process) must have some bias. Or, deviate in some way from the idealied measure space.

There was a good example recently in the homework forum, where the student was expected to apply some naive probability theory; whereas, there was much scope for questioning the assumptions involved. This, IMO, is at the heart of applying probability theory to a real-world situation: to what extent can you trust the assumptions that model your scenario as a Kolmogorov measure space?

Randomness is more of a statistical concept. True randomness is equivalent to a probability space with a finite number of outcomes of equal measure. Tests for randomness, however, go beyond probability theory, and opens up a whole new framework.

Moreover, statistics has two mainstream approaches, called frequentist and Bayesian. There is unfortunately often confusuion that this applies to basic probability theory (it does not). Bayesian statistics generally have an extra layer of flexibility - although at the cost of conjuring probabilities that are arguably meaningless. For example, researchers in cosmology use Bayesian techniques to test various theories. The frequentist approach would generally be simply unable to produce an answer to these questions.

In terms of QM, the assumption is that nature is inherently probabilistic. But, you still have experimental uncertainty that means you are never dealing with a perfect scenario. You can model the outcome of experiments using the Kolmogorov axioms (the intermediate complex probability amplitudes affect the QM theory, but ultimately, once you have the final amplitudes, then you can convert these to probabilities).

PS: IMO, you can model experimental outcomes in MWI as a Kolmogorov measure space. I can't see anything to prevent it. The measured outcomes are represented by the branches of the wavefunction, rather than the experimental outcomes directly - as would be the case in a collapse interpretation.
 
  • #5
gentzen said:
But I guess you are simply not much into QM, which is fine.
Yes. My knowledge of QM is close to zero. I am comfortable with the Komogorov axioms and the use of that type of probability/randomness, but know nothing about a QM interpretation of those concepts. I will leave this discussion to others.
 
  • #6
FactChecker said:
Yes. My knowledge of QM is close to zero. I am comfortable with the Komogorov axioms and the use of that type of probability/randomness, but know nothing about a QM interpretation of those concepts. I will leave this discussion to others.
You could try this, as an excellent introduction to QM and complex probability amplitudes.

https://www.scottaaronson.com/democritus/lec9.html
 
  • Like
Likes FactChecker
  • #7
PeroK said:
You could try this, as an excellent introduction to QM and complex probability amplitudes.

https://www.scottaaronson.com/democritus/lec9.html
Very good! From there, I get the following quote: "that nature is described not by probabilities (which are always nonnegative), but by numbers called amplitudes that can be positive, negative, or even complex."
The lecture has a lot that I do not understand, but it seems to consistently use the terms "amplitude" and "probability" as that quote implies.

I feel much better about that.
 
  • #8
FactChecker said:
but it seems to consistently use the terms "amplitude" and "probability" as that quote implies.

I feel much better about that.
So you are happy that physicists were decent enough to use the term "probability" only with its correct meaning (especially "Well, the probabilities had better be nonnegative, and they'd better sum to 1."), and introduced new terms for their "certain generalization of probability theory to allow minus signs". I agree, it would have been awful if they had mixed-up everything.
 
  • #9
This thread, IMO, clarifies the position re the MWI. All you need to do to claim that the MWI can simulate probabilities is find a mapping from measurement events to a Kolmogorov measure space. Different measurement events in different branches of the wavefunction satisfy the required conditions. There is nothing in the Kolmogorov axioms to distinguish between "true" probabilities and something that only looks like probabilities. The fact that a certain event has a probability of one with a different mapping is irrelevant. That does not prevent a different mapping of events to a different measure space.
 
  • Like
Likes jbergman, FactChecker and gentzen
  • #10
PeroK said:
This thread, IMO, clarifies the position re the MWI. All you need to do to claim that the MWI can simulate probabilities is find a mapping from measurement events to a Kolmogorov measure space. Different measurement events in different branches of the wavefunction satisfy the required conditions. There is nothing in the Kolmogorov axioms to distinguish between "true" probabilities and something that only looks like probabilities. The fact that a certain event has a probability of one with a different mapping is irrelevant. That does not prevent a different mapping of events to a different measure space.
In fact, my interpretation of "probability" is dependent on the amount of knowledge a person has when making a guess. Bayes theorem allows one to adjust probabilities when more information is received after the fact. A coin toss might have 50/50 odds to one person but different odds to a person who knows the exact forces on the coin and air currents.
That being said, there is a good question in the OP about whether true randomness is the same concept. I think that QM makes that an important distinction. The link in @PeroK 's post #6 is very clear about the terminology to use and I feel much better about that. I think that it also addresses the deeper issue that the OP is concerned about. I will need to give it some more serious reading.
 
  • Like
Likes gentzen
  • #11
FactChecker said:
In fact, my interpretation of "probability" is dependent on the amount of knowledge a person has when making a guess. ...
That being said, there is a good question in the OP about whether true randomness is the same concept. I think that QM makes that an important distinction.
  • For the case of "Chaitin's Ω number," which I understand best of my examples, randomness is indeed dependent on "knowledge of certain agents," namely "algorithms implementing halting computations".
  • For the example with P vs ZPP (i.e. about derandomization of randomized algorithms), "knowledge of certain agents" is also important, this time of "polynomial time algorithms". But this time, also the capabilities of "polynomial time algorithms" to generate randomness is important. They must at least be able to confuse "polynomial time algorithms," such they cannot distinguish it from "true" randomness. But there might be more going on, and I don't know which part is still too challenging for current proof techniques.
  • For the Rado graph, it looks to me like somehow a specific instance of the zero-one laws (from probability theory) got derandomized. But I could be wrong.

I don't think that "true" randomness is relevant to my question, just "relative" randomness should be enough. I asked this question, because the other question had triggered me to google the Rado graph, whose relation to probability theory (based on Kolmogorov axioms) is unclear to me. And the question of the relation between current proof techniques and "very slow progress when randomness gets involved" also interests me, even so I don't expect to get answers to that anytime soon, neither here nor elsewhere.
 
  • #12
gentzen said:
A recent question about interpretations of probability nicely clarified the role of the Kolmogorov axioms:


[... some excursions into QM, negative probabilities, and quasiprobability distributions ...]

Conclusion: the Kolmogorov axioms formalize the concept of probability. They achieve this by completely separating it from concepts like randomness, uncertainty, propensity, subjective 'degree of belief', ...

But randomness remains an important concept, even in mathematics (and computer science) itself. Take for example the challenge to proof or disprove whether randomized algorithm are more powerful than deterministic algorithms. It is a typical example of the challenges randomness presents for current proof techniques. And because they are sometimes also (wrongly?) called probabilistic algorithms, it is also a typical example of the "relevance and challenge" to distinguish between randomness and probability.

There are also examples where randomness was successfully conquered by current proof techniques:


Both examples clearly manage to go beyong probability into actual randomness. Are they of the same type, or do they use different concepts of randomness? What are other examples where randomness was successfully conquered by current proof techniques? Are they of the same type(s) as my two examples?
A probability space is nothing more than a measurable space with total measure of 1. So in some sense, probability is nothing more than the study of measurable spaces, measures, and measurable functions.

When you start talking about justifications for assigning probabilities to things you are entering the realm of philosophy. Many different reasons could motivate such a decision but they could all be reasonably called probabilities.

For instance, in QM, one can know the entire state of a system but not know the state of a subsystem exactly. Zurek argues this is a different kind of probability that is unique to QM.

https://terrytao.wordpress.com/2015/09/29/275a-notes-0-foundations-of-probability-theory/
 
  • Like
Likes gentzen

Similar threads

  • Quantum Interpretations and Foundations
Replies
14
Views
2K
  • Poll
  • Science and Math Textbooks
Replies
2
Views
7K
  • Quantum Interpretations and Foundations
Replies
21
Views
2K
  • STEM Academic Advising
Replies
13
Views
2K
  • Programming and Computer Science
Replies
29
Views
2K
Replies
9
Views
3K
  • STEM Career Guidance
Replies
10
Views
4K
  • Poll
  • Science and Math Textbooks
Replies
2
Views
5K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
7
Views
2K
  • General Math
Replies
13
Views
9K
Back
Top