Having laid the groundwork in graph theory, we now transition to the realm of probability theory, a mathematical framework for dealing with uncertainty and randomness. Probability theory provides the language and tools to analyze random phenomena, quantify likelihoods, and make informed decisions in the face of incomplete information. This chapter introduces the fundamental concepts and notations of probability theory, setting the stage for its application in the design and analysis of randomized algorithms in later chapters.

Fundamental Definitions and Notations (2.1)

At the heart of probability theory lies the concept of a probability space, the mathematical construct that formalizes the notion of a random experiment. A probability space consists of three components:

  • Sample Space (): The sample space is the set of all possible outcomes of a random experiment. It is the universe of all possible results. For example, if we toss a coin, the sample space is . If we roll a six-sided die, .

  • Elementary Events: Each element is called an elementary event or a sample point. It represents a single, indivisible outcome of the experiment.

  • Probability Function (Pr): A probability function assigns a probability to each elementary event . These probabilities must satisfy two fundamental axioms:

    1. Non-negativity and Boundedness: For every elementary event , the probability must be non-negative and at most 1:
    1. Normalization: The sum of the probabilities of all elementary events in the sample space must equal 1:

A probability space is thus a triple , where is the sample space, is a sigma-algebra of subsets of (events), and is a probability measure defined on . For discrete probability spaces, is typically taken to be the power set of , .

Events and Probability of Events

An event is a subset of the sample space. It represents a collection of possible outcomes. For example, if we roll a die, the event “getting an even number” is .

The probability of an event , denoted , is the sum of the probabilities of all elementary events in :

Complementary Events

For any event , the complementary event (or ) is the set of all outcomes in that are not in . It is defined as . Complementary events are essential for relating the probability of an event to the probability of its non-occurrence.

Basic Properties of Probability (Lemma 2.2)

From the fundamental definition of a probability space, several elementary but crucial properties of probability follow directly:

Lemma 2.2: For events in a probability space :

  1. Probability of Impossible and Certain Events: The probability of the empty set (impossible event) is 0, and the probability of the entire sample space (certain event) is 1.

  2. Probability Range: The probability of any event is between 0 and 1, inclusive.

  3. Probability of Complement: The probability of the complement of an event is 1 minus the probability of .

  4. Monotonicity: If event is a subset of event (), then the probability of is less than or equal to the probability of .

These properties are fundamental for manipulating and reasoning about probabilities. They provide basic tools for calculating probabilities and establishing relationships between different events.

Addition Rules for Probabilities (Satz 2.3)

Addition Rule for Disjoint Events (Satz 2.3)

A key rule for calculating probabilities is the Addition Rule, which applies to disjoint (or mutually exclusive) events. Two events and are disjoint if they have no outcomes in common, i.e., .

Theorem 2.3 (Addition Rule): If events are pairwise disjoint (i.e., for all ), then the probability of their union is the sum of their individual probabilities:

This rule extends to countably infinite sequences of disjoint events as well.

The Addition Rule is essential for calculating the probability of complex events by decomposing them into unions of simpler, disjoint events.

Inclusion-Exclusion Principle for Non-Disjoint Events (Satz 2.5)

When events are not disjoint, the simple Addition Rule does not directly apply because we would be double-counting outcomes that belong to multiple events. The Inclusion-Exclusion Principle (Satz 2.5) provides a way to calculate the probability of the union of events, even when they are not disjoint. It systematically adds and subtracts probabilities of intersections to correct for overcounting.

Theorem 2.5 (Inclusion-Exclusion Principle): For events :

In expanded form, for and , the principle becomes:

  • For n = 2:

  • For n = 3:

The Inclusion-Exclusion Principle is a powerful tool for calculating probabilities of unions of events, especially when dealing with dependencies and overlaps between events.

Boole’s Inequality (Union Bound) (Korollar 2.6)

A useful upper bound for the probability of the union of events, even if they are not disjoint, is given by Boole’s Inequality or the Union Bound (Korollar 2.6). It states that the probability of the union of events is at most the sum of their individual probabilities.

Corollary 2.6 (Boole’s Inequality/Union Bound): For events :

This inequality is often used when a precise calculation of the union probability is difficult, but we need a quick upper estimate. It is particularly valuable when dealing with a large number of events, as it provides a simple and easily computable bound.

Laplace Principle: Assigning Probabilities

In many practical situations, we need to assign probabilities to elementary events. The Laplace Principle provides a fundamental guideline for assigning probabilities when we lack specific information favoring one outcome over another.

Laplace Principle: In the absence of any reason to believe otherwise, we assume that all elementary events in the sample space are equally likely.

If we apply the Laplace Principle to a sample space with equally likely elementary events, then the probability of each elementary event is:

For an event , the probability of is then given by:

This principle is the cornerstone of classical probability and is widely used in situations where outcomes are considered to be symmetric or equally probable a priori. It is important to note that the Laplace Principle is a principle of indifference; it applies when we have no information to discriminate between outcomes. In situations where we have reason to believe that outcomes are not equally likely, other methods for assigning probabilities are needed, often based on empirical data or subjective assessments.

Describing Probability Spaces and Events

When working with probability spaces, especially in complex scenarios, formally describing the sample space and events can become cumbersome. In practice, we often use informal descriptions and rely on conventions to define probability spaces and events.

For example, when considering card games, instead of explicitly listing all possible hands in the sample space, we might simply refer to events like “Player A has four Aces.” While less formal, this approach is often sufficient for practical purposes, provided that the underlying probability space and the meaning of events are clearly understood. However, when precision is paramount, or when ambiguities arise, it is crucial to revert to a rigorous definition of the sample space and events.

In summary, this section has laid the foundation for probability theory by introducing fundamental definitions, notations, and principles. These concepts will be essential as we delve into more advanced topics and apply probability theory to the analysis of algorithms and randomized processes.

Prev: 06 Matchings, Colorings | Next: 02 Conditional Probabilities, Independence