Add second year
This commit is contained in:
@ -0,0 +1,136 @@
|
||||
- #[[ST2001 - Statistics in Data Science I]]
|
||||
- **Previous Topic:** [[Sampling]]
|
||||
- **Next Topic:** [[Random Variables]]
|
||||
- **Relevant Slides:** 
|
||||
-
|
||||
- Probability provides the *framework* for the study & application of statistics.
|
||||
- # What are Probabilities?
|
||||
- Take, for example, a 6-sided die about to be tossed for the first time.
|
||||
- **Classical:** 6 possible outcomes, by symmetry, each equally likely to occur,
|
||||
- **Frequentist:** Empirical evidence shows that similar dice thrown in the past have landed on each side about equally often.
|
||||
- **Subjective:** The degree of individual belief in occurrence of an event can be influenced by classical or frequentist arguments.
|
||||
- Subjective probabilities are also influenced by other reasons when symmetry arguments don't apply & repeated trials are not possible.
|
||||
-
|
||||
- # Probability
|
||||
- The probability of an event $A$ is the number of (equally likely & disjoint) outcomes in the event divided by the total number of (equally likely & disjoint) possible outcomes.
|
||||
- $$P(A) = \frac{\text{\# of outcomes in A}}{\text{\# of possible outcomes}}$$
|
||||
- $$(0 \leq P(A) \leq 1)$$
|
||||
- ## Sample Spaces
|
||||
- What is a **sample space**? #card
|
||||
card-last-interval:: 4
|
||||
card-repeats:: 1
|
||||
card-ease-factor:: 2.6
|
||||
card-next-schedule:: 2022-10-03T08:36:59.497Z
|
||||
card-last-reviewed:: 2022-09-29T08:36:59.498Z
|
||||
card-last-score:: 5
|
||||
- The set of all possible outcomes of a random experiment is called the **sample space**, $S$.
|
||||
- $S$ is **discrete** if it consists of a finite or countably infinite set of outcomes.
|
||||
- $S$ is **continuous** if it contains an interval of real numbers.
|
||||
- $$P(S) = 1$$
|
||||
- ## Events
|
||||
- What is an **event**? #card
|
||||
card-last-interval:: 4
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.46
|
||||
card-next-schedule:: 2022-10-10T09:40:11.548Z
|
||||
card-last-reviewed:: 2022-10-06T09:40:11.549Z
|
||||
card-last-score:: 3
|
||||
- An **event** is a specific collection of sample points / possible outcomes.
|
||||
- An event is denoted by $E$ or by capital letters, $A$, $B$, etc.
|
||||
- What is a **SImple Event**? #card
|
||||
card-last-interval:: -1
|
||||
card-repeats:: 1
|
||||
card-ease-factor:: 2.5
|
||||
card-next-schedule:: 2022-10-04T23:00:00.000Z
|
||||
card-last-reviewed:: 2022-10-04T12:11:11.353Z
|
||||
card-last-score:: 1
|
||||
- A **Simple Event** is a collection of only **one** sample point / possible outcomes.
|
||||
- What is a **Compound Event**? #card
|
||||
card-last-interval:: 3.84
|
||||
card-repeats:: 1
|
||||
card-ease-factor:: 2.6
|
||||
card-next-schedule:: 2022-10-04T04:30:13.004Z
|
||||
card-last-reviewed:: 2022-09-30T08:30:13.004Z
|
||||
card-last-score:: 5
|
||||
- A **Compound Event** is a collection of **more than one** sample point / possible outcomes.
|
||||
- ## Permuatations
|
||||
- A **permutation** is an arrangement of objects.
|
||||
- It can also be an arrangement of $r$ objects chosen from $n$ distinct objects where replacement in the selection is not allowed.
|
||||
- The symbol, $P^n_r$ represents the number of permutations of $r$ objects selected from $n$ objects.
|
||||
- The calculation is given by the formula:
|
||||
- $$P^n_r = \frac{n!}{(n-r)!}$$
|
||||
- ## Joint Events (and / or)
|
||||
- Probabilities of **joint events** can often be determined from the probabilities of the individual events that comprise them.
|
||||
- Joint events are generated by applying basic set operations to individual events, specifically:
|
||||
- **Complement** of event $A$ is $\bar{A} =$ all outcomes *not* in $A$.
|
||||
- **Union** of events $A \cup B$; $A$ **or** $B$ or both.
|
||||
- **Intersection** of events $A$ **and** $B$ -> $A \cap B$.
|
||||
- **Disjoint** events cannot occur together -> $A \cap B = \empty$.
|
||||
-
|
||||
- ## Probability of a Union ($A$ **or** $B$) #card
|
||||
card-last-interval:: 1.62
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.36
|
||||
card-next-schedule:: 2022-10-06T02:32:21.829Z
|
||||
card-last-reviewed:: 2022-10-04T12:32:21.830Z
|
||||
card-last-score:: 3
|
||||
- For any two events $A$ and $B$, the probability of union is given by:
|
||||
- $$P(A \cup B) = P(A) + P(B) - P(A \cap B)$$
|
||||
- For two **disjoint** (also called **mutually exclusive**) events $A$ and $B$, the probability that one *or* the other occurs is the sum of the probabilities of the two events (provided that $A$ and $B$ are disjoint).
|
||||
- $$P(A \cup B) = P(A) + P(B)$$
|
||||
- If $P(A \cup B)$ is greater than 1, then you know you have made a mistake and that the events were not mutually exclusive -> there is an intersection.
|
||||
- ## Intersections ($A$ **and** $B$)
|
||||
card-last-interval:: 3.51
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.6
|
||||
card-next-schedule:: 2022-10-08T00:26:58.336Z
|
||||
card-last-reviewed:: 2022-10-04T12:26:58.337Z
|
||||
card-last-score:: 5
|
||||
- #### Multiplication Rule for Independent Events #card
|
||||
card-last-interval:: 3.84
|
||||
card-repeats:: 1
|
||||
card-ease-factor:: 2.6
|
||||
card-next-schedule:: 2022-10-04T04:37:30.498Z
|
||||
card-last-reviewed:: 2022-09-30T08:37:30.499Z
|
||||
card-last-score:: 5
|
||||
- For two **independent** events $A$ and $B$, the probability that *both* $A$ **and** $B$ occur is the product of the probabilities of the two events.
|
||||
- $$P(A \cap B) = P(A) \times P(B)$$
|
||||
- If two events are **independent**, that means that one event has no impact on the probability of occurrence of the other event.
|
||||
- ## Conditional Probability #card
|
||||
card-last-interval:: -1
|
||||
card-repeats:: 1
|
||||
card-ease-factor:: 2.5
|
||||
card-next-schedule:: 2022-10-04T23:00:00.000Z
|
||||
card-last-reviewed:: 2022-10-04T12:31:44.517Z
|
||||
card-last-score:: 1
|
||||
- $P(B | A)$ is the probability of event $B$ occurring, given that event $A$ has already occurred.
|
||||
- The **conditional probability** of $B$ given $A$, denoted by $P(B | A)$, is defined by:
|
||||
- $$P(B|A) = \frac{P(A \cap B)}{P(A)} \text{, provided } P(A) > 0$$
|
||||
- **Note:** $P(A)$ cannot equal 0, since we know that $A$ *has* occurred.
|
||||
- ### General Multiplication Rule for Dependent Events
|
||||
- The conditional probability can be rewritten to further generalise the multiplication rule:
|
||||
- $$P(A \cap B) = P(A) \cdot P(B|A)$$
|
||||
- $$P(B \cap A) = P(B)B \cdot P(B|A)$$
|
||||
- $$\text{As } P(A \cap B) = P(B \cap A) \text{ implies}$$
|
||||
- $$P(A) \cdot P(B | A) = P(B) \cdot P(A |B)$$
|
||||
- These results mean that $P(A |B)$ can be calculated once we know $P(A)$, $P(B)$, and $P(B | A)$.
|
||||
- #### Bayes' Theorem #card
|
||||
card-last-interval:: -1
|
||||
card-repeats:: 1
|
||||
card-ease-factor:: 2.5
|
||||
card-next-schedule:: 2022-10-04T23:00:00.000Z
|
||||
card-last-reviewed:: 2022-10-04T09:45:39.727Z
|
||||
card-last-score:: 1
|
||||
- **Bayes' Theorem** states that:
|
||||
- $$P(A | B) = \frac{P(B | A) \cdot P(A)}{P(B)} \text{ for } P(B)>0$$
|
||||
- ## Independence
|
||||
- Two events, $A$ and $B$ are independent, if and only if:
|
||||
- $$P(A \cap B) = P(A)\cdot P(B)$$
|
||||
- Therefore, to obtain the probability that two independent events will occur, we simply find the product of their individual probabilities.
|
||||
- Two events $A$ and $B$ are independent, if and only if:
|
||||
- $$P(B | A) = P(B) \text{ or } P(A|B) = P(A)$$
|
||||
- assuming the existence of the conditional probabilities.
|
||||
- Otherwise, $A$ and $B$ are **dependent**.
|
||||
- If in an experiment, the events $A$ and $B$ can both occur, then:
|
||||
- $$P(A \cap B) = P(A)P(B|A) \text{, provided } P(A) > 0$$
|
||||
-
|
@ -0,0 +1,149 @@
|
||||
- #[[ST2001 - Statistics in Data Science I]]
|
||||
- **Previous Topic:** [[Sampling]]
|
||||
- **Next Topic:** [[Random Variables]]
|
||||
- **Relevant Slides:** 
|
||||
-
|
||||
- Probability provides the *framework* for the study & application of statistics.
|
||||
- # What are Probabilities?
|
||||
- Take, for example, a 6-sided die about to be tossed for the first time.
|
||||
- **Classical:** 6 possible outcomes, by symmetry, each equally likely to occur,
|
||||
- **Frequentist:** Empirical evidence shows that similar dice thrown in the past have landed on each side about equally often.
|
||||
- **Subjective:** The degree of individual belief in occurrence of an event can be influenced by classical or frequentist arguments.
|
||||
- Subjective probabilities are also influenced by other reasons when symmetry arguments don't apply & repeated trials are not possible.
|
||||
-
|
||||
- # Probability
|
||||
- The probability of an event $A$ is the number of (equally likely & disjoint) outcomes in the event divided by the total number of (equally likely & disjoint) possible outcomes.
|
||||
- $$P(A) = \frac{\text{\# of outcomes in A}}{\text{\# of possible outcomes}}$$
|
||||
- $$(0 \leq P(A) \leq 1)$$
|
||||
- ## Sample Spaces
|
||||
- What is a **sample space**? #card
|
||||
card-last-interval:: 4
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.7
|
||||
card-next-schedule:: 2022-10-12T15:07:56.387Z
|
||||
card-last-reviewed:: 2022-10-08T15:07:56.387Z
|
||||
card-last-score:: 5
|
||||
- The set of all possible outcomes of a random experiment is called the **sample space**, $S$.
|
||||
- $S$ is **discrete** if it consists of a finite or countably infinite set of outcomes.
|
||||
- $S$ is **continuous** if it contains an interval of real numbers.
|
||||
- $$P(S) = 1$$
|
||||
- ## Events
|
||||
- What is an **event**? #card
|
||||
card-last-interval:: 4
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.46
|
||||
card-next-schedule:: 2022-10-10T09:40:11.548Z
|
||||
card-last-reviewed:: 2022-10-06T09:40:11.549Z
|
||||
card-last-score:: 3
|
||||
- An **event** is a specific collection of sample points / possible outcomes.
|
||||
- An event is denoted by $E$ or by capital letters, $A$, $B$, etc.
|
||||
- What is a **SImple Event**? #card
|
||||
card-last-interval:: -1
|
||||
card-repeats:: 1
|
||||
card-ease-factor:: 2.36
|
||||
card-next-schedule:: 2022-10-09T23:00:00.000Z
|
||||
card-last-reviewed:: 2022-10-09T08:51:51.247Z
|
||||
card-last-score:: 1
|
||||
- A **Simple Event** is a collection of only **one** sample point / possible outcomes.
|
||||
- What is a **Compound Event**? #card
|
||||
card-last-interval:: 4
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.46
|
||||
card-next-schedule:: 2022-10-10T17:15:57.407Z
|
||||
card-last-reviewed:: 2022-10-06T17:15:57.408Z
|
||||
card-last-score:: 3
|
||||
- A **Compound Event** is a collection of **more than one** sample point / possible outcomes.
|
||||
- ## Permuatations
|
||||
- A **permutation** is an arrangement of objects.
|
||||
- It can also be an arrangement of $r$ objects chosen from $n$ distinct objects where replacement in the selection is not allowed.
|
||||
- The symbol, $P^n_r$ represents the number of permutations of $r$ objects selected from $n$ objects.
|
||||
- The calculation is given by the formula:
|
||||
- $$P^n_r = \frac{n!}{(n-r)!}$$
|
||||
- ## Joint Events (and / or)
|
||||
- Probabilities of **joint events** can often be determined from the probabilities of the individual events that comprise them.
|
||||
- Joint events are generated by applying basic set operations to individual events, specifically:
|
||||
- **Complement** of event $A$ is $\bar{A} =$ all outcomes *not* in $A$.
|
||||
- **Union** of events $A \cup B$; $A$ **or** $B$ or both.
|
||||
- **Intersection** of events $A$ **and** $B$ -> $A \cap B$.
|
||||
- **Disjoint** events cannot occur together -> $A \cap B = \empty$.
|
||||
-
|
||||
- ## Probability of a Union ($A$ **or** $B$) #card
|
||||
card-last-interval:: 8.38
|
||||
card-repeats:: 3
|
||||
card-ease-factor:: 2.46
|
||||
card-next-schedule:: 2022-10-15T19:46:22.379Z
|
||||
card-last-reviewed:: 2022-10-07T10:46:22.380Z
|
||||
card-last-score:: 5
|
||||
- For any two events $A$ and $B$, the probability of union is given by:
|
||||
- $$P(A \cup B) = P(A) + P(B) - P(A \cap B)$$
|
||||
- For two **disjoint** (also called **mutually exclusive**) events $A$ and $B$, the probability that one *or* the other occurs is the sum of the probabilities of the two events (provided that $A$ and $B$ are disjoint).
|
||||
- $$P(A \cup B) = P(A) + P(B)$$
|
||||
- If $P(A \cup B)$ is greater than 1, then you know you have made a mistake and that the events were not mutually exclusive -> there is an intersection.
|
||||
- ## Intersections ($A$ **and** $B$)
|
||||
card-last-interval:: 3.51
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.6
|
||||
card-next-schedule:: 2022-10-08T00:26:58.336Z
|
||||
card-last-reviewed:: 2022-10-04T12:26:58.337Z
|
||||
card-last-score:: 5
|
||||
- #### Multiplication Rule for Independent Events #card
|
||||
card-last-interval:: 11.2
|
||||
card-repeats:: 3
|
||||
card-ease-factor:: 2.8
|
||||
card-next-schedule:: 2022-10-18T14:24:58.694Z
|
||||
card-last-reviewed:: 2022-10-07T10:24:58.695Z
|
||||
card-last-score:: 5
|
||||
- For two **independent** events $A$ and $B$, the probability that *both* $A$ **and** $B$ occur is the product of the probabilities of the two events.
|
||||
- $$P(A \cap B) = P(A) \times P(B)$$
|
||||
- If two events are **independent**, that means that one event has no impact on the probability of occurrence of the other event.
|
||||
- ## Conditional Probability
|
||||
card-last-score:: 1
|
||||
card-repeats:: 1
|
||||
card-next-schedule:: 2022-10-04T23:00:00.000Z
|
||||
card-last-interval:: -1
|
||||
card-ease-factor:: 2.5
|
||||
card-last-reviewed:: 2022-10-04T12:31:44.517Z
|
||||
- What is **conditional probability**? #card
|
||||
card-last-interval:: -1
|
||||
card-repeats:: 1
|
||||
card-ease-factor:: 2.5
|
||||
card-next-schedule:: 2022-10-08T23:00:00.000Z
|
||||
card-last-reviewed:: 2022-10-08T15:31:25.004Z
|
||||
card-last-score:: 1
|
||||
- $P(B | A)$ is the probability of event $B$ occurring, given that event $A$ has already occurred.
|
||||
- The **conditional probability** of $B$ given $A$, denoted by $P(B | A)$, is defined by:
|
||||
- $$P(B|A) = \frac{P(A \cap B)}{P(A)} \text{, provided } P(A) > 0$$
|
||||
- **Note:** $P(A)$ cannot equal 0, since we know that $A$ *has* occurred.
|
||||
- ### General Multiplication Rule for Dependent Events #card
|
||||
card-last-interval:: 2.8
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.6
|
||||
card-next-schedule:: 2022-10-11T09:57:45.051Z
|
||||
card-last-reviewed:: 2022-10-08T14:57:45.052Z
|
||||
card-last-score:: 5
|
||||
- The conditional probability can be rewritten to further generalise the multiplication rule:
|
||||
- $$P(A \cap B) = P(A) \cdot P(B|A)$$
|
||||
- $$P(B \cap A) = P(B)B \cdot P(B|A)$$
|
||||
- $$\text{As } P(A \cap B) = P(B \cap A) \text{ implies}$$
|
||||
- $$P(A) \cdot P(B | A) = P(B) \cdot P(A |B)$$
|
||||
- These results mean that $P(A |B)$ can be calculated once we know $P(A)$, $P(B)$, and $P(B | A)$.
|
||||
- #### Bayes' Theorem #card
|
||||
card-last-interval:: 3.33
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.36
|
||||
card-next-schedule:: 2022-10-10T17:20:39.188Z
|
||||
card-last-reviewed:: 2022-10-07T10:20:39.189Z
|
||||
card-last-score:: 3
|
||||
- **Bayes' Theorem** states that:
|
||||
- $$P(A | B) = \frac{P(B | A) \cdot P(A)}{P(B)} \text{ for } P(B)>0$$
|
||||
- ## Independence
|
||||
- Two events, $A$ and $B$ are independent, if and only if:
|
||||
- $$P(A \cap B) = P(A)\cdot P(B)$$
|
||||
- Therefore, to obtain the probability that two independent events will occur, we simply find the product of their individual probabilities.
|
||||
- Two events $A$ and $B$ are independent, if and only if:
|
||||
- $$P(B | A) = P(B) \text{ or } P(A|B) = P(A)$$
|
||||
- assuming the existence of the conditional probabilities.
|
||||
- Otherwise, $A$ and $B$ are **dependent**.
|
||||
- If in an experiment, the events $A$ and $B$ can both occur, then:
|
||||
- $$P(A \cap B) = P(A)P(B|A) \text{, provided } P(A) > 0$$
|
||||
-
|
@ -0,0 +1,155 @@
|
||||
- #[[ST2001 - Statistics in Data Science I]]
|
||||
- **Previous Topic:** [[Sampling]]
|
||||
- **Next Topic:** [[Random Variables]]
|
||||
- **Relevant Slides:** 
|
||||
-
|
||||
- Probability provides the *framework* for the study & application of statistics.
|
||||
- # What are Probabilities?
|
||||
- Take, for example, a 6-sided die about to be tossed for the first time.
|
||||
- **Classical:** 6 possible outcomes, by symmetry, each equally likely to occur,
|
||||
- **Frequentist:** Empirical evidence shows that similar dice thrown in the past have landed on each side about equally often.
|
||||
- **Subjective:** The degree of individual belief in occurrence of an event can be influenced by classical or frequentist arguments.
|
||||
- Subjective probabilities are also influenced by other reasons when symmetry arguments don't apply & repeated trials are not possible.
|
||||
-
|
||||
- # Probability
|
||||
- The probability of an event $A$ is the number of (equally likely & disjoint) outcomes in the event divided by the total number of (equally likely & disjoint) possible outcomes.
|
||||
- $$P(A) = \frac{\text{\# of outcomes in A}}{\text{\# of possible outcomes}}$$
|
||||
- $$(0 \leq P(A) \leq 1)$$
|
||||
- ## Sample Spaces
|
||||
- What is a **sample space**? #card
|
||||
card-last-interval:: 4
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.7
|
||||
card-next-schedule:: 2022-10-12T15:07:56.387Z
|
||||
card-last-reviewed:: 2022-10-08T15:07:56.387Z
|
||||
card-last-score:: 5
|
||||
- The set of all possible outcomes of a random experiment is called the **sample space**, $S$.
|
||||
- $S$ is **discrete** if it consists of a finite or countably infinite set of outcomes.
|
||||
- $S$ is **continuous** if it contains an interval of real numbers.
|
||||
- $$P(S) = 1$$
|
||||
- ## Events
|
||||
- What is an **event**? #card
|
||||
card-last-interval:: 4
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.46
|
||||
card-next-schedule:: 2022-10-10T09:40:11.548Z
|
||||
card-last-reviewed:: 2022-10-06T09:40:11.549Z
|
||||
card-last-score:: 3
|
||||
- An **event** is a specific collection of sample points / possible outcomes.
|
||||
- An event is denoted by $E$ or by capital letters, $A$, $B$, etc.
|
||||
- What is a **SImple Event**? #card
|
||||
card-last-interval:: -1
|
||||
card-repeats:: 1
|
||||
card-ease-factor:: 2.36
|
||||
card-next-schedule:: 2022-10-09T23:00:00.000Z
|
||||
card-last-reviewed:: 2022-10-09T08:51:51.247Z
|
||||
card-last-score:: 1
|
||||
- A **Simple Event** is a collection of only **one** sample point / possible outcomes.
|
||||
- What is a **Compound Event**? #card
|
||||
card-last-interval:: 4
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.46
|
||||
card-next-schedule:: 2022-10-10T17:15:57.407Z
|
||||
card-last-reviewed:: 2022-10-06T17:15:57.408Z
|
||||
card-last-score:: 3
|
||||
- A **Compound Event** is a collection of **more than one** sample point / possible outcomes.
|
||||
- ## Permuatations
|
||||
- A **permutation** is an arrangement of objects.
|
||||
- It can also be an arrangement of $r$ objects chosen from $n$ distinct objects where replacement in the selection is not allowed.
|
||||
- The symbol, $P^n_r$ represents the number of permutations of $r$ objects selected from $n$ objects.
|
||||
- The calculation is given by the formula:
|
||||
- $$P^n_r = \frac{n!}{(n-r)!}$$
|
||||
- ## Joint Events (and / or)
|
||||
- Probabilities of **joint events** can often be determined from the probabilities of the individual events that comprise them.
|
||||
- Joint events are generated by applying basic set operations to individual events, specifically:
|
||||
- **Complement** of event $A$ is $\bar{A} =$ all outcomes *not* in $A$.
|
||||
- **Union** of events $A \cup B$; $A$ **or** $B$ or both.
|
||||
- **Intersection** of events $A$ **and** $B$ -> $A \cap B$.
|
||||
- **Disjoint** events cannot occur together -> $A \cap B = \empty$.
|
||||
-
|
||||
- ## Probability of a Union ($A$ **or** $B$) #card
|
||||
card-last-interval:: 8.38
|
||||
card-repeats:: 3
|
||||
card-ease-factor:: 2.46
|
||||
card-next-schedule:: 2022-10-15T19:46:22.379Z
|
||||
card-last-reviewed:: 2022-10-07T10:46:22.380Z
|
||||
card-last-score:: 5
|
||||
- For any two events $A$ and $B$, the probability of union is given by:
|
||||
- $$P(A \cup B) = P(A) + P(B) - P(A \cap B)$$
|
||||
- For two **disjoint** (also called **mutually exclusive**) events $A$ and $B$, the probability that one *or* the other occurs is the sum of the probabilities of the two events (provided that $A$ and $B$ are disjoint).
|
||||
- $$P(A \cup B) = P(A) + P(B)$$
|
||||
- If $P(A \cup B)$ is greater than 1, then you know you have made a mistake and that the events were not mutually exclusive -> there is an intersection.
|
||||
- ## Intersections ($A$ **and** $B$)
|
||||
card-last-interval:: 3.51
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.6
|
||||
card-next-schedule:: 2022-10-08T00:26:58.336Z
|
||||
card-last-reviewed:: 2022-10-04T12:26:58.337Z
|
||||
card-last-score:: 5
|
||||
- #### Multiplication Rule for Independent Events #card
|
||||
card-last-interval:: 11.2
|
||||
card-repeats:: 3
|
||||
card-ease-factor:: 2.8
|
||||
card-next-schedule:: 2022-10-18T14:24:58.694Z
|
||||
card-last-reviewed:: 2022-10-07T10:24:58.695Z
|
||||
card-last-score:: 5
|
||||
- For two **independent** events $A$ and $B$, the probability that *both* $A$ **and** $B$ occur is the product of the probabilities of the two events.
|
||||
- $$P(A \cap B) = P(A) \times P(B)$$
|
||||
- If two events are **independent**, that means that one event has no impact on the probability of occurrence of the other event.
|
||||
- ## Conditional Probability
|
||||
card-last-score:: 1
|
||||
card-repeats:: 1
|
||||
card-next-schedule:: 2022-10-04T23:00:00.000Z
|
||||
card-last-interval:: -1
|
||||
card-ease-factor:: 2.5
|
||||
card-last-reviewed:: 2022-10-04T12:31:44.517Z
|
||||
- What is **conditional probability**? #card
|
||||
card-last-interval:: 0.81
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.36
|
||||
card-next-schedule:: 2022-10-11T06:39:23.513Z
|
||||
card-last-reviewed:: 2022-10-10T11:39:23.513Z
|
||||
card-last-score:: 3
|
||||
- $P(B | A)$ is the probability of event $B$ occurring, given that event $A$ has already occurred.
|
||||
- The **conditional probability** of $B$ given $A$, denoted by $P(B | A)$, is defined by: #card
|
||||
card-last-interval:: -1
|
||||
card-repeats:: 1
|
||||
card-ease-factor:: 2.5
|
||||
card-next-schedule:: 2022-10-18T23:00:00.000Z
|
||||
card-last-reviewed:: 2022-10-18T13:35:35.409Z
|
||||
card-last-score:: 1
|
||||
- $$P(B|A) = \frac{P(A \cap B)}{P(A)} \text{, provided } P(A) > 0$$
|
||||
- **Note:** $P(A)$ cannot equal 0, since we know that $A$ *has* occurred.
|
||||
- ### General Multiplication Rule for Dependent Events #card
|
||||
card-last-interval:: 2.8
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.6
|
||||
card-next-schedule:: 2022-10-11T09:57:45.051Z
|
||||
card-last-reviewed:: 2022-10-08T14:57:45.052Z
|
||||
card-last-score:: 5
|
||||
- The conditional probability can be rewritten to further generalise the multiplication rule:
|
||||
- $$P(A \cap B) = P(A) \cdot P(B|A)$$
|
||||
- $$P(B \cap A) = P(B)B \cdot P(B|A)$$
|
||||
- $$\text{As } P(A \cap B) = P(B \cap A) \text{ implies}$$
|
||||
- $$P(A) \cdot P(B | A) = P(B) \cdot P(A |B)$$
|
||||
- These results mean that $P(A |B)$ can be calculated once we know $P(A)$, $P(B)$, and $P(B | A)$.
|
||||
- #### Bayes' Theorem #card
|
||||
card-last-interval:: 3.33
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.36
|
||||
card-next-schedule:: 2022-10-10T17:20:39.188Z
|
||||
card-last-reviewed:: 2022-10-07T10:20:39.189Z
|
||||
card-last-score:: 3
|
||||
- **Bayes' Theorem** states that:
|
||||
- $$P(A | B) = \frac{P(B | A) \cdot P(A)}{P(B)} \text{ for } P(B)>0$$
|
||||
- ## Independence
|
||||
- Two events, $A$ and $B$ are independent, if and only if:
|
||||
- $$P(A \cap B) = P(A)\cdot P(B)$$
|
||||
- Therefore, to obtain the probability that two independent events will occur, we simply find the product of their individual probabilities.
|
||||
- Two events $A$ and $B$ are independent, if and only if:
|
||||
- $$P(B | A) = P(B) \text{ or } P(A|B) = P(A)$$
|
||||
- assuming the existence of the conditional probabilities.
|
||||
- Otherwise, $A$ and $B$ are **dependent**.
|
||||
- If in an experiment, the events $A$ and $B$ can both occur, then:
|
||||
- $$P(A \cap B) = P(A)P(B|A) \text{, provided } P(A) > 0$$
|
||||
-
|
@ -0,0 +1,155 @@
|
||||
- #[[ST2001 - Statistics in Data Science I]]
|
||||
- **Previous Topic:** [[Sampling]]
|
||||
- **Next Topic:** [[Random Variables]]
|
||||
- **Relevant Slides:** 
|
||||
-
|
||||
- Probability provides the *framework* for the study & application of statistics.
|
||||
- # What are Probabilities?
|
||||
- Take, for example, a 6-sided die about to be tossed for the first time.
|
||||
- **Classical:** 6 possible outcomes, by symmetry, each equally likely to occur,
|
||||
- **Frequentist:** Empirical evidence shows that similar dice thrown in the past have landed on each side about equally often.
|
||||
- **Subjective:** The degree of individual belief in occurrence of an event can be influenced by classical or frequentist arguments.
|
||||
- Subjective probabilities are also influenced by other reasons when symmetry arguments don't apply & repeated trials are not possible.
|
||||
-
|
||||
- # Probability
|
||||
- The probability of an event $A$ is the number of (equally likely & disjoint) outcomes in the event divided by the total number of (equally likely & disjoint) possible outcomes.
|
||||
- $$P(A) = \frac{\text{\# of outcomes in A}}{\text{\# of possible outcomes}}$$
|
||||
- $$(0 \leq P(A) \leq 1)$$
|
||||
- ## Sample Spaces
|
||||
- What is a **sample space**? #card
|
||||
card-last-interval:: 4
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.7
|
||||
card-next-schedule:: 2022-10-12T15:07:56.387Z
|
||||
card-last-reviewed:: 2022-10-08T15:07:56.387Z
|
||||
card-last-score:: 5
|
||||
- The set of all possible outcomes of a random experiment is called the **sample space**, $S$.
|
||||
- $S$ is **discrete** if it consists of a finite or countably infinite set of outcomes.
|
||||
- $S$ is **continuous** if it contains an interval of real numbers.
|
||||
- $$P(S) = 1$$
|
||||
- ## Events
|
||||
- What is an **event**? #card
|
||||
card-last-interval:: -1
|
||||
card-repeats:: 1
|
||||
card-ease-factor:: 2.46
|
||||
card-next-schedule:: 2022-10-20T23:00:00.000Z
|
||||
card-last-reviewed:: 2022-10-20T08:30:25.650Z
|
||||
card-last-score:: 1
|
||||
- An **event** is a specific collection of sample points / possible outcomes.
|
||||
- An event is denoted by $E$ or by capital letters, $A$, $B$, etc.
|
||||
- What is a **SImple Event**? #card
|
||||
card-last-interval:: -1
|
||||
card-repeats:: 1
|
||||
card-ease-factor:: 2.36
|
||||
card-next-schedule:: 2022-10-20T23:00:00.000Z
|
||||
card-last-reviewed:: 2022-10-20T08:28:43.077Z
|
||||
card-last-score:: 1
|
||||
- A **Simple Event** is a collection of only **one** sample point / possible outcomes.
|
||||
- What is a **Compound Event**? #card
|
||||
card-last-interval:: -1
|
||||
card-repeats:: 1
|
||||
card-ease-factor:: 2.46
|
||||
card-next-schedule:: 2022-10-20T23:00:00.000Z
|
||||
card-last-reviewed:: 2022-10-20T08:34:49.368Z
|
||||
card-last-score:: 1
|
||||
- A **Compound Event** is a collection of **more than one** sample point / possible outcomes.
|
||||
- ## Permuatations
|
||||
- A **permutation** is an arrangement of objects.
|
||||
- It can also be an arrangement of $r$ objects chosen from $n$ distinct objects where replacement in the selection is not allowed.
|
||||
- The symbol, $P^n_r$ represents the number of permutations of $r$ objects selected from $n$ objects.
|
||||
- The calculation is given by the formula:
|
||||
- $$P^n_r = \frac{n!}{(n-r)!}$$
|
||||
- ## Joint Events (and / or)
|
||||
- Probabilities of **joint events** can often be determined from the probabilities of the individual events that comprise them.
|
||||
- Joint events are generated by applying basic set operations to individual events, specifically:
|
||||
- **Complement** of event $A$ is $\bar{A} =$ all outcomes *not* in $A$.
|
||||
- **Union** of events $A \cup B$; $A$ **or** $B$ or both.
|
||||
- **Intersection** of events $A$ **and** $B$ -> $A \cap B$.
|
||||
- **Disjoint** events cannot occur together -> $A \cap B = \empty$.
|
||||
-
|
||||
- ## Probability of a Union ($A$ **or** $B$) #card
|
||||
card-last-interval:: 8.38
|
||||
card-repeats:: 3
|
||||
card-ease-factor:: 2.46
|
||||
card-next-schedule:: 2022-10-15T19:46:22.379Z
|
||||
card-last-reviewed:: 2022-10-07T10:46:22.380Z
|
||||
card-last-score:: 5
|
||||
- For any two events $A$ and $B$, the probability of union is given by:
|
||||
- $$P(A \cup B) = P(A) + P(B) - P(A \cap B)$$
|
||||
- For two **disjoint** (also called **mutually exclusive**) events $A$ and $B$, the probability that one *or* the other occurs is the sum of the probabilities of the two events (provided that $A$ and $B$ are disjoint).
|
||||
- $$P(A \cup B) = P(A) + P(B)$$
|
||||
- If $P(A \cup B)$ is greater than 1, then you know you have made a mistake and that the events were not mutually exclusive -> there is an intersection.
|
||||
- ## Intersections ($A$ **and** $B$)
|
||||
card-last-interval:: 3.51
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.6
|
||||
card-next-schedule:: 2022-10-08T00:26:58.336Z
|
||||
card-last-reviewed:: 2022-10-04T12:26:58.337Z
|
||||
card-last-score:: 5
|
||||
- #### Multiplication Rule for Independent Events #card
|
||||
card-last-interval:: 11.2
|
||||
card-repeats:: 3
|
||||
card-ease-factor:: 2.8
|
||||
card-next-schedule:: 2022-10-18T14:24:58.694Z
|
||||
card-last-reviewed:: 2022-10-07T10:24:58.695Z
|
||||
card-last-score:: 5
|
||||
- For two **independent** events $A$ and $B$, the probability that *both* $A$ **and** $B$ occur is the product of the probabilities of the two events.
|
||||
- $$P(A \cap B) = P(A) \times P(B)$$
|
||||
- If two events are **independent**, that means that one event has no impact on the probability of occurrence of the other event.
|
||||
- ## Conditional Probability
|
||||
card-last-score:: 1
|
||||
card-repeats:: 1
|
||||
card-next-schedule:: 2022-10-04T23:00:00.000Z
|
||||
card-last-interval:: -1
|
||||
card-ease-factor:: 2.5
|
||||
card-last-reviewed:: 2022-10-04T12:31:44.517Z
|
||||
- What is **conditional probability**? #card
|
||||
card-last-interval:: 0.81
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.36
|
||||
card-next-schedule:: 2022-10-11T06:39:23.513Z
|
||||
card-last-reviewed:: 2022-10-10T11:39:23.513Z
|
||||
card-last-score:: 3
|
||||
- $P(B | A)$ is the probability of event $B$ occurring, given that event $A$ has already occurred.
|
||||
- The **conditional probability** of $B$ given $A$, denoted by $P(B | A)$, is defined by: #card
|
||||
card-last-interval:: -1
|
||||
card-repeats:: 1
|
||||
card-ease-factor:: 2.5
|
||||
card-next-schedule:: 2022-10-18T23:00:00.000Z
|
||||
card-last-reviewed:: 2022-10-18T13:35:35.409Z
|
||||
card-last-score:: 1
|
||||
- $$P(B|A) = \frac{P(A \cap B)}{P(A)} \text{, provided } P(A) > 0$$
|
||||
- **Note:** $P(A)$ cannot equal 0, since we know that $A$ *has* occurred.
|
||||
- ### General Multiplication Rule for Dependent Events #card
|
||||
card-last-interval:: 2.8
|
||||
card-repeats:: 2
|
||||
card-ease-factor:: 2.6
|
||||
card-next-schedule:: 2022-10-11T09:57:45.051Z
|
||||
card-last-reviewed:: 2022-10-08T14:57:45.052Z
|
||||
card-last-score:: 5
|
||||
- The conditional probability can be rewritten to further generalise the multiplication rule:
|
||||
- $$P(A \cap B) = P(A) \cdot P(B|A)$$
|
||||
- $$P(B \cap A) = P(B)B \cdot P(B|A)$$
|
||||
- $$\text{As } P(A \cap B) = P(B \cap A) \text{ implies}$$
|
||||
- $$P(A) \cdot P(B | A) = P(B) \cdot P(A |B)$$
|
||||
- These results mean that $P(A |B)$ can be calculated once we know $P(A)$, $P(B)$, and $P(B | A)$.
|
||||
- #### Bayes' Theorem #card
|
||||
card-last-interval:: -1
|
||||
card-repeats:: 1
|
||||
card-ease-factor:: 2.36
|
||||
card-next-schedule:: 2022-10-20T23:00:00.000Z
|
||||
card-last-reviewed:: 2022-10-20T08:36:28.145Z
|
||||
card-last-score:: 1
|
||||
- **Bayes' Theorem** states that:
|
||||
- $$P(A | B) = \frac{P(B | A) \cdot P(A)}{P(B)} \text{ for } P(B)>0$$
|
||||
- ## Independence
|
||||
- Two events, $A$ and $B$ are independent, if and only if:
|
||||
- $$P(A \cap B) = P(A)\cdot P(B)$$
|
||||
- Therefore, to obtain the probability that two independent events will occur, we simply find the product of their individual probabilities.
|
||||
- Two events $A$ and $B$ are independent, if and only if:
|
||||
- $$P(B | A) = P(B) \text{ or } P(A|B) = P(A)$$
|
||||
- assuming the existence of the conditional probabilities.
|
||||
- Otherwise, $A$ and $B$ are **dependent**.
|
||||
- If in an experiment, the events $A$ and $B$ can both occur, then:
|
||||
- $$P(A \cap B) = P(A)P(B|A) \text{, provided } P(A) > 0$$
|
||||
-
|
Reference in New Issue
Block a user