本文主要是介绍CS224d: Deep Learning for NLP Lecture1 概率复习(1),希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!
因为平时考试,我的报告分数特别低,主要是因为我的英文写作能力特别差,为了练习英文写作,部分博客(比较简单的内容)将用英文写作,望大家闲来无聊时,看看我的博客,指正错误。
Abstract
The main content is the review of cumulative distribution functions(CDFs), probability mass functions(PMFs) and probability density functions(PDFs).
Axioms of Probability
- P(A)≥0, for all A∈, is a set of events
- P(Ω)=1, Ω represents a sample space
- If A1,A2,... are disjoint events, then P(∪iAi)=∑iP(Ai)
Conditional Probability and Independence
The conditional probability of any event A given B is defined as,
Random Variables
We denote random variables using upper case letters X(ω) or simply X . We denote the value that a random variable may take on using lower case letter
Cumulative Distribution Functions
A cumulative distribution function is a function FX:ℝ→[0,1] which specifies a probability measure as,
- 0≤FX(x)≤1
- limx→−∞FX(x)=0
- limx→∞FX(x)=1
- x≤y⇒FX(x)≤FX(y)
Probability Mass Functions
When X is a discrete random variable, a simpler way to represent the probability of a random variable is to directly specify the probability of each value that the random variable can assume. A probability mass function is a function
Properties:
-
- ∑x∈Val(X)pX(x)=1
- ∑x∈ApX(x)=P(X∈A)
Probability Density Functions
The CDF FX(x) is differentiable everywhere for some continuous random variables. The probability density functions is the derivative of the CDF:
Properties:
- fX(x)≥0
- ∫∞−∞fX(x)=1
- ∫x∈AfX(x)dx=P(X∈A)
Reference:http://cs229.stanford.edu/section/cs229-prob.pdf
这篇关于CS224d: Deep Learning for NLP Lecture1 概率复习(1)的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!