PROBABILITY Made By: Prankit Mishra (141CC00007) Submitted To: Mr. Nishant Mathur(Faculty)
Contents  Basic Definition  Use Of Probability  Interpretations  Where It came from?  History  Theory  Applications  Important Terms of Probability  Experiment, Outcome, Event & Joint Probability  Independent Event  Mutually Exclusive Event  Conditional Probability
Basic Definition  Probability is the measure of the likelihood that an event will occur. Probability is quantified as a number between 0 and 1 (where 0 indicates impossibility and 1 indicates certainty).The higher the probability of an event, the more certain we are that the event will occur, i.e., for a certain event A the probability of A( also called P(A)) will be P(A)= The Number of ways event can occur The total Number of possible outcomes
Use Of Probability  Concepts of probability have been given an axiomatic mathematical formalization in probability theory which is used widely in such areas of study as mathematics, statics, finance, gambling, science (in particular physics), artificial intelligence/machine learning, computer science, game theory, and philosophy.  Probability theory is also used to describe the underlying mechanics and regularities of complex systems.
Interpretations  When dealing with experiments that are random and well-defined in a purely theoretical setting (like tossing a fair coin), probabilities can be numerically described by the number of desired outcomes divided by the total number of all outcomes.  When it comes to practical application however, there are two major competing categories of probability interpretations, whose adherents possess different views about the fundamental nature of probability: 1. Objectivists assign numbers to describe some objective or physical state of affairs. This interpretation considers probability to be the relative frequency "in the long run" of outcomes. 2. Subjectivists assign numbers per subjective probability, i.e., as a degree of belief. The degree of belief has been interpreted as, "the price at which you would buy or sell a bet that pays 1 unit of utility if E, 0 if not E."
Where it came from…??  The word probability derives from the Latin probabilitas, which can also mean "probity", a measure of the authority of a witness in a legal case in Europe, and often correlated with the witness's nobility.  In a sense, this differs much from the modern meaning of probability, which, in contrast, is a measure of the weight of empirical evidence, and is arrived at from inductive reasoning and statistical inference.
History  The scientific study of probability is a modern development of mathematics. Gambling shows that there has been an interest in quantifying the ideas of probability for millennia, but exact mathematical descriptions arose much later.  According to Richard Jeffrey, "Before the middle of the seventeenth century, the term 'probable' (Latin probabilis) meant approvable, and was applied in that sense, univocally, to opinion and to action. A probable action or opinion was one such as sensible people would undertake or hold, in the circumstances.“  The theory of errors may be traced back to Roger Cotes's Opera Miscellanea (posthumous, 1722), but a memoir prepared by Thomas Simpson in 1755 (printed 1756) first applied the theory to the discussion of errors of observation.
Cont……  Daniel Bernoulli (1778) introduced the principle of the maximum product of the probabilities of a system of concurrent errors.  Adrien-Marie Legendre (1805) developed the method of least squares, and introduced it in his Nouvelles méthodes pour la détermination des orbites des comètes (New Methods for Determining the Orbits of Comets).  In the nineteenth century authors on the general theory included Laplace, Sylvestre Lacroix (1816), Littrow (1833), Adolphe Quetelet (1853), Richard Dedekind (1860), Helmert (1872), Hermann Laurent (1873), Liagre, Didion, and Karl Pearson. Augustus De Morgan and George Boole improved the exposition of the theory.  Andrey Markov introduced the notion of Markov chains (1906), which played an important role in stochastic processes theory and its applications. The modern theory of probability based on the measure theory was developed by Andrey Kolmogorov (1931).
Theory  Like other theories, the theory of probability is a representation of probabilistic concepts in formal terms—that is, in terms that can be considered separately from their meaning.  There have been at least two successful attempts to formalize probability, namely the Kolmogorov formulation and the Cox formulation. In Kolmogorov's formulation, sets are interpreted as events and probability itself as a measure on a class of sets.  In Cox's theorem, probability is taken as a primitive and the emphasis is on constructing a consistent assignment of probability values to propositions.  In both cases, the laws of probability are the same, except for technical details.
Applications  Probability theory is applied in everyday life in risk assessment and in trade on financial markets.  Governments apply probabilistic methods in environmental regulation, where it is called pathway analysis.  In addition to financial assessment, probability can be used to analyze trends in biology (e.g. disease spread) as well as ecology (e.g. biological Punnett squares).  Another significant application of probability theory in everyday life is reliability. Many consumer products, such as automobiles and consumer electronics, use reliability theory in product design to reduce the probability of failure. Failure probability may influence a manufacturer's decisions on a product's warranty.
IMPORTANT TERMS OF PROBABILITY
Experiment, Outcome, Event & Joint Probability  An experiment is a situation involving chance or probability that leads to results called outcomes.  An outcome is the result of a single trial of an experiment.  An event is one or more outcomes of an experiment.  If two events A and B occur on a single performance of an experiment, this is called the intersection or joint probability of A and B, denoted as P(A∩B).
Independent Events  Two events, A and B, are independent if the fact that A occurs does not affect the probability of B occurring. Some examples of independent events are: • Landing on heads after tossing a coin AND rolling a 5 on a single 6-sided die. • Choosing a marble from a jar AND landing on heads after tossing a coin.
Mutually Exclusive Event  If either event A or event B occurs on a single performance of an experiment this is called the union of the events A and B denoted as P(AUB). P(A or B) = P(AUB) = P(A) + P(B)  For example, the chance of rolling a 1 or 2 on a six-sided die is P(1 or 2) = P(1) + P(2) = 1/6 + 1/6 = 1/3
Conditional Probability  Conditional probability is the probability of some event A, given the occurrence of some other event B. Conditional probability is written P(A|B) , and is read "the probability of A, given B". It is defined by P(A|B) = P(A∩B) P(B)
THANK-YOU

Probability In Discrete Structure of Computer Science

  • 1.
    PROBABILITY Made By: PrankitMishra (141CC00007) Submitted To: Mr. Nishant Mathur(Faculty)
  • 2.
    Contents  Basic Definition Use Of Probability  Interpretations  Where It came from?  History  Theory  Applications  Important Terms of Probability  Experiment, Outcome, Event & Joint Probability  Independent Event  Mutually Exclusive Event  Conditional Probability
  • 3.
    Basic Definition  Probabilityis the measure of the likelihood that an event will occur. Probability is quantified as a number between 0 and 1 (where 0 indicates impossibility and 1 indicates certainty).The higher the probability of an event, the more certain we are that the event will occur, i.e., for a certain event A the probability of A( also called P(A)) will be P(A)= The Number of ways event can occur The total Number of possible outcomes
  • 4.
    Use Of Probability Concepts of probability have been given an axiomatic mathematical formalization in probability theory which is used widely in such areas of study as mathematics, statics, finance, gambling, science (in particular physics), artificial intelligence/machine learning, computer science, game theory, and philosophy.  Probability theory is also used to describe the underlying mechanics and regularities of complex systems.
  • 5.
    Interpretations  When dealingwith experiments that are random and well-defined in a purely theoretical setting (like tossing a fair coin), probabilities can be numerically described by the number of desired outcomes divided by the total number of all outcomes.  When it comes to practical application however, there are two major competing categories of probability interpretations, whose adherents possess different views about the fundamental nature of probability: 1. Objectivists assign numbers to describe some objective or physical state of affairs. This interpretation considers probability to be the relative frequency "in the long run" of outcomes. 2. Subjectivists assign numbers per subjective probability, i.e., as a degree of belief. The degree of belief has been interpreted as, "the price at which you would buy or sell a bet that pays 1 unit of utility if E, 0 if not E."
  • 6.
    Where it camefrom…??  The word probability derives from the Latin probabilitas, which can also mean "probity", a measure of the authority of a witness in a legal case in Europe, and often correlated with the witness's nobility.  In a sense, this differs much from the modern meaning of probability, which, in contrast, is a measure of the weight of empirical evidence, and is arrived at from inductive reasoning and statistical inference.
  • 7.
    History  The scientificstudy of probability is a modern development of mathematics. Gambling shows that there has been an interest in quantifying the ideas of probability for millennia, but exact mathematical descriptions arose much later.  According to Richard Jeffrey, "Before the middle of the seventeenth century, the term 'probable' (Latin probabilis) meant approvable, and was applied in that sense, univocally, to opinion and to action. A probable action or opinion was one such as sensible people would undertake or hold, in the circumstances.“  The theory of errors may be traced back to Roger Cotes's Opera Miscellanea (posthumous, 1722), but a memoir prepared by Thomas Simpson in 1755 (printed 1756) first applied the theory to the discussion of errors of observation.
  • 8.
    Cont……  Daniel Bernoulli(1778) introduced the principle of the maximum product of the probabilities of a system of concurrent errors.  Adrien-Marie Legendre (1805) developed the method of least squares, and introduced it in his Nouvelles méthodes pour la détermination des orbites des comètes (New Methods for Determining the Orbits of Comets).  In the nineteenth century authors on the general theory included Laplace, Sylvestre Lacroix (1816), Littrow (1833), Adolphe Quetelet (1853), Richard Dedekind (1860), Helmert (1872), Hermann Laurent (1873), Liagre, Didion, and Karl Pearson. Augustus De Morgan and George Boole improved the exposition of the theory.  Andrey Markov introduced the notion of Markov chains (1906), which played an important role in stochastic processes theory and its applications. The modern theory of probability based on the measure theory was developed by Andrey Kolmogorov (1931).
  • 9.
    Theory  Like othertheories, the theory of probability is a representation of probabilistic concepts in formal terms—that is, in terms that can be considered separately from their meaning.  There have been at least two successful attempts to formalize probability, namely the Kolmogorov formulation and the Cox formulation. In Kolmogorov's formulation, sets are interpreted as events and probability itself as a measure on a class of sets.  In Cox's theorem, probability is taken as a primitive and the emphasis is on constructing a consistent assignment of probability values to propositions.  In both cases, the laws of probability are the same, except for technical details.
  • 10.
    Applications  Probability theoryis applied in everyday life in risk assessment and in trade on financial markets.  Governments apply probabilistic methods in environmental regulation, where it is called pathway analysis.  In addition to financial assessment, probability can be used to analyze trends in biology (e.g. disease spread) as well as ecology (e.g. biological Punnett squares).  Another significant application of probability theory in everyday life is reliability. Many consumer products, such as automobiles and consumer electronics, use reliability theory in product design to reduce the probability of failure. Failure probability may influence a manufacturer's decisions on a product's warranty.
  • 11.
    IMPORTANT TERMS OFPROBABILITY
  • 12.
    Experiment, Outcome, Event& Joint Probability  An experiment is a situation involving chance or probability that leads to results called outcomes.  An outcome is the result of a single trial of an experiment.  An event is one or more outcomes of an experiment.  If two events A and B occur on a single performance of an experiment, this is called the intersection or joint probability of A and B, denoted as P(A∩B).
  • 13.
    Independent Events  Twoevents, A and B, are independent if the fact that A occurs does not affect the probability of B occurring. Some examples of independent events are: • Landing on heads after tossing a coin AND rolling a 5 on a single 6-sided die. • Choosing a marble from a jar AND landing on heads after tossing a coin.
  • 14.
    Mutually Exclusive Event If either event A or event B occurs on a single performance of an experiment this is called the union of the events A and B denoted as P(AUB). P(A or B) = P(AUB) = P(A) + P(B)  For example, the chance of rolling a 1 or 2 on a six-sided die is P(1 or 2) = P(1) + P(2) = 1/6 + 1/6 = 1/3
  • 15.
    Conditional Probability  Conditionalprobability is the probability of some event A, given the occurrence of some other event B. Conditional probability is written P(A|B) , and is read "the probability of A, given B". It is defined by P(A|B) = P(A∩B) P(B)
  • 16.