1780 Research Project The project is simple – Pick something you want to learn about that is mathematically oriented and at hopefully at least tangentially related to probability theory, teach...

1 answer below »
PROBABILITY PROJECT ON MARKOVS CHAIN


1780 Research Project The project is simple – Pick something you want to learn about that is mathematically oriented and at hopefully at least tangentially related to probability theory, teach yourself about it, and then present your learnings to me! To give you a sense of what I’ll be looking for, here are some ideas of projects that I’d be interested in seeing. You are welcome to choose any of these topics for yourself, or come up with your own: • An explanation of Markov chains along with some applications and examples. (Recommended if you are interested in connecting probability theory to linear algebra, if you’ve taken that.) • An axiomatic derivation of the formula for entropy. (Recommended to those looking for a legit and concrete mathematical challenge.) • A presentation of a probability distribution(s) that we won’t be covering in this class, along with some applications, examples, and relationships to other distributions. • A presentation and mathematical analysis of a probabilistic algorithm. • A solution to an interesting problem in physics using probability theory. • An introduction to some other mathematical theory that you wanted to learn about (pending my approval) • Anything else at all which shows me that you put an honest attempt into learning something that you are genuinely interested in! My life generally consists of me waking up, going somewhere around town, and teaching myself whatever math I’m interested in all day until my brain feels like it’s melting. My hope with this project is that you’ll experience what it’s like to spend a couple of days in my shoes and have some fun doing it. I would like to encourage everyone to pick something which seems challenging to them. Pick something that seems overly lofty and out of your current reach - see how far you can get! You might surprise yourself. You probably won’t fully learn what you picked out as a topic if you do this, but you’ll end up learning a ton of other stuff along the way! Just present that stuff to me. As long as I see that you made an honest attempt to learn something, that’s what I’m looking for. What I’m looking for is primarily effort, and secondarily success. On Thursday, March 21 – You will turn into me a short paragraph explaining what you plan to do for your project. You will receive this paragraph back the following week, and written with it will be one of three things: • Green check mark – You’re good to go. No issues, I like your idea. Go forth, my child. • Yellow box – I have concerns about your topic choice, but I am okay with your general idea. You can start doing research on it, but you should also come talk to me asap. • Red X – There’s problems. In the case of an X, there will be written details about how to proceed. A red X does not necessarily mean pick another topic entirely. It means you absolutely need to come talk to me or address the written details before starting research. This project is designed to be done individually. Depending on the situation, I might allow pairs of people to work together on a project. If two of you are committed to working together and turning in a single project, then I will expect the quality of what is turned in to reflect the efforts of two people, and your grade will reflect that. Expectations: First and foremost, I am expecting to see some real mathematical theory, somewhere in your project. Do not simply watch a few Numberphile videos and assume that is enough research to write your project. I am expecting worked examples/proofs/derivations, formal definitions, and mathematical rigor. For example, if you are presenting a probabilistic algorithm, you need to prove to me that it actually works using the tools we’ve developed! Any sources that you heavily rely on should me cited in some capacity. I am not picky about where those sources come from. YouTube and Wikipedia are fine, but I would strongly encourage you to have at least one source external to both of those. If you are having trouble finding something which is approachable to you, come talk to me and I can help. Since this is a math project, I cannot reasonably expect the entire thing to be typed. Ideally, what you will turn in is a combination of typed explanation and neatly written mathematics, on paper which would be stapled to typed report. If you’d like to write everything down on paper you can, but I will expect it to be at the same level of readability as my notes on expectation, which are available to look at on the front page. You should use that as a quality reference for anything you write down. On the flip side, if you are interested in turning in something which is purely electronic, the standard markup language for mathematical texts is called LaTeX. The online browser based IDE called Overleaf is wonderfully convenient, and the language itself is quite easy to learn. This is by no means a requirement, however. So how do impose a length requirement, if the format is so loose? Lucky for you, I’ve essentially done my own version of this project already. On canvas, you can find my own project, BPP and the Chernoff Bound. (Which we’ll go over as a class once we finish talking about the Binomial distribution.) This can be seen as a minimal amount of material which would receive a good grade. It’s got some exposition, some definitions, a dash of rigorous math, an illustrative example, and some genuine interest involved. As long as I can see all of these things, or at least a solid attempt at all of these things, then you’ll receive a good grade. That’s all for now. The project will be due near the end of the semester. I’m not sure of an exact date yet. It is worth a full test grade. Truly exceptional efforts might even be worth more than a test grade! If you are worried about your grade in this class, the best way to show me that you care would be to go above and beyond on this. I guarantee you that will be taken into account when assigning final grades.
Answered Same DayApr 25, 2021

Answer To: 1780 Research Project The project is simple – Pick something you want to learn about that is...

Rajeswari answered on Apr 27 2021
146 Votes
Entropy of probability
Introduction:
Though very long back the uncertainty of probability distribution was considered as entropy and information, the functional relationship between the two was a challenging question to persons doing research in Statistics. There were many theories developed showing the relationship between probability and en
tropy. Some conventional theory found out the properties of additivity and the extensivity in the main information theory known as Shannon. Among all, the mostly accepted and used was that of Shannon where entropy was considered as the negative of the sum of variable multiplied by logarithm of probability. The next generation were interested whether any other theory besides Shannon theory can be developed to explain the relation between probability and entropy.
In Physics, the main branch of thermodynamics, the entropy was much useful. Also in behavior of compression of gases, the entropy was applied for so many discoveries.
Meaning of entropy:
As per dictionary entropy means lack of order or lack of predictability or a position of uncertainty, or gradual decline into disorder.
We can say future price of steel is not predictable as entropy reigns supreme, etc.
Entropy of probability:
We already learnt that probability is the likelihood of chance of occurring. By past events or experiments we find the number of possible favourable outcomes out of a certain number of total incomes and express probability as the ratio of favourable outcomes to total outcomes.
Or probability can be expressed as the no of likely successes/total number of trials.
But the point to be noted is that because we found out that say in 1000 trials that expected value of heads may be 500 by repeating the number of trials this does not guarantee that if you toss 10 coins you will get 5 heads. The probability means that if no of trials increase the number of heads to total number of trials may limit to a value 0.5
And also probability always lies between 0 and 1 where 0 is the extreme case of uncertain event and 1 is the certain case of definite event.
Entropy thus here refers to the uncertainty of the probability. As we mentioned above suppose in a fair die, number of 6’s expected may be 1/6 of the total trials.
But this is not certain and there is an uncertainty involved in it. This uncertainty or lack of certainty is termed as entropy. Note that probability 0 or 1 two extremes will have entropies as 0.
Let us consider the entropy function defined as H(x) for a simple random experiment. We get when probability is 0 or 1 i.e. for impossible event and certain event, the entropy is equal to 0. This is because there is no uncertainty about these two probabilities. But for all values of probabilities in the interval (0,1) there is a positive entropy function associated with each probability.
Entropy also lies between 0 and 1.
Unlike cumulative probability where probability increases and reaches 1, the entropy function is bell shaped and reaches maximum when p = 0.5 and again starts declining. This looks like a parabola with vertex at (0.5,1) with axis of symmetry at x=0.5.
This goes on increasing from p(x)=0 to p=0.5 reaches maximum then starts declining from p=0.5 to p=1
The more the probability is closer to 0.5 the more the entropy or uncertainty would be.
When probability is nearer to 0 or 1 entropy will be the less value.
How is probability related to entropy?
Entropy is a...
SOLUTION.PDF

Answer To This Question Is Available To Download

Related Questions & Answers

More Questions »

Submit New Assignment

Copy and Paste Your Assignment Here