{
"cells": [
{
"cell_type": "markdown",
"id": "3adcb07b-fc30-4e99-8810-1010095adae8",
"metadata": {},
"source": [
"# Master 2 Computer Science \n",
"# RL Course M2 AI\n",
"\n",
"## Mathematical Reminders"
]
},
{
"cell_type": "markdown",
"id": "8e2fe953-7feb-4013-8958-8129df9e693a",
"metadata": {
"jp-MarkdownHeadingCollapsed": true
},
"source": [
"### 1. Probability Theory Reminders"
]
},
{
"cell_type": "markdown",
"id": "dc75defc-2fec-48d7-b834-0a4cf3c2e874",
"metadata": {
"jp-MarkdownHeadingCollapsed": true
},
"source": [
"#### Exercise 1.1. (Discrete Random Variables):\n",
"\n",
"*Task:* Simulate rolling a fair die 10,000 times and estimate the probability of each outcome.\n",
"\n",
"*Python Tip:* Use numpy.random.choice()
to simulate the die rolls and collections.Counter
to count outcomes.\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "68102f9b-0f26-4356-b8b7-6a0c651093ec",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"id": "2033f83e-355d-4a6e-9914-f32fc6ccef81",
"metadata": {
"jp-MarkdownHeadingCollapsed": true
},
"source": [
"#### Exercise 1.2 (Continuous Random Variables):\n",
"\n",
"*Task:* Generate random samples from a normal distribution with mean 0 and variance 1, then plot the histogram and overlay the theoretical probability density function.\n",
"\n",
"*Python Tip:* Use numpy.random.normal()
for sample generation and matplotlib
for plotting."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "507dd126-d10c-4ac6-8e58-7d0eed1e788d",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"id": "2cf65791-aa33-4ec5-8992-b09be15f6313",
"metadata": {
"jp-MarkdownHeadingCollapsed": true
},
"source": [
"#### Exercise 1.3 (Expectation Calculation):\n",
"\n",
"*Task:* Simulate 100,000 samples of a random variable that represents the outcome of a coin toss (0 for heads, 1 for tails) with \n",
"$p=0.6$. Calculate the expected value.\n",
"\n",
"*Python Tip:* Use numpy.random.binomial()
and numpy.mean()
.\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "3ed3c330-d04c-4520-9aec-08323bcd42c8",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"id": "1635a5a7-be76-450a-a9d1-2e3d330c9790",
"metadata": {},
"source": [
"#### Exercise 1.4 (Variance and Standard Deviation):\n",
"\n",
"*Task:* Generate 10,000 samples of a variable following an exponential distribution with rate parameter \n",
"$\\lambda=0.5$. Calculate the variance and standard deviation.\n",
"\n",
"*Python Tip:* Use numpy.random.exponential()
and numpy.var()
."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "d4bb3164-adac-4a70-8f20-75f9be544b3f",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"id": "46c0c50c-0b0a-4f23-80fb-23d25da604e2",
"metadata": {},
"source": [
"#### Exercise 1.5 (Law of Large Numbers):\n",
"\n",
"*Task:* Simulate rolling a biased die (probability of getting 6 is 0.3) and compute the empirical mean of outcomes for increasing sample sizes. Show that the empirical mean converges to the expected value as the sample size increases.\n",
"\n",
"*Python Tip:* Use numpy.random.choice()
to generate samples and matplotlib
to plot the empirical mean against sample size.\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "dc677bff-f072-4da4-9034-8649541509d1",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"id": "e8195f63-d076-4db0-b565-9c4b53cf7f9e",
"metadata": {},
"source": [
"#### Exercise 1.6 (Central Limit Theorem):\n",
"\n",
"*Task:* Simulate the sum of 100 independent uniform random variables and plot the resulting histogram. Show that the distribution approaches a normal distribution.\n",
"\n",
"*Python Tip:* Use numpy.random.uniform()
and scipy.stats.norm.pdf()
to overlay the normal distribution.\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "3a5f7da3-9f4d-43d0-b45d-ac78ccebff4f",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"id": "dcf84fe6-cafd-47e7-baf1-2208d9651aa0",
"metadata": {},
"source": [
"### 2. Statistical Inference"
]
},
{
"cell_type": "markdown",
"id": "aae7175a-1c02-4c54-ad81-7e046be1b538",
"metadata": {},
"source": [
"#### Exercise 2.1. (Point Estimation - Mean):\n",
"\n",
"*Task:* Generate samples from a normal distribution with unknown mean and known variance. Use the sample mean as an estimator and compare it to the true mean.\n",
"\n",
"*Python Tip:* Use numpy.mean()
for sample mean calculation."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "22a36b55-6ad8-41fc-b640-571921d16210",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"id": "f2c6c9fb-7d01-4471-bd33-bf8419a55725",
"metadata": {},
"source": [
"#### Exercise 2.2 (Confidence Interval Estimation):\n",
"\n",
"*Task:* Generate data from a normal distribution and calculate the 95% confidence interval for the mean using the standard error.\n",
"\n",
"*Python Tip:* Use scipy.stats.norm.interval()
or manually compute using sample mean and standard deviation."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "115e12a3-6a38-4547-b1b2-7201a82730ad",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"id": "0e0f359e-bd14-452e-8491-e4eae5777715",
"metadata": {},
"source": [
"#### Exercise 2.3 (Hypothesis Testing - One Sample T-test):\n",
"\n",
"*Task:* Simulate a dataset representing the heights of individuals. Test if the sample mean is significantly different from a hypothesized mean of 170 cm.\n",
"\n",
"*Python Tip:* Use scipy.stats.ttest_1samp()
."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "57bfd8b6-24bf-4379-987c-e02acfd76fc1",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"id": "b3317016-7d5b-4870-adb0-6fd68292f02e",
"metadata": {},
"source": [
"#### Exercise 2.4 (Non-Parametric Test - Wilcoxon Test):\n",
"\n",
"*Task:* Simulate two sets of data representing different treatments and test if they come from the same distribution without assuming normality.\n",
"\n",
"*Python Tip:* Use scipy.stats.wilcoxon()
."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "1f3a379d-c933-4e9e-a34a-b4ddcf4cd0a2",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"id": "57f67b39-bd4e-45a2-89f9-6ad1e4ef9972",
"metadata": {},
"source": [
"### 3. Optimization and Differential Calculus"
]
},
{
"cell_type": "markdown",
"id": "f828f49e-a398-4174-9b3c-50e6b14300d5",
"metadata": {},
"source": [
"#### Exercise 3.1 (Gradient Descent - Finding Minima):\n",
"\n",
"*Task:* Implement gradient descent to find the minimum of the function \n",
"$$ f(x) = (x - 2)^2 + 3$$\n",
"Plot the function and the path taken by the gradient descent algorithm.\n",
"\n",
"*Python Tip:* Use iterative updates of $x$ with the gradient formula and matplotlib
for visualization."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "425fecfb-454b-4c2b-b924-68a3dd828bdc",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"id": "364ee76d-f69f-4210-bc8b-a8d4de8292b3",
"metadata": {},
"source": [
"#### Exercise 3.2 (Newton’s Method for Optimization):\n",
"\n",
"*Task:* Use Newton’s method to find the root of the function \n",
"$$ f(x) = x^3 - 2x + 1$$\n",
"Compare the convergence speed with gradient descent.\n",
"\n",
"*Python Tip:* Implement the update rule using the first and second derivatives."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "83777abd-eab2-4be4-ab41-1f7a1a1bec6a",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"id": "a3967f62-e397-4e60-b365-59b7643cc4da",
"metadata": {},
"source": [
"#### Exercise 3.3 (Numerical Derivatives):\n",
"\n",
"Task: Approximate the derivative of the function \n",
"$$ f(x) = \\sin(x)$$ \n",
"at $x=\\pi/4$ using finite differences and compare with the analytical derivative.\n",
"\n",
"*Python Tip:* Use numpy.diff()
or manual calculations for finite differences."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "e8841135-0017-4f8c-8b1d-ec2334559905",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"id": "77eb8c46-936b-4f89-accc-6666e917934f",
"metadata": {},
"source": [
"#### Exercise 3.4 (Jacobian Calculation with Autograd):\n",
"\n",
"*Task:* Compute the Jacobian matrix of the function \n",
"$$f(x,y)=\\left(x^2 +y, x−y^2\\right)$$ at a given point using Python's autograd package.\n",
"\n",
"*Python Tip:* Use autograd
to automatically differentiate."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "e1e3d6ca-546b-4c3d-957f-2afee6246d3f",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"id": "b60e0865-a594-4bad-8f5a-963066224f33",
"metadata": {},
"source": [
"### 4. Introduction to Fixed Point"
]
},
{
"cell_type": "markdown",
"id": "4edcbe5d-b46e-4b15-a7c8-06afa707961c",
"metadata": {},
"source": [
"#### Exercise 4.1 (Fixed Point Iteration - Numerical Approximation)\n",
"\n",
"*Task:* Implement a numerical method to find the fixed point of the function $f(x)=cos(x)$. \n",
"A fixed point is a value of x such that $f(x)=x$. Use an iterative approach starting from an \n",
"initial guess and observe the convergence behavior.\n",
"\n",
"Steps:\n",
"\n",
"1. Define the function $f(x)=cos(x)$.\n",
"2. Implement an iterative method that starts from an initial guess $x_0$ and repeatedly applies the function \n",
"$f$.\n",
"3. Set a convergence criterion: stop iterating when the change between iterations is smaller than a tolerance value\n",
" (e.g., $10^{-5}$).\n",
"4. Experiment with different initial guesses and analyze how the convergence is affected.\n",
" \n",
"*Python Tip:* Use a loop to perform the iterative process and check the difference between consecutive iterations to determine convergence."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "d33d0b01-5ca9-47bf-af36-c1437c0ba72d",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.5"
}
},
"nbformat": 4,
"nbformat_minor": 5
}