Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix Small Typo #2

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 1 addition & 2 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
problems
_build
*solutions
*solutions*
*ai
7 changes: 7 additions & 0 deletions _update.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
lecture_file=lecture_files/Lecture$1.md
git add $lecture_file
git commit -m "Adding lecture notes for Lecture $1"
git push
rm -rf _build
jupyter-book build .
ghp-import -n -p -f _build/html
2 changes: 1 addition & 1 deletion lecture_files/Lecture5.md
Original file line number Diff line number Diff line change
Expand Up @@ -229,7 +229,7 @@ referred to as the [**Gibbs entropy**](https://en.wikipedia.org/wiki/Entropy_(st

$$S = -k_B \sum_j p_j \ln p_j$$

You will shows this expression's equivalence to the Boltzmann definition
You will show this expression's equivalence to the Boltzmann definition
of the entropy on [Problem Set 2](../../problems/ps_2/problem_set_2). We now have the expressions:

$$\begin{aligned}
Expand Down
Binary file added lecture_files/figs/fig_17_1-01.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added lecture_files/figs/fig_17_2-01.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added lecture_files/figs/fig_17_3-01.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added lecture_files/figs/fig_17_4-01.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added lecture_files/figs/fig_17_5-01.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added lecture_files/figs/fig_17_6-01.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added lecture_files/figs/fig_17_7-01.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added lecture_files/figs/fig_17_8-01.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
222 changes: 222 additions & 0 deletions problems/ps_1/problem_set_1.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,222 @@
# Problem Set 1 (Due Wednesday, September 20, 2023)

## Question 1: Polymer dimensions

Ideal polymer chains are often described as undergoing *random walks* on
a lattice. Like ideal gases, the monomers in an ideal polymer chain do
not interact with each other; that is, the segments of the chain do not
exclude volume and can overlap. Consider a **one-dimensional** ideal
polymer chain composed of $N$ independent segments.

One end of the chain is placed at the origin. A single chain conformation can then be
generated by iteratively placing a single segment of the chain a
distance $b$ in either the positive or negative $x$ dimension from the
current chain end - *i.e.*, the chain elongates by taking "steps" along
the one-dimensional coordinate. The end-to-end distance of the chain,
$x$, is the distance between the origin and the end of the last segment
placed.

![image](pset_1_random_walk_fig.png)
Figure 1 shows an example for $N = 8$ in which the end-to-end distance is after all 8 steps are taken.

```{admonition} **(a)**

For a one-dimensional ideal chain with $N$ segments and segment
size $b$, calculate the probability, $p(x)$, that the end-to-end
distance of the polymer is equal to $x$. Assume that there is an **equal
likelihood** of taking a step in either the positive or negative
direction for each chain segment.

<details>
<summary>Hints</summary>
First compute the probability that we take n steps to the right.
</details>

```


```{admonition} **(b)**

For the chain described in part **a**, using Stirling's
approximation, show that in the large
$N$ limit the probability distribution $p(x)$ can be approximated by:

$$p(x) = C \exp \left ( -\frac{x^2}{2Nb^2}\right )$$

where $C=\frac{1}{\sqrt{2\pi Nb^2}}$ is a normalization constant
to enforce that $\int_{-\infty}^{\infty} p(x) dx = 1$. Make sure
to show how you obtain $C$.

<details>
<summary>Hints</summary>
Define the quantity a = x/Nb, where a << 1 in the large N limit,
and write a Taylor series expansion for ln(1-a) when appropriate.

</details>

```


```{admonition} **(c)**

Show that the entropy of an ideal one-dimensional chain in the
large $N$ limit is given by:

$S(N, x) = -\frac{k_Bx^2}{2Nb^2} + S(N) + k_B \ln{C}$.

where $C$ is our normalizing constant from above, and S(N) is the x-independent entropy term.

```

## Question 2: Magnetization of a paramagnet

Consider a system of $N$ distinguishable, non-interacting atoms with
magnetic dipole moments (spins) in a magnetic field $H$ at constant
temperature. Each spin $s_i$ has a magnetic moment $\mu$ and can be in
one of two states: parallel to the field ($s_i = 1$) or anti-parallel to
the field ($s_i = -1$). The energy of each microstate is due only to
interactions between the spins and the magnetic field, and is given by:

$$E = -\sum_{i=1}^N s_i \mu H$$

The magnetization of the material is defined as:

$$\begin{aligned}
M = \sum_i^N s_i \mu = -\frac{E}{H}
\end{aligned}$$

This model is commonly used to describe paramagnetic materials. In this
problem, we will derive an expression for the ensemble-average
magnetization of a paramagnet in a magnetic field.


```{admonition} **(a)**

Assuming that the paramagnet has a fixed energy $E$, write an
expression for the entropy as a function of the number of spins aligned
with the field, $N_+$, and the total number of spins, $N$.


<details>
<summary>Hints</summary>
Use the [Boltzmann formulation](https://en.m.wikipedia.org/wiki/Boltzmann's_entropy_formula) for entropy.

</details>
```

```{admonition} **(b)**

Show that the expression for $N_+$ as a function of the number of
spins $N$, the magnetic field strength $H$, the magnetic moment $\mu$,
and the temperature $T$ (or as a function of $\beta \equiv 1/k_B T$) is


$$ N_+ = \frac{N}{1+exp(-2H\beta\mu)}$$


<details>
<summary>Hints</summary>
Procedurally, this is almost identical to our previous lecture.

</details>
```



```{admonition} **(c)**
Show that the magnetization of a paramagnet is given by:

$$\begin{aligned}
M = N\mu \tanh (\beta \mu H)
\end{aligned}$$

```

## Question 3: Mixing entropy

Consider two different ideal fluids containing $N_1$ and $N_2$
molecules, respectively, for a total of $N$ molecules. All molecules
exclude the same molar volume and are assumed to interact weakly with
each other and with themselves, such that the potential energy of the
system is negligible. The two fluids are initially completely demixed
due to an impermeable partition; the partition is then removed and the
fluids are allowed to mix at constant volume as illustrated in Figure 2.
Assume also that the molecules of each fluid are **indistinguishable**
from the molecules of the same fluid.

![image](pset_1_mixing_entropy_fig.png)

```{admonition} **(a)**

Assume that the two fluids occupy a fictitious lattice that
spans the available volume. Each molecule occupies a single lattice site
and all lattice sites are occupied; thus, there are are $N_1$ lattice
sites occupied by molecule 1 and $N_2$ lattice sites occupied by
molecule 2. Using this approximation, calculate $\Omega(N_1, N_2)$,
which is defined as the number of microstates for the mixture of fluids.


<details>
<summary>Hints</summary>
Identical arrangements of the fluids are indistinguishable.

</details>

```

```{admonition} **(b)**

Derive an expression for the entropy change associated with
mixing the two ideal fluids in terms of the mole fractions,
$x_1 = N_1/N$ and $x_2 = N_2/N$, of the two components. That is,
determine an expression for:

$$\Delta S_{\textrm{mix}} = S_{\textrm{mixed}} - S_{\textrm{demixed}}$$

<details>
<summary>Hints</summary>
How many distinguishable configurations are there for totally demixed systems?

</details>

```

```{admonition} **(c)**

Is the lattice model assumption reasonable for molecules
occupying a continuous set of positions (rather than discrete points)?
Why or why not?

```

## Question 4: Stirling's approximation


```{admonition} Python Exercise

Write a Python program that calculates the percent error of Stirling's
approximation as a function of $N$, where Stirling's approximation is
defined as:

$$\ln N! \approx N \ln N - N$$

Include with your solution a copy of your Python code (including
comments as necessary), a plot of $\ln N!$ and Stirling's approximation
as a function of $N$ up to $N=100$, and a plot of the error of
Stirling's approximation as a function of $N$ up to $N=100$. Your grade
for this problem will be based in part on the readability of your code
and plots in addition to the accuracy of the solution.
```

**Note: this problem is intended to provide you with practice in Python
programming prior to the assignment of the simulation project. If you
need resources for learning to code in Python, see [here](https://sts.doit.wisc.edu/).**

The two requested plots are provided below.

![Comparison of Stirling's approximation vs. $\ln N!$ as a function of
$N$.](pset_1_plot_stirling.png)

![Relative error of Stirling's approximation vs. $N$, which approaches
zero as $N$ exceeds
$\approx 50$.](pset_1_plot_stirling_error.png)
Binary file added problems/ps_1/pset_1_mixing_entropy_fig.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added problems/ps_1/pset_1_plot_stirling.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added problems/ps_1/pset_1_plot_stirling_error.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added problems/ps_1/pset_1_random_walk_fig.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading