16:49: We usually apply a Poisson distribution to calculate the probability of having a certain number of speciesKMC and MD Overview
16:49: The Poisson distribution formula depends on the average number of species.
17:53: When beginning a new KMC trajectory, you should sample from the Poisson distribution to obtain a new initial condition for the counts of various species.
17:53: This is sampling over both the initial conditions and the events that follow.
KMC Optimization - Handling Time Scales
- 18:14: Because KMC problems with large numbers of states are challenging, there need to be ways to speed them up.
- 18:14: One way is to recognize and examine the fast processes, such as diffusion.
- 18:14: If fast processes (e.g., diffusion) are not essential to the desired product (e.g., reaction), they can be treated differently.
- 18:14: Alternatives include assuming infinitely rapid diffusion or slowing down diffusion artificially to match slower processes.
- 18:14: Slowing down rapid processes artificially can significantly speed up calculations.
- 21:55: Including rapid diffusion with its actual timescale may result in costly trajectories and restricted sampling.
KMC Optimization - Handling Low Probability Events and Sampling Error
- 20:17: Extremely low probability events may never be sampled.
- 20:17: If a process occurs on a much slower timescale than the simulation can achieve, it may be excluded from the calculation.
- 20:17: Eliminating sluggish processes decreases the number of states.
- 20:17: Sufficient sampling requires knowledge of the margin of error, which is greater for rare events.
- 21:55: A quick response will have adequate sample statistics, but a slow response may not yield reliable conclusions.
Molecular Dynamics
- 22:28: Molecular Dynamics (MD) solves equations of motion for atoms or clusters, usually applying Newton's equations.
- 22:28: Force fields, optimized to experimental and/or quantum chemistry results, explain interactions.
- 22:28: MD is usually classical, but quantum mechanical effects can be incorporated.
MD Algorithms and Techniques
- 23:43: The velocity Verlet algorithm is popular for conserving energy over many steps.
- 23:43: Thermostats simulate contact with a thermal bath and adjust velocities periodically.
- 23:43: Thermostats are useful in chemical kinetics simulations to witness infrequent, high-energy events.
MD vs. Metropolis Monte Carlo
- 25:00: MD may serve as an alternative to Metropolis Monte Carlo for computing multi-dimensional integrals.
- 25:00: MD's timescale is determined by actual physical movements, eliminating the need for a step size parameter.
- 25:00: MD is time-accurate, requiring extremely small delta t.
- 25:00: If the objective is a steady-state property, Metropolis Monte Carlo may be preferable.
MD Use Cases and Limitations (Time Scales)
- 26:50: MD is well-suited for calculating time-dependent properties on picosecond to nanosecond timescales.
- 26:50: Examples include energy transfer mechanisms on picosecond timescales.
- 26:50: A very small delta t restricts overall simulation time, often to nanoseconds.
- 26:50: MD is not feasible for processes that relax to equilibrium on longer timescales.
- 26:50: MD excels at sampling dynamics within a given conformation at nanosecond timescales.
MD Initial Conditions
- 28:30: MD has an initial condition issue with placing atoms.
- 28:30: Sampling over varying molecular arrangements is worthwhile.
- 28:30: MD cannot track slowly evolving conformational changes alone.
- 28:30: A separate sampling technique may be needed to initialize the system.
Comparing Methods (MC, KMC, MD)
- 29:16: Metropolis Monte Carlo, Kinetic Monte Carlo, and Molecular Dynamics are tools for different problems.
- 29:16: Each method is not best suited for
Selecting a Tool
30:00: Selecting a tool that suits the particular issue you are attempting to address is important.
Q&A
30:20: The rest of the class time is designated for questions.
Choosing P and F in Monte Carlo, Marginal Integrals
- 30:36: A question is raised about how to choose the probability distribution P or waiting factor W when evaluating an integral G(x).
- 30:36: The approach is to rewrite the integral G(x) as P(x) * F(x).
- 30:36: Using a uniform distribution for P is possible but inefficient as it samples low-probability regions.
- 30:36: The goal is to find a P*F factorization that works best for sampling.
- 30:36: "Working best" implies setting F as constant as feasible and P as sharply focused as feasible.
- 30:36: Balancing a sharp P with a flat F is a trade-off.
- 30:36: Default options include using the Boltzmann factor in statistical mechanics problems or the provided W in Bayesian analysis.
- 30:36: You typically have a joint probability density or waiting factor W(theta1, theta2, ...), but only need to consider a subset of parameters.
- 30:36: You can perform a "marginal integral" by summing out the degrees of freedom you don't care about.
- 30:36: Marginalizing out degrees of freedom is a convenient trick applicable to both Bayesian and Boltzmann problems.
- 30:36: What you do specifically depends on what system properties you are interested in.
Course Logistics
- 34:00: Final review session schedule discussion (Wednesday evening or Friday morning vote).
- 34:00: There are times scheduled today for review sessions, one of which is on Kinetic Monte Carlo.
- 34:00: The solution to the previously graded homework will be made available soon.
Closing Remarks
34:45: Best of luck with studying and the exam.