Training Neural Networks with Genetic Algorithms

In this blog post I present my findings of an independent analytical and computational study of using genetic algorithms to train neural networks. This was my final project for an Introduction to Cognitive Science course that I took at The University of Texas at Austin, under Dr. David Beaver. My motivation comes from the fact …

Continue reading Training Neural Networks with Genetic Algorithms

Lyapunov Exponent of Logistic Map

Calculating the Lyapunov Exponent of a Time Series (with python code)

(In a later post I discuss a cleaner way to calculate the Lyapunov exponent for maps and particularly the logistic map, along with Mathematica code.) I found this method during my Masters while recreating the results of an interesting paper on how some standard tests for chaos fail to distinguish chaos from stochasticity (Stochastic neural network …

Continue reading Calculating the Lyapunov Exponent of a Time Series (with python code)

Target Distribution in Gnuplot

R code for multivariate random-walk Metropolis sampling

I couldn't find a simple R code for random-walk Metropolis sampling (the symmetric proposal version of Metropolis Hastings sampling) from a multivariate target distribution in arbitrary dimensions, so I wrote one. This is also my first R code. It requires the package MASS to sample from the multivariate normal proposal distribution using the mvrnorm function. …

Continue reading R code for multivariate random-walk Metropolis sampling

Partners meet halfway: a simple correlation study of an undergrad lab class

Last semester I taught two classes of an Introductory Physics (Electrodynamics) for Engineers lab course at the University of Texas at Austin. The first day that my students came in, most of them did not know each other. They just came in and sat at tables of two, and this pretty much became their permanent …

Continue reading Partners meet halfway: a simple correlation study of an undergrad lab class