**Introduction into GR (no math)****Differential calculus****Index notation****Special theory of relativity****Some Riemannian geometry**

1. **International Winter School on Gravity and Light 2015 (Basic/Medium Difficulty)**

This is a collection of 28 video lectures by Frederic P Schuller from the WE-Haraeus international winter school. It’s a thorough introduction to the basic prerequisites of GR. The topics covered include topology, tangent spaces, fields, parallel transport, metric manifolds, integrals on manifolds, optical geometry, cosmology, gravitational waves and others. The tutorial problems and videos are available **here**.

2. **General Relativity by Leonard Susskind (2012) (Medium Difficulty)**

This highly recommended course on General Relativity by Leonard Susskind includes 10 lectures. The subjects covered include the equivalence principle, curved geometry, space-time geometry, introduction to tensors and all the other good stuff you need for the basics. There’s also a similar course by Susskind from **2009,** but I personally found the newer course more informative and simpler-to-understand.

3. **Einstein’s General Relativity and Gravitation (Medium Difficulty)**

Here we have a course by Prof Herbert W. Hamber on the basics of GR. It contains 24 lectures that go deeply into the field equations, their derivations and possible solutions. In addition, 3 lectures introducing special relativity are included as well.

4. ** Introduction to General Theory of Relativity — Coursera (Advanced)**

This course from Coursera offers a thorough introduction into the main principles of general relativity. The syllabus includes general covariance, Einstein-Hilbert action, Schwarzschild solution, Penrose diagrams, tests of GR, Kerr solution, gravitational waves and cosmology. As other Coursera courses it includes both weekly video lectures and assignments. In addition, the student has online access to a discussion forum.

5. **Relativity — Perimeter Institute (Advanced)**

In this series of lectures Prof Michael Duff goes deep into the mathematical framework of general relativity. More advanced topics, such as modifications of gravity, scalar field theories, Kaluza-Klein theory, quantum gravity and other areas are discussed. The link above gives access to the lectures in different video formats as well as in written form.

**Computational Physics by Mark Newman**

This in-depth introduction to the field of computational physics explains the fundamental techniques that every physicist should know. Techniques such as finite difference methods, numerical quadrature, and the fast Fourier transform are of great importance in nearly every branch of physics. Computational Physics by Newman gives a detailed introduction to these techniques in Python along with clear examples. The text starts with an in-depth introduction to the basic principles of Python and then heads on into various numerical methods used for solving differential equations. In addition, Fourier transforms and Markov chain Monte Carlo processes are explored as well.

**Computational Physics by ****Giordano and Nakanishi**

Computational Physics by Giordano and Nakanishi is another often quoted text that covers a wide range of topics in computational physics. The physical situations explored in the text include projectile motion, the movement of the planets in the Solar System, pendulum motion and chaos, problems in statistical physics and others. Additional interdisciplinary topics include neural networks and the brain, real neurons and action potentials and even cellular automata. The examples are written in Basic, which is a language renowned for being easy to pick up.

**Computational Physics by ****Thijssen **

This entry is a more advanced take on the core topics in computational physics. The book covers many different topics such as Monte Carlo and molecular dynamics, various electronic structure methodologies, methods for solving partial differential equations, and lattice gauge theory. Newly added topics in this updated edition include finite element methods and lattice Boltzmann simulation, density functional theory, quantum molecular dynamics and diagonalisation of one-dimensional quantum systems. Recommended for those with a solid foundation in physics and programming.

**A First Course in Computational Physics by DeVries and Hasbun**

Intended for the physics and engineering students who have completed introductory physics courses, this text covers the different types of computational problems using MATLAB. Topics such as root finding, Newton-Cotes integration, and ordinary differential equations are included and presented in the context of physics problems. A decent understanding of MATLAB programming is required.

**Introduction to Numerical Programming by Beu**

This often cited text aims to make numerical programming more accessible to a wider audience of scientists and engineers. Through practical examples in Python and C/C++ a wide variety of relevant topics are introduced, including function evaluation, solving algebraic and transcendental equations, systems of linear algebraic equations, ordinary differential equations, and eigenvalue problems. Furthermore, Markov chain Monte Carlo methods are employed to solve a variety of physics problems. The text requires only a basic understanding of programming in Python or C/C++, as an introductory section covers the needed topics.

]]>

**What? How? When? Why? **

** Machine learning **is a highly interdisciplinary sub-field of computer science, which is tightly related to artificial intelligence and deals with computational learning and pattern recognition. To put it simply, it’s all about enabling a computer software to learn without being pre-programmed. Mathematically this corresponds to taking in the data and discovering a function that best relates the inputs to the outputs via some mathematical algorithm. In terms of programming, this corresponds to teaching an algorithm from available data so that it can respond correctly to unseen data.

**Applications**

Given the inherent link between machine learning, data science, statistics and AI, it’s no wonder that the applications are countless. Today machine learning is used in almost all parts of modern life, starting with your Netflix suggestions, self-driving Google cars and ending with neuroscience and astrophysics research just to name a few. This essentially is due to the fact that any field producing or using large quantities of data requires clever ways of dealing with it.

**Most Common Techniques**

Very roughly most machine learning techniques can be split into **supervised** and **unsupervised** learning. In supervised learning, our data has labels and we want the program to find a relation between the variables. A simple example for supervised learning would be determining the relation between the price of a house and its size. Once the data is collected and the algorithm is run, we can use the calculated hypothesis function to predict house size given unseen data. In unsupervised learning, on the other hand, we feed a bunch of unlabeled data to our algorithm, which then aims to cluster it — i.e. to find clusters and patterns in data in order to classify them. A simple example would be classifying customers into groups by buying behavior or grouping galaxies into galaxy clusters etc.

**Commonly Used Algorithms**

So just to give you an overall idea what can be accomplished by machine learning techniques, here are some commonly used algorithms.

- Artificial Neural Networks

Artificial neural networks refer to an algorithm that mimics biological neural networks, such as the central nervous system and the brain. A neural network (ANN) is essentially a multi-layered connection of *nodes *which are connected to each other via pre-defined rules. The first layer corresponds to the input data, whereas the inner, so called hidden layers, transform the input all the way to the outer layer, which is the prediction. Typical tasks achievable by ANN’s are handwriting recognition, regression analysis, automated driving, robotics and, of course, computational neuroscience.

2. Decision Trees

Another algorithm, sharing some similarities with ANN’s, is basically a tree-like model that maps out the possible outcomes. Decision trees are one of the easiest models to understand intuitively and can be combined with other decision techniques that makes them one of the more popular algorithms.

3. Genetic Algorithms

In 1950s Alan Turing famously proposed a learning, which would parallel the principles of evolution. The computational realization of this idea are the genetic algorithms . Genetic algorithms solve optimization problems by mimicking the process of natural selection. The algorithms employ biologically inspired operators like mutation, crossover and selection. For a good introduction check out this **article**.

**Online Courses on Machine Learning**

Ok, so it all sounds cool, but how does one actually learn these algorithms? Here are some great free online resources to get you started. The courses are listed roughly in terms of increasing complexity.

**The basics:**

In this talk Dr Amanda Barnard will take us through the latest advances in nanotechnology and answer the fantastic question: ‘Are nanoparticles alive?’ In the world of science fiction, nanotechnology is often shown as swarms of micro-machines that act without the need for human supervision. Much like their biological counterparts, these imagined devices even sometimes possess their own intelligence. The imagination of science fiction writers has taken this high tech fantasy beyond the reality. But scientists have imagination too, and increasingly they are taking nanomaterials to remarkable places with properties that sound stranger than fiction.

]]>