Consolidation
Introduction
We have introduced a lot of material in the first six weeks. This week, we will review the content that has been introduced in the first six weeks and reinforce some of the skills you have learned during the class.
Syntax Summary
Nothing new this week
Worksheet
Let's start be reviewing the course syllabus (available in the module catalogue).
Aims of the module
- To give instruction on and provide practice in the use of computers to develop basic data handling skills required by physical scientists (e.g. graphing and data analysis skills).
- To develop the basic numerical computing skills required by physical scientists.
- To develop proficiency in problem solving by working through a series of examples which relate to the experimental and taught physics modules.
Learning outcomes
The student will be able to:
- Enter or read in data from files (in standard formats); plot and export graphs suitable for inclusion in a report.
- Perform statistical analysis of data of the form y=mx+c; best fit and error determinations.
- Use and manipulate simple mathematical functions in Python.
- Express mathematical integration and differentiation problems such that they may be solved numerically.
- Use numerical techniques with Python to solve a number of mathematical and physical problems.
Outline Syllabus
- Introduction: Data analysis and numerical techniques via computing with Python.
- Graphing skills: Reading in and entering x-y data, data plotting, fitting with straight line (including error determination).
- Random numbers: Generation. Application to the generation of distributions (including plotting histograms).
- Using mathematical functions: Use of basic mathematical functions (sine, exp etc.); introduction to more advanced functions.
- Simple numerical integration and differentiation: Numerical techniques and the importance of step size. These will practiced (and tested) mainly by application to basic mathematical functions.
- Application of Numerical Methods: Approaches to solving physics problems, e.g. modelling the simple pendulum, projectile motion through a viscous medium, analysing real experimental data.
In the first six weeks, we have covered many of these topics. Indeed, it's only the numerical integration and differentiation that we have not yet looked at -- that's the subject of weeks 8 to 10. For this week, we have four questions that will use many of the skills that have been introduced so far in the class.
Something Fun - Cloud Computing!
So far, you have been executing code on your own laptop or PC, but as it was mentioned at the start of this module, Cloud Computing is becoming the standard in a wide range of major science collaborations, as well as in the industry.
In preparation for Year2 and Year3 data analysis modules -which are also going to be based on Cloud Computing- you are being introduced here to the concept.
Even though you are not required to do so, you may try and execute the code for this week's questions on Google Colab servers. The image below, demonstrates how easy it is to start a new Jupyter Notebook. Any code you create, instead of being saved on your laptop it will be saved on your own space on Google Drive
Note that Google Colab requires a Google Account - If you are not interested in creating a Google Account, this is perfectly fine - You can answer this week's work in the usual way.

Excercises
1. Earthquake data analysis [1 ¼ marks]
The file earthquakes.txt (only available on LC) contains details of all the recorded earthquakes with a magnitude above 4 that have been recorded since 1973.
- Read in the data and extract the year and magnitude of each earthquake.
- Plot a histogram (on a log scale) of the number of earthquakes vs magnitude. Format the figure and save it.
- Looking at the histogram, by approximately what factor does the number of earthquakes decrease with a unit increase in magnitude?
- Does this relationship hold for the entire range of data? If not, do you have any ideas why not?
- How many earthquakes of magnitude 7 or more have occurred from 2000 onwards?
Note: There is a text header at the top of the file that you will need to look at to know what's in each column and then skip when you read in the data.
2. Thermal Expansion [1¼ marks]
The length of a metal rod is measured a set of different temperatures. Here is the data:
Temperature (C) | Length (m) |
0 | 1.12347 |
5 | 1.12354 |
10 | 1.12364 |
15 | 1.12370 |
20 | 1.12372 |
25 | 1.12388 |
30 | 1.12394 |
35 | 1.12402 |
40 | 1.12410 |
- Plot the data
- Fit a line through the data and calculate the slope and error on the slope.
- Format the figure and save it.
- The thermal expansion coefficient is defined as αL = 1/L0C (dL/dT). Calculate the thermal expansion coefficient of the bar (with uncertainties).
- Given your results, what material do you conclude the bar is made of? There is a list of thermal expansion coefficients available here.
3. Vectors & Taylor expansion [1 ¼ marks]
i. Given two vectors x = (1, 2, 3) and y = (5, -3, 7):
- Calculate the angle (in degrees) between them
- Calculate the unit normal to the plane defined by x and y
ii.The first few terms of the Taylor expansion of ex are: ex = 1 + x + x2/2! + x3/3! + ...
- Make a plot of ex between -2 and 2.
- Add lines showing the Taylor expansion at 1st, 2nd and 3rd order in x.
- Format the figure and save it.
4. Image analysis [1¼ marks]
The file “NGC1097_image.txt” contains data taken by the Herschel Space Observatory.
- Load the data.
- Make a 2-d plot of it with
contourf()
. - Set the contours to go from the data minimum to 10% of the data maximum, to make the galaxy more visible. [Hint: use the levels parameter of
contourf()
to provide an array of contour levels] - Make a 1-d plot showing vertical and horizontal slices that pass through the centre of the image.
Note: We haven’t told you the units of the data, or x and y scales so don’t worry too much about labelling the axes.