You are on page 1of 4

Department of Computer Science, University of

Leicester

C07201 Individual Project

Tool for calibration and analysis of data from


Total internal reflection fluorescence microscope
(TIRFM)

Maximilian Friedersdorff, mf195@student.le.ac.uk, mf195

Preliminary Report

supervised by
Dr. Nir Piterman
Dr. Manuela Bujorianu

July 1, 2016
DECLARATION

All sentences or passages quoted in this report, or computer code of any form whatsoever used and/or submitted at
any stages, which are taken from other people’s work have been specially acknowledged by clear citation of the
source, specifying author, work, date and page(s). Any part of my own written work, or software coding, which is
substantially based upon other people’s work, is duly accompanied by clear citation of the source, specifying author,
work, date and page(s). I understand that failure to do this amounts to plagiarism and will be considered grounds
for failure in this module and the degree examination as a whole.
Name: Maximilian Friedersdorff
Date: July 1, 2016

1
1 Motivation This should be done not to improve functionality, but
to make future development work easier.
The department of Molecular and Cell Biology operates
two TIRFMs. At a high level these microscopes take im- • Explore and decide on a GUI design pattern to apply
ages of fluorescing molecules at multiple wavelengths. This consistently
is done by splitting the beam using a prism or grating and
detecting at different wavelengths either simultaneously on • Implement a small test project with that design pat-
different parts of the detector, or sequentially on the same tern to see whether it is feasible to use in Matlab
area of the detector. It is important to note that some, but • Initially implement additions to the GUI with this
not all, molecules emit at multiple wavelengths simultane- pattern
ously. Due to diffraction, the otherwise point light sources
appear on the detector as 2D Gaussian distributions (to a • Restructure the existing GUI to fit into this pattern
good approximation).
The principle challenge lies in locating the centre of 2.2 Calibration
Gaussian to sub-pixel accuracy and identifying sources at
different wavelengths that correspond to an emission from As previously mentioned, the microscopes suffer from sig-
the same molecule. The observed molecules are at separa- nificant and fluctuating chromatic aberration. It is there-
tions approaching the resolution limit of the microscope. fore sensible to have a method of performing calibration
Identifying coincident sources is difficult primarily because quickly and (semi-) automatically. Ideally the user should
the microscope introduces chromatic aberration, which is only load calibration data and press a button. There
assumed to be a superposition of the systematic aberra- should also be a facility to catalogue and load previous cal-
tion introduced by the lens and a random component due ibrations, which would allow working with old data with-
to thermal expansion and perhaps bumping of the micro- out having to recalibrate.
scope or vibration. This requires frequent and accurate Presently, the calibration is largely done by hand on
calibration. an excel sheet.
The data taken by these microscopes is in fact a series
• Find and parametrize the transformation that de-
of on the order of 100 images. These are taken in relatively
scribes the chromatic aberration.
quick succession so drift is not a major problem. The im-
age series is used to create light curves for each interesting • Experiment with different approaches (algorithms)
source, further analysis of which is also of interest. The to finding the best parameters. This is an optimiza-
power emitted (photons per second) is quantized and de- tion problem.
cays over time. It is of interest to the department to iden-
tify these transitions in intensity. However superimposed • Build a GUI or extend the existing GUI in order
Gaussian noise makes this a challenge. to manage calibration. This should have options to
The department already uses an application written in load or save to file, as well as edit the calibration
Matlab by a former Ph.D. student to automate some (and data manually
more) of these tasks.
• Implement the calibration process.

2 Requirements 2.3 Colocalization


At the moment checking for colocalization is done by check-
2.1 GUI code organisation
ing the distance between two sources (at different wave-
Presently the code controlling the GUI is a mess. The en- lengths) after accounting for chromatic aberration. This
try point into the programme (Auswerter) calls CreateUI is not as good as it can be.
which in turn creates the UI elements. The callbacks to When the parameters of the transformation are var-
some of these functions are locally declared anonymous ied, the transformed coordinates will trace a path in the
functions, some are function handles to externally defined plane. Colocalized sources should ideally be precisely su-
functions and yet others are passed as strings to be eval- perimposed (the separation should be much smaller than
uated (this is terrible). When a subroutine is invoked by the resolution of the microscope). Any deviation from that
a UI interaction no attempt is made to pass data to it as ideal is due to an imperfect calibration. Since the main
parameters; instead the subroutine typically requests the component of the calibration is a scaling, it is expected
required data from a global store using the ReadAppData that colocalized sources should be in a straight line with
routine. the centre of scaling (or deviating only little). This should
Not all of these things occur directly because of the way be taken into account when detecting colocalization.
the GUI was written. However tidying up the GUI will
also clean this if done sensibly. I propose that existing GUI 2.4 Analysis of Light Curves
code is restructured and rewritten as necessary to organise
it into some, internally consistent, pattern. Whatever this The department wishes to automate some analysis of the
pattern is, it should also allow for elimination of the global light curves produces. This has not been discussed with
variables. them in great detail yet and so I will not go into further
detail here.

2
2.5 Potential processing time 5 Reading
improvements
Aside from the Matlab documentation, Robert Weinmeis-
Some processes take a long time to complete. This in- ter’s Ph.D. thesis was consulted so far: Development of
cludes the detection of sources, as well as loading data Single-Molecule Methods for RNA Splicing (2014). Most
into the software. It is unknown at this time whether this of it is irrelevant to this project, however he does briefly
is due to an inefficient choice of algorithm, poor imple- touch on the form of the transformation that describes the
mentation or simply because this is as good as it gets. chromatic aberration. This was used as a reference when
This should be investigated. If the poor performance is looking at the spreadsheet used for calibration.
due to one of the above, it should be improved. This may I suspect more will be added to this list as the project
include rewriting some of the performance intensive tasks wears on.
in native code, or simply optimizing existing Matlab code.

2.6 Testing
Currently no attempt is made to automatically test the
code. While it would be a ideal to write tests for the
entire application, this is unrealistic. A good compromise
would be to write tests for all new code, as well as any
code that is modified or where bugs are discovered.

3 Technical Specifications
The existing application is written in Matlab 2012. It was
written to run on the relatively standard university desk-
top machines running windows 7. These typically have
low-end desktop class processor (usually an i3). Devel-
opment work is also carried out on a similarly equipped
(though not identical) desktop computer running
GNU/Linux.

4 Timetable
I question the use for a detailed timetable at this point. A
fair amount of this work will happen concurrently and not
sequentially. For instance, restructuring the GUI code will
likely happen alongside adding additional features. On top
of that, the order in which features are implemented will
depend to a large extent on the priorities of the Dept. of
Molecular and Cell Biology. I can however estimate the
time required for some of these.

• GUI code organisation — 2 Weeks


• Calibration — 2 Weeks

• Colocalization — 1 Week
• Analysis of Light Curves — 3 Weeks (This is will be
associated with some research)
• Processing time improvements — 4 Weeks (This will
be highly variable depending on what is found)
• Testing — This will be ongoing

The tasks listed above will be carried out in roughly


the order given.

You might also like