You are on page 1of 2

Journal part 2

Reading week and Week 7-

For the rest of week 6 after the end of the mid-week presentations and then we just went over
the finer details of the project which needed to be done moving further and the goals based on
the split of the team in two being NetLogo and statistical analysis. Ravneet and Jiayu worked
further on the development of the model over the reading week with advancement made
specific to the model being in the radius assigned to the shop suggested by Monica as the
proximity within which the agents entering commit a crime. Then the agents were also assigned
colors based on them committing crime or not(red and yellow).

The progress made on mine and Alan’s forefront for the week largely contained going through
the papers within the analysis specific to the risk factor type analysis of the crime-based
dataset. We were able to find no papers discussing probabilities but rather likelihood based on
social economic factors, region of crime, and some other demographic factors. To make further
progress on this, we quite outlined needing the probabilities of crime which we found needed
to be from one of the distribution based on evidence from different papers to be randomly
distributed. The other probability is rather concerned with the likelihood of crime observed by
the camera. This took a large portion of the two weeks’ time just to understand how to
undertake the methodology for the already scarce dataset. There were also papers with data
from different cities and hence the setup seemed different based on the distributional
assumptions of the data. So I tried to narrow down the scope of project to the City of Toronto
with an emphasis on a certain paper which investigated on the risk analysis for robberies in
Toronto. For the rest of the week, I just went over the literature in depth and sorted as to the
most important papers for the research .

Week 8-

Opened up the OverLeaf account. For the NetLogo, Ravneet and Jiayu worked on refining
elements like making the filled of vision and the shop vicinity represented with smaller dots and
configuration of the cameras and having additional dynamic options to adjust in the model.
Furthermore, dummy probabilities were assigned to agents in the model to commit the crime. I
personally was looking for feedback to proceed further with the model from our meeting with
Monica. Learning more on the feedback, we had an outline for how to integrate probabilities
from the likelihood of the crime based on the risk factors odds ratio. I am not quite sure if any of
us were able to get a hang of what exactly was needed to be done with that so we emailed her
to get the written feedback for the same. The parts I did understand to move forward included
that we use the crime proportions to change values based on the crime dataset and converge it
to it and see if the two are the same to extract the data from it based on the converge process
to the true distribution of the data through MCMC. Another was the calibration measures from
a trial-error method and capture the transformed data. Another huge suggestion to the model
was that the tick size needed to be changes as the current one in the model had a 10-tick cycle
representing a day which needed to be changed to 1 tick representing a week. This was to be
changed to a higher frequency representation so as to fasten up the conclusion of the model
and have larger inferences from the dataset. We further developed an outline of what needed
to be completed for the rest of the weeks just to know our goals moving forward with 3 weeks
left on hand. Moreover in the Sunday meeting, I explained to the group on the specifics of the
Toronto robbery crime dataset which we would be particularly working with.

Week 9-

The week started with Ravneet and Jiayu getting help from Monica in terms of the time ticks for
the model which could be calibrated This week was especially useful it terms of finding the
trends and analyzing the data so that we can have probabilities in the following week. I found
some trends which helped to deduce the dataset for crime reported in Toronto. We found the
mean crime reported being 0.06% for 10 years of data and some observations based of how
much crime is reported over time. Having this probability also helped Ravneet and Jiayu make
further advancements to the model where the randomness of committing a crime can be
assigned hence a part of the two probabilities is now conditionally sorted to implement in the
model. In terms of the model itself, they found to plot the chart for the crime taking place. Still
waiting for the feedback from Monica so as to know what precisely needs to be done to
calibrate the model. The Wednesday meeting was cancelled hence the problem still remains
unresolved. Then we also met with the mentors on the following day to get the feedback and
explained with the email communication set up by Ravneet mainly. Bridgette helped us to first
breakdown he model and specific paper we were discussing about. The large suggestion that
came out of the meeting was to set up a dummy variable in a random manner and get the
environment to capture the data based off of it. Also one suggestion to the probabilities was
that they should be investigated of the population as a whole instead of individuals. Then I
again had a call with Ravneet to break the specifics of the NetLogo model.

Week 10-

In terms of individual progress, the stumbling block still remained to be the probability of the
crime which was to be assigned. This was concluded as to how to conduct when we met with
Monica and how the likelihood need not be just a random but a number within a confidence
interval which was a result of the calibration part through MCMC and trial-error techniques.
Hence a little bit of progress was made in the next two days leading up to Wednesday where we
presented it again. Now the scope of work lies on finding this interval and hence integrating it in
the model. Moreover the behaviour environment is now designed to produce and record the
output. In the next week after the probabilities are assigned, I will undertake the analysis of the
data and we will begin writing the paper (not a lot for this week as it was half the week but we
remain hopeful as to reaching our targets).

You might also like