You are on page 1of 5

Computational descriptors definitions

- Vertical skills – (Sloboda, 1996; Rodger, 2010)

* Music gestural analysis: (Wanderley, 2000)

1. Spatial analysis: the amount of space between events in each


gesture,(Wanderley, 2000).
2. Cinematic analysis: the number of events per second,
(Wanderley, 2000).
3. Frequential analysis: the the variation of the range of
effect used,(Wanderley, 2000).
4. Error: whenever the mouse is clicked outside the instrument's
circle (boolean).
5. Local error: the error of each improvisation, where the error
of each gesture is weighted by its duration.
6. Technical skills: the product of the IQR of the the spatial
analysis, the cinematic analysis and the frequential
analysis, minus the local error.

- Horizontal skills – (Sloboda, 1996; Rodger, 2010)

* Computational creativity analysis: (Maher, 2006, 2010, 2012;


Comajuncosas, 2015)

7. Gesture descriptors:(Hartman et al, 2005; Comajuncosas, 2015)


• duration: segment duration in seconds
• IOI: inter-onset interval between segments
• angle: mean position in radians
• angle range: inter-quartile range related to spatial extent
• radius: mean position related to coordinates
• radius range: inter-quartile range related to spatial
extent
• speed: mean angular velocity
• speed range: inter-quartile range for velocity
8. Novelty: the euclidean distance to the closest centroid of a
k-means based clustering, (Maher, 2006, 2010, 2012).
9. Surprise: the distance of a new event, within each gesture,
to the resulting linear regression of the (maximum five)
previous events, (Maher, 2010; Comajuncosas, 2015).
10. Value: in this specific conceptual space, each

1
improvisation is evaluated according to its coherence,
fluency and error correction value, (Maher, 2012; Gibbs,
2011).
11. Coherence: the unity minus the distance from every event
within each gesture to the centroid of the cluster computed
by adding the events one by one. It is analysed over the
effect, freq, freqn, teta and diameter descriptors, (Gibbs,
2011).
12. Fluency: the unity minus the number of the boolean click
equal to zero, in a period of ten seconds, starting after the
first click(Gibbs, 2011).
13. Error correction: the unity minus the current local error
value divided by the previous local error. In case the local
error of the improvisation is zero, the error correction
becomes the maximum.

2
Thesis (pseudo)code display

- Setting up -

1. Ask on the terminal the number of experiments (number of .csv


data files) to work with.
2. Read from the data files the saved descriptors (time, click,
x_coord, y_coord, outer, effect, inner, diameter, freqn, freq
and teta).
3. Segment the descriptors by events; that is to say, group the
data in segments separated when clicking stops.
4. Calculate new descriptors, such as radius and
angular_velocity.

- Vertical skills -

5. Compute the spatial_analysis by adding the euclidean distance


between each event and the previous one.
6. Compute the cinematic_analysis dividing the number of events
by the total duration in seconds.
7. Compute the frequential_analysis as the IQR of the effect.
descriptor.
8. Compute the local_error by grouping the consecutive values
where an error is detected and averaging by its duration in
seconds.
9. Compute the total_skills by subtracting the local_error from
spatial_variation*cinematic_analysis*frequential_analysis.

- Horizontal skills -

* Low-level descriptors extraction

10. For each segment, calculate the gesture descriptors:


• duration = seg_time[i][-1] - seg_time[i][0]
• IOI = seg_time[i+1][0] - seg_time[i][0]
• angle = mean angle in the [0, 2pi] interval
• angle range: angle's IQR
• radius = mean radius
• radius range = radius' IQR
• speed = angular velocity
• speed range = velocity's IQR

3
11. Normalize the previous gesture descriptors.
12. Create the gestures vector:
gestures = [duration_n, ioi_n, angle_n, radius_n, srange_teta_n,
srange_r_n, speed_n, speed_var_n]

* Clustering

13. Feed the skcikit-learn model with the gestures vector.


14. Select the maximum number of clusters so that n_samples
>= n_clusters.
15. Compute the elbow method and select the optimal number of
clusters for each segment.
16. Define a main_descriptor by performing a Principal
Component Analysis (PCA) on the gestures vector, selecting
two attributes.
17. Perform a kmeans clustering fitting with the
main_descriptor, adding the gestures one by one, clustering
each whenever a new gesture is added.

* Novelty measurement

18. Calculate the novelty value of each gesture by computing


its euclidean distance to the cluster centroid.
19. Calculate the improvisation's novelty value as the sum of
all novelty values divided by the number of gestures.

* Surprise measurement

20. Compute the scipy.stats linear regression of a pair of


events from the effect, freq, freqn and duration descriptors.
21. Compute the Confidence Interval (CI) of the linear
regression.
22. Calculate the distance from the new event to the linear
regression and check if it is less than the CI.
23. Select as surprise value the previous calculated
distance, unless the event is within the CI, with what
surprise will be zero.
24. Normalize the values obtained for each descriptor, and
select the surprise values as the maximum ones.
25. Compute the improvisation's surprise value as the maximum
value from the surprise values calculated from the effect,

4
freq, freqn and duration descriptors.

* Coherence measurement

26. Cluster the values within the descriptors: effect, freq,


freqn, teta and diameter, adding each event one by one.
27. Calculate coherence values as 1 minus the distance to the
cluster's centroid.
28. Normalize the values of coherence of each descriptor.

* Fluency measurement

29. Calculate the number of click=0 in an interval of 10


seconds from each event, and take 1 minus that value as
fluency.
30. Take the mean of the values of fluency as the
total_fluency value.

* Value measurement

31. Create a new gesture vector:


gestures_value=[coher_effect,coher_freq,cohere_freqn,
coher_teta,coher_diameter,total_fluency]
32. Fit a kmeans cluster with the PCA of the gestures_value
vector with the number of clusters calculated, again, with
the elbow method.
33. Calculate a preliminary value quantity by calculating the
distance from each gesture to the centroid and taking the
minimum one.
34. Compute the error_correction value as 1 minus the current
local error value divided by the previous local error, giving
the value of error_correction=1 when the local error of the
improvisation is zero.
35. Calculate the total_value as the sum of the value numbers
divided by the number of gestures, weighted by the
error_correction.
36. Compute the final creativity as:
total_creativity = total_surprise*total_novelty*total_value
37. Save the final creativity and tech_skills values for
further analysis
38. Plot the requested graphs.

You might also like