You are on page 1of 10



Detection of Learners Affective State Based on Mouse Movements

Georgios Tsoulouhas, Dimitrios Georgiou, and Alexandros Karakos
Abstract In this paper we introduce a new method of detection of the emotional state of a student who attends a lesson online. To be more specific we are occupied with the detection of boredom which can be caused by the presentation of a course through a distance learning environment. The detection method is based on information obtained from the movements of the user's computer mouse. We suggest some metrics which are derived from this information and may be related to the emotional state of the user. Based on these metrics we use data mining methods to classify the results. In order to test the efficiency of the method we carried out an experiment with 136 students in a particular course which consisted of 7 different learning objects. The students were called to attend the lesson and to express periodically their emotional condition (if they feel bored or not) for each learning object separately. The collected data after being processed were applied to the known classification algorithm C4.5 which classifies the values of the metrics according to the user's emotional state. Index Terms Distance learning, Mining methods and algorithms, Adaptive hypermedia

owadays it is generally acceptable that the use of the computers and the information technology can improve very much the learning process. The Adaptive Educational Hypermedia Systems (AEHS) are being developed more and more [1] trying to help the studentuser to obtain the knowledge needed in the most efficient way. Studies have shown [2],[3] that the emotional state of the user during the attendance of a course plays a significant role in the effectiveness of the learning process. For example, if some presentation of a lesson causes a sense of boredom to the student then he won't be in a condition to attend the course and the time pass by ineffectively. Other emotions are confusion, frustration, fatigue [4]. Consequently the ability for the system to detect the student's condition is very significant as it has the potential to adjust properly and provide stimulants to the student so as to change his disposition and to stimulate again his interest. In this work we propose a method for the detection of the sense of boredom based on the hand motions of the student and by extension the mouse movements. Previous works [5],[6],[7] attempt to detect the emotional state by the use of several devices which monitoring the student. These devices are cameras, movement detectors and other special computer peripherals (pointing devices). The disadvantage of the above is that such devices can't be available to every user who uses the computer at his home in order to attend a distance learning course. On the other hand, the mouse as a device is widespread and its observation can be done relatively easily.


2.1. The Relationship between Affect and Learning
One important way for AEHS to adapt to their individual learners is related with their ability to show empathy. Being empathetic implies that such systems are able to recognize users mental states and understand the implication of those states. Recent researchers in neurosciences, education, and psychology have shown that emotions play an important role in learning. People often separate emotions and reason, believing that emotions are an obstacle in rational decision making or reasoning but recent work have shown that in every case the cognitive process of an individual is strongly dependent on his emotions which can drastically influence performance [8]. Numerous students submitted to an examination have been faced to stress and anxiety and by consequence to the memory blank, a situation in which they are unable to neither retrieve any information nor make any deductions [9]. A number of psychological sources influence how individuals want and intend to learn. Individuals process their emotions and interact with other elements of their information-processing system. Emotional processing is a complex task not easy to be accomplished by a learner. Well skilled tutors may apply several techniques additionally to their ability to communicate emotionally. They will be able then, to regulate the learning procedure in order to increase its efficiency, avoiding the impact of the boredom effect, or other affective states. Various results published recently [4] focuses on certain emotions such as fear, anger, happiness, sadness, disgust, and surprise. A number of papers deal with theories of emotions and their impact on learning processing [3]. Rob Reilly and Barry Kort [4] suggest a model of a learning cycle, which integrates affect. Fig. 1 suggests six possible

G. Tsoulouhas is with the Democritus University of Thrace, Xanthi, GR 67100. D. Georgiou is with the Democritus University of Thrace, Xanthi, GR 67100. A. Karakos is with the Democritus University of Thrace, Xanthi, GR 67100.



eras, movement and distance sensors. The use of these devices is effective but there is a problem. They have some cost and are difficult to exist on every computer - elearning environment. So there is a need to find ways to monitor the emotional condition of a student via devices which are widespread and of low cost as for example the keyboard and the mouse which are two devices that exist in every computer.

2.3 Mouse Movements and User Behavior

The motions of the user's hand and by extension the movements of the computer mouse have a direct relation with the psychological sentimental condition of the usFig. 1. Emotion sets possibly relevant to learning. er. To be more specific, the way by which the mouse is moved (orbit, speed, intervals of immobility, direction) can demonstrate the condition of the user student. Studies have shown that at the movement of the hand which tries to do some task, the brain continuously is fed with data and influences proportionally the trajectory of the hand. (A motor response, such as the trajectory of a reaching arm movement, is continuously updated be perceptual-cognitive preprocessing over time) [14],[15],[16]. For example, when the hand intends to grasp an object, if this object moves, the trajectory it forms will change in motion trying to follow the moving object. This shows that the movement of the hand has dynamics which depends directly on the condition of the brain when it gave the initial order to the hand to catch the object [17],[18],[19]. Even more specifically, research has shown that following the motions of the mouse we can derive conclusions as to how the human brain processes the words which it listens to or reads on the screen or how it processes different images seen on the screen and which psychological funcFig.2. Four Quadrant model relating phases of learning to emotions in tions take place [20],[21],[22],[23],[24]. Fig. 1. One more field for which there is a study of how the user moves the mouse is the biometric method. It has been emotion axes that may arise in the course of learning. Fig. proven that based on the mouse dynamics there can be an 2 interweaves the emotion axes shown in Fig. 1 with the identification of the user so he can have access to a system or network [25],[26],[27]. Furthermore, we can identify cognitive dynamics of the learning process. The question of learners affective state detection in AEHS the sex of the user [28]. Finally, studies have shown that the movements of the is crucial as long as such systems should take into account mouse reflects the gaze position of the user [29],[30],[31] the learners loss of interest as he/she stands in front of a screen. Moreover, as far as such systems offer persona- and by extension provide information of his intentions lized learning services, arises the need to have the ability (for example: which could be the next search phrase on a web search page? [32],[33]). to sense and perceive affective states.

2.2 Monitoring Learners Behavior

Different assignments and methods have been proposed for the observation and analysis of the behavior of a user as he interacts with a computer [5],[6]. Also some of these were applied in e-learning environments [7],[10],[11],[12],[13]. For this purpose several special devices are used which are capable to monitor the motions of the user. For example, cameras are used to record the facial expressions and then with special techniques of pattern matching there is some correspondence between a facial expression and some emotion. Other devices which are used are those of eye-tracking, microphones, special chairs which record the position of the body, special mouses with senses of pressure and touch, thermal cam-


In order to prove that the movements of the mouse can be used to detect the boredom effect we collected the data from an experiment with 136 students. For this experiment the students needed to attend a session of 45 min of a concrete course which consisted of different learning objects. The subject of the course was Computer Programming Techniques and consisted of 7 different learning objects. During the session the movements of the mouse were recorded along with the intervals of immobility of the mouse. If for some concrete time interval the mouse would stop moving, then the students were told to answer to the question "Are



you bored?" with a single Yes or No. The time span was determined from one constant "b" which we named boredom threshold. To group the 136 students we separated them into 4 smaller groups. For each of these 4 groups the parameter "b" had a different value 10sec, 20sec, 30sec and 40sec. Our purpose was to see if the movements of the mouse and the intervals of its immobility are related to the boredom effect. The experiment was done on computers of the same hardware and on screens with the same resolution for each student.

3.1 Data mining: gain knowledge on mouse movements data

Taking into consideration that the monitoring of the mouse is continuous (our application for example records the position of the mouse every 8msec - the sampling rate is constrained by the processors clock speed and other factors related to the processors architecture) for a session of a few minutes the volume of the data is very large. For the processing of the final raw data we use the software Weka [34], which is a collection of machine learning algorithms for data mining tasks. Weka contains tools for data pre-processing, classification, regression, clustering, association rules, and visualization. The purpose of the tests we did on the data with the aid of Weka was to find some patterns within the data which correlate the mouse movements and the interval of its immobility with the sense of boredom which appeared or not to the users. For this purpose we use methods of data classification. To be more specific we used the algorithm C4.5 which is a statistical classifier that can construct decision trees [35]. The reason we used this algorithm is because it has a very good efficiency to data which have discrete or continuous attributes and to data which have no noise [36]. Our data contains discrete attributes with unordered nominal values and continuous attributes whose values are numeric. Also before their application to the C4.5 they are filtered so as to reduce the noise to the least. The Wekas implementation of C4.5 is mentioned as J48. Finally, to get an idea of the values of the proposed metrics, we perform a clustering using the SimpleKmeans algorithm [37]. The clustering results were 2 clusters, one with metrics from bored users and one with metrics for non-bored users.

pre-process and store the data into a MySQL database [40]. Fig. 3 shows the model of the architecture. Generally, the monitoring of the mouse pointer movements on the page can be done by the use of a code written in JavaScript [41]. The JavaScript provides events for every action of the mouse like on mouse over, on mouse move, on mouse up, on mouse down, on mouse out. So with the use of the event "on mouse move" (Fig.4) we can notice when there is some movement of the mouse. This event is activated every time the mouse is moved even by one pixel. We have made a function which is called every time the event "on mouse move" occurs. This function records the time when the movement took place and the co-ordinates of the muse pointer and stores them in an array. Also for as long the mouse moves (continuous movement) its position is recorded with sample rate 8msec and it is also stored on the above array. The data of the array are sent to the server periodically (for example every 5 sec). Every time that the data are sent, the array empties. In this way we avoid the accumulation of great quantities of data on the user's computer something that could slow down the system. The collected data is stored as previously mentioned on a MySQL data base. The data base contains on separate relational tables the following data (as seen on the database schema on Fig.5): Data of the users who use the application. The learning objects of the lesson. The movements of the mouse (moving times and x, y coordinates) connected with the corresponding learning object and the corresponding user and the answers of the users to the question "Are you bored?" connected with the corresponding last movement of the mouse. Every time the user has the mouse immobilized for a time period equal or more of its value of parameter "b" then the next time the mouse moves (that is after b+x sec) a pop up dialog is activated which asks the user to answer

3.2 Architecture Design

The application we designed runs with any HTML browser. The course which the participant students are called to attend consists of 7 different learning objects which are lie on a HTML web page. This page is separated into 7 different frames one for each learning object. As the student follows one by one the learning objects the application records the movements of the mouse. We made use of a simple architecture to obtain data on mouse movement that includes a transparent interface between the user and the test application. This interface asynchronously collects and sends data, on the basis of user-invoked mouse movements, during the user's interaction. We principally used the Ajax (Asynchronous JavaScript and XML) technique [38] to implement the asynchronously behavior and the PHP language [39] to collect,

Fig.3. The Ajax model of communication between the client and the server [39].




Fig.4. Use of onmousemove JavaScript event

with a single "Yes" or a single "No" to the question "Are you bored?". This means that the pop-up dialog isn't activated every b seconds, but every b+x seconds, with the rate of x not being apparently constant. We do this so as not bother the user during the pauses of the mouse and not to influence the duration of the pause. Each row contains the data that corresponds on a specific position of the mouse pointer. The difference between end_time and start_time is equivalent with the inactivity time of the mouse. If the difference is less than 500ms, that shows continuous movement (no inactivity). mouse movements performed within a 45-degree area. For instance, direction number 1 represents all actions performed with angles between 0 degree and 45 degrees, whereas direction number 2 is responsible for all actions performed between 45 degrees and 90 degrees. In Table 1 we notice an example of the raw data which is recorded on the data base. Except of the timestamps, the coordinates and the directions, on the table appears the id of the session - user to whom the concrete record corresponds. Also appears the learning object id, which corresponds to the learning object upon which the mouse was moving. Concerning the learning objects what interests us is the type description of each learning object. For our application as previously mentioned we used 7 different learning objects on which depending on their content we gave the following descriptions (Learning Object Types - LOT):

Fig.5. Database schema

3.3 Collected Data

The data collected on the data base as the mouse is moving consists of 2 timestamps, the coordinates x,y and the direction of the movement. The 1st timestamp (start time) shows the time of the previous sampling while the 2nd timestamp shows the time of the current sampling. Thus when we have continuous movement the difference between the 2 time periods is 8msec while when we have a pause the difference shows the duration of the pause. The coordinates x,y correspond to the position of the pointer on the 2nd timestamp. Based on the coordinates we can calculate the distance covered by the mouse between 2 samplings according to the formula (1) where (x1,y1) and (x2,y2) the coordinates of the points where the pointer was found between the 2 samplings. Also from the coordinates we can estimate the direction of the movement. The direction of the movement as suggested by A. A. E. Ahmed and I. Traore [25] is a number from 1 to 8. As shown in Fig.6, each of the eight directions covers a set of

Fig.6. Mouse movement directions. For instance, direction number 1 represents all actions performed with angles between 0 degree and 45 degrees.



1. medium text with images (mti) 2. short text (st) 3. short text with images (sti) 4. long text with images (lti) 5. video (vd) 6. multiple choice questions (mcq) 7. exercise (exr) From the 136 participants we collected about 1,800,000 records of raw data.

In the following, using diagrams we will show the relation between these metrics and the appearance of the boredom effect to the users- learners.

4.2. Learning Object Type

The type of every learning object (LOT) plays a significant role in the appearance of boredom and generally the affective state of a student [42],[43]. For this reason many AEHS give special significance to the way the different types of learning objects will be combined so as to compose a course [44]. So depending on the type of learning object boredom appears more or less frequently, according to the histogram of Fig. 7. For example to the learning objects with a short text and pictures or video (st, sti, vd)


4.1. Table of Proposed Metrics

From the above data we can easily extract different metrics which are directly related to the movement of the mouse as that of the speed. Also very useful are the metrics which are related to the immobility of the mouse along with the metrics which are estimated on basis of the moment of time during which the user was asked to give an answer to the question "Are you bored?". The metrics we suggest appear on Table 2. The metrics in which appears the expression Before Asked, concerns the last 60 seconds before the question Are you bored?. TABLE 2 PROPOSED METRICS


Units (discrete) Pixels/sec Pixels/sec (numerical) sec % % % Pixels/sec

Fig.7. Percentage of users who claimed boredom per learning object type. The difference between different learning objects is obvious.


Total Average MovementSpeed (TMS) Latest Average Speed - Before Asked (LMS) Mouse Inactivity ccurrences Before Asked (MIN) Average Duration of Mouse Inactivity Before Asked (DMIN) Horizontal Movements to Total Movements Ratio (HRZ) Vertical Movements to Total Movements Ratio (VRT) Diagonal Movements to Total Movements Ratio (DGNL) Average MovementSpeed per Movement Direction(MDA)

we have a small percentage of boredom while to the learning objects with a longer text the percentages are quite high (mti, lti).

4.3. Total Average Movement Speed and Latest Average Movement Speed
The speed of the mouse movement is a metric which can help to the detection of boredom. The Total Average Movement Speed (TMS) corresponds to the average speed of the movement for all the duration of monitoring the learning object, while the Latest Average Speed (LMS)

Each row contains the data that corresponds on a specific position of the mouse pointer.

Fig.8. Users 1 and 2 claimed boredom. There is a significant difference between TMS and LMS for users 1 and 2, but not for users 3 and 4.



is the average speed of the movement in the last 60 seconds before the user is asked if bored. We noticed that TMS over the LMS does not have big differences in users who didn't report that they are bored. On the other hand, for users who reported boredom both values differ significantly. In the diagram of Fig.8 we see this difference in two speeds for 4 different users, 2 who reported boredom (user 1 and user2) and 2 who reported no boredom (user 3 and user 4). In the classification process we use the ratio TMS/LMS, since it shows the proportion between these values.

4.4. Mouse Inactivity

The number of observations in which there is mouse inactivity as much as the duration of inactivity can provide us indications on the condition of the user. The metric Mouse Inactivity Occurrences Before Asked (MIN) corresponds to the number of mouse pauses in the last 60 seconds before the user is asked if bored. The metric Average Duration of Mouse Inactivity Before Asked (DMIN) is the average time that the mouse is stationary. We observed that both the value of the metric MIN and the DMIN had noticeably increased for users who reported boredom in contrast to those who did not. In dia-

Fig.10. How distributions of eye fixation time and clickthrough relate to distribution of mouse hovering time, for regions like text regions or image regions [45].

Fig.11. Distribution of the 8 mouse movement directions from all the 136 participants.

Fig.9. Average values of MIN and DMIN metrics, for bored and nonbored users.

gram of Fig. 9 we see the averages of metric MIN and DMIN for all the users who answered "Yes" to the question "Are you bored?" and all those who answered "No".

4.5. Direction of Movement

The movement of the mouse on a page usually follows the movement of the user's eyes [30]. Users move the mouse cursor according to where the eye is focused and many people use the mouse as a marker when they are reading a text [45]. Fig. 10 shows the relative distributions of the user s attention across selected regions of the page, comparing the proportion of mouse data points in each region to the equivalents of total eye fixation duration. The mouse pointer moved mostly in the horizontal direction (Direction 3 and 7 according to Fig. 6) , across or below the text that the user currently reading. Also, vertical directions are commonly used, at the right side of the text. Fig. 11 shows the distribution of the mouse movement directions from all the 136 participants. As we can see, directions 3 and 7 (horizontal directions) have the highest value, followed by directions 1 and 5 (vertical directions). In our experiment we noticed some differentiation in regard to the diagram of Fig.11 concerning the users who

reported boredom. To be more specific, an increase of the vertical and diagonal movements in regard to the horizontal movements was observed. Fig. 12 shows the histogram of movement directions from 6 different users. Users 4, 5 and 6 in Fig. 12b claimed boredom and users 1, 2 and 3 in Fig. 12a did not claim boredom. For the users of Fig.12b we have an increase of directions 1 and 5 in regard to directions 3 and 7. So for the metrics HRZ, VRT and DGNL we can say they are connected with the user's behavior.

4.6. Average Movement Speed per Movement Direction

The analysis of the data of the experiment as well as the metrics of HRZ, VRT and DGNL showed us that except for the direction of the movement, there is a differentiation in the speed of the movement, proportionally to the direction. So we valued the Average Movement Speed per Movement Direction (MDA) and noticed that the users who reported boredom had greater speed in the horizontal movements a little before they reported boredom in relation to the other users. The metric MDA is in substance an array of 8 numbers, one for each of the 8 directions of movement. Fig. 13 shows that differentiation for each mouse movement direction.




Each row contains some values of the proposed metrics from 2 different users (the user with session id = 102 and the user with session id = 103). For example, the user 102 on the learning object lti declares boredom (bored=yes). (a) of the metrics appears on Table 3. We import the data (for each group separately) to the Weka software and applied the J48 classifier. The properties of J48 were: confidenceFactor: 0.9 , minNumObj: 1, numFolds: 3. In order to evaluate the results of the classifier, we separated the data into 2 sets of the proportion 60:40. We used the small set in order to train the classifier and the big set to evaluate. For all 4 groups the classifier managed to classify correctly more than 90% of the data. To be more specific, the results of each group are: 1. (b)
Fig.12. (a) Histograms of Directions of Movement from 3 users who didnt claimed boredom. (b) Histograms of Directions of Movement from 3 users who claimed boredom. The increase of value of directions 1 and 5 is obvious.

Fig.13. Distribution of the MDA for each mouse movement direction. The values are the average values of MDA from all 136 participants.


From the raw data we calculate the values of the metrics. For the group with b=10 we had 725 records, for b=20 we had 430 records, for b=30 we had 318 records and for b=40 we had 196 records. A sample of the values

Forb=10: Correctly Classified Instances: 423(97.2414 %) Incorrectly Classified Instances:12(2.7586 %) Kappa statistic:0.9266 Mean absolute error:0.0436 Root mean squared error: 0.1647 Relative absolute error:11.6371 % Root relative squared error:37.3886 % Total Number of Instances:435(60%of725) ConfusionMatrix: ab<classifiedas 10311|a=yes(bored) 1320|b=no(nonbored) 2. Forb=20: Correctly Classified Instances: 246(95.3488 %) Incorrectly Classified Instances:12(4.6512 %) Kappa statistic: 0.873 Mean absolute error: 0.0717 Root mean squared error: 0.2101 Relative absolute error: 19.1182 % Root relative squared error: 47.6024 % Total Number of Instances: 258 (60% of 430) ConfusionMatrix: ab<classifiedas 5612|a=yes(bored) 0190|b=no(nonbored) 3. Forb=30: Correctly Classified Instances: 178 (93.1937 %) Incorrectly Classified Instances: 13 (6.8063 %)



Kappa statistic: 0.8033 Mean absolute error: 0.0752 Root mean squared error: 0.2596 Relative absolute error: 21.5768 % Root relative squared error: 62.1531 % Total Number of Instances: 191 (60% of 318) ConfusionMatrix: ab<classifiedas 629|a=yes(bored) 1246|b=no(nonbored) 4. Forb=40: Correctly Classified Instances: 104 (88.1356 %) Incorrectly Classified Instances: 14 (11.8644 %) Kappa statistic: 0.6449 Mean absolute error: 0.1186 Root mean squared error: 0.3444 Relative absolute error: 30.5677 % Root relative squared error: 81.6108 % Total Number of Instances: 118 (60% of 196)

Fig. 14 and Fig. 15 show an example of one of the classifier trees which were created. Fig. 14 shows a visual representation and Fig. 15 shows a text representation of the tree. The classifier gives concrete values of the metrics so as to conclude whether the user displays or not boredom. An example of a condition which leads to boredom according to Fig.14 and Fig.15 is the following: IF MDA2 <= 16.96 AND MIN <= 7 AND VRT > 14.93 AND MDA7 <= 8.91 THEN bored = yes In order to have a better view of the values of metrics which can help us detect the emotion of boredom on the student-user, we perform a clustering to the data. The algorithm we used for the clustering is the SimpleKmeans. The properties of SimpleKmeans were Euclidean distance

Fig.15. Text representation of the tree.

with 2 clusters, since our goal was to separate the instances into 2 clusters, one for bored users and one for non-bored users. Fig. 16 shows the results of the clustering and Fig. 17 shows a visualization of the results. Except for the metric LOT which its values are discrete and cannot have numerical values, the values of the other metrics which arise from the clustering agree almost completely with the findings mentioned in paragraph 4. For example, the value of metric TMS/LMS for 1st cluster (bored=yes) is 56.78% which shows a decrease of value TMS in regard to LMS, while for the 2nd cluster (bored=no) it is 78% which indicates that TMS and LMS don't have a big difference. If to the properties of simple K means we change the number of produced clusters from 2 to some greater number (for example 7) then we will be able to create clusters for each value of metric LOT.


In this work we suggested a methodology for the definition of the appearance of boredom to students-users of an e-learning environment. We suggested a set of metrics which can help us, depending on their values

Fig. 14. Visual representation of the tree.



[2] [3]


[5] [6]


[8] [9]


[11] Fig. 16. The results of Kmeans clustering.

to detect if a certain student shows boredom. We showed that on a particular group of users and with specific computer hardware the C4.5 classifier can classify the metrics relevant to boredom with a success rate of above 90%. The advantage of this methodology is the fact that without the use of special devices, only by the analysis of the mouse movements we can monitor the behavior affective state of the learner. As future work, we plan to automate the whole procedure, that is, we are developing a plug-in tool to automate the data pre-processing and classification steps. This tool, according to classification results, will be able to predict the emotional state of the learner and then it will inform the AHES to redesign the flow of the lesson if necessary.


[13] [14] [15]




The authors wish to thank the students of Democritus University of Thrace who were kind enough to participate in the experiments of this work.

[19] [20]


[1] S. Chatzisavva, G. Tsoulouhas, A. Georgiadou and A. Karakos, A dynamic environment for distance learning, 2nd International Conference on Computer Supported Education, pp. 398-401, 2010.



Spering, M., Wagener, D., Funke, J., The role of emotions in complex problems solving, Cognitive and Emotion 19 2005 B. Kort, R. Reilly and R. W. Picard, An affective model of interplay between emotions and learning: Reengineering educational pedagogy-building a learning companion, Proceedings of the IEEE International Conference on Advanced Learning Technologies, pp. 43-46, 2001. R. Reilly, B. Kort, The Science Behind The Art of Teaching Science: Emotional State and Learning, Proceedings of Society for Information Technology & Teacher Education International Conference, pp. 3021-3026, 2004. R. W. Picard, Toward computers that recognize and respond to user emotion, IBM Systems Journal, vol. 39, no 3.4, pp. 705-719, Apr. 2010 M. Pantic, L.J.M. Rothkrantz, Toward an Affect-Sensitive Multimodal HumanComputer Interaction, Procceedings of the IEEE, vol 91, no 9, pp. 1370-1390, Sept. 2003. S. DMello, T. Jackson, S. Craig, B. Morgan, P. Chipman, H. White, N. Person, B. Kort, R. Kaliouby, R. Picard and A. Graesser, AutoTutor detects and responds to learners affective and cognitive states, Workshop on emotional and cognitive issues in ITS held in conjunction with the ninth international conference on intelligent tutoring systems, pp. 31-43, 2008. A. Damasio, Descartes Error - Emotion, Reason and the Human Brain, New York: Penguin Books, pp. 165201, 2005. C. Frasson, P. Chalfoun, Managing Learners Affective States in Intelligent Tutoring Systems, Advances in Intelligent Tutoring Systems, vol. 308, pp. 339-358, 2010. S. Asteriadis, P. Tzouveli, K. Karpouzis and S. Kollias, Estimation of behavioral user state based on eye gaze and head pose application in an e-learning environment, Multimedia Tools and Applications, vol 41, no 3, pp. 469-493, 2008 T. Dragon, I. Arroyo, B. P. Woolf, W. Burleson, R. Kaliouby and H. Eydgahi, Viewing Student Affect and Learning through Classroom Observation and Physical Sensors, Lecture Notes in Computer Science, vol 5091, pp. 29-39, 2008. C. Conati, R. Chabbal, and H. Maclaren, A study on using biometric sensors for monitoring user emotions in educational games, User ModelingWorkshop on Assessing and Adapting to User Attitudes and Affect: Why,When, and How?, 2003 S. D'Mello, R. W. Picard, A. Graesser, Toward an Affect-Sensitive AutoTutor, IEEE Intelligent Systems, vol. 22, no. 4, pp. 53-61, 2007. Abrams, R., & Balota, D. (1991). Mental chronometry: Beyond reaction time, Psychological Science, 2, 153-157. J. I. Gold, M. N. Shadlen, Neural computations that underlie decisions about sensory stimuli, Trends in Cognitive Sciences, vol. 5, pp. 10-16, 2001. J. H. Song, K. Nakayama, Target selection in visual search as revealed by movement trajectories, Vision Research, vol. 48, no 7, pp. 853-861, 2008. M. A. Goodale, D. Pelisson and C. Prablanc, Large adjustments in visually guided reaching do not depend on vision of the hand or perception of target displacement, Nature, vol 320, pp. 748-750, 1986. M. Finkbeiner, J. H. Song, K. Nakayama, and A. Caramazza, Engaging the motor system with masked orthographic primes: A kinematic analysis, Visual Cognition, vol. 16, pp. 11-22, 2008. T. Schmidt, The finger in flight: Real-time motor control by visually masked color stimuli, Psychological Science, vol. 13, pp. 112-117, 2002. M. J. Spivey, M. Grosjean, and G. Knoblich, Continuous attraction toward phonological competitors, Proceedings of the National Academy of Sciences, vol. 102, pp. 10393-10398, 2005. R. Dale, C. Kehoe, and M. J. Spivey, Graded motor responses in the time course of categorizing atypical exemplars, Memory & Cognition, vol. 35, pp. 15-28, 2007. T. A. Farmer, S. E. Anderson, and M. J. Spivey, Gradiency and visual context in syntactic garden-paths, Journal of Memory & Language, vol. 57, pp. 570-595, 2007. J. B. Freeman, N. Ambady, N. O. Rule, and K. L. Johnson, Will a category cue attract you? Motor output reveals dynamic competition






[27] [28]







[35] [36] [37]


[39] [40] [41] [42]

[43] [44]


across person construal, Journal of Experimental Psychology: General, vol. 137, pp. 673-690, 2008. J. B. Freeman, N. Ambady, Motions of the hand expose the partial and parallel activation of stereotypes, Psychological Science, vol. 20, pp. 1183-1188, 2009. A. A. E. Ahmed, I. Traore, A New Biometric Technology Based on Mouse Dynamics, Dependable and Secure Computing, vol. 4, no 3, pp. 165-179, 2007. S.Benson, A.Thomson, A Behavioral Biometric Approach Based on Standardized Resolution in Mouse Dynamics, International Journal of Computer Science and Network Security, vol. 9, no 4, pp. 370-377, 2009. A. A. E. Ahmed, I. Traore, Detecting Computer Intrusions Using Behavioral Biometrics, PST, 2005. J.B. Freeman and N. Ambady, Motions of the hand expose the partial and parallel activation of stereotypes, Psychological Science, vol. 20, pp. 11831188, 2009. Q. Guo, E. Agichtein, Towards Predicting Web Searcher Gaze Position from Mouse Movements, 28th international conference extended abstracts on Human factors in computing systems, 2010. K. Rodden and X. Fu, Exploring how mouse movements relate to eye movements on Web search results pages, Web InformationSeeking and Interaction (WISI) Workshop at the 30th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 29-32, 2007. M. C. hen, J. R. Anderson, M. H. Sohn, What can a mouse cursor tell us more? Correlation of eye/mouse movements on web browsing, CHI EA '01 extended abstracts on Human factors in computing systems, 2001. Q. Guo, E. Agichtein, Exploring mouse movements for inferring query intent, 31st annual international ACM SIGIR conference on Research and development in information retrieval, pp. 707-708, 2008. A.L. Cox and M.M. Silva, The role of mouse movements in interactive search, 28th Annual Meeting of the Cognitive Science Society, pp. 1156-1161, 2006. E. Frank, M. Hall, G. Holmes, R. Kirkby, B. Pfahringer, I. H. Witten, L. Trigg, Weka-A Machine Learning Workbench for Data Mining, Data Mining and Knowledge Discovery Handbook, pp. 1269-1277, 2010. J. R. Quinlan , C4.5: programs for machine learning, San Mateo: Morgan Kaufmann Publishers, 1993. J. R. Quinlan, Improved Use of Continuous Attributes in C4.5, Journal of Artificial Intelligence Research, vol 4, pp. 77-90, 1996. T. Kanungo, D. M. Mount, N. S. Netanyahu, C. Piatko, R. Silverman, A. Y. Wu , The analysis of a simple k-means clustering algorithm, sixteenth annual symposium on Computational geometry, pp. 100-109, 2000. J. J. Garrett, Ajax: A New Approach to Web Applications, 2005. S. Nash, Learning Objects, Learning Object Repositories, and Learning Theory: Preliminary Best Practices for Online Courses, Interdisciplinary Journal of Knowledge and Learning Objects, vol. 1, pp. 217-228, 2005. M. Martinez, Designing learning objects to personalize learning, The Instructional Use of Learning Objects, pp.151173, 2002. O. Conlan, D. Dagger and V. Wade, Towards a standards-based approach to e-Learning personalization using reusable learning objects, World Conference on E-Learning, E-Learn 2002, pp. 210217, 2002. F. Mueller, A. Lockerd, Cheese: Tracking Mouse Movement Activity on Websites, a Tool for User Modeling, CHI '01 extended abstracts on Human factors in computing, pp. 279-280, 2001.

(2007-2011) on Programming (FORTRAN, C and Internet Programming). He deals with programming both for Windows and UNIX environments in C++, Perl, Visual Basic, .NET, PHP and anything that comes handy. Research Interests: 1. Data Mining, 2. Intelligent Tutoring Systems, 3. Metadata, 4. Software Agents, 5. Fuzzy Systems. Dimitrios Georgiou Born in 1948. BSc. AUTh Mathematics Department , PhD. DUTh Postdoc, UCBerceley. Visiting Scholar at UC, Davis (1980-1982). Visiting Professor URI (1989-92). Associate Professor, School of Engineering DUTh since 1991. Instructor of several undergraduate and postgraduate courses he also teaches vocational courses to Hellenic Power Corporation, Center for Productivity and Development, Training School for High School Teachers, Hellenic Air Force e.a. He published six text books. Research interests: 1) Qualitative behaviour of solutions of ODE, Difference Equations, and PDEs, 4) Numerical Methods for Boundary Value Problems, 5) Intelligent Tutoring Systems. 6) Computer Networks. Research Papers published in Several Scientific Journals and conferences. Member of IEEE Computer Society, AACE, AMS, ECMI e.a. Referee for the Journal of Mathematical Analysis and its Applications and other journals in mathematics and educational technology. A. Karakos received the Degree of Mathematician from the Department of Mathematics from Aristotle University of Thessaloniki, Greece and the Maitrise d' Informatique from the university PIERRE ET MARIE CURIE, Paris. He completed his PhD studies at university PIERRE ET MARIE. He is Assistant Professor at the Dept. of Electrical and Computer Eng., Democritus University of Thrace, Greece. His research interests are in the areas of learning systems, data analysis and programming languages.

George Tsoulouhas Born in 1979. BSc. Democritus University of Thrace (Duth), Greece. Polytechnics School, Electrical and Computer Engineering Department. PhD candidate at Democritus University of Thrace (DUTh), Polytechnics School, Electrical and Computer Engineering Department. Professor Assistant as PhD candidate