You are on page 1of 24

METHODOLOGY

This part of the study presented the theoretical and conceptual framework of the

study, operational definition of terms, research design, population and sampling,

instrumentation, methods of data gathering and method of data analysis.

Instructional technology integration was imperative to help maximize efficiency

and effectiveness of teachers and schools. It was the assumption that the use of this

available technology would make students successful when taking the PARCC

Assessment. Of particular interest here is the type of instructional technology the teachers

use in a given school and the amount of time dedicated in using these technology, and its

relationship to the previous year’s Math Scores of current 5th Grade students in the

PARCC Assessment.

Theoretical and Conceptual Framework

This study will be anchored on the Unified Theory Of Acceptance And Use Of

Technology (UTAUT) by Venkatesh, Morris, Davis and Davis (2003). Technology

integration into the classrooms has almost become one of the most rated topics to discuss

after the introduction of the technology to the classrooms. In 2003, Venkatesh, Morris,

Davis and Davis created the Unified Theory of Acceptance and Use of Technology. This

theory integrated elements from eight Information Technology Acceptance Models to

create their model. Gender, age, experience, and voluntariness of use were added to the

model and were hypothesized to moderate the effect of four constructs such as

performance expectancy, effort expectancy, social influence, and facilitating


conditionson intention to use and usage behavior. Behavioral intention is seen as a critical

predictor of technology use (Venkatesh, Morris, Davis & Davis, 2003). Self-efficacy and

anxiety were determined by Venkatesh, Morris, Davis & Davis (2003) to be indirect

determinants, and therefore unnecessary in the model. In this model, the UTAUT

contains four core determinants of information technology use such as performance

expectancy, effort expectancy, social influence and facilitating conditions. These are

significantly linked to various moderators of key relationships such as the socio-

demographic characteristics.

There are actually four abbreviated components of UTAUT: PE for performance

expectancy, EE for effort expectancy, SI for social influence (SI), and FC for facilitating

conditions. Performance Expectancy refers to the extent to which an individual believes

that using an instructional technology system will help students to attain benefits in

learning performance (Teo&Noyes, 2011). According to PE users will find technology

useful because it enables them to accomplish learning activities more quickly and

effectively. It can be considered the synthesis of variables, extrinsic motivation, relative

advantage, and result expectancies.Effort Expectancy or EE is defined as the degree of

ease associated with the use of instructional technology (Teo&Noyes, 2011). Based on

the UTAUT, use of technology among educational users will depend on whether or not

the technology is easy to use, and the influence of EE on behavioral intention will be

moderated by gender and age such that the effect will be stronger for women, particularly

for older women (Venkatesh, Morris, Davis & Davis, 2003). It can be considered as the

synthesis of variables such as complexity, and ease of use.Social Influence or SI is

defined as the degree to which a person perceives how important it is that ‘‘other people’’
believe he or she should use a technology. It can be considered as the synthesis of

variables such as subjective norms, planned behavior, and individual image. Further,

Facilitating Conditionsor FC are defined as the extent to which users believe that the

necessary infrastructures to support the use of technology in an organization or institution

exist. These may include resource and technology factors concerning compatibility issues

that have an impact on usage (Teo&Noyes, 2011). FC also includes the necessary

training for the technology users. FC is the organizational and the technical support for

the users. It can be considered as the synthesis of variables such as perceived behavioral

control, facilitating conditions, and job fit.

In this study, the Unified Theory Of Acceptance And Use Of Technology

(UTAUT) by Venkatesh, Morris, Davis and Davis (2003) will be used to look through its

lens on how the conceptual paradigm of this study will be addressed. By looking at the

four core components of this theory, the independent variables and the dependent

variables of the study will be addressed substantively.

Looking across in the context of performance expectancy, the socio-demographic

variables of student and teacher respondents in terms of age, sex, educational attainment,

parents’ occupation, inclination to technology, area of specialization, and personal

technology equipment will be defined on how such behavioral intentions manifest in the

classroom setting. Further, effort expectancy can be examined through the context of how

learners defined such conditions of the learning process and teachers’ way of utilizing

such instructional technology to assist student learning and academic achievements.

With this perspective, social influence plays vital role to the first two core components

wherein it defines how productive a specific learner or teacher to an output which


influences an individual to be likewise productive. By assessing what instructional

technology teachers utilize, and the amount of time each teacher and student use in

discussing contents as well as mastery on the concepts, this would offer new knowledge

that would assist learners and teachers enhance instructions and scholastic achievements.

Finally, facilitating conditions could also be used as lens to examine how instructional

technology is used in content delivery. Because capability issues on the use of

instructional technology would be its primary scope, it can be used as lens to evaluate the

amount of time a particular learner spends for mastery as well as amount of time a

teacher spendsfor evaluating learner’s output. With the use of Unified Theory Of

Acceptance And Use Of Technology by Venkatesh, Morris, Davis and Davis (2003), all

the independent and dependent variables will be addressed vividly and logically.

Figure 1 below depicts the relationships of the independent and dependent

variables and to what extent the usefulness of instructional technology would benefit the

end users such as learners and teachers.


INDEPENDENT VARIABLES DEPENDENT VARIABLES

Socio- Demographic Profile of Students


Age
Sex
Ethnicity
Parents Highest Educational Attainment
Father’s Occupation
Mother’s Occupation
Inclination to Technology
Performance in Mathematics thru
PARCC Assessment in Spring 2018
Socio- Demographic Profile of Teachers Level 1: Did Not Meet
Age Expectations
Sex Level 2: Partially Met
Ethnicity Expectations
School District Level 3: Approach
Current Work Responsibility Expectations
Highest Educational Attainment Level 4: Met Expectations
Years of Teaching Level 5: Exceeded
Area of Specialization Expectations
Personal Instructional Technology
Equipment
Instructional Technology Integration
Instructional Technology Utilized
Type of Technology Utilized
Amount of Time Spent Preparing
Instructional Technology
Technology Website/Software Used
Amount of Time Spent in
UsingTechnologyWebsite/Software
Level of Skills in Using Technology
Related Instructional Technology
Training Attended
Content Delivery Using Instructional
Technology
Amount of Time Utilized for Content
Delivery
Amount of Time Utilized by Students for Figure 1.The research paradigm showing
Content Mastery the relationship of independent and
Amount of Time Utilized for Assessment
of Mastery dependent variables
Behavioral Intentions Operational Definition of Terms
Performance Expectancy
Effort Expectancy
Social Influence
Facilitating Conditions
To provide clarity and understanding

to the readers as they are using this study, the following terms are defined.
Socio- demographic profileof Studentsrefers to the characteristics of student

respondentssuch as age, sex, race or ethnicity, and educational attainment of

parents

Agerefers to the number of years calculated from the time of the latest birth of the

respondent.

Sexrefers to the biological structure of respondents as to male or female.

Ethnicityrefers to respondents’ state of belonging to specific social group or race.These

social group in this study includes Asian, Hispanic, American, Latinos, Australian

and Canadian.

Parents Educational Attainment refers to the completed highest educational attainment

of parents by the respondents which are categorized such as elementary, high

school and college.

Parents Highest Educational Attainmentrefers to the highest educational degree that

father and mother have attained or completed.

Father’s Occupationrefers to father’s profession or job to earn a living.

Mother’s Occupation refers to mother’s profession or job to earn a living.

Inclination to Technology refers to respondents’ natural tendency or urge to act or feel

in a particular way in any context of technology related activities.

Socio- demographic profileof Teachers refers to teacher respondents characteristics

such as age, sex, ethnicity, school district, current work responsibility, highest

educational attainment, area of specialization, and personal instructional

technology equipment.
School District refers to geographical location of the school or venue where the study

has been conducted.

Current Work Responsibility refers to currently occupational position of the teacher

respondents which pertains to department head, assistant head, adviser,

department secretary, office staff, program head, department coordinator, and

plain faculty member.

Highest Educational Attainment refers to individual’s highest educational degree being

attained or completed. In this study, teachers’ highest educational attainment is

categorized in terms of bachelor’s degree, master’s units, master’s degree,

doctoral units, and doctoral degree.

Years of Teaching refers to the total number of teaching experience of the teacher

respondents. Teacher respondents’ years of teaching include 1-5 years, 6-10

years, 11-15 years, 16-20 years, and 21 years and above.

Area of Specialization refers tospecific discipline or area where the teacher respondent

is expert on or concentrates on. Teachers’ areas of specialization include

Mathematics, Science, and English.

Personal Instructional Technology Equipment refers to teacher respondents’ personal

equipment used in the classroom for teaching mathematics.

Instructional Technology Integration refers to the process on how various technology tools

are utilized to teach or enhance learning. This includes the hardware (desktop

computers, Chrome Books, tablets, SmartBoard/Inter-active WhiteBoard, computer

software and programs, and math internet websites.

Type of Technology Utilized refers tothe application of scientific knowledge for

particular purpose especially in the teaching and learning process. In this study,
instructional technology that have been developed such as software and other

equipment will be used such as cellular, mobile, tablet, laptop, router, power-

point, printer, video clips, DVD, LCD projector, television and the like.

Amount of Time Spent Preparing Instructional Technologypertains to the number of

hours the teacher respondents used in preparing instructional technology

materials.

Technology Website/Software Usedmeans website or software the student and teacher

respondents utilized for teaching and learning process in mathematics.

Amount of Time Spent in Using Technology Website/Softwarerefers to number of

hours the respondents use in a particular website or software.

Level of Skills in Using Technologydefines the ability of respondents in using

technology as to highly proficient, proficient, moderately proficient, partially

proficient, and not proficient.

Related Instructional Technology Training Attendedrefers to seminars related to

instructional technology the teacher respondents attended.

Content Delivery Using Instructional Technologyis defined as how the teacher

respondents used instructional technology in the delivery of the contents or topics

in daily basis.

Amount of Time Utilized for Content Deliveryrefers to teacher respondents’ number of

minutes used in presenting a topic using instructional technology.

Amount of Time Utilized by Students for Content Masteryrefers to student

respondents’ number of minutes used in a particular activity for mastery of the

topic.
Amount of Time Utilized for Assessment of Masterypertains to teacher respondents’

allocated time to assess the mastery skill of students using technology.

Behavioral Intentionsrefers to respondents’ perceived likelihood or subjective

probability that the respondents will engage is a specific instructional technology

activity in a given behavior.

Performance Expectancyis defined as respondents’ state of thinking or hoping about his

or her performance in technology that something pleasant would be the output.

Effort Expectancyrefers to respondents’ initiative or determined attempt to get involved

in technology for teaching and learning.

Social Influence means respondents tendency in change of behavior which cause in

another, intentionally or unintentionally, as a result of the way the changed person

perceives themselves in relation to the influencer such as instructional technology.

Facilitating Conditions refers to a specific technology and social related influence

which has a significant effect on respondents which may refer to the extent to

which an individual respondent perceives that behavioral intention.

Performance in Mathematics thruPARCC Assessment refers to the PARCC

Math Scores of fifth grade students from previous year (SY 2017-2018),

measuring level of proficiency from Level 1 to Level 5. PARCCrefers to the

Partnership of for Assessment of Readiness for College and Careers, agroup of

states working together to develop a set of assessments that measure whether

students are on track to be successful in college and careers.PARCC

Assessmentrefers to the test utilized by member states and is based on the

CommonCore State Standards (curriculum)


State assessments refer to various tests administered by each and every school district

according to state and federal mandates. These assessments are an important

source of data on students’ progress and proficiency relative to the state’s

educational standards.

PARCC Scores refers to various levels such as: Level 1 means “Did not yet meet

expectations” represented by a down arrow with score of 650-659; Level 2 means

“Partially met expectations” represented by a down arrow withscore of 700-724;

Level 3 means“Approached expectations”which is the minimum score students

need to achieve to pass the PARCC represented by a bidirectional arrow

withscore of 725-750; Level 4 means“Met expectations” whereinstudents who

performed at level 4 and above have demonstrated readiness for the next grade

level/course and, eventually, college and careerrepresented by an up arrow

withscore of 750-809; and, Level 5 means “Exceeded expectations” represented

by an up arrow withscore of 810-850.

Locale and Time of the Study

The study was conducted at Bladensburg Elementary School, an elementary

school located at 4915 Annapolis Road, Bladensburg MD 20710 during the academic

year Fall 2018-Spring 2019. This was under the jurisdiction of Prince George’s County

Public Schools in the state of Maryland, USA. In this study, the PARCC scores of Spring

2018 of current fifth grade students have been selected since this group of students have

already taken the PARCC assessment for the last two consecutive years. An archival
study looking into Math scores from the previous school year, trends in technology use

and trends in usage of different websites will be revisited to establish if there is a

relationship between technology use and time spent on technology and Math scores of

these students.
Figure 2: Map of Maryland USA showing the location where the study will be
conducted(source: google map)
Research Design

The study utilized two types of research designs namely the Descriptive and

Correlational Research Design. Descriptive research design is defined by Bernard (2006)

as fact-finding process with logical interpretation. Fact obtained may be accurate

expressions of frequencies, central tendency and deviations. On the other hand, Weirsma

and Jurs (2005) defined correlational design as a method of research involving the

measurement of two or more relevant variables and an assessment of the relationship

between or among those variables, without drawing conclusions about the causal

relationships among the measured variables. The correlational design will be used to

establish the relationship between type of technology used, time spent on technology and

the match scores of students in PARCC assessment. Particularly, the correlation will

indicate whether there exists a positive correlation or there exists a negative correlation or

the relationship is non-linear.


In this study, boththe quantitative descriptive and correlation designwere utilized

to identify, analyze, and describe the instructional technology integration in relation to

academic performance in mathematics thru the PARCC State Assessment scores among

5th grade students at Bladensburg Elementary School in Prince George’s County Public

Schools.

Respondents of the Study

Researcher has identified the two groups of participants of the study, via total

enumeration sampling. There were 92 fifth grade students in a general education

classroom on the diploma track and 8 mathematics and non-mathematics major teachers

at Bladensburg Elementary School in Prince George’s County Public Schools.Total

enumeration sampling as defined by Bernard (2006) is a probability sample that has been

selected based on characteristics of a population and the objectives of the study. Total

enumeration sampling is also known as total population sampling.

Since the setting of this study was at Bladensburg Elementary School in Prince

George’s County Public Schools, the teacher and student respondents were purposefully

chosen based on their actual teaching experience in instructional technology integration

in the classroom and content mastery in the learning process. The table below illustrated

the number of teacher and student respondents.

Table 1: Population Distribution of Teacher and Student Respondents

Section Name Total Population Total Samples


Section1 23 23
Section2 23 23
Section3 23 23
Section 4 23 23
Teachers 8 8
TOTAL 100 100
N(teacher respondents) = 8
N(student respondents) = 92

Instrumentation

The research instrument included to types of survey questionnaires: Instrument

for Teacher Respondents and Instrument for Student Respondents.

The First Instrument is for Teacher Respondents consisting of the three parts:

Part I consists of a checklist for the description of theSocio-Demographic Profile

for teachers such as age, gender, ethnicity, school district, current work responsibility,

highest educational attainment, years of teaching, area of specialization, and personal

instructional technology equipment.

Part II includes teachers’ perceptions on three subcomponents of Instructional

Technology Integration such as the Use of Instructional Technology, Content Delivery in

Using Instructional Technology and Behavioral Intentions. The Instructional Technology

covers the type of technology used, amount of time spent preparing the instructional

technology, technology website or software used, amount of time in using website, level

of skills in using technology, and related instructional technology training attended. For

Content Delivery Using Instructional Technology, these include amount of time utilized

for content delivery, amount of time used by students for content mastery, and amount of

time used by teachers for assessment using technology. For Behavioral Intentions, these

involve performance expectancy, effort expectancy, social influence, and facilitating


conditions. Likert Scale was used such as 5 (strongly agree), 4 (Agree), 3 (partially

agree), 2 (disagree) and 1 (strongly disagree).

Part III includes the performance of 5th grade students in Mathematics through

PARCC Assessment scores. Data from academic year Fall 2017 and Spring 2018 will be

tabulated and described according to five levels such as Level 1 (Did Not Meet

Expectations) with the score range from 650-699, Level 2 (Partially Met Expectations)

with the score range from 700-724, Level 3 (Approach Expectations) with the score range

from 725-750, Level 4 (Met Expectations)with the score range from 751-809, and Level

5 (Exceeded Expectations) with the score range from 810-850. These numerical scores

are standard values given by PARCC Assessment in the whole United State of America.

This part of the instrument will be filled up by the teacher respondents for individual

student score.

The Second Instrument is for Student Respondents consisting oftwo mainparts:

Part I consists of a checklist for the description of the socio-demographic profile

of student respondents. The instrumentprovided a checklist about their socio-

demographic characteristics like age, sex, ethnicity, parents’ highest educational

attainment, father’s occupation, mothers’ occupation, and inclination to technology.

Part II includes students’ perceptions on Instructional Technology Utilization in

the classroom such as type of technology utilized for learning mathematics, amount of

time spent in using technology website/software for learning mathematics, technology

website/software used for learning mathematics, level of skills in using technology, and

amount of time utilized by students for content mastery. Similar to teacher respondents,
the Likert Scale was likewise used such as 5 (strongly agree), 4 (Agree), 3 (partially

agree), 2 (disagree) and 1 (strongly disagree).

Reliability and Validity of Instrument

The whole parts of the survey questionnaire was designed and developed by the

researcher in accordance with the research paradigm. The researcher consulted three

experts to validate the scope of the questionnaires namely mathematics teacher,

anstatistician, and a research professor. Several revisions had been made to capture the

overall purpose of the study.

In determining the reliability of the instrument, a pilot testing was conducted to 5

mathematics teachers who used instructional technology in teaching mathematics and 10

5th grade students in the neighboring elementary school county. The results were

analyzed using the Statistical Package for Social Science (SPSS) and got a Cronbach’s

coefficient alpha of 0.812 for its internal consistency suggesting that the instrument was

valid and reliable.

Methods of Data Gathering

The first step was securing permission from the Prince’s George County Public

Schools (PGCPS) because it involved human subjects for the research. Registration

through online webpage was made by the researcher. After the approval, the researcher

started gathering data to teachers and students.


The Initial Stage of this research was to identify benchmark scores of students

using archival data from school year 2017-2018 to serve as a baseline corresponding to

Fall 2017 assessment administration. Then the Spring 2018 assessment administration

data was used through proper collaboration with the school registrar.

The second stage of the study was to conduct a survey using the first research

instrument survey among math teachers of the school to determine the instructional

technology being used for mathematics instruction, what websites and software the

teacher uses in teaching the content, amount of time utilizing the instructional

technology specified for delivery of content, amount of time utilizing the instructional

technology specified for student practice and mastery, and amount of time utilizing the

instructional technology specified for assessment of mastery of content.

Since archival data was utilized, researcher worked directly with the school’s

Testing Coordinator. A letter requesting access to previous school year’s testing result

was sent to the School Principal for approval. Once approval was granted, data needed

for the study were collected from Testing Coordinator.

Then, the second instrument surveywas given to the 5 th grade students to gather

data on how they utilized instructional technology in learning mathematics. The

instrument was administered for a week and some follow ups were made to clarify

uncertain answers in some parts.

Methods of Data Analysis

For the methods of data analysis, different statistical tools will be used for this

study:
The frequency counts, percentage distribution, mean, and standard deviation will

be utilized for the socio-demographic characteristics of student and teacher respondents

as well as for the scaled-checklist of this study. These included their perceptions on the

instructional technology integration and students’ utilization for content mastery.

To find out the significant relationship between student and teacher respondents

socio-demographic characteristics and students’ performance in mathematics thru in

PARCC assessment test, as well as instructional technology utilized in terms of content

delivery and behavioral intentions, Pearson Product Moment Correlation was used.

The multiple linear regressionwas also used to determine whether the socio-

demographic profile could be predictors of mathematics score in PARCC Assessment

test. Similarly, to determine whether instructional technology, content delivery using

instructional technology, and behavioral intentions in using instructional technology

could be predictors of mathematics scores in PARCC Assessment test, the same method

of analysis will be utilized.


LITERATURE CITED

Akbar, F. (2013). What affects students’ acceptance and use of technology? Dietrich
College Honors Theses. Carnegie Mellon University
Allsopp, D. H., McHatton, P., & Farmer, J. L. (2010). Technology, mathematics ps/rti,
and students with ld: What do we know, what have we tried, and what can we do
to improve outcomes now and in the future. Learning Disability Quarterly, 33(4),
273-288.

Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human
Decision Processes, 50, 179–211.

Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory.
Englewood Cliffs, NJ: Prentice Hall.

Balanskat, A., Blamire, R., &Kefala, S. (2006). The ICT Impact Report– A review of
ICT impact on schools in Europe.Retrieved July, 16, 2008.

Bates, T., &Epper, R.(2001). Teaching faculty how to use technology: Best practices
from leading institutions. Greenwood Publishing Group

Belz, J. A. (2003). Linguistic graduation and the development of intercultural


competence in Internetmediated foreign language learning.Unpublished
manuscript, the Pennsylvania State University.

Bernard, R. H. (2006). Research Method in Anthropology. New York, NY: Altamira


Press.

Bingimlas, K. A. (2009). Barriers to the successful integration of ICT in teaching and


learning environments: A review of the literature. Eurasia Journal of
Mathematics, Science & Technology Education, 5(3), 235-245.

Birch, A. and Irvine, V. (2010). “Preservice teachers’ acceptance of ICT integration in


the classroom: applying the UTAUT model”, Educational Media International,
46(4) 295-315.

Bransford, J., Brown, A. L., & Cocking, R. R. (Eds.). (2000). How people learn: Brain,
mind, experience, and school (2nd ed.). Washington, D.C.: National Academy
Press.
Brown, S. A., &Venkatesh, V. (2005). Model of adoption of technology in households: A
baseline model test and extension incorporating household life cycle. MIS
quarterly, 399-426.

Bukhart, P. (2011). Effects of Instructional technology on the Mathematics Achievement


of Eighth Grade Students. Retrieved from
https://digitalcommons.odu.edu/cgi/viewcontent.cgi?arti-
cle=1020&context=ots_masters_projects

Carter, A., Cotton, S. R., Gibson, P., O’Neal, L. J., Simoni, Z., Stringer, K., & Watkins,
L. S. (2013). Integrating Computing Across the Curriculum: Incorporating
Technology. Transforming K-12 classrooms with digital technology, 165.

Chen, J., Belkada, S., & Okamoto, T. (2004). How a web-based course facilitates
acquisition of English for academic purposes. Language learning & technology,
8(2), 33-49.

Choy, B. H. (2014). Teachers' productive mathematical noticing during lesson


preparation. In C. Nicol, P. Liljedahl, S. Oesterle & D. Allan (Eds.), Proceedings
of the joint meeting of PME 38 and PME-NA, 36 (2) 297-304.

Compeau, D. R., and Higgins, C. A. (2015). Computer self-efficacy: Development of a


measure and initial test. MIS quarterly, 189-211.

Davies, S. (2003). Content based instruction in EFL contexts. The Internet TESL Journal,
9(2), 24-28.

Davis, F.D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of
information technology.MIS Quarterly, 13(3), 319–340.

Dawes, L. (2001). What stops teachers using new technology? In M. Leask (Ed.), Issues
in Teaching using ICT (pp. 61-79). London: Routledge.

Delen, E., Bulut, O. (2011) The Relationship Between Students' Exposure to Technology
and their Achievement in Science and Math. Turkish Online Journal of
EducationalTechnology, 2011 - 79.123.150.20

Evans, B. R. (2011). Content Knowledge, Attitudes and Self-Efficacy in the Mathematics


New York City Teaching Fellows (NYCTF) Program.School Science and
Mathematics, 111(5), 225-235.

Hidayati, K. &Budiyono, S.(2018).Development and Validation of Student's


Responsibility Scale on Mathematics Learning Using Subject Scaling
Model.International Journal of Instruction, 11 (4) 499-512.

Hill, H., & Ball, D. L. (2014). Learning mathematics for teaching: Results from
California’s mathematics professional development institutes. Journal for
Research in Mathematics Education, 35(5), 330–351.

Huang, X., Craig, S. D., Xie, J., Graesser, A. C., Okwumabua, T. (2018). The
Relationship between Gender, Ethnicity, and Technology on the Impact of
Mathematics Achievement in an After-School Program.Society for Research on
Educational Effectiveness,2(1) 34-45.

Hobri, D. & Hossain, A. (2018).The Implementation of Learning Together in Improving


Students’ Mathematics Performance.International Journal of Instruction, 11 (2),
483-496.

Israel, O. 'N.(2016). Effects of Mathematics Innovation and Technology on Students


Performance in Open and Distance Learning.Research in Pedagogy, 6 (1)66-75.

Kuppalapalle, V. and Tammy, M. (2016). Integration of Digital Technology and


Innovative Strategies for Learning and Teaching Large Classes: A Calculus Case
Study. International Journal of Research in Education and Science, 2(2), 379-
395.

Kim, S., Chang, M.(2010). Computer Games for the Math Achievement of Diverse
Students.Journal of Educational Technology & Society.Innovations in Designing
Mobile Learning Applications, 13(3), 224-232.

Larson, M. “A Developmental Approach to Preparing Students for Standardized or State


Tests”. Retrieved from
https://www.eduplace.com/state/pdf/author/larson1_hmm05.pdf

Lim, C. and Khine, M. (2006) Managing Teachers' Barriers to ICT Integration in


Singapore Schools. Journal of Technology and Teacher Education, 14 (1) 97-
125.

Leung, A., &Bolite-Frant, A. (2015).Designing mathematics tasks: The role of tools. In


A. Watson, M. Ohtani (Eds.), Task design in mathematics education, New ICMI
Study Series (pp. 191-225). New York: Springer.

Mardiana, H. (2018). Lecturer's Attitude towards Advance Technology and Its Impact to
the Learning Process: Case Study in Tangerang City Campuses.Journal of
Educational Science and Technology ,4(1), 12-25.

Meagher, M. (2012).Students' relationship to technology and conceptions of mathematics


while learning in a computer algebra system environment. International Journal
for Technology in Mathematics Education, 19(1), 3-16.

Means, B. (2010). Technology and Education Change: Focus on Student Learning


Journal of Research on Technology Education, 42(3), 285–307.

Moein, M., Lin, L., Luchies, C., Patterson, M., and Darban, B. (2018).Enhancing
Teaching-Learning Effectiveness by Creating Online Interactive Instructional
Modules for Fundamental Concepts of Physics and Mathematics.Education
Science Journal, 1(1), 1-14.

Mokgwathi, M., Graham, M. & Fraser, W. (2010).The Relationship between Grade 9


Teacher’s and Learner’s Perceptions and Attitudes with Their Mathematics
Achievement.
International Journal of Instruction12(1), 841-850.

Murphy, D. (2016). A Literature Review: The Effect of Implementing Technology in a


High School Mathematics Classroom. International Journal of Research in
Education and Science, 2(2), 294-299.

Mustafina, A. (2016). Teachers' Attitudes toward Technology Integration in a


Kazakhstani Secondary School.International Journal of Research in Education
and Science, 2 (2) p322-332.

Nicholas, H., & Ng, W. (2012).Factors influencing the uptake of a mechatronics


curriculum initiative in five Australian secondary schools.International Journal of
Technology and Design Education, 22(1),
6590.doi:http://dx.doi.org/10.1007/s10798-010-9138-0

Oginni, N.I.(2015). Effects of mathematics innovation and technology on students


performance in open and distance learning. International Journal of Research in
Education and Science,2(4), 60-75.

Ouyang, J., Stanley, N. Theories and Research in Educational Technology and Distance
Learning Instruction through Blackboard.Universal Journal of Educational
Research,2(2): 161-172 DOI: 10.13189/ujer.2014.020208

Oyebolu, S. and Olusiji O. (2013) The Impact of Information and Communication


Technology (ICT) on Vocational and Technical Students’
Learning.Journal of Education and Practice ,4(7), 23-39.

Parkay, F. W., Anctil, E. J., & Hass, G. (2014). Curriculum leadership: Readings for
developing quality educational programs (10th ed.). Boston, MA: Allyn & Bacon
Ratnayake, I. and Oates, G. (2016).Supporting Teachers Developing Mathematical
TasksWith Digital Technology.In White, B., Chinnappan, M. &Trenholm, S.
(Eds.). Opening up mathematics education research (Proceedings of the 39th
annual conference of the Mathematics Education Research Group of Australasia),
pp. 543–551. Adelaide: MERGA.

Remmele, B. and Holthaus, M. (2013).De-Gendering in the Use of E-Learning.


International Review of Research in Open and Distance Learning, 14(3) 27-42.

Rosas, C., & Campbell, L. (2010). Who's teaching math to our most needy students? A
descriptive study.Teacher Education and Special Education, 33(2), 102-113.

Schacter, John (1999). The Impact of Education Technology on Student Achievement:


What the Most Current Research Has To Say. Retrieved from
https://eric.ed.gov/?id=ED430537

Tella, A. (2017). Teacher variables as predictors of academic achievement of primary


school pupils’ mathematics.International Electronic Journal of Elementary
Education.1(1), 16-33.

Todd Haydon, Renee Hawkins, Hillary Denune, Lauren Kimener, Dacia Mccoy, James
Basham (2012). A Comparison of iPads and Worksheets on Math Skills of High
School Students with Emotional Disturbance. Retrieved from
https://journals.sagepub.com/ doi/abs/ 10.1177/019874291203700404

Umugiraneza, O.,Bansilal, S.&North, D.(2018). Exploring Teachers' Use of Technology


in Teaching and Learning Mathematics in Kwazulu-Natal Schools
Pythagoras, 39 (1) 342-350.

Vogel-Walcutt, J.J., Gebrin J.B., and Nicholas, D.(2010) Animated versus Static Images
of Team Processes to Affect Knowledge Acquisition and Learning Efficiency.
MERLOT Journal of Online Learning and Teaching.6(1):162-173

Wenglisngsky, Harold (1998). Does It Compute? The Relationship between Educational


Technology and Student Achievement in Mathematics.
Retrieved from https://eric.ed.gov/? id=ED425191 on November 3, 2018.

Wiersma, W., &Jurs, S. (2005). Research Method in Education: An introduction. Boston,


MA. Chestnuts Hills Enterprises, Inc.

Wozney, L.; Venkatesh, V.; Abrami, P. (2006) Implementing Computer Technologies:


Teachers' Perceptions and Practices.Journal of Technology and Teacher
Education, 14 (1)173-207.
Wenglinsky, H. (2005) Technology and Achievement: The Bottom Line. Learning in the
Digital (63), 29-32. Retrieved from
https://imoberg.com/files/Technology_and_Achievement_--
The_Bottom_Line_Wenglinsky_H._.pdf

Yu-Liang, T. (2011).Introducing new technology to teachers: A pilot


evaluation.International Journal of Technology in Teaching & Learning, 7(2),
136-151.

You might also like