You are on page 1of 22

Running Head: EVALUATION OF INATURALIST LEARNING MODULE 1

Evaluation of iNaturalist Learning Module

Deborah Ceryes and Joan Miller

California State University, Monterey Bay

June 12, 2018

IST 622 Assessment and Evaluation

Dr. Bude Su
EVALUATION OF INATURALIST LEARNING MODULE 2

Contents
Introduction..........................................................................................................................3

Methodology........................................................................................................................4

Prototype..........................................................................................................................4

Learners...........................................................................................................................4

Tryout Process.................................................................................................................5

Tryout Conditions............................................................................................................6

Results..................................................................................................................................6

Entry Conditions..............................................................................................................6

Outcomes.........................................................................................................................7

Data Analysis...................................................................................................................9

Recommendations..........................................................................................................10

Summary............................................................................................................................11

References..........................................................................................................................13

Appendix A........................................................................................................................14

Appendix B........................................................................................................................16

Appendix C........................................................................................................................17

Appendix D........................................................................................................................19

Appendix E........................................................................................................................21
EVALUATION OF INATURALIST LEARNING MODULE 3

Introduction

The California Academy of Sciences (CAS) coordinates citizen science initiatives to

bolster public awareness of, and involvement in, protecting biodiversity. These initiatives

support a global movement through which scientists and non-scientists alike make observations

of flora and fauna, collecting data to help answer some of our planet’s most pressing questions.

As part of this initiative, the academy conducts several yearly bioblitz events: a gathering of

scientists, citizen scientists, land managers, teachers and families, working together to find and

identify as many different species as possible, all in one day. Due to the popularity of the events

and limited staff resources, the CAS needs a streamlined way to train volunteer coordinators to

plan and run the events in their local communities (Miller, 2018).

To resolve this training issue, an asynchronous e-learning course is being designed and

implemented to train volunteer coordinators to successfully run citizen scientist events, such as

Snapshot Cal Coast. This study analyzes one out of four modules within the asynchronous

training. Module #2: “iNaturalist: what it is and how to use it” encompasses the following

learning objectives (Miller, 2018):

1) Create an account on the iNaturalist website.

2) Download the iNaturalist app correctly to a smartphone device.

3) Upload an observation (i.e. a digital photo) to the iNaturalist network using the

app or website.

4) Demonstrate knowledge of tide pool etiquette.


EVALUATION OF INATURALIST LEARNING MODULE 4

5) Photograph species for proper identification.

The purpose of the study is to determine the effectiveness of the module on achieving the

learning objectives as well as evaluate the usability of the module. Reactions to the module will

provide feedback for design improvements and include if learners would recommend the module

to others.

Methodology

Prototype

“Working with iNaturalist” is a web-based learning module with a variety of multimedia

components to engage a variety of learners across a range of demographic groups. Activities and

assessment allow the learner to interact with the module directly. The module allows learners to

progress at their own pace and to review the content of the module as necessary. At one point in

the module, users select a specific media element to view. Learners are provided: 1) information

on the iNaturalist website and app, 2) an opportunity to create an iNaturalist account, 3) video

demonstrations (or text) on how to upload an observation, 4) instruction on how to perform

observations with minimal impact on the flora and fauna, and 5) interactivity using “drag and

drop” and multiple-choice options to confirm their knowledge on the quality of photo

observations. Feedback is provided to learners during the quiz portion of the module using on-

screen text as well as an overall summary of their quiz results.

Learners

Since the role of the “citizen scientist” is usually a self-motivated volunteer in the

community with an interest in interacting with nature, specific learner characteristics (e.g.

education, technical proficiency, familiarity with CAS, etc.) are expected to vary. Participants

were seven learners selected via “convenience sampling” (i.e. colleagues, friends and family
EVALUATION OF INATURALIST LEARNING MODULE 5

members of the evaluators). A learner survey collected information specific to education level,

background with environmental organizations or marine sciences, familiarity with various topics

in the module as well as self-assessment of technical proficiency. Per data collected from a

learner survey (Appendix A), results indicate the majority of learners:

● vary in educational background from middle school through college

● do not possess professional experience working with an environmental club nor have

an education background in marine sciences (i.e. volunteer or amateur)

● are familiar with California Academy of Sciences (100%) and somewhat familiar

with “citizen scientist” (57.1%)

● are much less familiar with “bioblitz” or “Snapshot Cal Coast” or “iNaturalist” rated

themselves as “excellent” to “good” in the technical tasks related to using iNaturalist.

Tryout Process

Via email and in person, learners received expectations and instructions to complete the

learner survey (Appendix A), pre-test (Appendix B), iNaturalist module, post-test (Appendix B),

and a usability checklist (Appendix C). The learner survey obtained demographic and technical

ability data specific to each learner. The pre and post-test included the same questions to

measure whether learning occurred specific to the overall learning objectives for the module.

The usability checklist gathered feedback from the learner on the navigation, ease of use,

likeability and overall usefulness of the module to support the learning objectives.

In addition, the evaluator used an observation checklist (Appendix D) to document the

learner experience and conducted post-test interviews with the learners to clarify any noted

observations. Learners scheduled time with the evaluator at their mutual convenience over the

course of one weekend. Survey and test instruments were created in Google forms and provided
EVALUATION OF INATURALIST LEARNING MODULE 6

to the user via email links. Paper versions were available if needed. The allotted time to

complete the tryout was under one hour.

Tryout Conditions

Five learners were observed using the module via desktop or laptop computers in their

home or work settings. Two additional learners were observed via screen sharing on a Zoom

conference call, using their laptops at home. Minimum technical specifications include: PC with

video and audio capabilities, internet connection, and a web browser. Within the module

narrative, participants were encouraged, but not required, to utilize their smartphones to

download the iNaturalist app and try making an original observation on their own.

Results

Entry Conditions

Originally the intention was to test the “Working with iNaturalist” module on actual CAS

volunteer coordinators. Due to time restrictions on the part of the CAS Citizen Science team and

their coordinators, (who are preparing for this year’s SnapShot Cal Coast event happening this
Figure SEQ Figure \* ARABIC 1.
week), the most feasible choice was to utilize convenience
Figure SEQ Figure \* ARABIC 1.
sampling to select participants. CAS bioblitz volunteer

coordinators are typically a diverse mix of parks interpreters, docent coordinators, professors and

educators, activists, students, advocacy organization leaders, and coordinators for other

organizations. A convenience sampling of seven individuals from a friends and family pool is

not an exact match to this population, as the CAS coordinators often have some knowledge of

citizen science, experience with observing flora and fauna, and occasionally, more formal

science education. However, our sample population did offer a diverse group in terms of

education and technical proficiencies, which aligns with CAS volunteer coordinators and
EVALUATION OF INATURALIST LEARNING MODULE 7

volunteers (a possible secondary audience for the module) that do not possess the previously

mentioned CAS bioblitz experience.

Instruction

In general, learners followed the path of instruction as intended. However, at one point

they were offered a choice about which lesson they wanted to view, and at other points some

learners accidentally skipped a lesson due to navigation errors. Because of this, none of the

participants viewed every item in the module, which clearly impacted the post-test results as the

test questions included content across the entire module.

Another large gap between intended and observed instruction involved the learners’ use

of their personal smartphone devices with the iNaturalist app. The module used in this study

presents a video and a screencast of how to upload an observation via smartphone. Though the

learners were prompted to download the free iNaturalist app and encouraged to go outside to

record an observation, only two participants downloaded the app, and none of the participants

used the app to make an actual observation. Additionally, none of the participants made an

original observation via the iNaturalist website using their computer, though they all participated

in a simulated observation upload.

Outcomes

Results from observations, pre/post-test results and the usability survey provide multiple

facets of information about the learning module. Observation of learners revealed the following

navigation challenges from a majority of learners where too many components were offered.
Figure SEQ Figure \* ARABIC 1.
Some users skipped portions of the module or required

intervention to find their way back through the module. The drag and drop quiz labels presented

challenges to learners that wanted to change the placement of their answers.


EVALUATION OF INATURALIST LEARNING MODULE 8

In addition, learners’ comments while under observation clearly indicates bias existed

among the learners and observers. Although observers instructed learners to complete the

module per their own preference and no “right way” exists for completing the module, family

members felt compelled to complete some portions of the module to support the project for the

observer. Some degree of bias was expected based on the relationship of the learners to the

observers. Understanding when or how bias impacts the learners’ experience should always be

considered to ensure high quality instructional design.

Deeper review of the frequently missed questions on the pre/post-test indicated some

learning gains occurred in questions related to the quality of an observation photo. However,

two questions about the purpose of and how to use iNaturalist demonstrated variability in

learning. Further analyses of the learners’ results on these questions provide some insight into

how the questions were worded and/or scored that may need to be addressed in developing future

assessments for the module. The usability survey provided several important perspectives about

the learners’ experiences with the module. All seven learners “agree” or “strongly agree” on the

following:

● Narration adds to the learning experience

● Graphics and video are helpful

● Lesson instructions are easy to follow

● I would recommend this module to others interested in how to be an iNaturalist

Citizen Scientist.

Learners responses varied more widely on additional aspects of the module, such as: 1) engaging

design, 2) sound effects, and 3) the on-screen agent.


EVALUATION OF INATURALIST LEARNING MODULE 9

Data Analysis

The pre and post-test data were analyzed to determine the module’s effect on learning.

Each of the ten questions were assigned one point and a learner’s score on each test were totaled

and paired. The directional null hypothesis states that the learner’s pre-test mean will be greater
Figure SEQ Figure \* ARABIC 1.
than or equal to the post-test mean (Ho: Mean1 >= Mean2).

The research hypothesis states that the learner’s pre-test mean will be less than the post-test

mean (H1: Mean1 < Mean2). The post-test mean score (M=6.86, StDev = 0.90) was slightly

higher than the pre-test mean score (M=5.28, StDev = 1.38) (Appendix E). Further analysis of

the data using a t-test for paired samples (Table 1) shows statistical evidence that the learning

module was effective in achieving the learning objectives. A comparison of the absolute value

of the t-statistic (2.42) to the smaller t-Critical value (1.94) supports rejection of the null-

hypothesis. The p-value of 0.026 is less than the level of significance (α = .05) which also

supports rejection of the null hypothesis.

Table 1
t-Test: Paired Two Sample for Means
  Variable 1 Variable 2
Mean 5.285714286 6.857142857
Variance 1.904761905 0.80952381
Observations 7 7
Pearson Correlation -0.095870624  
Hypothesized Mean Difference 0  
df 6  
t Stat -2.419677398  
P(T<=t) one-tail 0.025942945  
t Critical one-tail 1.943180281  
P(T<=t) two-tail 0.051885891  
t Critical two-tail 2.446911851  
EVALUATION OF INATURALIST LEARNING MODULE 10

Recommendations

The results from the survey and test instruments provide evidence to support

recommendations to both the module designer and CAS. Within a reasonable timeframe after

completion of the module, CAS staff should follow up with the learners and examine their

iNaturalist accounts to confirm that they uploaded original observations. Demonstrating an

understanding of both web and smartphone observation abilities is crucial for volunteer

coordinators who will be teaching these skills to their recruited volunteers and provides another

mechanism to measure learning gains through a transfer of knowledge.

The following recommendations are for the module designer:

1. Both user feedback and observation clearly documented significant navigation issues

in the module. The designer should utilize much clearer signaling of what action to

perform next, and what buttons to push to advance to the next phase of the lesson.

Consider removing the built-in navigation bar at the bottom entirely, and instead

provide obvious buttons for the users to click on at the appropriate time. Most users

did not see that there was a table of contents (TOC) even though it was labeled at the

beginning of the lesson. Consider removing the TOC, or having it on screen the

entire time, or increase signaling at the open of the module that the TOC is there and

can be utilized. Also, when incorporating video and simulation media, clearly

identify the play button that connects with that media, as distinguished from the

controls that operate the module lesson.

2. Interviews with learners revealed that some thought the module to be complicated and

long, while others found sections to be too easy or trivial. A redesign of the module to

present each learning objective, along with accompanying activities and assessments,
EVALUATION OF INATURALIST LEARNING MODULE 11

in separate segments or “sub-modules” would allow learners with varying skill sets to

choose their own objectives, which adheres to Malcolm Knowles’ Adult Learning

Theory (Pappas, 2017). This would also likely increase post-test assessment scores

because it would ensure that learners are only assessed on lessons they choose to

interact with. Learners who are completely new to CAS and the concept of bioblitzes

would be encouraged to review all of the segments in order.

3. Pre-test/Post-test questions should be designed and measured more carefully to

accurately reflect learning gains. Specifically, two questions appear in the

“Frequently Missed Questions” review from both the pre and post-tests (Appendix

B). Those multiple-choice questions allowed users to “choose all that apply”, yet the

entire problem was worth only one point. Several users had learning gains between

the pre and post-tests that were not measured due to the fact that each individual

correct choice was not scored. Additionally, on these same questions, some learners

did not see the directive “choose all that apply” and subsequently chose only one

answer. It may be advisable to make it clearer within the body of the question that

more than one answer is correct.

4. To reduce bias and strengthen the evaluation process, implement true random

sampling instead of convenience sampling by recruiting learners who do not

personally know the evaluators.

Summary

The evaluation of the iNaturalist learning module included several components to support

an organized approach for gathering data and feedback from learners. Specific instruments were

developed to survey learners about the usability of the module as well as to assess the module’s
EVALUATION OF INATURALIST LEARNING MODULE 12

learning objectives. Although the sample learner group does not align with all characteristics of

the target audience, several relevant recommendations can be considered for future improvement

of the module, especially around navigation and organization of the media presented in the

module.

Data analysis indicates statistically significant learning gains occurred, especially with

questions related to photo observations. All learners agree that they would recommend the

module to others interested in becoming an iNaturalist citizen scientist (Appendix C). Overall,

the evaluation process provided important feedback to analyze multiple facets of designing,

implementing and delivering instruction via the iNaturalist learning module. As additional

learners engage with future iterations of the module, additional assessment of the primary

learning outcomes will occur to determine whether the module’s learning effectiveness sustains.
EVALUATION OF INATURALIST LEARNING MODULE 13

References

Miller, J., (2018). Training Citizen Science Coordinators for Bioblitz Events (MIST Capstone

Proposal). California State University, Monterey Bay.

Pappas, C. (2017, December 21). The Adult Learning Theory - Andragogy - of Malcolm

Knowles. Retrieved from https://elearningindustry.com/the-adult-learning-theory-

andragogy-of-malcolm-knowles
EVALUATION OF INATURALIST LEARNING MODULE 14

Appendix A
Learner Survey

Participants received the learner survey via email and completed the survey prior to engaging

with the learning module. Additional responses include:


EVALUATION OF INATURALIST LEARNING MODULE 15

Appendix A

Learner Survey
(Continued)
EVALUATION OF INATURALIST LEARNING MODULE 16
EVALUATION OF INATURALIST LEARNING MODULE 17

Appendix B

Working with iNaturalist Pre-test

Participants received the pre-test and post-test via email. The test instruments included the same

questions. Participants completed the pre-test prior to engaging with the learning module and the

post-test was completed immediately after viewing the module.

Pre-test insights:
Average: 5.29 / 10 points
Range: 3 - 7 points
Frequently missed questions
 What do you think the iNaturalist website & app is for based on the name? (Choose all
that apply)
 Which of the following statements about iNaturalist are true? (Choose all that apply)
 This photo is not the best choice for uploading to iNaturalist because: (Choose all that
apply)
 What could you do to make this shot even better for species identification? (Choose all
that apply)

Post-test insights:
Average: 6.86 / 10 points
Range: 6 - 8 points
Frequently missed questions
 What do you think the iNaturalist website & app is for based on the name? (Choose all
that apply)
 Which of the following statements about iNaturalist are true? (Choose all that apply)
EVALUATION OF INATURALIST LEARNING MODULE 18

Appendix C

Usability Checklist

Participants received the usability checklist via email and completed the survey immediately

after completing the learning module post-test. Specific responses include:

What did you like most?


 the pictures; the idea; being able to find out something
 The helpful interface that told me EXACTLY where to click. The in-depth videos were also helpful.
 I liked how the Narrator did a step by step procedure for showing how to take an observation and how to share.
 The fact that it lets non-scientific trained people learn more about our world
 Nature photos
 The thoughtful presentation of data being fun & interesting. Good learning experience.
We see so many nature programs these days - which is great - but I feel like with this character being sort of
obviously not on scene, that I am more involved with her. I know I'm not just watching a documentary or
entertainment/education video. It's clear to me that I am to interact with her & what I am watching. Loved the
slimy frog/Prince humor - well timed. Loved the girl on a beach concept for a story line & also loved the
doggie.t

What did you like least?


 kinda complicated
 The random animal noises
 What I liked the least is that when the module was paused and the interaction popped up on the screen I didn't
know what to do. On the screen with the snake, when the word popped up I did not know that I was supposed to
click on one of them. Another thing I liked the least was when the module would pause and the sound effects
(the waves) I was confused and didn't know that I was supposed to unpause the module.
 Moving from one segment to the next was sometimes confusing.
 Data entry
EVALUATION OF INATURALIST LEARNING MODULE 19

 I thought that the girl "guide" was a great feature, her clothing was annoying (but I'm a pro photographer) &
much more engaging than an on location live narrator.
Appendix C

Usability Checklist
(Continued)
What would you change?
 make text larger
 no need for multiple species just confuses
 I would probably get rid of the random animal noises that happen in between the pauses, and the final quiz
interface could probably be fixed up a bit (it was hard to select a different answer when I realized my mistake)
 The things I listed for what I liked least.
 More audio on how to move from segment to segment
 Sometimes wasn't sure when to click screen or when to push play
 Egret background sound was irritating & I thought it was computer feedback initially. I very much liked the
other sound effects. On "How to take a photo" screen the narration of the do's & don't kept dropping out. The
asterisk that refers down to nothing during the quizzes was confusing.
EVALUATION OF INATURALIST LEARNING MODULE 20

Appendix D

Observation Checklist

Observers utilized the observation checklist to document notes, issues and errors. Observers

completed the checklist while the learner engaged with the various instruments and module. In

addition, responses to follow up questions were documented on the observation checklist as well.

Module Section Navigation Other Notes Observee notes Re-design


Notes thoughts

Title Screen

Introduction (Lana)

Register

Get the App (2


phones screen)

What’s an
Observation

Smartphone or Web
Choice (snake
screen)

Smartphone Video
(CAS)

Web Upload Video


(CAS)

Try-it Web Upload

iPhone Demo (Joan)

Tidepool Etiquette & These are just things you watch, no navigation or interaction
Take good photos

Photo guidelines

Drag & Drop quiz

Matching quiz &


Score sheet
EVALUATION OF INATURALIST LEARNING MODULE 21

Appendix D

Observation Checklist
(Continued)

Observation General Notes


Item Yes No N/A Comments/Notes:
(Include Errors and Non-
Critical Errors)

Learner navigates the module easily.

Learner clicks the appropriate buttons when


prompted.

Learner completes the assessments without any


issues.

Learner encounters a problem and can resolve on


their own.

Learner encounters a problem and requires


intervention.

Optional User Tasks: Please note web or phone


A. Did they download the app to a phone? app use, if applicable...
B. Did they sign up for an iNat account?
C. Did they make and upload an original
observation?
Environmental conditions: (Circle or document as applicable)
a. Devices: Desktop Laptop Smartphone Other device:
b. Operating System: Windows Mac iOS Android Other:
c. Audio: Headphones Speakers - Internal or External
d. Setting: Public Private

NOTES ON GOOGLE FORMS USED (for future form improvement):


Learner Survey:
Pre-Test:
Post-Test:
Usability Survey:
EVALUATION OF INATURALIST LEARNING MODULE 22

Appendix E

Descriptive Statistics

PreTest PostTest

5.28571 6.85714
Mean 4 Mean 3
0.52164 0.34006
Standard Error 1 Standard Error 8
Median 6 Median 7
Mode 6 Mode 6
1.38013 0.89973
Standard Deviation 1 Standard Deviation 5
1.90476 0.80952
Sample Variance 2 Sample Variance 4
Kurtosis -0.3255 Kurtosis -1.81661
0.35304
Skewness -0.70645 Skewness 5
Range 4 Range 2
Minimum 3 Minimum 6
Maximum 7 Maximum 8
Sum 37 Sum 48
Count 7 Count 7
Confidence 1.27640 Confidence 0.83211
Level(95.0%) 8 Level(95.0%) 6

You might also like