You are on page 1of 25

1

The Participatory Web A medium for humancomputer collaborative design of watershed


management solutions
Meghna Babbar-Sebens, Ph.D., Oregon State University
3rd CUAHSI Conference on HydroInformatics, July 17th 2015, Tuscaloosa, AL

Outline
2

1. Acknowledgements
2. Introduction
Need for participatory design in watershed planning
Human-centered design on the Participatory Web

3. Research Goals
4. Methodology
5. Results
Lessons in experiments with students and stakeholders
1.
2.
3.

User learning and engagement


User models for detecting revealed preferences
Interactive search algorithms

6. Conclusions

1. ACKNOWLEDGEMENTS
3

Collaborators and Students

Snehasis Mukhopadhyay, E Jane Luzar, Kristen Macuga, Edna Loehman,


Adriana D. Piemonti, Vidya B. Singh, Jon Eynon, Jill Hoffman, and the
various student participants

Funding Agencies (NSF Award IDs ESE:1014693/ 1332385;


NOAA/CPO/SARP Award ID NA14OAR4310253; ISDA Award ID
A337-9-PSC-002)
Indiana Universitys FutureSystems (old name FutureGrid) (NSF Grant
# 0910812)
Stakeholder participants and partners

2. INTRODUCTION
4

Wetlands

Filter Strips

Restoration of the
hydrologic cycle in
degraded watersheds
need

a network of
distributed conservation and
storage practices

Grassed
Waterways

Stakeholder

No-till

engagement is
even more critical now!

acceptable
Image source: http://cdn.phys.org/

References: Hey et al. 2004; Mitsch & Day Jr.,


2006; Heisel, 2009

choices will
depend on the local site and
stakeholder conditions and
preferences

Participatory Methods for Engaging


Stakeholders
Integrated Watershed & Water
Resources Management
(Schramm, G. 1980; Viessman,
W. 1996; Hooper, 2007)
Shared Vision Modeling and
Planning (Hamlet, A. et al.
1996; Palmer et al., 1995)
subjectivity and preferences

Participatory Design of Alternatives


6

Crowd-sourcing the design


process to the community
Bottom-up

and democratic
problem-solving process
Community building and
empowerment
Emergence of a new humanbased computing paradigm,
where human-computer
interactions play an important
role in design algorithms
References: von Ahn, 2009; Fraternali et al., 2012

Opportunities on the Participatory Web


2.0 ( and the Web 3.0 in the future)

The Internet provides opportunities for coordination of


efforts of billions of humans

Combined power of humans and computers can now be


harnessed to solve complex problems!

Consideration of issues that arise when human are


included in the loop (Ipeirotis and Paritosh, 2011)
Human factors
Quality of human contribution
Market, ethical, and legal aspects

The Participatory Optimization Concept


(Interactive Optimization/ Human-guided search/ Human-Centered Optimization)
8

Initial Feed of
Designs
Optimization
Algorithm
Operations

Optimal
Solutions

New Designs
Numerical
Evaluation

Non-interactive Optimization
Algorithm Approach

The Participatory Optimization Concept


(Interactive Optimization/ Human-guided search/ Human-Centered Search)
9

User

Initial Feed
of Designs

Web Interface

Optimization
Algorithm
(IGAMII)
Operations
Designs/
Alternatives

Feedback
(Information
Gathering
and
Decision
Making)

Optimal
Solutions

Designs/
Alternatives

Feedback
(Likert
Scale User
Rating and
Usabaility
Data)

Subjective
Evaluation

Online User
Preference Models

New Designs

Numerical
Evaluation

Interactive Optimization
Algorithm Approach
Babbar-Sebens et al. (2015)

3. RESEARCH GOALS
10

Represent stakeholders qualitative/unquantified


knowledge (wisdom) within numerical optimization
process through interaction
Participatory optimization algorithms and tools
1.

2.

3.

How do users engage with and learn from such


participatory optimization tools?
Can we model users nonstationary and noisy feedback
provided through the interfaces to detect underlying
revealed preferences?
How does the human-centered search process affect the
design of watershed plans?

4. METHODOLOGY
11

WRESTORE (http://wrestore.iupui.edu)

WRESTOREs Client-Server Architecture


12

HTTP

Arrows indicate data flow and workflow

(Thin
clients)

Web Interface

13

Participatory Optimization
Introspection Loop to support Users
Cognitive Learning

Introspection
sessions
(Human User)

Im sessions

Participatory
optimization/Humanguided search sessions
(Human or Simulated
User)

m loops

HSn_m sessions

Where,
n = number of iterations in optimization algorithm
m = number of introspection sessions
Participatory optimization algorithm: IGAMII (BabbarSebens et al., 2015)
13

Study site
14

Restoring upland storage in Eagle


Creek Watershed (HUC 11),
Central Indiana
Conservation practices modeled
using Soil and Water Assessment
Tool (Neitsch et al., 2005), for a
2005-2008 simulation period.
Impact of practices estimated by
comparing economic costs, peak
flows, sediments, and nitrates
with the baseline scenario that
does not contain any new
practice.

5. RESULTS
15

2 Loops

20 participants: Students and Stakeholders


Decision variables: cover crops (0/1) and
filter strips (0-5m) in 108 sub-basins
Quantitative objectives (Numerical
Evaluation): Economic Costs, Peak flow
reduction, Nitrate reduction, and Sediment
reduction at watershed scale
Qualitative objective (Subjective
Evaluation): User Rating of alternatives
based on users subjective assessment of
suitability of practices in their local subbasins

Q1. How do users engage with and learn from such


participatory optimization tools?
- By tracking usability metrics (Tullis and Albert (2013))
16

Im sessions

Where,
HSn_m sessions n = 1 to 6
m = 1 to 3

a) Overall response times


Stakeholders exhibited more variability from DeJongs power learning curve in HSn_m sessions
Small sample size, statistical detection of outliers, stakeholders might have a different
learning model structure

Q1. How do users engage with and learn from such


participatory optimization tools?
17
I1

Students

HS_1
35%

42%

I2
27%

33%

42%

60%

29%

35%

45%

32%

29%

20%

22%

57%
5%

I1

Students
Mean percentage of
clicking events
Stakeholders

22%

Students

HS_1
39%

39%

19%

I2
30%

15%
55%

39%

HS_2
29%

30%

46%
30%

41%

51%

15%

Stakeholders

Mean percentage of
response times

Stakeholders

38%

46%

25%

12%

Students

32%

46%

51%

52%
19%

13%

37%

I3
25%

43%

25%

23%

Stakeholders

HS_2

25%

28%

56%

32%

40%

15%

24%
23%

19%

I3

48%
53%

45%

48%

37%

b) Information Gathering versus Decision Making


Stakeholders spent 16% more time on Information Gathering than students
Stakeholders clicked 19% more in information gathering areas than students

7%

Q1. How do users engage with and learn from such


participatory optimization tools?
18

Decision Making

Information Gathering

300

y = 18.359x0.9401
R = 0.6727

250

Number of Clicks

200

y = 7.7771x
R = 0.6645

150

y = 10.023x0.8133
R = 0.5581

100

y = 28.126x0.5678
R = 0.6363
y = 29.703x0.6113
R = 0.6488

50

y = -3.4978x2 + 26.858x
R = 0.5815

0
0.00

5.00

10.00

15.00 20.00
Time (min)

25.00

30.00

Positive trend

35.00 0.00

Negative trend

5.00

10.00

15.00 20.00
Time (min)

25.00

30.00

35.00

No trend

c) Response time vs. number of clicks per session per participant and based
on trends in averages of self-reported confidence levels
When users spend more time and make more mouse clicks in information gathering areas,
their confidence levels are likely to increase over time

Q2. Can we model users nonstationary and noisy feedbacks


provided through the interfaces to detect underlying revealed
preferences?
19

Simulated Users (Avatars) To enable extensive search where individual users


qualitative criteria is represented via user models
Online User
Preference Models

Which Machine Learning Algorithm to use?

User modeling in Environmental Problems is still in infancy!

Lessons can be learned from HCI, Intelligent Interfaces, Adaptive Interfaces,


Cognitive Engineering, Intelligent Information Retrieval, Intelligent Tutoring, Expert
Systems, etc.

Challenges:

High Number of input parameters

Varying size of input parameters

Limited Feedback data

Skewed training data

Q2. Can we model users nonstationary and noisy feedbacks


provided through the interfaces to detect underlying revealed
preferences?
m loops
20

Error in predicted user rating

Deep Learning tended to


perform much better than
any other machine learning
algorithm we used

Stakeholder SDM

X axis: SDMs at end of


introspection sessions
1: I1 data = Epoch 1
2: HS_1 & I2 data = Epoch 2
3: HS_2 & I3 data = Epoch 3
4: Epoch 1 & 2 data
5: Epoch 1, 2, & 3 data

Q3. How does the user-guided search affect the design of


watershed plans?
21

PARTICIPANT
Participant1 1
PFR (cms)

25
20
15

PARTICIPANT
2
Participant2

10

25

PFR (cms)

5.5
5
4.5

20
15
10
6

-2.6

-2.4

x 10
-2.2
-2
5.5
Cost ($/Watershed)

NR (kg/Watershed)

NR (kg/Watershed)

x 10

-1.8

-1.6
7

x 10

5
4.5
-2.6

-2.4

-2.2
-2
-1.8
Cost ($/Watershed)

-1.6

-1.4
7

In Objective Space, differences and similarities can be detected in the


acceptable or unacceptable alternatives found by the users
x 10

Q3. How does the user-guided search affect the design of


watershed plans?

I like it Alternatives
Cover crops across participants

Higher
average
probability
across
participants

Filter Strip width across participants

Higher
mode

Standard
deviation

In Decision Space,
Cover crops: Across all participants, 45% of sub-basins have low variability in
probability of cover crops (i.e. standard deviation< 0.13)
Filter widths: Across all participants, 69% of the sub-basins have low variability in
modes (i.e.
< 1.2 m)

6. CONCLUSIONS

Interface usability is important!


Participants

can have the same view for different

reasons.
Cues

are interpreted differently when they occur

Participants

can have different views based on the


same information different interpretations

User modeling needs to be adaptive to a humans


own learning process
Putting stakeholders-in-the-loop enables
identification of design features in alternatives that
are satisficing from the humans perspective

6. CONCLUSIONS
24

Paradigm shift: Optimization based on human-computer


collaboration via interaction.
The

participatory web provides an enabling framework.


Knowledge and preferences of community included.
Supports

learning process of decision makers


Need for collaborations across multiple disciplines
Foundation

laid for further investigation in interactive and


participatory design methodologies
How

will cognitive machines solve complex participatory design


problems for watershed groups in the future?

25

Questions?

mathematical formulas alone do not produce


consistently relevant results. Human intelligence is still a
very important part of the process.
(Jimmy Wales, Founder of Wikipedia. Source - Businessweek.com, 12/2006)