You are on page 1of 40

Transcript of Balancing

Continuous Discovery and


Delivery with Jeff Patton
Jeff Patton: Maybe in way of self introduction, I don't
know if it's true for a lot of people, but my life appears
by accident. I'm not sure if I did it on purpose. I'm a UI
designer who is an art school dropout. And mine
sucked less than others did. I learned the hard way
that software that everybody loves and looks good,
when I delivered it to actual customers and users,
good and is good isn't the same thing. I learned the
hard way, I have some pride in this stuff and I care
about it, I want it to be good, I need to spend a lot
more time facetoface with customers and users. And
even in spite of that, I learned that I'm still going to be
wrong an awful lot of the time.

The other thing that I learned early on well, I got told


in this Agile stuff very early on. 2000 I got recruited in
the company into a startup and there was a process
called extreme programming. The startup was going
to be successful because they were doing that. Now,
in 2,000, in Agile. In 2001, the term Agile was coined
and it was the use of a term with things like

I was there, I was the guy with the answers, I knew


what to do. I've learned during the first ten years of
software development, my first ten years, that I didn't
know what to build, that I only build by spending time
with customers and users but I also learned that no
matter what I tell team members that no matter how
visibly I change a story, it is better if I bring them out,
better if they see for themselves, better if they feel for
themselves. I spent a lot of time or I spent the last
decade and almost a half now, the last decade and,
trying to put sensible packages back into Agile
development. From 2000, it was messy. But these
days the good news is I don't have to spend a lot of
time ranting about how this how to do it or if we
should do it. There are a lot of people doing it really
well. The bad news is it is still freaking messy. And
that's what we've got to talk about.

I'm talking about product discovery, and I want to


differentiate from that that from design or at least the
design process I learned. During the 90s, I did what
we did and did what seemed natural. In 2000 I started
being more disciplined about what design meant. And
I learned, well, what I might call traditional design.
Traditional design, I might start by framing something,
basically understanding the business and who the
users were, and I might follow that with some amount
of research. And the research, I would then
understand what people's problems, are were, I might
pull back that research and synthesize it, and I might
make sense of that synthesis with things like a
persona, and I might start to move from a deep
understanding about the problem to ideas, solution
ideas. I might have those, I might build a prototype.
And I get out there and get out there and test that
prototype, and then, hey, we build this thing.

So that's what I learned. There's two kind of research


with customers that I do, and there's a fair amount of
work that we do inside the house.

Now, I don't know about you, but this never worked for
me. Everybody always asked and I always hear people
asking at conferences and things like that, how many
users do I need to talk to? I hear lots of answers, and I
don't know what the answer is, four or five more than
you you'll ever be given time to talk to, or a lot more
than you have to talk to.

And how long should this process take, and longer


than you're given. You've never got enough time. And
for me, during the early 2 thousands, I felt I spent
most of the 2,000's feeling guilty because I knew the
right way to do things, and as a product person, I just
didn't have time to do this. There wasn't time to do it.

You try and skip all of this stuff and skip what you
think the right thing is, learn the hard way, and after
skipping the wrong thing a lost time, we're ending up
what I'm going to call a temporary design process. It
works a little bit like this. It starts with the same kind of
framing. You kind of have to know what kind of
problems you have to solve, who they are for. There's
some kind of demand to change the product. And it
starts with this thing is that we've been doing pretty
well for a long time, and that's guessing.

Now, the guessing for me starts by guessing about


what the problems are. And guessing moves from
there to guessing about what the solutions are. And
there's two parts to the guess that we're building.
Now, guessing is sort of incomplete. If I've been
working in a product company for a long time, I
actually have been talking to my customers a lot, I've
built a I've built a lot of software and I've spent a lot of
time building some deep understanding and empathy
for people so I've got this weird, now, if we look at
what I know, up here I learned that there was a data
that was gathered through research and that was
good and then there was this there was stuff that I
just didn't know and I learned that it was sort of
opposite, I was either dealing with guesses or we were
making stuff up. And so I was either guessing or I had
data.

What I learned over time is there's a bit of a continuum


here. If data is intentional and gathered rigorously and
guessing comes from zero experience with people, a
lot of times we've got information that has lots of
experience and anecdotal stories sort of falls in the
morning. And I have learned that guessing, what we
understand about a problem, we don't it's a mixture of
guesses and there's guesses, facts, assumptions, and
even those guesses are educated guesses. So they're
not quite as bad as I thought. The solutions we get
from them aren't so bad.

Now, the light work for the guess these days is


assumptions. And these are lots of assumptions. But
one of the things I've learned is that well, through trial
and error, that I'm likely wrong. And I might build a list
of all the reasons why I'm likely wrong. I'm going to call
this list a list of assumptions. Now, it's my list of
assumptions and guesses. And just outright
questions, stuff I didn't know. And at the top, I want
the riskiest. And it is at that point I've got a decision to
make. Now, for me, this is a bit of a dirty decision. This
is do you feel lucky concept. If you feel lucky, go
ahead and build it and ship, it ship it to scale. But if
you're not so sure, we've got to make a decision about
what we need to learn, what we need to learn to test
our guesses and assumptions here and to really test
whether those solutions are the right things to do.

Once we've figured out what it is we've learned, we


are going out how to test, or some of you know where
I'm going, we use this term lean, because watching
them work, it may be putting a prototype in front of
them. Once I figure out how I'm going to measure that,
I can figure out what I'm going to build. Some of you
recognize that build measure learn. Once I understand
those riskiest assumptions that I can work my way
through that with. And once I've learned something,
I'm going to come back to the data, or part of that
learning is the data, and it's out of that data that lets
me update or change these assumptions.

What feels like contrary design or what we might talk


about as a lean UX is a process, for me, is a process of
building better guesses. And where I, these are my
guesses, and my guess both about the problem and
the solution, when it some time for me to when it
comes time for me to measure this thing and I'm out,
what are the things I have to do when I'm out there
working with customers and users, I had this script
research, the stuff that lets me critique what I've built.
But I've learned over time that every time I go out, I
have to do a mixture of both he of these things where
part of the time I'm talking with the users the way they
work today and the other part of the time I'm talking
with them about what I might build.

I was told never do that because if I go into an


interview or talking to customers with a solution in
mind, I'm going to inject a lot of bias. I'm going to
inject a lot of bias in the way I interview you. And that's
true. You are. If I'm in a product company, I make one
product addition after another. I've talked to users a
very long time, I say, I'm not a design agency. These
are people, I've talked to a lot. The bias has always
been there. Recognizing that is we get nice man
address that come out of the lean community like
design like you're right. When I'm thinking through
these solutions I will want to really I want confidence
that everything I have is true and do the best job I can
with design but it when it comes down to testing stuff,
I'm not going to put my hat on and work really hard,
work really hard to prove that I'm wrong. It takes a little
bit to do this but everything I was always taught to
avoid leading questions, and now I use leading
questions, or the way I do leading questions these
days is by using misleading questions. If I think my
solution is good for a certain reason or I think users
have a certain problem, I'll use leading questions to
get them to say that they don't. So when I talk about a
lean process, I'm thinking of this. Comes out of the
lean UX community, what you say build, maybe, learn
will come out of the lean startup community which is
more founded by developers because it starts with
build, you'll see the UX community, that starts the
thinking. This is the work that I'm doing.

Now, there is a term that's been socialized for a little


while called dual track development. Has anybody
heard this term before? Just a few people. So maybe
it's best I don't even talk about it. The there's a reason
it came in to being. Actually, for people who have
heard of it, because it is people started talking about
it, and one of the people who helped socialize the
term is a guy named Marty Kagen. Marty Kagen is a
good name to may attention to in the product world,
he's written a product book, he came out of digital
products management. He led product management
for eBay for a number of years. And he came out of
product manager at HP before that. He's been the
visor to a lot of companies. And Marty will talk about
product management, and one day, talking to Marty, I
was trying to explain how we do this stuff in Agile, and
I put a model up on the screen, and he said, well, what
do you call this? I said, well, I don't know, this is the
just the way that everybody does it. The model was in
a paper from this lady named Desiree. Desiree came
out of a, when I first met her, she's at a company
called alias. That was bought by auto desk. And as a
UX person, she's describing the way that she's got to
work in Agile products, find the paper, it's a I'll go
back and read it and see this how the way we work
today that are still super well, even though that paper
goes all the way back to 2007, these describing the
way she was working when I first met her in 2003, 4,
something like that. But the stuff shes describing are
the things we're still figuring out now.

This is a model she drew where there might be a little


bit of a boot up or taking what I believe is true about
my assumptions, about my problems and coming up
with a little bit of a solution, and then one track that's
talking on development and one track that's talking on
interaction design, more detailed design, and lots of
back and forth and hands off. In the paper, I've pulled
up a little bit of a section that says these things are
depicted and dual track but it was trying to make the
point that this isn't two separate teams or facts aren't
functioning independently, that there are lots and lots
of communication going back and forth be between
people.

So by dual track, what I mean is that if you've been


working in an Agile development environment for
awhile, you're aware of how that works. We've got
these short cycles that focus a lot on velocity but
what we mean by that is development velocity. The
focus on predictability. And for the team to be
predictable, oftentimes that will push you to have your
design super refined before the team starts. Or you
get these other little pushes that you're not supposed
to design until it starts, and that goes against
predictable. And we talk a lot about quality. We want
everything built and tested but there's pressure to
finish things in the short cycle so that makes it super
tough for you.

So there's two kinds of quality here. Stuff that's a little


bit more objective, stuff that's stuff that lets you test if
it has design or bugs. But if we're talking about quality
of ability or quality of aesthetics, it is not often
thought of here.

Agile development will focus on those types of things.


But the work I was talking about was discovery work.
When we're doing discovery work, well, it is not so
predictable. We're trying to validate our ideas. We're
trying to not build fast, we're trying learn fast. We're
focussed on learning velocity. Amend we can't
predict well, it's hard to describe, it's hard to lay out
what we're going to do day after day or week after
week because if I go around one of these still measure
learn loops what I learn will change what I belief, will
change my assumptions and change what I do next if I
can get around this loop in a few days or a few hours,
even, well, what I do on one day will determine what I
do on the next day. I can't plan this stuff two weeks
ahead. The best I can do is time box or say how long
it's going to take to do this. Every time around this
loop, well, this is where I get a chance to make a
decision. It is where I get a chance to assess the risk.
Or ask if I feel lucky and decide how much more I'm
going to invest in this or not invest in this thing.

These two kinds of work have two different ways of


working, and the problem is that they do have to work
side by side.

When we're talking about dual tracking, what we're


talking about is the idea that sorry. That's driving me
crazy.

That ideas or opportunities, you may pull those things


out, and the first cycle around may have us building a
problem and a solution, assesses risk, figuring out
what the first thing we can test it is or if we need to do
a first test. The first cycle around may take a little bit
of time. And by a little bit of time, I really do mean a
little bit. A little bit compared to maybe the shape of
things I used to do. A little bit may be measured in
days, not weeks. Because I don't need to have strong
assumptions to figure out what my biggest risk is. I
can test pretty quickly. Then I move into these fairly
short cycles of testing. At some point in time I can
decide this thing is, I feel strong about this. It is ready
to build. And then I can produce what in Agile
development gets all of the product backlog. Product
backlog now contains things that I'm confident in
building or confident that I should release. We call that
a release backlog.

At that point in time it's that I can move into these


cycles or these different cycles of focus on
predictability and quality. And Agile development,
those are typically two weeks long, and kind of feel
long and ponderous like compared to the quick stuff
you're doing up here.

The annoying thing is that while you were doing up


this up here, the team was building stuff, and you still
got to pay attention to what they're building, and once
you get done with this piece of discovery, well, you're
going to have to pull in the next piece and do more
cycles.
Now, this looks and is messy. In talking with people
like Desiree, she'll draw the model and sprain the back
and forth, by the time she's done, people's heads are
spinning, and they will say, that was awful. And she
keeps trying to say, it's not so bad. It's not so bad. It is
bad. It is bad.

What I've learned is we do need to focus on both


discovery and delivery, and we can't do this stuff in
phases. That's not is going to keep up. The rate of
demand, the rate of change in a digital product of
world. So it is continuous discovery and delivery that
we're focussing on

Now, before coming here, I wanted to I wanted to give


some advice and a bunch of tips, and I wanted to use
a bunch of examples, but I decided at the last minute
to back off and actually it has been a while since I've
been a product manager and designer, it has been
since the late 2000's, I want to tell stories with people
that I'm working with now, tell stories about people I'm
working with now that are making this work, that are
doing a better job at this, that are really embracing
this stuff. There's been a lot of language that has
changed. When I use words like lean UX, lean startup.
Those are contrary words. It wasn't until 2001 that
was a common things. And until a few years after that
that they we got lean UX.

This person's name is Belinda. Belinda works at a


company called Hightower in New York. They it do real
its day property management stuff. They were a fairly
small company in the 50s and 60s, and what is
interesting to me, I met the founder in an innovation
lab and he's super strong, but in working with different
product managers there and have been for awhile,
what is interesting about Belinda, she's young, she's
been working at product software development five or
six years ago. And she o only knows a world where
we're working in short cycles. And it is not fair to show
a picture of Belinda, she only knows a world where
she works closely with a UX designer and a lead
engineer. The quality of relationship she has with
those people is critical to their success. She knows
she doesn't make decisions and hands them down. If
there's a few things to pay attention to in Agile
development is that crap about the property owner
making a decision or a product manager working
alone isn't true.

I mentioned this guy named Marty Kagen. Again, he


came out of HP. He went from JP to Netscape and
when browsers started being included in operating
systems, that whole market exploded, he went to then
what was a newish startup called eBay. He was the
third product manager hired at eBay and helped eBay
grow big. Marty is an expert on digital product
management. Digital products don't behave the same
way as traditional products, where we design them,
package them, ship them, and we get only one event
to get it right. We don't have very long to correct
problems.

By digital, I guess I don't mean completely digital, I


mean software, mostly. Most a lot of traditional
products enjoys some amount of growth and maturity
and eventually declines and we might replace it with
another product or another model. But that's for
digital products, we want to keep them alive forever.

If I would ask you when the last big release for Netflix
was or for eBay, it doesn't quite make sense. They
make lots of continuous changes and add things and
bull things back. Digital product management works a
bit different. If you were a product manager working
with Marty, he would tell you, well, you're responsible
for the success of this product. The product success
or what the right product is, is an intersection of
these well, if we draw a diagram, and if you'll see this
is slightly different than the one that gets drawn a lot.
The right product is an intersection of what is valuable
and by valuable we mean valuable to our organization.
It is worth paying for. It is worth investing in. We will
get return on your investment as a product company.
He puts usable here. I see variations of models of
desirable here. But look if we can come up with cool
feature ideas but people can't incorporate them in the
way they work, for Marty, when we're talking about
things from a user or customer perspective, it is the
intersection of valuable and usable that makes things
desirable.

People, to be usable, we need to understand users


and how they work and what they'll incorporate into
their work practice. And then because any idiot can
come up with super cool ideas that we can't afford to
build, these ideas need to be feasible to build given
the time that you've got.

Now, to hit the center of this, you do need to be an


expert in your organization or industry, its vision, its
strategy, its customers, its users, you need to be good
add understanding how customers and users work.
You need to be good at thinking of solution that is will
help them and you need to understand the technology
that you're building out. In fact, you should understand
it well enough and how to code, understand the code
you've got, and understand, well, what is coming from
a code perspective. That's a tall order for a single
person, and the reason why the way Marty has kind of
frames thing is that would be nuts to expect from a
single person. Marty pushes hard on this partnership
of product managers, user experience person and a
lead engineer.

Marty would refer to this group of people and kind of


look for a product manager to lead this group as a fob
product team. I'm going to show pictures in a minute
of a friend of mine who is a product manager at
Adlacian. And how many people use Adlacian
products? They make confluence, things like that?
The first time I walked around Adlacian, he was giving
me a tour, and things like that. He pointed to an area
where he pointed to a group of desks and said, this is
where the triad sits. And he was talking about was this
for product team. They used the jargon of triad, and
that's that surprised me. It didn't surprise me because
it was the first time I heard the term. But it surprised
me because, well, it was the second time I had heard
the term and I had heard it just weeks before it at a
company in South Carolina where the lead of where
the person in charge of product there used the term
to talk about her core team.

I was curious, how Adlacian is based in Sidney,


Australia, I was curious how that went to the southern
hemisphere so quickly. He said he had been using it
for years. And he said his director of product got it.

Now, I've been to companies like Spotify before, and


they will use the term trio because triad sounds too
much like Chinese mafia, and there's a company I
worked with in Seattle that has the list of team
members that show teams and next to multiple team
members would be this abbreviation 3. What starts
getting confused is when the team of three contains
two people or four people or sometimes five people. It
refers to the three concerns. It does not refer to three
people all the time.
And finally, we used the term nucleus. But if you watch
the A the show Silicon Valley, you know you can't use
that anymore.

The problem is the since there's been product


management, at least for a strong digital product
managers working in with commercial products, we've
known that this partnership between design and
engineering and user experience has been important,
and one of the super annoying things about Agile
development is it kind of got beat out, we got
representations for you, but again, people like Belinda
don't know any of other way to work. And for Belinda,
she sent me this picture yesterday because I said I
want an informal picture of you and your team. The
Rebecca, the lady in the middle takes kung fu lessons
and she said I've got this picture handy of us taking
kung fu together. Now, that's her team. And her team
is small enough, and I've talked with them a lot over
phone and Skype, but I see a lot of them a lot, they're
all working on, this they're all aware of this discovery
stuff because half of the team is a full time focus for
them or for the core it is.

Now, I've drew that dual track thing for you and I
wanted to make it scary. But now I want to make it
little scarier. I have to draw something that I don't
quite know how to draw. I'm going to refer to that's
juggling. One of the things I thought of Belinda and
multiple times at Hightower is they're super good at
juggling.

By the way, I saw that picture, those are instructions


on how to juggle three because. I can't see what was
what is going on. I am going to tell you, I'm about to
draw a diagram that I can't follow either.

So I'm going to keep iterate this drawing until I have


something obvious but not today.

So I mentioned here that Belinda is pulling down ideas


or opportunities, but he may start with an idea or, like,
most other companies, the ideas often come from
mandates and sales and marketing because they've
got demand for particular customers and if we're
going to get these gigs, we need these features or
capabilities. And Belinda goes into a phase of
impacting or understanding that idea. A phase of
building small prototypes, a building a few things, and
she'll, through lots of iteration, she'll finally get to a
point where she can make a strong go decision. And
go means that we should build it.

What is interesting that Belinda will often make no go


things, no go decisions. Those are tough. By no go,
she means no, we're not, we should not build, this and
here's why. We should build something else, or not at
all. Saying no, those aren't business requirements,
saying no, at high tower. That's something that they
do. Now, go doesn't mean she's done and she's ready
to build. It go may have come as a result of doing
some enough user research to be confident. Enough
prototyping and design to be confident. But in every
nook and cranny of thing, she's going to design at a
level that is necessary to validate this is the right thing
to billion. There's a lot of things we don't have to
validate. But know, her team needs to dig in a little bit
deeper. Terry is the UX person there, Terry needs to
focus on doing more detailed design.

And they've got to do more detailed design to see how


things look, how things work, and he's going to do all
of the things that need to be validated or tested. One
of the things that she's learned is there are things that
just aren't that risky. There are things that where
usability is a concern and they can using testing with
internal people, they can bait out usability issues
pretty well and even correct them after the fact, they
could do that.

So Terry starts designing in short signing psych he is.


Up with of the last conversations I had with him, is him
talking about how many cycles of designing and
testing he had gone through this week, and for
usability. The risk is out. But at some point in time he's
done. Now, the annoying thing here is that while he's
doing this design, at some point in time he'll start
dropping things into what are more slow, well, these
delivery cycles. He's feeding the team with details,
and details come in late while they're working. All this
development. And he may be doing that a lot. He may
be doing that concurrently.

At some point in time, this thing is really done, and


they'll ship this thing, and now the next kind of work
starts. The thing is out there. It is people are using it,
and there's kind of a longer cycle that is focussed
around measurements. Measurement means two
things. For Belinda, as a product person, she's always
known, look, we're making assumptions about what
people need and want, and at the end of the day, it is
whether they actually use it that matters a lot.

Don't look at this too closely because I just snapped it


off the screen when we were talking over the phone,
but that was one of her dash boards that was showing
the list and usage of a particular feature. He's
measuring at the particular feature level and
comparing it to other features in the product. And in
the head, she's got a target for what usage should be
before she shipped it. She's measuring whether it's
actually getting that. And because Belinda knows that
data will give you facts, data will tell you what happens
and not why, a friend of mine chi Miller will talk about
data being exhaust, not fuel. Belinda, for her,
measurement means she's looking at metrics.
Measurement always means that she's facetoface
conversation with customers.

And this stuff may go on a little while until she's really


confident this is done and she can leave it. There is a
concept used in Agile development or a silly phrase
that Agile users use sometimes that is done done.
Have any people heard the term done done or does it
get used in your organization? Yeah, a few. Done done
means it is done, it means built, the code is written,
the second done means it has been tested. Belinda is
using done, done, done. The last one, the feature
should stay in the product, it is really good, that takes
a long time, this is super annoying, it takes a lot long
time after we shipped it.

All of these black lines for me are discovery work. And


there's a lot of this stuff going on. And just like I told
you before when I first drew that tool duel track thing,
this is where the juggling comes in.

When something switches over to being ready for


detailed design, Belinda is going to pick up a next
opportunity and start to do early validation with that.
At some point in time that might pass over. And this
edge may always be wrong. Terry, who is doing
design, might be doing design on multiple things at
the same time. And that sucks. And because features
are different sizes, a feature may drop at a different
time into the product, and we get this nasty overlap
where a feature will drop and it will be overlapping
stages of measurement. So at any given time, Belinda
will try to juggle three features, and she's done pretty
good at it, such that she doesn't recognize it, other
people will say this is super hard, and she doesn't
know any other way. She always has a feature that is
new that they're doing more formative discovery on.
The should we build it discovery feature that is in
design. Or on deck they might call it. And a feature
that is shipped in measuring, now, this cycle is long
enough as it, is and it actually takes discipline to just
work on three. As the companies start to do this dual
tracking stuff start to see the lag between the should
we build it kind of discovery and the design sort of
work, and start to stack up or overlap because there's
always pressure to start one more thing. Belinda
spends a lot of her time saying no, we're not going to
do one more thing. She spends a lot of time pushing
back because they're going to work on more than one
thing. The overlap is always there. So I'm not going to
draw this model very often.

Now, I wanted to I wanted to go on with Belinda's


story, I wanted to pull up a picture of the website for
Hightower so I could show you it's a real company,
they are really out there, and I've pulled up this
website, and I saw this big guy that this looks like a
stereotypical real estate product management. I
looked at the website and I saw multiple people there.
And then I looked closer and I said, wait a minute, I
know those people. I've been talking to Belinda for a
long time and she talks about these people all the
time. These aren't these aren't these people did not
come from shutter shock, they're not Getty images,
she's really people that she actually works with and
they're people that she knows base face to face.
Belinda does something because she builds
enterprise class products, well, she does some a
couple of things that might seem counterintuitive. Let
me give you one more that kind of helps with all of
this. When Belinda shifts and starts to measure if
things are working, Belinda still isn't confident that this
thing is right or that she knows, so Belinda does this
thing where she ships products or features or releases
she knows don't work. Now, by she knows won't work,
we don't mean that there's bugs in it, but she knows
it's incomplete, it's not perfect.

Now, I would ask you, if you had something you knew


was not good enough, who would you ship it to?
Would you ship it to your best friends or worst
enemies isn't trick here is ship it to your best friend.
And those people in those pictures, after I saw them, I
know them as Belinda's best friends because they're
what Belinda refers to as her CDP or her customer
development partners.

Customer development partners fit a special profile.


Some of you may have seen a term like this before.
This is a crossing the curve where if we look at the
number of the users adopting a product over time,
and in the meeting we've got this big hump that is
majority, and we have the early and late majority. But
over here, to the left, we've got these people that are
early adopters. Early adopters are the people that will
try a product or if a customer asks you, you talk about
a cool new feature and a customer asks you, that
sounds grate, who else is using, it they are not an
early adopter. Early adopters use it before other
people use it. Early adopters will use jagged things
and that's where there's there is a chachl here
because some things get killed because some people
just don't make it across the chachl to the majority of
the market. Them die quickly. I know this because my
wife buys a lot of crap on Kickstarter. My garage is full
of this stuff. Yeah, my wife is an early adopter. If I look
at this broad base of all possible users your customer
development partner comes out of early adopters.
This is a this is a behavioral characteristic. When I
show pictures of these people, these are the people
that want to be involved. With validating this product
and he releasing the early adopters isn't like releasing
a beta release or we think it's are willing good and
we're going to release a small number or a small
group, and make sure it's okay before we release it to
everybody else. No, this isn't just a last stop before
general release. We ship crap to them and we say
that's crap but we ship stuff that we're not sure of and
we expect them to adopt, it actually use it and give us
feedback and we expect them to talk to us every
single week, we expect them to take multiple releases
and we're going to keep iterating it until it is you a
stomach.

So Belinda is pretty strong at that stuff. There's a term


that comes from a people who do this or release it to
small subsets and it is to nail it before you scale it.

She'll do that. And that's what's different for me or


what I learned over time is if I learned traditional
design, if I did this stuff right, I would nail it every time.
And I got to tell you, it has taken me 20 plus years to
realize I'm not and we don't and what we're not
predicting is when we can design it or build it well.
What we're predicting is people will actually use it the
way we think. It is their behavior that is not so
predictable.

Now, let me see if I can summarize here. So Belinda is


a strong juggling. She's always going to be working
with a few different opportunities at a time but trying
to constrain that. She's super strong at understanding
tactical metrics. I see a lot of organizations where data
analytics is another group for Belinda and her team,
analytics is a core competency that's baked into there.
I just realized there's one drawing that I need to draw
here. That point about scaling poet toe types, one of
the things that Belinda is super good at well, for her,
prototype doesn't mean what you might think it
means.

So there is this quote I love from this guy, Phil Buxton.


When we talk about prototypes, we might have this
continuum that is that starts with paper and moves to
something that's working, maybe move to something
that spills out of code, but I remember years ago
hearing this from Bill in a talk that the difference
between high fidelity and low fidelity is stupid. There's
just right and wrong.

And Belinda is making choices about a prototype. The


choice is balance between her confidence that the
confidence that she is right and the cost and time it
takes to produce threes things. Now, with all of those
pseudo charts, I'll draw these because the way that
this one works is there is a line going up the center,
but don't picture this as a right way. There's a steep
ledge where there's a narrow fence that you're walking
across that at any given time, if my confidence is low, I
want my spend to be low, I want to spend time just
doing interviews. I might move up to doing a paper
prototype. And look at the very top, we've got this
production software.

The challenge for a lot of organizations is finding


space in between. Maybe if I were to find something
like a test in between. This is sort of a well, there's a
space zone in the middle here where the rate of my
investment in a prototype makes sense. But look, if I
drop below this zone, I spend a lot of money building
stuff that I'm not confident in. Down here is the stupid
and expensive zone.
Up here is the, well, it's stupid also, but it is the stupid
and just wasting time. If as a design group, as a
product group, my only way of doing things is inter
views or paper prototypes but I'm super confident,
that's it is worth upping the fidelity and also
everybody knows there's a limit to learn more about
paper. Oftentimes you can learn more than you think.
But I've seen people's behavior change strikingly once
you get something that has well what my friend Marty
refers to as a live data prototype.

Something where it there is data behind, it is


production ready. When Belinda ships to a CDP, it's
farther up here, but she knows that her CDP pool is
only six people. It doesn't need to scale quite as
much. It doesn't, well, the rigor is slightly lower
because she's not shipping it to hundreds of people.

The scaling prototypes based on the confident level is


something that she's good at, and Belinda, as a rule,
she's not going to ship something generally into the
product until she's sure it's nailed.

Now, let's see if I can hit the last couple. We've got a
lot of basics down here that make this stuff work, and
I want to pull this back together. This guy's name is
Shariff. I've known him for a lot of years. And he does
work at this company Adlacian. Now, they make GERA
and confluence and a lot of tools that are used in Agile
teams and it is always both insightful and troubling to
see how they do not use their tools the way anybody
else does. When I walk around, their walls are
absolutely covered with the odd the evidence of their
product work. And I don't have a cool slide for, this but
I've asked Shariff, another, I called him last week in
preparation for this, I have lifted a few quotes from
him. For most people working with Agile
development actually let me hit his quotes. Nine years
ago I was prioritizing every single bug in the backlog,
spent a lot of time in GERA, but look, that just didn't
scale. I don't drag and drop stories up and down the
backlog anymore, the engineer does that. For some of
the teams I work with, I haven't seen a GERA board for
some of the teams. The teams will figure out how to
break stories down. The team will work with
acceptance criteria. And the all of this discover work,
all of this work we're done doing to plan interviews do,
interviews, create prototypes, things like that, now I
don't create GERA tickets for that or stories for that.

If you're working in a product team, you are a product


manager, you are working with other product
managers or you're a UX person working closely with
a product manager, for some product managers their
life is wrangling bits of things in GERA and what's
interesting is, well, it's not best of that and you
probably feel it.

One cool thing about Adlacian is tray not a slave to


their process. Now, I'm not going to play this, I don't
want to listen to the whole thing, but you could find on
line, you should find on line there are short videos
because, look, Adlacian is trying really hard I'm not
going to play this because I didn't check sound before
getting here. There are there are videos on line who
describes how they describe product management.
I've only given you about 30 seconds. But you can see
if they describe the way that they do product
manager, it is the team report, the team directs one
powerful thing that a Shariff said is they always used
to involve developers in interviews but he said, look,
now it's a real. I just don't do customer inter viewers
without a developer in the room. They need empathy.
They need insight. And the that makes a big
difference.

Look, one of the things, when I first learned to use


personas, I learned that personas were built out of
rigorous research and data, and it took competent
team to create a good persona. Well, on the other
hand, they always through from the perspective of
personas. But these quick light right, persona
hypothesis, these things built out of facts and
assumptions are built together with teams. They have
a lot of standard, wellresearched personas but for
every feature, they will pick up one of the standard
personas and say let's look at this feature from this
personas perspective. Let's talk about what are their
needs relative to the feature we're talking about, about
what are their challenges? And how does this feature
need to be for these people? These kind of
discussions are key discussions, if you need empathy,
and they're fueled by knot by team members who
don't know anything but by team members who have
been talking with their customers and users.

A lot of people here have heard of the term design


studio or sketch boarding. This idea of involving a
whole team, sketching solutions is signer common
there. Every team member, almost every single week,
sex pecked to participate in these activities, where
given the problem, we should consider or test or talk
about the solutions. That is the picture of Shariff and
there's the results of the design sketching session.

Now, if you're working in a Agile development, some of


you may be wincing because the idea that telling your
team they need to participate in an hour or two a week
to do that kind of stuff, that's not going to fly in your
environment. With Adlacian, it is sort of expected. It is
desirable.

This is a team at a standup meeting at Adlacian. And


what's interesting at standup meetings, they do not
take place in front of a task board. And I first took this
picture a few years ago and I saw a few teams doing
this, and in talking to him last week, he said most
teams that he works with do this. The stand up
meeting is in front of the user journey for the feature
that they're working on and it isn't a what ticket did
you work on yesterday and what ticket am I going to
work on today. It's the I'm working on this part of the
journey here and these are the challenges I've got. It is
they keep their focus on their eye on what is going on
in the product the whole time. And at every place,
every team has their own board that is showing data.
As I was walking around this one, on that thing are
tests or things that they have released into
production, to a subset of their market, and they're
going to nail it before they scale it. They're going to let
these things run until their confident they're getting
data. But what I learned out of from Sharif is that
discovery work is a whole team responsibility. In fact,
one of the things that drove me nut is this he wouldn't
call it discovery work. He didn't even think of as a
different class of work. It was just the work, sort of like
when you go to China, they don't call their food
Chinese food, they just call it food. They don't call it
discovery work, they just call it work. They don't
hander about what velocity works, if we are moving
fast enough, we're learning fast enough and then
building fast enough, those things go hand in hand.

And the work it takes, well, that backlog isn't for


tracking discovery work. That backlog is for tracking
development. And we track development once we've
decided we know what we're going to build and the
best people often to do that are, well, the people
building it.

I'll draw one last picture and I wanted to leave just a


few minutes for questions. One quick picture of this
guy Archie. Talk about what he's doing and we'll nit
this together. Archie runs UX, and when I first met
him, he runs UX for a team and now he works at car
max. One of the things I get from Archie is, well, if you
go to Adlacian, they are shuffling into confluence but
when you walk around car max, everything is just
straight out on the wall. We were joking awhile back,
and now it is caught on as their term for this, Archies
builds prime board. So all of us have seen detective
shows, and I've watched a lot of them, and there's
always a bunch of suspects and things come together,
and it stays on the wall. I've watched a lot of detective
movies and never, not once, I have seen detectives
say quick, we've got to get this stuff into PowerPoint.
This stuff is advisable and it helps you move fast. And
I see a lot of these. It is hard to know what's going on
there. But I pulled that out, he has a standardized way
that from interviews has made it simple so that all of
these members on interviews can do it quickly. These
two by twos for debriefing are on chart paper and they
can look across. That's for one round of interviews
that took place in one day. One, two, three, four, fives,
interviews that day. When I look at if this it keeps
these on the walls because it is common for
psychiatrists to stop by and talk about what they're
currently working on and they need to be ready to
point and talk through their crime board.

I won't explain what a hypothesis statement is, look it


up from lean UX stuff, but this hypothesis stuff frames
what they believe is true, what they believe they're
going to build and what they believe the results will be
interest doing it.

There are, again, because they've been juggling also,


they've got metrics from experiments that are
currently running. Those are updated. Those per
seasons in a are similar to that persona style we see in
lean UX today. But because they know they are built
out of a mixture of hypothesis and assumptions you'll
see lots of crossed out stuff, sticky notes on it,
because every round of discovery they do ends up
updating or making changes. These are not the
beautiful personas I do. And then interview results. I
see various crime boards around the shops that
shows one of their tricks is, as you can see there,
personas on the left hand side but that is space for flip
chart paper so they'll photograph it and print it out on
8 and a half by 11 so they can get more on the board.

What Archie has done is they really do use this stuff.


But they have fixed it. So let's pull this stuff together.
They worry about visibility. And you hear the term
visibility and transparency a lot in Agile tile
development. But well, it needs to extend to the
discovery work. Discovery path board is common and
visible where if we know what opportunity we're
working on, the list of stuff that we need to do are
interviews and prototypes and design sketching
sessions. And in a task board we end up going to what
we're doing and what is done. And those things are
visible and that time or I board is a big thing there
where Adlacian you see stuff on confluence. But it
interesting. A lot of it is confluence. A lot of it is
sprayed all over the wall. But there's lots of those
things. But what changes a lot about scrum are those
ceremonies. If you've got a sprint planning session,
well, the big three are the sprint planning session, the
standup, and the review. Now, in everyone one of
these things, we've always talked about development.
For Archie and for people doing this, they recognize
that it's not just about development. Every standup or
every planning meeting starts with a discussion about
discovery. The goal of that discussion isn't to plan the
details of it. It is to talk about what we're trying to
learn, and well, the goal is to set a time box. Because
we need to budget time for the whole team to
participate, and we need to pull that time out of
development budgets because it's to lower velocity.
Every stand up meeting talks about discovery,
development and discovery, but one of the things they
and have a lot of people have learned is we talk
about we have separate this had or we put these two
subjects backtoback. We talk about development first
because there's more people doing that and we talk
about discovery second. There are kind of two
different time horizons, those tracks are different, the
way of working is different, participation is different.
And then at those reviews, the rule is to talk about
discovery first, what we learn and how it's going to
affect development. Look, if you've got stakeholders
or other things in your company, one of the things I've
learned from people doing this is, first off,
stakeholders generally lining to learn discussions
about discovery, and a lot of time, the only thing will
trump is evidence from the real world or what your
customers and users say. The other thing I say is after
we've spent a lot of time and building it is the only
time I want customer feedback.

And so kind of so my key point out of Archie is that,


look, we've got to keep this discovery work visible. It
is so don't be slaveish to the way Agile development
is supposed to work. It is your job to make Agile
development work four.

Now, for the designers in the room, this is a quote I


heard from Leah Buley. I've heard it we were talking
here in Boston years ago, and she said this over a
beer. I don't know if this appears in her book. But she
look, design isn't a product that a designer produce.
It's a process that the designer facilitates. She was
hinting at and describing at the time, she was at
Adaptive and Forester, and I think she just left there,
but they were driving they've been driving towards a
process where design is a facilitated activity. But this
is not a democracy so the commentary quote is Lisa's
quote. Her blog disambiguity is filled with lots of great
articles and she's gone a lot of cool stuff over the last
several years, first, with the UK government, and I saw
her in Australia this summer with the Australian
government. They supported her there. So this is not
designed by community. Design by community is not
the same as design by community. Everybody is
involved but at the end of the day, it's, well when, you
fall bake on that core team, you fall back on that
strong partnership between design and senior
engineering and product leadership.

At the end of the day, all of this juggling stuff is


difficult. And the only way to make it less difficult is to
not take it on yourself. It is to pull everybody together
to do this. It is the involve, recognize a good product
design, and the work we do to validate, whether we're
building the right thing, that discovery work isn't the
stuff we could and then hand off to other people. It's
the team's responsibility. And if you've been a
describer in the past, you also know and probably do
know that things, well, the design isn't done after you
ship the design shipping that actually needs to
validate this stuff too. It is a lot of the best learning
comes after that.

I have left about 45 seconds for question. So at least


we can answer a couple. Thinks for listening to. This
we've only got one mic out there. [ Applause ]

Audience Member: Hi, Jeff. Amazing talk. My name is


Dana. I'm way here in the back. For the last two years
I've been working with the United States digital
service. Mostly in U.S. citizenship and immigration
services.

Jeff: You have probably heard Lisa what she did

Audience Member: Oh, yeah, yeah. Yeah.

Jeff: Okay. Good.

Audience Member: We're great friends.

Jeff: Yeah. Good

Audience Member: But I just wanted to thank you


profusely for giving this talk because I now know that I
am not completely insane.

Jeff: No.

[ Laughter ]

Audience Member: For how design works in Agile


and large enterprises, and if I am crazy, a whole lot of
other people are insane too. So thank you. This
description was amazing and I love your message
about make Agile work for you. Make design o work in
Agile. Bring everybody in. So thank you.

Jeff: No question in there. Thank you. But yeah, no,


you're not crazy, I have all of the crazy people, I wish
Adlacian would come clean about how they actually
do things. And what is annoying that they build a team
that so many people use but they wouldn't use the
way other people use it.

[ Laughter ]

So any other questions out there?

Audience Member: Hi. I work for a services company


based in New Zealand, and what I'm interested in, and
this is probably a larger question, you don't have time
to answer it now, but if you have any sort of initial
thoughts, a lot of what you spoke about to do with the
Agile process was in an enterprise company, do you
have any thoughts on how we would adapt that
process for a services based company where when
we're working on isolated projects and multiple
different products at a time?

Jeff: So by services based company, you mean you're


working with clients that are doing this stuff?

Audience Member:Yes.

Jeff: Look, the best I've worked with a lot of


organizations over time trying to move to this sort of
process this way and they're sort of recognizing that
the old way was to take do the work and then do the
tada moment and then the goal was to sell the client,
you've done a good job, they pay, you win. But if you
become more of a partner, that does change the way
you work. Maybe the one piece, there is more to talk
about, but something that sort of stuck is to take on
more of a producers posture. If you were look,
imagine you and your friends have a cool band and
you sound great and you can get a record out I think
so are [speaking away from microphone], anyway, if
you want to get something produced out there, but
you don't know how the music industry works, you
don't now how recording industries work, you don't
know how it would take to get you on the market, you
need a partner to take those ideas and make them
great and make them successful. And that
partnership, well, if there's a continuum that describes
your partnership, and you put waiter on there and
doctor on the other, you should act like a Dr. You're
not a ordinary taker.

Your job is to do no harm. Help make them successful.


It is your job to help educate them on how this
process works, just like how a good producer would
educate how the music industry works. So it is a
partnership, opening up a discovery process to
cocreate or work with them. It really screws up
revenue models. You've got to find development
person that their.

All right. This thing is flashing at me so it means I need


to shut up.

All right. I'm around if you have questions.

You might also like