Professional Documents
Culture Documents
Speckled Computing is an adventurous research program, which offers radical new ideas
for information technology in the future. This will be realized by minuted, autonomous
semiconductor speckes – each speck will encapsulate capabilities for sensing,
programmable computation, and wireless networking. Computing with networks of
speckes (known as specknets) will enable linkages between the material and digital world
with a finer degree of spatial resolution than hitherto possible, which will be a
fundamental enabler for truy ubiquitous computing. The Speckled Computing Research
Consortium slices across traditional academic disciplines and organizational boundaries
to bring together physicists, electronic engineers and computer scientists from five
Scottish universities to provide an integrated technological push towards the realization
of specks and specknets. This paper outlines the vision underlying the project, and gives
an overview of some of the considerable research challenges.
Context
For Eileen, it’s the Personal Inquiry project – large collaboration with Nottingham.
Inquiry learning in science; a little over halfway through a 3y project.
Hardware is all reasonably off-the-shelf equipment for scientific data capture. Literature-
grounded method of supporting the inquiry process by involving young people in
empirical work; technology is a way of enabling them to work through an actual cycle of
focused investigation, rather than a simulation. Examplar topics: microclimates, urban
heat islands.
A lot of previous work to support inquiry learning is about modelling phenomena and
processes, often using simulations.
Student feedback says they appreciate real data collection. Project is not tackling issues
of modelling and immediacy of feedback.
Speckled Computing
Internet has 1 billion hosts today. IPv6 will support >35 trillion separate subnets, and
each one in turn can connect millions of devices. Potential capacity to name/connect
every grain of sand. Smart objects – smart meaning objects know something about their
environment, and location-aware – not necessarily absolute, but relative: who are my
neighbours.
Specks: minature programmable devices which can sense, compute and network
wirelessly. Autonomous, rechargeable, energy scavenging (e.g. photovoltaic cells tuned
to internal lighting – focus on built environment). Specks non-static and unreliable –
design protocols for expected failure and intermittent connectivity.
Family of devices – 8-bit (med) client, can connect up to four sensors with 32-bit (large)
microserve first, miniaturising to give 8-bit 5mm cube client. Freespace optics as comms
– useful when devices are stationary. Would love to put sensors in e.g. the Jennie Lee
Labs – because they’re static, can have line-of sight. Very small, low-power lasers. When
on people, need radio – but that’s wasteful of energy because you radiate in all directions
rather than directionally.
Also: Energy Neutral (EN) platform – capturing energy from photovoltaic sens.
Current motion capture methods: 1. Studio based with cameras, many cameras, reflective
markers attached to person; grab info from 6-8 cameras, stitch together to get 3D view –
computationally/memory-intensive post-processing. Not real time unless very high-end.
Expensive – £30k ?per hour. Occlusion is a problem when capturing multiple subjects –
need more cameras, but makes more post-processing.
2. Motion-capture suits. Wired suits, lycra, with a bulky base station/backpack which
routes the sensor data to a high-end machine to do the processing (like Gollum).
3. Joint angle sensors. Bulky exoskeleton, cumbersome, hinders movement – not widely
used.
So want: fully wireless, real-time and interactive, easy to use, ‘banalise the technology’,
democatise its usage. Parallel with desktop publishing.
Orient Motion Capture system – currently sensors are about 30mm, need to miniaturise.
(Video using Motion Builder for capture at www.specknet.org)
Can use real avatars: telepresence; bipedal robots operating in a harsh environment – use
entire body as interface. Also in games. Unobtrusive participation in simulations
combining real and virtual players – ‘serious games’.
Applications – lots – Digital media (motion capture, games, sports); Health – with
Lothian – looking at:
Showed avatar control to Linden Labs (Second Life). Not keen because would flood their
network.
Edingburgh Science Festival 2006 – learning in informal settings. Put sensors on break
dancers (8-10 year old), give them ideas about physics e.g. angular momentum,
centripetal forces and so on, based on their breakdances. Competition – who can spin on
their head fastest. Not saying you’re teaching – surreptitiously getting them to do things.
Golf swing analysis – challenging, limited bandwidth, 2-3 hour tour round club. Data
coming in to mobile phone. Modelled as double pendulum – arms are one pendulum,
connected to club which is the other. Equation of motion for double pendulum using
Newton’s Laws. Get visual feedback of swinging club in the plane – angles between parts
of the arm and so on. Applied sports science unit with biomechanics people helping
interpret.
Interacting with robots – Trying to program behaviour, especially standing on one leg,
walking etc, is done with heuristics, army of programmers over weeks. Can we capture
human motion, analyse, run it on a simulator with physics engine, then select candidates
and run on a real robot. (Extend life of robot by being selective in which gaits to use!)
Get training data from human, segment in to phases. Fantastic videos of arm swinging,
standing on one leg, sit-ups: and a great walk by a robot, with no human intervention in
the learning algorithm.
“You need to demonstrate before anyone will start adopting these things” – very true.
Health scenarios – Need to validate data. Breathing rate during ventilation – breathing
rate validated all the time. Can capture coughs, overall activity – e.g. go to sleep, turn
right/left etc. Prosthetic limb adjustment – done by eye at the moment; with their capture
data, can make it much closer to the normal/optimal setup. One example – couldn’t do it
for climbing up a slope, but can now.
Exists to evangelise! Encourage people to experiment with the technology. About fifteen
applications project, very keen – due to funding! – to see the technology applied, and
making a difference.
Example: projectile motion. Take a soft ball with Orient device inside. Instrument
thrower with three devices. Thrower throws ball, can detect instant when ball leaves the
hand, so only acceleration due to gravity thereafter. Expect an arc defined by good old
equation of motion. Study in inquiry learning: try using tangible interface to support
learning the laws of projectile motion. Masters student had a first attempt at this.
(The research question here – for me and people like us – is what can you do if motion
capture is cheap, easy and near-ubiquitous? Exciting!)
Questions
Don’t detect physical location, but can infer it. Treat human body as articulated system of
rods. Marker system requires precise placing of markers on parts of the body – here can
be anywhere. Camera-based gives you position information, but have to infer orientation
and acceleration.
256-times a second capture. Do-able because done on the devices themselves, so can be
done in real time. Base station 33g, sensors 13g. Sensors can talk to each other, but here
they all talk to base station.
Feedback not just visual, but audio – a tone – good e.g. in physiotherapy or golf swing.
Give audio feedback on how close it is.
Visual feedback on phone for golf – haven’t done any evaluation. They demonstrate they
can do it, then work with end users to evaluate it. They work on the speck inside,
improving, miniaturising. Applications are collaborations.
iPhone would’ve been a good bet; have worked on WindowsCE and don’t much enjoy it.
Using those in experiments with NHS Lothian. You need a load of software, it’s messy.
Happy to work with people to do stuff on more phones, but that’s not their zone.
Delighted to work with people to port it – can give you the hooks etc.
Dance also – tango dancers. Can get metrics about e.g. coupling of motion between
leader and follower.
Separate centre for applications, several students, is geared up to use stable versions of
their platforms, very open to collaboration.
Wii only gives you acceleration; here get the biomechanics of it.
Speckled Computing offers a radically new concept in information technology that has
the potential to revolutionise the way we communicate and exchange information.
Specks will be minute (around 1 mm3) semiconductor grains that can sense and compute
locally and communicate wirelessly. Each speck will be autonomous, with its own
captive, renewable energy source. Thousands of specks, scattered or sprayed on the
person or surfaces, will collaborate as programmable computational networks called
Specknets.
Computing with Specknets will enable linkages between the material and digital worlds
with a finer degree of spatial and temporal resolution than hitherto possible; this will be
both fundamental and enabling to the goal of truly ubiquitous computing.
Speckled Computing offers a radically new concept in information technology that has
the potential to revolutionise the way we communicate and exchange information. Specks
-- minute, autonomous, semi-conductor grains that can sense and compute locally and
communicate wirelessly -- can be sprayed into the atmosphere, onto surfaces or onto
people, and will collaborate as programmable computational networks called SpeckNets
which will pave the way to the goal of truly ubiquitous computing. Such is the vision of
the Speckled Computing Project -- however, although the technology to build such
devices is advancing at a rapid rate, the software that will enable such networks to self-
organise and function lags somewhat behind. In this paper, we present a framework for a
self-organising SpeckNet based on Cohen's model of the immune system. We further
suggest that the application of immune inspired technologies to the rapidly growing field
of pervasive computation in general, offers a distinctive niche for immune-inspired
computing which cannot be filled by another other known technology to date.
Speckled Computing offers a radically new concept in information technology that has
the potential to revolutionise the way we communicate and exchange information.
Specks are small semiconductor devices which can sense and compute locally and
communicate wirelessly. Each speck is autonomous, with its own captive, renewable
energy source. Thousands of specks, scattered or sprayed on the person or surfaces, can
collaborate as programmable computational networks called Specknets.
The Speckled Computing group actively works with organisations to deploy innovative
new applications in areas such as:
Health
Industrial
Digital Media
For more information please visit the Speckled Computing Group website:
http://www.specknet.org/
Abstract
Speckled computing (Arvind, D.K. and Wong, K.J., Proc. IEEE Int. Symp. on Consumer
Electronics, p.219-23, 2004) is an emerging technology in which data is sensed in minute
(eventually one cubic millimetre) semiconductor grains called specks. Information is
extracted in situ from each speck and is exchanged and processed in a collaborative
fashion in a wireless network of thousands of specks, called a specknet. Specks are not
assumed to be static, and therefore estimating and maintaining the logical location of the
mobile specks in a network is essential for a number of speckled computing and sensor
network applications. A novel lightweight distributed algorithm is introduced for this
purpose and simulation results are presented to determine the goodness of the algorithm
for different parameters, The algorithm was also successfully ported to a hardware
prototype of the speck called the ProSpeckz. The problems and issues of porting the
algorithm onto such a resource-constrained hardware platform are discussed. Finally, the
paper concludes with plans to improve the algorithm.
"Speckled" Computing
We talk about nanofactories, but first the world will be changed by Fab Labs. One day
we'll have nanobots, but on our way to that goal we'll have "speckled computing."
Speckled computing - some of the most advanced computing technology in the world -
is currently being researched and developed by a group of Scottish experts.
The individual appliances, or 'specks', will form networks that can be programmed like
ordinary computers.
Spraying them directly onto a person creates the ability to carry out different tests at the
same time, for example muscle movement and pulse rate. This allows a complete picture
of the patient's condition to be built up quickly.
Arvind said: "This is the new class of computing: devices which can sense and process
the data they receive. They also have a radio so they can network and there's a battery in
there as well, so they are entirely self-powered.
UPDATE: In the comments Phil asks, "What happens when you're done with them...do
they just wash off?"
I would guess that now at the dawn of "speckled computing" that researchers will
attempt to gather these little matchhead-sized computers back up for use later.
But as these devices enter mass production and get smaller and cheaper... yeah, you'd
wear them for the day and lose them the next time you shower.
The cool thing is the promise of what these devices (even though they are less than
nanobots or utility fog) could do. Instant communications via cell and Internet access.
Virtual reality or enhanced reality. Health monitoring. The list goes on.
Speckled Computing offers a radically new concept in information technology that has
the potential to revolutionise the way we communicate and exchange information. This
will be realised by minute autonomous specks, each of which encapsulates sensing,
programmable computation and wireless networking. Computing with minute specks will
enable linkages between the material and digital worlds with a finer degree of spatial and
temporal resolution than hitherto possible, which will be fundamental to truly ubiquitous
computing. The research consortium brings together worldclass expertise in computer
science, electronic engineering and optoelectronics, which will conduct collaborative
research in the core areas, and importantly, at the interfaces between them. While this
provides an integrated technology push, designers industrial designers and educationalists
will collaborate in exemplar projects to demonstrate the impact of this research.
Abstract: D.K. Arvind from the School of Informatics, University of Edinburgh and E.
Scanlon from IET, Open University will lead a discussion on speckled computing and
potential applications in education. Eileen Scanlon will discuss the experiences of the
Personal Inquiry project, the Inquiry Learning cycle and potential applications in the area
of modelling. D.K. Arvind will discuss the concepts of speckled computing and the range
of work being carried out under the banner of the Research Consortium in Speckled
Computing, based at Edinburgh.
Specknets link the digital world of computers to the physical world of sensory data. A
network of Orient specks on the person, for example, is capable of tracking the
orientation of the limbs, and this information can be stored, manipulated and accessed
remotely over the internet. Computing with specknets, or Speckled Computing, affords
new modes of unencumbered interaction with the digital world, in which the physical
environment is the primary site of interaction.
The Orient specks have been used for fully wireless, full-body 3-D motion capture of
human motion in a variety of applications:
• real-time manipulation of characters in digital animation and virtual worlds
• interaction with physical robots and teaching them behaviours such as walking
• biomechanical analysis of the golf swing, cricket strokes, Tango and break dancing
• health-care (physiotherapy, gait analysis, non-invasive wireless respiratory rate
monitoring)
• analysis of throwing and balancing skills in pre-school children
• exploratory learning of concepts in physics such as projectile motion.
Abstract:
Specknets link the digital world of computers to the physical world of sensory data. A
network of wearable Orient specks, for example, is capable of tracking the orientation of
the body parts, or the position of the person in the environment, and this information can
be stored, manipulated and accessed remotely over the internet.
The tutorial will cover three aspects of Speckled Computing:
• Speck platforms, including the hardware and firmware
• Application of the on-body Orient wireless motion capture system in the healthcare,
animation and sports sectors
• The science underpinning the interpretation of sensor data in these applications.
The attendees will have access to the Orient motion capture system and will get hands-on
experience in interactive animation of virtual characters and in the real-time control of a
bipedal robot performing tasks such as walking, standing on a leg and waving of arms.
Agenda:
• Scanning of the application spaces in healthcare, digital media and sports sectors for
wearable wireless motion capture systems
• Detailed case studies of deployment in clinical gait analysis, 3-D animation,
biomechanics of golf swings
• Unsupervised learning algorithms for the classification of motion data
• Hands-on experience in 3-D animation and in programming behaviour of a bipedal
robot