You are on page 1of 2

Introduction

There is something about hand written letters, the tangibleness of it, the opening ceremony, the ability
to kind of file it away and have a an idea of where it is or if it is close to you because it means something.

Writing is a skill most people take for granted. From the invention of typewriters to basically any
computer right now, essays, letters and scripts can be easily typed to the font of the users liking.

Now technology is being deployed to try to replicate a human touch, as a growing number of consumers
turn to pen-wielding robots that can mimic the loops and patterns of the human hand. This especially
increases the intimacy and personalization of documents such as letters and holiday cards.

Problem statement

It is unfortunate that some people cannot engage in letter writing due to certain disabilities that they
either were born with or acquired in their lifetime. This project aims to design and fabricate a robotic
pen that is able to accept voice input and coverts it to written text on paper. With pre-loaded fonts
chosen by the user, one can write using different customized fonts. To specifically cater to those who
acquired disabilities later in life, this device will be able to scan their handwriting and add it on as an
optional writing font. This project aims to enable people with arm or spinal and nervous system
disabilities to engage in the personal art of letter writing.

Justification

Personalized letters put an intimate touch on the letter themselves and humanizes the writer. It shows
commitment and represents who we are. They might also be a great way to market oneself or a
business the person is running. However, not everybody may be able to write down a note or letter due
to various disabilities.

Working principle *from net

A robot testbed for writing Chinese and Japanese calligraphy characters is presented. Single strokes of
the calligraphy characters are represented in a database and initialized with a scanned reference image
and a manually chosen initial drawing spline. A learning procedure uses visual feedback to analyze each
new iteration of the drawn stroke and updates the drawing spline such that every subsequent drawn
stroke becomes more similar to the reference image. The learning procedure can be performed either in
simulation, using a simple brush model to create simulated images of the strokes, or with a real robot
arm equipped with a calligraphy brush and a camera that captures images of the drawn strokes. Results
from both simulations and experiments with the robot arm are presented.

Structural requirements

Should be lightweight to enable it move seamlessly

Functional requirements

Should be able to duplicate someone’s handwriting as close as possible

Should be able to accept voice input accurately from the user

Should be able to give audio output to the user (to ascertain it recorded the correct message)
The main parts of our project

Robotic arm

Calligraphy pen

Scanner

Voice recorder

Translating software

You might also like