You are on page 1of 84

BLAISE PASCAL MAGAZINE

ALL ABOUT DELPHI AND DELPHI PRISM(.Net) , LAZARUS & PASCAL


AND RELATED LANGUAGES
Pascal

95/13
LAZARUS

ALL ACCES

COMPONENTS
DEVELOPERS

4
DELPHI

DB ARTISAN
ER/STUDIO

RAPID SQL

DB OPTIMZER
DB PERFORMANCE CENTER

INTERBASE

DB CHANGE MANAGER

DATABASE:
WORLDS IN COLLISION?
Mini Course SQL by Miguel van de Laar
Delphi JSON Viewer by Pawe Gowacki
Is Lazarus ready for writing commercial applications ? by Zeljan
NexusDB exceptionaly good, a real surprise... by Erwin Mouthaan
Great News: The TMS DataModeler by Bruno Fierens
Introduction to Delphi Database Development: Part 1 by Cary Jensen
First Look at Advantage Database Server 10 by Cary Jensen
Real-time data collection by Anton Vogelaar
About Object oriented databases by Detlef Overbeek
Fastreport, whats up? by Detlef Overbeek
Using ADS with Delphi Prism and ASP.NET by Bob Swart
A datawarehouse example using kbmMW by Kim Madsen
Multiplatform Windows CE by Joost van der Sluis
Five Considerations for Choosing an Effective SQL Development Tool
by Scott Walz

September 2010

BLAISE
PASCAL MAGAZINE 11
ALL ABOUT DELPHI AND DELPHI PRISM(.Net) ,LAZARUS & PASCAL AND RELATED LANGUAGES
CONTENTS

Volume 11, ISSN 1876-0589

Articles

GR

GR

T NE

EA

GR

GR

T NE

EA

Mini Course SQL


by Miguel van de Laar page 10
Is Lazarus ready creating commercial applications ?
by Zeljan page 13
Delphi JSON Viewer
by Pawe Gowacki page 16
NexusDB exceptionaly good, a real surprise...
by Erwin Mouthaan page 22
T NE
EA
The TMS DataModeler
by Bruno Fierens page 27
Introduction to Delphi Database Development: Part 1
by Cary Jensen page 31
First Look at Advantage Database Server 10
by Cary Jensen page 35
T NE
EA
Real Time data collection
by Anton Vogelaar page 39
Object oriented databases
by Detlef Overbeek page 45
Fastreport, whats up?
by Marcop Roessen, Rob van den Bogert and Detlef
Overbeek Page 52
Using ADS with Delphi Prism and ASP.NET
by Bob Swart page 59
A datawarehouse example using kbmMW
by Kim Madsen page 66
Multiplatform Windows CE
by Joost van der Sluis page 73
Five Considerations for Choosing an Effective SQL
Development Tool
by Scott Walz page 81

Columns
Editorial page 4
Bookreviews by Frans Doove page 5

Advertisers
Advantage Databases page 3
Barnsten page 83
Cary Jensen
Book Advantage Database Server page 12
Components 4 Developers page 84

Database Workbench Pro page 11


Fastreport Version 5 Page 51
LAZARUS USB Stick page 44
LAZARUS
Invitation for the ILS - (International Lazarus
Symposium)
Nexus page 21
Marco Cant page 37
Order printed item page 38
Vogelaar Electronics
The Delphi/Lazarus Controller
and Development Board page 42

Editor in chief
Detlef D. Overbeek, Netherlands
Tel.: +31 (0)30 68.76.981 / Mobile: +31 (0)6 21.23.62.68
News and Press Releases
email only to editor@blaisepascal.eu
Authors
B Peter Bijlsma,
C Marco Cant,
D David Dirkse, Frans Doove,
G Primo Gabrijeli,
H Fikret Hasovic
N Jeremy North,
O Tim Opsteeg,
P Herman Peeren,
R Michael Rozlog,
S Henk Schreij, Rik Smit, Bob Swart,

Editors
Rob van den Bogert, W. (Wim) van Ingen Schenau,
Miguel van der Laar, M.J. (Marco) Roessen .
Correctors
Howard Page-Clark, James D. Duff
Translations
M. L. E. J.M. (Miguel) van de Laar,
Kenneth Cox (Official Translator)
Copyright See the notice at the bottom of this page.
Trademarks All trademarks used are acknowledged as the
property of their respective owners.
Caveat Whilst we endeavour to ensure that what is published
in the magazine is correct, we cannot accept responsibility for
any errors or omissions. If you notice something which may be
incorrect, please contact the Editor and we will publish a correction
where relevant.

Subscriptions (prices have changed)


1: Printed version: subscription 60.-(including code, programs and printed magazine, 4 issues
per year including postage).
2: Non printed subscription 35.-(including code, programs and download magazine)
Subscriptions can be taken out online at
www.blaisepascal.eu
or by written order, or by sending an email to
office@blaisepascal.eu
Subscriptions can start at any date. All issues published in the
calendar year of the subscription will be sent as well.
Subscriptions run per calender year. Subscriptions
will not be prolonged without notice. Receipt of payment will be
sent by email. Invoices will be sent with the March issue.
Subscription can be paid by sending the payment to:
ABN AMRO Bank Account no. 44 19 60 863
or by credit card: Paypal or TakeTwo
Foundation for Supporting the Pascal Programming Language
(Stichting Ondersteuning Programeertaal Pascal)
IBAN: NL82 ABNA 0441960863
BIC ABNANL2A VAT no.: 81 42 54 147
(Stichting Programmeertaal Pascal)
Subscription department
Edelstenenbaan 21 3402 XA IJsselstein, The Netherlands
Tel.: + 31 (0) 30 68.76.981 / Mobile: + 31 (0) 6 21.23.62.68
office@blaisepascal.eu

Copyright notice
All material published in Blaise Pascal is copyright SOPP Stichting Ondersteuning Programeertaal Pascal unless otherwise noted and may not be copied, distributed or
republished without written permission. Authors agree that code associated with their articles will be made available to subscribers after publication by placing it on the
website of the PGG for download, and that articles and code will be placed on distributable data storage media. Use of program listings by subscribers for research and
study purposes is allowed, but not for commercial purposes. Commercial use of program listings and code is prohibited without the written permission of the author.

Page 2

COMPONENTS
DEVELOPERS

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Editorial by Detlef Overbeek


Finally its ready, bigger then ever more and more information - about Databases.
But this topic is never-ending.
We finally had to decide this is the maximum size.
We had planned for sixty pages but it turned out to
become eighty-four pages.
And still we could of course not cover the whole
subject - only a good part of it..
So, reason enough to continue the stories of some
subjects in our regular columns. We aim to give you
the information we think you want, so let us know.
We are proud to have some very fresh news. TMS has
specially tried to combine the publishing of their new
DataModeler with this issue.
Since they are such well known producers of
components we have very high expectations for it.
We will review it of course in the next regular issue in
November.
For beginners there is an article about the basics of
SQL.
Pawel Glowacki has written a very interesting article
about a JSON viewer. Much to learn. Having read
that you will want to dive into the NO SQL databases
in the next issue.
Nexus Databases came up with great news and some
things I had never heard of before: the Nexus
memory manager is even better then the original one
in Delphi.
Sybase has released their latest product update and
Cary Jensen is going to take us into the world of
databases fromclear, simple beginnings..
Because we are very curious about Object Oriented
Databases and theire current deployment, we tried to
find out some fundamental things about them and
were very surprised when we heard that they are alive
and kicking. We had heard otherwise.
Because green is very sexy at the moment Anton
describes for us how to combine real-time electronic
temperature measurement with local data storage. It
seemed to us a subject everybody would like know
about - you might even want to use it in your own
house.

Page 4

COMPONENTS
DEVELOPERS

FastReport is as good as ever and its developers are


working very hard to publish the new version No.5.
In the next regular issue we will show how to create
reports using version 5.
Bob Swart has written about the .Net aspects of a
database for us .
We are going to give .NET some more attention. As
we promised.
Kim Madsen - we all know him from his suite gives us his first article - on datawarehousing.
Very interesting to see what the possibilities are.
And last but not least Lazarus is here with something
very special:
you could write now your own mobile phone mileage
tracker application or do anywith a database on your
phone - maybe even using your GPS.
Let the phone tell you where the kids are or how long
the grass is. Or may be even greener...
With the help of the Lazarus USB Stick (because its
already on the stick) or a Lazarus version and some
database of course...
So now - if youre not a subscriber yet
- you can see what we do for our readers.
We hope of course this issue will persuade you to
subscribe:
so we have a little extra for you:
if you take out a new subscription we will offer you a
discount of 5,00 per subscription.
And because we have some special offers for our
subscribers:
you will find these extra offers for them on
page 26 from Nexus,
from components4developers on page 72
from TMS software on page 27.
We hope you will have a lot of fun with this issue.
Your editor

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Graphical User Interface


programs for
Windows 32 and 64
Windows CE,
Mac OS,
Unix and
Linnux

M. van Canneyt,
M. Grtner, S. Heinig,
F. Monteiro de Cavalho,
I. Ouedraogo.

LAZARUS: COMPLETE GUIDE

Book Reviews by Frans Doove

M. van Canneyt and


M. Grtner, S. Heinig, F. de Monteiro de Cavalho,
I. Ouedraogo.

Lazarus complete guide


Working with the IDE and class library

Publisher: ProPascal Foundation - Netherlands (St. Propas)


ISBN: 978-94-90968-02-1
This is a preview of the contents of the book that will be
published in English in December 2010.
Length: about 800 pages
Price: 50 Euros
The book contains 12 chapters and a large index of the main
concepts. is written by a group of specialist authors. Each
author contributes one or more chapters according to his
special expertise.
Chapter 1:
"The architecture of Lazarus" introduces basic concepts.
Chapter 2
deals with the installation of Lazarus.
Chapter 3
covers the integrated development environment (IDE). This
long chapter (118 pages!) is a detailed and elaborate explanation
of the potential in the many tools included in the IDE.. The
IDE is not discussed in overview, rather the chapter explains
what the possibilities are and what you can arrange with it. This
long chapter (118 pages!) covers each possibility in a repetitive
style. After this elaborate discussion of the technical aspects, the
software description starts.
Chapter 4
handles Projects. Unfortunately too
much reader knowledge is assumed when some topics are
introduced. The chapter begins by mentioning "GUI
Applications" but these are not explained until later.
The explanation when it comes is exhaustive.
Chapter 5
entitled "Target Platforms" documents, the recompiling of
Lazarus programs for Windows, Linux/Unix and Mac-OS.X.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Chapter 6
is a lengthy (170 pages) discussion of
Lazarus class libraries (the LCL)
In Chapter 7
discusses how to port Delphi components to Lazarus.
Chapter 8
is used for Databases and tools, data and extra tools.
Chapter 9
covers programming with graphics and
Chapter 10
is about Processes and Threads.
In Chapter 11
network programming is discussed and Chapter 12 covers
programming for databases and the data-specific tools included
with Lazarus. By Chapter 8 the reader will have learned enough
to be able to handle the somewhat more difficult and specialised
subjects that are addressed. Though the book is written by
several authors, the impression is that they were all writing in the
same spirit. The explanation of the programs and software is
crystal clear and instructive: after reading this book, there are
very few questions left unanswered. There are many screenshots
and lots of code examples.
Summary
A beautiful and very instructive book,
its content is complete and appropriate for its intended
audience.
It offers clear logical explanations of all you need to know
about Lazarus.
This is a must-have book for English speaking
programmers who want to start using Lazarus.

The book will be available in


mid December 2010.
You can order it at our
web shop direct.
If you pre-order the
LAZARUS COMPLETE GUIDE
you can also order a
discounted
Lazarus USB stick
for only 15.00

COMPONENTS
DEVELOPERS

Page 5

Book Reviews (continuation 1)


Lazarus: Complete Guide, chapter listing
Chapter1 - The Architecture of Lazarus by Mattias Grtner
The text editor
CodeTools
Quick navigation in the code
Source code completion
Basics of the pascal language
Basic types
Unit structure and functionality
Object-oriented programming in Lazarus
Arrays in Free Pascal
Compiler directives
Projects
Components
Packages
C-Libraries in Free Pascal
Free-Pascal-C-Librariess in C
Installing new components into the IDE
Using units in a number of Projects
Virtual Units
Units for other Platforms
Searching Packages
Directories and search paths
Dependencies and inheritance
Compiling
Source documentation
Unicode
Chapter 2 - Installing Lazarus

Partial Preview of chapter 4: PROJECTS


by Felipe Monteiro de Carvalho
Lazarus is an Integrated Development Environment geared towards the
development of Pascal applications. While its greatest strength is in the
development of GUI applications, Lazarus can be used to develop all
kinds of projects, including web applications and even programs
without a user interface.

by Jrg Braun, Swen Heinig and Felipe Monteiro de Carvalho

Downloading from the subversion repository


The installation of TortoiseSVN
Subversion program package
SVN-repositories for Lazarus and FPC
Basic use of the Subversion command line
Checking out with TortoiseSVN
Installation on Windows
Installation on Linux
Installation on FreeBSD
Installation on MacOSX
Chapter 3 - The IDE

Image 4. : The Dialog File - New


The Lazarus menu selection File -> New displays a dialog with a list of
possible Project Types which Lazarus can create. The section headed
Projects shows how many different kinds of projects can be developed
with Lazarus and it should be noted that the list is not exhaustive. As a
general software development tool, any kind of programming project
can be developed with Lazarus. The list present in this dialog displays
only the project types for which Lazarus has built-in templates, and
external packages can extend this list.
After selecting the menu item New in the File menu, a dialog appears
showing a number of possible projects and modules to choose from.
The contents of this dialog change from version to version and also
when a new package is installed which uses the so-called Tools API to
install new entries in this dialog. Using this interface other IDE dialogs
like Environment and Project Options can also be altered. Even
without extra packages, the standard Lazarus dialog already contains
alarge number of possibilities to choose from, and those standard
options are explained in Table 4.1 (which is split across two pages
here).

by Swen Heinig

The main Lazarus menu


Databases
Edit
Searching
View
Project
Compiler directives
Start
Package
Tools
Environment
Windows
Help
Object Inspector
Source editor
Source code completion
Message Composer
Debugger
Recompiling the IDE
Chapter 4 - Projects
by Felipe Monteiro de Carvalho
GUI applications
Console appplications
DLLs and Shared Objects
Dynamic Libraries
Libraries in MacOSX
Control panel applets for Windows
CGI Applications
CGI Programs in Pascal
CGI with Powtils
Unit Testing
Packages
Installing Components
Registering components
Property editors
Component editors
Services and Daemons

Table 4.1
The standard options for creating new modules and projects in Lazarus (Part 1 from 2)

Page 6

COMPONENTS
DEVELOPERS

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Book Reviews (continuation 2)

PartialPreview of chapter 4: PROJECTS


To ensure that texts can be used in any language, using a GUI library
with Unicode support is strongly recommended. The table below
compares various GUI libraries in existence today which can be utilized
to write Pascal applications. Note that only Unicode-enabled libraries
are shown. Free Pascal applications can be written using all the libraries
shown (except the Delphi VCL), but only with the Lazarus LCL and
with the KOL-CE library is it possible to use the Lazarus form
designer, object inspector and standard component Also note that
additional components installed on the component palette can only be
used with LCL applications. This book will obviously only cover
working with the Lazarus Component Library (LCL).

Table 4.2 : GUI libraries and Pascal as of September 2010


To start developing a new GUI application based on the LCL choose
the menu File New and select the option Application in the dialog
which appears. A new main form will be created with the name Form1
and it will be opened and ready to be edited usingthe form designer.

Table 4.1 :
The standard options for creating new modules and projects in Lazarus (Part 2 from 2)

In this chapter I will explain only the most important types of project
which you can develop with Lazarus. For most of them a specific
template can be selected in the menu File New, but for some a
generic template is used.

Image 4.3 : Starting a new GUI application in the dialog File New

Image 4.2 : The dialog Project - New Project offers the same project creation
options as the dialog File New
4.1 GUI APPLICATIONS
Ever since its commercial introduction in the 1980s, the graphical user
interface (GUI) has quickly grown in popularity, and today only
technically savvy users work with console interfaces. The secret in
writing a cross-platform GUI is simply using a cross-platform GUI
library which ensures the portability of the displayed images and text.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

As soon as this option is selected all the necessary files for this kind of
project are created and the form designer as well as the code editor
windows will be opened. More windows can be added to the project by
choosing the menu File New Form and new Pascal source code
units can be added with the menu File New Unit. The same action
can also be performed byopening the dialog File New and selecting
the corresponding module or by clicking on the appropriate button in
the Lazarus toolbar in the left corner of the main Lazarus window.
While Lazarus offers project management and very advanced code
completion for all kinds of projects, even console ones, it is for GUI
applications that it really distinguishes itself from other IDEs. The code
in the unit is automatically updated for each new component dropped
on a form, as well as if you change a component's name. so that the
code is automatically synchronized with the GUI editor. By double
clicking an event the code editor is shown with the cursor placed in the
procedure which handles it, and a suitable procedure is added to the
code if none is assigned to the event. Even the end corresponding to
a typed in begin is automatically added, although excessive automatic
coding from the IDE can get annoying so it can be disabled via the
Options menu.
COMPONENTS
DEVELOPERS

Page 7

Book Reviews (continuation 3)


Chapter 5 - Target platforms

Chapter11 - Network Programming

by Felipe Monteiro de Carvalho

by lnoussa Ouedraogo

Operating System specifics


The API of Windows32/64
The Special case Windows CE
Linux, FreeBSD and other Unix platforms
The APIs of MacOSX
32 Bit and 64 Bit
Configuration files
Resource files
Chapter 6 - The class libraries

TCP/IP programming
The Client program
The Server program
Web services
Programming Servers
Programming Clients
Message Logging
Object pooling
Service extensions
Chapter 12 - Database Access

The RTL (FPC Run Time Library loading and saving )


Loading and saving using streaming
Component naming
The FCL (Free Component Library)
The LCL (Lazarus Component Library)
The Application class
Screen windows
Working with TForm
The properties of TForm
Special windows
The window environment
Controls in a window
Layout and Program design
Actions
Drag and Drop
The elements of the component palette
The Standard Tab
The Additional Tab
The Common Controls Tab
The Dialogs Tab
The Misc Tab
The Data Controls Tab
The Data Access Tab
The System Tab
The SynEdit Tab
Chapter 7 Porting Delphi components
by Michael Van Canneyt and Mattias Grtner
The architecture of Lazarus components
Operating System independent layer
Operating System dependent layer
Component modelling
Porting in real life
From components to Lazarus Package
The component palette
Chapter 8 - Files and devices

by MichaelVan Canneyt

Architectural overview
Database acces
Choosing Databases
The Database Desktop: an additional tool
Classes for Database access
The Dataset
The DataModule
Data-aware controls
TDataset descendants
The Database Desktop
The Data Dictionary
Exporting Data
Generating code
Crashcourse SQL
Reports
Creating reports
The Report-Designer
Index
Index of graphicx
Index of tables

Graphical User Interface


programs for
Windows 32 and 64
Windows CE,
Mac OS,
Unix and
Linnux

by Felipe Monteiro de Carvalho and Jrg Braun

File dialogs in Lazarus


Working with files
Searching in directories
Communicating with devices
The parallel port
Serial communication
The printer
Chapter 9 - Graphics programming

M. van Canneyt,
M. Grtner, S. Heinig,
F. Monteiro de Cavalho,
I. Ouedraogo.

by Felipe Monteiro de Carvalho

The drawing canvas(?)


Colurs
TPen
TBrush
Fonts
Important graphical routines
Graphical components
Varieties of graphic objects
TGraphic
TRasterlmage
Bitmaps
TJpeglmage
Icons
Chapter 10 - Processes and Threads

LAZARUS: COMPLETE GUIDE

by Michael Van Canneyt

by Felipe Monteiro de Carvalho

Processes
Threads

Page 8

COMPONENTS
DEVELOPERS

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Book Reviews (continuation 4)


Christian Bleske: Morfik AppsBuilder
Web-applications created in Pascal and Basic
2010 Computer & Literaturverlag Bblingen, Duitsland

ISBN 978 3 936546 54 5


Paperback, 398 pages, including CD.
Price : 50,00
The book is written in German.
Morfik is an Australian software company, as you can see in the
book title. You can find a huge amount of information about
Morfik on the internet. Here is a brief summary:

Code completion, as well as MorfikDoc popup help is available when


you are coding. You can navigate through your code using hyperlinks.
The code editor has full information about your code at all times and
can help you navigate to where a class, method, type or variable is
declared.
Morfik supports debug errors browser and server side code from
within Morfik by stepping over and through your high-level source
code as well as automatically generated JavaScript code.
You can add breakpoints, to pause execution anywhere in the browser
or server side code. You can view the current value of a variable by
hovering the mouse pointer over it. Track the flow of execution in your
code and more.
Morfik allows developers to both consume and publish Web Services
through easy to use Wizards. This offers an extremely easy path to the
world of Web Services and Services Oriented Architecture (SOA).
All server side components of Morfik Applications (XApps) are
inherently SOA compliant servers.
Morfik Packages and Widgets allow to create fully functional solutions
and advanced controls that can plug into any Morfik project.
A rich set of pre-built Packages are also available from Morfik.
You can create dynamic data-driven PDF reports that are naturally
suitable for printing and distribution. The reports design process is
identical to designing forms and it utilizes all the properties that you
have come to expect from word processing.

Morfik redefines web development by combining graphical design and


visual programming in a single environment, dramatically reducing the
time required to build modern web applications.
It gives you the power of a full featured SQL relational database, but
with a user-friendly visual interface. You can use the visual designer to
make designing tables and queries for your internal relational database
simple and easy.
Morfik allows you to write your web-based application using any
combination of the supported languages C#, Basic or Pascal) for
both browser and server sides, using the Morfik Compilers rigorous
enforcement of referential integrity to produce scalable Ajax
applications.
The compiler implements automatic intelligent optimization as well as
automatic obfuscation/compression of the final JavaScript.
It is a complete application hosting and deployment platform,
automating the process of deployment as a part of the development
environment itself. Deployment is handled via a deployment Server and
a deployment Client which is built into the environment.
Web solutions created with Morfik can be readily accessed by search
engines. Application content can be easily published (with a clean URL)
which gives your web application an advantage with search engines,
bookmarking and devices that do not support JavaScript. End users can
also browse through Morfik Ajax applications with JavaScript turned
off and they have a very similar browsing experience.
Morfik automatically generates all the necessary browser code from
your selected language. The generated codes are industry standards
compliant and therefore are compatible with all major browsers.
Morfik extends the power of user-defined themes beyond Pages,
and Layouts to theme controls themselves.
All Morfik controls support themes directly and give the user
unprecedented ease in customizing the look and feel of Morfik
applications and websites without interfering with the application logic.
It provides a complete set of wizards to assist you with the creation of
new project elements such as Tables, Queries, Forms, Reports, Web
Methods and Modules.
Wizards also help you with deploying your application, linking or
importing external data sources, consuming Web Services, converting
projects and more.

Conclusion:
This is a very interesting web development tool, and well worth your while
to try out. You can find the latest Morfik version included in our
database_special.iso available from the Blaise website.
If you want the database_special.iso a DVD you will have to order it at the
online Blaise shop.

You can order this issue as printed item


Page 9
4
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

COMPONENTS
DEVELOPERS

Mini Course SQL by Miguel van de Laar


SQL (Structured Query Language) is an ANSI/ISO standard
language for a relational 'database management system'
(DMBS). It is a standardized language which can be used to
manage data in a relational database, through methods such as
query and update. SQL can be used with virtually all modern
relational database products. SQL is a fourth generation language
(4GL) because it is not imperative but declarative. This means
that you use SQL to express the result you desire, without
prescribing how the desired result will be obtained (as you could
explicitly if you used languages such as Pascal, C or Java).

This SQL command requests a list of all elements in the table


Customer.
The name of the table being accessed follows the codeword FROM.
The codeword FROM can, in fact, be followed not just by one table
name, but by a list of all tables from which data will be retrieved.
SQL syntax provides so-called 'clauses' (particular codewords), are
used, which form the most important phrases within an SQL
statement.
The SELECT statement, for instance, can contain the following
clauses: SELECT, FROM, WHERE GROUP BY, and ORDER BY
clauses. In the SELECT statement only the SELECT and FROM
clauses are mandatory, the other clauses are optional.
Here is a more extensive SELECT statement:
SELECT C.LastName C.FirstName, I.TotalAmount AS Amount
FROM Customer AS C, Invoice AS I
WHERE NOT Paid
AND I.DueDate < Now()
AND C.Id

I.CustomerId

ORDER BY C.LastName

The SQL statement above asks the customer name and the amounts
due on all invoices which are not yet paid, but where the due date has
passed. This example contains a SELECT, FROM, WHERE, and
ORDER BY clause.

INTERBASE

SQL

- Standard Query Language


SQL is a language which enables querying or updating relational
databases such as IBM DB2, Firebird, Ingres, Microsoft Access,
Microsoft SQL Server, MySQL, Oracle, PostgreSQL and SQLite. A
relational database is a database system which stores a variety of
relations between its data. These relations are for instance in the form
of fields, records, tables, and links to other fields, tables, records and
databases.
Although SQL is formally a standard language, the implementations of
SQL in the various commercially available databases can differ
considerably. This is due to the fact that vendors often developed own
extensions to the standard SQL. To meet the ANSI standard (ANSI is
the American National Standards Institute, an organisation which
manages a number of American standards), however, at least the basic
commands (as discussed in this article) must be implemented. Other
commands (for example to construct databases and tables or to manage
users) are entirely dependent upon the specific database or vendor.
This article deals with the main basic SQL commands which are:
SQL: SELECT, INSERT, UPDATE, and DELETE.

These are the SQL commands most commonly used in program code.
The other SQL commands are most commonly used in database
maintenance. Most databases offer GUI tools for such maintenance
tasks, and these tools employ these other commands. For further
information about these additional SQL commands you will need to
consult your database manual or make use of the appropriate internet
forums. Commands in SQL are called statements.
SELECT
The SQL statement which is most often used is probably SELECT.
After all this is the statement to questions a database and that's the
reason why it is called a Query language.
The SELECT statement typically looks like this:

In this example the SELECT clause selects three fields: LastName,


FirstName and TotalAmount. TotalAmount gets an alias here: Amount.
And so in the result of the SELECT statement the last field will be
named Amount, not TotalAmount.. This can be useful when you need
the result directly in a component where the names of the fields are
used as column names.
The ORDER BY-clause indicates how the result must be sorted. If the
clause is not present, the order will be arbitrary. You can also designate
several fields here, separated by a comma. The default sort order is
ascending, but by using the codeword DESC the sorting will be
descending. E.g. ORDER BY C.Amount DESC places the highest
TotalAmount at the top.
The FROM clause specifies that the table Customer can be accessed
with the alias C and the table Invoice with alias I. The tables receive an
alias here as shorthand notation you can use when the tables have to
be distinguished. When the table Customer contains the field
LastName and table Invoice does not, you can use LastName in a
SELECT clause unambiguously. But when both tables have a field with
the same name, for instance CreationDate, the database doesn't know
which field is meant. Then you must indicate whether you mean the
Customer or the Invoice table, by using either one of the aliases
C.CreationDate or I.CreationDate.
When there are several tables in the FROM-clause, this will result in a
so-called Cartesian product of those tables. If there is no WHEREclause, the number of records in the result is equal to the product of all
records of all tables (for instance for 3 tables with 5 records each the
result would contain 5 x 5 x 5 = 125 records). The records in the result
then contain all possible combinations of the records in those tables.
As we are only interested in combinations of Customer and Invoice in
this example, where customer and invoice belong together, the
condition C.Id = I.CustomerId is included. In principle, a WHEREclause should contain such a coupling statement for each table after the
first one.
The WHERE-clause indicates when records must be included in the
result. For this purpose you use boolean statements to specify the
conditions for a record's inclusion. In the case of the SELECT
statement above, a record is included in the result when an invoice
belonging to a certain customer is not yet paid whilst the due date has
been passed. The function Now() used here returns the current day and
time.

SELECT * FROM Customer

Page 10

COMPONENTS
DEVELOPERS

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Mini Course SQ (Continuation)


INSERT
To add a record to a table you can use an INSERT statement, as
shown below:
INSERT INTO Customer

(FirstName,

LastName,

Address, HouseNumber, City)


VALUES ("Henk", "Janssen", "Stationsstraat", 4,
"Utrecht")

To conclude
You can do much more using the basic SQL commands than what I have
been able to cover here in an introductory article. If there is sufficient
interest, I will examine more complex SQL statements in a subsequent
article. If there are questions about SQL don't hesitate to email me
(mvdlaar@gmail.com).

.
The INSERT INTO-clause contains the name of the table into
which the record must be inserted and (between parentheses) the
fields which will be filled with values. Fields which are not
mentioned will get a default value (usually NULL, i.e. empty). Don't
include auto-incremented fields (or Id fields) in the list of fields,
since they will get new values inserted automatically.
The VALUES clause which follows INSERT INTO contains the
values for the fields, in the same sequence as given in the
INSERT INTO-clause. Dependent upon the database you must
enter textual values either with quotation marks ( ) or
apostrophes ( ' ). Note that the number of fields must be equal
to the number of values.
UPDATE
The UPDATE statement is meant to change values of records, as
shown below:
UPDATE Customer
SET Address = "Steenweg", HouseNumber
City

= "Naarden"

WHERE Id

= 12,

= 12

The UPDATE statement above contains all clauses which may be


present in an UPDATE statement: the mandatory UPDATE- and
SET-clauses and the optional WHERE-clause. As you will notice,
the UPDATE statement has a slightly different layout than the
INSERT statement. The UPDATE-clause contains only the name
of the table in which one or more records must be changed. The
SET-clause then has in a comma-separated list, the fields which
must be changed, each consisting of the name of the field, an
equals sign ( = ) and the value which must replace the old value.
The WHERE-clause functions the same as in the SELECT
statement: the parameter of the WHERE-clause is a boolean
statement. The records for which this Boolean is True will be
changed. Take care: if no WHERE-clause is given, ALL records
will be changed!

Database Workbench Pro


was reviewed in
Blaise Pascal Magazine no. 11

DELETE
Our survey of basic SQL commands is complete with DELETE..
When designing a new database it is important to know the
difference between logically or physically deleting records from a
table. When you delete records only logically, they remain
physically present in the database, but marked with a 'removed'
status.
Take care that the logically deleted record doesn't show up in
surveys (simply add in all SELECT statements a condition to
exclude removed or inactive records). Logical removal is
particularly recommended in situations where inexperienced users
could delete enormous amounts of data or if logging takes place.
If you choose to physically delete a record, the database will no
longer hold the deleted data anywhere.
In SQL this is done by the DELETE statement:
DELETE FROM Customer
WHERE FirstName = "Peter"
AND LastName

= "Halsema

The DELETE statement has only two clauses: the mandatory


DELETE FROM-clause and an optional WHERE-clause.
DELETE FROM is followed by the name of the table from
which one or more records are to be deleted. The operation of
the WHERE-clause is the same as for the SELECT and
UPDATE statements. Take care: if no WHERE-clause is given,
you'll end up with an empty table!
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Page 11

Is Lazarus ready for writing commercial applications ? by Zeljan


Editorial note:
This article illustrates the enormous progress Lazarus has made
because of the dedicated work of the core developer team: it
shows Nowadays Lazarus is in many ways on a par with Delphi.
In several key areas it is way ahead of Delphi (64 bit compiled
executables, multi-platform and multi-OS, allows development on
mobile and embedded systems).
Of course it is not the same as Delphi, but very cheap - see the
price of the Lazarus USB stick in the Advertisement. So
affordable for anyone and you get started immediately with no
installation required. You can create any application you ever
wanted and the number of components available is grows steadily.
So the answer is YES. Perfect for commercial applications!

At that time I didn't know third party components like Zeos etc..., so I
wrote my own Postgresql driver and used it for several years.
Later when I found Zeos - a nice surprise I immediately started to use
that. So it went on until 2004, there were rumors that Kylix was Exit,
no news from Borland just silence ... Yes it was Exited: shame on you
Borland, not because you put Kylix into grave, but because you cheated
your customers.
For years, then, we were fighting with Borland products. (In the
meantime Kylix could not run on any distributions based on Glibc
higher than 2.4.X) then, until I saw someone had started a qt-widgetset
in a Lazarus project, and that guy was Felipe, and thanks to Den Jean
for Qt C bindings, because without C bindings we could not have a Qtwidgetset inside Lazarus.

I started my job in 2001 at a Croatian company called Holobit..


At that time Holobit was a pretty small company with a few dozen
customers.

I had looked into Lazarus just a few times before, but I was not
attracted previously, because it supported only the Gtk1 widgetset
which looked awful compared to the Qt2 used by Kylix, so now I got
motivated to download the lazarus trunk and find out to see the way
how it worked with Qt. (I already tried Gtk before).

My primary task was to make the company's current C++ business


applications run on Linux and Windows by using Delphi and Kylix and
Borland's CLX technology.
After years of C/C++ coding on linux OOP looked very simple and
well organized. Two months later I concluded that Borland had great
products, and the coding time is much shorter than it was with C/C++
(gtk+,qt) using vi editor. Anyway, within 3 months our business
applications were converted to CLX and the company started selling for
Linux & Win32. All was done with Kylix 2 and Delphi 6 (later upgraded to
K3 & D7).
Creating native Linux apps was a good decision, so the number of
customers started growing rapidly.
Our customers were happy with the ability to choose between Linux
and Windows client apps for dekstop PCs, because it saves some
money and creates a better and more secure environment.
The second conversion issue that looked rather complicated - at that
time - were databases. When I started to convert our applications, all of
them used Foxpro, and I was very disapointed with it, because I already
used Postgresql on Linux. So your guess that we moved all of our
applications to Postgresql is quite correct.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Well, as IG mentioned already, work on the Qt widgetset having only


just started, the result needed a lot of improvements, so it does not
work.
After a quick scan of Lazarus principles, the Lazarus component
library (LCL) and widgetset connections to the LCL I began to
contribute to the Lazarus project with a primary goal to get Qt
widgetset up and running.
My first patches then were sent to Felipe. He argued about my coding
standards (hey, hey), but I fixed that and changed my coding standards to
the Lazarus coding standard.
Anyway, after a year or so the Qt widgetset became useable - in the
meantime Lazarus developers granted me svn write access - so no
more need to wait for Felipe and others to commit my patches.
At the same time business problems arose with Kylix & Delphi
programming and the company management considered moving our
thought about to move the complete codebase to Java or .Net .
When the management decided this change must be made quickly I
objected.
COMPONENTS
DEVELOPERS

Page 13

Is Lazarus ready for commercial applications ? (Continuation 1)


I was not very happy with that. Not
because of the applications,
but because of all the third party
components used in our applications
(ZeosLib, FastReports, TMS
grids,VirtualTrees etc). I said that we
would need a lot of time and resources to
move our code to Java or .Net and the
result of that operation was not reassuring.
I was disturbed by these business decisions
(and already had in mind to change the
job), so at one day I asked the chief if they
could agree to let me spend some time
developing code using Lazarus, and in the
next few months I showed some of our
apps running on Qt4.
So the struggle started...
I had started a race against time, I had to
fix the Qt-LCL and convert one of our
applications to LCL (only a small one).
That wasn't an easy task since qt-lcl is still
not finished and lot of things were not
working.
Zeos for Lazarus already existed, but for
this simple applications I had to have
FastReports and TMS grids. So I had three
months to make Qt under Lazarus
useable, convert FastReports and TMS
grids (both CLX licensed)...
After hundreds of hours of coding, the
day of reckoning came. I had to show my
work at the end of February 2008. I made
a presentation on Linux, 32 bit Windows
and on Mac OSX and the company
management was pleased and satisfied
with it.
Of course there were still bugs and
features not yet implemented, but they
appreciated my main argument. If we
moved to Lazarus we would be able to
work on other (even more) supported
platforms, and also because Lazarus is an
open source project, we would not be at
the mercy of decisions made by other
companies (such as Borland) that had
harmed us in the past.
That became the happiest day in the last
few years of my working life. I was given
the budget and time required to move our
applications to Lazarus.
Now I had a reasonable time (15 months)
to improve Lazarus and re-write our
applications for Lazarus (and deal with
other everyday tasks).
During 2008/2009 I converted all the
third party components and all of our
applications to FPC/Lazarus, and
therefore also contributed a lot of patches
to the Lazarus project.
The Goal is reached Lazarus is better
now than Kylix 3 and we started to deploy
Lcl LCL applications over more than 3,500
at customers sites.
User impressions were positive since our
apps looks native on all platforms. A few
dozen of Mac OSX users were also happy
since we give them native apps for the first
time (they used Parallels + linux VM).
WOW, what a glorious day. We just did not
need the Borland products anymore.

Page 14

COMPONENTS
DEVELOPERS

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Is Lazarus ready for commercial applications ? (Continuation 2)

Now our complete range of software is developed using FPC/Lazarus


and uses PostgreSQL RDBMS:
1. HoloERP ERP system with > 400 modules (forms)
2. Cafeman Caffe bars & restaurants backoffice and POS system
3. TSuS small shops backoffice & POS system
4. Cinema software for cinemas (reservations, tickets etc)
5. ArhStudio architects documentation database.
All of those these applications use the following 3rd party components:
ZeosLib
FastReports (ported CLX)
TMS Grids (ported CLX, but also we licensed the newest VCL
and ported it also)
TMS Planner (ported CLX, later VCL)
FlexCell (licensed LCL , yes there's LCL version)
Our custom components

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Conclusion:
Lazarus is ready for commercial usage especially for people with
legacy Kylix3 / Delphi7 codebases.
My personal opinion is that Lazarus Qt is much better than
K3/D7 at this time (0.9.29 trunk), and developers will be happy
with it's new 0.9.30 version.
Why?
The only OOP RAD which supports so many platforms.
Constantly developed by volunteers, it does not depend on
commercial decisions so you can avoid bankrupcy etc.
Costs almost nothing except energy and time.
If it doesn't fit your needs, you can change it and contribute.
If there's a bug - you can fix it and contribute it, but at least you
can open an issue at lazarus mantis issue tracker.

COMPONENTS
DEVELOPERS

Page 15

Delphi JSON Viewer by Pawe Gowacki


starter

expert Delphi 2010

JSON has become the X in Ajax. It is now the preferred data format
for Ajax applications. The most common way to use JSON is with
JSON support has been introduced in Delphi 2010 as a part of the XMLHttpRequest. Once a response text is obtained, it can quickly be
converted into a JavaScript data structure and processed by an
DBExpress database driver architecture, but of course JSON
application.
support is not limited to just database applications. JSON is
JSON's syntax is significantly simpler than XML, so parsing is
similar to XML as both are text-based data interchange formats.
more efficient.
Delphi 6 introduced the TXMLDocument component and the
XML Data Binding Wizard to make it easier to work with XML JSON doesn't have namespaces. Every object is a namespace: its set
of keys is independent of all other objects, even exclusive of nesting.
documents in code.
JSON
uses context to avoid ambiguity, just as programming
In this paper I'm presenting a similar component for JSON called
TJSONDocument. The next step was to implement a TJSON languages do.
JSON has no validator. Being well-formed and valid is not the same
TreeView component for displaying the content of TJSON
as being correct and relevant. Ultimately, every application is
document in VCL Forms applications.
responsible for validating its inputs.
Using these components a simple JSON Viewer application has
been created and described here. In the XML world there are two Below is a fragment of sample JSON text, based on the Sample
Konfabulator Widget from [5], used in the later part of this article.
categories of parsers: DOM and SAX.
A DOM (Document Object Model) parser reads XMLStrings and
builds an in-memory representation of them which applications
can traverse and update.
On the other hand SAX is a streaming interface - applications
receive information from XML documents in a continuous
stream, with no backtracking or navigation allowed.
Towards the end of this article I describe the TJSONParser
component that was created as an experimental TJSON
Document-descendant that provides SAX-like event-based
processing for JSON[1].

{
"widget": {
"debug": "on",
"window": {
"title": "Sample Konfabulator Widget",
"name": "main_window",
"width": 500,
"height": 500
},
"misc": ["hello", 23, false]
}
}

Delphi and DBXJSon.pas


Many programming languages have built-in support for JSON or
JSON is relatively new as it was first described by Douglas Crockford in libraries to work with JSON. These JSON bindings for different
programming languages are listed on JSON homepage [4], including
July 2006 in his IETF Request for Comments The application/json
three open source Delphi libraries. Since Delphi 2010 the JSON
Media Type for JavaScript Object Notation[2].
support is part of the VCL library as implemented in the
In many respects JSON is similar to XML as both are text based data
DBXJSON.pas unit.
interchange formats widely used across in the Web. While XML has
In order to visualize Delphi classes responsible for JSON support, I
now become a whole family of related standards - including XML
added the DBXJSON unit directly to a little test Delphi application,
Namespaces, XML Schema, XSL, XPath and others - JSON defines
clicked on the Model Support tab in the Project Manager and got the
only a small set of formatting rules for the portable representation of
following UML class diagram. Some of the classes from the
structured data. The key strength of JSON its simplicity. Douglas
Crockford describes JSON structure in his paper presented at the XML DBXJSON.pas unit not related directly to JSON support are not
2006 Conference in Boston JSON: The Fat-Free Alternative to XML shown here.
[3]: The types represented in JSON are strings, numbers, booleans,
object, arrays, and null. JSON syntax is nicely expressed in railroad
diagrams.

Why JSon?

JSON only has three simple types strings, numbers and Booleans
and two complex types arrays and objects. A string is a sequence of
zero or more characters wrapped in quotes with backslash escapement,
the same notation used in most programming languages.
A number can be represented as integer, real, or floating point. JSON
does not support octal or hex. It does not have values for NaN or
Infinity. Numbers are not quoted.
A JSON object is an unordered collection of key/value pairs. The keys
are strings and the values are any of the JSON types. A colon separates
the keys from the values, and a comma separates the pairs. The whole
thing is wrapped in curly braces. A JSON array is an ordered collection
of values separated by commas and enclosed in square brackets.
The character encoding of JSON text is always Unicode. UTF-8 is the
only encoding that makes sense on the wire, but UTF-16 and UTF-32
are also permitted. JSON has no version number.
No revisions to the JSON grammar are anticipated.

Page 16

COMPONENTS
DEVELOPERS

In Delphi 2007 the DBX database driver framework architecture has


been re-engineered in pure Delphi code and this introduced a number
of interesting features including extensible command types. In the
release that followed - Delphi 2009 - the DBX architecture has been
extended and DataSnap framework for building client/server and
multi-tier applications has been reengineered as well as the extension of
the new DBX architecture.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Delphi JSON Viewer (continuation 1)


One of the most interesting and powerful capabilities introduced
through DataSnap support in the release following Delphi 2009 -Delphi
2010 - were lightweight callbacks passed to DataSnap server methods.
This allowed the server application to be able to call back into the
client. The DBXJSON.pas unit defines the abstract base class for
callback objects that contains an Execute method that accepts and
returns parameters of TJSONValue type. This enables the developer to
pass data parameters to and from. These can be arbitrarily complex data
structures encoded as JSON. Here is the declaration of this class:
TDBXCallback = class abstract
public
function Execute(const Arg: TJSONValue): TJSONValue;
virtual; abstract;
// other members stripped out
end;

The DBXJSON unit also contains functionality to parse JSON text into
the graph of TJSONValue-descedants and to generate JSON text from
the graph of objects in memory. The "TJSONAncestor.Owned"
property (a boolean value) has been expanded to underline the fact that
all JSON descendants have the Owned property that controls the
lifetime of JSON objects in memory.
The TJSONObject class contains a static method ParseJSONValue
that effectively implements JSON parser functionality.
It accepts a string parameter with JSON text and returns a
TJSONValue reference to the root of the graph of TJSONAncestordescendants.
It is also possible to generate JSON text from the in-memory tree of
JSON objects calling the overloaded "ToString" method on any of
TJSONAncestor descendants.
TJsonDocumentComponent
Before Delphi 2010 introduced the DBXJSON unit, I was trying to
implement JSON parsing functionality manually by coding JSON
railroad diagrams.
With the DBXJSON implementation in place there is little point in
reinventing the wheel; however there is still no design -time support for
JSON.
Everything has to be done in code. Hence the idea of creating a
minimal VCL component wrapper for JSON parser implementation
provided by a TJSONObject.ParseJSONValue class method that
accepts JSON text and returns the object tree representing the
corresponding JSON document structure in memory.
The TJSONDocument component has been implemented inside a
unit named "jsondoc" to mirror the "xmldoc" name of the unit
containing the implementation of TXMLDocument class.
Below is the declaration of the TJSONDocument VCL component:
unit jsondoc;
//
type
TJSONDocument = class(TComponent)
private
FRootValue: TJSONValue;
FJsonText: string;
FOnChange: TNotifyEvent;
procedure SetJsonText(const Value: string);
procedure SetRootValue(const Value: TJSONValue);
protected
procedure FreeRootValue;
procedure DoOnChange; virtual;
public
class function IsSimpleJsonValue(v: TJSONValue):
boolean; inline;
class function UnQuote(s: string): string; inline;
class function StripNonJson(s: string): string; inline;
constructor Create(AOwner: TComponent); override;
destructor Destroy; override;
function ProcessJsonText: boolean;
function IsActive: boolean;
function EstimatedByteSize: integer;
property RootValue: TJSONValue read FRootValue write
SetRootValue;
published
property JsonText: string read FJsonText write
SetJsonText;
property OnChange: TNotifyEvent read FOnChange write
FOnChange;
end;

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

The full source code of this component and all other source code
described in this paper can be downloaded from [1]. See the
References section at the end of this article.
The TJSONDocument class contains a published JsonText: string
property that can be used to assign JSON text for parsing and a
RootValue: TJSONValue public property that can be used to assign a
TJSONValue reference and generate JSON text.
Assigning to either of these properties causes the other property to be
updated and the OnChange event is fired every time the JSON
stored inside the component is changed.
In this way it is possible for other components of an application to be
notified and refreshed. In this sense the TJSONDocument
component can be used as a JSON parser and generator as described in
the original JSON RFC [2].
The public IsActive: boolean property returns true if
TJSONDocument component contains valid JSON, or false it is empty.
function TJSONDocument.IsActive: boolean;
begin
Result := RootValue <> nil;
end;

The TJSONObject.ParseJSONValue: TJSONValue method is


sensitive to the contents of the JSON text passed for parsing. If the
string provided does not contain valid JSON text or it contains JSON
text with additional whitespace characters, then it always returns a nil
TJSONValue reference. The class function StripNonJson is used to
remove from JSON text any non JSON characters and is implemented
as follows using the TCharacter class from the VCL Character unit.
class function TJSONDocument.StripNonJson(s: string):
string;
var ch: char; inString: boolean;
begin
Result := '';
inString := false;
for ch in s do
begin
if ch = '"' then
inString := not inString;
if TCharacter.IsWhiteSpace(ch) and not inString then
continue;
Result := Result + ch;
end;
end;

The process of JSON parsing is implemented in the ProcessJsonText


method that is called as a side-effect of assigning to JsonText: string
published property.
procedure TJSONDocument.SetJsonText(const Value: string);
begin
if FJsonText <> Value then
begin
FreeRootValue;
FJsonText := Value;
if FJsonText <> '' then ProcessJsonText
end;
end;
function TJSONDocument.ProcessJsonText: boolean;
var s: string;
begin
FreeRootValue;
s := StripNonJson(JsonText);
FRootValue := TJSONObject.ParseJSONValue(BytesOf(s),0);
Result := IsActive;
DoOnChange;
end;

The TJSONDocument was designed to be as minimal as possible.


For convenience it also surfaces the EstimatedByteSize: integer
method provided by the underlying DBXJSON implementation.
This is how the TJSONDocument component looks at design-time
inside the Delphi 2010 Object Inspector.

COMPONENTS
DEVELOPERS

Page 17

Delphi JSON Viewer (continuation 2)


Additionally to the original viewer feature set I have also added the
possibility to display children counts next to every non-empty JSON
object or array, and an estimated byte size of a given JSON tree node.
The main functionality of this component is implemented in its public
LoadJson procedure that populates the tree view based on the content
of the connected TJSONDocument component.
TJsonTreeView Component
I have always wanted to implement a JSON viewer in Delphi. The
TJSONDocument component is non-visual, so I needed a separate
visual component that would provide a graphical tree-representation of
JSON. This component should have a JSONDocument published
property to connect both components at design-time.
How JSON should the JSON data be visualized? Would a simple
Delphi TTreeView component suffice, or should I do some fancy
painting in code? Perhaps I should use a TVirtualTreeView component
to have a tree with multiple columns?
These are all good questions, so I had to do a little googling around for
inspiration. There are both simple and complex JSON viewers available
on the Internet. Some of them are standalone applications, like the one
coded in .NET and available at
http://jsonviewer.codeplex.com/. Other viewers are
embedded at web pages like http://www.jsonviewer.com/ or
http://jsonviewer.stack.hu/. The one that I liked the most
was implemented in Java and is available as a part of the Apache Pivot
project[6] for Rich Internet Applications
http://pivot.apache.org/demos/json-viewer.html.
The one cosmetic thing that I do not like about it, is that is sorts JSON
properties alphabetically and does not preserve the original ordering of
object pairs.
Below is a screenshot from the Apache Pivot web page and this is my
desired TreeView-based JSON viewer functionality:

Here we go
I have decided to create my JSON tree view component as a
descendant of the Delphi VCL TTreeView component. A good
Delphi programming practice would be to derive it from
TCustomTreeView instead to be able to decide which inherited
protected members of a class should be declared as published. In
my case I want the end user to have access to whole TTreeView
component functionality at design-time, so I do not need to hide any
inherited properties.
unit jsontreeview;

type
TJSONTreeView = class(TTreeView)

public
procedure LoadJson;
published
property JSONDocument: TJSONDocument //
property VisibleChildrenCounts: Boolean //
property VisibleByteSizes: Boolean //
end;

Page 18

COMPONENTS
DEVELOPERS

procedure TJSONTreeView.LoadJson;
var v: TJSONValue; currNode: TTreeNode; i, aCount: integer;
s: string;
begin
ClearAll;
if (JSONDocument <> nil) and JSONDocument.IsActive then
begin
v := JSONDocument.RootValue;
Items.Clear;
if TJSONDocument.IsSimpleJsonValue(v) then
Items.AddChild(nil, TJSONDocument.UnQuote(v.Value))
else if v is TJSONObject then
begin
aCount := TJSONObject(v).Size;
s := '{}';
if VisibleChildrenCounts then
s := s + ' (' + IntToStr(aCount) + ')';
if VisibleByteSizes then
s := s + ' (size: ' + IntToStr(v.EstimatedByteSize)
+ ' bytes)';
currNode := Items.AddChild(nil, s);
for i := 0 to aCount - 1 do
ProcessPair(currNode, TJSONObject(v), i)
end
else if v is TJSONArray then
begin
aCount := TJSONArray(v).Size;
s := '[]';
if VisibleChildrenCounts then
s := s + ' (' + IntToStr(aCount) + ')';
if VisibleByteSizes then
s := s + ' (size: ' + IntToStr(v.EstimatedByteSize)
+ ' bytes)';
currNode := Items.AddChild(nil, s);
for i := 0 to aCount - 1 do
ProcessElement(currNode, TJSONArray(v), i)
end
else
raise EUnknownJsonValueDescendant.Create;
FullExpand;
end;
end;
procedure TJSONTreeView.ProcessPair(currNode: TTreeNode;
obj: TJSONObject; aIndex: integer);
var p: TJSONPair; s: string; n: TTreeNode; i, aCount: integer;
begin
p := obj.Get(aIndex);
s := TJSONDocument.UnQuote(p.JsonString.ToString) + ' : ';
if TJSONDocument.IsSimpleJsonValue(p.JsonValue) then
begin
Items.AddChild(currNode, s + p.JsonValue.ToString);
exit;
end;
if p.JsonValue is TJSONObject then
begin
aCount := TJSONObject(p.JsonValue).Size;
s := s + ' {}';
if VisibleChildrenCounts then
s := s + ' (' + IntToStr(aCount) + ')';
if VisibleByteSizes then
s := s + ' (size: ' + IntToStr(p.EstimatedByteSize) +
' bytes)';
n := Items.AddChild(currNode, s);
for i := 0 to aCount - 1 do
ProcessPair(n, TJSONObject(p.JsonValue), i);
end
else if p.JsonValue is TJSONArray then
begin
aCount := TJSONArray(p.JsonValue).Size;
s := s + ' []';
if VisibleChildrenCounts then
s := s + ' (' + IntToStr(aCount) + ')';
if VisibleByteSizes then
s := s + ' (size: ' + IntToStr(p.EstimatedByteSize) +
' bytes)';
n := Items.AddChild(currNode, s);
for i := 0 to aCount - 1 do
ProcessElement(n, TJSONArray(p.JsonValue), i);
end
else
raise EUnknownJsonValueDescendant.Create;
end;

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Delphi JSON Viewer (continuation 3)


procedure TJSONTreeView.ProcessElement(currNode: TTreeNode;
arr: TJSONArray; aIndex: integer);
var v:TJSONValue; s:string; n:TTreeNode; i, aCount:integer;
begin
v := arr.Get(aIndex);
s := '[' + IntToStr(aIndex) + '] ';
if TJSONDocument.IsSimpleJsonValue(v) then
begin
Items.AddChild(currNode, s + v.ToString);
exit;
end;
if v is TJSONObject then
begin
aCount := TJSONObject(v).Size;
s := s + ' {}';
if VisibleChildrenCounts then
s := s + ' (' + IntToStr(aCount) + ')';
if VisibleByteSizes then
s := s + ' (size: ' + IntToStr(v.EstimatedByteSize)
+ ' bytes)';
n := Items.AddChild(currNode, s);
for i := 0 to aCount - 1 do
ProcessPair(n, TJSONObject(v), i);
end
else if v is TJSONArray then
begin
aCount := TJSONArray(v).Size;
s := s + ' []';
n := Items.AddChild(currNode, s);
if VisibleChildrenCounts then
s := s + ' (' + IntToStr(aCount) + ')';
if VisibleByteSizes then
s := s + ' (size: ' + IntToStr(v.EstimatedByteSize)
+ ' bytes)';
for i := 0 to aCount - 1 do
ProcessElement(n, TJSONArray(v), i);
end
else
raise EUnknownJsonValueDescendant.Create;
end;

Json Standalone Viewer


In the next step I have used the TJSONDocument and
TJSONTreeView components to implement a simple JSON Viewer
application. The functionality is minimal. You can clear the current
contents of the JSON viewer using Clear button, or you can copy to
clipboard a JSON text and paste it into the viewer window using the
Paste button. There is also a popup menu to control if children
counts and node byte sizes are displayed or not. The application icon
was created using IcoFX (http://icofx.ro/) directly from the
JSON logo downloaded from the JSON home page.
Below is a screenshot from Delphi JSON Viewer at runtime. Just copy
some JSON text to the clipboard and paste

TJsonParser Component
In a sense the TJSONDocument component can be considered the
implementation of a Document Object Model for JSON. But what
about SAX for JSON? SAX or Simple API for XML presents a
completely different approach to document parsing. Instead of building
an in-memory representation of the document, it just goes through it
and fires events for every syntactical element encountered[7]. It is up to
the application to process the events it is interested in. For example to
find something inside a large document.
Based on the TJSONDocument I have implemented an experimental
TJSONParser component that implements a SAX processing model
for JSON. A bullet-proof SAX parser for JSON should be
implemented from scratch and directly parse JSON text and fire
relevant events. In my case it sits on top of the in-memory
representation of JSON.
The jsonparser unit contains the following enumerated type that lists
different token types that can be found in a JSON text:
type
TJSONTokenKind = (jsNumber, jsString, jsTrue, jsFalse,
jsNull, jsObjectStart, jsObjectEnd, jsArrayStart,
jsArrayEnd, jsPairStart, jsPairEnd);

There is also a declaration of a TJSONTokenEvent that is fired when a


JSON token is encountered:
type
TJSONTokenEvent = procedure(ATokenKind: TJSONTokenKind;
AContent: string) of object;

The "TJSONParser" class is derived from "TJSONDocument" and


declared as follows:
type
TJSONParser = class(TJSONDocument)
private
FOnToken: TJSONTokenEvent;
FTokenList: TJSONTokenList;
procedure DoOnAddToTokenListEvent(ATokenKind:
TJSONTokenKind; AContent: string);
procedure DoOnFireTokenEvent(ATokenKind:
TJSONTokenKind; AContent: string);
public
constructor Create(AOwner: TComponent); override;
destructor Destroy; override;
procedure FireTokenEvents;
procedure BuildTokenList;
procedure DoProcess(val: TJSONValue; aTokenProc:
TJSONTokenProc);
property TokenList: TJSONTokenList read FTokenList;
published
property OnToken: TJSONTokenEvent read FOnToken write
FOnToken;
end;

The TJSONParser component can do two things. If you call the


FireTokenEvents public method, it will traverse the underlying JSON
document and fire OnToken events for every token it encounters. It
is also possible to build a token list in memory that can be accessed via
the TokenList property. This could be useful if we would like to
implement a JSON viewer using the Virtual Tree View component that
requires fast access to the underlying data structure.
What was interesting during the implementation of these two methods
was the fact the underlying document traversal algorithm was the same
for both firing the events and building the list. In order to avoid code
duplication, I have decided to parameterize the traversal algorithm using
anonymous methods.
The following anonymous method signature was defined in
jsonparser unit:
type
TJSONTokenProc = reference to procedure(ATokenKind:
TJSONTokenKind; AContent: string);

The signature of this method matches both the


DoOnAddToTokenListEvent and the DoOnFireTokenEvent
private methods in the declaration of the TJSONParser class.
The actual document traversal algorithm has been implemented inside a
DoProcess method that is called from both the FireTokenEvents
and the BuildTokenList methods in the following way:
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

COMPONENTS
DEVELOPERS

Page 19

Delphi JSON Viewer (continuation 4)


procedure TJSONParser.BuildTokenList;
begin
if RootValue <> nil then
DoProcess(RootValue, DoOnAddToTokenListEvent);
end;
procedure TJSONParser.FireTokenEvents;
begin
if RootValue <> nil then
DoProcess(RootValue, DoOnFireTokenEvent);
end;
procedure TJSONParser.DoOnFireTokenEvent(ATokenKind:
TJSONTokenKind;
AContent: string);
begin
if Assigned(FOnToken) then
FOnToken(ATokenKind, AContent);
end;
procedure TJSONParser.DoOnAddToTokenListEvent(ATokenKind:
TJSONTokenKind;
AContent: string);
begin
FTokenList.Add(ATokenKind, AContent);
end;

In this way we have both functionalities implemented without code


duplication inside the recursive DoProcess method:
procedure TJSONParser.DoProcess(val: TJSONValue;
aTokenProc: TJSONTokenProc);
var i: integer;
begin
if val is TJSONNumber then
aTokenProc(jsNumber, TJSONNumber(val).Value)
else if val is TJSONString then
aTokenProc(jsString, TJSONString(val).Value)

Summary
JSON is currently probably the most important data interchange
format in use. Its simplicity makes it easy to process, and information
encoded with JSON is typically smaller than using XML.
Over the years XML has become a whole family of specifications and it
is not a trivial task to implement a fully compliant XML parser from
scratch.
Delphi 6 was the first commercial IDE on the market to introduce
support for XML SOAP web services. Delphi 6 also introduced the
TXMLDocument component and XML Data Binding Wizard to make
it easier to work with XML.
JSON so far lacks something equivalent to an XML Schema (which
abstracts a cross-platform representation of XML metadata). However
a JSON equivalent is slowly emerging.JSON.
On the JSON home page you can find a reference to a draft version of
IETF RFC A JSON Media Type for Describing the Structure and
Meaning of JSON Documents [8]. This is still pending feedback but
in future could be a starting point for implementing a Data Binding
Wizard for JSON.
In this article I have described a JSON Viewer application implemented
with Embarcadero Delphi 2010. The source code that accompanies this
paper is organized in the form of two packages for Delphi components
one runtime and one design-time and the djsonview: Delphi
VCL Forms application that implements the Delphi JSON Viewer.
References
1. Source code for this article
http://cc.embarcadero.com/item/27788

2. JSON RFC
http://www.ietf.org/rfc/rfc4627.txt

3. JSON: The Fat-Free Alternative to XML

else if val is TJSONTrue then


aTokenProc(jsTrue, 'true')

http://www.json.org/fatfree.html

4. JSON Home Page


http://www.json.org

else if val is TJSONFalse then


aTokenProc(jsFalse, 'false')

5. JSON Examples
http://www.json.org/example.html

else if val is TJSONNull then


aTokenProc(jsNull, 'null')

6. Apache Pivot JSON Viewer

else if val is TJSONArray then


begin
aTokenProc(jsArrayStart, '');
with val as TJSONArray do
for i := 0 to Size - 1 do
DoProcess(Get(i), aTokenProc);
aTokenProc(jsArrayEnd, '');
end

7. Simple API for XML Home Page

http://pivot.apache.org/demos/json-viewer.html
http://www.megginson.com/downloads/SAX/

8. Draft JSON Schema


http://tools.ietf.org/html/draft-zyp-json-schema-02

9. Embarcadero Delphi Home Page


http://www.embarcadero.com/products/delphi

else if val is TJSONObject then


begin
aTokenProc(jsObjectStart, '');
with val as TJSONObject do
for i := 0 to Size - 1 do
begin
aTokenProc(jsPairStart,
Get(i).JsonString.ToString);
DoProcess(Get(i).JsonValue, aTokenProc);
aTokenProc(jsPairEnd, '');
end;
aTokenProc(jsObjectEnd, '');
end
else
raise EUnknownJsonValueDescendant.Create;
end;

About the author:


Pawe Gowacki
is Embarcadero Technologies' European Technical Lead for
Delphi, RAD Studio and All-Access technologies. Previously,
Pawe spent over 7 years working as a senior consultant
and trainer for Delphi within Borland Education Services
and CodeGear. As well as working with Embarcadero
customers across the region, he also represents
Embarcadero internationally as a conference and seminar
speaker.
For more information check out Pawe's technical blog at
http://blogs.embarcadero.com/pawelglowacki

The TJSONParser component can be used as starting point for


arbitrary JSON processing at the lowest level of actual JSON text
tokens. Delphi anonymous methods are real cool!

You can order this issue as printed item


Page 20

COMPONENTS
DEVELOPERS

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

2010-11-30

NexusDB exceptionaly good, a real surprise... by Erwin Mouthaan


Nexus Database Systems is an Australian based company. The
websites states that NexusDB is the best database for Delphi. This
is a strong statement and very much worth investigating.
History
Nexus Database Systems was founded in the year 2003. But it's origins
are dated much earlier. For years, TurboPower has delivered state of the
art Delphi components. One of them was FlashFiler: a very popular and
extremely fast file-based database engine. NexusDB continues building
on the inheritance of this succesfull predecessor. By the way, the
complete library of Turbo Power Delphi components is free to use. The
source code is published because Turbo Power has made it open-source.
A look at it is very much recommended. I am almost sure that these
components have been described in the past issues of Blaise.
NexusDB is used by a great number of companies as Honda, Shell,
IBM, Motorola and Burgerking.

After installation a number of shortcuts appear on the desktop. One


shortcut is for the so called Enterprise Manager. This program
supervises the different databases. The two other shortcuts are for help
information. As shown below, I was walking on clouds....

Figure 2: Shortcuts on the desktop

Properties
NexusDB supports everything that can be expected of a modern
database : triggers, transactions, views and stored procedures. The
performance of NexusDB is high. Nexus has a specially designed
memory manager which performs best on multi-core computers. This
makes the NexusDB performance on such computers superior to other
databases. For Nexus Database Systems this Nexus memory manager is
an option. The standard memory manager of Delphi may be replaced by
the Nexus memory manager. (You add the nxReplacementMemoryManager
unit to the first place in the project uses list, done. thats all there is to using our MM
and any beginner should manage it.).
Another advantage is the simple installation procedure of the server. The
installation program is about 5 MB in size and installation proceeds very
quickly. No separate database administrator is needed.
NexusDB may be applied in different ways. Of course, NexusDB may
be used in a Client/Server architecture. One of other ways is to compile
the database engine with the project. In doing so, your Delphi program is
shipped as a single executable file without the worries of installation
procedures and configuration of a separate NexusDB Server.
This embedded version of NexusDB is even free to use. Download it
free of charge from the website! Besides that, there also is possible a so
called hybrid architecture, where a combination of Client/Server and
embedded mode is used.
Installation
First I will focus on the embedded version of NexusDB. The latest
version of NexusDB is 3.05 and it was released about the same moment
as the latest Delphi XE release. The free embedded version of NexusDB
Figure 3: Tool Palette of the NexusDB
can be downloaded from
http://www.nexusdb.com/support/index.php?q=FreeEmbedded.
It is not possible to supply a full description of all these components in
this introductory article. But they are all explained in detail in the
accompaning user manual. It is a big advantage that data-access
components are included. For example, look at the TnxDatabase,
TnxTable, TnxQuery and TnxStoredProc components. These
components enable the use of standard data-aware controls in Delphi
to implement database applicaties in the way we are used to.
The installation runs fast and trouble free.
Management Tool
Next to all components a so called Enterprise Manager is included. The
Enterprise Manager is a program to manage Nexus databases. Think of
the construction of tables, execution of SQL
scripts , defintion of indexes etc. All examples in the user manual use
the so called Northwind database. After installation, the SQL script to
generate this database may be found at :
C:\Program Files\NexusDB3\Sample Databases\Northwind.sql.

Figure 1: Starting the install procedure

Page 22 / 2256

Using the Enterprise Manager we execute the script to build the


Northwind database. These steps are described in detail in the user
manual, see NexusDB Manual V3 | Delphi Guide | Code Examples &
Fragments | Creating the Northwind sample database.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

NexusDB exceptionaly good, a real surprise... (continuation 1)

Figure 4: The Enterprise Manager


At this point my harddisc has a map C:\data\Northwind with a number of files. For each table there is a so called nx1 file. In the next release of
NexusDB v4 later this year however, the database is housed in a single file.

Figure 5: Windows Explorer

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Page 23

NexusDB exceptionaly good, a real surprise... (continuation 2)


An embedded example
The use of an embeded database frees the user from the need to
install a separate database server. The embedded version of NexusDB
even has no dll. The database engine is directly linked into the
application. In this way, the application may be distributed by CDROM to be started directly. No more worries about the installation of
some different library on the users PC.
If multiple users will get acces to the database, the change to the fully
client-server architecture can be made any time. This is a matter of
adding some components. Anders Pedersen has written a good article
in Blaise 67 about the FireBird Embedded Server. It is worthwile to
read Anders' article about embedded databases in general.
The user manual has a clear description how to develop an
embedded database application in Delphi. The accompaning
examples may be found at
C:\Program\Files\NexusDB3\Examples\Delphi\Manual\
SimpleEmbedded\Project1.dpr.

The datamodule of this project consists of 5 components:

Figure 6: Datamodule sample embedded


The component TnxDatabase has a property AliasPath which must
specify the directory of the database. In this case it is:
C:\data\Northwind

Figure 8: Object Inspector of theTnxTable


On the projects MainForm we notice the usual components such as
DBGrid and DataSource. The DataSet property of the DataSource
refers to the nxTable1 component on the datamodule.
Next, if we set the ActiveDesignTime property of nxTable1 to True,
the customers table data appears in the grid. Just as we are used to.

Figure 9: MainForm sample embedded


On the end-users computer this application , as stated before, needs
no separate database server. The database engine is linked directly
into the exe file of the application. Not a single dll is required. The
end-user only needs the executable of the application itself besides
his database files. In this case c:\data\Northwind.
Figure 7: Object Inspector of the TnxDatabase
The TnxTable component in the datamodule refers to the Customers
table. The TableName property should reflect that.

Page 24

Lazarus and Free Pascal


The Nexus team is working hard on a NexusDB version for Lazarus
and Free Pascal.
Below, notice a preliminary screenshot of the designtime support in
Lazarus. Later this year NexusDB v4 will be released. It is the
intention to officially support Lazarus and Free Pascal shortly after
that time.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

NexusDB exceptionaly good, a real surprise... (continuation 3)

Figure 10: Lazarus designtime support


Conclusion
NexusDB is a complete database with all possible options.
It is fast, installs without pain and the user manual contains many
examples. The data-access components included simplify the use
of the database.
Also a program is included to manage the databases.
This Enterprise Manager is particularly user friendly.
In this article I have mainly focussed on the Free Embedded
Version of NexusDB. Using the clearly written user manual, it is
fairly simple to design an embedded database application in
Delphi. Besides that, NexusDB offers many advanced options
such as in-memory tables, dynamic SQL and even a remotingframework. But at this time I do not master all options.

About the author:


Erwin Mouthaan
was born in Utrecht, the Netherlands, in 1966. He
studied applied mathematics at the University of
Twente where he learned programming and the
Pascal programming language. He worked for
various research institutes, is currently self
employed and very happy to use Delphi a lot.

However, the website discusses them all, together with many


examples.
I recomment everybody looking for a database to have a good
look at NexusDB.
Take a look at the adversiment from Nexux at
page 2: there is an offer for 20% reduction!
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Page 25

Nexus Database Systems - Support Policy changes


Our current support (which was introduced
shortly after the release of NexusDB V2.07 back
in April 2008) is based on yearly product based
support, which gives access to newly released minor upgrades of the
product and provides general support according to the level (bronze,
silver, gold or platinum). This scheme specifically excludes updates to
major new releases for which an extra upgrade charge is applicable.
With a new major and exciting RAD Studio version just released and
NexusDB V4 being worked on, we feel that this licensing scheme is
bringing a lot of customers into a difficult situation: Should they
update their support now and have to pay for a V4 upgrade again?
Or should they wait for V4 and stick with what they have already?
Since having happy customers is our main goal, we decided to do
something about it and change the licensing to include updates to
major versions. This means that with active product support customers
will have access to any new release of the product, no matter if its a
minor or major upgrade.
What does that mean for your product support?

Your support will still be product based.


You will get access to all product updates released (minor AND
major) within the time your product support is active.
You are eligible for product support according to your level (bronze,
silver, gold or platinum) within the time your product support is
active.
After your support expires you will get a grace period of 1 month to
renew your product support at a renewal price. The 1 year period
starts with the date of expiry.
If you let the grace period laps you can renew your product support
at the normal upgrade price. The 1 year period starts with the date
of purchase.
We will send you 3 reminder messages: 1 month before expiry, at
expiry, 2 weeks after expiry.
What happens if your product support is expired?

How much does it cost?

There is NO change to the current prices for renewals due to this


change of policy. This means that compared to the current scheme you
will get significantly more for your money.
Some products currently have no specific upgrade pricing, which means
that the full license price is applicable
for major upgrades (e.g. NexusDB Embedded SRC). To keep the
pricing consistent we will provide discounted
upgrade prices for these products and will add them to the price list
once they come in effect after October 15th 2010.
Please refer to
http://www.nexusdb.com/support/index.php?q=pricing

for an up to date full list of all prices.


Are there other benefits than upgrades to new major versions?
Yes there is at least one more major benefit. You will get new
functionality earlier. The scheme of paid major upgrades dictated for us
(Nexus Database Systems) which new functionality was released into
minor upgrades and which we held back for major upgrades. This led
to situations where functionality was held back until the next major
upgrade even though it was essentially ready for release. Since new
features are now not a driving force for a major upgrade anymore, we
can now roll out new features as they become ready. This has the
immediate benefit for you, our customer, that you dont have to wait for
feature X until features Y and Z are finished.
Another effect is that we can concentrate on working on the current
version only, instead of having to work on a new version and provide
support for one or two older major releases. This has immediately
obvious benefit that all effort and energy is channeled into making the
product better faster.
Another, not so immediately visible benefit of this is also the lower
administrative effort necessary starting from branches in version
control, over mailings, to pre- and after sales support. It all will get
much easier to manage.
In short we might lose some money generated by paid for major
upgrades but we are sure that this change will allow us make our
products better faster and to serve our customers in the best possible
way, which in turn will sell us more licenses.

You will still have access to all binaries and installers released before
your support expired.
You can still create and distribute your own products with the
versions that you have access to
(we do not revoke usage and distribution rights).
You will NOT get access to product
updates released after your
support expired.
Product
You can renew your product support
for 1 year after the 1 month grace period
for the normal upgrade price. The 1 year
NexusDB Developer SRC
period starts with the date of purchase.
NexusDB Developer DCU
NexusDB Embedded SRC
If your product support is expired right
NexusDB ADO Provider
now, we give you an extra extended grace
NexusDB ODBC Driver
period until October 15th 2010 to renew
NexusDB PHP Connector
your support at the renewal price.
Nexus Portal Pro
Thereafter the 1 month grace period applies.
Nexus Portal Std

Page 26

New License

Renewal within or
before grace period

Upgrade / Renewal
after grace period

AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD

AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD

AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD

750
500
350
400
400
400
1250
700

500
300
200
300
300
300
790
520

650
400
300
350
350
350
950
625

(new)
(new)
(new)
(new)
(new)
(new)

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

GR

GR

T NE
EAAT NE

The TMS DataModeler


starter

expert Delphi 2010

Introduction
Data modeling is a mandatory requirement throughout the
lifetime of a database system: from the initial design of the
software, when the first structure is created and modeled, to later
on, in system updates, when database structure is modified.
There are also situations where manipulation of database
structure may be a complicated task, such as when there is a need
to get a productive system and to work on a non-documented
database, or convert a database from a DBMS to another. There
are several tools related to data modeling on the market: some
DBMS-specific, others generic; some useful to specific and
isolated tasks, others offering a multitude of features (and usually
not very cheap). TMS Data Modeler is a tool that provides
nothing but essential features for creating and maintaining a
database: it integrates database design, modeling, creation and
maintenance into a single environment, in a simple and intuitive
user interface to manipulate databases efficiently. This article
briefly describes the main features of TMS Data Modeler,
demonstrating how it may be used to create a project and
maintain an existing database.
Data Modeler main features
TMS Data Modeler is a generic tool that allows data modeling independently of the used DBMS, in an easy-to-use interface. The application
allows you to start modeling a database from scratch, as well as import
the structure of an existing database (reverse engineering). It allows you
to generate scripts to create the full database, or upgrade an existing
database with update script, through its version control system.
In addition, it provides features for conversion from one DBMS to
another, consistency check and visualization of entity-relationship
diagrams, among others. Data Modeler supports several database
management systems, currently: Absolute Database, Firebird 2, MS SQL
Server 2000/2005/2008, MySQL 5.1, NexusDB V3 and Oracle 10g.

by Bruno Fierens
Creating a project in TMS Data Modeler
There are two ways to start a project in Data Modeler: creating a new
project from scratch or importing data dictionary from an existing
database.

Figure 1: Starting
Choosing the "New Project" option, just select the target database and
an empty project will be created. By default Data Modeler provides a
diagram named "Main Diagram". Tables and relationships between
them can be created visually through the diagram. All objects in the
database (apart from tables we can have procedures, views, etc.) can be
accessed, created and edited through the Project Explorer, located on
the left of the screen.
In the "Import from Database" option you need to configure the
connection to the database whose structure will be imported. After
importing the structure, Data Modeler will hold all the database objects:
tables, relationships, triggers, procedures, views, etc. All objects are
listed at Project Explorer on the left, in their respective category. For an
overview of the imported structure, it is possible to open the "Main
Diagram" and select "Add all tables" from the context menu.

Figure 2: Adding all tables


DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Page 27

The TMS DataModeler (continuation 1)


Versioning database
To use the version control features of TMS Data Modeler (comparison
and identification of changes, script generation for upgrade) it is
necessary to archive versions so they are kept in history. By default any
new project starts at version 1, with the status "under construction". In
this example, after importing the structure of an existing database, we
will archive the first version. Once this version is archived, its status is
changed to "closed", and a new version (2) is created with the status
"under construction". All changes from now on will be part of the
second version.

Creating and editing database objects


The Project Explorer on the left provides access to viewing and editing
interfaces to all existing objects, as well as creating new ones through
the context menu. Data Modeler's interface allows multiple objects to
be opened simultaneously, organized into tabs, which allows easy
navigation among them. In this example we will create a new table
named "attachments", and later we will relate it to the existing table
"projects". We will also create a new field "filesize" in the table "blobs".
Data Modeler allows you to create different diagrams by dragging the
desired tables from Project Explorer to the diagram area; related tables
are automatically linked. As a result, it is possible to see different sets
of tables, separated by module or system context.
To relate the two tables visually, through the diagram, insert a new
diagram to make viewing easier, and drag the "projects" and
"attachments" tables from Project Explorer to diagram area. Selecting
the "Relationship" button on the toolbar, just click on the parent table
and drag the mouse to the child table. This will display a window for
inserting a new relationship, in which we can set its name, keys and
other options. After confirmation, the relationship between the tables is
created and displayed immediately in the diagram. Note that a field
"ProjectID" was automatically created in the "attachments" table,
related to the primary key in the "projects" table.

Figure 3: Versioning

Figure 4: Editing database objects

Page 28

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

The TMS DataModeler (continuation 2)

CONNECT
AND
START
PROGRAMMING

Figure 5: Editing relationship


Version upgrade
After the changes are made in the project, version 2 becomes different
from version 1, which was archived right after importing the structure
from the database. Using the comparing versions tool of Data Modeler,
we can view the structure of each version side by side, with their
differences highlighted (created, removed or changed objects), and the
creation script of selected objects. On the same screen we can select the
changes to generate a script to update the database.

Figure 6: Clicking on "Generate", we have our script ready to update the database from version 1 to version 2, containing all alterations.

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Page 29

The TMS DataModeler (continuation 3)


Conclusion
In this article only the main features of TMS Data
Modeler have been described, with examples of how
to create a project and maintain a database using this
tool. TMS Data Modeler offers many other features for
modeling and maintaining databases, and comes with
complete manual reference and online help. Access
our website for details on further application features
and benefits.
For more information and a free trial download of
TMS Data Modeler, visit:
http://www.tmssoftware.com/site/tmsdm.asp

TMS Data Modeler Library is available for free at:


http://www.tmssoftware.com/site/tmsdmlib.asp

Figure 7: Auto increment


Reading the structure
of a Data Modeler project from your application.
TMS Software offers a free library, Data Modeler Library (DMLib),
which allows access to the structure of a database stored in a TMS Data
Modeler project, from any application in Delphi or C++Builder. It is a
collection of read-only classes containing clear methods and properties
for obtaining information about all objects from the data dictionary.
Here is a small example of how to use DMLib for getting the fields and
their data types from a specific table in the data dictionary:
program DMread;
uses
SysUtils, uAppMetaData, uGDAO;
var
amd: TAppMetaData;
table: TGDAOTable;
field: TGDAOField;
i: integer;
begin
amd := TAppMetaData.LoadFromFile
('C:\tmssoftware\dmlib\jedivcs.dgp');
try
table := amd.DataDictionary.TableByName('attachments');
if table <> nil then
begin
for i := 0 to table.Fields.Count-1 do
begin
field := table.Fields[i];
Writeln(Format('Field %d: %s [%s]',
[i + 1, field.FieldName, field.DataType.Name]));
end;
end
else
Writeln('Table not found.');
finally
amd.Free;
end;
end.
Output:
Field 1:
Field 2:
Field 3:
Field 4:
Field 5:
Field 6:

idattachment [Int (identity)]


description [VarChar]
filename [VarChar]
createdon [Datetime]
content [Image]
PROJECTID [Int]

About the author


Bruno Fierens
He started doing several small projects in the mid-eighties
in GWBasic and soon after discovered Turbo Pascal v3.0
and got hooked to its fast compilation, clean language and
procedural coding techniques.
Bruno followed the Turbo Pascal releases and learned
object oriented programming when it was added to the
Pascal language by Borland. With Turbo Pascal for Windows
and Resource Workshop, he could do his first steps in
Windows programming for several products for the local
market.
TMS software became Borland Technology Partner in 1998
and the team grew to 4 persons in the main office in
Belgium and developers in Brazil, Uruguay, India, Pakistan
doing specific component development.
TMS software is now overlooking a huge portfolio of Delphi
components and looks forward to strengthen this product
offering in the future. With Delphi 2010, Embarcadero now
offers a very rich and powerful environment for creating
fast and solid Windows applications using the latest
technologies in Windows 7 such as touch.
Bruno said he will watch the announced cross-platform
development tools from Embarcadero closely and TMS
software is hopeful this will bring exciting new
opportunities for Embarcadero, Delphi and our
components. We live indeed again in very interesting times
for passionate Delphi developers.

Special offer for


Blaise Pascal Magazine
Subscribers:
20% discount for
TMS Data Modeler 76,00
(standard price 95,00)
offer valid until end of November 2010
Coupon code: DM-BLAISE
to be used on the online order form:
https://secure.element5.com/register.html?prognr=
300398512&languageid=1
(just click on the url)

Page 30

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Introduction to Delphi Database Development: Part 1


starter

expert Delphi

This is the first article in an extended series of articles taking a


look at database application development in Delphi.
In this installment, Delphi database expert Cary Jensen takes a
look at databases in general, and provides a general introduction
to Delphi's support for database applications.
Here's a trivial question:
What was the name of Delphi during its original beta test (prior to it's
February 1995 release)?
The answer is Delphi (but you might have already known that). But why did
the development team pick such an odd name for their ground
breaking, component-based advancement of their Pascal compiler and
IDE (integrated development environment). The answer is related to
databases.
Delphi is name of both the city and temple in Greece where people
would travel to speak to the Oracle (a reference to the ORACLE Database
Server, if that wasn't obvious). And although Delphi can work with just
about any database you can think of, the point is that it was designed
from the start to be a great environment for developing database
applications.
While database development has not always been considered the most
glamorous area of software development (that's other people talking, not
me, I think database development is quite glamorous, frankly), a strong
argument can be made that it is the most important, with respect to
how it affects our daily lives. It is nearly impossible to go a day without
having some interaction that involves a database.
Whether you are making a purchase at the market, checking into a
hotel, catching an airline flight, or withdrawing money from an
automatic teller machine, data needs to be collected, and in most cases,
used to ensure that your experience is positive or to make it better in
the future.
And that's where our job comes in. As database developers, we are
responsible for understanding where the data comes from, where it
goes, and how it needs to be used. And that understanding helps us to
create software that helps people be more productive and makes their
lives better.
(1) From the Beginning
To begin with, I'm a real believer in foundations.
I feel that the more you know about the fundamentals of the tools that
you use, the better you'll be able to use them.
Unfortunately, when we're talking about a mature product like Delphi,
there are fewer resources available now than there were in the early
years of Delphi.
And this poses a problem, in part because there is new growth in the
Delphi market. Not only is there a need for new Delphi developers to
help support existing projects, but Delphi has emerged as one of the
leading development tools for native Windows development, and that
particular area of development is not going away.
And that's where this series comes in.
Though I were many hats, in my heart I am a database developer, and
have been deploying multi-user database applications since the 1980's,
and doing so in Delphi since it shipped.

by Cary Jensen

(1)What is a Database?
A database is a mechanism for storing and retrieving data. That's
all, really. In the simplest of worlds, a text document can be a
database. XML is a text format, and many do use it as a database.
A database application, on the other hand, is much more. Most
database applications assist in the collection of data, the manipulation
of that data, and turn that data into information (reports, charts, actions,
and so forth). These application, however, do require a database, but I'm
starting to get ahead of myself here.
In most cases, the data of a database is structured, which is to say that
it is organized. In this regard, text documents often fall short.
As a result, database developers often rely on something else.
For the purpose of brevity, I am going to over simplify this and say
that most Delphi developers rely on three types of databases: custom
file structures, local file system databases, and remote database servers.
Yes, there are others, but in some respects they are variations (or even
combinations) of one or more of these.
(2) Custom File Structures
A database based on a custom file structures can make use of either
simple text or binary files.
In most cases, these files are highly organized. For example, each
individual piece of data may be separated from other pieces of data by
a particular character, or separator. An example of such a file is a
comma separated values (CSV) file, which is a common text format.
It's not necessary for data to be separated by characters.
Specifically, if you know that each piece of data takes up four bytes in
the file, you can retrieve the individual data values by parsing the file,
snipping off four bytes at a time. You would also store this data in the
file in the same way, writing each value as a four byte chunk.
With Delphi, some developers create files that contain a series of
record structures, where by record structures I literally mean Delphi
record types. These files are called typed files, since they are files of a
Delphi type: records, in this instance.
(For a nice introduction to using typed files, see Zarko Gajic's article at
http://delphi.about.com/od/fileio/a/fileof_delphi.htm.)
And these are just a few of the options for working with custom file
structures. One of the advantages of custom file structures is that you
can usually read and write them very fast. In addition, your Delphi
applications that use these files normally rely on nothing more than
Delphi's internal file IO (input/output) capabilities. By contrast, most
of the other database approaches rely on external files, such as client
DLLs. Figure 1 shows a simple diagram that depicts the interaction
between a Delphi application and custom data structures.
Data

Figure 1 A simple custom file database

There is a downside, however.


To begin with, these types of database are almost always single user,
meaning that only one application or person can work with the data
files at a time.
Therefore, I am going to take this opportunity to start from the
Sure, you could devise a mechanism by which two or more applications
beginning. Before the beginning, actually, if you think about it.
(or people) can share this data at the same time, but that would be
I am going to start with a brief overview of databases. Towards the end extremely complicated, and is almost never worth the effort (since good,
of this article I will discuss the basics of Delphi's database related
multiuser solutions already exist).
classes.
Second, these types of databases are typically proprietary, which means
that your applications, and only your applications, can read the data.
In future articles in this series I will go into detail about Delphi's many
While you may find this desirable, it is nonetheless a limitation.
database-related tools, including ClientDataSets, multi-tier development
with DataSnap, Cloud computing, and more.
But for now, we'll begin at the beginning.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

COMPONENTS
DEVELOPERS

Page 31

Introduction to Delphi Database Development: Part 1 (continuation1)

DLL

DLL

DLL

(for example, the workstation on which the client is running and the server on which
the remote database server is running). As a result, overall processing power
is increased, since much of the data manipulation is handled by the
server, which has been optimized for these types of things, while your
client application takes primary responsibility for displaying the user
interface. Figure 3 contains a diagram that represents the typical
client/server architecture involving Delphi applications.

DLL

NETWORK

Data

DLL

DLL

DLL

DLL

Figure 2: A file server database


NETWORK

While file server databases offer a variety of benefits over custom file
structures, they have their limits, especially when compared with
feature-rich remote database servers.
I'll focus on just two limits, and these are related to network bandwidth
and database stability.
When two or more clients need to share data from a file server
database, that data must be placed in a network location accessible to all
clients.
When those clients need to read the data, the data must be transferred
across the network. When many clients are reading and writing data,
this can mean that a large amount of data is moving around on the
network.
Actually, it's worse than it sounds.
For example, if your client application is searching for a particular piece
of data, such as the information about a specific person, some data
about all of the people needs to be transferred across the network so
that the client application can read through each person's data,
searching for the one of interest.
In other words, if data about a million people is stored in your file
server database, and your client is searching for one particular person, it
is likely that some data about all one million people will be transferred
across the network, and that is only for one client application (I say some
data, since most databases make use of indexes. I'll discuss indexes in more detail
in the next article in this series).
Consider what happens when twelve different client applications, on
twelve different workstations on the network are each searching for one
person from the database.
I think you get the picture.
As far as stability goes, file server databases lack centralized control of
the data, and, as a result, are prone to data corruption.
In a file server database, each and every client application can read and
write the data stored in the shared files on the network.
All it takes is for one of these client applications to have a problem
during a write operation (such as being unplugged from the network) and the
database can become corrupt.
This potential for corruption increases in direct proportion to the
number of client applications writing to the database.
Even if there is only one client writing to the database, an error during
a write operation can render the underlying database unusable. (And this
is why backing up your data is so very important. You cannot predict when a
problem like this will be encountered.)
(2)Remote Database Servers
A remote database server is an application that manages your database.
When you write a client application that involves a remote database
server, your client application does not read or write data directly from
files.
Instead, it makes all of its requests for data through the remote
database server. This general architecture is referred to as client/server
architecture.
This distribution of responsibilities produces three primary benefits.
First of all, it distributes the processing of data across several machines

Page 32

COMPONENTS
DEVELOPERS

SERVER

Figure 3 A client/server database


The second benefit is a dramatic reduction in network traffic. Consider
the previous example where a client application needs to locate a single
person in a file of a million people. In a client/server scenario, the
client requests the single person from the database server, which
searches the data files, returning only the one located person (if found)
across the network to the client. That's almost a million to one
reduction in network traffic.
Finally, the client/server architecture is profoundly more stable than file
server databases. This is because the remote database server can
perform any requested data writes in a highly controlled fashion,
namely by using transactions. For example, after the remote database
server receives a properly formed request for a write operation from
the client, the server begins a transaction. (And if the client's network
connection fails during the write request, the request will not be well-formed, and
therefore will be ignored by the server.)
The transaction on the server normally involves the server making a
note of what it wants to write, attempts to write the data, and then
erases the note (I'm really simplifying this here, but you get the picture). If
something goes wrong during the server's write operation, the server
uses it's notes to either complete the write request (if possible) or to
restored the data to its original state. As a result, it is very hard to
corrupt data that is managed by a remote database server.
(1) Delphi and Databases
If you want to use custom file structures in your database applications,
this is something that you do manually.
Specifically, you write the procedures for creating and populating your
data structures (which might be record types), create the procedures for
reading and writing your data, and all things in between, including
displaying your data structures in your using interface, detecting a user's
changes to that data, and populating your data structures from the user
interface for the purpose of writing back to the custom files.
Sure, you can encapsulate all of this in custom components, but the
bottom line is that you are responsible for every aspect of the process.
If you are using one of the supported file server databases or remote
database servers, it's a completely different story.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Introduction to Delphi Database Development: Part 1 (continuation 2)


Delphi includes a large number of components that can read and write
data from a wide variety of industry-standard databases, as well as a rich
variety of data-aware controls that can both display this data and accept
user input. Finally, Delphi includes a special component that acts as the
glue, or go-between, permitting these classes of components to interact
smoothly.
These components can be divided into three categories: TDatasets,
data-aware controls, and the TDataSource class. Each of these are
introduced in the following sections.
Delphi's TDataset Components
Delphi's TDataset components are designed to communicate to an
underlying database through a library of routines that understand the
particular database.
Some of these libraries are based on industry standards, such as ODBC
(open database connectivity), a standard for data access that has been part of
the Windows operating system since the 1990's. Others are designed to
work with a particular database, such as InterBase.
These components are referred to as TDatasets, since most of the
components in these component sets descend from the TDataset
abstract class of Delphi's VCL (visual component library). Consider the
original TDataset classes, which are associated with the BDE.
The primary classes from this group that you use in your applications to
work with tables, execute queries, and execute stored procedures are
named TTable, TQuery, and TStoredProc, respectively.
Each of these classes descend from TDataset (not directly, as there are
several intervening classes, such as TDBDataset, between these classes
and TDataset).

TSQLConnection, TIBConnection, and TAdsConnection. There are


other TDataset classes as well, and they play their appropriate roles, but
those are not relevant to the current discussion.
While Delphi 1 included only one set of TDataset descendants, Delphi
2010 includes no less than 6 sets (I'll go into these in some detail in a future
article in this series). In addition, a number of database vendors provide
their own implementations of TDataset descendants. And through
these, you can access data in an amazing range of databases, including
Paradox, dBase, MS Access, MS SQL Server, ORACLE, DB2, MYSQL,
SQL Anywhere, Sybase ASE, InterBase, Firebird, BlackfishSQL,
Advantage Database Server, and many more.
Delphi's Data-Aware Controls
While the various TDataset classes give you access to a remarkable
collection of databases, it's Delphi's data aware controls that make it
simple to build user interfaces that permit your users to interact with
the underlying data.
Some of these classes, such as the DBNavigator and DBGrid, point to
an entire TDataset, while others, such as TDBEdit and TDBMemo,
refer to a single field of a TDataset (I'll go into more detail about fields and
datasets in the next article in this series, but for now suffice it to say that some data
aware controls refer to collections of data, while other refer to single data points).
In many cases, these controls not only display data, but permit your
users to change the data. For example, a DBGrid can be used to display
most types of textual data from a database, as well as edit this data (so
long as the underlying TDataset permits editing). An example of a DBGrid
displaying data from a database is shown in Figure 4.

With respect to TTable, TQuery, and TStoredProc, these classes do not


work alone.
There are additional classes that, though they do not actually descend
from TDataset, are nonetheless considered to be TDataset classes.
The TDatabase class, for example, defines the connection to a database,
which with a file server database, simply refers to the directory in which
Paradox or dBase files are stored, and in the case of a remote database
server, refers to a server and its associated database.
What is important is that these classes worked together to provide the
services necessary to work with a database.
For example, a TDatabase can be used to hold the username and
password needed to access an encrypted Paradox database.
Once the TDatabase establishes a connection, a TTable can be used to
open a particular Paradox table, or a TQuery can be used to execute a
query against one or more of the tables in the Paradox database.
In order for these TTables and TQueries to successfully perform their
roles, they need to be associated with a TDatabase.
This is done through the Database property of the TTable and TQuery
classes.
Interestingly enough, this is not always obvious. Specifically, a TTable
can refer to a BDE Alias, a definition stored in the BDE configuration,
to reference the location of the data files.
The "not obvious" part is that if you open this TTable, and the TTable
sees that it is not associated with a TDatabase, it creates one on-the-fly,
in the background, and this TDatabase is used to establish a database
connection.
Every set of TDataset descendant classes mimics this basic structure,
though there are subtle differences between the different groups of
TDatasets.
Specifically, there is always some classes designed to execute queries,
execute stored procedures, or open a SELECT * query on a table
(basically what the TTable represents).
In addition, there is always one or more components that represent the
connection to the database.
In the BDE world, this is the TDatabase (and it works with another
component, called TSession), but most of the TDataset descendant classes
include something with a name similar to TConnection, such as
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Figure 4 Data displayed in a DBGrid


Other controls, such as the DBImage and DBLabel, display data but do
not permit you to modify it.
Finally, the DBNavigator control doesn't display data at all, but instead
provides you with a convenient means of moving forward and
backwards in your database.
(The DbNavigator supports additional operations, such as placing a TDataset in
the edit mode, posting changes, canceling changes, to name a few, but only if the
underlying TDataset allows these operations.)
Before continuing to the final component that makes up Delphi's basic
support for the development of database applications,
the TDataSource, a comment about data-aware controls is appropriate.
Some Delphi developers do not approve of the use of Delphi's data
aware controls, preferring instead to control the many aspects of the
user interface, as far as database data goes, manually.
I don't want to go into the detailed arguments on both sides of this
position here.
Instead, I want to simply say that Delphi's data aware controls are easy
to use, and provide many of the capabilities that you want in most of
your basic user interfaces. What is important, though, is that if you
want something more than the data aware controls offer, you are free to
use a wide variety of alternative techniques.
COMPONENTS
DEVELOPERS

Page 33

Introduction to Delphi Database Development: Part 1 (continuation 3)


The TDataSource
The finally component in this triad is not a set of components, but
rather a single component: TDataSource. The TDataSource plays two
primary roles. The first, and most obvious, is as an intermediary
between TDatasets and data aware controls. In fact, data aware controls
get their data awareness from a TDataSource.
Here is how it works. A TDataSource wires up to a specific TDataset
using it's Dataset property, which is of the type TDataset, meaning that
it can point to any of the wide range of available TDataset descendants.
Data aware controls, in turn, point to a TDataSource through their
DataSource property. For those data aware controls that must
specifically refer to a given field of a TDataset, they do so through their
DataField property.
When the TDataset that the DataSource is pointing to changes state,
such as becoming active, or navigating to a new record (a new set of field
data), the TDataSource informs the associated data aware controls, and
they typically react by changing their visual appearance.
For example, if the TDataset becomes active, and is pointing to data (is
not pointing to an empty dataset), the data aware controls will display the
data contained in the TDataset. Likewise, if the TDataset is updated,
displaying a new record (a different set of fields) the data aware controls
react by repainting themselves, now displaying this new data.
Importantly, this interaction between the TDataSource and data aware
controls goes both ways.
For example, if you try to enter data into a DBGrid, and the underlying
TDataset is not in edit mode, the TDataSource will ask the TDataset to
place itself in edit mode.

If the TDataset complies, the DBGrid will accept your keystrokes.


On the other hand, if the underlying TDataset is readonly, it will reject
the request to enter the edit mode, and the DBGrid will not be permitted
to accept the data entry.
Figure 5 depicts the relationship between the BDE TDatasets, the
TDataSource, and data aware controls.Figure 5 The relationship between
BDE TDatasets, TDataSource, and data aware controls
The second role played by TDataSources is to act as an intermediary
between two or more TDatasets. In many respect, this is similar to the
role it plays when interacting with data aware controls, but different in
that no user interface elements are involved. I'm not going to say any
more about this second role at this time, but I wanted to at least
acknowledge that it exists.
(1)Summary
In this first of an extended series of articles on Delphi database
development, I have started at the beginning. This article began with a
general look at databases, with a brief discussion of the most common
types of databases supported by Delphi. It continued with a general
overview of Delphi's data-related components.
In the next article in this series, I will discuss the various types of data
structures that databases contain, including tables, indexes, views, and
stored procedures. I will then show you how you use Delphi's datarelated components to work with these types of data.

SESSION (Default)

DATASETS
(Table, Query, StoredProcedure)

DATASOURCE

Data Aware Controls

The Lazarus Complete Guide will be


available in mid December 2010.
You can order it at ourweb shop direct.
If you pre order the
LAZARUS COMPLETE GUIDE
you will have a Lazarus USB stick
for only 15.00

Page 34

COMPONENTS
DEVELOPERS

Graphical User Interface


programs for
Windows 32 and 64
Windows CE,
Mac OS,
Unix and
Linnux

M. van Canneyt,
M. Grtner, S. Heinig,
F. Monteiro de Cavalho,
I. Ouedraogo.

LAZARUS: COMPLETE GUIDE

DATABASE (Default)

About the Author


Cary Jensen
is Chief Technology Officer of Jensen Data Systems, a
consulting, training, development, and documentation
and help system company. Since 1988 he has built and
deployed database applications in a wide range of
industries. In addition, Cary is the best selling author of
more than 20 books on software development, and
winner of the 2002 and 2003 Delphi Informant Reader's
Choice Award for Best Training. A frequent speaker at
conferences, workshops, and seminars throughout much
of the world, he is widely regarded for his self-effacing
humor and practical approaches to complex issues. Cary
has a Ph.D. from Rice University in Human Factors
Psychology, specializing in human-computer interaction.

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

First Look at Advantage Database Server 10 by Cary Jensen


With the release of Advantage 10, Sybase continues the tradition
of consistent improvements to this highperformance,
low maintenance database server. In addition to a rich array of
additional features and feature enhancements, this release also
includes a large number of internal optimizations that will
improve the performance of most Advantage client applications
simply by upgrading the server to Advantage 10.
These improvements add significant value to the already
impressive collection of features that make Advantage Database
Server a perfect match for small to medium size database
applications.
Overview
The Advantage Database Server (ADS) is a high-performance, low
maintenance database server by Sybase. ADS supports an impressive set
of features often found in high-end databases. These features, in
combination with its ease of installation and nearly maintenance free
operation, make it a favorite database for vertical market applications.
With each new release of Advantage, the Sybase team has consistently
created added value by introducing new and enhanced features, as well
as improving the already impressive performance of the server. This
tradition continues with this release of Advantage 10.
This paper is designed to provide you with a brief overview of the
Advantage Database Server, followed by a look at the new and
enhanced features found in Advantage 10. If you are already familiar
with Advantage, you may want to go directly to the section New
Features and Enhancements in Advantage 10.
Overview of Advantage Database Server
The Advantage Database Server provides a unique set of features that
make it an ideal database for small to medium size applications. The
major features that make Advantage special are described in the
following sections.
High Performance
To put it simply, Advantage is fast. Much of its speed comes from its
architecture, which is based on ISAM (indexed sequential access
method) technology. ISAM makes extensive use of indexes to provide
high-speed table searches, filters, and table joins. Unlike other ISAM
technologies, such as dBase and Clipper, Advantage Database Server is
a transaction processing, remote database server. As a result, it provides
application developers with a reliable, distributed solution for managing
data using client/server technology.
Low Maintenance
The Advantage Database Server installs in minutes, and rarely needs
attention after that. Indeed, unlike high-end database servers, most
Advantage installations do not have a database administrator. This
makes Advantage an ideal server for vertical market applications where
the server may be installed in many facilities that do not have their own
IT department.
Navigational and Set Based Orientation
While Advantage is based on the navigational ISAM architecture, it also
supports industry standard SQL (structured query language), with most
of the SQL operations optimized for lighting fast execution. As a
result, Advantage is one of the rare remote database servers to support
both the navigational model of data access as well as set-based SQL,
giving you a wealth of options for presenting and managing your data.
Advanced Feature Set
The Advantage Database Server sports an impressive collection of
features often only found in high-end database servers. These include
security provided by users and groups, table encryption, and support
for encrypted client/server communication. Additional high-end
features supported by Advantage include stored procedures, SQL
PSMs(persistent stored modules), views, user defined functions, tableand field-level constraints, referential integrity, online
backup, triggers, notifications, and replication.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Scalable
Advantage comes in two basic flavors: the Advantage Database Server
(ADS) and the Advantage Local Server (ALS). ALS is a free, file-server
based technology whose API (application programming interface) is
identical to ADS. ALS permits developers to deploy their Advantage
applications royalty free to clients who do not need the stability and
power of a separate database server. Importantly, as the needs of those
applications deployed with ALS grow over time, those applications can
be almost effortlessly scaled to client/server technology, in many cases
simply by deploying ADS. So long as the client applications are
designed correctly, those applications will begin using ADS the next
time they execute.
New Features and Enhancements in
Advantage 10
Rather than reciting a laundry list of updates, I have organized the
enhancements into the following sections:
Major performance improvements, enhanced notifications, additions to
Advantage SQL, nested transactions, Unicode support, additional 64bit clients, added design-time support for Delphi, and side-by-side
installation. For a detailed listing of all of the updates found in
Advantage 10, see the white paper at the following URL:
http://www.sybase.com/files/White_Papers/Advantage_W
hatsNewADS10_WP.pdf

Ma jor Performance Improvements


The Advantage Database Server has always been recognized for its
superior performance, being able to handle very large amounts of data
with blinding speed. That makes it all the more remarkable that one of
the most enticing aspects of upgrading to Advantage 10 involves
performance. Specifically, the performance of database operations in
client applications will improve simply by upgrading the server to
Advantage 10. In some cases, these performance gains will be
significant. Many of the internal systems that contribute to Advantages
already impressive performance were evaluated by Advantages R&D
engineers. Where possible, improved algorithms were introduced,
caching was implemented or enhanced, and resources were pooled.
These changes resulted in more efficient indexes, improved transaction
handling, and more intelligent management of resources such threads,
record locks, and file writes. The effects of these improvements range
from nice to stunning. During Advantage 10s Beta cycle, one of the
Beta testers reported the results of his performance tests on some of
his larger queries involving, in some cases, millions of records. He
found that some Advantage 10 queries executed 40 percent faster than
the same queries in Advantage 9.
In other cases, the Advantage 10 queries were exponentially faster (one
query that ran in 2.7 seconds in Advantage 9 took about 1 millisecond
in Advantage 10). The R&D team has found similar improvements
during testing. But SQL queries are not the only area of Advantage to
benefit from these internal improvements. Operations that benefit from
Advantages support for navigational operations have also improved. In
fact, the Help files for Advantage 10 list no less than 20 specific
improvements or optimizations introduced in Advantage 10. And these
updates affect everything from cascading referential integrity updates to
record insertion, from memo file header updates to table creation, from
low-level index operations to worker thread management. Simply put,
the performance enhancements introduced in Advantage 10 alone
make a solid business case for upgrading from an earlier version of
Advantage.
Enhanced Notifications
Notifications are a feature originally introduced in Advantage 9, and
they provide you with a mechanism by which Advantage can notify
interested client applications that some change has occurred on the
server. For example, a client application can subscribe to a notification
in order to be informed when the contents of a specific table have
changed. The client application can then use this information to update
the end users view of that data.
A small change to notifications in Advantage 10 has resulted in a very
significant improvement in their utility:
COMPONENTS
DEVELOPERS

Page 35

First Look at Advantage Database Server 10 (continuation 1)


Advantage 10 notifications now support a data packet. This data packet,
in the form of a memo field, permits you to include any data you like
along with the notification.
This data may include the record ID of the record that was affected in
the table of interest, the connection id of the user who made the
change, the type of change, or any other data you like. This data
permits you to implement advanced features in your notificationsubscribing clients.
For example, you can now distinguish between changes made by your
client applications user and those made by other users. This
information can beused to automatically update a users view of data
when someone else has made changes, ignoring those changes posted
by that user.
Additions to Advantage SQL
There are many updates and additions to Advantages support for the
structured query language (SQL). Of these, my favorite update is the
new ability to use a stored procedure in the FROM clause of a SELECT
query.
If you have a stored procedure that returns a result set, you can treat
that result set like a table in a SQL SELECT statement, permitting you to
select specific fields or expressions from the result set, link the stored
procedure result to other tables (or other stored procedure result sets), and
define WHERE clause conditions to select just those records in which you
are interested. You can even use the predefined Advantage system
stored procedures in the FROM clause.
Another enhancement is the ability to use Boolean expressions in your
SQL statements. For example, if you have a table named CUSTOMER in
which a Boolean (logical) field named Active appears, the following
query will select all records where the Active field contains True.
SELECT

FROM CUSTOMER WHERE Active;

In previous versions of Advantage, you would have to form your query


like the following:
SELECT

FROM CUSTOMER WHERE Active

True;

Also, TOP queries now support a START AT clause, which permits you to
select a specific number of records beginning from some position in
the result set other than the top. For example, the following query will
return records 11 through 15 from the CUSTOMER table, ordered by last
name.
SELECT TOP 5 START AT 11 FROM CUSTOMER ORDER BY LastName;
A collection of bitwise SQL operators have also been introduced. These
include AND, OR, and XOR, as well as >> (rightshift) and << (left-shift).

There is also a new SQL scalar function: ISOWEEK, which returns the
ISO 8601 week number for a given date (it is also a new expression
engine function). And, some of the SQL scalar functions that were
previously not expression engine function are now. These include DAY,
DAYOFYEAR, DAYNAME, and MONTHNAME, to name a few. These are in
addition to CHAR2HEX and HEX2CHAR, which are newly added expression
engine functions. Support in the expression engine means indexes can
now be created using these functions, which in turn allows the
Advantage query engine to fully optimize
restrictions that use these scalars.
Finally, there are a number of new system stored procedures and
system variables. The following are just a few of the new system stored
procedures available in Advantage 10:
sp_SetRequestPriority, sp_GetForeignKeyColumns, and
sp_IgnoreTableTransactions.

As far as system variables go, among the new variables are


::conn.OperationCount (number of operations performed on this connection),
::stmt.TrigEventType (the event type of the executing trigger),
::stmt.TrigType
(the type of trigger executing), and
::conn.TransactionCount(the current nesting depth of nested transactions).

This is no longer the case. As a result, if you write a stored procedure


whose operations should be performed in a transaction, you can safely
call BEGIN TRANSACTION, even if that stored procedure is called by code
where a transaction is already active.
New Table Features
Several interesting new table-specific features have been introduced in
Advantage 10. Several of these are related to transactions and table
caching. Lets consider table caching first. To begin with, so long as
memory resources permit, temporary tables are now kept entirely in
cache. As a result, operations that rely on temporary tables are usually
very fast.
There is also a new table property called Table Caching. Most tables are
created with Table Caching set to None. These tables are not cached,
and any changes to these tables are written to the underlying file
immediately. When Table Caching is set to either Read or Write, the
corresponding table is kept in cache while it is open, making its data
highly available. These settings are normally used for data that is largely
static, and which can be reconstructed if the table becomes corrupt.
Specifically, tables held in cache are not written to disk except when the
table is closed.
As a result, changes to their data will be lost if Advantage unexpectedly
shuts down without being able to persist those tables contents (for
instance, if there is a sudden failure of your servers power supply). However, this
functionality can be very useful for static data (zip codes, part numbers, and
so forth).
The transaction free tables feature is also a table property, called Trans
Free Table. When set to True, the associated table does not participate
in transactions.
There are two implications of a table not participating in an active
transaction. First, changes made to a Trans Free Table during a
transaction are not rolled back even if the transaction itself is rolled
back. Second, changes to data in a Trans Free Table are not isolated
during a transaction, being immediately visible to all other client
applications, even though the transaction has not yet been committed.
Just like when a tables Table Caching property is set to Read or Write,
Trans Free Table is set to True only for special tables in most
applications. For example, you may use a table to log a users actions in
an application. In those cases, you may want to log that a user tried to
perform some task, even though the action may fail and the users
changes may be rolled back.
Similarly, you may have a table used for generating unique key field
values. This table may have a single record and single field that holds an
integer value. A client needing a key would lock this table, read the key,
increment the integer, and then release the lock.
With such a table, the incremented key needs to be visible to all client
applications, even if individual clients increment the key from within a
transaction. If such a table were not a Trans Free Table, other clients
would not be able to access the incremented key until the transaction
was committed, rendering the table useless for its intended purpose.
Unicode support
Although Unicode support is arguably a table feature, its significance
warrants separate consideration.
In short, Advantage 10 introduces three new field types. These types,
nchar, nvarchar, and nmemo, are UTF-16 Unicode field types.
The nchar type is a fixed length Unicode string field and nvarchar is a
variable length Unicode string field. The data for these two field types
are stored entirely in the table file. The nmemo field, by comparison, is
a variable length Unicode field that is stored in the memo file. Together,
these three fields provide you with a number of options for storing
Unicode symbols and characters in Advantage tables.

Nested Transactions
Speaking of nested transactions, Advantage 10 now supports them. In
previous versions of Advantage, code executing in an active transaction
could not attempt to start a transaction without raising an exception.

Page 36

COMPONENTS
DEVELOPERS

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

First Look at Advantage Database Server 10 (continuation 2)


More 64-bit Clients
Advantage 9 introduced 64-bit versions of both the Windows and
Linux Advantage servers, as well as 64-bit clients for the Advantage
Client Engine (ACE) and the Advantage .NET Data Provider.
A number of additional 64-bit clients have been added in Advantage
10, including 64-bit versions of the OLE DB Provider, the ODBC
driver, as well as the Linux PHP driver.
The Advantage Data Provider for .NET has also been enhanced to use
the appropriate 64-bit or 32-bit drivers, depending on the OS on which
your managed code is executing. In addition, 64-bit versions of the
Advantage Local Server (ALS) and Advantage backup utility have been
introduced in Advantage 10.
Added Design-Time Support in Delphi
The SQL Utility, a comprehensive SQL editor and debugger, is now
exposed as a property editor directly within the Delphi IDE (integrated
development environment). To use the SQL Utility within Delphi,
select the ellipsis button on the SQL property in Delphis Object
Inspector when an AdsQuery component is selected.
Using the SQL Utility, you can check the syntax of your SQL, execute
it, and even set breakpoints and debug your SQL scripts. Once you are
satisfied with your SQL, click the Save button on the SQL Utility
toolbar (or press Ctrl-S) to save your SQL to the SQL property of the
AdsQuery.
The Advantage Delphi Components also include a new component,
TAdsEvent. This component, which you can use to subscribe to and
handle notifications, allows you to easily configure and manage the
handling of asynchronous events.

Side-By-Side Installations
With Advantage 10, it is now possible to run two or more instances of
the Advantage server on the same physical server, even different
versions of Advantage. For example, it is now possible to run
Advantage 9 and Advantage 10 on the same server. This feature is
particularly useful for vertical market developers whose applications
need to support more than one version of the Advantage server.
Conclusion
With the release of Advantage 10, Sybase has once again
confirmed its commitment to this unique and valuable database
server.
In addition to a number of useful additions and enhancements,
Advantage 10 also includes a wide range of performance
improvements that will improve the performance of most client
applications merely by installing this updated server.
Most developers, however, will also want to update their client
applications to benefit from the many enhancements found in
Advantage 10.
From support for Unicode to greatly improved notifications, from
updated SQL syntax to enhanced table features, Advantage 10 has
something for everybody.

BOOKS FROM MARCO CANT COMBINED INTO ONE PDF

A single PDF bundling together 3 ebooks:


Delphi 2007 Handbook, Delphi 2009 Handbook, and Delphi 2010 Handbook. The material has not been edited,
it is simply the merging of the three original ebooks. The collection of books covers all of the new features of
Delphi 2010 since Delphi 7, covering the IDE, the Delphi language, Unicode support, Windows development,
new VCL components, database access, DataSnap, and much more. In total the combined book has almost
1,000 pages of Delphi-related content, in a single easy-to-search PDF file.
http://sites.fastspring.com/wintechitalia/product/delphihandbookscollection
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

COMPONENTS
DEVELOPERS

Page 37

You can order this issue as printed item

LAZARUS

ALL ACCES

COMPONENTS
DEVELOPERS

4
DELPHI

DB ARTISAN
ER/STUDIO

RAPID SQL

DB OPTIMZER
DB PERFORMANCE CENTER

INTERBASE

DB CHANGE MANAGER

If you take out a new subscription


we will offer you a discount of 5,00 per subscription.
We have some special offers for our subscribers:
you will find these extra offers for them on
page 26 from Nexus,
from components4developers on page 72
from TMS software on page 27.
Page 38

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

GR

GR

T NE
EAAT NE

Real-time datacollection
starter

with the Delphi Development Board

by Anton Vogelaar

expert Delphi 3 and above


DELPHI DEVELOPMENT BOARD
RAM
Record

TInputs
TInputs
8 x per
second

I/O Server

M485A Server

Temp Sensor

RS232

PC

DRIVER

DRIVER

In engineering science you often need to make physical data


simultaneously available to more then one application. To simplify
the programming of the the following example illustrating how to
do this, an Interbase / Firebird database is used as it provides
marshalling, atomicity, isolation and durability. This article
describes a modular approach from sensing temperature data to
storing the data in a database. It will also serve to document how
to write a program for using with the DelphiDevBoard (see page 42).

USB

DRIVER

Delphi / Lazarus Application


Lib485a.dll

dta_collector.exe
fbclient.dll

Figure 1: The application


The signal route from sensor to data storage
takes this pathway:
1) Sensor
2) Analogue to digital converter
3) Transmitter
4) Data collection application
5) RDBMS relational data base management system
Figure 2: The organsation of the system
1. Sensor.

4. Data collection application.

To collect the data a PC is used running Windows and a Delphi


application. As most modern computers don't have RS232 ports a USB
dongle is used to connect to the transmitter. To ease the interfacing
with the transmitter the DelphiDevBoard comes with the lib485a.dll
implementing the M485a protocol. The task of this application is to get
the temperature, add a date and time stamp and store it in a database,
for which we use Interbase/Firebird. This modular Delphi project
dta_collector.dpr contains four layers i.e.
Presentation layer
Business layer
Communication layer
Persistence layer
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

dta_controller.dpr

Busines layer +
communication
layer
Persistent
layer

UGUI.pas

UDmain.pas

UDB.pas

lib485a.dll

Presentation
Layer

fbclient.dll

In this example a temperature is to be measured with a resolution of at


least 0.1 degree Celsius and recorded together with including a date and
time stamp. The sensor DS18B20 from Maxim was chosen as it
has a measuring range of -55 to +125 degrees Celsius with a resolution
of 1/16 degree. This satisfies the required specification.
2. Analogue to digital converter.
The DS18B20 has a built-in 12 bit A/D converter.
The digital data ranges from %1111 1100 1001 0000 = $FC90 = -880
equivalent to -55.0 degrees Celsius up to
%0000 0000 0000 0000 = $0000 = 0 for zero degrees and beyond to
%0000 0111 1101 0000 = $07D0 = 2000 equivalent to + 125 Celsius.
I.e. the digital value from the converter is 16 times the Celsius
temperature
3. Transmitter. The DelphiDevBoard (see page 42) is used as the
transmitter. The standard firmware can be used to implement the
transmitter. It contains these components:
A driver for the temperature sensor. It receives the digital data from
the sensor and provides the data as an integer.
I/O Server. This background application fires 8 times per second,
calling the driver and writing the received integer in a memory pascal
record : Var Inputs : Tinputs; the location where it is stored
is Inputs.Temp.
A M485aServer, running in the background, stands-by to provide
memory data on incoming requests through the RS232 port. The
open protocol used is M485a.

Figure 3:
The dpr file explained

Presentation layer.
The source code of the GUI can be read in UGUI.pas and the
screenshot shows how the controls are positioned. The controlling of
this application is handled by a controller instance of TContr defined
in the business layer.
The controller is instantiated in the OnCreate method of the main
form (named GUI), and released in its OnDestroy method.
In the top-left corner is a SpeedButton with a red / green glyph
indicating the on/off line status.
The OnClick event calls the GoOnline and GoOffLine private methods
of the main form (GUI).
These methods enable and disable the visibility of a panel showing the
measured data and call the Start and Stop methods of the controller.
This class also provides the public method Refresh to set the visual
controls with numbers as obtained from lower layers.

Figure 4:
The presntation layer and its components

Page 39

Realtime datacollection(continuation 1)
Unit UDomain;
Interface

Unit UGUI;
(* ======= Interface ===== *)
Interface
Uses Windows, Messages, SysUtils, Variants, Classes,
Graphics, Controls, Forms, Dialogs, StdCtrls, ExtCtrls,
Buttons, ToolWin, ComCtrls, UDomain;
Type TGUI

= Class (TForm)

TBar
: TToolBar;
BtnGo
: TSpeedButton;
PnlMain
: TPanel;
LbTemp
: TLabel;
LbPotm
: TLabel;
ShTot
: TShape;
ShTemp
: TShape;
ShPotm
: TShape;
Label1
: TLabel;
Label2
: TLabel;
Label3
: TLabel;
LbLog
: TLabel;
Shape1
: TShape;
Procedure BtnGoClick (Sender : TObject);
(Sender : TObject);
Procedure FormHide
(Sender : TObject);
Procedure FormShow
Private
Contr : TContr;
Procedure GoOnLine;
Procedure GoOffLine;
Public
Procedure Refresh (STemp, SPotm, SLog : String);
End;
Var

GUI : TGUI;

(* ======= Implementation ============================= *)


Implementation
{$R *.dfm}
(* ===== Private ================================= *)
Procedure TGUI.GoOnLine;
Begin
Try Contr.Start;
PnlMain.Visible := True;
Except
GoOffLine; Raise
End;
End;
Procedure TGUI.GoOffLine;
Begin
BtnGo.Down
:= False;
PnlMain.Visible := False;
Contr.Stop;
End;
(* ==== Public ================================= *)
Procedure TGUI.Refresh (STemp, SPotm, SLog : String);
Begin
LbTemp.Caption := STemp;
LbPotm.Caption := SPotm;
LbLog.Caption := SLog;
End;
(* ====== Form ================================== *)
Procedure TGUI.FormShow (Sender : TObject);
Begin
Contr := TContr.Create;
End;
Procedure TGUI.FormHide (Sender : TObject);
Begin
GoOffLine;
Contr.Free;
End;
Procedure TGUI.BtnGoClick (Sender : TObject);
Begin
If BtnGo.Down Then GoOnLine Else GoOffLine;
End;
(* ==== End =================================== *)
End.

Uses

ExtCtrls, SysUtils, UDB;

TContr = Class
Type
Private
Timer
: TTimer;
DB
: TDB;
NLog
: Integer;
Procedure DoTimer (Sender : TObject);
Public
Constructor Create;
Destructor Destroy; Override;
Procedure Start;
Procedure Stop;
End;
(* =========== Implementation ============== *)
Implementation
Uses

UGUI;

TStr80 = String [80];


Type
(* =========== DLL procedures ============== *)
Procedure M485a_Open (Port : Integer);
External 'lib485a.dll';
Procedure M485a_Close;
External 'lib485a.dll';
: TStr80);
Procedure M485a_ProdID (Var ProdID
External 'lib485a.dll';
: TStr80);
Procedure M485a_ProdIDF (Var ProdID
External 'lib485a.dll';
(Var MainVars : TStr80);
Procedure M485a_Vars
External 'lib485a.dll';
(N, MAddr : Word; Var Buf);
Procedure M485a_RdRam
External 'lib485a.dll';
(N, MAddr : Word; Var Buf);
Procedure M485a_WrRam
External 'lib485a.dll';
(N, MAddr : Word; Var Buf);
Procedure M485a_RdEe
External 'lib485a.dll';
(N, MAddr : Word; Var Buf);
Procedure M485a_WrEe
External 'lib485a.dll';
(* =========== Public ====================== *)
Constructor TContr.Create;
Begin
Inherited;
DB := TDB.Create;
DB.Open ('192.168.0.16:/db/test.fdb');
End;
Destructor TContr.Destroy;
Begin
DB.Close;
FreeAndNil (DB);
Inherited;
End;
Procedure TContr.Start;
Begin
M485a_Open (1);
Timer := TTimer.Create (Nil);
Timer.Interval := 1000;
Timer.OnTimer := DoTimer;
Timer.Enabled := True;
End;
Procedure TContr.Stop;
Begin
If Timer = Nil Then exit;
Timer.Enabled := False; FreeAndNil (Timer);
M485a_Close;
End;
(* =========== Timer ======================= *)
Procedure TContr.DoTimer (Sender : TObject);
S
: TStr80;
Var
STemp, SPotm
: String;
ITemp, IPotm
: Integer;
Begin
If Not Timer.Enabled Then Exit;
M485a_Vars (S);
STemp := Copy (S, 3, 4); SPotm := Copy (S, 7, 4);
ITemp := StrToInt ('$' + STemp);IPotm :=
StrToInt ('$' + SPotm);
STemp := Format ('%.1f C', [ITemp / 16]);
SPotm := Format ('%.1f %%', [IPotm / 10.23]);
Inc (NLog);
GUI.Refresh (STemp, SPotm, IntToStr (NLog));
DB.Save (ITemp, IPotm);
End;
(* =========== End ========================= *)
End.

Page 40

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Realtime datacollection(continuation 2)
Business and Communication layer.
Both business and communication functionality is implemented in the
unit UDomain. As Windows is an event based operating system the
controller class contains a timer instance firing the OnTimer event
every second. This event is linked to the DoTimer method. In
TContr.DoTimer the procedure M485a_Vars (S) is called.
This procedure resides is the lib485a DLL and returns a string
representation of the Inputs record in hexadecimal format.
The temperature is four hexadecimal characters long from position #3.
Itemp := StrToInt ('$' + Copy (S, 3, 4)); returns the
temperature as an integer in multiples of 1/16 degree Celsius where the
'$' character forces the StrToInt procedure to treat S as a string of
hexadecimal characters. This value is passed to the GUI as a string
created by Format ('%.1f C', [ITemp / 16]) the method GUI.Refresh is
called. This value is also to be stored in the database. Since all database
actions are encapsulated in class TDB it is sufficient to call DB.Save
(ITemp). Instantiating an releasing the DB object in handled by the
controllers Create and Destroy events.
Persistence layer.
This layer contains all the functionality required to save the measured
data in an Interbase / Firebird database. Instances of TIbDatabase,
TIbTransaction and TIbSql are used since TIbSql is a lightweight
communication class. Objects of these classes are instantiated by the
TDB.Open and TDB.Close methods. The actual storage functionality is
implemented in the TDB.Save method and embedded into a
transaction.
Unit UDB;
Interface
ExtCtrls, SysUtils, IbDatabase, IbSQL, Classes;
Uses
TDB
= Class
Type
Private
IbDb
: TIbDatabase;
IbTr
: TIbTransaction;
IbSQL
: TIbSQL;
Public
Procedure Open (DbName : String);
Procedure Close;
Procedure Save (ITemp, IPotm : Integer);
End;
(* =========== Implementation ============== *)
Implementation
(* =========== Public ====================== *)
Procedure TDB.Open (DbName : String);
Begin
IbDb := TIBDatabase.Create (Nil);
IbTr := TIBTransaction.Create (Nil);
IbSQL := TIBSQL.Create (Nil);
IbTr.DefaultDatabase := IbDb;
With IbDb Do
Begin
Params.Add ('user_name=SYSDBA');
Params.Add ('password=masterkey');
DatabaseName
:= DbName;
LoginPrompt
:= False;
SQLDialect
:= 3;
DefaultTransaction := IbTr;
Open;
End;
With IbSQL Do
Begin
Database
:= IBDb; Transaction := IBTr;
End;
End;
Procedure TDB.Close;
Begin
IbDb.Close; FreeAndNil (IbSQL); FreeAndNil (IbTr);
FreeAndNil (IbSQL);
End;
Procedure TDB.Save (ITemp, IPotm : Integer);
Begin
IbTr.StartTransaction;
IbSQL.SQL.Text := Format ('insert into LOG (TEMP, POTM)
values (%s, %s)',
[IntToStr (ITemp), IntToStr (IPotm)]);
IbSQL.ExecQuery;
IbTr.Commit;
End;
(* ============ End ======================== *)
End.

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

5.Relational Database management System


(RDBMS).
Installing Interbase / Firebird provides for the database engine, not for
the database to be used. The database can be created using the
console application ibsql.

To automate this process and to be able to recreate the database when


errors are found or when installing on a different machine all
statements are included in the script file create_db.sql.
This script is executed by the command : ibsql -q -i create_db.sql.
The script contains comments, the creation of the database, creation of
tables, creation of triggers and generators to implement auto-increment
fields and sample data to populate the tables.

The SQL Script


create_db.sql
/*Script creating a firebird database table.
Copyright by Vogelaar Electronics, Bunschoten Netherlands. Rev. 0.10 2010-09-17 Initial release
Usage: # ln -s /opt/firebird/bin/isql /usr/local/sbin/fbisql
# cd /path of script/
# fbisql -q -i create_db.sql /*
/*==================== connect ==================== /*
set sql dialect 3;
create database "localhost:/db/test.fdb"
user 'SYSDBA' password 'masterkey';
/* =================== tables ===================== */
create table LOG (
ID
integer
not null primary key,
STAMP
timestamp not null,
TEMP
integer,
POTM
integer,
REM
varchar (20));
/* =================== generators and triggers ==== */
create generator AI_LOG;
set generator AI_LOG to 0;
set term ^ ;
create trigger TR_BILOG for LOG before insert as
begin
if (NEW.ID
is null) then NEW.ID = gen_id (AI_LOG, 1);
if (NEW.STAMP is null) then NEW.STAMP = CURRENT_TIMESTAMP;
end ^
set term ; ^
/* =================== populate =================== */
insert into LOG values (null, null, 100, 200, 'none');
insert into LOG values (null, null, 150, 250, null);
commit;

Marshalling

(In computer science, marshalling (similar to serialization) is the process of


transforming the memory representation of an object to a data format suitable for
storage or transmission.) It is typically used when data must be moved between
different parts of a computer program or from one program to another.
Marshalling is a process that is used to communicate to remote objects with an
object(serialized object - serialization). It simplifies complex communication, using
custom/complex objects to communicate - instead of primitives.
The opposite, or reverse, of marshalling is called unmarshalling (or demarshalling,
similar to deserialization). - wiki
Atomicity

In database systems, atomicity (or atomicness) is one of the ACID transaction


properties. In an atomic transaction, a series of database operations either all occur,
or nothing occurs. A guarantee of atomicity prevents updates to the database
occurring only partially, which can cause greater problems than rejecting the whole
series outright.
The etymology of the phrase originates in the Classical Greek concept of a
fundamental and indivisible component; see atom.
An example of atomicity is ordering an airline ticket where two actions are
required: payment, and a seat reservation. The potential passenger must either:
1. both pay for and reserve a seat; OR
2. neither pay for nor reserve a seat.
The booking system does not consider it acceptable for a customer to pay for a ticket
without securing the seat, nor to reserve the seat without payment succeeding. - wiki

Page 41

The DelphiController and DelphiDevBoard were designed to help


students, Pascal programmers and electronic engineers
understand how to program micro controllers and embedded
systems especially in programming these devices. This is
achieved by providing hardware (either pre-assembled or as a
DIY kit of components), using course material, templates, and a
Pascal compatible cross-compiler and using of a standard IDE
for development and debugging (Delphi, Lazarus or FreePascal)

GR

GR

DELPHI 3 en hoger (Win32)

(Advertentie)

expert

starter

The Delphi/Lazarus Controller and


the Delphi/Lazarus Development Board

T NE
EAAT NE

The VE09206 DelphiController


Regulated
3.3 VDC
Power Supply

M324D40

X-tal
oscilator
7.37 MHz

Delphi CPU

Information about sales etc.


www.blaisepascal.eu

Hardware.
M324D40 DelphiCPU
A programmed AtMega324 40 pin dual in line controller chip measuring
50 x 17 x 4 mm. This chip contains all basic computer parts like :
- 32 kBytes Flash memory for program storage
- 2 kByte RAM for variables
- 32 general purpose registers
- 1 kByte of EeRom for non - volatile storage
- a AVR CPU
- a hardware multiplier for fast mathematics
- a clock oscillator able to run up to 20 MHz.
The chip also contains a set of input and output peripherals like :
- JTAG interface for real-time debugging
- 3 timer counters with interrupts for a system clock and delays
- 6 PWM channels which can be used for motor speed and direction
control
- 8 analog to digital converters 10 Bit with 1x, 10x and 200x amplifiers
for interfacing with analog sensors, voltages and potentiometers
- I2C interface for expanding the number of peripherals
- Watchdog timer for automatic reset when a failure is detected
- Analog comparator for accurate level detection
- 32 programmable digital in- and output lines for lamps, switches,
LCD, TCP/IP etc.
4 KByte of the 32 KByte flash is reserverved and
preprogrammed with the:
- OS : an operating system implementing a system clock, starting of
servers and user applications
- BIOS : I/O drivers for EeRom and standard hardware
- Hardware test : standard hardware test application
- M485A server : monitor for RAM and EeRom including a Flash
programmer for user applications
- I/O server : synchronized I/O with RAM records

32 I/O
wires

1kByte EeRom

JTAG

6xPWM

3xTimers

8x A/D

WDOG

12C

VE08201 DelphiStamp
this is a miniature (52 x 20 x 20 mm) plug-in version of the
DelphiController as described above with additional resources :
- Flash memory = 128 kByte
- RAM memory
= 4 kByte
- EeRom memory = 4 kByte
- xtal oscillator = 11.06 MHz

The VE09205 DelphiDeveloperBoard

2kByte Ram
32kByte Flash
BIOS
Hardware test
M485AServer
I/OServer
OS

This 90 x 80 x 18 mm printed circuit board contains the VE09206


DelphiController plus additional sensors and actuators, enabling you
to make a quick start in writing applications.. For the provided I/O the
BIOS contains the required drivers. Templates are provided for
The VE09206 DelphiController
simulation in the Delphi IDE making the R&D cycle much quicker.
This is a high quality printed circuit board with dimensions of 67 x
51 x 13 mm which contains besides the above described DelphiCPU : On board sensors are:
- a potentiometer providing an angle measurement
- a 5 to 9 Volt DC regulated power supply with screwd terminals
- two push buttons
and polarity protection providing the 3.3 Volt board voltage
- a temperature sensor with a resolution of 0.06 degree Celsius On
- a 7.37 MHz crystal oscillator
board actuators are:
- a LED as activity indicator
- 10 segment LED bar
- a reset switch
- 2 digit 7 segment LED display
- 3 enable / disable switches for servers and user application
Communication:
- JTAG connector for real-time debugging
- RS232 or RS485 communication port
- a RS232 port for monitoring and uploading user applications
- 10 pin header with 6 universal I/O wires to control user hardware
- a 40 pin socket to connect the controller with the application
(in future: TCP/IP networking will be possible through this header)
hardware

Page 42

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

The Delphi/Lazarus Controller and


the Delphi/Lazarus Development Board

2 Digit LED

Documentation.
The tutorial package contains:
- software users manual including install instructions
- hardware users manual
- description of hardware test method
- several detailed project descriptions
- compiler manual
- data sheets of used components
- electronic circuit diagrams
- source listing of used drivers
Cross-compiler.
The cross compiler accepts pascal code as used in Delphi and Lazarus
(no classes or objects) and converts this to AVR object code which the
DelphiController will run after uploading. The cross compiler has a
build-in assembler which can be used for drivers and time critical
sections of the user application.
IDE.
The use of an IDE (Integrated Development Environment) greatly
enhances the development process by providing features such as code
completion, help, file management etc. Suitable IDEs are provided by
Delphi, Lazarus and FreePascal. While using the IDE and suitable
templates the controller algorithm can be run in simulation mode in
which the provided GUI units and templates simulate sensors and
actuators. After debugging, the unmodified source code can be used for
the cross compiler.
Templates.
To help you learn quickly how to program this hardware, curve
templates are providedin units for typical unvarying code sections
found in simple projects. By supplying templates and library units the
controller's behaviour can be simulated in your chosen IDE. Templates
are provided for three types of application programs i.e.
1) Interface application
2) GUI application and
3) Stand alone application.

Comm DLL

Stand alone application.


In this mode the full speed of the DelphiController can be utilized as
no background servers are required. Optionally the M485A monitor
server can be started for remote control through the RS232 port.

GUI (optional)

M485A server
(optional)

Controller

DelphiDevBoard
Written in Pascal

User hardware
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Comm DLL

Delphi
Controller

Interface application.
In this mode the PC is the master and executes the control algorithm
while the DelphiController acts as an interface slave and follows the
commands coming from the PC through its RS232 port. When the PC
stops the DelphiController will also stop.
In the interface mode the DelphiController contains in RAM an input
and output record. Each field in these records reflects the state of the
sensors and actuators. The IOServer running in the background
synchronizes these records with the physical hardware through the
BIOS drivers. The running M485A monitor server will accept read and
write commands on the input and output records through the RS232
port. To make communication between the PC and the
DelphiController easy a communication DLL, templates and sample
applications are provided.

Controller

M485A server
I/O server

DelphiDevBoard

User hardware

GUI application.
In this mode the DelphiController is the master and executes the
control algorithm while the PC retrieves information from the RS232
port to update a GUI or to send commands to the controller. When the
PC stops the DelphiController will continue. In the GUI mode the
DelphiController contains in RAM an input and output record. Each
field in these records reflects the state of the sensors and actuators. The
IOServer running in the background synchronizes these records with
the physical hardware through the BIOS drivers. The running M485A
monitor server will accept read and write commands on the input and
output records through the RS232 port. To make communication
between the PC and the DelphiController easy a communication DLL,
templates and sample applications are provided.
Comm DLL

10 segment LED Bar

VE09206

Information about sales etc.


www.blaisepascal.eu

GUI

M485A server
I/O server
Controller

DelphiDevBoard
Written in Pascal

User hardware
Information about sales etc.
About the author
Anton J. Vogelaar
www.blaisepascal.eu
is an electronic measurement and control engineer. In
1974 he completed his electronic engineering study at the
HTS in Utrecht. In 1982 he completed a course in
engineering science at the university of Durham UK. Since
1972 he has been director of Vogelaar Electronics
Netherlands which specialises in in digital control
equipment (air gauging, climate control, industrial control,
bio-reactor control etc.). He also writes software which
provides the control, GUI and persistence aspects required
by the hardware his firm produces. The programming
languages he uses used include Assembler (AVR, PIC and
8751), Pascal, Delphi and Java on Windows and Linux
(embedded) platforms. He has more than ten years
experience as part-time teacher in advanced technical
colleges and gives lectures in Pascal/Delphi. His other
interests are : rowing, steam engines, study, technology
and grandchildren.
DATABASE SPECIAL 2010 BLAISE PASCAL Page
MAGAZINE
43

IN THE NETHERLANDS: UTRECHT / DE MEERN


SATURDAY 14TH OF MAY 2011
INFORMATION: www.blaisepascal.eu

25 megabytes Adobe PDF Instruction files


Lazarus Help functionality is all pre-installed
- no internet connection is required.
The USB stick includes a wealth of
documentation in Adobe PDF format
- only for subscibers
25,00 plus 7,50 postage costs
non subscribers
35,00 plus 7,50 postage costs
Ordering at the Blaise

Pascal Shop
www.blaisepascal.eu
N
WIIN
SW
US
RU
AR
ZA
LLA
AZ
E
BLLE
AB
TA
RT
P
OR
PO

LAZARUS ON USB STICK 4 GB

COME TO JOIN THE FIRST


INTERNATIONAL LAZARUS SYMPOSIUM

Subscribers: 25,00 / non subscribers 35,00 plus 5,00 postage costs

CONNECT
AND
START
PROGRAMMING

Object-Oriented Databases,
starter

quietly conquering the world...By

expert Delphi 2010

To begin with I think it is necessary to give you an overview of


what a database actually is.. So first of all we will explain the
most relevant models to be able to dive deeper and finally end up
with a real surprise: Object oriented databases arent dead as I
thought they were (because of all the negative reactions I had during the
writing of this article about the item). To the contrary:
they are thriving
and I found some very large companies using them. What makes
it extra exciting: I found one object database that works with
Delphi and or Pascal and you can find some of the Pascal code
on the .iso image of the DVD available at the Blaise website. (If
you want to, you could just order the DVD as well) So lets try to understand
the various database models and start wit an overview of the
database models.
The Hierarchical Model
A hierarchical data model is a data model in which the data is organized into a
tree-like structure. The structure allows repeating information using parent/child
relationships: each parent can have many children but each child only has one
parent. All attributes of a specific record are listed under an entity type.
In a database, an entity type is the equivalent of a table; each individual record is
represented as a row and an attribute as a column. Entity types are related to each
other using 1: N mapping, also known as one-to-many relationships.
The most recognized and used hierarchical databases are IMS developed by IBM
and Windows Registry by Microsoft. - wikipedia
(The explanations that follow are taken from the websit at
http://www.unixspace.com/context/databases.html.

the Author is: Alexander Lashenko Toronto, Canada


The hierarchical data model organizes data in a tree structure.
There is a hierarchy of parent and child data segments.
This structure implies that a record can have repeating information,
generally in the child data segments.
Data is stored in a series of records, which have a set of field values
attached to them. The hierarchy collects all the instances of a specific
record together as a record type.
These record types are the equivalent of tables in the relational model,
with the individual records being the equivalent of rows.
To create links between these record types, the hierarchical model uses
Parent Child Relationships.
There is a 1:N mapping between record types. This is done by using
trees, like set theory used in the relational model, "borrowed" from
maths.
Figure 1: a hierarchical model

Pavement Improvement

Reconstruction

Maintenance

Rehabilitation

Routine

Corrective

Preventive

Network Model
The network model is a database model conceived as a flexible way of representing
objects and their relationships. Its distinguishing feature is that the schema, viewed as
a graph in which object types are nodes and relationship types are arcs, is not
restricted to being a hierarchy or lattice.
The network model's original inventor was Charles Bachman. - wikipedia

An owner record type can also be a member or owner in another set.


The data model is a simple network, and link and intersection record
types (called junction records by IDMS) may exist, as well as sets between
them .
Thus, the complete network of relationships is represented by several
pairwise sets; in each set some (one) record type is owner (at the tail of the
network arrow) and one or more record types are members (at the head of
the relationship arrow). Usually, a set defines a 1:M relationship, although
1:1 is permitted.
The CODASYL network model is based on mathematical set theory.
http://en.wikipedia.org/wiki/CODASYL
Preventive Maintenace

Flexible Pavement

Rigid Pavement

Spall Rpair

Joint Seal

Sillicone Sealant

Crack Seal

Patching

Asphalt Sealant

Figure 2: a network model


Relational Model
(RDBMS - relational database management system) A database based
on the relational model developed by E.F. Codd. A relational database
allows the definition of data structures, storage and retrieval operations
and integrity constraints. In such a database the data and relations
between them are organised in tables. A table is a collection of records
and each record in a table contains the same fields.
Properties of Relational Tables:
* Values Are Atomic
* Each Row is Unique
* Column Values Are of the Same Kind
* The Sequence of Columns is Insignificant
* The Sequence of Rows is Insignificant
* Each Column Has a Unique Name
Certain fields may be designated as keys, which means that searches for
specific values of that field will use indexing to speed them up. Where
fields in two different tables take values from the same set, a join
operation can be performed to select related records in the two tables
by matching values in those fields.
Often, but not always, the fields will have the same name in both tables.
For example, an "orders" table might contain (customer-ID, productcode) pairs and a "products" table might contain (product-code, price)
pairs. To calculate a given customer's bill you would sum the prices of
all products ordered by that customer by joining on the product-code
fields of the two tables. This can be extended to joining multiple tables
on multiple fields. Because these relationships are only specified at
retrieval time, relational databases are classed as dynamic database
management systems. The RELATIONAL database model is based on
Relational Algebra.

The popularity of the network data model coincided with the popularity
of the hierarchical data model. Some data is more naturally modelled
with more than one parent per child.
So, the network model permitted the modeling of many-to-many
relationships in data. In 1971, the Conference on Data Systems
Languages (CODASYL) formally defined the network model.
The basic data modelling construct in the network model is the set. A
set consists of an owner record type, a set name, and a member record
type. A member record type can have that role in more than one set,
hence the multiparent concept is supported.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Detlef Overbeek

Figure 3: a relational model


COMPONENTS
DEVELOPERS

Page 45

Object-Oriented Databases (continuation 1)


The Object/Relational Model
An object-relational database (ORD), or object-relational database management
system (ORDBMS), is a database management system (DBMS) similar to a
relational database, but with an object-oriented database model:
objects, classes and inheritance are directly supported in database schemas and in the
query language.
In addition, it supports extension of the data model with custom data-types and
methods. An object-relational database can be said to provide a middle ground
between relational databases and object-oriented databases (OODBMS).
In object-relational databases, the approach is essentially that of relational
databases: the data resides in the database and is manipulated collectively with
queries in a query language.
At the other extreme are OODBMSes in which the database is essentially a
persistent object store for software written in an object-oriented programming
language, with a programming API for storing and retrieving objects, and little or no
specific support for querying. - wikipedia

The conditions under which OOP prevails over alternative techniques


(and vice-versa) often remain unstated by either party, however, making
rational discussion of the topic difficult, and often leading to heated
debates over the matter.
Roots

Object-oriented programming has roots that can be traced back to the


1960s. As hardware and software became increasingly complex,
manageability often became a concern.
Researchers studied ways to maintain software quality and developed
object-oriented programming in part to address common problems by
strongly emphasizing discrete, reusable units of programming logic.
The technology focuses on data rather than processes, with programs
composed of self-sufficient modules ("classes"), each instance of which
("objects") contains all the information needed to manipulate its own
data structure ("members").
This is in contrast to the existing modular programming that had been
dominant for many years that focused on the function of a module,
rather than specifically the data, but equally provided for code reuse,
and self-sufficient reusable units of programming logic, enabling
collaboration through the use of linked modules (subroutines).
This more conventional approach, which still persists, tends to consider
data and behavior separately.
A collection of interacting objects

An object-oriented program may thus be viewed as a collection of


interacting objects, as opposed to the conventional model, in which a
program is seen as a list of tasks (subroutines) to perform.
In OOP, each object is capable of receiving messages, processing data,
and sending messages to other objects.
Each object can be viewed as an independent 'machine' with a distinct
role or responsibility. The actions (or "methods") on these objects are
Figure 4: example of an Object-Oriented Database Model.
closely associated with the object.
Object/relational database management systems add new object storage For example, OOP data structures tend to 'carry their own operators
capabilities to the relational systems at the core of modern information around with them' (or at least "inherit" them from a similar object or class).
In the conventional model, the data and operations on the data
systems. These new facilities integrate management of traditional field
don't have a tight, formal association.
data, complex objects such as time-series and geospatial data and
diverse binary media such as audio, video, images, and applets. By
SQL data definition and
data manipulation statement
encapsulating methods with data structures, an ORDBMS server can
Relational
(create, insert and select)
execute comple x analytical and data manipulation operations to search
data
syntax
and transform multimedia and other complex objects.
As an evolutionary technology, the object/relational (OR) approach has
inherited the robust transaction- and performance-management
features of its relational ancestor and the flexibility of its objectoriented cousin. Database designers can work with familiar tabular
structures and data definition languages (DDLs) while assimilating new
object-management possibilities. Query and procedural languages and
call interfaces in ORDBMSs are familiar: SQL3, vendor procedural
languages, and ODBC, JDBC, and proprietary call interfaces are all
extensions of RDBMS languages and interfaces. And the leading
vendors are, of course, quite well known: IBM, Informix, and Oracle.

Complex
data
types

Operators for complex


data types object identifiers
inheritance, polymorphism

Figure 5: interaction of an Object-Oriented Database Model.

Object DBMSs add database functionality to object programming


languages. They bring much more than persistent storage of
programming language objects. Object DBMSs extend the semantics of
the C++, Smalltalk and Java object programming languages to provide
full-featured database programming capability, while retaining native
language compatibility. A major benefit of this approach is the
The Object-Oriented Model
Object-oriented programming (OOP) is a programming paradigm that uses "objects" unification of the application and database development into a
seamless data model and language environment.
data structures consisting of data fields and methods together with their
As a result, applications require less code, use more natural data
interactions to design applications and computer programs. Programming
techniques may include features such as data abstraction, encapsulation, modularity, modelling, and codebases are easier to maintain.
polymorphism, and inheritance. Many modern programming languages now support Object developers can write complete database applications with a
modest amount of additional effort.
OOP. - wikipedia
An object is a discrete bundle of functions and procedures, often
relating to a particular real-world concept such as a bank account holder
or hockey player. Other pieces of software can access the object only by
calling its functions and procedures that have been allowed to be called
by outsiders.
A large number of software engineers agree that isolating objects in this
way makes their software easier to manage and keep track of.
However, a significant number of engineers feel the reverse may be
true: that software becomes more complex to maintain and document,
or even to engineer from the start.

Page 46

COMPONENTS
DEVELOPERS

According to Rao (1994), "The object-oriented database (OODB)


paradigm is the combination of object-oriented programming language
(OOPL) systems and persistent systems.
The power of the OODB comes from the seamless treatment of
both persistent data, as found in databases, and transient data, as
found in executing programs."
In contrast to a relational DBMS where a complex data structure must
be flattened out to fit into tables or joined together from those tables to
form the in-memory structure, object DBMSs have no performance
overhead to store or retrieve a web or hierarchy of interrelated objects.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Object-Oriented Databases (continuation 2)


This one-to-one mapping of object programming language objects to
database objects has two benefits over other storage approaches:
it provides higher performance management of objects, and it enables
better management of the complex inter-relationships between objects.
This makes object DBMSs better suited to support applications such as
financial portfolio risk analysis systems, telecommunications service
applications, world wide web document structures, design and
manufacturing systems, and hospital patient record systems, which have
complex relationships between data.
3rd Generation:
Object Oriented
Database
Object Realational
Database

Object
Oriented
Ideas

SQL 3
1999

2nd Generation:
Relational Database

Data
Base
Ideas

SQL 2
1992

Object Orientation

SQL

1st generation:
Hierarchical
Network

Relational Data Model

Hierarchical Data Model

Network Data Model

Figure 6: an overview of the history


Semi-structured Model
The semi-structured model is a database model.
In this model, there is no separation between the data and the schema, and the
amount of structure used depends on the purpose.
The advantages of this model are the following:
* It can represent the information of data sources which cannot be
constrained by a schema.
* It provides a flexible format for data exchange between different
types of databases.
* It can be helpful to view structured data as semi-structured
(for browsing purposes).
* The schema can easily be changed.
* The data transfer format may be portable.

First, there are data sources such as the Web, which we would like to
treat as databases but which cannot be constrained by a schema.
Second, it may be desirable to have an extremely flexible format for
data exchange between disparate databases.
Third, even when dealing with structured data, it may be helpful to
view it as semi-structured for the purposes of browsing.
Associative Model
The associative model of data is an alternative data model for database systems.
Other data models, such as the relational model and the object data model, are
record-based.
These models involve encompassing attributes about a thing, such as a car, in a
record structure. Such attributes might be registration, colour, make, model, etc.
In the associative model, everything which has discrete independent existence is
modelled as an entity, and relationships between them are modelled as associations.
The granularity at which data is represented is similar to schemes presented by Chen
(Entity-relationship model); Bracchi, Paolini and Pelagatti (Binary Relations); and
Senko (The Entity Set Model).
A number of claims made about the model by Simon Williams, in his book The
Associative Model of Data, distinguish the associative model from more traditional
models. - wikipedia
The associative model divides the real-world things about which data is
to be recorded into two sorts:
Entities are things that have discrete, independent existence.
An entitys existence does not depend on anything else.
Associations are relationships whose existence depends on one or more
other things, such that if any of those things ceases to exist, then the
association itself ceases to exist or becomes meaningless.
An associative database comprises two data structures:
1. A set of items, each of which has a unique identifier,
a name and a type.
2. A set of links, each of which has a unique identifier, together with
the unique identifiers of three other things, that represent the
source, verb and target of a fact that is recorded about the source in
the database.
Each of the three things identified by the source, verb and target
may be either a link or an item.

The primary trade-off being made in using a semi-structured database model is that
queries cannot be made as efficiently as in a more constrained structure, such as in
the relational model. Typically the records in a semi-structured database are stored
with unique IDs that are referenced with pointers to their location on disk. This
makes navigational or path-based queries quite efficient, but for doing searches over
many records (as is typical in SQL), it is not as efficient because it has to seek
around the disk following pointers.
The Object Exchange Model (OEM) is one standard to express semi-structured
data, another way is XML. - wikipedia
In semi-structured data model, the information that is normally
associated with a schema is contained within the data, which is
sometimes called ``self-describing''. In such a database there is no clear
separation between the data and the schema, and the degree to which it
is structured depends on the application.

Figure 7: associative tables

In some forms of semi-structured data there is no separate schema, in


others it exists but only places loose constraints on the data. Semistructured data is naturally modelled in terms of graphs which contain
labels which give semantics to its underlying structure. Such databases
subsume the modelling power of recent extensions of flat relational
databases, to nested databases which allow the nesting (or
encapsulation) of entities, and to object databases which, in addition,
allow cyclic references between objects.
Semi-structured data has recently emerged as an important topic of
study for a variety of reasons.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

COMPONENTS
DEVELOPERS

Page 47

Object-Oriented Databases (continuation 3)


Entity-Attribute-Value (EAV) data model
Entity-attribute-value model (EAV) is a data model to describe entities where the
number of attributes (properties, parameters) that could be used to describe them is
potentially vast, but the number that will actually apply to a given entity is relatively
modest. In mathematics, this model is known as a sparse matrix. EAV is also
known as object-attribute-value model and open schema. - wikipedia
The best way to understand the rationale of EAV design is to
understand row modeling (of which EAV is a generalized form).
Consider a supermarket database that must manage thousands of
products and brands, many of which have a transitory existence.
Here, it is intuitively obvious that product names should not be hardcoded as names of columns in tables.
Instead, one stores product descriptions in a Products table:
purchases/sales of individual items are recorded in other tables as
separate rows with a product ID referencing this table.
Conceptually an EAV design involves a single table with three columns,
an entity (such as an olfactory receptor ID), an attribute (such as species, which
is actually a pointer into the metadata table) and a value for the attribute (e.g.,
rat).
In EAV design, one row stores a single fact.
In a conventional table that has one column per attribute, by
contrast, one row stores a set of facts.
EAV design is appropriate when the number of parameters that
potentially apply to an entity is vastly more than those that actually
apply to an individual entity.
For more information see: The EAV/CR Model of Data:
http://ycmi.med.yale.edu/nadkarni/eav_cr_contents.htm

Something special: The Context Model


The context data model combines features of all the above models.
It can be considered as a collection of object-oriented, network and
semistructured models or as some kind of object database.
In other words this is a flexible model, you can use any type of
database structure depending on task.
Such a data model has been implemented in the ConteXt DBMS..
The fundamental unit of information storage of ConteXt is a CLASS.
A class contains METHODS and describes an OBJECT.
The Object contains FIELDS and PROPERTY.
The field may be composite, in which case the field contains SubFields
etc. The Property is a set of fields that belongs to a particular Object.
(similar to an AVL database).
In other words, fields are the permanent part of an Object whereas
Property is its variable part.
The header of a class contains the definition of the internal structure
of the Object, which includes the description of each field, such as its
type, length, attributes and name.
A Context data model has a set of predefined types as well as user
defined types.
The predefined types include not only character strings, texts and digits
but also pointers (references) and aggregate types (structures).

All pointers that belong to a particular static pointer type point to the
same Class (albeit, possibly, to different Object).
In this case, the Class name is an integral part of the that pointer type.
A dynamic pointer type describes pointers that may refer to different
Classes. The Class, which may be linked through a pointer, can reside
on the same or any other computer on the local area network.
There is no hierarchy between Classes and the pointer can link to any
Class, including its own.
In contrast to pure object-oriented databases, context databases is not
so coupled to the programming language and doesn't support methods
directly. Instead, method invocation is partially supported through the
concept of VIRTUAL fields.
A VIRTUAL field is like a regular field: it can be read or written into.
However, this field is not physically stored in the database, and in it
does not have a type described in the scheme. A read operation on a
virtual field is intercepted by the DBMS, which invokes a method
associated with the field and the result produced by that method is
returned. If no method is defined for the virtual field, the field will be
blank. The METHODS is a subroutine written in C++ by an
application programmer. Similarly, a write operation on a virtual field
invokes an appropriate method, which can changes the value of the
field. The current value of virtual fields is maintained by a run-time
process; it is not preserved between sessions. In object-oriented terms,
virtual fields represent just two public methods: reading and writing.
Experience shows, however, that this is often enough in practical
applications. From the DBMS point of view, virtual fields provide
transparent interface to such methods via an aplication written by
application programer.
A context database that does not have composite or pointer fields and
property is essentially RELATIONAL. With static composite and
pointer fields, context database become OBJECT-ORIENTED. If the
context database has only Property in this case it is an ENTITYATTRIBUTE-VALUE database. With dynamic composite fields, a
context database becomes what is now known as a
SEMISTRUCTURED database. If the database has all available types...
in this case it is ConteXt database!
For more information see: Concepts of the ConteXt database
The current version of DBMS ConteXt you can download from:
UnixSpace Download Center:
http://www.unixspace.com/download/index.html

It is very instructive to try this out.

A context model comprises three main data types: REGULAR,


VIRTUAL and REFERENCE. A regular (local) field can be ATOMIC
or COMPOSITE. The atomic field has no inner structure. In contrast,
a composite field may have a complex structure, and its type is
described in the header of Class.
The composite fields are divided into STATIC and DYNAMIC.
The type of a static composite field is stored in the header and is
permanent. A description of the type of a dynamic composite field is
stored within the Object and can vary from Object to Object.

During this summer I interviewed many people about the subject


of Object Oriented Databases.
One of the most interesting and informative suppliers I found
was InterSystems. Its a firm that operates worldwide with an
office here in Belgium and a subsidiary in Amsterdam. They have
a great OODB called CACH.
They were very helpful and I had a long talk with one of their
specialists. I actually learned quite a lot about OODBs, and the
current real-world market for OODBs. Object Oriented
Databases have found a very secure niche in database land.
One of the things I learned, was that Intersystems had added
SQL to the OODB so offering all normal database functionality.
I was told about their special client:

Like a NETWORK database, apart from the fields directly containing


the information, context database has fields storing a place where
this information can be found, i.e. a POINTER (link, reference) which
can point to an Object in this or another Class.
Because the main unit addressed in a context database is an Object, the
pointer addresses an Object instead of a field of this Object.
The pointers are either STATIC or DYNAMIC.

The Belgian Police.


A tragic crime in 1996, and its aftermath, resulted in sweeping reforms
to Belgiums police forces. New laws enacted in 1998 called for the
consolidation of Belgiums 196 independent municipal police zones
(using a patchwork of information systems), the federal police (with its
own information systems), and the Criminal Investigation Department,
into a single integrated force with two levels: federal and local police.

Page 48

COMPONENTS
DEVELOPERS

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Object-Oriented Databases (continuation 4)


The law also called for the federal police to
create a single computing hardware and
software architecture that all the forces would
share. For the core data management and
application infrastructure, the federal police
chose Cach.
Eddy Muylaert, information technology
director at the federal police, is overseeing
development of a new suite of Cach-based
applications. These include Transit,
Investigation Management, and Money
Transfer modules, with application access
through a secure Web portal. Cach also
supports applications used for day-to-day
police department operations such as file
management, human resources, logistics,
payroll, telephone directory, and other
functions. When all of these applications are
fully deployed, They will be used around the
clock, seven days a week, by our 45,000 agents
from nearly 24,000 PCs, said Muylaert. Our
combination of Cach on Intel-based hardware
running Red Hat Linux is absolutely reliable
and fast.
The Investigation Management module gives
investigators a broad overview of the elements
of a case bringing together information
about the events, people, transportation, places,
and other details in various formats -- and
supports cross-referencing and analysis on the
data. This application highlights how Cachs
Figure 9:
innovative object-oriented nature enabled us to
naturally model how we work in the database,"
said Muylaert.
Cach was not a conventional choice for us, in the sense that big
name relational database technology is conventional, Muylaert
continues. But Cachs performance and reliability, and minimal
hardware requirements, were key for us, as was the ability to easily
migrate data, metadata, and stored procedures from the previous police
forces Sybase and Informix databases into Cach.
Cach can be used in combination with Delphi: you need the ODBC
Driver. Thats all. You could download a trial version from their website
at http://download.intersystems.com/download/register.csp
Technicalities
Cach objects are powered by Cach's post-relational database, the ideal
match for high-performance applications developed with Web and object- oriented
technologies. Using Cach object technology, programmers can create meaningful data
structures that represent the real world and streamline the development process.
Cach Objects Enable Rapid Development in order to speed
application development. Cach object technology supports the
concepts of inheritance, encapsulation, and polymorphism.
Inheritance is the ability to derive one class of objects from another.
The new class (a subclass) will always have an "is a" relationship
to the superclass.
For example, a dog "is a" mammal, so the "Dog
class can inherit all the properties and methods of the "Mammal
class, as well as containing properties and methods that are unique
to dogs. Multiple inheritance means a subclass can be derived
from more than one superclass. A dog "is a" mammal, and "is a" pet,
so the "Dog" class can inherit the attributes of both the "Mammal
class and the "Pet" class.
Encapsulation means that, as far as the application is concerned,
classes can be viewed as a sort of "black box". No matter how
complex, a class has a finite number of properties and methods. Once
a class is defined, the application doesn't need to know its internal
workings. The application deals only with the properties and methods
of the class.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

This black box approach yields two important benefits.

A) Classes are modular. Programmers can improve the internal


workings of classes without affecting the rest of the application at all.
B) Classes are interoperable.
Classes can be shared between applications, because the interface (the properties and methods) remains constant.
Polymorphism refers to the fact that methods used in multiple classes
can share a common interface, even if the underlying implementation
is different. For example, say an application uses several different
classes: Letter, Mailing Label, and ID Badge, all of which contain a
method called PrintAddress. The application doesn't need to contain
special instructions about formatting an address for each kind of
object.
It merely includes a command that says something like
"DO PrintAddress(objectID)". Polymorphism ensures that each object
carries out the instruction in a manner appropriate for the class to
which that object belongs.
The Cach Objects Model is like The Real World

Cach object technology attempts to describe the way that humans


actually think about and use data. It does this by bundling together data
and the code that controls how the data is used. In object parlance, the
various pieces of data contained in a class are called "properties" and
the sections of code that describe how the data behaves are called
methods.
Cach object technology also promotes a naturalistic view of data by
not restricting properties to simple, computer-centric data types.
Classes may contain other classes, or references to other classes, which
makes it easy to build useful and meaningful data models. Here's a
simple example: MAMMAL
PET
Multiple Inherritance

DOG
Figure 10: Subclasses can inherit attributes from one or more superclasses

COMPONENTS
DEVELOPERS

Page 49

Object-Oriented Databases (continuation 5)


Name
Data can be of a simple,
system-defined type such
as a string, integer, etc.
SSN
Data can be programmer-defined,
advanced data type (ADT)
for example, SSN can be defined
as a nine digit number that
matches the pattern: NNN-NNNNNN. ADTs are themselves
a kind of object.

CUSTOMER

Name
SSN

Account Rep:
Account Rep is a fairly complex
object that exists independently
from Customer. In this example,
Customer includes a reference to the
appropriate Account Rep.

Address
Street

Address
Objects can be embedded within
classes. In this example,
Address is an object that
contains the properties Street
and City

City

Account Rep:

Account rep
Name

INVOICE
INVOICE
INVOICE
Name
Addres
City

Name
Addres
City

Rep ID#

Name
Addres
City

Figure 11: even though Customer


contains a large amount of information,
an application can treat it as single
entity - an object.

Invoice:
References can be made to more than one
instance of a class, thus creating a collection. A
collection can be thought of as a one-to-many
relationship. Cach also supports other types of
relationships

Creating Objects

Integrating Objects With Other Technologies

Classes are rapidly created and edited with the Cach Studio.
The Studio is an integrated development environment (IDE) where
developers can perform all of their application development tasks.
For data modelling, this includes specifying properties, coding and
debugging object methods, and defining specialized data types.
The support for advanced object concepts, simple and multiple
inheritance, embedded objects, references to objects, collections,
relationships, and polymorphism - make the Studio a powerful and
productive environment for modelling data and business processes.

By virtue of Cach's Unified Data Architecture, all its classes are


automatically accessible as relational tables via ODBC and JDBC.
And by taking advantage of inheritance, Cach classes can easily be
adapted for use with XML and object-oriented technologies.

Importing/Exporting Data Models

The Studio includes a wizard for the easy creation of Cach classes, but
there are several other ways to input and export class definitions to and
from the Studio.
The Cach RoseLink allows classes defined using Rational Software's
popular Rose object modeling tool to be imported into Cach. Similarly,
class definitions can be exported to Rose for use within Rose's
modelling environment.
Cach can also create objects from relational DDL files. The resulting
object classes will be very simple: their properties will be single-valued
system-defined data types that correspond to the relational table's fields,
and their only methods will be those persistence methods required to
move the data to and from the disk.
However, thanks to Cach's Unified Data Architecture, even these
simple classes are immediately available for use with object
programming languages, and they may be used as building blocks to
create more complex data models.
XML provides another way to transport class definitions from one
application to another. Class definitions can be exported/imported as
XML documents.

Cach Server Pages

A class designated as a Cach Server Page automatically inherits all the


necessary Web session management methods, plus the "OnPage()"
method where developers can code the page content.
XML

Inheriting properties and methods from the %XML.Adaptor class


(provided by InterSystems) enables a class to import/export XML data.
Cach will automatically determine the mapping between Cach objects
and XML documents, or developers may create custom maps.
COM

A single command within the Cach Studio projects Cach classes as


COM classes for use with tools such as Visual Basic, Delphi, and any
software compatible with the COM interface. Cach also includes a COM
Gateway, which allows COM objects to be used by Cach applications.
C++

Similarly, a single command can create C++ projections of Cach classes.


Java

One command can project Cach classes as Java classes. Cach also
provides a class library that allows Java programmers to access Cach
objects in the Cach database.
EJBs

EJB projections can also be created with one click from within Cach
Studio. Cach allows developers to take advantage of the speed of BeanManaged Persistence, without having to do lots of tedious coding to map
between Java classes and relational tables.
Cach supports BEA's WebLogic application server.

Scripting Languages

Methods in Cach objects are coded using either (or both) Cach
ObjectScript or Cach Basic. Both languages allow developers to use all
of Cach's data access modes - Objects, SQL, and Multidimensional within the same routine.

Page 50

COMPONENTS
DEVELOPERS

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Fastreport A report generator for Delphi and .NET


by Marco Roessen and Rob van den Bogert and Detlef Overbeek
starter

expert

DELPHI 5 and above


Win32 and nd .Net

If data has to be presented in a clear manner, people


frequently use a report. A report presents the data in a
predefined format. This can be done using a table, a chart,
a graphic or plain text. The data can come from several
sources, such as a database, the result of a (complex)
calculation, data which have been typed earlier by a user
and stored in a file or even the output of measuring
equipment.
It is possible to program the report layout entirely from code.
The programmer calculates and codes the exact position of each report
element. This is however not a flexible solution.
Imagine an end-user wants to have an extra graphic or table added
somewhere in the report, or even the same data in another order. Then
the programmer has to recalculate and recode the entire report layout,
check and discuss this new layout with the user, etc.
It would be much easier if the end-user could make (small) changes to
the report layout by himself, even without having any programming
skills.
A solution for having this kind of flexibility, is to use a report
generator. A report generator generates a report by using the report
blue print and the data. This resulting report can be viewed, printed
or even exported to a format you need (figure 1).

Figure 1: Internals of a report generator


There are a couple of commercially available report generators for
Delphi. The most well-known are:
Report Builder, Crystal Report, Rave Reports, Quick Report and Fast
Reports.
We ourselves were searching for a report generator to be used in a
couple of existing and future applications. These applications are (or
will be) written in Delphi 7, 2006 and C#. These applications use the
same databases (MS-SQL) and several other data sources (XML,CSV
and EDF/EDF+).
A couple of requirements we had for choosing a report generator were:
* Suitable for both Delphi and .NET
* Integrated end-user layout editor
* Export to different formats (e.g. PDF, JPG, TIFF) with
preview
* Source available to adapt/extend functionality and being
independent of the supplier
* Possibility to have web enabled reports in the future
* Not too expensive (well, that's business as usual in
healthcare )
With these requirements in mind we have looked at the report
generators and listed our strong and weak points of each of the report
generators:

Page 52

Report Builder:
+ Fully integrated with the Delphi IDE
+ Visually create a report layout
+ 21 different components to present the data
+ Speed
+ Runtime Pascal Environment (RAP). Object Pascal with event
handling integrated to create complex reports
+ Good documentation
+ Source code is included in all editions
- RAP is only available with the Enterprise and Server editions
- Relatively expensive. The version with the end-user layout editor
integrated (Professional edition) costs $ 495.
- No .NET version available
Crystal report:
+ Has a lot of features
+ Can display data from many different databases
- Integration with Delphi can be a problem
- Separate licenses needed for commercial applications
- Expensive, license fees start at 479,Rave reports:
+ Free. Rave BE (Bundled Edition) is installed automatically when
installing Delphi 7, 8, 2005, 2006, 2007 and 2009, 2010 and XE.
+ Fully integrated with the Delphi IDE
- People say that there are a lot of problems with the Delphi 2009
edition.
- Very hard to make contact or get support.
- Website hasn't been updated for a long time and there is no
information about the versions they sell.
- No .NET version available
Quick Report:
+ Quick Report (version 5.05) is available for Delphi
5/6/7/2005/2006/2007/2009/ 2010/Xe (Win32 mode)
5.05 for Delphi XE 32 now available for download.
They are working on the C++ Builder XE version.
+ Fully integrated with the Delphi IDE
+/- Pro version 345,+ Upgrade price is 25% of a new license.
+ end-user report editor and PDF export (Pro version)
- It seems that Quick reports contains a couple of nasty bugs
- No .NET version

Fastreport :
+ Fully integrated with the Delphi IDE
Improved Engine:
improved shift mechanism / duplicated combining / new aggregates
improved cross object / changes in xml format (write collections in XML)
improved report inheritance / hierarchy / watermarks/ objects fill
improved linear barcodes / improved interactive reports
OnMouseEnter /OnMouseLeave events
detailed reports /multi-tab preview for detailed reports;
+ Well documented
+ Create a layout visually from the Delphi IDE or Visual Studio
+ A lot of different components to present the data
New Objects: new 2D barcodes: DataMatrix and PDF417;
Table object / Cellular text / Zip Code;
+ A lot of export filters:
PDF, RTF, XLS, XML, HTML, JPG, BMP, GIF, TIFF, TXT, CSV,
Open Document Format (you can even build our own export filters!)
New exports: BIFF XLS / PPTX / XLSX / DOCX
+ Built in script engine for PascalScript, C++Script, BasicScript,
JScript with debugger (Win32 version). The .NET version currently
uses C# and VB.NET for scripting.
+ Versions available for Delphi 4 XE and .NET (integrated with
Visual Studio: Delphi Prism, C#, VB.NET, etc)
+ Built in end user rapport editor, starting at standard edition,
without any extra license fees
+ Source available (starting at Professional Edition)
+ Web reports (Enterprise edition)
+ Licenses starting at $79 (Basic) till $349 (Enterprise)
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Fastreport - A report generator for Delphi and .NET (continuation 1)


Conclusion
From the above enumeration is becomes clear that for us
Fastreport would be the choice.
Fastreport comes in 2 different flavors:
* Fastreport 4 VCL. This version is based on the VCL and
therefore it's a Delphi Win32 version.
* Fastreport CLX. This version base on the CLX library
(Delphi and Kylix). We don't have any experience with this
version.

Fastreport is a report generator based on bands. For example there are


Report Title, Page Header and Page Footer bands. By using bands
that can resize with the contents, we are not dependent on the amount
of data that will be displayed; the report generator will move the other
bands up or downwards. You can display data from a database source in
many ways: with column names, details, groups, summaries, etc. There
are 6 different types of bands for displaying databases.

Fastreport in practice
In the meanwhile we have added reporting capabilities to a couple of
our applications. Below you will find an enumeration of a couple of
features we have used or will be using in the future.
You can download a trial version of Fastreport on the Fastreports
website (http://www.fast-report.com/en/download/fast-report4-download.html), so you can explore the possibilities of their
products.
The Fastreport trial has two limitations: only 5 pages of the report can
be printed or exported and a nag message displayed if the report has a
script. On this download page you can also download a demo
application and the full documentation; this is very handy if you want
to try it the first time.
After installing Fastreport 4 VCL there will be (depending on the
version you installed) 2,3 or 4 new groups of components in the Tool
Palette window (figure 2). The ones most used are in the Fastreport 4.0
group. Below you will find a short guide to get started quickly.
The first steps Creating a report starts by adding a TfrxReport
component to your form. Double click the frxReport1 icon. This will
start the layout editor. Default there are 3 tabs In the editor: the 'Code'
and 'Data' pages and the first page of your report (figure 3).

Figure 3: Report editor pages


As a small excercise we will display data from a demo database.
To create a report of data from a database we have to add the database
to the Data page. We can do that by dragging the database
component onto the Data page and double-click its icon. This will
start the Connection Wizard. You can also connect to a database
manually by setting the components properties.
For this exercise we will use a BDETable component en set the
DatabaseName property to DBDEMOS (a Delphi demo database)
and the TableName property to clients.dbf . The database is now
ready for use.

Figure 2: Delphi Tool Box with Fastreport components


DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Figure 4: Adding a Master Data band

Page 53

Fastreport - A report generator for Delphi and .NET (continuation 2)


Activate the tab Page1 and add a Master Data band by clicking the
add band icon on the left hand side of the editor (figure 4). The editor
will ask what database to use, select our BDETable.
Now drag from the right column, the Data Tree, the database fields
you want to report into the just added Master band. It is already
possible to view the results, just click the preview button. This is
reporting the Delphi way!
Variables
You can not only report data from databases, but it is also possible to
report data from other data sources. This can be achieved by adding
variables to the report layout from inside the editor. To add variables
click the menu item Report and then Variables. The dialog that
appears will display the already available variables and has a button to
add new variables.
It is also possible to add variables from code (listing 1).
You can add a variable to the report by dragging it from the list with
available variables to your report page 'Page1'. The easiest way to add a
variable to your report is to add it to a 'Master' band with no database
attached.

uses StrUtils;
procedure TForm1.FormCreate(Sender: TObject);
begin
frxReport1.Script.AddMethod(
'function SecToHHMM(Seconds: Extended):String',
CallFrxMethod,'Custom');
end;
function TForm1.SecToTime(Seconds: Extended): TTime;
begin
result:=Seconds/(24*3600);
end;
function TForm1.CallFrxMethod(Instance: TObject;
ClassType: TClass; const MethodName: String;
var Params: Variant): Variant;
begin
If(MethodName = 'SECTOHHMM') then
begin If(Params[0] >= 0) then
begin result:= FormatDateTime
('hh:nn',SecToTime(Params[0]));
result:= AnsiReplaceText(result,':','h')+'m';
end
else result:='--h--m';
end;
//Other methods
end;

Listing 2: Adding your own function

Listing 1: Adding report variables from code


You can add a 'Master' band without database to your report by
adding a the 'Master' band and when the 'Select DataSet' dialog
appears, select '[unassigned]'. Do not forget to set the number of
records to 1. If the number of records is set to 0, Fastreport will
not display the band, in fact hiding it.
The way data is presented in the report, for example 2 decimals or
dd-MMM-yyyy as a date format can be set by double clicking the
'Text Object' , select the 'Format' tab and set the desired display
format.
Creating a dynamic report with code
You can not only show static data form databases and variables. It is
also possible to add pieces of code to your report. You can for
instance add code to calculate a new value from the other report
variables, e.g. convert a number of seconds to a string containing
the equivalent hours and minutes. Fastreport VCL uses Pascal
(script) code. Fastreport .NET uses C# or VB.NET (the current
.NET version 1.1 can only use these two languages, even when you
are using Delphi Prism!). You can imagine you can create very
sophisticated reports using script code.
Every component you add to the report can have event handlers
attached to it. For instance, there is a 'OnBeforePrint' print event
which will be called just before the component is rendered to the
internal report canvas. You can use these event handlers to do some
special processing. For example: you want to hide a 'Memo'
component if a report variable has a predefined value. In the
'OnBeforePrint' handler you can check this value and optionally
hide the Memo, by setting it's 'Visible' property to False.
Adding your own functions
With Fastreport you can report simple variables easily, but it is
also possible to create complex expressions using one or more of
the available mathematical, string and other functions. Even if the
function you require is not available you can create and implement
your own custom function. To use a custom function you have to
implement the algorithm using Delphi and register it with
Fastreport. We ourselves have variables that are expressed in
seconds, but the report user asked to show the values as hours and
minutes. Creating the function in Delphi was rather straight
forward, also registering it with Fastreport (listing 2).

Page 54

procedure TForm1.Button1Click(Sender: TObject);


begin
frxReport1.DesignReport;
end;
Listing 3 Activating the designer from

code

Figure 5:
The newly added function
You can find your newly added function in the tab 'Functions'
You can now use this function by dragging the variable you want
to be showed as hours and minutes to your report. Double click
the component, a property editor will popup, and modify the
variable name to the following expression:
[SecToHHMM(<VariabeleNaam>)]
This procedure of adding your own functions is only available in
Fastreport VCL. Fastreport .NET uses a different approach, even
more simple. First create an assembly with the functions you will
need and add this assembly to the 'Assembly' property of the
report. You will be able to use your own functions right away. You
can also use the already available functions defined by the
assemblies of the .NET framework.

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Fastreport - A report generator for Delphi and .NET (continuation 3)


Extendibility
VCL does not have a PNG export filter by default. That's why we
wrote this export filter ourselves. We have purchased a Fastreport
Professional license. The Professional and Enterprise version come
with the source code included. We checked the source code how the
export filters are implemented and implemented the PNG export
filter ourselves in less than a hour. This is a good example that
Fastreport can be extended fairly easy.

Figure 6: Inheritance
Report inheritance
A lot of companies have a default report layout with their name,
address, bank account, logo, etc. Adding these same elements over and
over again when you create a new report layout is a boring and errorprone task.
The creators of Fastreport have found a solution for this problem:
'report inheritance'. You create the base report with the company info
once and use this report as a base for all 'descendant' reports. You can
set the base report in the 'Report Settings' dialog (figure 6).
If you have to change, for instance, the address or bank account of the
company, you only have to change the base report.
All descendant reports will also be changed automatically.
You can use all elements of the base report from your descendant
report and change properties (e.g. font, size, color) without changing
the base report.
There are however some restrictions concerning inheriting reports:
The base report can not contain any code. Fastreport will not
warn you, but the inherited report will disregard all code
from the base report.
You cannot inherit from an inherited report.
You cannot use the same component names in both the base
and inherited reports. So if you don't give your components a unique
name, there is a possibility that if you add new components to your
base report, the inherited report will have components with same
names and you won't be able to use your inherited report.

Creating or modifying a report layout


from code
It is also possible to modify, remove , add report elements or
change their properties from inside your application. Listing 4
shows (a part of) code to add a 'Page Footer' to a report.
This way of modifying components is equivalent to the way it can
be done for a Delphi form.
procedure TPolymanAnalysisReport.AddDefaultPageFooter
(APage: TfrxReportPage);
var
Memo: TfrxMemoView;
PageFooter: TfrxPageFooter;
S: String;
begin
PageFooter:= TfrxPageFooter.Create(APage);
PageFooter.Name:= 'ProgramAndReportInformation';
Memo:= TfrxMemoView.Create(PageFooter);
Memo.Name:= 'MemoProgramAndReportInfo';
Memo.Font.Size:= 8;
Memo.StretchMode:= smActualHeight;

Converting Fastreport VCL reports to Fastreport .NET


The default file extension of VCL reports is 'fr3'; .NET reports use
the 'frx' extension by default. Converting reports from VCL to .NET
is possible by adding a unit to your Delphi VCL application:
'frxSaveFRX'. The application embedded editor will now have an
extra type for saving your report: the frx format. If you open the
'Save' dialog you will find this format in the file type dropdown box.
When you have saved your report as a frx report, you can open it
using the Fastreport .NET report designer. Converting the report is
not completely automatic:
- If you have used single quotes in your expressions, you have to
replace all of them by double quotes.
- Of course the script code will not be translated to C# or
VB.NET code. However, the script code is added to the 'Code'
page as comments, so you can see which functions and event
handlers you have to code yourself.
- Charts are not converted automatically. There is a logical
explanation: Fastreport VCL uses TeeChart, Fastreport .NET uses
Microsoft Chart Controls. Before we can use charts, we have to
install the components first. Both components are free and are
quite alike.
- Some of the functions are no longer available. One function we
used a lot, the 'IIF' function is no longer available. This function
returns, depending on a Boolean value, one of two expressions.
However implementing this function is rather straightforward
(listing 5).
using
using
using
using
using
using
using
using
using
using
using
using
using

System;
System.Collections;
System.Collections.Generic;
System.ComponentModel;
System.Windows.Forms;
System.Drawing;
System.Data;
FastReport;
FastReport.Data;
FastReport.Dialog;
FastReport.Barcode;
FastReport.Table;
FastReport.Utils;

namespace FastReport
{
public class ReportScript
{
public string IIF(bool condition,
string trueValue, string falseValue)
{
return condition ? trueValue : falseValue;
}
}
}

Listing 5: Implementing the IFF function using C#

Listing 4 excerpt of code to add a Page Footer

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Page 55

An overview of reports

Page 56

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Previews of the yet to come


version 5

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Page 57

Fastreport - A report generator for Delphi and .NET (continuation 5 )

Fastreport .NET
Fastreport VCL
The creators of Fastreport have succeeded to
create a report generator that can be used in
both Win32 and .NET applications. The VCL
and .NET versions have a lot in common: same kind
of editor, same kind of components, etc.
But there are also a lot of differences. These differences are logical:
both frameworks, the VCL and the .NET framework, on which the
report generators are based are also quite different.
The .NET version has had a complete makeover: new classes and a
new modern editor look. We have used the VCL version for some time
now, but we have to read the manual now and then to find out how
things work in the .NET version.

More information about the different report generators

Lazarus
LazReport is not compatible with FastReport yet, because LazReport is
based on FreeReport (it is a very old version: FastReport- 2.3).
For example - an actual format of files is used from FastReport - XML,
but in the second version we have used the binary format.
In a contact with Michael Philippenko from Fastreport,
he told us that as soon we have developed the special trial
component for all purposes (The Lazarus team and Blaise Pascal
Magazine is working on that), they will consider building a version for
Lazarus.
Conclusion
This article shows some of the possibilities of Fastreport, not all.
We will build example applications and use them for publication
in the next issue. But up till now we haven't been bumping into
problems we could not solve; sometimes we had to consult the
manual or search the support forum to find the solution. We
stronly recommend Fastreport for its great quality, for its
relatively low price, the enormous amount of features and
expandability.

FastReport VCL 5
COMING SOON

Page 58

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Using ADS with Delphi Prism and ASP.NET By Bob Swart


Advantage Database Server (ADS) is an ideal replacement for old and deprecated Borland
Database Engine (BDE) local tables, as I've demonstrated in a previous article for Blaise
Magazine. However, ADS is much more than that, and can be used for any database task you
have in mind. Stand-alone (local), but also client/server, multi-tier or as internet (web)
database.
In this article, I will demonstrate the use of ADS as a web database for an ASP.NET web project
(written using Delphi Prism XE) that implements a registration form for events and seminars.
Advantage Database Server v10
First of all, in this article I will use ADS version 10, although you can also use version 9 if you want (I'm not using
any of the special new features). To design the database, I use the Advantage Data Architect application, which is
free and can be downloaded from the Advantage website, and is written in Delphi (full source code included). The
Data Architect is a nice project to learn how to work with and manipulate ADS tables.
For the example of this article, I need to store the name and address information from people who want to attend
an event or seminar. The usual fields are: Company, Name (or FirstName + LastName), Address, Postal Code, City,
Country, E-mail address, Phone number, and optionally one or two fields related to the event (like which version of
Delphi you are currently using). The example registration form that I want to implement today, is for the Advantage
Database System Training Day on November 3rd in Utrecht, The Netherlands. For this registration, we'd like to
know which version of ADS (if any) the visitor is currently using, so we can adjust the sessions when needed.
In short, using the Advantage Data Architect, I have created a new Data Dictionary called Events and a new table
with the following layout:

Figure 1.
Note the ID field, which is the primary key of type autoinc. This table is saved as Registration.adt, and can now be
used by the application we'll make with Delphi Prism. But before we continue, we must make sure that apart from
the Advantage Database Server, we've also installed the Advantage .NET Data Provider (so we can use ADO.NET
and ASP.NET with declarative data binding to connect to the Advantage database).
Delphi Prism XE
Delphi Prism XE is the most recent edition of Delphi Prism at the time of writing (people with a subscription
received no less than two major updates in the last year: first from Delphi Prism 2010 to Delphi Prism 2011, and
then a few weeks ago Delphi Prism XE). It can run in both Visual Studio 2008 and 2010, but for this article I'm
using Delphi Prism XE in Visual Studio 2010, together with ADS v10 as mentioned before.
Using Delphi Prism XE, we can create ASP.NET Web Projects, with File | New Project, using the dialog from the
following screenshot:

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

COMPONENTS
DEVELOPERS

Page 59

Using ADS with Delphi Prism and ASP.NET (continuation 1)

Figure 2.
For the purpose of this demo, I'll give the project the name EventRegistration. The ASP.NET project will consist of
one page Default.aspx, where we should start by placing a FormView control from the Data category of the
Toolbox. The FormView has a number of tasks, including one to Choose the Data Source. Since there is no Data
Source on the page, yet, we should select the <New data source> option instead:

Figure 3.
This will produce the Visual Studio Data Source Configuration Wizard, where we can specify where the application
will get its data from.
In our case, that's from a SQL Database, so click on
the SQL Database item. This will automatically
generate a default ID for the data source
(SqlDataSource1) and place this ID in the textbox
so we can modify it if needed. Click on OK to go to
the next page of the wizard.
In the second page, we can choose the data
connection. Either from a list of existing
connections, of by clicking on the New Connection
button.
If you click on the New Connection button, a
dialog will pop-up in which we can choose the data
source. Here, we can select the type of data source
from a list that contains Advantage Database Server
(if you've installed the Advantage .NET Data
Provider), as well as for example DataSnap,
InterBase, and several Microsoft drivers.

Figure 4.

Page 60

COMPONENTS
DEVELOPERS

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Using ADS with Delphi Prism and ASP.NET (continuation 2)

Figure 5.
If you click on the Continue button, a new dialog follows were we can specify the specific details to connect to the
ADS Data Dictionary. Unless you've specified a username and password to access the Data Dictionary, this usually
only means that we have to specify the location of the .add file.

Figure 6.
Click on Test Connection to ensure that we can connect to the Data Dictionary. Click on OK if everything works,
and back in the Configure Data Source wizard, we can click on OK to get to the next page were the option is
offered to save the connection in the web.config file. This is handy, since it means we can modify the
connectionstring without having to recompile the application.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

COMPONENTS
DEVELOPERS

Page 61

Using ADS with Delphi Prism and ASP.NET (continuation 3)

Figure 7.
The next page allows us to build the way we want to retrieve the data from the database. We can use a SQL
statement or stored procedure, or use the dialog to select a table (there is only one: Registration) and specify the
fields that we want to use in a SELECT statement for example. Note that by default the wizard will check the *
for all fields, but I prefer to explicitly check the fields I need instead.

Figure 8.

Page 62

COMPONENTS
DEVELOPERS

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Using ADS with Delphi Prism and ASP.NET (continuation 4)


Optionally, we can add an ORDER BY clause to the SELECT statement, to sort the results based on the LastName field
for example. The resulting SELECT statement will select all fields for the Registration table, and since we don't use a
WHERE clause, we will see all records. Which is ideal when I want to have a list of all people who registered for the
event, but not if I want to enter a new registration. For that, I need an INSERT statement. We can automatically produce
one, by clicking on the Advanced button in the wizard's dialog.
In the Advanced SQL Generation Options, we can specify that we want to generate INSERT, UPDATE and DELETE
statements:

Figure 9.
This will ensure that we can use the FormView in INSERT mode to enter new registrations.
After we close the dialog, we can click on OK to get to the last page of the wizard, and close that one as well.
The result is that we now have a FormView with a newly configured SqlDataSource component that connects to the
Registrations table from the Event Data Dictionary.
ASP.NET Page
We can now configure the ASP.NET page, and especially the FormView to show itself in INSERT mode only. This can
be done by selecting the FormView, and in the Properties Inspector making sure DefaultMode is set to Insert, with the
following result:

Figure 10.
Now we can give the application a test run, by selecting Debug | Start Without Debugging, which will start the ASP.NET
Development Server as well as the default browser, showing the registration application in action:

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

COMPONENTS
DEVELOPERS

Page 63

Using ADS with Delphi Prism and ASP.NET (continuation 5)

Figure 11.
Obviously, there are some issues with this page. First of all, the ID field is of type
autoinc, so that shouldn't be part of the input screen. And second, after we click on
the Insert hyperlink, the page doesn't jump to a Thank you! page (something we
didn't implement, yet), but gives an error message instead:

Figure 12.
This problem is caused by the fact that the declarative data binding in the generated
.aspx file is using positional parameters in the INSERT statements, but named
parameters in the list of parameters that follows it. In detail, the InsertCommand is
specified as follows:
InsertCommand = "INSERT INTO [Registration] ([FirstName], [LastName],
[Address], [Postcode], [City], [Country], [Company], [Email], [Phone], [ADS],
[ID]) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)"

While the InsertParameters include the names, as follows:


<InsertParameters>
<asp:Parameter Name=
<asp:Parameter Name=
<asp:Parameter Name=
<asp:Parameter Name=
<asp:Parameter Name=
<asp:Parameter Name=
<asp:Parameter Name=
<asp:Parameter Name=
<asp:Parameter Name=
<asp:Parameter Name=
<asp:Parameter Name=
</InsertParameters>

"FirstName"
"LastName"
"Address"
"Postcode"
"City"
"Co untry"
"Company"
"Email"
"Phone"
"ADS"
"ID"

Type
Type
Type
Type
Type
Type
Type
Type
Type
Type
Type

=
=
=
=
=
=
=
=
=
=
=

"String"
"String"
"String"
"String"
"String"
"String"
"String"
"String"
"String"
"String"
"Int32"

/>
/>
/>
/>
/>
/>
/>
/>
/>
/>
/>

We should change the ? in the INSERT statement to :fieldname items, turning it into the
following (also removing the ID field):

= "INSERT INTO [Registration] ([FirstName], [LastName], [Address], [Postcode], [City],[Country],


[Company], [Email], [Phone], [ADS])
VALUES (:FirstName, :LastName, :Address, :Postcode, :City, :Country, :Company, :Email, :Phone, :ADS)"

InsertCommand

Page 64

COMPONENTS
DEVELOPERS

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Using ADS with Delphi Prism and ASP.NET (continuation 6)


Finally, we should ensure that the ID parameter is also removed from the visible
fields on screen (the Bind(ID)), since we won't be used it anymore.
With these fixes in place, we can click on the Insert link to enter a registration in
the database, with the result that we get a new empty page again. Ready for the
next registration. This is nice if more people are standing in line behind the same
machine, but it would be nicer (I think) so show a little Thank you! message as
well.

_Default.SqlDataSource1_Inserted (sender: System.Object;


e: System.Web.UI.WebControls.SqlDataSourceStatusEventArgs);

method

begin
if Assigned(e.Exception) then
begin

// handle error?
end
else
begin
Response.Write('<h1>Thank you for your registration!</h1>');
end;
end;

We can implement this and catch INSERT errors in the


Inserted event of the SqlDataSource. In the designer, select
the SqlDataSource and then double-click on the Inserted
event. Here, we can check if there were any INSERT errors
(I leave it up to the reader to display them in a user friendly way), or
to display the "Thank you!" message using a call to
Response.Write.
And that's it. Of course, the final registration page will need
to look much better when deployed on the internet
(including the use of validators to ensure certain fields are
required), but the skeleton is in place. If you want to know
more about the actual ADS Training Day in The
Netherlands we want to use it for, and attend the session on
November 3rd where I'll build this application live'
(including the validators), then check out
http://www.eBob42.com/ADS. Thanks in advance!
Bob Swart

Bob Swart Training &


Consultancy
Delphi and Advantage Database Server Reseller
www.eBob42.com

Special offer from Bob Swart


The Delphi XE Develoment Essentials courseware manual,
contains over 200 pages on the most recent version of Delphi to get you up to speed with the latest developments and new features.
Topics include:
IDE Enhancements (including Code Formatter, SubVersion integration, Beyond Compare and debugger enhancements),
Language Enhancements (covering the new features since Delphi 7, so also very useful for readers who come from older editions of Delphi),
RTL and VCL Enhancements (including Windows 7 support, Direct2D examples, Ribbon controls, Touch, as well as COM Enhancements),
a big section on Unicode and migration from ANSI to Unicode (plus an example on Globalization and Localization where we'll translate an application
from English to Dutch and Chinese),
Modelling Support (now also available in the Professional Edition),
Unit Testing, and finally a section on third-party controls that ship with Delphi XE, including CodeSite, AQTime, SubVersion Integration, and
more.
This courseware manual is offered free of charge to all developers who purchase Delphi directly from me
(see http://www.bobswart.nl for more details).
Note that Database development, DataSnap, XML & SOAP, and IntraWeb details are not covered in this manual. These topics will be covered in
their own courseware manual publications available later in 2010. See
http://www.eBob42.com/courseware for details.

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

COMPONENTS
DEVELOPERS

Page 65

A datawarehouse example using kbmMW by Kim Madsen


starter

expert Delphi 7 and above

What is a datawarehouse? It's a large collection of data from


various sources, organized in a way that makes it possible to
search efficiently in order to extract the required data.
In other words, it requires data to be organized in a way that
makes it easy to access and search. True datawarehouse centres
typically specify the minimum requirements data must meet
before it is acceptable to their system.. The data must follow
certain rules to be easily searchable, and that means in some
situations that data must be reorganized to meet this requirement
before it can be inserted into the datawarehouse.
However situations exist where data, for various reasons, cant be
reorganized and must be offered to a datawarehouse as is. For example
where the data is used as evidence in a criminal case, or where the sheer
volume of data in each package makes it impractical to reorganise.
This article describes, how such a scenario has been handled using
kbmMW, allowing users to search many terabytes of data (the total
volume of data is constantly increasing) stored in thousands of separate
databases (also constantly increasing).
The situation is that a company receives large (gigabyte sized) databases
in SQLite format every day from various places. The data, amongst
other things, contains hundred of thousands of records with detailed
information about how a technical instrument is operating, and in
addition contains a lot of frame by frame video footage too.
Each technical device may (typically will) produce many databases over
time, each containing the results of a single production run.
The company requires that no changes can be made to the structure of
their database files. And they are not interested in duplicating
information into a different database system for various reasons, one of
them being security.

Another feature of kbmMW Enterprise Edition is its ability to


communicate completely in async mode via its transport framework
called WIB (Wide Information Bus).
When operating with the WIB all clients and servers are just nodes. The
servers are nodes which publish certain information and subscribe for
requests, while the clients typically are nodes publishing requests and
subscribing for responses.
kbmMW v. 3.50.01 Enterprise Edition which is currently running in
beta 2 and is about to be released any day now, contains features that
allow a server to provide incremental responses to a clients request for
data from a database. Its not a simple fetch-on-demand scenario we are
talking about where the client asks for more and more information. It is
a server side push of data records (virtual or real) that are relevant for
the ongoing search, whenever data is available.
Futher we also need some sort of connection management for
accessing the thousands of databases. Preferably one where we can
cache the connections to each database for later reuse.
Hence we would like to end up with a client that simply makes a
request for data. For example asking for temperature data for all
productions from a specific technical instrument. That data can then be
analyzed in various ways afterwards which is beyond the scope of this
article. (In the specific company case, a scripted, vector-oriented
integrated development environment was developed for them, making
it easy to analyze massive amounts of data using methods similar to
Matlab's, and under full control play synchronized video from the
frames in the databases).
The rest of this article shows how to build such an application server
and client using kbmMW.

The application server


We will first build a simple standard kbmMW application server with a
Previously, they were used to picking one database file at a time,
analysing it remotely over the net using a tool such as Matlab. However built-in query service for accessing databases. Afterwards we will
the performance of such operations was terrible, due to the enormous modify it bit by bit to give us the functionality we need as described
above.
amount of data passing over the net. That lead to people trying to
We
start out with a new empty VCL forms appilcation which means the
duplicate data into other local databases that were easier to handle, but
application server will run as a regular application. In real life scenario's,
that wasn't advisable for several reasons:
you would create an application server as a service, which is illustrated
- Security
in one of the many samples available on the
Confidential raw data was no longer contained as they would
kbmMW/Downloads/Samples section of Components4Developers
like it to be.
home page. Since creating Windows service applications is beyond the
- Security
scope of this article, I just take a shortcut and create a simple Windows
They allowed experts to update some sections of the database.
However if duplicate versions of the data existed, then there was no application that operates as a server.
We add the following:
guarantee that the updates actually altered the original database.
- TkbmMWServer component. Name it Server.
Manual updating proved unreliable here..
- TkbmMWTCPIPIndyMessagingServerTransport component and
- Performance Manually (or semi autimatically) copying data from
name it Transport. Set its property Server to point to Server, and
many database files to local storage, was time
its ClusterID to 'Demo'.
consuming, space consuming and error-prone.
Thus a better solution was sought. And that's where kbmMW really - Two message queue components for the in and out bound
messages for the transport. Name one of them qIn and the other
shines.
qOut.
Set the properties Transport.InboundMessageQueue to
As its probably well known, kbmMW is a middleware, which provides
point at qIn, and Transport.OutboundMessageQueue to point at
the glue between a server application (or in kbmMW terms.. application
qOut.
server) and one or more clients. It allows the clients to request various
- We also add two buttons Listen and Don't listen to activate and
operations on the application server, and expect responses for those
deactivate the server.
requests.
The form should look something like this:

One of the many features provided by kbmMW is the


ability to act as a middleman between a database and a
client. By putting kbmMW in between the client and
database, you gaine several advantages including:
- Significantly reducing the network traffic required to
access the database.
- Significantly increasing the security level around the
database.
- Significantly increasing the responsiveness of the
client.

Page 66

COMPONENTS
DEVELOPERS

Figure 1:

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

A datawarehouse example (continuation 1)


Lets add some code to the Listen and Dont Listen buttons:
procedure TForm1.Button1Click(Sender: TObject);
begin
Transport.Subscriptions.Clear;
// Subscribe for requests to this server.
Transport.Subscriptions.Subscribe(
kbmMWGenerateRequestSubscriptionSubject
(Transport.ClusterID));
// Subscribe for subscriptions/unsubscriptions to this server.
Transport.Subscriptions.Subscribe(
'SUB.'+Transport.ClusterID+'.>');
Transport.Subscriptions.Subscribe(
'USB.'+Transport.ClusterID+'.>');
// Activate server and transport.
Server.Active:=true;
end;
procedure TForm1.Button2Click(Sender: TObject);
begin
Server.Active:=false;
end;

With messaging transports, correct subscriptions are crucial, otherwise


the node (in this case the application server) will not receive the correct
messages. As we will have a messaging based client later on, we need to
tell the server that it should accept three type messages: Requests
(REQ), Subscriptions (SUB) and Unsubscriptions (USB), the latter are
not really required in this demo, but its good practice to pair up SUB
and USB subscriptions. In all cases the application server subscribes
only for requests etc. for the cluster called Demo (set in the Transports
ClusterID property).
Then lets tell the Transport that it should allow connections from
anywhere. Open its Bindings property and at designtime add a new
binding (Ip4).

Figure 5:
The first page of the wizard is shown. Select the Query
service/kbmMW_1.0 service type and click Next:

Figure 2:
Set the port number to 3000 and the mask to 0.0.0.0.

Figure 6:
Now we will be given the option to choose what type of database we
would like to access. In this example we choose to use SQLite which is
supported by all kbmMW Editions. Then click Next.
Figure 3:
Then we add the query service via the service wizard:

Figure 4:
Locate the Components4Developers Wizards and clicj the kbmMW
Service Wizard:

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Figure 7:
We will then be given the opportunity to name the service, and
optionally give it a version. Versioning a service can be smart if, at some
point, the service interface changes, and we want to support both older
and newer clients. Lets name the service DATAWAREHOUSE and
keep the version empty. Then click Next.
COMPONENTS
DEVELOPERS

Page 67

A datawarehouse example (continuation 2)

Figure 8: Now click through all the remaining wizard pages, and click the OK button on the last page.

Figure 9: This generates a new datamodule for us, with a couple of components on it:
The data module will be used by clients making database requests
to the application server. Because we want to access a large
number of SQLite databases, we have not defined a SQLite
connection pool (which in most applications would be an expected step)
on the application servers main form (we deselected that option
in the wizard). Instead we will make our own list of known
SQLite databases, from which we will select one or more as
needed.
The next step is to register this service with the application server
(Server component) on the main form. The main form's OnCreate
event is a suitable place to do that, as it only needs to be
registered once.
Figure 10:

Page 68

COMPONENTS
DEVELOPERS

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

A datawarehouse example (continuation 3)


implementation
uses Unit2,
{$R *.dfm}
procedure TForm1.Button1Click(Sender: TObject);
begin
Server.Active:=true;
end;
procedure TForm1.Button2Click(Sender: TObject);
begin
Server.Active:=false;
end;
procedure TForm1.FormCreate(Sender: TObject);
begin
Server.RegisterService(TkbmMWQueryService2,false);
end;
end.

For simplicity in this example, we design the application server to know


about databases with filenames DB1.db, DB2.db, DB3.db.. DBn.db.
The client will tell us what database number(s) we are to search in, at
the appropriate time. This sample assumes that there is an identically
structured table called 'DATA' in each database.
Each database file needs its own TkbmMWSQLiteConnectionPool, to
take care of connections to the database, caching of resultsets and
metadata and more. Thus when the client requires access to a specific
database, we can choose to create a TkbmMWSQLiteConnectionPool
on-the-fly, connect it to the relevant database and execute the client
query request. Or we can choose a better performing method, where
we keep a list of all previously accessed databases, and thus keep the
databases open and ready for use. A pool of connection pools. We'll
show a simplistic way to do that now.
We define a thread safe hash table that will contain all the previously
accessed database's connection pools. The unit kbmMWGlobal
contains lots of nice containers and other goodies, so we add it to the
uses clause.
uses
Windows, Messages, SysUtils, Variants, Classes, Graphics,
Controls, Forms, Dialogs, kbmMWCustomMessagingTransport,
kbmMWCustomTransport, kbmMWServer,
kbmMWCustomServerMessagingTransport,
StdCtrls, kbmMWGlobal;

Then we make a couple of methods to fetch database connection


pools from our hash list (our pool of connection pools), and
automatically create new connection pools when new databases are
accessed. We first add the unit kbmMWSQLite to the uses clause of
the unit Unit1.pas. Then we add this method:
function TForm1.GetConnectionPool
(ADatabaseName:string):TkbmMWSQLiteConnectionPool;
begin
DBs.BeginWrite;
try
Result:=TkbmMWSQLiteConnectionPool
(DBs.GetObject(ADatabaseName));
if Result = nil then
begin
Result:=TkbmMWSQLiteConnectionPool.Create(nil);
try
Result.Database:='yourdbdirectory\
+ADatabaseName+'.db';
Result.Active:=true;
DBs.AddManagedObject(ADatabaseName,Result);
except
FreeAndNil(Result);
raise;
end;
end;
finally
DBs.EndWrite;
end;
end;

The method first ensures that all access to the contents of DBs is
protected so only one thread at a time can access it. Then we look up a
connection pool based on the database name. If none has been found,
we create a new one, and add it 'managed' to the DBs storage. If the
database the client is requesting actually doesn't exist at all, this method
will throw an exception. Other ways to indicate the issue to the client
could also be coded.
Then we define a so called virtual dataset on the query service. Right
now we have a TkbmMWSQLiteQuery component there, and that will
be used for the query against a specific database. However we would
like to interact with the client in such a way that we don't just send the
complete result (containing the combined matching records for all
client specified databases) in one go to the client. Instead we would like
to send incremental resultsets. In this sample we choose to send all
matching records from one database to the client as one incremental
resultset.
On the queryservice datamodule (unit2.pas) we put a
TkbmMWMDQuery and a TkbmMWMDConnectionPool.

Then we define a field in the Tform1 class, which is to hold the


connection pools for the databases we have accessed and thus allow us
to reuse the connection pools later on, without having to reopen the
database files:
procedure Button1Click(Sender: TObject);
procedure Button2Click(Sender: TObject);
procedure FormCreate(Sender: TObject);
private
{ Private declarations }
public
{ Public declarations }
DBs:TkbmMWThreadHashStringList;
end;

We add the code required to create the instance upon form creation and
destroy it at form destruction. We specify that objects that are are
'managed' by the hashlist are also deleted automatically by the hashlist
when entries are deleted from the list or the list itself is freed.
procedure TForm1.FormCreate(Sender: TObject);
begin
DBs:=TkbmMWThreadHashStringList.Create(100);
DBs.FreeObjectsOnDestroy:=true;
Server.RegisterService(TkbmMWQueryService2,false);
end;
procedure TForm1.FormDestroy(Sender: TObject);
begin
FreeAndNil(DBs);
end;

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Figure 11:
We rename the kbmMWMDQuery1 to 'DATA', and set its published
property to true and its connection pool to point to the
kbmMWMDConnectionPool1 component. That way clients can
request data from this particular component. The virtual memory
dataset (thus the MD acronym) provides some interesting events for
us, namely the OnPerformFieldDefs and OnPerformQuery events.
The OnPerformFieldDefs allow us to easily define field definitions at
runtime on-the-fly. We could also define them at designtime for this
demo because we know the structure of the SQLite table DATA
doesn't change for different databases served by this application server.
However we'll define the definitions in code. Its also possible to define
parameters in the same method, but since we need a way the client
query can provide information about database name and some search
criteria, we will define parameters for that at designtime.

COMPONENTS
DEVELOPERS

Page 69

A datawarehouse example (continuation 4)


procedure TkbmMWQueryService2.DATAPerformFieldDefs
(Sender: TObject; AParamsOnly: Boolean);
var
ds:TkbmMWMDQuery;
begin
ds:=TkbmMWMDQuery(Sender);
if not AParamsOnly then
begin
// Define the fields that should be returned to the client.
ds.FieldDefs.Clear;
ds.FieldDefs.Add('ID',ftString,20,false);
ds.FieldDefs.Add('X',ftFloat,0,false);
ds.FieldDefs.Add('Y',ftFloat,0,false);
end;
end;

Then we add the parameters at designtime via the property Params of


the DATA (TkbmMWMDQuery) component.

Figure 14:
Finally we need to add some code to do the actual search and return the
incremental data. This code should be put in the OnPerformQuery
event .
The First part of the PerformQuery event extracts parameter values
provided by the client.

Figure 12:
Three parameters have been created:
1
Database, ftString, ptInput, Size 100
2
ConditionLow, ftFloat, ptInput
3
ConditionHigh, ftFloat, ptInput
The purpose of these parameters is to allow the client to indicate
database number, and some conditions we may choose to use for the
selection from a database. By convention we define that Database may
contain one or more comma-separated numbers indicating database
number for which a result is requested.
Then we define the SQL statement which will query a single database.
That can be done at runtime or designtime. Since the SQL is the same
regardless which database is queried, we define it at designtime.

procedure TkbmMWQueryService2.DATAPerformQuery
(Sender: TObject; var ACanCache,
ACallerMustFree: Boolean; var ADataset: TDataSet);
var
i:integer;
sDBID:string;
ds:TkbmMWMDQuery;
sDatabase:string;
slDatabase:TStringList;
conditionLow,conditionHigh:double;
cp:TkbmMWSQLiteConnectionPool;
bFirst:boolean;
mt:TkbmMemTable;
begin
ds:=TkbmMWMDQuery(Sender);
// Get parametervalues.
sDatabase:=ds.ParamByName['Database'].AsString;
conditionLow:=ds.ParamByName['conditionLow'].AsFloat;
conditionHigh:=ds.ParamByName['conditionHigh'].AsFloat;

Then comes a loop that runs for each database


that the client has asked the application server to
query. For each database, a connection pool is
requested, and a native TkbmMWSQLite query is
executed. The resulting dataset (if any) is then
sent asynchronously to the client via the
SendPartialResultDataset method. The method
needs to know if it's the first partial resultset, or
an intermediate one.
The last partial resultset MUST be sent by
returning from the eventhandler with a dataset
that it can send to the client. That final dataset
will be marked as either the complete dataset (if
SendPartialResultDataset was never called) or the
final partial dataset (if SendPartialResultDataset
has been called at least once). kbmMW keeps
track of that internally.
Figure 13:
The parameters must then be configured in the Params property.
We have 2 parameters to define: conditionLow and conditionHigh.
They need to be of DataType ftFloat and ParamType ptInput.

Page 70

COMPONENTS
DEVELOPERS

The purpose of that, is twofold:


1) The server doesn't need to send complete
field definitions for intermediate or final
partial datasets.
2) The client will know if further partial datasets
are to come, or if all have been received.

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

A datawarehouse example (continuation 5)


conditionHigh:=ds.ParamByName['conditionHigh'].AsFloat;
// Extract list of database id's to search.
slDatabase:=TStringList.Create;
try
slDatabase.CommaText:=sDatabase;
// Loop for all database ids.
bFirst := true;
for i:= 0 to slDatabase.Count-1 do
begin
sDBID:=slDatabase.Strings[i];
// Lets ignore any exceptions related to accessing a single database.
try
// Find or allocate a connection pool access to the database.
cp:=Form1.GetConnectionPool(sDBID);
// Setup the SQLite query component.
kbmMWSQLiteQuery1.ConnectionPool:=cp;
try
kbmMWSQLiteQuery1.ParamByName
['conditionLow'].AsFloat:=conditionLow;
kbmMWSQLiteQuery1.ParamByName
['conditionHigh'].AsFloat:=conditionHigh;
kbmMWSQLiteQuery1.Open;
// Now we should have data that we can return to the client.
// Its important that the last partial result is sent by
// kbmMWs internals, and thus not by us using SendPartialResultDataset.
// Since we actually dont know if there is data in the last database
// requested, we will let the final package be empty.
if kbmMWSQLiteQuery1.RecordCount>0 then
begin
SendPartialResultDataset
(Form1.Transport, kbmMWSQLiteQuery1,
KBMMW_MESSAGEPRIORITY_NORMAL,
bFirst);
bFirst:=false;

The final part of the event closes the native query, detaches the
connection pool from it, and generates a final (empty) dataset to return.
Because we don't know if we are infact processing the last dataset
(database) at any point in the loop we send this empty dataset which
indicates that it is the last one (we may encounter an empty or nonexistent dataset anywhere in the loop). We also tell the system that it is
responsible for getting rid of our temporary TkbmMemTable, when its
done with it.
bFirst:=false;
end;
finally
kbmMWSQLiteQuery1.Close;
kbmMWSQLiteQuery1.ConnectionPool := nil;
end;
except
// Do nothing.
end;
end;
finally
slDatabase.Free;
end;
// If no data collected, complain.
if bFirst then
raise Exception.Create
(No matching slides were found.');
// Now prepare a final (empty) dataset package.
mt:=TkbmMemTable.Create(nil);
mt.CreateTableAs(ds,[mtcpoStructure]);
mt.Open;
ADataset:=mt;
ACallerMustFree:=true;
end;

Finally we need to tell the query service that clients are allowed to
access its published query components. That is done via the property
AllowClientNamedStatement on the query service data module.

The client
Now all that's left is to build a client that can talk to the application
server. We start out by creating a new VCL Forms application for the
client. We add several components:
-

TkbmMWTCPIPIndyClientMessagingTransport
2 x TkbmMWMemoryMessageQueue
TkbmMWClientConnectionPool
TkbmMWClientQuery
TkbmMWBinaryStreamFormat
And a datasource, dataaware grid and a couple of buttons.

Transport.InboundMessageQueue is set to point to qIn and


Transport.OutboundMessageQueue is set to pont to qOut. The
TkbmMWClientConnectionPool.Transport property is set to point at
Transport, and kbmMWClientQuery1.ConnectionPool is set to point to
the kbmMWClientConnectionPool component.
Further kbmMWClientQuery1.TransportStreamFormat is set to point
to kbmMWBinaryStreamFormat1 (matching a similar setting on the
query service in the application server). The DB grid is hooked up to
the datasource that is hooked up to kbmMWClientQuery1, and
Transport.ClusterID property is set to 'Demo'.
The QueryService property of the kbmMWClientQuery1 component is
set to the name of the service on the server that we would like to
provide data to our client query. That is 'DATAWAREHOUSE'. Hence
we set the property to that string. As we didn't define a service version
on the DATAWAREHOUSE service we should clear out the property
QueryServiceVersion.

Figure 16: Lets put some code in the connect/disconnect buttons.


procedure TForm1.btnConnectClick(Sender: TObject);
begin
kbmMWClientConnectionPool1.Active := true;
end;
procedure TForm1.btnDisconnectClick(Sender: TObject);
begin
kbmMWClientConnectionPool1.Active := false;
kbmMWClientConnectionPool1.KillConnections;
end;

We don't in fact connect immediately, but instead tell the connection


pool that maintains connections from the client to the application
server, that it can connect when it needs to. Similarly we tell the
connection pool that it should kill its pool of connections when the
user clicks the Disconnect button.

Figure 15:

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

COMPONENTS
DEVELOPERS

Page 71

A datawarehouse example (continuation 6)


Then we write some code for the button that makes the request to the server:
procedure TForm1.QueryClick(Sender: TObject);
begin
// We want to access the published query named DATA on the server.
kbmMWClientQuery1.Query.Text:='@DATA';

Provided we have the correct SQLite databases available and provided


we have configured the kbmMWSQLiteConnectionPool correctly on
the application server, its now possible to make asynchronous queries
with developer-controlled incremental on-the-fly transfer of partial
resultsets to the client.

// Lets get the parameters from it, so we can fill them out with
// relevant query information.
kbmMWClientQuery1.FetchDefinitions;

To summarise what we have demonstrated here:


* How to create a basic server and client using
// Setup the parameter values for the query.
kbmMW Enterprise Edition
// Search all databases identified with the values 3 and 4 and 5.
* How to work with virtual memory data sets
// Search for values in the range 10-20.
* How to work with SQLite databases
kbmMWClientQuery1.ParamByName['Database'].AsString:='3,4,5'; * How to work with the Wide Information Bus
kbmMWClientQuery1.ParamByName['ConditionLow'].AsInteger:=10;
(WIB) messaging framework
kbmMWClientQuery1.ParamByName['ConditionHigh'].AsInteger:=20;
* How to work with developer-controlled incremental resultsets
kbmMWClientQuery1.AsyncOpen;
// Now the server will go through all the specified databases,
// and search for the relevant criteria, and give us a message
// for each non empty resultset, via the Transports OnAsyncResponse
// event.
end;

And finally we need to write some code to handle the asynchronous


responses, of which there will come with minimum of 1 and potentially
many for each click on the Query button..
procedure TForm1.TransportAsyncResponse(Sender: TObject;
const TransportStream:IkbmMWCustomResponseTransportStream;
const RequestID: Integer; const Result: Variant;
UserStream: TkbmMWMemoryStream);
var
ps:TkbmMWDatasetPartialState;
begin
// Check if response to request made via query component?
// Compare LastRequestID from the query component with the
// RequestID for the incoming response message.
if kbmMWClientQuery1.ActiveClient.LastRequestID
= RequestID
then
begin
ps := kbmMWClientQuery1.SetQueryResult
(Result,UserStream);
case ps of
// We got all in one response message.
mwdpsAll: ;
// No more messages are coming for this request.
mwdpsInitial: ;// We got initial response message, and thus
// the dataset has now been defined with
// structure (fields) and some data.
// We got an intermediate response message.
mwdpsData: ;
// It was appended to the already existing data.
mwdpsFinal: ; // We got the final response message for this
// request. Now the dataset is complete.
end;
end;
end;

COMPONENTS
DEVELOPERS

ONLY FOR SUBSCRIBERS

SPECIAL OFFER:
AT ORDERING A NEW LICENSE OF
KBMMW PROFESSIONAL OR ENTERPRISE
INCLUDING COMPETITIVE UPGRADES

40% DISCOUNT
THE COUPONCODE TO REFER TO IS: KBMMWBLAISEPASCAL2010.
THIS OFFER IS ONLY VALID THROUGH THE PERIOD STARTING AT OCTOBER 10/2010 UNTIL NOVEMBER 10 /2010

Page 72

COMPONENTS
DEVELOPERS

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Multiplatform Windows CE by Joost van der Sluis


starter

expert Lazarus (USB Stick)

The Lazarus USB stick is already prepared to be used with this


project. If you want to get your own Lazarus version you will have
to prepare this and do the installation yourself.
Using Lazarus you can build programs which can be compiled
for different platforms. One of those platforms is Windows
Mobile, also known as PocketPC or Windows CE. This article
explains how you can write a simple Windows Mobile application
which uses GPS and the embedded database SQLite. I also cover
how you can debug the application and how you can test the
application with the Windows Mobile Emulator from Microsoft.
Multiplatform
Lazarus main advantage its support for multiple platforms. Free
Pascal, the compiler which is used by Lazarus, can make executables for
several platforms. Unlike .Net, Free Pascal does not make something
like an intermediate executable which has to run on some framework.
But what exactly are the differences between platforms? For Lazarus
three aspects are important: the processor, the operating system and the
used widgetset. How does this relate to mobile phones using Windows
Mobile?
The processor:
Mobile phones can use different types of processors. Most phones use
ARM processors but there are a lot of different types. And ARM
processors can also be used with different settings which can be
important. But luckily Windows Mobile phones always use the same
kind of ARM processor, using the same (relevant) settings.
The operating system:
Lazarus calls Windows Mobile 'WinCE', the name of an older version
of this operating system. At least versions 5 and 6 are supported.
The widgetset
A 'widgetset' is a collection of graphic controls which can be used to
build programs.

On Windows most programs use the default Windows widgetset, but it


is also possible to build programs using the QT
widgetset, for example.On Linux QT and GTK are the most popular
widgetsets and on OS/X there are Carbon and Cocoa. In this case we
want to use Windows Mobile and its default widgetset called 'WinCE' .
(Note that in this case the name of the widgetset is the same as the name of the
operating system. We must be clear about the distinction between widgetset and
operating system.).
Since it is almost impossible to write the program on the mobile phone
itself, we will use cross-compilation. That means that we use a Windows
PC to create the binaries for WinCE. Therefore you need a 'crosscompiler' and some extra utilities.
On BlaisePascalMagazine Lazarus USB-stick these are preinstalled. If you are using your own installation of Lazarus, you have
to download and install the cross-compiler for ARM/WinCE add-in.
(Lazarus-0.9.28.2-fpc-2.2.4-cross-arm-wince-win32.exe)
Start Building
Now we can start building 'Hello World' for Windows Mobile. First we
create a PC version, then a version for a telephone. Start Lazarus and
start a new program. Place a button in the upper left corner and place
the following code in the 'onclick' event:
MessageDlg('Hello','Hello World',mtInformation,[mbok],0);

Save the program and compile and run the program to test it on the
PC. If this all works we can configure Lazarus to create a WinCE
application. From the 'Project' menu choose 'Project Options...' then
'Compiler Options'. In this screen we select which widgetset to use,
from this screen called the 'LCL Widget Type. Note that versions of
Lazarus later than the USB-stick version have a combobox dropdown
where you can select the LCL Widget Type.. Select 'WinCE' and
activate the 'Code' tree node ('Code generation' node in later Lazarus
versions).. Set the 'Target OS' to WinCE and the processor (Target CPU
family) to ARM.
Click on 'OK' to save the changes and then recompile the program. If
everything goes successfully you now have a hello-world application for
WinCE. You could try to start the application, but that will fail. A
program compiled for an ARM processor wil not work on a Intel
Pentium processor. To test your program you'll have to copy it to a
Windows Mobile Phone and run it there.

Figure 1:

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

COMPONENTS
DEVELOPERS

Page 73

Multiplatform Windows CE (continuation 1)


Windows Mobile Emulator
It is very cumbersome, of course, to copy the application to the phone
every time you want to test out a compiled coding change.use the
Windows Mobile Emulator for quick testing.
This program from Microsoft simulates a mobile phone with an ARM
processor on which you can run multiple images with several operating
systems. This emulator and images can be downloaded from the
Microsoft website. Search for 'Windows Mobile Emulator Images'
andselect the 'Professional' version of Windows Mobile, or else there
will be no touchscreen support.

Figure 3:
After the installation there is a new option in the start menu, 'Windows
Mobile 6 SDK' with the sub-item 'Standalone Emulator Images' which
contains a list of several images with different versions of Windows
Mobile. Choose one to run. You will see a telephone on which
Windows is starting up. In the File>Configure it is possible to set a
'shared map'. Set it to 'location' where you stored the Lazarus project.
This 'Shared folder' is now available on the simulated phone as an extra
storage card. On the phone select 'programs' in the start menu and run
the File Explorer. Now select the 'storage card' in the upper left corner.
Now you can see all the files from the Lazarus project and the helloworld executable file. Click on the program and your phone will tell you
'hello'.
Figure 2:

Figure 4:

Page 74

COMPONENTS
DEVELOPERS

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Multiplatform Windows CE (continuation 2)


Remote debugging
So now we are able to change the program, compile it and show the
result in the emulator. But at time you also want to debug the
application. Lazarus uses the debugger 'gdb' internally. To be able to
debug WinCE applications a special version of gdb is necessary which
can be downloaded from the Free Pascal website.
(ftp://ftp.freepascal.org/fpc/contrib/cross/) Download the file gdb6.4-win32-arm-wince.zip and unzip the files to a new folder. Now you
have to configure Lazarus so that it uses this version of gdb to debug
applications. This has to be done in the IDE options, (Environment>Options) on the debugger-screen. Set the debugger-path to the
path+filename (gdb.exe) of the gdb-version you just extracted.
Gdb uses ActiveSync (Windows XP) or the 'Windows Mobile Device
Center' (Windows Vista or Windows 7) for remote debugging. If you
don't have one of these tools installed you will have to do this next.
If things don't work immediately after installation, reboot the PC.
After the reboot, restart the Windows Mobile Image. Now you need
the 'Device Emulator Manager' which you can find in the 'Windows
Mobile 6 SDK\tools' section of the Windows start-menu. When the
Device Emulator Manager is started, you can see a GUID (a random
string) which represents the mobile telephone which is running in the
emulator. Right-click on the GUID and choose 'cradle'. This simulates
the connection of the telephone to your computer. Now start
ActiveSync or the 'Windows Mobile Device Center' and check that it
connects to the simulated mobile phone. (If the connection isn't made,
select 'DMA' in the connection-settings of the Windows Mobile Device
Center and restart the Device Center)

GPS
Until now we were only busy with configuring all kind of things. Now
let's begin with writing applications. We want to create an application
which determines the location of the phone using GPS and to store
this information in a local database. Reading the GPS can be done to
connect directly to the com-port on which the GPS is connected, but
Windows CE also has a library (gpsapi.dll) which can be used to get
data from the GPS. (This library is part of WinCE since version 5) It's
this 'GPS Intermediate Driver' that we're using in this article. Therefore
we first have to make the structures and functions from this dll
accessible in our program. Add this definition to the application code:
const
gps_version_1
= 1;
gps_max_satellites = 12;
gps_max_prefix_name = 16;
gps_max_friendly_name = 64;
type
Tgps_fix_quality

= (gps_fix_quality_unknown,
gps_fix_quality_gps,
gps_fix_quality_dgps);
Tgps_fix_selection = (gps_fix_selection_unknown,
gps_fix_selection_auto,
gps_fix_selection_manual);
Tgps_fix_type
= (gps_fix_unknown,
gps_fix_2D,
gps_fix_3D);

TGPS_Position = record
dwVersion: DWord;
dwSize: DWord;
dwValidFields: DWord;
dwFlags: DWord;
stUTCTime: Windows.SYSTEMTIME;
dblLatitude: double;
dblLongitude: double;
flSpeed: cfloat;
flHeading: cfloat;
dblMagneticVariation: double;
flAltitudeWRTSeaLevel: cfloat;
flAltitudeWRTEllipsoid: cfloat;

Figure 5:
If there is a connection between ActiveSync/Device Center and the
emulated mobile phone, we can start debugging. Return to Lazarus and
place a breakpoint on the line on which the messagebox is opened.
Start the program using the remote debugger (F9). Without the remote
debugger selected, this would result in the error message that the
application is not suitable for Windows.
But now the application is started on the phone. You need some
patience, though. It needs some time. When the program is running,
click on the button and Lazarus will pause the program on the
breakpoint. With F9 the program continues, behaving just as you expect
while debugging applications.
It's important though to know what happens exactly when you debug
the application remotely. On the phone a map called '\gdb' is created.
Then the program which has to be run on the phone is copied to this
map and started. Then the debugger connects to this running
application. Note however that when a file with the same name already
exists on the phone, the application is not transferred. This means that
if you change the program, re-compile it and run it again, on the phone
the 'old' version of the program is still used. So you have to remove the
application from the map '\gdb\' before you can debug the new
version.
One of the reasons that debuging is so slow is that copying the file to
the phone takes so long. It's possible to speed up this process by
excluding the debug-information from the executable while linking the
application, and place this information in a separate file. This leads to a
smaller executable so it takes less time to copy it. You can find the
option for putting debug-information in a separate file in the compiler
options, on the link-tab. (Use external gdb debug symbols file (-Xg)).
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

FixQuality: Tgps_fix_quality;
FixType: Tgps_fix_type;
SelectionType: Tgps_fix_type;
flPositionDilutionOfPrecision: cfloat;
flHorizontalDilutionOfPrecision: cfloat;
flVerticalDilutionOfPrecision: cfloat;
dwSatelliteCount: DWORD;
rgdwSatellitesUsedPRNs:
array[0..gps_max_satellites - 1] of cdouble;
dwSatellitesInView: DWORD;
rgdwSatellitesInViewPRNs:
array[0..gps_max_satellites - 1] of cdouble;
rgdwSatellitesInViewElevation:
array[0..gps_max_satellites - 1] of cdouble;
rgdwSatellitesInViewAzimuth:
array[0..gps_max_satellites - 1] of cdouble;
rgdwSatellitesInViewSignalToNoiseRatio:
array[0..gps_max_satellites-1]
of cdouble;
Fillup: array[0..287] of byte;
end;

COMPONENTS
DEVELOPERS

Page 75

Multiplatform Windows CE (continuation 3)


TGPS_Device = record
dwVersion: DWORD;
dwSize: DWORD;
dwServiceState: DWORD;
dwDeviceState: DWORD;
ftLastDataReceived: Windows.FILETIME;
szGPSDriverPrefix: array[0..gps_max_prefix_name - 1]
of WChar;
szGPSMultiplexPrefix: array[0..gps_max_prefix_name - 1]
of WChar;
szGPSFriendlyName: array[0..gps_max_friendly_name - 1]
of WChar;
Fillup: array[0..114] of byte;
end;

The constants are for some regular values. TGPS_Position is a record in


which a location is stored. TGPS_Device contains information about
the GPS device. The Fillup array in TGPS_Position and TGPS_Device
is to work around a bug in several versions of Windows CE. This is
explained later. As you can see the FileTime and SystemTime types are
used. These types are specific for Windows, so you have to add the
windows unit to the uses clausule. This immediately results in an
application that won't compile for non-Windows operating systems.
Not a problem for our Windows-specific application. Further the
ctypes unit has to be added.
This unit adds several types which are used in the C-programming
language, in which the GPS api has been written. The GPSAPI.DLL
has four functions to communicate with the GPS. To be able to use
these functions in a Pascal program, they have to be defined first:
{$IFDEF WINCE}
function GPSOpenDevice(hNewLocationData,
hDeviceStateChange: PtrInt; szDeviceName : PWideChar;
dwFlage: DWord): PtrInt; cdecl; external 'gpsapi.dll'
Name 'GPSOpenDevice';
function GPSCloseDevice(hGPSDevice: PtrInt): DWord;
cdecl; external 'gpsapi.dll' Name 'GPSCloseDevice';
function GPSGetPosition(hGPSDevice: PtrInt;
var pGPSPosition: TGPS_Position;
dwMaximumAge, dwFlags: DWord): DWord; cdecl;
external 'gpsapi.dll' Name 'GPSGetPosition';
function GPSGetDeviceState(var pGPSDevice: TGPS_Device):
DWord; cdecl; external 'gpsapi.dll'
Name 'GPSGetDeviceState';
{$ENDIF WINCE}

The definitions are bracketed between $IFDEF statements to ensure


that this code is only compiled when compiling for WinCE. Free
Pascal uses these defines to write code which differs between different
operating systems. There are different defines for all operating
systems, processors and widgetsets. Here we use the define to make
sure that the code still compiles on a system on which GPSAPI.DLL
is not available. This way we can use this program also on a regular
PC, although in that case the GPS won't work, obviously.
Now we want to see the GPS-position, instead of 'Hello World'. Add
the private function ConnectGPS to the form:
function TForm1.ConnectGPS: Boolean;
var
res:
DWORD;
begin
{$IFDEF WINCE}
result
:= false;
fgpshandle
:= GPSOpenDevice(0, 0, nil, 0);
if fgpshandle = 0 then
begin
MessageDlg('Fout', 'Activating the GPS failed.',
mtError, [mbOK], 0);
exit;
end;
result := true;

With GPSOpenDevice a connection is made with the GPS and when


necessary the GPS is turned on. The fgpshandle has to be added to the
form as a private variable with the type PtrInt. This handle is used to
read data from the GPS and to close it down. The error message is selfexplanatory.
Now we can replace the code which shows 'Hello World' with
something more useful :
procedure TForm1.Button1Click(Sender: TObject);
var
pGPSPosition: TGPS_Position;
res:
DWORD;
begin
{$IFDEF WINCE}
If (fgpshandle<>0) or ConnectGPS then
begin
FillByte(pGPSPosition, sizeof(pGPSPosition), 0);
pGPSPosition.dwVersion := gps_version_1;
pGPSPosition.dwSize:= 376; // 344
res := GPSGetPosition(fgpshandle,pGPSPosition,10000,0);
if res <> ERROR_SUCCESS then
begin
MessageDlg('Fout', Format('Reading the GPS failed.
Foutcode %d',[res]),
mtError,
[mbOK], 0);
exit;
end
else
MessageDlg('Success', Format('Longitude: %g,
Latitude: %g.
Satelites: %d:%d.',
[pGPSPosition.dblLongitude,pGPSPosition.dblLatitude,
pGPSPosition.dwSatelliteCount,
pGPSPosition.dwSatellitesInView]),
mtError,
[mbOK], 0);
end;
{$ENDIF WINCE}
end;

The code above first checks if a fgpshandle is available, and if not the
program tries to create a connection with the GPS. If this is successful
the pGPSPosition record is initialized. It's been cleared completely and
then the versionnumber and size of the record are set. As you can see
the size of the record is given explicitly. In principle this is wrong, it
should be 'sizeof(pGPSPosition)'; instead of 376. The problem is that
Windows CE 5 and 6 use a different size for the pGPSPosition. The
idea was that WinCE 6 would be backwards compatible with the size of
WinCE 5, but there's a bug in some versions of WinCE that breaks this
compatibility. That's why the size has a constant size here. If the given
size is incorrect, the program will raise an error with errorcode 87.
Should this happen, replace '376' by '344' and try again. Later on a
better solution will be discussed.
After the pGPSPosition record is initialized, GPSPosition is called with
four parameters, Firstly the fgpshandle, secondly the pGPSPosition
record, and thirdly the maximum number of milliseconds old that the
answer can be. That's because the GPS sends information to the phone
continuously. It could be that a second ago a location has been send to
the phone. If this age parameter is larger then 1000 (1 second) then the
call will immediately return the value sent one second ago. The fourth
parameter is always zero.
After a successful call to GPSGetPosition a messagebox is shown with
the GPS-coordinates of the current location. Further you can see how
many satellites are used to obtain the current location (more satellites
means a more accurate result) and the total number of
satellites the GPS 'sees'.

{$ENDIF WINCE}
end;

Page 76

COMPONENTS
DEVELOPERS

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Multiplatform Windows CE (continuation 4)


Now we can test the program. Compile and start the application in the
emulator and press the button. If you receive an error message with
errorcode 87, change the size of the pGPSPosition record. If
everything went ok, you'll see that you are on the equator, precisely
below Greenwich (0,0) and that 0 satellites are found. Don't worry, this
is completely normal since there is no GPS device in your PC or the
emulator. But if this works, copy the application to a real mobile phone
and try again. Again it will tell you that you are at position 0,0, but still
there is no reason to panic.
If the GPS has just been turned on, it takes some time until the right
location is found. When everything is ok, the second number indicating
the amount of satellites is higher then zero. Click on the messagebox so
that it goes away and open the messagebox again a few times. You'll see
the number of satellites increasing, and finally it will also show a
position. (Walking outside the building will also help.)
To be able to test GPS software inside the emulator, Microsoft
developed a utility called 'FakeGPS'. It is part of the Windows Mobile
SDK, but to install this SDK you need to have Visual Studio installed.
However, it is also possible to open the Windows Mobile SDK in a
program like '7zip' and to extract the file 'fakegps.cab'. Copy this file to
the emulated phone which can be done by placing it into the same
location as the application you have just developed. After all we have set
this location to be visible in the emulator as 'storage card'. Click on
'fakegps.cab' on the phone to install it. Then browse to the programs
folder on the phone and start the 'fakegps' utility. Turn on the fakegps
and then you can test the Lazarus program on the emulator. You'll see
that it will return some fake coordinates, so you are able to test things.
Now a real application
Now we can read GPS coordinates we can make a real program. Doing
this we have to realize that we are developing for mobile phones with
small screen resolutions which differ for each model. The form must
always be started maximised. So set Form.Windowstate to
wsMaximized. Place two TLabels on the form with the names
'lMessage' and 'lMessage1'. For both components set align to al Top and
alignment to taCenter. Set the font-style to fsBold and give them a
suitable captions, for example 'Lazarus WinCE demonstration' and
'GPS demo'. Set the BorderSpacingAround to 5. It might also be a
good idea to increase the default font size. Set the font.size property of
the for to 10 which will then set the default font size of all the form's
controls to 10. (Unless a font size has explicitly been given to a control)
Also place two buttons on the form and call them 'bStop' and 'bStart'
giving them each an appropriate caption. Place them on the bottom-left
of the form as you can see in Figure 6, and set the akBottom and
akLeft anchor properties to true and the others to false. Now drag the
border of the form to change the form size and check that everything
scales nicely with the form. This is important because it is quite
possible that when you run the program on your phone, you won't see
anything to begin with because all controls are positioned outside the
screen. The last step is to place a TTimer on the form. To check if you
did everything ok you can run the program on your phone or emulator.

function TForm1.GetGPSPosition(var GPSPosition:


TGPS_Position): dword;
DWORD; dtime: TDateTime; ttime:Windows.systemtime;
var res:
begin
{$IFDEF WINCE}
FillByte(GPSPosition, sizeof(GPSPosition), 0);
GPSPosition.dwVersion := gps_version_1;
if fgpsposition_size = 0
then
begin
GPSPosition.dwSize := 376;
res := GPSGetPosition
(fgpshandle, GPSPosition, 500000, 0);
if res=ERROR_INVALID_PARAMETER then
begin
GPSPosition.dwSize:= 344;
res := GPSGetPosition
(fgpshandle, GPSPosition, 500000, 0);
end;
end
else
begin
GPSPosition.dwSize:=fgpsposition_size;
res := GPSGetPosition
(fgpshandle, GPSPosition, 500000, 0);
end;
if res = ERROR_SUCCESS
then
begin
if (GPSPosition.dwSatelliteCount > 0)
then
begin
lMessage.Caption := Format
('Last position:(%d)',
[GPSPosition.dwSatelliteCount]);
lMessage1.Caption:= Format
('%g %g', [GPSPosition.dblLatitude,
GPSPosition.dblLongitude]);
end
else lMessage.Caption := Format
('Wait for signal. %D satellites.',
[GPSPosition.dwSatellitesInView]);
end
else lMessage.Caption := Format
('Error while reading GPS, error code %D.',[res]);
result := res;
{$ENDIF}
end;

This reads and displays the GPS coordinates on the screen. If there are
no coordinates available yet a message is shown alerting you to the
absence of a signal. What is new is that the program tries to setup a
connection with dwSize:=376 and if this doesn't succeed with
dwSize:=344. Then the size which works is stored so it can be used
later. Note that for this to work it is important that the size of the
actual record is at least as high as the value given here.
The TGPS_Position as we defined earlier is for Windows Mobile
version 5 and normally has a size which is too small for Windows
Mobile 6. But because version 6 can't handle the format of version 5 in
all cases, the size of the record is stretched by adding the unused array
of bytes. (TGPS_Position.Fillup) This way it will work on all versions.
Now we have to hook up the events for the buttons and timer:

Last Position(6)

Figure 6:
Now we add a new private function to obtain the GPS-position. The
code is as follows:

Figure 7:
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

COMPONENTS
DEVELOPERS

Page 77

Multiplatform Windows CE (continuation 5)


procedure TForm1.bStartClick(Sender: TObject);
var
res: DWord;
pGPS_Position: TGPS_Position;
begin
if not ConnectGPS then Exit;
res := GetGPSPosition(pGPS_Position,False);
if res = ERROR_SUCCESS then
begin
bStart.Enabled:=false;
bStop.Enabled:=True;
Timer1.Enabled:=true
end;
end;

Figure 8:

procedure TForm1.bStopClick(Sender: TObject);


begin
Timer1.Enabled:=false;
{$IFDEF WINCE}
GPSCloseDevice(fgpshandle);
{$ENDIF WINCE}
bStart.Enabled:=true;
bStop.Enabled:=false;
end;
procedure TForm1.Timer1Timer(Sender: TObject);
begin
res := GetGPSPosition(pGPSPosition,true);
end;

Now on pressing the Start button, the program tries to connect to the
GPS and if it succeeds, the timer is activated.
The timer makes sure that after an interval (Timer1.Interval), the GPS is
checked, to see if there is any new information. This information is
shown on the screen. Using the Stop button, the timer is deactivated
and the connection with the GPS disconnected.
Storage of a walk in the park
Suppose we want to walk in the park and save our position every 5
seconds. We want to use a database for that but not one for which we
have to install a complete database server. And it should work on
Windows CE. That's a perfect match for SQLite (www.sqlite.org).
If you want to work with SQLite the only thing you have to do is place
the sqlite3.dll in the same map as your executable, or if you want to
make it system-wide, in c:\windows\system32. Let's try to get this
working on the PC first.
This way the {$IFDEF WINCE} are also used for something useful.
First download sqlite3.dll and place it in the system32 folder.
We also want to access the database from within Lazarus, so placing it
in the project-folder is not enough. (It is possible, of course. But in that case
the dll should also be copied to the location of the Lazarus executable.)
Now place a TSQLite3Connection, TSQLTrasaction and a
TSQLQuery from the sqldb-tab on the form.
Connect the TSQLTransaction.Connection and
TSQLQuery.Connection to the TSQLite3Connection. Set the
TSQL3Connection.Transaction to the TSQLTransaction.
From the 'Data Access' tab add a TDataSource and from the 'Data
Controls' a TDBGrid.
Connect the TDatasource.Dataset property to the TSQLQuery and set
the Datasource property of the grid to the TDatasource.
Finally give the TSQLQuery the following query (SQL): 'select
* from coordinates;'.
Now choose a DatabaseName for the TSQLite3Connection.
The databasename is nothing more then a filename in which the data is
stored. In my case that's 'h:\src\pgg-wince\gpsdata.sdb'.
You can check if sqlite is installed and configured ok by setting the
'connected' property to 'true'. Because there is no extra tool available to
create the table we need, we create the table.
Add to your application the private function which follows.
This code first checks if the query is already active. If not active, a
connection to the database is made. Then it checks if a table with the
name 'coordinates' already exists. If this table does not exist it is
created with the fields 'LocalTime','GPSTime','Longitude'
end 'Latitude'.

Page 78

COMPONENTS
DEVELOPERS

procedure TForm1.InitialiseDB;
var sl: TStringList; i: integer;
begin
if SQLQuery1.Active then Exit;
SQLite3Connection1.Open;
sl := TStringList.Create;
try
SQLite3Connection1.GetTableNames(sl);
if not sl.Find('coordinates', i) then
begin
SQLite3Connection1.ExecuteDirect
('create table coordinates(LocalTime datetime,
GPSTime datetime, longitude real, latitude
real);');
SQLTransaction1.CommitRetaining;
end;
finally
sl.Free;
end;
SQLQuery1.Open;
end;

The transaction will be committed to save it all. Now you will be able to
open the query. Add a call to this function in the OnCreate event of the
form, to ensure the table will always be opened at the start of the
program.
To test and create this simple database and table, we have to compile
the application for a PC. Go to the compiler options and set the
widgettype to win32/win64.
The code that has to be generated, should be for an i386 processor and
you have to choose win32 for the operating system. Start running the
program in the debugger and you'll see that this fails.
This is because the debugger is configured to debug applications for
ARM processors, remotely on a mobile phone.
Go to 'environment' -> 'IDE Options' -> 'debugger'
and set the debugger path to the 'normal' gdb executable.
(lazarus\mingw\bin\gdb.exe).
Start the application again and then close it. Now check to see if a
database file with a size greater than zero bytes has been created.
If that's the case you can set the Active property of the TSQLQuery on
the form to true. Then the fields added to the table will become visible
in the grid. To scale the grid with the form-size on the phone set the
BorderSpacing.Around to 5 and BorderSpacing.Bottom to 40.
Set Readonly to true and options.dgEditing and
options.dgIndicator to false. Set AutoFillColumns to true.
This is not all though. To show the columns properly on a small screen
the properties of each column have to be set manually. Double-click on
the grid and click three times on the 'add' button so that three columns
are added. Select the first column and set the 'fieldname' to the
'LocalTime' field and the 'DisplayFormat' to 'hh:mm:ss', so that only the
time is displayed, not the date.
You can also give the column a suitable title.
Configure the other two columns
so that the 'latitude' and 'longitude' are shown,
with '#0.#######' as DisplayFormat.
Next we're ready having configured the controls.
Now we have to take care of saving the
positions into the database.

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Multiplatform Windows CE (continuation 6)


We do this by adding the 'StoreDB' parameter
to the function GetGPSPosition:

As well as the new parameter the dtime and ttime variables are also new
as is the code-part in which a new record is added to the database. Also
the time at which the coordinates were measured by the GPS is stored
into the database.
Therefore this date has to be converted to a TDateTime.
When StoreDB is false nothing is stored into the database and if the
time from then GPS is invalid, this is not stored either.
Now we only need to add a value for the StoreDB parameter at the two
places where the GetGPSPosition function is called.
In the start-button event handler the value should be 'false'. In the
timer StoreDB has to be 'true'.
After these changes set the compiler-options so that Lazarus generates
a Windows CE/ARM application again.
Further the TSqlite3Connection databasename has to be adapted since
the folder of the filename which is now used probably doesn't exist on
the mobile phone device.
The easiest way is setting the database name to a simple filename
without any path, for example 'gpsdata.sdb'.
But different from what we are used to in Windows running on a PC,,
the file is not stored into the 'current' folder, but in the 'my device'
folder.
This is the first/root folder of the device.
This is because Windows CE doesn't have something as the 'default
folder'.
Change the databasename, compile the application and run it on the
emulator or mobile phone.
This will immediately lead to an error message because sqlite3.dll isn't
present on the phone (yet).
This isn't as easy as it looks.

Figure 9:
function TForm1.GetGPSPosition(var GPSPosition:
TGPS_Position; StoreDB: boolean): dword;
var
res:
DWORD; dtime: TDateTime;
ttime: Windows.systemtime; dist: double;
begin
{$IFDEF WINCE}
...
lMessage.Caption:=Format('Laatste positie: (%d)',
[GPSPosition.dwSatelliteCount]);
lMessage1.Caption:=Format('%g %g',
[GPSPosition.dblLatitude,GPSPosition.dblLongitude]);
if StoreDB then
begin
SQLQuery1.First;
SQLQuery1.insert;
SQLQuery1.FieldByName('localtime').AsDateTime := now;
if GPSPosition.stUTCTime.Year<>0 then
begin
ttime := GPSPosition.stUTCTime;
dtime:=
ComposeDateTime(EncodeDate
(ttime.Year,ttime.Month,ttime.Day),
EncodeTime(ttime.Hour,ttime.Minute,ttime.
Second,ttime.Millisecond));
SQLQuery1.FieldByName('gpstime').AsDateTime:=
dtime;
end;

The sqlite3.dll we used earlier is built for an Intel Pentium processor on


a Windows operating system.
We first have to download a sqlite3.dll version which is built for
ARM/WinCE. The easiest way to do this is downloading this file from
the Free Pascal ftp site (ftp://ftp.freepascal.org/fpc/contrib/
file: arm-wince-sqlite322.zip).
Place the (extracted) file at the same location as your program and start
the program again. Note: if the program is running in the debugger,
its location is the folder '\gdb'.
When the program is running, click on 'start' and wait until the GPS
has a suitable signal. By now you can see in what way the current
location is shown in the grid and is stored into the database.
However, if you restart the application you will notice the coordinates
table is empty again.
To save the data add a button 'save' to the form on the lower-right
corner of the form. Set Anchors.Right and Anchors.Bottom to true
(and the others to false). Add the following code on the next page to the
onClick event:
About the author:
Joost van der Sluis
is as member of the Free Pascal core team. He is
responsible for the DB-components of which
he wrote large parts. He is one of the founders of CNOC,
the company that offers commercial support for Free Pascal
and Lazarus and started with the www.lazarussupport.com
website.
CNOC tries to promote the use of Lazarus in commercial
environments and guides conversions from Delphi to
Lazarus.

SQLQuery1.FieldByName('latitude').AsFloat:=
GPSPosition.dblLatitude;
SQLQuery1.FieldByName('longitude').AsFloat:=
GPSPosition.dblLongitude;
SQLQuery1.Post;
end;
end
else
lMessage.Caption:=Format
('Wachten op verbinding. %D satelieten.',
[GPSPosition.dwSatellitesInView]);
....

May 2010 BLAISE PASCAL MAGAZINE 11

COMPONENTS
DEVELOPERS

Page 79

Multiplatform Windows CE (continuation 7)

procedure TForm1.bOpslaanClick(Sender: TObject);


begin
SQLQuery1.ApplyUpdates(0);
SQLTransaction1.CommitRetaining;
end;

ApplyUpdates takes all changes in the update-buffer of the dataset and


write them to the table. The CommitRetaining is to handle the
transaction.
So now the application is finished, well, for this article. There's a lot
that could be added.
For example a system which will measure the distance you travelled.
Or something like a service that records how often and where you go
by car.
But all the basic components are here and the program can be extended
in any way you would like.
Conclusion
It turns out that, developing for Windows Mobile is not that
difficult. Configuring all the necessary tools is a big part of the work.
You also need to understand factors such as using the right version of
sqlite3.dll on your phone, and using the right debugger version.
If you have all that set, it's just child's play.

Professional help and support


The new www.Lazarussupport.com website has been set up to promote
the use of Lazarus and Free Pascal and to make these tools accessible
for a larger audience.
The goal is to provide all the information that new Lazarus users need.
It also offers commercial support for those who want to use Lazarus
commercially.
Our employees have highly developed professional skills.
http://www.lazarussupport.com/
Make contact when ever you need to

Page 80

COMPONENTS
DEVELOPERS

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Five Considerations for Choosing an Effective SQL Development Tool


Cross Platform Database Development Byline

By Scott Walz

Database administrators (DBAs) and developers are


busyand that means they need the right tools to stay
productive. Most work in cross-platform database
environments, where they must handle complex,
heterogeneous databases, multiple platforms, multiple
versions of platforms and numerous instances. Without the
right tools, managing the data that's flooding into their
enterprises at an alarmingly fast rate is impossible.

3. How deep is the tool's knowledge of other


platforms?
DBAs and developers often work with outdated, inflexible tools and
have to spend a lot of time sifting through online documentation for
bits of information that will help them use the tools. This is a waste of
time. New tools are out there that offer efficient wizards that
automatically convert queries into the appropriate format for the target
platform. For example, you can quickly create tables even if you don't
know the right data fields and types. The simplicity can be enhanced
with a single user interface that is familiar and intuitive.

In fact, the typical DBA manages a terabyte of data or more and up to


35 databases instances at any one time. He must also meet the needs of
a team of developers who are all requesting changes and updates
constantly. Despite this overwhelming workload, DBAs must deliver
improved performance and availability to prove their value to the
organization.

4. Does the tool enable code organization and


the application of standards for DB
development?
To work quickly, organization is extremely important. Look for a tool
that enables you to organize and categorize data sources by platform
for easy retrieval. Some tools have customizable Bookmark features
The challenges of cross-platform database development and
that link you automatically to the resources you need or use most often.
management can lead to significantly increased costs, as IT departments
Others offer filters that help to remove the noise and enable you to
rush to hire new developers to support unfamiliar platforms. What
focus on only relevant information. Some tools also enable fast project
DBAs need is enhanced manageability, change management and
creation in a new platform through reverse engineering; they can
automationand the right tools to simplify, streamline and reduce the
extract procedures from one platform and apply them to the new
complexity of day-to-day tasks. Without such tools, it will be impossible
project, and even check in changes automatically to your chosen version
to handle the increasing pressure of new business requirements and
control solution. Such automation makes adapting to new
limited resources, and, ultimately, performance and quality will suffer.
environments much easier and eliminates human error.
Fortunately, comprehensive toolsets are now available to alleviate some
of these pain points. SQL development tools can be particularly helpful
in cutting costs and saving times in cross-platform environment. The
following five considerations can help you choose the right tool for the
job.
1. Does the tool provide a rich user interface?
Database development and management tools must fit how you work.
Tools that offer a single user interface, no matter what the target
platform, won't require you to familiarize yourself with a new UI, and
you can start being productive immediately. This is particularly
important when bringing on new talent to take over or assist with
existing projects. Additionally, a single UI can cut down on ramp-up
times as well as training costs.
2. Does the tool have a comprehensive tools
menu?
During SQL development, you may need to use numerous tools and
data repositories. A tool that can link you directly to other tools will
help streamline development. Look for solutions that offer
comprehensive menus and let you access other important resources
with a mouse click. It's also helpful if the tool lets you interface through
the menu with database file searches and scripting features, or view
visual differences between files and objects. Quick links to other
software, such as editing tools, are also desirable.

5. Can the tool streamline and improve coding?


As a DBA or developer working in cross-platform environments, you
spend a lot of time learning the constructs of a new procedural
language and understanding the relationships between data structures
and how to form queries. Tools that help you understand relationships
between foreign and primary keys are tremendous time-savers. Some
tools even enable you to drag and drop relationships into SQL windows
by defining the meaning of the relationships in the metadata. By
bringing this information into the query builder automatically, you can
quickly construct effective queries, even in unfamiliar environments.
Such automation allows lets you worry less about query construction so
you can focus on writing good code. Look for tools that also provide
validation and error checking. Debugging features can save you
considerable time and ensure that bad code never makes it past the
tuning stage.
When choosing the right tool to assist with cross-platform database
development, it's essential to know your options and choose a tool that
addresses key considerations. Being familiar with what features are
available to streamline and automate the more difficult and tedious
tasks involved in working with different platforms will make life a lot
easier, and help you be more productive, more efficient and more
valuable to your organization.

Five tips You should Know to tune SQL like an expert


Are you or someone you know, suffering from SQL CTD
compulsive tuning disorder? It's a tragic situation that has plagued
developers for years. First discovered by an Oracle DBA, CTD sufferers
tune blindly without a methodology -- burning time and energy chasing
phantom SQL performance issues. The good news is now developers
have new best practices and tuning tools to get relief from CTD and
start tuning like an expert. Keep reading to learn about new tuning
techniques that help you tune faster and with greater confidence
knowing the code you pass to the Quality Assurance Engineer and
DBA is more fully optimized.
Know the Risks of CTD
With DBAs becoming increasingly focused on meeting production
service levels, developers are finding themselves either writing or
inheriting the SQL code leveraged by the applications they are building;
and therefore,
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

owning the performance of their applications from the application layer


through the database layer. If performance issues work their way past
unit testing, they typically be found again during load testing in QA and
slow down the QA process.
If these performance issues manage to reach production, they can
wreak havoc on the very service levels the DBA is trying to meet,
causing system slowdowns or even outages, which are considered the
most dangerous type of performance concern for any business.
And if it is determined through a process of elimination that the SQL
code is responsible for the performance bottleneck(s), then the
developer is often left searching for a needle in a haystack, tuning SQL
randomly by shoveling it into a tuning engine that needs to sift through
all of the SQL to suggest ways to speed up any and all code no matter
the duration or frequency. This is a common symptom of CTD, and it
results in building a haystack around the issue and wasting inordinate
amounts of time and effort.
COMPONENTS
DEVELOPERS

Page 81

Five Considerations for Choosing an Effective SQL Development Tool


(continuation)
Once you are able to pinpoint the worst-performing SQL, you can
Profile the Database
The symptom of chasing phantom issues can easily be remedied by
profiling the database first. This is a practice that can be applied during
unit testing in development, load testing in QA and production, and it
allows for capturing snapshots of meaningful statistics and data
represented graphically to easily and immediately pinpoint performance
issues. It also allows for a collaborative workflow between production
DBAs, QA Engineers, and Developers who can share profile snapshots
and conduct a very focused and effective troubleshooting process.
Image DBO15_profiling
Discover the Bottleneck
Once a profile has been captured, it is easy to pinpoint performance
issues via a graphical representation of data that is broken down into
three dimensions; SQL statements, events, and sessions
(programs/users). A red line represents the number of available CPUs
on the target database, and any spikes that break above that line grab
your attention you can crop out and drill down into the dimensions
and statistics for each spike to determine the root cause and take action.
This is a very useful productivity feature that helps avoid
troubleshooting irrelevant issues.
Find Worst-Performing SQL
After you identified the cause of the bottleneck the worst-performing
SQL, events, and sessions that are contributing to the performance
issues are automatically sorted to the top. You can click on the SQL
statements to see a graphical explain plan that presents the execution
path the database-specific optimizer takes to run the code. You can
also see session details about the specific statement including execution
statistics. All of this information is recorded during the profiling
session, and by profiling and discovering the bottleneck first it makes it
easier and faster to find the worst-performing SQL statement
contributing to the bottleneck.
Tune SQL
Best practices in SQL tuning indicate that the best way to speed up
application and database performance is to tune the worst-performing
SQL first, but those that suffer from CTD and don't follow the
preceding three steps, are unable to identify the worst SQL statements.
Instead they find themselves shoveling any and all SQL into a tuner,
chasing phantom issues and causing massive inefficiencies and slow
turnarounds on performance problems.

select that SQL to be run through the tuner. The tuner will verify the
database-specific optimizer is taking the fastest execution path, the SQL
is written effectively, indexes are being leveraged, missing indexes are
created, and the underlying schema is defined effectively for maximum
performance.
DBO20_Indexes_VST_datasheetimage
Stress Test to Validate Performance Gains
Once the SQL is tuned, it is important to measure and validate
performance gains by stress testing the original, un-tuned SQL code
and the newly tuned SQL code side-by-side while running a profiling
session and capturing the resultant snap shot. By simulating a number
of parallel sessions (user) and number of executions for some duration
of time, you can ensure that the SQL you have tuned will in fact stand
up to QA load testing and production stress levels.
DBO20_ Loadeditor.
In Summary
It sounds simple enough, but so many developers are still suffering
from CTD. It's a shame because help is available. By following a
painless five-step regime and using the right tools, any developer can
learn to quickly profile and pinpoint the worst-performing SQL
statements, streamline their SQL tuning process and validate their work
before passing the code back to QA or the DBA for final testing.
Everyone will think it was tuned by an expert.
About the author:

Scott Walz
has more than 15 years of experience in database
development and serves as senior director of product
management for Embarcadero Technologies. In this
position, Scott oversees the direction of the company's
database product family, while focusing on database
development and administration products. Prior to
joining Embarcadero four years ago, Scott served as
development lead for Louisville Gas & Electric. He holds
a bachelor's degree in computer information systems
from Western Kentucky University.

The Lazarus Complete Guide will be


available in mid December 2010.
You can order it at ourweb shop direct.
If you pre order the
LAZARUS COMPLETE GUIDE
you will have a Lazarus USB stick
for only 15.00

Page 82

COMPONENTS
DEVELOPERS

Graphical User Interface


programs for
Windows 32 and 64
Windows CE,
Mac OS,
Unix and
Linnux

M. van Canneyt,
M. Grtner, S. Heinig,
F. Monteiro de Cavalho,
I. Ouedraogo.

LAZARUS: COMPLETE GUIDE

Five Tips to CTD Relief

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Barnsten is Embarcadero's Technology Centre in the


Benelux.
Barnsten became CodeGear's representative in 2007 when
CodeGear closed down their offices in the Benelux.
Main products to represent were Delphi, Delphi for PHP, RAD
Studio, C++Builder, JBuilder and InterBase.
The CodeGear Division has been taken over by Embarcadero in 2008. Embarcadero
was at that time publisher of powerful database development tools like ER/Studio,
DBArtisan, DB Optimizer, DB Change Manager, Rapid SQL etc.
Last year Barnsten merged with Embarcadero's former database tool partner and is
now supporting all the Embarcadero tools.
As most applications do support databases this is a great fit!
You can now find all the tools you need for multi platform development at one
company. Barnsten employees have over 16 years of experience with the
Embarcadero Tools.
Visit the Barnsten website to learn more about us, the tools, special offers and local
events!
Contact us for a special offer on the database development tools.
This offer is for Blaise Pascal readers only!
Benelux customers can call us now at +31 23 542 22 27 or send a mail to
info@barnsten.com

DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE

Page 83

You might also like