Professional Documents
Culture Documents
Blaise 13uk PDF
Blaise 13uk PDF
95/13
LAZARUS
ALL ACCES
COMPONENTS
DEVELOPERS
4
DELPHI
DB ARTISAN
ER/STUDIO
RAPID SQL
DB OPTIMZER
DB PERFORMANCE CENTER
INTERBASE
DB CHANGE MANAGER
DATABASE:
WORLDS IN COLLISION?
Mini Course SQL by Miguel van de Laar
Delphi JSON Viewer by Pawe Gowacki
Is Lazarus ready for writing commercial applications ? by Zeljan
NexusDB exceptionaly good, a real surprise... by Erwin Mouthaan
Great News: The TMS DataModeler by Bruno Fierens
Introduction to Delphi Database Development: Part 1 by Cary Jensen
First Look at Advantage Database Server 10 by Cary Jensen
Real-time data collection by Anton Vogelaar
About Object oriented databases by Detlef Overbeek
Fastreport, whats up? by Detlef Overbeek
Using ADS with Delphi Prism and ASP.NET by Bob Swart
A datawarehouse example using kbmMW by Kim Madsen
Multiplatform Windows CE by Joost van der Sluis
Five Considerations for Choosing an Effective SQL Development Tool
by Scott Walz
September 2010
BLAISE
PASCAL MAGAZINE 11
ALL ABOUT DELPHI AND DELPHI PRISM(.Net) ,LAZARUS & PASCAL AND RELATED LANGUAGES
CONTENTS
Articles
GR
GR
T NE
EA
GR
GR
T NE
EA
Columns
Editorial page 4
Bookreviews by Frans Doove page 5
Advertisers
Advantage Databases page 3
Barnsten page 83
Cary Jensen
Book Advantage Database Server page 12
Components 4 Developers page 84
Editor in chief
Detlef D. Overbeek, Netherlands
Tel.: +31 (0)30 68.76.981 / Mobile: +31 (0)6 21.23.62.68
News and Press Releases
email only to editor@blaisepascal.eu
Authors
B Peter Bijlsma,
C Marco Cant,
D David Dirkse, Frans Doove,
G Primo Gabrijeli,
H Fikret Hasovic
N Jeremy North,
O Tim Opsteeg,
P Herman Peeren,
R Michael Rozlog,
S Henk Schreij, Rik Smit, Bob Swart,
Editors
Rob van den Bogert, W. (Wim) van Ingen Schenau,
Miguel van der Laar, M.J. (Marco) Roessen .
Correctors
Howard Page-Clark, James D. Duff
Translations
M. L. E. J.M. (Miguel) van de Laar,
Kenneth Cox (Official Translator)
Copyright See the notice at the bottom of this page.
Trademarks All trademarks used are acknowledged as the
property of their respective owners.
Caveat Whilst we endeavour to ensure that what is published
in the magazine is correct, we cannot accept responsibility for
any errors or omissions. If you notice something which may be
incorrect, please contact the Editor and we will publish a correction
where relevant.
Copyright notice
All material published in Blaise Pascal is copyright SOPP Stichting Ondersteuning Programeertaal Pascal unless otherwise noted and may not be copied, distributed or
republished without written permission. Authors agree that code associated with their articles will be made available to subscribers after publication by placing it on the
website of the PGG for download, and that articles and code will be placed on distributable data storage media. Use of program listings by subscribers for research and
study purposes is allowed, but not for commercial purposes. Commercial use of program listings and code is prohibited without the written permission of the author.
Page 2
COMPONENTS
DEVELOPERS
Page 4
COMPONENTS
DEVELOPERS
M. van Canneyt,
M. Grtner, S. Heinig,
F. Monteiro de Cavalho,
I. Ouedraogo.
Chapter 6
is a lengthy (170 pages) discussion of
Lazarus class libraries (the LCL)
In Chapter 7
discusses how to port Delphi components to Lazarus.
Chapter 8
is used for Databases and tools, data and extra tools.
Chapter 9
covers programming with graphics and
Chapter 10
is about Processes and Threads.
In Chapter 11
network programming is discussed and Chapter 12 covers
programming for databases and the data-specific tools included
with Lazarus. By Chapter 8 the reader will have learned enough
to be able to handle the somewhat more difficult and specialised
subjects that are addressed. Though the book is written by
several authors, the impression is that they were all writing in the
same spirit. The explanation of the programs and software is
crystal clear and instructive: after reading this book, there are
very few questions left unanswered. There are many screenshots
and lots of code examples.
Summary
A beautiful and very instructive book,
its content is complete and appropriate for its intended
audience.
It offers clear logical explanations of all you need to know
about Lazarus.
This is a must-have book for English speaking
programmers who want to start using Lazarus.
COMPONENTS
DEVELOPERS
Page 5
by Swen Heinig
Table 4.1
The standard options for creating new modules and projects in Lazarus (Part 1 from 2)
Page 6
COMPONENTS
DEVELOPERS
Table 4.1 :
The standard options for creating new modules and projects in Lazarus (Part 2 from 2)
In this chapter I will explain only the most important types of project
which you can develop with Lazarus. For most of them a specific
template can be selected in the menu File New, but for some a
generic template is used.
Image 4.3 : Starting a new GUI application in the dialog File New
Image 4.2 : The dialog Project - New Project offers the same project creation
options as the dialog File New
4.1 GUI APPLICATIONS
Ever since its commercial introduction in the 1980s, the graphical user
interface (GUI) has quickly grown in popularity, and today only
technically savvy users work with console interfaces. The secret in
writing a cross-platform GUI is simply using a cross-platform GUI
library which ensures the portability of the displayed images and text.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
As soon as this option is selected all the necessary files for this kind of
project are created and the form designer as well as the code editor
windows will be opened. More windows can be added to the project by
choosing the menu File New Form and new Pascal source code
units can be added with the menu File New Unit. The same action
can also be performed byopening the dialog File New and selecting
the corresponding module or by clicking on the appropriate button in
the Lazarus toolbar in the left corner of the main Lazarus window.
While Lazarus offers project management and very advanced code
completion for all kinds of projects, even console ones, it is for GUI
applications that it really distinguishes itself from other IDEs. The code
in the unit is automatically updated for each new component dropped
on a form, as well as if you change a component's name. so that the
code is automatically synchronized with the GUI editor. By double
clicking an event the code editor is shown with the cursor placed in the
procedure which handles it, and a suitable procedure is added to the
code if none is assigned to the event. Even the end corresponding to
a typed in begin is automatically added, although excessive automatic
coding from the IDE can get annoying so it can be disabled via the
Options menu.
COMPONENTS
DEVELOPERS
Page 7
by lnoussa Ouedraogo
TCP/IP programming
The Client program
The Server program
Web services
Programming Servers
Programming Clients
Message Logging
Object pooling
Service extensions
Chapter 12 - Database Access
by MichaelVan Canneyt
Architectural overview
Database acces
Choosing Databases
The Database Desktop: an additional tool
Classes for Database access
The Dataset
The DataModule
Data-aware controls
TDataset descendants
The Database Desktop
The Data Dictionary
Exporting Data
Generating code
Crashcourse SQL
Reports
Creating reports
The Report-Designer
Index
Index of graphicx
Index of tables
M. van Canneyt,
M. Grtner, S. Heinig,
F. Monteiro de Cavalho,
I. Ouedraogo.
Processes
Threads
Page 8
COMPONENTS
DEVELOPERS
Conclusion:
This is a very interesting web development tool, and well worth your while
to try out. You can find the latest Morfik version included in our
database_special.iso available from the Blaise website.
If you want the database_special.iso a DVD you will have to order it at the
online Blaise shop.
COMPONENTS
DEVELOPERS
I.CustomerId
ORDER BY C.LastName
The SQL statement above asks the customer name and the amounts
due on all invoices which are not yet paid, but where the due date has
passed. This example contains a SELECT, FROM, WHERE, and
ORDER BY clause.
INTERBASE
SQL
These are the SQL commands most commonly used in program code.
The other SQL commands are most commonly used in database
maintenance. Most databases offer GUI tools for such maintenance
tasks, and these tools employ these other commands. For further
information about these additional SQL commands you will need to
consult your database manual or make use of the appropriate internet
forums. Commands in SQL are called statements.
SELECT
The SQL statement which is most often used is probably SELECT.
After all this is the statement to questions a database and that's the
reason why it is called a Query language.
The SELECT statement typically looks like this:
Page 10
COMPONENTS
DEVELOPERS
(FirstName,
LastName,
To conclude
You can do much more using the basic SQL commands than what I have
been able to cover here in an introductory article. If there is sufficient
interest, I will examine more complex SQL statements in a subsequent
article. If there are questions about SQL don't hesitate to email me
(mvdlaar@gmail.com).
.
The INSERT INTO-clause contains the name of the table into
which the record must be inserted and (between parentheses) the
fields which will be filled with values. Fields which are not
mentioned will get a default value (usually NULL, i.e. empty). Don't
include auto-incremented fields (or Id fields) in the list of fields,
since they will get new values inserted automatically.
The VALUES clause which follows INSERT INTO contains the
values for the fields, in the same sequence as given in the
INSERT INTO-clause. Dependent upon the database you must
enter textual values either with quotation marks ( ) or
apostrophes ( ' ). Note that the number of fields must be equal
to the number of values.
UPDATE
The UPDATE statement is meant to change values of records, as
shown below:
UPDATE Customer
SET Address = "Steenweg", HouseNumber
City
= "Naarden"
WHERE Id
= 12,
= 12
DELETE
Our survey of basic SQL commands is complete with DELETE..
When designing a new database it is important to know the
difference between logically or physically deleting records from a
table. When you delete records only logically, they remain
physically present in the database, but marked with a 'removed'
status.
Take care that the logically deleted record doesn't show up in
surveys (simply add in all SELECT statements a condition to
exclude removed or inactive records). Logical removal is
particularly recommended in situations where inexperienced users
could delete enormous amounts of data or if logging takes place.
If you choose to physically delete a record, the database will no
longer hold the deleted data anywhere.
In SQL this is done by the DELETE statement:
DELETE FROM Customer
WHERE FirstName = "Peter"
AND LastName
= "Halsema
Page 11
At that time I didn't know third party components like Zeos etc..., so I
wrote my own Postgresql driver and used it for several years.
Later when I found Zeos - a nice surprise I immediately started to use
that. So it went on until 2004, there were rumors that Kylix was Exit,
no news from Borland just silence ... Yes it was Exited: shame on you
Borland, not because you put Kylix into grave, but because you cheated
your customers.
For years, then, we were fighting with Borland products. (In the
meantime Kylix could not run on any distributions based on Glibc
higher than 2.4.X) then, until I saw someone had started a qt-widgetset
in a Lazarus project, and that guy was Felipe, and thanks to Den Jean
for Qt C bindings, because without C bindings we could not have a Qtwidgetset inside Lazarus.
I had looked into Lazarus just a few times before, but I was not
attracted previously, because it supported only the Gtk1 widgetset
which looked awful compared to the Qt2 used by Kylix, so now I got
motivated to download the lazarus trunk and find out to see the way
how it worked with Qt. (I already tried Gtk before).
Page 13
Page 14
COMPONENTS
DEVELOPERS
Conclusion:
Lazarus is ready for commercial usage especially for people with
legacy Kylix3 / Delphi7 codebases.
My personal opinion is that Lazarus Qt is much better than
K3/D7 at this time (0.9.29 trunk), and developers will be happy
with it's new 0.9.30 version.
Why?
The only OOP RAD which supports so many platforms.
Constantly developed by volunteers, it does not depend on
commercial decisions so you can avoid bankrupcy etc.
Costs almost nothing except energy and time.
If it doesn't fit your needs, you can change it and contribute.
If there's a bug - you can fix it and contribute it, but at least you
can open an issue at lazarus mantis issue tracker.
COMPONENTS
DEVELOPERS
Page 15
JSON has become the X in Ajax. It is now the preferred data format
for Ajax applications. The most common way to use JSON is with
JSON support has been introduced in Delphi 2010 as a part of the XMLHttpRequest. Once a response text is obtained, it can quickly be
converted into a JavaScript data structure and processed by an
DBExpress database driver architecture, but of course JSON
application.
support is not limited to just database applications. JSON is
JSON's syntax is significantly simpler than XML, so parsing is
similar to XML as both are text-based data interchange formats.
more efficient.
Delphi 6 introduced the TXMLDocument component and the
XML Data Binding Wizard to make it easier to work with XML JSON doesn't have namespaces. Every object is a namespace: its set
of keys is independent of all other objects, even exclusive of nesting.
documents in code.
JSON
uses context to avoid ambiguity, just as programming
In this paper I'm presenting a similar component for JSON called
TJSONDocument. The next step was to implement a TJSON languages do.
JSON has no validator. Being well-formed and valid is not the same
TreeView component for displaying the content of TJSON
as being correct and relevant. Ultimately, every application is
document in VCL Forms applications.
responsible for validating its inputs.
Using these components a simple JSON Viewer application has
been created and described here. In the XML world there are two Below is a fragment of sample JSON text, based on the Sample
Konfabulator Widget from [5], used in the later part of this article.
categories of parsers: DOM and SAX.
A DOM (Document Object Model) parser reads XMLStrings and
builds an in-memory representation of them which applications
can traverse and update.
On the other hand SAX is a streaming interface - applications
receive information from XML documents in a continuous
stream, with no backtracking or navigation allowed.
Towards the end of this article I describe the TJSONParser
component that was created as an experimental TJSON
Document-descendant that provides SAX-like event-based
processing for JSON[1].
{
"widget": {
"debug": "on",
"window": {
"title": "Sample Konfabulator Widget",
"name": "main_window",
"width": 500,
"height": 500
},
"misc": ["hello", 23, false]
}
}
Why JSon?
JSON only has three simple types strings, numbers and Booleans
and two complex types arrays and objects. A string is a sequence of
zero or more characters wrapped in quotes with backslash escapement,
the same notation used in most programming languages.
A number can be represented as integer, real, or floating point. JSON
does not support octal or hex. It does not have values for NaN or
Infinity. Numbers are not quoted.
A JSON object is an unordered collection of key/value pairs. The keys
are strings and the values are any of the JSON types. A colon separates
the keys from the values, and a comma separates the pairs. The whole
thing is wrapped in curly braces. A JSON array is an ordered collection
of values separated by commas and enclosed in square brackets.
The character encoding of JSON text is always Unicode. UTF-8 is the
only encoding that makes sense on the wire, but UTF-16 and UTF-32
are also permitted. JSON has no version number.
No revisions to the JSON grammar are anticipated.
Page 16
COMPONENTS
DEVELOPERS
The DBXJSON unit also contains functionality to parse JSON text into
the graph of TJSONValue-descedants and to generate JSON text from
the graph of objects in memory. The "TJSONAncestor.Owned"
property (a boolean value) has been expanded to underline the fact that
all JSON descendants have the Owned property that controls the
lifetime of JSON objects in memory.
The TJSONObject class contains a static method ParseJSONValue
that effectively implements JSON parser functionality.
It accepts a string parameter with JSON text and returns a
TJSONValue reference to the root of the graph of TJSONAncestordescendants.
It is also possible to generate JSON text from the in-memory tree of
JSON objects calling the overloaded "ToString" method on any of
TJSONAncestor descendants.
TJsonDocumentComponent
Before Delphi 2010 introduced the DBXJSON unit, I was trying to
implement JSON parsing functionality manually by coding JSON
railroad diagrams.
With the DBXJSON implementation in place there is little point in
reinventing the wheel; however there is still no design -time support for
JSON.
Everything has to be done in code. Hence the idea of creating a
minimal VCL component wrapper for JSON parser implementation
provided by a TJSONObject.ParseJSONValue class method that
accepts JSON text and returns the object tree representing the
corresponding JSON document structure in memory.
The TJSONDocument component has been implemented inside a
unit named "jsondoc" to mirror the "xmldoc" name of the unit
containing the implementation of TXMLDocument class.
Below is the declaration of the TJSONDocument VCL component:
unit jsondoc;
//
type
TJSONDocument = class(TComponent)
private
FRootValue: TJSONValue;
FJsonText: string;
FOnChange: TNotifyEvent;
procedure SetJsonText(const Value: string);
procedure SetRootValue(const Value: TJSONValue);
protected
procedure FreeRootValue;
procedure DoOnChange; virtual;
public
class function IsSimpleJsonValue(v: TJSONValue):
boolean; inline;
class function UnQuote(s: string): string; inline;
class function StripNonJson(s: string): string; inline;
constructor Create(AOwner: TComponent); override;
destructor Destroy; override;
function ProcessJsonText: boolean;
function IsActive: boolean;
function EstimatedByteSize: integer;
property RootValue: TJSONValue read FRootValue write
SetRootValue;
published
property JsonText: string read FJsonText write
SetJsonText;
property OnChange: TNotifyEvent read FOnChange write
FOnChange;
end;
The full source code of this component and all other source code
described in this paper can be downloaded from [1]. See the
References section at the end of this article.
The TJSONDocument class contains a published JsonText: string
property that can be used to assign JSON text for parsing and a
RootValue: TJSONValue public property that can be used to assign a
TJSONValue reference and generate JSON text.
Assigning to either of these properties causes the other property to be
updated and the OnChange event is fired every time the JSON
stored inside the component is changed.
In this way it is possible for other components of an application to be
notified and refreshed. In this sense the TJSONDocument
component can be used as a JSON parser and generator as described in
the original JSON RFC [2].
The public IsActive: boolean property returns true if
TJSONDocument component contains valid JSON, or false it is empty.
function TJSONDocument.IsActive: boolean;
begin
Result := RootValue <> nil;
end;
COMPONENTS
DEVELOPERS
Page 17
Here we go
I have decided to create my JSON tree view component as a
descendant of the Delphi VCL TTreeView component. A good
Delphi programming practice would be to derive it from
TCustomTreeView instead to be able to decide which inherited
protected members of a class should be declared as published. In
my case I want the end user to have access to whole TTreeView
component functionality at design-time, so I do not need to hide any
inherited properties.
unit jsontreeview;
type
TJSONTreeView = class(TTreeView)
public
procedure LoadJson;
published
property JSONDocument: TJSONDocument //
property VisibleChildrenCounts: Boolean //
property VisibleByteSizes: Boolean //
end;
Page 18
COMPONENTS
DEVELOPERS
procedure TJSONTreeView.LoadJson;
var v: TJSONValue; currNode: TTreeNode; i, aCount: integer;
s: string;
begin
ClearAll;
if (JSONDocument <> nil) and JSONDocument.IsActive then
begin
v := JSONDocument.RootValue;
Items.Clear;
if TJSONDocument.IsSimpleJsonValue(v) then
Items.AddChild(nil, TJSONDocument.UnQuote(v.Value))
else if v is TJSONObject then
begin
aCount := TJSONObject(v).Size;
s := '{}';
if VisibleChildrenCounts then
s := s + ' (' + IntToStr(aCount) + ')';
if VisibleByteSizes then
s := s + ' (size: ' + IntToStr(v.EstimatedByteSize)
+ ' bytes)';
currNode := Items.AddChild(nil, s);
for i := 0 to aCount - 1 do
ProcessPair(currNode, TJSONObject(v), i)
end
else if v is TJSONArray then
begin
aCount := TJSONArray(v).Size;
s := '[]';
if VisibleChildrenCounts then
s := s + ' (' + IntToStr(aCount) + ')';
if VisibleByteSizes then
s := s + ' (size: ' + IntToStr(v.EstimatedByteSize)
+ ' bytes)';
currNode := Items.AddChild(nil, s);
for i := 0 to aCount - 1 do
ProcessElement(currNode, TJSONArray(v), i)
end
else
raise EUnknownJsonValueDescendant.Create;
FullExpand;
end;
end;
procedure TJSONTreeView.ProcessPair(currNode: TTreeNode;
obj: TJSONObject; aIndex: integer);
var p: TJSONPair; s: string; n: TTreeNode; i, aCount: integer;
begin
p := obj.Get(aIndex);
s := TJSONDocument.UnQuote(p.JsonString.ToString) + ' : ';
if TJSONDocument.IsSimpleJsonValue(p.JsonValue) then
begin
Items.AddChild(currNode, s + p.JsonValue.ToString);
exit;
end;
if p.JsonValue is TJSONObject then
begin
aCount := TJSONObject(p.JsonValue).Size;
s := s + ' {}';
if VisibleChildrenCounts then
s := s + ' (' + IntToStr(aCount) + ')';
if VisibleByteSizes then
s := s + ' (size: ' + IntToStr(p.EstimatedByteSize) +
' bytes)';
n := Items.AddChild(currNode, s);
for i := 0 to aCount - 1 do
ProcessPair(n, TJSONObject(p.JsonValue), i);
end
else if p.JsonValue is TJSONArray then
begin
aCount := TJSONArray(p.JsonValue).Size;
s := s + ' []';
if VisibleChildrenCounts then
s := s + ' (' + IntToStr(aCount) + ')';
if VisibleByteSizes then
s := s + ' (size: ' + IntToStr(p.EstimatedByteSize) +
' bytes)';
n := Items.AddChild(currNode, s);
for i := 0 to aCount - 1 do
ProcessElement(n, TJSONArray(p.JsonValue), i);
end
else
raise EUnknownJsonValueDescendant.Create;
end;
TJsonParser Component
In a sense the TJSONDocument component can be considered the
implementation of a Document Object Model for JSON. But what
about SAX for JSON? SAX or Simple API for XML presents a
completely different approach to document parsing. Instead of building
an in-memory representation of the document, it just goes through it
and fires events for every syntactical element encountered[7]. It is up to
the application to process the events it is interested in. For example to
find something inside a large document.
Based on the TJSONDocument I have implemented an experimental
TJSONParser component that implements a SAX processing model
for JSON. A bullet-proof SAX parser for JSON should be
implemented from scratch and directly parse JSON text and fire
relevant events. In my case it sits on top of the in-memory
representation of JSON.
The jsonparser unit contains the following enumerated type that lists
different token types that can be found in a JSON text:
type
TJSONTokenKind = (jsNumber, jsString, jsTrue, jsFalse,
jsNull, jsObjectStart, jsObjectEnd, jsArrayStart,
jsArrayEnd, jsPairStart, jsPairEnd);
COMPONENTS
DEVELOPERS
Page 19
Summary
JSON is currently probably the most important data interchange
format in use. Its simplicity makes it easy to process, and information
encoded with JSON is typically smaller than using XML.
Over the years XML has become a whole family of specifications and it
is not a trivial task to implement a fully compliant XML parser from
scratch.
Delphi 6 was the first commercial IDE on the market to introduce
support for XML SOAP web services. Delphi 6 also introduced the
TXMLDocument component and XML Data Binding Wizard to make
it easier to work with XML.
JSON so far lacks something equivalent to an XML Schema (which
abstracts a cross-platform representation of XML metadata). However
a JSON equivalent is slowly emerging.JSON.
On the JSON home page you can find a reference to a draft version of
IETF RFC A JSON Media Type for Describing the Structure and
Meaning of JSON Documents [8]. This is still pending feedback but
in future could be a starting point for implementing a Data Binding
Wizard for JSON.
In this article I have described a JSON Viewer application implemented
with Embarcadero Delphi 2010. The source code that accompanies this
paper is organized in the form of two packages for Delphi components
one runtime and one design-time and the djsonview: Delphi
VCL Forms application that implements the Delphi JSON Viewer.
References
1. Source code for this article
http://cc.embarcadero.com/item/27788
2. JSON RFC
http://www.ietf.org/rfc/rfc4627.txt
http://www.json.org/fatfree.html
5. JSON Examples
http://www.json.org/example.html
http://pivot.apache.org/demos/json-viewer.html
http://www.megginson.com/downloads/SAX/
COMPONENTS
DEVELOPERS
2010-11-30
Properties
NexusDB supports everything that can be expected of a modern
database : triggers, transactions, views and stored procedures. The
performance of NexusDB is high. Nexus has a specially designed
memory manager which performs best on multi-core computers. This
makes the NexusDB performance on such computers superior to other
databases. For Nexus Database Systems this Nexus memory manager is
an option. The standard memory manager of Delphi may be replaced by
the Nexus memory manager. (You add the nxReplacementMemoryManager
unit to the first place in the project uses list, done. thats all there is to using our MM
and any beginner should manage it.).
Another advantage is the simple installation procedure of the server. The
installation program is about 5 MB in size and installation proceeds very
quickly. No separate database administrator is needed.
NexusDB may be applied in different ways. Of course, NexusDB may
be used in a Client/Server architecture. One of other ways is to compile
the database engine with the project. In doing so, your Delphi program is
shipped as a single executable file without the worries of installation
procedures and configuration of a separate NexusDB Server.
This embedded version of NexusDB is even free to use. Download it
free of charge from the website! Besides that, there also is possible a so
called hybrid architecture, where a combination of Client/Server and
embedded mode is used.
Installation
First I will focus on the embedded version of NexusDB. The latest
version of NexusDB is 3.05 and it was released about the same moment
as the latest Delphi XE release. The free embedded version of NexusDB
Figure 3: Tool Palette of the NexusDB
can be downloaded from
http://www.nexusdb.com/support/index.php?q=FreeEmbedded.
It is not possible to supply a full description of all these components in
this introductory article. But they are all explained in detail in the
accompaning user manual. It is a big advantage that data-access
components are included. For example, look at the TnxDatabase,
TnxTable, TnxQuery and TnxStoredProc components. These
components enable the use of standard data-aware controls in Delphi
to implement database applicaties in the way we are used to.
The installation runs fast and trouble free.
Management Tool
Next to all components a so called Enterprise Manager is included. The
Enterprise Manager is a program to manage Nexus databases. Think of
the construction of tables, execution of SQL
scripts , defintion of indexes etc. All examples in the user manual use
the so called Northwind database. After installation, the SQL script to
generate this database may be found at :
C:\Program Files\NexusDB3\Sample Databases\Northwind.sql.
Page 22 / 2256
Page 23
Page 24
Page 25
You will still have access to all binaries and installers released before
your support expired.
You can still create and distribute your own products with the
versions that you have access to
(we do not revoke usage and distribution rights).
You will NOT get access to product
updates released after your
support expired.
Product
You can renew your product support
for 1 year after the 1 month grace period
for the normal upgrade price. The 1 year
NexusDB Developer SRC
period starts with the date of purchase.
NexusDB Developer DCU
NexusDB Embedded SRC
If your product support is expired right
NexusDB ADO Provider
now, we give you an extra extended grace
NexusDB ODBC Driver
period until October 15th 2010 to renew
NexusDB PHP Connector
your support at the renewal price.
Nexus Portal Pro
Thereafter the 1 month grace period applies.
Nexus Portal Std
Page 26
New License
Renewal within or
before grace period
Upgrade / Renewal
after grace period
AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD
AUD
750
500
350
400
400
400
1250
700
500
300
200
300
300
300
790
520
650
400
300
350
350
350
950
625
(new)
(new)
(new)
(new)
(new)
(new)
GR
GR
T NE
EAAT NE
Introduction
Data modeling is a mandatory requirement throughout the
lifetime of a database system: from the initial design of the
software, when the first structure is created and modeled, to later
on, in system updates, when database structure is modified.
There are also situations where manipulation of database
structure may be a complicated task, such as when there is a need
to get a productive system and to work on a non-documented
database, or convert a database from a DBMS to another. There
are several tools related to data modeling on the market: some
DBMS-specific, others generic; some useful to specific and
isolated tasks, others offering a multitude of features (and usually
not very cheap). TMS Data Modeler is a tool that provides
nothing but essential features for creating and maintaining a
database: it integrates database design, modeling, creation and
maintenance into a single environment, in a simple and intuitive
user interface to manipulate databases efficiently. This article
briefly describes the main features of TMS Data Modeler,
demonstrating how it may be used to create a project and
maintain an existing database.
Data Modeler main features
TMS Data Modeler is a generic tool that allows data modeling independently of the used DBMS, in an easy-to-use interface. The application
allows you to start modeling a database from scratch, as well as import
the structure of an existing database (reverse engineering). It allows you
to generate scripts to create the full database, or upgrade an existing
database with update script, through its version control system.
In addition, it provides features for conversion from one DBMS to
another, consistency check and visualization of entity-relationship
diagrams, among others. Data Modeler supports several database
management systems, currently: Absolute Database, Firebird 2, MS SQL
Server 2000/2005/2008, MySQL 5.1, NexusDB V3 and Oracle 10g.
by Bruno Fierens
Creating a project in TMS Data Modeler
There are two ways to start a project in Data Modeler: creating a new
project from scratch or importing data dictionary from an existing
database.
Figure 1: Starting
Choosing the "New Project" option, just select the target database and
an empty project will be created. By default Data Modeler provides a
diagram named "Main Diagram". Tables and relationships between
them can be created visually through the diagram. All objects in the
database (apart from tables we can have procedures, views, etc.) can be
accessed, created and edited through the Project Explorer, located on
the left of the screen.
In the "Import from Database" option you need to configure the
connection to the database whose structure will be imported. After
importing the structure, Data Modeler will hold all the database objects:
tables, relationships, triggers, procedures, views, etc. All objects are
listed at Project Explorer on the left, in their respective category. For an
overview of the imported structure, it is possible to open the "Main
Diagram" and select "Add all tables" from the context menu.
Page 27
Figure 3: Versioning
Page 28
CONNECT
AND
START
PROGRAMMING
Figure 6: Clicking on "Generate", we have our script ready to update the database from version 1 to version 2, containing all alterations.
Page 29
Page 30
expert Delphi
by Cary Jensen
(1)What is a Database?
A database is a mechanism for storing and retrieving data. That's
all, really. In the simplest of worlds, a text document can be a
database. XML is a text format, and many do use it as a database.
A database application, on the other hand, is much more. Most
database applications assist in the collection of data, the manipulation
of that data, and turn that data into information (reports, charts, actions,
and so forth). These application, however, do require a database, but I'm
starting to get ahead of myself here.
In most cases, the data of a database is structured, which is to say that
it is organized. In this regard, text documents often fall short.
As a result, database developers often rely on something else.
For the purpose of brevity, I am going to over simplify this and say
that most Delphi developers rely on three types of databases: custom
file structures, local file system databases, and remote database servers.
Yes, there are others, but in some respects they are variations (or even
combinations) of one or more of these.
(2) Custom File Structures
A database based on a custom file structures can make use of either
simple text or binary files.
In most cases, these files are highly organized. For example, each
individual piece of data may be separated from other pieces of data by
a particular character, or separator. An example of such a file is a
comma separated values (CSV) file, which is a common text format.
It's not necessary for data to be separated by characters.
Specifically, if you know that each piece of data takes up four bytes in
the file, you can retrieve the individual data values by parsing the file,
snipping off four bytes at a time. You would also store this data in the
file in the same way, writing each value as a four byte chunk.
With Delphi, some developers create files that contain a series of
record structures, where by record structures I literally mean Delphi
record types. These files are called typed files, since they are files of a
Delphi type: records, in this instance.
(For a nice introduction to using typed files, see Zarko Gajic's article at
http://delphi.about.com/od/fileio/a/fileof_delphi.htm.)
And these are just a few of the options for working with custom file
structures. One of the advantages of custom file structures is that you
can usually read and write them very fast. In addition, your Delphi
applications that use these files normally rely on nothing more than
Delphi's internal file IO (input/output) capabilities. By contrast, most
of the other database approaches rely on external files, such as client
DLLs. Figure 1 shows a simple diagram that depicts the interaction
between a Delphi application and custom data structures.
Data
COMPONENTS
DEVELOPERS
Page 31
DLL
DLL
DLL
(for example, the workstation on which the client is running and the server on which
the remote database server is running). As a result, overall processing power
is increased, since much of the data manipulation is handled by the
server, which has been optimized for these types of things, while your
client application takes primary responsibility for displaying the user
interface. Figure 3 contains a diagram that represents the typical
client/server architecture involving Delphi applications.
DLL
NETWORK
Data
DLL
DLL
DLL
DLL
While file server databases offer a variety of benefits over custom file
structures, they have their limits, especially when compared with
feature-rich remote database servers.
I'll focus on just two limits, and these are related to network bandwidth
and database stability.
When two or more clients need to share data from a file server
database, that data must be placed in a network location accessible to all
clients.
When those clients need to read the data, the data must be transferred
across the network. When many clients are reading and writing data,
this can mean that a large amount of data is moving around on the
network.
Actually, it's worse than it sounds.
For example, if your client application is searching for a particular piece
of data, such as the information about a specific person, some data
about all of the people needs to be transferred across the network so
that the client application can read through each person's data,
searching for the one of interest.
In other words, if data about a million people is stored in your file
server database, and your client is searching for one particular person, it
is likely that some data about all one million people will be transferred
across the network, and that is only for one client application (I say some
data, since most databases make use of indexes. I'll discuss indexes in more detail
in the next article in this series).
Consider what happens when twelve different client applications, on
twelve different workstations on the network are each searching for one
person from the database.
I think you get the picture.
As far as stability goes, file server databases lack centralized control of
the data, and, as a result, are prone to data corruption.
In a file server database, each and every client application can read and
write the data stored in the shared files on the network.
All it takes is for one of these client applications to have a problem
during a write operation (such as being unplugged from the network) and the
database can become corrupt.
This potential for corruption increases in direct proportion to the
number of client applications writing to the database.
Even if there is only one client writing to the database, an error during
a write operation can render the underlying database unusable. (And this
is why backing up your data is so very important. You cannot predict when a
problem like this will be encountered.)
(2)Remote Database Servers
A remote database server is an application that manages your database.
When you write a client application that involves a remote database
server, your client application does not read or write data directly from
files.
Instead, it makes all of its requests for data through the remote
database server. This general architecture is referred to as client/server
architecture.
This distribution of responsibilities produces three primary benefits.
First of all, it distributes the processing of data across several machines
Page 32
COMPONENTS
DEVELOPERS
SERVER
Page 33
SESSION (Default)
DATASETS
(Table, Query, StoredProcedure)
DATASOURCE
Page 34
COMPONENTS
DEVELOPERS
M. van Canneyt,
M. Grtner, S. Heinig,
F. Monteiro de Cavalho,
I. Ouedraogo.
DATABASE (Default)
Scalable
Advantage comes in two basic flavors: the Advantage Database Server
(ADS) and the Advantage Local Server (ALS). ALS is a free, file-server
based technology whose API (application programming interface) is
identical to ADS. ALS permits developers to deploy their Advantage
applications royalty free to clients who do not need the stability and
power of a separate database server. Importantly, as the needs of those
applications deployed with ALS grow over time, those applications can
be almost effortlessly scaled to client/server technology, in many cases
simply by deploying ADS. So long as the client applications are
designed correctly, those applications will begin using ADS the next
time they execute.
New Features and Enhancements in
Advantage 10
Rather than reciting a laundry list of updates, I have organized the
enhancements into the following sections:
Major performance improvements, enhanced notifications, additions to
Advantage SQL, nested transactions, Unicode support, additional 64bit clients, added design-time support for Delphi, and side-by-side
installation. For a detailed listing of all of the updates found in
Advantage 10, see the white paper at the following URL:
http://www.sybase.com/files/White_Papers/Advantage_W
hatsNewADS10_WP.pdf
Page 35
True;
Also, TOP queries now support a START AT clause, which permits you to
select a specific number of records beginning from some position in
the result set other than the top. For example, the following query will
return records 11 through 15 from the CUSTOMER table, ordered by last
name.
SELECT TOP 5 START AT 11 FROM CUSTOMER ORDER BY LastName;
A collection of bitwise SQL operators have also been introduced. These
include AND, OR, and XOR, as well as >> (rightshift) and << (left-shift).
There is also a new SQL scalar function: ISOWEEK, which returns the
ISO 8601 week number for a given date (it is also a new expression
engine function). And, some of the SQL scalar functions that were
previously not expression engine function are now. These include DAY,
DAYOFYEAR, DAYNAME, and MONTHNAME, to name a few. These are in
addition to CHAR2HEX and HEX2CHAR, which are newly added expression
engine functions. Support in the expression engine means indexes can
now be created using these functions, which in turn allows the
Advantage query engine to fully optimize
restrictions that use these scalars.
Finally, there are a number of new system stored procedures and
system variables. The following are just a few of the new system stored
procedures available in Advantage 10:
sp_SetRequestPriority, sp_GetForeignKeyColumns, and
sp_IgnoreTableTransactions.
Nested Transactions
Speaking of nested transactions, Advantage 10 now supports them. In
previous versions of Advantage, code executing in an active transaction
could not attempt to start a transaction without raising an exception.
Page 36
COMPONENTS
DEVELOPERS
Side-By-Side Installations
With Advantage 10, it is now possible to run two or more instances of
the Advantage server on the same physical server, even different
versions of Advantage. For example, it is now possible to run
Advantage 9 and Advantage 10 on the same server. This feature is
particularly useful for vertical market developers whose applications
need to support more than one version of the Advantage server.
Conclusion
With the release of Advantage 10, Sybase has once again
confirmed its commitment to this unique and valuable database
server.
In addition to a number of useful additions and enhancements,
Advantage 10 also includes a wide range of performance
improvements that will improve the performance of most client
applications merely by installing this updated server.
Most developers, however, will also want to update their client
applications to benefit from the many enhancements found in
Advantage 10.
From support for Unicode to greatly improved notifications, from
updated SQL syntax to enhanced table features, Advantage 10 has
something for everybody.
COMPONENTS
DEVELOPERS
Page 37
LAZARUS
ALL ACCES
COMPONENTS
DEVELOPERS
4
DELPHI
DB ARTISAN
ER/STUDIO
RAPID SQL
DB OPTIMZER
DB PERFORMANCE CENTER
INTERBASE
DB CHANGE MANAGER
GR
GR
T NE
EAAT NE
Real-time datacollection
starter
by Anton Vogelaar
TInputs
TInputs
8 x per
second
I/O Server
M485A Server
Temp Sensor
RS232
PC
DRIVER
DRIVER
USB
DRIVER
dta_collector.exe
fbclient.dll
dta_controller.dpr
Busines layer +
communication
layer
Persistent
layer
UGUI.pas
UDmain.pas
UDB.pas
lib485a.dll
Presentation
Layer
fbclient.dll
Figure 3:
The dpr file explained
Presentation layer.
The source code of the GUI can be read in UGUI.pas and the
screenshot shows how the controls are positioned. The controlling of
this application is handled by a controller instance of TContr defined
in the business layer.
The controller is instantiated in the OnCreate method of the main
form (named GUI), and released in its OnDestroy method.
In the top-left corner is a SpeedButton with a red / green glyph
indicating the on/off line status.
The OnClick event calls the GoOnline and GoOffLine private methods
of the main form (GUI).
These methods enable and disable the visibility of a panel showing the
measured data and call the Start and Stop methods of the controller.
This class also provides the public method Refresh to set the visual
controls with numbers as obtained from lower layers.
Figure 4:
The presntation layer and its components
Page 39
Realtime datacollection(continuation 1)
Unit UDomain;
Interface
Unit UGUI;
(* ======= Interface ===== *)
Interface
Uses Windows, Messages, SysUtils, Variants, Classes,
Graphics, Controls, Forms, Dialogs, StdCtrls, ExtCtrls,
Buttons, ToolWin, ComCtrls, UDomain;
Type TGUI
= Class (TForm)
TBar
: TToolBar;
BtnGo
: TSpeedButton;
PnlMain
: TPanel;
LbTemp
: TLabel;
LbPotm
: TLabel;
ShTot
: TShape;
ShTemp
: TShape;
ShPotm
: TShape;
Label1
: TLabel;
Label2
: TLabel;
Label3
: TLabel;
LbLog
: TLabel;
Shape1
: TShape;
Procedure BtnGoClick (Sender : TObject);
(Sender : TObject);
Procedure FormHide
(Sender : TObject);
Procedure FormShow
Private
Contr : TContr;
Procedure GoOnLine;
Procedure GoOffLine;
Public
Procedure Refresh (STemp, SPotm, SLog : String);
End;
Var
GUI : TGUI;
Uses
TContr = Class
Type
Private
Timer
: TTimer;
DB
: TDB;
NLog
: Integer;
Procedure DoTimer (Sender : TObject);
Public
Constructor Create;
Destructor Destroy; Override;
Procedure Start;
Procedure Stop;
End;
(* =========== Implementation ============== *)
Implementation
Uses
UGUI;
Page 40
Realtime datacollection(continuation 2)
Business and Communication layer.
Both business and communication functionality is implemented in the
unit UDomain. As Windows is an event based operating system the
controller class contains a timer instance firing the OnTimer event
every second. This event is linked to the DoTimer method. In
TContr.DoTimer the procedure M485a_Vars (S) is called.
This procedure resides is the lib485a DLL and returns a string
representation of the Inputs record in hexadecimal format.
The temperature is four hexadecimal characters long from position #3.
Itemp := StrToInt ('$' + Copy (S, 3, 4)); returns the
temperature as an integer in multiples of 1/16 degree Celsius where the
'$' character forces the StrToInt procedure to treat S as a string of
hexadecimal characters. This value is passed to the GUI as a string
created by Format ('%.1f C', [ITemp / 16]) the method GUI.Refresh is
called. This value is also to be stored in the database. Since all database
actions are encapsulated in class TDB it is sufficient to call DB.Save
(ITemp). Instantiating an releasing the DB object in handled by the
controllers Create and Destroy events.
Persistence layer.
This layer contains all the functionality required to save the measured
data in an Interbase / Firebird database. Instances of TIbDatabase,
TIbTransaction and TIbSql are used since TIbSql is a lightweight
communication class. Objects of these classes are instantiated by the
TDB.Open and TDB.Close methods. The actual storage functionality is
implemented in the TDB.Save method and embedded into a
transaction.
Unit UDB;
Interface
ExtCtrls, SysUtils, IbDatabase, IbSQL, Classes;
Uses
TDB
= Class
Type
Private
IbDb
: TIbDatabase;
IbTr
: TIbTransaction;
IbSQL
: TIbSQL;
Public
Procedure Open (DbName : String);
Procedure Close;
Procedure Save (ITemp, IPotm : Integer);
End;
(* =========== Implementation ============== *)
Implementation
(* =========== Public ====================== *)
Procedure TDB.Open (DbName : String);
Begin
IbDb := TIBDatabase.Create (Nil);
IbTr := TIBTransaction.Create (Nil);
IbSQL := TIBSQL.Create (Nil);
IbTr.DefaultDatabase := IbDb;
With IbDb Do
Begin
Params.Add ('user_name=SYSDBA');
Params.Add ('password=masterkey');
DatabaseName
:= DbName;
LoginPrompt
:= False;
SQLDialect
:= 3;
DefaultTransaction := IbTr;
Open;
End;
With IbSQL Do
Begin
Database
:= IBDb; Transaction := IBTr;
End;
End;
Procedure TDB.Close;
Begin
IbDb.Close; FreeAndNil (IbSQL); FreeAndNil (IbTr);
FreeAndNil (IbSQL);
End;
Procedure TDB.Save (ITemp, IPotm : Integer);
Begin
IbTr.StartTransaction;
IbSQL.SQL.Text := Format ('insert into LOG (TEMP, POTM)
values (%s, %s)',
[IntToStr (ITemp), IntToStr (IPotm)]);
IbSQL.ExecQuery;
IbTr.Commit;
End;
(* ============ End ======================== *)
End.
Marshalling
Page 41
GR
GR
(Advertentie)
expert
starter
T NE
EAAT NE
M324D40
X-tal
oscilator
7.37 MHz
Delphi CPU
Hardware.
M324D40 DelphiCPU
A programmed AtMega324 40 pin dual in line controller chip measuring
50 x 17 x 4 mm. This chip contains all basic computer parts like :
- 32 kBytes Flash memory for program storage
- 2 kByte RAM for variables
- 32 general purpose registers
- 1 kByte of EeRom for non - volatile storage
- a AVR CPU
- a hardware multiplier for fast mathematics
- a clock oscillator able to run up to 20 MHz.
The chip also contains a set of input and output peripherals like :
- JTAG interface for real-time debugging
- 3 timer counters with interrupts for a system clock and delays
- 6 PWM channels which can be used for motor speed and direction
control
- 8 analog to digital converters 10 Bit with 1x, 10x and 200x amplifiers
for interfacing with analog sensors, voltages and potentiometers
- I2C interface for expanding the number of peripherals
- Watchdog timer for automatic reset when a failure is detected
- Analog comparator for accurate level detection
- 32 programmable digital in- and output lines for lamps, switches,
LCD, TCP/IP etc.
4 KByte of the 32 KByte flash is reserverved and
preprogrammed with the:
- OS : an operating system implementing a system clock, starting of
servers and user applications
- BIOS : I/O drivers for EeRom and standard hardware
- Hardware test : standard hardware test application
- M485A server : monitor for RAM and EeRom including a Flash
programmer for user applications
- I/O server : synchronized I/O with RAM records
32 I/O
wires
1kByte EeRom
JTAG
6xPWM
3xTimers
8x A/D
WDOG
12C
VE08201 DelphiStamp
this is a miniature (52 x 20 x 20 mm) plug-in version of the
DelphiController as described above with additional resources :
- Flash memory = 128 kByte
- RAM memory
= 4 kByte
- EeRom memory = 4 kByte
- xtal oscillator = 11.06 MHz
2kByte Ram
32kByte Flash
BIOS
Hardware test
M485AServer
I/OServer
OS
Page 42
2 Digit LED
Documentation.
The tutorial package contains:
- software users manual including install instructions
- hardware users manual
- description of hardware test method
- several detailed project descriptions
- compiler manual
- data sheets of used components
- electronic circuit diagrams
- source listing of used drivers
Cross-compiler.
The cross compiler accepts pascal code as used in Delphi and Lazarus
(no classes or objects) and converts this to AVR object code which the
DelphiController will run after uploading. The cross compiler has a
build-in assembler which can be used for drivers and time critical
sections of the user application.
IDE.
The use of an IDE (Integrated Development Environment) greatly
enhances the development process by providing features such as code
completion, help, file management etc. Suitable IDEs are provided by
Delphi, Lazarus and FreePascal. While using the IDE and suitable
templates the controller algorithm can be run in simulation mode in
which the provided GUI units and templates simulate sensors and
actuators. After debugging, the unmodified source code can be used for
the cross compiler.
Templates.
To help you learn quickly how to program this hardware, curve
templates are providedin units for typical unvarying code sections
found in simple projects. By supplying templates and library units the
controller's behaviour can be simulated in your chosen IDE. Templates
are provided for three types of application programs i.e.
1) Interface application
2) GUI application and
3) Stand alone application.
Comm DLL
GUI (optional)
M485A server
(optional)
Controller
DelphiDevBoard
Written in Pascal
User hardware
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
Comm DLL
Delphi
Controller
Interface application.
In this mode the PC is the master and executes the control algorithm
while the DelphiController acts as an interface slave and follows the
commands coming from the PC through its RS232 port. When the PC
stops the DelphiController will also stop.
In the interface mode the DelphiController contains in RAM an input
and output record. Each field in these records reflects the state of the
sensors and actuators. The IOServer running in the background
synchronizes these records with the physical hardware through the
BIOS drivers. The running M485A monitor server will accept read and
write commands on the input and output records through the RS232
port. To make communication between the PC and the
DelphiController easy a communication DLL, templates and sample
applications are provided.
Controller
M485A server
I/O server
DelphiDevBoard
User hardware
GUI application.
In this mode the DelphiController is the master and executes the
control algorithm while the PC retrieves information from the RS232
port to update a GUI or to send commands to the controller. When the
PC stops the DelphiController will continue. In the GUI mode the
DelphiController contains in RAM an input and output record. Each
field in these records reflects the state of the sensors and actuators. The
IOServer running in the background synchronizes these records with
the physical hardware through the BIOS drivers. The running M485A
monitor server will accept read and write commands on the input and
output records through the RS232 port. To make communication
between the PC and the DelphiController easy a communication DLL,
templates and sample applications are provided.
Comm DLL
VE09206
GUI
M485A server
I/O server
Controller
DelphiDevBoard
Written in Pascal
User hardware
Information about sales etc.
About the author
Anton J. Vogelaar
www.blaisepascal.eu
is an electronic measurement and control engineer. In
1974 he completed his electronic engineering study at the
HTS in Utrecht. In 1982 he completed a course in
engineering science at the university of Durham UK. Since
1972 he has been director of Vogelaar Electronics
Netherlands which specialises in in digital control
equipment (air gauging, climate control, industrial control,
bio-reactor control etc.). He also writes software which
provides the control, GUI and persistence aspects required
by the hardware his firm produces. The programming
languages he uses used include Assembler (AVR, PIC and
8751), Pascal, Delphi and Java on Windows and Linux
(embedded) platforms. He has more than ten years
experience as part-time teacher in advanced technical
colleges and gives lectures in Pascal/Delphi. His other
interests are : rowing, steam engines, study, technology
and grandchildren.
DATABASE SPECIAL 2010 BLAISE PASCAL Page
MAGAZINE
43
Pascal Shop
www.blaisepascal.eu
N
WIIN
SW
US
RU
AR
ZA
LLA
AZ
E
BLLE
AB
TA
RT
P
OR
PO
CONNECT
AND
START
PROGRAMMING
Object-Oriented Databases,
starter
Pavement Improvement
Reconstruction
Maintenance
Rehabilitation
Routine
Corrective
Preventive
Network Model
The network model is a database model conceived as a flexible way of representing
objects and their relationships. Its distinguishing feature is that the schema, viewed as
a graph in which object types are nodes and relationship types are arcs, is not
restricted to being a hierarchy or lattice.
The network model's original inventor was Charles Bachman. - wikipedia
Flexible Pavement
Rigid Pavement
Spall Rpair
Joint Seal
Sillicone Sealant
Crack Seal
Patching
Asphalt Sealant
The popularity of the network data model coincided with the popularity
of the hierarchical data model. Some data is more naturally modelled
with more than one parent per child.
So, the network model permitted the modeling of many-to-many
relationships in data. In 1971, the Conference on Data Systems
Languages (CODASYL) formally defined the network model.
The basic data modelling construct in the network model is the set. A
set consists of an owner record type, a set name, and a member record
type. A member record type can have that role in more than one set,
hence the multiparent concept is supported.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
Detlef Overbeek
Page 45
Complex
data
types
Page 46
COMPONENTS
DEVELOPERS
Object
Oriented
Ideas
SQL 3
1999
2nd Generation:
Relational Database
Data
Base
Ideas
SQL 2
1992
Object Orientation
SQL
1st generation:
Hierarchical
Network
First, there are data sources such as the Web, which we would like to
treat as databases but which cannot be constrained by a schema.
Second, it may be desirable to have an extremely flexible format for
data exchange between disparate databases.
Third, even when dealing with structured data, it may be helpful to
view it as semi-structured for the purposes of browsing.
Associative Model
The associative model of data is an alternative data model for database systems.
Other data models, such as the relational model and the object data model, are
record-based.
These models involve encompassing attributes about a thing, such as a car, in a
record structure. Such attributes might be registration, colour, make, model, etc.
In the associative model, everything which has discrete independent existence is
modelled as an entity, and relationships between them are modelled as associations.
The granularity at which data is represented is similar to schemes presented by Chen
(Entity-relationship model); Bracchi, Paolini and Pelagatti (Binary Relations); and
Senko (The Entity Set Model).
A number of claims made about the model by Simon Williams, in his book The
Associative Model of Data, distinguish the associative model from more traditional
models. - wikipedia
The associative model divides the real-world things about which data is
to be recorded into two sorts:
Entities are things that have discrete, independent existence.
An entitys existence does not depend on anything else.
Associations are relationships whose existence depends on one or more
other things, such that if any of those things ceases to exist, then the
association itself ceases to exist or becomes meaningless.
An associative database comprises two data structures:
1. A set of items, each of which has a unique identifier,
a name and a type.
2. A set of links, each of which has a unique identifier, together with
the unique identifiers of three other things, that represent the
source, verb and target of a fact that is recorded about the source in
the database.
Each of the three things identified by the source, verb and target
may be either a link or an item.
The primary trade-off being made in using a semi-structured database model is that
queries cannot be made as efficiently as in a more constrained structure, such as in
the relational model. Typically the records in a semi-structured database are stored
with unique IDs that are referenced with pointers to their location on disk. This
makes navigational or path-based queries quite efficient, but for doing searches over
many records (as is typical in SQL), it is not as efficient because it has to seek
around the disk following pointers.
The Object Exchange Model (OEM) is one standard to express semi-structured
data, another way is XML. - wikipedia
In semi-structured data model, the information that is normally
associated with a schema is contained within the data, which is
sometimes called ``self-describing''. In such a database there is no clear
separation between the data and the schema, and the degree to which it
is structured depends on the application.
COMPONENTS
DEVELOPERS
Page 47
All pointers that belong to a particular static pointer type point to the
same Class (albeit, possibly, to different Object).
In this case, the Class name is an integral part of the that pointer type.
A dynamic pointer type describes pointers that may refer to different
Classes. The Class, which may be linked through a pointer, can reside
on the same or any other computer on the local area network.
There is no hierarchy between Classes and the pointer can link to any
Class, including its own.
In contrast to pure object-oriented databases, context databases is not
so coupled to the programming language and doesn't support methods
directly. Instead, method invocation is partially supported through the
concept of VIRTUAL fields.
A VIRTUAL field is like a regular field: it can be read or written into.
However, this field is not physically stored in the database, and in it
does not have a type described in the scheme. A read operation on a
virtual field is intercepted by the DBMS, which invokes a method
associated with the field and the result produced by that method is
returned. If no method is defined for the virtual field, the field will be
blank. The METHODS is a subroutine written in C++ by an
application programmer. Similarly, a write operation on a virtual field
invokes an appropriate method, which can changes the value of the
field. The current value of virtual fields is maintained by a run-time
process; it is not preserved between sessions. In object-oriented terms,
virtual fields represent just two public methods: reading and writing.
Experience shows, however, that this is often enough in practical
applications. From the DBMS point of view, virtual fields provide
transparent interface to such methods via an aplication written by
application programer.
A context database that does not have composite or pointer fields and
property is essentially RELATIONAL. With static composite and
pointer fields, context database become OBJECT-ORIENTED. If the
context database has only Property in this case it is an ENTITYATTRIBUTE-VALUE database. With dynamic composite fields, a
context database becomes what is now known as a
SEMISTRUCTURED database. If the database has all available types...
in this case it is ConteXt database!
For more information see: Concepts of the ConteXt database
The current version of DBMS ConteXt you can download from:
UnixSpace Download Center:
http://www.unixspace.com/download/index.html
Page 48
COMPONENTS
DEVELOPERS
DOG
Figure 10: Subclasses can inherit attributes from one or more superclasses
COMPONENTS
DEVELOPERS
Page 49
CUSTOMER
Name
SSN
Account Rep:
Account Rep is a fairly complex
object that exists independently
from Customer. In this example,
Customer includes a reference to the
appropriate Account Rep.
Address
Street
Address
Objects can be embedded within
classes. In this example,
Address is an object that
contains the properties Street
and City
City
Account Rep:
Account rep
Name
INVOICE
INVOICE
INVOICE
Name
Addres
City
Name
Addres
City
Rep ID#
Name
Addres
City
Invoice:
References can be made to more than one
instance of a class, thus creating a collection. A
collection can be thought of as a one-to-many
relationship. Cach also supports other types of
relationships
Creating Objects
Classes are rapidly created and edited with the Cach Studio.
The Studio is an integrated development environment (IDE) where
developers can perform all of their application development tasks.
For data modelling, this includes specifying properties, coding and
debugging object methods, and defining specialized data types.
The support for advanced object concepts, simple and multiple
inheritance, embedded objects, references to objects, collections,
relationships, and polymorphism - make the Studio a powerful and
productive environment for modelling data and business processes.
The Studio includes a wizard for the easy creation of Cach classes, but
there are several other ways to input and export class definitions to and
from the Studio.
The Cach RoseLink allows classes defined using Rational Software's
popular Rose object modeling tool to be imported into Cach. Similarly,
class definitions can be exported to Rose for use within Rose's
modelling environment.
Cach can also create objects from relational DDL files. The resulting
object classes will be very simple: their properties will be single-valued
system-defined data types that correspond to the relational table's fields,
and their only methods will be those persistence methods required to
move the data to and from the disk.
However, thanks to Cach's Unified Data Architecture, even these
simple classes are immediately available for use with object
programming languages, and they may be used as building blocks to
create more complex data models.
XML provides another way to transport class definitions from one
application to another. Class definitions can be exported/imported as
XML documents.
One command can project Cach classes as Java classes. Cach also
provides a class library that allows Java programmers to access Cach
objects in the Cach database.
EJBs
EJB projections can also be created with one click from within Cach
Studio. Cach allows developers to take advantage of the speed of BeanManaged Persistence, without having to do lots of tedious coding to map
between Java classes and relational tables.
Cach supports BEA's WebLogic application server.
Scripting Languages
Methods in Cach objects are coded using either (or both) Cach
ObjectScript or Cach Basic. Both languages allow developers to use all
of Cach's data access modes - Objects, SQL, and Multidimensional within the same routine.
Page 50
COMPONENTS
DEVELOPERS
expert
Page 52
Report Builder:
+ Fully integrated with the Delphi IDE
+ Visually create a report layout
+ 21 different components to present the data
+ Speed
+ Runtime Pascal Environment (RAP). Object Pascal with event
handling integrated to create complex reports
+ Good documentation
+ Source code is included in all editions
- RAP is only available with the Enterprise and Server editions
- Relatively expensive. The version with the end-user layout editor
integrated (Professional edition) costs $ 495.
- No .NET version available
Crystal report:
+ Has a lot of features
+ Can display data from many different databases
- Integration with Delphi can be a problem
- Separate licenses needed for commercial applications
- Expensive, license fees start at 479,Rave reports:
+ Free. Rave BE (Bundled Edition) is installed automatically when
installing Delphi 7, 8, 2005, 2006, 2007 and 2009, 2010 and XE.
+ Fully integrated with the Delphi IDE
- People say that there are a lot of problems with the Delphi 2009
edition.
- Very hard to make contact or get support.
- Website hasn't been updated for a long time and there is no
information about the versions they sell.
- No .NET version available
Quick Report:
+ Quick Report (version 5.05) is available for Delphi
5/6/7/2005/2006/2007/2009/ 2010/Xe (Win32 mode)
5.05 for Delphi XE 32 now available for download.
They are working on the C++ Builder XE version.
+ Fully integrated with the Delphi IDE
+/- Pro version 345,+ Upgrade price is 25% of a new license.
+ end-user report editor and PDF export (Pro version)
- It seems that Quick reports contains a couple of nasty bugs
- No .NET version
Fastreport :
+ Fully integrated with the Delphi IDE
Improved Engine:
improved shift mechanism / duplicated combining / new aggregates
improved cross object / changes in xml format (write collections in XML)
improved report inheritance / hierarchy / watermarks/ objects fill
improved linear barcodes / improved interactive reports
OnMouseEnter /OnMouseLeave events
detailed reports /multi-tab preview for detailed reports;
+ Well documented
+ Create a layout visually from the Delphi IDE or Visual Studio
+ A lot of different components to present the data
New Objects: new 2D barcodes: DataMatrix and PDF417;
Table object / Cellular text / Zip Code;
+ A lot of export filters:
PDF, RTF, XLS, XML, HTML, JPG, BMP, GIF, TIFF, TXT, CSV,
Open Document Format (you can even build our own export filters!)
New exports: BIFF XLS / PPTX / XLSX / DOCX
+ Built in script engine for PascalScript, C++Script, BasicScript,
JScript with debugger (Win32 version). The .NET version currently
uses C# and VB.NET for scripting.
+ Versions available for Delphi 4 XE and .NET (integrated with
Visual Studio: Delphi Prism, C#, VB.NET, etc)
+ Built in end user rapport editor, starting at standard edition,
without any extra license fees
+ Source available (starting at Professional Edition)
+ Web reports (Enterprise edition)
+ Licenses starting at $79 (Basic) till $349 (Enterprise)
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
Fastreport in practice
In the meanwhile we have added reporting capabilities to a couple of
our applications. Below you will find an enumeration of a couple of
features we have used or will be using in the future.
You can download a trial version of Fastreport on the Fastreports
website (http://www.fast-report.com/en/download/fast-report4-download.html), so you can explore the possibilities of their
products.
The Fastreport trial has two limitations: only 5 pages of the report can
be printed or exported and a nag message displayed if the report has a
script. On this download page you can also download a demo
application and the full documentation; this is very handy if you want
to try it the first time.
After installing Fastreport 4 VCL there will be (depending on the
version you installed) 2,3 or 4 new groups of components in the Tool
Palette window (figure 2). The ones most used are in the Fastreport 4.0
group. Below you will find a short guide to get started quickly.
The first steps Creating a report starts by adding a TfrxReport
component to your form. Double click the frxReport1 icon. This will
start the layout editor. Default there are 3 tabs In the editor: the 'Code'
and 'Data' pages and the first page of your report (figure 3).
Page 53
uses StrUtils;
procedure TForm1.FormCreate(Sender: TObject);
begin
frxReport1.Script.AddMethod(
'function SecToHHMM(Seconds: Extended):String',
CallFrxMethod,'Custom');
end;
function TForm1.SecToTime(Seconds: Extended): TTime;
begin
result:=Seconds/(24*3600);
end;
function TForm1.CallFrxMethod(Instance: TObject;
ClassType: TClass; const MethodName: String;
var Params: Variant): Variant;
begin
If(MethodName = 'SECTOHHMM') then
begin If(Params[0] >= 0) then
begin result:= FormatDateTime
('hh:nn',SecToTime(Params[0]));
result:= AnsiReplaceText(result,':','h')+'m';
end
else result:='--h--m';
end;
//Other methods
end;
Page 54
code
Figure 5:
The newly added function
You can find your newly added function in the tab 'Functions'
You can now use this function by dragging the variable you want
to be showed as hours and minutes to your report. Double click
the component, a property editor will popup, and modify the
variable name to the following expression:
[SecToHHMM(<VariabeleNaam>)]
This procedure of adding your own functions is only available in
Fastreport VCL. Fastreport .NET uses a different approach, even
more simple. First create an assembly with the functions you will
need and add this assembly to the 'Assembly' property of the
report. You will be able to use your own functions right away. You
can also use the already available functions defined by the
assemblies of the .NET framework.
Figure 6: Inheritance
Report inheritance
A lot of companies have a default report layout with their name,
address, bank account, logo, etc. Adding these same elements over and
over again when you create a new report layout is a boring and errorprone task.
The creators of Fastreport have found a solution for this problem:
'report inheritance'. You create the base report with the company info
once and use this report as a base for all 'descendant' reports. You can
set the base report in the 'Report Settings' dialog (figure 6).
If you have to change, for instance, the address or bank account of the
company, you only have to change the base report.
All descendant reports will also be changed automatically.
You can use all elements of the base report from your descendant
report and change properties (e.g. font, size, color) without changing
the base report.
There are however some restrictions concerning inheriting reports:
The base report can not contain any code. Fastreport will not
warn you, but the inherited report will disregard all code
from the base report.
You cannot inherit from an inherited report.
You cannot use the same component names in both the base
and inherited reports. So if you don't give your components a unique
name, there is a possibility that if you add new components to your
base report, the inherited report will have components with same
names and you won't be able to use your inherited report.
System;
System.Collections;
System.Collections.Generic;
System.ComponentModel;
System.Windows.Forms;
System.Drawing;
System.Data;
FastReport;
FastReport.Data;
FastReport.Dialog;
FastReport.Barcode;
FastReport.Table;
FastReport.Utils;
namespace FastReport
{
public class ReportScript
{
public string IIF(bool condition,
string trueValue, string falseValue)
{
return condition ? trueValue : falseValue;
}
}
}
Page 55
An overview of reports
Page 56
Page 57
Fastreport .NET
Fastreport VCL
The creators of Fastreport have succeeded to
create a report generator that can be used in
both Win32 and .NET applications. The VCL
and .NET versions have a lot in common: same kind
of editor, same kind of components, etc.
But there are also a lot of differences. These differences are logical:
both frameworks, the VCL and the .NET framework, on which the
report generators are based are also quite different.
The .NET version has had a complete makeover: new classes and a
new modern editor look. We have used the VCL version for some time
now, but we have to read the manual now and then to find out how
things work in the .NET version.
Lazarus
LazReport is not compatible with FastReport yet, because LazReport is
based on FreeReport (it is a very old version: FastReport- 2.3).
For example - an actual format of files is used from FastReport - XML,
but in the second version we have used the binary format.
In a contact with Michael Philippenko from Fastreport,
he told us that as soon we have developed the special trial
component for all purposes (The Lazarus team and Blaise Pascal
Magazine is working on that), they will consider building a version for
Lazarus.
Conclusion
This article shows some of the possibilities of Fastreport, not all.
We will build example applications and use them for publication
in the next issue. But up till now we haven't been bumping into
problems we could not solve; sometimes we had to consult the
manual or search the support forum to find the solution. We
stronly recommend Fastreport for its great quality, for its
relatively low price, the enormous amount of features and
expandability.
FastReport VCL 5
COMING SOON
Page 58
Figure 1.
Note the ID field, which is the primary key of type autoinc. This table is saved as Registration.adt, and can now be
used by the application we'll make with Delphi Prism. But before we continue, we must make sure that apart from
the Advantage Database Server, we've also installed the Advantage .NET Data Provider (so we can use ADO.NET
and ASP.NET with declarative data binding to connect to the Advantage database).
Delphi Prism XE
Delphi Prism XE is the most recent edition of Delphi Prism at the time of writing (people with a subscription
received no less than two major updates in the last year: first from Delphi Prism 2010 to Delphi Prism 2011, and
then a few weeks ago Delphi Prism XE). It can run in both Visual Studio 2008 and 2010, but for this article I'm
using Delphi Prism XE in Visual Studio 2010, together with ADS v10 as mentioned before.
Using Delphi Prism XE, we can create ASP.NET Web Projects, with File | New Project, using the dialog from the
following screenshot:
COMPONENTS
DEVELOPERS
Page 59
Figure 2.
For the purpose of this demo, I'll give the project the name EventRegistration. The ASP.NET project will consist of
one page Default.aspx, where we should start by placing a FormView control from the Data category of the
Toolbox. The FormView has a number of tasks, including one to Choose the Data Source. Since there is no Data
Source on the page, yet, we should select the <New data source> option instead:
Figure 3.
This will produce the Visual Studio Data Source Configuration Wizard, where we can specify where the application
will get its data from.
In our case, that's from a SQL Database, so click on
the SQL Database item. This will automatically
generate a default ID for the data source
(SqlDataSource1) and place this ID in the textbox
so we can modify it if needed. Click on OK to go to
the next page of the wizard.
In the second page, we can choose the data
connection. Either from a list of existing
connections, of by clicking on the New Connection
button.
If you click on the New Connection button, a
dialog will pop-up in which we can choose the data
source. Here, we can select the type of data source
from a list that contains Advantage Database Server
(if you've installed the Advantage .NET Data
Provider), as well as for example DataSnap,
InterBase, and several Microsoft drivers.
Figure 4.
Page 60
COMPONENTS
DEVELOPERS
Figure 5.
If you click on the Continue button, a new dialog follows were we can specify the specific details to connect to the
ADS Data Dictionary. Unless you've specified a username and password to access the Data Dictionary, this usually
only means that we have to specify the location of the .add file.
Figure 6.
Click on Test Connection to ensure that we can connect to the Data Dictionary. Click on OK if everything works,
and back in the Configure Data Source wizard, we can click on OK to get to the next page were the option is
offered to save the connection in the web.config file. This is handy, since it means we can modify the
connectionstring without having to recompile the application.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
COMPONENTS
DEVELOPERS
Page 61
Figure 7.
The next page allows us to build the way we want to retrieve the data from the database. We can use a SQL
statement or stored procedure, or use the dialog to select a table (there is only one: Registration) and specify the
fields that we want to use in a SELECT statement for example. Note that by default the wizard will check the *
for all fields, but I prefer to explicitly check the fields I need instead.
Figure 8.
Page 62
COMPONENTS
DEVELOPERS
Figure 9.
This will ensure that we can use the FormView in INSERT mode to enter new registrations.
After we close the dialog, we can click on OK to get to the last page of the wizard, and close that one as well.
The result is that we now have a FormView with a newly configured SqlDataSource component that connects to the
Registrations table from the Event Data Dictionary.
ASP.NET Page
We can now configure the ASP.NET page, and especially the FormView to show itself in INSERT mode only. This can
be done by selecting the FormView, and in the Properties Inspector making sure DefaultMode is set to Insert, with the
following result:
Figure 10.
Now we can give the application a test run, by selecting Debug | Start Without Debugging, which will start the ASP.NET
Development Server as well as the default browser, showing the registration application in action:
COMPONENTS
DEVELOPERS
Page 63
Figure 11.
Obviously, there are some issues with this page. First of all, the ID field is of type
autoinc, so that shouldn't be part of the input screen. And second, after we click on
the Insert hyperlink, the page doesn't jump to a Thank you! page (something we
didn't implement, yet), but gives an error message instead:
Figure 12.
This problem is caused by the fact that the declarative data binding in the generated
.aspx file is using positional parameters in the INSERT statements, but named
parameters in the list of parameters that follows it. In detail, the InsertCommand is
specified as follows:
InsertCommand = "INSERT INTO [Registration] ([FirstName], [LastName],
[Address], [Postcode], [City], [Country], [Company], [Email], [Phone], [ADS],
[ID]) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)"
"FirstName"
"LastName"
"Address"
"Postcode"
"City"
"Co untry"
"Company"
"Email"
"Phone"
"ADS"
"ID"
Type
Type
Type
Type
Type
Type
Type
Type
Type
Type
Type
=
=
=
=
=
=
=
=
=
=
=
"String"
"String"
"String"
"String"
"String"
"String"
"String"
"String"
"String"
"String"
"Int32"
/>
/>
/>
/>
/>
/>
/>
/>
/>
/>
/>
We should change the ? in the INSERT statement to :fieldname items, turning it into the
following (also removing the ID field):
InsertCommand
Page 64
COMPONENTS
DEVELOPERS
method
begin
if Assigned(e.Exception) then
begin
// handle error?
end
else
begin
Response.Write('<h1>Thank you for your registration!</h1>');
end;
end;
COMPONENTS
DEVELOPERS
Page 65
Page 66
COMPONENTS
DEVELOPERS
Figure 1:
Figure 5:
The first page of the wizard is shown. Select the Query
service/kbmMW_1.0 service type and click Next:
Figure 2:
Set the port number to 3000 and the mask to 0.0.0.0.
Figure 6:
Now we will be given the option to choose what type of database we
would like to access. In this example we choose to use SQLite which is
supported by all kbmMW Editions. Then click Next.
Figure 3:
Then we add the query service via the service wizard:
Figure 4:
Locate the Components4Developers Wizards and clicj the kbmMW
Service Wizard:
Figure 7:
We will then be given the opportunity to name the service, and
optionally give it a version. Versioning a service can be smart if, at some
point, the service interface changes, and we want to support both older
and newer clients. Lets name the service DATAWAREHOUSE and
keep the version empty. Then click Next.
COMPONENTS
DEVELOPERS
Page 67
Figure 8: Now click through all the remaining wizard pages, and click the OK button on the last page.
Figure 9: This generates a new datamodule for us, with a couple of components on it:
The data module will be used by clients making database requests
to the application server. Because we want to access a large
number of SQLite databases, we have not defined a SQLite
connection pool (which in most applications would be an expected step)
on the application servers main form (we deselected that option
in the wizard). Instead we will make our own list of known
SQLite databases, from which we will select one or more as
needed.
The next step is to register this service with the application server
(Server component) on the main form. The main form's OnCreate
event is a suitable place to do that, as it only needs to be
registered once.
Figure 10:
Page 68
COMPONENTS
DEVELOPERS
The method first ensures that all access to the contents of DBs is
protected so only one thread at a time can access it. Then we look up a
connection pool based on the database name. If none has been found,
we create a new one, and add it 'managed' to the DBs storage. If the
database the client is requesting actually doesn't exist at all, this method
will throw an exception. Other ways to indicate the issue to the client
could also be coded.
Then we define a so called virtual dataset on the query service. Right
now we have a TkbmMWSQLiteQuery component there, and that will
be used for the query against a specific database. However we would
like to interact with the client in such a way that we don't just send the
complete result (containing the combined matching records for all
client specified databases) in one go to the client. Instead we would like
to send incremental resultsets. In this sample we choose to send all
matching records from one database to the client as one incremental
resultset.
On the queryservice datamodule (unit2.pas) we put a
TkbmMWMDQuery and a TkbmMWMDConnectionPool.
We add the code required to create the instance upon form creation and
destroy it at form destruction. We specify that objects that are are
'managed' by the hashlist are also deleted automatically by the hashlist
when entries are deleted from the list or the list itself is freed.
procedure TForm1.FormCreate(Sender: TObject);
begin
DBs:=TkbmMWThreadHashStringList.Create(100);
DBs.FreeObjectsOnDestroy:=true;
Server.RegisterService(TkbmMWQueryService2,false);
end;
procedure TForm1.FormDestroy(Sender: TObject);
begin
FreeAndNil(DBs);
end;
Figure 11:
We rename the kbmMWMDQuery1 to 'DATA', and set its published
property to true and its connection pool to point to the
kbmMWMDConnectionPool1 component. That way clients can
request data from this particular component. The virtual memory
dataset (thus the MD acronym) provides some interesting events for
us, namely the OnPerformFieldDefs and OnPerformQuery events.
The OnPerformFieldDefs allow us to easily define field definitions at
runtime on-the-fly. We could also define them at designtime for this
demo because we know the structure of the SQLite table DATA
doesn't change for different databases served by this application server.
However we'll define the definitions in code. Its also possible to define
parameters in the same method, but since we need a way the client
query can provide information about database name and some search
criteria, we will define parameters for that at designtime.
COMPONENTS
DEVELOPERS
Page 69
Figure 14:
Finally we need to add some code to do the actual search and return the
incremental data. This code should be put in the OnPerformQuery
event .
The First part of the PerformQuery event extracts parameter values
provided by the client.
Figure 12:
Three parameters have been created:
1
Database, ftString, ptInput, Size 100
2
ConditionLow, ftFloat, ptInput
3
ConditionHigh, ftFloat, ptInput
The purpose of these parameters is to allow the client to indicate
database number, and some conditions we may choose to use for the
selection from a database. By convention we define that Database may
contain one or more comma-separated numbers indicating database
number for which a result is requested.
Then we define the SQL statement which will query a single database.
That can be done at runtime or designtime. Since the SQL is the same
regardless which database is queried, we define it at designtime.
procedure TkbmMWQueryService2.DATAPerformQuery
(Sender: TObject; var ACanCache,
ACallerMustFree: Boolean; var ADataset: TDataSet);
var
i:integer;
sDBID:string;
ds:TkbmMWMDQuery;
sDatabase:string;
slDatabase:TStringList;
conditionLow,conditionHigh:double;
cp:TkbmMWSQLiteConnectionPool;
bFirst:boolean;
mt:TkbmMemTable;
begin
ds:=TkbmMWMDQuery(Sender);
// Get parametervalues.
sDatabase:=ds.ParamByName['Database'].AsString;
conditionLow:=ds.ParamByName['conditionLow'].AsFloat;
conditionHigh:=ds.ParamByName['conditionHigh'].AsFloat;
Page 70
COMPONENTS
DEVELOPERS
The final part of the event closes the native query, detaches the
connection pool from it, and generates a final (empty) dataset to return.
Because we don't know if we are infact processing the last dataset
(database) at any point in the loop we send this empty dataset which
indicates that it is the last one (we may encounter an empty or nonexistent dataset anywhere in the loop). We also tell the system that it is
responsible for getting rid of our temporary TkbmMemTable, when its
done with it.
bFirst:=false;
end;
finally
kbmMWSQLiteQuery1.Close;
kbmMWSQLiteQuery1.ConnectionPool := nil;
end;
except
// Do nothing.
end;
end;
finally
slDatabase.Free;
end;
// If no data collected, complain.
if bFirst then
raise Exception.Create
(No matching slides were found.');
// Now prepare a final (empty) dataset package.
mt:=TkbmMemTable.Create(nil);
mt.CreateTableAs(ds,[mtcpoStructure]);
mt.Open;
ADataset:=mt;
ACallerMustFree:=true;
end;
Finally we need to tell the query service that clients are allowed to
access its published query components. That is done via the property
AllowClientNamedStatement on the query service data module.
The client
Now all that's left is to build a client that can talk to the application
server. We start out by creating a new VCL Forms application for the
client. We add several components:
-
TkbmMWTCPIPIndyClientMessagingTransport
2 x TkbmMWMemoryMessageQueue
TkbmMWClientConnectionPool
TkbmMWClientQuery
TkbmMWBinaryStreamFormat
And a datasource, dataaware grid and a couple of buttons.
Figure 15:
COMPONENTS
DEVELOPERS
Page 71
// Lets get the parameters from it, so we can fill them out with
// relevant query information.
kbmMWClientQuery1.FetchDefinitions;
COMPONENTS
DEVELOPERS
SPECIAL OFFER:
AT ORDERING A NEW LICENSE OF
KBMMW PROFESSIONAL OR ENTERPRISE
INCLUDING COMPETITIVE UPGRADES
40% DISCOUNT
THE COUPONCODE TO REFER TO IS: KBMMWBLAISEPASCAL2010.
THIS OFFER IS ONLY VALID THROUGH THE PERIOD STARTING AT OCTOBER 10/2010 UNTIL NOVEMBER 10 /2010
Page 72
COMPONENTS
DEVELOPERS
Save the program and compile and run the program to test it on the
PC. If this all works we can configure Lazarus to create a WinCE
application. From the 'Project' menu choose 'Project Options...' then
'Compiler Options'. In this screen we select which widgetset to use,
from this screen called the 'LCL Widget Type. Note that versions of
Lazarus later than the USB-stick version have a combobox dropdown
where you can select the LCL Widget Type.. Select 'WinCE' and
activate the 'Code' tree node ('Code generation' node in later Lazarus
versions).. Set the 'Target OS' to WinCE and the processor (Target CPU
family) to ARM.
Click on 'OK' to save the changes and then recompile the program. If
everything goes successfully you now have a hello-world application for
WinCE. You could try to start the application, but that will fail. A
program compiled for an ARM processor wil not work on a Intel
Pentium processor. To test your program you'll have to copy it to a
Windows Mobile Phone and run it there.
Figure 1:
COMPONENTS
DEVELOPERS
Page 73
Figure 3:
After the installation there is a new option in the start menu, 'Windows
Mobile 6 SDK' with the sub-item 'Standalone Emulator Images' which
contains a list of several images with different versions of Windows
Mobile. Choose one to run. You will see a telephone on which
Windows is starting up. In the File>Configure it is possible to set a
'shared map'. Set it to 'location' where you stored the Lazarus project.
This 'Shared folder' is now available on the simulated phone as an extra
storage card. On the phone select 'programs' in the start menu and run
the File Explorer. Now select the 'storage card' in the upper left corner.
Now you can see all the files from the Lazarus project and the helloworld executable file. Click on the program and your phone will tell you
'hello'.
Figure 2:
Figure 4:
Page 74
COMPONENTS
DEVELOPERS
GPS
Until now we were only busy with configuring all kind of things. Now
let's begin with writing applications. We want to create an application
which determines the location of the phone using GPS and to store
this information in a local database. Reading the GPS can be done to
connect directly to the com-port on which the GPS is connected, but
Windows CE also has a library (gpsapi.dll) which can be used to get
data from the GPS. (This library is part of WinCE since version 5) It's
this 'GPS Intermediate Driver' that we're using in this article. Therefore
we first have to make the structures and functions from this dll
accessible in our program. Add this definition to the application code:
const
gps_version_1
= 1;
gps_max_satellites = 12;
gps_max_prefix_name = 16;
gps_max_friendly_name = 64;
type
Tgps_fix_quality
= (gps_fix_quality_unknown,
gps_fix_quality_gps,
gps_fix_quality_dgps);
Tgps_fix_selection = (gps_fix_selection_unknown,
gps_fix_selection_auto,
gps_fix_selection_manual);
Tgps_fix_type
= (gps_fix_unknown,
gps_fix_2D,
gps_fix_3D);
TGPS_Position = record
dwVersion: DWord;
dwSize: DWord;
dwValidFields: DWord;
dwFlags: DWord;
stUTCTime: Windows.SYSTEMTIME;
dblLatitude: double;
dblLongitude: double;
flSpeed: cfloat;
flHeading: cfloat;
dblMagneticVariation: double;
flAltitudeWRTSeaLevel: cfloat;
flAltitudeWRTEllipsoid: cfloat;
Figure 5:
If there is a connection between ActiveSync/Device Center and the
emulated mobile phone, we can start debugging. Return to Lazarus and
place a breakpoint on the line on which the messagebox is opened.
Start the program using the remote debugger (F9). Without the remote
debugger selected, this would result in the error message that the
application is not suitable for Windows.
But now the application is started on the phone. You need some
patience, though. It needs some time. When the program is running,
click on the button and Lazarus will pause the program on the
breakpoint. With F9 the program continues, behaving just as you expect
while debugging applications.
It's important though to know what happens exactly when you debug
the application remotely. On the phone a map called '\gdb' is created.
Then the program which has to be run on the phone is copied to this
map and started. Then the debugger connects to this running
application. Note however that when a file with the same name already
exists on the phone, the application is not transferred. This means that
if you change the program, re-compile it and run it again, on the phone
the 'old' version of the program is still used. So you have to remove the
application from the map '\gdb\' before you can debug the new
version.
One of the reasons that debuging is so slow is that copying the file to
the phone takes so long. It's possible to speed up this process by
excluding the debug-information from the executable while linking the
application, and place this information in a separate file. This leads to a
smaller executable so it takes less time to copy it. You can find the
option for putting debug-information in a separate file in the compiler
options, on the link-tab. (Use external gdb debug symbols file (-Xg)).
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
FixQuality: Tgps_fix_quality;
FixType: Tgps_fix_type;
SelectionType: Tgps_fix_type;
flPositionDilutionOfPrecision: cfloat;
flHorizontalDilutionOfPrecision: cfloat;
flVerticalDilutionOfPrecision: cfloat;
dwSatelliteCount: DWORD;
rgdwSatellitesUsedPRNs:
array[0..gps_max_satellites - 1] of cdouble;
dwSatellitesInView: DWORD;
rgdwSatellitesInViewPRNs:
array[0..gps_max_satellites - 1] of cdouble;
rgdwSatellitesInViewElevation:
array[0..gps_max_satellites - 1] of cdouble;
rgdwSatellitesInViewAzimuth:
array[0..gps_max_satellites - 1] of cdouble;
rgdwSatellitesInViewSignalToNoiseRatio:
array[0..gps_max_satellites-1]
of cdouble;
Fillup: array[0..287] of byte;
end;
COMPONENTS
DEVELOPERS
Page 75
The code above first checks if a fgpshandle is available, and if not the
program tries to create a connection with the GPS. If this is successful
the pGPSPosition record is initialized. It's been cleared completely and
then the versionnumber and size of the record are set. As you can see
the size of the record is given explicitly. In principle this is wrong, it
should be 'sizeof(pGPSPosition)'; instead of 376. The problem is that
Windows CE 5 and 6 use a different size for the pGPSPosition. The
idea was that WinCE 6 would be backwards compatible with the size of
WinCE 5, but there's a bug in some versions of WinCE that breaks this
compatibility. That's why the size has a constant size here. If the given
size is incorrect, the program will raise an error with errorcode 87.
Should this happen, replace '376' by '344' and try again. Later on a
better solution will be discussed.
After the pGPSPosition record is initialized, GPSPosition is called with
four parameters, Firstly the fgpshandle, secondly the pGPSPosition
record, and thirdly the maximum number of milliseconds old that the
answer can be. That's because the GPS sends information to the phone
continuously. It could be that a second ago a location has been send to
the phone. If this age parameter is larger then 1000 (1 second) then the
call will immediately return the value sent one second ago. The fourth
parameter is always zero.
After a successful call to GPSGetPosition a messagebox is shown with
the GPS-coordinates of the current location. Further you can see how
many satellites are used to obtain the current location (more satellites
means a more accurate result) and the total number of
satellites the GPS 'sees'.
{$ENDIF WINCE}
end;
Page 76
COMPONENTS
DEVELOPERS
This reads and displays the GPS coordinates on the screen. If there are
no coordinates available yet a message is shown alerting you to the
absence of a signal. What is new is that the program tries to setup a
connection with dwSize:=376 and if this doesn't succeed with
dwSize:=344. Then the size which works is stored so it can be used
later. Note that for this to work it is important that the size of the
actual record is at least as high as the value given here.
The TGPS_Position as we defined earlier is for Windows Mobile
version 5 and normally has a size which is too small for Windows
Mobile 6. But because version 6 can't handle the format of version 5 in
all cases, the size of the record is stretched by adding the unused array
of bytes. (TGPS_Position.Fillup) This way it will work on all versions.
Now we have to hook up the events for the buttons and timer:
Last Position(6)
Figure 6:
Now we add a new private function to obtain the GPS-position. The
code is as follows:
Figure 7:
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
COMPONENTS
DEVELOPERS
Page 77
Figure 8:
Now on pressing the Start button, the program tries to connect to the
GPS and if it succeeds, the timer is activated.
The timer makes sure that after an interval (Timer1.Interval), the GPS is
checked, to see if there is any new information. This information is
shown on the screen. Using the Stop button, the timer is deactivated
and the connection with the GPS disconnected.
Storage of a walk in the park
Suppose we want to walk in the park and save our position every 5
seconds. We want to use a database for that but not one for which we
have to install a complete database server. And it should work on
Windows CE. That's a perfect match for SQLite (www.sqlite.org).
If you want to work with SQLite the only thing you have to do is place
the sqlite3.dll in the same map as your executable, or if you want to
make it system-wide, in c:\windows\system32. Let's try to get this
working on the PC first.
This way the {$IFDEF WINCE} are also used for something useful.
First download sqlite3.dll and place it in the system32 folder.
We also want to access the database from within Lazarus, so placing it
in the project-folder is not enough. (It is possible, of course. But in that case
the dll should also be copied to the location of the Lazarus executable.)
Now place a TSQLite3Connection, TSQLTrasaction and a
TSQLQuery from the sqldb-tab on the form.
Connect the TSQLTransaction.Connection and
TSQLQuery.Connection to the TSQLite3Connection. Set the
TSQL3Connection.Transaction to the TSQLTransaction.
From the 'Data Access' tab add a TDataSource and from the 'Data
Controls' a TDBGrid.
Connect the TDatasource.Dataset property to the TSQLQuery and set
the Datasource property of the grid to the TDatasource.
Finally give the TSQLQuery the following query (SQL): 'select
* from coordinates;'.
Now choose a DatabaseName for the TSQLite3Connection.
The databasename is nothing more then a filename in which the data is
stored. In my case that's 'h:\src\pgg-wince\gpsdata.sdb'.
You can check if sqlite is installed and configured ok by setting the
'connected' property to 'true'. Because there is no extra tool available to
create the table we need, we create the table.
Add to your application the private function which follows.
This code first checks if the query is already active. If not active, a
connection to the database is made. Then it checks if a table with the
name 'coordinates' already exists. If this table does not exist it is
created with the fields 'LocalTime','GPSTime','Longitude'
end 'Latitude'.
Page 78
COMPONENTS
DEVELOPERS
procedure TForm1.InitialiseDB;
var sl: TStringList; i: integer;
begin
if SQLQuery1.Active then Exit;
SQLite3Connection1.Open;
sl := TStringList.Create;
try
SQLite3Connection1.GetTableNames(sl);
if not sl.Find('coordinates', i) then
begin
SQLite3Connection1.ExecuteDirect
('create table coordinates(LocalTime datetime,
GPSTime datetime, longitude real, latitude
real);');
SQLTransaction1.CommitRetaining;
end;
finally
sl.Free;
end;
SQLQuery1.Open;
end;
The transaction will be committed to save it all. Now you will be able to
open the query. Add a call to this function in the OnCreate event of the
form, to ensure the table will always be opened at the start of the
program.
To test and create this simple database and table, we have to compile
the application for a PC. Go to the compiler options and set the
widgettype to win32/win64.
The code that has to be generated, should be for an i386 processor and
you have to choose win32 for the operating system. Start running the
program in the debugger and you'll see that this fails.
This is because the debugger is configured to debug applications for
ARM processors, remotely on a mobile phone.
Go to 'environment' -> 'IDE Options' -> 'debugger'
and set the debugger path to the 'normal' gdb executable.
(lazarus\mingw\bin\gdb.exe).
Start the application again and then close it. Now check to see if a
database file with a size greater than zero bytes has been created.
If that's the case you can set the Active property of the TSQLQuery on
the form to true. Then the fields added to the table will become visible
in the grid. To scale the grid with the form-size on the phone set the
BorderSpacing.Around to 5 and BorderSpacing.Bottom to 40.
Set Readonly to true and options.dgEditing and
options.dgIndicator to false. Set AutoFillColumns to true.
This is not all though. To show the columns properly on a small screen
the properties of each column have to be set manually. Double-click on
the grid and click three times on the 'add' button so that three columns
are added. Select the first column and set the 'fieldname' to the
'LocalTime' field and the 'DisplayFormat' to 'hh:mm:ss', so that only the
time is displayed, not the date.
You can also give the column a suitable title.
Configure the other two columns
so that the 'latitude' and 'longitude' are shown,
with '#0.#######' as DisplayFormat.
Next we're ready having configured the controls.
Now we have to take care of saving the
positions into the database.
As well as the new parameter the dtime and ttime variables are also new
as is the code-part in which a new record is added to the database. Also
the time at which the coordinates were measured by the GPS is stored
into the database.
Therefore this date has to be converted to a TDateTime.
When StoreDB is false nothing is stored into the database and if the
time from then GPS is invalid, this is not stored either.
Now we only need to add a value for the StoreDB parameter at the two
places where the GetGPSPosition function is called.
In the start-button event handler the value should be 'false'. In the
timer StoreDB has to be 'true'.
After these changes set the compiler-options so that Lazarus generates
a Windows CE/ARM application again.
Further the TSqlite3Connection databasename has to be adapted since
the folder of the filename which is now used probably doesn't exist on
the mobile phone device.
The easiest way is setting the database name to a simple filename
without any path, for example 'gpsdata.sdb'.
But different from what we are used to in Windows running on a PC,,
the file is not stored into the 'current' folder, but in the 'my device'
folder.
This is the first/root folder of the device.
This is because Windows CE doesn't have something as the 'default
folder'.
Change the databasename, compile the application and run it on the
emulator or mobile phone.
This will immediately lead to an error message because sqlite3.dll isn't
present on the phone (yet).
This isn't as easy as it looks.
Figure 9:
function TForm1.GetGPSPosition(var GPSPosition:
TGPS_Position; StoreDB: boolean): dword;
var
res:
DWORD; dtime: TDateTime;
ttime: Windows.systemtime; dist: double;
begin
{$IFDEF WINCE}
...
lMessage.Caption:=Format('Laatste positie: (%d)',
[GPSPosition.dwSatelliteCount]);
lMessage1.Caption:=Format('%g %g',
[GPSPosition.dblLatitude,GPSPosition.dblLongitude]);
if StoreDB then
begin
SQLQuery1.First;
SQLQuery1.insert;
SQLQuery1.FieldByName('localtime').AsDateTime := now;
if GPSPosition.stUTCTime.Year<>0 then
begin
ttime := GPSPosition.stUTCTime;
dtime:=
ComposeDateTime(EncodeDate
(ttime.Year,ttime.Month,ttime.Day),
EncodeTime(ttime.Hour,ttime.Minute,ttime.
Second,ttime.Millisecond));
SQLQuery1.FieldByName('gpstime').AsDateTime:=
dtime;
end;
SQLQuery1.FieldByName('latitude').AsFloat:=
GPSPosition.dblLatitude;
SQLQuery1.FieldByName('longitude').AsFloat:=
GPSPosition.dblLongitude;
SQLQuery1.Post;
end;
end
else
lMessage.Caption:=Format
('Wachten op verbinding. %D satelieten.',
[GPSPosition.dwSatellitesInView]);
....
COMPONENTS
DEVELOPERS
Page 79
Page 80
COMPONENTS
DEVELOPERS
By Scott Walz
Page 81
select that SQL to be run through the tuner. The tuner will verify the
database-specific optimizer is taking the fastest execution path, the SQL
is written effectively, indexes are being leveraged, missing indexes are
created, and the underlying schema is defined effectively for maximum
performance.
DBO20_Indexes_VST_datasheetimage
Stress Test to Validate Performance Gains
Once the SQL is tuned, it is important to measure and validate
performance gains by stress testing the original, un-tuned SQL code
and the newly tuned SQL code side-by-side while running a profiling
session and capturing the resultant snap shot. By simulating a number
of parallel sessions (user) and number of executions for some duration
of time, you can ensure that the SQL you have tuned will in fact stand
up to QA load testing and production stress levels.
DBO20_ Loadeditor.
In Summary
It sounds simple enough, but so many developers are still suffering
from CTD. It's a shame because help is available. By following a
painless five-step regime and using the right tools, any developer can
learn to quickly profile and pinpoint the worst-performing SQL
statements, streamline their SQL tuning process and validate their work
before passing the code back to QA or the DBA for final testing.
Everyone will think it was tuned by an expert.
About the author:
Scott Walz
has more than 15 years of experience in database
development and serves as senior director of product
management for Embarcadero Technologies. In this
position, Scott oversees the direction of the company's
database product family, while focusing on database
development and administration products. Prior to
joining Embarcadero four years ago, Scott served as
development lead for Louisville Gas & Electric. He holds
a bachelor's degree in computer information systems
from Western Kentucky University.
Page 82
COMPONENTS
DEVELOPERS
M. van Canneyt,
M. Grtner, S. Heinig,
F. Monteiro de Cavalho,
I. Ouedraogo.
Page 83