You are on page 1of 268

Springer Aerospace Technology

Jens Eickhoff Editor

A Combined
Data and Power
Management
Infrastructure
For Small Satellites
Springer Aerospace Technology

For further volumes:


http://www.springer.com/series/8613
Jens Eickhoff
Editor

A Combined Data and


Power Management
Infrastructure
For Small Satellites

123
Editor
Jens Eickhoff
Institut für Raumfahrtsyteme
Universität Stuttgart
Stuttgart
Germany

ISSN 1869-1730 ISSN 1869-1749 (electronic)


ISBN 978-3-642-35556-1 ISBN 978-3-642-35557-8 (eBook)
DOI 10.1007/978-3-642-35557-8
Springer Heidelberg New York Dordrecht London

Library of Congress Control Number: 2013936520

 Springer-Verlag Berlin Heidelberg 2013


This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of
the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations,
recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or
information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar
methodology now known or hereafter developed. Exempted from this legal reservation are brief
excerpts in connection with reviews or scholarly analysis or material supplied specifically for the
purpose of being entered and executed on a computer system, for exclusive use by the purchaser of the
work. Duplication of this publication or parts thereof is permitted only under the provisions of
the Copyright Law of the Publisher’s location, in its current version, and permission for use must
always be obtained from Springer. Permissions for use may be obtained through RightsLink at the
Copyright Clearance Center. Violations are liable to prosecution under the respective Copyright Law.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this
publication does not imply, even in the absence of a specific statement, that such names are exempt
from the relevant protective laws and regulations and therefore free for general use.
While the advice and information in this book are believed to be true and accurate at the date of
publication, neither the authors nor the editors nor the publisher can accept any legal responsibility for
any errors or omissions that may be made. The publisher makes no warranty, express or implied, with
respect to the material contained herein.

Cover picture original by Sabine Leib, EADS Cassidian

Printed on acid-free paper

Springer is part of Springer Science+Business Media (www.springer.com)


Foreword

Innovation is the key for success in technical fields and thus cannot be underes-
timated in space engineering domains working at the cutting edge of feasibility.
Therefore, agile industry is continuously supporting innovative developments
extending technical limits or abandoning classical design paths. It is particularly
difficult to implement such innovative approaches straightforward in commercial
satellite projects, in agency funded productive systems like operational Earth
observation missions, or in navigation constellation programs.
The ideal platforms for ‘‘innovation verification’’ missions are governmental or
academic technology demonstrator programs. These demonstrator programs allow
for rapid prototyping of new capabilities and satellite architectures while keeping
capital expenditures as low as possible.
Technology development partnerships between industry and academic institu-
tions are extensively stressed and often cited. But for partnering in leading edge
areas like the Combined Data and Power Management Infrastructure (CDPI),
described in this book, such academic/Industry partnerships are extremely rare to
find since they require substantial know-how, both on the industry and the uni-
versity side.
However, once established, such cooperations provide a competitive edge for
both the university as well as the industry partners. For industry the advantage is
twofold because it allows the qualification of new space technology and in parallel
students and Ph.D. candidates are educated to a knowledge level which is far
above the normal standard for graduates.
For Astrium the University of Stuttgart has become a strategic technology
partner over the past decade due to the extensive small satellite programs that have
been established at their premises. These programs together with the state-of-the-art
facilities at the Institute of Space Systems, hosted at the ‘‘Raumfahrtzentrum
Baden-Württemberg’’, go far beyond typical university CubeSat programs. The
FLP satellite, for example, will qualify industry relevant flight hardware on board a
university satellite for the first time in the German context.
For this reason Astrium has invested significantly in the development of this
SmallSat program and in particular in this CDPI, both through direct sponsoring as
well as through provision of manpower. The other consortium partners, Aeroflex,

v
vi Foreword

4Links, Aeroflex Gaisler, Vectronic Aerospace, and HEMA Kabeltechnik, which


all are close partners of Astrium, have also invested significant effort for devel-
opment of their subsystems and have provided their SW/HW contributions at a
university compatible cost basis.
For the German Aerospace Center this development is a key example of a
successful industry/academic cooperation—a transcontinental one moreover. The
technical product was developed in a 3.5 years timeframe, which is comparable
with industry. The overall project concept with an experienced industry expert as
team leader, industry suppliers for subsystems, and Ph.D. students complementing
the team from the research side, the CDPI product testing and the integration into
the satellite, proved to be a successful strategy.
This book is unique in that it presents in great detail a successful model for
industry-university collaboration in the development of small satellites that aspire
to cutting edge operational performance. We envision the beginnings of a world-
wide trend for collaborative small satellite development with centers of excellence
in Europe, North America, and Asia, with other nations also joining in the not too
distant future.

December 2012 Evert Dudok, Director Astrium GmbH


Munich, Germany
Prof. Dr. rer. nat. Hans-Peter Röser
Institute of Space Systems, Stuttgart, Germany
Prof. Dr.-Ing. Johann-Dietrich Wörner
German Aerospace Center DLR, Cologne, Germany
Prof. Dr.-Ing. Olivier L. de Weck
Department of Aeronautics and Astronautics, MIT, Cambridge, USA
Preface

The Combined Data and Power Management Infrastructure (CDPI) described in


this book is a functional merging of a satellite Onboard Computer and a Power
Control Unit. It was developed in the frame of the Small Satellites Program at the
University of Stuttgart, Germany.
In 2009, the need became evident for a suitable Onboard Computer design for
the small satellite FLP (also known as ‘‘Flying Laptop’’). It had to meet the
stringent performance, mass, volume, and power consumption constraints imposed
by the 130 kg FLP satellite, with its full-featured ACS, diverse payloads, and
complete CCSDS telecommand and telemetry standard compliance.
Thus one of my first tasks as senior engineer was to identify a design for an
onboard computer which was required to be
• space proof w.r.t. radiation hardness and electromagnetic space conditions,
• compact enough for the target satellite platform (a cube of 60 x 70 x 80 cm),
• powerful enough to run a realtime operating system,
• suitable to support professional CCSDS-based satellite operations,
• and limited in power consumption.
Consumer electronic components for the onboard computer were out of the
scope of this project considering the need for robustness against the space envi-
ronment. Classic space industry OBC devices were not appropriate with regard to
geometry and mass, much less cost considerations. The only realistic option was to
find consortium partners from the space industry with expert knowledge in diverse
fields who could supply the institute with Onboard Computer components—
altogether forming a complete, redundant OBC. Thus the task was to define a
completely modular, ‘‘LEGO like’’ design which allowed the subcontracting of
entire functional OBC boards to consortium partners. The overview of the
achieved design solution is described in Chap. 1.
The requirements on the OBC system, especially with respect to potential
hardware failure robustness and the handling of different types of external analog
and digital interfaces, led to a functional merging between OBC and the satellite’s
Power Control and Distribution Unit (PCDU), resulting in a very innovative
design—the so-called CDPI.

vii
viii Preface

At end of the flight unit’s development the consortium decided to provide a


single consistent documentation of the developed CDPI Infrastructure. The tech-
nical overview should be available for other university students in a sort of mix
between technical brochure and user guide. This book also might be of interest for
future university or industry partners who intend to order in Stuttgart rebuilds/
adaptations of the CDPI infrastructure or even the entire satellite bus for their
missions.

December 2012 Prof. Dr.-Ing. Jens Eickhoff


Acknowledgments

This system development of a completely new and ultracompact Onboard Com-


puter plus the functional merging of data and power management FDIR functions
of both Onboard Computer (OBC), and Power Control and Distribution Unit
(PCDU), would not have been possible without support from many sides to which
we are indebted.
First of all in the name of the entire institute I would like to thank Evert Dudok,
Director Astrium GmbH, Germany, for taking over the funding of the FM Pro-
cessor-Boards and for their donation to the institute as in-kind sponsoring at the
inauguration of the ‘‘Raumfahrtzentrum Baden-Württemberg’’ in 2011. Within
Astrium I am also very much obliged to my site director in Friedrichshafen,
Eckard Settelmeyer, and to my Section and Department Heads, Jörg Flemmig, and
Volker Debus, for supporting this coaching activity in addition to my tasks at
Astrium.
Secondly as overall system designer I am very much indebted to Hans-Peter
Röser, the IRS institute director, who gave me plenty of range to conceptualize the
overall system architecture, to find the consortium partners and negotiate with
them, and to elaborate and finally verify the design at unit and system level. Last
but not least it was him organizing the funding of all institute procured
components.
Special thanks goes to the industry partners who joined this project, Aeroflex
Colorado Springs, Inc. CO USA, Aeroflex Gaisler AB Sweden, 4Links Ltd. UK,
Vectronic Aerospace GmbH and HEMA Kabeltechnik GmbH & Co. KG, Ger-
many. It is absolutely non-standard that a university project is supported so
intensively by industrial global players.
In 2009, I had just taken over the role of the engineering coach of the FLP
satellite and was standing empty-handed, lacking a convincing OBC solution for
the satellite. My first idea was to upgrade one of the Aeroflex Gaisler development
boards. I contacted Jiri Gaisler at the DASIA conference in 2009 and herewith I
express my gratitude to him for handing over my problem to Aeroflex Inc. USA.
They were just developing a Single Board Computer based on the LEON3FT chip
UT699. From this initial contact, the fruitful cooperation between the university
and Aeroflex evolved.

ix
x Acknowledgments

Concerning the OBC development the university team is indebted to Aeroflex


and their German distributor Protec GmbH, Munich, for guiding us through the
formalisms of the procurement of the CPU boards as being ITAR products. Ae-
roflex helped us to fill out all relevant forms and explained the ‘‘dos and don’ts’’
concerning, shipment, and publishing rules under ITAR. The Processor-Boards
were provided under Technical Assistance Agreement TA-6151-10. The author’s
team is also indebted to the Aeroflex management and particularly to Tony Jordan
for granting their chief developer Sam Stratton the permission to participate in this
book and for reviewing the Aeroflex chapter concerning ITAR compliance.
For the publishing of this book the author’s team thanks Mrs. Sterritt-Brunner
and Dr. Christoph Baumann and the Springer publishing team for their assistance.
Last but not least as project manager of the CDPI, I do not want to forget to
express my gratitude and respect to the highly motivated FLP team of Ph.D.s and
students and in particular to those Ph.D.s which became co-authors in this book.

December 2012 Prof. Dr.-Ing. Jens Eickhoff


Donation for Life

In 2011, during the CDPI development, the 10-year-old daughter of one of the
authors was diagnosed to be suffering from a breakdown of blood cell production
in the bone marrow—a variant of blood cancer. She luckily could be saved and
recovered by means of a bone marrow transplant. Due to this experience the
authors decided to sponsor with the royalties of this book the German and
international bone marrow donor’s database:

DKMS Deutsche Knochenmarks-


penderdatei gemeinnützige Gesell-
schaft mbH
Phone: ?49-(0)7071-9430
German Website: https://www.
dkms.de/
US Website: http://www.dkmsa-
mericas.org/

xi
Contents

1 The System Design Concept . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1


1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 The Onboard Computer Concept Baseline . . . . . . . . . . . . . . 2
1.3 The PCDU as Analog RIU. . . . . . . . . . . . . . . . . . . . . . . . . 5
1.4 Common System Reconfiguration Controller . . . . . . . . . . . . 6
1.4.1 Component Functions During Failure Handling . . . 7
1.4.1.1 Failure Type 1: Failures Identifiable
by the Running OBSW . . . . . . . . . . . . . . 7
1.4.1.2 Failure Type 2: Crash of the OBSW
and Auto-Reconfiguration . . . . . . . . . . . . 8
1.4.1.3 Failure Type 3: Crash of OBSW
and Reconfiguration from Ground . . . . . . 8
1.4.1.4 Failure Type 4: Power Bus Undervoltage
Failures . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.4.2 Innovation: A Combined-Controller for all FDIR
Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.4.3 Failure Management with the
Combined-Controller . . . . . . . . . . . . . . . . . . . . . . 11
1.4.4 Advantages of the Combined-Controller
Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
1.5 Completeness of System Architecture . . . . . . . . . . . . . . . . . 12
1.6 Component Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
1.6.1 Processor-Boards. . . . . . . . . . . . . . . . . . . . . . . . . 14
1.6.2 CCSDS Decoder/Encoder. . . . . . . . . . . . . . . . . . . 17
1.6.3 I/O and CCSDS-Board. . . . . . . . . . . . . . . . . . . . . 21
1.6.4 OBC Power-Boards . . . . . . . . . . . . . . . . . . . . . . . 22
1.6.5 The PCDU with Enhanced Functionality . . . . . . . . 23
1.7 Testing the CDPI in an Integrated Environment . . . . . . . . . . 23
1.8 The Flight Model Assembly. . . . . . . . . . . . . . . . . . . . . . . . 24
1.9 Conceptual Outlook for future Missions . . . . . . . . . . . . . . . 25

2 The OBC Processor-Boards . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27


2.1 The Processor-Boards as ITAR Item . . . . . . . . . . . . . . . . . . 27
2.2 The Processor-Board: A Single Board Computer . . . . . . . . . 28

xiii
xiv Contents

2.3 Technical Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29


2.3.1 The OBC Microprocessor Device . . . . . . . . . . . . . 29
2.3.2 The OBC Memory Configuration . . . . . . . . . . . . . 30
2.3.3 The OBC FPGA for Miscellaneous Functions. . . . . 31
2.4 The OBC Processor-Board Functional Overview . . . . . . . . . 32
2.5 The OBC Processor-Board Memory Interface. . . . . . . . . . . . 33
2.6 The OBC Processor-Board SpaceWire Interface . . . . . . . . . . 36
2.7 Miscellaneous Functions . . . . . . . . . . . . . . . . . . . . . . . . . . 37
2.7.1 NVMEM Chip Enable . . . . . . . . . . . . . . . . . . . . . 38
2.7.2 SRAM Chip Enable. . . . . . . . . . . . . . . . . . . . . . . 38
2.7.3 Pulse Per Second Interface . . . . . . . . . . . . . . . . . . 38
2.7.4 Watchdog Signal and LEON3FT Reset . . . . . . . . . 39
2.7.5 RS422 Interface . . . . . . . . . . . . . . . . . . . . . . . . . 39
2.7.6 Resets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
2.7.7 Clock Interfaces . . . . . . . . . . . . . . . . . . . . . . . . . 40
2.7.8 DSU/Ethernet Interface Card (DEI) . . . . . . . . . . . . 40
2.8 Power Specifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
2.9 Mechanical Design and Dimensions . . . . . . . . . . . . . . . . . . 41

3 The I/O-Boards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
3.1 Common Design for I/O and CCSDS-Boards . . . . . . . . . . . . 43
3.2 The I/O-Board as Remote Interface Unit . . . . . . . . . . . . . . . 44
3.3 The I/O-Board as OBC Mass Memory Unit . . . . . . . . . . . . . 46
3.4 I/O-Board Hot Redundant Operation Mode . . . . . . . . . . . . . 46
3.5 I/O-Board RMAP Interface . . . . . . . . . . . . . . . . . . . . . . . . 47
3.5.1 Board Identification for I/O-Boards . . . . . . . . . . . . 47
3.5.2 I/O Board Interface RMAP Addresses . . . . . . . . . . 50
3.5.3 Returned RMAP Status Values . . . . . . . . . . . . . . . 50
3.6 I/O Circuits, Grounding and Terminations . . . . . . . . . . . . . . 51
3.7 I/O-Board Interface Access Protocols . . . . . . . . . . . . . . . . . 54
3.8 I/O-Board Connectors and Pin Assignments . . . . . . . . . . . . . 56
3.8.1 Connectors-A and C (OBC internal) . . . . . . . . . . . 56
3.8.2 Connector-B (OBC internal) . . . . . . . . . . . . . . . . . 56
3.8.3 Connectors-D and E (OBC external) . . . . . . . . . . . 57
3.9 I/O and CCSDS-Board Radiation Characteristic . . . . . . . . . . 58
3.10 I/O and CCSDS-Board Temperature Limits . . . . . . . . . . . . . 58
3.11 4Links Development Partner . . . . . . . . . . . . . . . . . . . . . . . 58

4 The CCSDS Decoder/Encoder Boards . . . . . . . . . . . . . . . . . . . . . 59


4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
4.2 Architectural Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
4.2.1 Interfaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
4.2.2 Command Link Control Word Coupling . . . . . . . . 65
4.2.3 Clock and Reset . . . . . . . . . . . . . . . . . . . . . . . . . 65
Contents xv

4.2.4 Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
4.2.5 Telemetry Encoder . . . . . . . . . . . . . . . . . . . . . . . 65
4.2.5.1 Telemetry Encoder Specification . . . . . . . 66
4.2.5.2 Virtual Channels 0, 1, 2 and 3 . . . . . . . . . 67
4.2.5.3 Virtual Channel 7. . . . . . . . . . . . . . . . . . 68
4.2.6 Telecommand Decoder . . . . . . . . . . . . . . . . . . . . 68
4.2.6.1 Telecommand Decoder Specification . . . . 68
4.2.6.2 Software Virtual Channel . . . . . . . . . . . . 69
4.2.6.3 Hardware Virtual Channel . . . . . . . . . . . . 69
4.2.7 SpaceWire Link Interfaces . . . . . . . . . . . . . . . . . . 70
4.2.8 On-Chip Memory . . . . . . . . . . . . . . . . . . . . . . . . 70
4.2.9 Signal Overview . . . . . . . . . . . . . . . . . . . . . . . . . 70
4.3 Telemetry Encoder . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
4.3.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
4.3.2 Layers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
4.3.2.1 Data Link Protocol Sub-layer. . . . . . . . . . 73
4.3.2.2 Synchronization and Channel Coding
Sub-Layer . . . . . . . . . . . . . . . . . . . . . . . 73
4.3.2.3 Physical Layer . . . . . . . . . . . . . . . . . . . . 73
4.3.3 Data Link Protocol Sub-Layer . . . . . . . . . . . . . . . 73
4.3.3.1 Physical Channel . . . . . . . . . . . . . . . . . . 73
4.3.3.2 Virtual Channel Frame Service . . . . . . . . 74
4.3.3.3 Virtual Channel Generation:
Virtual Channels 0, 1, 2 and 3 . . . . . . . . . 74
4.3.3.4 Virtual Channel Generation:
Idle Frames—Virtual Channel 7. . . . . . . . 74
4.3.3.5 Virtual Channel Multiplexing . . . . . . . . . 74
4.3.3.6 Master Channel Generation . . . . . . . . . . . 75
4.3.3.7 All Frame Generation . . . . . . . . . . . . . . . 76
4.3.4 Synchronization and Channel Coding Sub-Layer. . . 76
4.3.4.1 Attached Synchronization Marker. . . . . . . 76
4.3.4.2 Reed-Solomon Encoder. . . . . . . . . . . . . . 76
4.3.4.3 Pseudo-Randomizer . . . . . . . . . . . . . . . . 76
4.3.4.4 Convolutional Encoder . . . . . . . . . . . . . . 76
4.3.5 Physical Layer . . . . . . . . . . . . . . . . . . . . . . . . . . 77
4.3.5.1 Non-Return-to-Zero Level Encoder . . . . . 77
4.3.5.2 Clock Divider . . . . . . . . . . . . . . . . . . . . 77
4.3.6 Connectivity . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
4.3.7 Operation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
4.3.7.1 Descriptor Setup. . . . . . . . . . . . . . . . . . . 78
4.3.7.2 Starting Transmissions . . . . . . . . . . . . . . 79
4.3.7.3 Descriptor Handling After Transmission . . 80
4.3.8 Registers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
4.3.9 Signal Definitions and Reset Values . . . . . . . . . . . 82
xvi Contents

4.3.10 TM Encoder: Virtual Channel Generation . . . . . . . 82


4.3.11 TM Encoder: Descriptor . . . . . . . . . . . . . . . . . . . 82
4.3.12 TM Encoder: Virtual Channel Generation Function
Input Interface . . . . . . . . . . . . . . . . . . . . . . . . . . 83
4.4 TC Decoder: Software Commands . . . . . . . . . . . . . . . . . . . 84
4.4.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
4.4.1.1 Concept. . . . . . . . . . . . . . . . . . . . . . . . . 85
4.4.1.2 Functions and Options . . . . . . . . . . . . . . 85
4.4.2 Data Formats . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
4.4.3 Coding Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
4.4.3.1 Synchronization and Selection
of Input Channel . . . . . . . . . . . . . . . . . . 87
4.4.3.2 Codeblock Decoding. . . . . . . . . . . . . . . . 87
4.4.3.3 De-Randomizer . . . . . . . . . . . . . . . . . . . 87
4.4.3.4 Non-Return-to-Zero: Mark. . . . . . . . . . . . 88
4.4.3.5 Design Specifics. . . . . . . . . . . . . . . . . . . 88
4.4.3.6 Direct Memory Access . . . . . . . . . . . . . . 89
4.4.4 Transmission . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
4.4.4.1 Data Formatting . . . . . . . . . . . . . . . . . . . 91
4.4.4.2 CLTU Decoder State Diagram . . . . . . . . . 91
4.4.4.3 Nominal . . . . . . . . . . . . . . . . . . . . . . . . 92
4.4.4.4 CASE 1 . . . . . . . . . . . . . . . . . . . . . . . . 92
4.4.4.5 CASE 2 . . . . . . . . . . . . . . . . . . . . . . . . 92
4.4.4.6 Abandoned . . . . . . . . . . . . . . . . . . . . . . 93
4.4.5 Relationship Between Buffers and FIFOs. . . . . . . . 93
4.4.6 Command Link Control Word Interface. . . . . . . . . 93
4.4.7 Configuration Interface (AMBA AHB slave) . . . . . 94
4.4.8 Interrupts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
4.4.9 Registers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
4.4.10 Signal Definitions and Reset Values . . . . . . . . . . . 95
4.5 TC Decoder: Hardware Commands . . . . . . . . . . . . . . . . . . . 96
4.5.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
4.5.2 Operation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
4.5.2.1 All Frames Reception . . . . . . . . . . . . . . . 96
4.5.2.2 Master Channel Demultiplexing . . . . . . . . 97
4.5.2.3 Virtual Channel Demultiplexing . . . . . . . . 97
4.5.2.4 Virtual Channel Reception . . . . . . . . . . . 97
4.5.2.5 Virtual Channel Segment Extraction . . . . . 98
4.5.2.6 Virtual Channel Packet Extraction . . . . . . 98
4.5.2.7 UART Interfaces . . . . . . . . . . . . . . . . . . 99
4.5.3 Telecommand Transfer Frame Format: Hardware
Commands . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
4.5.4 Signal Definitions and Reset Values . . . . . . . . . . . 100
Contents xvii

4.6 SpaceWire Interface with RMAP Target . . . . . . . . . . . . . . . 100


4.7 JTAG Debug Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . 100
4.8 Diverse Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
4.9 CCSDS Processor Spacecraft Specific Configuration. . . . . . . 102

5 The OBC Power-Boards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103


5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
5.2 Power Conversion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
5.2.1 The DC/DC Converters . . . . . . . . . . . . . . . . . . . . 105
5.2.2 Start-Up Characterization of OBC Power
Consumers . . . . . . . . . . . . . . . . . . . . . . . . . .... 108
5.2.3 Start-Up Behavior of the Power Supply Board .... 111
5.2.4 Connection of Power Supply Board
and OBC Power Consumers . . . . . . . . . . . . . . . . . 112
5.3 Clock Strobe Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
5.4 Heaters and Thermal Sensors . . . . . . . . . . . . . . . . . . . . . . . 114
5.5 OBC Service Interface and JTAG Interface . . . . . . . . . . . . . 115
5.6 Connector Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . 116

6 The OBC Internal Harness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119


6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
6.1.1 Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
6.1.2 Challenges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
6.1.3 Realization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
6.2 Harness Design. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
6.2.1 Harness Engineering . . . . . . . . . . . . . . . . . . . . . . 123
6.2.2 SpaceWire Harness . . . . . . . . . . . . . . . . . . . . . . . 125
6.2.3 OBC Power Harness . . . . . . . . . . . . . . . . . . . . . . 127
6.3 Verification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128
6.4 Quality and Manufacturing Documentation . . . . . . . . . . . . . 131

7 OBC Mechanical and Thermal Design . . . . . . . . . . . . . . . . . . . . 133


7.1 Mechanical and Thermal Requirements . . . . . . . . . . . . . . . . 133
7.2 Mechanical Design of the OBC . . . . . . . . . . . . . . . . . . . . . 135
7.2.1 OBC Structure Concept . . . . . . . . . . . . . . . . . . . . 135
7.2.2 Mechanical Dimensioning
and Concept Validation . . . . . . . . . . . . . . . . . . . . 138
7.3 Thermal Design of OBC . . . . . . . . . . . . . . . . . . . . . . . . . . 140
7.3.1 Thermal Model . . . . . . . . . . . . . . . . . . . . . . . . . . 141
7.3.2 Thermal Calculation Results. . . . . . . . . . . . . . . . . 144
7.3.3 OBC Internal Heaters . . . . . . . . . . . . . . . . . . . . . 146
7.4 OBC Housing Material Properties. . . . . . . . . . . . . . . . . . . . 149
xviii Contents

8 The Power Control and Distribution Unit . . . . . . . . . . . . . . . . . . 151


8.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
8.2 The PCDU in a Typical Power Supply Subsystem . . . . . . . . 152
8.3 PCDU Unit Design Overview. . . . . . . . . . . . . . . . . . . . . . . 153
8.3.1 PCDU Interfaces . . . . . . . . . . . . . . . . . . . . . . . . . 154
8.3.2 PCDU Command Concept . . . . . . . . . . . . . . . . . . 155
8.4 Boot-Up Sequence of the PCDU and PCDU Modes . . . . . . . 156
8.5 Power Control and Distribution Functions . . . . . . . . . . . . . . 157
8.6 PCDU Specific Functions in the CDPI Architecture . . . . . . . 160
8.6.1 Analog Data Handling Concept . . . . . . . . . . . . . . 160
8.6.2 Reconfiguration Logic for the OBC. . . . . . . . . . . . 161
8.6.3 Reconfiguration Functionality for the Spacecraft. . . 164
8.7 Diverse PCDU Functions . . . . . . . . . . . . . . . . . . . . . . . . . . 166
8.7.1 Launcher Separation Detection . . . . . . . . . . . . . . . 167
8.7.2 Control and Monitoring of Solar Panel
Deployment . . . . . . . . . . . . . . . . . . . . . . . . . . .. 167
8.7.3 Control of the Payload Data Transmission
Subsystem Power . . . . . . . . . . . . . . . . . . . . . . .. 168
8.7.4 History Log Function . . . . . . . . . . . . . . . . . . . .. 168
8.7.5 Time Synchronization Between Internal
Controllers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168
8.7.6 Overvoltage Protection. . . . . . . . . . . . . . . . . . . . . 168
8.7.7 Measurement of Test-String Characteristics . . . . . . 169
8.8 PCDU Environmental Qualification Characteristics. . . . . . . . 169
8.8.1 Thermal-Vacuum Limits . . . . . . . . . . . . . . . . . . . 169
8.8.2 Radiation Limits . . . . . . . . . . . . . . . . . . . . . . . . . 169
8.8.3 Vibration Limits . . . . . . . . . . . . . . . . . . . . . . . . . 170
8.9 List of Connectors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171
8.10 PCDU Commands Overview . . . . . . . . . . . . . . . . . . . . . . . 172

9 CDPI System Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173


9.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
9.2 Test Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174
9.2.1 Test Conditions. . . . . . . . . . . . . . . . . . . . . . . . . . 175
9.2.2 Involved Personnel . . . . . . . . . . . . . . . . . . . . . . . 176
9.2.3 Test Program Simplifications . . . . . . . . . . . . . . . . 176
9.3 Test Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177
9.3.1 PCDU Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178
9.3.2 Processor-Board Tests . . . . . . . . . . . . . . . . . . . . . 179
9.3.3 Power-Board Tests . . . . . . . . . . . . . . . . . . . . . . . 179
9.3.4 CCSDS and I/O-Board Tests . . . . . . . . . . . . . . . . 180
9.3.5 OBC Subsystem Tests . . . . . . . . . . . . . . . . . . . . . 181
9.3.6 CDPI Reconfiguration Tests . . . . . . . . . . . . . . . . . 182
Contents xix

9.4 EM Testbench Infrastructure . . . . . . . . . . . . . . . . . . . .... 183


9.5 EM Test Execution and Results . . . . . . . . . . . . . . . . . .... 185
9.5.1 STB Tests Stage 1: Connecting OBC
Processor-Board and S/C-Simulator. . . . . . . . .... 186
9.5.2 STB Tests Stage 2: Connecting OBC
Processor-Board and CCSDS-Board . . . . . . . .... 186
9.5.3 STB Tests Stage 3: Entire Command
Chain Bridging the RF Link . . . . . . . . . . . . . .... 187
9.5.4 STB Tests Stage 4: Verify High Priority
Commanding . . . . . . . . . . . . . . . . . . . . . . . .... 189
9.5.5 STB Tests Stage 5: Commanding Equipment
Unit Hardware . . . . . . . . . . . . . . . . . . . . . . .... 190
9.5.6 STB Tests Stage 6: Performance Tests
with Attitude Control Software . . . . . . . . . . . .... 191
9.6 FM Test Execution and Results . . . . . . . . . . . . . . . . . .... 193

10 The Research Target Satellite . . . . . . . . . . . . . . . . . . . . . . . . . . . 195


10.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195
10.2 Orbit and Operational Modes . . . . . . . . . . . . . . . . . . . . . . . 196
10.3 Mechanical Design and Launcher Interface . . . . . . . . . . . . . 196
10.4 Technology and Payloads . . . . . . . . . . . . . . . . . . . . . . . . . 199
10.5 Satellite Attitude Control System . . . . . . . . . . . . . . . . . . . . 200
10.5.1 Sensors and Actuators . . . . . . . . . . . . . . . . . . . . . 200
10.5.2 Control Modes . . . . . . . . . . . . . . . . . . . . . . . . . . 201
10.6 Satellite Communication Links . . . . . . . . . . . . . . . . . . . . . . 202
10.7 Satellite Electrical Architecture and Block Diagram . . . . . . . 204
10.8 EMC Launcher and Primary Payload Compliance. . . . . . . . . 204

11 CDPI Assembly Annexes and Data Sheets . . . . . . . . . . . . . . . . . . 205


11.1 Processor-Board DSU/Ethernet Interface Card . . . . . . . . . . . 205
11.2 CCSDS Conventions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208
11.2.1 CCSDS Field Definition . . . . . . . . . . . . . . . . . . . 208
11.2.2 Galois Field . . . . . . . . . . . . . . . . . . . . . . . . . . . . 209
11.2.3 Telemetry Transfer Frame Format. . . . . . . . . . . . . 209
11.2.4 Reed-Solomon Encoder Data Format . . . . . . . . . . . 210
11.2.5 Attached Synchronization Marker . . . . . . . . . . . . . 211
11.2.6 Telecommand Transfer Frame Format . . . . . . . . . . 211
11.2.7 Command Link Control Word . . . . . . . . . . . . . . . 212
11.2.8 Space Packet . . . . . . . . . . . . . . . . . . . . . . . . . . . 212
11.2.9 Asynchronous Bit Serial Data Format . . . . . . . . . . 212
11.2.10 SpaceWire Remote Memory Access Protocol . . . . . 213
11.2.11 Command Link Control Word Interface. . . . . . . . . 213
11.2.12 Waveform Formats . . . . . . . . . . . . . . . . . . . . . . . 213
xx Contents

11.3 Selected TM Encoder Registers . . . . . . . . . . . . . . . . . . . . . 214


11.4 TM Encoder: Virtual Channel Generation Registers . . . . . . . 217
11.5 Selected TC Decoder Registers. . . . . . . . . . . . . . . . . . . . . . 218
11.6 OBC Unit CAD Drawing. . . . . . . . . . . . . . . . . . . . . . . . . . 226
11.7 OBC Unit I/O-Board Connector Pin Allocation . . . . . . . . . . 227
11.8 OBC Unit CCSDS-Board Connector Pin Allocation . . . . . . . 231
11.9 OBC Power-Board Connectors Pin Allocation . . . . . . . . . . . 232
11.10 PCDU Unit CAD Drawing. . . . . . . . . . . . . . . . . . . . . . . . . 234
11.11 PCDU Unit Connector Pin Allocations . . . . . . . . . . . . . . . . 235
11.12 PCDU Switch and Fuse Allocation to Spacecraft
Equipment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ....... 236

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239

Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 245
Abbreviations

General Abbreviations
a.m. Above mentioned
cf. Confer
i.e. Id est (that is)
w.r.t. With respect to

Technical Abbreviations
ADC Analog-to-Digital Converter
AES Advanced Encryption Standard
AIT Assembly, Integration, and Test
ASIC Application-Specific Integrated Circuit
ASM Attached Synchronization Marker
AWG American Wire Gauge
BBM Breadboard Model
BCH Bose–Chaudhuri–Hocquenghem
BCR Battery Charge Regulator
BoL Begin of Life
CAD Computer-Aided Design
CADU Channel Access Data Unit
CC Combined-Controller
CCSDS Consultative Committee for Space Data Systems
CD Clock Divider
CDPI Combined Data and Power Management Infrastructure
CE Convolutional Encoder
CE Circuit Enable/Chip Enable
CL Coding Layer
CLCW Command Link Control Word
CLTU Command Link Transfer Unit
CMOS Complementary Metal Oxide Semiconductor
CPDU Command Pulse Decoding Unit

xxi
xxii Abbreviations

CPU Central Processing Unit


DEI DSU/Ethernet Interface
DMA Direct Memory Access
DoD Depth of Discharge
DOM De-Orbiting Mechanism
DRAM Dynamic Random Access Memory
DSU Debug Support Unit
ECSS European Cooperation on Space Standardization
EDAC Error Detection and Correction
EEPROM Electrically Erasable PROM
EM Engineering Model
EMC Electromagnetic Compatibility
EMI Electromagnetic Interference
EPPL European Preferred Parts List
ESA European Space Agency
ESD Electrostatic Discharge
FDIR Failure Detection, Isolation, and Recovery
FIFO First-In-First-Out
FM Flight Model
FOG Fiberoptic Gyro
FPGA Field Programmable Gate Array
FRAM Ferroelectric Random Access Memory
GPIO General Purpose Input/Output
GPS Global Positioning System
GSE Ground Support Equipment
HF High Frequency
HK Housekeeping
HPC High Priority Command
HW Hardware
I/O Input/Output
IADC United Nations Inter-Agency Space Debris Coordination Committee
IC Integrated Circuit
IF Interface
IIC Inter-Integrated Circuit Bus
IRS Institut für Raumfahrtsysteme, Institute of Space Systems, University
of Stuttgart, Germany
ITAR International Traffic in Arms Regulations
ITU International Telecommunication Union
JTAG Joint Test Actions Group
LCL Latching Current Limiter
LEOP Launch and Early Orbit Phase
LISN Line Impedance Stabilization Network
LTDN Local Time of Descending Node
LVDS Low Voltage Differential Signaling
MAP-ID Multiplexer Access Point Identifier
Abbreviations xxiii

MCM Multi-Chip Module


MCS Mission Control System
MGM Magnetometer
MGT Magnetotorquer
MRAM Magnetoresistive Random Access Memory
MSI Medium-Scale Integration
MTQ Magnetotorquer
NASA National Aeronautics and Space Administration
NRZ-L Non-Return-to-Zero Level
NRZ-M Non-Return-to-Zero Mode
NVRAM Non-Volatile RAM
OBC Onboard Computer
OBSW Onboard Software
OS Operating System
PCB Printed Circuit Board
PCDU Power Control and Distribution Unit
PFM Protoflight Model
PLOC Payload Controller—‘‘Payload Onboard Computer’’
POR Power On Reset
PPS Pulse Per Second
PROM Programmable Read-Only Memory
PSR Pseudo Randomizer
PSU Power Supply Unit
PUS ESA Packet Utilization Standard
QA Quality Assurance
RAM Random Access Memory
RF Radio Frequency
RF-SCOE Radio Frequency Special Check-Out Equipment
RISC Reduced Instruction Set Computer
RIU Remote I/O Unit (of an OBC)
ROM Read Only Memory
RTOS Realtime Operating System
RTS Realtime Simulator
RWL Reaction Wheel
S/C Spacecraft
SA Solar Array
SBC Single Board Computer
SCID Spacecraft Identifier
SCOE Special Checkout Equipment
SDRAM Synchronous Dynamic Random Access Memory
SEL Single Event Latch-up
SEU Single-Event Upset
SIF Service Interface
SMD Surface Mounted Device
SoC System on Chip
xxiv Abbreviations

SoC Battery State of Charge


SPARC Scalable Processor Architecture
SpW SpaceWire
SRAM Static Random Access Memory
SSO Sun-Synchronous Orbit
SSRAM Synchronous Static Random Access Memory
STR Star Tracker
SW Software
TAA Technical Assistance Agreement
TC Telecommand
TCC Telecommand Channel Layer
TF Transfer Frame
TID Total Ionizing Dose
TM Telemetry
TMR Triple Module Redundancy
TTL Transistor–Transistor Logic
UART Universal Asynchronous Receiver Transmitter
USB Universal Serial Bus
VC Virtual Channel
VCID Virtual Channel Identifier
WS Workstation
Chapter 1
The System Design Concept

Jens Eickhoff

1.1 Introduction

The Combined Data and Power Management Infrastructure (CDPI), presented in


this book represents a functional merging of a classical satellite Onboard
Computer (OBC), and a Power Control and Distribution Unit (PCDU). Before
focusing on the system’s technical details some overview on the development
background shall be given to estimate the overall system’s function, performance
and usability for satellite missions.
The background for the CDPI development was the SmallSat program of
University of Stuttgart. It foresees the development of multiple space missions and
the first in the queue is an Earth observation Micro Satellite of 130 kg mass with
diverse payloads. More details on the mission and the satellite design can be found
in Chap. 10. During the satellite development the problem was to find first an
adequate Onboard Computer for control of the overall spacecraft system.

J. Eickhoff (&)
Astrium GmbH—Satellites, Friedrichshafen, Germany
e-mail: jens.eickhoff@astrium.eads.net

J. Eickhoff (ed.), A Combined Data and Power Management Infrastructure, 1


Springer Aerospace Technology, DOI: 10.1007/978-3-642-35557-8_1,
 Springer-Verlag Berlin Heidelberg 2013
2 J. Eickhoff

After handover of the SmallSat technical responsibility to the autor as industry


representative with sufficient satellite project experience, it was decided to step
away from any formerly envisaged FPGA based OBC architectures, since the
system never would have been suitable for running a proper operating system and
for allowing the coding of a trustworthy Failure Detection, Isolation and Recovery
(FDIR), layer in the Onboard Software (OBSW).
Thus a concept for a modular OBC was developed which during design led to
this mentioned functional integration of the OBC and PCDU in a combined
architecture. More details are given in the subsequent sections of this chapter. The
essential aspect for development was, that both OBC and PCDU have not been
available as flight proven units as it is usually the case for agency and industry
missions, but they had to be developed in parallel to the SmallSat project itself.
This was a great challenge for the entire team concerning technical design cor-
rectness and schedule constraints.
The reason why the SmallSat group did not simply take over solutions from
other university satellites or directly potentially sponsored industry solutions for
OBC and/or PCDU were:
• Industry OBCs were simply too large, too heavy and too power consuming to fit
into the SmallSat design at all.
• The SmallSat should be commandable via industry standard CCSDS/PUS pro-
tocol which ruled out the most university project OBCs.
• The OBC on board should reflect state of the art communication data protocols—
such as SpaceWire—and a professional Real Time Operating System (RTOS) as
baseline for the onboard software. This in addition required a sufficiently powerful
radiation hard onboard processor and again ruled out all low cost approaches.
Falling back to the professionals network of the author, a consortium of top
ranked industry partners could be set up for the project. All of these contributors
had a certain interest in sponsoring and supporting a brand new design solution for
a compact, lightweight, low-power Combined Data and Power Management
Infrastructure (CDPI).
An overview of the system and some development aspects is given in this
chapter. The subsequent main chapters are devoted to the individual subcompo-
nents, their features and functions. These chapters are written by the developers of
the individual industry team partners.

1.2 The Onboard Computer Concept Baseline

A number of design ‘‘paradigms’’ have formed the baseline of the overall archi-
tecture development. Starting with the OBC these are the following (Fig. 1.1):
• The overall OBC should be implemented as a sort of ‘‘stack’’ of individual
Printed Circuit Boards (PCB), each of them mounted in a coated aluminum
frame. All the PCBs should be of single Eurocard size.
1 The System Design Concept 3

Fig. 1.1 Housing concept


based on stacked frames

• All I/O into and out of the OBC to/from other spacecraft units should be routed
via connectors on the individual PCBs top side. All interfaces routing signals
between the individual PCBs should be routed via front connectors.
These two concepts led to an overall box design later which is depicted in
Fig. 1.2. The inter-board cabling is encapsulated in the OBC’s front compartment
which is closed by a front cover.

Fig. 1.2 Final OBC frame stack—CAD model.  IRS, University of Stuttgart
4 J. Eickhoff

Normally such inter-PCB routing is performed via an OBC backplane, but at


time of OBC component requirement definition and project kickoff with the
individual OBC component manufacturers, the front interfaces were not yet frozen
at all, particularly not things like the Processor-Board’s JTAG Interface and the
Service Interface (SIF). The concept of such a cabling based inter-board con-
nection allowed a completely individualized design and production of the OBC’s
different PCB types.
• The OBC concept of a stack of boards in closed frames significantly eased the
problem of potential EMC cross influences between the boards. This was an
important aspect since at the university which later assembled the overall OBC
from the delivered subcomponents, no sufficient EMC knowledge was available
in case of problems to be solved. The front compartment closed with a dedicated
cover in addition shields the OBC’s high speed data interfaces from external
EMC influences.
• The OBC’s thermal concept was based on the approach that the OBC is mounted
in the satellite on a radiator plate with its bottom side and is conducting waste
heat to this radiator. This however results in the OBC being potentially too cold
in non-operational state. Heater mats placed on the separation walls between
each second OBC frame assure operational temperature keeping. The heaters are
controlled by bi-metal thermostats in the OBC front compartment.
• The OBC was designed for single redundancy of all subunits. Therefore each
board type is available in two instances and all data interfaces between the
boards are cross coupled.
• The first OBC board type is the power supply board, which provides the 3.3 V
power supply to the individual PCBs and which itself is coupled to the 24 V
satellite power supply from the PCDU. The OBC Power-Board in addition
performs the routing of some OBC Processor-Board interfaces to the outside of
the box to the satellite skin connector, namely the SIF, the JTAG-I/F, the line-in
of the 1 Hz GPS PPS strobe and the line-out of the star tracker PPS strobe.
• The second board type obviously is the Processor-Board, a so-called Single
Board Computer (SBC), since it comprises not only the CPU, but also non-
volatile memory for bootloader, operating system and OBSW boot image, as
well as RAM for running the OBSW. Packet stores for OBSW housekeeping
data, for the S/C state vector and the S/C configuration vector are available on
the next board type, the so-called I/O-Boards.
• The I/O-Boards are coupled to the Processor-Boards via SpaceWire interfaces
and perform as main function the bridging to all types of lower level digital
interfaces of S/C onboard equipment. So all onboard units providing interface
types like UART, IIC or others are coupled to the I/O-Boards and the Processor-
Board can access their data via its SpaceWire interface.
The I/O-Board design is based on a radiation tolerant FPGA with an according
IP Core which is accessed via SpaceWire from the Processor-Board and on
according driver ICs for each type of digital I/O interface to connected space-
craft equipment.
1 The System Design Concept 5

Since this approach requires some buffers and memory chips on the board
anyway, these boards via additional memory and according enhanced IP Core
functions are designed to handle the storage of OBSW housekeeping data, the
S/C state vector and the S/C configuration vector.
• The fourth type of boards are the CCSDS protocol decoder/encoder boards
which also are coupled to the Processor-Boards via SpaceWire interfaces and on
the other side are interfacing the spacecraft’s transceiver units. These boards
perform the low-level decoding of the uplinked telecommands and the low level
encoding of the telemetry to be downlinked during ground station visibility.
The approach for these boards was to use the identical PCB and FPGA as for the
I/O-Boards, and to just equip the CCSDS-Boards with only a limited number of
external interfaces, namely those to the transceivers, to the PCDU for High
Priority Commands (HPC) and those for cross coupling of the Command Link
Control Word (CLCW) interfaces (see later in Sect. 4.2.2). Obviously these
CCSDS-Boards comprise a different loaded IP Core since they perform a
completely different task.
The processing IP Cores in the FPGA on the CCSDS-Board and the software
libraries which are available in the RTOS tailoring and which are running on the
Processor-Board were designed in a common architecture.
When looking at the the above mentioned design paradigms, a reader who is
familiar with classic OBC design immediately will identify some elementary OBC
functions missing in the so far described board types. These functions have been
implemented through a sort of functional merging of OBC and PCDU and they
will be explained in the following sections.

1.3 The PCDU as Analog RIU

Already at begin of the OBC development a PCDU from Vectronic Aerospace,


Berlin, was preselected for satellite power bus management. This family of
PCDUs is equipped with relatively performant microcontrollers and with analog
measurement functions for current and voltage measurement and with according
functions for TM reporting to the OBC. More details can be found in Chap. 8.
Thus as first part of classic OBC functions the task of
• analog current measurement,
• voltage measurement and
• parameter analog to digital conversion
for the analog power lines and also for all analog sensor interfaces—as e.g. for
thermistor readings—is implemented as PCDU function in this OBC/PCDU co-
design. Thus in the CDPI now the PCDU takes over these tasks instead of a
Remote Interface Unit (RIU) in classic OBC design.
6 J. Eickhoff

1.4 Common System Reconfiguration Controller

The system reconfiguration functionality of the overall satellite with respect to


Failure Detection, Isolation and Recovery (FDIR), in emergency cases is based on
a concept invented by the author which is patent affiliated by Astrium Satellites
[48]. For proper understanding of the concept some explanations shall be given
before describing the implementation.
OBC and PCDU are the two key components for control and safety manage-
ment onboard a satellite. All controllers and electronics inside of the OBC and
PCDU are implemented at least with single redundancy (cited below as version A
and B in the translations and figures taken from [48]). Furthermore, they are cross
coupled. Inside an OBC thus for example the processor module CPU A can be
operated with the housekeeping TM mass memory A or B. Standard setting is
usually to operate all units on A side at launch and to use all B configuration for
the safe-mode switchover. As exemplary sketch in the Fig. 1.3 only the most
important OBC elements are cited. For example there exist diverse types of
memory in real OBCs (PROM, RAM, Safeguard Memory etc.). Internal clock,
clock strobe distribution units and the like are also not depicted but they underlie
the same redundancy switchover concepts which are explained below.

Onboard
Reconfi - Memory A Computer
guration
Unit A
Decoder CPU A
Encoder Data Data Bus A
Board A Bus
Trans- Ctrl A
ceiver Data Data Bus B
Decoder Bus
Encoder Ctrl B
Board B CPU B SC SC SC
Reconfi - Equipm. 1 Equipm. 2 Equipm. 3
guration
Unit B Memory B

Pwr Supply A Pwr Supply B

PCDU
Battery Ctrl A

Pwr Pwr Bus A


Electronics LCLs/FCLs
Pwr Bus B

Solar PCDU
Array Ctrl B Power Control & Distribution Unit

Fig. 1.3 Conventional infrastructure for data management (OBC) and power management
(PCDU).  Astrium—see [48]
1 The System Design Concept 7

For taking a spacecraft payload into operation—say ‘‘Spacecraft Equipment 1’’


in Fig. 1.3—the satellite via its transceiver receives commands from ground which
themselves are decoded by the decoder/encoder unit or board and which then are
routed as command packets to the OBSW running on the CPU. For payload
activation the OBSW first commands the active PCDU controller (A or B) via data
bus to supply the payload equipment with power—which means closing the
according Latching Current Limiter (LCL) of the—currently active power bus.
Thereafter the payload itself is commandable via the OBC via the active data bus.
Telemetry from the payload is delivered to the active encoder by the OBC for
encoding and routing to ground via the transceiver.

1.4.1 Component Functions During Failure Handling

The understanding of the roles of decoder/encoder modules, CPUs, Reconfigura-


tion Units, PCDU controllers etc. for failure handling is essential and four types of
errors shall be assessed to represent the most typical, most frequently occurring
ones. In any case of reconfiguration of the Data Handling and Operations
(DH/OPS) subsystem or of the power subsystem, essential control functions of the
satellite are at least blocked or non-operational for a limited time which will
degrade the spacecraft attitude and power control. So besides pure failure man-
agement in form of switchover to a redundant unit, the FDIR system must assure
shutdown of non essential power loads (payloads, thermal consumers) and must
assure bringing back the satellite into a safe-mode configuration. The overall
process of such FDIR management is a combination of functions implemented in
hardware and functions coded in OBSW and thus goes partly beyond the scope of
this book. Here only those functions to be implemented in real OBC hardware
shall be explained. For further reading please refer to [10].

1.4.1.1 Failure Type 1: Failures Identifiable by the Running OBSW

Examples for this type of hardware failures are e.g. a bus controller which is
sporadically not responding and requiring retries by the OBSW or the failure case
where the OBSW receives mass memory EDAC bit error messages. In such case
the OBSW can send so-called High Priority Commands (HPC) to the active
Reconfiguration Unit for DH/Ops and Power FDIR which then triggers the
reconfiguration to the redundant OBC bus controller—or in this case the entire
I/O-Board. In other error cases the Reconfiguration Unit may be forced to switch
over to the redundant Processor-Board. In the latter case the OBSW has to be
rebooted on the target Processor-Board.
To sum it up the triggering of the reconfiguration process is initiated in these
cases by the OBSW, the diverse reconfiguration steps with redundant subunit
8 J. Eickhoff

power-up, defective-unit power-down and potential intermediate steps for data


shift-over are carried out by the function implemented in the Reconfiguration Unit.

1.4.1.2 Failure Type 2: Crash of the OBSW and Auto-Reconfiguration

Failure type 2 covers all those errors which coincide with a crash of the OBSW. In
such case a Reconfiguration Unit outside the Processor-Board has first to detect the
OBSW failure and then has to perform the reconfiguration.
In the simple case of an OBSW crash due to a bug or an IC Single-Event Upset
(SEU), the OBSW has to be restarted by the Reconfiguration Unit. The failure
detection usually is performed by means of the processor cyclically sending
watchdog signals to the Reconfiguration Unit and the latter starting FDIR action in
case of the watchdog signal exceeding a timeout.
The reconfiguration activities reach from simple processor soft reset to com-
plete Processor-Board or/and I/O-Board redundancy switchover, depending on the
symptoms detected by the Reconfiguration Unit. The number and type of watch-
dogs and timers obviously has to be designed appropriately for the Reconfiguration
unit being able to determine between different error cases.
Such an automated reconfiguration with broken OBSW is only possible for a
limited type of failure cases. The determination of the complex ones and moreover
the root cause analysis requires intervention from ground. Which leads over to the
next failure class described below.
As summary it can be stated that the Processor-Board shall feature watchdog
lines to the Reconfiguration Unit for detection of OBSW crashes or I/O-Board
signal routing failures which then can be used for OBSW restart or hardware
reconfiguration—the latter only in very limited cases.

1.4.1.3 Failure Type 3: Crash of OBSW and Reconfiguration


from Ground

If SW crashes and auto-reconfiguration or OBSW reboot initiation are not sufficient


to overcome the problem, the reconfiguration has to be initiated and to be
performed from ground. This usually starts with the initiation of power shutdowns
for unnecessary loads and continues with deactivation of suspected defective units
and activation of hopefully still healthy ones.
All ground commands for these switch-overs have to bypass the OBSW—as it
is non-operational. Therefore a special class of HPCs are used (so-called class1
HPCs or HPC1 commands) which can be identified by the OBC’s TC decoder
modules and which are not routed to the OBSW on the Processor-Board like
normal commands. Instead these HPCs are directly forwarded to a specific subunit
in the Reconfiguration Unit—the so-called Command Pulse Decoding Unit
(CPDU). For more details please refer to [10]. The CPDU can decode such HPCs
and can send pulse commands to the Power Control and Distribution Unit (PCDU)
1 The System Design Concept 9

to trigger LCL relay on/off switching, according to what the targeted unit’s status
shall be.
This allows ground to activate deactivate onboard equipment completely
independent from any OBSW to overcome severe failure situations.

1.4.1.4 Failure Type 4: Power Bus Undervoltage Failures

The final main type of failures are those leading to a power shortage on board. In
such case—independent of whatever the root cause was—the S/C equipment is
shut off in several steps. The first units being disabled are payloads and payload
data mass memory units. In further sequence then—if the OBSW apparently is
unable to manage the problem—the platform units including OBC are shut down.
An in case even these measures do not overcome the problem the PCDU finally
deactivates itself. For these cases a PCDU is equipped with an auto-activation
function for its controller as soon as the S/C power bus again supplies sufficient
voltage—e.g. due to the satellite returning from orbit eclipse phase to sun phase.
With increasing power availability the PCDU subsequently activates further S/C
platform units, first unit being the OBC, respectively in Safe Mode the redundant
OBC (depending on the latest settings in the Reconfiguration Unit). The latter then
activates platform AOCS units to achieve a stable SC safe-mode attitude acqui-
sition and potential ground contact through activation of the transceivers.
As summary in these cases reboot and configuration are initiated by the PCDU
controller and again PCDU controller and Reconfiguration Unit together manage
the recovery.

1.4.2 Innovation: A Combined-Controller for all FDIR


Functions

In a classic unit architecture on board three controllers implement the functionality,


namely the OBC’s Reconfiguration Unit [10] and as subunit the OBC’s Command
Pulse Decoding Unit (CPDU) [10] as well as the PCDU’s internal controller IC.
These elements are typically implemented without being based on Onboard Soft-
ware but either as pure electronic circuits or as FPGA or ASIC implementation.
This implies that the S/C flies three controllers which need to be highly reliable
and thus require according design and test effort. The basic idea behind the patent
affiliation [48] is to implement only one FDIR controller for all data handling and
power FDIR reconfiguration and emergency functions—the patent cites it as a
‘‘Combined-Controller’’. In such case it is of no relevance, whether the Combined-
Controller physically is mounted inside the OBC housing or the PCDU housing or
even in a common housing of both (Fig. 1.4).
10 J. Eickhoff

Onboard
Memory A Computer
Decoder
Encoder
Board A CPU A
Data Data Bus A
Bus
Trans- Ctrl A
ceiver Data
Bus Data Bus B
Ctrl B
Decoder CPU B
Encoder
Board B
Memory B

Decoder Pwr Core Pwr Bus, I/O Pwr

Combined
Battery Controllerl A

Pwr Pwr Bus A


Electronics Pwr Bus B LCLs/FCLs

Solar Combined
Array Controller B
Power Control & Distribution Unit

Fig. 1.4 Combined Data and Power Management Infrastructure.  Astrium see [48]

The FLP target satellite from the University of Stuttgart is the first one to fly
such a CDPI architecture with a Common-Controller. In this first implementation
the combined controller physically resides in the PCDU housing and is an
implementation of the standard Vectronic Aerospace PCDU controller with
enhanced firmware functions defined by the Stuttgart FLP team and implemented
by Vectronic Aerospace. More details on the PCDU and its controller functions
can be found in Chap. 8.
In normal operational cases the OBSW still can command the ‘‘PCDU’’ con-
troller—i.e. the Combined-Controller—to switch on/off spacecraft equipment
power lines. To avoid to overload the above figure the routing of these links have
been cut off (in comparison to Fig. 1.3). The same applies to the omitted data bus
lines to the diverse spacecraft equipment which still were depicted in Fig. 1.3.
The reconfiguration now is triggered by the Combined-Controller either via
soft-reset command lines to the individual OBC subunits (e.g. CPU) or via power
resets respectively via power down of the defective unit and power-up of the
component’s redundancy.
As example the Combined-Controller (in the PCDU in the current implemen-
tation example) can separately power each OBC decoder/encoder board, each
Processor-Board with CPU, NVRAM, RAM, clock module and finally also each
OBC I/O-Board. The apparently ‘‘additional’’ cabling needed between CC and these
elements—compared to a classic architecture—was hidden inside the reconfigura-
tion unit of the OBC in the design of Fig. 1.3 and thus is no real additional effort.
1 The System Design Concept 11

1.4.3 Failure Management with the Combined-Controller

1. In failure case 1 the running OBSW triggers the reconfiguration functions


implemented in the Combined-Controller (CC), via the CPU-CC links, for
example a reconfiguration of an OBC decoder/encoder unit.
2. In failure case 2 the auto-reconfiguration is unchanged. It simply is performed
through the Combined-Controller, for example induced through watchdog
signal timeouts etc. Such watchdog or alert signal lines from the CPU now have
to be routed to the Combined-Controller.
3. In failure case 3 where reconfiguration has to be performed via High Priority
Commands from ground, the according reconfiguration command routing goes
from the decoders to the CC for reconfigurations command execution in the CC.
4. In failure case 4 (power failures, power bus undervoltage) the approach of
satellite power down and eventual reboot is controlled as in the classic archi-
tecture. Just the functions implemented formerly inside the PCDU controller
now are located in the CC.

1.4.4 Advantages of the Combined-Controller Approach

The advantages of the CDPI design with one Common-Controller instead of the
triple made of the OBC’s Reconfiguration Unit, the Command Pulse Decoding
Unit and the PCDU’s internal controller are as follows:
1. Only one single critical controller IC (FPGA or ASIC) has to be designed—
respectively a single chip firmware has to be implemented.
2. Only for one single controller chip intensive tests have to be performed.
3. Complex tests for proper interaction of a classic OBC Reconfiguration Unit
with the PCDU controllers for handling of the diverse failure cases are sig-
nificantly simplified. Some specific cases are completely obsolete.
4. For the OBSW design compared to the classical architecture only minimal
adaptations are necessary for triggering system reconfigurations—so no addi-
tional effort is implied also in this field.
5. The OBC CCSDS decoder/encoder board architecture also is not touched by the
new concept implementation—except for their I/O cabling.
6. The analog CPDU as subunit of the OBC’s Reconfiguration Unit which in the
conventional architecture serves to submit the analog pule commands to the
PCDU controller, is completely obsolete in the new concept. The according
class 1 High Priority Commands (HPC), can be directly submitted to the
Combined-Controller from the OBC CCSDS decoder via a normal digital link.
An according analog LCL control electronic in side the OBC also is obsolete.
12 J. Eickhoff

These simplifications significantly reduce the manufacturing and verification


effort and resulting cost for the tandem of OBC and PCDU unit which makes the
architecture specifically interesting for low cost missions. Astrium granted a free
patent use to the IRS for the FLP satellite mission.

1.5 Completeness of System Architecture

By this presented overall system design approach for the Combined Data and Power
Management Infrastructure (CDPI), all necessary flight relevant functions (see also
[10]) are covered—although not all allocated to the classic boards or component:
• OBC power supply -[ on OBC Power Boards
• OBC processor and internal data bus -[ on OBC Processor-Boards
• OBC memory—non volatile for boot loader and operating system and OBSW—
and volatile as work memory -[ on OBC Processor-Boards
• OBC packet stores for Housekeeping Telemetry (HK TM), spacecraft state
vector and spacecraft configuration vector -[ on I/O-Board
• OBC digital RIU function—coupling of all non SpaceWire digital equipment I/
O to the OBC - [ on I/O-Board
• Interface to the spacecraft transceivers decoding TCs and encoding TM -[ on
CCSDS-Boards
• OBC analog RIU functions—coupling all analog interface control and analog
parameter measurements to the OBC -[ implemented in PCDU and com-
manded/controlled from OBC via OBC-PCDU UART interfaces.
• OBC reconfiguration functions -[ implemented in the Common-Controller. In
this implementation realized by the PCDU controller with enhanced firmware.
• The OBC HPC interface functionality to implement implicitly the functions of
an OBC’s CPDU -[ implemented in the Common-Controller firmware and
accessible through additional UART interfaces via the OBC CCSDS-Boards
• Bus power management functions -[ implemented in the PCDU Common-
Controller
• Equipment power supply switching and overvoltage/overcurrent protection -[
implemented in the PCDU Common-Controller
• Power bus undervoltage FDIR functions (DNEL functions) -[ implemented in
the PCDU Combined-Controller.
An overview figure of the CDPI with all external I/Os and the interlinks
between OBC and PCDU part is included in Fig. 1.5. Box internal cross connects,
like between Processor-Boards and I/O-Boards are not included for not over-
complicating the figure here. The same applies for cross couplings of redundant
elements within OBC respectively PCDU. All these later are treated in more detail
in Chap. 6.
OBC
UART SIF N
UART SIF R
Legend:
JTAG Debug
OBSW Load N OBC Pwr Boards

JTAG
Debug OBC CCSDS Boards
OBSW Load R
OBC I/O Boards
1x PPS In
1x PPS Out
OBC CPU Boards

FOG
2x IIC
1 The System Design Concept

24x RS422
HK 4x Logic Out
Memory 2x Logic In

Cmd / Ctrl

For diverse
S/C Equipment
Safeguard PCDU
UART UART
Memory Cmd/Ctrl N …... …...
Detection

Digital IF 1
Separation

Digital IF m

Analog IF 1
Analog IF n
UART PCDU UART
Cmd/Ctrl R

TC In1 UART
NRZ-L via RS422 HPC N UART
TM Out1 SA In 1
NRZ-L via RS422

TC In2 NRZ-L via RS422 UART UART SA In 3


HPC R
TM Out2
NRZ-L via RS422
Batt Pwr IF1

…...
PCDU Batt Pwr IF3

Pwr R
Pwr N
Relay 1
Relay 2

Bistable
Bistable

LCL 0/1
LCL 2/3
LCL 4/5
LCL 6/7
LCL 8/9
LCL 76

LCL 10/11

OBC Core N
OBC Core R
I/O Board N
I/O Board R
CCSDS N
CCSDS R
13

Fig. 1.5 The combined data and power management infrastructure


14 J. Eickhoff

The Combined Data and Power Management Infrastructure (CDPI) conceptu-


alized for the IRS FLP satellite was designed to follow exactly these paradigms as
explained in the Sects. 1.2–1.4. The selection process for baseline units for the
individual CDPI components, like Processor-Board is further described in the next
section.

1.6 Component Selection

1.6.1 Processor-Boards

At time of concrete development of the FLP OBC and PCDU the initial problem
was to find a supplier providing a suitable CPU board for the OBC. The initial idea
was to base the development on one of the diverse available CPU test boards from
Aeroflex Gaisler AB and implement necessary modifications—since these test
boards were not designed to be flight hardware (Fig. 1.6).

Fig. 1.6 Aeroflex Gaisler


test board.  Aeroflex Gaisler
AB

By coincidence the author met Jiri Gaisler, the founder of former Gaisler
Research—today Aeroflex Gaisler AB—at the Data Systems in Aerospace
conference in May 2009 in Istanbul and Jiri Gaisler was aware that Aeroflex
Colorado Springs just had started the development of a Single Board Computer
(SBC), based on their LEON3FT processor UT699.
The LEON3FT (cf. [49] and [50]) architecture includes the following peripheral
blocks (please also refer to Fig. 1.7):
1 The System Design Concept 15

RS232 / JTAG 2 x LVTTL 2 x LVTTL CAN N/R

IEEE754 LEON3FT
FPU SPARC V8 Debug Serial/JTAG 2x SpaceWire 2x SpaceWire CAN
Support Debug Links Links 2.0
Mul & 2 x 4kB 2 x 4kB Unit Link RMAP
Div D-cache I-cache

MMU AMBA AHB

AMBA AHB

AMBA APB
PCI Ethernet
Memory AHB/APB Intiator / MAC
Controller Bridge IrqCtrl UART Timers I/O Port Target 10/100

8/32-bit memory bus

RS232 Watchdog I/O Port 32-bit PCI Ethernet PHY


PROM I/O SRAM SDRAM

Fig. 1.7 UT699 LEON3FT SOC block diagram.  Aeroflex Gaisler AB

• LEON3FT SPARC V8 integer unit with 8 kByte instruction and 8 kByte of data
cache
• IEEE-754 floating point unit
• 8/16/32-bit memory controller with EDAC for external PROM and SRAM
• 32-bit SDRAM controller with EDAC for external SDRAM
• 16-bit general purpose I/O port (GPIO)
• Timer/watchdog unit
• Interrupt controller
• Debug support unit with UART and JTAG debug interface
• 4 SpaceWire links with RMAP
• Up to two CAN controllers
• Ethernet
• cPCI interface
• Debug support unit with UART and JTAG debug links.
What stands out here is that the system has the processor itself, the interface
controllers for CAN bus, Ethernet, cPCI and especially SpaceWire implemented
all on the same chip. Furthermore, the chip provides a debug interface. All these
additional on chip elements are connected to the CPU core via an internal AMBA
bus which was originally developed by ARM Ltd. for the ARM processor family.
A block diagram of the overall Processor-Board design intended at that time is
depicted in Fig. 1.8. This system was designed to include:
• The processor
• SRAM memory for OBSW computation
• Non volatile RAM (NVRAM), which could be used as EEPROM for storing the
initial boot OBSW image
• The SBC concept includes SpaceWire interfaces to couple the other foreseen
OBC board types to it and
• it included RS422 interfaces
• and a clock module.
16 J. Eickhoff

Fig. 1.8 UT699 SBC draft block diagram.  Aeroflex Inc.

All this functionality was intended to be mounted on an Eurocard size Printed


Circuit Board (PCB).
The university satellite project was presented to Aeroflex. A team of them and
their German distributor Protec GmbH, Munich, visited the university and Aeroflex
finally decided to support the university project and act as Processor-Board supplier.
The Processor-Board development itself then was handled like a standard
commercial project with requirements review, design reviews, manufacturing
review, test review and shipment review—and this both for an initial EM
(see Fig. 1.9) as well as for the later 2 FMs (see Fig. 2.1).
The university team is indebted to Aeroflex and their German distributor Protec
GmbH, Munich, for guiding us through the formalisms of the procurement of the
CPU boards as being ITAR products.
Compared to the initial design diagram of Aeroflex cited in Fig. 1.8 the Pro-
cessor-Board design was modified later during the overall project to have 4
SpaceWire interfaces which are absolutely identical w.r.t. their access from the
OBSW, to provide a Service Interface (SIF), and to provide an OBSW load and
debug interface as well as external Pulse Per Second (PPS) signal input as well as
1 The System Design Concept 17

Fig. 1.9 OBC Processor-


Board with JTAG and OBSW
load adapter—EM—mounted
in test rack [7, 87].  IRS,
University of Stuttgart

Processor-Board generated PPS output. All details on the final design ‘‘as
implemented’’ can be taken from the Processor-Board description in Chap. 2 of
this book and from Fig. 2.5.
Since Aeroflex Colorado Springs was not permitted to deliver any bootloader
nor software/RTOS for the Processor-Board due to ITAR, the solution was the
partnership with Aeroflex Gaisler AB, Sweden, applying their RTEMS SPARC
tailoring running on the LEON3FT.

1.6.2 CCSDS Decoder/Encoder

On the side of satellite ground command/control the university uses a SCOS system
licensed from ESOC and thus applies the CCSDS spacecraft TC/TM control
standards [23]. Therefore a TC/TM decoder/encoder functionality according to
these standards was necessary. As already being in contact with Aeroflex Gaisler it
was the obvious choice to utilize Aeroflex Gaisler’s CCSDS TC/TM processing
infrastructure for the decoder/encoder boards (cf. [62]). For the students developing
the satellite this avoided the huge effort for programming according functionalities
for these complex frame management, compression and bit-stream error correction
functions (see [10]) which definitely would have exceed the team’s resources.
• The CCSDS Telecommand Decoder implements in hardware the synchroniza-
tion, channel coding sub-layer, and part of physical layer. The higher layers are
implemented in software as libraries, integrated into the Aeroflex Gaisler
SPARC V8 tailoring of the RTEMS realtime operating system. This software
implementation of the higher layers allows for implementation flexibility and
accommodation of future standard enhancements. The hardware decoded
command outputs and pulses do not require software and can therefore be used
for critical operations. The CCSDS telecommand decoder provides the entire
functionality of reassembling TCs from NRZ-L CLTU level coming from
18 J. Eickhoff

receiver side up to Telecommand Frame content. It identifies hardware com-


mands to be routed to PCDU’s Combined-Controller. The special implemen-
tation selected here is not based on marking HPCs by MAP-ID, but via
dedicated Virtual Channels. In the presented CDPI implementation the hardware
commands will be routed to the PDU via an RS422 interface link and the CPDU
functionality is implicitly covered by the Combined-Controller of the PCDU.
• The CCSDS/ECSS Telemetry Encoder implements—entirely in hardware—
protocol sub-layer, synchronization & channel coding sub-layer (e.g. Reed-
Solomon and convolutional encoding), and part of physical layer. Also here the
higher layers are implemented in software as libraries, integrated into the Ae-
roflex Gaisler SPARC V8 tailoring of the RTEMS realtime operating system.
Thus the CCSDS IP Core provides TM wrapping from Virtual Channel level
into CADUs handed over to the transmitter for downlink.
The above mentioned ‘‘in hardware’’ functions of the decoder and encoder
functionality are implemented in an IP Core to be loaded into the FPGA of the
according decoder/encoder board of the OBC—or CCSDS-Board for short. The
corresponding software libraries also are available in the RTEMS from Aeroflex
Gaisler AB. More details on the entire CCSDS decoder/encoder architecture
including explanations of the in-software and in-hardware implemented parts can
be taken from Chap. 4. This also comprises the SpaceWire RMAP protocol inter-
facing to the decoder/encoder. A block diagram of the entire architecture com-
prising both ‘‘in-software’’ and ‘‘in-hardware’’ layers is shown there in Fig. 4.1.
In the overall CDPI EM testbed assembly a Breadboard Model in form of a standard
Aeroflex Gaisler/Pender-Electronics test board was used to verify the IP Core func-
tions in cooperation with spacecraft transceiver bypass on the input side, Processor-
Board and PCDU Combined-Controller respectively on the output side (Fig. 1.10).
The TC and TM packet definitions of the FLP satellite are depicted in the
following Figs. 1.11, 1.12.

Fig. 1.10 CCSDS Decoder-/


Encoder Board—Breadboard
Assembly.  Aeroflex
Gaisler AB
CLTU: variable - 320 byte max.
CLTU - Command Link Transmission Unit
Start 1st 2nd n-th Tail
Sequence Valid tail sequences used:
EB90 (hex) Codeblock Codeblock Codeblock Sequence 55 55 55 55 55 55 55 55 (hex)
16 bit 8 byte 8 byte 8 byte 8 byte C5 C5 C5 C5 C5 C5 C5 79 (hex)

Co deblock – one out of n


TC Data
Codeblocks: variable - 37 max.
Error Control
7 Bytes 7 Parity Bits | 1 Filler Bit

56 bit 8 bit

TC F rame:
1 The System Design Concept

TC Transfer Frame variable - 256 byte max.


TC Frame
TC Frame Data Unit Aggregation = OFF: n = 1 / single Packet in Segment Data Field
Frame Header Error Control
= 1 TC Segment Aggregation = ON : n > 1 / multiple Packets in Segment Data Field
(CRC)
Segment Data Field filled up by
5 byte variable length - 249 byte max. 2 byte integral number of packets
Sequence Flags = 11 : no Segmentation

Frame Header T C Segment


Version Bypass Control Spare Spacecraft Virtual Frame Frame Segment Header Segment Data Field
Cmd Channel Sequence
(=00) Flag Flag (=00) ID Length Sequence Flags MAP ID
ID Number
Packet #1 Packet #n
2 1 1 2 10 6 10 8 2 6
2 byte 2 byte 1 byte 1 byte variable length - 248 byte max.

Frame Header TC VC: TC Segment:


VC0 = SW TC O BC Core N variable - 249 byte max.
VC1 = HPC1 CCSDS processo r N
VC2 = SW TC O BC Core R T C Source Packet Packet: variable - 248 byte max.
VC3 = HPC1 CCSDS processo r R Packet Data Field
Packet Header Application Data Packet
Data Field Command:
Error Control
Header > Comman d Data < variable - 236 byte max.
(CRC)
6 byte 4 byte variable length - 236 byte max. 2 byte

Packet Header Data Field Header


Packet ID Packet Sequence Control Pkt Length TC Ack Flags
Sec. CRC 0001 = accepted
APID Sequence Packet Head. Flags 0010 = start exec Service Service Source
Version Type DFH Sequence Flag 0= no CRC Type Sub-Type ID
Flag Flags Data Field 0100 = progress
(=000) (=1) PID Pkt CAT (=11) Count Length (=0) 1= CRC
1000 = executed
3 1 1 7 4 2 14 16 1 3 4 8 8 8
2 byte 2 byte 2 byte 1 byte 2 byte 1 byte
19

Fig. 1.11 Telecommand packet definition.  IRS, University of Stuttgart, template  Astrium
Legend: (=<value>) ::== fix numerical value
20

next line from bottom: number of bits in field


CADU bottom line: number of bytes in field

Attached Reed-Solomon Codeblock


Sync. Marker
1ACFFC1D Data Space Check Symbols
32 bits 8920 bits 1280 bits
4 byte 1275 byte (10200 bits) CADU: 1279 byte

TM Transfer Frame

Frame Header Frame Data Field Frame Trailer TM Frame: 1115 byte = 8920 bits
6 byte 1105 byte 4 byte

Transfer Frame Trailer


Frame Header
Frame Identification Frame Data Field Status CLCW
Master Virtual Pkt Segment Frame Error
Version Virtual Op DFH Sync First Ctrl CLCW COP Flags
Spacecraft Channel Channel Order Length Virtual Farm
(=00) Channel Ctrl Flag Flag Header Control (CRC)
ID Frame Frame Flag ID Word Version Status in No No Re- Report
Id Flag (=0) (=0) Pointer Channel Spare Lock- B Spare
Count Count (=0) (=11) Type Fields Effect Value not used.
ID RF Bit WAIT trans- Count
2 10 3 1 1 1 1 2 11 (=0) (=00) (=01) Out
Avail Lock mit Covered by RS
2 byte 1 byte 1 byte 2 byte 1 2 3 2 6 2 1 1 1 1 1 2 1 8

4 byte 0 byte

TM Source Packet Packet: variable -2048 byte max.


Packet Data Field
Source Data Packet
Packet Header Data Field
Error Control Telemetry:
Header > Telemetry Data < variable -2028 byte max.
(CRC)
6 byte 12 byte variable length -2028 byte max. 2 byte

Packet Header Data Field Header


Packet ID Packet Sequence Control Pkt Length Filler Filler Sync Status Time -Day Segmented (CDS)
Error
APID Sequence Packet Service Service
Version Type DFH Sequence Control PPS Sync Day of
Flags Data Field Type Sub-Type Millisec.
(=000) (=0) Flag Count Flags Qual. Epoch:
PID Pkt CAT (=11) Length (=0) (=0) Source of Day
Index 0 := 1.1.2000
3 1 1 7 4 2 14 16 1 3 4 8 8 4 4 16 32

2 byte 2 byte 2 byte 1 byte 2 byte 1 byte 6 byte

Time -Day Segmented (CDS)


Offset
Day of
Millisec. -sec Counter
Epoch:
of Day of Day
0 := 1.1.2000
16 32 16 32

8 byte 4 byte
J. Eickhoff

Fig. 1.12 Telemetry packet definition.  IRS, University of Stuttgart, template  Astrium
1 The System Design Concept 21

1.6.3 I/O and CCSDS-Board

The bridge between the Processor-Boards and the spacecraft platform and payload
equipment is realized via intermediate I/O-Boards (Fig. 1.13). These I/O-Boards
mimic the digital interface function of a Remote Interface Unit (RIU) in a com-
mercial S/C. The I/O-Boards are connected to the Processor-Boards via a
SpaceWire connection running an RMAP protocol. Two redundant I/O-Boards are
available and are cross coupled to the two redundant Processor-Boards. For the
development of the I/O-Boards the university selected 4Links Ltd., UK, as partner
due to their extensive experience with SpaceWire equipment and software.
Another reason for selecting 4Links was the excellent experience of the author
with 4Links SpaceWire test equipment on ground.

Fig. 1.13 I/O-Board—EM.


 4Links Ltd

The I/O-Boards SpaceWire functionality and data routing between SpaceWire


and the low level I/O and bus interfaces have been implemented as IP Core on an
Microsemi/Actel A3PE3000L FPGA. Applying Triple Module Redundancy
(TMR), for all gates on the FPGA makes this part of the OBC infrastructure
sufficiently radiation tolerant which is an outstanding criterion of the FLP satellite
compared to other university projects.
The idea for the CCSDS-Boards was to use the identical design as for the I/O-
Boards, just with less interfaces:
22 J. Eickhoff

• The coupling of the CCSDS-Boards to the Processor-Boards was planned be


done similarly via SpaceWire links.
• The interface on the CCSDS-Board’s transceiver side would be RS422.
• And the interface for HPCs to the Combined-Controller of the PCDU for
commands bypassing OBSW also were designed as RS422 type.
• And finally the above mentioned CCSDS IP Core was intended to be loaded on
the same type of FPGA as for the I/O-Boards.

With this approach the CCSDS-Boards could be implemented without any


additional effort, just by use of the same housing, FPGA, PCB, connectors and the
like, just as ‘‘not fully equipped’’ I/O-Board with a different IP Core.

1.6.4 OBC Power-Boards

For the power supply boards—which are described in more detail later in Chap. 5
—it was decided right from the start to manufacture these at the IRS by the
electrical engineering team. These Power-Boards mainly have the task to convert
the satellite power bus voltage of 24 V down to the required voltages for the
different OBC data handling boards—which is 3.3 V nominal (Fig. 1.14).

Fig. 1.14 Power-Board—


EM.  IRS, University of
Stuttgart

During the design evolution the Power-Boards were adapted in addition to route
some of the OBC internal signals which are available e.g. on the Processor-
Board’s front connectors to the top of the overall housing assembly—please refer
to Fig. 1.1 respectively 1.2.
Furthermore, a logic circuitry was mounted to route the active pulse signal from
the multiple redundant GPS as a single output line towards the Processor-Boards.
1 The System Design Concept 23

1.6.5 The PCDU with Enhanced Functionality

As already cited in Sect. 1.3 the PCDU supplier for the satellite mission was
already selected at start of the overall CDPI architecture design, however the cited
functions for
• overall analog equipment command and control,
• for the OBC reconfiguration and for,
• the HPC interface functionality to implement implicitly the functions of an
OBC’s CPDU,

were specified during the engineering process to the supplier in accordance with
the CDPI concept as explained in Sect. 1.4. All the PCDU functional upgrades
from a standard PCDU controller to the Combined-Controller were implemented
by Vectronic Aerospace into the PCDU firmware.
Details on these functions and features, on redundancy handling, cross cou-
plings etc. all can be found in Chap. 8. Already in the overall EM Satellite Test
Bed (STB) these functions were available and the corresponding PCDU EM is
shown in Fig. 1.15.

Fig. 1.15 Power Control and Distribution Unit—EM.  IRS, University of Stuttgart/Vectronic
Aerospace

1.7 Testing the CDPI in an Integrated Environment

The entire CDPI electronics subsequently was tested on an EM Satellite Test Bed
(STB) and later on a satellite FlatSat setup. Since for EM the units became
available step by step and partly as EM, partly as breadboard and the overall STB
setup was non-redundant, these units were mounted into a 19’’ rack together with
power supply equipment, debugging equipment, a spacecraft simulator. Together
24 J. Eickhoff

with a ground command equipment this STB infrastructure formed an adequate


verification setup.
Figure 1.16 shall only provide an initial impression of the overall infrastructure
(please also refer to [4], [87]). The details on applied tests, the available test
infrastructure etc. are included in more detail in Chap. 9.

Fig. 1.16 Satellite test bed for OBC EM/EBB integration tests.  IRS, University of Stuttgart

1.8 The Flight Model Assembly

All Flight Model components of the CDPI, namely the OBC boards and the PCDU
have been manufactured under cleanroom conditions. The same applies to the
assembly of the OBC from the individual frames and the integration of OBC and
PCDU into the overall CDPI. These steps have been performed by the IRS in the
cleanroom facilities of the ‘‘Raumfahrtzentrum Baden-Württemberg’’ in Stuttgart.
Figure 1.17 depicts the CDPI assembly Flight Model of the FLP satellite in the IRS
cleanroom. The setup here consists of the 2 separate housings of OBC and PCDU,
ready for connecting cabling of:
• OBC Telemetry output and Telecommand input lines to/from satellite trans-
ceivers respectively bypass
• Control lines from OBC to PCDU for normal control of power functions and for
control of the PCDU’s analog RIU functions
• OBC watchdog lines to the PCDU Common-Controller
• HPC lines for PCDU commanding from ground via CCSDS decoders.
The input power to the PCDU power is provided here by a laboratory supply
which later is replaced by the solar array and the battery for flight. The onboard
battery is not yet mounted in this setup. For the tests of this assembly on unit and
system level please refer to Chap. 9.
1 The System Design Concept 25

Fig. 1.17 CDPI FM units forming the start of the satellite FlatSat testbench.  IRS, University
of Stuttgart

1.9 Conceptual Outlook for future Missions

Comparing the implemented OBC/PCDU architecture with the overall satellite


mission, the mission objectives and the satellite complexity—please also refer to
Chap. 10—makes the solution deem a bit oversized on the first impression. For the
sake of control of a simple university Micro Satellite, the OBC design, commu-
nication protocols etc. could have been much simpler, as e.g. applied during the
DLR BIRD or ESA SSETI satellites.
The target for this infrastructure design however was to reach a technology
level, which also is of interest for industry for future mission applications and this
also was the reason for industry partners engaging so much in this project. This not
only applies to the industry partners directly providing components for the OBC
and PCDU, but also concerning those industry and agency partners providing test
equipment or simulators at pure material cost or even as donations.
The developed CDPI design—for which the dedicated following chapters
present the individual board details—can be easily adapted to other future low-cost
missions1 concerning diverse aspects:
• The PCDU with its modular design can be adapted flexibly to other missions
with different numbers of LCL and FCL outputs.
• The OBC and the PCDU also could be integrated in a single common housing
which avoids the external OBC/PCDU channel wiring. This was not applied for
the current mission since in the FLP platform OBC and PCDU reside geo-
metrically in different satellite compartments with a shear wall in between.

1
‘‘Low-cost’’ in the sense of a low cost platform versus industry or agency project price ranges.
26 J. Eickhoff

• The OBC design can be extended with external SpaceWire interfaces by adding
SpaceWire router boards, one for the nominal and one for the redundant OBC
side.
• The OBC design with its inter-board cabling and the external interfaces can be
adapted to a classical PCB/backplane design if required. However looking back
to the individual, parallel board engineering processes and the decoupling of
design and production cycles during the prototype development, the frame based
approach with an inter-board harness showed some significant benefit con-
cerning late design freeze for the boards.
• The overall OBC architecture also is flexible to exchange the CPU board by
future multi-core LEON chip implementations such as the GR712 LEON3 dual-
core SBC which is under development at Aeroflex Gaisler AB.
• Optionally the CDPI infrastructure can also be enhanced with an integrated
payload data management unit for future missions.
The engineering team is highly interested to apply the CDPI system concept
also in further missions.
Chapter 2
The OBC Processor-Boards

Sam Stratton and Dave Stevenson

2.1 The Processor-Boards as ITAR Item

The Processor-Board as radiation hard component and as product of a United


States company is classified as element falling under the ITAR (International
Traffic in Arms Regulations). The boards produced for the University of Stuttgart
are provided under the TAA (Technical Assistance Agreement), TA-6151-10.
The contributions in this chapter from the side of Aeroflex Inc. have been verified
by the Aeroflex legal department to be in agreement with the ITAR rules and the
a.m. TAA.

S. Stratton (&)  D. Stevenson


Aeroflex Colorado Springs Inc, Colorado Springs, CO, USA
e-mail: sam.stratton@aeroflex.com
D. Stevenson
e-mail: dave.stevenson@aeroflex.com

J. Eickhoff (ed.), A Combined Data and Power Management Infrastructure, 27


Springer Aerospace Technology, DOI: 10.1007/978-3-642-35557-8_2,
 Springer-Verlag Berlin Heidelberg 2013
28 S. Stratton and D. Stevenson

2.2 The Processor-Board: A Single Board Computer

The OBC Processor-Board was developed as a SBC (Single Board Computer),


which also can be used for other purposes than controlling a satellite. Potential
other applications in space could be as instrument controller for satellites or space
station instruments and many other applications.
A Single Board Computer or SBC for short is a common term used for any
computer that incorporates all of the essential components that a processor needs
to perform its intended function. The SBC includes all of the memory resources,
I/O and communication interfaces necessary to operate. The board is in fact a
self-contained computer and functions as such without the need of additional
circuit boards (Fig. 2.1).

Fig. 2.1 OBC processor-


board flight model in the
mounting frame.  Aeroflex
Inc./IRS, Uni Stuttgart

Typical home computers do not implement an SBC as their Processor-Board.


They will in most cases have a processor on a mother board with additional cards
connected to the mother board for functions such as memory control and graphics
processing for example. All computers have some basic requirements that are
necessary for the computer to operate—requirements such as memory resources,
interfaces including a keyboard and a monitor as well as communication interfaces
such as Ethernet and USB.
Some of the first computer cards developed in the 1970s were of the SBC type.
These were the early days of processor technology, so it was easier to incorporate
all of the necessary resources and means of communicating with the processor on a
single circuit board. As processor and other memory and interface technology
progressed it became necessary to offload some of the functions to different cards.
Functions such as Ethernet, memory controllers and graphics processing were
typically implemented on separate circuit cards often called ‘daughter’ cards.
Today, there are virtually no single board computers in the home PC market.
2 The OBC Processor-Boards 29

Though not typically used in home PCs, SBC computers are very typical in
satellite applications as well as many industrial and aerospace computer systems.
They are a much more specialized type of computer and in the case of satellite and
military applications, are designed for more extreme environments.
In general, most SBCs used for satellite or automation purposes will implement
specialized operating systems called RTOS (Real Time Operating Systems). One of
the main concepts behind an RTOS is the idea commonly referred to as ‘‘deter-
minacy’’. Determinacy is the ability of a computer system to calculate with great
accuracy the response time of any given process. Most satellite implementations of
processors require the tasks performed by the processor to be very deterministic.

2.3 Technical Overview

The OBC Processor-Board is a Single Board Computer (SBC) that is designed to


perform the main processor function within the OBC system. The board is designed
and implemented based on requirements passed down from the University of
Stuttgart to Aeroflex Colorado Springs. Requirements were then analyzed by
Aeroflex and a design approach was arrived at in consult with the University of
Stuttgart. The following subsections describe the flight version of the SBC and not
the EM version due to the fact that there are some significant differences (Fig. 2.2).

Fig. 2.2 Processor-Board engineering model and flight model.  Aeroflex Inc.

2.3.1 The OBC Microprocessor Device

The processor on an SBC is the heart of the board and is chosen based on the
performance requirements of the program. In this case the term ‘performance’ is
used in a much broader context than one would usually think. For a satellite,
30 S. Stratton and D. Stevenson

or other high reliability application, performance is not just a reference to pro-


cessing power but can and usually does include environmental and mechanical
performance as well. The environmental performance capabilities are important
because satellite OBCs will typically be subjected to much higher extremes of
temperature, and mechanical stress than a typical home PC processor chip.
Additional parameters that are considered when choosing a processor for a satellite
application are power consumption as well as radiation performance. All of these
requirements are important and are considered very carefully before a processor is
chosen for any satellite program.
For the FLP satellite’s OBC Processor-Board, the Aeroflex UT699 LEON3FT
processor was chosen (Fig. 2.3). This processor is highly suited for satellite
applications based on all of the above criteria. The LEON3FT is a 32 bit
SPARCTM V8 device with many interfaces suitable for multiple types of imple-
mentations. For the FLP satellite program, it was decided at the program level to
use SpaceWire as the primary interface between all of the boards in the OBC unit.
SpaceWire is a high speed serial bus uniquely suited for inter-system communi-
cation and control of a satellite. The bus is essentially a bit shipping protocol with
the data being defined by the user for their own specific needs. For a full
description of the SpaceWire protocol please refer to [11] and [12].

Fig. 2.3 LEON3FT in CQFP


package.  Aeroflex Inc.

2.3.2 The OBC Memory Configuration

All SBCs require memory in order to process data and perform tasks. Two dif-
ferent types of memory fulfill different functions inside an embedded computer:

Non-volatile Memory:
The first type of memory is known as non-volatile and is named as such due to the
fact that when the power is removed from the board the device retains the data
stored inside its cells. Non-volatile memory is essential because it typically
contains the operating system the processor will use when it boots on power-up.
In a home PC the non-volatile memory is fulfilled by the boot EEPROM and the
2 The OBC Processor-Boards 31

hard disk drive though in recent years we are seeing more and more solid state
devices take the place of a hard disk. Since there are no hard disk drives that are
qualified to fly in space, the non-volatile memory in a satellite needs to be of the
solid state variety. Some types of non-volatile memory include Flash-Memory,
EEPROM, FRAM, and MRAM. These devices will retain data when not powered.

Volatile Memory:
The second type of memory on an SBC is referred to as volatile memory. Volatile
memory will not retain data when the power is removed and therefore can only be
used by the processor when it is powered up and performing its intended function.
Common varieties of volatile memory include SRAM, Synchronous SRAM,
Dynamic RAM or DRAM, and Synchronous Dynamic RAM or SDRAM. These
are all examples of volatile memory. For satellite applications, the most common
form of volatile memory is Static Random Access Memory (SRAM).

2.3.3 The OBC FPGA for Miscellaneous Functions

During the course of designing a processor system, requirements that are not easily
fulfilled using a microprocessor, will often require implementation in discrete
logic or most likely a Field Programmable Gate Array (FPGA). The FLP program
has some OBC Processor-Board requirements that are not easily implemented with
software on the LEON3FT. So the design team decided to implement these
functions on a radiation tolerant FPGA (Fig. 2.4).

Fig. 2.4 UT6325 RadTol


eclipse FPGA.  Aeroflex
Inc.

The FPGA is typically a better choice over discrete logic because an FPGA will
usually take up less space on the board and will also most likely use less power.
The process of choosing an FPGA for a satellite system is similar to the process of
choosing a microprocessor. Electrical, temperature as well as mechanical and
radiation performance need to be considered prior to making a choice of FPGA.
For the FLP satellite program the Aeroflex UT6325 in the CGA484 package was
chosen for its radiation performance as well as for its relatively small footprint and
ease of implementation. The device is well suited for housekeeping functions and
other tasks that would be very difficult if not impossible to implement using
discrete logic devices. The device is readily available and has good flight history.
32 S. Stratton and D. Stevenson

2.4 The OBC Processor-Board Functional Overview

The OBC Processor-Board is a UT699 LEON3FT based 3U PCB card with 8 MB of


on board SRAM and 4 MB of on board non-volatile FRAM memory. The primary
method of communication on the OBC Processor-Board is through the four Space-
Wire ports that are connected directly to the LEON3FT. The SpaceWire ports give
access to and from the OBC Processor-Board and the other peripheral OBC Boards in
the system. Additional interfaces include Ethernet and RS422 interface which are
both used for ground testing and debug of the system. There is also a LEON3FT
Debug Support Unit (DSU), that can be accessed using the front panel connector.
The on chip LEON3FT EDAC function has been implemented on the SBC for
both the volatile and the non-volatile memory spaces. The following section
discuss in more detail all of the interfaces on the SBC.
A top level functional diagram of the OBC flight board is shown in Fig. 2.5
which provides a graphical view of the overall design of the OBC Processor-
Board.

Dedicated SpaceWire Buses


Spw1 10 Mbs
RS-422
UT699 Spw2 10 Mbs
SLEEP Bits
LEON3 Spw3 10 Mbs

Spw4 10 Mbs
RAMS and ROMS Signals

4MB 8MB
EDAC Address /Data EDAC
Ethernet
NVMEM SRAM

SRAM CE
UT6325
One Second Pulse
FPGA
Leon 3 Addr 20 to 22

NVMEM CE

Fig. 2.5 OBC processor-Board block diagram (flight model).  Aeroflex Inc.

One important aspect that will be covered in more detail in later sections is the
fact that the FPGA is handling the CE signals to both the volatile and non-volatile
memories as explained in more detail in the following section.
2 The OBC Processor-Boards 33

2.5 The OBC Processor-Board Memory Interface

All microprocessors require memory to perform their desired function. The OBC
Processor-Board memory interface was implemented based on requirements pas-
sed down from the University of Stuttgart to the designers at Aeroflex. Any issues
or changes to the board during the design process were made in consult with the
University.
The amount of on board memory contained on any Processor-Board is a very
important element in the performance of the processor. This fact is true of home
PCs as well as SBCs and follows the common understanding that more is better.
As mentioned previously, the processor requires two types of memory—volatile
and nonvolatile.
• Non-volatile memory: This type of memory will retain data even when the
power to the board has been turned off. It is suitable for storing the processor
operating system. This type of memory is similar in function to the boot EE-
PROM and hard disk used in home computers. It typically is slower than most
types of volatile memory which however is acceptable since it is accessed only
during boot-up of the computer.
• Volatile memory: This type of memory does not retain data when the power to
the device is turned off. It is suitable for use by the processor when running its
intended tasks (Fig. 2.6).

Fig. 2.6 Aeroflex stacked


SRAM on Flight Unit. 
Aeroflex Inc.

The amount of each type of memory on the board is dependent on the functional
requirements of the board being designed. The non-volatile memory devices need
to be dense enough to hold the entire RTOS image as well as any boot code that
the user desires.

Non-volatile Memory Resources:


For the FLP satellite program it was decided to use FRAM devices for the flight
configuration. On power up of the OBC board the LEON3FT will load the RTOS
image from the FRAM devices to the SRAM for flight operation. The FRAM
devices also have a ‘Sleep Bit’ that allows the user to set the device in a low power
mode once the image has been loaded into SRAM. The interface on the SBC was
34 S. Stratton and D. Stevenson

designed to use both of the ROM Select (ROMS) signals on the LEON3FT.
ROMS[0] begins at address 0x00000000 and ROMS[1] begins at 0x10000000.
Each bank of non-volatile memory provides 2 MB of SRAM to the LEON3FT.

Non volatile Interface to LEON3FT:


The non-volatile interface to the LEON3FT is fairly straight forward with the
exception of the Chip Enable signals to the devices. The timing of these signals at
the 33 MHz system clock used for the LEON3FT is not compatible with the FRAM
devices under worst case conditions. It became necessary to use the on board FPGA
to force these signals to meet the worst case timing of the FRAM device.

Fig. 2.7 LEON3FT NV ROMS


memory interface. 
Aeroflex Inc.
UT6325

FRAM CE
FPGA
UT699
LEON3
Address
4MB EDAC
Data NVMEM
Control

Figure 2.7 shows the top level interface from the LEON3FT to the FRAM
devices. The ROMS signal from the LEON3FT to the FPGA, is essentially the
Chip Enable signal from the LEON3FT. The FPGA manipulates the timing of the
ROMS signal to create CE signals to the FRAM devices. The manipulation is such
that the CE signals will meet the timing of the FRAMs and the system impact is
the use of three wait states required when interfacing to the non-volatile memory
space by the LEON3FT. Since these memories are not read from frequently, the
impact to the processor performance is almost negligible.
• LEON3FT Access to NVMEM:
The LEON3FT processor has two Chip Enables for the ROM memory area on
the device. The non-volatile memory devices are mapped into the ROMS[0] and
the ROMS[1] space of the LEON3FT. The ROMS signals are Chip Enables for
non-volatile memories. The result is there are two banks of 2 MB each of non-
volatile memory on the FM SBC.
• NVMEM Wait States:
A minimum of three wait states need to be set in the LEON3FT Memory
Configuration 1 (mcfg1) register when NVMEM accesses are performed. This is
to ensure the timing of the LEON3FT interface using the FPGA to control the
CE signals to the FRAM devices. The three wait states should be set for both
read and write. Refer to the LEON3FT Functional Manual [50] for a detailed
description of the mcfg registers in the LEON3FT. Note that on power up the
default wait states for the PROM memory area are set to the maximum of 30.
2 The OBC Processor-Boards 35

• NVMEM Sleep Bit Function:


Each of the FRAM devices on the OBC Processor-Board has a ‘Sleep Bit’ that is
used to put the device into a low power mode. Once the program code has been
read from the LEON3FT and is stored into SRAM, it is recommended that the
user sets these bits low when the devices are not being accessed. The sleep bits
are connected to GPIO signals on the LEON3FT and they have pull-ups con-
nected to them because the LEON3FT defaults all GPIO signals to inputs.
Having the pull-ups ensures the LEON3FT will have access to the non-volatile
memory space after power up or after the LEON has been reset by the FPGA.
• Sleep Bit Implementation:
The two Sleep Bits are implemented on the FM SBC using LEON3FT GPIO
signals. Each one controls one bank of 2 MB of non-volatile memory. Table 2.1
explains these signals and identifies their default condition.

Table 2.1 NVSLEEP GPIO assignments


Signal name LEON3FT I/O GPIO assignment Reset value Description
NVSLEEP0 O GPIO12 High 2 MB Bank 0 Sleep bit
NVSLEEP1 O GPIO13 High 2 MB Bank 1 Sleep bit

• NVMEM EDAC Implementation:


The LEON3FT on chip EDAC has been enabled on the FM SBC. Refer to the
LEON3FT Functional Manual for a full description of the EDAC for the PROM
memory area.

Volatile Memory Resources:


The SBC contains 8 MB of on-board SRAM. When the LEON3FT is powered up or
is reset, it will load the program code stored in the non-volatile memory space into the
SRAM memory space. All of the processing for the OBC is done with code running
from the SRAM memory space. The SRAM chosen for the OBC Processor-Board is
an Aeroflex stacked SRAM device with four die inside a single package. A single
RAM Select (RAMS) signal from the LEON3FT is routed to the FPGA and subse-
quently, four enables are generated from the FPGA to the SRAM device (Fig. 2.8).

E4

E1

Address [18:0] 512K by 39


W SRAM Data [31:0]
G
Die 1

512K by 39
SRAM
Die 4

Fig. 2.8 Aeroflex 8 MB stacked SRAM with on chip EDAC bits.  Aeroflex Inc.
36 S. Stratton and D. Stevenson

• LEON3FT Access to SRAM:


There are four Chip Enables on the LEON3FT. The OBC uses one of these
signals along with upper address bits to control the four Chip Enable signals to
the SRAM device. The 32 bit data bus on the LEON3FT is connected to the
lower 32 bits of data on the SRAM device and the Error Detection and Cor-
rection (EDAC), check bits are connected to the upper 7 bits of data on the
SRAM. Figure 2.9 shows a simplified version of this interface. The SRAM
interface does not need any wait states set in the LEON3FT memory controller.

Fig. 2.9 LEON3FT SRAM RAMS[0]


interface.  Aeroflex Inc.
UT6325

SRAM CE
FPGA
UT699 8MB
Address Stacked
LEON3 Data [31:0] SRAM
Check Bits [39:32]
Control

• LEON3FT SRAM EDAC:


One of the primary requirements for the OBC Processor-Bard was to have all
memories protected with EDAC. The OBC Processor-Board utilizes the
LEON3FT on chip EDAC for the SRAM memory space. The Check bits on the
LEON3FT are connected to the SRAM data bits [39:32]. Memory controller
registers manage the function of the EDAC as well as their enabling on the SBC.
The reader is referred to the LEON3FT Functional Manual [50] for a description
of how the EDAC functions are enabled for the SRAM space.

2.6 The OBC Processor-Board SpaceWire Interface

SpaceWire is a point to point serial bus that supports full duplex communication at
a data rate of up to 200 Mbs. The protocol uses a simple ‘token’ based system to
manage data to and from each end point. Each token character tells the receiver of
the token that the transmitter has 8 bytes of data space available in its receive
buffer. Therefore, if a SpaceWire node has data to send it will send 8 bytes of data
for each token it receives. The resulting function is simply as follows: As long as
each side has data to send, and the data that gets received is taken out of the
receive buffer, the system keeps running. For the FLP satellite program, the SBC
implements all four of the dedicated SpaceWire ports on the LEON3FT micro-
processor operating at 10 Mbs (Fig. 2.10).
2 The OBC Processor-Boards 37

Fig. 2.10 LVDS quad driver


footprint on SBC layout. 
Aeroflex Inc.

SpaceWire Ports Management:


All four SpaceWire ports on the LEON3FT are enabled for operation using
LEON3FT SpaceWire registers. These registers control a number of important
characteristics of the SpaceWire ports. Characteristics such as Port Enable, ini-
tialization data rate, disconnect time and timeout duration to name a few. The
registers are discussed in detail in the LEON3FT User’s Guide [50] and the reader
is encouraged to refer to this document for detail discussion of the LEON3FT
SpaceWire registers.

SpaceWire Clock and Data Rate:


The SBC implements the SpaceWire clock input to the LEON3FT using a 10 MHz
oscillator. The registers that set the transmit data rate are set such that the
SpaceWire ports operate at 10 Mbs.

2.7 Miscellaneous Functions

The OBC Processor-Board has a number of miscellaneous functions that are not
suitable for the LEON3FT microprocessor. These functions have been designed
into the UT6325 FPGA and are discussed in the following sections. FPGAs are
uniquely suited for Processor-Board utility functions and the UT6325 is used by
the OBC Processor-Board designers (Fig. 2.11).

Fig. 2.11 UT6325 in CQFP


package.  Aeroflex Inc.
38 S. Stratton and D. Stevenson

2.7.1 NVMEM Chip Enable

As stated previously, the FM22L16 FRAM devices chosen for the OBC Processor-
Board have timing that is incompatible with the timing of the LEON3FT. There-
fore, certain signals required to control the memory need to be managed by the on-
board FPGA. The signals in this instance are the FRAM Chip Enables. The internal
logic of the FPGA ensures the proper timing over worst case flight conditions.

2.7.2 SRAM Chip Enable

The LEON3FT is not designed to interface directly with SRAM devices that
implement a stacked die configuration. The used UT8ER2M39 SRAM devices
have four die layers inside one package. The on-board FPGA uses one of the Chip
Enables from the LEON3FT along with upper address bits to generate the four
Chip Enables to the SRAM devices.

2.7.3 Pulse Per Second Interface

A very good example of a utility function suited for an FPGA is the Pulse Per
Second requirement (PPS). The signal is used to sync the star tracker interface of
the FLP satellite to the LEON3FT. The signal is generated by the on-board FPGA
and provided on the 44 pin D-sub connector. Generating this type of signal using
MSI devices would require four or five separate chips and would also most likely
take up more board space than the FPGA.
The one second pulse is shown in Fig. 2.12 and this scope plot was taken from
the OBC FM unit. The timing is measured from the rising edge to the next rising
edge. The timing parameters for the signal are shown in Table 2.2. The signal is also
routed to one of the GPIOs on the LEON3FT. That way, the signal can be monitored
by the LEON3FT if the user desires. The GPIO used for this input is GPIO 10.

Fig. 2.12 Processor-Board oscilloscope plot of the PPS signal.  Aeroflex Inc.
2 The OBC Processor-Boards 39

Table 2.2 Pulse accuracy depending on operating temperature


Temperature range (C) Minimum pulse frequency (Hz) Maximum pulse frequency (Hz)
-55 to +125 0.9999950 1.0000500

One Second Pulse Reset:


There is a signal connected to LEON3FT GPIO 15 and routed to the FPGA that is
used to reset the one second pulse. The minimum pulse width in order to reliably
reset the one second pulse is two system clock cycles.

2.7.4 Watchdog Signal and LEON3FT Reset

The LEON3FT has a watchdog trip signal on chip and the OBC processor routes
that signal to the FPGA. When enabled, the logic inside the FPGA will use the
watchdog signal trip to reset the LEON3FT.

Watchdog Trip Enable:


The LEON3FT GPIO14 has been setup to enable the reset of the LEON3FT when
a watchdog trip occurs. If GPIO14 is set ‘High’ by the LEON3FT and there is a
LEON3FT watchdog trip, the LEON3FT will be reset by the FPGA. The default
upon power-up of the Watchdog Trip Enable is disabled (Low).

2.7.5 RS422 Interface

The RS422 interface is implemented using a combination of the inputs on the P5


connector, the LEON3FT GPIO signals and the LEON3FT on chip UART. See
also Fig. 2.14.

Unused Transmit and Receive Signal Handling:


If the user does not wish to use the RS422TXEN and RS422RXEN input signals
the following settings must be applied:
In the GPIO port direction register at address 0x80000908, bits [5] and [6] have
to be set to ‘High’. This will set GPIO5 and GPIO6 to outputs.

2.7.6 Resets

All digital circuit boards need to be reset at the very least on power up. The OBC
processor is no exception. There is a Power On Reset circuit that will hold the
LEON3FT in reset until the input power is stable (Fig. 2.13).
40 S. Stratton and D. Stevenson

WD
LEON3FT
WD_EN

44 Pin Connector
UT6325 POR 3.3V
FPGA Circuit
External Reset

Fig. 2.13 Processor-Board reset lines.  Aeroflex Inc.

POR Reset:
The Power On Reset (POR) on the OBC Processor-Board is approximately 250 ms
long. The LEON3FT and the FPGA will be reset at power-up. The FPGA is only
reset at power-up and is designed to control the LEON3FT reset.

LEON3FT External Reset:


The external reset signal is provided through the 44 pin connector that is used for
an external reset of the LEON3FT. The implementation of this signal should be
such that when the user wants to reset the LEON3FT, this signal is to be pulled
down to ‘Low’. When the reset is terminated, the signal should be left floating. On
the SBC, the FPGA performs the reset to the LEON3FT.

2.7.7 Clock Interfaces

LEON3FT System Clock:


The SBC implements the System Clock input to the LEON3FT using a 33 MHz
oscillator.

SpaceWire Clock:
The SpaceWire clock as already cited is 10 MHz.

2.7.8 DSU/Ethernet Interface Card (DEI)

For debug and test of the OBC Processor-Boards—both EM and FM models—a


small PCB card was designed by Aeroflex Colorado Springs to interface to the
LEON3FT DSU and Ethernet ports respectively. Details on this card can be found
in Sect. 11.1.
2 The OBC Processor-Boards 41

The Ethernet interface on the LEON3FT is implemented by routing the signals


from the LEON3FT to the 44 pin D-sub connector. During debug the user will use
the DSU/Ethernet Interface card to implement the Ethernet connection.
The LEON3FT microprocessor on the OBC Processor-Board also has a Debug
Support Unit (DSU) on the device and these signals are routed on the SBC to the
44 pin D-sub connector. When attempting to access the DSU, the user will use the
DSU/Ethernet Interface card.

2.8 Power Specifications

The SBC has a 44 pin D-sub connector which is used for the 3.3 V ± 5 % input
power. In addition there are five MDM 9 pin connectors, 4 of which are for
SpaceWire interfaces and one for RS422. The SBC consumes a maximum of no
more than 5w at full throughput.

2.9 Mechanical Design and Dimensions

The SBC is based on a 3U cPCI Printed Circuit Board (PCB). The dimensions are
100 mm by 160 mm. Refer to Fig. 2.14 for a conceptual drawing of the board and
the connector placement.
Ethernet and Power
RS-422
6.50 by

DB-44

Proposed Frame
10.00

9.75 by 40.25

Dimensions
2.0 mm

70.00 mm

100.00 mm
Port 2

Port1

Drill Hole
6.50 by

6.50 by
10.00

10.00

10.00 mm
Keepout area

50.00 mm
Port 3
6.50 by

6.50 by
10.00

10.00
Port 0

6.0 mm

SpaceWire 2
.
0
0

m
m

6.0 mm 3.20 mm
80.00 mm

160.00 mm

Fig. 2.14 Processor-Board mechanical concept.  Aeroflex Inc.


Chapter 3
The I/O-Boards

Barry M. Cook, Paul Walker and Roger Peel

3.1 Common Design for I/O and CCSDS-Boards

The approach for these two OBC components is a common design for the OBC
I/O-Boards and the CCSDS decoder/encoder boards (or CCSDS-Boards for short).
Both of these board types are available with single redundancy inside the OBC and
are connected to the two OBC Processor-Boards via cross coupled SpaceWire
links running RMAP protocol. A key characteristic is that although implementing

B. M. Cook (&)  P. Walker  R. Peel


4Links Ltd., Milton Keynes, UK
e-mail: barry@4links.co.uk
P. Walker
e-mail: paul@4links.co.uk
R. Peel
e-mail: roger@4links.co.uk

J. Eickhoff (ed.), A Combined Data and Power Management Infrastructure, 43


Springer Aerospace Technology, DOI: 10.1007/978-3-642-35557-8_3,
 Springer-Verlag Berlin Heidelberg 2013
44 B. M. Cook et al.

completely different functionalities both boards are based on the same 3U printed
circuit board, FPGA chip and I/O-driver ICs design.
The I/O-Boards are designed by 4Links Limited, including both printed circuit
board and functionality implemented in the FPGA IP Core. All the I/O-Board IP,
for SpaceWire, RMAP, and the I/O interfaces is from 4Links. The SpaceWire core
is derived from the design used in 4Links test equipment for SpaceWire, which
uses synchronous clock and data recovery—greatly simplifying the design and
eliminating problems of asynchronous design. The simplified IP Core’s ease of use
is testified by a recent customer who praised 4Links’ support when, in practice,
negligible support was required (Fig. 3.1).

Fig. 3.1 I/O-Board FPGA – Actel ProASIC 3e


block diagram. A3PE3000 (Industrial temperature version )
 4Links Ltd.
SpaceWire Remote Memory Access Protocol (RMAP)
LVDS
CODECs Target
Buffers

Bus IIC UARTs Digital I/O RAM


I/F Interface

MRAM

Fibreoptic O/C RS422/485 Digital line


Gyro I/F Buffers Buffers Buffers

The CCSDS-Board is reusing the I/O-Board PCB with a limited interface


instrumentation and without mass memory chips. It only needs connection to the
spacecraft’s transceivers and the decoding Combined-Controller in the PCDU for
HPC1 type high priority commands. The corresponding CCSDS functionality is
implemented by Aeroflex Gaisler AB, Göteborg, in an IP Core derived from the
Aeroflex Gaisler GR-TMTC-0002.
Both board types are based on an Microsemi/Actel A3PE3000L PQ208 FPGA.
This common board concept saves considerable cost for hardware development
which is essential for a low-cost project like a University satellite. Both boards’ IP
Cores are featuring Triple Module Redundancy (TMR), technology for achieving a
higher immunity against Single-Event Upsets (SEU).

3.2 The I/O-Board as Remote Interface Unit

The I/O-Board provides connections between the currently active Processor-Board


and operational sensors and actuators which means it takes the role of a Remote
Interface Unit (RIU), in a conventional spacecraft architecture. In this sense it is
3 The I/O-Boards 45

routing the onboard TM and TC signals directly to spacecraft’s GPS receivers, star
trackers, fiberoptic gyros, reaction wheels, magnetotorquer electronics, magnetometer
electronics, telemetry tracking and control equipment, payload controller, payload
data downlink system and the power control and distribution unit. The reader inter-
ested in the details of the FLP target satellite’s architecture is referred to Chap. 10 and
specifically to Fig. 10.5. In total, an I/O-Board provides 62 external interfaces:
• 2 SpaceWire links at 10 Mb/s.
• 22 UART interfaces, mostly full duplex, at rates from 9600 to 115200 baud
• 1 bus at 2 MHz for the fiberoptic gyro
• 2 IIC buses at 100 kHz and
• 35 digital I/O lines.
Data transfer between the I/O-Board and Onboard Computer occurs within a
schedule of 1.25 ms time slots. The most demanding transfer, 1200 bytes in a
single slot, amounts to not quite 1 MB/s. This is achieved with a SpaceWire raw
line rate of 10 Mb/s. The FPGA can thus be run with a 10 MHz clock, reducing
dynamic power consumption and also reducing the probability that an upset in the
combinatorial logic will be retained.
Transfers between the I/O-Board and attached devices are slower and several
transactions can occur simultaneously—this is far simpler and more easily verified
as concurrent hardware in FPGA than as pseudo-concurrent software on a pro-
cessor. The connection between the OBC and I/O-Board will be kept reasonably
busy with transfers at 10 Mb/s, making SpaceWire with its LVDS interface a good
choice. Other interfaces run at a much lower data rate and source terminated
RS422/485 lines are used for most connections so that power is consumed only
during edge transitions, giving a very low average power consumption whilst also
providing a wide voltage swing to provide good noise immunity.
SpaceWire CODECS conform to the current ECSS standard [11] and the
associated RMAP protocol [13, 14] was chosen to structure transfers. Although a
Remote Memory Access Protocol (RMAP) is an obvious choice for accesses to
memory—MRAM that can be used as both normal SRAM and NVRAM—it is less
clearly suited to streaming data such as found in UARTs.
External (non-memory) transfers are made to look like memory transfers with a
simple additional protocol between RMAP and the device. Details differ, depending
on the exact interface requirement (UART, IIC, etc.) but all follow the same overall
structure. Data to be sent from the I/O-Board (to an instrument or actuator) is first
written to a memory-mapped buffer where it is held until confirmed to be error-free.
It is transferred from the buffer to the external interface, taking account of those
interfaces that require a more complex action than a simple data-copy—for
example an IIC read which uses a write/read sequence of sending an address
followed by inputting data. Received data is stored in a buffer until read, as a
standard memory read, by a later transaction initiated by the OBC Processor-Board.
Each external interface is given a unique RMAP memory address.
46 B. M. Cook et al.

3.3 The I/O-Board as OBC Mass Memory Unit

Furthermore, the I/O-Boards is equipped with mass memory for


• satellite housekeeping telemetry,
• timeline storage,
• storage of specific spacecraft status and health status information [10],
• as well as for uploaded timelines.
The individual use, partitioning of the memory etc. is mission specific and under
responsibility of the satellite Onboard Software (OBSW). With respect to dimen-
sioning of the memory it is not targeted for the storage of science data telemetry from
satellite payloads. In the FLP target satellite mission a dedicated Mass Memory Unit
(MMU), managed by the payload controller is responsible for this task.
To avoid losing the stored information in case of a power outage of the OBC—
especially not the housekeeping information and the spacecraft status informa-
tion—the I/O-Board memory is realized as non-volatile mass memory.
The non-volatile memory is Magnetic RAM (MRAM) which has the benefit
that is can be written and read as static random access memory (SRAM) with
similar speed to SRAM. MRAM is claimed to have unlimited endurance (number
of read or write cycles) and to retain non-volatile data for at least 20 years. The
storage elements are claimed to be immune to single-event-upsets. We selected
Everspin MR4A16B devices, each with 1,048,576 words of 16 bits, used on the
I/O-Board as 2MBytes (8-bit Bytes) per chip.
Ten chips are used to provide 20 MBytes overall non-volatile memory per
I/O-Board, organised as two regions: telemetry data (16 MB) and status data
(4 MB). Access is completely random access as individual Bytes.
The mass memory read/write accesses also are performed applying the RMAP
protocol. More information is provided later in Sect. 3.5.

3.4 I/O-Board Hot Redundant Operation Mode

In normal operational cases one OBC I/O-Board is active, controlled by the active
Processor-Board. As described, the OBC controls all the satellite’s onboard
equipment through the driver interfaces of the I/O-Board and it stores all the
relevant status and housekeeping data and timeline processing status in the
I/O-Board memory.
However in case an interface driver—e.g. a UART—to an onboard equipment
becomes defective w.r.t. its hardware, cabling or maybe even on the controlled
equipment’s side, it might be required to switch over to the redundant I/O-Board. In
such case however the operational board also should contain the current house-
keeping and status data as well as the recorded history. Therefore the functionality
exists to reconfigure to the healthy I/O-Board and still keep the defective one
3 The I/O-Boards 47

switched on with deactivated interface buffer chips. This permits the history of the
recorded telemetry on the defective I/O-Board to be copied over to the healthy one.
The buffer chips are deactivated by removing their power, in which case their
outputs become high-impedance.
The I/O-Board’s interface buffers are not powered until instructed by writing
values at address 098100 (the value 255 powers the buffers, 0 removes power, any
other value is invalid). The first logic value controls connector-D buffers and the
second value controls connector-E buffers.

3.5 I/O-Board RMAP Interface

SpaceWire can be used with no protocol or with one or more of a variety of


protocols. As the I/O Board has a substantial memory that is written and read by
the OBC, the SpaceWire Remote Memory Access Protocol (RMAP), standardized
by ECSS-E-ST-50-52C [14] was considered suitable.

3.5.1 Board Identification for I/O-Boards

The developed I/O-Board type is called SIU B-012-PPFLPIO.


The board identity (Nominal or Redundant) via SpaceWire from OBC core is
set by grounding selected pins on connector-E, allowing the board ID to be set by
location in the wiring harness.
• The Nominal board is selected by grounding1 the TMS pin 55 and leaving the
TDI pin 80 open or connected to pin 81 (J_VJTAG), see Fig. 11.7.
The board will then respond to requests sent to logical address 0921.
• The Redundant board is selected by grounding2 the TDI pin 80 and leaving the
TMS pin 55 open or connected to pin 81 (J_VJTAG). The board will then
respond to requests sent to logical address 0922.

Other combinations of TMS and TDI (both open/V_JTAG or both grounded)


are invalid and will result in the board not responding to SpaceWire commands.
Writes/Reads to undefined addresses will result in the RMAP status code 10
(command not possible) being returned.

1
If the JTAG function is required (for example, to load new firmware) the ‘‘grounded’’ pin may
be connected via a 1 KX resistor to ground and the programmer can then be wired directly to the
connector pin. The JTAG programmer should be disconnected when it is not being used to ensure
correct board operation.
The board will then respond to requests sent to logical address 0921.
48

Table 3.1 I/O-Board RMAP addresses


Extended address Address Function Notes
0903 0900000000 to 09003FFFFF 4 MB MRAM State-vector memory Addresses wrap around within this address space
0902 0900000000 to 0900FFFFFF 16 MB MRAM Telemetry memory Addresses wrap around within this address space
0901 098100 Logic-out Controls power to the buffers
0901 098110 UART 9600 b/s, Connector D
0901 098111 UART 9600 b/s, Connector D
0901 098112 UART 9600 b/s, Connector D
0901 098113 UART 9600 b/s, Connector D
0901 098120 IIC 100 kHz, Connector D
0901 098121 IIC 100 kHz, Connector E
0901 098130 UART 57 600 b/s, Connector D
0901 098131 UART 57 500 b/s, Connector E
0901 098140 UART 57 600 b/s, Connector E
0901 098141 UART 57 600 b/s, Connector E
0901 098142 UART 57 600 b/s, Connector E
0901 098143 Logic-out 3 signals, Connector E
0901 098144 Logic-in 3 signals, Connector E
0901 098150 UART 115 200 b/s, Connector E
0901 098151 Logic-out 1 signal, Connector E
0901 098160 Fiberoptic Gyro Connector D
0901 098170 UART 115 200 b/s, Connector D
0901 098171 UART 115 200 b/s, Connector E
0901 098172 UART 115 200 b/s, Connector D
0901 098173 UART 115 200 b/s, Connector E
0901 098174 Logic-out 3 signals, Connector D
0901 098175 Logic-out 3 signals, Connector E
0901 098176 Logic-in 6 signals, Connector D
(continued)
B. M. Cook et al.
Table 3.1 (continued)
Extended address Address Function Notes
0901 098177 Logic-in 6 signals, Connector E
0901 098180 UART 115 200 b/s, Connector D
0901 098190 UART 115 200 b/s, Connector D
0901 098191 UART 115 200 b/s, Connector E
3 The I/O-Boards

0901 098192 UART 115 200 b/s, Connector D


0901 098193 UART 115 200 b/s, Connector E
0901 0981A0 UART 115 200 b/s, Connector D
0901 0981A1 UART 115 200 b/s, Connector E
0901 0981B1 UART (spare) 9600 b/s, Connector D
0901 0981B2 UART (spare) 9600 b/s, Connector D
0901 0981B3 UART (spare) 115 200 b/s, Connector E
0901 0981B4 UART (spare) 115 200 b/s, Connector E
0901 0981B5 UART (spare, input only) 115 200 b/s, Connector E
49
50 B. M. Cook et al.

3.5.2 I/O Board Interface RMAP Addresses

Overall, the board presents itself as a large, sparse, memory map accessed through
the RMAP protocol. Table 3.1 lists the mapping from RMAP memory space to
real memory and interfaces, showing how the ‘extended address’ field is used to
select major blocks and ‘address’ for the detailed selection.
Writes to the state vector memory and telemetry memory are unbuffered:
verification before writing is not possible. At least some of the data will be written
to memory, even if an error occurs—the RMAP Verify Bit will be ignored. If a
write acknowledge is requested and an error occurs, a specific error code will be
returned (unless the header is corrupted, in which case there will be no reply—and
no data will be written to memory.)
Writes to the UART, IIC, FOG and Logic-out are fully buffered and data may
be verified before it is written. Data will not be transferred if the RMAP Verify Bit
is set and an error occurs. Some data may be transferred if the RMAP Verify Bit is
not set, even if an error occurs.

3.5.3 Returned RMAP Status Values

See Table 3.2

Table 3.2 I/O-Board RMAP return status values


Status value in a Meaning Notes
response packet
0 Completed OK Memory: Data in memory is correct; UART/Logic-out/IIC/
Fiberoptic Gyro: Data is being sent to interface
2 Unused type code An invalid RMAP command was received
3 Invalid key Must be 0900
4 Invalid data CRC Memory: Data was written to memory; UART/Logic-out/
IIC/Fiberoptic Gyro: Data was discarded, no data was sent
to the interface
5 Unexpected EOP Memory: Data was written to memory; UART/Logic-out/
IIC/Fiberoptic Gyro: Data was discarded, no data was sent
to the interface
6 Too much data Memory: Data was written to memory; UART/Logic-out/
IIC/Fiberoptic Gyro: Data was discarded, no data was sent
to the interface
7 Unexpected EEP Memory: Data was written to memory; UART/Logic-out/
IIC/Fiberoptic Gyro: Data was discarded, no data was sent
to the interface
9 Verify buffer overrun Memory: Data was written to memory; UART/Logic-out/
IIC/Fiberoptic Gyro: Data was discarded, no data was sent
to the interface
This code will be returned if the amount of data sent
exceeds the transmit buffer size.
10 Command not possible All data was discarded
12 Invalid target address Must be 0921 or 0922
3 The I/O-Boards 51

3.6 I/O Circuits, Grounding and Terminations

This section depicts the I/O circuitry diagrams for the diverse types of interface,
for both nominally grounded interfaces and for specific interfaces which due to the
target satellite’s connected equipment required I/O groups with isolated ground on
the board.
This interface type is applied for all RS485 and RS422 connections in the
spacecraft. For the target platform the this interface applies to connections of the
OBC with PCDU, magnetometer control electronics, the transceiver housekeeping
interfaces, the reaction wheels and the fiberoptic gyros.
A low-pass filter is applied at the output terminals to reduce possible Electro-
magnetic Interference (EMI) to other signals/systems in the spacecraft. At a
maximum data rate of 2 Mb/s for any signal (and most at 100 kb/s or less) the
buffer output rise and fall times are far faster than required. Fast edges do not aid
signal integrity in this application (inputs all have Schmitt triggers) but merely
enhance spurious emissions—hence the filtering (Fig. 3.2).

Fig. 3.2 Example for a standard serial differential interface.  4Links Ltd.

Differential signals, such as the standard serial differential interface, complete


the circuit with their own signal return path—the other wire of the differential pair.
Single ended signals need a return path, possibly common to a number of signals
to the same place but, in order to avoid grounding problems, this must be isolated
from the power ground. The ADuM14xx and ADum54xx series isolators were
chosen as they also provide an isolated power supply for connector-side buffers.
Some onboard equipment on the small satellite only provides IIC connections,
such as the magnetotorquer control electronics. To avoid ground loops inside the
spacecraft between the controlled equipment and the OBC these IICs are imple-
mented as isolated groups (see Fig. 3.3). Isolation is achieved by means of mag-
neto-couplers.
52 B. M. Cook et al.

Fig. 3.3 Example for an isolated IIC group. 4Links Ltd.

Another example for an isolated group interface is given in Fig. 3.4. It serves
for units requiring onboard SW patch functions. The example depicted here is
from the GPS receiver (Fig. 3.5).
SWE+3V3

IC40
100nF

100nF
C136

C135
C164

C165
10uF

10uF

ADuM5402 GPS_3V3
VDD1 VISO
+ +
GND1A GND-ISO-A
GND1B GND-ISO-B

GND nc RCOUT VSEL select_3V3_VISO


GPS_CDM_2 EVR R75 S_EV_IN_T
VOC VIC
10k
GPS_CDM_1 EUR R73 S_EU_IN_T
VOD VID
10k
SW_NVRET EYD
VIA VOA

VIB VOB nc
SWE+3V3

GND
IC52
100nF

100nF
C134

C133

ADuM1402
VDD1 VDD2

GND1A GND2A
GND1B GND2B

GND VE1 VE2


GPS_CDM_0 ETR R71 S_ET_IN_T
VOC VIC
10k
nc VOD VID

GPS_INT_ON EVD
VIA VOA

GPS_CDM_ON EUD
VIB VOB
IC53
AM26LV31 C132
BAS40H
G VCC
10nF
GBAR GND D1
R81 BAS40H EY_VOUT
1A 1Y
1Z nc 100R
R76 D2 S_EV_OUT_T
2A 2Y
2Z nc 100R
R74 S_EU_OUT_T
3A 3Y
3Z nc 100R C50 C51 C52
4A 4Y nc 10nF 10nF 10nF
4Z nc

GROUP_E3_RETURN

Fig. 3.4 Isolated group for equipment with SW patch function—here GPS. 4Links Ltd.

And finally there may exist connected S/C units with a bunch of different
command/control interfaces which all must be electrically isolated as being
grounded physically on equipment unit side. For such units entire isolated I/F
groups are provided on the I/O-Board. The example depicted in Fig. 3.5 is used for
a complex payload controller.
3 The I/O-Boards 53

R34 D_D1C_IN_T
IC55
220R

220R
A8

R36
2
R B7
R35 D_D1C_IN_I
220R
R37 D_D1C_OUT_T
Y 5 390R C139
3
SWD+3V3
D Z 6
R39 270pF D_D1C_OUT_I
IC54 +3V3 +3V3
390R

1
SN65HVD30-3V3

100nF

100nF
C137

C138
ADuM1402 C157 C156 IC55P IC56P

R41

R43
2k4

2k4
VDD1 VDD2
10nF 10nF GND GND

4
GND1A GND2A
GND1B GND2B
GND
VE1 VE2
D1CR R45 D_D1B_IN_T
VOC VIC IC56
220R

220R
D1BR A8

R49
2
VOD VID
R B7
R47 D_D1B_IN_I
D1CD 220R
VIA VOA R50 D_D1B_OUT_T
Y 5 C140
D1BD 3
390R
VIB VOB
D Z 6
R53 270pF D_D1B_OUT_I
390R
SWD+3V3

SN65HVD30-3V3
IC30

R62

R64
2k4

2k4
100nF

100nF
C152

C153
10uF

10uF
ADuM5402 CPN_0_3V3
C21

C22

VDD1 VISO
+ +
GND1A GND-ISO-A
GND1B GND-ISO-B
GND
nc RCOUT VSEL select_3V3_VISO
CPN_VTX_READY_0 DTR R112 S_DT_IN_T
VOC VIC
10k
CPN_CC_READY_0 DSR R111 S_DS_IN_T
VOD VID
10k
VIA VOA nc

VIB VOB nc

GND
SWD+3V3

IC31
100nF

100nF

ADuM1402
C23

C27

VDD1 VDD2

GND1A GND2A
GND1B GND2B
GND
VE1 VE2
CPN_CC_ERROR_0 DRR R110 S_DR_IN_T
VOC VIC
10k
PB_STATUS_0 DQR R48 S_DQ_IN_T
VOD VID
10k
PB_RESET_0 DQD
VIA VOA

VIB VOB nc

GND
SWD+3V3

IC32
100nF

100nF

ADuM1402
C28

C29

VDD1 VDD2

GND1A GND2A
GND1B GND2B
GND
VE1 VE2
CPN_T_ALERT_0 DPR R44 S_DP_IN_T
VOC VIC
10k
CPN_V_ALERT_0 DOR R46 S_DO_IN_T
VOD VID
10k
CPN_MASTER_0 DPD
VIA VOA

CPN_RESET_0 DOD
VIB VOB
IC33
AM26LV31 C126
G VCC
10nF
GBAR GND
R38 S_DQ_OUT_T
1A 1Y
1Z nc 100R
R40 S_DP_OUT_T
2A 2Y
2Z nc 100R
R42 S_DO_OUT_T
3A 3Y
3Z nc 100R
C25 C24 C26
4A 4Y nc
nc 10nF 10nF 10nF
4Z GROUP_D1_RETURN

Fig. 3.5 Isolated I/F group for a complex payload controller. 4Links Ltd.

The I/O-Board debug/JTAG Interface finally completes the set of external OBC
interfaces. The electrical grounding and termination diagrams are provided in
Fig. 3.6.
54 B. M. Cook et al.

Fig. 3.6 I/O-Board JTAG Interface. 4Links Ltd.

3.7 I/O-Board Interface Access Protocols

The board’s interface types were already mentioned in the previous section. The
protocols for interface access from the OBC Processor-Board are listed here.

Memory:
Beyond the RMAP protocol, there is no additional protocol for data written to/read
from State Vector and Telemetry data. All data bytes in a command or reply are
exactly copied to the absolute memory addresses or read from absolute addresses.
3 The I/O-Boards 55

Handling of Data in the memory—namely spacecraft configuration data, house-


keeping telemetry etc.—has to be performed exclusively by the spacecraft
Onboard Software.

UART:
There is also no additional protocol for data written to/read from a UART. All data
Bytes in a command packet written to the UART are forwarded through the UART
without addition or deletion or change of any Byte. Similarly, all data Bytes
received by the UART are read from the UART’s buffer without addition, deletion
or change of any Byte.

Logic-out:
The logic-out and logic-in interfaces use a simple protocol within the RMAP
packet.
Each logic-out address allows one or more signals to accessed. The signal may
be set low, set high, or left unchanged. The data is a sequence of bytes, one for
each signal, in the order shown in the above interface table. A data value of 0 will
set the output signal low, a value of 255 will set the output signal high, and any
other value will leave the signal unchanged.
If fewer data bytes than signals are sent, Logic-out signals that are not sent data
will not change.
If more data bytes than signals are sent, excess bytes will be ignored and no
error will be reported.

Logic-in:
Values are returned as 0 (signal wire low) or 255 (signal wire high).

IIC:
The IIC interfaces use a simple protocol within the RMAP packet.
Command: Commands consist of an address byte (with the least significant bit set
to 0), one byte indicating how many bytes are to be written (may be zero), one byte
indicating how many bytes are to be read (may be zero) and the bytes (if any) to be
written.

Reply: the bytes read (if any) as follows:


\address (lsb clear)[\bytes to write[\bytes to read[ [\write bytes[]

Fiberoptic Gyro interface:


The interface to the fiberoptic gyro (FOG) is as specified in the I/O-Board ICD:
Command: Of the 32 bytes in the command, all unused bytes (31 of the 32) must
be set to zero. The implementation will send non-zero bytes to the bus.
Reply: 32-bytes.
56 B. M. Cook et al.

3.8 I/O-Board Connectors and Pin Assignments

The connector placement on the I/O-Boards is as depicted in Fig. 3.7. The CCSDS-
Board schematic is similar and is just comprising connectors A, B, C and E.

D – I/O E – I/O and E – CCSDS

C – SpaceWire

B – power

A – SpaceWire

Fig. 3.7 I/O-Board EM. 4Links Ltd.

3.8.1 Connectors-A and C (OBC internal)

The connectors A and C for SpaceWire links to the OBC Processor-Boards are
following SpaceWire standard and are implemented as Micro-D High Density
connectors.

Connector-A: SpaceWire—Nominal
This SpaceWire link will be used by default if it succeeds in establishing a con-
nection with the OBC Processor Board.

Connector-C: SpaceWire—Redundant
This redundant SpaceWire link will be used if the nominal link fails to establish a
connection with the OBC Processor-Board. If the nominal SpaceWire link suc-
ceeds in establishing a connection with the OBC Processor-Board, the redundant
link is disabled.

3.8.2 Connector-B (OBC internal)

The power supply of the I/O-Board is provided by the connected OBC Power-
Board via connector B, a Sub-D High Density connector, 15-way. The pin
assignment is shown in Table 3.3:
3 The I/O-Boards 57

Table 3.3 Power connector Pin Voltage to Heater wires and


pinout I/O/CCSDS board non-connected (NC) pins
1 NC
2 NC
3 GND
4 GND
5 Heater2
6 NC
7 NC
8 GND
9 GND
10 Heater1
11 +3v3
12 +3V3
13 GND
14 Heater4
15 Heater3

3.8.3 Connectors-D and E (OBC external)

Connectors D and E (or J5/6 and J11/12 on OBC unit level—see Fig. 11.4) pro-
vide the signal I/O between OBC and connected spacecraft equipment, connector
E in addition carries the I/O-Board JTAG Interface. To avoid connection errors
during harness mounting
• connector D is a Micro-miniature D-socket (female), 100-way and
• connector E is a Micro-miniature D-plug (male), 100-way with pins protected
by the connector body.

The signal I/O connections comprise both the standard grounded groups and the
isolated I/F groups. The standard ground pins are all equivalent. Within each
isolated group the ground pins are equivalent per connector.
• Logic inputs can accept 3.3 V CMOS, 3.3 V TTL and 5 V TTL signals. Logic
outputs meet the requirements of 3.3 V CMOS, 3.3 V TTL and 5 V TTL
signals.
• The SW_NVRET (connector E) logic output has a series diode limiting the
output to being a voltage source at 3 V.
• The connector data pins will behave as high-impedance wires when the buffers
(or the whole board) is not powered—these buffers must be powered before
normal I/O operation is possible.
• JTAG pins on connector E connect to/from the FPGA via permanently powered
buffers to provide ESD protection.
• JTAG TRST has a pull-down resistor to hold JTAG reset unless it is actively
taken high—the programmer may need to be configured to drive this pin.
58 B. M. Cook et al.

The connector pin assignments for connectors D and E are depicted in the
annex Sects. 11.7. (I/O-Board) and 11.8. (CCSDS-Board).

3.9 I/O and CCSDS-Board Radiation Characteristic

Both the I/O-Boards and the CCSDS-Boards use a MicroSemi/Actel ProAsic3


FPGA as processing unit, as A3PE3000L in a PQ208 package. This chip provides
the following radiation immunity characteristics:
• A Total Ionizing Dose (TID) up to 15 krad (Si)
• A Single Event Latch-up immunity (SEL) to LETTH [ 6 MeVcm2/mg (before
TMR)
• An immunity to Single-Event Upsets (SEU) to LETTH [ 96 MeVcm2/mg.

3.10 I/O and CCSDS-Board Temperature Limits

The operating temperature range of both the I/O and CCSDS-Board is from: -40
to +85 C.
Note that these limits apply to the temperature of the components/silicon.
Allowance must be made for thermal conductivities from those components to the
chassis.
The storage temperature range ranges from: -55 to +105 C.

3.11 4Links Development Partner

4Links gratefully acknowledges the support from Microsemi/Actel in providing


access to FPGA design software.
Chapter 4
The CCSDS Decoder/Encoder Boards

Sandi Habinc

© Edelweiss – Fotolia.com

4.1 Introduction

Traditionally the implementation of Telemetry Encoders and Telecommand


Decoders for space has been made in hardware, at least for the last two decades.
This was also the approach that was envisaged when these OBC boards had been
conceptualized. But with the availability of more processing power (e.g. the
LEON3FT 32-bit fault-tolerant SPARCTM V8 processor), more of the encoding
and decoding tasks can be moved to software, allowing flexibility for adapting the
system to on-going standardization efforts. The European return of software-based
decoders in space was made in late 2009 with the launch of a European technology
demonstrator satellite (PROBA-II).

S. Habinc (&)
Aeroflex Gaisler AB, Göteborg, Sweden
e-mail: sandi@gaisler.com

J. Eickhoff (ed.), A Combined Data and Power Management Infrastructure, 59


Springer Aerospace Technology, DOI: 10.1007/978-3-642-35557-8_4,
 Springer-Verlag Berlin Heidelberg 2013
60 S. Habinc

The approach followed here in this CDPI architecture is that part of the CCSDS
decoding/encoding is performed in FPGA hardware on the CCSDS decoder/
encoder board and part of the task is done in software using libraries provided by
Aeroflex Gaisler AB together with the RTEMS realtime operating system for the
Aeroflex Processor-Boards. The processing IP Cores in the FPGA on the CCSDS-
Board and the software libraries which are available in the RTEMS SPARC
tailoring and which are running on the LEON3FT Processor-Board are designed in
a common architecture by Aeroflex Gaisler AB.
The CCSDS Decoder/Encoder boards are based on the same Microsemi/Actel
RT ProASIC3 FPGA as the I/O-Boards and are also manufactured by 4Links Ltd.
Since the CCSDS board only uses the SpaceWire interfaces to the Processor-
Board, the CLTU and CADU NRZ-L lines and the HPC UART interface to PCDU,
it applies a reduced interface driver IC mounting on the PCB compared to the
I/O-Board. The same applies to memory equipment on PCB. The board hardware
even shares the PCB layout with the I/O-Boards as well as the SpaceWire interface
LVDS transceiver design for the processor board interfaces. With respect to
electronic circuitry it is a ‘‘not fully populated I/O-Board’’. The connector pinout
of the CCSDS-Boards can be taken from the annex Sect. 11.8.
The IP Cores loaded in the FPGA are provided by Aeroflex Gaisler AB. The
product name for this specific implementation is GR-TMTC-0004. The overall
architecture is based on IP Cores from the GRLIB VHDL IP Core library. All
functions implemented in the FPGA are based on Triple Module Redundancy
(TMR) technique to assure sufficiently high robustness under space environmental
radiation conditions. The GR-TMTC-0004 is also available for other devices of the
Microsemi/Actel ProAsic3 series in other package types and for diverse speed
grades.
This main chapter consists of extracts from the Aeroflex Gaisler CCSDS
TM/TC and SpaceWire FPGA Data Sheet and User’s Manual [62], which is the
detailed documentation of the GR-TMTC-0004 product. The tailoring is done
according to what the overall OBC user needs to know for
• installation of the GR-TMTC-0004 FPGA programming file on the CCSDS
boards and for
• applying the proper TC/TM link settings for ground equipment to connect to the
OBC.
Since this chapter is providing all the necessary details on the TM encoding, TC
decoding and on High Priority TC handling functionality, it exceeds the size of the
other purely hardware oriented main chapters.

The TM/TC FPGA device features the following functions:


• CCSDS/ECSS compliant Telemetry encoder:
– Multiple Virtual Channels implemented in hardware, via two SpaceWire
RMAP I/Fs
– Reed-Solomon and Convolutional Encoding in hardware
4 The CCSDS Decoder/Encoder Boards 61

• CCSDS/ECSS compliant Telecommand decoder:


– Multiple Virtual Channels implemented in software, via a SpaceWire RMAP
I/F
– One Virtual Channel implemented in hardware, with pulse commands
• Start sequence search and BCH decoding in hardware.
The mix between hardware and software implementation caters for a safe and
sound system at the same time as flexibility is provided to support upcoming
standards.
The novelty of this TM/TC FPGA design is that the communication between
the telemetry and telecommand system and Onboard Computer, as well as pay-
load, is done by means of the Remote Memory Access Protocol (RMAP) over
SpaceWire links. Via RMAP read and write commands the device status can be
observed and it can be controlled in a safe (by means of verified-write commands)
and standardized way (ECSS standard).
For telemetry, the Space Packet is carried in the Data field of an RMAP write
command. The RMAP protocol provides additional protection of the Space Packet
by means of the 8-bit Data CRC field, which can be used to discard any packets
that have been received with errors. (The Space Packets can themselves include a
16-bit CRC as optional Packet Error Control, but this would require the checking
of the Space Packet which is not in line with a layered protocol approach).
The routing is done by means of the addressing capability of the RMAP write
command; the address can be used for example to distinguish different Virtual
Channels on a telemetry downlink. For telecommands, complete Transfer Frames
can be moved between the FPGA device and Onboard Computer.
The RMAP target implementation in the FPGA requires no local processor,
simplifying the design and releasing logic resources. The CCSDS/ECSS telemetry
and telecommand software stacks are handled over RMAP.

4.2 Architectural Overview

Figure 4.1 shows a simple block diagram of the device. Although this block
diagram shows the functional structure, its external interfaces shown here directly
correspond to the hardware interfaces of the CCSDS-Board:
• On the left side the two SpaceWire interfaces to the processor boards can be
seen which physically are implemented on the front side of the CCSDS Board,
as for the I/O Board.
• The same applies to the power supply connector. Power supply is not depicted in
this logical diagram.
• Furthermore, on the left side of this figure the High Priority Command (HPC) of
the telecommand decoder is shown. It is cited with ‘‘TC UART’’ here since it is
implemented as an RS422 type interface.
62 S. Habinc

Fig. 4.1 CCSDS decoder/encoder block diagram.  Aeroflex Gaisler AB

• On the right side of the figure the CLTU Telecommand input interface and the
CADU Telemetry interface are shown (both also of type RS422 in hardware).
Some of the interfaces (e.g. TC input) is physically split into multiple signals.
For the explanations and the signal overview please refer to the later Sect. 4.2.9
and Fig. 4.2. Note that all IP Cores with AMBA AHB master interfaces also have
APB slave interfaces for configuration and status monitoring, although not shown
in the block diagram. The Telemetry and Telecommand specification comprises
the following elements:
• CCSDS compliant Telemetry encoder:
– Input:
4 Virtual Channels ? 1 Virtual Channel for Idle Frames
Input access via SpaceWire link
CCSDS Space Packet data (or any custom data block)
4 The CCSDS Decoder/Encoder Boards 63

CLCW
Input via SpaceWire link
CLCW internally from hardware commands
CLCW externally from two dedicated asynchronous bit serial inputs
– Output:
CADU/encoded CADU
NRZ-L encoding
Pseudo-Randomization
Reed-Solomon and/or Convolutional encoding
Bit synchronous output: clock and data
• CCSDS compliant Telecommand decoder (software commands):
– Layers in hardware:
Coding layer
– Input:
Auto adaptable bit rate
Bit synchronous input: clock, qualifier and data
– Output:
– Output access via SpaceWire link
CLTU (Telecommand Transfer Frame and Filler Data)
CLCW internally connected to Telemetry encoder
CLCW on dedicated asynchronous bit serial output
• CCSDS compliant Telecommand decoder (hardware commands):
– Layers in hardware:
Coding layer
Transfer layer (BD frames only)
CLCW internally connected to Telemetry encoder
– Input:
Auto adaptable bit rate
Bit synchronous input: clock, qualifier and data
Telecommand Frame with Segment
– Output:
Redundant UART
CLCW on dedicated asynchronous bit serial output.
64 S. Habinc

Fig. 4.2 Signal overview.  Aeroflex Gaisler AB

4.2.1 Interfaces

The following interfaces are provided:


• Telemetry
– Telemetry transmitter clock input
– CLCW externally from two dedicated asynchronous bit serial inputs
– Physical layer output:
Two sets of bit synchronous output: clock and data
• Telecommand
– Physical layer input:
Two sets of bit synchronous input: data, qualifier (bit lock), clock and RF
status
– Hardware commands:
Redundant UART output
– CLCW on dedicated asynchronous bit serial output (hardware commands)
– CLCW on dedicated asynchronous bit serial output (software commands)
4 The CCSDS Decoder/Encoder Boards 65

• System level
– System clock and reset
– SpaceWire link with RMAP support for telemetry and telecommand.

4.2.2 Command Link Control Word Coupling

As special feature the CCSDS Decoders/Encoders provide an external routing of


the Command Link Control Word (CLCW) coupling (also as RS422) between the
two redundant boards, so that if one transmitter of the spacecraft fails, the received
command’s CLCWs can still be downlinked via the redundant transmitter chain.
This significantly decomplicates command chain FDIR.

4.2.3 Clock and Reset

The system clock is taken directly from a separate external input. The telemetry
transmitter clock is derived from the system clock. The SpaceWire transmitter
clock is derived from system clock. The device is reset with a single external reset
input that need not be synchronous with the system clock input.

4.2.4 Performance

Telemetry downlink rate is programmable up to at least 2 Mbit/s, based on a


10 MHz system clock. A telecommand uplink rate up to at least 100 kbit/s is
supported.
A SpaceWire links rate up to at least 10 Mbit/s is supported, based on a 10 MHz
system clock. System clock frequency up to at least 10 MHz is supported, based
on a 10 MHz input clock.

4.2.5 Telemetry Encoder

The CCSDS Telemetry Encoder implements the Data Link Layer, covering the
Protocol Sub-layer and the Synchronization and Coding Sub-layer and part of the
Physical Layer of the packet telemetry encoder protocol.
The Telemetry Encoder comprises several encoders and modulators imple-
menting the Consultative Committee for Space Data Systems (CCSDS) recom-
mendations, European Cooperation on Space Standardization (ECSS) and the
66 S. Habinc

European Space Agency (ESA) Procedures, Standards and Specifications (PSS) for
telemetry and channel coding.
The Telemetry Encoder implements four Virtual Channels accessible via
SpaceWire links. The Virtual Channels accept CCSDS Space Packet data [27] as
input via the SpaceWire RMAP protocol. An additional Virtual Channel is
implemented for Idle Frames only.
In the target satellite project the ‘‘Virtual Channel Generation Function Input
Interface’’ of the encoder was used as it is described in depth in Sect. 4.3.12. This
simplifies VC handling for the software designers and only requires the proper use
of the registers for submission of CCSDS Space Packets to the encoder and the
according activation and control registers. These are described further in Sect. 4.3.

4.2.5.1 Telemetry Encoder Specification

The following Data Link Protocol Sub-layer [26] functionality (CCSDS-131.0) is


not implemented in hardware:
• Packet Processing
• Virtual Channel Frame Service (see also Virtual Channel 0, 1, 2 and 3)
• Master Channel Frame Service (only single Spacecraft Identifier supported)
• Master Channel Multiplexing (only single Spacecraft Identifier supported).

The following Data Link Protocol Sub-layer functionality (CCSDS-131.0) is


implemented in hardware:
• Virtual Channel Generation (for Virtual Channels 0, 1, 2 and 3)
• Virtual Channel Generation (for Idle Frame generation only, e.g. Virtual
Channel 7)
• Virtual Channel Multiplexing
• Master Channel Generation
• All Frame Generation
• Multiplexing of four CLCW sources, of which two external via asynchronous bit
serial interfaces.

This Synchronization and Channel Coding Sub-Layer [24] functionality (CCSDS


130.0) is implemented in hardware:
• Attached Synchronization Marker
• Reed-Solomon coding
• Pseudo-Randomizer
• Convolutional coding.

This Physical Layer [21] functionality (ECSS-E-ST-50-05C) is implemented in


hardware:
4 The CCSDS Decoder/Encoder Boards 67

• Non-Return-to-Zero Level modulation (NRZ-L).

The Telemetry Encoder fixed configuration is as follows:


• fixed Transfer Frame format, version 00b, Packet Telemetry
• fixed Transfer Frame length of 1115 octets
• common Master Channel Frame Counter for all Virtual Channels
• fixed nominal Attached Synchronization Marker usage
• fixed 2 kB telemetry transmit FIFO
• fixed 8 kB on-chip EDAC protected RAM memory per Virtual Channel 0, 1, 2
and 3.

The Telemetry Encoder programmability is as follows:


• telemetry Spacecraft Identifier
• telemetry OCF/CLCW enable
• telemetry No RF Available and No Bit Lock bits in CLCW overwriting from
input pins
• telemetry Reed-Solomon enable
(E = 16 coding, interleave depth 5, 160 check symbols)
• telemetry Pseudo Randomization enable
• telemetry Convolutional Encoder enable and rate
• telemetry transfer rate.

The Telemetry Encoder does not implement the following:


• no Frame Error Control Field (FECF)/CRC
• no Advanced Orbiting Systems (AOS) support (also no Insert Zone (AOS) and
no Frame Header Error Control (FHEC))
• no Transfer Frame Secondary Header (also no Extended Virtual Channel Frame
Counter)
• no Turbo Encoding
• no Split-Phase Level modulation
• no Sub-carrier modulation.

4.2.5.2 Virtual Channels 0, 1, 2 and 3

Virtual Channels 0, 1, 2 and 3 are implemented in hardware without any software


support being required. Data are input via SpaceWire RMAP commands. See
Sects. 4.3.12 and 4.6 for details.
The following Data Link-Protocol Sub-layer [26] functionality (CCSDS-132.0)
is implemented:
• Virtual Channel Generation
– Transfer Frame Primary Header insertion
– Transfer Frame Data Field insertion
68 S. Habinc

– First Header Pointer (FHP) handling and insertion


– Buffering of two complete Transfer Frames per Virtual Channel
– CCSDS Space Packet [27] data [CCSDS-133.0] input (or user-defined data-
blocks).

4.2.5.3 Virtual Channel 7

Idle Frames are generated on a separate Virtual Channel, using identifier 7. See
Sect. 4.3.3.4.

4.2.6 Telecommand Decoder

The CCSDS Telecommand Decoder implements part of the Data Link Layer,
covering the Protocol Sub-layer and the Synchronization and Coding Sub-layer
and part of the Physical Layer of the packet telecommand decoder protocol.
The Telecommand Decoder supports decoding of higher protocol layers in
software, being accessible via a SpaceWire link. It also supports decoding in
hardware for hardware commands (see Sect. 4.5), for which CLCW is produced to
on-chip Telemetry Encoder.

4.2.6.1 Telecommand Decoder Specification

The following Data Link—Synchronization and Channel Coding Sub-Layer [34]


functionality (CCSDS-231.0) is implemented in hardware:
• Pseudo-De-randomization
• BCH codeblock decoding
• Start Sequence Search.

The following Physical Layer [21] functionality (ECSS-E-ST-50-05C) is


implemented in hardware:
• Non-Return-to-Zero Level de-modulation (NRZ-L).

The telecommand decoder fixed configuration is as follows:


• fixed telecommand decoder support for CCSDS/ECSS functionality, not ESA
PSS

The telecommand decoder provides the following fixed configuration values:


• telecommand (hardware commands) Spacecraft Identifier
• telecommand (hardware commands) Virtual Channel Identifier (with bit 1 taken
from input pin)
4 The CCSDS Decoder/Encoder Boards 69

• telecommand Pseudo-De-randomization disabled


• telecommand NRZ-L enabled
• telecommand RF available indicator, positive polarity
• telecommand active signal (bit lock), positive polarity
• telecommand bit clock active, rising edge.

The Telecommand Decoder has multiple separate serial input streams from
transponders etc., comprising serial data, clock, channel active indicator (bit lock)
and RF carrier available. The input stream is auto-adaptable.

4.2.6.2 Software Virtual Channel

The interface between the Telecommand Decoder hardware and software is a


SpaceWire link with RMAP protocol. The CLCW produced by the software is
input to the Telemetry Encoder via the Telecommand Decoder CLCW Registers
(CLCWRn), see Sect. 4.4.9 for details, using the SpaceWire link with RMAP
protocol, with the same information being output on an asynchronous bit serial
output suitable for cross-strapping.
The higher protocol levels are implemented in software. These software tele-
commands are stored in memory and can be accessed via a SpaceWire interfaces.
The software implementation of the higher layers of the telecommand decoder
allows for implementation flexibility and accommodation of future standard
enhancements. See Sects. 4.4 and 4.6 for details.

4.2.6.3 Hardware Virtual Channel

A separate Virtual Channel for hardware commands is implemented in hardware,


without the need of software support. The hardware commands are output on two
serial UART ports. The hardware commands are carried as Space Packets inside
the Segment Data Field of a Transfer Frame Data Field, and the Transfer Frame
includes the Frame Error Control Field (FECF/CRC).
The Application Layer functionality is not implemented in hardware.
The Space Packet Protocol layer [27] functionality (CCSDS-133.0) is not
implemented in hardware. This Data Link Protocol Sub-Layer [35] functionality
(CCSDS-232.0) is implemented in hardware:
• Virtual Channel Packet Extraction
• Virtual Channel Segment Extraction
• Virtual Channel Reception:
– Support for Command Link Control Word (CLCW)
• Virtual Channel Demultiplexing
• Master Channel Demultiplexing
70 S. Habinc

• All Frames Reception:


– Frame Delimiting and Fill Removal Procedure; and
– Frame Validation Check Procedure, in this order.

The CLCW is automatically transferred to the on-chip Telemetry Encoder, with


the same information being output on an asynchronous bit serial output suitable for
cross-strapping.
The hardware telecommands are implemented entirely in hardware and do not
require any software and therefore can be used for critical operations. See Sect. 4.5
for details.

4.2.7 SpaceWire Link Interfaces

The SpaceWire links provide an interface between the on-chip bus and a Space-
Wire network. They implement the SpaceWire standard [12] with the protocol
identification extension [13]. The Memory Access Protocol (RMAP) command
handler implements the ECSS standard [14].

4.2.8 On-Chip Memory

Two times 16 kB of on-chip volatile memory is provided in the FPGA for tem-
porary storage of 7 Telemetry Transfer Frames for each of the Telemetry Virtual
Channels 0 through 3, together with a dedicated hard-coded descriptor memory
containing 7 descriptors for each channel. Additional 8 kB of on-chip volatile
memory is provided to be used for telecommand data. All memory is protected by
EDAC. Neither automatic scrubbing nor an error counter are implemented.

4.2.9 Signal Overview

The signal overview of the telemetry encoder and telecommand decoder is shown
in Fig. 4.2.
The functional signals are shown in Table 4.1. Note that index 0 is MSB for
TM/TC signals.
Further details on the applied IP Cores, interrupts, the memory map and signals
can be taken from the reference document [62].
4 The CCSDS Decoder/Encoder Boards 71

Table 4.1 External signals


Name Usage Direction Polarity Reset
clk System clock (also SpaceWire and Telemetry In Rising –
transmit clock)
resetn System reset In Low –
tck JTAG clock In – –
tms JTAG TMS In High –
tdi JTAG TDI In High –
tdo JTAG TDO Out High –
caduclk[0:1] Telemetry CADU serial bit clock output Out – Low
caduout[0:1] Telemetry CADU serial bit data output Out – Low
id[0:1] Identifier (id[1] used with Telecommand hardware In – –
command Virtual Channel Identifier and SpaceWire
Node Address)
tcrfa[0:1] Telecommand CLTU RF available indicator In – –
tcactive[0:1] Telecommand CLTU input active indicator (bit lock) In – –
tcclk[0:1] Telecommand CLTU serial bit clock input In – –
tcdata[0:1] Telecommand CLTU serial bit data input In – –
tcuart[0:1] Telecommand (hardware command) UART output Out – High
clcwin[0:1] Telemetry CLCW asynchronous bit serial inputs In – –
clcwout[0:1] Telecommand CLCW asynchronous bit serial outputs Out – High
spw rxd[0:1] Data input In High –
spw rxs[0:1] Strobe input In High –
spw txd[0:1] Data output Out High Low
spw txs[0:1] Strobe output Out High Low

4.3 Telemetry Encoder

4.3.1 Overview

The CCSDS/ECSS/PSS Telemetry Encoder implements part of the Data Link


Layer, covering the Protocol Sub-layer and the Frame Synchronization and Coding
Sub-layer and part of the Physical Layer of the packet telemetry encoder protocol.
The operation of the Telemetry Encoder is highly programmable by means of
control registers. The Telemetry Encoder comprises several encoders and modu-
lators implementing the Consultative Committee for Space Data Systems
(CCSDS) recommendations, European Cooperation on Space Standardization
(ECSS) and the European Space Agency (ESA) Procedures, Standards and
Specifications (PSS) for telemetry and channel coding. The encoder comprises the
following:
• Packet Telemetry Encoder (TM)
• Reed-Solomon Encoder
• Pseudo-Randomizer (PSR)
72 S. Habinc

Fig. 4.3 Telemetry Encoder block diagram.  Aeroflex Gaisler AB

• Non-Return-to-Zero Level encoder (NRZ-L)


• Convolutional Encoder (CE)
• Clock Divider (CD).

Note that the SpaceWire input interface is described separately. The SpaceWire
interfaces and corresponding Virtual Channel Generation function and buffer
memories are not shown in the block diagram below, as is the case for the CLCW
multiplexing function (Fig. 4.3).

4.3.2 Layers

The relationship between Packet Telemetry standard and the Open Systems
Interconnection (OSI) reference model is such that the OSI Data Link Layer
corresponds to two separate layers, namely the Data Link Protocol Sub-layer and
Synchronization and Channel Coding Sub-Layer.
4 The CCSDS Decoder/Encoder Boards 73

4.3.2.1 Data Link Protocol Sub-layer

The following functionality is not implemented in the core:


• Packet Processing
• Virtual Channel Frame Service (see also Virtual Channel 0, 1, 2 and 3)
• Master Channel Frame Service (only single Spacecraft Identifier supported)
• Master Channel Multiplexing (only single Spacecraft Identifier supported).

The following functionality is implemented in the core:


• Virtual Channel Generation (for Virtual Channels 0, 1, 2 and 3)
• Virtual Channel Generation (for Idle Frame generation only, e.g. Virtual
Channel 7)
• Master Channel Generation (for all frames)
• All Frame Generation (for all frames).

4.3.2.2 Synchronization and Channel Coding Sub-Layer

The following functionality is implemented in the core:


• Attached Synchronization Marker
• Reed-Solomon coding
• Pseudo-Randomizer
• Convolutional coding.

4.3.2.3 Physical Layer

The following functionality is implemented in the core:


• Non-Return-to-Zero Level modulation.

4.3.3 Data Link Protocol Sub-Layer

4.3.3.1 Physical Channel

The configuration of a Physical Channel covers the following parameters:


• Transfer Frame Length is fixed to 1115 octets
• Transfer Frame Version Number is fixed to 0, i.e. Packet Telemetry.
74 S. Habinc

4.3.3.2 Virtual Channel Frame Service

The Virtual Channel Frame Service is not implemented, except as a support for
Virtual Channels 0, 1, 2, and 3.

4.3.3.3 Virtual Channel Generation: Virtual Channels 0, 1, 2 and 3

There is a Virtual Channel Generation function for each of Virtual Channels 0, 1, 2


and 3. The channels have each an on-chip memory buffer to store seven complete
Transfer Frames. Each Virtual Channel Generation function receives data from the
SpaceWire interface that are stored in the on-chip memory buffer that is EDAC
protected (see Sect. 4.3.12).
The function supports:
• Transfer Frame Primary Header insertion
• Transfer Frame Data Field insertion (with support for different lengths due to
OCF)
• First Header Pointer (FHP) handling and insertion.

The function keeps track of the number of octets received and the packet
boundaries in order to calculated the First Header Pointer (FHP). The data are
stored in pre-allocated slots in the buffer memory comprising complete Transfer
Frames. The module fully supports the FHP generation and does not require any
alignment of the packets with the Transfer Frame Data Field boundary. The buffer
memory space allocated to each Virtual Channel is treated as a circular buffer. The
function communicates with the Virtual Channel Frame Service by means of the
on-chip buffer memory.
The data input format can be CCSDS Space Packet [27] or any user-defined
data-block (see Sect. 4.3.12).
The Virtual Channel Generation function for Virtual Channels 0, 1, 2 and 3 is
enabled through the GRTM DMA External VC Control register. The transfer is
done automatically via the Virtual Channel Frame Service (i.e. DMA function).

4.3.3.4 Virtual Channel Generation: Idle Frames—Virtual Channel 7

The Virtual Channel Generation function is used to generate the Virtual Channel
Counter for Idle Frames as described here below.

4.3.3.5 Virtual Channel Multiplexing

The Virtual Channel Multiplexing Function is used to multiplex Transfer Frames


of different Virtual Channels of a Master Channel. Virtual Channel Multiplexing
4 The CCSDS Decoder/Encoder Boards 75

in the core is performed between two sources: Virtual Channel Generation func-
tion (Virtual Channels 0, 1, 2 and 3) and Idle Frames (Virtual Channel 7). Note
that multiplexing between different Virtual Channels is assumed to be done as part
of the Virtual Channel Frame Service outside the core, i.e. in hardware for Virtual
Channels 0, 1, 2 and 3. The Idle Frame generation is described hereafter.
Bandwidth allocation between Virtual Channels 0, 1,2 and 3 is done in hard-
ware and is equal between these channels, see Sect. 4.3.11 and [62]. Bandwidth
allocation to VC7 is only done when no other VC has anything to send. If one VC
has no data to send, then the next one can send.
Idle Frame generation can be enabled and disabled by means of a register. The
Spacecraft ID to be used for Idle Frames is programmable by means of a register.
The Virtual Channel ID to be used for Idle Frames is programmable by means of a
register, e.g. Virtual Channel 7.
Master Channel Counter generation for Idle Frames can be enabled and disabled
by means of a register. Note that it is also possible to generate the Master Channel
Counter field as part of the Master Channel Generation function described in the
next section. When Master Channel Counter generation is enabled for Idle Frames,
then the generation in the Master Channel Generation function is bypassed.

4.3.3.6 Master Channel Generation

The Master Channel Counter is generated for all frames on the master channel.
The Operational Control Field (OCF) is generated from a 32-bit input, via the
Command Link Control Word (CLCW) input of the Telecommand Decoder—
Software Commands (see Sect. 4.4.9) or the internal Telecommand Decoder—
Hardware Commands. This is done for all frames on the master channel (MC OCF).
The transmit order repeats every fourth Transfer Frames and is as follows:
• CLCW from the internal software commands register (Telecommand Decoder
CLCW Register 1 (CLCWR1), see Sect. 4.4.9 for details) is transmitted in
Transfer Frames with Transfer Frame Master Channel Counter value ending
with bits 0b00.
• CLCW from the internal hardware commands is transmitted in Transfer Frames
with Transfer Frame Master Channel Counter value ending with bits 0b01.
• CLCW from the external asynchronous bit serial interface input clcwin[0] is
transmitted in Transfer Frames with Transfer Frame Master Channel Counter
value ending with bits 0b10.
• CLCW from the external asynchronous bit serial interface input clcwin[1] is
transmitted in Transfer Frames with Transfer Frame Master Channel Counter
value ending with bits 0b11.

Note that the above order depends on the state of the static input pin id. If id is
logical zero, then the above scheme is applied, else the two first entries are
swapped with the two last entries.
76 S. Habinc

Note that bit 16 (No RF Available) and 17 (No Bit Lock) of the CLCW and
project specific OCF are taken from information carried on discrete inputs tcrfa[]
and tcactive[].
• The Master Channel Frame Service is not implemented.
• The Master Channel Multiplexing Function is not implemented.

4.3.3.7 All Frame Generation

The All Frame Generation functionality operates on all Transfer Frames of a


Physical Channel.

4.3.4 Synchronization and Channel Coding Sub-Layer

4.3.4.1 Attached Synchronization Marker

The 32-bit Attached Synchronization Marker is placed in front of each Transfer


Frame as per [25] and [19].

4.3.4.2 Reed-Solomon Encoder

The CCSDS recommendation [25] and ECSS standard [19] specify Reed-Solomon
codes, one (255, 223) code. The ESA PSS standard [40] only specifies the former
code. Although the definition style differs between the documents, the (255, 223)
code is the same in all three documents. The definition used in this document is
based on the PSS standard [40]. Details shall be taken from [62].

4.3.4.3 Pseudo-Randomizer

The Pseudo-Randomizer (PSR) generates a bit sequence according to [25] and [19]
which is xor-ed with the data output of preceding encoders. This function allows
the required bit transition density to be obtained on a channel in order to permit the
receiver on ground to maintain bit synchronization. The implementation details are
described in [62].

4.3.4.4 Convolutional Encoder

The Convolutional Encoder (CE) implements the basic convolutional encoding


scheme. The ESA PSS standard [40] specifies a basic convolutional code without
4 The CCSDS Decoder/Encoder Boards 77

puncturing. This basic convolutional code is also specified in the CCSDS recom-
mendation [25] and ECSS standard [19], which in addition specifies a punctured
convolutional code. For details of the implementation please again refer to [62].

4.3.5 Physical Layer

4.3.5.1 Non-Return-to-Zero Level Encoder

The Non-Return-to-Zero Level encoder (NRZ-L) encodes differentially a bit


stream from preceding encoders according to [21]. The waveform is shown in
Fig. 4.4. Both data and the Attached Synchronization Marker (ASM) are affected
by the coding. When the encoder is not enabled, the bit stream is by default non-
return-to-zero level encoded.

Fig. 4.4 NRZ-L waveform

4.3.5.2 Clock Divider

The clock divider (CD) provides clock enable signals for the telemetry and
channel encoding chain. The clock enable signals are used for controlling the bit
rates of the different encoder and modulators. The source for the bit rate frequency
is the system clock input. The system clock input can be divided to a degree 215.
The divider can be configured during operation to divide the system clock fre-
quency from 1/1 to 1/215. The bit rate frequency is based on the output frequency
of the last encoder in a coding chain, except for the sub-carrier modulator. No
actual clock division is performed, since clock enable signals are used. No clock
multiplexing is performed in the core. Details for the clock divider settings are
contained in [62].

4.3.6 Connectivity

The output from the Packet Telemetry encoder can be connected to the following
postprocessings:
78 S. Habinc

• Reed-Solomon encoder
• Pseudo-Randomizer
• Non-Return-to-Zero encoder
• Convolutional encoder.

The processing modules can be aligned to a chain with a certain variability of


the postprocessor sequencing. Possible I/O chain connections between these pro-
cessors are explained in [62].

4.3.7 Operation

The Telemetry Encoder DMA interface provides a means for the user to insert
Transfer Frames in the Packet Telemetry and AOS Encoder. Depending on which
functions are enabled in the encoder, the various fields of the Transfer Frame are
overwritten by the encoder. It is also possible to bypass some of these functions for
each Transfer Frame by means of the control bits in the descriptor associated to
each Transfer Frame. The DMA interface allows the implementation of Virtual
Channel Frame Service and Master Channel Frame Service, or a mixture of both,
depending on what functions are enabled or bypassed.

4.3.7.1 Descriptor Setup

The transmitter DMA interface is used for transmitting Transfer Frames on the
downlink. The transmission is done using descriptors located in memory.
A single descriptor is shown in Tables 4.2 and 4.3. The number of bytes to be
sent is set globally for all Transfer Frames in the length field in register DMA
length register. The the address field of the descriptor should point to the start of
the Transfer Frame. The address must be word-aligned. If the Interrupt Enable (IE)
bit is set, an interrupt will be generated when the Transfer Frame has been sent

Table 4.2 GRTM transmit descriptor word 0 (address offset 0 9 0)


31 16 15 14 13 1O 9 8 7 6 5 4 3 2 1 O

RESERVED UE TS 000 VCE MCB FSHB OCFB FHECB lZB FECFB lE WR EN


31: 16 RESERVED
15 Underrun error (UE)—underrun occurred while transmitting frame (status bit only)
14 Time strobe (TS)—generate a time strobe for this frame
13: 10 RESERVED
9 Virtual channel counter enable (VCE)—enable virtual channel counter generation
(using the Idle Frame virtual channel counter)
8 Master channel counter bypass (MCB)—bypass master channel counter
generation (TM only)
7 Frame secondary header bypass (FSHB)—bypass frame secondary header
generation (TM only)
(continued)
4 The CCSDS Decoder/Encoder Boards 79

Table 4.2 (continued)


6 Operational control field bypass (OCFB)—bypass operational control field
generation
5 Frame error header control bypass (FECHB)—bypass frame error header
control generation (AOS)
4 Insert zone bypass (IZB)—bypass insert zone generation (AOS)
3 Frame error control field bypass (FECFB)—bypass frame error control
field generation
2 Interrupt enable (IE)—an interrupt will be generated when the frame from this
descriptor has been sent provided that the transmitter interrupt enable bit in
the control register is set. The interrupt is generated regardless if the frame
was transmitted successfully or if it terminated with an error
1 Wrap (WR)—set to one to make the descriptor pointer wrap to zero after this
descriptor has been used. If this bit is not set the pointer will increment by 8.
The pointer automatically wraps to zero when the 1 kB boundary of the
descriptor table is reached
0 Enable (EN)—set to one to enable the descriptor. Should always be set last
of all the descriptor fields

Table 4.3 GRTM transmit descriptor word 1 (address offset 0 9 4)


31 2 1 0
ADDRESS RES
31: 2 Address (ADDRESS)—pointer to the buffer area from where the packet
data will be loaded
1: 0 RESERVED

(this requires that the transmitter interrupt enable bit in the control register is also
set). The interrupt will be generated regardless of whether the Transfer Frame was
transmitted successfully or not. The wrap (WR) bit is also a control bit that should
be set before transmission and it will be explained later in this section.
To enable a descriptor the enable (EN) bit should be set and after this is done,
the descriptor should not be touched until the enable bit has been cleared by the
core.

4.3.7.2 Starting Transmissions

Enabling a descriptor is not enough to start a transmission. A pointer to the


memory area holding the descriptors must first be set in the core. This is done in
the transmitter descriptor pointer register. The address must be aligned to a 1 kB
boundary. Bits 31 to 10 hold the base address of descriptor area while bits 9 to 3
form a pointer to an individual descriptor. The first descriptor should be located at
the base address and when it has been used by the core, the pointer field is
incremented by 8 to point at the next descriptor. The pointer will automatically
wrap back to zero when the next 1 kB boundary has been reached (the descriptor at
80 S. Habinc

address offset 0 9 3F8 has been used). The WR bit in the descriptors can be set to
make the pointer wrap back to zero before the 1 kB boundary.
The pointer field has also been made writable for maximum flexibility but care
should be taken when writing to the descriptor pointer register. It should never be
touched when a transmission is active.
The final step to activate the transmission is to set the transmit enable bit in the
DMA control register. This tells the core that there are more active descriptors in
the descriptor table. This bit should always be set when new descriptors are
enabled, even if transmissions are already active. The descriptors must always be
enabled before the transmit enable bit is set.

4.3.7.3 Descriptor Handling After Transmission

When a transmission of a frame has finished, status is written to the first word in
the corresponding descriptor. The Underrun Error bit is set if the FIFO became
empty before the frame was completely transmitted. The other bits in the first
descriptor word are set to zero after transmission while the second word is left
untouched. The enable bit should be used as the indicator when a descriptor can be
used again, which is when it has been cleared by the core.
There are multiple bits in the DMA status register that hold transmission status.
The Transmitter Interrupt (TI) bit is set each time a DMA transmission of a
Transfer Frame ended successfully. The Transmitter Error (TE) bit is set each time
a DMA transmission of a Transfer Frame ended with an Underrun Error. For either
event, an interrupt is generated for Transfer Frames for which the Interrupt Enable
(IE) was set in the descriptor (Virtual Channels 0 through 2 only). The interrupt is
maskable with the Interrupt Enable (IE) bit in the control register.
The Transmitter AMBA error (TA) bit is set when an AMBA AHB error was
encountered either when reading a descriptor or when reading Transfer Frame
data. Any active transmissions were aborted and the DMA channel was disabled.
This can be a result of a DMA access caused by any Virtual Channel. It is
recommended that the Telemetry Encoder is reset after an AMBA AHB error. The
interrupt is maskable with the Interrupt Enable (IE) bit in the control register.
The Transfer Frame Sent (TFS) bit is set whenever a Transfer Frame has been
sent, independently if it was sent via the DMA interface or generated by the core.
The interrupt is maskable with the Transfer Frame Interrupt Enable (TFIE) bit in
the control register. Any Virtual Channel causes this interrupt.
The Transfer Frame Failure (TFF) bit is set whenever a Transfer Frame has
failed for other reasons, such as when Idle Frame generation is not enabled and no
user Transfer Frame is ready for transmission, independently if it was sent via the
DMA interface or generated by the core. The interrupt is maskable with the
Transfer Frame Interrupt Enable (TFIE) bit in the control register.
4 The CCSDS Decoder/Encoder Boards 81

The Transfer Frame Ongoing (TFO) bit is set when DMA transfers are enabled,
and is not cleared until all DMA induced Transfer Frames have been transmitted
after DMA transfers are disabled.
The External Transmitter Interrupt (XTI) bit is set each time a DMA trans-
mission of a Transfer Frame ended successfully (unused here). The External
Transmitter Error (XTE) bit is set each time a DMA transmission of a Transfer
Frame ended with an underrun error (for Virtual Channels 0 through 3 only).

4.3.8 Registers

The core is programmed through registers mapped into APB address space
(Table 4.4).

Table 4.4 GRTM registers


APB address offset Register
0 9 00 GRTM DMA control register
0 9 04 GRTM DMA status register
0 9 08 GRTM DMA length register
0 9 0C GRTM DMA descriptor pointer register
0 9 10 GRTM DMA configuration register
0 9 14 GRTM DMA revision register
0 9 20 GRTM DMA external VC control and status register
0 9 2C GRTM DMA external VC descriptor pointer register
0 9 80 GRTM control register
0 9 84 GRTM status register (unused)
0 9 88 GRTM configuration register
0 9 90 GRTM physical layer register
0 9 94 GRTM coding sub-layer register
0 9 98 GRTM attached synchronization marker (unused)
0 9 A0 GRTM all frames generation register
0 9 A4 GRTM master frame generation register
0 9 A8 GRTM idle frame generation register
0 9 C0 GRTM FSH/insert zone register 0 (unused)
0 9 C4 GRTM FSH/insert zone register 1 (unused)
0 9 C8 GRTM FSH/insert zone register 2 (unused)
0 9 CC GRTM FSH/insert zone register 3 (unused)
0 9 D0 GRTM operational control field register (unused)

In the annex Sect. 11.3 only a selection of registers is described in detail—those


which were essential for accessing the TM Encoder by the OBSW of the Uni-
versity satellite. Details for the complete register set again can be taken from [62].
82 S. Habinc

4.3.9 Signal Definitions and Reset Values

The signals and their reset values are described in Table 4.5. The key ones are ‘‘RF
Available’’ and ‘‘Bit Lock’’ to start transmission not before the RF ground link
really is established.

Table 4.5 Signal definitions and reset values


Type Function Active Reset value
tcrfa[] Input, async RF available – –
tcactive[] Input, async Bit lock – –
caduout[] Output Serial bit data, output at caduclk edge (selectable) – –
caduclk[] Output Serial bit data clock Rising Logical 0
clcwin[] Input CLCW data input – –

4.3.10 TM Encoder: Virtual Channel Generation

The CCSDS/ECSS/PSS Telemetry Encoder Virtual Channel Generation function


implements:
• Transfer Frame Primary Header insertion
• Transfer Frame Data Field insertion (with support for different lengths due to
OCF and FECF)
• First Header Pointer (FHP) handling and insertion.

The function keeps track of the number of octets received and the packet
boundaries in order to calculated the First Header Pointer (FHP). The data are
stored in pre-allocated slots in the buffer memory comprising complete Transfer
Frames. The module fully supports the FHP generation and does not require any
alignment of the packets with the Transfer Frame Data Field boundary.
The data input format can be CCSDS Space Packet [27] or any user-defined
data-block. Data is input via a separate Virtual Channel Generation function input
interface.
The function communicates with the Telemetry Encoder Virtual Channel
Frame Service by means of a buffer memory space. The buffer memory space
allocated to the Virtual Channel is treated as a circular buffer. The buffer memory
space is accessed by means of an AMBA AHB master interface.

4.3.11 TM Encoder: Descriptor

The CCSDS/ECSS/PSS Telemetry Encoder Descriptor implements an automatic


descriptor handler for external Telemetry Virtual Channels implemented in
4 The CCSDS Decoder/Encoder Boards 83

hardware (Telemetry Encoder Virtual Channel Generation function), not requiring


software support. Details can be taken from [62].

4.3.12 TM Encoder: Virtual Channel Generation Function


Input Interface

The Telemetry Encoder Virtual Channel Generation function input interface


implements an interfaces towards the automatic Virtual Channel Generation
function of the Telemetry Encoder (also called external Virtual Channels). Space
Packets or any other user-defined data block can be input. It is the essential
interface for the user for TM generation.
Data is transferred to the Virtual Channel Generation function by writing to the
AMBA AHB slave interface, located in the AHB I/O area. Writing is only possible
when the packet valid delimiter is asserted, else the access results in an AMBA
access error. It is possible to transfer one, two or four bytes at a time, following the
AMBA big-endian convention regarding send order. The last written data can be
read back via the AMBA AHB slave interface. Data are output as octets to the
Virtual Channel Generation function.
In the case the data from a previous write access has not been fully transferred
over the interface, a new write access will result in an AMBA retry response. The
progress of the interface can be monitored via the AMBA APB slave interface. An
interrupt is generated when the data from the last write access has been transferred.
An interrupt is also generated when the ready for input packet indicator is asserted.
The core incorporates status and monitoring functions accessible via the AMBA
APB slave interface. This includes:
• Busy and ready signaling from Virtual Channel Generation function
• Interrupts on ready for new word, or ready for new packet (size 2048 octets).

Two interrupts are implemented by the interface:

Index Name Description


0 NOT BUSY Ready for a new data (word, half-word or byte)
1 READY Ready for new packet

The control registers for this function can be found in annex Sect. 11.4.
84 S. Habinc

4.4 TC Decoder: Software Commands

4.4.1 Overview

The Telecommand Decoder (GRTC) is compliant with the Packet Telecommand


protocol and specification defined by [43] and [44]. The decoder is also compatible
with the CCSDS recommendations stated in [30], [31], [32] and [33]. The Tele-
command Decoder (GRTC) only implements the Coding Layer (CL) (Fig. 4.5).
In the Coding Layer (CL), the telecommand decoder receives bit streams on
multiple channel inputs. The streams are assumed to have been generated in
accordance with the Physical Layer specifications. In the Coding Layer, the
decoder searches all input streams simultaneously until a start sequence is
detected. Only one of the channel inputs is selected for further reception. The
selected stream is bit-error corrected and the resulting corrected information is
passed to the user. The corrected information received in the CL is transfer by
means of Direct Memory Access (DMA) to the on-board processor.

Fig. 4.5 Telecommand Decoder block diagram.  Aeroflex Gaisler AB

The Command Link Control Word (CLCW) and the Frame Analysis Report
(FAR) can be read and written as registers via the AMBA AHB bus. Parts of the
two registers are generated by the Coding Layer (CL). The CLCW is automatically
transmitted to the Telemetry Encoder (TM) for transmission to the ground.
Note that most parts of the CLCW and FAR are not produced by the Tele-
command Decoder (GRTC) hardware part. This is instead done by the software
part of the decoder.
4 The CCSDS Decoder/Encoder Boards 85

4.4.1.1 Concept

A telecommand decoder in this concept is mainly implemented by software in the


on-board processor (Figs. 4.6). The supporting hardware in the GRTC core
implements the Coding Layer, which includes synchronization pattern detection,
channel selection, codeblock decoding, Direct Memory Access (DMA) capability
and buffering of corrected codeblocks. The hardware also provides a register via
which the Command Link Control Word (CLCW) is made available to a
Telemetry Encoder. The CLCW is to be generated by the software.

Fig. 4.6 Conceptual block diagram.  Aeroflex Gaisler AB

The GRTC has been split into several clock domains to facilitate higher bit rates and
partitioning. The two resulting sub-cores have been named Telecommand Channel
Layer (TCC) and the Telecommand Interface (TCI). Note that TCI is called AHB2TCI.
A complete CCSDS packet telecommand decoder can be realized at software level
according to the latest available standards, staring from the Transfer Layer.

4.4.1.2 Functions and Options

The Telecommand Decoder (GRTC) only implements the Coding Layer of the
Packet Telecommand Decoder standard [43]. All other layers are to be imple-
mented in software, e.g. Authentication Unit (AU). The Command Pulse Decoding
86 S. Habinc

Unit (CPDU) is not implemented. As explained in chapter 1 for the CDPI archi-
tecture a Command Pulse Distribution Unit is not needed.
The following functions of the GRTC are programmable by means of registers:
• Pseudo De-Randomisation
• Non-Return-to-Zero-Mark decoding.

The following functions of the GRTC are pin configurable:


• Polarity of RF Available and Bit Lock inputs
• Edge selection for input channel clock.

The pin configurable settings have been applied accordingly by 4Links Ltd. as
the CCSDS-Board hardware designer.

4.4.2 Data Formats

Figure 4.7 shows the telecommand input protocol waveform.

Fig. 4.7 Telecommand input protocol.  Aeroflex Gaisler AB

4.4.3 Coding Layer

The Coding Layer (CL) synchronizes the incoming bit stream and provides an
error correction capability for the Command Link Transmission Unit (CLTU). The
Coding Layer receives a dirty bit stream together with control information on
whether the physical channel is active or inactive for the multiple input channels.
The bit stream is assumed to be NRZ-L encoded, as the standards specify for
the Physical Layer. As an option, it can also be NRZ-M encoded. There are no
assumptions made regarding the periodicity or continuity of the input clock signal
while an input channel is inactive. The most significant bit (Bit 0 according to
[43]) is received first.
Searching for the Start Sequence, the Coding Layer finds the beginning of a
CLTU and decodes the subsequent codeblocks. As long as no errors are detected, or
errors are detected and corrected, the Coding Layer passes clean blocks of data to
the Transfer Layer which is implemented in software. When a codeblock with an
uncorrectable error is encountered, it is considered as the Tail Sequence, its con-
tents are discarded and the Coding Layer returns to the Start Sequence search mode.
4 The CCSDS Decoder/Encoder Boards 87

The Coding Layer also provides status information for the FAR, and it is
possible to enable an optional de-randomizer according to [30].

4.4.3.1 Synchronization and Selection of Input Channel

Synchronization is performed by means of bit-by-bit search for a Start Sequence


on the channel inputs. The detection of the Start Sequence is tolerant to a single bit
error anywhere in the Start Sequence pattern. The Coding Layer searches both for
the specified pattern as well as the inverted pattern. When an inverted Start
Sequence pattern is detected, the subsequent bit-stream is inverted till the detection
of the Tail Sequence.
The detection is accomplished by a simultaneous search on all active channels.
The first input channel where the Start Sequence is found is selected for the CLTU
decoding. The selection mechanism is restarted on any of the following events:
• The input channel active signal is de-asserted, or
• a Tail Sequence is detected, or
• a codeblock rejection is detected, or
• an abandoned CLTU is detected, or the clock time-out expires.

As a protection mechanism in case of input failure, a clock time-out is provided


for all selection modes. The clock time-out expires when no edge on the bit clock
input of the selected input channel in decode mode has been detected for a
specified period.
When the clock time-out has expired, the input channel in question is ignored
(i.e. considered inactive) until its active signal is de-asserted (configurable with
gTimeoutMask=1). [Not implemented]

4.4.3.2 Codeblock Decoding

The received codeblocks are decoded using the standard (63, 56) modified BCH code.
Any single bit error in a received codeblock is corrected. A codeblock is rejected as a
Tail Sequence if more than one bit error is detected. Information regarding Count of
Single Error Corrections and Count of Accept codeblocks is provided to the FAR.
Information regarding Selected Channel Input is provided via a register.

4.4.3.3 De-Randomizer

In order to maintain bit synchronization with the received telecommand signal, the
incoming signal must have a minimum bit transition density. If a sufficient bit
transition density is not ensured for the channel by other methods, the randomizer
is required. Its use is optional otherwise. The presence or absence of randomization
is fixed for a physical channel and is managed (i.e. its presence or absence is not
88 S. Habinc

signaled but must be known a priori by the spacecraft and ground system). A
random sequence is exclusively OR-ed with the input data to increase the fre-
quency of bit transitions. On the receiving end, the same random sequence is
exclusively OR-ed with the decoded data, restoring the original data form. At the
receiving end, the de-randomisation is applied to the successfully decoded data.
The de-randomizer remains in the ‘‘all-ones’’ state until the Start Sequence has
been detected. The pattern is exclusively OR-ed, bit by bit, to the successfully
decoded data (after the Error Control Bits have been removed). The de-randomizer
is reset to the ‘‘all-ones’’ state following a failure of the decoder to successfully
decode a codeblock or other loss of input channel.

4.4.3.4 Non-Return-to-Zero: Mark

An optional Non-Return-to-Zero-Mark decoder can be enabled by means of a


register.

4.4.3.5 Design Specifics

The coding layer is supporting 1 to 8 channel inputs, although PSS requires at least 4.
A codeblock is fixed to 56 information bits (as per CCSDS/ECSS).
The CCSDS/ECSS (1024 octets) or PSS (256 octets) standard maximum frame
lengths are supported, being programmable via bit PSS in the GCR register. The
former allows more than 37 codeblocks to be received.
The Frame Analysis Report (FAR) interface supports 8 bit CAC field, as well as
the 6 bit CAC field specified in ESA PSS-04-151 When the PSS bit is cleared to
‘0’, the two most significant bits of the CAC will spill over into the ‘‘LEGAL/
ILLEGAL’’ FRAME QUALIFIER field in the FAR. These bits will however be
all-zero when PSS compatible frame lengths are received or the PSS bit is set to
‘1’. The saturation is done at 6 bits when PSS bit is set to ‘1’ and at 8 bits when
PSS bit is cleared to ‘0’.
The Pseudo-Randomizer decoder is included (as per CCSDS/ECSS), its usage
being input signal programmable.
The Physical Layer input can be NRZ-L or NRZ-M modulated, allowing for
polarity ambiguity. NRZ-L/M selection is programmable. This is an extension to
ECSS: Non-Return to Zero-Mark decoder added, with its internal state reset to
zero when channel is deactivated.
Note: If input clock disappears, it will also affect the codeblock acquired imme-
diately before the codeblock just being decoded (accepted by ESA PSS-04-151).
In state S1, all active inputs are searched for start sequence, there is no priority
search, only round robin search. The search for the start sequence is sequential
over all inputs: maximum input frequency = system frequency/(gIn ? 2)
The ESA PSS-04-151 [44] specified CASE-1 and CASE-2 actions are imple-
mented according to aforementioned specification, not leading to aborted frames.
4 The CCSDS Decoder/Encoder Boards 89

Extended E2 handling is implemented:


• E2b Channel Deactivation—selected input becomes inactive in S3
• E2c Channel Deactivation—too many codeblocks received in S3
• E2d Channel Deactivation—selected input is timed-out in S3 (design choice
being: S3 =[ S1, abandoned frame).

4.4.3.6 Direct Memory Access

This interface provides Direct Memory Access (DMA) capability between the
AMBA bus and the Coding Layer. The DMA operation is programmed via an
AHB slave interface. This interface technique is used by the OBSW of the
Stuttgart University FLP satellite platform.
The DMA interface is an element in a communication concept that contains
several levels of buffering. The first level is performed in the Coding Layer where
a complete codeblock is received and kept until it can be corrected and sent to the
next level of the decoding chain. This is done by inserting each correct information
octet of the codeblock in an on-chip local First-In-First-Out (FIFO) memory which
is used for providing improved burst capabilities. The data is then transferred from
the FIFO to a system level ring buffer in the user memory (i.e. on-chip memory
located in the FPGA) which is accessed by means of DMA.
The following storage elements can thus be found in this design:
• The shift and hold registers in the Coding Layer
• The local FIFO (parallel; 32-bit; 4 words deep)
• The system ring buffer (on-chip FPGA memory; 32-bit; 1 to 256 kB deep).

4.4.4 Transmission

The serial data is received and shifted in a shift register in the Coding Layer when
the reception is enabled. After correction, the information content of the shift
register is put into a hold register.
When space is available in the peripheral FIFO, the content of the hold register
is transferred to the FIFO. The FIFO is of 32-bit width and the byte must thus be
placed on the next free byte location in the word.
When the FIFO is filled for 50 %, a request is done to transfer the available data
towards the system level buffer.
If the system level ring buffer isn’t full, the data is transported from the FIFO,
via the AHB master interface towards the main processor and stored in e.g. SRAM.
If no place is available in the system level ring buffer, the data is held in the FIFO.
When the GRTC keeps receiving data, the FIFO will fill up and when it reaches
100 % of data, and the hold and shift registers are full, a receiver overrun interrupt
will be generated (IRQ RX OVERRUN). All new incoming data is rejected until
space is available in the peripheral FIFO.
90 S. Habinc

When the receiving data stream is stopped (e.g. when a complete data block is
received), and some bytes are still in the peripheral FIFO, then these bytes will be
transmitted to the system level ring buffer automatically. Received bytes in the
shift and hold register are always directly transferred to the peripheral FIFO.
The FIFO is automatically emptied when a CLTU is either ready or has been
abandoned. The reason for the latter can be codeblock error, time out etc. as
described in CLTU decoding state diagram.
The operational state machine is shown in Fig. 4.8.

Fig. 4.8 Direct memory access.  Aeroflex Gaisler AB


4 The CCSDS Decoder/Encoder Boards 91

Legend:
rx_w_ptr Write pointer
rx_r_ptr Read pointer

4.4.4.1 Data Formatting

When in the decode state, each candidate codeblock is decoded in single error
correction mode as described hereafter.

4.4.4.2 CLTU Decoder State Diagram

Fig. 4.9 CLTU decoder states and transitions.  Aeroflex Gaisler AB

Figure 4.9 Note that the diagram has been improved with explicit handling of
different E2 possibilities listed below.
State Definition:
S1 Inactive
S2 Search
S3 Decode
Event Definition:
E1 Channel Activation
E2a Channel Deactivation—all inputs are inactive
E2b Channel Deactivation—selected becomes inactive (CB = 0 ? frame
abandoned)
E2c Channel Deactivation—too many codeblocks received (all ? frames
abandoned)
E2d Channel Deactivation—selected is timed-out (all ? frames abandoned)
E3 Start Sequence Found
E4 Codeblock Rejection (CB = 0 ? frame abandoned)
92 S. Habinc

4.4.4.3 Nominal

A: When the first ‘‘Candidate Codeblock’’ (i.e. ‘‘Candidate Codeblock’’ 0, which


follows Event 3 (E3):START SEQUENCE FOUND) is found to be error free, or if
it contained an error which has been corrected, its information octets are trans-
ferred to the remote ring buffer as shown in Table 4.6. At the same time, a ‘‘Start
of Candidate Frame’’ flag is written to bit 0 or 16, indicating the beginning of a
transfer of a block of octets that make up a ‘‘Candidate Frame’’. There are two
cases that are handled differently as described in the next sections.

Table 4.6 Data format


Bit(31….24) Bit(23….16) Bit(15……8} Bit(7……..0}
0 9 40000000 information octet0 0 9 01 information octet1 0 9 00
0 9 40000004 Information octet2 0 9 00 Information octet3 0 9 00
0 9 40000008 Information octet4 0 9 00 End of frame 0 9 02
0 9 400000xx Information octet6 0 9 01 Information octet7 0 9 00
0 9 400000xx Information octet8 0 9 00 Abandoned frame 0 9 03

Legend: Bit [17:16] or [1:0]:


‘‘00’’ = continuing octet
‘‘01’’ = Start of Candidate Frame
‘‘10’’ = End of Candidate Frame
‘‘11’’ = Candidate Frame Abandoned

4.4.4.4 CASE 1

When an Event 4—(E4): CODEBLOCK REJECTION—occurs for any of the 37


possible ‘‘Candidate Codeblocks’’ that can follow codeblock 0 (possibly the tail
sequence), the decoder returns to the SEARCH state (S2), with the following
actions:
• The codeblock is abandoned (erased)
• No information octets are transferred to the remote ring buffer
• An ‘‘End of Candidate Frame’’ flag is written, indicating the end of the transfer
of a block of octets that make up a ‘‘Candidate Frame’’.

4.4.4.5 CASE 2

When an Event 2—(E2): CHANNEL DEACTIVATION—occurs which affects


any of the 37 possible ‘‘Candidate Codeblocks’’ that can follow codeblock 0, the
decoder returns to the INACTIVE state (S1), with the following actions:
4 The CCSDS Decoder/Encoder Boards 93

• The codeblock is abandoned (erased)


• No information octets are transferred to the remote ring buffer
• An ‘‘End of Candidate Frame’’ flag is written, indicating the end of the transfer
of a block of octets that make up a ‘‘Candidate Frame’’.

4.4.4.6 Abandoned

• B: When an Event 4 (E4), or an Event 2 (E2), occurs which affects the first
candidate codeblock 0, the CLTU shall be abandoned. No candidate frame
octets have been transferred.
• C: If and when more than 37 codeblocks have been accepted in one CLTU, the
decoder returns to the SEARCH state (S2). The CLTU is effectively aborted and
this is will be reported to the software by writing the ‘‘Candidate Frame
Abandoned flag’’ to bit 1 or 17, indicating to the software to erase the ‘‘Can-
didate frame’’.

4.4.5 Relationship Between Buffers and FIFOs

Details on the relationship between buffers and FIFOs, buffer full condition han-
dling etc. again can be found in [55].

4.4.6 Command Link Control Word Interface

The Command Link Control Word (CLCW) is inserted in the Telemetry Transfer
Frame by the Telemetry Encoder (TM) when the Operation Control Field (OPCF)
is present. The CLCW is created by the software part of the telecommand decoder.
The telecommand decoder hardware provides two registers for this purpose which
can be accessed via the AMBA AHB bus.
Note that bit 16 (No RF Available) and 17 (No Bit Lock) of the CLCW are not
possible to write by software. The information carried in these bits is based on
discrete inputs.
The CLCW Register 1 (CLCWR1) is internally connected to the Telemetry
Encoder.
The CLCW Register 2 (CLCWR2) is connected to the external clcwout[0]
signal. One Packet Asynchronous interfaces (PA) are used for the transmission of
the CLCW from the telecommand decoder. The protocol is fixed to 115200 baud,
94 S. Habinc

1 start bit, 8 data bits, 1 stop, with a BREAK command for message delimiting
(sending 13 bits of logical zero). The CLCWs are automatically transferred over
the PA interface after reset, on each write access to the CLCW register and on each
change of the bit 16 (No RF Available) and 17 (No Bit Lock) (Table 4.7).

Table 4.7 CLCW transmission protocol


Byte CLCWR
Number register bits CLCW contents

First [31:24] Control Word Type CLCW Status COP In Effect


Version Field
Number
Second [23:16] Virtual Channel Reserved Field
Identifier
Third [15:8] No RF Available No Bit Lock Wait Retransmit Farm B Report
Lock Out Counter Type
Fourth [7:0] Report Value
Fifth N/A [RS232 Break Command]

For the cross strapping of the CLCW routing between the redundant CCSDS-
Boards please refer to Sect. 4.2.2.

4.4.7 Configuration Interface (AMBA AHB slave)

Details on the TC Decoder configuration interface also go beyond the scope of this
book and shall be taken from [55].

4.4.8 Interrupts

The core generates the interrupts defined in Table 4.8.

Table 4.8 Interrupts


Interrupt offset Interrupt name Description
1:st RFA RF available changed
2:nd BLO Bit lock changed
3:rd FAR FAR available
4:th CR CLTU ready/aborted
5:th RBF Output buffer full
6:th OV Input data overrun
7:th CS CLTU stored
4 The CCSDS Decoder/Encoder Boards 95

4.4.9 Registers

The core is programmed through registers mapped into AHB I/O address space
(Table 4.9).

Table 4.9 GRTC registers


AHB address offset Register
0 9 00 Global reset register (GRR)
0 9 04 Global control register (GCR)
0 9 0C Spacecraft identifier register (SIR)
0 9 10 Frame acceptance report register (FAR)
0 9 14 CLCW register 1 (CLCWR1) (internal)
0 9 18 CLCW register 2 (CLCWR2) (external)
0 9 1C Physical interface register (PHIR)
0 9 20 Control register (COR)
0 9 24 Status register (STR)
0 9 28 Address space register (ASR)
0 9 2C Receive read pointer register (RRP)
0 9 30 Receive write pointer register (RWP)
0 9 60 Pending interrupt masked status register (PIMSR)
0 9 64 Pending interrupt masked register (PIMR)
0 9 68 Pending interrupt status register (PISR)
0 9 6C Pending interrupt register (PIR)
0 9 70 Interrupt mask register (IMR)
0 9 74 Pending interrupt clear register (PICR)

Also for the TC Decoder only the most important registers shall be treated as far
as they are used by the OBSW of the satellite target platform FLP. The register
descriptions can be found in the annex Sect. 11.5.

4.4.10 Signal Definitions and Reset Values

The signals and their reset values are described in Table 4.10.

Table 4.10 Signal definitions and reset values


Signal name Type Function Active Reset value
tcrfa[0:1] Input, async RF available for CLCW – –
tcactive[0:1] Input, async Active – –
tcclk[0:1] Input, async Bit clock – –
tcdata[0:1] Input, async Data – –
clcwout[0] Output CLCW output data 2 – Logical 1
96 S. Habinc

4.5 TC Decoder: Hardware Commands

4.5.1 Overview

The Telecommand Decoder—hardware commands—or High Priority Com-


mands (HPC)—provides access to an output port via telecommands.
The decoder implements the following layers:
• Data Link—Protocol Sub-Layer:
– Virtual Channel Packet Extraction
– Virtual Channel Segment Extraction
– Virtual Channel Reception:
• Support for Command Link Control Word (CLCW)
– Virtual Channel Demultiplexing
– Master Channel Demultiplexing
– All Frames Reception
• Data Link—Synchronization and Channel Coding Sub-Layer:
– Pseudo-De-randomization
– BCH codeblock decoding
– Start Sequence Search
• Physical Layer:
– Non-Return-to-Zero Level de-modulation (NRZ-L).

The Channel Coding Sub-Layer and the Physical Layer are shared with the
Telecommand Decoder—Software Commands, and are therefore not repeated here.

4.5.2 Operation

In the Application Layer and the Data Link—Protocol Sub-Layer, the information
octets from the Channel Coding Sub-Layer are decoded as described in the fol-
lowing subsections.

4.5.2.1 All Frames Reception

The All Frames Reception function performs two procedures:


• Frame Delimiting and Fill Removal Procedure; and
• Frame Validation Check Procedure, in this order.
4 The CCSDS Decoder/Encoder Boards 97

The Frame Delimiting and Fill Removal Procedure is used to reconstitute


Transfer Frames from the data stream provided by the Channel Coding Sub-Layer
and to remove any Fill Data transferred from the Channel Coding Sub-Layer. The
Frame Length field is checked to correspond to the received data. The number of
information octets is checked to match the frame length.
The Fill Data is checked to match the 0 9 55 pattern when Pseudo-De-ran-
domization is not enabled.
The Frame Validation Checks procedure performs the following checks:
• Version Number is checked to be 0.
• Bypass Flag is checked to be 1.
• Control Command Flag is checked to be 0.
• Reserved Spare bits are checked to be 0.
• Spacecraft Identifier is compared with a fixed value, see Table 4.15.
• Virtual Channel identifier is compared with a fixed value for bit 0 to 3, see
Table 4.15, and bit 4 with value of id [1] input pin, and bit 5 with fixed value 1.
• Frame Length field is checked to match the received frame and CLTU, maxi-
mum 64 octets.
• Frame Sequence Number is checked to be a fixed value of 0.
• The Frame Error Control Field is checked to match the recomputed CRC value.

4.5.2.2 Master Channel Demultiplexing

The Master Channel Demultiplexing is performed implicitly during the All Frames
Reception procedure described above.

4.5.2.3 Virtual Channel Demultiplexing

The Virtual Channel Demultiplexing is performed implicitly during the All Frames
Reception procedure described above.

4.5.2.4 Virtual Channel Reception

The Virtual Channel Reception supports Command Link Control Word (CLCW)
generation and transfer to the Telemetry Encoder, according to the following field
description.
• Control Word Type field is 0.
• CLCW Version Number field is 0.
• Status Field is 0.
• COP in Effect field is 1.
• Virtual Channel Identification is taken from pin configurable input value.
98 S. Habinc

• Reserved Spare field is 0.


• No RF Available Flag is 0, but is overwritten by the Telemetry Encoder.
• No Bit Lock Flag is 0, but is overwritten by the Telemetry Encoder.
• Lockout Flag is 1.
• Wait Flag is 0.
• Retransmit Flag is 0.
• FARM-B Counter is taken from the to least significant bits of a reception
counter.
• Reserved Spare field is 0.
• Report Value field is 0.

Note that the CLCW is not generated unless also the Segment and Packet
extraction is successful and the Space Packet has been sent out via the UART
interfaces.
The CLCW transmission protocol is fixed to 115200 baud, 1 start bit, 8 data
bits, 1 stop, with a BREAK command for message delimiting (sending 13 bits of
logical zero).

4.5.2.5 Virtual Channel Segment Extraction

The decoder implements the Segmentation Sublayer and extracts the Segment
from the Frame Data Unit on the Virtual Channel, received from the Virtual
Channel Reception function. It supports blocking, but neither segmentation nor
packet assembly control. It only supports one Virtual Channel.
The Segment Header is checked to have the following fixed values:
• Sequence Flags are checked to be 11b
• MAP Identifier is compared with a fixed value, see Table 4.15.

The Segment Data field may be between 1 and 56 octets in length.

4.5.2.6 Virtual Channel Packet Extraction

The Virtual Channel Packet Extraction function extracts the Space Packet from the
Segment Data Field received from the Virtual Channel Segment Extraction
function. The aggregated length of all Space Packet(s) in one Segment Data Field
may be maximum 56 octets. The contents of the Space Packet(s) are not checked.
The Space Packet(s) are sent to the UART interface for output.
4 The CCSDS Decoder/Encoder Boards 99

4.5.2.7 UART Interfaces

The Space Packet(s) received from the Virtual Channel Packet Extraction function
our sent out via the redundant UART outputs. For each correctly received Transfer
Frame, a synchronization pattern containing two bytes, 0xFF followed by 0 9 55,
is first sent out serially, followed by the Space Packet(s).
The CLCW transmission protocol is fixed to 115200 baud, 1 start bit, 8 data bits,
and 1 stop bit. After the Space Packet(s) have been sent, the CLCW is updated.

4.5.3 Telecommand Transfer Frame Format: Hardware


Commands

The Telecommand Transfer Frame for hardware commands has the following
structures (Tables 4.11, 4.12, 4.13).

Table 4.11 Telecommand transfer frame format


Transfer Frame
Transfer Frame Transfer Frame Data Field Frame Error
Primary Header Segment Control Field
Segment Header Segment Data Field (FECF)
one or more Space Packets
5 octets 1 octet 1 to 56 octets 2 octets

Table 4.12 Telecommand transfer frame primary header format


Transfer Frame Primary Header
Version Bypass Control Reserved S/C Id Virtual Frame Frame
Flag Command Spare Channel Length Sequence
Flag Id Number
00b 1b 0b 00b FIXED FIXED/PIN 9 to 64 00000000b
0:1 2 3 4:5 6:15 16:21 22:31 32:39
2 bits 1 bit 1 bit 2 bits 10 bits 6 bits 10 bits 8 bits
2 octets 2 octets 1 octet

Table 4.13 Segment format


Segment
Segment Header Segment Data Field
Sequence Flags MAP Identifier Space Packet(s)
11b FIXED
40:41 42:47 variable
2 bits 6 bit variable
1 octet 1 to 56 octets
100 S. Habinc

4.5.4 Signal Definitions and Reset Values

The signals and their reset values are described in Table 4.14.

Table 4.14 Signal definitions and reset values


Signal name Type Function Active Reset value
tcuart[0:1] Output Hardware command UART outputs Logical 1 Logical 0
id[1] Input, static Telecommand virtual channel identifier, bit 1 . .
clcwout[1] Output CLCW output data – Logical 1

4.6 SpaceWire Interface with RMAP Target

The developed I/O-Board type is called SIU B-012-PPFLPCCSDS.


The board identity (Nominal or Redundant) via SpaceWire from OBC Pro-
cessor-Board is set by grounding selected pins on connector-E, allowing the board
ID to be set by location in the wiring harness.
• The TDI pin grounding (80 on connector E—see Fig. 11.9) defines the board ID
to be nominal or redundant.
• TDI Pin floating results in the board responding to requests sent to logical
address 0 9 28 (nominal CCSDS-Board) and
• TDI Pin grounded results in the board responding to requests sent to logical
address 0 9 29 (redundant CCSDS-Board).
Writes/Reads to undefined addresses will result in the RMAP status code 10
(command not possible) being returned.
The SpaceWire interface functionality on board the CCSDS Decoder/Encoder
IP Core works out of the box without additional settings with the core running on
the 4Links CCSDS-Board hardware and with the Aeroflex Processor-Board fea-
turing the SpaceWire interfaces in hardware on the UT699 LEON3FT chip,
together with the Aeroflex Gaisler RTEMS operating system. In [55] this IP Core
SpaceWire interface is described in detail which can be skipped here.

4.7 JTAG Debug Interface

The JTAG Interface provides access to on-chip AMBA AHB bus through JTAG.
This interface is made available to the user by the 4Links CCSDS-Board but is no
longer available to the closed OBC unit external. For application on standard user
side it is relevant only for loading the IP Core onto the 4Links board hardware.
4 The CCSDS Decoder/Encoder Boards 101

The JTAG debug interface implements a simple protocol which translates JTAG
instructions to AMBA AHB transfers. Details on its operation can be taken from
[55] and (Fig. 4.10).

Fig. 4.10 JTAG debug link block diagram.  Aeroflex Gaisler AB

4.8 Diverse Features

Also the CCSDS Decoder/Encoder IP Core is featuring triple module redundancy


technique for its implementation.
Furthermore, the memory implementation on the FPGA chip is including
EDAC functionality (for technical details see [55])
Also for the CCSDS Decoder/Encoder IP Core status registers the reader is
kindly referred to [55].
The interrupt controller in the CCSDS IP Core is also described in [55]. For
standard use of the Decoder/Encoder it is not needed except in special debugging
cases.
The IP Core furthermore features a general purpose I/O port (see [55]). Since
this feature is not made available to the user by the 4Links CCSDS-Board hard-
ware, it is not described here any further.
The clock generator implements internal clock generation and buffering. The
minimum clock period, and the resulting maximum clock frequency, is dependent
on the manufacturing lot for the Actel parts and expected radiation levels. In the
CDPI implementation described here the clock period is 100 ns.
The reset generator implements input reset signal synchronization with glitch
filtering and generates the internal reset signal. The input reset signal can be
asynchronous. The resetn input is re-synchronized internally. The signals do not
have to meet any setup or hold requirements.
102 S. Habinc

4.9 CCSDS Processor Spacecraft Specific Configuration

For each individual spacecraft some fixed parameters of the design must be
configured as shown in Table 4.15. This task is performed by Aeroflex Gaisler
according to the customer’s specification.

Table 4.15 Configuration at ordering


Parameter Value range Used Description
SCID 10 bit 0 9 25D TC Hardware spacecraft identifier
VCID 6 bit 0 9 01/ TC Hardware virtual channel
0 9 03 identifier
Note: only the 4 most significant
bits 0:3 are set by configuration.
Bit 4 is set by means of the id I
input pin. Bit 5 is always 1
MAPID 6 bit 0 9 00 TC hardware MAP identifier
RF available, polarity 0, 1 1 TC RF available indicator input
TC active, polarity 0, 1 1 TC active indicator input (bit lock)
TC sampling edge Rising, falling Rising TC clock input
TC pseudo-de-randomization Enabled, Disabled TC pseudo-de-randomization
disabled
TC NRZ-M demodulation Enabled, Disabled TC NRZ-M demodulation
disabled
CLKDIV Integer 0 System clock division to get
SpaceWire transmit frequency of
10 Mbps
SpaceWire node address 8 bit 0 9 28/ SpaceWire node address
(Destination logical address, 0 9 29 Note: only the 7 most significant
DLA) bits 7:1 are set by configuration.
Bit 0 is set by means of the id I
input pin
Chapter 5
The OBC Power-Boards

Rouven Witt and Manfred Hartling

5.1 Introduction

The FLP PCDU supplies an unregulated power bus with voltage levels between
19 V and 25 V as explained later in Chap. 8. The three data handling boards of the
OBC, namely the Processor-Board, I/O-Board and CCSDS-Board, require a steady
voltage of approx. 3.3 V. Thus, the main task for the power conversion of the OBC
Power Supply Board—or OBC Power-Board for short—is to establish a steady
conversion of the unregulated FLP main power bus to the required 3.3 V, and
within the range as specified for every board by their manufacturers.

R. Witt (&)  M. Hartling


Institute of Space Systems University of Stuttgart, Stuttgart, Germany
e-mail: witt@irs.uni-stuttgart.de
M. Hartling
e-mail: hartling@irs.uni-stuttgart.de

J. Eickhoff (ed.), A Combined Data and Power Management Infrastructure, 103


Springer Aerospace Technology, DOI: 10.1007/978-3-642-35557-8_5,
 Springer-Verlag Berlin Heidelberg 2013
104 R. Witt and M. Hartling

Fig. 5.1 Connections of the OBC Power-Board.  IRS, University of Stuttgart

Besides provision of regulated power to the OBC data handling boards, the
OBC Power-Boards fulfill a second task, which is the signal transmission and
conversion for the FLP pulse signals. These are available for clock synchroniza-
tion of OBC with GPS and star tracker system. The GPS, if powered, provides a
Pulse Per Second (PPS), signal which is transmitted to the OBC accompanied by a
time packet. The OBC can synchronize its own PPS signal to the GPS provided
one. Furthermore, the OBC Processor-Board is able to submit a PPS and a time
packet to the star tracker system. Combining both the STR and GPS systems will
be working on a common clock strobe base as close as possible which significantly
improves packet communication stability between GPS and OBC and STR and
OBC respectively.
Finally, there are interfaces that need to be routed out of or into the OBC
housing. These are led via the OBC Power-Boards as well. All interfaces along
with all other boards are depicted in Fig. 5.1. From the main task, the power
supply lines for the data handling boards are displayed in red color. In purple, the
pulse signals are shown. The two blue shades depict the power lines for the OBC
heaters, circuit 1 and circuit 2, or nominal (N) and redundant (R), respectively.
They are routed to a bi-metal thermostat switch and from there on to the corre-
sponding heaters placed on the backside of every second frame. More details about
heater placement can be taken from Figs. 5.11, 5.12 and 6.10. The green shades
show the Service Interface and the JTAG Interface lines to the OBC Processor-
Board. These are used to access the Processor-Boards after the assembly of the
complete OBC, when the connector on the Processor-Board can no longer be
directly accessed. The connectors in Fig. 5.1 are the same as in Fig. 1.2.
It can be identified here that the OBC Power-Boards don’t provide cross-
coupling with the data handling boards. The data handling boards and their parts
are significantly more complex than those of the Power-Boards. This implies the
data handling boards being more prone to hardware failure. This permitted
desisting from power line cross-coupling and simplified the electric design.
Through the Power-Board redundancy the overall OBC design is still single-
failure tolerant in accordance with the mission requirements.
5 The OBC Power-Boards 105

5.2 Power Conversion

The power conversion on the OBC Power-Board has to provide a steady 3.3 V
voltage output for all three data handling boards. Since the main power bus is
unregulated, an exact steady supply voltage is not achievable. On the other hand,
the provision of a voltage exceeding the 3.3 V might cause damage to the sensitive
parts on the data handling boards. Therefore, the manufacturers give a range of
voltage which is allowed in order for their board to work properly. These ranges
are summarized in Table 5.1.

Table 5.1 Requirements to the OBC Power-Boards derived from input characteristics of the data
handling boards
Device Permitted input voltage Maximum Bleeder resistors
range power (power consumption
consumption at 3.3 V)
Processor-Board 3.3 V ± 5 % (3.135 … 3.456 V) 2.5 … 4.75 W 50 X (0.22 W)
(Chap. 2)

I/O-Board (Chap. 3) 3.3 V ± 9 % (3.0 V … 3.6 V) 0.53 … 1.5 W 50 X (0.22 W)

CCSDS-Board 3.3 V ± 9 % (3.0 V … 3.6 V) 0.165 … 1.0 W 33 X (0.33 W)


(Chap. 4)

As second criterion for selection of converters, a minimum and a maximum


load of the data handling boards have to be taken into consideration. These loads
were estimated by the manufacturers as well and provided before designing the
OBC Power-Boards. They can also be taken from Table 5.1.
After the converters had been selected, tests to prove compatibility between
OBC Power-Boards and data handling boards were performed. The converters have
been selected considering only the steady state loads provided by the manufac-
turers. However, since there are non-ohmic resistances on the boards, board
impedance effects during power-up need to be taken into account. The current that
is drawn at the power-up of a data handling board connected to the OBC Power-
Board differs from the steady state current. In case there is a longer phase when the
power-up current is too low, the danger of an over-voltage from the converters to
the power consumers appears. Therefore, a characterization of the start-up behavior
of the data handling boards (see Sect. 5.2.2) as well as of the OBC Power-Boards
(Sect. 5.2.3) was performed before operating them in connection (Sect. 5.2.4).

5.2.1 The DC/DC Converters

For the power conversion, DC/DC converters from Gaia Converter (hereafter
simply referred to as Gaia) have been selected. Gaia offers a series of single output
106 R. Witt and M. Hartling

converters for a load of 4 W and 10 W which are specially designed for appli-
cation in space, namely the SGDS04 and SGDS10 family. They can be ordered
with varying input and output voltages and have an operating temperature range of
-40 C up to +85 C which is compliant with the operating temperature
requirements of the OBC. As taken from Gaia, the converters are characterized by
a heavy ions and ionizing dose of 20K rad, provide an undervoltage lockout and a
permanent output current limitation. Furthermore, they comply to ESA standard
PSS-01-301 [71].
By characterizing the input voltage to 9–36 VDC (? ‘H’) and the output
voltage to 3.3 V (? ‘B’) the selected converters for the three OBC boards are:
• MGDS10-HB for the Processor-Board
• MGDS04-HB for the I/O-Board
• MGDS04-HB for the CCSDS-Board.

Please note that due to the estimated maximum load of the Processor-Board, the
MGDS10 converter with 10 W maximum nominal load was selected for the OBC.

Fig. 5.2 Typical power


converter regulation
characteristic

One disadvantage of these power converters however, is, that they require a
certain load being drawn to provide the desired output voltage. To demonstrate the
effect, the output characteristics of an exemplary Gaia converter, namely the
MGDS04JC with an input Voltage of 16–40 V and 5 V output voltage, is shown in
Fig. 5.2.1 It can be identified that for lower output currents the output voltage
exceeds the nominal 5 V significantly. Only for a consumed current of 600 mA or
more, the dedicated output voltage of 5 V can be reached in the example of this

1
The 5 V converter example is depicted here since the supplier does not provide an according
diagram for the selected converter models of the MGDS series.
5 The OBC Power-Boards 107

converter. For the complete MGDS converter series this effect has to be consid-
ered. As for all OBC boards a certain input voltage range is prescribed with
Table 5.1, it had to be guaranteed that these requirements are met by the OBC
Power-Board output.
As result, additional tests had to be performed to exactly characterize the
power-up behavior of each OBC data handling board on the one hand, and the
exact behavior of the converter output on the other hand. The results of a load test
of the ordered converters is provided in Table 5.2. The relevant voltage limits are
marked in bold.
The MGDS10-HB stays below the required 3.465 V at a drawn current of
0.59 A which marks the power value of 3.465 V * 0.59 A = 2.04 W. It is spec-
ified in Table 5.1 that the Processor-Board exceeds a power consumption of
2.5 W. So no further protective action had to be taken into account.
An analog examination for the I/O and CCSDS-Board power supply leads to the
following results:
The MGDS04-HB stays below the required 3.6 V at a drawn current of 0.075 A
due its significantly lower specified power level. This marks a power level of
3.6 V * 0.075 A = 0.27 W which is below the specified minimum power con-
sumption of the I/O-Board, yet greater than the one for the CCSDS-Board by
0.105 W. Consequently, between the output of the converter and the CCSDS-
Board a bleeder resistor as constant power consumer had to be implemented. Its
maximum value results from the following (3.6 V)2 / 0.105 W = 123.428 X.
Therewith a resistor with a value below 123.42 X had to be implemented.

Table 5.2 Load characterization of MGDS10-HB and MGDS04-HB


MGDS10-HB MGDS04-HB
Voltage level (V) at current (A) Voltage level (V) at current (A)
3.6 0.104 3.6 0.075
3.5 0.33 3.59 0.079
3.465 0.59 3.497 0.157
3.45 0.74 3.4 0.4
3.4 1.32 3.38 0.5

However, for reasons of security, in all three power lines of Processor, I/O and
CCSDS-Board power supply a set of resistors is included to guarantee con-
sumption of the minimum load. The resulting values for the bleeder resistors are
also provided in Table 5.1. As example the circuitry of the CCSDS power line is
depicted in Fig. 5.3, including the output line leading to the CCSDS-Board,
beginning at the MGDS04-HB DC/DC converter. For clarity reasons, all parts left/
upstream of the converter are hidden.
As the Power-Board in this design meets the steady state requirements given by
the manufacturers of all three OBC data handling board types, an Engineering
Model was built. Diverse tests combining this OBC Power-Board EM with the
other OBC board types have been run as explained in the subsequent sections.
108 R. Witt and M. Hartling

Fig. 5.3 Bleeder resistors in


CCSDS-Board power
conversion line.  IRS,
University of Stuttgart

5.2.2 Start-Up Characterization of OBC Power Consumers

The previous section covered the proper Power-Board design concerning steady
operation power consumption. Another important aspect is the behavior versus
consumer inrush current characteristics. Therefore, tests were performed to char-
acterize supply voltage behavior of Power-Board on the lines of the different types
of OBC data handling boards within the first couple of milliseconds after the
power-up of the OBC Processor, I/O and CCSDS-Board.
Expected was a voltage peak when applying the power due to the capacitances
on the OBC data handling boards and, either
• after the peak, the steady state will be reached directly, or
• after the peak the current decreases for a short period below the steady state
current and ramps up again.

The second case holds the danger that the converter delivers a higher output
voltage level during that current decrease period, which might be hazardous for the
connected OBC board.
The setup for this test is depicted in Fig. 5.4 as it was used to record the drawn
current by the data handling boards at start-up. A TTI Power Supply Unit (PSU)
was used as power source being configured for 3.3 V output. Between PSU and

Fig. 5.4 Electrical set-up for


testing power-up of OBC Oscilloscope
boards.  IRS, University of
Stuttgart Differential
amplifier
3.3 V
shunt Core /
Power
IO /
Supply
GND CCSDS
5 The OBC Power-Boards 109

OBC board there is a low resistive shunt where the current can be registered by
means of an oscilloscope. A differential probe was used as input sensor for the
oscilloscope. The shunt resistors have been chosen as low resistive as possible so
that not too much of voltage is lost over the resistor, yet so high that the probe can
still detect the voltage properly. The resulting shunt resistor values are provided in
Table 5.3. Since the OBC I/O-Board behaves very similar to the CCSDS-Board at
power-up due to the delayed activation of the transmission interfaces on both
boards, the same shunt and estimated power consumption was used for test of both
boards. Due to the higher power consumption of the OBC Processor-Board, a
smaller resistor was used to minimize the shunt voltage drop.

Table 5.3 Determination of shunt resistors


Board Resistor Steady State Power Resulting Voltage loss over
value (X) consumption (W) Current resistor (V)
(estimated) I = P/U
(A)
I/O/CCSDS 0.39 1 0.3 0.117
Processor-Board 0.1 3.5 1.1 0.11
(compare) (0.39) (3.5) (1.1) (0.43)

Conducting the test resulted in the following diagrams. In Fig. 5.5 the start-up
of the CCSDS-Board is depicted. The appearing peaks are marked.

Fig. 5.5 Current at power-up—CCSDS-Board.  IRS, University of Stuttgart


110 R. Witt and M. Hartling

In Fig. 5.6 the power-up is shown for the I/O-Board without activating its
transmission interfaces. The diagrams don’t differ significantly, both show several
peaks and a slow re-drop of the inrush current down to the steady state value which
implies that there is no danger of over-voltage at start-up.

Fig. 5.6 Current at power-up—I/O-Board.  IRS, University of Stuttgart

In Fig. 5.7 the start-up behavior of the OBC Processor-Board is depicted. It can
be identified that in this case, there is a possibly hazardous phase after the main
peak when the inrush current drops significantly below the steady state current. A

Fig. 5.7 Current at power-up—Processor-Board.  IRS, University of Stuttgart


5 The OBC Power-Boards 111

dedicated examination of the converter itself had to be performed in order to judge


whether changes to the electrical PCB design of the Power Supply Board would be
necessary or not. This verification is covered in Sect. 5.2.3.

5.2.3 Start-Up Behavior of the Power Supply Board

After the inrush behavior of the OBC Processor, I/O and CCSDS-Board had been
characterized in the previous tests and since the Processor-Board was identified to
be problematic due to its power-up current behavior, dedicated tests of the OBC
data handling boards in combination with the Power-Board became necessary to
assure overall design adequacy.
The following tests were conducted to verify the overall power supply charac-
teristics of the Gaia converters on the Power-Boards with connected load. The
electrical set-up for this test can be taken from Fig. 5.8. Significant difference to the
previous test is the utilization of a Line Impedance Stabilization Network (LISN).

Oscilloscope

R1/2/3
22V
Power Power
Supply
LISN 22V Ret J4
Board J1 3.3V_OBC /
3.3V_IO /
GND 3.3V_CCSDS

Fig. 5.8 Test setup for characterization of Power-Board start-up behavior.  IRS, University of
Stuttgart

Due to the 3.3 V low voltage range applied, during the previous board inrush
behavior tests such a device wasn’t necessary. In current test the LISN is used to
provide representative 22 V as supplied by the satellite PCDU instantaneously at
power line activation. Using only a PSU otherwise might have caused a non-
realistic, delayed build-up of the voltage. Basically, the LISN is a capacitor in
parallel to the power line which provides the 22 V level at switch on.
The resistors R1, R2 and R3 were selected according to the actual steady state
currents taken from the previous test diagrams, marked in magenta. The values of
the resistors are 22, 44.6 and 50 X for Processor, I/O and CCSDS-Board. Since
this test is only essential for the OBC Processor-Board line, only the result of that
test is discussed in particular. In Fig. 5.9 the behavior of the MGDS10-HB con-
verter is shown, with a load of the mentioned 22 X as consumer. It can be seen that
despite the low power consumed temporarily at the beginning, the voltage con-
tinuously increases to the nominal 3.3 V (please refer to Fig. 5.7). Furthermore,
despite the usual variations that appear within such processes, the voltage never
exceeds the 3.465 V limit specified by the supplier in Table 5.2.
112 R. Witt and M. Hartling

Fig. 5.9 Behavior of Power-Board at start-up—Processor-Board line.  IRS, University of


Stuttgart

After completion of these behavioral measurements for OBC Power-Board and


the OBC data handling boards, the verification of their design compatibility was
completed and the boards were connected for the first time in the CDPI devel-
opment program.

5.2.4 Connection of Power Supply Board and OBC Power


Consumers

The test-set-up for the connection of Power-Board and the connected OBC data
handling boards can be taken from Fig. 5.10. The oscilloscope was kept included
during the test to observe the behavior of the DC/DC output lines and to allow fast
cancellation of the test in case any unexpected over-voltage would be observed.
The boards proved to work together perfectly and the EMs are meanwhile being
operated permanently in the STB in the configuration depicted in Fig. 5.10. The
FM boards are integrated into the OBC flight model.

3.3V
22V
Power LISN 22V Ret J3
Power J1/ Core /
Supply Board J2 IO /
GND GND CCSDS

Oscilloscope

Fig. 5.10 Setup for final board connection test.  IRS, University of Stuttgart
5 The OBC Power-Boards 113

5.3 Clock Strobe Signals

As already explained, the OBC Processor-Boards provide the functionality to


receive a PPS strobe signal from an external clock reference like a GPS or Galileo
receiver. The Processor-Boards also allow synchronization of OBC to this external
PPS clock strobe on request as far as supported by OBSW and RTOS.
Furthermore, the OBC provides a PPS strobe generated by its Processor-Board.
This PPS has to be available on an OBC external connector for syncing onboard
equipment to the OBC. In the FLP target spacecraft mission this functionality is
required for the star trackers.
Since both PPS interfaces are located on OBC internal connectors of the Pro-
cessor-Boards (please refer to Figs. 1.2 and 2.1), the signals require routing from/
to external OBC connectors. The PPS signals on the OBC external connectors, as
generated by the GPS or required by the star trackers, are transmitted as differ-
ential signals. Whereas inside the OBC on Processor-Board level they are handled
as single-ended 5 V PPS lines. Therefore, besides the pure line routing from
internal to external connector a conversion of signal type was needed. Since the
OBC Power-Board design and development was in the hands of the IRS, it was
decided to implement this signal routing and conversion feature onto the OBC
Power-Boards.
A further particularity of the FLP mission is that there are three redundant GPS
receivers for experimental reasons. All three are able to provide a pulse signal to
the OBC. During nominal operation, only one GPS receiver will be active and
consequently only one signal will be provided. In case one receiver becomes
defective the pulse signal needs to be taken from a redundant receiver. But there is
only one input available on OBC Processor-Board side. This means that all three
GPS PPS input signals into the Power-Boards need to be logically merged to one
single PPS going to the Processor-Board. This can be achieved easily by differ-
ential driver ICs featuring high impedance when deactivated, as long as there is
only one single receiver in use.
However, during one particular FLP target mission experiment all three GPS
receivers are foreseen to be working in parallel. To avoid the PPS signals from all
three receivers interfering in an uncontrolled manner, a priority circuitry has been
installed on the Power-Boards that permits only one of the three pulse signals to be
routed to the Processor-Board. If the nominal pulse signal is available, the other two
signals are blocked. A signal type conversion from differential to single-ended is
performed at Power-Board interface level by means of a standard RS422 driver IC.
Please note that only the PPS signals are routed via the Power-Board. Packet
communication between GPS/Galileo receiver and OBC is handled via the
I/O-Boards.
For the conversion of the pulse signal from Processor-Board to star tracker, a
straight forward conversion from single-ended to differential is performed using a
standard RS422 driver chip. The signal can be sent from both Processor-Boards of
which only one will be working at any time.
114 R. Witt and M. Hartling

Please note that packet communication between star tracker and OBC is han-
dled via I/O-Boards as well.
The power for the driver and gate chips is taken from the Processor-Board
power line since the conversion is only required as long as the according Pro-
cessor-Board is active. A dedicated converter provides the voltage for the corre-
sponding chips.

5.4 Heaters and Thermal Sensors

Inside the OBC housing there are two heater circuits that can be activated if the
temperature drops below the minimum operational limit. Please also refer to the
figures and explanations in Sect. 7.3. The two circuits are redundant and each
comprises four heaters that are mounted on the backside of every second frame as
depicted in Figs. 5.11, 7.15 and 7.16. For the detection of the internal temperature
one bi-metal thermostat switch is included in each circuit. In case when the
temperature drops below the minimum operational limit of -40 C, the switches
will close and the heaters are activated—provided that the PCDU power lines are

pwr ccsds cpu io pwr ccsds cpu io

Temp. - Sensor
PCB
Bimetal switch

Emergency Heater Emergency Heater


OBC - nominal OBC - redundant

Fig. 5.11 OBC internal heater positioning.  IRS, University of Stuttgart


5 The OBC Power-Boards 115

switched ON. Dedicated temperature sensors are glued onto the OBC housing and
connected to the PCDU to feed back actual housing temperature data to the
thermal control loop. No dedicated wiring is necessary for these inside the housing
on the OBC power board.
The schematic of the heater and temperature sensor wiring is provided in
Fig. 5.12. On the PCDU there are two fuses and switches which are used for both
the OBC and the TTC system. One switch will activate the nominal heaters, the
other activates the redundant side. From the heater power output of the PCDU the
wire is routed to the bi-metal thermostat. After that the wire is split and leads to the
four heaters. The return lines are brought together on a star point on the power
board from which the rail leads back to the return pin of the PCDU. The same
principle is applied for both nominal and redundant heater circuit.

Fig. 5.12 OBC heater


circuits wiring.  IRS,
University of Stuttgart

5.5 OBC Service Interface and JTAG Interface

As already cited and depicted in Fig. 5.1, the OBC Power-Boards also provide the
routing of the OBC Processor-Boards’ serial Service Interface (SIF), and the JTAG
debug interface. The reason for routing these signals from OBC internal connec-
tors of the Processor-Boards to the OBC housing’s external connectors is the same
as for the PPS signals discussed earlier. After OBC unit assembly completion and
closed housing the Processor-Board interfaces are no longer directly accessible.
SIF and JTAG Interface are essential for upload of software versions after OBC
integration into the satellite and for OBSW debugging. For the FLP target satellite
both interfaces will be routed to the spacecraft’s external skin connector.
116 R. Witt and M. Hartling

5.6 Connector Configuration

All connectors on the Power Board are Sub-D High Density connectors and have
either 15 or 26 Pins. Power lines and data lines are routed over different con-
nectors. Connectors with the same size on the same side of the board are of
different gender. All connectors are soldered with flying wires onto the board.
Fixed connectors might break at the soldering points under vibrational load.
Fig. 5.13 provides a schematic of the Power-Board. Also the different connectors
can be identified. The naming of the connectors is board specific. This differs from
the naming convention of the overall OBC unit as depicted in annex Sect. 11.6. A
short description on the connectors is provided below. The relevant pinouts of the
OBC unit’s external connectors of the power board (J2, J3 and J4)—or according
to annex Sect. 11.6—(J1/J7, J2/J8, J3/J9) are included in annex Sect. 11.9.

Fig. 5.13 OBC Power-Board connector configuration.  IRS, University of Stuttgart

From connector J0 (Sub-D HD 26 male) the power supply lines lead to one
OBC Processor-Board, one I/O-Board and one CCSDS-Board. The other Power-
Board supplies the redundant set. The power supply lines for the heaters are also
provided via this connector. And finally the forwarding of the pulse signals is
established via this connector as LVTTL signals.
J1 is the data connector leading data lines for JTAG and SIF into the OBC
housing. There is no necessity to lead these lines over the Power Board’s elec-
tronics so they will be wired directly from connector J1 via PCB to connector J2—
see below. The connector is a female Sub-D HD 15 connector.
In Table 11.45 the pins for the data connector J2 on the long side of the board
are provided. This connector is the correspondent to data connector J1 on the short
board side and, thus, will also be of type Sub-D HD 15 (female). The cable
5 The OBC Power-Boards 117

mounted to this connectors will connect to the skin connector of the FLP target
satellite which will be mounted on the satellite’s primary structure. Both data
connectors J2 of the Power-Boards are routed to one connector on the spacecraft
skin connector of the FLP target satellite.
In Table 11.46 the pins for the Power Connector J3 on the long side of the board
are listed. This connector is also of type Sub-D HD 15 (male). It receives the
power lines for the individual OBC boards connected internally and for the
powered heaters. All lines to this connector source from the PCDU.
In comparison to classical OBC designs here the ‘‘OBC’’ receives a separate
power line for each individual board since the reconfiguration unit—the Common-
Controller in the PCDU—by this means is able to power each board individually
for nominal operation and to perform shutdowns, power-cycling and redundancy
activations in FDIR cases. In a classical OBC, this type of reconfiguration lines are
implemented inside the OBC housing. Here, they become visible due to the fact of
this CDPI prototype being physically realized as two boxes.
In Table 11.47 the pins of the PPS connector J4 are listed. The OBC receives
PPS signal lines from the GPS receivers and it provides PPS line as output e.g. for
the star trackers in the FLP target satellite. The technical details of these PPS lines
were already covered in Sect. 5.3.
Chapter 6
The OBC Internal Harness

Artur Eberle, Michael Wiest and Rouven Witt

6.1 Introduction

As explained in Chap. 1 the OBC components have all been designed from scratch
and were developed largely in parallel. A significant number of interface details
have not been available at project start which led to the decision of building the
OBC as protoflight model based on an inter-board harness instead of a backplane.

A. Eberle (&)  M. Wiest


HEMA Kabeltechnik GmbH & Co. KG, Salem, Germany
e-mail: artur.eberle@hema-kabeltechnik.de
M. Wiest
e-mail: michael.wiest@hema-kabeltechnik.de
R. Witt
Institute of Space Systems, University of Stuttgart, Stuttgart, Germany
e-mail: witt@irs.uni-stuttgart.de

J. Eickhoff (ed.), A Combined Data and Power Management Infrastructure, 119


Springer Aerospace Technology, DOI: 10.1007/978-3-642-35557-8_6,
 Springer-Verlag Berlin Heidelberg 2013
120 A. Eberle et al.

Figure 6.1 provides an overview on the main OBC internal lines for the SpaceWire
board to board connections, the power supply lines and the heater supply lines.
Please also refer to Fig. 5.1 which only depicts the Power-Board relevant
interfaces but in addition details the ‘‘Data Connections’’ cited in Fig. 6.1 into
pulse lines, Service Interface and JTAG Interface lines. Figure 6.2 in addition then
shows the individual power and data connection routing through the Power Boards
for one of the redundant branches from Fig. 5.1. Thus it can be identified that the
OBC internal harness can be split into two elements,
• the SpaceWire subharness and
• the Power Board subharness including
– power lines,
– heater lines,
– pulse lines,
– Serve Interface lines, and
– JTAG debug interface lines.
The overall OBC internal harness was implemented by HEMA Kabeltechnik
GmbH & Co. KG, a professional harness supplier for both space checkout
equipment on ground and for space harness applications. The harness was
assembled at HEMA under clean room conditions by applying a geometric con-
nector mockup of the assembled OBC frame stck (Fig. 6.3).

6.1.1 Requirements
The electric cabling input for the harness was handed over to HEMA in electronic
form by the IRS. Further requirements where the types of to be used connectors,
cleanliness during production and geometric constraints:
• Connectors: Sub-D connectors and Micro-D connectors
• Available Space for harness routing (Go/No-Go Areas)
• Cleanliness for Space application: clean room of class 100 000/ISO 8
• Final integration for customer at IRS.

6.1.2 Challenges
The following challenges had to be considered during harness design and
manufacturing:
• Cable routing was difficult in the small OBC front compartment area with a
dimension of approximately 250 9 100 9 35 mm.
• Small harness bending radii resulted from dimension limits.
• Insufficient available space was available for cable splicing.
• Insufficient space was available for standard SpaceWire cabling. In fact not
enough space for the minimum bending radii of standard SpaceWire cables and
no space for connector backshell mounting.
6 The OBC Internal Harness
121

Fig. 6.1 Inter-board cabling schematic.  IRS, University of Stuttgart


122 A. Eberle et al.

Core N P4

PWR 0 13
Ext_Ret 14
PWR 0 GND 15
PWR 0 DSUTMS 16
PWR 0 DSUTCK 17
PWR 0 DSUTDI 18
PWR 0 DSUTDO 19
PWR 0 DSUEMDC 20
21
22
23
24
25
26
27
PWR 0 P1
28
1 29
2 30 Not Connected
3 31
DSU Signals
4 32
5 33 Debug Output
6 PWR 0 DSUACT 34 3.3V Power
7 PWR 0 DSUBRK 35 24V Power
8 PWR 0 DSUEN 36
9 PWR 0 DSURSTN 37
Ground
10 PWR 0 RS422 _Data+ 1 38 PPS Pulse Signal
11 PWR 0 RS422_Data- 2 39
12 3 40
13 PWR 0 3.3V_Ext_Power 4 41
14 5 42
15 6 Core N P5 43
7 44
8 1
9 2
3
PWR 0 P0
4
1 PWR 0 3.3V_Core_N 5
10 PWR 0 3.3V_Core_N 6
19 PWR 0 3.3V_Core_N 7
2 PWR 0 3.3V_Core_N 8
11 PWR 0 Ret_Core_N 9
20 PWR 0 Ret_Core_N 10
3 PWR 0 Ret_Core_N 11
IO 0 P1
12 PWR 0 Ret_Core_N 12 CCSDS 0 P1
4 PWR 0 3.3V_IO_N 11
13 PWR 0 3.3V_IO_N 12
21 PWR 0 Ret_IO_N 13
22 PWR 0 Ret_IO_N 8
5 PWR 0 3.3V_CC_N 11 1
14 PWR 0 3.3V_CC_N 12 2
23 PWR 0 Ret_CC_N 13 6
6 PWR 0 Ret_CC_N 8 7
15 1 3
24 2 4
7 6 9
16 7 5
25 3 10
8 PWR 0 PPS_Clock_STR 4 PWR 1 24V_Heater_R 14
17 PWR 0 PPS_Clock_GPS 9 PWR 1 Ret_Heater_R 15
26 5
9 PWR 0 Ret_Heater_N 10
18 PWR 1 24V_Heater_R 14
PWR 1 Ret_Heater_R 15

PWR 0 24V_Heater_N

CCSDS 1 P1 5
10
5
10
IO R P1
Bi-Metal
Switch
PWR 0 Ret_Heater_N
PWR 0 Ret_Heater_N
PWR 0 Ret_Heater_N

Fig. 6.2 Debug-, pulse signal and thermostat line routing.  IRS, University of Stuttgart

• The stability of the harness routing had to be guaranteed also under vibration loads.
• Mounting areas had to be foreseen for fixing of the harness bundles to tie-bases
glued to the OBC cassette frames. The latter part of the integration into OBC
frame assembly was foreseen to be performed by IRS.
• The harness was required to be disconnectable which requires that the access for
all connector screws and mounting screws had to be guaranteed by design.
6 The OBC Internal Harness 123

Fig. 6.3 OBC internal harness in manufacturing mockup.  HEMA/IRS

These constraints resulted in the need of a mockup for manufacturing to assure


the exact positioning of the connectors and achieving proper harness bundle
routing and line lengths.

6.1.3 Realization

Manufacturing was performed at HEMA Kabeltechnik as certificated manufac-


turer according to the following standards:
• ECSS-Q-ST-70-08C
Manual soldering of high-reliability electrical connections [45]
• ECSS-Q-ST-70-26C
Crimping of high-reliability electrical connections [46].

Cable design requirements for SpaceWire were followed according to ECSS-E-


50-12C (see [12]).

6.2 Harness Design

This section describes the engineering process for a space application harness.

6.2.1 Harness Engineering

Each harness line starts with a connector and is plugged to its corresponding target
connector on the according OBC board. Table 6.1 illustrates the OBC boards with
their connectors.
124 A. Eberle et al.

Table 6.1 Connector list


OBC board Connector Connector Corresponding Connector
name type connector type name
Nominal PWR 0 PWR 0 J0 DAMA-26P DAMA-26S PWR 0 P0
PWR 0 J1 DEMA-15S DEMA-15P PWR 0 P1
CCSDS 0 CCSDS 0 J0 MDM-9S MDM-9P CCSDS 0 P0
CCSDS 0 J1 DEMA-15P DEMA-15S CCSDS 0 P1
CCSDS 0 J2 MDM-9S MDM-9P CCSDS 0 P2
Processor board core N Core N J0 MDM-9S MDM-9P Core N P0
Core N J1 MDM-9S MDM-9P Core N P1
Core N J2 MDM-9S MDM-9P Core N P2
Core N J3 MDM-9S MDM-9P Core N P3
Core N J4 DBMA-44P DBMA-44P Core N P4
Core N J5 MDM-9S MDM-9P Core N P5
I/O N I/O N J0 MDM-9S MDM-9P I/O N P0
I/O N J1 DEMA-15P DEMA-15P I/O N P1
I/O N J2 MDM-9S MDM-9P I/O N P2

Redundant PWR 1 PWR 1 J0 DAMA-26P DAMA-26S PWR 1 P0


PWR 1 J1 DEMA-15S DEMA-15P PWR 1 P1
CCSDS 1 CCSDS 1 J0 MDM-9S MDM 9P CCSDS 1 P0
CCSDS 1 J1 DEMA-15P DEMA-15S CCSDS 1 P1
CCSDS 1 J2 MDM-9S MDM-9P CCSDS 1 P2
Processor board Core R Core R J0 MDM-9S MDM-9P Core R P0
Core R J1 MDM-9S MDM-9P Core R P1
Core R J2 MDM-9S MDM-9P Core R P2
Core R J3 MDM-9S MDM-9P Core R P3
Core R J4 DBMA-44P DBMA-44P Core R P4
Core R J5 MDM-9S MDM-9P Core R P5
I/O R I/O R J0 MDM-9S MDM-9P I/O R P0
I/O R J1 DEMA-15P DEMA-15P I/O R P1
I/O R J2 MDM-9S MDM-9P I/O R P2
Legend: Core = OBC Processor-Board, I/O = OBC I/O-Board, CCSDS = OBC CCSDS-Board

The detailed pin allocation list is included in the product documentation from
HEMA [72]. The harness bundle definitions were the next step after freeze of the
pin allocation. Usually the signal lines of one interface (UART, RS488, HPC,
Status, etc.) are twisted to one cable. The cables are then combined to a bundle
(Fig. 6.4).
The routing of the harness shall not disturb both the transmitted signal itself
(reflections/damping) nor other signals (EMC). For this reason usually the power
harness and signal harness are routed separately. HEMA decided to separate the
SpaceWire harness and the power/signal harness. As illustrated in Fig. 6.5 the
orange lines represent the power and the green ones SpaceWire bundle.
6 The OBC Internal Harness 125

3xSL-22 Core N P5 Core R P5 3xSL-22 PWR 0 P1


Micro-D 9P Micro-D 9P DEMA-15P
1xSL22 1xSL-22

PWR 0 P1
DEMA-15P

9xSL22 Core N P4 Core R P4 9xSL-22


PWR 0 P0
DBMA-44P DBMA-44P
DAMA-26S 4xTP22 4xTP-22
PWR 0 P0
1xSL-22
DAMA-26S

1xSL-22 Bi-Metall Bi-Metall 1xSL-22


Switch 0 Switch 1

2xTP22 IO N P1 1xSL-22
1xSL-22 DEMA-15P 1xSL22

1xSL-22 IO R P1 1xSL-22
2xTP-22
DEMA-15P
1xSL-22

2xTP-22
1xSL-22 CCSDS 0 P1 1xSL22
DEMA-15S

1xSL-22 CCSDS 1 P1 1xSL22


2xTP-22
DEMA-15S
1xSL-22

Fig. 6.4 Harness power bundle.  HEMA/IRS

PWR0 CCSDS
0 CORE N IO N PWR 1 CCSDS 1 CORE R IO R
Bi-Metall
Bi-Metall

Switch
Switch

CCSDS 1 P2
CCSDS 0 P2

IO R P2
Micro-D 9P
IO N P2
Micro-D 9P

Micro-D 9P
Micro-D 9P

PWR 1 P1
PWR 0 P1

DEMA-15P
DEMA-15P

Core N P5

Core r P5

Core R P4
Core N P4

Micro-D 9P
Micro-D 9P

DBMA-44P
DBMA-44P

CCSDS 1 P1
CCSDS 0 P1

Core R P2
Core N P2

IO R P1
DEMA-15P
IO N P1
DEMA-15P

DEMA-15S

Micro-D 9P
DEMA-15S

Micro-D 9P

Core R P1
Core N P1
Micro-D 9P

Micro-D 9P
PWR 1 P1
PWR 0 P0

DAMA-26S
DAMA-26S

Core R P0
Core N P0

Micro-D 9P
Micro-D 9P

Core R P3
Core N P3

CCSDS 1 P0
Micro-D 9P

Micro-D 9P
CCSDS 0 P0

IO R P0
Micro-D 9P
IO N P0
Micro-D 9P

Micro-D 9P
Micro-D 9P

Fig. 6.5 OBC harness schematic routing.  HEMA/IRS

6.2.2 SpaceWire Harness

SpaceWire is a high speed field bus interface standard for space equipment inter-
communication, standardized by a consortium from multiple space agencies. The
specification can be found in literature (see for example [11, 12]). The SpaceWire
interfaces in the OBC accordingly consist of point-to-point, bidirectional data links.
126 A. Eberle et al.

Two differential signal pairs in each direction make a total of eight signal wires with
an additional screen wire (Fig. 6.6).

Fig. 6.6 Micro-D


SpaceWire connector. 
HEMA Kabeltechnik GmbH
& Co. KG

The SpaceWire connector is required to provide eight signal contacts plus a


screen termination contact. A nine pin micro-miniature D-type is specified as
standard connector for SpaceWire. All OBC internal SpaceWire connections are
using the same connector type and pin allocation in accordance to [15].
The OBC inter-board SpaceWire harness consist of eight SpaceWire bundles.
Cable type follows AWG 26, non-impedance controlled harness. One side is
implemented as solder connector to reduce reflection points (such as a splice).
Figure 6.7 provides some impressions from the stepwise harness implementation.

Fig. 6.7 SpaceWire Harness stepwise implementation.  HEMA Kabeltechnik GmbH & Co. KG
6 The OBC Internal Harness 127

The completed OBC internal SpaceWire harness is depicted in Fig. 6.8.

Fig. 6.8 Complete OBC internal SpaceWire harness.  HEMA/IRS

6.2.3 OBC Power Harness

The OBC internal power harness includes the line routing for power supply lines
from the OBC Power-Boards to the consumers, the routing of the OBC internal
heater lines from Power-Board via thermostats to the boards equipped with heaters
and finally the routing of pulse lines and debug lines. Figure 6.9 provides an
impression of the power harness.

Fig. 6.9 OBC internal power harness.  HEMA/IRS


128 A. Eberle et al.

As already explained the OBC is equipped with internal heater lines controlled
by thermostats. These are included in the power harness and are positioned on the
according OBC frames (Fig. 6.10).

Fig. 6.10 Thermostat positions.  IRS, University of Stuttgart

The thermostats are pre-integrated into the power harness already by HEMA.
The allocation of the thermostats in the OBC are:

• Position 1: Thermostat 1 (5702BJ3628639 0642 007)


• Position 2: Thermostat 2 (5702BJ3628639 0642 011).

6.3 Verification

The OBC Harness was tested against potential failures and to verify proper
manufacturing quality and correct interconnections. The following tests were
performed:
6 The OBC Internal Harness 129

• Contact Retention Test (only for crimped contacts)


• Resistance/Continuity Test
• Insulation Test.
These tests were intended to detect failures such as:
• Contacts not inserted
• Interconnection status between source and target
• Interconnection of shield groups
• Short circuits
• Insulation defects.
Figure 6.11 shows the principle harness test configuration:

OBC Harness
Test Harness

SUP
Test
Equipment

RTN
Fig. 6.11 Harness test setup.  HEMA Kabeltechnik GmbH & Co. KG

Test Conditions
The tests were performed with the same clean room conditions and following the
same handling procedures as during harness manufacturing. The tests had to avoid
overstressing the harness. To limit the mate/de-mate rate of the connector the use
of test adapters was mandatory. The test conductor was not allowed to be the same
person as the harness assembler.

Retention Test
The contacts of the connectors was tested with the contact retention tool, to verify
full contact insertion and correct retention forces. This test only was applied to
contacts being crimped to the wire (Fig. 6.12).

Nylon Hand Protector

Color coded area

Pin End Socket End

Fig. 6.12 Retention test.  HEMA Kabeltechnik GmbH & Co. KG


130 A. Eberle et al.

Resistance Test (Milliohm-Test)


A resistance test was performed to verify the contact quality between harness parts
(contacts, overall-shield) which should have a perfect connection. Technically this
is not directly possible so the quality of the connection must be verified by two
means: The first test is a bonding test and the second is a continuity test (Fig. 6.13).

Harness Harness

025.8mΩ 003.7mΩ
Milliohmmeter Milliohmmeter

(a) contact to contact resistance test (continuity) (b) contact to shield resistance test (bonding)

Fig. 6.13 Resistance test.  HEMA Kabeltechnik GmbH & Co. KG

The bonding value that was used reflects only the transitions, which are elec-
trically relevant to the system. An acceptable value stays far below 1 X. In the
most applications the value must be far below 20 mX, which also was the refer-
ence limit for this harness.
In the continuity test the harness was checked for the correct point to point pin
allocation and the resistance value for each line was recorded. This allowed for
comparison of values between multiple cables in the harness, and an analysis
based on length and diameter. Cables with the same length and diameter should
have also the same resistance value.

Insulation Test
This test was performed to verify the proper electrical isolation between two parts
such as contacts to each other or to housing. The applied test conditions were
500VDC, Test Current: 1A (Fig. 6.14).

Harness Harness
SUP

RTN

>500MΩ >500MΩ
SUP
RTN

Insulation Test Equipment Insulation Test Equipment

(a) contact to contact insulation test (b) contact to shield insulation test

Fig. 6.14 Insulation test.  HEMA Kabeltechnik GmbH & Co. KG


6 The OBC Internal Harness 131

6.4 Quality and Manufacturing Documentation

The documentation of harness in space application was combined in a so-called


Manufacturing Data Package (MDP), defined by the customer. For the FLP target
satellite mission it consists of the following sections:
• Test list (connection tests performed and corresponding results)
• Test procedure (how the tests were performed)
• History Record (every step from kitting to packing, recorded by person and
date)
• Connector Mate/De-mate List (how often the connector was mated/de-mated)
• Configuration Item List (Material list sorted by bundle)
• Declared Materials List (The sum of all material that was used).
Chapter 7
OBC Mechanical and Thermal Design

Michael Lengowski and Fabian Steinmetz

© Vilnis – Fotolia.com

7.1 Mechanical and Thermal Requirements

The mechanical structure and thermal system of the OBC was designed by the
IRS. With this approach it was possible to find a configuration of the OBC housing
and electronics as compact as possible and well adapted to the FLP target
spacecraft. The conceptual design of the mechanical and thermal architecture was
conducted on the base of the following requirements:
Mechanical:
M01: The mechanical structure shall cover Processor-Board, I/O-Board, CCSDS-
Board and the Power-Board as well as their redundancies.

M. Lengowski (&)  F. Steinmetz


Institute of Space Systems, University of Stuttgart, Stuttgart, Germany
e-mail: lengowski@irs.uni-stuttgart.de
F. Steinmetz
e-mail: steinmetz@irs.uni-stuttgart.de

J. Eickhoff (ed.), A Combined Data and Power Management Infrastructure, 133


Springer Aerospace Technology, DOI: 10.1007/978-3-642-35557-8_7,
 Springer-Verlag Berlin Heidelberg 2013
134 M. Lengowski and F. Steinmetz

M02: The maximum envelope of the OBC shall not exceed 220 9 300 9 140 mm3
(see Fig. 7.1) and the the mass shall not exceed 6 kg.

Fig. 7.1 Envelope of OBC in the FLP satellite.  IRS, University of Stuttgart

M03: The OBC shall be dimensioned to withstand a quasi static load of 100 g in
all axes. The first eigenfrequency of the OBC shall be higher than 130 Hz.
M04: The mechanic structure shall provide a rigid connection between the elec-
tronic boards as well as a firm attachment to the satellite.
M05: All circuit boards as well as the internal connection harness shall be sealed
from each other with regard to HF interferences.
M06: All circuit boards shall be separately testable and safely embedded for
handling.
M07: The mechanical structure shall be designed for possible removal of all
PCBs.
M08: The used components shall withstand the orbit environment with respect to
thermal, EMC and radiation conditions.
Thermal:
T01: The OBC shall feature an operating temperature range of -40 to 80 C.
T02: The thermal connection of the circuit boards to the structure of the OBC
housing shall be designed to prevent high temperature spots.
T03: The temperature of the OBC shall be measured with 2 thermal sensors.
T04: Redundant survival heaters shall be installed preventing cold switch on
temperatures of the OBC. Furthermore, they shall be controlled without
telecommand.
7 OBC Mechanical and Thermal Design 135

7.2 Mechanical Design of the OBC

7.2.1 OBC Structure Concept

The given mechanical requirements lead to a compact configuration with a modu-


larity for testing each circuit board on its own. In order to meet the required envelope
the boards are of 3u Eurocard type (100 mm 9 160 mm) and are all oriented in
vertical direction—see Fig. 1.2. This design furthermore results in a balanced
thermal coupling of all boards to the satellite radiator baseplate leading to a
homogeneous temperature profile over the entire OBC housing. The modular con-
figuration is reached by covering each board with a single cassette frame (Fig. 7.2).
Cassette configurations of electronic components are a common design concept
for electronic units with a large number of circuit boards. A cassette configuration
offers the possibility for separation of each board for tests and assures a safe and
accurate handling. The cassette frames are designed to seal the mounted circuit
boards from HF influences from external and from each other. This is achieved by
using two rectangle contact surfaces for every connecting edge between the frames
and outer plates. Additionally screws spaced every 20 mm along the contact
surfaces generate compression forces, creating a very thin gap between the con-
necting surfaces. Thus HF signals can not pass the non-straight and thin gap at the
edges and the cassette is sealed in every direction.
In order to connect the PCB to the cassette additional tie rods are implemented in
the most commercial electronics, creating a well stiff stack. This standard solution

Fig. 7.2 Power Board frame as example for design principle.  IRS, University of Stuttgart
136 M. Lengowski and F. Steinmetz

with four rods requires a high volume of envelope. For very compact Micro Sat-
ellites the flight harness connection envelope does not allow this solution for the
OBC. In the selected design every single cassette is mounted to the base plate of the
satellite structure by means of three M5 screws to achieve the required stack
stability. Therewith this base plate takes over the function of the lower tie rods. To
permit mounting and multiple dis-/re-mountings of the entire OBC from the
satellite’s base plate, helicoils are used for all M5 screws. The upper tie rods of a
conventional cassette design are replaced by the form-locking cassette intercon-
nection to prevent movements of the cassettes relative to each other. The design is
strengthened by the numerous M2 screws. In order to achieve a plane mounting area
at the frame/baseplate contact surface small counter-sunk screws were selected.
The OBC boards connect to two different harnesses. The first one is the harness
to the satellite providing the power for OBC and the interfaces to the spacecraft
components. This harness directly starts at the CCSDS, the I/O and Power-Boards.
Due to the required pin number on these OBC external connectors the PCB long
side is used for these connections whereas the short side of each PCB provides the
internal connectors (see Figs. 1.2, 2.1, 3.7, 7.3 and 7.4).

Fig. 7.3 Cassette separation for I/O-Board frame.  IRS, University of Stuttgart

Fig. 7.4 Cassette separation


for Processor-Board frame.
 IRS, University of Stuttgart
7 OBC Mechanical and Thermal Design 137

The significantly large number of the external harness lines connecting to the I/
O-Board requires two special connectors featuring a higher integration density
than standard D-Sub connectors. Special 100 pin Micro-D Axon connectors are
used for these interfaces—see connectors D and E in Fig. 3.7.
The OBC internal harness (see Chap. 6) interlinks the OBC boards with each
other. It is required that the internal harness is shielded against HF influences from
the circuit boards, that it does not radiate any HF towards the circuit boards and that
it is shielded against HF influences from the OBC’s outer environment. Therefore
the frames are designed with an upper and lower, overlapping ‘‘nose’’ to create an
additional front compartment in the OBC housing when assembled all together—
see Figs. 1.2, 7.3 and 7.4. These edges of the compartments are manufactured in the
same way as the circuit board compartments with two rectangle contact surfaces.
In order to provide the possibility of replacing the circuit boards for maintenance
the frames are designed in two parts. In case of CCSDS, I/O and Power-Board the
cassette is separated into a frame part and a cover plate for the external connectors.
This design allows dismounting of the circuit board from the frame after dismounting
the frame cover plate. To assure high-frequency signal shielding also here contacting
overlapping edges are used. For the CPU cassette which has no external connectors it
is required to remove the cassette rear plane for accessing the internal connectors.
The configuration of the cassette assemblies are depicted in Figs. 7.3 and 7.4.
All frames are supplied with cut-outs in the rear plane and the outer surfaces for
mass reduction. The top and side cut-outs are applied from the outside and can be
milled. The rear plane cut-outs had to be made from the inside of the frame in
order to produce an elevated contact surface for thermal coupling between PCB
and frame. These cut-outs are manufactured by eroding because of the undercut. In
order to increase the eigenfrequency of the circuit boards themselves in a frame, an
additional mounting point was foreseen in the center of each board.
Furthermore, two venting-holes are foreseen in each frame for faster evacuation
of the cassette. To prevent a potential HF leak these venting-holes are realized with
a very small diameter of 1.5 mm and going around a rectangular corner. The
corner is realized by a hole from inside and a hole from the outside meeting each
other in a right angle. All remaining open surfaces of the OBC housing are closed
by three integrally manufactured plates—see front cover (removed in Fig. 1.2),
cover of rightmost frame and small left front compartment cover in Fig. 1.2. The
general mechanical properties of the OBC are provided in Table 7.1. Figure 7.5
shows the closed assembly of the OBC housing.

Table 7.1 Characteristics of OBC


Characteristic Property
Mass 4,98 kg
Volume 267 9 217 9 121 mm3
Material EN AW-6082 (AlMgSi1)
Moments of inertia (kg m3) Lxx = 0.095 Lyy = 0.137 Lzz = 0.181
Lxy = 0.065 Lxz = -0.039 Lyz = -0.03
First eigenfrequency (FEM) 174 Hz
Quasi-static design load 100 g
138 M. Lengowski and F. Steinmetz

Fig. 7.5 OBC assembly with closed front cover.  IRS, University of Stuttgart

7.2.2 Mechanical Dimensioning and Concept Validation

The design and dimensioning of the OBC housing is done using the CAD software
CATIA V5 R20 from Dassault Systems and Nx I-deas 6.1 from Siemens PLM
Software. CATIA is a multi-platform CAD/CAM/CAE program, and is used as the
principal mechanical design software for the Stuttgart Small Satellites Program.
The CAD/FEM software NX I-deas is used to assist in mechanical dimensioning.
Due to differences in the software tools and their implementation, two different
models were created:
• In CATIA a 3D-model was created to be used for fitting and collision analyses
as well as to detail the manufacturing process.
• On the other hand the FEM model consists of 2D-elements for shell meshing of
the simulation.

This meshing type was selected in order to reduce computing time and to
increase the accuracy of the simulation. The use of 3D-elements would have
generated a higher number of elements than desirable for simulating a structure
with such small wall thicknesses.
The OBC frames and their circuit boards are both modeled in the FEM simu-
lation. All shell elements are defined with their corresponding shell thicknesses
taken from the CAD data. To simplify the modeling process the radii of the cut-
outs are not included in the simulation. The electrical components of the circuit
boards are modeled as non-structural masses on the boards. A quasi homogeneous
distribution of the components over the PCBs is expected. The connection of the
7 OBC Mechanical and Thermal Design 139

boards to the frame is represented with one dimensional rigid elements in all seven
screw locations. Such a rigid element is a connection between two nodes that
maintains the distance and angle constant between them (Fig. 7.6).

Fig. 7.6 Quasi-static simulation of OBC (data in N/mm2).  IRS, University of Stuttgart

The OBC model is fixed by 24 restraints as boundary conditions, each without


any translational and rotational degree of freedom. For these restraints the nodes at
the screwing points were selected. The dimensioning was done with quasi-static
and modal analysis. In the quasi-static simulation an acceleration load of 100 g was
applied to the model. In order to simulate the quasi-static loading from different
directions, three load cases are defined to model acceleration in the x, y and z
direction of OBC. Through these simulations the structural stresses were deter-
mined, which must stay below than the permitted material characteristic values.
The modal analysis calculates the first eigenfrequencies of the OBC. These
frequencies are required to be above 130 Hz in order to have enough margin to the
first resonance frequency of the FLP target satellite induced by the launcher. The
design of the OBC in the CAD model could be optimized with results from these
simulations. The results are depicted in Table 7.2.

Table 7.2 Loads, deformations and first eigenfrequency from OBC FEM simulations
Simulation Results Approval value
Quasi-static load of 100 g in x direction 39.0 N/mm2; 0.042 mm 135 N/mm2
Quasi-static load of 100 g in y direction 38.9 N/mm2; 0.311 mm 135 N/mm2
Quasi-static load of 100 g in z direction 47.4 N/mm2; 0.155 mm 135 N/mm2
First eigenfrequency of modal analysis 174.6 Hz 130 Hz
140 M. Lengowski and F. Steinmetz

The OBC assembly is vibration tested when mounted into the target satellite
platform. The capability of the applied cassette frame concept was demonstrated
with other components of the FLP target satellite featuring PCBs of the same size
and resulting same frame sizes. The OBC cassettes also correlate with respect to
wall thicknesses, screw mountings and interconnection between cassettes. The
loads applied to these units are included in Tables 7.3 and 7.4. The random and the
sine vibration tests were conducted in each axis.

Table 7.3 Random vibration Frequency (Hz) Qualification level PSD (g2/Hz)
test loads
20 0.017
110 0.017
250 0.3
1,000 0.3
2,000 0.077
gRMS 19.86
Duration 3 min/axis

Table 7.4 Sine vibration test Frequency range (Hz) Qualification level
loads
Longitudinal axis 5–21 12.5 mm (0 to peak)
21–100 11 g
Lateral axis 5–16.7 12.5 mm (0 to peak)
16.7–100 7g

Sweep rate 2 Oct/min


Number of sweeps One up-sweep

7.3 Thermal Design of OBC

The design of an electronic box for a very compact satellite has to consider some
particularities with respect to thermal balancing. In contrast to electronics in large
satellites such a SmallSat unit has no large compartment where to radiate heat into
and where waste heat can be absorbed and further conducted/radiated away.
Therefore the CDPI units OBC and PCDU are thermally designed to be mounted on
a radiator plate where waste heat can be conducted to and on the spacecraft outer
side gets radiated into space. In case of the FLP target satellite this radiator at the
same time forms the structure baseplate with the launch adapter ring—see also
Sect. 10. For proper cooling the CDPI units additionally are coated on their outer
surface with thermal paint featuring a high emissivity. The inside of the OBC unit
frames also is painted black to prevent hot spots on the electronic boards and to
allow thermally high emissive chips to radiate their waste heat to the board’s frame.
As was explained in the previous sections on the OBC Power-Boards and the
internal OBC harness, the OBC is equipped with compartment heaters on each
second PCB frame for keeping the OBC boards above the minimum operating
7 OBC Mechanical and Thermal Design 141

temperature. The heater control is performed by simple thermostats to also work in


case of an OBSW failure. The positioning of the heaters and switches are already
shown in Figs. 5.11 and 6.10 in the previous chapters. The consequence of the
OBC cooling design for the nominal operational mode results in the the fact, that
the OBC’s temperature can drop below -40 C after a severe OBC failure out of
ground contact where the reconfiguration needs multiple attempts and lasts several
orbits if no heater power is supplied from PCDU. Furthermore, OBC power-up
cannot take place before thermal preconditioning of the OBC through powering
the heaters by the PCDU and by verification of the OBC unit temperature. This
task is taken over by the CPDI FDIR functionality embedded in the Combined-
Controller inside the PCDU unit.

7.3.1 Thermal Model

To identify the thermal behavior of the OBC housing, a lumped parameter model
was established with the software ESATAN-TMS (see [73]). The model is shown
in Fig. 7.7 depicting each separate frame with a different color. For comparison the
CAD model of the OBC is presented in Fig. 7.8.

Fig. 7.7 Thermal model of


the OBC.
Cyan: power frame,
blue: CCSDS frame,
red: Processor-Board frame,
yellow: IO frame,
green: harness cover.  IRS,
University of Stuttgart
142 M. Lengowski and F. Steinmetz

Fig. 7.8 CAD model of the


OBC with the IO frame
highlighted.  IRS,
University of Stuttgart

Each part of the OBC is modeled separately and is connected via user-defined
conductive couplings. Solid part elements were merged together in order to rep-
resent correctly the heat conductivity within each integrally machined part.
Figure 7.9 shows a frame in CAD compared to the ESATAN model in
Fig. 7.10. The mesh of the PCB was designed to represent correctly the area of
contacting surfaces between PCB and frame. This contact substantially influences
the conducted heat flux and therefore the temperature of the PCB. For contact
conductivity between nodes a value of 300 W/(m2K) was assumed. This value
represents an average value for screw connections [74]. These values were con-
firmed by thermal-vacuum tests (Fig. 7.11).

Fig. 7.9 CAD representation


of an OBC frame.  IRS,
University of Stuttgart
7 OBC Mechanical and Thermal Design 143

Fig. 7.10 Thermal network model of a frame (yellow) and a PCB (green).  IRS, University of
Stuttgart

Fig. 7.11 Contact resistance of the OBC boards.  IRS, University of Stuttgart

For analysis purposes the heat dissipation over the PCBs was assumed as being
equally distributed over each board, independent from board type. Figure 7.12
shows the heat dissipating parts of the mesh. The power dissipated in each PCB
can be taken from Table 7.5.
144 M. Lengowski and F. Steinmetz

Fig. 7.12 Heat dissipating


surfaces on a PCB (purple).
 IRS, University of Stuttgart

Table 7.5 Thermal power PCB Board Power (Watts)


dissipation
Processor-Board 4.75
I/O-Board 1.5
CCSDS-Board (hot redundant) 29 1.0
Power-Board 0.95
+0.3
+2 9 0.2
= 1.65

Every PCB is redundant in the OBC unit. Processor-Boards and I/O-Boards are
normally operated in cold redundancy except for certain FDIR and software patch
cases. The CCSDS-Boards are permanently operated in hot redundancy. The
material parameters applied for the OBC thermal model can be taken from
Tables 7.10 and 7.11.

7.3.2 Thermal Calculation Results

The thermal model was also used to verify the heat conduction to the radiator plate
under different thermal environmental conditions. The applied conditions are
given in Table 7.6.

Table 7.6 Thermal simulation boundary conditions


Hot case Cold case
Satellite interior temperature 50 C 0 C
Orbit height 500 km 650 km
Satellite pointing Inertial Spin stabilized
Satellite mode Idle Safe
7 OBC Mechanical and Thermal Design 145

The results of the simulation runs is shown in Figs. 7.13 and 7.14. From these
the conclusion can be drawn that the heat transport within the OBC is sufficient to
conduct its own dissipated power and the dissipation from satellite interior onto the
OBC towards the radiator. The PCBs are sufficiently coupled to the mounting
frames.

Fig. 7.13 Temperature chart—hot case.  IRS, University of Stuttgart

Fig. 7.14 Temperature chart—cold case.  IRS, University of Stuttgart


146 M. Lengowski and F. Steinmetz

7.3.3 OBC Internal Heaters

It was already discussed that the OBC temperature can drop below the minimum
operational temperature in case of a longer deactivation—e.g. in case of a satellite
OBSW failure, resulting satellite tumbling and longer power outage in eclipse. The
heaters to warm up the OBC before activation any board also was mentioned. In
the following paragraphs more details on these heaters, their dimensioning and the
selected parts shall be provided. For the positioning of such heaters on each second
OBC frame (please refer to Figs. 7.15 and 7.16).

Fig. 7.15 Model of the


heaters on a frame (purple).
 IRS, University of Stuttgart

Fig. 7.16 CAD drawing of


the heaters on a frame
(yellow).  IRS, University
of Stuttgart

The heaters are realized as heater mats, glued onto the frame cassette floors.
They conduct heat through the frame bottom into the cassette onto which they are
mounted. And they radiate heat into the neighbor frame stacked to it with its open
side oriented towards the heaters.
7 OBC Mechanical and Thermal Design 147

An analysis with the thermal model showed that an electrical power above
40 W becomes inefficient for PCB heat up as more and more of the generated heat
is directly conducted to the radiator of the satellite rather than reaching the PCB
(see Fig. 7.17). As result the resistance of the heaters was selected to lead to a
maximum dissipation of 40 W in total. Variations may result from the unregulated
power bus voltage.

Fig. 7.17 Simulation of


heaters with different heater
power.  IRS, University of
Stuttgart

Inside the OBC a set of four nominal heaters are in switched parallel and
another chain of four heaters for the same compartments represents the redundant
chain. So in each second compartment where heaters are placed, there exists one
heater from the nominal and one from the redundant chain. In a worst case sce-
nario the heaters are activated in eclipse where the solar panels do not supply any
power to the satellite bus. For that case the design voltage for the heaters is the
lowest possible battery voltage being 22 V. The resistance of the heaters thus was
chosen to provide approximately 5 W in this scenario (please refer to Fig. 7.18).

Fig. 7.18 Heat dissipation per heater over battery voltage (blue), manufacturing tolerance
(black), minimum battery voltage (red).  IRS, University of Stuttgart
148 M. Lengowski and F. Steinmetz

The heaters are acquired from Minco Inc. as these models are suitable for
vacuum environment according NASA standard [57]. The type of heaters selected
for the OBC is model HK5591 with aluminum backing and pressure sensitive
adhesive, suitable for a temperature range from -73 to 150 C.
Heaters are activated by bimetallic thermostats as long as power is supplied to
the heater line so that no micro-controller needs to be active for their control in an
FDIR state. The thermostats are manufactured and tested by the company
COMEPA according ESA ESCC 3702 & 3702/001 standard for bimetallic
switches [76]. These thermostats are listed in the European Preferred Parts List
(EPPL) for space components [77].
The definition of the optimum thermostat switch temperatures was again
achieved by using the thermal lumped parameter model. The temperatures at the
thermostat positions and the PCB temperatures were analyzed by means of a
simulated cool down and a warm up scenario. For the precise assessment of the
necessary lower heater activation limit Tr transient cool-down simulations have
been performed with the OBC thermal model. One scenario started from the OBC’s
upper operational temperature limit, one from a cold case with moderate temper-
atures. The model assumed all OBC boards themselves being in ‘‘off’’ state, i.e. not
dissipating any heat. The results show that when reaching the minimum operational
temperature of -40 C there are almost no temperature gradients anymore in the
network. Therefore the activation temperature of the thermostats can be set directly
to the minimum operational temperature of the PCBs which is -40 C.
The switch temperatures of the procured OBC FM thermostats were charac-
terized in a thermal vacuum chamber at IRS premises. The test results are in
accordance with the supplier data and a measurement tolerance of ±1 K for the
upper switching temperature Tf. The measured values of the lower switch tem-
perature Tr are exceeding this tolerance. But the overall system performance is still
valid with these devices since they will activate heating even before the critical
temperature of -40 C is reached (Table 7.7).

Table 7.7 Thermostat switch temperatures—specification and measured data


Manufacturer Measured value Manufacturer spec. value Measured value
spec. value
Thermostat 1 (C) -27.43 -27.31 -41.74 -39.45
Thermostat 2 (C) -25.62 -24.92 -39.51 -37.66

By means of the thermal model it was analyzed after which time during heat up
the upper thermistor switch temperature will be reached, considering the ther-
mostat positions in the OBC housing front compartment, analyzing different heater
configurations and varying power supply voltages. The results of these simulations
as well as an exemplary temperature chart of a heat–up simulation are condensed
in Table 7.8 and Fig. 7.19.
7 OBC Mechanical and Thermal Design 149

Table 7.8 Heat-up duration and consumed energy


Battery 22 V ? 5.88 W per Heater Battery 25 V ? 7.60 W per Heater
Heater lines 1&2 ? 47.1 W 1 ? 23.52 W 2 ? 23.52 W 1&2 ? 60.8 W 1 ? 30.4 W 2 ? 30.4 W
time to reach -40 C 1,680 4,080 4,080 1,320 2,760 2,760
at PCB (s)

time to reach 4,020 16,800 16,800 3,240 8,940 8,940


-27.43 and -25.62
at thermostat (s)

Resulting heating 2,340 12,720 12,720 1,920 6,180 6,180


period (s)

Required additional 26.0 141.3 141.3 21.3 68.7 68.7


energy (Wh)

Fig. 7.19 OBC warm up, all heaters active, battery at 25 V.  IRS, University of Stuttgart

7.4 OBC Housing Material Properties

See Tables 7.9, 7.10, and 7.11.

Table 7.9 OBC mechanical model—material properties


Property AL EN AW 6082 T651
Material Metal
E (N/mm2) 70.000
G (N/mm2) 26.700
m (–) 0.34
. (kg/m3) 2.700
a (10-6/K) 23.04.12
k [W/(mK)] 170
c [J/(kg K)] 900
Rp0,2 (N/mm2) 240
150 M. Lengowski and F. Steinmetz

Table 7.10 Thermal model–bulk properties


Bulk material q (kg/m3) c (J/kg K) kp (W/m K) kn (W/m K)
Aluminum 2,700.00 900.00 150.00 150.00
(Al 6082)

Copper 8,920.00 385.00 394.00 394.00

FR4 1,500.00 1.800.00 0.30 0.30

PCB 1,662.00 1,634.00 8.90 0.31

Kapton 1,420.00 1,090.00 0.37 0.37

Kapton heater 3,770.00 606.21 5.21 0.55

Table 7.11 Thermal model—optical properties of applied materials


Surface material eIR aS
Aeroglaze Z307 0.89 0.97

Second surface mirror 0.75 0.09

FR4 0.91 –

Kapton 0.62 0.39


Chapter 8
The Power Control and Distribution Unit

N.N. and Alexander N. Uryu

8.1 Introduction

A Power Control and Distribution Unit (PCDU), traditionally performs the power
regulation, control and distribution activities in a satellite system. Furthermore, the
PCDU is responsible for monitoring and protecting the satellite power bus. Thus,
the PCDU is one of the key components on board the satellite together with the
OBC. Some specific functionalities were implemented into the PCDU design in
order to facilitate the overall Combined Data and Power Management Infra-
structure (CDPI). This chapter describes both the specific PCDU functionality
which enable this CDPI concept as well as the standard PCDU functions.
The FLP satellite’s Power Control and Distribution Unit is developed in
cooperation with an experienced industrial partner. The expertise and compliance

N.N.
Vectronic Aerospace GmbH, Berlin, Germany
A. N. Uryu (&)
Institute of Space Systems, University of Stuttgart, Stuttgart, Germany
e-mail: uryu@irs.uni-stuttgart.de

J. Eickhoff (ed.), A Combined Data and Power Management Infrastructure, 151


Springer Aerospace Technology, DOI: 10.1007/978-3-642-35557-8_8,
 Springer-Verlag Berlin Heidelberg 2013
152 A. N. Uryu

with Quality Assurance (QA) in an industrial organization contributes significantly


to the quality standard of the unit. This industrial partner for the PCDU is Vec-
tronic Aerospace GmbH in Berlin. Vectronic Aerospace can look back to many
years of experience in the sector of manufacturing PCDUs and other satellite
components for small satellite projects.
The specification for the PCDU functionality which is described in this chapter
was composed accounting for the CDPI system and the FLP target satellite’s
mission requirements. Based on this specification, Vectronic Aerospace manu-
factured an electrical EM and an FM unit which both were electrically verified at
Vectronic Aerospace premises and verified with respect to functionality at IRS
premises.
This chapter covers the following topics:
• The target satellite’s Power Supply Subsystem
• The overall PCDU design
• Power regulation and control concept
• Analog data handling concept
• Reconfiguration functionality for the OBC
• Reconfiguration functionality for the entire satellite system
• Diverse functions
• Operational constraints/limits
• Unit interfaces.

8.2 The PCDU in a Typical Power Supply Subsystem

Like most satellites orbiting Earth, the FLP follows the standard implementation
approach for the primary power source and energy storage device:
Primary power source: Photovoltaic solar cells
Secondary energy storage: Battery cells
The FLP features three solar panels in total which are implemented in two
different configurations, a side panel type which is deployed after separation from
the upper stage of the launcher, and a center panel type which is body mounted
(please also refer to Chap. 10). The GAGET1-ID/160-8040 [78] solar cells from
AZUR Space Solar Power are applied as primary energy source together with
lithium iron phosphate secondary battery cells for energy storage from A123
Systems [79]. The center solar panel includes a test string with experimental solar
cells with an efficiency of 27.8 % (BoL, 28 C, AM0) from AZUR Space Solar
Power [80] which shall be qualified for space use during the FLP mission.
Table 8.1 gives a short overview of the FLP target satellite’s power sources. Please
consult the respective data sheet for detailed technical information.
8 The Power Control and Distribution Unit 153

Table 8.1 Overview of the Solar cells


satellite’s solar cell and
Identification GAGET1-ID/160-8040
battery key characteristics
Base material GaInP2/GaAs/Ge
on Ge substrate
Efficiency at BOL, 25.3 %
28 C, AM0
Maximum power output Approximately 270 W
in FLP configuration

Secondary battery
Type Lithium iron phosphate
Identification ANR26650M1-B
Total capacity for the 35 Ah
configuration, BOL

8.3 PCDU Unit Design Overview

The PCDU features two SH7045 32-bit high-speed, single-chip microcontrollers


[84] from RENESAS Electronics for the operational tasks. Those controllers have
already been successfully applied for multiple space missions by Vectronic
Aerospace.
The PCDU is composed of five PCB stacks which are assembled with a cover
plate on top. Two screw threads at the connector side allow for the fastening of
ground straps for the proper electrical connection to the structure. All PCBs are
connected to chassis-ground.
All connectors are of Sub-D Standard or Sub-D High Density type, as both
connector types are relatively inexpensive and can be processed easily. Figure 8.1
shows the engineering model of the PCDU, which was used to conduct the
functional verification at the IRS in Stuttgart. Table 8.2 shows the main electrical
and mechanical characteristics of the unit. A CAD Drawing of the PCDU can be
found in Sect. 11.10.

Fig. 8.1 Engineering model


of the PCDU.  Vectronic
Aerospace GmbH
154 A. N. Uryu

Table 8.2 PCDU Parameter Min Typical Max Unit


characteristics
Supply current at 25 V 90 100 120 mA
auxiliary supply (standby)
Power consumption at 25 V 2.5 W
auxiliary supply (standby)
Bus voltage 18.5 – 25.5 V
Reset recovery time – 10 20 s
Mass – 4.14 – kg
Dimensions:
Height – 117.4 – mm
Width – 220 – mm
Depth – 160 – mm

Steady state power consumption of the unit lies below 5 W. By design, heat
emitting parts like fuses, switches or the CPUs are placed by Vectronic on PCBs
near the baseplate, which is connected to the structure for thermal conductance
reasons. The remaining surface sections are anodized in a matt black color to
increase the thermal balancing by radiation. A PCB internal heating for the CPU
PCBs facilitates a fast warm-up to -20 C in order to prevent damaging of
electronic parts due to thermal stress due to high temperature gradients. Moreover,
the PCDU is qualified to a lower limit of -40 C for operational use to increase
the availability of the PCDU and thus satellite system reliability. The thermal
conditions are monitored by five internal temperature sensors the PCDU.
According to FLP design regulations the PCDU is designed single failure tol-
erant. This means that a specific functionality is covered by a redundant unit or
functional path in case the nominal unit fails. The FM unit was also subjected to
environmental tests, such as vibration and thermal-vacuum tests, to facilitate a safe
launch and reliable operations in orbit. Additionally, the PCDU is designed to
fulfill its tasks reliably under the influence of the expected space radiation for the
projected mission lifetime of two years. According to [81] 1–10 krad are to be
expected per year.

8.3.1 PCDU Interfaces

The PCDU is equipped with a number of interfaces for connecting digital and
analog equipment plus the serial interconnection to the OBC. Furthermore, the
PCDU provides electrical interfaces for power generation, storage and distribution.
In addition interfaces are implemented for satellite operations, system monitoring
and for all tasks of OBC monitoring and reconfiguration in the frame of the overall
CDPI architecture. The listing provided below comprises all interfaces that are
implemented for FLP use, see also Table 8.12 in sect. 8.9 and Tables 11.48–11.50
in the annex for the connector affiliation. Figure 11.10 depicts the PCDU
dimensions.
8 The Power Control and Distribution Unit 155

• Interface to solar arrays


• Interface to batteries
• Power supply for all components
• Battery overcharge protection
• Interface to the launcher
• RS422 communication interfaces to the OBC Processor-Board through I/O-
Board
• RS422 communication interface for High Priority Commands
• Interface for temperature sensors
• Interface for sun sensors
• Interface for solar panel deployment detection
• Interface to experimental solar test string.

8.3.2 PCDU Command Concept

In general, Common Commands are available to control the fuses, switches and
PCDU modes and to request sensor data as well as PCDU status. All Common
Commands and according TM data can be transmitted with a baud rate of 115200
through a full-duplex RS422 interface. The transmission protocol for commanding
consists of a mandatory part of 8 bytes and an optional data part, see Table 8.3.

Table 8.3 Protocol structure of Common Commands


8-byte protocol structure
Byte no 0 1 2 3 4 5 6 7 8
Meaning LEN1 LEN0 CMDC CMDID P1 P0 CRCH CRCL CC Data
with
Byte Explanation
LENx Length of data block following the command, big endian, initial
8 bytes of a command are always mandatory
CMDC Command count, increment for every sent command, for
acknowledgment identification
CMDID Command ID, identification for a command group
Px Command parameter
CRCH 16 bit CRC-CCITT (byte 0–5), higher byte
CRCL 16 bit CRC-CCITT (byte 0–5), lower byte
Data Optional block of data, max 65536 byte plus 2 byte of data CRC

Every Common Command is acknowledged by a command return, also referred


to as echo, informing about the reception and execution status. The command
return can also contain Telemetry (TM) to previously sent commands. The pro-
tocol structure for the command reply is composed as shown in Table 8.4.
156 A. N. Uryu

Table 8.4 Basic command reply structure


8-byte reply structure
Byte no 0 1 2 3 4 5 6 7 8…
Meaning LEN1-E LEN0-E B2-E CMDID-E P1-E P0-E CRCH CRCL Rx Data
Byte Explanation
LENx-E Length of data block following the echo, big endian, initial 8 bytes
of a command are always mandatory
B2-E Byte 2: 0x00
CMDID-E CMDID; Command ID of the received command
P1-E Execution of received command: 0xF0 ? Yes; 0x0F ? No
P0-E CMDC, Command Count of the received command
CRCH 16 bit CRC-CCITT (Echo byte 0–5), higher byte
CRCL 16 bit CRC-CCITT (Echo byte 0–5), lower byte
Rx Data Optional block of data, max 65536 byte plus 2 byte of data CRC

8.4 Boot-Up Sequence of the PCDU and PCDU Modes

A safe and reliable step-by-step boot-up sequence of the PCDU and thus of the
entire satellite system is implemented to facilitate the completion of the first stable
satellite mode, the System Safe Mode. The boot-up procedure includes specific
prerequisites before the OBC boards are powered by the PCDU and assume
control of the satellite. Thereby, following actions are performed to prevent the
damaging of critical satellite units:
1. PCDU internal heaters warm up the unit up to its operational temperature limit.
2. Check of the power level of the batteries to complete the entire boot-up pro-
cedure up to the System Safe Mode.
3. Check of the temperature level of the OBC unit and the TT&C transceivers If
temperature is below the operational limit, the PCDU activates the power
switches for the redundant heater design of both units. These heaters include
thermistors to facilitate the heating up to the specified operating temperature.
Alternatively, a timer condition is implemented which is set according to the
results of thermal simulations. As soon as the timer condition is met, the PCDU
continues the boot-up process.
4. The last step concludes the boot-up procedure to the System Safe Mode.
From here, the OBC can command every other defined operations mode of the
satellite.
8 The Power Control and Distribution Unit 157

8.5 Power Control and Distribution Functions

The main task of the PCDU is the distribution and regulation of the electric power
on board the satellite. The power handling design is specified in order to safeguard
the power supply of the satellite bus as far as possible. Furthermore, specific
protection features are implemented in order to prevent damaging of the on-board
components or the batteries which are essential for accomplishing the mission
objectives. Figure 8.2 shows the circuitry of the PCDU and its connections to the
satellite bus.
Each solar panel is connected to only one battery string by a Battery Charge
Regulator (BCR), in order to prevent a single point failure. If all power strings
were interconnected, the complete power supply would for example be disabled in
the case when a non-insulated cable accidentally contacts the structure. The FLP
target satellite configuration represents a non-DET system [82] with an unregu-
lated power bus supplying the power bus with a voltage between 18.5 V and
25.5 V. The BCR is located in the direct energy path to protect the satellite bus
from excessive voltage or current transients. Each BCR is adjusted to an upper
voltage limit of 25.5 V, which corresponds with the end charge voltage value of
each battery string.
The three independent power strings are combined before the Main Switch of
the PCDU, but secured with diodes to prevent the current flow of one string into
another. In case a battery string or solar panel is broken or short-circuited, the
energy of the other two strings can be used to operate the S/C. String 0 and 1
represent the energy paths of the side solar panels, whereas string 2 represents the
path of the middle solar panel and solar test string. The solar test string is used for
the generation of electrical energy by default.
The distribution of power to the consumer loads is controlled by the appli-
cation of a fuse and switch system. The PCDU deactivates the power supply by
the respective Latching Current Limiters (LCLs), as soon as an over-current is
measured. Due to volume and cost reasons some power outputs are combined at
one fuse. However, critical on-board components such as the OBC boards and the
TC receivers are implemented as single loads on a fuse. For reliability reasons
and due to the combined allocation of multiple loads to one fuse, additional
switches are used to regulate power supply of single loads. High-power con-
suming components are equipped with two switches in series in order to protect
the satellite bus. If a switch should break during the mission lifetime, the second
serially connected switch can be opened to deactivate the respective component,
if necessary. The LCL fuses can be reactivated after the event of an over-current
and the connected consumers are not lost for the mission. A complete list of the
component affiliations to fuses and switches can be found in Table 11.51 in the
annex.
In addition to the given fuse-switch control and protection system for the on-
board loads, there are two bi-stable relays. Each one of these bi-stable relays is
dedicated to a battery survival heater. The relays are implemented in order to
158
A. N. Uryu

Fig. 8.2 Circuitry and power connections of the PCDU.  IRS, University of Stuttgart
8 The Power Control and Distribution Unit 159

safeguard the heating of the battery compartment, even if the satellite is deacti-
vated due to the under-voltage protection feature. Since the batteries are very
sensitive with regard to their storage temperature conditions, this mean was
implemented to protect the energy storage devices from damage.
Figure 8.3 shows the connections between the PCDU and one battery string.
Charge and discharge of the battery is managed by the power interface (IF). By
default, the switch is closed in order to allow the charging of the battery string. If
the charge process shall be interrupted the switch can be opened. The energy path
with the diode still allows energy extraction from the battery.
Since the PCDU only monitors the voltage level of a complete battery string,
single cells are not protected from overcharging. In order to prevent overcharging
of a single cell, an electrical circuitry is applied at the battery side which monitors
the respective cell voltage. If the voltage of single, serially connected cells
diverges too much, single cells could be overcharged before the combined charge
limit of 25.5 V is reached. The PCDU features the reception interfaces for dedi-
cated signals sent by the monitoring circuitry. As soon as the PCDU receives the
interrupt signal, battery charging is stopped by opening the respective switch in the
energy path for a specified time. In case of a fault event there exists the possibility
to command the PCDU to ignore the interrupt signal.
Each battery string is equipped with two temperature sensors for thermal
monitoring. In case the temperature limits for a stable energy output are violated,
the charging is disabled to prevent long-term damaging of the cells.

Battery String

Not available for


battery string at
middle solar panel

TS
TS

- +

Balancing and Monitoring Circuitry

Diode Switch

Power IF Data IF Power IF

Power Control and Distribution Unit (PCDU)

Legend:
IF: Interface TS: Temperature Sensor

Fig. 8.3 Connection of the PCDU and a battery string.  IRS, University of Stuttgart
160 A. N. Uryu

The battery’s State of Charge (SoC), is usually calculated by the Onboard


Software (OBSW), in the OBC on the basis of the received PCDU TM. The OBC
passes the calculated value to the PCDU for mode control. Calculation of the SoC
is independently performed by the PCDU, since this information is required during
the start-up sequence when the OBC is still deactivated.
The available under-voltage protection feature preserves the batteries, if the
voltage level falls below 18.5 V. The measurement point is located at the main
switch of the PCDU. Exhaustion of the batteries must be prevented as the batteries
could be damaged or even corrupted. When under-voltage protection becomes
active, the PCDU and hence the complete satellite is deactivated. Therefore, this
lower limit is to be avoided by the OBSW operations management of the satellite.
In under-voltage case the satellite system is set into a low-power Safe Mode,
where the available power charges only the batteries and survival is assured with a
sufficient margin before the minimum voltage level is reached. In order to avoid a
meta-stable state, in which the satellite turns on and off, the PCDU reactivation
threshold is specified to a higher voltage level of 21.5 V.
A further implication of the combined loads at one fuse is a current monitoring
feature by S/W through a Current State Table. The Current State Table contains
reference values of the allowed current level for all on-board components. Addi-
tionally, the PCDU records and monitors which components are powered on at a
respective fuse with the help of the configuration list. As soon as the LCLs
determines a current flow exceeding the referenced value in the Current State
Table, the PCDU deactivates the respective LCL to avoid corruption of the con-
nected components. This monitoring functionality is performed with a repetition
rate of higher than 20 Hz (more than twice the main control loop frequency) to
increase the protection potential.

8.6 PCDU Specific Functions in the CDPI Architecture

8.6.1 Analog Data Handling Concept

One of the special characteristics which exceeds the scope of duties of a common
PCDU is the different approach for the on-board data reception. Usually, the
collection of all data data is conducted by a separate unit in an industrial satellite,
sometimes referred to as Remote Interface Unit [10].
For the FLP, digital and analog data interfaces are separated in the command
chain. Making use of synergies, the PCDU contains all analog on-board IFs. Since
the PCDU contains Analog-to-Digital Converters (ADCs), for the measurement of
voltages and currents anyway, this step was a reasonable decision. Most of the
digital IFs to the satellite units are comprised by the I/O-Board. Dividing the two
interface types and assigning these to two distinct components reduces complexity
in each of the respective units. Each interface unit can thus be developed as fast as
8 The Power Control and Distribution Unit 161

possible, only dependent on the definition status of the respective IFs. Moreover,
the required qualification effort is split and qualification time can be minimized as
both units may be tested in parallel.
According to this interface control concept, the PCDU collects all analog sensor
data on board the satellite. Some digital sensor data which is required for the
PCDU tasks is collected as well. The sensor data as shown in Table 8.5 is collected
by the PCDU.

Table 8.5 Overview of sensor data collected by PCDU


Data information Quantity of functional Data type
connections
Temperature sensors (resistance) 32 Analog
Sun sensors (current) 16 Analog
Solar panel and battery condition 3 (for solar panels)/ Analog
(voltage/current) 3 (for batteries)
Characterization of a experimental 1 Analog
test string (voltage/current)
Reed sensors for deployment status 4 Digital
of two solar wing panels
Separation detection from upper 1 Digital
stage of the launcher
Input for monitoring signals for 3 Digital
battery overcharge protection

This sensor data is not processed inside the PCDU. Whereas the analog sensor
data is converted to digital data by the ADCs, all sensor data is forwarded to the
OBC. The handling is conducted in the OBC, utilizing its processing power, where
the relevant data is also distributed to the respective subsystem control module.

8.6.2 Reconfiguration Logic for the OBC

As explained in Sect. 1.4 and as published in [4], the Combined-Controller in this


Combined Data and Power Management Infrastructure (CDPI), also serves as
Reconfiguration Unit for the OBC. In this context the PCDU controller is taking
over the role of the CDPI Combined-Controller. The diverse boards of the OBC
and their functions were described earlier in Sect. 1.2. There is always one
operational instance of the OBC Processor-Boards and I/O-Boards required in
order to facilitate a working command chain on board the satellite. Since both
CCSDS-Boards are permanently powered, their hot-redundant board is instantly
available in case of a malfunction. This hot-redundant operations concept is not
applied for the OBC Processor-Boards.
However, it is essential to guarantee the detection and reconfiguration of any
malfunctioning OBC boards. Industrial satellites usually feature an independent
OBC internal Reconfiguration Unit that permanently monitors the operation of the
162 A. N. Uryu

OBC components. In case an OBC component is not working properly anymore,


this independent Reconfiguration Unit IC—in most cases a radiation-hardened
ASIC—switches over to the corresponding redundant component. As explained in
Chap. 1 the key idea of the Combined Data and Power Infrastructure is to use the
processor of an intelligent PCDU to take over the task of the classic Reconfigu-
ration Unit and thus to save the development, qualification and manufacturing of
one entire OBC ASIC.
The first essential step in OBC FDIR is failure detection: In the FLP concept,
the OBC requests important housekeeping data from the PCDU at a regular
interval of 10 Hz by a Common Command. This data is recorded by the PCDU
within its control loop which is adjusted to match the 10 Hz polling cycle of the
OBC. All requested data is submitted to the OBC as requested. Among the
periodically accumulated data are:
• Status of batteries: voltage/current/battery depth of discharge or state of charge
calculated by PCDU
• Status of fuses: on/off
• Status of switches: on/off
• Status of panels: voltage/current
• Temperature sensor data
• Sun sensor data.

In case the PCDU is not polled cyclically as specified, a failure of the OBC
system is assumed by its reconfiguration function. Figure 8.4 shows the four
OBC boards that are involved in the command chain of Common Commands from
OBC to the PCDU. These are the nominal and the redundant OBC Processor-
Board—or ‘‘OBC Core’’ for short—as well as the nominal and redundant I/O-
Board. Figure 8.4 also depicts the connections between the OBC’s CCSDS-Boards
and the PCDU for High Priority Commanding, which is explained in the following
section in detail.
The affiliation ‘0’ and ‘1’ indicate that the respective units are operated in a hot
redundant mode. In contrast, the boards which are operated in a cold redundant
mode are affiliated Nominal (N) and Redundant (R).
Considering a single OBC failure leading to a watchdog timeout or TM request
timeout on PCDU side, in the first place for the CDPI Combined-Controller it is
impossible to identify which of the OBC boards is defective—defective in this
context meaning either electrically defective or simply non operational due to
crashed OBSW. Therefore, a specific reconfiguration procedure for the OBC is
performed to restore the command chain operability by switching through the
available instances. After each switch step a delay time is foreseen—e.g. to allow
the redundant Processor-Board to boot up entirely—and it is verified by the PCDU
whether telemetry polling is resumed again by the OBC. This default hold time can
be adapted by command. If polling is resumed, the reconfiguration is considered
successful and the further sequence is aborted. If no polling is occurring yet, the
next switchover step is performed:
8 The Power Control and Distribution Unit 163

Common High Priority


Commands Commands
OBC

OBC Core N OBC Core R

I/O Board N I/O Board R CCSDS 0 CCSDS 1

Interface 0 Interface 1 Legend

SpaceWire
RS 422
CPU 0 CPU 1 Common Com.
High Priority Comm.
PCDU

Fig. 8.4 Interface communication between the OBC and the PCDU.  IRS, University of
Stuttgart

• Turn the switches off for both I/O-Boards and on for the nominal I/O-Board
• Turn the switches off for both OBC Processor-Boards and on for the nominal
OBC Processor-Board
• Turn the switch off for the nominal I/O-Board and activate the switch for
the redundant I/O-Board
• Turn the switch off for the nominal OBC Processor-Board and activate the
switch for the redundant OBC Processor-Board
• Turn the switch off for the redundant I/O-Board and activate the switch for the
nominal I/O-Board.

As soon as a working command chain configuration is found, the PCDU is


autonomously set into Mini Operations Mode, which corresponds to the Safe
Mode at system level. By implementing the autonomous reconfiguration procedure
the down time of the satellite system is minimized. This is essential, since an
uncontrolled satellite puts the mission at risk. Without OBC operation the satellite
loses its controlled attitude for power generation and its thermal balance can be
compromised. Additionally, there is only a limited number of ground stations
available for academic projects and they are not staffed all the time. Days could
pass before someone notices the fault and reinstates the system, which drastically
increases the risk of losing the mission.
164 A. N. Uryu

The implementation of the above described concept for the OBC reconfigura-
tion is only reasonable, since the PCDU itself is equipped with an internal
watchdog circuit which facilitates the autonomous switching between the PCDU
internal, redundant controllers. Thus, the monitoring and switching tasks can be
performed without significant delays and thus safeguard a minimum downtime of
the satellite system.
Figure 8.5 shows the watchdog functionality for the autonomous switching of
the PCDU internal controllers. Both, the nominal and the redundant controller, are
operated in a hot-redundant concept with a master and a slave unit at separate
electric circuits. The master unit performs all actions, whereas the slave monitors
the master.

Switch
Master Master
switch switch
signal signal

Controller N Controller R

confirmation signal during each processing cycle

Fig. 8.5 Functional design of the switch logic for the PCDU internal CPUs.  IRS, University of
Stuttgart

The master CPU is sending a confirmation signal during its processing cycle in
order to permanently confirm its operability. If this condition is no longer met, the
switch logic commands to switch the master functionality to the slave unit.

8.6.3 Reconfiguration Functionality for the Spacecraft

The PCDU incorporates a further functionality, which is essential for FDIR


operations at S/C system level. In order to establish a reliable concept for
spacecraft operations and system commandability so-called High Priority Com-
mands (HPCs) are to be applied in case the spacecraft OBSW has crashed and an
automatic reconfiguration failed. The concept of HPCs is explained in [10]. Such
commands are applied in case the nominal command chain units, such as OBC
Processor-Boards and I/O-Boards, or the OBC’s OBSW is not operational. In
standard architectures the HPCs are submitted from ground to a dedicated Com-
mand Pulse Decoding Unit (CPDU), of the OBC which then commands the PCDU
relays switching via dedicated pulse command lines. In case of the integrated
8 The Power Control and Distribution Unit 165

CDPI architecture HPCs also bypass the OBSW command chain but do not need a
CPDU as they are transmitted directly from the CCSDS-Boards to the PCDU (see
Sect. 1.4 and [4]).
The following explanations of the CCSDS protocol are limited to the parts that
are necessary to understand the HPC transmission and forwarding. Please consult
the CCSDS standards [23] for further information. Figure 1.11 in Sect. 1.6.2
shows an example of the composition of an uplinked TC packet that is transmitted
from ground to the satellite. After the decoding of the so-called Command Link
Transmission Unit (CLTU), the TC Transfer Frame contains the commands from
ground. Only the Frame Header and the TC Segment must be considered to
understand the HPC forwarding. The Virtual Channel (VC), in the Frame Header
indicates which command chain unit receives the data. The following four VCs are
available for FLP target spacecraft:
• VC0: nominal command to OBC Processor-Board N
• VC1: HPC1 to CCSDS-Board N
• VC2: nominal command to OBC Processor-Board R
• VC3: HPC1 to CCSDS-Board R.

Whereas nominal commands are assigned to the VCs ‘0’ and ‘2’, HPCs are
allocated to the VCs ‘1’ and ‘3’. An HPC that is commanded by ground is referred
to as High Priority Command Level 1 (HPC1). HPCs Level 2 are processed by the
OBC S/W. The TC Segment contains the Multiplexer Access Point Identifier
(MAP-ID). A MAP-ID that equals ‘0’ states that the TC Segment contains HPC1 s
and that containing commands are directly forwarded from the CCSDS-Board to
the PCDU. All MAP-IDs unequal to ‘0’ imply PUS packets and are transmitted to
the OBC Processor-Board for further processing by the OBC S/W.
As described in Sect. 1.4.2 industrial satellites may feature a so-called Com-
mand Pulse Decoding Unit (CPDU) on board to receive HPCs. This unit routes the
commands to the respective units. An HPC consists of 2 bytes. The first 8 bits
contain information on the channel selection, the second 8 bits on pulse length
definition. So it is possible to command 256 units (channel selection) through 256
commands (pulse length definition) by utilizing HPCs. For FLP, the CPDU is
integrated in the PCDU as the only unit commanded by HPCs. Thus, 65536
different commands may be implemented to reconfigure the satellite system by
switching LCLs and component switches. The PCDU features a nominal and a
redundant RS422 communication interface for the reception of HPCs from the
OBC CCSDS-Boards (see Fig. 8.4) with a baud rate of 115200. All HPC packets
are implemented with a 2 byte header that serve as an identifier for the following
HPC frame. The composition of the HPC header is shown in Table 8.6.

Table 8.6 Header composition of an HPC Frame


Byte number 1 2
Bit composition 11111111 01010101
166 A. N. Uryu

An HPC frame can contain up to four high priority commands. Every command
starts with a TC Source Packet Header (TSPH) which is completely dismissed.
Each of the HPCs consists of the 6 bytes TSPH, 2 bytes command plus 2 bytes
checksum. HPCs can be used to activate or deactivate a single or a specific set of
on-board components which cover a specific safety aspect. By virtue of their
importance, HPCs are processed immediately after reception at the PCDU and in
favor to Common Commands. The most important HPCs are:
• Activate or deactivate the on-board heaters
• Deactivate all non-essential loads for the Safe Mode to save energy
• Reconfigure the command chain.

The structure of a single HPC is as shown in Table 8.7.

Table 8.7 Basic HPC structure


HPC structure
Byte no. 0–5 6 7 8 9
Meaning TSPH CMD ID-H CMD ID-L CRCH CRCL
with
Byte Explanation
Header 0xFF55
TSPH TC source packet header (0x00)
CMD ID-H Activate: 0x0F; deactivate: 0xF0
CMD ID-L HPC No.
CRCH 16 bit CRC (byte 0–7), higher byte
CRCL 16 bit CRC (byte 0–7), lower byte

Table 8.8 shows an example of an HPC command sequence which may contain
up to four single HPCs. Table 8.9 gives an overview of all implemented HPCs for
the FLP.

Table 8.8 HPC frame composition


HPC sequence structure (up to 4 commands)
Byte no. 0–1 2–11 12–21 22–31 32–41
Meaning Header HPC1 HPC2 HPC3 HPC4

8.7 Diverse PCDU Functions

The PCDU furthermore features a number of functionalities which partly are


standard for PCDUs, partly are resulting from the PCDU’s role in the overall CDPI
architecture and some also are implemented as specifics of the FLP target satellite.
These functions are explained very briefly here only. For further information
please refer to [83, 85].
8 The Power Control and Distribution Unit 167

Table 8.9 HPC commands


HPC Action
Number
1–77 Activate switch 0–76
(turn single components on)
78–144 Deactivate switch 0–76
(turn single components off)
145 Activate all heater switches
146 Activate nominal core component switches (OBC N, I/O board N)
147 Activate redundant core component switches (OBC R, I/O board R)
148 Activate first core component cross-coupling switches (OBC N, I/O board R)
149 Activate second core component cross-coupling switches (OBC R, I/O board N)
150 Deactivate nominal core component switches (OBC N, I/O board N)
151 Deactivate redundant core component switches (OBC R, I/O board R)
152 Deactivate first core component cross-coupling switches (OBC N, I/O board R)
153 Deactivate second core component cross-coupling switches (OBC R, I/O board N)
154 Deactivate all payload switches
155 Deactivate all switches except for ‘Safe Mode’ components and ‘Survival heaters’
156 Deactivate all heater switches

8.7.1 Launcher Separation Detection

Due to the PCDU taking over in the CDPI architecture some functions of a
classical OBC Remote Interface Unit (RIU), it features an arming switch for
detection of spacecraft separation from the launcher by opening the according
circuits. This prerequisite together with a sufficient level of solar array input power
is required to startup the PCDU operations.

8.7.2 Control and Monitoring of Solar Panel Deployment

The control and the monitoring of the deployment procedure of the solar panels by
the PCDU is based on both implemented deployment timers (timer 0 and timer 1)
and on the activation flag. The control is performed if the activation flag is enabled
(default setting).
After the timer 0 becomes active (time after launcher separation) the PCDU
activates the fuses and the switches of the heaters for the deployment mechanism
and checks its status. As soon as the deployment mechanism signalizes a successful
deployment of the solar panels, the PCDU switches off the heaters and disables the
activation flag. If the deployment mechanism does not signalize a successful
deployment of the solar panels and the timeout value is exceeded, the PCDU
will switch off the heaters without disabling the deployment process.
168 A. N. Uryu

A total of five attempts will be made (with a wait interval in between) to release the
solar panels by switching the deployment device heaters. After five unsuccessful
attempts the autosequence is finally deactivated in order to save power, and FDIR
from ground has to take over.

8.7.3 Control of the Payload Data Transmission Subsystem


Power

The power switches for the Data Downlink System for payload data transmission
are deactivated after a certain duration. This feature is implemented to restrict
the data downlink only to the specified access times. Thus, the transmission of
data is avoided over specific regions of the Earth according to the International
Telecommunication Union (ITU), regulations.

8.7.4 History Log Function

The PCDU software includes a history log functionality for commands, events and
configuration of working components. The history log functionality is introduced
in order to establish a means to check on actions inside the unit in case of oper-
ational issues. Each of the above given values that are recorded are identified by an
dedicated ID and a time stamp.

8.7.5 Time Synchronization Between Internal Controllers

The PCDU features a time synchronization mechanism between the current


operating PCDU controller and the redundant one. The synchronization occurs
every 5 min through an emulated UART interface between both controllers.

8.7.6 Overvoltage Protection

In addition to the under voltage protection feature for the batteries, the PCDU
features an overvoltage protection for itself. The PCDU is switched off automa-
tically via its main switch as soon as a bus voltage greater than 28.5 V is detected.
This case may apply during tests on ground, when the PCDU is powered through
the auxiliary power input.
8 The Power Control and Distribution Unit 169

8.7.7 Measurement of Test-String Characteristics

The PCDU features a measurement circuitry base on a DAC for recording the
characteristic line of the test string of the satellite’s middle solar array. The
measurement is initiated by command. The PCDU sets the current flow through a
shunt resistor and records the values of the current and of the associated voltage.

8.8 PCDU Environmental Qualification Characteristics

The PCDU is environmentally qualified regarding the launcher and operational


requirements in orbit. Thereby, the load limits were implemented according to the
ECSS-E-10-03A standard [47], if possible. However, full compliance to the ECSS
standard (e.g. testbench setup, intermediate testing,…) was not applied due to
financial reasons.

8.8.1 Thermal-Vacuum Limits

The PCDU is tested under varying thermal and vacuum conditions. The tempe-
rature profile of the thermal testing of the unit is shown in Fig. 8.6. Usually, the
lower non-operational is specified lower than the lower operational temperature
limit. The lower operational temperature was adapted to -40 C in order to
increase the operational reliability of the unit and thus for the overall satellite
system.
The operating temperature range of the PCDU ranges from: -40 C to +70
The non-operational temperature range is from: -40 C to +80 C
Number of thermal cycles: 5

The vacuum tests were conducted with a maximum pressure level lower than
1 9 10-5, where the operability of the unit and the PCB-internal heaters were
confirmed.

8.8.2 Radiation Limits

The PCDU resists at least a radiation up to 20 krad total dose, without significant
degradation.
170 A. N. Uryu

Ta Ambient temperature
Tnh , nl Non-operational temperature, high orlow
Temperature Toh , ol Operational temperature, high orlow
Electrical Test
operational
non-operational

4h Thermal Cycling
Tnh
4h 4h 4h 4h 4h 4h
Toh

Ta …..

Tnl,ol
4h 4h 4h 4h 4h 4h

Time

Fig. 8.6 Test profile for thermal testing of the PCDU.  Vectronic Aerospace, IRS

8.8.3 Vibration Limits

The PCDU survives and performs without any degradation after being exposed to
the vibration loads as shown in Tables 8.10 and 8.11.

Table 8.10 PCDU sine vibration limits


Axis Frequency (Hz) Level
Longitudinal axis 4–10.8 15 mm (0 to peak)
10.8–100 11 g
Lateral axis 2–8.5 15 mm (0 to peak)
8.5–100 7g

Sweep rate 2 octaves per minute


Number of sweeps One up-sweep
8 The Power Control and Distribution Unit 171

Table 8.11 PCDU random vibration limits


Axis Frequency (Hz) Level (g2/Hz)
All three axes 20 0.017
110 0.017
250 0.3
1,000 0.3
2,000 0.077

Overall level 19.86 gRMS


Duration 3 min per axis

8.9 List of Connectors

The table below provides an overview over all connectors of the PCDU unit
including a keyword description on the connector use. Detailed pin allocations
obviously are driven by the onboard equipment of the individual mission—in this
case the FLP satellite—and are therefore not provided here.
Connector pin allocations for the CDPI inter-unit cabling between OBC and
PCDU for standard commanding and HPCs are included in the annex in
Tables 11.49 and 11.50 (Table 8.12).

Table 8.12 List of connectors


Name Type Description
J1 SUB-D 25, male Solar panel 0, battery 0,
J2 SUB-D 25, male Solar panel 1, battery 1,
J3 SUB-D 25, male Solar panel 2, battery 2, EGSE power input, solar test
string
J4 SUB-HD 62, female S/C equipment power supplies I
(including power lines for CCSDS-Boards 0 and 1)
J5 SUB-HD 62, female S/C equipment power supplies II
(including power lines for OBC Processor-Boards
N ? R and I/O-Boards N ? R)
J6 SUB-HD 62, female S/C equipment power supplies III
J7 SUB-D 25, female S/C equipment power supplies IV
J8 SUB-HD 62, male Temperature sensor inputs I
J9 SUB-HD 62, male Temperature sensor inputs II
J10 SUB-D 25, female Deployment sensor inputs; sun sensor inputs I
J11 SUB-D 9, male Communication interface for
Common Commanding and High Priority
Commanding I
J12 SUB-D 9, male Communication interface for
Common Commanding and High Priority
Commanding II
J13 SUB-D 25, female Sun sensor inputs II
172 A. N. Uryu

8.10 PCDU Commands Overview

The PCDU provides a large number of commands for controlling all the described
functions and for requesting the according telemetry for the processing activities
by the OBSW. For details on the commands and telemetry messages, for the full
lists and their detailed syntax and arguments the reader is kindly referred to the
PCDU ICD from Vectronic Aerospace [85]. Below a brief overview is given to
depict the sophisticated features of the PCDU:
• Power control commands
– Status requests commands (e.g. currents and voltages of components, solar
panels, batteries)
– Control commands for all component LCLs, switches and relays
– Adjustment commands for the over current monitoring by PCDU software
– Adjustment commands for the charge regulation of the batteries.
• Commands for Satellite Operations
– Adjustment and status request commands of solar panel deployment
– Status request commands of thermistor temperatures
– Status request commands for sun sensors
– Control and status request commands for solar panel test string measurement
– Adjustment commands for the boot-up procedure and its prerequisites.
• Reconfiguration Activities and FDIR commands
– Adjustment and status request commands for the reconfiguration process of
OBC units
– Control and status request commands for internal PCDU controllers
– Request commands for History Log.
• Diverse Commands
– PCDU Reset
– Status Request commands for software version info.
Chapter 9
CDPI System Testing

Michael Fritz, Nico Bucher and Rouven Witt

9.1 Introduction

Both the modularity and the novelty of the presented CDPI infrastructure require
extensive system testing. These tests cover the fields of
• hardware/software compatibility,
• internal communication between Onboard Computer elements and
• OBC communication with external equipment and infrastructure.

M. Fritz (&)  N. Bucher  R. Witt


Institute of Space Systems, University of Stuttgart, Stuttgart, Germany
e-mail: fritz@irs.uni-stuttgart.de
N. Bucher
e-mail: bucher@irs.uni-stuttgart.de
R. Witt
e-mail: witt@irs.uni-stuttgart.de

J. Eickhoff (ed.), A Combined Data and Power Management Infrastructure, 173


Springer Aerospace Technology, DOI: 10.1007/978-3-642-35557-8_9,
 Springer-Verlag Berlin Heidelberg 2013
174 M. Fritz et al.

Finally the complete communication chain has to be proven, covering


• commanding the OBC unit from a ground control system,
• controlling satellite equipment units connected to the OBC and
• retrieving telemetry back through the entire chain.
Failure Detection, Isolation and Recovery (FDIR) testcases come on top. This
chapter provides a brief overview on the entire test program of the CDPI and gives
an overview about tests of OBC components, OBC subsystem, PCDU subsystem
up to the entire CDPI infrastructure system tests. A subset of the HW/SW inte-
gration tests is described in more detail in [87].
The early tests were performed with Breadboard (BBM) and Engineering
Models (EM). The OBC Processor-Board Engineering Model is depicted in
Fig. 1.8. BBMs and EMs do not yet have full FM functionality. Thus, additional
FM tests are usually necessary later and tests previously performed for BBMs and
EMs need to be re-run on the FM CDPI units. This BBM/EM approach however
reduces the effort for test debugging on FM hardware—which is available later in
the program—and by this means it reduced the overall test program duration for
the entire campaign. Figure 1.17 depicts both parts of the CPDI—the OBC Flight
Model and the PCDU FM.

9.2 Test Scope

The complexity of the tests required a consistent concept. This concept had to
cover the following tasks:
• Configuration tests of all OBC modules. This comprises read and write opera-
tion from/to memory, IP Core updates as well as successful OBSW uploads and
runs.
• Communication tests between the OBC modules. The elements (boards and
functional subgroups such as I/O-Board memory) must interact as specified.
EMs were to be exchanged step by step by FMs. After each replacement, the
tests needed to be re-run. Modules to test in interaction were:
– CCSDS-Board and OBC Processor-Board
– OBC Processor-Board and I/O-Board covering:
equipment interface read/write access by OBC Processor Board via I/O-Board
drivers and
housekeeping data management on I/O-Board
– OBC Processor-Board, I/O-Board and PCDU covering:
PCDU cmd./ctrl. by OBC and
OBC reconfiguration by PCDU
– CCSDS-Board and PCDU
9 CDPI System Testing 175

• System chain tests were necessary for controlling the CDPI from the Mission
Control System ESA SCOS-2000 [88] via CCSDS protocol (see [23–28]) as
foreseen for the target satellite through the frontend equipment to the CCSDS-
Board (Fig. 9.1).
• System tests for the onboard chains between OBC and S/C equipment were
necessary, e.g. for PCDU, star tracker and reaction wheels. Such tests served to
prove that specification requirements were implemented correctly both on OBC
and on equipment side and in the OBSW.
• Communication tests with simulated S/C equipment for Onboard Software
verification. Thus, operational scenarios could be tested applying closed-loop
scenarios without sensor stimulation.

Fig. 9.1 ESA mission control system SCOS-2000.  IRS, University of Stuttgart

9.2.1 Test Conditions

The test conditions for functional tests are driven by


• cleanliness,
• temperature and
• humidity.

For EMs the tests were performed under ambient conditions in an air-condi-
tioned laboratory. For FMs all tests concerning boards or tests concerning inte-
grated units were performed exclusively in a temperature and humidity controlled
class 100.000/ISO 8 cleanroom at IRS premises.
176 M. Fritz et al.

9.2.2 Involved Personnel

This aspect is of key relevance for this particular university project since it had to
cope with significant personnel fluctuation over the development period—partic-
ularly in the domain of Assembly, Integration and Tests (AIT) of the CDPI. For all
projects it is necessary to determine between:
• Engineering staff—designing the foreseen tests up to a specification level and
• AIT staff—debugging the test flows, executing them and documenting the results.

The areas of responsibilities have to be clearly defined. This also applies for a
university project. However, some simplifications are necessary at university level
due to limited manpower and the requirement for lean system engineering. At
university, almost the entire team consists of students and PhD candidates, which
implies the high personnel fluctuation in the project. Students support the project
for less than 1 year, PhD candidates for 3–4 years. It is favorable to perform a
limited, clearly defined subset of the test campaign with the same set of people and
the next functional subset with the successor team to be as efficient as possible.
Team fluctuation in such a program implies the necessity to organize proper know-
how transfer and system test status handover from one to the next sub-team with
sufficient handover time. Furthermore, a proper tracking of personnel to tests and
of lessons learned is required. Such tasks are preferably assigned to PhD candi-
dates due to their longer availability in the team.
Since the OBC Processor-Board is an item subjected to ITAR, there were test
tasks which could only be performed by authorized personnel.

9.2.3 Test Program Simplifications

As already mentioned there were simplifications necessary in the test program


which result from university constraints like
• limited manpower,
• limited experience and
• budget.

These simplifications affected the entire functional verification program, par-


ticularly the hardware and software integration tests—also due to the fact that the
satellite OBSW development program was running with a minimum staff. Thus, it
had to be elaborated where simplifications were feasible. This was a balancing
between risk and schedule/cost efficiency. The most important simplification
identified in this program was a substantial reduction of unit tests at IRS, since
except for the IRS-made OBC Power-Boards and the PCDU functional behavior
the commercially manufactured units already had undergone a test suite at supplier
level. This reduced unit test program also implied skipping some tests on system
9 CDPI System Testing 177

level. All mentioned considerations directly lead to a test plan which is presented
in condensed form in the following section.

9.3 Test Plan

This section provides a condensed overview on the most important elements of the
test plan which served as baseline for the qualification test campaign of the CDPI
System. In order to keep the overview on the to be performed tests, to track test
execution status during the campaign and to track test success status, so called
functional verification matrices were used. These FV matrices address the fol-
lowing questions on:
• Which tests have to be performed?—Test scope
• Which testbench is to be used?—Test setup and environment
• Which HW model is to be used in the test?—BBM, EM or FM—Item under test
Exact documentation of the used software version as well as TM/TC database
was essential in order to reproduce the tests—especially if they had to be run on
multiple testbenches or with multiple hardware models (EM ? FM). Furthermore,
the test plan had to consider the test concept. It organized the order of tests,
described simplifications and served as documentation both for internal and
external purposes. Furthermore, lessons-learned were part of this document in
order to accumulate experience and knowledge.
The next step in the test program was to develop test procedures for each test
identified in the test plan. Then the tests had to be executed and test reports were to
be filled out for documentation. To simplify the overall procedure in the frame of
the university program, combined test procedure/report documents were
established:
• First the test procedure part was implemented, containing sections on test setup,
required equipment/instrumentation, test conditions, personnel, etc. and finally
at the end a step by step procedure to be executed with entry fields for expected
and as received results. The latter column first remains empty.
• Then the procedure was reviewed and formally signed off by the senior engineer
of the program.
• Thereafter the test was executed—potential procedure variations directly were
added to the document and the as-is result was entered into the according column
together with an OK/NOK information to track success/failure. Each test with
FM HW was executed at least with 2 persons to guarantee 4 eyes principle.
The general component functional tests—except for OBC reconfiguration—unit
performance tests and all EM unit tests were executed on a dedicated Satellite Test
Bed which included a ground station frontend for the TC/TM traffic to/from the
OBC on one side and a simulation of the target satellite on the backside. This
testbench (see Fig. 1.16) includes BBM/EM units of the OBC and PCDU. It does
178 M. Fritz et al.

not provide redundancy for the OBC units. The development of this testbed is
described in more detail in Sect. 9.5.
The tests requiring OBC redundancy and certain reconfiguration functionality
of the PCDU were performed exclusively in the FM FlatSat setup of the project
(see Fig. 1.17, 9.12) since some prerequisite features were not yet available in the
PCDU EM.
Functional verification matrices in the tables presented on the following pages
provide a brief overview on all tests and show which kind of tests were performed
in which particular setup.
Test-types:
Q = Qualification by the unit supplier
BB = Bread Board Model test at IRS
EM = EM Qualification test at IRS
FM = FM Qualification tests at IRS
Test-setups:
• Test at supplier premises: supplier specific test infrastructure
• Satellite Test Bed (STB) in configuration 1 and 2 as described in Sect. 9.5
• FlatSat as described in Sect. 9.5
• Test configuration for Thermal Vacuum (TV) Chamber
• Test configuration for vibration test on Shaker

9.3.1 PCDU Tests

The Power Control and Distribution Unit was completely qualified on supplier
side. To prove compatibility with the CDPI System, an extensive series of com-
munication tests with the OBC was performed at IRS (Table 9.1). Since the PCDU
is an essential element in the overall concept of the CDPI system it was also
significantly involved in reconfiguration tests for the entire system—see Table 9.7.

Table 9.1 PCDU FV matrix


PCDU Supplier STB STB FlatSat Shaker TV
conf.1 conf.2 chamber
Electrical qualification tests Q – – – – –
Shaker tests Q – – – – –
Thermal vacuum tests Q – – – – –
Preliminary software tests Q – – – – –
Initial power-up Q – EM FM – –
Communication tests with OBC: – – EM FM – –
Cmd./ctrl. interfaces and HPC interfaces
9 CDPI System Testing 179

9.3.2 Processor-Board Tests

The FM Processor-Boards were tested during the FlatSat campaign whilst the EM
was used to run test software with the connected satellite simulation environ-
ment—the STB. The FM Boards were operated exclusively under cleanroom
conditions for the FlatSat tests (Table 9.2). It is envisaged to use the Satellite Test
Bed with the Processor-Board EM later for system simulations on ground during
Phase E of the mission as well as for pretests of OBSW patches before uplinking
them to the spacecraft in orbit.

Table 9.2 Processor-Board FV matrix


Processor-Boards Supplier STB STB FlatSat Shaker TV
conf.1 conf.2 chamber
Initial power-up – EM EM FM – –
Characterization of electrical parameters: – EM EM FM – –
Power consumption, inrush current at
power-up, etc
Test boot process from FRAM – EM EM FM – –
Test reboot process from SRAM – – EM FM – –
Test PPS output – – EM FM – –
Test PPS Input – – EM FM – –
Test FPU functionality – EM EM FM – –
Test debug I/O functionality – EM EM FM – –
Test of SpaceWire ports 1–4 – EM EM FM – –

Thermal qualification Q – – – – –

Mechanical qualification: – – – – FM –
FM qualification in the frame of the overall
satellite vibration and shock test campaign

9.3.3 Power-Board Tests

Tests for the Power-Boards were focused on verifying the requirements to be met
by each Power-Board. The most critical phase regarding voltage control is during
the startup process of the connected OBC data handling boards. More details on
this topic are provided in Sect. 5.2. Further tests were conducted to verify signal
conversion of the GPS PPS signals and STR PPS and their routing to an external
OBC connector. Please see Chap. 5. Dedicated unit shaker tests were performed
only with the EM in order to avoid damaging of flight hardware (Table 9.3).
180 M. Fritz et al.

Table 9.3 Power-Board FV matrix


Power-Boards Supplier STB STB FlatSat Shaker TV
conf.1 conf.2 chamber
Test supply voltage for Processor-Board – – EM FM – –
Test supply voltage for I/O-Board – – EM FM – –
Test supply voltage for CCSDS-Board – – EM FM – –
Test priority circuit function – – EM FM – –
Test power for OBC heaters – – EM FM – –
Test conversion of PPS signal – – EM FM – –
Test signal forwarding for debug output – – EM FM – –

Shaker test – – – – EM –

Thermal Qualification: – – – – – FM
FM qualification in the frame of the
overall satellite TV test campaign

9.3.4 CCSDS and I/O-Board Tests

The PCBs for I/O-Boards and CCSDS-Boards are based on the same design and
the hardware of both board types is manufactured by 4Links Ltd. Therefore, basic
hardware qualification for both boards was performed on supplier side except for
environmental tests. This environmental qualification was performed by IRS in
the frame of the overall mechanical and thermal spacecraft test campaign
(Tables 9.4, 9.5).

Table 9.4 CCSDS-Board FV matrix


CCSDS-Boards Supplier STB STB FlatSat Shaker TV
conf.1 conf.2 chamber
Test IP core upload Q – EM – – –
Test SpaceWire interface Q BB EM FM – –
Test RMAP access to memory Q BB EM FM – –
Test HPC forwarding – BB EM FM – –
Test telecommand decoding – BB EM FM – –
Test telemetry encoding – BB EM FM – –

Mechanical qualification: – – – – – FM
FM qualification in the frame
of the overall satellite vibration
and shock test campaign

Thermal qualification: – – – – – FM
FM qualification in the frame
of the overall satellite TV test
campaign
9 CDPI System Testing 181

Table 9.5 I/O-Board FV matrix


I/O-Boards Supplier STB STB FlatSat Shaker TV
conf.1 conf.2 chamber
Test IP core upload Q BB EM – – –
Test SpaceWire interface Q BB EM FM – –
Test RMAP and RMAP error codes Q BB EM FM – –
Check RMAP access to NVRAM Q – EM FM – –
Check RMAP access to SRAM Q – EM FM – –
Check RMAP access to transmission Q BB EM FM – –
buffer
Test RS422 interface Q BB EM FM – –
Test LVDS interface Q – EM FM – –
Test IIC interface Q – EM FM – –
Test FOG interface Q – EM FM – –
Test logical interface Q – EM FM – –

Mechanical qualification: – – – – – FM
FM qualification in the frame
of the overall satellite vibration
and shock test campaign

Thermal qualification: – – – – – FM
FM qualification in the frame of the
overall satellite TV test campaign

For preliminary tests Breadboard Models were used while assembling the
Satellite Test Bed infrastructure. The BBM for the CCSDS-Board was provided by
Aeroflex Gaisler. 4Links provided the BBM for the I/O-Board in the form of a
partially populated PCB. Later in the project a full I/O-Board EM was supplied by
4Links.

9.3.5 OBC Subsystem Tests

The OBC subsystem was assembled from the diverse delivered boards plus the
OBC internal harness in the cleanroom at IRS premises. Therefore, it also had to
be electrically and functionally qualified by the IRS team. This qualification was
performed within the FlatSat test campaign. Board and unit thermal qualification
could also be performed in-house while vibration and shock tests were only per-
formed externally in the frame of the shaker tests of the completely integrated
satellite for reduced stress of the CDPI components (Table 9.6).
182 M. Fritz et al.

Table 9.6 OBC subsystem tests FV matrix


OBC subsystem Supplier STB STB FlatSat Shaker TV
conf.1 conf.2 chamber
Electrical verification – – EM FM – –
Test SpaceWire/RMAP access to – – EM FM – –
I/O-Board
Test SpaceWire RMAP access to – – EM FM – –
CCSDS-Board
Test access to debug interface – – EM FM – –
Test OBC service interface – – EM FM – –
Test PPS synchronization – – – FM – –
Test spacecraft units electrical integration – – – FM – –
with OBC

9.3.6 CDPI Reconfiguration Tests

The CDPI reconfiguration test series was targeted to prove the redundancy man-
agement of the CDPI system. The reconfiguration functionality is controlled by the
CDPI Common-Controller which in this implementation is the processor of the
PCDU. The test series included basic switching between single nominal and
redundant boards by externally commanded HPCs as well as by automatic routines
of the PCDU. The final test was an artificially created, critical outage of one
Processor-Board. The system needed to successfully recover automatically back to
an operational configuration and within specified parameters in order to pass the
test (Table 9.7).

Table 9.7 CPDI reconfiguration FV matrix


CDPI reconfiguration Supplier STB STB FlatSat Shaker TV
conf.1 conf.2 chamber
Test switching nom./red. I/O-Board – – – FM – –
Test switching nom./red. Processor-Board – – – FM – –
Test HPC forwarding and execution – – – FM – –
Test automatic reconfiguration by PCDU – – – FM – –
Complete CDPI recovery test – – – FM – –

For the overall test campaign a distinction between Engineering Model tests
(EM tests) and flight model tests (FM tests) had to be made. Both groups of tests
involved different hardware and had completely different objectives.
The objective of the EM tests was to functionally verify the system itself. This
covers interface tests and the first assembly of all EM parts where subsequently
complexity was added until one non redundant OBC EM was complete and all
interfaces were tested and working as desired. This EM of the OBC is powered by an
external power supply—not via Power-Boards and therefore doesn’t represent the
complete system as it is used within the satellite. Such limitations were the reason
9 CDPI System Testing 183

why only functional tests could be performed on the STB. This resulted in leaving out
all tests regarding power supply and redundancy. After the test had passed, the EM
was used to conduct simulation tests with the real-time simulator, i.e. attitude control
test scenarios, and will also be used during the mission to verify software updates.
The focus for FM tests was on reliability aspects and CDPI redundancy man-
agement. Compared to EM tests, the group of tests for the FM follows a different path.
The main objective was to qualify the FM system for space. This means the entire
system including Power-Boards had to be assembled and tested thoroughly under
clean-room conditions. In this setup tests like OBC reconfiguration between nominal
and redundant Processor-Board and I/O-Board were performed interactively with the
PCDU. Nevertheless, the FM OBC had to undergo similar interface tests as con-
ducted for the EM too but for the FM additional safety aspects came into play.

9.4 EM Testbench Infrastructure

The Satellite Test Bed described in Sect. 9.2 is schematically depicted in Figs. 9.2
and 9.3. The first line in Fig. 9.3 represents the command/control infrastructure
which is representative for what will later be used in the satellite ground station.
All parts of the OBC are depicted in the second line. The third line shows a
simulation environment for OBSW verification.
One part of the command/control infrastructure is the ESA Mission Control
System SCOS-2000 shown in the middle of the top line. It supports telecommand/
telemetry data handling, visualization and packet processing and is applied as
standard tool in ESA and DLR satellite project ground infrastructures. In order to
convert TC and TM packets into formats which are suitable for lossless trans-
mission, there is a Telemetry and Telecommand Frontend implemented which
performs the transformation of TC packets to Command Link Transmission Units
(CLTU) and vice versa of Channel Acquisition Data Units (CADU) back to TM
packets for the ground.

S/C Simulator
System Models (Env./Dyn./etc.)

Equipment
Model
Simulator Kernel

OBC Equipment
Model
Equipment
Model Control
Console
Power TM/TC Equipment
Front- Front- Model
end end
Simulator I/O

Testbench LAN

Fig. 9.2 STB with complete OBC in the loop.  Jens Eickhoff [9]
184 M. Fritz et al.

Fig. 9.3 Testbench schematic.  IRS, University of Stuttgart [87]

The procedure execution engine MOIS from RHEA Group is integrated to


execute sequences of commands, so-called Flight Procedures. The Onboard
Computer is connected to this environment applying cables which bypass the radio
transmission link. Fictive transmission errors can be artificially injected into the
TM/TC-Frontend in order to test on board forward error correction.
Figure 1.16 in Chap. 1 ‘‘The System Design Concept’’ depicts the real setup.
The screens on the left belong to the command/control infrastructure. Onboard
Computer and simulator are located in the rack on the right side. On-board
equipment can be added onto the grounded table on the very right of the STB setup.
The testbench needed to comply to several requirements:
• Correct grounding is essential to prevent both ground loops and unintended
electrostatic discharging. It was necessary to carefully design a grounding
concept for all mounted units which was verified by measurements.
9 CDPI System Testing 185

• The power supply for the testbench electronic components in the rack had to be
independent from primary power fluctuations. This comprised both blackouts
and overvoltages. A non-interruptable power supply was integrated to act as
buffer between primary network power and consumers.
• Due to the OBC Processor-Boards falling under ITAR regulations, access to OBC
components had to be restricted to authorized personnel. However full setup
configuration for all non ITAR elements of the testbench was required and so was
• quick access for installation and de-installation of both OBC EM components
and equipment.
All requirements together lead to a rack based design solution. Based on a UPS
and a remote power switch, the power supply fulfills all requirements. The rack
itself provides a star point for correct grounding. The rack can be locked. As the
rack frontend only consists of one plug for mains power supply and one Ethernet
connector, it is simple to move. In order to keep the temperature at an acceptable
level, a temperature-controlled ventilation is installed at the top side of the rack.

9.5 EM Test Execution and Results

The testbed in Fig. 1.16 for the performed EM tests in its full deployment is called a
Satellite Test Bed or STB (see also [9]). In an initial configuration CDPI component
breadboard versions were tested (Fig. 9.4). In an upgraded setup 2 the EM hardware
was tested (still non redundant). Full CDPI redundancy and reconfiguration testing
was performed on the FLP satellite FlatSat setup later—see Sect. 9.6.

Equipment

S/C Simulator
System Models (Env./Dyn./etc.)

Simulator Kernel

OBC Equipment
Model
Equipment
Model Control
Console
Power TM/TC Equipment
Front- Front- Model
end end
Simulator I/O

Testbench LAN

Fig. 9.4 EFM testbed with S/C equipment hardware connected to OBC.  Jens Eickhoff [9]
186 M. Fritz et al.

Both the I/O-Boards and the CCSDS-Boards of the OBC are FPGA based. The
IP Core of each board was updated several times during the test program in order to
install new respectively corrected functionality. The FPGAs are programmable via
a JTAG Interface. The JTAG pins are located on the 100 pin Micro-D connectors E
of each board as described in Sect. 3.8 respectively in the according annex tables.

9.5.1 STB Tests Stage 1: Connecting OBC Processor-Board


and S/C-Simulator

For stage 1 the connection between the OBC Processor-Board EM and the Real-
time Simulator (RTS) was established. Therefore a SpaceWire Router manufac-
tured by 4Links was used. The Realtime Simulator (RTS) uses a standard Ethernet
connection for communication with the 4Links SpaceWire Router. The SpaceWire
Router was connected on the other side to the same SpaceWire port of the OBC
Processor-Board that I/O-Board normally would use. This is why either the RTS or
the real I/O-Board can be used—not both in parallel. The RTS holds a simulated
Front-End which acts as a replacement for the actual I/O-Board (Fig. 9.5).

Onboard Computer

CCSDS Core I/O


CCSDS-Board Processor-Board I/O-Board
BBM EM EM

Real-Time Simulator

SpWR RTS

4Links SpW Router Real-Time Simulator

Fig. 9.5 Connecting processor-board and S/C simulator.  IRS, University of Stuttgart

9.5.2 STB Tests Stage 2: Connecting OBC Processor-Board


and CCSDS-Board

After connecting the RTS to the OBC Processor-Board it was possible to conduct
simulation tests focusing on the device handling part of the OBSW. However, to
reasonably control and monitor the OBSW telecommanding and telemetry han-
dling is required. Therefore as next step, the CCSDS-Board was taken into
operation (Fig. 9.6).
9 CDPI System Testing 187

Onboard Computer

CCSDS Core I/O


CCSDS-Board Processor-Board I/O-Board
BBM EM EM

Real-Time Simulator

SpWR RTS

4Links SpW Router Real-Time Simulator

Fig. 9.6 Connecting Processor-Board and CCSDS-Board.  IRS, University of Stuttgart

The preliminary OBSW used for this stage had to configure CCSDS-Board
parameters such as link encoding and decoding settings, Spacecraft Identifier,
symbol-rates and clock dividers. Also TC reception buffers were configured by this
OBSW release.

9.5.3 STB Tests Stage 3: Entire Command Chain Bridging


the RF Link

The test infrastructure concept was to conduct all tests with a command/control
infrastructure that also will be used during the mission in flight. The only hardware
part that cannot be used within a laboratory environment is the Radio Frequency
(RF) equipment. To bridge the RF link in the test setup, a bypass is used. The
uplink input for the bypass line is the clock-driven synchronous signal that will
later be the input for the baseband up and down-converters. This signal corre-
sponds to the line signal from the onboard receiver to the OBC’s CCSDS-Boards.
Similarly the downlink signal on the bypass line is the encoded output of the
CCSDS-Boards which normally would be routed to the spacecraft’s transmitters.
In the bridge it is directly connected to the telemetry decoding frontend in the
ground infrastructure. The connection is physically realized by an RS422 interface.
This chain test served for:
• Testing High Priority Command reception and routing (RS422)
• Testing command reception using OBSW
• Testing telemetry generation by the OBSW
• Testing TM Idle Frame generation and reception on ground
188 M. Fritz et al.

Figure 9.7 shows all involved components and CCSDS layer conversions [24].
The ground control system i.e. SCOS-2000 operates on the packetization layer.
These packets are forwarded and returned to and from the TM/TC-Frontend via
TCP/IP using the facility LAN. The TM/TC-Frontend can be operated either alone
or in combination with radio frequency up and down-converters as an RF-SCOE for
RF tests. For this stage of EM tests only the TM/TC-Frontend was used. It handles
conversions between packetization layer and coding layer. On the corresponding
side these conversions are partly done by the CCSDS-Board and the OBSW.

SBC Board
OBC Processor Board

CCSDS Board
Decoder / Encoder

NRZ-L

NRZ-L cable bridge

NRZ-L
RF-SCOE

CLTU Layer CADU Layer


Frame Layer
TM/TC-FE Frame Layer
Segm. Layer
Pck. Layer Pck. Layer
Pckt.

Gnd Ctrl System,


SCOS / MOIS TM/TC-DB
Script Engine

Fig. 9.7 TM/TC chain overview.  IRS, University of Stuttgart

Since the bypass line is a synchronous connection, the correct interpretation of


the clock signal is mandatory for this link. If the clock signal on the submitting side
is valid on the rising flank, it has to be read out by the receiver on the rising flank,
too. Otherwise the interpretation gets screwed up and corruption of data will occur.
Both interfaces as well as the cable-bridge have proven to be compatible. TC
Frames that have been sent to the CCSDS-Board have been stored in full integrity
within the TC Buffer. From this point onwards it is the responsibility of the OBSW
to manage this buffer and to forward the packets to their designated PUS Terminal.
To test the TM link direction, generic TM frames have been created by the
OBSW and forwarded to the CCSDS-Boards memory. The TM stream conversion
is part of the functionality of the CCSDS-Board. The packets were then extracted
by the TM/TC-Frontend and forwarded to the Mission Control System where they
can be prepared for display (Fig. 9.8).
9 CDPI System Testing 189

Ground Station Equipment

MOIS SCOS TMTC


Telemetry &
Procedure Execution Mission Control
Telecommand
Engine (MOIS) System
Front-End

Onboard Computer

CCSDS Core I/O


CCSDS-Board Processor-Board I/O-Board
BBM EM EM

Real-Time Simulator

SpWR RTS

4Links SpW Router Real-Time Simulator

Fig. 9.8 Entire command chain bridging RF link.  IRS, University of Stuttgart

The OBSW is responsible for configuring the CCSDS-Board. It can configure


e.g. the TM transmission rate and other mission specific parameters needed for
generation of TM frames as well as the correct interpretation of TC frames. The
according details can be found in Chap. 4.

9.5.4 STB Tests Stage 4: Verify High Priority Commanding

High Priority Commands (HPCs) are an essential means for the safety concept of
the spacecraft system design. With HPC type 1 commands certain emergency
shutdown and reconfiguration functions can be commanded from ground without
an OBSW running on board the spacecraft. Therefore the CCSDS-Board func-
tionalities which directly forward HPC1 packets to the CDPI Common-Controller
in the PCDU were tested. The command concept in this implementation foresees
the identification of HPCs by Virtual Channels (see Fig. 1.11) and not by MAP-ID
(see also [10]).
Received HPCs are submitted directly from the CCSDS-Board to the PCDU.
This is why for this test an EM of the PCDU was required. The PCDU EM was
placed in the hardware test frame right next to the STB rack (see Fig. 1.16). It was
directly connected to the HPC UART ports of the CCSDS-Board (Fig. 9.9).
190 M. Fritz et al.

Ground Station Equipment

MOIS SCOS TMTC


Telemetry &
Procedure Execution Mission Control
Telecommand
Engine (MOIS) System
Front-End

Onboard Computer

CCSDS Core I/O PCDU


CCSDS-Board Processor-Board I/O-Board
BBM EM EM

Real-Time Simulator

SpWR RTS

4Links SpW Router Real-Time Simulator

Fig. 9.9 Verification of high priority commanding.  IRS, University of Stuttgart

9.5.5 STB Tests Stage 5: Commanding Equipment Unit


Hardware

For commanding actual hardware devices, the preliminary version of the OBSW
needed to provide at least rudimentary device handling capability. Once this
requirement was met, the I/O-Board could be integrated into the STB followed by
a series of tests that firstly verified the proper SpaceWire address mapping of each
device. Furthermore, these tests served to check whether the I/O-Board input and
output signals were interpreted respectively generated correctly. In the STB each
device interface then was tested by connecting an engineering model of the
according spacecraft equipment hardware to the corresponding I/O-Board
interface.
With the possibility of connecting actual EM devices to the I/O-Board, the
assembly of the STB was completed. The OBC EM now could be operated from
the Mission Control System while the OBC either was connected to the RTS for
simulation of scenarios or to real EM hardware for interface tests. In summary the
following types of tests were performed:
9 CDPI System Testing 191

• Test of high priority commanding with PCDU


• Test of regular commanding with PCDU
• Test of each UART Interface
• Test of IIC Bus communication with MGT EM electronics
• Test of communication with FOG EMs

The final STB setup is depicted in Fig. 9.10. This setup is referred to as ‘‘STB
Configuration 2’’ in the test matrices in Sect. 9.3 and will remain unchanged until
the end of the mission.

Ground Station Equipment

MOIS SCOS TMTC


Telemetry &
Procedure Execution Mission Control
Telecommand
Engine (MOIS) System
Front-End

Onboard Computer

Equip-
CCSDS Core I/O
ment
CCSDS-Board Processor-Board I/O-Board
EM EM EM

Real-Time Simulator

SpWR RTS

4Links SpW Router Real-Time Simulator

Fig. 9.10 Commanding simulated S/C equipment and hardware in the loop.  IRS, University
of Stuttgart

9.5.6 STB Tests Stage 6: Performance Tests with Attitude


Control Software

A further type of tests concerns some early performance tests which were run to
evaluate the CPU load induced by the Onboard Software in realistic load cases—
more complex than only small channel access tests, boot or debug testcases. The
192 M. Fritz et al.

tests were run on STB with the EM OBC connected to the spacecraft simulator
running attitude control scenarios for the FLP target satellite. The Attitude Control
System (ACS) software implements preliminary control functions which were
designed in Simulink, converted to C++ and integrated into the OBSW
framework.
All test showed good results with respect to minor control precision errors due
to the early stage of the OBSW. The attitude control system provides a good
example for numeric load tasks which the system has to cope with during oper-
ation. The ACS controller software represents one application module in the
OBSW framework and accesses diverse equipment handlers for all the ACS
sensors and actuators. In addition to pure data handling functions like TC handling
and TM generation in OBSW the ACS is also performing significant floating-point
calculations and thus complements the scope of used processor functionality of the
LEON3FT, namely the extensive use of the processor’s floating point unit.
After successful debugging of the ACS test scenario, a performance analysis
was conducted in order to estimate margins of available processing time. For this
test, the CPU time of the ACS task was measured. A design guideline for the
overall system was to keep the CPU time for ACS below 33 % of the overall CPU
load. Figure 9.11 shows a CPU load result summary from multiple test scenarios
with the ACS subsystem controlling the satellite in its different operational modes
(x-Axis). As can be seen, even normal target pointing mode consumes only a bit
more than 20 % of CPU load thanks to the high CPU performance of the
LEON3FT. Only with activated additional numeric filter the limit is reached as
worst case. The blue bars for device handling represent the CPU load for access of
ACS sensor and actuator equipment. Since I/O access timings are fixed in a
common polling sequence table over all operational modes of the software this
load is not influenced by the complexity of the ACS computations.

Fig. 9.11 Performance test results example.  IRS, University of Stuttgart


9 CDPI System Testing 193

9.6 FM Test Execution and Results

All FM components have to be operated and stored under clean room conditions.
To test these components, a so-called FlatSat assembly was built up in the IRS
clean room. This assembly in its full deployment consists of all FM components of
the satellite. For the FlatSat assembly, all satellite components are electrically
connected together on a table to check the proper operability of the complete
satellite system. It serves as a final interface test environment before equipment is
integrated into the structure of the satellite. The FlatSat starts out with the CDPI
system units, the OBC and the PCDU plus a power supply replacing the Solar
Panels, and expands until the complete S/C system is assembled.
Figure 9.12 provides an outline of the FlatSat assembly of the FLP project. For
a better overview the figure illustrates only a subset of the overall equipment
assembly as far as relevant for the CDPI testing and as far as including the
functional connections of the main technical units. The Command and Data
Handling subsystem and the Power Subsystem are shown with their distinct
subunits. All remaining subsystems that are composed of multiple component
boxes are illustrated in an abstracted view. An additional simplification is achieved
by the illustration of only one side of the redundant component sets and conse-
quently also by leaving out cross-strappings.

Modulator/ Transceivers
Demodulator
RF-SCOE

TM/TC- Solar Array


Frontend Simulator
CCSDS
PCDU
Boards
MOIS Script SCOS Control Battery
Power Boards

Engine Station Pwr. Subsystem

Processor Thermal Ctrl.


Boards Subsystem
S/C Simulator
Attitude Ctrl.
I/O-Boards System
OBC Config.
PC OBC Unit Payloads
Combined Data and Power
EGSE Management Infrastructure

Fig. 9.12 Functional overview of the FlatSat assembly.  IRS, University of Stuttgart

The EGSE that is necessary for the control of the simulator and the FlatSat
assembly incorporates control applications for:
• Flight procedure execution which again was realized by a second instance of
MOIS supplied by RHEA Group
194 M. Fritz et al.

• Mission control realized again via a SCOS-2000 provided by ESA/ESOC


• Simulating the transmission between ground and satellite by a Radio Frequency
Special Check-Out Equipment (RF-SCOE) from Satellite Services B.V.
• Uploading and patching the OBC S/W via an OBC configuration PC.
So except for providing also RF-Link test functionality the FlatSat EGSE could
be kept almost identical to the command and control infrastructure used for the
STB which was a strategic issue to avoid running EM and FM tests with different
infrastructures. By this means a maximum similarity for test setups and procedures
between the testbenches could be achieved and as a consequence also a maximum
comparability of the test runs and results. The used RF-SCOE consists of two
parts. A TM/TC-Frontend as in the EM STB covers the transmission coding and
decoding functionality. The system is complemented by a modulation/demodu-
lation unit respectively which prepares commands or receives telemetry packets
for transmission.
The performed system tests with the completely assembled OBC and the CDPI
reconfiguration tests can be taken from the test matrices presented in Sect. 9.3.
Chapter 10
The Research Target Satellite

Hans-Peter Röser and FLP Team

10.1 Introduction

The FLP is the first in a series of planned satellites implemented in the frame of the
SmallSat Program of the IRS at the University of Stuttgart, Germany. It is being
developed and built primarily by PhD and graduate students funded by the major
industry partner Astrium (Astrium Satellites GmbH and its daughter TESAT
Spacecom), by space agency and research facilities (German Aerospace Center,
DLR) and contributing other universities. The project is financed by the federal
state of Baden-Württemberg, the university and industry partners. Current devel-
opment status is assembly Phase D with a functional test program on the FlatSat
testbench and the qualification of a structural/thermal model of the flight structure.
The flight hardware units have been built at IRS or have been procured and unit as
well as system and OBSW tests are still ongoing.

H.-P. Röser (&)  FLP Team


Institute of Space Systems, University of Stuttgart, Stuttgart, Germany
e-mail: roeser@irs.uni-stuttgart.de

J. Eickhoff (ed.), A Combined Data and Power Management Infrastructure, 195


Springer Aerospace Technology, DOI: 10.1007/978-3-642-35557-8_10,
 Springer-Verlag Berlin Heidelberg 2013
196 H.-P. Röser

A main project goal for the industry partners is to qualify electronic components
for space, in particular the elements of the innovative, integrated OBC/PCDU
infrastructure CDPI. For the university it is to establish the expertise and infra-
structure for development, integration, test and operations of satellites at the IRS. It
is also used to improve the education of students by providing a possibility for
hands on experience within a challenging space project. Once in orbit, the satellite
shall be used to demonstrate new technologies and to perform Earth observation.
The FLP is three-axis stabilized and features target pointing capabilities. The
total pointing error during one pass is less than 150 arcsec and the pointing
knowledge is better than 7 arcsec. To achieve these values, state of the art star
trackers, magnetometers and fiberoptic gyros as well as GPS receivers are used to
measure the attitude. Reaction wheels and magnetotorquers are used as actuators.
As an asset, the star trackers can also be used in camera mode to search for both
Inner Earth Asteroids and Near Earth Asteroids. The FLP is not equipped with any
means for propulsion or orbit control (Table 10.1).

Table 10.1 Satellite Dimensions 60 9 70 9 80 cm3


characteristics
Mass 130 kg
Launch type Piggy-back (secondary payload)
Desired orbit Circular, polar
Orbit altitude 500–800 km
Attitude control 3-axis stabilized
Communications S-Band
Solar panels 3(2 deployable)

10.2 Orbit and Operational Modes

The satellite has been designed for a circular Sun-Synchronous Orbit (SSO) with a
Local Time of Descending Node (LTDN) between 9:30 and 11 h. As the opera-
tional lifetime of the satellite is targeted to be two years and the satellite should not
stay in orbit for more than 25 years after end of life (considering the European Code
of Conduct on Space Debris Mitigation), the desired orbital altitude is between 500
and 650 km. For de-orbiting after operational use the satellite is equipped with an
experimental De-Orbiting Mechanism (DOM) from Tohoku University, Japan.

10.3 Mechanical Design and Launcher Interface

The FLP is a cuboid with two deployable solar panels. It has an estimated total
mass of less than 130 kg. The exact mass cannot be provided before completion of
the flight harness manufacturing. Figure 10.1 shows the satellite configuration with
10 The Research Target Satellite 197

Fig. 10.1 FLP in-orbit simulation and mechanical configuration.  IRS, University of Stuttgart

deployed solar panels. Its main dimensions during launch with undeployed solar
panels are 600 by 702 by 866 mm as depicted in Fig. 10.2. An adapter ring to the
launcher adapter is installed on the satellite. The depicted variant was designed to
be compliant to the PSLV piggy-back separation adapter. For alternative launch
vehicles this still can be adapted accordingly. It needs to be noted that the
deployable De-Orbiting Mechanism (DOM) is located inside the launch adapter
ring.
The structure of the FLP is designed to be a hybrid structure. The lower part is
made of integral aluminum parts and the upper part, where the optical payloads are
installed, consists of carbon-fiber reinforced sandwich structures which provide a
more stable alignment of the cameras due to their low thermal expansion. The
Thermal Control System (TCS) of the satellite consists of several temperature
sensors and heaters inside the satellite as well as Multi Layer Insulation and
radiators on the outside. No active cooling system is used.
198

865.95
Satellite Launch Adapter Mount
836.45
600 De-Orbiting Mechanism plunged in the launch adapter

162

702
298
358

+0.3 -0.2 Format Maßstab 1:10 Masse ca. 120 kg


+0.1 -0.4 Allgemeintoleranz
Werkstoff/Halbzeug
DIN 6784 nach ISO 2768-mK A3
Datum Name
Bearb. 02.07.2012 Lengowski Flying Laptop
Gepr.
Berech. Launch Configuration
Zust. Änderung Datum Name Norm
Zeichngsnummer: Blatt
Betreuer:
Institut für FlyingLaptop Launch
Tel.(Betreuer): Zeichngsdatei:
Raumfahrtsysteme Configuration.CATDrawing Bl.
Tel.(Zeichner):
Ers. für:
H.-P. Röser

Fig. 10.2 FLP satellite launch configuration.  IRS, University of Stuttgart


10 The Research Target Satellite 199

10.4 Technology and Payloads

As mentioned above, the purpose of the FLP is to demonstrate new technologies.


The main asset—although it is not a payload—concerns the Combined Data and
Power Management Infrastructure (CDPI), namely the OBC based on a LEON3FT
processor, SpaceWire driven I/O-Boards which serve as a communication interface
between the Processor-Boards, full CCSDS/PUS protocols based TC/TM via
CCSDS-Decoder/Encoder-Boards and the OBC reconfiguration performed by the
CDPI Common-Controller in the PCDU. The international partner consortium for
the architecture is Astrium GmbH Satellites (Germany), Aeroflex Colorado Springs
Inc. (USA), Aeroflex Gaisler AB (Sweden), 4Links Ltd. (UK), HEMA Kabel-
technik GmbH & Co. KG (Germany) and Vectronic Aerospace GmbH (Germany).
On instrument side the FLP is equipped with an Optical Infrared Link System
(OSIRIS) to demonstrate high speed downlink capabilities using an optical ter-
minal which was developed by DLR. Furthermore, a new reconfigurable FPGA
architecture is implemented as payload controller, the Payload On-Board Com-
puter (PLOC). Besides that, three GPS sensors shall be used to precisely determine
the position and the attitude of the satellite (GENIUS experiment).
The main payload is the Multi-spectral Imaging Camera System (MICS), which
consists of three separate cameras, using filters for green, red and near-infrared
spectral ranges. It is used for multi-angle and multi-spectral imaging of the Earth
with a ground sample distance of approximately 25 m and a swath width of
roughly 25 km. An interesting application of the MICS is to determine the Bidi-
rectional Reflectance Distribution Function (BRDF) of certain features of the
Earth’s surface.
The MICS is additionally used in a ship observation experiment. For this
experiment the satellite is equipped with an Automatic Identification System (AIS)
to receive ship transponder signals for navigation and ship traffic control. The AIS-
Receiver is developed and built at the Institute of Space Systems of DLR Bremen.
By mapping the AIS information over the optical data of the MICS the satellite
based observation of ship traffic shall be studied and improved.
Another camera with a wider field of view is used to get a better overview of
the region observed by the MICS. This Panoramic Camera (PAMCAM) is based
on a commercial off-the-shelf product and has a swath width of approximately
200 km. The large amount of image data is stored and handled in the PLOC and
can be transmitted to the ground station using a High Frequency Amateur (HAM)
radio S-Band downlink system featuring a custom designed directional antenna.
The last payload implemented in the Flying Laptop is a De-Orbiting Mecha-
nism (DOM) to meet the 25 year maximum orbital lifetime proposed by the United
Nations Inter-Agency Space Debris Coordination Committee (IADC) in order to
reduce the buildup of orbital debris. At end of the satellite lifetime the DOM
releases a large-area foil shield for generation of a higher interaction with the
residual atmosphere in orbit resulting in a negative DV and thus a faster reentry of
the satellite.
200 H.-P. Röser

10.5 Satellite Attitude Control System

The Attitude Control System (ACS) of the FLP satellite and its algorithms are fully
defined, developed and were tested as described in [89, 90] in a Matlab/Simulink
environment. The ACS key features are:
• Damping of rotational rates after its separation from the launch-vehicle or in
case of emergency.
• A safe-mode in which power supply is ensured by utilizing reliable equipment
only.
• Coarse attitude determination by using reliable sensors.
• The capability of pointing the satellite to any given target with an absolute
pointing accuracy of 150 arcsec.
• A Kalman Filter for increased accuracy of rate and attitude measurements.
• Propagation of orbit and magnetic field models.
• State estimation and rating of sensor data.

10.5.1 Sensors and Actuators

Table 10.2 shows all sensors of the FLP satellite that are used for the attitude
control system. Two redundant magnetometer systems (MGM), four fiberoptic
gyros (FOG), eight sun sensors (SUS) distributed all over the satellite, a GPS
system with three receivers and antennas and a high precision star tracker (STR)
system with two camera units.

Table 10.2 Sensors overview


MGM FOG SUS GPS STR
Output Magnetic field Rotational Solar cell Position (3 9 1) Inertial
vector (3 9 1) rate (scalar) current velocity (3 9 1) quaternion
(scalar) (4 9 1)
Sensors 2 4 6 3 2
Unit (T) (/s) (A) (m) 9m/s) (–)
Resolution/ 5 nT (LSB) 2*10-6 /s 50 mA 10 m 0.1 m/s 5 arcsec
accuracy (LSB)
Control rate 1.5 Hz/3 Hz/ 10 Hz 10 Hz 1 Hz 5 Hz
6 Hz
Connection RS422 FOG IF RS422 RS422 RS422
type (PCDU)
Manufacturer ZARM NG-LITEF Vectronic DLR DTU
Aerospace

Table 10.3 gives the key parameters of the FLP satellite’s actuator system
which is composed of three magnetotorquers (MGT) and four reaction wheels
(RWL).
10 The Research Target Satellite 201

Table 10.3 Actuators overview


MGT RWL
Input Magnetic dipole moment Rotational rate
Quantity 1 unit 4 reaction wheels
3 redundant coils
Unit (Am2) (Nm)
Control rate 10 Hz 10 Hz
Connection type IIC RS422
Manufacturer ZARM Rockwell collins

10.5.2 Control Modes

The FLP satellite control system can be operated in six different modes—see
Figs. 10.3 and 10.4:

Fig. 10.3 FLP mode transitions.  IRS, University of Stuttgart

Fig. 10.4 Satellite control modes.  IRS, University of Stuttgart


202 H.-P. Röser

De-tumble and Safe Mode


After separation from the launcher the FLP satellite will be spinning with
rotational rates |xB| of up to 10 deg/s. For that situation and in case of an
emergency the de-tumble mode is designed to reduce the rotational rate to a
threshold of |xB| = 0.4 deg/s by using only magnetotorquers as actuators.
The satellite will then be switched into safe mode where it orients the negative
z-axis (cf. Fig. 10.1) to the sun and builds up a spin of |xB| = 2 deg/s around the
z-axis to stabilize this orientation. This mode intends to provide a safe state in
which power supply is secured and only a minimum of reliable sensors and
actuators are used.
The safe and de-tumble mode are the fallback modes of the satellite. They have
to bring it to a safe state and have to be designed very robust in order to prolong
the satellites time of survival in case of an onboard failure.
Idle Mode
The idle mode is designed for recharging the batteries during normal operation
when the FLP satellite is in idle with respect to ground commanding or between
scheduled observations of the loaded mission timeline. In this mode the satellite’s
3-axis control with according sensors and actuators is operational.
Operational Pointing Modes
To perform any payload operation task the FLP satellite has three different
pointing modes. The first one is designed to orient the satellite’s coordinate system
to the inertial coordinate system so this mode is called Inertial Pointing Mode. For
this mode no information about the current position is required since the inertial
reference vector is not depending on it. In contrast the other two pointing modes,
Nadir-Pointing and Target-Pointing, do need satellite position information since
the direction to nadir, the earths center, or to any given target is depending on the
satellites current position. So target pointing and nadir pointing with respect to
attitude control laws is essentially the same except for a fixed target ~ tnadir ¼
T
ð0; 0; 0Þ in Nadir-Pointing Mode.
Figure 10.3 shows all possible satellite mode transitions, the commanded ones
being depicted with bold lines and automatic transitions (in most cases induced to
detected onboard problems) are represented by dotted lines.

10.6 Satellite Communication Links

For the communication with the ground stations, the TT&C system uses omni-
directional antennas to receive telecommands and transmit telemetry in com-
mercial S-Band. All these TM/TC packages are encoded in standard protocols
according to the Consultative Committee for Space Data Systems (CCSDS).
Payload data downlink is handled also in S-Band range on an amateur radio
frequency with a separate onboard transmitter.
%% , " "
%% ,
. %% , " %
. %% ,
%- * , "
%- * ,
"
* ,
* ,
' , * ,
' , * ,
$

+ $
+ &
$

+ '
( $
$

( &
( '
( !
#
$
01##0
&
" # $% '
" # $ !
" # &%
" # &
" # ' %
" # '
" "$
" "&

Fig. 10.5 FLP Satellite electrical block diagram.  IRS, University of Stuttgart
$%
$
) %
)
! ,- , # / -
$
0
&
0
'
*
%% ,
%% ,
203 The Research Target Satellite 10
204 H.-P. Röser

10.7 Satellite Electrical Architecture and Block Diagram

The satellite is powered by three solar panels equipped with triple junction GaInP2/
GaAs/Ge solar cells. On the center panel, there is a further string of more advanced
triple junction cells, of which the on-orbit performance shall be verified.
The maximum achievable power is estimated to be about 230 W. The power
system is controlled by a Power Control and Distribution Unit (PCDU), which
includes the Battery Charge Regulators and provides an unregulated bus voltage of
19–25 V to the instruments. A battery assembled from off-the-shelf Lithium-Iron-
Phosphate cells is used to provide electrical power during eclipse periods
(Fig. 10.5).
To deploy the solar panels after launch, no pyrotechnics are used. Instead, a
melting wire mechanism has been developed by IRS. For this mechanism, the bolts
retaining the solar panels during launch are attached to the satellite using a split
pod, which is held together by the melting wire. After the separation from the
upper stage of the launcher, the wire is melted by heat resistors and the panels are
deployed.
The De-Orbiting Mechanism is stowed within the launch adapter ring until its
deployment at the end of the satellite mission. It is a flat square sail and is
deployed using a non-explosive bi-metal switch.

10.8 EMC Launcher and Primary Payload Compliance

For the launch supplier it essential that the satellite is designed to be launched in
off-mode and will boot after cut of launcher separation straps. The Launch and
Early Orbit Phase (LEOP) autosequence will then be performed automatically.
Thus during the launch the satellite will not emit any RF spectrum.
Chapter 11
CDPI Assembly Annexes and Data Sheets

11.1 Processor-Board DSU/Ethernet Interface Card

Overview
For debug and test of the OBC Processor-Boards—both EM and FM models—a
small PCB card was designed by Aeroflex Colorado Springs to interface to the
LEON3FT DSU and Ethernet ports respectively. The DEI card is also valuable for
engineers to verify the Processor-Boards have arrived unharmed at end user premises.
The DEI card is used for essentially three purposes:
• Resetting the SBC:
During software development, code running on the processor is typically not
mature and therefore crashes can and do quite often occur. The external reset is
valuable and allows the user to reset the processor without cycling power.
This feature was extensively used during OBSW testing with the OBC EM on
the Satellite Test Bed. Please refer to Figs. 1.16 and 9.3.

J. Eickhoff (ed.), A Combined Data and Power Management Infrastructure, 205


Springer Aerospace Technology, DOI: 10.1007/978-3-642-35557-8_11,
 Springer-Verlag Berlin Heidelberg 2013
206

• DSU Interface:
The LEON3FT has a Debug Support Unit that gives the user the ability to
modify registers, load programs and generally interface to the LEON3FT using
simple commands running out of a DOS shell.
This functionality has been used with both EM Processor-Board in STB as well
as with the complete OBC FM in the FlatSat environment. Please refer to Figs.
1.16 and 9.12.
• Ethernet Interface:
The single ended Ethernet signals are routed to the 44 pin connector on the SBC
and these signals are connected to an Ethernet Phy device on the DEI. The
Ethernet connector on the DEI is then used to connect to the Ethernet port on the
LEON3FT of the SBC.
DEI Connectors
There are four connectors on the DEI card. Type and function are as follows
(please also refer to Fig. 11.1):
• 44 Pin D-Sub Female:
Used to interface to the SBC and contains all of the signals to power, reset and
interface to Ethernet and the DSU.
• Ethernet Connector:
Interface to any 10/100 Mb Ethernet network.
• DSU Molex:
Standard 14 pin connector used to interface to the LEON3FT DSU using a
Xilinx JTAG Interface pod.
• Nine pin D-Sub Female Connector:
3.3 V Power and Ground input from any bench power supply.
Block Diagram and Dimensions
Figure 11.1 shows the general layout of the DEI card as well as the dimensions.
The card is designed to connect to the SBC with or without a cable.

Fig. 11.1 DSU/ethernet


interface card.  Aeroflex
Ethernet
Inc.
Connector
44 Pin D-Sub Connector

DSU Molex
63.56 mm
Connector
External Reset
Switch

9 Pin D-Sub
Connector
76.27 mm
11 CDPI Assembly Annexes and Data Sheets 207

LEON3FT Ethernet Interface


A standard Ethernet Phy device is connected from the Ethernet connector on the
DSU/Ethernet Interface card to the Ethernet signals on the 44 Pin D-Sub.

LEON3FT DSU Interface


The signals on the 14 pin Molex connector on the DSU/Ethernet Interface card are
connected directly to the 44 pin D-sub connector. This allows the user to use the
Xilinx Platform USB to connect to the LEON3FT DSU unit.

DSU/Ethernet Interface Jumper Settings


See Table 11.1.

Table 11.1 DSU/ethernet interface card jumper settings


Jumper Title Description
J4 Leon DSU EN Leon DSU enable jumper. Connect jumper to enable the active high
LEON3FT DSU enable signal
J5 JTAG reset Active low JTAG/DSU reset to the LEON3FT. Connect jumper to
assert reset
J6 DSU break Active high DSU break signal. Attach jumper to enable the break
J8 Ethernet clock Active low connection to the 25 MHz ethernet oscillator. Attach
Tri-state jumper to tri-state oscillator output
J10 DSU mode test Active high signal indicates if DSU has been enabled
point
J11 Ethernet error This signal will go high when there is an error and is detected on the
test point ethernet Phy device

Power Connector
The DEI card gets power through a 9 pin D-sub connector. These are readily
available through many electronics suppliers (Table 11.2).

Table 11.2 9 Pin D-sub on Pin number I/O Pin assignment


DSU/ethernet interface card
pin assignments 1 I GND
2 I GND
3 I 3.3 V
4 I 3.3 V
5 I 3.3 V
6 I GND
7 I GND
8 I 3.3 V
9 I 3.3 V
208

DSU and Ethernet Implementation


When implementing the DSU and Ethernet Interface on the SBC the 44 pin D-Sub
Female connector on the DSU/Ethernet Interface card has to be connected to the 44
pin D-Sub Male connector on the OBC Processor-Board as shown in Fig. 11.2. Note
that a cable may be used if greater distance is required between the DEI card and the
SBC. A Xilinx Platform USB programmer has to be used to interface to the DSU.
Such devices are readily available and come with a ribbon cable that connects to the
Molex 14 pin connector on the DEI card. Please refer to LEON3FT documentation
for implementing the DSU using the Xilinx Platform USB.

3.3V +/-5%
Input

DSU/Ethernet
Interface Card
9 Pin D-Sub Stuttgart SBC EM
Connector
44 Pin D-Sub Connector

DSU Molex
Connector

Ethernet
Connector

Ethernet

USB
Personal
Computer

Fig. 11.2 Using the DSU/ethernet interface card.  Aeroflex Inc.

11.2 CCSDS Conventions

11.2.1 CCSDS Field Definition

The subsequent sections present the convention according to the Consultative


Committee for Space Data Systems (CCSDS) recommendations, applying to all
relevant structures of the CCSDS-Board decoder/encoder firmware and OBC
software:
11 CDPI Assembly Annexes and Data Sheets 209

• The most significant bit of an array is located to the left, carrying index number
zero, and is transmitted first.
• An octet comprises eight bits.

General convention, applying to signals and interfaces:


• Signal names are in mixed case.
• An upper case ‘_N’ suffix in the name indicates that the signal is active low
(Table 11.3).

Table 11.3 CCSDS n-bit field definition


CCSDS n-bit field
most significant least significant
0 1 to n-2 n-1

11.2.2 Galois Field

Convention according to the Consultative Committee for Space Data Systems


(CCSDS) recommendations, applying to all Galois Field GF(28) symbols:
• A Galois Field GF(28) symbol comprises eight bits.
• The least significant bit of a symbol is located to the left, carrying index number
zero, and is transmitted first (Table 11.4).

Table 11.4 Galois field GF(28) symbol definition

Galois Field GF(28)


symbol
least significant most significant
0 1 to 6 7

11.2.3 Telemetry Transfer Frame Format

The Telemetry Transfer Frame specified in [19, 26] is composed of a Primary


Header, a Secondary Header, a Data Field and a Trailer with the following
structures (Tables 11.5, 11.6, 11.7, 11.8, and 11.9).

Table 11.5 Telemetry transfer frame format


Transfer Frame
Transfer Frame Header Transfer Frame Transfer Frame Trailer
Data Field
Primary Secondary (optional) ket I Packet I Pa OCF / FECF (optional)
6 octets variable variable 0 / 2 /4 / 6 octets
up to 2048 octets
210

Table 11.6 Telemetry transfer frame primary header format


Transfer Frame Primary Header
Frame Identification Master Channel Virtual Channel Frame Data
Version S/C Id VC Id OCF Flag Frame Frame Field
Count Count Status
2 bits 10 bits 3 bits 1 bit 8 bits 8 bits 16 bits
0:1 2:11 12:14 15
2 octets 1 octet 1 octet 2 octets

Table 11.7 Part of telemetry transfer frame primary header format


Frame Data Field Status
Secondary Header Flag Sync Flag Packet Order Flag Segment Length Id First Header Pointer
1 bit 1 bit 1 bit 2 bits 11 bits
0 1 2 3:4 5:15
2 octets

Table 11.8 Telemetry transfer frame secondary header format


Transfer Frame Secondary Header (optional)
Secondary Header Identification Secondary Header Data Field
Secondary Header Version Secondary Header Length Custom data
2 bits 6 bits
0:1 2:7
1 octet up to 63 octets

Table 11.9 Telemetry transfer frame trailer format


Transfer Frame Trailer (optional)
Operational Control Field (optional) Frame Error Control Field (optional)
0 / 4 octets 0 / 2 octets

11.2.4 Reed-Solomon Encoder Data Format

The applicable standards [17, 25] specify a Reed-Solomon E = 16 (255, 223) code
resulting in the frame lengths and code block sizes listed in Table 11.10.

Table 11.10 Reed-solomon E = 16 code blocks with attached synchronization marker


Interleave Attached synchronization Transfer frame Reed solomon check
depth marker (octets) (octets) symbols (octets)
1 4 223 32
2 446 64
3 669 96
4 892 128
5 1115 160
8 1784 256
11 CDPI Assembly Annexes and Data Sheets 211

The applicable standards [25] also specifies a Reed-Solomon E = 8 (255, 239)


code resulting in the frame lengths and codeblock sizes listed in Table 11.11

Table 11.11 Reed-solomon E = 8 codeblocks with attached synchronization marker


Interleave Attached synchronization Transfer frame Reed-solomon check
depth marker (octets) (octets) symbols (octets)
1 4 239 16
2 478 32
3 717 48
4 956 64
5 1195 80
8 1912 128

11.2.5 Attached Synchronization Marker

The Attached Synchronization Marker pattern depends on the encoding scheme in


use, as specified in [17, 25] as shown in Table 11.12.

Table 11.12 Attached synchronization marker hexadecimal pattern


Mode Hexadecimal stream (left to right)
Nominal 1ACFFC1Dh

11.2.6 Telecommand Transfer Frame Format

The Telecommand Transfer FrameTelecommand Transfer Frame specified in


[20, 35] is composed of a Primary Header, a Data Field and a trailer with
(Tables 11.13, 11.14 and 11.15).

Table 11.13 Telecommand transfer frame format


Transfer Frame
Transfer Frame Primary Transfer Frame Data Field Frame Error Control Field
Header Segment Header (optional) ket I Packet I Pac FECF (optional)
5 octets variable variable 2 octets
up to 1024 octets

Table 11.14 Telecommand transfer frame primary header format


Transfer Frame Primary Header
Version Bypass Control Reserved S/C Id Virtual Frame Frame
Flag Command Spare Channel Length Sequence
Flag Id Number
2 bits 1 bit 1 bit 2 bits 10 bits 6 bits 10 bits 8 bits
0:1 3 4 5 6:15 16:21 22:31 32:39
2 octets 2 octets 1 octet1
212

Table 11.15 Transfer frame secondary header format


Segment Header (optional)
Sequence Flags Multiplexer Access Point (MAP) Id
2 bits 6 bits
40:41 42:47
1 octet

11.2.7 Command Link Control Word

The Command Link Control Word (CLCW) can be transmitted as part of the
Operation Control Field (OCF) in a Transfer Frame Trailer. The CLCW is spec-
ified in [20, 35] and is listed in Table 11.16.

Table 11.16 Command link control word


Command Link Control Word
Control Word Type Version Number Status Field COP in Effect Virtual Channel Identifier Reserved Spare
0 1:2 3:5 6:7 8:13 14:15
1 bit 2 bits 3 bits 2 bits 6 bits 2 bits
No RF Available No Bit Lock Lock Out Wait Retransmit FARM B Reserved Report Value
Counter Spare
16 17 18 19 20 21:22 23 24:31
1 bit 1 bit 1 bit 1 bit 1 bit 2 bits 1 bit

11.2.8 Space Packet

The Space Packet defined in the CCSDS [27, 28] recommendation and is listed in
Table 11.17.

Table 11.17 CCSDS space packet format

Space Packet
Primary Header Packet Data Field
Packet Packet Identification Packet Sequence Control Packet Secondary User Packet
Version Header Data
Type Secondary Application Sequence Sequence Data Error
Number (optional) Field
Header Flag Process Id Flags Count Length Control
(optional)
0:2 3 4 5:15 16:17 18:31 32:47
3 bits 1 bit 1 bit 11 bits 2 bits 14 bits 16 bits variable variable variable

11.2.9 Asynchronous Bit Serial Data Format

The asynchronous bit serial interface complies to the data format defined in EIA-
232. It also complies to the data format and waveform shown in Table 11.18 and
Fig. 11.3. The interface is independent of the transmitted data contents. Positive
logic is considered for the data bits. The number of stop bits can optionally be
either one or two. The parity bit can be optionally included.
11 CDPI Assembly Annexes and Data Sheets 213

Table 11.18 Asynchronous bit serial data format


Asynchronous Start D0 D1 D2 D3 D4 D5 D6 D7 Parity Stop Stop
bit serial First lsb msb Last
format
General 8*i ? 7 8*i ? 6 8*i ? 5 8*i ? 4 8*i ? 3 8*i ? 2 8*i ? 1 8*i
data format Last First
i = {0, n}

11.2.10 SpaceWire Remote Memory Access Protocol

A general definition of RMAP commands is specified in [14]. For Telemetry


Virtual Channels 0 through 3, a complete CCSDS Space Packet [27, 28] is carried
inside an RMAP write command [14], which in turn is carried inside a SpaceWire
packet [12], as shown in the Table 11.19.

Table 11.19 CCSDS space packet, in RMAP write command, in spacewire packet
SpaceWire Destination Cargo EOP
Packet Address
RMAP Write Target Target Protocol lnstruction Key Reply lnitiator Transaction Extended Address Data Header Data Data EOP
Command SpaceWire Logical ldentifier Address Logical ldentifier Address Length CRC CRC
Address Address Address
CCSDS CCSDS
Space Space
Packet Packet
optionalv, 1 byte 1 byte 1 byte 1 byte optional, 1 byte 2 bytes 1 byte 4 bytes 3 bytes 1 byte variable 1 byte token
ari- able variable

11.2.11 Command Link Control Word Interface

See Table 11.20.

Table 11.20 CLCW transmission protocol


Byte CLCW CLCW contents
Number register bits
First [31:24] Control Word Type CLCW Version Status COP In Effect
Number Field
Second [23:16] Virtual Channel ID Reserved Field
Third [15:8] No RF Available No Bit Lock Lock Wait Retransmit Farm B Report
Out Counter Type
Fourth [7:0] Report Value
Fifth N/A [RS232 Break Command]

11.2.12 Waveform Formats

The design receives and generates the waveform formats as shown in Fig. 11.3:
214

Start Data Stop

Start LSB MSB Stop

Start LSB MSB Stop Stop

Start Data parity Stop

Start LSB MSB p Stop

Start LSB MSB p Stop Stop

Break
Start Stop

Delimiter

Clock

Data 0 1 2 3 4 5 6 7 n-8 n-7 n-6 n-5 n-4 n-3 n-2 n-1


MSB LSB

Fig. 11.3 Telecommand input protocol/waveform.  Aeroflex Gaisler AB

11.3 Selected TM Encoder Registers

See Tables 11.21, 11.22, 11.23, 11.24, 11.25, 11.26, 11.27, and 11.28.

Table 11.21 GRTM DMA external VC control and status register


31 6 5 4 3 2 1 0

RESERVED XTFO RESERVED XTl XTE XEN


31:6 RESERVED
5 External Transfer Frame Ongoing (XTFO)—telemetry frames via DMA transfer for
external VC (Virtual Channels 3 through 6) are on-going (read-only)
4:3 RESERVED
2 External Transmitter Interrupt (XTI)—DMA interrupt for external VC, cleared by
writing a logical 1 (unused)
1 External Transmitter Error (XTE)—DMA transmitter underrun for external VC
(Virtual Channels 3 through 6), cleared by writing a logical 1
0 External Enable (XEN)—enable DMA transfers for external VC (Virtual Channels 3
through 6) (note that descriptor table is checked continuously till this bit is cleared)

Table 11.22 GRTM DMA external VC descriptor pointer register


31 10 9 3 2 0
BASE lNDEX "OOO"
31:10 Descriptor base (BASE)—base address of descriptor table (Virtual Channels 3 through 6)
9:3 Descriptor index (INDEX)—index of active descriptor in descriptor table
2:0 Reserved—fixed to ‘‘000’’
11 CDPI Assembly Annexes and Data Sheets 215

Table 11.23 GRTM control register


31 1 0
RESERVED TE
31:1 RESERVED
0: Transmitter Enable (TE)—enables telemetry transmitter (should be done after the complete
configuration of the telemetry transmitter, including the LENGTH field in the GRTM DMA
length register)

Table 11.24 GRTM configuration register (read-only)


31 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 6 5 4 3 2 1 0
RESERVED O C A F I M F l E O F A RS RS TE P N CE SP SC
C l O H Z C S D V C E A DEPTH S R
F F S E G H L C F C S R Z
B C E F M

31:23 RESERVED
22 Operational Control Field Bypass (OCFB)—CLCW implemented externally, no OCF
register
21 Encryption/Cipher Interface (CIF)—interface between protocol and channel coding sub-
layers
20 Advanced Orbiting Systems (AOS)—AOS Transfer Frame generation implemented
19 Frame Header Error Control (FHEC)—frame header error control implemented, only if
AOS also set
18 Insert Zone (IZ)—insert zone implemented, only if AOS also set
17 Master Channel Generation (MCG)—master channel counter generation implemented
16 Frame Secondary Header (FSH)—frame secondary header implemented
15 Idle Frame Generation (IDLE)—idle frame generation implemented
14 Extended VC Cntr (EVC)—extended virtual channel counter implemented (ECSS)
13 Operational Control Field (OCF)—CLCW implemented internally, OCF register
12 Frame Error Control Field (FECF)—Transfer Frame CRC implemented
11 Alternative ASM (AASM)—alternative attached synchronization marker implemented
10:9 Reed-Solomon (RS)—Reed-Solomon encoder implemented, ‘‘01’’ E = 16, ‘‘10’’ E = 8,
‘‘11’’ E = 16 & 8
8:6 Reed-Solomon Depth (RSDEPTH)—Reed-Solomon interleave depth -1 implemented
5 Turbo Encoder (TE)—turbo encoder implemented (reserved)
4 Pseudo-Randomizer (PSR)—Pseudo-Randomizer implemented
3 Non-Return-to-Zero (NRZ)—non-return-to-zero—mark encoding implemented
2 Convolutional Encoding (CE)—convolutional encoding implemented
1 Split-Phase Level (SP)—split-phase level modulation implemented
0 Sub Carrier (SC)—sub carrier modulation implemented

Table 11.25 GRTM physical layer register


31 30 16 15 14 0

SF SYMBOLRATE SCF SUBRATE


31 Symbol Fall (SF)—symbol clock has a falling edge at start of symbol bit
30:16 Symbol Rate (SYMBOLRATE)—symbol rate division factor—1
15 Sub Carrier Fall (SCF)—sub carrier output start with a falling edge for logical 1
14:0 Sub Carrier Rate (SUBRATE)—sub carrier division factor—1
216

Table 11.26 GRTM coding sub-layer register


31 20 19 18 17 16 15 14 12 11 10 8 7 6 5 4 2 1 0
RESERVED C CSEL A RS RSDEPTH R RESERVED P N CE CE RATE SP SC
l A S S R
F S 8 R Z
M

31:20 RESERVED
19 Encryption/Cipher Interface (CIF)—enable external encryption/cipher interface between
sub-layers
18:17 Clock Selection (CSEL)—selection of external telemetry clock source (application
specific)
16 Alternative ASM (AASM)—alternative attached synchronization marker enable. When
enabled the value from the GRTM Attached Synchronization Marker register is used, else
the standardized ASM value 0x1ACFFC1D is used
15 Reed-Solomon (RS)—Reed-Solomon encoder enable
14:12 Reed-Solomon Depth (RSDEPTH)—Reed-Solomon interleave depth -1
11 Reed-Solomon Rate (RS8)—‘0’ E = 16, ‘1’ E = 8
10:8 RESERVED
7 Pseudo-Randomizer (PSR)—Pseudo-Randomizer enable
6 Non-Return-to-Zero (NRZ)—non-return-to-zero—mark encoding enable
5 Convolutional Encoding (CE)—convolutional encoding enable
4:2 Convolutional Encoding Rate (CERATE):
‘‘00-’’rate 1/2, no puncturing
‘‘01-’’rate 1/2, punctured
‘‘100’’rate 2/3, punctured
‘‘101’’rate 3/4, punctured
‘‘110’’ rate 5/6, punctured
‘‘111’’rate 7/8, punctured
1 Split-Phase Level (SP)—split-phase level modulation enable
0 Sub Carrier (SC)—sub carrier modulation enable

Table 11.27 GRTM attached synchronization marker register


31 0
ASM
31:0 Attached Synchronization Marker (ASM)—pattern for alternative ASM, (bit 31 MSB sent
first, bit 0 LSB sent last) (The reset value is the standardized alternative ASM value
0x352EF853)

Table 11.28 GRTM idle frame generation register


31 24 23 22 21 20 19 18 17 16 15 10 9 0

lDLEMCFC RE- I O E F V MC VClD SClD


SER- D C V S C
VED L F C H C
E
31:24 Idle Master Channel Frame Counter (IDLEMCFC)—diagnostic read out (read only, TM
only)
23:22 RESERVED
21 Idle Frames (IDLE)—enable idle frame generation
(continued)
11 CDPI Assembly Annexes and Data Sheets 217

Table 11.28 (continued)


20 Operation Control Field (OCF)—enable OCF for idle frames
19 Extended Virtual Channel Counter (EVC)—enable extended virtual channel counter
generation for idle frames (TM only, ECSS)
18 Frame Secondary Header (FSH)—enable FSH for idle frames (TM only)
17 Virtual Channel Counter Cycle (VCC)—enable virtual channel counter cycle generation
for idle frames (AOS only)
16 Master Channel (MC)—enable separate master channel counter generation for idle frames
(TM only)
15:10 Virtual Channel Identifier (VCID)—virtual channel identifier for idle frames
9:0 Spacecraft Identifier (SCID)—spacecraft identifier for idle frames

11.4 TM Encoder: Virtual Channel Generation Registers

The Virtual Channel Generation function core is programmed through registers


mapped into APB address space (Table 11.29).

Table 11.29 GRTM PAHB registers


APB address offset Register
16#004# Status register
16#008# Control register

Status Register (R)


See Table 11.30.

Table 11.30 Status register


31 2 1 0
BUSY READY
1: BUSY Not ready for new input, busy with octet
0: READY Ready for new packet of maximum size
All bits are cleared to 0 at reset

Control Register (R/W)


See Table 11.31.

Table 11.31 Control register


31 10 9 8 7 3 2 1 0

- BUSYEN READYEN - VALID RST EN


9: BUSYEN Enable not-busy interrupt when 1
8: READYEN Enable ready for packet interrupt when 1
2: VALID Packet valid delimiter, packet valid when 1, in-between packets when
0 (read-only)
1: RST Reset complete core when 1
0: EN Enable interface when 1
All bits are cleared to 0 at reset. Note that RST is read back as 0
218

AHB I/O Area


Data to be transferred to the Virtual Channel Generation function is written to the
AMBA AHB slave interface which implements a AHB I/O area. See [64] for details.
Note that the address is not decoded by the core. Address decoding is only done
by the AMBA AHB controller, for which the I/O area location and size is fixed. It
is possible to transfer one, two or four bytes at a time, following the AMBA big-
endian convention regarding send order. The last written data can be read back via
the AMBA AHB slave interface. Data are output as octets on the Virtual Channel
Generation interface (Tables 11.32, and 11.33).

Table 11.32 AHB I/O area—data word definition


31 24 23 16 15 8 7 0

DATA [31:24] DATA [23:16] DATA [15:8] DATA [7:0]

Table 11.33 AHB I/O area—send order


Transfer Address DATA DATA DATA DATA Comment
size offset [31:24] [23:16] [15:8] [7:0]
Word 0 First Second Third Last Four bytes
sent
Halfword 0 First Last – – Two bytes
sent
2 – – First Last Two bytes
sent
Byte 0 First – – – One byte sent
1 – First – – One byte sent
2 – – First – One byte sent
3 – – – First One byte sent

11.5 Selected TC Decoder Registers


See Tables 11.34 to 11.44.

Table 11.34 Global reset register


31 24 23 1 0
SEB RESERVED SRST
31:24 SEB (Security Byte):
Write: ‘0 9 55’ = the write will have effect (the register will be updated)
Any other value = the write will have no effect on the register
Read: All zero
23:1 RESERVED
Write: Don’t care
Read: All zero
0 System reset (SRST): [1]
Write: ‘1’ = initiate reset, ‘0’ = do nothing
Read: ‘1’ = unsuccessful reset, ‘0’ = successful reset
Note: The Coding Layer is not reset
Power-up default: 0x00000000
11 CDPI Assembly Annexes and Data Sheets 219

Table 11.35 Global control register (GCR)


31 24 23 13 12 11 10 9 0

SEB RESERVED PSS NRZM PSR RESERVED


31:24 SEB (Security Byte):
Write: ‘0x55’ = the write will have effect (the register will be updated)
Any other value = the write will have no effect on the register
Read: All zero
23:13 RESERVED
Write: Don’t care
Read: All zero
12 PSS (ESA/PSS enable) [11]
Write/Read: ‘0’ = disable, ‘1’ = enable [read-only]
11 NRZM (Non-Return-to-Zero Mark Decoder enable)
Write/Read: ‘0’ = disable, ‘1’ = enable [read-only]
10 PSR (Pseudo-De-Randomizer enable)
Write/Read: ‘0’ = disable, ‘1’ = enable [read-only]
9:0 RESERVED
Write: Don’t care
Read: All zero
Power-up default: 0x00001000, The default value depends on the tcmark and tcpseudo inputs

The following register sets the spacecraft ID for telecommands. It depends on


IP Core configuration at ordering as explained in Table 4.15.

Table 11.36 Spacecraft identifier register (SIR)[7]


31 10 9 0
RESERVED SCID
31:10 RESERVED
Write: Don’t care
Read: All zero
9:0 SCID (Spacecraft Identifier)
Write: Don’t care
Read: Bit[9] = MSB, Bit[0] = LSB
Power-up default: Depends on SCID input configuration

Table 11.37 Frame acceptance report register (FAR)[7]


31 30 25 24 19 18 16 15 14 13 11 10 0
SSD RESERVED CAC CSEC RESERVED SCl RESERVED
31 SSD (Status of Survey Data) (see [44])
Write: Don’t care
Read: Automatically cleared to 0 when any other field is updated by the coding layer
Automatically set to 1 upon a read
30:25 RESERVED
Write: Don’t care
Read: All zero
(continued)
220

Table 11.37 (continued)


24:19 CAC (Count of Accept Codeblocks) (see [44])
Write: Don’t care.
Read: Information obtained from coding layer[2]
18:16 CSEC (Count of Single Error Corrections) (see [44])
Write: Don’t care
Read: Information obtained from coding layer
15:14 RESERVED
Write: Don’t care
Read: All zero
13:11 SCI (Selected Channel Input) (see [44])
Write: Don’t care
Read: Information obtained from coding layer
10:0 RESERVED
Write: Don’t care
Read:All zero
Power-up default: 0x00003800

Table 11.38 CLCW register (CLCWRx)—see [43]


31 30 29 28 26 25 24 23 18 17 16 15 14 13 12 11 10 9 8 7 0

CWTY VNUM STAF ClE VCl RESERVED NRFA NBLO LOUT WAlT RTMl FBCO RTYPE RVAL
31 CWTY (Control Word Type)
30:29 VNUM (CLCW Version Number)
28:26 STAF (Status Fields)
25:24 CIE (COP In Effect)
23:18 VCI (Virtual Channel Identifier)
17:16 Reserved (PSS/ECSS requires ‘‘00’’)
15 NRFA (No RF Available)
Write: Don’t care
Read: Based on discrete inputs
14 NBLO (No Bit Lock)
Write: Don’t care
Read:Based on discrete inputs.
13 LOUT (Lock Out)
12 WAIT (Wait)
11 RTMI (Retransmit)
10:9 FBCO (FARM-B Counter)
8 RTYPE (Report Type)
7:0 RVAL (Report Value)
Power-up default: 0x00000000
11 CDPI Assembly Annexes and Data Sheets 221

The following register provides to the OBSW the RF-Available flag and the
Bit-Lock flag as coming from the receivers.

Table 11.39 Physical interface register (PHIR)[7]


31 16 15 8 7 0
RESERVED
31:16 RESERVED
Write: Don’t care
Read: All zero
15:8 RFA (RF Available)[3]
Only implemented inputs are taken into account. All other bits are zero.
Write: Don’t care
Read: Bit[8] = input 0, Bit[15] = input 7
7:0 BLO (Bit Lock)[3]
Only implemented inputs are taken into account. All other bits are zero
Write: Don’t care
Read: Bit[0] = input 0, Bit[7] = input 7
Power-up default: Depends on inputs

Table 11.40 Control register (COR)


31 24 23 10 9 8 1 0

SEB RESERVED CRST RESERVED RE


31:24 SEB (Security Byte):
Write: ‘0x55’ = the write will have effect (the register will be updated). Any other
value = the write will have no effect on the register
Read: All zero
23:10 RESERVED
Write: Don’t care
Read: All zero
9 CRST (Channel reset)[4]
Write: ‘1’ = initiate channel reset,‘0’ = do nothing
Read: ‘1’ = unsuccessful reset, ‘0’ = successful reset
Note: The Coding Layer is not reset
8:1 RESERVED
Write: Don’t care
Read: All zero
0 RE (Receiver Enable)
The input from the Coding Layer receiver is masked when the RE bit is disabled
Read/Write: ‘0’ = disabled, ‘1’ = enabled
Power-up default: 0x00000000

The following register is essential for the OBSW to detect overrun and FIFO
full errors:
222

Table 11.41 Status register (STR)[7]


31 11 10 9 8 7 6 5 4 3 1 0

RESERVED RBF RESERVED RFF RESERVED OV RESERVED CR


31:11 RESERVED
Write: Don’t care
Read: All zero
10 RBF (RX BUFFER Full)
Write: Don’t care
Read: ‘0’ = Buffer not full, ‘1’ = Buffer full (this bit is set if the buffer has less than 1/8
of free space)
9:8 RESERVED
Write: Don’t care
Read: All zero
7 RFF (RX FIFO Full)
Write: Don’t care
Read: ‘0’ = FIFO not full, ‘1’ = FIFO full
6:5 RESERVED
Write: Don’t care
Read: All zero
4 OV (Overrun)[5]
Write: Don’t care
Read: ‘0’ = nominal, ‘1’ = data lost
3:1 RESERVED
Write: Don’t care
Read: All zero
0 CR (CLTU Ready)[5]
There is a worst case delay from the CR bit being asserted, until the data has actually been
transferred from the receiver FIFO to the ring buffer. This depends on the PCI load etc
Write: Don’t care
Read: ‘1’ = new CLTU in ring buffer. ‘0’ = no new CLTU in ring buffer
Power-up default: 0x00000000

The following register must be initialized by the OBSW with a memory address
representing the start address of the TC buffer:

Table 11.42 Address space register (ASR)[8]


31 10 9 8 7 0
BUFST RESERVED RXLEN
31:10 BUFST (Buffer Start Address)
22-bit address pointer
This pointer contains the start address of the allocated buffer space for this channel.
Register has to be initialized by software before DMA capability can be enabled
9:8 RESERVED
Write: Don’t care
Read: All zero
7:0 RXLEN (RX buffer length)
Number of 1 kB-blocks reserved for the RX buffer
(Min. 1 kByte = 0x00, Max. 256 kByte = 0xFF)
Power-up default: 0x00000000
11 CDPI Assembly Annexes and Data Sheets 223

The following read and write pointers have to be reinitialized at each TC


access:

Table 11.43 Receive read pointer register (RRP)[6, 9, 10]

31 24 23 0
RxRdPtrUpper RxRdPtrLower
31:24 10-bit upper address pointer
Write: Don’t care
Read: This pointer = ASR[31..24]
23:0 24-bit lower address pointer
This pointer contains the current RX read address. This register is to be incremented with
the actual amount of bytes read
Power-up default: 0x00000000

Table 11.44 Receive write pointer register (RWP)[6, 9]

31 24 23 0
RxWrPtrUpper RxWrPtrLower
31:24 10-bit upper address pointer
Write: Don’t care
Read: This pointer = ASR[31..24]
23:0 24-bit lower address pointer
This pointer contains the current RX write address. This register is incremented with the
actual amount of bytes written
Power-up default: 0x00000000

Legend:
[1]. The global system reset caused by the SRST-bit in the GRR-register results in
the following actions:
• Initiated by writing a ‘1’’, gives ‘0’ on read-back when the reset was
successful.
• No need to write a ‘0’ to remove the reset.
• Unconditionally, means no need to check/disable something in order for
this reset- function to correctly execute.
• Could of course lead to data-corruption coming/going from/to the reset core.
• Resets the complete core (all logic, buffers & register values).
• Behaviour is similar to a power-up.
Note that the above actions require that the HRESET signal is fed back
inverted to HRESETn, and the CRESET signal is fed back inverted to
CRESETn.
• The Coding Layer is not reset.
224

[2]. The FAR register supports the CCSDS/ECSS standard frame lengths
(1024 octets), requiring an 8 bit CAC field instead of the 6 bits specified for
PSS. The two most significant bits of the CAC will thus spill over into the
‘‘LEGAL/ILLEGAL’’ FRAME QUALIFIER field, Bit [26:25]. This is only
the case when the PSS bit is set to ‘0’.
[3]. Only inputs 0 through 3 are implemented.
[4]. The channel reset caused by the CRST-bit in the COR-register results in the
following actions:
• Initiated by writing a ‘1’’, gives ‘0’ on read-back when the reset was
successful.
• No need to write a ‘0’ to remove the reset.
• All other bit’s in the COR are neglected (not looked at) when the CRST-bit
is set during a write, meaning that the value of these bits has no impact on
the register-value after the reset.
• Unconditionally, means no need to check/disable something in order for
this reset- function to correctly execute.
• Could of course lead to data-corruption coming/going from/to the reset
channel.
• Resets the complete channel (all logic, buffers & register values)
• Except the ASR-register of that channel which remains it is value.
• All read- and write-pointers are automatically re-initialized and point to the
start of the ASR-address.
• All registers of the channel (except the ones described above) get their
power-up value.
• This reset shall not cause any spurious interrupts.
Note that the above actions require that the CRESET signal is fed back
inverted to CRESETn.
• The Coding Layer is not reset.
[5]. These bits are sticky bits which means that they remain present until the
register is read and that they are cleared automatically by reading the register.
[6]. The value of the pointers depends on the content of the corresponding
Address Space Register (ASR).
During a system reset, a channel reset or a change of the ASR register, the
pointers are recalculated based on the values in the ASR register.
The software has to take care (when programming the ASR register) that the
pointers never have to cross a 16MByte boundary (because this would cause
an overflow of the 24-bit pointers).
It is not possible to write an out of range value to the RRP register. Such
access will be ignored with an HERROR.
11 CDPI Assembly Annexes and Data Sheets 225

[7]. An AMBA AHB ERROR response is generated if a write access is attempted


to a register without any writable bits.
[8]. The channel reset caused by a write to the ASR-register results in the fol-
lowing actions:
• Initiated by writing an updated value into the ASR-register.
• Unconditionally, means no need to check/disable something in order for this
reset- function to correctly execute.
• Could of course lead to data-corruption coming/going from/to the reset
channel.
• Resets the complete channel (all logic & buffers) but not all register values,
only the following:
• COR-register, TE & RE bits get their power-up value, other bits remain their
value.
• STR-register, all bits get their power-up value.
• Other registers remain their value.
• Updates the ASR-register of that channel with the written value.
• All read- and write-pointers are automatically re-initialized and point to the
start of the ASR-address.
• This reset shall not cause any spurious interrupts.
• The Coding Layer is not reset.
[9]. During a channel reset the register is temporarily unavailable and HRETRY
response is generated if accessed.
[10]. It is not possible to write an out of range value to the RRP register. Such
access will be ignored without an error.
[11]. The PSS bit usage is only supported if the gPSS generic is set on the TCC
module. Fixed to 0.
A number of interrupt registers of the TC decoder give complete freedom to the
software, by providing means to mask interrupts, clear interrupts, force interrupts
and read interrupt status. Details on these can be taken from [55].
217
226

178
94
121 10
Y Z

X X 16.5
See Fig. 11.4.
J11
J12
55.5
J10 J8

89
J9 J7

119
J6 J5

267
146.5

J4 J2 185.5
11.6 OBC Unit CAD Drawing

J1 219
J3

249

Top View Rear View Bottom View 24x M 6 (Helicoils)

Material: EN AW-6082 (AlMgSi1)


Surface: Aluminium: Chromated
all non-connected Aluminium Faces:
coated with Aeroglaze Z307
Mass: 5kg
Moments of Inertia [kgm³]:
J1: Sub-D HD 15 male Lxx=0.095 Lyy=0.137 Lzz=0.181
J2: Sub-D HD 15 female Lxy=-0.065 Lxz=-0.039 Lyz=-0.03
J3: Sub-D 25 male Format Maßstab Masse
+0.3 -0.2 1:2 5 kg
J4: Micro-D 100 male +0.1 -0.4 Allgemeintoleranz
Werkstoff/Halbzeug
J5: Micro-D 100 female DIN 6784 nach ISO 2768-mK A3
J6: Micro-D 100 male
Datum Name
J7: Sub-D HD 15 male Bearb. 06.11.2012 Lengowski
J8: Sub-D HD 15 female Gepr. Onboard Computer
J9: Sub-D 25 male Berech.
Zust. Änderung Datum Name Norm
J10: Micro-D 100 male Blatt
Betreuer: Zeichngsnummer: dfdsf
J11: Micro-D 100 female Institut für
Tel.(Betreuer): Zeichngsdatei: Asm, OBC.CATDrawing
J12: Micro-D 100 male Raumfahrtsysteme Bl.
Tel.(Zeichner):
Ers. für:

Fig. 11.4 OBC housing CAD drawing.  irs, University of Stuttgart


11 CDPI Assembly Annexes and Data Sheets 227

11.7 OBC Unit I/O-Board Connector Pin Allocation

I/O-Board Connector D:
• I/O-Board internal nomenclature D
• External instantiations J5/J11 according to annex Sect. 11.6
Generic Pin Allocations (IF Type)
See Fig. 11.5.

Fig. 11.5 I/O-Board connector D—generic pin allocations.  4Links Ltd.


228

Pin Allocation to Target Satellite Equipment Interfaces


See Fig. 11.6.

Fig. 11.6 I/O-Board connector D—target satellite specific interfaces.  4Links Ltd.
11 CDPI Assembly Annexes and Data Sheets 229

I/O-Board Connector E
• I/O-Board internal nomenclature E
• External instantiations J6/J12 according to annex Sect. 11.6
Generic Pin Allocations (IF Type)
See Fig. 11.7.

Fig. 11.7 I/O-Board connector E—generic pin allocations.  4Links Ltd.


230

Pin Allocation to Target Satellite Equipment Interfaces


See Fig. 11.8.

Fig. 11.8 I/O-Board connector E—target satellite specific interfaces.  4Links Ltd.
11 CDPI Assembly Annexes and Data Sheets 231

11.8 OBC Unit CCSDS-Board Connector Pin Allocation

Transceiver Interfaces and Cross Coupling


• CCSDS-Board internal nomenclature E (Fig. 11.9).
• External instantiations J4/J10 according to annex Sect. 11.6.

Fig. 11.9 CCSDS-Board connector E—pin allocations.  4Links Ltd.


232

11.9 OBC Power-Board Connectors Pin Allocation

The connector/pin allocations of the OBC Power-Boards external connectors are


provided in Tables (11.45–11.47):
Power-Board Data Connector
• Power-Board internal nomenclature J2
• External instantiations J1/J7 according to annex Sect. 11.6 Table 11.46.

Table 11.45 J2—Data connector on long board side (female)


Pin To Pwr-Board From Pwr-Board Voltage/signal Name
1 Processor-Board N/R Data DSUTMS
2 Processor-Board N/R Data DSUTCK
3 Processor-Board N/R Data DSUTDI
4 Processor-Board N/R Data DSUTDO
5 Processor-Board N/R Data EMDC (DSU)
6 Processor-Board N/R Data DSUACT
7 Processor-Board N/R Data DSUBRK
8 Processor-Board N/R Data DSUEN
9 Processor-Board N/R Data DSURSTN
10 Processor-Board N/R Data (RS422) data+
11 Processor-Board N/R Data (RS422) data-
13 Processor-Board N/R 3.3 V Ext. power
15 Processor-Board N/R GND Ext. ground

Table 11.46 J3—power connector on long board side (female)


Pin To Pwr-Board From Pwr-Board Voltage/signal Usage Name
1 x 20–25 Core Pwr24 V-processor
2 x 20–25 IO Pwr24 V-IO

3 x 20–25 CCSDS Pwr24 V-CCSDS


6 x Return Core Pwr24 V-Ret-processor

11 x Return IO Pwr24 V-Ret-IO


12 x Return CCSDS Pwr24 V-Ret-CCSDS

4 x 20–25 Heater Pwr24 V-heater


13 x Return Pwr24 V-Ret-heater

9 x GND Shield
15 x GND Shield
11 CDPI Assembly Annexes and Data Sheets 233

Table 11.47 J4—PPS signal connector on long board side (male)


Pin To PWR-Board From PWR-Board Voltage/signal Usage
1 x PPS+ STR
10 x PPS- STR
19 x GND STR
2 x PPS+ 0 GPS0
11 x PPS- 0 GPS0
20 x GND GPS0
4 x PPS+ 1 GPS1
13 x PPS- 1 GPS1
22 x GND GPS1
6 x PPS+ 2 GPS2
15 x PPS- 2 GPS2
24 x GND GPS2
3 x PPS+ 0 GPS0
12 x PPS- 0 GPS0
21 x GND GPS0
5 x PPS+ 1 GPS1
14 x PPS- 1 GPS1
23 x GND GPS1
7 x PPS+ 2 GPS2
16 x PPS- 2 GPS2
25 x GND GPS2

The lines routed via external connector J1 (see Fig. 11.2) connect to OBC
Processor-Board N. The lines routed via external connector J7 connect to OBC
Processor-Board R.
Power-Board Power Connector
• Internal nomenclature J3
• External instantiations J2/J8 according to annex Sect. 11.6 Table 11.47.
The lines routed via external connector J2 connect to OBC Boards N. The lines
routed via external connector J8 connect to OBC Boards R.
Power-Board PPS Signal Connector
• Internal nomenclature J4
• External instantiations J3/J9 according to annex Sect. 11.6.
The lines routed via external connector J3 connect to OBC Processor-Board N.
The lines routed via external connector J9 connect to OBC Processor-Board R.
234

See Fig. 11.10.


11.10 PCDU Unit CAD Drawing

Fig. 11.10 PCDU housing CAD drawing.  Vectronic Aerospace GmbH


11 CDPI Assembly Annexes and Data Sheets 235

11.11 PCDU Unit Connector Pin Allocations

The connector pin assignments of the PCDU are something very mission specific,
driven by the spacecraft design. Therefore here only an overview concerning
which PCDU connector routes which type of signals is provided. Just for the
connectors J11 and J12 the detailed assignments are given further below since they
route

• the HPC interface lines from OBC CCSDS-Board to PCDU and


• the PCDU command/Control interfaces between OBC I/O-Boards and PCDU,
and thus they are key interfaces between the OBC and PCDU in the frame of the
overall CDPI architecture (Tables 11.48, 11.49, and 11.50).

Table 11.48 List of connectors


Name Type Description
J1 SUB-D 25, male Solar panel 0, battery 0,
J2 SUB-D 25, male Solar panel 1, battery 1,
J3 SUB-D 25, male Solar panel 2, battery 2, EGSE power input, Solar test string
J4 SUB-HD 62, female S/C equipment power supplies I
J5 SUB-HD 62, female S/C equipment power supplies II
J6 SUB-HD 62, female S/C equipment power supplies III
J7 SUB-D 25, female S/C equipment power supplies IV
J8 SUB-HD 62, male Temperature sensor inputs I
J9 SUB-HD 62, male Temperature sensor inputs II
J10 SUB-D 25, female Deployment sensor inputs; sun sensor inputs I
J11 SUB-D 9, male Communication interface for Common Commanding and
High Priority Commanding I
J12 SUB-D 9, male Communication interface for Common Commanding and
High Priority Commanding II
J13 SUB-D 25, female Sun sensor inputs II

Table 11.49 Pin assignment of J11(X13) Sub -D 9, male


Pin Name Signal description
1 GND Signal ground
2 Rx1A RS422 HPC interface from OBC CCSDS-Board 0
3 Rx1B RS422 HPC interface from OBC CCSDS-Board 0
4 Tx1A RS422 HPC interface from OBC CCSDS-Board 0
5 Tx1B RS422 HPC interface from OBC CCSDS-Board 0
6 Rx0A RS422 command interface from/to OBC I/O-Board N
7 Rx0B RS422 command interface from/to OBC I/O-Board N
8 Tx0A RS422 command interface from/to OBC I/O-Board N
9 Tx0B RS422 command interface from/to OBC I/O-Board N
236

Table 11.50 Pin assignment of J12 (X12) Sub -D 9, male


Pin Name Signal description
1 GND Signal ground
2 Rx1A RS422 HPC interface from OBC CCSDS-Board 1
3 Rx1B RS422 HPC interface from OBC CCSDS-Board 1
4 Tx1A RS422 HPC interface from OBC CCSDS-Board 1
5 Tx1B RS422 HPC interface from OBC CCSDS-Board 1
6 Rx0A RS422 command interface from/to OBC I/O-Board R
7 Rx0B RS422 command interface from/to OBC I/O-Board R
8 Tx0A RS422 command interface from/to OBC I/O-Board R
9 Tx0B RS422 command interface from/to OBC I/O-Board R

11.12 PCDU Switch and Fuse Allocation to Spacecraft


Equipment

The design of the PCDU includes a total of 27 fuses and 77 power switches plus 2
special switches for high-power consuming loads. The Table (11.51) provides an
overview of the assignments of the fuses and switches to equipment of the FLP
target spacecraft.

Table 11.51 Switch and fuse register description


No. of fuse (SW) No. of Switch (SW) Component
00 00 OBC Processor-Board N
01

01 02 OBC Processor-Board R
03

02 04 OBC I/O-Board N
05

03 06 OBC I/O-Board R
07

04 08 OBC CCSDS-Board N
09

05 10 OBC CCSDS-Board R
11

06 12 TC receiver 0

07 13 TC receiver 1

08 14 Camera Payload: MICS channel green


15 Payload controller power channel N
16 Payload data transmitter N
17
(continued)
11 CDPI Assembly Annexes and Data Sheets 237

Table 11.51 (continued)


No. of fuse (SW) No. of Switch (SW) Component
09 18 Camera Payload: MICS channel red
19 Payload controller power channel R
20 Payload data transmitter R
21

10 22 Camera payload: MICS channel near infrared


23 TM transmitter N
24
25 STR N

11 26 RWL 3
27 STR R
28 FOG 3

12 29 FOG 0
30 RWL 0

13 31 FOG 1
32 RW 1

14 33 FOG 2
34 RW 2

15 35 MGM 0

16 36 MGM 1
17 37 Camera payload: PAMCAM
38 MGT unit R
39 GPS electronics 1

18 40 GPS electronics 0
41 MGT unit N
42 GPS electronics 2

19 43 Laser payload: Osiris channel 2


44
45 TM transmitter R
46

20 47 Laser payload: Osiris channel 1


48
49 Data transmission power amplifier R
50
51 Survival heater OBC R ? TT&C R (with thermostat)

21 52 Survival heater OBC N ? TT&C N (with thermostat)


53 Data transmission power amplifier N
54
(continued)
238

Table 11.51 (continued)


No. of fuse (SW) No. of Switch (SW) Component
22 55 Satellite payload compartment heater N
56
57 Satellite core compartment heater N
58
59 Satellite service compartment heater N
60

23 61 Satellite payload compartment heater R


62
63 Satellite core compartment heater R
64
65 Satellite service compartment heater R
66

24 67 SA retaining mechanism N
68

25 69 SA retaining mechanism R
70
26 71 De-orbiting mechanism
72
73 Payload: AIS antenna
74
75 Payload: AIS receiver
76
References

© selenamay - Fotolia.com

References on the CDPI System Concept

1. Leon3FT Processor: http://www.aeroflex.com/ams/pagesproduct/prods-hirel-leon.cfm


2. Stratton, Sam: Fault Tolerant LEON Processing, Devices and Circuit Cards MAPLD 2009 -
Greenbelt, Maryland, August 31 - September 3, 2009
3. Aeroflex Gaisler Leon3FT IP Core: http://www.gaisler.com/cms/index.php?option=
com_content&task=view&id=13&Itemid=53
4. Eickhoff, Jens; Stratton, Sam; Butz, Pius; Cook, Barry; Walker, Paul; Uryu, Alexander;
Lengowski, Michael; Röser, Hans-Peter: Flight Model of the FLP Satellite OBC and
Reconfiguration Unit Data Systems in Aerospace, DASIA 2012 Conference, 14–16 May,
2012, Dubrovnik, Croatia
5. Habinc, Sandi A.; Cook, Barry; Walker, Paul; Eickhoff, Jens; Witt, Rouven; Röser, Hans-
Peter: Using FPGAs and a LEON3FT Processor to Build a ‘‘Flying Laptop’’, ReSpace/
MAPLD 2011 Conference 2011, 22–25 August 2011, Albuquerque, New Mexico, USA
6. Uryu, Alexander N.; Fritz, Michael; Eickhoff, Jens; Röser, Hans-Peter: Cost and Time
Efficient Functional Verification in Hardware and Software 28th ISTS (International
Symposium on Space Technology and Science), 05–20 June, 2011, Okinawa, Japan

J. Eickhoff (ed.), A Combined Data and Power Management Infrastructure, 239


Springer Aerospace Technology, DOI: 10.1007/978-3-642-35557-8,
Ó Springer-Verlag Berlin Heidelberg 2013
240 References

7. Eickhoff, Jens; Cook, Barry; Walker, Paul; Habinc, Sandi A.; Witt, Rouven; Röser, Hans-
Peter: Common board design for the OBC I/O unit and the OBC CCSDS unit of the Stuttgart
University Satellite ‘‘Flying Laptop’’ Data Systems in Aerospace, DASIA 2011 Conference,
17–20 May, 2011, San Anton, Malta
8. Eickhoff, Jens; Stevenson, Dave; Habinc, Sandi; Röser Hans-Peter:University Satellite
featuring latest OBC Core & Payload Data Processing Technologies, Data Systems in
Aerospace, DASIA 2010 Conference, Budapest, Hungary, June, 2010

References on S/C Engineering

9. Eickhoff, Jens: Simulating Spacecraft Systems, Springer, 2009, ISBN: 978-3-642-01275-4


10. Eickhoff, Jens: Onboard Computers, Onboard Software and Satellite Operations - An
Introduction, Springer, 2011, ISBN 978-3-642-25169-6

References on General Engineering Standards

11. http://www.spacewire.esa.int/content/Home/HomeIntro.php
12. ECSS-E-ST-50-12C (31 July 2008) SpaceWire - Links, nodes, routers and networks
13. ECSS-E-ST-50-51C (5 February 2010) SpaceWire protocol Identification
14. ECSS-E-ST-50-52C (5 February 2010) SpaceWire - Remote memory access protocol
15. ECSS-E-50-12C SpaceWire cabling
16. ECSS-E-ST-50C Communications
17. ECSS-E-ST-50-01C Space data links - Telemetry synchronization and channel coding
18. ECSS-E-ST-50-02C Ranging and Doppler tracking
19. ECSS-E-ST-50-03C Space data links - Telemetry Transfer Frame protocol
20. ECSS-E-ST-50-04C Space data links - Telecommand protocols, synchronization and channel
coding
21. ECSS-E-ST-50-05C Radio frequency and modulation
22. ECSS-E-70-41A Ground systems and operations - Telemetry and telecommand packet
utilization
23. Consultative Committee for Space Data Systems: CCSDS Recommended Standards, Blue
Books, available online at http://public.ccsds.org/publications/BlueBooks.aspx
24. CCSDS 130.0-G-2 CCSDS layer conversions
25. CCSDS-131.0-B-1 TM Synchronization and Channel Coding
26. CCSDS-132.0-B-1 TM Space Data Link Protocol
27. CCSDS 133.0-B-1 Space Packet Protocol
28. CCSDS-133.0-B-1-C1 Encapsulation Service Technical Corrigendum 1
29. CCSDS-135.0-B-3 Space Link Identifiers
30. CCSDS-201.0 Telecommand - Part 1 - Channel Service, CCSDS 201.0-B-3, June 2000
31. CCSDS-202.0 Telecommand - Part 2 - Data Routing Service, CCSDS 202.0-B-3, June 2001
32. CCSDS-202.1 Telecommand - Part 2.1 - Command Operation Procedures, CCSDS 202.1-B-
2, June 2001
33. CCSDS-203.0 Telecommand - Part 3 - Data Management Service, CCSDS 203.0-B-2, June
2001
References 241

34. CCSDS-231.0-B-2 TC Synchronization and Channel Coding


35. CCSDS 232.0-B-2 TC Space Data Link Protocol
36. CCSDS-232.1-B-2 Communications Operation Procedure-1
37. CCSDS-401.0-B Radio Frequency and Modulation Systems
38. CCSDS-732.0-B-2 AOS Space Data Link Protocol
39. ESA PSS-04-0: Space data communications
40. ESA PSS-04-103: Telemetry channel coding standard, Issue 1
41. ESA PSS-04-105: Radio frequency and modulation standard
42. ESA PSS-04-106: Packet telemetry standard, Issue 1
43. ESA PSS-04-107: Packet Telecommand Standard, Issue 2
44. ESA PSS-04-151: Telecommand Decoder Standard, Issue 1
45. ECSS-Q-ST-70-08C Manual soldering of high-reliability electrical connections
46. ECSS-Q-ST-70-26C Crimping of high-reliability electrical connections
47. ECSS-E-10-03A Space Engineering - Testing, February 2002
48. Astrium Patent Affiliation P700377-DE-NP: Multifunktionaler Kontroller für einen Satelliten
Deutsches Patent- und Markenamt DPMA 10 2012 009 513.9

References on the Processor-Boards

49. UT699 LEON3 Datasheet: http://www.aeroflex.com/ams/pagesproduct/datasheets/leon/


ut699LEON3datasheet.pdf
50. UT699 LEON3FT Functional Manual: http://www.aeroflex.com/ams/pagesproduct/
datasheets/leon/UT699LEON3UserManual.pdf
51. UT8R2M39 80Megabit SRAM MCM: http://www.aeroflex.com/ams/pagesproduct/
datasheets/UT8ER1M39SRAMMCM.pdf
52. FM22L16 4Mbit F-RAM Memory: http://www.ramtron.com/files/datasheets/FM22L16_
ds.pdf
53. UT6325 RadTol Eclipse FPGA: http://www.aeroflex.com/ams/pagesproduct/datasheets/
RadTolEclipseFPGA.pdf
54. UT54LVDS031LVE 3.3-VOLT QUAD LVDS Driver: http://www.aeroflex.com/ams/
pagesproduct/datasheets/LVDSDriver3v.pdf
55. UT54LVDS032LVE 3.3-VOLT QUAD LVDS Receiver: http://www.aeroflex.com/ams/
pagesproduct/datasheets/LVDSReceiver3v.pdf
56. DS16F95QMLRS-422 Transceiver: http://www.national.com/ds/DS/DS16F95QML.pdf

References on Low Level Software and Operating System

57. OAR Corporation: http://www.rtems.com


58. Aeroflex Gaisler RTEMS: http://www.aeroflex.com/gaisler
242 References

References on the I/O-Boards

59. 4Links Ltd.: SpaceWire Interface Unit for Interfacing to Avionics, Payloads, and TM/TC
units, User Manual for FLP IO, FM SIU B-012-PPFLPIO
60. Everspin MRAM Brochure: http://everspin.com/PDF/MSG-14349_MRAM_Sales_Bro.pdf
61. Everspin MR4A16b Data Sheet: http://everspin.com/PDF/EST_MR4A16B_prod.pdf

References on the CCSDS TM/TC Encoder / Decoder

62. Aeroflex Gaisler AB: CCSDS TM / TC and SpaceWire FPGA Data Sheet and User’s Manual
GR-TMTC-0004 July 2012, Version 1.2
63. 4Links Ltd.: SpaceWire Interface Unit for Interfacing to Avionics, Payloads, and TM/TC
units, User Manual for FLP CCSDS, FM SIU B-012-PPFLPCCSDS
64. GRLIB IP Library User’s Manual, Aeroflex Gaisler http://www.aeroflex.com/gaisler
65. GRLIB IP Core User’s Manual, Aeroflex Gaisler http://www.aeroflex.com/gaisler
66. Spacecraft Data Handling IP Core User’s Manual, Aeroflex Gaisler http://www.aeroflex.
com/gaisler
67. AMBA Specification, Rev 2.0, ARM IHI 0011A, Issue A, ARM Limited
68. Radiation-Tolerant ProASIC3 Low Power Space-Flight Flash FPGAs, 51700107-1/11.09,
Revision 1, November 2009, Actel Corp
69. ProASIC3L Low Power Flash FPGAs, 51700100-9/2.09, February 2009, Actel Corp
70. ProASIC3E Flash Family FPGAs, 51700098-9/8.09, August 2009, Actel Corp

References on the OBC Power Boards

71. ESA PSS-01-301: Derating Requirements applicable to Electronic, Electric and


Electromechanical (EEE) Components for ESA Space Systems

References on the OBC Internal Harness

72. Manufacturing Data Package for IRS OBC internal harness HEMA Kabeltechnik GmbH &
Co. KG, 2012

References on the OBC Mechanical / Thermal Design

73. Schuh: Konstruktion und Analyse eines Struktur-Thermal Modells des Onboard-Computers
für den Kleinsatelliten Flying Laptop, Study Thesis, IRS, 2011
74. Ley: Handbuch der Raumfahrttechnik Hanser Verlag, 2008
75. http://www.mincoglobal.de/uploadedFiles/Products/Thermofoil_Heaters/Kapton_Heaters/hs202b-
hk.pdf
References 243

76. ESCC Detail Specification No. 3702/001 SWITCHES, THERMOSTATIC, BIMETALLIC,


SPST, OPENING CONTACT
77. https://escies.org/epplcomponent/show?id=3976

References on the Power Control and Distribution Unit

78. Gaget1-ID/160-8040 data sheet: RWE Space Solar Power GmbH, Gaget1-ID/160-8040 Data
Sheet, HNR 0002160-00, 2007
79. Battery data sheet:: A123 Systems, Inc.: Nanophosphate High Power Lithium Ion Cell
ANR26650M1B, MD100113-01, 2011
80. Test String data sheet: RWE Space Solar Power GmbH, RWE3G-ID2*/150-8040 Data Sheet,
2005
81. NASA radiation: PD-ED-1258: Space radiation Effects on Electronic Components in Low-
Earth Orbit, April 1996, NASA - Johnson Space Center (JSC)
82. Wertz, J. R.; Larson, W. J.: Space Mission Analysis and Design, 3rd ed., Microcosm Press,
1999, ISBN 978-1881883104
83. Uryu, A. N.: Development of a Multifunctional Power Supply System and an Adapted
Qualification Approach for a University Small Satellite, Dissertation, University of Stuttgart,
Stuttgart, Germany, Institute of Space Systems, 2012
84. PCDU Microcontroller data sheet: RENESAS Electronics, Renesas 32-Bit RISC
Microcomputer, SuperH RISC engine Family/SH7040 Series, hardware manual, Issue 6.0,
2003
85. VECTRONIC Aerospace GmbH: Interface Control Document & Operation Manual for
Power Control and Distribution Unit Type VPCDU-1, Project IRS-FLP TD-VAS-PCDU-
FLP-ICD16.doc Issue 6, 12.12.2011

References on System Tests

86. Brandt, Alexander; Kossev, Ivan; Falke, Albert; Eickhoff, Jens;Röser Hans-Peter:
Preliminary System Simulation Environment of the University Micro-Satellite Flying
Laptop, 6th IAA Symposium on Small Satellites for Earth Observation, German Aerospace
Center (DLR), 23–26 April 2007, Berlin, Germany
87. Fritz, Michael: Hardware und Software Kompatibilitätstests für den Bordrechner eines
Kleinsatelliten PhD thesis, Institute of Space Systems, 2012
88. http://www.egos.esa.int/portal/egos-web/products/MCS/SCOS2000/

References on the Target Satellite

89. Grillmayer, Georg: An FPGA based Attitude Control System for the Micro-Satellite Flying
Laptop.PhD thesis, Institute of Space Systems, 2008
90. Zeile, Oliver: Entwicklung einer Simulationsumgebung und robuster Algorithmen für das
Lage- und Orbitkontrollsystem der Kleinsatelliten Flying Laptop und PERSEUS. PhD thesis,
Institute of Space Systems, 2012
Index

A CATIA, 138
A3PE3000L FPGA, 21, 44, 58 CCSDS, 5, 65, 71, 208
ADuM14xx Isolator, 51 CCSDS protocol, 175
ADum54xx Isolator, 51 Channel Acquisition Data Unit (CADU), 18,
AMBA bus, 15 60, 62, 183
Analog data handling, 152 Chip Enable, 34, 36, 38
Analog RIU, 160 CLCW, 63, 68, 70, 75, 84, 85, 93, 97, 98, 212
Analog-to-Digital Converter, 160 Cleanroom, 175
ARM, 15 Clock, 6, 45
ASIC, 9 Clock divider, 77
Assembly, Integration and Tests, 176 Clock signal, 188
Attitude Control System, 192 Clock strobe, 104
Authentication Unit, 85 CLTU, 17, 60, 62, 86, 165, 183
Autonomous reconfiguration, 163 CMOS, 57
Codeblock decoding, 87
Codeblock rejection, 87
B Coding Layer, 87
Backplane, 4, 119 Cold redundancy, 144, 162
Bandwidth allocation, 75 Combined Data and Power Management
Base plate, 136 Infrastructure, 1, 2, 14, 60, 86, 151, 161,
Baseplate, 135, 140 182, 199
Battery, 152 Combined-Controller, 9, 11, 18, 22, 44, 141,
Battery Charge Regulator, 157 161, 162
Battery power, 156 Command Link Control Word, 75, 84, 85, 93,
Battery status, 162 97, 212
Battery survival heater, 157 Command Link Transmission Unit, 86, 165, 183
BCR, 157 Command Pulse Decoding Unit, 8, 9, 85, 164,
Bi-stable relay, 157 165
Bleeder resistor, 107 Communication tests, 175
Board identity, 47 Compact PCI, 15
Boot-up sequence, 156 Conductive coupling, 142, 144
Breadboard Model, 18, 174, 180 Connectors, 56
Buffer chips, 47 Consultative Committee for Space Data
Systems, 65, 71, 208
Control loop, 160, 162
C Convolutional encoding, 60, 63, 76
CAD software, 138 CPDU, 86, 164, 165
CAN bus, 15 CPU load, 191

J. Eickhoff (ed.), A Combined Data and Power Management Infrastructure, 245


Springer Aerospace Technology, DOI: 10.1007/978-3-642-35557-8,
Ó Springer-Verlag Berlin Heidelberg 2013
246 Index

D Flight model, 182


Data bus, 7 Flight Model, 24
Data Handling and Operations, 7 Flight Procedure, 184
Data Link Protocol, 66, 69 FM22L16 FRAM, 38
DC/DC converter, 105 FOG, 50
Debug Support Unit, 32, 41, 206 FPGA, 4, 9, 18, 31, 44, 45, 57, 60
Deformation, 139 FRAM, 31–34
Deployment autosequence, 168 Frame Error Control Field, 69
Deployment timer, 167 Functional verification matrix, 178
Depth of discharge, 162 Fuse, 154, 157, 162, 167, 236
De-randomizer, 87
Digital I/O, 45
Digital RIU, 160 G
Direct Memory Access, 84, 85, 89 GPIO, 35, 38, 39
Downlink rate, 65 GPS, 104, 113
DRAM, 31 Ground loop, 51
DSU, 32 Grounding, 184
DSU/Ethernet Interface, 41 GR-TMTC-0004 IP-core, 60

E H
ECSS, 65, 71 Hardware command, 63, 64, 69, 96, 99
EDAC, 7, 32, 35, 36, 70 Heater circuits, 114
EEPROM, 31 Heaters, 127, 140, 146
Eigenfrequency, 139 Heat-up duration, 149
Electromagnetic Interference, 51 HF influences, 135, 137
EMC, 124, 134 High Priority Command, 5, 7, 11, 18, 22, 44,
Engineering Model, 174, 182 60, 61, 96, 155, 162, 164, 165, 187, 189
EPPL, 148 History log function, 168
Error Detection and Correction, 36 HK5591 Heater, 148
ESA Procedures, Standards and Specifications, Hot redundancy, 144, 162, 164
66, 71 Housekeeping data, 5
ESATAN-TMS, 141 Housekeeping telemetry, 46
Ethernet, 15, 28, 32, 40, 206 HPC command sequence, 166
European Code of Conduct on Space Debris HPC command structure, 166
Mitigation, 196 HPC frame header, 165
European Cooperation on Space Standardiza-
tion, 65, 71
European Preferred Parts List, 148 I
External reset, 40 Idle Frame, 62, 66, 75, 187
IIC, 4, 45, 50, 51, 55
Inrush current, 110
F Insulation test, 129
Failure Detection, Isolation and Recovery, 2, Inter-board harness, 119
6, 174 International Telecommunication
Failure tolerance, 104 Union, 168
FDIR, 65, 164, 174 International Traffic in Arms Regulations, 27,
FEM model, 138 176, 185
FEM simulation, 138 Isolated group, 51, 52, 57
Field Programmable Gate Array, 31
FIFO, 89
Flash-memory, 31 J
FlatSat, 179, 181, 185, 193 JTAG Interface, 4, 53, 57, 100, 104, 115
Index 247

L OBC I/O-Board, 10, 12, 46, 109, 235


Latching Current Limiter, 7, 157, 160, 165, 172 OBC Power-Board, 103, 104, 127, 140, 176,
Launch and Early Orbit Phase, 204 232
LEON3FT, 59 OBC Processor-Board, 4, 12, 28, 43, 54, 100,
Line Impedance Stabilization Network, 111 104, 113, 155, 162–165, 185, 186, 205
Local Time of Descending Node, 196 OBC temperature, 156
Logic-in, 55 OBSW, 186
Logic-out, 50, 55 OBSW device handling, 186
Lumped parameter model, 141, 148 Onboard Computer, 1
LVDS, 45 Onboard Software, 2, 46, 175
Operating temperature, 58, 106, 134, 140
OSI reference model, 72
M Overcharge, 159
Magneto-coupler, 51 Overvoltage protection, 168
Manufacturing Data Package, 131
Mass, 137
Mass memory, 7 P
Master Channel Counter, 75 Packet Telecommand protocol, 84
Mechanical properties, 137 PCB internal heating, 154
Memory, 33, 44, 45, 54 PCDU commands, 172
Memory controller, 36 PCDU Common Command, 155, 162
MGDS04-HB power PCDU controller, 164, 168
converter, 106, 107 PCDU controller redundancy, 164
MGDS10-HB power converter, 106 PCDU internal heater, 156
Microcontroller, 5 PCDU non-operational temperature, 169
Mini Operations Mode, 163 PCDU operating temperature, 169
Mockup, 123 Physical Layer, 66
Modal analysis, 139 Port Enable, 37
Moments of inertia, 137 Power bus, 5, 7, 9, 103, 105, 151, 157
MR4A16B MRAM, 46 Power Control and Distribution Unit, 1, 12, 23,
MRAM, 31, 45, 46 24, 44, 60, 103, 115, 140, 141, 151, 174,
Multiplexer Access Point Identifier, 18, 165, 176, 178, 182, 193, 199, 204
189 Power harness, 127
Power On Reset, 40
Power On Reset circuit, 39
N Power supply, 185
Noise immunity, 45 Power Supply Unit, 108
Nominal heaters, 147 Power switch, 156, 236
Non-interruptable power supply, 185 PPS, 38, 113
Non-Return-to-Zero Level, 68, 77 Printed Circuit Board, 41, 153
Non-Return-to-Zero-Mark, 88 Priority circuitry, 113
Non-volatile memory, 4, 30, 33, 34, 46 ProASIC3 FPGA, 60
NRZ-L, 60, 68, 69, 77, 86 PROM, 6, 34, 35
NRZ-L encoding, 63 Protocol Sub-layer, 68
NRZ-M, 86 Pseudo-De-randomization, 69
NVRAM, 45 Pseudo-Randomization, 63, 76
Nx I-deas, 138 Pulse Per Second, 16, 38, 104
Pulse signal, 104

O
OBC CCSDS-Board, 12 Q
OBC heaters, 104 Quasi-static analysis, 139
OBC housing, 104, 114, 148 Quasi-static design load, 137
248 Index

R Software-based decoder, 59
Radiation tolerance, 4, 31, 169 Solar panel, 152, 157, 162
Radiator, 140 Solar panel deployment, 167
RAM, 6 Space Packet, 61, 62, 69, 74, 98, 212
Random vibration, 140 Space Packet Protocol layer, 69
Real Time Operating System, 2, 29, 33 Spacecraft Identifier, 68, 187
Realtime Simulator, 186 Spacecraft status, 46
Reconfiguration, 8, 11, 152, 161, 163, 177, 182 SpaceWire, 2, 4, 15, 18, 21, 30, 32, 36, 37, 41,
Reconfiguration Unit, 7–9, 161 43, 44, 45, 47, 60–62, 66, 100, 213
Redundancy, 6, 23, 161, 182, 183 SpaceWire Clock, 40
Redundant heaters, 147 SpaceWire harness, 127
Reed-Solomon encoding, 60, 63, 76, 210 SpaceWire link, 56
Reflection, 124 SpaceWire packet, 213
Reliability, 183 SpaceWire port, 37
Remote Interface Unit, 5, 21, 44, 160, 167 SpaceWire Router, 186
Remote Memory Access Protocol, 18, 43, SPARC, 30, 59
45–47, 60, 61, 70, 213 SRAM, 31–33, 35, 36, 38, 45, 46
Resistance test, 130 Stacked die, 38
Retention test, 129 Standard grounded group, 57
RF available indicator, 69 Star tracker, 104, 113
RMAP addresses, 50 Start Sequence, 87
RMAP memory address, 45 State of Charge, 160
RMAP packet, 55 Static Random Access Memory, 31
RMAP Verify Bit, 50 Storage temperature range, 58
RS422, 18, 32, 39, 41, 45, 51, 61, 65, 155, 165 Sun-Synchronous Orbit, 196
RS485, 45, 51 Switch, 154, 157, 162, 167, 236
RTEMS, 17, 18, 60 Synchronization, 87
Synchronization and Channel Coding, 66
Synchronization and Coding Sub-layer, 68
S Synchronous clock and data recovery, 44
Safeguard Memory, 6 System chain tests, 175
Safe Mode, 9, 156, 160, 163 System clock, 40
Safety, 183 System tests, 175
Satellite resonance frequency, 139
Satellite Test Bed, 23, 177–179, 183, 185
Schmitt trigger, 51 T
SCOS, 175 Technical Assistance Agreement, 27
SDRAM, 31 Telecommand, 5
Segment, 98 Telecommand active signal, 69
Segment Data Field, 69 Telecommand bit clock active, 69
Senior engineer, 177 Telecommand Decoder, 17, 59, 69, 84, 85
Sensor data, 161 Telecommand Frame, 18
Separation detection, 167 Telecommand input interface, 62
Serial differential interface, 51 Telecommand Source Packet Header, 166
Service Interface, 4, 104, 115 Telecommand Transfer Frame, 63, 99, 165,
SH7045 microcontroller, 153 211
Shaker, 178 Telemetry, 5, 155
Shunt resistor, 109 Telemetry downlink, 61
Sine vibration, 140 Telemetry Encoder, 18, 59, 65, 67, 69, 71, 84,
Single Board Computer, 4, 14, 28, 29 93
Single Event Latch-up, 58 Telemetry Encoder Descriptor, 82
Single-Event Upset, 8, 44, 58 Telemetry interface, 62
Skin connector, 4, 115, 117 Telemetry Transfer Frame, 70, 209
Sleep Bit, 35 Temperature limit, 159
Index 249

Temperature sensor, 115, 154, 162 V


Test execution, 177 Venting-holes, 137
Test plan, 177 Vibration loads, 170
Test procedure, 177 Vibration test, 140, 154
Thermal preconditioning, 141 Virtual Channel, 18, 60–62, 66, 67, 69, 74, 83,
Thermal-vacuum chamber, 178 98, 165, 189
Thermal-vacuum test, 154 Virtual Channel Generation, 74, 82, 217, 218
Thermostat, 128, 148 Virtual Channel Identifier, 68
Thermostat switch, 104, 114 Virtual Channel Multiplexing, 74
Timeline, 46 Virtual Channel Packet Extraction, 99
Total Ionizing Dose, 58 Virtual Channel Segment Extraction, 98
Transceiver temperature, 156 Volatile memory, 31, 33, 35, 70
Transfer Frame, 61, 69 Volume, 137
Transfer Frame Data Field, 69
Transistor-Transistor Logic, 57
Triple Module Redundancy, 21, 44, 60 W
Wait states, 34
Watchdog, 8, 11, 39, 162, 164
U Write acknowledge, 50
UART, 4, 45, 50, 55, 60, 63, 98, 99
Under-voltage protection, 159, 160
Unregulated power bus, 157 X
UT6325 RadTol Eclipse FPGA, 31, 37 Xilinx Platform USB Programmer, 208
UT699 LEON3FT, 14, 30, 32
UT8ER2M39 SRAM, 38

You might also like