Professional Documents
Culture Documents
5.1 reconfigScanDisk
Normally, this test is not run in the field. Applications/Install reconfig will
NOTE: internally/automatically run this to setup RAID. This programs sets up RAID (and then
tests the RAID setup).
On the DARC,
Each action appears on screen and OK should appear next to the action once it has completed
successfully.
Attached devices:
Host: scsi0 Channel: 00 Id: 00 Lun: 00
Vendor: MITSUMI Model: CD-ROM SR244W Rev: T01A
Type: CD-ROM ANSI SCSI revision: 02
Host: scsi1 Channel: 00 Id: 00 Lun: 00
Vendor: SEAGATE Model: ST336753LW Rev: 0005
Type: Direct-Access ANSI SCSI revision: 03
Host: scsi1 Channel: 00 Id: 01 Lun: 00
Vendor: SEAGATE Model: ST336753LW Rev: 0005
Type: Direct-Access ANSI SCSI revision: 03
- OR -
These two tests take around 10 minutes, each. You will not see any output
NOTE:
while they are running.
Example Output:
71687372+0 records in
6 Other
6.1 sg_map -i
This test applies to emulated SCSI devices like the DARC CD-ROM.
Personalities : [raid0]
md0 : active raid0 sdb1[1] sda1[0] The device is named ‘md0’. Active
71681792 blocks 4k chunks means the device is available
1 Overview
The DARC/DARC2 Node mounts the Disk Array. The -c (create) command wipes out all Scan
(Patient) Data and reconfigures the Scan Disk. The -c and -a tests must be run with Application
Software down. The -q (query) test may be run with Application Software up or down.
Additional commands have been provided to see if the RAID is mounted and the disks are present,
in Section 4.
For All-In-One Console, there is no DARC node presence. CDIP card and scan data
disks are mounted in Host computer.
NOTE:
All the commands ran on DARC node still can be ran on Host computer. It’s not
necessary to run “rsh darc” for AIO console.
2 Determine Software
Confirm the Host Application Software is equal to or greater than 06MW03.4. If the software is less
than 06MW03.4, do NOT perform this procedure.
1. Open a Unix Shell and type the following to verify Applications software version information.
Examples are provided. The examples are not actual output.
{ctuser@hostname} swhwinfo
3 Procedure
The Disk Array is configured with 2 disks (internal to the SDDA or the DARC2). As required,
perform the create, assemble or query commands. Visually verify the entire output is successful
from the command script performed. The final output line ‘gre-raid success’ does not mean the 2
Disk Array RAID is good. Shutdown Application Software if it is up. Open a Unix Shell and type the
following:
{ctuser@hostname} cleanMon
The sudo gre-raid -c (create) script erases all data on the Scan Disk. Patient
information WILL be lost. Make certain to back-up all patient data prior to
performing this test.
With Application Software down, open a Unix Shell and type the following:
Are you sure you want to create a new disk array (yes/no)? yes
/dev/md0: stopping...done
/dev/sda: testing...drive_spec...52.0MB/sec...done
/dev/sdb: testing...drive_spec...51.9MB/sec...done
/dev/sda: partitioning...done
/dev/sdb: partitioning...done
/dev/md0: RAID-0 active with (2) drives, 256k chunk size, and 69GB
capacity
/raw_data: testing...170.1MB/sec...done
gre-raid: success
[ctuser@darc ~]$
Verify the create partitions diagnostic passes and all output looks good.
Perform the assemble and query commands if the create has been successful.
With Application Software down, open a Unix Shell and type the following:
/dev/md0: stopping...done
/dev/sda: testing...drive_spec...52.4MB/sec...done
/dev/sdb: testing...drive_spec...52.4MB/sec...done
/dev/md0: starting...done
/dev/md0: RAID-0 active with (2) drives, 256k chunk size, and 69GB
capacity
/raw_data: mounting /dev/md0 filesystem...done
/raw_data: testing...176.1MB/sec...done
gre-raid: success
[ctuser@darc ~]$
With Application Software up or down, open a Unix Shell and type the following:
A query can be performed at any time with Application software up or down. The
NOTE: preferred method is to test with Application software down. The Operator Console can
not be scanning or manipulating Scan Data during the testing.
/dev/sda: testing...drive_spec...52.7MB/sec...done
/dev/sdb: testing...drive_spec...52.4MB/sec...done
/raw_data: testing...158.3MB/sec...done
gre-raid: success
[ctuser@darc ~]$
{ctuser@darc} exit
{ctuser@hostname}
{ctuser@darc} df /raw_data
{ctuser@darc} df
[ctuser@darc ~]$ df
Filesystem 1K-blocks Used Available Use% Mounted on
/dev/hda2 4182720 2768432 1414288 67% /
/dev/hda1 101086 4939 90928 6% /boot
none 972692 0 972692 0% /dev/shm
/dev/hda7 30580776 147212 30433564 1% /usr/g
/dev/md0 71550208 288 71549920 1% /raw_data
[ctuser@darc ~]$ cat /proc/scsi/scsi
Attached devices:
Host: scsi0 Channel: 00 Id: 00 Lun: 00