Hach ORBISPHERE 3650 Atex Basic User Manual

Taper
Basic User Manual
DOC024.98.93004
ORBISPHERE Model
3650 ATEX
12/2019, Edition 5
Basic User Manual
Allgemeines Benutzerhandbuch
Manuale di base per l'utente
Manuel d'utilisation de base
Manual básico do utilizador
Basisgebruikershandleiding
Начальное руководство пользователя
Table of Contents
English..............................................................................................................................3
Deutsch.......................................................................................................................... 23
Italiano............................................................................................................................ 45
Français......................................................................................................................... 67
Português...................................................................................................................... 89
Nederlands................................................................................................................. 111
Русский........................................................................................................................133
2
Table of Contents
1 Table of contents on page 3
2 Specifications on page 3
3 Expanded manual version on page 4
4 General information on page 4
5 Installation on page 7
6 User interface on page 12
7 Options setup on page 17
8 Calibration on page 19
9 Maintenance on page 21
10 Troubleshooting on page 21
Section 1 Table of contents
Specifications on page 3 Options setup on page 17
General information on page 4 Calibration on page 19
Installation on page 7 Maintenance on page 21
User interface on page 12 Troubleshooting on page 21
Section 2 Specifications
Specifications are subject to change without notice.
2.1 3650EX Instrument
Specification Details
Power supply Model 32960 non-rechargeable lithium battery
Power autonomy 60 hours continuous use
Signal drift < 0.5% of reading between service
Serial output (RS232) Baud rate: 9600; Stop Bits: 1; Start Bits: 0; Parity: None;
Temperature compensation range -5 to 60°C
Instrument operating limits 0 to 45°C
Dimensions (H x W x D) 150 mm x 115 mm x 220 mm
Weight 2.4 kg
Enclosure protection IP 65/NEMA 4
EMC standards EN 61326
EXPROOF standards EN 60079-0; EN 60079-11
LCIE 03 ATEX 6003 X
II 1 G, EX ia IIC T4 Ga
ISO certification ISO9001/EN29001
2.2 29122 Interface box
Specification Details
Power supply
120 VAC 50/60 Hz (Model 29122.A)
230 VAC 50/60 Hz (Model 29122.B)
Power consumption
11 VA (Model 29122.A)
7 VA (Model 29122.B)
English 3
Specification Details
Fuse
Max current 250 mA (Model 29122.A)
Max current 100 mA (Model 29122.B)
Instrument operating limits 0 to 45°C
Dimensions (H x W x D) 70 mm x 140 mm x 190 mm
Weight 0.65 kg
Enclosure protection IP 20
Enclosure materials ABS FR (V0)
EMC Directive EN 61326-1
LVD Directive EN 61010-1
2.3 Analyzer gas and display options
Instrument model Gas measured Display units Maximum display resolution
3650EX/111 Oxygen ppm/ppb (liquid) 1 ppb
3650EX/112 Oxygen %/ppm (gaseous) 1 ppm
3650EX/113 Oxygen
ppm (liquid) 0.001 ppm
% (gaseous) 0.001%
3650EX/114 Oxygen kPa/Pa (gaseous) 1 Pa
3650EX/115 Oxygen bar/mbar (gaseous) 1 mbar
3650EX/211 Hydrogen ppm/ppb (liquid) 0.01 ppb
3650EX/212 Hydrogen %/ppm (gaseous) 0.01 ppm
Section 3 Expanded manual version
For additional information, refer to the expanded version of this manual, which is available on the
manufacturer's website.
Section 4 General information
In no event will the manufacturer be liable for direct, indirect, special, incidental or consequential
damages resulting from any defect or omission in this manual. The manufacturer reserves the right to
make changes in this manual and the products it describes at any time, without notice or obligation.
Revised editions are found on the manufacturer’s website.
4.1 Safety information
N O T I C E
The manufacturer is not responsible for any damages due to misapplication or misuse of this product including,
without limitation, direct, incidental and consequential damages, and disclaims such damages to the full extent
permitted under applicable law. The user is solely responsible to identify critical application risks and install
appropriate mechanisms to protect processes during a possible equipment malfunction.
Please read this entire manual before unpacking, setting up or operating this equipment. Pay
attention to all danger and caution statements. Failure to do so could result in serious injury to the
operator or damage to the equipment.
Make sure that the protection provided by this equipment is not impaired. Do not use or install this
equipment in any manner other than that specified in this manual.
4
English
4.2 Use of hazard information
D A N G E R
Indicates a potentially or imminently hazardous situation which, if not avoided, will result in death or serious injury.
W A R N I N G
Indicates a potentially or imminently hazardous situation which, if not avoided, could result in death or serious
injury.
C A U T I O N
Indicates a potentially hazardous situation that may result in minor or moderate injury.
N O T I C E
Indicates a situation which, if not avoided, may cause damage to the instrument. Information that requires special
emphasis.
4.3 Interface box (model 29122)
W A R N I N G
Explosion hazard. Only use the Interface Box 29122 in the safe area and never in the explosive area.
W A R N I N G
The interface box should only be connected to an earthed power supply socket.
W A R N I N G
In accordance with safety standards, it must be possible to disconnect the external power supply of the interface
box in its immediate vicinity.
W A R N I N G
Any maintenance of the interface box should be performed exclusively by personnel specialized and authorized to
work on electrical equipment, in accordance with relevant local regulations.
W A R N I N G
Disconnect the interface box from the power supply before carrying out any maintenance (including changing
fuses).
W A R N I N G
Electrical danger and fire hazard. Only use the supplied power cable. Only qualified experts may perform the
tasks detailed in the installation section of this manual, while adhering to all locally valid safety regulations.
W A R N I N G
Removable power cables must not be replaced with inadequately dimensioned power cables.
4.4 Precautionary labels
Read all labels and tags attached to the instrument. Personal injury or damage to the instrument
could occur if not observed. A symbol on the instrument is referenced in the manual with a
precautionary statement.
This is the safety alert symbol. Obey all safety messages that follow this symbol to avoid potential
injury. If on the instrument, refer to the instruction manual for operation or safety information.
This symbol indicates that a risk of electrical shock and/or electrocution exists.
English 5
This symbol indicates the presence of devices sensitive to Electro-static Discharge (ESD) and
indicates that care must be taken to prevent damage with the equipment.
This symbol, when noted on a product, indicates the instrument is connected to alternate current.
Electrical equipment marked with this symbol may not be disposed of in European domestic or
public disposal systems. Return old or end-of-life equipment to the manufacturer for disposal at no
charge to the user.
Products marked with this symbol indicates that the product contains toxic or hazardous substances
or elements. The number inside the symbol indicates the environmental protection use period in
years.
Products marked with this symbol indicates that the product must only be used in the safe area and
never in the explosive area.
4.5 Intrinsically safe conformity
Orbisphere series 3650Ex analyzers for gas measurement have been certified as Intrinsically Safe
by:
LCIE (Laboratoire Central des Industries Electriques), 33 av. Division Leclerc, Fontenay aux
Roses 92260, France.
Note: LCIE is a notified body number 0081 in accordance with the European ATEX Directive.
LCIE certifies that the electrical apparatus has been found to comply with the essential Health and
Safety Requirements: EN 60079-0, EN 60079-11.
These instruments are certified II 1G EX ia IIC T4 Ga under EU type Examination Certificate
number LCIE 03 ATEX 6003 X
Table 1 ATEX Certification
Category Explanation
II 1 G
ATEX marking: Surface apparatus with permanent explosive gas presence.
Ex Explosion-proof apparatus built to the universal standards below:
ia Type of protection: The highest category, based on a safety factor of 1.5 on two faults. No
combination of two faults in the 3650Ex can produce a spark, or heating, causing ignition of an
explosive atmosphere.
IIC Gas group: Corresponds to the most flammable gases, including hydrogen.
T4 Temperature category: Maximum surface temperature of 135º C.
Ga Equipment protection level.
4.5.1 Specific conditions of use
Use only non-rechargeable cell of type LS 26500 SAFT. Cell replacement is allowed in explosive
area.
The use of the external power supply input is not allowed in hazardous area. External link shall be
equipped with protection for voltage limitation when used in the non- hazardous area.
6
English
The equipment enclosure contains more than 15% aluminum. It must be mounted in such a manner
as to eliminate any risk of sparks caused by friction or impact.
The user shall implement any the necessary actions to avoid any electrostatic discharges hazards on
accessible metallic and non-metallic parts of the enclosure.
4.5.2 Send the Orbisphere 3650EX for repair
Important information
There are special regulations for the transport of the Orbisphere 3650EX portable oxygen instrument.
The instrument contains a special lithium battery, which based on current transport regulations by
ADR, IMDG or IATA
1
is classified as a dangerous good for all types of transport and is subject to
special dangerous goods regulations.
To send the instrument for repair or maintenance, make sure to remove the lithium battery from
the device. Do not send the battery. To remove the battery, refer to Batteries on page 10.
Remove the battery to prevent dangerous goods violations within the transport chain.
Section 5 Installation
W A R N I N G
Electrical danger and fire hazard. Only use the supplied power cable. Only qualified experts may perform the
tasks detailed in the installation section of this manual, while adhering to all locally valid safety regulations.
W A R N I N G
This instrument is powered by a special non-rechargeable Exproof lithium battery (model 32960).
W A R N I N G
The battery may be installed or changed in the hazardous area.
W A R N I N G
Do not short circuit the battery.
W A R N I N G
The instrument can be connected to a PC via the Interface Box (model 29122) only in a safe area.
1
ADR is the European Agreement concerning the International Carriage of Dangerous Goods by
Road. IMDG is the International Maritime Dangerous Goods Code. IATA is the International Air
Transport Association.
English 7
Figure 1 3650Ex Portable analyzer
1 Flow chamber 3 RS232 port 5 Barometric pressure sensor
relief valve
2 EC sensor 4 Battery cap
The series 3650Ex Intrinsically Safe Portable Analyzer is a self-contained instrument configured to
make oxygen or hydrogen gas concentration measurements with Electrochemical (EC) Sensors in a
hazardous area, in either liquid or gaseous samples.
The instrument is a portable unit and should be located convenient to the sample being analyzed.
5.1 Sensor installation
The electrochemical (EC) sensor connects to the instrument base through a 10-pin LEMO connector.
A locking nut holds the sensor in place. Generally, the sensor is shipped already installed in the
instrument. If this is not the case, for full installation instructions, please refer to the Sensor Manual
provided with your instrument.
Connection Sensor Signal LEMO-10 Pin
Guard ring electrode Pin 1
Not used Pin 2
Temperature measurement Pin 3
Counter electrode Pin 4
Not used Pin 5
Temperature measurement Pin 6
Not used Pin 7
Not used Pin 8
Working electrode Pin 9
Not used Pin 10
8 English
5.2 Flow chamber installation
Figure 2 3650Ex (rear view) with 32007E flow chamber
1 Inlet 5 Battery cap 9 Retractable carrying handle
2 Barometric pressure sensor
relief valve switch
6 Outlet 10 Sensor collar
3 Sensor 7 Guide pins 11 Sensor locking nut
4 RS-232 socket and cap 8 Reducing ring
The model 32007E flow chamber draws the liquid or gaseous sample past the EC sensor. It attaches
to the sensor with a threaded collar and is then sealed to the sensor with two O-rings.
The flow chamber centrally located inlet and eccentrically located outlet use either ¼-inch or 6-mm
diameter transparent plastic tubing. Connect by compression fittings to the sample source and to the
drain, respectively.
You may also have received a model 32051 sample tube adapter to attach the flow chamber inlet
tubing to the sampling point.
5.3 Sample tube adapter (optional)
A model 32051A sample tube adapter can be attached to the flow chamber's inlet tubing. This
adapter, in turn, attaches to 6 mm or ¼ inch stainless steel or flexible tubing using rubber gasket
model 32813 (or, for 8 mm tubing, rubber gasket model 32814).
The tightening ring provides a compression fitting to the sample tube.
Figure 3 Sample tube adapter
1 Inlet tubing 3 Check valve 5 Tightening ring
2 Cover 4 Rubber gasket (2 included)
English 9
5.4 WinLog97 PC program installation
Install the WinLog97 program onto the PC by inserting the accompanying CD into your PC and
running the SetUp program. Simply follow the on-screen instructions. When finished, a new Windows
Program Group labeled Orbisphere is created containing the software and help files.
5.5 Connections
5.5.1 Batteries
The instrument is designed to work on battery power.
W A R N I N G
Only the model 32960 non-rechargeable Exproof lithium battery can be used with this instrument.
Install the Exproof lithium battery by first unscrewing the instrument's battery cap located on the right
side of the instrument (see Figure 1 on page 21) with a coin or flat screwdriver. Then insert the
battery pack lengthwise, positive end first, and replace the cap.
The power autonomy of the instrument is about 60 hours with a new battery. If battery power should
drop, a LO BAT warning appears in the instrument LCD's top-left corner.
5.5.2 3650Ex Instrument - PC connection
W A R N I N G
Explosion hazard. Only use the Interface Box 29122 in the safe area and never in the explosive area.
The interface box operates from 115 VAC or 230 VAC power. Make sure that the power is correct
before connecting to a power supply. A green power LED is illuminated when the box is plugged into
the power source.
Figure 4 Instrument to PC connection
1 Model 3650EX instrument 3 Model 29122 interface box 5 Model 32538 cable
2 RS-232 LEMO-6 4 Model 32511 cable
The model 29122 interface box must be used to connect the 3650EX instrument to a personal
computer (PC) as illustrated in Figure 4. This unit converts TTL digital signals from the instrument to
RS-232 compatible signals.
W A R N I N G
This connection should be made exclusively within the safe area.
Two cables are supplied with the model 29122 interface box:
10
English
A model 32511 cable for connection between the 3650EX instrument LEMO-6 connector and the
interface box.
A model 32538 cable for connection between the interface box and the PC.
The connections should be made as indicated in Figure 4.
Note: When the Interface Box is connected to the 3650Ex instrument, this also acts as a power supply to the
instrument and disconnects the internal battery power source.
Connection Pseudo RS232 Signal LEMO-6 Pin
Transmitted data (TTL-TXD) Pin 1
Received data (TTL-RXD) Pin 2
Not used Pin 3
Not used Pin 4
External voltage input (used with the model 29122 interface only in safe area) Pin 5
Ground Pin 6
Interface box to instrument 9-Pin female connector Interface box to PC 9-Pin male connector
Pin 2 TTL Transmitted data (TXD) RS-232 Transmitted data (TXD)
Pin 3 TTL Received data (RXD) RS-232 Received data (RXD)
Pin 5 Power supply (V+) Ground
Pin 9 Ground Not used
Other pins not used
Note: If you use an adapter for the connection to the PC, make sure it is designed for this purpose and, thus, has
all nine pins accessible. Some 25-to-9 pin adapters are supplied for specific use, such as a mouse, and these may
have only certain pins available.
5.6 Installation checklist
5.6.1 Instrument clock setting
If you use the instrument to store measurements for downloading to a PC, you should verify the date
and time settings of the instrument's internal clock, as described in Clock settings on page 22.
5.6.2 Electrochemical sensor
Before making any measurements, for GA2800 ATEX sensors perform the preparation procedure in
the GA2x00 Sensor Manual. For other ATEX sensors perform the sensor service procedure as
described in the EC Sensor Manual.
5.6.3 Flow chamber
The model 32007E flow chamber's inlet and outlet should be free of any obstructions. It is mounted
by means of the sensor's collar, as shown in Figure 2 on page 9. A guide pin on the flow chamber
surface prevents twisting during operation.
When switching from liquid to gaseous samples, ensure that the sensor membrane is dry. Ensure
that the gas from the exit tube of the flow chamber (in gaseous mode) is released at atmospheric
pressure and that it is constant.
English
11
5.7 Storage when not used
At the end of the workday clean the outside of the instrument and interface box (if used) with a damp
cloth. Run clean water through the flow chamber to prevent passageways from clogging. You may
wish to repeat sensor preconditioning (see Preconditioning sensors on page 13) prior to the next
use.
If you do not expect not to use your sensor for more than a few months, you should clean the sensor
as instructed in the Sensor Manual and then store it dry, without electrolyte, and with the calibration
cap in place for protection.
Section 6 User interface
6.1 Keypad and function keys
The front panel of the instrument has a three-digit liquid crystal display (LCD). The LCD includes a
right-side marker to distinguish between gas concentration and temperature display. This marker
also indicates the measurement display units (ppm, ppb, %, etc.) depending on the instrument
model. To the LCD's right is a label showing the measurement units configured at the factory for your
application.
In addition to the controls indicated on the front panel, there is also a pressure relief valve switch on
top of the instrument to enable atmospheric pressure equilibrium for sensor calibration, or for
measurements in gaseous samples in % units. The panel keyboard has the following push-button
controls:
Power switch. This turns instrument power on or off. The instrument starts in measurement mode
Places the instrument in measurement mode
Calibrates the analyzer against a reference sample. This button can be locked out from the
WinLog97 PC program
Stores a measurement value into memory
Backlights the LCD for approximately three minutes
Toggles between gas concentration and temperature measurement displays in measurement
mode, increases or decreases the storage number during storage or memory view, or sets a
calibration value during calibration
To start the analyzer, press the keyboard POWER switch (located bottom left of the keyboard). When
you turn power on, the instrument displays its model number briefly, and then starts in measurement
mode. You can access other instrument functions by pushing one of these keys while turning power
on:
12
English
Sensor calibration - see Sensor calibration on page 20
Start automatic data acquisition - see Automatic data acquisition on page 14
Start memory storage view - see Viewing stored measurements on page 14
Display program identification information
6.2 Taking measurements
Once the system is calibrated, you should be able to begin taking measurements. Connect the top-
mounted inlet to accept your sample, typically this is accomplished by connection to a sampling
valve. The sample flow can be regulated by adjusting the knurled stainless steel knob on top of the
flow chamber.
Minimum flow rates, measurement limits and response times for the various available membranes
are given in the accompanying Sensor Manual.
To switch between gas measurement and temperature measurement, press the Up/Down Arrow
buttons.
To backlight the LCD for approximately three minutes, press the Backlight button.
Note: For measurements of gaseous samples in % units, you must open the barometric pressure sensor relief
valve switch on the top of the instrument from time to time, to allow the pressure inside the instrument to equilibrate
to the barometric pressure.
6.3 Preconditioning sensors
You can expect a more rapid and accurate first result if you precondition the sensor before you take
readings. You may want to precondition prior to each series of measurements, depending on
frequency of use.
To precondition, connect the flow chamber to a convenient sampling source at or below expected O
2
levels. If measurements take place in carbonated samples, precondition with carbonated water.
Open the sampling valve on the flow chamber, just enough for a trickle. Then, switch on the
instrument and watch the LCD. You will see the values drop.
If you are measuring in the 0.1 ppm range, then only a short time is required for the display to fall to
this level, whereas ppb level measurements may require the sensor to remain exposed to the sample
for half an hour or more. You will establish your own requirement with experience.
Once the LCD displays the expected level of O
2
, close off the sample. The sensor is now
preconditioned and ready to use.
6.4 Storing measurements in the instrument
The instrument will store up to 500 gas measurement values, labeled by numbers 0 through 499,
along with the current date and time of each measurement. You have the choice of acquiring this
information manually or automatically, as described below.
Before storing measurements, you should verify the date and time settings of the instrument's
internal clock, as described in Clock settings on page 22.
English
13
6.4.1 Automatic data acquisition
Note: When the instrument is used to automatically store measurement data, all buttons except the POWER key
are disabled. If enough time elapses to store all 500 values, the instrument will return to normal measurement mode
and the buttons re-enabled.
Before starting automatic measurement storage, first select the sampling rate desired using the
Sampling Rate menu of the WinLog97 program (see Automatic data acquisition - setting sampling
intervals on page 18).
1. Switch the instrument OFF (by pressing the POWER key).
2. Then hold down the STO button while switching the instrument back ON. The LCD displays the
message Sto for about one second.
3. Normal gas concentration measurements are displayed for about two minutes.
4. After two minutes the instrument displays the sample number (starting at 000), then the gas
concentration measurement value followed by [- - -] to indicate the measurement is being stored.
5. This storage sequence repeats automatically, at the rate specified by the WinLog97 program
Sampling Rate menu. Values are stored sequentially in sample numbers 000 through 499.
Note: If you have not cleared previously stored values, the storage sequence automatically overwrites the older
values, as they are stored.
6. To end automatic storage, switch the instrument OFF (by pressing the POWER key) while it is in
normal measurement mode and not while it is in the process of automatically storing data.
7. Switching ON again without holding down the STO button returns the instrument to measurement
mode.
Note: If you accidentally interrupt the automatic data storage by switching off the instrument while it is in the
process of storing a value, and you then attempt to download the stored values by the WinLog97 program, you
will get a Windows Checksum Error message, and you will not be able to view the measurement data. If this
happens, then go back to the instrument and manually log one more value (as described in Manual data
acquisition on page 14). You can then download your original set of values to your PC.
6.4.2 Manual data acquisition
Note: You cannot store measurement data manually if the instrument has already been set up to store the data
automatically.
1. For the first measurement you wish to store, press the STO button once to display a sample
number. The default sample number is 000 (for first time access), or the last used memory
position where data was stored, incremented by a value of 1.
2. You can increase or decrease this number by pressing the Up/Down Arrow buttons within three
seconds.
3. Should you decide at this point, not to store this particular measurement, just wait five seconds
and the display returns to measurement mode. You may also exit this routine by pressing the
MEAS button.
4. Press STO a second time, within five seconds of the first. The instrument then displays a brief
clearing [- - -] message, followed by the gas concentration measurement value for about three
seconds
5. The [- - -] message is displayed as this measurement value is stored
6. Repeat the above steps to store additional measurements.
If you stored the first value as sample 001, the instrument automatically increases the next
storage location, and labels it sample 002. You can increase or decrease this number by pressing
the Up/Down Arrow buttons. If you label a sample number the same as a previously stored
measurement value, the new measurement value overwrites the previously stored value.
6.4.3 Viewing stored measurements
1. Switch the instrument OFF (by pressing the POWER key).
2. Hold down the Up Arrow button while switching the instrument back ON. The LCD displays a
sample location number.
14
English
3. Scroll through the numbered sample locations of all the stored values using the Up Arrow and
Down Arrow buttons.
4. To view the actual gas concentration measurement value at a particular sample number, press
the STO button. The LCD now displays the stored value for that sample number.
5. Press STO a second time to return to the next numbered location display, to continue scrolling or
view another stored value.
6. To return to the measurement mode, switch the instrument OFF and then back ON again without
holding down any additional buttons.
6.5 Storing and accessing measurements from the PC
If you have made measurements and stored them in the instrument, you should be ready to bring
them into the WinLog97 program for viewing, copying, saving and printing. See also Options setup
on page 17 for additional information on the WinLog97 program.
6.5.1 Downloading stored values
To download the stored results from the instrument to the PC, choose the DownLoad data command
from the Logger menu.
The DownLoad window presents a display of the stored measurements from the instrument. The
window displays five columns of data:
Sample (sequence number of the sample)
Gas (concentration of the measured gas)
Date (date of the measurement)
Time (time of the measurement)
Sample Description
The descriptions can be modified for your applications using the procedures described below.
6.5.2 Altering the sampling point descriptions
For help in identifying the locations of various sampling points that are stored by the instrument, you
may choose the Sampling Point Description command from the Logger menu to bring up the
dialog box.
The measurement values to be placed in positions 0 through 499 (identified as Text 0, Text 1... etc.)
can be described however you wish. Double-click on a particular position (or click Modify), then type
a description in the box. Choose OK when finished entering a description.
When you Close this box, your modifications will be saved, and will appear in the Sample
Description column for the next downloaded list. These descriptions can be modified again later as
your requirements change.
6.5.3 Copying values
To copy the results to the Windows Clipboard, so that the data can be pasted into a spreadsheet,
word processor or other Windows program that accepts tabular text information, choose the
Clipboard command from the Export menu.
6.5.4 Saving values
To save this list of measurements as a text (.txt) file, capable of being recalled by the
WinLog97 program or imported as a file into other Windows programs, choose the Save As
command from the File menu. A dialog box appears, with a space to fill in with an eight-letter name.
(The program automatically attaches a .txt suffix to these files.) If you have saved previous files, a
grayed-out list of these names appears as well. Typical to Windows programs, Directories and
Drives boxes can be used to locate other places to save. You may also type the drive and directory
yourself when saving the file.
6.5.5 Printing values
To place this list of measurements into a tabular format and send it to the Windows printer, choose
the Print command from the File menu. The program asks you to enter Title and Author
information. Note that the Date is fixed by your operating system. The resulting printed list will
include this information on each page.
English
15
6.5.6 Clearing stored values
To clear all the values stored in the instrument via the WinLog97 program, choose the Clear Data
command from the Logger menu. Since this action will clear the storage memory of the instrument, a
warning appears first. Choose OK to bring up the next dialog box to confirm the clear action. Choose
Clear to start the memory clear operation. A message, Reset should be completed appears in this
box when the task is finished.
Note: You can accomplish the same thing passively, by simply allowing the analyzer to overwrite a set of stored
values with new ones.
6.6 Monitoring measurements in real-time
You may wish to analyze a particular sampling point via the WinLog97 program's Monitoring menu.
To use this Monitoring chart, the instrument must be connected to your PC. Choose Monitoring from
the WinLog97 menu to bring up a chart display.
The Monitoring chart shows the gas concentration (in blue), temperature (in red), and pressure (in
green) as the sample is being measured by the instrument. The chart is updated directly from
instrument measurements, at a rate determined by the time scale set in the TIMEBASE box at the
lower right corner of the chart.
Click the TIMEBASE up/down pointers to change the time scale of the divisions of the chart. Each
division mark along the baseline (1, 2, ...10) can be made to represent from 30 seconds to 2½ hours,
providing from 5 minutes to 25 hours of continuously displayed samples. The chart updating rate is
determined by the time scale selected.
Timebase Updating rate* Maximum samples
(10 divisions)
30 Seconds/Division 5 Seconds/Sample 60
1 Minute/Division 5 Seconds/Sample 120
10 Minutes/Division 5 Seconds/Sample 1,200
30 Minutes/Division 9 Seconds/Sample 2,000
1 Hour/Division 18 Seconds/Sample 2,000
2.5 Hours/Division 45 Seconds/Sample 2,000
*This chart's updating rate is independent from the acquisition rate (see Automatic data acquisition - setting
sampling intervals on page 18).
Click on the Continuous box, in the lower right corner, to enable or disable continuous charting.
When this box is checked, the chart scrolls continuously after reaching the 10 division, and the oldest
samples are lost off the left of the chart. When Continuous is not checked, the chart stops displaying
new results after reaching the 10 division, and all subsequent measurements are lost.
Click the up/down pointers for each measurement variable (GAS, TEMPERATURE and
PRESSURE) at the right of the chart to change the scaling of that value on the chart. The display of
each measurement variable may be turned on or off by choosing the appropriate On or Off switch at
the right of the chart.
If your measurements do not chart properly, try using a higher or lower value scale or time base than
the one displayed. Adjust these scale factors before starting the monitoring operation. A running
display of latest sample Gas, Temperature and Pressure is also shown in the bottom-right corner of
the chart.
Use the buttons at the bottom of the chart to control real-time monitoring. Choose Go to clear the
chart and start real-time monitoring display, Stop to stop real-time monitoring and Copy to copy the
data from the chart as text information to the Windows Clipboard. This information can be pasted
from the clipboard into any Windows application, such as a spreadsheet or word processor. Finally,
choose Close to close the Monitoring window.
16
English
Section 7 Options setup
The WinLog97 program is an integral part of the analyzer. Running under Microsoft Windows
®
, it
permits you to list and analyze up to 500 stored measurement values. The program also includes a
special monitoring feature, which lets your computer act as a chart recorder, and enables a hardware
test to ensure that the system is in good working order.
7.1 Main menu basics
When you start the program, it displays the
Main Menu, which automatically maximizes
on opening and appears as follows:
File serves typical Windows file management needs.
The Logger menu lets you download measurement values from the instrument, make
modifications to the sample list that can be used to identify sampling point locations, or clear the
instrument's stored values.
Export places your information into the Windows Clipboard, so that it can be pasted directly into
other Windows programs. This is especially useful when working with spreadsheet programs, but
the information can be pasted into word processing programs as well.
The Monitoring menu creates a running chart of real-time measurements (see Monitoring
measurements in real-time on page 16). These can also be saved to the Windows Clipboard.
The Configuration menu lets you see how your system has been configured for your application.
You may change the PC's COM port, the sensor membrane, automatic data acquisition rate, or the
sensor calibration mode. You may also lock out the instrument's CAL button, or for calibration
using a span gas, you may enter the span gas percentage.
The Troubleshooting menu includes a series of tests, permits the setting of the clock, and
enables a barometric pressure calibration routine.
The Help menu gives access to the Help file and allows the identification of the
WinLog97 program.
7.2 Instrument - PC connection
For the hardware connection of the instrument to a PC, see 3650Ex Instrument - PC connection
on page 10. The Configuration, Serial port menu lets you choose one of four serial communication
ports.
Usually, COM1 is used to connect to a mouse, so try COM2 first. You may find that a separate SetUp
program supplied with your PC is necessary to activate this port. Click on OK to activate the selected
port. If the port you have selected here is adequate, the WinLog97 program will return to the main
menu. Otherwise, you will see an RS232 ERRORS message advising you to select another port.
7.3 Reviewing instrument configuration
To review if the analyzer is set up as expected, choose the Configuration, Configuration view
command.
You may change a number of these settings using the WinLog97 program. The modifiable settings,
and information relating to these settings are listed in the next section, Configuring the instrument.
However, should you see any unexpected items listed on your screen which you are unable to
correct, please contact your Hach Lange representative.
7.4 Configuring the instrument
The analyzer can be readily configured for your application using the following commands in the
Configuration menu. The instrument must be connected to your PC in order to change its
configuration.
English
17
7.4.1 Automatic data acquisition - setting sampling intervals
The instrument can perform as a standalone data acquisition device, automatically recording gas
measurements with the date and time, and storing up to 500 of these values. Choosing the
WinLog97 program's Configuration, Sampling Rate menu lets you select time intervals (acquisition
rate) for this storage capability.
Use the slide bar to view and select a sampling rate, from 15 seconds to 1 hour. The selected rate is
shown in the Acquisition rate window. Click OK to save this rate. Once your choice is made, the
instrument can be used independently of the WinLog97 program for data acquisition, as described in
Automatic data acquisition on page 14.
Note: The Acquisition rate set via this menu is independent from the monitoring chart updating rate described in
Monitoring measurements in real-time on page 16. The sampling rate menu applies only to automatic data
acquisition, while the chart updating rate is used only for displaying real-time results via the monitoring chart.
7.4.2 Membrane selection
You may find it necessary to use a different type of membrane for different applications. Naturally,
with any membrane change, you will need to re-calibrate (see Sensor calibration on page 20). You
should also consider the changes in required flow rates and response times, which are specified in
the accompanying Sensor Manual.
To re-configure the analyzer, choose Configuration, Membrane to bring up the box which reveals
the membrane models available. Choose OK when the desired membrane is selected.
7.4.3 Selecting type of calibration
7.4.3.1 Select from the PC
You can use the Configuration, Calibration mode command to select how the sensor is to be
calibrated.
Note: Only the calibration modes applicable to your sensor can be selected. Others will be grayed out.
Choose In Air to allow calibration of the sensor in air. Choose In Line to calibrate the sensor directly
in the sampling line, in a sample of known gas content. In instrument models that measure gaseous
samples, you may select In a Span Gas to calibrate. Choose OK when the desired mode is selected.
7.4.3.2 Select from the instrument
1. Switch the instrument power OFF
2. Switch the instrument on by holding down the CAL button and then pressing the POWER button
while still pressing the CAL button
3. The instrument display will show either SPA for span gas calibration, LI for in line calibration, or
Air for in air calibration
4. Use the Up/Down Arrow buttons to change the mode to your choice
5. Press the STO button to set the selected mode. The instrument displays Sto for a few seconds,
then returns to measurement mode
7.4.4 Locking out the instrument's CAL button
You can use the Configuration, Calibration Key Status menu to prevent an accidental sensor re-
calibration from the instrument keyboard.
Choose Disabled to lock out the keyboard CAL button. To unlock this capability, choose Enabled.
Choose OK when the desired mode is selected.
7.4.5 Sensor calibration range checking
When calibration is performed for In Air and In a Span Gas calibration modes, the sensor
measurement current is compared to an ideal current for the selected membrane to determine
whether or not to complete the calibration. You can use the Configuration, Calibration Range
Checking menu to enable or disable sensor calibration range checking in the instrument.
Choose Disabled to calibrate without checking the value of the measurement current, within a range
of 0% to 999% of the ideal current.
18
English
When set to Enabled, at calibration the measurement current should be between 25% and 175% of
the ideal current; if the value is outside of these limits, the calibration fails and Err is displayed on the
instrument LCD. Choose OK when the desired mode is selected.
Note: It is recommended to leave range checking enabled. In special measurement situations it may be necessary
to disable range checking. However, contact a Hach Lange representative for further details before disabling this
feature.
7.4.6 Entering a span gas value
When calibrating the sensor in a span gas, use the Configuration, Span Gas menu to enter the
concentration of the gas to be measured in the span gas. Enter the percentage of measurement gas
in the span gas (e.g. 10.00%), then choose OK.
7.4.7 Dual use (model 3650/113 only)
7.4.7.1 Change from the PC
Use the Configuration, Dual Use menu to change the measurement phase (either dissolved or
gaseous) for the model 3650/113 dual-use analyzer. Choose ppm (dissolved) to set the instrument
for dissolved measurement in liquids, or % (gaseous) to set the instrument to gas phase
measurement.
7.4.7.2 Change from the instrument
1. Switch the instrument power OFF
2. Switch the instrument on by holding down the Down Arrow button and then pressing the
POWER button while still pressing the Down Arrow button
3. The instrument will first display USE followed by either dIS for dissolved phase measurement in
liquids or gAS for gaseous phase measurement
4. Use the Up/Down Arrow buttons to change the mode to your choice
5. Press the STO button to set the selected mode. The instrument displays Sto for a few seconds,
then returns to measurement mode
Section 8 Calibration
8.1 Pressure calibration
Since the instrument is sealed against moisture, you must open the barometric pressure sensor relief
valve switch on top of the instrument to permit the instrument to achieve atmospheric pressure
equilibrium, and take an accurate barometric pressure reading. This must be done with every
calibration. To open the relief valve, depress and hold down for five seconds, then release.
If you have access to an accurate barometer, you may wish to calibrate the instrument's internal
barometric pressure sensor. This is done using the PC WinLog97 program. Choose
Troubleshooting, Pressure Calibration and an informational message will appear to warn you that
the instrument’s current pressure calibration will be lost.
Choose OK to continue. The calibration procedure then displays a Pressure Calibration dialog box.
The Measured Pressure value shows the current instrument pressure reading.
Enter the current atmospheric pressure, in mbars, in the Calibration Pressure entry box. Choose
Calibrate to direct the instrument to read and display the Measured Pressure using this calibration
value. Choose Quit when you are satisfied with the pressure calibration to return to normal
operation.
8.2 Calibration range checking
When calibrating the sensor in air or in a span gas, as detailed in Sensor calibration on page 20,
the new calibration current should be between 25% to 175% of the ideal current (which is stored in
the instrument's non-volatile memory for each membrane). If not, the instrument displays Err on its
LCD and the system will not calibrate. When the system will not calibrate, it is likely that a sensor
service will be required.
English
19
Note: This calibration range checking can be enabled or disabled using the WinLog97 program (see Sensor
calibration range checking on page 18). It is recommended to leave this checking feature enabled. In special
measurement situations it may be necessary to disable range checking. This will allow calibration between 0% to
999% of the ideal current. Contact a Hach Lange representative before disabling this range checking feature.
8.3 Sensor calibration
When delivered, the sensor is pre-calibrated. However, it should be re-calibrated on site, when being
used for the first time, and always after a membrane change. If you have just replaced the
membrane, allow at least half an hour for the membrane to settle before attempting to calibrate.
If you want to verify the accuracy of the calibration, place the analyzer back in measurement mode
and compare your displayed gas concentration against the value in the appropriate tables found in
the accompanying Calibration Tables booklet.
Your calibration is stored internally and is valid for the life of the sensor's membrane, thus it is not
necessary to repeat the calibration procedure until after the next membrane change.
The sensor can be calibrated either by using a span gas, directly in line in a liquid sample, or in air.
The mode of calibration may be selected using the WinLog97 program or directly from the instrument
(see Selecting type of calibration on page 18 for more details).
8.3.1 Calibration in span gas
The span gas calibration procedure may be used if you have a supply of gas with a known
concentration of O
2
(in % units). To perform this type of calibration, the instrument must be set for
calibration In a Span Gas. The WinLog97 program also must be used in this procedure.
Switch on the instrument, if necessary, and wait a minute or so for the displayed measurement to
settle. Then expose the sensor to a span gas sample with a known oxygen concentration. Enter the
percentage of oxygen in the span gas using the WinLog97 program (see Entering a span gas value
on page 19).
1. Press the CAL button. Remember, this button may have been locked out to prevent an accidental
reset (see Locking out the instrument's CAL button on page 18 for details).
2. A brief clearing [- - -] message appears.
3. Press CAL again within a 3 second period.
4. The percentage of the measured current to the ideal one is displayed.
5. When the reading is stable, press CAL again.
If the new calibration current is within 25% to 175% of the ideal current, the instrument displays
CAL and returns to the measurement mode. Calibration is now complete, the sensor is
calibrated, and you can proceed with your measurements. If the new calibration current is not
within this range, the instrument displays Err and returns to measurement mode. The system will
not calibrate, and it is likely that a sensor service will be required.
8.3.2 Calibration in line
The in line calibration procedure can be used to calibrate the sensor directly in line, against a liquid
sample with a known dissolved oxygen concentration. To perform this type of calibration, the
instrument must be set for calibration In line. Switch on the instrument, if necessary, and wait a
minute or so for the displayed measurement to settle. Expose the sensor to a liquid sample with a
known gas concentration.
1. Press the CAL button. Remember, this button may have been locked out to prevent an accidental
reset (see Locking out the instrument's CAL button on page 18 for details).
2. A brief clearing [- - -] message appears.
3. Press CAL again within a 3 second period.
4. A measurement will flash on the LCD, showing the oxygen concentration of the calibration
sample.
5. Assuming you know the gas content to be a certain value, you can adjust the displayed value with
the Up/Down Arrow keys.
20
English
  • Page 1 1
  • Page 2 2
  • Page 3 3
  • Page 4 4
  • Page 5 5
  • Page 6 6
  • Page 7 7
  • Page 8 8
  • Page 9 9
  • Page 10 10
  • Page 11 11
  • Page 12 12
  • Page 13 13
  • Page 14 14
  • Page 15 15
  • Page 16 16
  • Page 17 17
  • Page 18 18
  • Page 19 19
  • Page 20 20
  • Page 21 21
  • Page 22 22
  • Page 23 23
  • Page 24 24
  • Page 25 25
  • Page 26 26
  • Page 27 27
  • Page 28 28
  • Page 29 29
  • Page 30 30
  • Page 31 31
  • Page 32 32
  • Page 33 33
  • Page 34 34
  • Page 35 35
  • Page 36 36
  • Page 37 37
  • Page 38 38
  • Page 39 39
  • Page 40 40
  • Page 41 41
  • Page 42 42
  • Page 43 43
  • Page 44 44
  • Page 45 45
  • Page 46 46
  • Page 47 47
  • Page 48 48
  • Page 49 49
  • Page 50 50
  • Page 51 51
  • Page 52 52
  • Page 53 53
  • Page 54 54
  • Page 55 55
  • Page 56 56
  • Page 57 57
  • Page 58 58
  • Page 59 59
  • Page 60 60
  • Page 61 61
  • Page 62 62
  • Page 63 63
  • Page 64 64
  • Page 65 65
  • Page 66 66
  • Page 67 67
  • Page 68 68
  • Page 69 69
  • Page 70 70
  • Page 71 71
  • Page 72 72
  • Page 73 73
  • Page 74 74
  • Page 75 75
  • Page 76 76
  • Page 77 77
  • Page 78 78
  • Page 79 79
  • Page 80 80
  • Page 81 81
  • Page 82 82
  • Page 83 83
  • Page 84 84
  • Page 85 85
  • Page 86 86
  • Page 87 87
  • Page 88 88
  • Page 89 89
  • Page 90 90
  • Page 91 91
  • Page 92 92
  • Page 93 93
  • Page 94 94
  • Page 95 95
  • Page 96 96
  • Page 97 97
  • Page 98 98
  • Page 99 99
  • Page 100 100
  • Page 101 101
  • Page 102 102
  • Page 103 103
  • Page 104 104
  • Page 105 105
  • Page 106 106
  • Page 107 107
  • Page 108 108
  • Page 109 109
  • Page 110 110
  • Page 111 111
  • Page 112 112
  • Page 113 113
  • Page 114 114
  • Page 115 115
  • Page 116 116
  • Page 117 117
  • Page 118 118
  • Page 119 119
  • Page 120 120
  • Page 121 121
  • Page 122 122
  • Page 123 123
  • Page 124 124
  • Page 125 125
  • Page 126 126
  • Page 127 127
  • Page 128 128
  • Page 129 129
  • Page 130 130
  • Page 131 131
  • Page 132 132
  • Page 133 133
  • Page 134 134
  • Page 135 135
  • Page 136 136
  • Page 137 137
  • Page 138 138
  • Page 139 139
  • Page 140 140
  • Page 141 141
  • Page 142 142
  • Page 143 143
  • Page 144 144
  • Page 145 145
  • Page 146 146
  • Page 147 147
  • Page 148 148
  • Page 149 149
  • Page 150 150
  • Page 151 151
  • Page 152 152
  • Page 153 153
  • Page 154 154
  • Page 155 155
  • Page 156 156
  • Page 157 157
  • Page 158 158

Hach ORBISPHERE 3650 Atex Basic User Manual

Taper
Basic User Manual