AIAG Year 2000

Test Procedures




NOTICE

This "Year 2000 Test Procedures" document contains test cases from several sources which may be of assistance in the planning of test activities for your systems, equipment, products and services. It is provided so lely for general informational purposes, and not as advice or specific recommendations on how you should conduct an assessment, testing or conversion program for your particular systems, equipment, products or services. Neither the Year 2000 AIAG Informat ion Center nor its agents or consultants, make any representation or warranty regarding the test procedures and other information contained in this document, or the information or results which may be generated by your use of the test procedures, processe s, and other information provided in this document.

YOU ARE SOLELY RESPONSIBLE FOR THE TESTING AND ASSESSMENT OF YOUR SYSTEMS, EQUIPMENT, PRODUCTS AND SERVICES, AND FOR ANY INFORMATION WHICH YOU MAY SUPPLY TO THE YEAR 2000 AIAG INFORMATION CENTER, OR TO ANY OTHER PARTY, REGARDING THE " YEAR 2000" COMPLIANCE STATUS OF YOUR SYSTEMS, EQUIPMENT, PRODUCTS AND SERVICES.


 

 

Table of Contents

1. Introduction *
1.1 Executive Summary *
1.2 Year 2000 Information Overview *
1.3 Control System Component Type *
1.3.1 HMI *
1.3.2 Weld Controller *
1.3.3 Instrumentation *
1.3.4 Specialized System *
1.3.5 Motion Control *
1.3.6 PLC *
1.3.7 Information Co-Processor *
1.3.8 Robot *
1.3.9 CNC *
1.3.10 Vision System *
1.3.11 BIOS *
1.3.12 PC/Micro/Mini *
1.3.13 Software *
1.3.14 Software Custom *
1.3.15 Integrator *
1.4 Glossary *
1.5 References *
1.6 Testing Tools *
2. Test Planning *
2.1 What to Test *
2.1.1 Inventory *
2.1.2 Resource Availability *
2.2 Which Tests Apply *
2.3 Component Summary Documentation *
2.4 Combined Component Testing *
2.5 System *
2.6 Cluster Testing *
2.6.1 Cluster Definitions *
2.6.2 Cluster Tests *
2.7 Cross Cluster Testing *
2.8 Procedures *
2.8.1 Pre-Test *
2.8.2 Testing *
2.8.3 Post-Test *
2.9 Documentation *
2.9.1 Test Setup *
2.9.2 Test Procedures *
2.9.3 Test Results *
2.10 Resources *
2.10.1 Skills *
2.10.2 Equipment *
2.10.3 Tools *
3. Test Procedures *
3.1 Critical Date Values for Year 2000 Testing *
3.2 Rollover, Reboot, Day of Week Tests *
3.2.1 Rollover - 1999 to 2000 - Power on *
3.2.2 Day of Week *
3.2.3 Reboot - Date retention *
3.2.4 Rollover - 1999 to 2000 - Power Off *
3.3 Manual Date Set Test *
3.3.1 Date Set - 1 Jan 2000 *
3.3.2 Date Set - Date retention *
3.3.3 Date Set - 29 Feb. 2000 *
3.4 Leap Year Test *
3.4.1 Leap Year - Rollover 2/28 - Power On *
3.4.2 Leap Year - Reboot 2/29 *
3.5 Date Window Tests *
3.5.1 Date Window Test - Below Limit *
3.5.2 Date Window Test - Above Limit *
3.5.3 Date Window Test - Change Limit *
3.6 Other Date Representation Tests *
3.6.1 DOY - 29 February 2000 *
3.6.2 DOY - 31 December 2000 *
3.6.3 DOY - Invalid Dates *
3.7 Arithmetic Date Tests *
3.7.1 Days in 2000 *
3.7.2 Days across 1999/2000 Boundary *
3 Days across leap year *
Upload / Download Tests *
3.8.1 Upload *
3.8.2 Download *
3.9 Special Value Test *
3.10 File or Directory Creation Test *
3.10.1 File - Creation 2000 *
3.10.3 File - Replacement 2000-2000 *
3.11 Audit Log Test *
3.11.1 Audit Log *
3.12 Report Tests *
3.12.1 Report - Query *
3.12.2 Report - Sort *
3.12.3 Report - Merge *
3.12.4 Report - Search *
3.13 Log file purge Test *
3.14 Timer Test *
3.15 Input Data Test *
3.16 Output Data Test *
3.17 Activation/Deactivation Tests *
3.17.1 Valid access *
3.17.2 Expired access *
3.18 Display Data Tests *
3.18.1 Display Data Test *
3.19 Indirect Date Usage Tests *
4. Appendix *
4.1 Test Issues Checklist *
4.1.1 General Integrity *
4.1.2 Date Integrity *
4.1.3 Explicit First 2 Digits of Year *
4.1.4 Implicit First 2 Digits of Year *
4.2 Year 2000 Testing Report Forms *
4.2.1 Test Report Purpose *
4.2.2 General Test Report Instructions *
4.2.3 Year 2000 Component Test Report Instructions *
4.2.4 Year 2000 Component Test Report Form *
4.2.5 Year 2000 Combined Component Test Report Purpose *
4.2.6 Year 2000 Combined Component Test Report Instructions *
4.2.7 Year 2000 Combined Component Test Report Form *
4.2.8 Year 2000 System Test Report Purpose *
4.2.9 Year 2000 System Test Report Instructions *
4.2.10 Year 2000 System Test Report Form *
4.3 Year 2000 Plant Notebook *
4.4 Sample Plant (or Site) Test Plan *
4.4.1 Body Shop *
4.5 Sample A Manufacturing System Test Plan *
4.5.1 Manufacturing System Name : *
4.6 Sample B Manufacturing System Test Plan *
4.6.1 Manufacturing System Name : *

 

1. Introduction

The objective of the Year 2000 Test Procedures document is to:

This document has been developed from several sources and pilot experiences. The test cases included are based on industry wide information and actual Year 2000 problems encountered during several system evaluations. These test procedures can be used t o test commercial off the shelf components and systems assembled from those components.

1.1 Executive Summary

This document contains information about Year 2000 testing in several sections and an appendix. The first section contains an overview of the document, a Year 2000 glossary, and references. A diagram illustrates the relationships between t he Year 2000 inventory and test documents. The second section contains guidelines for planning testing activities. The third section contains test procedures for components and manufacturing systems. The fourth section addresses the process for Year 2000 compliance. The appendix contains a testing issues check list, sample test plans, and forms for reporting test results.

Testing systems will require judgment and knowledge of the system under test. Careful documentation of all testing will be necessary due to the large number of systems each engineer is responsible to maintain. When problems are discovered, careful docu mentation will allow the vendor of the system or other teams to re-create the problem for analysis and /or corrective action as required.

The inventory of components used in the construction of manufacturing & facilities control systems must follow standard naming conventions for the purpose of sharing information about test results supplied by vendors, other facilities, and other so urces. As evidenced by the scope of this document there is enormous opportunity to duplicate effort if standards are not followed.

Please take the time to understand the entire document. It is important to recognize that judgment must be exercised to decide which tests apply to a specific component or system. Time spent planning will be paid back many times, since experience has s hown that much time is wasted if the test procedures are not organized in the most efficient order for actually performing the tests.

An alternative to testing is to review the computer code and look for problems manually or with another computer program. This is a popular method in the COBOL environment and many tools exist specifically for the purpose of finding Year 2000 problems in COBOL programs. This method can be adapted to the controls environment and your test team may want to investigate tools and processes for bench testing or code review in the controls environment. Code for a specific PLC can be examined to see if the ye ar function or register is being used by cross reference listings. Other PLCs may have specific configuration blocks which must be setup for the date to be used. Where these methods prove out, a large reduction in test effort can be reduced to simply benc h testing the PLC code. Some vendors are currently developing programs to scan the logic for their more popular products; as information is collected this document will be updated.

 

 

1.2 Year 2000 Information Overview

The following diagram illustrates the relationship of the inventory and testing information.

 

1.3 Control System Component Type

The following sections describe the categories or component types of computer based devices typically used in manufacturing & facilities systems. This information summarizes the equipment which may be affected by Year 2000 problems and matches the inventory process. As the complexity of a control system increases, the probability also increases that a date related problem exists. For the purpose of defining test requirements, the microprocessor based devices and control systems encount ered in the manufacturing environment are grouped by the following component types:

This list is ordered generally by increasing level of complexity of the control system class and the level of testing required. It is not possible to test a complex system 100%, but a reasonable effort can be made to identify Year 2000 problems with a limited number of test case scenarios. ( See section 2.3 Which Tests Apply )

1.3.1 HMI

The Human Machine Interface (HMI) component type includes devices dedicated to display or input data. Increasingly intelligent HMIs built using PCs should be tested as PCs; HMIs that do not include PCs should be tested as HMIs. ( Se e section 1.3.12 PC/Micro/Mini )
Examples include:

  1. Allen-Bradley PanelView 550, 900, 1200
  2. Cutler-Hammer IDT PanelMate
  3. Unipo

1.3.2 Weld Controller

The weld controller component type includes any controls related to welding, weld timers, weld controllers, stud welders.
Examples include:

  1. Pertron Paragon G
  2. Bosch PS2081

1.3.3 Instrumentation

The instrumentation component type includes data loggers, remote instrument transmitters, and analytical instruments in quality laboratories.
Examples include:

  1. Promess EPR 12006
  2. Leeds & Northrup Speedomax

 

1.3.4 Specialized System

The specialized system component type includes control systems which are designed and manufactured to solve a specific problem. Controls are single purpose and only allow for configuration and no programming. The number of Year 2000 problems for this class is expected to be low. However there have been problems with time stamped printouts failing to function correctly on weigh scales, and fire alarm systems which no longer report fires after the year 2000.
Examples include:

  1. Wheel Alignment System
  2. Head Light Aim System
  3. Tire Balancing Equipment
  4. Single Loop Controllers
  5. HVAC controllers
  6. Weigh scales

1.3.5 Motion Control

The motion control component type includes AC drives, DC drives, servo systems, and single axis controllers.
Examples include:

  1. Allen-Bradley IMC110, 1336 AC Drive
  2. GE Fanuc D3B, DC300
  3. Modicon Quantum Sercos Controller

1.3.6 PLC

The Programmable Logic controller (PLC) component type, (or in Europe step process controllers - SPCs) are normally programmed in ladder logic and must have a custom application program. Most PLC controllers are Year 2000 compliant, but each controller contains a custom application program which may or may not handle date information correctly. During the inventory PLCs should be surveyed for information co-processors.
Examples include:

  1. Allen-Bradley PLC-5/40C
  2. Modicon 984
  3. Bosch CL500

1.3.7 Information Co-Processor

The information co-processor component type contains intelligent sub-systems or control modules which are used in PLCs or other modular systems which are designed for expansion.
Examples include:

  1. CPU modules (Bosch ZS500)
  2. Basic modules (Allen-Bradley 1771-DB)
  3. ASCII modules (Allen-Bradley 1771-DA)
  4. Real-time clock modules (Allen-Bradley 1771-DC)
  5. Co-processor modules (Allen-Bradley 1771-DMC)
  6. Communications modules (Bosch DB500, Bosch RM500)

1.3.8 Robot

The robot component type includes complex motion controller which is optimized to translate tool path data into the required motion on non-orthogonal joints, some models include a PC in the control for user interface and networking. The inventory should include the robot controller (not the mechanical arm) model and version.
Examples include:

  1. ABB S3
  2. ABB S4/PC
  3. Fanuc RH, RJ, RJ-2

1.3.9 CNC

The Computer Numerical Controls (CNC) component type includes motion controls which process a part program to define part geometry and machine motion. Most modern CNCs include a 3.5" floppy drive with associated PC compatible f ile system. Fault logging functions which time stamp the occurrence of machine faults are standard features. Many CNCs are used in an environment where part programs are downloaded for every part which is processed. Typical CNCs have PLC style logic which is used to "marry" the control to a specific machine configuration, the part-program interpreter, real-time motion control, and even graphical part-programming support functions.
Examples include:

  1. GE Fanuc SERIES 15-M, SERIES 18-TB
  2. Allen-Bradley 8600, Series 9/260
  3. Okuma OSP 5020M/L

1.3.10 Vision System

The vision system component type includes dedicated vision systems which may be assembled from other components. If the other components at identifiable by vendor, model, and version list the individual items also.
Examples include:

  1. Perceptron 1500
  2. Allen-Bradley VIM2

1.3.11 BIOS

The Basic Input Output System (BIOS) component type is included to emphasize the source of several Year 2000 related problems. It may be possible to resolve some Year 2000 problems by upgrading the BIOS. The BIOS of a PC can be obta ined using special PC programs or during the boot sequence on older, slower PC’s. Some operating systems can display the BIOS vendor, model, and version
Examples include:

  1. Award Modular 4.40
  2. Phoenix 8D 486 V1.10
  3. AMI 3.03

1.3.12 PC/Micro/Mini

The PC/Micro/Mini component type includes all computer systems from micro-computers to mini- computers. During the inventory process the PC BIOS, operating system, and other software should be listed where known as individual compon ents. This is by far the most likely type of system to have Year 2000 related problems. The combination of PC hardware real-time clock (RTC), a BIOS, and operating systems is frequently present in the manufacturing environment. Recently PC’s are used more with PLC application programs. Mini computers are used as well as PC’s. Documenting a control system that uses a combination of hardware and software with different versions and implementation tools may be of little use unless the information can be used and shared to identify known problems. The effect of Year 2000 problems within unique Computer based Controls must be fully documented and tested where possible to ensure any Year 2000 problem is minimized.
Examples include:

  1. DEC VAX 11/780, PDP-11/44
  2. HP 9000/G40, HP1000, HP3000
  3. IBM PC/AT, Compaq Deskpro450
  4. Texas Micro D486

 

1.3.13 Software

The software component type includes operating systems, software packages, and software tools used to support manufacturing and facilities systems. Typical software packages are database, graphics, and specialized data collection pa ckages. Software tools include compilers, assemblers, code management systems, ladder logic programming tools, and PLC status monitoring programs. All packages are also potential problem areas including runtime libraries used to build an application. Some of the known problems for software packages include: locking up completely at the Year 2000 rollover, failing to write historical data after Year 2000, and incorrect sorting of alarm or history data after the year 2000.
Examples include:

  1. Wonderware In-Touch 5.0b
  2. Bosch MADAP
  3. Microsoft Windows NT3.51

1.3.14 Software Custom

The software custom component type includes any custom software used to support manufacturing and facilities systems. Custom software programs are ladder logic programs, robot control applications in KAREL, and computer programs wri tten using any programming language.
Examples include:

  1. Ladder Logic Programs
  2. Computer Programs

Custom software written for a specific system exists in PLCs, Robot Controllers, CNCs, and computers. By definition these custom applications present a risk since they are normally unique and therefore fall into the unknown compliance category. To dete rmine the requirement for testing each application, consider the following factors:

1.3.15 Integrator

The integrator component type is included to capture the vendor or engineer of the system as opposed to the manufacturer of the other components. The integrator is typically the one who would be contacted first in the case of sys tem rather than component problems.

 

1.4 Glossary

The following list of words may have special meaning in the context of the Year 2000 project or this document.

WORD or Phrase

Definition

General integrity

No value for current date will cause interruptions in normal operation.

Date integrity

All manipulations of calendar-related data (dates, duration, days of week, etc.) will produce desired results for all valid date values within the application domain.

Explicit first 2 digits of year

Date elements in interfaces and data storage permit specifying the first 2 digits of the year to eliminate date ambiguity.

Implicit first 2 digits of year

For any date element represented without the first 2 digits, the correct first 2 digits is unambiguous for all manipulations involving that element.

Extended semantics

In general, specific values for a date field are reserved for special interpretation. The most common example is interpreting "99" in a 2-digit year field as an indefinite end date, i.e., "does not expire." Another is embedding a date valu e in a non-date data element.

Calendar errors

Errors typically include failing to treat 2000 as a leap year and converting incorrectly between date representations.

Date overflow

Many software products represent dates internally as a base date/time plus an offset in days, seconds, or microseconds since that base date/time. Hardware integers holding the offset value can overflow past the maximum corresponding date—a n event which may lead to undefined behaviors.

Inconsistent semantics

At interface between systems, software on each side assumes semantics of data passed. Software must make same first 2 digits of date assumptions about 2-digit years.

First 2 digits of year ambiguity

This is the most common element. Software represents dates with a 1- or 2-digit year. When software does not recognize that dates are not all in the 19xx range, the results are undesirable.

Gregorian Calendar

Revision of the Julian calendar in 1582 by Pope Gregory XIII, adopted by the US and Great Britain in 1752. Added that centesimal leap years must be divisible by 400 rule and suppressed 10 or 11 days during 1700.

Julian Calendar

Introduced in Rome in 46 B.C., it established a 12 month year of 365 days with every 4th year having 366 days.

Julian Date

Julian Date (JD) is the number of days since Noon 4713 BC plus the fractional part of a day for the time of day past Noon.

Modified Julian Date

Modified Julian Date (MJD) is the Julian Date minus 2,400,000.5 which reduced the size of the number of days to 5 digits and shifted the beginning of the time of day to Midnight. For 1997, the MJD is 50448 + DOY (day of year).

Real-time Clock

A battery operated clock which keeps time when the system is powered off

Virtual Clock

A software based clock used in some operating systems to maintain the time and date as an operating system service.

Compliance

Year 2000 compliance means that neither performance nor functionality is affected by dates prior to, during and after Year 2000. Compliance will be demonstrated when the criteria of General, date, and century integrity are satisfied.

Year 2000 Ready

There will be no impact on production nor product quality due to Year 2000 date issues, but compliance is NOT required.

Unit

A Unit is the minimum recognizable level to which equipment containing a date/time function or processor can be broken down. The Inventory will be composed of multiple Units.

Component

Any stand-alone, computer-based, commercial off-the-shelf device or software package that has only one date/time function that can affect test results. The smallest unit of testing for the Year 2000 project.

Combined Component

Any stand-alone, computer-based, commercial off-the-shelf device or software package that has two or more date/time functions that can affect test results.

System

A system is composed of multiple combined components and/or components that form part of a manufacturing process, i.e. Welding Cell, Production Cell, etc.

Cluster

A cluster is composed of multiple systems that constitute a complete manufacturing process, e.g. Body Framing, Paint shop, Hood assembly.

Cross Cluster Testing

The highest level of integration testing which is organized around a particular function or across functions. An example is a product order through product delivery.

Test plan

A test plan is a documented set of test cases and test scripts.

Test case

A test case is a documented test procedure with specific input data and expected test results. Example - rollover Dec 31, 1999

Test procedure

A test procedure is a step by step description of the test to be performed. Example - rollover

1.5 References

The following list of references are pertinent to the Year 2000 project.

1.

ANSI X3.30 - Formatting Date Data

2.

FIPS-4-1 (Revised 1996-03-25) - Federal or DoD procurements

3.

ASC X12 EDI draft std for trial use, ISO 9735, UN/EDIFACT - Electronic commerce (EDI)

4

The Modified Julian Date (MJD) has been officially recognized by the International Astronomical Union (IAU), and by the Consultative Committee for Radio (CCIR), the advisory committee to the International Telecommunications Union (ITU). Th e pertinent document is CCIR RECOMMENDATION 457-1, USE OF THE MODIFIED JULIAN DATE BY THE STANDARD FREQUENCY AND TIME-SIGNAL SERVICES. This document is contained in the CCIR "Green Book" Volume VII.

5.

The Almanac for Computers also provides information on JD & MJD.

6.

Additional, extensive documentation regarding the Julian Date (JD) is contained in the Explanatory Supplement to the Astronomical Ephemeris and Nautical Almanac, and in the yearbooks themselves, now called The Astronomical Almanac.< /TD>

7.

Testing Very Big Systems; David Marks, McGraw Hill 1992, ISBN 0-07-040433-X

8.

Software Testing; Marc Roper, 1994, ISBN 0-07-707466-1

9.

Testing Computer Software; Kaner, Tab, 1988, ISBN 0-8306-9563-X

10.

Software Testing in the Real World, improving the process; Edward Kit, Addison-Wesley, 1995, ISBN 0-201-87756-2

11.

Software Testing A Craftsman’s Approach; Paul C Jorgensen, 1995, ISBN 0-8493-7345-X

12.

IEEE Standard for Software Verification and Validation Plans (ANSI/IEEE Standard 1012-1986)

13.

IEEE Standard 829-1983 for Software Test Documentation (updated 1991)

1.6 Testing Tools

Certain testing tools may be useful for the Year 2000 project. Be aware that during the pilot testing one of the tools for PC Compatibles returned results which seemed inconsistent with manually performed tests. The 2000.exe test tool was examining the real-time clock which is not accessible through the user interface and was testing features not normally accessible from the user interface. The readme file included with the tool provided and expanded an explanation of the results. Several test tools exist to test PC BIOS and real-time clock functions. Also, companies may be developing tools for scanning PLC programs for potential Year 2000 problems.

2. Test Planning

This section of the document provides guidelines for writing specific test plans. Example test plans for manufacturing plants and manufacturing systems are included in the appendix. The resources required to conduct the tests are also desc ribed. Testing shall be conducted on large systems using the divide and conquer method. Unit testing addresses the smallest division for which tests are reasonable, the number of test cases is smaller and the likelihood of finding all the errors is increa sed. Once a unit has been tested, integration testing begins by assembling larger systems from tested units. At this level the number of test cases may increase and the likelihood of finding all the Year 2000 errors decreases. Balance and judgment are req uired of the test designer to select the system boundaries and appropriate test cases. Manufacturing Systems which are unique may share many common components and similar architectures. Two systems using identical hardware in a similar architecture may ha ve different applications and use date related functions in completely different ways requiring separate testing.
Testing Levels:

  1. Component testing, testing a single component controller or application in isolation
  2. Combined Component testing, testing an assembly of standard components and custom application programs or configurations
  3. System testing, testing a collection of Combined Components and Components which are assembled into a complete manufacturing or facilities control system. An example is a welding cell with PLCs, Robots, HMIs, PCs, and weld controllers.
  4. Cluster testing, testing a collection of Systems and Common Systems and / or Business Systems that constitute a complete manufacturing process. An example is a test of the paint shop systems taken as a whole including the interfaces to any comm on systems.
  5. Cross Cluster testing,the highest level of integration testing where the test cases are organized around specific functions, or across functional boundaries. An example is the entire process of automobile order through automobile delivery.

2.1 What to Test

Prioritize test scheduling based on the inventory of controls and computers to identify the most suspect devices or systems. Suspect devices are those with unknown or undocumented testing results. Devices that are part of a system that sha res date information or that uses date information for product marking in the manufacturing process (encoded date information) should be considered suspect. Unique systems which have been locally built and programmed must be tested individually, and test results for commercial off the shelf hardware or software can be shared industry wide.

2.1.1 Inventory

Using the plant inventory and available equipment compliance status information, eliminate components with known status and select candidate manufacturing systems to test. The inventory of controls and computers must be recorded using the standard naming conventions to minimize redundant efforts.

2.1.2 Resource Availability

Evaluate the required resources prior to scheduling the tests:

 

2.2 Which Tests Apply

The following test procedures are intended to provide a baseline for Year 2000 testing to ensure consistent and repeatable results. These tests apply to standard control components and software as well as custom applications. The matrix sh own below of component type and tests contains a recommended set of tests to perform on a particular manufacturing system or standard component. Again these are only a recommendation. The tester should review these tests and determine which apply to the s pecific component or system.

Section

Component Type

Test Name

HMI

Vision System

Instrumentation

Motion Control

Weld Controller

Specialized

System

PLC

Information

Co-Processor

Robot

CNC

PC/

Micro/

Mini

BIOS

Software

Software

Custom

3.2

Rollover, Reboot, Day of Week Tests

X

X

X

X

X

X

X

3.3

Manual Date Set Test

X

X

X

X

X

X

X

3.4

Leap Year Test

X

X

X

X

X

X

X

3.5

Date Window Tests

   

X

     

X

3.6

Other Date Representation Tests

           

X

3.7

Arithmetic Date Tests

   

X

     

X

3.8

Upload / Download Tests

 

X

X

X

X

 

X

3.9

Special Value Test

   

X

     

X

3.10

File or Directory Creation Test

 

X

X

X

X

 

X

3.11

Audit Log Test

   

X

     

X

3.12

Report Tests

   

X

     

X

3.13

Log file purge Test

   

X

     

X

3.14

Timer Test

   

X

 

X

 

X

3.15

Input Data Test

 

X

X

X

   

X

3.16

Output Data Test

 

X

X

X

   

X

3.17

Activation/Deactivation Tests

           

X

3.18

Display Data Tests

   

X

     

X

3.19

Indirect Date Usage Tests

 

X

X

X

X

 

X

2.3 Component Summary Documentation

For each component selected for testing, document the following:

2.4 Combined Component Testing

The following issues are intended to help approach testing a collection of components connected to each other and possibly external systems:

When testing combined components which include several components or layers, identify all the interfaces to the time/date functions and test each as appropriate. For example, all PC clones have a hardware Real Time Clock (RTC) which is the source of th e time/date on system reboot or power on. The PC BIOS has a set of programs which allow access to the RTC, which may also present Year 2000 problems. If an application software package uses the RTC and fails some of the tests, a special program may be use d to test the RTC functions to isolate problems from the PC/BIOS component from or to the application software package (see the test tools section). The RTC is tested for rollover and reboot operation. PC’s have a CMOS configuration program which allows h ardware configuration and time/date setup and should be tested for manual date entry. The time/date may also be set using the DOS "date" and "time" commands, and MS Windows allows the time/date to be set using the "control panel&q uot;. Software application packages may not only provide system time/date setting functions, but display the time and date, create files, open files, and delete files from within the application. UNIX systems may provide alternative commands to set the ti me/date. For example the "askdate" function would accept any year from 00-69, but set the year to 1970. On the same system the date function ("date 01012000") accepted the year 2000 and functioned correctly.

2.5 System

A system is composed of multiple combined components and/or components that form part of a manufacturing process, i.e. Welding Cell, Production Cell, etc.

2.6 Cluster Testing

Cluster testing is a term used to describe integration testing when multiple manufacturing systems and common systems are tested as a group. Each cluster test becomes a project of its own to coordinate the planning, preparation, and execut ion. Configuration of the systems which make up the cluster must be well managed using configuration management tools where available. This must be coordinated with compatible release dates and versions as Year 2000 changes are implemented to correct date related problems.

2.6.1 Cluster Definitions

A cluster is a collection of systems which have been selected for systems integration testing. Each interface between systems must be documented to understand the expected behavior before, during and after testing. Some interfaces will be internal to the cluster, others may require simulated inputs or outputs for testing. Interface documentation should include a list of messages documenting the format of any date information passed and where ambiguous (2-digit), how the 1999/2000 rollover is to be handled on both sides of the interface, in either direction. If the interface will be changed, when and how it will change should be documented. If there is a master clock, where it is set and who are its customers or dependent clocks should be d ocumented as well.

2.6.2 Cluster Tests

The following issues must be considered when planning cluster tests:

2.7 Cross Cluster Testing

Cross cluster testing is based on having all of the systems which support a particular function tested as a group. Examples are processing a car order from a dealer through to delivery of the car, or hiring a new employee after the year 20 00.

2.8 Procedures

This section describes several actions to consider before, during, and after testing. These general guidelines apply to all testing. Modify the test parameters as necessary for the specific system under test. Be sure to record the set up f or the test. It is best to conduct testing on a non-production system, but this is not always possible. When using production systems make sure that adequate backup procedures are followed prior to testing and the backup media is readily available for rei nstallation. If the time window for conducting the test is limited, include an amount of time for reinstalling the system when determining the total required time for testing.
Caution:

  1. Some software with date checking (for example Windows 95 betas) may reset and not allow access after setting dates into the future. Other issues include expiration of passwords, user ids, or product licenses.
  2. All tests assume that the battery used to backup CMOS settings, memory, and/or applications software is operational.

2.8.1 Pre-Test

Prior to conducting tests on a manufacturing system:

2.8.2 Testing

For each manufacturing system selected for testing:

2.8.3 Post-Test

After the tests are completed:

 

2.9 Documentation

Documentation during the Year 2000 project is extremely important since new information will become available as the number of devices with unknown status approaches zero. The time frame may span several years and information will be lost if not committed to paper or electronic documentation. People change jobs during the project. The tests performed must be documented and repeatable to provide a basis for sharing information across the many locations participating in this project.

2.9.1 Test Setup

The test setup documentation should include where appropriate:

2.9.2 Test Procedures

The test procedure documentation should include where appropriate:

2.9.3 Test Results

The test result documentation should include where appropriate:

2.10 Resources

For each manufacturing system several different skills, equipment, and tools may be required. Understanding the resources that are required for a Year 2000 project, and matching these requirements to resource availability is critical for s uccessful execution of the project.

2.10.1 Skills

List specific resources required to perform the testing. Different skill levels may include:

Consult the test procedures to select someone capable of initiating, performing and recording the tests in a repeatable manner.

2.10.2 Equipment

List specific equipment required to conduct systems testing. Test parts may be required for manufacturing systems which cannot be cycled without a part present. External systems may require scheduling for availability when performing downl oad tests, or tests of system interfaces.

2.10.3 Tools

The tests should be conducted with suitable programming devices and relevant test equipment.

 

3. Test Procedures

Several test scenarios have been developed as a result of problems identified with the year 2000. This limited set of tests cannot prove a Component/System to be Year 2000 compliant, but using them will help identify several frequently obs erved problems. These test procedures are written as general instructions. Specific knowledge of the systems or components under test is also required in order to apply these test cases.

The following test procedures are the result of problems identified with manufacturing control systems. A brief description of each test is provided for guidance in conducting tests on custom control systems and other systems with unknown status.

The following test procedures provide step by step instructions for performing each test. The results should be recorded step by step as the test is performed to ensure accurate records of the test are documented on the "Year 2000 Test Report" ; form. The test results should be retained locally and made available for distribution to other locations as well.

3.1 Critical Date Values for Year 2000 Testing

The following dates should be tested for proper operation:

  1. 0000-00-00 Special Value
  2. 1998-12-31 Rollover, Reboot
  3. 1999-01-01 Special Value
  4. 1999-09-09 Special Value
  5. 1999-12-31 Special Value, Rollover, Reboot
  6. 2000-01-01 Day of Week, Day of Year
  7. 2000-02-28 Rollover, Reboot
  8. 2000-02-29 Rollover, Reboot, Day of Week
  9. 2000-03-01 Day of Week
  10. 2000-12-31 Rollover, Reboot, Day of Week, Day of Year
  11. 2001-01-01 Day of Week, Day of Year
  12. 2027-12-31 Rollover, Reboot, Day of Week, Day of Year

3.2 Rollover, Reboot, Day of Week Tests

The rollover test checks for proper handling of the date transition from 1999 to 2000 without manual intervention. Based on actual tests several different results have been observed as examples of incorrect handling of the transition from 1999 to 2000. Many systems used 2-digit dates and the result may be a rollover to year 100; sometimes the 19 is assumed and the result is the year 19100. For other unknown reasons the years 2001, 2028, and non-printable characters have been observed. The effect of incorrect date calculations may include negative numbers.

The reboot test checks for correct date & time storage during power cycles of the system. The system may function correctly when the time is set ahead, but revert to another time and date when the power is cycled. Many PC’s revert to 1980 or 1984 w hen rebooted after the year 2000.

The day of the week may be incorrectly calculated. Systems should display the day of the week of January 1, 2000 as Saturday, not Monday, which may mean January 1, 1900.

3.2.1 Rollover - 1999 to 2000 - Power on

Test:
Set the date to 31 Dec. 1999.
Set the time to 23:59 (11:59 p.m.).
Observe the system date after 00:00 am
Expected Result:
The system clock advances into the year 2000 and continues normally.

3.2.2 Day of Week

Test:
The clock is set to 1 Jan 2000.
Observe the system day of week display.
Expected Result:
The system displays the day of week as Saturday.
(1 Jan 1900 was a Monday)

3.2.3 Reboot - Date retention

Test:
Set the date to 1 Jan 2000.
Power down the system.
Power up the system.
Observe the system date
Expected Result:
The system clock still displays the year 2000 and operates normally.

Note: Many personal computers reset themselves to 04 January 1980, or some other past date, whenever they reboot, if the CMOS real time clock says the year is 00.

3.2.4 Rollover - 1999 to 2000 - Power Off

The procedure specifies 10 minutes before mid-night, but a smaller time may be appropriate. Be sure you can shutdown the system before the rollover occurs.
Test:
Set the date to 31 Dec. 1999.
Set the time to 23:50 (11:50 p.m.).
Power down the system before it can roll over to year 2000
Wait until after midnight with the power off.
Power up the system.
Observe the system date
Expected Result:
The system clock advances into the year 2000 and operates normally.

3.3 Manual Date Set Test

This test checks for correct date & time entry to initialize the system clock. The "set system date" function may operate incorrectly when the time is set ahead, not allow entry over a certain date range, or revert to another time and date when set. Some PC’s revert to a default date (1980 or 1984) when set to a date in the year 2000. Some systems have multiple date set functions; for a PC the date may be set using the CMOS Setup program at power on, using a DOS date function , or using a windows clock or control panel interface. The tests in this section should be executed on all date set functions for the system.

The date set function may also be accessed through an application programming interface (API). If the equipment has a battery backed up clock, the date set test may include removing both battery power and external power to completely initialize the sys tem clock and attempting to set the date to 1 Jan 2000. Exercise caution to document all system configurations when attempting this test because the configuration may be lost upon removal of the battery.

3.3.1 Date Set - 1 Jan 2000

Test:
Set the date to 1 Jan 2000.
Observe the system date
Expected Result:
The date should be Saturday, 1 Jan 2000.

3.3.2 Date Set - Date retention

Check to insure that the date set function sets the real-time clock, not just the system’s virtual clock.
Test:
With the date still in the year 2000, power down the system.
Power up the system.
Observe the system date
Expected Result:
The system clock still displays the year 2000 and operates normally.

Note: Some PC’s which fail the Reboot - Date retention test will pass the manual Date retention test. This is a possible fix for those PC’s.

 

3.3.3 Date Set - 29 Feb. 2000

Test:
Set the date to 29 Feb. 2000.
Observe the system date.
Expected Result:
The date should be Tuesday 29 Feb. 2000.

3.4 Leap Year Test

The leap year test checks the logic which calculates valid dates for leap year. An example of a failure on leap year was published on the Internet which told of 66 industrial controllers in a steel mill all locking up when the date calcula tion for leap year 1996 occurred. A 2-digit year representation presents a possible divide by zero problem. The year 2000 leap year calculation is more complex because multiple exceptions apply to the calculation, leading to greater opportunities for erro r. The following are leap year considerations:

3.4.1 Leap Year - Rollover 2/28 - Power On

Test:
Set the date to Monday 28 Feb. 2000.
Set the time to 23:59 (11:59 p.m.).
Observe the system date after midnight
Expected Result:
The date should be Tuesday 29 Feb. 2000.

3.4.2 Leap Year - Reboot 2/29

Test:
Set the date to 29 Feb. 2000.
Power down the system.
Power up the system.
Observe the system date
Expected Result:
The date should be Tuesday 29 Feb. 2000.

3.4.3 Leap Year - Rollover 2/29 - Power On

Test:
Set date to 29 Feb 2000
Set the time to 23:59 (11:59 p.m.).
Observe the system date after 00:00 am
Expected Result:
The date should be Wednesday 1 March 2000.

3.5 Date Window Tests

Windowing date systems assume the first 2 digits of a 4-digit year to be 20 for values below the switch value and 19 for values above the switch value. An example switch value of 50 provides for a range of 1951 to 2049. If the 2-digit year is greater than 50 the year is assumed to be 19xx. That is, 84 is greater than the switch value so the year is 1984. If the 2-digit year is less than 50 the year is assumed to be 20xx. That is, 34 is less than the switch value, so the year is 2034. When two integrated systems share date information in this format be sure to test the interface at the boundary conditions. Is the behavior specified when the year is the switch value? Do both sides of an interface switch the same way?

Systems using date windowing should consider testing:

3.5.1 Date Window Test - Below Limit

Test:
Observe the configured switch value.
Change the current date to one year below the switch value.
Observe a 4-digit date.
Expected Result:
The date assumes 20xx.

3.5.2 Date Window Test - Above Limit

Test:
Observe the configured switch value.
Change the current date to one year above the switch value.
Observe a 4-digit date
Expected Result:
The date assumes 19xx.

3.5.3 Date Window Test - Change Limit

Test:
Change the configurable switch value to 2004.
Observe the configured switch value.
Expected Result:
Limit has been changed to 2004

(Repeat the above and below limit tests to confirm the limit has changed.)

3.6 Other Date Representation Tests

The following date formats occur often enough that a brief description is included to stimulate thinking about possible date related functions and interface definitions:

The following DOY dates should be checked:

3.6.1 DOY - 29 February 2000

Test:
Set the date to 29 February 2000.
Observe the DOY date by function call or system display.
Expected Result:
29 February 2000 should be 00060 or 2000060.

3.6.2 DOY - 31 December 2000

Test:
Set the date to 31 December 2000.
Observe the date by function call or system display.
Expected Result:
31 December 2000 should be 00366 or 2000366.

3.6.3 DOY - Invalid Dates

Test:
Attempt to set the DOY date to 2000000.
Observe the DOY date function call return value.
Attempt to set the DOY date to 98367.
Observe the DOY date function call return value.
Expected Result:
Error code or message as documented by vendor.

3.7 Arithmetic Date Tests

If dates are used in any calculations, test for correct operation. The following list is intended to help identify functions which should be checked:

3.7.1 Days in 2000

Test:
Create a period calculation using 1-Jan-2000 as the start date and 31-Dec-2000 as the end date.
Expected Result:
The year 2000 has 366 days.

3.7.2 Days across 1999/2000 Boundary

Test:
Create a period calculation using 1-Dec-1999 as the start date and 31-Jan-2000 as the end date.
Expected Result:
The period ( (31 January 2000)- (1 December 1999) ) has 61 days.

3 Days across leap year

Test:
Create a period calculation using 1-Feb-2000 as the begin date and 1-Mar-2000 as the end date.
Expected Result:
The month of February has 29 days.

Upload / Download Tests

The upload and download tests check for logic which prevents new files from replacing old files when the date comparison uses only 2 digits. The logic must be reversed when the dates cross the year 2000 boundary. Test to see if a file crea ted in (19)99 is considered older than a file created in (20)00. Systems have failed to download new programs because they were assumed to be older that the current program on the system.

Old File Date

New File Date

2-digit difference

4-digit difference

July 4, 1998

July 4, 1999

+1

+1

July 4, 1999

July 4, 2000

-99

+1

July 4, 2000

July 4, 2001

+1

+1

3.8.1 Upload

Test:
Set the date of the control system to January 11, 2000.
Attempt to upload the test file.
Expected Result:
Verify that the new file was uploaded.

 

3.8.2 Download

Preparation:
Download the existing test file with a date prior to January 1, 2000.
Create a new version of the test file to download with the file date January 5, 2000.
Test:
Set the date of the control system under test to January 10, 2000.
Attempt to download the January 5, 2000 test file.
Expected Result:
Verify that the new file was downloaded.

3.9 Special Value Test

The special value test checks for usage of values in date fields for special purposes that are not dates. An example is special handling of the date September 9, 1999, which may be used as a special code for software license expiration dat es, or never expires codes, and/or errors. Systems integrated to higher level systems should be subjected to special value tests. Special values considered should include the date values 9-9-99, 0-0-00, and x-x-9999. This test applies to applications whic h create records containing the current date as a time or data field, such as database applications or systems which maintain historical data .

Test:
Set the current date to a special value (e.g. 9-9-99, 9-9-1999, 0-0-00, 0-0-0000).
Observe the number of records in a test file at the start of the test.
Using the application under test, create a new record that contains the current date.
Expected Result:
Observe that the application was able to create the test record.
Observe that the test record is included in displays or reports as applicable.
Observe that the end of file continues to function correctly (e.g. number of records correct?).
Observe that the test record can be deleted from the system.
Examples of failure include:
Not terminating an expired software license;
Failing to age backup tapes for recycling as scratch tapes;
End-of-file markers which use the date field with a special value function incorrectly.

3.10 File or Directory Creation Test

These tests check for observed problems with file or directory creation when the file name is based on an incorrect year date. These tests apply to any system which can store information collected from the manufacturing process, or allow f or editing and creating files. Errors can occur when the system attempts to create data files after the year 2000 and the date rollover is incorrect. Systems which create file names based on the time and date have been observed to lockup when the file nam e contained non- printable or illegal characters. Another possibility exists when the file already exists and is being updated. Most user interfaces prompt for verification, "Do you wish to replace file foo.dat 12/30/99 with file foo.dat 01/03/00?&qu ot; Will the newest file replace the older file in both the prompt and the actual replacement? Is the file identifier "FILE00" assumed to be older than "FILE99"?

 

3.10.1 File - Creation 2000

Test:
Set the date of the system under test to a date beyond January 1, 2000.
Create an event or choose a time such that the system will attempt to create a file.
Expected Result:
Verify that the new file was created.
Verify that time stamped information is valid inside any history or log files.
Verify that any history or log files can be used by the application or system.
Examples:
Systems which store historical data that create files using a name based on the date.
3.10.2 File - Replacement 1999-2000
Test:
Create an old test file in 199X with data identifiable for 199X.
Set the date of the system under test to a date beyond January 1, 2000.
Create a new test file with the same name and new data.
Expected Result:
Observe the prompt for the correct order of replacement, and replace old file with new file.
Verify that the old file was replaced with the new file.
Verify that the file contains the new data.
Examples:
Systems which prompt for confirmation during file updates, such as file managers.

3.10.3 File - Replacement 2000-2000

Test:
Create an old test file in 2000 with data identifiable for 2000.
Set the date of the system under test to a date beyond the creation date of the old file.
Create a new test file with the same name and new data.
Expected Result:
Observe the prompt for the correct order, replace old with new file
Verify that the old file was replaced with the new file.
Verify that the file contains the new data.
Examples:
Systems which prompt for confirmation during file updates, such as file managers.

3.11 Audit Log Test

This test checks for problems with audit logging systems. This test applies to systems which include the capability to audit user activity or network transactions.

3.11.1 Audit Log

Test:
Set the date of the system under test to a date beyond January 1, 2000.
Create an event or choose a time such that the system will attempt to create a file.
Expected Result:
Verify that the new file was created.
Verify that time stamped information is valid inside the audit log file.
Examples:
Systems which incorporate a DBMS with roll back capability.
Operating systems which record usage by account.

3.12 Report Tests

The report tests pertain to the following functions: sorting and retrieving, sorting and merging, searching, and indexing on either disk file or database table. Report tests apply to manufacturing systems which display data sorted by time and date. Examples include alarm fault reports or production reports. The tests should include creating faults or alarms after the year 2000 and observing the alarm display pages or reports for correct ordering of the alarm data. Failure examples include monitoring packages which place new alarms at the end of the list after 2000. One test is to query for all items from now until 12-31-1999 and observe the results, then query for all items from now until 1-2-2000 and observe the results. Some systems fail this test by not returning any records on the second query.

3.12.1 Report - Query

Test:
Set the date of the control system under test to a date beyond January 10, 2000.
Create new data by forcing a fault or some system event which will create test records.
Set the date of the control system under test to a date beyond March 1, 2000.
Create a new report containing the Year 2000 data by choosing four time spans:
a) November 15, 1999 to December 31, 1999 (1999 data);
b) November 15, 1999 to March 1, 2000 (all data);
c) January 1, 2000 to March 1, 2000 (2000 data);
d) February 1, 2000 to March 1, 2000 (no data).
Expected Result:
Verify that the report with all data, b), contained all the data in the report.
Verify that the data was ordered correctly in the report.
Verify that the report with no data, d), executed correctly and no data was printed in the report.

3.12.2 Report - Sort

Test:
Set the date of the control system under test to a date beyond January 1, 2000.
Create new data by forcing a fault or some system event which will create test records.
Create a new report containing the Year 2000 data by choosing a valid time span.
Expected Result:
Verify that the new data was ordered correctly in the report.

3.12.3 Report - Merge

Test:
Set the date of the control system under test to a date beyond January 1, 2000.
Create new data by forcing a fault or some system event which will create test records.
Create a new report containing the Year 2000 data by merging new data.
Expected Result:
Verify that the new data was merged correctly in the report.

3.12.4 Report - Search

Test:
Query or Search for an existing record created in the year 1999 with the current time in the year 1999.
Query or Search for an existing record created in the year 1999 with the current time in the year 2000.
Query or Search for an existing record created in the year 2000 with the current time in the year 2000.
Expected Result:
Verify that all records are found as expected.

3.13 Log file purge Test

The log file purge test applies to manufacturing systems which periodically purge old data to maintain file system space by deleting the oldest data. The problem identified occurs when after the year 2000, files with year data lower than o ther files (e.g. 1998 is less than 1999) are removed. Does the system consider data with a year date of 2000 to be less than 1999? If the comparison is only considering 2- digit years, this will happen and newer data can be purged.
Test:
Verify that log data backups are available.
Set the date of the system under test to a date beyond January 10, 2000.
Create new data for the log file or rename some existing data.
Attempt to purge data from the system older than 7 days.
Expected Result:
Verify that only data from before the purge date was removed.
Examples:
VAX/VMS RMS purge command
Database products which support a purge function

3.14 Timer Test

This test verifies the creation and operation of event timers in systems or software which provide these capabilities.
Test:
Set the date of the control system under test prior to 2000.
Create new timer to wake up or alarm to trigger at 10:01 AM, January 3, 2000
Set the date of the control system under test to January 2, 2000.
Create new timer to wake up or alarm to trigger at 10:02 AM, January 3, 2000
Set the date of the control system under test to January 3, 2000.
Set the time to 10:00 AM.
Wait for the alarms to trigger
Expected Result:
Verify that the alarm or timer created before 2000 operates correctly.
Verify that the alarm or timer created after 2000 operates correctly.
Examples:
UNIX chron scheduling software;
SCADA package timer functions;
HVAC Controls for starting and stopping ventilation or cooling equipment.

3.15 Input Data Test

The input data test applies to manufacturing systems which read date information from labels or other manufacturing control systems.
Test:
Set the date of the control system under test to January 2, 2000.
Create input labels or simulate input from other systems with a date beyond 1-1-2000 (or 1-1-00).
Attempt to read the input data.
Expected Result:
Verify that the system correctly reads the input data.

3.16 Output Data Test

The output data test applies to manufacturing systems which write date information to labels or other manufacturing control systems. Systems which print results to a printer or operator display have been found to lock up or fail to display data after the year 2000. The cause is that from the year date rollover, invalid characters can be produced.
Test:
Set the date of the control system under test to a date beyond January 1, 2000.
Attempt to output data.
Expected Result:
Verify that the data was output correctly.
Examples:
Printed labels or product markings;
Transfer data to other systems.

3.17 Activation/Deactivation Tests

This test applies to manufacturing systems which contain passwords, accounts, or complex software license systems which contain expiration functions.

3.17.1 Valid access

Test:
Check that the expiration date extends past January 1, 2000
Set the date of the system under test to a date beyond January 1, 2000.
Attempt to execute the software licensed, or use the affected password, account, etc.
Expected Result:
Verify that the software executes properly after January 1, 2000.

 

3.17.2 Expired access

Test:
Check the expiration date.
Set the date of the system under test to a date after the expiration date in the year 2000 or beyond.
Attempt to execute the software licensed, or use the affected password, account, etc.
Expected Result:
Verify that the software does not execute after the expiration date.

3.18 Display Data Tests

The display data test applies to manufacturing systems which display date information on several different pages. The test must include moving the date ahead to the year 2000 and observing every screen which the controller contains. The no n-compliance can range from partial display to complete control system lockup. Many industrial controllers have unique software for each display screen, and may behave differently on any screen. Examples of this include CNC controllers which may have one set of display pages for tool management, another set of display pages for fault annunciation, and another for the file system, all of which may have software written by different persons or teams of persons over the product development life cycle.

3.18.1 Display Data Test

Test:
Create a list of all the date fields on all the display screens.
Set the date of the system under test to a date beyond January 1, 2000.
Create new files or fault records.
Attempt to display all date fields on all display screens for file dates or fault time stamps.
Expected Result:
Verify that each date field displays correctly.
Examples:
A software application supports 4-digit dates in the 20xx range using a DBMS but only passes 2-digit
years to the DBMS which defaults to 19xx and stores the wrong date.

3.19 Indirect Date Usage Tests

These tests apply to systems which use date information in an indirect manner. The following list is intended to stimulate questions about a system which could use the date in functions that do not require date information, but may have be en implemented using a date function.
Test:
Identifying functions which use the date indirectly may be very difficult.
Expected Result:
Where identified, verify correct operation in the year 2000.
Examples:
Encryption/Decryption algorithms;
Random Number generators;
Communications protocols;
Firmware.

 

4. Appendix

4.1 Test Issues Checklist

The following sections provide a list of questions which can be used to review a test plan for completeness.

4.1.1 General Integrity

1. System date can be set to high-risk dates:
1999-12-31, 2000-01-01, 2000-02-29
2. Re-initialize from cold start on high-risk dates:
1999-12-31, 2000-01-01, 2000-02-29
3. System date rolls over correctly to/from high-risk dates:
1999-01-01, 2000-01-01, 2000-02-29, 2000-03-01
4. Does the programming language provide a function to obtain the system date on the host or through a time service?
5. Does this function return the correct system date value for high-risk dates
(1999-12-31, 2000-01-01, 2000-02-29)?
6. Does this function return the correct value for system date after the system date rolls over on high- risk dates
(1999-01-01, 2000-01-01, 2000-02-29, 2000-03-01)?
7. Are there third-party products embedded in this application? Are all these products Year 2000 compliant?
8. Does the application code ignore values for explicit first 2 digits of year in the system date at any point in the program logic?

4.1.2 Date Integrity

9. Does the programming language support a data type for date values in the range 1900-01-01 to 2050- 12-31?
10. Does the application make a leap-year calculation? Do these calculations treat 2000 as a leap year and 1900 as a non-leap year?
11. Does the date arithmetic correctly calculate duration (differences) between dates, add dates and duration, compute date of week?
12. Does the application convert date values from one representation to another (e.g. YMD to Julian to base-and-offset internal)? Does software correctly convert between date representations according to the Gregorian calendar?
13. Does the application compare dates in any of its branching logic or calculation of Boolean values? Do all these comparisons produce correct results for all combinations of values with the expected ranges for dates?
14. Does the application include searching, sorting, merging, or indexing on internal tables, linked lists, or other data structures based on date variables? Do these operations perform correctly for all possible values for dates in the key variables? Doe s a key index which includes a date field produce correct sequence across dates in 19xx and 20xx?
15. Does the application represent dates in any variable as an offset from a base date/time? What is the maximum value for a date for this representation? What is the minimum value for a date for this representation (usually the base date)? Does the expec ted range of values for each variable using this date representation fall within these extremes?
16. Does the application use assigned values for the date from one variable to another? Is the first 2 digits of the value truncated during any assignment? Is the value in the target variable eventually used in a date manipulation which requires the expli cit 4 digit value for correct results?
17. Does the application use language features which map a data address to more than one variable (such as REDEFINE in COBOL or COMMON in FORTRAN)? In all aliases for the same data space, does any variable ignore or truncate a value for explicit first 2 d igits in the date value? Is the truncated value for date eventually used in a manipulation which assumes that all values for date share the same first 2 digits?
18. Are constants for date values (including day, month, or year) used in any manipulation? Is the date constant intrinsic to the functional requirements or a special value used in a "date" data type for convenience?
19. Does the application store and retrieve dates accurately for values in the range 1900-01-01 through 2050-12-31?
20. Does the application use sort/merge utilities to order file contents on date fields or use indexed file structures keyed on date fields? Is this order correct for all values of dates in the range 1900-01-01 through 2050-12-31?
21. Does the application rely on primary or alternate indices on a structured database for search, insert, update, or delete functions in which any key contains a date field? Will the index order be correct for all values for date in the range 1900-01-01 through 2050-12-31?
22. Are all date variables initialized to some convention for null value?

4.1.3 Explicit First 2 Digits of Year

23. Does the application use a language, toolkit, and/or application generator which permits explicit first 2 digits in the date data types? If so, are values for first 2 digits in variables of these types supplied from external input or d erived within the software logic?
24. Does the application use a DBMS or other layered (or horizontal) software product for data persistence to store and retrieve date variables? If so, can these products support explicit values for first 2 digits in any date variable stored and retrieved ?
25. Does the application have external interfaces (I/O, APIs, external subprogram calls, IPCs, library routines, HMI) which contains a date variable with explicit first 2 digits of year? Does the software ignore, truncate, or write over the first 2 digits in a value in any such variable as it flows through the program logic to any other external interface? In any such flow, could any logic alter the value for the first 2 digits of a year in any manner inconsistent with generalized manipulations based on t he Gregorian calendar?
26. Do all representations of date with explicit first 2 digits both internal to the application and in all interfaces satisfy the criteria for date compliance?

4.1.4 Implicit First 2 Digits of Year

27. Does the application use a language, toolkit, and/or application generator (including GUI builders) which permits date representation without an explicit first 2 digits in the date data types? If so, are the first 2 digits derived for any manipulations, for passing a date value across any interface, or for permanent storage? If so, is the value for the first 2 digits correct for all possible values of date that each such variable can hold?
28. Does the application use constant values for date or portions of date (i.e., day, month, or year)? If so, for any constant which is a full date value or value for year, are the first 2 digits explicit in the value? Do all manipulations using each cons tant value, directly or indirectly (that is, carried via variables to other operations in the program logic), produce the correct results for all possible values for such date variables?
29. Does the application use any application-program interface (API), such as in-line SQL or IMS DML, which passes date variables? If so, for any date value supplied across this interface, does the receiving software provide a default or derived value of the first 2 digits of date? Are the rules for derivation on both sides of the interface consistent with each other for all possible values for a date in the respective fields?
30. Does the application support a user interface containing date fields without the explicit first 2 digits of that date? Is the first 2 digits of a date in each field unambiguous to a user for all possible values for that date in each such field?
31. Do all the date representations, both internal to the application and in all interfaces, satisfy the criteria for date compliance?

4.2 Year 2000 Testing Report Forms

The following sections explain the purpose and use of the Year 2000 Test Report and Year 2000 System Test Report Forms. By responding with your test results in these standardized formats, testing data can be shared throughout your organiza tion.

4.2.1 Test Report Purpose

The purpose of the Year 2000 Test Report Form is to capture the results of Year 2000 testing on components shared throughout the corporation and to record the test results of unique systems. When testing combined compon ents use the Combined Component Test Report, and for manufacturing systems use the Systems Test Report.

 

4.2.2 General Test Report Instructions

Result - Pass, Fail, Not Applicable
Effect - I (Inconvenient), S (Severe) or C (Catastrophic)
Severity - scale rating failure from 1 to 10 where:

  1. - MILD - misspelled words
  2. - MODERATE - misleading or redundant information
  3. - ANNOYING - truncated names
  4. - DISTURBING - some transactions not processed
  5. - SERIOUS - lose a transaction
  6. - VERY SERIOUS - incorrect transaction execution
  7. - EXTREME - frequent very serious errors
  8. - INTOLERABLE - database corruption
  9. - CATASTROPHIC - system shutdown
  10. - INFECTIOUS - shutdown spreads to other systems

4.2.3 Year 2000 Component Test Report Instructions

4.2.3.1 Header

Component Name - name of component as used in the inventory master naming guide.
Date - Date that the test was completed.
Location - Location of equipment. (e.g. Line identifier)
Manpower requirements - The quantity of persons required for this test.
Total Time - Total net time in man-hours to do all of the applicable tests.
Plant/Site Name - Name by which the plant or site is commonly known.
Contact Name - Name of the qualified person performing the test.
Vendor - Name of the manufacturer or vendor of the hardware/software component.
Model - Model name that is used in the inventory process.
Version - Name/Number of the version of the component.

4.2.3.2 Tests
Result - Enter P (Pass), F (Fail) or N/A (for Not Applicable). Pass if the expected results are observed. Fail if abnormal results occur. N/A if the test is not applicable. If the test fails, complete the comments field for that test to document th e issues according to the instructions below.
Result Comments - If result was Pass or N/A then comments are not obligatory. If the result was fail then the comments are obligatory to describe the failure.
Effect - Code to describe the effect of failure. If the Result was Pass then input either D (for Date) or N (for No date) in the Effect field depending on whether a date function was found. If the result was Fail then input either I (Inconvenient), S (Severe) or C (Catastrophic) in the Effect field. If the result was Not Applicable, do not enter anything.

 

4.2.4 Year 2000 Component Test Report Form

Plant/Site Name

Contact Name

Vendor

Model

Verson

Component Name

 

Date

Location

Time

Manpower

   
 

Test Name

Results

Effect

Result Comments

3.2

Rollover, Reboot, Day of Week Tests

     

3.2.1

Rollover - 1999 to 2000 - Power on

     

3.2.2

Day of Week

     

3.2.3

Reboot - Date retention

     

3.2.4

Rollover - 1999 to 2000 - Power Off

     

3.3

Manual Date Set Test

     

3.3.1

Date Set - 1 Jan 2000

     

3.3.2

Date Set - Date retention

     

3.3.3

Date Set - 29 Feb. 2000

     

3.4

Leap Year Test

     

3.4.1

Leap Year - Rollover 2/28

     

3.4.2

Leap Year - Reboot 2/29

     

3.4.3

Leap Year - Rollover 2/29

     

3.5

Date Window Tests

     

3.5.1

Date Window Test - Below Limit

     

3.5.2

Date Window Test - Above Limit

     

3.5.3

Date Window Test - Change Limit

     

3.6

Other Date Representation Tests

     

3.6.1

DOY - 29 February 2000

     

3.6.2

DOY - 31 December 2000

     

3.6.3

DOY - Invalid Dates

     

3.7

Arithmetic Date Tests

     

3.7.1

Days in 2000

     

3.7.2

Days across 1999/2000 Boundary

     

3.7.3

Days across leap year

     

3.8

Upload / Download Tests

     

3.8.1

Upload

     

3.8.2

Download

     

3.9

Special Value Test

     

3.10

File or Directory Creation Test

     

3.10.1

File - Creation 2000

     

3.10.2

File - Replacement 1999-2000

     

3.10.3

File - Replacement 2000-2000

     

3.11

Audit Log Test

     

3.12

Report Tests

     

3.12.1

Report

     

3.12.2

Report - Sort

     

3.12.3

Report - Merge

     

3.12.4

Report - Search

     

3.13

Log file purge Test

     

3.14

Timer Test

     

3.15

Input Data Test

     

3.16

Output Data Test

     

3.17

Activation/Deactivation Tests

     

3.17.1

Valid access

     

3.17.2

Expired access

     

3.18

Display Data Tests

     

3.19

Indirect Date Usage Tests

     

 

 

4.2.5 Year 2000 Combined Component Test Report Purpose

The purpose of the Year 2000 Combine Component Test Report Form is to capture the results of Year 2000 testing on combined components and to record the test results.

4.2.6 Year 2000 Combined Component Test Report Instructions

Plant/Site Name - Name by which the plant or site is commonly known.
Contact Name - Name of the qualified person performing the test.
Date - Date that the test was completed.
Location - Location of equipment. (e.g. Line identifier)
Area - Name of the particular manufacturing area within the site. Use real names not acronyms.
Combine Components Name - unique name of the combined component.
Overall Result (1-10) - severity of failure as documented in section 4.2.2.
Overall Result Comments - any comments regarding overall failures.
Manpower requirements - The quantity of persons required for this test.
Total Time - Total net time in man-hours to do all of the applicable tests.

4.2.6.1 Subheader

Identify each component active in the test:
Vendor - Name of the manufacturer or vendor of the hardware/software component.
Model - Model name that is used in the inventory process.
Version - Name/Number of the version of the hardware/software component.
Component Type - one of the standard component types.

4.2.6.2 Tests
Result - Enter P (Pass), F (Fail) or N/A (for not applicable) Pass if the expected results are observed. Fail if abnormal results occur. N/A if the test is Not Applicable. If the test fails, complete the comments field for that test to document the issues according to the instructions below.
Result Comments - If result was Pass or N/A then comments are not obligatory, but if the result was fail then the comments are obligatory to describe the failure.
Effect - Code to describe the effect of failure. If the Result was Pass then input either D (for Date) or N (for No date) in the Effect field depending on whether a date function is found. If the result was Fail then input either I (Inconvenient), S (Severe) or C (Catastrophic) in the Effect field. If the result was Not Applicable, do not enter anything.

 

4.2.7 Year 2000 Combined Component Test Report Form

Plant/Site Name

Contact Name

Date

Location

Area

Combined Component Name

 

Vendor

Model

Version

Component

Type

Overall Result (1-10)

 
       

Overall Result Comments

 
 
 
       

Manpower

 
 
       

Time

 
 

Section

Test Name

Results

Effect

Result Comments

3.2

Rollover, Reboot, Day of Week Tests

     

3.2.1

Rollover - 1999 to 2000 - Power on

     

3.2.2

Day of Week

     

3.2.3

Reboot - Date retention

     

3.2.4

Rollover - 1999 to 2000 - Power Off

     

3.3

Manual Date Set Test

     

3.3.1

Date Set - 1 Jan 2000

     

3.3.2

Date Set - Date retention

     

3.3.3

Date Set - 29 Feb. 2000

     

3.4

Leap Year Test

     

3.4.1

Leap Year - Rollover 2/28

     

3.4.2

Leap Year - Reboot 2/29

     

3.4.3

Leap Year - Rollover 2/29

     

3.6.1

DOY - 29 February 2000

     

3.6.2

DOY - 31 December 2000

     

3.6.3

DOY - Invalid Dates

     

3.7

Arithmetic Date Tests

     

3.7.1

Days in 2000

     

3.7.2

Days across 1999/2000 Boundary

     

3.7.3

Days across leap year

     

3.8

Upload / Download Tests

     

3.8.1

Upload

     

3.8.2

Download

     

3.9

Special Value Test

     

3.10

File or Directory Creation Test

     

3.10.1

File - Creation 2000

     

3.10.2

File - Replacement 1999-2000

     

3.10.3

File - Replacement 2000-2000

     

3.11

Audit Log Test

     

3.12

Report Tests

     

3.12.1

Report

     

3.12.2

Report - Sort

     

3.12.3

Report - Merge

     

3.12.4

Report - Search

     

3.13

Log file purge Test

     

3.14

Timer Test

     

3.15

Input Data Test

     

3.16

Output Data Test

     

3.17

Activation/Deactivation Tests

     

3.17.1

Valid access

     

3.17.2

Expired access

     

3.18

Display Data Tests

     

3.19

Indirect Date Usage Tests

     

 

 

4.2.8 Year 2000 System Test Report Purpose

The purpose of the Year 2000 System Test Report Form is to capture the results of Year 2000 testing on manufacturing systems and to record the test results.

4.2.9 Year 2000 System Test Report Instructions

4.2.9.1 Header

System Name - Unique name of manufacturing system used in the inventory.
Date - Date that the test was completed.
Location - Location of equipment. (e.g. Line identifier)
Manpower requirements - The quantity of persons required for this test.
Total Time - Total net time in man-hours to do all of the applicable tests.
Plant/Site Name - Name by which the plant or site is commonly known.
Area - Name of the particular manufacturing area within the site. Use real names not acronyms.
Contact Name - Name of the qualified person performing the test.
Overall Result (1-10) - severity of failure as documented in section 4.2.2.
Overall Result Comments - any comments regarding overall failures.

4.2.9.2 Subheader

List each component and combined component active in the test.
Item - identifier to link failures to components
Vendor - Name of the manufacturer or vendor of the hardware/software component.
Model - Model name that is used in the inventory process.
Version - Name/Number of the version of the hardware/software component.

4.2.9.3 Failures Observed

List only tests which exhibit a failure during system testing:
Item - item number of component which failed
Section - Document section of test which fails.
Test Name - name of test which fails.
Result - Enter P (Pass), F (Fail) or N/A (for not applicable) Pass if the expected results are observed. Fail if abnormal results occur. N/A if the test is Not Applicable. If the test fails, complete the comments field for that test to document the issues according to the instructions below.
Result Comments - If result was Pass or N/A then comments are not obligatory, but if the result was fail then the comments are obligatory to describe the failure.
Effect - Code to describe the effect of failure. If the Result was Pass then input either D (for Date) or N (for No date) in the Effect field depending on whether a date function is found. If the result was Fail then input either I (Inconvenient), S (Severe) or C (Catastrophic) in the Effect field. If the result was Not Applicable, do not enter anything.

 

4.2.10 Year 2000 System Test Report Form

Plant/Site

Contact Name

Date

Location

Area

System Name

           

Result

Result Comments

Manpower

Time

 
           

         

Item

Vendor

Model

Version

Item

Vendor

Model

Version

1

     

26

     

2

     

27

     

3

     

28

     

4

     

29

     

5

     

30

     

6

     

31

     

7

     

32

     

8

     

33

     

9

     

34

     

10

     

35

     

11

     

36

     

12

     

37

     

13

     

38

     

14

     

39

     

15

     

40

     

16

     

41

     

17

     

42

     

18

     

43

     

19

     

44

     

20

     

45

     

21

     

46

     

22

     

47

     

23

     

48

     

24

     

49

     

25

     

50

     
 

Failures Observed

     

Item

Section

Test Name

Results

Effect

Result Comments

           
           
           
           
           
           
           
           
           
           
           
           
           

Fix or Workaround Comments:

4.3 Year 2000 Plant Notebook

It is recommended that a notebook containing the following sections be maintained:

4.4 Sample Plant (or Site) Test Plan

The sample plant is an assembly plant. In the body shop area several similar manufacturing systems are used to produce the body sub assemblies. One manufacturing system of each distinct technology combination should be included in the test plan. For example one robotic welding cell may employ GE Fanuc Robot controls with Modicon PLC and Square D Welding controllers. Another robotic welding cell may employ Kuka robot controls with Bosch PLC and British Federal Welding controllers.

4.4.1 Body Shop

4.4.1.1 Responsible Engineer:
4.4.1.2 Floor Pan Assembly Cell
Scheduled Test Date:
Completed Test Date:
Test Plan Filename:
4.4.1.3 Right Rear Quarter Assembly Cell
Scheduled Test Date:
Completed Test Date:
Test Plan Filename:

4.5 Sample A Manufacturing System Test Plan

4.5.1 Manufacturing System Name :

4.5.1.1 Manufacturing System Inventory
Filename :
Verified correct by:
Date inputs:
Date outputs:
Clocks:
Date displays:
History Data:

4.5.1.2 Manufacturing System Pre-Test
System Tested for normal operation by:
System reboots from cold start with no faults:

4.5.2 Component A (Weld Controller) Procedures
4.5.2.1 Component A Pre-Test
System Backup Filename and location:
4.5.2.2 Component A Testing
Rollover Test
Date Set Test
Reboot Test
Leap Year Test
Download Test
4.5.2.3 Component A Post Test
Restore Original Configuration

4.6 Sample B Manufacturing System Test Plan

4.6.1 Manufacturing System Name :

4.6.1.1 Manufacturing System Inventory
Filename :
Verified correct by:
Date inputs:
Date outputs:
Clocks:
Date displays:
History Data:
4.6.1.2 Manufacturing System Pre-Test
System Tested for normal operation by:
System reboots from cold start with no faults:
System Backup Filename and location:
Component A
Component B
Component C
Application C
Dry Cycle functioned Normally:
4.6.1.3 Manufacturing System Testing
Rollover Test
Component A
Component B
Component C
Application C
Date Set Test
Component A
Component B
Component C
Application C
Reboot Test
Component A
Component B
Component C
Application C
Leap Year Test
Component A
Component B
Component C
Application C
Download Test
Component A
Component B
Component C
Application C
4.6.1.4 Manufacturing System Post-Test
Restore Original Configuration
Component A
Component B
Component C
Application C
System Tested for normal operation by:
System reboots from cold start with no faults:

Dry Cycle operation normal