Development and Testing: Making Sure Our ICT System Actually Works! (Syllabus Section 7.3)

Hello future ICT experts! Welcome to the most important stage of the Systems Life Cycle: Development and Testing. Think of system development like baking a complicated cake. Analysis is choosing the ingredients, Design is writing the recipe, and Development is actually baking it.

But before you serve that cake (implementation), you need to TASTE it (testing)! Is it cooked through? Does it taste right? Does it fall apart? In ICT, testing is where we find and fix all the errors (bugs) before the system is used by real people. Let's dive into how we make our ICT systems strong, reliable, and error-free.

1. The Essential Need for Testing

Why do we bother testing? The syllabus states clearly that there is a need to test the system before implementation.

Why Testing is Non-Negotiable:

  • Catching Errors (Bugs): All programs have errors. Testing finds these faults before they cause serious problems for the users or the organisation.
  • Ensuring Requirements are Met: Testing confirms that the new system does exactly what the client originally asked for during the Analysis stage.
  • System Reliability: A well-tested system is less likely to crash or produce incorrect data when it goes live.
  • User Confidence: If the system works perfectly from day one, users will trust it immediately.

Did you know? A software bug that caused the $18.6 million failure of the Ariane 5 rocket in 1996 was traced back to a small piece of code that was never correctly tested with extreme data. Always test!


Quick Review: The Goal of Testing

The main goal is simple: ensure the actual outcomes produced by the system match the expected outcomes defined in the design phase. If they don't match, we fix the fault!


2. Testing Strategies and Scope

You can't test everything at once. We use different Test Strategies depending on what stage of development we are in.

Testing Each Module (Unit Testing)

This strategy involves testing each small part or "module" of the system in isolation (by itself).

  • What is a module? It could be one screen, one calculation routine, or one specific function (like the "login" feature).
  • Advantage: It is easier to find the exact location of a bug because you are only testing a small piece of code.

Analogy: If you are building a LEGO castle, unit testing is making sure each individual wall segment is correctly assembled before joining them all together.

Testing the Whole System (Integration Testing)

Once all the modules work individually, we test how they work together.

  • This confirms that data flows correctly from one module to the next (e.g., the data input form correctly passes the user's details to the database structure).
  • This is a more comprehensive test, ensuring all functions work as one cohesive unit.

Key Takeaway: First, test the parts (modules), then test the finished product (whole system).

3. Designing the Test Plan

A Test Plan is a formal document that details every single test that will be run. It ensures testing is structured, complete, and provides clear steps for remedial action (fixing the problem).

Components of a Good Test Plan:

For every test case, you must define four key elements:

  1. Test Data: The data (inputs) that you will feed into the system.
  2. Expected Outcomes: What the system should produce. This is calculated manually or predicted beforehand.
  3. Actual Outcomes: What the system actually produces when the test data is input.
  4. Remedial Action Following Testing: The action taken if the Actual Outcome does not match the Expected Outcome (e.g., "Adjust validation routine" or "Check file structure definition").

Memory Aid: Think of a test plan as a T.E.A.R. sheet: Test Data, Expected Outcome, Actual Outcome, Remedial Action.

4. Types of Test Data: Finding the Limits

To properly check a system, especially its validation routines, we must use different types of data.

4.1 Normal Data

Characteristics: Data that is valid, expected, and within the specified acceptable limits (range).

Use: To check that the system processes standard, correct input accurately.

Example: If a form asks for age between 16 and 60, Normal Data would be 35.

4.2 Extreme Data (Boundary Data)

Characteristics: Data that is valid but is exactly at the upper or lower limits (boundaries) of the acceptable range.

Use: To ensure the system correctly handles the boundaries. Programmers often make "off-by-one" errors (using < instead of <=). Extreme data catches this.

Example: If the age range is 16 to 60, Extreme Data would be 16 and 60.

4.3 Abnormal Data (Invalid Data)

Characteristics: Data that is invalid and should be rejected by the system's validation checks. This includes data outside the range, wrong data types, or incorrect formats.

Use: To test the robustness of the system's validation routines and error messages. We test whether the system prevents bad data from being entered.

Examples:

  • Out of range: Age 5 or Age 75.
  • Wrong type: Entering the word "Thirty-Five" into a numeric age field.
  • Too short/long: Entering only two digits for a phone number that requires ten.

Common Mistake Alert!

Don't confuse Extreme Data (which is valid and accepted) with data just outside the boundary (which is abnormal and rejected).
If the range is 10 to 20:
• Normal: 15
• Extreme: 10, 20
• Abnormal: 9, 21, "cat"


4.4 The Use of Live Data

Towards the end of the testing phase, especially if migrating from an old system, you might use Live Data.

Definition: This is real data that has been used and processed by the existing (old) system.

Use: It provides a highly realistic test environment because it mimics the volume, variety, and complexity of the actual data the system will encounter when fully operational.

Caution: When using live data, you must ensure that the new system is run in a way that doesn't affect the running of the existing system or corrupt the real, valuable data. This often happens during Parallel Running (a type of implementation, covered in the next section).

5. Testing Specific System Components

Your test plan must ensure that you specifically test the components you designed in the previous stage:

Testing Data Structures and File Structures

This ensures the database or files are correctly set up. You must check:

  • Are the field lengths appropriate (e.g., is the name field long enough)?
  • Are the data types correct (e.g., is "Price" set as a currency/decimal field)?
  • Are the Primary and Foreign Keys working to maintain relationships between tables?

Testing Input and Output Formats

This is about checking the user-facing parts of the system.

  • Input Formats: Check that data capture forms are easy to use and contain all necessary fields.
  • Output Formats: Check that reports, screen layouts, and invoices are printed or displayed correctly, show the right information, and are well-aligned.

Testing Validation Routines

As covered in the Test Data section, you must systematically check that all validation routines work perfectly, including:

  • Range Check: Does it ensure data is within limits?
  • Type Check: Does it ensure data is the correct format (e.g., number only)?
  • Presence Check: Does it stop the user from leaving a mandatory field blank?
  • Check Digit/Sum: Does it detect accidental data entry errors?

Key Takeaway: Development and testing go hand-in-hand. You develop a piece, then test it thoroughly using normal, extreme, and abnormal data. A structured test plan is the road map to a successful, bug-free launch!