Friday, 13 November 2009

Manual Intervention in an Automated Test

Supporting Files
- nFocusMessageBox.zip

You may be wondering why you would want to add manual intervention to an automated test run, I used to have the same thoughts as you, but I have seen the benefits of the approach. Before we start it is important to understand that normally any manual intervention in an automated test run is a bad idea, for one thing the tests will not run.

First let’s explore where this approach is useful:

  • Manual verification: You might have to verify objects that cannot be mapped, or are difficult to test, eg. Flash, Silverlight.
  • External processing: In order to automate the end to end process a batch process might have to be executed. If you can use Axe to execute the batch process great.
  • Get around CAPTCHA.

I have used manual verification successfully on a project a few years ago to verify the look and feel of a website, and I currently need to use external processing to test an ETL process.

For those of you that do not know what an ETL process is, it is basically Extract, Transform and Load. Testers do that all the time trying to set up test data.

The high level testing steps are as follows:

1. Stop the ETL process and clear down the system

2. Load all data into the source system using Axe

3. Start the ETL process

4. Verify the data in the target system using Axe

5. Stop the ETL process

6. Edit the source data

7. Restart the ETL process

8. Verify the data in the target system using Axe

9. Stop the ETL process

10. Delete the source test data

11. Restart the ETL process

12. Verify that the data has been marked as deleted on the target environment

I know that some of you are asking why we cannot use Axe to stop and start the ETL process. The answer is “Yes we can, but this is the first implementation of the ETL process and we have now ironed out the bugs”. To change the manual steps to automated steps using Axe is really easy and can be done later.

Enough ramblings, lets get to how we implement the functionality. I am going to assume that you have a project library file and ActionMap. As always you can always download a working zip file from here and unzip to your C:\ drive.

We are going to implement parts of the .NET MessageBox class, specifically a standard dialog with an OK button, and a dialog with Yes and No buttons. The latter can be used for manual verification.

1. Add a reference to System.Windows.Forms in the __testInit section of the project ActionMap: using System.Windows.Forms

2. Add the following method to your project library file:

public static int YesNoMessageBox(AxeMainAPI axe, string message, string title, DialogResult expected)

{

int result;

DialogResult actual = MessageBox.Show(message, title, MessageBoxButtons.YesNo);

if (actual == expected)

{

result = 0;

}

else

{

result = 1;

}

return axe.StepValidate(expected.ToString(), actual.ToString(), result);

}

3. Hook up the MessageBox class and custom method to the ActionMap as follows:

Table 1

4. Add the following to the ObjectMap:

Table2

5. Next step, create a Subtest page for the mapped objects.

As you can see from the Actions, the MsgBoxOk class is overloaded with the possibility to have a title on your dialog box, eg: set( MessageBox title). The MessageBox text is read from the data column in the test step.

The Expected parameter for the MsgBoxYesNo Action can be one of the following:

  • DialogResult.Yes, eg: set(nFocus MessageBox Example, DialogResult.Yes)
  • DialogResult.No, eg: set(nFocus MessageBox Example, DialogResult.No)

As in the MsgBoxOk class the MessageBox text is read from the data column.

6. Last step is to download the example and try it out.

I hope that you found the blog interesting and understand the potential to have manual intervention in an automated test.

Marc Maurhofer

No comments: