Efficiently scripting change-resilient tests
Abstract
In industrial practice, test cases often start out as steps described in natural language and are intended to be executed by a human. Since tests are executed repeatedly, they go through an automation process, in which they are converted to automated test scripts (or programs) that perform the test steps mechanically. Conventional test-automation techniques can be time-consuming, require specialized skills, and can produce fragile scripts. To address these limitations, we present a tool, called ata, for automating the test-automation task. Using a novel combination of natural-language processing, backtracking exploration, and learning, ata can significantly improve tester productivity in automating manual tests. ata also produces change-resilient scripts, which automatically adapt themselves in the presence of certain common types of user-interface changes. © 2012 Authors.