- •contents
- •preface
- •acknowledgments
- •about this book
- •Special features
- •Best practices
- •Design patterns in action
- •Software directory
- •Roadmap
- •Part 1: JUnit distilled
- •Part 2: Testing strategies
- •Part 3: Testing components
- •Code
- •References
- •Author online
- •about the authors
- •about the title
- •about the cover illustration
- •JUnit jumpstart
- •1.1 Proving it works
- •1.2 Starting from scratch
- •1.3 Understanding unit testing frameworks
- •1.4 Setting up JUnit
- •1.5 Testing with JUnit
- •1.6 Summary
- •2.1 Exploring core JUnit
- •2.2 Launching tests with test runners
- •2.2.1 Selecting a test runner
- •2.2.2 Defining your own test runner
- •2.3 Composing tests with TestSuite
- •2.3.1 Running the automatic suite
- •2.3.2 Rolling your own test suite
- •2.4 Collecting parameters with TestResult
- •2.5 Observing results with TestListener
- •2.6 Working with TestCase
- •2.6.1 Managing resources with a fixture
- •2.6.2 Creating unit test methods
- •2.7 Stepping through TestCalculator
- •2.7.1 Creating a TestSuite
- •2.7.2 Creating a TestResult
- •2.7.3 Executing the test methods
- •2.7.4 Reviewing the full JUnit life cycle
- •2.8 Summary
- •3.1 Introducing the controller component
- •3.1.1 Designing the interfaces
- •3.1.2 Implementing the base classes
- •3.2 Let’s test it!
- •3.2.1 Testing the DefaultController
- •3.2.2 Adding a handler
- •3.2.3 Processing a request
- •3.2.4 Improving testProcessRequest
- •3.3 Testing exception-handling
- •3.3.1 Simulating exceptional conditions
- •3.3.2 Testing for exceptions
- •3.4 Setting up a project for testing
- •3.5 Summary
- •4.1 The need for unit tests
- •4.1.1 Allowing greater test coverage
- •4.1.2 Enabling teamwork
- •4.1.3 Preventing regression and limiting debugging
- •4.1.4 Enabling refactoring
- •4.1.5 Improving implementation design
- •4.1.6 Serving as developer documentation
- •4.1.7 Having fun
- •4.2 Different kinds of tests
- •4.2.1 The four flavors of software tests
- •4.2.2 The three flavors of unit tests
- •4.3 Determining how good tests are
- •4.3.1 Measuring test coverage
- •4.3.2 Generating test coverage reports
- •4.3.3 Testing interactions
- •4.4 Test-Driven Development
- •4.4.1 Tweaking the cycle
- •4.5 Testing in the development cycle
- •4.6 Summary
- •5.1 A day in the life
- •5.2 Running tests from Ant
- •5.2.1 Ant, indispensable Ant
- •5.2.2 Ant targets, projects, properties, and tasks
- •5.2.3 The javac task
- •5.2.4 The JUnit task
- •5.2.5 Putting Ant to the task
- •5.2.6 Pretty printing with JUnitReport
- •5.2.7 Automatically finding the tests to run
- •5.3 Running tests from Maven
- •5.3.2 Configuring Maven for a project
- •5.3.3 Executing JUnit tests with Maven
- •5.3.4 Handling dependent jars with Maven
- •5.4 Running tests from Eclipse
- •5.4.1 Creating an Eclipse project
- •5.4.2 Running JUnit tests in Eclipse
- •5.5 Summary
- •6.1 Introducing stubs
- •6.2 Practicing on an HTTP connection sample
- •6.2.1 Choosing a stubbing solution
- •6.2.2 Using Jetty as an embedded server
- •6.3 Stubbing the web server’s resources
- •6.3.1 Setting up the first stub test
- •6.3.2 Testing for failure conditions
- •6.3.3 Reviewing the first stub test
- •6.4 Stubbing the connection
- •6.4.1 Producing a custom URL protocol handler
- •6.4.2 Creating a JDK HttpURLConnection stub
- •6.4.3 Running the test
- •6.5 Summary
- •7.1 Introducing mock objects
- •7.2 Mock tasting: a simple example
- •7.3 Using mock objects as a refactoring technique
- •7.3.1 Easy refactoring
- •7.3.2 Allowing more flexible code
- •7.4 Practicing on an HTTP connection sample
- •7.4.1 Defining the mock object
- •7.4.2 Testing a sample method
- •7.4.3 Try #1: easy method refactoring technique
- •7.4.4 Try #2: refactoring by using a class factory
- •7.5 Using mocks as Trojan horses
- •7.6 Deciding when to use mock objects
- •7.7 Summary
- •8.1 The problem with unit-testing components
- •8.2 Testing components using mock objects
- •8.2.1 Testing the servlet sample using EasyMock
- •8.2.2 Pros and cons of using mock objects to test components
- •8.3 What are integration unit tests?
- •8.4 Introducing Cactus
- •8.5 Testing components using Cactus
- •8.5.1 Running Cactus tests
- •8.5.2 Executing the tests using Cactus/Jetty integration
- •8.6 How Cactus works
- •8.6.2 Stepping through a test
- •8.7 Summary
- •9.1 Presenting the Administration application
- •9.2 Writing servlet tests with Cactus
- •9.2.1 Designing the first test
- •9.2.2 Using Maven to run Cactus tests
- •9.2.3 Finishing the Cactus servlet tests
- •9.3 Testing servlets with mock objects
- •9.3.1 Writing a test using DynaMocks and DynaBeans
- •9.3.2 Finishing the DynaMock tests
- •9.4 Writing filter tests with Cactus
- •9.4.1 Testing the filter with a SELECT query
- •9.4.2 Testing the filter for other query types
- •9.4.3 Running the Cactus filter tests with Maven
- •9.5 When to use Cactus, and when to use mock objects
- •9.6 Summary
- •10.1 Revisiting the Administration application
- •10.2 What is JSP unit testing?
- •10.3 Unit-testing a JSP in isolation with Cactus
- •10.3.1 Executing a JSP with SQL results data
- •10.3.2 Writing the Cactus test
- •10.3.3 Executing Cactus JSP tests with Maven
- •10.4 Unit-testing taglibs with Cactus
- •10.4.1 Defining a custom tag
- •10.4.2 Testing the custom tag
- •10.5 Unit-testing taglibs with mock objects
- •10.5.1 Introducing MockMaker and installing its Eclipse plugin
- •10.5.2 Using MockMaker to generate mocks from classes
- •10.6 When to use mock objects and when to use Cactus
- •10.7 Summary
- •Unit-testing database applications
- •11.1 Introduction to unit-testing databases
- •11.2 Testing business logic in isolation from the database
- •11.2.1 Implementing a database access layer interface
- •11.2.2 Setting up a mock database interface layer
- •11.2.3 Mocking the database interface layer
- •11.3 Testing persistence code in isolation from the database
- •11.3.1 Testing the execute method
- •11.3.2 Using expectations to verify state
- •11.4 Writing database integration unit tests
- •11.4.1 Filling the requirements for database integration tests
- •11.4.2 Presetting database data
- •11.5 Running the Cactus test using Ant
- •11.5.1 Reviewing the project structure
- •11.5.2 Introducing the Cactus/Ant integration module
- •11.5.3 Creating the Ant build file step by step
- •11.5.4 Executing the Cactus tests
- •11.6 Tuning for build performance
- •11.6.2 Grouping tests in functional test suites
- •11.7.1 Choosing an approach
- •11.7.2 Applying continuous integration
- •11.8 Summary
- •Unit-testing EJBs
- •12.1 Defining a sample EJB application
- •12.2 Using a façade strategy
- •12.3 Unit-testing JNDI code using mock objects
- •12.4 Unit-testing session beans
- •12.4.1 Using the factory method strategy
- •12.4.2 Using the factory class strategy
- •12.4.3 Using the mock JNDI implementation strategy
- •12.5 Using mock objects to test message-driven beans
- •12.6 Using mock objects to test entity beans
- •12.7 Choosing the right mock-objects strategy
- •12.8 Using integration unit tests
- •12.9 Using JUnit and remote calls
- •12.9.1 Requirements for using JUnit directly
- •12.9.2 Packaging the Petstore application in an ear file
- •12.9.3 Performing automatic deployment and execution of tests
- •12.9.4 Writing a remote JUnit test for PetstoreEJB
- •12.9.5 Fixing JNDI names
- •12.9.6 Running the tests
- •12.10 Using Cactus
- •12.10.1 Writing an EJB unit test with Cactus
- •12.10.2 Project directory structure
- •12.10.3 Packaging the Cactus tests
- •12.10.4 Executing the Cactus tests
- •12.11 Summary
- •A.1 Getting the source code
- •A.2 Source code overview
- •A.3 External libraries
- •A.4 Jar versions
- •A.5 Directory structure conventions
- •B.1 Installing Eclipse
- •B.2 Setting up Eclipse projects from the sources
- •B.3 Running JUnit tests from Eclipse
- •B.4 Running Ant scripts from Eclipse
- •B.5 Running Cactus tests from Eclipse
- •references
- •index
198CHAPTER 9
Unit-testing servlets and filters
Listing 9.4 Implementation of getCommand that makes the tests pass
package junitbook.servlets;
import javax.servlet.ServletException; import javax.servlet.http.HttpServlet; import javax.servlet.http.HttpServletRequest;
public class AdminServlet extends HttpServlet
{
public static final String COMMAND_PARAM = "command";
public String getCommand(HttpServletRequest request) throws ServletException
{
String command = request.getParameter(COMMAND_PARAM); if (command == null)
{
throw new ServletException("Missing parameter [" + COMMAND_PARAM + "]");
}
return command;
}
}
Running the tests again by typing maven cactus:test yields the result shown in figure 9.7.
Figure 9.7 Execution of successful Cactus tests using Maven
9.2.3Finishing the Cactus servlet tests
At the beginning of section 9.2, we mentioned that you need to write three methods: getCommand, executeCommand, and callView. You implemented getCommand in listing 9.4. The executeCommand method is responsible for obtaining data from the database. We’ll defer this implementation until chapter 11, “Unittesting database applications.”
Writing servlet tests with Cactus |
199 |
|
|
That leaves the callView method, along with the servlet doGet method, which ties everything together by calling your different methods. One way of designing the application is to store the result of the executeCommand method in the HTTP servlet request. The request is passed to the JSP by the callView method (via servlet forward). The JSP can then access the data to display by getting it from the request (possibly using a useBean tag). This is a typical MVC Model 2 pattern used by many applications and frameworks.
Design patterns in action: MVC Model 2 pattern
MVC stands for Model View Controller. This pattern is used to separate the core business logic layer (the Model), the presentation layer (the View), and the presentation logic (the Controller), usually in web applications. In a typical MVC Model 2 design, the Controller is implemented as a servlet and handles all incoming HTTP requests. It’s in charge of calling the core business logic services and dynamically choosing the right view (often implemented as a JSP). The Jakarta Struts framework (http://jakarta.apache.org/struts/) is a popular implementation of this pattern.
You still need to define what objects executeCommand will return. The BeanUtils package in the Jakarta Commons (http://jakarta.apache.org/commons/beanutils/) includes a DynaBean class that can expose public properties, like a regular JavaBean, but you don’t need to hard-code getters and setters. In a Java class, you access one of the dyna-properties using a map-like accessor:
DynaBean employee = ...
String firstName = (String) employee.get("firstName"); employee.set("firstName", "vincent");
The BeanUtils framework is nice for the current use case because you’ll retrieve arbitrary data from the database. You can construct dynamic JavaBeans (or dyna beans) that you’ll use to hold database data. The actual mapping of a database to dyna beans is covered in chapter 11.
Testing the callView method
There’s enough in place now that you can write the tests for callView, as shown in listing 9.5.
200CHAPTER 9
Unit-testing servlets and filters
Listing 9.5 Unit tests for callView
package junitbook.servlets;
[...]
import java.util.ArrayList; import java.util.Collection; import java.util.List;
import org.apache.commons.beanutils.BasicDynaClass; import org.apache.commons.beanutils.DynaBean; import org.apache.commons.beanutils.DynaProperty;
public class TestAdminServlet extends ServletTestCase
{
[...]
private Collection createCommandResult() throws Exception
{
List results = new ArrayList();
DynaProperty[] props = new DynaProperty[] { new DynaProperty("id", String.class),
new DynaProperty("responsetime", Long.class)
Create objects to be
returned by executeCommand
};
BasicDynaClass dynaClass = new BasicDynaClass("requesttime", null, props);
DynaBean request1 = dynaClass.newInstance(); request1.set("id", "12345"); request1.set("responsetime", new Long(500)); results.add(request1);
DynaBean request2 = dynaClass.newInstance(); request1.set("id", "56789"); request1.set("responsetime", new Long(430)); results.add(request2);
return results;
}
public void testCallView() throws Exception
{
AdminServlet servlet = new AdminServlet();
//Set the result of the execution of the command in the
//HTTP request so that the JSP page can get the data to
//display
request.setAttribute("result", createCommandResult()); c
|
|
servlet.callView(request); |
Set execution |
|
|
|
results in HTTP |
||
|
} |
|
||
|
|
requests |
||
} |
|
|
||
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Writing servlet tests with Cactus |
201 |
|
|
To make the test easier to read, you create a createCommandResult private method. This utility method creates arbitrary DynaBean objects, like those that will be returned by executeCommand. In testCallView, you place the dyna beans in the HTTP request where the JSP can find them.
There is nothing you can verify in testCallView, so you don’t perform any asserts there. The call to callView forwards to a JSP. However, Cactus supports asserting the result of the execution of a JSP page. So, you can use Cactus to verify that the JSP will be able to display the data that you created in createCommandResult. Because this would be JSP testing, we’ll show how it works in chapter 10 (“Unit-testing JSPs and taglibs”).
Listing 9.6 shows the simplest code that makes the testCallView test pass.
Listing 9.6 Implementation of callView that makes the tests pass
package junitbook.servlets; [...]
public class AdminServlet extends HttpServlet
{
[...]
public void callView(HttpServletRequest request)
{
}
}
You don’t have a test yet for the returned result, so not returning anything is enough. That will change once you test the JSP.
Testing the doGet method
Let’s design the unit test for the to verify that the test results are how you can do that:
AdminServlet doGet method. To begin, you need put in the servlet request as an attribute. Here’s
Collection results = (Collection) request.getAttribute("result");
assertNotNull("Failed to get execution results from the request", results);
assertEquals(2, results.size());
This code leads to storing the command execution result in doGet. But where do you get the result? Ultimately, from the execution of executeCommand—but it isn’t implemented yet. The typical solution to this kind of deadlock is to have an executeCommand that does nothing in AdminServlet. Then, in your test, you can implement executeCommand to return whatever you want:
AdminServlet servlet = new AdminServlet()
{
202CHAPTER 9
Unit-testing servlets and filters
public Collection executeCommand(String command)
throws Exception
{
return createCommandResult();
}
};
You can now store the result of the test execution in doGet:
public void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException
{
try
{
Collection results = executeCommand(getCommand(request)); request.setAttribute("result", results);
}
catch (Exception e)
{
throw new ServletException(
"Failed to execute command", e);
}
}
Notice that you need the catch block because the Servlet specification says doGet must throw a ServletException. Because executeCommand can throw an exception, you need to wrap it into a ServletException.
If you run this code, you’ll find that you have forgotten to set the command to execute in the HTTP request as a parameter. You need a beginDoGet method, such as this:
public void beginDoGet(WebRequest request)
{
request.addParameter("command", "SELECT...");
}
The completed unit test is shown in listing 9.7.
Listing 9.7 Unit test for doGet
package junitbook.servlets; [...]
public class TestAdminServlet extends ServletTestCase
{
[...]
public void beginDoGet(WebRequest request)
{
request.addParameter("command", "SELECT...");
}
Writing servlet tests with Cactus |
203 |
|
|
public void testDoGet() throws Exception
{
AdminServlet servlet = new AdminServlet()
{
public Collection executeCommand(String command) throws Exception
{
return createCommandResult();
}
};
servlet.doGet(request, response);
//Verify that the result of executing the command has been
//stored in the HTTP request as an attribute that will be
//passed to the JSP page.
Collection results =
(Collection) request.getAttribute("result");
assertNotNull("Failed to get execution results from the " + "request", results);
assertEquals(2, results.size());
}
}
The doGet code is shown in listing 9.8.
Listing 9.8 Implementation of doGet that makes the tests pass
package junitbook.servlets; [...]
public class AdminServlet extends HttpServlet
{
[...]
public Collection executeCommand(String command) |
|
|
|
throws Exception |
|
Throws exception |
|
|
|||
{ |
|
|
if called; not |
throw new RuntimeException("not implemented"); |
|
|
implemented yet |
|
|
||
} |
|
|
|
public void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException
{
try
{
Collection results = executeCommand(getCommand(request));
request.setAttribute("result", results);
}
catch (Exception e)
{