Thursday, June 28, 2012

Load-Testing ASP.NET MVC (Part 2 - JMeter introduction)

Although Apache Workbench provides some simple and useful metrics it's a little bit limited. In this post I'm going to talk about a much more powerful tool for load-testing: JMeter (also from the Apache Foundation).

I'll start by replicating the previous experiment that I did with Apache Worbench but using JMeter, and gradually adding more complexity to it.

Let's get started:
  • Extract the archive somewhere on your disk and run the <folder>/bin/jmeter.bat file (Java required)
JMeter is not a particularly intuitive application. When starting you get this home screen.

You have a Test Plan node and a workbench. The workbench works like a Sandbox. Nothing you place there is saved implicitly with your tests. According to the JMeter manual the workbench is:

"(..) a place to temporarily store test elements while not in use, for copy/paste purposes, or any other purpose you desire"

Yeah, a little bit weird. Anyway, the Test Plan is where you place the test itself. If you right-click it you might be a little overwhelmed with all the available options, but the JMeter page does a really nice job in explaining each of the available options. You can check the manual at

Step 1: Simple HTTP Get Requests

Now, the typical basic JMeter test has a Thread, a Sampler and a Listener. In the context of our first test this will mean:
  • Thread: configuration of number of users and number of executions. Similar to the parameters we tweaked in Apache Worbench
  • Sampler: the HTTP Request configuration
  • Listener: Handles the requests and results and produces result metrics. 
So, without further ado, let's create all these items:

  • Test Plan > Add > Threads (Users) > Thread Group

  •  Let's create 10 users, where each launches 10 simultaneous requests. 

So, we'll have a total of 100 requests. The ramp-up period is very useful in more realistic testing scenarios and represents the interval between each thread (user) submitting requests. In this case, as we're trying to mimic the behavior from our Apache Worbench test from my previous post, I've set it to 0 (zero).

  • Thread Group > Add > Sampler > HTTP Request
  • Fill the request information. Obviously the server name, port number and path may vary.

  • Thread Group > Add > Listener > Summary Report
  • Launch the Test
  • The Summary Reports shows basic information regarding our test.

As you can see 100 samples were produced with an average time of 516 ms. There are LOTS of other result listeners, allowing you to show information about each request, export to CSV, graphs, etc.

Step 2: HTTP Get Requests with dynamic parameter and measuring success

We'll pick the previous example and send a dynamic sequential parameter, created by a JMeter counter. Then, the server will give a success or failure error message depending if the number is odd of even. So, we'll add to the existing example:

  • Counter
  • Tree Listener
  • Response Assertion

First of all we'll have to change the mvc project.

The action implementation will be:
public async Task<ViewResult> Index(int id)
    await Task.Delay(500);

    this.ViewBag.Id = id;
    this.ViewBag.Success = id % 2 == 0;

    return View();
And the view:
    ViewBag.Title = "Index";

<p>@(ViewBag.Success ? "Success" : "Error")</p>

<p>Item ID:@ViewBag.Id</p>

So, opening the page should show a different result if the parameter is odd or even:

Regarding JMeter:

  • Thread Group > Add > Config Element > Counter

  •  Fill the counter information. Basically it'll increment its value by one on each iteration and store the result on a variable called COUNTER.

  • Let's change our HTTP request to use this counter value. To use variables in JMeter the notation ${VARIABLE} is used. Thus, in our case, our HTTP request url should become: /Home/Index/${COUNTER}

Tree Listener

This is my favorite result listener during development as it shows each request, including the full response.
  • Thread Group > Add > Listener > View Results Tree

  • Rearrange the items using Drag&Drop so that they become like this:

  • Launch the Test
Now, besides having the Summary Report as before, we can also see each request in the "View Results Tree" item. There we can confirm that each request is being sent with a different ID (from the counter) and that the response has a different message when using odd or even IDs.

Response Assertion

Now, sometimes we need to measure the success of a request according to the response we got. In this case, we'll consider a success if the response has the text "Success".
  • Thread Group > Add > Assertions > Response Assertion

  • Configure the assertion

  • Now run the test. Half of the requests will be considered errors (the ones with odd IDs)

This response assertion thing is more important than you may consider at first glance. Follow my advice: always have a Response Assertion in place to make sure that your results are not tampered.

Now, this post is starting to be a little big so I'll wrap this up on a third post where I'm going to talk about some additional things:
  • Using Variables and other misc stuff
  • Submitting HTTP POST (also, using the HTTP Proxy)
  • Handling authentication 
  • Handling the ASP.NET MVC Anti-Request forgery token
  • Loading data from a CSV file


  1. Hi Pedro,

    I think 50% error you got because of running your load test on localhost and using localhost as target server. Related to assertion, it's good to checking response but it reduces performance and recommended not to use assertsions when you tune up JMeter script to run on production.

  2. The 50% errors were done on purpose. I generate half of the pages with the "Error" message to be used by the assertion.

    Also, although the assertion does impact performance, I disagree that you shouldn't ever use it production. It serves as an extra-degree of confidence on your results which can be invaluable sometimes.