Under the Hood: BlueCat’s Second Generation Hardware Performance Testing
One of the great things about working at BlueCat is knowing that what we do is foundational to every network connection in an organization.

One of the great things about working at BlueCat is knowing that what we do is foundational to every network connection in an organization. Having been here for several years I’ve seen many of our customers grow from medium to large to massive. As a key part of the Engineering team, I hear stories of how our customers rely on us for everything from IP connected clouds to the doors and security cameras that keep them secure – it’s amazing.
No matter what size the organization is, scalability and performance is critical. For large organizations and complex networks it’s obvious, but smaller organizations need performance and scalability headroom as well. All the data we see shows how fast organizational device and network connectivity needs are growing, so a smaller appliance that can still drive high performance can make the total cost of ownership for the solution significantly lower.
Part of my role at BlueCat is to build out a world-class testing infrastructure to ensure that our solution exceeds our customers’ demanding expectations. To do this we use a variety of tools from the industry recognized IXIA load testing tool, to web UI testing, to custom approaches we’ve had to build internally in order to stress some of the unique technology areas within our solution.
However, the best tools are worthless without the right data and the right test direction. As a customer-focused company we primarily use real customer data for our tests. We are fortunate to have had many of our large customers allow us to use images of their database in order to push our solution in a number of different ways that are seen in real production installs.
This is an area of the business that I’m very passionate about and will be writing more on going forward. However, with our latest appliance platforms having just been announced with some of the best performance in the market, I wanted to spend some time talking about how we test performance first.
DHCP Performance
Under normal operating circumstances, the majority of DHCP requests are lease renewals that are much less intensive than new lease requests. In most environments where the lease time for a network is >4 hours an outage will be less than the lease time so the vast majority of devices will simply renew. The only time you’ll see a 100% new lease situation is when there is a complete outage that lasts longer than the lease time or if you are simultaneously turning on a large number of devices.
With this in mind our testing scenario measures the performance of our DHCP server with a 50/50 ratio of new and returning clients, and with our firewall turned on, and a ping check of those DHCP leases being performed. With BlueCat’s options for highly available and redundant DHCP this methodology makes more sense than one in which all clients are granted new leases (this would only happen after a prolonged outage that our HA makes highly unlikely). We feel that this test represents a conservative view of our performance:
Specifically, our DHCP performance setup includes:
- The Ixia load testing tool to provide virtual client load for all our tests
- Multiple test runs to establish consistent results and ensure accuracy
- A DHCP environment set up with:
- 30,000 clients in a /16 network
- Lease time of 24 hours
- Discover timeout of 4 seconds with 3-retrys allowed
Figure 1: The DHCP Testing Environment
DNS Performance
When testing DNS we measure authoritative DNS as this is typically slower than recursive DNS. The results of our DNS performance testing are below:
The test environment again leverages the Ixia load testing tool. For DNS performance BlueCat uses
- 200 unique clients (MAC addresses)
- A DHCP range of 32k
- A single DNS zone that houses 10,000 resource records ranging from testhost-00001.example.com to testhost-10000.example.com
Figure 2: The DNS Testing Environment
I’ll be following up with more posts outlining how we test other elements of our system, but I hope this quick overview is helpful in outlining our approach. We’re always pushing ourselves to do our jobs better and that means working more and more closely with our customers. I’d very much like to speak to more of you in the coming months and welcome your feedback.