June 15, 2017

Inside the QA Automation Lab

Bottle Rocket’s Quality Assurance team is on an exciting automation journey. We’ve written the code, trained our manual testers, and are now focusing efforts on the place where it will all be put together. We’ve put a lot of hard work into our automation process and we wanted an automation lab that was just as awesome.

Our lab is an open space that is accessible to all. Shelves containing a collection of devices covering every major type, size, and operating system line the walls. Each device has its own stand and the shelves are backlit to maximize visibility. USB hubs with up to two amps per port are positioned to the side, allowing devices to charge and sync at the same time. The hub is connected to an iMac acting as the server. We anticipate being able to run tests on 32 devices at once per server.

Many of our large projects that have had multiple releases have had automation scripts created for them and can be continuously monitored. Any time a new build is created, a test plan can be triggered to download the build and run a smoke test on selected devices. This can really ease the pain of having to run the same tests on each build over and over, not to mention those server testing sessions in the wee hours of the morning.

Reports are generated and QA on the project are notified. Testers get reports from these runs complete with screenshots of each test outcome. These reports are divided into different sections, such as pages of the app, and the results are clear-cut with color-coded pass/fail icons. Each individual test can pass or fail based on criteria detailed in the test cases. If the tester is unable to see exactly what happened with just the screenshots, they can swing by the automation lab to run the test again and follow what happens in person.

This has been a journey because it wasn’t always easy; there were plenty of issues that had to be worked through along the way. Physical devices were a big problem. They’re definitely the most beneficial for testing because they are the closest we can get to real-world application of an app. The fact is, there are lots of issues that can be missed on a simulator, like things related to performance, battery level, or any kind of hardware component. The more variety we can get on the higher number of devices, the better. But working with this volume of devices has its hardships. Some of them required enhancements to our setup; high-performance devices (namely tablets) require lots of power to even maintain a charge, hence the beefy USB hub mentioned earlier. The less powerful hubs we used initially just could not keep up. Different devices can also have different settings menu navigation, meaning our test cases need to be general enough to apply to different hierarchies. Some devices even disconnect after a period of time.

Our automation engineers solved this one with a program they created called Vadr. Vadr is the interface that allows testers to access the lab and devices remotely. It shows all devices, their connection status (so we know when those difficult devices have disconnected), and allows us to choose which test/test plan to run on each device. This will make it easy for any QA tester to take advantage of our automation tools.

Our goals for an automation lab were accessibility, visibility, and efficiency. We ended up with that and more. As the physical space was created and evolved, our process and understanding of automation testing grew and solidified with it.

March 7, 2017

Understand Mobile and Web Testing with Bottle Rocket QA

Businesses need stable apps for their mobile experiences to impact users. That’s why Bottle Rocket clients receive one or more quality assurance (QA) analysts who participate in design and development. Our mobile experiences go through several stages of testing across all required devices. Here’s some insight on one aspect of that process.

With widespread adoption of mobile applications, the industry is implementing new kinds of testing. QA testers used to be primarily concerned with web apps. But even the most seasoned QA testers familiar with web apps will need to look at mobile apps with a fresh set of eyes. These applications can look and behave similarly, but there are more things to consider when testing mobile apps. There are core principles that apply to both, however. And it’s worth mentioning that there are many other kinds of app testing out there, but we’ll focus on these two for now.

Before we start with the similarities, here is a quick note for those unfamiliar with the concept of web apps: they are a client-server software application in which the client (or a user interface) runs in a web browser. People use them all the time: webmail, online retail sales, wikis, instant messaging services, among lots of others. Basically, any activity within a website that is more than just navigation and reading involves web applications. Just a few things to consider when testing:

  • Internal hardware
  • External hardware
  • Operating system
  • Browser compatibility
  • Network connectivity

Hardware and operating system combinations are something to consider for both web apps and mobile apps. There are many brands and models of desktop computers and mobile devices. Operating systems are constantly updating with new iterations – this is not unique to either web or mobile. Each update needs to be tested thoroughly with an app. Network connectivity is also a concern for both as wireless connections increase in homes, so web app users won’t necessarily be tied to a wired network connection. Outages and connection loss can happen, and behavior of both types of apps needs to be verified.

Mobile app testing involves the above concerns and then some:

  • Internal hardware
  • External hardware
  • Operating system
  • Browser compatibility
  • Network connectivity
  • Syncing multiple devices
  • Mobile network operators
  • Voice commands
  • Gestures
  • Device interruptions

The list could go on, but one of the key differences from the above list is syncing multiple devices. Traditionally, web apps need an internet connection to work at all. But some mobile apps allow the user to work offline and sync that data once the device is connected again. And now that smartphones and tablets are so common (and most people have at least one of each), connectivity and seamless syncing is important. If you carry a tablet with you that has a to-do list app installed and you leave it at home, what do you do? Access the app on your phone. All the changes you make on your phone will carry over to the original list on your tablet. There are many test cases around this topic. Are all devices supported? Do problems arise on some operating systems and not others? Can I sync between iOS and Android devices? Will changes made off Wi-FI be saved when a connection is restored? How long to wait to see a change sync to another device?

These are just a few examples to consider. Mobile-savvy testers that can write mobile specific test cases add tremendous value to clients seeking the most stable mobile experience that users prefer.

© 2020 Bottle Rocket. All Rights Reserved.