Is That What You intended()?

By: May 12, 2016

In the wake of the launch of our new content curation app Curate, the AWeber mobile team has been spending some time turning the soil of our older codebases. The Android team in particular now has three app projects and a common library to maintain – and in each of those, we’ve been looking to reduce the footprint of our Robolectric tests due to test stability issues.

This is a story of one of those refactors and it begins as most developer tales do, with a simple question…

Have you ever wanted to match 1 intents, but only actually matched 0 intents?


You’ve got your rule set up, you’ve made sure that your Intents are being created, and your app is doing the thing it should be doing. So what’s wrong? If you’re like me, it is your expectations.

I was tasked with refactoring legacy Robolectric tests and replacing them with newer Espresso framework implementations. Long term fans of the Engineering Blog may remember that our history with Robolectric hasn’t been spotless. We weren’t checking for anything bizarre and we were using a standard library, the Intents package, which was provided by the Espresso team. At the time I thought to myself, “We check for Intent usage in our other apps without a problem, this’ll take me maybe an hour.” My team lead checked in a day later and somewhere nearby you could hear that famous line about assumptions.

SnipeHuntTest.java contains no breakpoints

When you’re driving down the road and you hear a loud popping noise, generally you’ll pull over to the side of the road, step out of your vehicle, look down at the shredded tire and lament to yourself, “Oh darn, looks like my tire’s flat.” In my case, I decided the first thing to do would be to pop the hood and start trying to figure out why there was some odd, grimy build-up on the engine.

Our applications use an Activity with Theme.NoDisplay to perform some routing functions in order to enter the application from different points based on a variety of conditions. This allows us to utilize code compartmentalization and the natural Android life-cycle to keep our Activity stack nice and tidy and still have tight control over what should be launching and when.

During my debugging process, I became focused on the relationship between Theme.NoDisplay, Theme.Transparent, and the Android lifecycle. I was convinced that there was some kind of timing issue preventing my breakpoints from hitting in an expected fashion. Though I did learn some interesting things there regarding IdlingResource usage and how Espresso waits (or, more accurately, doesn’t wait) for events, I wasn’t any closer to figuring out what exactly was happening.


So, after figuring out that I was in the weeds and I still needed to fix those tests and close out cards that were piling up in the backlog, I refocused inside the debugger and started to go kill some assumptions.

As I stepped through the Espresso code, I finally put together that the Intent tracking flow was occurring as normal, but the callbacks that were supposed to be triggering and storing metadata weren’t there. So what was causing my issue? What insane conflux of parameters cost me and my team our precious velocity?

It’s pretty plain to see why when you’re staring at it in hindsight. Let’s take a look at my real source of consternation, IntentsTestRule:

Turns out, this was not actually doing what I thought it said it was doing. When you want to track Intents in your tests, you’ll want to use the Espresso.Intents package. Specifically, you’ll need to utilize Intents.init() and Intents.release(). These two methods will start up and tear down the tracking code that allows you to see what Intents have been fired off during your tests. The detail that cost me time was the devil in the implementation of IntentsTestRule.

Oops. Maybe it would help if Intents.init() was being called before the Activity is launched?


Of course, all of our other tests using IntentsTestRule worked as they should. In all of those scenarios, we were testing Intents that were created by a running activity. This was a new set of tests specifically launching an Activity from scratch via Intent. I had blinded myself and didn’t fact check to make sure that what I thought was happening was actually happening.

Now my path to a solution was clear and I had a couple of easy options: I could either switch back to using an ActivityTestRule and manually call Intents.init() before the tests execute. Or I could override IntentsTestRule and force the behavior expected for the tests. I chose the former simply because the task in question was only to update two tests, and it would have a lower impact on the rest of the test suites. Easy peasy. My tests were passing and everyone was happy.

Where to go from there? I looked at the Git blame for IntentsTestRule and found the code was added to afterActivityLaunch by Stephan Linzner, a Google employee and Android developer. I would love to speak with him concerning why this was chosen rather than beforeActivityLaunch and what specific side effects they were hoping to avoid. (I’ve reached out and will certainly provide an update – and maybe a pull request – if that conversation happens.) I’m interested in seeing what a bit of refactor could do for our other tests that rely on launching by Intent and whether or not calling Intents.init() sooner will introduce any issues for the rest of our post launch intent tests.

All Tests Passed

What have I learned from this experience?

  • If the small task you’re working on goes pear shaped, it definitely pays to start out with the path of least resistance even if the documentation makes you think otherwise. A little peek into the source code would’ve saved me a significant amount of time.
  • Expectations should be challenged, especially when you haven’t written the code.
  • Time box all the things. If you’re reasonably certain that something should take you an hour and you’re still staring at it a day later? Stop, take a step back, and try to ask yourself why you may not be where you should be. Sometimes the answer you get is worthy of a :picard_face_palm:.

What have you learned from your own testing experiences?

  • What are some of your testing sticky points that led you astray?
  • How did you find out the root causes and what solutions did you develop?

Thanks for joining us on another part of our Behind the App journey. If you have a question or something to say, please leave it in the comments section!

Note: This is a post in the AWeber Engineering Blog series "Behind the App," which reveals important pieces of the behind-the-scenes story of our latest mobile app, Curate. You can find the other posts in the series here.

It's available for download on both the Apple App Store and the Google Play Store.