Bob Ricca

Let's Connect.

Case Study: Sending Email in AWeber

In this project I cover the thoughts and processes behind redesigning the way scheduling and sending email works for AWeber customers.

Objectives & Goals

  • Simplify the workflow for sending an email with minimal development resources
  • Remove the confusion of our email Queue system
  • Better clarify whether an email is in draft, scheduled, or sent status
  • Place the "Create A New Broadcast" button in the top right hand corner as part of a bigger initiative to standardize button placement
  • Modernize the design

Evaluating Success

  • All UI changes were subjected to multiple rounds of remote usability testing using our internal beta pool
  • Subjects who participated received a $50 Amazon Gift Card as a thank you for their time
  • As a result of our UX findings, we did implement small tweaks to the UI to further refine the workflow

Analyzing The Old

Before we dig into the description let's take a look at what the design used to be for comparison's sake.

Confusing Terminology

Customers, specifically Americans, found the idea of "Queueing" a message very confusing.

Queue simply isn't a commonly used word in our vocabulary.

Using the word Queue in the interface was a choice made by our engineering team years ago, partly as a good willed gesture to imply how our back end system is actually processing our 100k+ customers sending email simultaneously.

Misleading Labels + Bugs = Lack of Confidence for Customers

"Oh no! Is this message already sending?!"
- a customer panicked during our benchmark usability testing

We found that customers would initially get caught up on if a message was currently scheduled or not.

Even worse, some directly faulted themselves for "not getting it"... and honestly who can blame them?

At first glance label "Pending Broadcasts" can be misleading. It implies that the messages in this table are scheduled and waiting to send — in reality it's a mix of draft messages and scheduled messages.

Sending email to the past

On the table above, "Send Date" was populated with false information (date created) until the user actively scheduled the email for a different date.

Digging into how much more confusing this can be, for example purposes, assume that today is December 1st, 2010.

Now, there is a message I'd like to send that I created one month prior (November 1st, 2010).

When I'm queueing the email to send today (December 1st, 2010), this is the dialog box I am greeted with.

Because the "Send Date" was populated with the "Date Created"... it appeared as though customers could actually email the past :).

Now let's look at the Sent messages

A few parts of this design either didn't make sense or were just plain noisy.

Showing that a message was sent to All Subscribers was redundant considering most messages are sent to an entire list and the exception tends to be when targeting specific segments of subscribers.

Differentiating between the type of message (HTML vs. Text/HTML) was confusing because they weren't labeled very intuitively, not to mention that we find that most people who prefer sending a specific message type tend to stick to it. Furthermore, when using our new editor, this feature was actually made obsolete because we automatically populate a plain text version of every email sent to subscribers.

The mystery caused by unique statistics

Last but not least, let's look at the performance statistics.

Using raw open rate data leads to weird unusable statistics (e.g. 416% of people opened your message).

"How can my open rate be over 100%?"
- a customer asked baffled during testing

Customers came to a few different conclusions as to why their open rate could be over 100%.

Some though it had to do with subscribers forwarding messages to friends. Others thought maybe they were using the system wrong somehow.

What is the real answer?

When using raw statistics we count a single person opening an email multiple times.

We've found that if a subscriber uses a program like Outlook, when they load the program it defaults to the last email received. If they check their email once every hour during an 8 hour work day, that is the equivalent of 8 opens.

The Redesign

Brainstorming / Analysis

As with any project, before you seek a solution it's imperative that you to do the ground work to make sure you fully understand the problem you're setting out to solve.

It's also important to realize the ripple effect from making changes of this magnitude.

Teams I've consulted with for analysis include members from Customer Solutions, Education, Development, Data Management, Design and Marketing.

Wireframing

Depending on the nature of the project I'm working on, sometimes I'll draw a variety of potential solutions on a whiteboard before using software such as Balsamiq or Mockflow.

Once I'm satisfied with the wireframe I'll then digitalize the solution and create a page that documents the project on Confluence. As time has progressed I've found this to be the most effective way when communicating with large groups of people.

The wireframe below addresses the issues with the original design:

  • Messages are now broken up into three distinct sections — Drafts, Scheduled, and Sent
  • The queue system has simplified to two options — "Send Now" and "Schedule"
  • Anything involving manipulating a message has been grouped under the subject line (Edit, Delete, Copy, Send a Test)
  • Attachments only show when applicable
  • Instead of redundantly saying each message is sending to All Subscribers, we now only show if a message is being sent to a specific subset of subscribers
  • Scheduled messages are using unique percentages instead of raw statistics for easier comparison

Comping / Coding

After wireframing the design and going through rounds of stakeholder sign off, I usually work with a developer to implement what I've come up with.

For this particular project I also did the hi-fidelity design work and coded the CSS & HTML in a new Git branch. This code was then handed off to a developer to further build the functionality based on all the outlined use case scenarios.

The Outcome

Customers Noticed :)

I love and cherish the fact that I can postively impact real people who depend on our product.

After every customer facing project I keep my eye on Twitter to see what people are saying.

The responses were pretty awesome:








90 Day Retrospect of AWeber's Reskin

I should have been smarter about using branches in Git

Due to the size of this project and considering it was a universal design refresh we were forced to release this as part of a larger project.

Moving forward I've learned to avoid this by being stricter about minimizing and managing my changesets. This not only provides a safer deployment with minimal risk for the company but it also lends itself better toward iterative improvement and smoother deployment.

We set the bar for team immersion when making customer facing changes

This was the first project at AWeber that we really did everything in our power to immerse the entire company in the changes.

Throughout the design process I was constantly meeting with members from multiple departments, gathering feedback, digging through usage statistics, sharing my findings in mini presentations, etc.

In the final weeks, I put together a presentation that spoke to each change and the research supporting it and personally gave it to every member of our Customer Solutions team over a period of three days. I'll never forget the moment two of our guys high fived when I announced that we are getting rid of our Queue system. (...yeah, we're nerds!)

We also made sure customer solutions team had multiple weeks to test out the new design via our beta pool and provide feedback through an internal tool.

I also put together a PDF that further outlined all the design decisions, showing the old design in comparison to the new design.

The way our team came together to release these changes was nothing short of amazing and I was lucky to be part of it.

We didn't have a centralized spec for this project while in development

As we all know in software development, nothing is as easy as it initially seems. This project really would have benefitted from a proper spec for each edge case scenario we encountered.

We also ran into some issues when moving from our virtual test environment to using live production data that probably could have been avoided with more planning.

User research can sometimes be misleading

As part of this project we ended up removing raw statistics in favor of percentage based stats because they were easier for comparison purposes. During both our user research benchmarking sessions as well as our usability sessions with the new design, there wasn't the slightest rejection.

Since this release, we have iterated on the design and it now includes a toggle which allows customers to change between raw and percentage based statistics.

We have since implemented this new design

Want to see other projects I've worked on?