In October, we released beta.USA.gov and beta.USA.gov en Español to the public. These websites are where we are building the next version of USA.gov. We want the future federal front door to be more human-centered, actionable, interactive, and accessible. But how do we know if the choices we have made are accomplishing these goals?
In order to measure the effectiveness of our previous choices and to guide future decisions, we’re evaluating beta.USA.gov in a variety of ways.
We present the website to a variety of possible users and interview them one at a time as they use it. We ask them to perform common and important tasks and observe to see if the website makes it easy or hard. We do unique tests for different parts of the website so we can get in-depth information about them. We even test designs and pictures of web pages so we can avoid mistakes before writing code. Because beta.USA.gov is a different website from our main site, USA.gov, we perform similar tests on each so we can compare the results. Find out more about how we adapted our user testing practices to measure the success of the beta site in our blog post: 5 Changes the UX Team Made for Beta.
Coordinating Analytics and Outreach Strategies
For us to compare beta.USA.gov and USA.gov, our analytics team works in coordination with our outreach team. Currently, most visitors to USA.gov come from a search engine with a task in mind. However, this is not the case with the beta site. We invite visitors to check out beta.USA.gov from USA.gov and by leveraging targeted social media and email campaigns. By strategically planning how we invite users to visit the beta site, we’re able to monitor influxes in traffic to certain pages and closely review the feedback we get from those users to help us make decisions about future functionality on the site. Future blog posts will go into further detail about how the USAGov team markets beta.USA.gov.
We offer opportunities for feedback from visitors at different points of the experience on both beta.USA.gov and USA.gov. We ask visitors if they thought a page was helpful, if they found the information they needed, and more. The questions help us identify the groups of pages where we need to focus our efforts. Visitors also tell us about their experiences in their own words. We read these comments in order to identify improvements and prioritize feature development so we can make beta.USA.gov more helpful.
We want to make sure that our visitors are as successful as possible. One of the ways we do that is by connecting them to the best possible place to accomplish their goal with the least amount of effort. We use custom website analytics to anonymously compare how the users of beta.USA.gov and USA.gov accomplish similar tasks. For example, what percentage of each site's visitors find and click a link to another government agency? How many choose to share the information with others on social media or email?
Because beta.USA.gov is not finished, there are some types of pages still to come. Specifically, our beta site includes interactive tools that the current USA.gov website does not. To address these variations in user experiences on the beta site vs USA.gov, we are cautiously designing how we collect and report data. Our anonymous usage dashboards pinpoint certain key outcomes on both websites. This lets us take into account the different ways we need to measure success on the various features we develop.
Using these strategies across our websites, we hope our data will be informative and allow us to create the best version of USA.gov that we can.
About the Building Beta Series
The “Building Beta” series highlights our approach to work in creating beta.USA.gov and beta.USA.gov en Español. Follow along to learn how the USAGov team worked to create the new iteration of our bilingual sites. For more information about the beta websites, check out this post.