This list was adapted from my talk at SUGCON EU in April 2018.
Testing elements of your website is a tried and true way to search for improvements in performance and user experience. Most companies focus narrowly on calls to action, but with a modern web content management system or an external testing technology a much broader range of tests are possible. These examples also try to expand testing to “exploratory” areas as opposed to focusing on “refinement” of existing ideas. By experimenting with different solutions to problems or challenges, you gain the potential for an even better result than the status quo.
1. Mobile vs Desktop Layout
Most companies code their desktop sites to be responsive, but many of those responsive sites use the same content in the same order. This should not be your default assumption, and the entire layout in mobile should be tested extensively. There are many ways clever front-end developers can change a responsive page experience, so talk to them and get ideas. I guarantee they will come up with good ones.
This is just another chapter in the eternal struggle to make content show up across a variety of form factors and keep it effective. Maybe you need to really rethink what goes on your mobile pages. Experiment with stripping out images, copy, graphic elements, and links, and add back in simple infographics, quotes and soundbites, video, or other immersive and easy to see options.
Other areas to explore: Are you designing for people over 50? Are you testing your designs on phones set up for accessibility? You may be shocked at how much the experience can get distorted.
2. Regular Navigation vs. Super Navigation
The pendulum swings back and forth on regular navigation and super navigation.
Here we see the Sitecore Habitat demo site, with an extensive navigation showcasing content within the nav itself. Testing ideas like this might be a bit extreme, but there is a wide range of potential here to explore options producing unexpected results - getting at the “exploratory” opportunities we talked about at the beginning.
Remember that evaluation should take place based on the overall aggregated results, not just the performance of a specific page. Tests like navigation will likely change a range of different metrics, but the end result – conversions or revenue per visit – is what matters. (Unless, of course, you have visibility across visits, in which case it should be conversions or revenue per unique visitor.)
3. Search Results
Search can be incredibly important in some situations. In both B2B and B2C scenarios, increased conversion on search will be an important source of incremental revenue. In an earlier post I used the idea of visitors to a plastics company website, where a single visitor can be worth millions of dollars. When considered this way, a more robust or performant search page can have a large financial impact.
This is an area we recommend lots of design testing and tweaking, since none of us are as good as Google at search. Testing how results are presented, how much related content is shown, null result experience - these are all as involved as a whole site might be.
Contrast the basic results here from the Sitecore Habitat demo and a showcase of Coveo’s capabilities. The range is large.
4. Cookie Acceptance Language and Design
With the advent of GDPR, virtually every commercial website now has a cookie acceptance policy. While there is not yet a worldwide standard, most include either a single button acceptance or the ability to select which types of cookies are permitted.
The language around this acceptance often goes through legal review, but now that the initial challenge of meeting GDPR requirements has passed, most companies should revisit this language and test alternate versions that are more visitor-friendly and convey the same information. We’ve started to see some cheeky or snarky versions of this language, which, while entertaining, are probably not an option for most companies. Closely related is the design of the language, buttons, and checkboxes. These could follow traditional testing techniques, with agreement clicks as the measure of success.
5. Number of Form Fields
Thankfully most marketing automation solutions let us rework forms quickly and effectively. The first thing to test, which has been an area of focus since the beginning of the web, is the number of fields.
Lots of research over the years has said to reduce to your minimum number of fields possible. But that is easier said than done. What might make sense is get to the minimum then test adding one field at a time. Getting this dialed in won’t be easy, because there are so many other things you can test (the next few coming up) but the number of fields is a good place to start.
One very important point to remember, however, is that form data can be attached to the visitor profile and used in future personalization or offers. So the analysis of form completion value needs to be enhanced with the value generated by the additional field data. For example, if you can gather title along with the company and location, your personalization can be highly specific and focused on the typical personal of that role, industry, and geography. But say your tests tell you completion rates drop when you ask for all 3 data points. Which two should you keep? When calculating the value created by a form completion, you must include this evaluation. (And keep in mind if there is another way you can get this, such as through inferences based on pages viewed, combine the techniques for the best result.)
As an aside, testing the opt-in language most companies now add to forms would be another aspect of forms to test, like the cookie acceptance language in #4.
6. Form Placement on Page
Where does a form go? At the bottom? Forms at the bottom probably perform terribly for most companies, so testing somewhere else makes total sense.
You should try everything here. Top, second row, side, middle, bottom, you name it.
Here we look at a B2B site with a multi-step form, with second row right placement. This seems like a good place for it, but why not in the hero? Maybe across the whole width of the screen? Maybe more steps and fewer choices to fit it into a different location on Level 2 and level 3 pages? All of these would be candidates for testing.
Sidebar: Tracking Scroll Depth
When investigating form performance and setting up testing hypotheses, it can be very useful to understand scroll depth. Setting this up in Google Analytics is not difficult but takes a little work. It will track various depths down the page (such as Hedgehog’s standard of 10%, 25%, 50%, 75%, and 90%).
A typical result is fewer than 20% of visitors make it even halfway down a page. If your form is at the bottom, but only 10% of visitors scroll that far, you’ve made it very difficult to succeed.
I asked the audience at SUGCON how many tracked this and only two people out of more than 100 raised their hands. Which means the vast majority of companies attending, some very sophisticated, were missing a very important data point about their form conversion. If a page with a form converts at 2%, but only 10% of visitors make it to the form, your form is really converting at 20%. So testing farther up the page to expose the form to more visitors would be an important test.
7. Stepped vs Long Forms
Another way to approach forms is using multiple steps. This may not convert as well as a single long form (but you should test that!), but it allows the collection of data with each step. This can be key for personalization and intermediate goals, and has the potential to drive more value per visit (a Sitecore capability that can be replicated with difficulty in other CMS and marketing automation solutions) even if outcome conversion is lower.
But stepped forms may work better overall. Test them.
8. Form Field Guidance
Even something as simple as how you provide guidance to filling out forms can have an impact.
Do you use dummy data? Do you use underlines (left example) or boxes (right example) for fields? How do you delineate required fields? If validation fails, how do you communicate it? I guarantee if you have not looked at validation errors and guidance recently there’s probably room for improvement, as current UX standards change often in this area.
9. Campaign Inclusion Rules
This is a classic area to test in direct/database marketing and makes sense in the context of web-based marketing automation. The rule sets available to a marketer depend greatly on the martech stack and its integrations. With Sitecore, for example, rule sets can use behavioral or attribute data, including in-session data. This flexibility is a natural for testing, particularly if you don’t have analysts who can build true response models, or you don’t have enough data.
Don’t forget to add a campaign count exclusion so your best customers or frequent visitors are not included in everything.
As you get more advanced, adding predictive models to visitor profiles to help decide which campaigns to include them in will be standard operating procedure. (Realistically segmentation and models can fill dozens of textbooks and is well beyond the scope of this discussion, but don’t let that scare you. A few different rule sets will produce very different results and start providing insights to help with improvement.)
10. Delay Between Messages
As we move into marketing automation (greatly improved and a central part of Sitecore 9), there are many, many things to test. But we’ll start with a basic – how long should I delay before sending a follow-up?
Lets say you have a form completion and send an immediate acknowledgement via email. When should you send the 2nd follow up? 1 day later? 2 days? Maybe 7?
These are all good options, but without testing them out you really won’t know which is most effective. Like segmentation in #9, there are great depths of analytic options for determining the best approach here, but a few basic tests will start to improve the insights at your disposal.
11. Test Message Content
Just as important to campaigns is testing the content of our messages. All the classic testing items are in play here including images, layout, copy, call to action, channel, and so on.
Basic tests could cover tone (happy and pleasant or punchy and direct?), offer (hard hitting or soft?), length, and if you are using email, subject lines, time of day/day of week, and a host of other items also come into play. Email marketers have done these kinds of test for decades.
But we see the marketing automation and web content world tends to test messages less often or rigorously than email marketers do, so consider this a reminder to keep it high on your list.
12. Dynamic vs Static Landing Pages
Separate static landing pages have been the standard for SEO for a long time. Would dynamic pages work as well? Test it! Part of what you need to evaluate is whether the faster setup time is offset by the lack of search engine visibility (they only see the default version of a page).
For example, a home page might have surfing info (top image) but someone who came to the site via a tagged campaign for golf might see the bottom hero instead.
This increase in relevancy will virtually always perform better (as this example did) but at a cost of search visibility. As long as the versioning is based on campaigns (that is, the test is based on earned or paid traffic and not organic) then the loss of search visibility should not matter to the overall results.
13. Landing Page Value Propositions
Another classic testing scenario is the value proposition on a landing page.
In this hotel example, there are many separate CTAs. The total value generated by the page may be higher due to the multiple pathways available, but testing different offer options (and possibly fewer options) would be worthwhile.
There are, of course, many other tests you can run to identify areas of your site that could be more productive - I discuss several of those in this previous post. If you'd like to discuss other ways to begin optimizing your website, feel free to reach out to me on Twitter or to Hedgehog directly.