Category Archives: Experimentation

This is not a test: Google Optimize now free — for everyone

Businesses often have one big question for us: How can they better understand their website visitors and deliver more relevant, engaging experiences?

To help businesses test and take action, last spring we launched our enterprise-class A/B testing and personalization product, Google Optimize 360. We saw great demand, so we made it more accessible with a free beta version last fall — and that response also exceeded our expectations, with over 250,000 users requesting an Optimize account.

Today we're very excited to announce that both Optimize and Optimize 360 are now out of beta. And Optimize is now immediately available to everyone — for free. This is not a test: You can start using it today.

Easy to implement 

A recent survey showed 45% of small and medium businesses don’t optimize their websites through A/B testing.1 The two most common reasons given were a "lack of employee resources" and "lack of knowledge to get started."

If you're part of that 45%, Optimize is a great choice for you. Optimize has many of the same features as Optimize 360. It's just right for small and medium-sized businesses who need powerful testing, but don't have the budget or team resources for an enterprise-level solution. Optimize is easy for anyone to set up. Early users of Optimize have been happy with how easy it is to use. In fact, it's built right on top of Analytics, so if you're already an Analytics user you'll add just a single line of code to get Optimize up and running. With just a few clicks more, you can start using your Analytics data to design experiments and improve the online experience for your users.

Easy to use

Worried about having to hire someone to run A/B tests on your site, or frustrated about not knowing how to do it yourself? Don't be. The Optimize visual editor allows for WYSIWYG (what-you-see-is-what-you-get) editing so you can change just about anything on your site with a drag and a drop. And more advanced users will enjoy the ability to edit raw HTML or add JavaScript or CSS rules directly in the editor.


Powerful targeting capabilities within Optimize allow you to serve the right experiences to just the right set of users. And you have flexible URL targeting capabilities to create simple or complex rules for the pages where you want your experiment to run. To find out if a targeting rule you've set will apply to a specific URL on your site, use the new Optimize URL tester. Just enter a URL and the tester will immediately tell you if that page is a match for your targeting rule.

Easy to understand

Optimize calculates results based on your existing Analytics metrics and objectives using advanced Bayesian methods, so the reporting shows you exactly what you need to know to make better and faster decisions.


We’ve also upgraded the improvement overview (see image above) to help you quickly see how an experiment affects the metrics you care about most, whether that means purchases, pageviews, session lengths, or whatever else you’re tracking in Analytics.

Easy to try 

Leading businesses are building a culture of growth that embraces the use of data and testing to improve the customer experience every day. We’re delighted to offer Optimize to everyone to help deliver better user experiences across the board.

As of today, Optimize is available in over 180 countries. (A special note for our European users: We’ve added a new data processing amendment to the Google Optimize Terms of Service that you may review in the UI and accept if you wish.) And we're not done yet: Keep an eye out for more improvements and announcements in the future.

What are you waiting for? Try it right now!

Happy Optimizing!

1Google Surveys, "Website Optimization Challenges for SMBs," Base: 506 Small/Medium Business Owners and Managers, Google Surveys Audience Panel, U.S., March 2017


Referensi: Google Analytics Blog - This is not a test: Google Optimize now free — for everyone.


Why Your Testing and Optimization Team Needs a Data Storyteller

If a test happens on your website and nobody hears about it, does it make a sound?

Not to get too philosophical, but that's one of the big challenges of building a culture of growth and optimization: getting the word out. That's why a data storyteller is one of the key members of any testing team.

In fact, “communication and data storytelling” was noted as a critical skill for a person who leads testing and optimization efforts, according to a survey of marketing leaders who conduct tests and online experiments.1 The must-have skills rounding out the top three were leadership and, the obvious, analytics.



A data storyteller is part numbers-cruncher, part internal marketer, and part ace correspondent from the testing trenches. He or she is someone who can take the sheer data of testing — the stacks of numbers, the fractional wins and losses, the stream of daily choices — and turn it into a narrative that will excite the team, the office, and (especially) the C-suite.

Storytelling doesn't just mean bragging about successes. It can also mean sharing failures and other less-than-optimal outcomes. The point is not just to highlight wins: it's to reinforce a culture of growth, to generate interest in experimentation, and to explain why testing is so good for the company.

"Our test success rate is about 10%," says Jesse Nichols, Head of Growth for Nest. "We learn something from all our tests, but only one in 10 results in some kind of meaningful improvement." That means that a big part of the data storyteller's job is to keep people interested in testing and show them the value.

Watch our on-demand webinar “Test with success — even when you fail” to hear more testing and optimization tips.


If you're the data storyteller for your team, here are three points to remember:
  • Take the long view.  Gaining support for testing is like rolling a rock up a hill: slow going at first, but once you cross the summit the momentum will take over fast. It takes time, so lay the groundwork with lots of short reports. Don't wait to make formal presentations: Look for chances to drop your message into weekly wrap-ups and other group forums. In short, don’t be afraid to over-communicate. 
  • Be specific. It's better to present one great number than 10 so-so ones. Think mosaic rather than mural: Look for specific stories that can represent your larger efforts and broader plans. 
  • Keep your eye on the bottom line. In the end, that's what it's all about. You may be thrilled that a call-to-action change from "see more" to "learn more" increased clicks by .03%, but what will really get the CMO and other executives interested is moving the profit needle. As a litmus test, ask yourself, “So what?” If your story doesn’t clearly answer the question in terms the audience cares about, consider giving it a rewrite. 
And remember that it won't always be small victories. "The things you're so sure are going to work are the ones that go nowhere," says Jesse. "Then you do a throwaway test and it makes the company an extra $500,000." That's a story that everyone will want to hear.


Download our eBook How to Build a Culture of Growth to learn more best practices on testing and optimization.


1Source: Google Surveys, U.S., "Marketing Growth and Optimization," Base: 251 marketing executives who conduct A/B tests or online experiments, Oct. 2016.


Referensi: Google Analytics Blog - Why Your Testing and Optimization Team Needs a Data Storyteller.


Lessons Learned: Testing and Optimization Tales from the Field

Max van der Heijden is a user experience and conversion specialist at Google who works with companies across Europe, the Middle East, and Africa. Max shares his thoughts about how companies can build a culture of growth and experimentation.


How many times have you launched new features or page designs on your website without testing first?

In an ideal world, companies should test everything before rolling out site changes. But some websites have too little traffic to generate credible results from experiments, and some bugs should just be fixed if they prevent users from achieving their goal. At the very least, analyze your analytics data and use qualitative methods such as user tests and surveys to validate any improvement ideas you have before implementing. If you have the traffic volume: Test!

I’m part of a team at Google that works with advertisers to identify opportunities for improving website user experiences through experiments and testing roadmaps. When our team of UX specialists begins consulting with a new company, the first three things I tell them are:

  1. The possibilities for improvement are enormous. Even if an experiment increases your conversion rate by “just 5%,” you can calculate the positive effect on your revenue.
  2. What works for one may not work for all. No matter how many times we have seen recommendations or “best practices” work on other — maybe even similar — websites, that does not mean it will work for your users or your business.
  3. Expect failures — and learn from them. Testing takes time, and it's hard to know which tests will pay off. Embrace failures and the lessons learned from them.

Making the switch from “get-it-live” mode to a test-and-learn mindset takes time and effort. Leading companies are building a culture of growth: one where people focus on using data and testing to optimize the customer experience day by day. Below are some of the key lessons learned as we work with teams embracing this growth mindset.

Get top-level support

When we first talk with customers, we insist a decision-maker attend our meetings. If there's no support from the top, all of our testing ideas could end up on the shelf collecting dust. Obviously, the marketing executive or CEO won’t have an a-ha moment if you frame testing as a way to improve conversions. The trick is to show how testing impacts a business goal, such as revenue or, better yet, profit. Then the decision-maker will have an ohhh moment: As in, “Ohhh, I knew this was important, but I didn’t think about how a small change could have such a big impact on our bottom line.”

Top-level support will help you get the resources you need and unlock the potential of people already working on experiments. The trend we see is typically one or two persons who start doing the optimizations. They are usually mid-level designers or data analysts who have an affinity for conversion rate optimization, but are often working in a silo.

On the other end of the spectrum, we see companies that have fully bought into the power of experimentation. Multiple customers even have a group of product managers who work on projects with a group of specialists, including a data scientist, copywriter, designer, and even a design psychologist.

Tip: Look for these three types of people to jumpstart a culture of growth in your organization.

Prioritize, prioritize, prioritize

You can't test every idea at once. And prioritization should not be a guessing game.

When we surveyed a group of our EMEA advertisers at a conversion rate optimization event, 38% of the respondents said they use their gut or instinct to prioritize, while 14% allow the HiPPO (highest paid person’s opinion) to call the shots.1 Instead, try using a framework that takes into account past lessons learned and resource requirements.

Map test ideas in a speed-versus-impact grid, and prioritize experiments that are quick to launch and likely to have the biggest impact. Keeping track of all prior test results is another way to ensure past learnings come into play when faced with a HiPPO.

Tip: Start with ideas that will be simple to test and look like they could have the biggest potential impact.


Turn fairweather fans into engaged experimenters

Over time, as you share testing insights and achieve a few wins, more people will jump on board and you’ll need to train people on a repeatable testing framework.

Testing is part of a cycle: What does the data tell you? Did the experiment succeed or fail for every user, or just for a specific segment of users? Analyze your test results, especially from failed experiments, and use those insights to improve the customer experience across your touchpoints. And then conduct another test.

Just as important: How do you keep people excited and engaged in the process? Try using a shared document to invite everyone to submit their improvement suggestions for your website or app. You can even add gamification to this by keeping score of the most impactful ideas. Or, have people guess which test variation will win before you run the test. When you share the results, recognize or reward people who correctly predicted the winner.
Tip: Three ways to get your team engaged with testing and optimization

Feel good about failures

By its very nature, experimentation involves a lot of failure. A typical website might have 10 or 100 or even 1,000 things to test, but it might be that only a small percentage of those tests lead to significant, positive results. Of course, if that one winner leads to a 5% or 10% improvement in conversions, the impact on revenue can be enormous.

When we surveyed EMEA advertisers at one of our events, we found that companies running one to two tests a month had a 51% success rate. But for respondents who said they ran more than 21 tests a month, the success rate decreased to 17%.2

In the beginning, it’s easier to spot areas for improvement and “low-hanging fruit.” The more experiments you run, the more you’ll be focusing on smaller and smaller things. Then, the more you test, the less “successful” you will be. "Our test success rate is about 10%," says Jesse Nichols, Head of Growth at Nest. "But we learn something from all our tests."

Download the guide How to Build a Culture of Growth to learn more about best practices for testing and optimization.

1-2 Source: Conversions@Google 2016 - State of CRO event attendee survey, 145 respondents, EMEA, September 2016.


Referensi: Google Analytics Blog - Lessons Learned: Testing and Optimization Tales from the Field.


‘All Killer, No Filler’: The Next Web finds the right message with Google Optimize 360

In a world where consumer behavior can shift on a dime, marketers constantly ask themselves: How can we be more useful to our customers? With all the data businesses collect, the challenge becomes tuning out the noise to focus on insights your team can act on.

Today’s most successful businesses have turned to a new approach: building a culture of growth and optimization. This is where everyone in an organization is using data to test and learn as a means to improve the customer experience every day.

The Next Web, a technology-media company and online publisher, has embraced this testing culture and turned to Google Optimize 360 to help them find just the right message to drive readers to their conference website.

The Next Web Case Study 


The Next Web’s conferences bring tech leaders, entrepreneurs, and marketers together to innovate, share, and look ahead. The first TNW conference was created in 2006 by Patrick de Laive and Boris Veldhuijzen van Zanten, when they couldn’t find the kind of event they needed to showcase their own startup.

That first event drew a respectable 280 attendees, but the founders knew they needed a better way to promote future TNW conferences. That’s when they launched thenextweb.com, a tech news and culture website that today attracts 8 million users a month. The Next Web’s two annual conferences in New York City and Amsterdam now draw over 20,000 attendees.

The Next Web’s marketing team uses promotional messages within articles on thenextweb.com to drive potential attendees to the conference website and sell tickets. To find out which combination of messages works best, they used Google Optimize 360, an integrated part of the Google Analytics 360 Suite.


"We want more people to read content on thenextweb.com as a first step," says Martijn Scheijbeler, who leads the marketing team's efforts. "If we can convince them to become a loyal user, we can try to interest them in different opportunities. In the end, we’d like them to join us at one of our events to experience what The Next Web is really about." 

With one of its conferences coming up, The Next Web's marketing team wanted to compare different headlines and descriptions to see which combination would drive more readers to its conference page. Using Optimize 360, The Next Web team ran a multivariate experiment to discover the combinations that worked best.


For The Next Web, the results were clear: The "All Killer, No Filler" headline with the "This one's different, trust us" description was the winner. During the experiment it performed 26.5% better than the original headline and description, with a 100% probability to beat baseline.

Today The Next Web team tests and optimizes its conference messages day by day. Better messaging means more traffic to The Next Web conference site, and that means more attendees. It also gives the marketing team extra wins like higher awareness and more newsletter signups.

“Optimize 360 and Analytics 360 make testing easy for us,” Martijn says. “They give us much better insights into how many clicks we’re getting for each message. We’re reaching more people who want to come to our conferences, and those better results are going right to our bottom line.”


For more, read the full case study with The Next Web.



Referensi: Google Analytics Blog - ‘All Killer, No Filler’: The Next Web finds the right message with Google Optimize 360.


What does a good website test look like? The essential elements of testing

"Test! Test! Test!" We've all heard this advice for building a better website. Testing is the heart of creating a culture of growth ― a culture where everyone on your team is ready to gather and act on data to make the customer experience better day by day.

But how do you run a good test? Is it just a matter of finding something you're not sure about and switching it around, like changing a blue "Buy now" button for a red one? It depends: Did you decide to test that button based on analytics, or was it a wild guess?

Assuming the former, a good test also means that even if it fails, you’ve still learned something. A bad test may make your website performance worse than before, but it’s even worse if you don’t take those learnings into account in the future.

The key to running good tests is to establish a testing framework that fits your company.

Join us for a live webinar on Thursday, March 9, as Krista Seiden, Google Analytics Advocate, and Jesse Nichols, Head of Growth at Nest, share a six-step framework for testing and building better websites.

Frameworks vary from business to business, but most include three key ideas:

Start with an insight and a hypothesis.
A random "I wonder what would happen if …" is not a great start for a successful test. A better way to start is by reviewing your data. Look for things that stand out: things that are working unusually well or unusually badly.

Once you have an insight in hand, develop a hypothesis about it: Why is that element performing so well (or so badly)? What is the experience of users as they encounter it? If it's good, how might you replicate it elsewhere? If it's bad, how might you improve it? This hypothesis is the starting point of your test.

For example, if you notice that your mobile conversion rate was less than on desktop, you might run tests to help you improve the mobile shopping or checkout experience. The team at The Motley Fool found that email campaigns were successfully driving visitors to the newsletter order page, but they weren’t seeing the conversions. That led them to experiment on how to streamline the user experience.

Come up with a lot of small ideas.
Think about all the ways you could test your hypothesis. Be small-c creative: You don't have to re-invent the call-to-action button, for instance, but you should be willing to test some new ideas that are bold or unusual. Switching your call-to-action text from "Sign up now" to "Sign up today" may be worth testing, but experimenting with "Give us a try" may give you a broader perspective.

When in doubt, keep it simple. It's better to start with lots of small incremental tests, not a few massive changes. You'll be surprised how much difference one small tweak can make. (Get inspiration for your experiments here.)

Go for simple and powerful.
You can't test every idea at once. So start with the hypotheses that will be easy to test and make the biggest potential impact. It may take less time and fewer resources to start by testing one CTA button to show incremental improvement in conversion rates. Or, you may consider taking more time to test a new page design.

It may help to think in terms of a speed-versus-impact grid like this. You don't want quiet turtles; the items you're looking for are those potential noisy rabbits.


The best place to begin a rabbit hunt is close to the end of your user flow. "Start testing near the conversion point if you can," says Jesse Nichols, Head of Growth at Nest. “The further you go from the conversion point, the harder it gets to have a test that really rocks — where the ripple effect can carry all the way through to impact the conversion rate,” says Jesse.

Stick with it
A final key: Test in a regular and repeatable way. Establish an approach and use it every time, so you can make apples-to-apples comparisons of results and learn as you go.

A clear and sturdy framework like this will go a long way toward making your team comfortable with testing — and keeping them on the right track as they do.

Download the eBook How to Build a Culture of Growth to learn more about best practices for testing and optimization.


Referensi: Google Analytics Blog - What does a good website test look like? The essential elements of testing.


Why Building a Culture of Optimization Improves the Customer Experience

How can we be more useful to our customers today?

That's the simple question that drives any marketing organization focused on testing, improvement, and growth.

But answering the question is not always so simple in our data-rich world. The old challenge of gathering enough data has been replaced by a new one: gleaning insights from the mountains of data we’ve collected — and taking action.

In response to this flood of data, many of today's most successful businesses have turned to a new approach: building what's called a culture of growth and optimization.

This growth-minded culture is one where everyone is ready to:
  • Test everything 
  • Value data over opinion 
  • Keep testing and learning, even from failures 
Most companies have a few people who are optimizers by nature, interest, or experience. Some may even have a “growth team.” But what really moves the dial is when everyone in the company is on board and embraces the importance of testing, measuring, and improving the customer experience across all touchpoints.
"We refuse to believe that our customers’ experiences should be limited by our resources." - Andrew Duffle, Director of Analytics, APMEX
Why should marketers care?
Because they'll be leading the revolution. 86% of CMOs and senior marketing executives believe they will own the end-to-end customer experience by 2020, according to a recent survey from the Economist Intelligence Unit.1 And a culture of growth and optimization offers an excellent path to major gains in those experiences.

As testing and optimization proves itself, it tends to generate higher-level investments of support, talent, and resources. The payoff arrives in the form of more visitors, more sales, happier customers and a healthier bottom line.

If you're curious about building a culture of optimization in your marketing organization, register for our Nov. 10 webinar, Get Better Every Day: Build a Marketing Culture of Testing and Optimization.

This webinar will cover:
  • The critical elements of a culture of optimization 
  • Tips for building that culture in your own company 
  • A case study discussion with Andrew Duffle, Director of Analytics at APMEX, a retailer that boosted revenues with continuous testing and optimization 
This kind of culture doesn't happen by command, but it’s also simple to start building.

We look forward to sharing tips on how you can get started. Happy optimizing!


  1. The Economist Intelligence Unit, "The Path to 2020: Marketers Seize the Customer Experience." Survey and a series of in-depth interviews with senior executives. Survey base: 499 CMOs and senior marketing executives, global, 2016.


Referensi: Google Analytics Blog - Why Building a Culture of Optimization Improves the Customer Experience.