• Shortcuts : 'n' next unread feed - 'p' previous unread feed • Styles : 1 2

» Publishers, Monetize your RSS feeds with FeedShow:  More infos  (Show/Hide Ads)

Date: Thursday, 28 Aug 2014 13:15


As we look towards the end of this year and the beginning of 2015, consider how a training in Google Analytics, Google AdWords, or Google Tag Manager may help your career! Choose from seven different cities in the first quarter, ranging from Boston to San Francisco, with stops in Chicago and Denver along the way.

With trainings in cities around the country, we hope you can find a location that is easy to travel to and fun to explore!

Whether you’re just starting out in a new field or looking to get a deeper understandable of the tools you’re currently using, we have a class for you.

Learn how to better collect and analyze your data with our Google Analytics series, futureproof your website with the flexible Google Tag Manager, or drive qualified traffic to your site through paid search with our Google AdWords trainings.

Choose an option below to learn more about the specific topics we cover and decide which trainings would be right for you!

Google Analytics Google AdWords Google Tag Manager

Check out our list of upcoming cities for a training near you! View the schedule here.

Where to Next?

Should we add a city? Tell us where we should go next in the survey below!

Author: "Jon Meck" Tags: "Google Analytics, Google Tag Manager, Pa..."
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 26 Aug 2014 13:03


In 2013, LunaMetrics hosted its first free SEO training, designed for local students and recent graduates and partnering with local non-profit organizations. The event was so successful for all who attended that LunaMetrics will offer the free training again this year, over the weekend of October 18-19.

The students who were chosen to participate in last year’s program left with knowledge of SEO best practices and experience optimizing a website for search engine traffic. These employable skills and experiences could be added to their professional résumés to help kickstart their professional careers in essentially any field.

LaToya Johnson, then a Carlow University MBA student, participated in the 2013 training. She gave this advice, “…I would encourage other students to take advantage; not only will you gain knowledge, great networking opportunities, and a certificate. You may also discover a passion that you didn’t know that you possessed.”

seo-training-lunametricsI also attended this training in 2013 and I also possessed the undiscovered passion that LaToya Johnson spoke of. I had an interest in social media marketing but very little knowledge of search engine optimization, so I eagerly applied and was accepted to attend the training.

That training, led by Andrew Garberson (@Garberson), was unlike any college class or public internet marketing seminar I had ever attended. Andrew’s ability to convey the complexity of SEO to the entire room seemed effortless. Instead of being lectured, we discussed the topics together in a workshop style environment.

So rarely are you offered the opportunity to make a real impact as a college student, but this wasn’t the case. Instead of hypothetical scenarios, we worked with real organizations, real websites, and real challenges!

I made connections with many organizations during the training and followed up with them shortly after the training. I was offered and accepted a digital marketing internship with a non-profit organization. After its conclusion, I applied for an SEO internship with none other than LunaMetrics!

Fast-forward one year and today I’m a bona fide member of the LunaMetrics Search Department working in SEO, SEM, and Google Analytics. This year, I’m returning the favor by conducting the training that I attended just over a year ago. This year’s training will be held on October 18-19th, 2014.

If you’re a local student or a local non-profit who could use some SEO help, read through the descriptions here and consider applying!



LunaMetrics is a company that cares deeply for the Pittsburgh region & its non-profit organizations. This summer, a wacky company-wide scavenger hunt raised over $400 to benefit a local non-profit that was chosen by the winning team.

The Search Department recently began to implement policies that enable its members to offer consultation to their favorite charities. This training strengthens the local non-profits and helps ensure the future of a technologically informed Pittsburgh region.

Author: "Chris Vella" Tags: "Search Engine Optimization, Trainings"
Comments Send by mail Print  Save  Delicious 
Date: Monday, 25 Aug 2014 15:16


While Google AdWords is a terrific platform for getting your advertising message in front of the right audience, it can take years to master. That’s why we offer our Google AdWords Training courses. The sessions are a terrific opportunity to get your questions answered and learn everything you need to know to maximize your ad spend and generate revenue for your business.

It doesn’t matter if you are brand new to pay-per-click advertising or a seasoned pro, you will learn about strategies and settings to help you maximize your account. Every training is unique as attendees work in different industries and have different business models. We really try to speak to specific examples in attendees’ accounts and industries.

Fortunately, our trainers have years of experience managing AdWords accounts for a wide variety of business types and actively work on accounts in addition to providing training, so you can be sure the recommendations you receive are time-tested.

However, some questions come up during each training session, and rightly so, as PPC advertising isn’t cut and dry. As I look forward to my next training (2 weeks away in Los Angeles!) I thought it might be helpful to review some common questions.

1. Is AdWords Right For My Business?

AdWords can be an effective use of your marketing budget for all types of businesses. Do you work in a long lead cycle and need new sign-ups or registrations? Do you provide a service to a specific geographic area? Does your company sell industry-specific enterprise software? Are you a local pizzeria?

All of these business models could benefit from AdWords’ ability to lower your cost-per-lead and drive qualified traffic to your website.

How it works:

2. How Do I Choose The Right Keywords?

The entire AdWords Search PPC system is designed around displaying your ad when a user searches for a specific term. For example, we might want the ad for our fictional theme park below to trigger and show when someone searches for the words Florida theme parks. When the ad is clicked on, our account will be billed a per-click cost.


In order to find search volume for keywords like “Florida theme parks” and other related terms, we do keyword research! We start by brainstorming various phrases and words that relate to our business with the Marketing team. Then we can run those keywords through a variety of programs to find out how much search interest exists around those terms, and roughly how much we may have to pay if those terms trigger an ad click.

AdWords Keyword Planner, SpyFu and Ubersuggest are a few of the most popular tools to get you started.

3. Seriously Though, What Are These Match Types About?

Just having the keywords isn’t enough. You can control their exposure to searches using match types. These are additional keyword settings that narrow or broaden the search terms triggering your ad. Match types are critical in setting up a profitable account.

Broad match, Phrase Match and Exact Match all control how your ad will display based on the user’s search.

Broad Match is the shotgun approach that reaches the largest possible audience (even related search terms).

Exact Match lasers in on only the keyword you specify.

Negative Keywords allow you to specify terms that you do not want triggering your ad, like the name of a competitor or an acronym that may have many meanings.

This chart gives you a quick idea of how match types effect an advertiser who wants to show ads when users search for women’s hats:


Google provides a short & sweet walk-through about match types:

4. Did I Structure My Account Correctly?

A well-structured AdWords account gives you better control of your budget, makes management easier, and provides the maximum targeting opportunities of your keywords. But let’s back up. AdWords accounts contain Campaigns > Ad Groups > Ads & Keywords.

Once you’ve completed keyword research, organize your keywords into tightly-themed groups, these will become your Campaigns (Example: “AdWords Training”). Each Campaign contains Ad Groups that are smaller breakdowns (like “AdWords Seminars”, “AdWords Workshops”, “AdWords Training Company”). Then each of those Ad Groups will contain Ads and Keywords.

It will look something like this:


It’s considered a best practice to break your Campaigns and Ad Groups down into themed groups based on your business model. Unfortunately, this isn’t usually a right or wrong situation. It needs to make sense to your particular business!

Need help with this? Look at your website navigation! Below we can see that I may want to have a Campaign called “Training” and a separate Ad Group for each of the training options:


5. Should I Target Mobile?

If you are using the Search network in Google AdWords, you ARE targeting mobile by default. But there are many factors involved in deciding whether or not you really should be targeting mobile users with paid ads.

Ask yourself the following questions:

  • Is your website mobile-friendly?
  • Can a user complete a sign-up or transaction easily on a mobile device?
  • Are these users worth more or less to you than a desktop user?

If so, then go for it! There are even mobile-specific sitelinks that you can use, including ones that display a click-to-call button on mobile phones viewing your ad!


Very commonly, advertisers will want to opt out of mobile targeting for business reasons. There’s no option to check in AdWords to prevent ads from displaying on mobile, but if you use a Device Bid Adjustment set to -100%, you are effectively not bidding on mobile ad placements.

mobile adjustment

These commonly asked questions are just some of the topics we review in our AdWords 101 course. In that first day, we focus on building your account and your message. AdWords 201 takes a deep dive into account settings for granular fine-tuning and maximum profitability.

Do you have any questions you’d like an AdWords expert to answer? There’s a comment box below.

Author: "Michael Bartholow" Tags: "Paid Search, Trainings"
Comments Send by mail Print  Save  Delicious 
Date: Friday, 22 Aug 2014 13:41


Spurred on by the Edward Snowden revelations, Google has begun taking security more seriously. After the revelations came out, Google quickly secured and patched their own weaknesses. Now they are pushing to encrypt all internet activity by incentivizing websites that use SSL certificates by giving them a boost in rankings.

During a Google I/O presentation this year called HTTPS Everywhere, speakers Ilya Grigorik and Pierre Far made it clear that this move is not just about encrypting the data being passed between server to browser, but also to protect users from having the meta data surrounding those requests collected.

Though the meta data collected by visiting a single unencrypted website is benign, when you aggregate that data it can pose serious security risk for the user. Thus by incentivizing HTTPS, Google has begun to eliminate instances on the web where users could be vulnerable to having information unknowingly collected about them.

I will give you the spark notes version of the HTTPS Everywhere presentation, but even that will warrant a TL;DR stamp. My hope is that this outline and the resource links contained within it give you a hub you can use when evaluating and implementing HTTPS on your site.

What is Internet SecurityWhen Google talks about securing websites and users with HTTPS, they are really talking about three things: Authentication, Data Integrity, and Encryption

Authentication involves making sure the site you are visiting is who they say they are.

Data Integrity revolves around protecting the data from being modified while in transit.

Encryption, probably what you first think of, is about making data unreadable if someone does get ahold of your data.

Together this trio works to prevent passive attackers from listening in on user activity, prohibits attackers from tampering with data while in transit, and inhibits attackers from impersonating the destination site. In the past implementing this level of protection was met with some resistance because of the cost and latency added to running a website.

Ilya and Pierre acknowledge this in their HTTPS presentation and present us with a process of implementing HTTPS in a way that reduces the cost and latency traditionally associated with adding a TLS layer to your site. Their recommendations come straight from Google’s own HTTPS implementations which saw double digit page load improvements over their HTTP counterparts.

There are two checklists, first is the System Admin Checklist; second is the Webmaster checklist. The System Admin checklist should be followed in order.

It’s important to note that as of right now HTTPS as a ranking factor is only affecting 1% of sites in search results. So making this change is not urgent. I assume if you are selling things online then you already have HTTPS set up on your site, in which case you’re ahead of the curve.

If you already have HTTPS, I would suggest you use the Qualys SSL Tool to evaluate the level of security your certificate is offering you, talk to your developers about how they’re leveraging keep-alives and session resumption, and ask them if SPDY implemented on your server. Note: SPDY is currently only available on Apache servers.

Webmasters, consistency is very important when moving a site to HTTPS. Any link or 301 that could direct a user to an HTTP version of the site opens a hole in the security you invested so much time setting up. So take protocol relative URLs seriously and monitor webmaster tools following implementation.

System Admin Checklist: Configuring the Transport Layer Security (TLS) and making it fast.

1. Get a 2048-bit TLS certificate

  • Presentation Start: 10:03
  • Must be 2048-bit, sites using 1024-bit certificates should upgrade
  • You must decide between single domain(example.com), multi-domain(example.com, cdn.example.com, example.us), or wild card(*.example.com) certificate
  • Cost depends on use case
    1. Non-commercial: Free certificates for non-commercial use from StartSSL
    2. Open-Source Project: Free certificate from GlobalSign
    3. Commercial multi-domain certificates for $30+

2. Configure TLS on your server

3. Verify your configuration

  • Presentation Start: 13:20
  • Use Qualys SSL Labs SSL Report Tool to test that your server has been configured correctly

4. Monitor Performance

  • Presentation Start: 14:42
  • Two steps: Asymmetric and symmetric cryptography
  • Asymmetric
    1. Optimize keep-alives
    2. Utilize session resumption
    3. These two work together to eliminate the need for a full authentication handshake which reduces your CPU usage.

5. Tune your server configuration

6. Enable SPDY HTTP2

Webmaster Checklist: Making HTTPS search engine friendly

1. Update site content to request HTTPS resources

  • Fix hardcoded URLs by implementing protocol relative URLs
  • “Protocol relative URLs” just means that your links will adopt the https header automatically when moving the site from the development server to the live site. Protocol relative URLs are also used to prevent security gaps from cropping up by making sure all links on the site are pointing to https versions of a page. It saves both the developer and SEOs a lot of headaches.
  • These links should be implemented across the entire site, including resource links. If resources like javascript or css files have an http link on an https page, the browser will not load those pages.

2. Set up redirects from http to https, add HSTS, set up rel=canonical, and robots.txt

  • Make sure your reducing redirect or eliminating redirect chains, they make latency much worse for mobile users.
  • Eliminate redirects by using HTTP Strict Transport Security (HSTS)
    1. Presentation Start: 26:39
    2. Using HSTS, the browser remembers that it should automatically request HTTPS resources for this site and its subdomains.
    3. Here’s the HSTS Documentation by Mozilla
  • Rel=canonical
    1. Make sure https pages contain self-referencing canonicals as a means of reinforcing the signal sent to Google with the redirect
    2. Canonical URLs should be hard coded as opposed to protocol relative
    3. Here is the Google Documentation about using canonical URLs
  • Robots.txt
    1. Make sure you are not blocking the http version of the website with robots.txt
    2. If you block the http version of the site then search engines will be unable to crawl the 301s redirecting users from http to https

3. Verify robots.txt, rel=canonical, and 301 redirects are correct

  • Verify all variants. This includes https & https for all www, non-www, and mobile sites.
  • Check the index status of each http and https site. After launching https, HTTP should drop to zero and HTTPS should spike to the HTTP’s previous version. If not, then you have probably missed some redirects, hard coded URLs, or have accidentally blocked something in robots.txt
  • Check Crawl Errors. Check the webmaster tools crawl error report for additional monitoring of the site move. Google recommends you check out this documentation about site moves to help smooth out the process.
Author: "Sean McQuaide" Tags: "Industry News, Search Engine Optimizatio..."
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 20 Aug 2014 18:46


Here’s a quick tutorial on how to use Excel to analyze the keywords that have more than one of your site’s pages ranking in Google organic search results.

Your site may have plenty of keywords that have more than one landing page ranking for a variety of reasons. For example, when someone googles “Google Analytics Training”, there are many different LunaMetrics pages that might display, based largely on where the user is located.

Let’s look at how we can break these out and analyze them further.

Step 1: Export all your Google keywords with the associated landing pages.

This can be done in a few seconds.

1. First, open Google Webmaster Tools and go to the landing pages tab of the Search Queries report.

2. Add the parameter &grid.s=100000 to the URL. This displays all landing pages by changing from the default of 25 to up to 100,000 pages.

step 1.2 - change the url

3. Next, use the Search Queries exporter bookmarklet by Noah Haibach to export ALL the queries by landing page. It can take a minute or two if you have a high-traffic site.


Step 2: Isolate the keywords with more than one landing page.

Now we’ll just pull our desired list of keywords in Excel. Note: I use Excel 2010.

1. Import text file into Excel.

Step 2.1 - Import

2. Make a table.

In Excel, go to Insert > Table.

step 2.2 - make table

3. Use conditional formatting to highlight duplicate keywords.

Highlight the keyword column. Then go to home > Conditional Formatting > Highlight Cells Rules > Duplicate Values. Now any keyword that appears more than once is highlighted.

Excel table with duplicate keywords highlighted

4. Filter the table to show only cells of color you used for highlighting.

Go to the little arrow button in B1; then Filter by Color.

All pages ranking for a keyword

All done! 

Step 3: Analyze.

Easy enough to pull that list right? Now let’s talk about what to do with it.

Too much data!?

First, if your list seems overwhelming and non-useful at first, I feel you. My list for lunametrics.com had over 2,300 keywords with more than one landing page. Deeply analyzing all of these columns isn’t very valuable. Additionally, manipulating tables like this can be resource intensive if you have a lot of data; it’s possible that Excel could freeze here.

So, you might want to cut out unimportant data. One way is to exclude the lowest trafficked landing pages before you perform step 1.

Alternatively, you can skim off the fat after you exported but before you make the table. I like to Advanced Sort by clicks and then by impressions, and then I delete columns at the bottom. I reduced the 2,300 columns for lunametrics.com to about 500 by deleting columns with 0 clicks and less than 10 impressions. In the last screenshot, I did this, then I alpha-sorted by keyword.

Figure out why your keywords might have more than one page ranking.

Understanding why you have multiple pages ranking can help you understand your search engine visibility better, and help you identify opportunities for improvement. Below are a few common reasons this could happen:

  • Localization Sometimes a search query made by a nearby user will return one page (for example, your homepage or store page shows in a local 7 pack); whereas the same query from a more distant user will return a different page. This technique is one of the best ways I know to grab a nice list of relevant keywords impacted by location.
  • Rankings changes – Perhaps one page was ranking consistently for a search query, and now another one is.
  • Multiple pages ranking simultaneously - if you’re really dominating the rankings.
  • Sitelinks or other extra links (CAUTION). The data I often see in GWT on Sitelinks, breadcrumb links, event listing links, or other rich snippet links is often not what I’d expect, and I’m not sure what is going on. For example, I always see Sitelinks when I Google “lunametrics”; however, the # of impressions listed by GWT for the  home page for the query “lunametrics” is several times greater than any other landing page for that query. As another example, the breadcrumb links to a category page within the Google listings for a client’s product pages do not appear to show up either.
  • To that last point, be wary of  jumping to conclusions with GWT search query data. It’s probably better to use this analysis for creating general hypotheses.

Sample analysis questions:

  • What queries are impacted by localization? Should I change up how we’re doing geo-targeting and local SEO?
  • Which page do I want to rank for keyword X?
  • Why is page x ranking for these keywords? Should I shift the targets on page x?
  • Do I have keyword cannibalism? (See #4 in 11 Keyword Targeting Mistakes for a definition.)
  • Are there any pages getting substantial traffic from unexpected queries? Why?
  • Are there landing page optimization opportunities to give said unexpected traffic a better experience (and improve conversion rates?)


Happy analyzing. Let us know in the comments what pearls of insight you’ve found by analyzing the landing pages of keywords.

Author: "Reid Bandremer" Tags: "Miscellaneous"
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 19 Aug 2014 14:15


As analysts and marketers, we always want to track positive performance metrics and conversions in Google Analytics. However, tracking errors is also important to monitor the health of your site and keep track of signals indicating a negative user experience.

Accessing this data gives us a better idea of what’s causing users to get lost and wander into the dark, unattached voids of your domain. Knowing where these problem spots are makes it easier to fix internal links or set redirects.

I’ll show you different ways to view where people are hitting these error pages and where they are coming from, either through your existing setup or by using Google Tag Manager to fire events or virtual pageviews.

404 – User Not Impressed

As a web user, there’s a good chance that you’ve been acquainted with the ominous “404 Page Not Found” error. They come in many flavors, sometimes with illustrations, sometimes with a site search to help find what you were looking for and sometimes it is just a white page with unfeeling black, bold letters.

Some causes of 404 errors come from

  • A manually mis-typed URL path
  • Third-party sites linking to nonexistent or removed pages
  • Old links from social media platforms (scroll down your Facebook page to 2006, some links you shared might not be valid anymore!)
  • Errors with internal links

No matter how cute or whimsical the page is, a ‘page not found’ error disrupts the user experience. These 404 pages are not actual pages on a site – they are a result of a status code response on the server side and can be thought of as an alert rather than a fixed page. So how can you know if visitors to your site are experiencing these errors?

Easy, Existing Options

In Google Analytics, you may already see the page path that the user attempted to access. If it is an old article that doesn’t exist anymore, the page may show up as something like /2003/04/12/article-title/. Seeing that single pageview in your reports may be your only indication that the page doesn’t exist. Possibly, the page title will give you some indication as well.

If you’re able to determine that a page is an error, either by the page path or title, you could set up 404 errors as goals in Google Analytics. This has been common practice in the past because of the ability to see the funnel a user took before getting to the destination page.

However, goals are best suited for key performance indicators, or KPIs of your site. Also, keep in mind that there is a limit on non-Premium accounts of 20 goals per view.

Another approach to track 404 response pages is to take advantage of Google’s Webmaster Tools. It can show you what is linking to the missing pages and graph the volume of errors over time. Some limitations are that the errors logged are from the Googlebot crawler (not necessarily viewed by users), you can’t see how it affects users’ overall sessions, and you can’t include it in your analytics reporting.

Using Google Tag Manager

If you don’t have this information in Google Analytics or Google Webmaster Tools, we can track 404 errors as events or virtual pageviews with Google Tag Manager. These examples assume basic pageview tracking is set up and the code for the Tag Manager container is on the 404 page template.

Page Title or Page Header

Sometimes, the easiest solution is to look for something consistent on the page that identifies it as an error.

Typically, an error page template will have a page title that does not change. For example, Google’s 404 pages have the title, ‘Error 404 (Not Found)!!1.’

To target the title of this page, we would first create a custom JavaScript macro to represent the page title. The page title can be captured by using “document.title”, which is supported in all major browsers.


Then, our firing rule for the 404 response event tag would simply be {{page title}} equals “Error 404 (Not Found)!!1.”


One thing to keep in mind when making this rule is that the field is case-sensitive. Also, if the page starts with something like “Page Not Found” and dynamically adds the path that the user tried to access (“Page Not Found – /notapage”), using “starts with” or a regular expression would be the alternative to “equals.”

This method can be used for an even more complicated scenario where the page title is just the URL or path that the user attempted to access- “/notapage.” This isn’t an ideal situation, but in this instance we could target the header element instead of the page title. For a custom {{header}} macro, you could use jQuery

function() {
var header = $('h1:first').text();
return header;

…or JavaScript.

function() {
var header = document.getElementsByTagName('h2')[0].innerText;
return header;

The firing rule would then be {{header}} equals “Not Found.”

Fire an Event

Now that we’ve identified the 404 Error Page, we can fire an event to let us know some more information about the error. The benefit here is that it gives us a really easy to view metric inside of Google Analytics, as well as the ability to include any information we like.

Below, the label {{referrer}} is a custom macro that can be made easily in Tag Manager by selecting “HTTP Referrer” as the macro type.



Now we know A) that an error occurred by a user and B) how they reached the page. We can fix our site with a redirect and, if possible, remove the offending link to our outdated link.

Adding Code to the Page

One method we use frequently is to send a Google Analytics Event from Google Tag Manager using an “event” that is added to the data layer. When Tag Manager sees this event on the data layer, it triggers a rule, which we attach to the GA Event Tag.

You will likely need cooperation from a developer or the department in charge of maintaining the website since this part involves adding lines of code onto the 404 page template.

The data layer is an object that goes before the container snippet and simply makes variables and events available to Google Analytics and Tag Manager.

Below is the code that we would use for a 404 event:

dataLayer = [{
'category': '404 Response',
'action': url,
'label': document.referrer, 
'event': '404error'

We can use Data Layer Variable macros to pull the event category, action and label from the data Layer.


Next, we would create a firing rule of {{event}} equals ’404error’.


We can use this rule to cause a Google Analytics Event tag to fire, much like the one displayed earlier. Now anytime there is a pageview hit on a 404 response page, an event is triggered and sent to Google Analytics.

Using Virtual Pageviews

Another way to track those lost souls wandering your site is to use virtual pageviews with Google Tag Manager. One benefit to tracking hits this way is the ability to look at the data in the content drilldown under the Behavior reports, or to use this virtual pageview in a Goal Funnel like previously mentioned.

To accomplish this, create a pageview tag instead of an event tag. Under “More Settings” set your document path to begin with 404/ so it will be apparent when analyzing the page data later in Google Analytics.


For the firing rule, you can use any of the rules and methods we mentioned earlier either by using the data layer or targeting the page title or header.

Just make sure to re-use the rule as a blocking rule on your default pageview tag so that your data isn’t polluted by duplicate hits.


Set an alert

Finally, it’s a good practice to set up alerts for errors as well (in the Admin tab under “Custom Alerts”). For the “Alert Conditions”, you will either use the Event Category pictured below, or Page with the condition of “starts with” and value of “404.” This will send you automated emails when pageviews on error pages goes above whatever threshold you specify.


This data will not only allow you to keep track of 404 errors, but it will also give you the ability to analyze why they are happening with the referring URLs, and the page path to implement redirects. Redirecting lost users to content will improve the user’s visit and potentially create a converting visitor who may have left at the sight of an error!

Author: "Samantha Barnes" Tags: "Analytics, Google Analytics, Google Tag ..."
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 12 Aug 2014 14:00


Raise your hand if you’ve heard a co-worker say “Ugh, I’ve gotta jump on a call”! Most people don’t look forward to phone calls with clients. There’s the inherent fear that you’re not prepared (It’s hard to imagine the audience in their underpants when you’re only calling one person across the country), or that you don’t have the right report or solution lined up.

If you work in the Search & Analytics fields like we do at our office, it’s quite possible that you have not and will not meet certain clients face-to-face due to distance, so building rapport can be a challenge. You just don’t get to shoot the breeze on the phone like you might during an on-site visit or lunch with your client.

In fact, relationship building is my favorite part of working with clients. Helping them succeed and meet their objectives helps me succeed and meet mine, so I invest in good client relationships wherever I can.

If you don’t share my excitement over client calls, I’ve assembled the following presentation to help you ease any fears when preparing for and executing your next client call.

Not showing? View here

Here are the tips:

1. Frame Each and Every Call before Dialing

Why is this call taking place? Is it a kickoff call with a new client? Are you reviewing a monthly report? Before the call, write down your objective, information you must convey, and anything you need to ask your from your client.

This will ensure you won’t forget any key points. Even spending 5 minutes before a call to gather yourself and your notes can make all the difference!

2. Use Inclusive Language

Everyone likes to feel like they are part of the team. Use pronouns like “we” and “our” to create a group feeling.

I consider myself part of my client’s teams, and you should too because, if you’re providing a service or consultation, you are!


3. Limit your Buzzwords & Local-isms

Remember your audience when delving into your mental bank of acronyms, abbreviations and business buzzwords, as these can lead to confusion. This can be difficult. I work in SEO and can probably compose an entire sentence comprised of nothing but acronyms, but it may be hard for a client to understand.

In journalism classes they often suggest you “write for your Grandma”, and while I won’t go quite that far, I do recommend that you gauge your client’s technical level and speak to that.

Regarding Local-isms or regional turns-of-phrase, use your judgement. There are certain phrases you may say in casual conversation that just aren’t universal in all regions. Take it from me, I live and work in Pittsburgh. If you didn’t know, we have our own language.

It doesn’t translate well on a call with clients in California. Noting any home-spun phrasings you rely on and eliminating them from your client phone calls will lead to more clarity.

4. Be realistic when making promises

Don’t promise more than you can deliver, deliver more than you promised. Part of your client call will be managing client expectations. Whether you are talking about campaign performance, deliverables or contracts, be careful what you say might happen. Sure, numbers could explode and make everyone rich, but what is the likeliest scenario?

Similarly, don’t promise that you can deliver work 2 weeks before a deadline if it will require your team working around the clock. Your team won’t take kindly to that, and the client, who would have been perfectly fine with it by the normal deadline, now has been let down.

5. Empathize!

Just as everyone wants be on a team, everyone wants to feel heard.

Early in my career I was fortunate enough to work closely with a sales executive in the media world who was dynamite at his job. He didn’t talk fast, and he never got mad. He listened to his customers and clients, asked about their problems, then asked how he could help remove barriers to a fix. No hard sell required.

Not only was this incredibly effective, it wasn’t even a “method”, it was just being nice and helpful to people you work with. What a simple thing to do!

6. Getting Back on Track

After you’ve heard about the dwindling marketing budget and your client’s woes from booking a flight for vacation, you do have to loop the conversation back to its original point (See Step 1.).

Lead the call as best you can and be honest and straightforward about what you need. Is it time to review our reports? Do you need a file or access to something? Now is the time to get the conversation going again.

7. Tell a Compelling Story

I LOVE this video from Stanford University Professor of Marketing Jennifer L. Aaker I found via thinkwithgoogle.com. Everyone working with data in any capacity should be required to watch this.

Aaker suggests that stories make data easier to understand and remember, and I couldn’t agree more. Watch it, I’ll wait.

All done? Good, let’s continue.

Every call with your client should tell a story. It won’t always be the big story about how you’re doubling revenue or single-handedly innovating their business, but you must be able to attach a why to anything you are recommending or reporting on.

It’s not enough to simply say that the bounce rate on the client’s website is increasing. Why? What does it mean? How might we fix it? What’s the deeper story here? Many times, it’s up to you to parse an explanation from the available data and help the client understand it. In other cases, it’s your job.

8. Take a Stand and be an Expert

This is often forgotten during a busy workday, but the client is paying to talk to you! Isn’t that great? Make sure you are delivering value by being prepared, authoritative and firm. If you are offering a strong recommendation for an action, defend it. Explain why it’s considered a best practice or what could go wrong if you don’t do it.

Take ownership, and don’t waver with “umms”, “maybes” and filler words. There are times to be flexible and open-minded, but when you’re sure of a solution or recommendation, be firm. That might just be what the client needs to take back to their team to get the ball rolling.

You’re all set. Good luck.

How do you build relationships over the phone? I’d love to hear your tricks! Leave a note below.

Author: "Michael Bartholow" Tags: "Miscellaneous"
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 07 Aug 2014 12:45


On July 30th, 2014, Google Analytics announced a new feature to automatically exclude bots and spiders from your data. In the view level of the admin area, you now have the option to check a box labeled “Exclude traffic from known bots and spiders”.

Most of the posts I’ve read on the topic are simply mirroring the announcement, and not really talking about why you want to check the box. Maybe a more interesting question would be why would you NOT want to? Still, for most people you’re going to want to ultimately check this box. I’ll tell you why, but also how to test it beforehand.

The Spider-Man Transformer is neither a bot, nor a spider, nor a valid representation of my childhood.

The Spider-Man Transformer is neither a bot nor a spider, nor a valid representation of my childhood.

What are Bots and Spiders?

The first thing to understand is just what a Bot or Spider is. They are basically automated computer programs, not people, that are hitting your website. They do it for various reasons.

Sometimes it’s a search engine looking to list your content on their site. Sometimes it’s a program looking to see if your blog has new content so they can let someone know in their news reader. Sometimes it’s a service that YOU have hired to make sure that your server is up, that it’s loading speed is normal, etc.

Some of the more basic bots don’t run code on your site, like the JavaScript that Google Analytics requires, so you don’t see them in your traffic reports. You will however, see them in your server logs if you have access to those. Many web hosts will charge by the hit on the server, based on their server logs.

Some sites, like LunaMetrics, get tons of these hits that we only have evidence of on the server log level. All you people who have automated services pinging our servers looking for new posts every few seconds are essentially costing us money. It’s ok, we don’t mind. It doesn’t screw up our analytics data.

The Problem With Smart Bots


A stereotypical bot spike.

The problems start when you learn that some of these computer programs that are running automatically CAN run the Google Analytics code and WILL show up as a hit in Google Analytics. Sometimes a site will barely get touched by these “smart bots” and you won’t give them a second thought, as they won’t be visiting your site enough to have it skew your insights. Other times you’ll get wild and insane spikes in your data, which you’ll have to deal with.

Dealing with these bots can be a big problem. In the past, you’ve often had to hunt them down to discern them by the browser they’re listed on, the number of pages they hit and other behavior, etc. Once you figured this out you could filter out many of them going forward, but it would still remain in your historical data, and affect sampling.


This is not a human generated traffic pattern.

Because it remains in your historical data, you’ll be forced to use segments to get rid of them when you look at your property, which will often cause sampling for large sites. Even worse, your total sessions will be affected by these bots, so even if you filter them out, you’ll trigger sampling faster in the interface, and it will sample at a much lower inaccurate sample size right out of the gate.

Even when you know about the bots AND deal with them, they can still make your life miserable if you’re an analyst.


Goals and Conversions can be affected by a spike in segmented traffic.

The problems don’t even stop there, and it’s not just about traffic. For instance, some bots can even log into your site and pretend to be a specific audience segment. Many of these are ones in services YOU pay for, like Webmetrics.

If you don’t filter out these “super smart bots”, they’ll really mess up your data, because you might be looking at a specific audience segment, and see a wild swing in traffic or Ecommerce rate. Or worse the bots ramp up slowly, and you don’t even get a clear indication that something odd happened.

The New Bot and Spider Filtering Feature

Which brings us back to the Google Analytics new offering. This feature will automatically filter all spiders and bots on the IAB/ABC International Spiders & Bots List from your data. This is a list of spiders and bots that is continuously updated and compiled when people find new ones. Generally membership to see this list costs from $4,000 to $14,000 a year, but by checking the little box on your view (and on every view you want to filter them) you get to utilize the list for free. A number of bots and spiders may slip through the list, but usually not for long, and hopefully not long enough to affect your data.

You won’t be able to SEE the actual list they’re using, but you can exclude visits from bots on their list from showing up in your Analytics.


So great, right? Check that box? Not so fast.

Best Practices For Implementing New Filters

I am all for checking the box, but this is a great chance to talk about and implement best practices:

Step 1: Make sure you have an unfiltered view in your property that has zero filters, and which you don’t check the box. This way if there IS some sort of error, you’ll have your data in a pure state to go back and look at, and compare against.

Step 2: Don’t implement it immediately in your main view. We’ve heard reports from people having some problems, or even their Ecommerce being affected. I don’t know how accurate any of these complaints are, but it’s always good practice to put a new filter in a test view first. Create a new test view that mirrors your main one in every other respect, and then check the box.

Let it run for a week or two, and see what sort of difference you have. Investigate major differences and clarify internally what monitoring systems you’re paying for.

Step 3: If you’re happy with the new bot and spider exclusion filter based on this test, then go ahead and implement it in the main view.

I don’t know if this is going to solve all our smart bot and spider problems, but it’s a great start. As someone who recently had to manually exclude hundreds of different IP addresses from a view for a client, I can attest to a single checkbox being a humongous time saver.

So follow the best practices, and hopefully enjoy your cleaner data.

Author: "Sayf Sharif" Tags: "Analytics, Google Analytics"
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 05 Aug 2014 15:46


No company dictates the online marketing industry and all of our careers like Google. Regardless of whether you use the company’s products, your customers do and that leaves you no choice but to become a Google expert.

This post outlines 20 things that every marketer should know about Google. Some are huge (and somewhat unimaginable) dollar figures. Others are market share percentages. The one thing they all have in common: you need to know them.

If we missed any important facts, please let us know in the comments.

Search & Mobile Statistics

Google controls 67.6 percent of the US search engine market, well ahead of silver medalist, Bing, which has less than 20 percent.

Google projects that 90 percent of its revenue this year will come from digital advertising. About 20 percent of those earnings are from the Google Display Network.

Google will make $44 billion in ad revenue this year by owning 31.45 percent of the global digital ad market, 20-something points ahead of the runner up, Facebook.

Last year, nearly $9 billion of its ad revenue came from mobile. That number is expected to increase with mobile adoption.

Android operating system holds roughly 80 percent of the global smartphone market with about 1.6 billion units around the globe in 2014. Android’s 76 million users in the US make up about 50 percent of the market (compared to Apple’s 40 percent).

The industries with the highest cost-per-click (CPC) on AdWords are insurance, banking then legal.

Studies suggest that the top organic and paid spots in the Google search results get about twice as many clicks as the second spots.

Google Apps & Tools Facts

There are more than 500 million Gmail accounts with 1 billion Android installations.

youtube statisticsIBM and Microsoft have long been the leaders of business email. Surveys found that only one of the Fortune 50 use Gmail. Look outside of big business, however, and the story changes. In 2014, 60 percent of mid-sized businesses used Gmail and 92 percent of startups. As the next corporate generation matures, expect for Gmail to grow with them.

YouTube has 1 billion unique users each month who consume 6 billion hours of content.

Chrome just broke 20 percent of the browser market, putting it in second place, but still well behind Internet Explorer.

Google Analytics is used by somewhere between 10 million and 25 million websites worldwide and various surveys suggest over 50 percent of business websites (both large and small) use Google Analytics.

Final Thing Marketers Should Know

Definition: to google

To google is a verb recognized by Merriam-Webster (and most people on the North American continent). As a verb, and not a proper noun, googling is not capitalized. Google, the company, is capitalized.

There were only 20 slots so lots of Google facts and statistics did not make the cut. Are there any that you think should be on the list? Please share in the comments.

Author: "Andrew Garberson" Tags: "Paid Search, Search Engine Optimization"
Comments Send by mail Print  Save  Delicious 
Date: Monday, 04 Aug 2014 12:45


It’s now easier than ever to track and compare performance between articles and blogs. While Google Analytics shows you pageviews and other key metrics, frequent content comparisons are made difficult by the shifting time frames.

How can I compare a blog post that was published this month vs. a blog post that was posted last month? Sure, we can run two different reports, pull it into Excel and start crunching the numbers, but there’s gotta be a better way!

blog-cohort-applesEnter Cohort Analysis. You may have heard this term thrown around before, usually in relation to users on your site and when they first became users. The idea here is to group users or sessions into common groups, like who first visited in January or first-month visitors. Avinash and Justin Cutroni both love cohorts, so obviously we should, too!

In this case, we’re going to use Google Tag Manager to put content into cohorts so we can analyze how they performed in similar time frames. We’ll pass these into Google Analytics as Custom Dimensions so they’re available for analysis. It’s actually much easier than it sounds!

Step One: Find the Published/Posted Date

Like the title says, this post is really geared towards things that get published on a certain date. I originally started with blogs posts or articles in mind, but this could apply to other things that are published, for instance, if you’re a deal site and you have new deals go up each day, etc.

We need to find a way to get that publish date into Google Tag Manager. Here are three options, in order of my preference.

1. Data Layer
2. URL Structure
3. Element on Page

Data Layer

If you’re using Google Tag Manager, you likely love the data layer. Put all of your important information into one great place so GTM can snag that data and use it. Add this to your site if it’s not currently there, or ask your developer to help out. For us, it was as easy as adding the following PHP code to our Header template in WordPress.

'postedDate' : '<?php echo get_the_date('Y/m/d g:i:s A T');?>'

The ultimate goal is to get the full date with time information on the data layer. If you have more questions about this step, check out this article that explains more about how the data layer works.

If you get this to the data layer, setting up the Data Layer Variable macro is pretty easy.


URL Structure

If the data layer step is going to take too long or just isn’t technically possible, it’s time to start getting creative. Where else can you find the publish date? Check out our blog URL structure up in the address bar. For us, we actually have the date available! There’s no time available, but it’s better than nothing!

Our URLs look like this:


We can use a Custom Javascript Macro to extract the date from the URL Path like the examples below.


function (){
    var url= {{url path}};
    var arr = url.match(/^\/blog\/(\d{4}\/\d{2}\/\d{2})\/.+/);
    if (arr) {
        return arr[1];
    } else {
        return null


Element on the Page

Lastly – I’ll mention using an Element on the page. Look to see if the date is somewhere that you can steal it from the page itself. Find the date and check to see if it’s wrapped in an element. You can right-click on the date and check to see if it’s wrapped in a span or html tag with a unique ID.


If you’re so fortunate to have this available to you, this DOM Element macro is pretty easy to set up as well!


Alternately, try viewing your source code and doing a CTRL+F for variations of your publish date. It may appear in a hidden field or somewhere else on the page in a uniquely identified tag that you can use.

Note of Caution: There’s a reason this is my least preferred method. If you use a DOM Element, there’s a good chance it might be not be available when the Pageview Tag fires. Use the highest DOM element on the page that you can, but if that’s not working reliably, you may have to alter the Rule for your Pageview tag to wait until gtm.dom. Any time you delay your Pageview Tag, you may lose a few Pageviews, so keep this in mind!

Step Two: Calculating Days/Weeks/Months Since Posted

Now that we have the date that the blog/article was posted, we can quickly calculate numbers of days/weeks and… months? Perhaps.

Again, I’ll reemphasize that it’s best to have the time that the content was published AND the time zone! If your visitor is coming from a different time zone, we want to accurately count how long it has been since the content has actually been on the site.

Set Up Your Custom JavaScript Macros

Now that we have the publish date, let’s grab today’s date and take the difference.


daysSincePosted – I’m going to round up here, so the first 24 hours will count as Day 1, and so on.

function() {
    var postDate = new Date({{postedDate}});
    var currDate = new Date();
    var daysSincePost = Math.ceil((currDate.getTime()-postDate.getTime())/1000/60/60/24);
    if(daysSincePost) {
	return daysSincePost;
	} else {
	return null;

weeksSincePosted – We’ll just take days and divide by 7. Again, we’ll start in Week 1.

function() {
    var postDate = new Date({{postedDate}});
    var currDate = new Date();
    var daysSincePost = Math.ceil((currDate.getTime()-postDate.getTime())/1000/60/60/24);
    var weeksSincePost = Math.ceil(daysSincePost/7);
	if(weeksSincePost) {
	return weeksSincePost;
	} else {
	return null;

monthsSincePosted – Months are really the toughest thing to do. Some months have 31 days, some have 30, February just hates consistency. If we’re talking about time passed, then months just don’t work well. My advice here is to just go with buckets of 30 days. bucketsOf30DaysSincePost doesn’t have quite the same ring though, so call it months and add an asterisk to your reports.

function() {
    var postDate = new Date({{postedDate}});
    var currDate = new Date();
    var daysSincePost = Math.ceil((currDate.getTime()-postDate.getTime())/1000/60/60/24);
    var monthsSincePost = Math.ceil(daysSincePost/30); 
    if(monthsSincePost) {
	return monthsSincePost;
	} else {
	return null;

Step Three: Passing this Information In

Now that you have your Macros up and running, it’s time to pass these in as Custom Dimensions (Universal Analytics only). I created my Custom Dimensions in Property Settings and then added them onto the Pageview Tag under Custom Dimensions.



You’ll notice I also passed in the posted date. This is mostly for flexibility, just in case we need it for something else down the line!

Step Four: Custom Reports

Now that you have the info in Google Analytics, you can create all kinds of custom reports. Two simple custom reports can be set up like below, that use a longer time span but then only include data from an article’s first week or month.


Or, after enough time has passed, it will be easy to export the full list and pivot it into a triangle chart with blog title down the left side and week or month across the top.



Note of caution: Because these are Dimensions and not Metrics, we won’t be able to do anything inside the Google Analytics interface that resembles Greater than or Less than selections. If you wanted to get everything before 60 days, you could use a regular expression like so ^[1-5]?[0-9]$, or always spit it into another program to crunch.

Author: "Jon Meck" Tags: "Analytics, Google Analytics, Google Tag ..."
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 30 Jul 2014 17:57


Segments are one of the most powerful features of Google Analytics, and they are often useful for zeroing in on the sets of users who are most valuable to us.

One way of looking at potentially valuable users is to look at the frequency with which they visit the website. Let’s look at a couple of ways to do that in GA.

Dimension: Count of Sessions

The dimension Count of Sessions has been around forever in GA, and it’s the one you’ll find in the Frequency & Recency report.

Screen Shot 2014-07-30 at 8.12.46 AM

Google Analytics keeps track of how many times a user has visited your website, and Count of Sessions is that count for each individual user. The count is incremented each time the user visits. So, for a new user, the count is 1 (i.e., it is the first visit to the site). The next time, the count is 2 (it’s the second visit).

Note that Count of Sessions dimension is not based on the time period of the reporting. That is, it doesn’t say “this is the second visit during the last 30 days” (for example), it simply says “this is the user’s second visit”, when the first could have been prior to the 30-day range displayed. (Of course, you’ll only see sessions that occurred during the specified time period, just like always.)

So Count of Sessions is useful in defining segments if you want to be able to show your most loyal users — say, users who have visited the site more than 10 times in their lifetime. You can find Count of Sessions as an option in the advanced Condition settings in creating a segment.

Count of Sessions under Advanced Condition setting

Metric: Sessions

What if instead of saying “this is the user’s third visit ever”, I want to say “show me users who have visited three times in the last 30 days“?

Up until the recent changes with user-based segments in Google Analytics, you couldn’t do this. But now, there’s an easy option to get such a segment with the Sessions metric (you’ll find it in the Behavior section in the segment settings).

Sessions Metric in Advanced Segments

If you specify Sessions ≥ 3 and look at a specific time period in a report, what you’ll see are only the users who had more than three sessions during that period. This is often more what we are looking for when we want to define segments around frequency, because the whole history isn’t important to us, we just want to know how frequently the user has been to the site in a recent period.

Other helpful options

There are a few other options that may be useful in combination with the above suggestions for defining segments based on frequency or loyalty.

The good old standby dimension User Type tells whether a user is new or returning. (Fair warning: although GA updated its terminology from Visitor to User in most places, the contents of the User Type dimension have the values “New Visitor” or “Returning Visitor”.)

There are also a couple of date-based options for segmentation that may also be useful in tailoring frequency segments. First, just below the Sessions metric in the Behavior section is Days Since Last Session, which is just a count of the number of days that have passed since the user’s last session. (Note that a first session will give “0” as the number of days, as will multiple sessions within the same day.)

Days Since Last Session in Advanced Segments

Second, we can know not just whether a user is new, but how new. The new option Date of First Session lets you specify a date range for the first session by that user.

Date of First Session in Advanced Segments

Via the GA API

The API actually gives you even finer-grained control over how metrics are calculated in defining the segments. You can specify the scope over which a metric is calculated (at the user, session, or hit level) for the purposes of creating the segment. See the documentation for all the gory details.

How it Works

Google Analytics relies on cookies to make all of this possible. If a user clears their cookies or there is an implementation problem, you’ll see a user show up as a New User, when in reality, it may not be his first time visiting your site. Keep this in mind as you look at the frequency reports. How often people clear their cookies will depend on your site content and the audience it attracts.


Segmenting by frequency and loyalty measurements can be especially useful for content-oriented sites (who are the users who are back most often to read more stuff) as well as ecommerce (they’re back again shopping). How have you combined these options in Google Analytics to get at segments of your most loyal users?

Author: "Jonathan Weber" Tags: "Google Analytics"
Comments Send by mail Print  Save  Delicious 
Date: Monday, 28 Jul 2014 13:17


In this blog post, I evaluate several of the numerous (and potentially overwhelming) options for the processing and reporting of Google Analytics data. The default  Google Analytics web interface is great for quick ad hoc data exploration, but limited for deeper analysis and the development of automated reports.

Whether we’re mining for hidden trends or trying to report on hard-to-extract dimensions, there are a number of third-party tools out there can that help ease the burden.

In the first half of this article, I explain the difference between the two types of Google Analytics data: what’s available from the standard interface and what’s available through the BigQuery export.

The second half of this article is an evaluation of three different solutions for processing, visualizing, and reporting on Google Analytics/GA BigQuery data. I evaluate these three solutions (ShufflePoint, Tableau, and R) based on objective features and my subjective scoring of performance.

I only evaluate three data processing solutions in this article. Think I missed a good one? Let me know!! We all have a different background in data analysis tools, and I would love this conversation to continue in the comments section.

1. Google Analytics Data vs. BigQuery Export for Google Analytics

Google’s Page on Google Analytics BigQuery Export

There are two types of Google Analytics data. First, the standard type that is available through the interface and the API, and which we’ll refer to as “summary data”. The second, available only to Premium users through the BigQuery Export for Google Analytics feature, is hit-level/session-level data . We refer to this as  “granular data”.

“You can export your session and hit data from a Google Analytics Premium account into Google BigQuery, so you can run queries using a SQL-like language.”

You can access all of your Analytics Data. Great, but what does that actually mean? Well, if you’re a frequent reader of the LunaMetrics blog, you probably already know the answer!

From colleague Jonathan Weber’s post on BigQuery:

“Because this is session-level data, we can get at the kind of effects you can usually only get at with Advanced Segments in GA (including the new user segments). For example, finding all the visits by users who have viewed Product A. And using the power of BigQuery’s processing engine — voila, no sampling, no matter how big the data set you start with.”

If you’re not a frequent user of Google Analytics or don’t have much experience with the Google Analytics API, you still might be wondering why exactly the session-level and hit-level data export is a big deal.

Comparing the Two

A good way to understand standard Google Analytics data is to think of an Excel Pivot table. A pivot table is a data processing tool that provides a summary of the individual data points, sorted by a set of dimensions.

Say we have 100 sessions on our website, and we want to look at pageviews based on the browser and state (region) of those sessions. The limitation of standard Google Analytics data (through the interface and standard Data Export) is that we can only view the completed pivot table. i.e. we can view the summary (average or sum) numbers for metrics like pageviews per session, session duration, etc., but we cannot view the actual data that was used to create the table.

The standard reports from Google Analytics look like completed Pivot Tables

Through the BigQuery data export, we’re actually able to pull back the curtain and take a look at each of the rows of data that were summarized by the pivot table above. We get each line of data, and we can then crunch it, pivot it, and process it any way that we desire.


When is Standard Google Analytics Data Sufficient?

Standard Google Analytics data is sufficient for most basic analyses:

  • Did bounce rate increase for Internet Explorer users since the homepage upgrade?
  • Are users from Oklahoma converting at a lower rate than users from Texas? Should ad spend be shifted accordingly?
  • Are women or men consuming more Pageviews (and ad content)? Which age group is increasing fastest among users who made purchases?

Standard Google Analytics data is not sufficient under two conditions:

a. Sampling is triggered by the number of sessions or bucketing is caused by too many rows

  • Sampling occurs when the number of sessions in queried timeframe exceeds 250K/500K at the property, not view, level
  • Sampling occurs when the number of sessions in the flow visualization report exceeds 100K
  • Bucketing occurs in GA interface when there are more than 75,000 rows in a standard report (for a given day)
  • Bucketing occurs for data export when there are more than 10,000 rows over requested timeframe

You can read Google’s documentation for more information on Sampling and Bucketed data, which appears in the reports as (other).

b. More-complex analysis is required

  • Is there a significant difference in revenue per session between the sessions (visits) from campaign A and campaign B? (Requires standard deviation of revenue per session in each campaign.)
  • What is the likelihood that a user (visitor) purchases products A and B together (market basket analysis)? Is this association statistically significant?
  • What is the likelihood that a visitor will make another purchase after purchasing product A? Which product has the highest value for this likelihood? (Determine best product on which to offer a promotional discount to new customers).

Now we can see in more detail why Google Analytics BigQuery Export data has capabilities beyond the standard summary data available in the Google Analytics interface and through the standard data export functionality.

In my last post on Google Analytics Data Mining with BigQuery and R I provide further explanation of the types of analyses possible with hit-level and session-level data. (Bonus: if you’re using these tools already, there’s an R script there that will create a pretty cool Ecommerce report from your Google Analytics BigQuery Export data!)

2. Processing Solutions for Google Analytics Data

Now for the second half of this article: which data sources and data processing interfaces are right for my Google Analytics needs?

We have already covered Google Analytics and Google Anaytics Export for BigQuery as the data sources.

A few of the data processing interfaces which we are familiar with here at LunaMetrics are:


ShufflePointShufflepoint is a data processing and report automation tool that works especially well for bringing data directly into Excel. It also has capabilities for PowerPoint and custom web dashboards. It currently aggregates data from twelve data sources including Google Analytics, Google AdWords, Google BigQuery, YouTube.com, and Salesforce.com. This list is always growing.

We especially love ShufflePoint because the customer service is incredible. These analytics platforms that it supports are always changing, and ShufflePoint is extremely quick and proactive in supporting these changes. They also add support for platforms you are using if they are not currently supported.


tableaud logoTableau provides not only data processing and aggregation, but also extensive data visualizations. Further, the learning curve for Tableau is less steep than other solutions. Along with the power of the visualizations, this ease of use is Tableau’s biggest strength. It feels like working with Excel Pivot Charts on steroids.

The professional version of Tableau connects to some analytics platforms, including Google Analytics, Google BigQuery, and Salesforce.com. The full list is available here: http://www.tableausoftware.com/products/techspecs. Tableau is more focused on ERP systems and Big Data sources. It does not list many of the digital analytics platforms supported by ShufflePoint.

R (using RStudio)

Rlogo R is a free software programming language designed for statistical computing and graphics. Thanks to the work of independent developers, there are R packages to access Google Analytics and Google Analytics BigQuery Export. This has the steepest learning curves of the three data processing and reporting solutions, but is also the most powerful.

Although R is free, you definitely want to use RStudio logo_rstudio for any reporting. RStudio is a freemium model, with pricing plans for enterprise solutions. They also develop a web reporting platform called Shiny which allows you to generate HTML and CSS web reports using only the R programming language.

Assessment of Data Processing Solutions

The table below is my evaluation of these three solutions (ShufflePoint, Tableau, and R) based on objective features and my subjective scoring of performance.

Download the PDF

If you have Google Analytics Premium already or are considering getting it and using BigQuery, you should read through Jonathan’s post on BigQuery (as I mentioned above). He provides more detailed explanations on BigQuery, namely:

  • Who is eligible for Google Analytics Export for BigQuery
  • How you specifically access the data
  • How you set up BigQuery
  • Cost of running BigQuery

And for even more specifics on the use cases of BigQuery, refer to one of the first posts on BigQuery, by my colleague Dorcas Alexander.

Author: "Noah Haibach" Tags: "Analytics, Google Analytics, bigquery, d..."
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 23 Jul 2014 13:45


A holistic industry transformation was the tone at MozCon this year and Erica McGillivray and team did a fantastic job getting speakers that supported this theme. Those chosen for the conference are experts in their fields, pushing conventional wisdom and challenging us with new ways to tackle old problems. Each spoke on different topics, but to the same point.

MozCon started with a presentation from our fearless SEO leader, the Wizard of Moz himself, Rand Fishkin. Rand started off the conference by reflecting on the past year in search and framing his vision for the future. He highlighted 5 big trends from the past year.

Rand’s Takeaways

1. We may be on the verge of regulation. Several things point to this being very close to happening. First, the EU is implementing “Right To Be Forgotten”. Right To Be Forgotten states that: someone who wants information about them taken out of the index will have to apply to Google, which will then have to weigh whether it is in the public interest for that information to remain.

Secondly, the FTC has already released disclosure guidelines for digital advertising. The U.S. came very close to legislation in May, which may have slipped by most marketers. One of the reasons cited that this did not happen is that Google has become the second largest lobbying spender in the U.S. “While Google has the clout, money, & lobbyists to influence the government, the search marketing field is not nearly so well armed or organized.”

2. “Inbound Marketing” as a terminology is losing to “Content Marketing”.  Most in the industry know the difference between these two – inbound marketing relies on earning attention rather than interrupting while content marketing is about producing content to earn customers – but those outside of our industry use them interchangeably. Last year Rand predicted that Inbound Marketing would be the winner of this race. Not so much.

3. Google’s penalties have taken a toll on spam, but have hurt many businesses too. Google used to tell us to let them worry about spam, we should focus on the customer experience. Now we are required to stay vigilant for spam that points back to our sites, even if we weren’t the ones that created the spam.

4. We are nearing the end of SEO as a job title. This point was supported by LinkedIn job post data where only 17K jobs contained SEO in the job title compared to 512K job postings that listed SEO as part of the job description.

5. Google is shortening the searchers journey. On the surface this appears to hurt publishers, but in reality, may be more complicated. Google needs to create and feed search addiction; instantaneous answers allow for more searches per searcher per day. While instant answers means less clicks per query, more people searching could mean a bigger pie for the industry.

To combat this trend, Rand sees only two logical strategies:

  1. Diversify your traffic channels.
  2. Become more important to Google searcher’s than Google is to your traffic.

Beyond SEO

Of the 5 trends Rand outlined, I found the fourth to be the most compelling.

We are nearing the end of SEO as a job title.

We are to become more analytical, more strategic, and more forward thinking than we have ever been. We will need to be our clients’ doctor during algorithm updates, their adviser when they’re developing a marketing strategy, and a philosopher when looking to the future.

For years we have been able hide behind the curtain and waft smoke at clients as their sites ranked for top keywords. It was magic. But as Google continues to restrict the way in which sites are able to market themselves and as the knowledge graph grows beyond common queries, we as marketers will need to adapt by diversifying our skills beyond SEO.

We will need to use data, along with our in depth understanding of how search engines work, to tell a story about our clients’ customers and use that to drive real value to our clients’ bottom lines. As Will Reynolds put it, we need to focus on “Real Customer Shit”.

After the Click – Output vs Outcome

When is the last time we cared about what happens after they click on your listing? What if we cared more about the amount of conversions we get from clicks and viewed rankings as a secondary metric? Wil Reynold‘s point was this; what got us from A to B will not get us from B to C.

If we could grow a clients’ business without 10 blog posts this month, is that a fail? No! So why do we continue to focus on output when we should be focused on outcomes? It’s our job to challenge the way clients think about SEO so that we can achieve real results in search.

Broken Brand Promises

Wil’s “after the click” pep-talk was supported by Kerry Bodine two days earlier when she explained the customer journey and how it affects consumer experience both on and offline. Kerry threw out some great statistics to support good customer experiences.

  • 81% of customers are willing to pay more for a better customer experience.
  • 70% have stopped buying goods or services from a company after experiencing poor customer service
  • 64% have made future purchases from a company’s competitors after poor customer service

Kerry’s presentation was titled “Broken Brand Promises” and it was more than just making sure customers are getting to your shopping cart happily. It was about coming to terms with who your brand REALLY is, not how you want to be perceived. Then once you’ve figured out who you are, you need to set expectations with your client by aligning your marketing goals with the areas in your customers’ journey that make them happy.

Her best example was the kitten for your birthday example. If your parents tell you that they’re getting you a kitten for your birthday (ignoring that your parents are terrible at surprises) you will be expecting a live kitten. If instead they get you a stuffed kitten, they would have then unnecessarily set you up for disappointment.


As a LunaMetrician, Wil and Kerry’s presentations were music to my ears. We have an industry leading analytics department that helps us spot after the click problems. But we shouldn’t stop at identification. We need to take the data and do something with it.

Tell the Story

Speaker Marshall Simmonds spoke passionately about the industry’s need to take our data back from Google and use it to create a story clients can understand. Use everything at your disposal to paint the most reliable picture possible. Go beyond Google Analytics and Webmaster Tools and start analyzing log files.

yoda-do-or-dontThen take all that information and write the story. Use that story to drive conversations about conversion optimization and the customer journey.

Finally take action. That action could be setting up an A/B test (see Kyle Rush’s presentation PDF), or testing a new content idea (see Dr. Pete’s presentation PDF), or maybe you need better engagement (see Richard Milligton’s presentation PDF). You can see all speakers and presentations on the MozCon website.

Whatever it is, take the first step, analyze, and adjust. Use the insights from success and failure to develop your story and affect further change.

Author: "Sean McQuaide" Tags: "Industry News, Search Engine Optimizatio..."
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 22 Jul 2014 12:52


A great new feature, Tag Firing Priority was rolled out inside of Google Tag Manager around July 1 along with the updated and redesigned debug mode. It is seemingly a small feature, located under the ‘Advanced Settings’ in the Tag (see below).


It’s an exciting update not only because of the application of setting priority, but also because it proves the direction Tag Manager has been heading – toward giving marketers and analysts more comprehensive control over the Tags they load on their site. Without any extra coding on the site, users can now control the firing priority of their Tags within Google Tag Manager’s interface.

Priority affects Tags that have the same firing Rule and is especially relevant for sites that have many Tags and third-party scripts like DoubleClick, Bounce Exchange, and search conversions that fire when the page loads. Tags marked with a higher priority are fired first, followed by lower priority Tags.

The elements on a page are not loaded simultaneously (painfully obvious whenever you experience a slow internet connection). A browser typically ‘reads’ the code of a page like us – from top to bottom. That’s why style and script references go in the header, to be loaded first. The Google Tag Manager container is also toward the top of the page, directly after the opening body tag.

Tags loaded from Google Tag Manager are loaded asynchronously. This is a good thing because a slow-loading Tag will not block other Tags from firing. It also means that if Google Analytics or Tag Manager has trouble loading for some reason, it won’t prevent the rest of your site from loading. This is true for setting priority as well- changing the priority of a Tag will not prevent other Tags from firing, even if one of a higher priority does not fire at all.

Testing Tag Firing Priority

So, why is this necessary? How drastically can prioritizing Tags really affect firing order and how can we find out? An experiment!

For this test, we created ten separate Tags with identical firing Rules and edited the priority to compare firing times.

Before setting it up, we thought about how we’d need to look at the data in Google Analytics. To see how long it took from the first Tag to the last, we needed to include the time each was fired. We also need to be able to group them together by visitor, so we took advantage of the GA cookie ID.

One way to capture this unique cookie ID is to create a custom JavaScript Macro. I tweaked this code to handle the new Universal Analytics GA cookie to suit our purposes.


For recording the firing time, a useful function is right in the example of the custom JavaScript Macro type. For our purposes, I didn’t format this and instead keep it in milliseconds. The Universal Analytics cookie also has a timestamp, but that time is actually when the cookie was created and will not return the current time.


Next, I created 10 separate Tags in Google Tag Manager, named “Event – Priority Test Tag – [01-10].” These are Universal Analytics event Tag types and all will have the firing Rule “All Pages” so that they will fire as soon as the container loads. Note that this firing Rule will not wait for all the elements of the page to load, so there may be page elements, other Tags, and other on-page scripts loading while our Tags are firing.

As with any event that automatically fires on page load, I made sure to set Non-Interaction Hit to “True” so our bounce rate wasn’t affected. We also created a Rule to block even-numbered cookies so that the events will fire about half the time.

For the event parameters, the Tag number [1-10] was used as the action and the Macros we created were used as the label.


For the Tag Firing Priority, we set up the Tags below with 1 and 2 having the highest priority, 9 and 10 the lowest, and 3-8 sharing a priority of 0.


The Results

So what did our grand experiment tell us?

As expected, the time between the Tags firing was very small. The mean difference between the first and the last was about 50 milliseconds.

The priority feature worked as expected – the first two Tags were always the first to fire and the last two were almost always the last. Here’s an example of what the data looks like in Google Analytics.


Priority actually mattered… a little. While each site is different, we did have a very small percentage of users where we saw a drop-off happening. They were only served the first Tags and presumably left the page before the lower-priority Tags could fire.

The surprising outlier was over 10 seconds between the first Tag and the last. Outliers aside, the high end of the time difference was closer to 1 or 2 seconds. So while this may seem small, keep in mind this graph from KISSmetrics about how page load affects bounce rate, and the difference a single second can make.

How can I use this in my implementation?

When working on a Google Tag Manager implementation, we recommend the following best practice: when there are multiple Tags firing, give Analytics Tags higher numbers in firing priority (over remarketing, for example). We’re not just saying this because we are Analytics consultants and trainers!

While the chance is very small, it is possible that some transaction or conversion data might be missed if a user leaves the page before the Tag is fired. So why not play it safe and make sure Tags related to Analytics and Ecommerce data should be given highest priority? You want to make sure you catch every hit so you know that you can trust your data when you’re reporting on the KPIs of your site.

Author: "Samantha Barnes" Tags: "Google Tag Manager"
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 17 Jul 2014 12:45


Often overlooked, Internal Site Search’s importance shouldn’t be underestimated. Recently, as I was exploring our company’s website, I noticed that our internal search results weren’t as helpful as I anticipated.

I conducted a search on our site for “google analytics”, a term very significant to us at LunaMetrics. I was shocked to see that all the top listed results were blog posts.

While blogging is important to us, it’s also important for our visitors to know that we offer trainings around the country and Google Analytics services to clients. All the relative content we had created through our blog was coming back and actually overpowering our other results, hardly an ideal situation.

We, as marketers, do a lot to get people to our site. From search engine marketing to analysis of internal analytics, we make it a top priority to ensure our website is extremely visible across all channels of the internet. Why then does it seem that we tend to slack when it comes to internal search results of our own site?

Not being able to quickly see our Google Analytics trainings after my query was a definite problem. If you’re in a similar position, here’s how I sought out to address it.

Let’s Break Optimization into Two Steps:

  1. Checking to see if you have a problem
  2. Addressing your problem areas

Checking To See If YOU Have A Problem

The first step in checking to see if you have an internal search optimization problem is to create a list of terms you would like to test. The easiest way to do this is to export the list of site search terms from Google Analytics.

Once you have a list of site search terms take the most popular (we chose to do the top 25) and determine what your top results are. You can give this project to an intern (bad idea) or do it programmatically like we did!

After reading Michael’s recent blog post and talking over the problem with LunaMetrics’ Jon Meck, we came up with a solution. Using SEO Tools for Excel, we created a site search scraper. This Excel document uses the SEO Tools XPathOnURL function to crawl pages for specific content.

We used it to diagnosis issues with our internal site search, by crawling our internal search results page for the first result that appeared and storing it in the Excel file.

Internal site search varies depending on your website, so you will have to look through your own results in order to figure out where to pull your search results from. For us, we found that the 1st search result was always within the H3 tag. This is what we wanted to concentrate on. Once the file loads and the site scrape is complete, you will have a list of search results.

The next step is the important one. From the list of search results you need to identify problem areas, or areas where you think the top search result should actually be something else. As I mentioned before, our first problem area was “google analytics” which wasn’t returning results for either Services or Trainings. Once you have identified the problem areas, you are going to address them.

Addressing Your Problem Areas

There are many ways to address content problems with internal search. Depending on what you use for site search, you could dynamically change your search results so they better encompass the problem terms, or you could look for a new site search that would allow you to do so.

If you don’t have that option, you can try the method I tested, which involves using Google Tag Manager to fire suggestions every time one of these “problem terms” is queried, containing the information that you deem more relevant to your company.

I am favoring the GTM route because it still works even when you add more content to your site. If you are always adding new content to your site, you never know if something you just added suddenly becomes more relevant than the result you optimized prior to adding the new content.

Here’s the basic gist of how we’ll address the situation. A person arrives on the search results page, we’ll check to see if they searched for a keyword that we’re optimizing for, and if so, we’ll use GTM to insert a suggestion.

Before I discuss how to implement GTM, you should look at your problem terms and group anything that is similar together, as these keywords will become the terms that trigger your Tag Manager suggestion to fire.

After grouped, you need to generate a new message for each group that better promotes the more relevant content. For us, we included a brief description and HTML links to more relevant content. Here is what it looks like inside of the the suggestion wrapper:

Site Search Suggestion from GTM

We will be using a Lookup Table Macro inside of GTM, so we’re also limited to just 255 characters for this new message.

How to Implement

1. Create a Macro to pull the site search term from the URL. If your site search term doesn’t show up in the URL with query parameters, you may need to find a creative solution here. For us, it was as simple as using a URL macro and entering the site search parameter, “s”.


2. Create a Lookup Table Macro to match the search query term to the suggestion text. Check out this link to automate the Lookup Table Macro.


3. Create a Rule for when the page is equal to the Search Results page and where the Lookup Table has returned a value. This way it will only fire when someone searches for a term that we listed in the Lookup Table.


4. A Custom HTML Tag was made for the shell going around the search term suggestion. I modified the Expand Message example from this blog post about inserting Ads on your site through Tag Manager.

5. The Lookup Table Macro was inserted into the Custom HTML Tag in order to populate the message inside of the suggestion box.


6. Set the firing rule to be the Search Results rule we just created.

7. Lastly, you’ll want to add some sort of tracking to this, to see if your optimization was worthwhile. For us, we used a class on the links and a Link Click Listener to fire a Google Analytics Event every time someone clicked on our suggested links.

Make sure you test and debug as much as needed to get this box to look right, then Publish to your site!

After successful implementation of this Google Tag Manger Suggestion Box, here is what it looks like now when I search for Google Analytics.


Good luck with your own optimization and let us know what you are doing to stay on top of your internal site search in the comments below!

Author: "Jonathon Stephens" Tags: "Google Tag Manager"
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 16 Jul 2014 16:03

Have you ever tried to use the “plot rows” feature in Google Analytics and it literally falls flat?

It happens because you can’t keep the chart from graphing the metric total. That thick blue line across the top of your chart flattens everything else. It keeps the size of the chart static, rendering it useless.
Drink me potion and key on table next to miniature door

Wouldn’t it be great if you could graph only the rows you want and the chart would dynamically resize?

Here’s the key to turning those flat, plotted rows into dynamic data visualizations: motion charts.

I used to think that motion charts were all flash and no substance, and then I found out they were more than a bunch of colorful, moving bubbles. Motion charts deliver insight “in motion”, and they plot rows better than “plot rows” can.

Resizing data is as easy as Alice’s “Drink Me” potion. Read on to find out how it works.

The Problem: Flat Data

In case you’ve never seen this before, let’s look at an example. Suppose you want a visual comparison of transactions for the first two weeks of July, when you targeted users from four states. You go to the Audience > Geo > Location report and drill into the row for United States.

Here’s what you get if you tick the check boxes next to each state and then click “plot rows”: four very flat lines.

Four flat lines of data in a chart where the total metric line controls the size

The blue line (total transactions) controls the size of the chart and cannot be removed.

The Solution: Plot Rows with Motion Charts

With a few simple steps, you can change the chart to show exactly what you want.

Step 1: Decide how many rows to show.

How to show more rows in Google Analytics

Change “show rows” so you can see everything you might want to compare. If the states you want are already visible in the top 10 rows, you can skip this step.

But if you need states past the top 10, or you need to compare other states, too, go ahead and show more rows now.

Step 2: Click the motion charts icon.

The motion charts icon has three bubbles and is just above the chart area, on the far right.

Step 3: Select the line graph tab.

Change from bubbles to lines by choosing the tab with the line graph in the top right corner of the chart.

Plot rows with motion charts in Google Analytics

Step 4: Change opacity settings (wrench) to 0%.

Click the wrench just below the chart, on the far right, and drag the slider all the way to 0% opacity. When you start selecting states, all the other states will appear to vanish, leaving you with just the data you want.

Step 5: Select your metric on the left side of chart (sideways).

In this example, you want to change the chart metric to Transactions. The chart metric is the one that appears sideways on the left side of the chart.

Step 6: Click states to compare.

Ready for the good stuff? Start clicking the states you want to compare. What a difference! Each trend line, and several daily spikes, are now clear.

Step 7: Curiouser and curiouser!

What more will you discover, now that the basic setup is done? Try these:

  • Hover for details or to focus on one trend line
  • Switch back and forth quickly by checking and unchecking rows
  • Motion charts are not limited to 6 lines (plotted rows are)
  • Magnify lower trends by switching from linear to log scale!

Campaigns, keywords, landing pages, products, articles… what would you like to visualize? Have you used motion charts like this before? Do you have tips to share? Let me know in the comments.


Author: "Dorcas Alexander" Tags: "Analytics, Google Analytics"
Comments Send by mail Print  Save  Delicious 
Date: Monday, 14 Jul 2014 16:53


You know what’s been grinding my gears lately? No matter how long I’ve been in the search field, or what happens out there in the industry, some myths continue to persist. Wishful thinking? Lack of education? I say both.

Let’s clear up some common misconceptions with the help of some industry experts from Google+. If you’re an average web user, Google+ probably doesn’t have a place in your life. However, I’ve found it to be a thriving locale for search industry discussion! Add one of these experts to your circles and join the conversation today.

Myth 1: Social Media EQUALS Higher Rankings.

In reality, social media LEADS TO higher rankings. There is a big difference between correlation and causation, which is what study after study proves and Google confirms.

I urge people to think about the process instead:

More shares -> more eyeballs -> more link opportunities -> more links and digital authority -> higher rankings.

But racking up tweets in the name of SEO is not going to take you far until search engines find a consistent way to monitor social signals.  Eric Enge of Stone Temple Consulting has some thoughts:

Myth 2: SEO is JUST Links and Words

This myth is a product of SEO evolution because it was not long ago that this industry was built on links and words. Authority from links and keyword research are still essential components, but Google looks to established brands more than ever, and that trend promises to continue.

Want to see what I mean? Search for “masters of public health.” The top 10 pages do not have the most links and probably performed zero keyword research. But do you recognize any names? Harvard, Johns Hopkins, NYU. Investing in your brand will help you more today and tomorrow than building cheap links.

Building your business has never been more in fashion. Rand Fiskin of MOZ weighs in on this idea in the video below. SEO is no longer a few simple strategies.

Myth 3: Technical SEO is Dead

Google and Bing have invested heavily in their Webmaster Tools services to make technical SEO easier for every site owner, but that does not give you permission to forget about redirects, canonical tags and indexation. One-third of my sales calls are with companies after a redesign goes wrong and search traffic flatlines.

It is still important for every webmaster to understand the basics of technical SEO, if only as an insurance policy. Dan Shure of Evolving SEO shared one of our pieces on the great SEO uses of Google Webmaster Tools. Make sure to follow him!

Myth 4: Yeah I Did All of That SEO Once, I’m Done Now

Reality, you’re never finished. This is a world of curve-balls and current best practices because nothing stays still. Take Google Authorship Photos for example, one second the pros are telling clients to add it for authority and higher organic click through rates then the next they are gone and strategies are shifting.

See Luna’s own Andrew Garberson’s take on Google removing Author photos from SERPs:

Myth 5: Paying for AdWords GUARANTEES Higher Rankings

This is an oldie but a goodie. There certainly ARE benefits to using Paid Search for data and coverage during your organic campaign, for instance, reviewing your Matched Search Query report and The Paid/Organic Report leap to mind, but this myth definitely doesn’t hold water.

Here Barry Scwartz shares a video of Matt Cutts’ (Google’s Webspam Lead) favorite myths.

Myth 6: Your Measurement Strategy Will Never Change

We are creatures of routine, and that can be to our detriment in the search game. Monitor, analyze, repeat. But then, issues like (not provided) happen. Lindsey Wassle shared a great new feature in MOZ to help with that. Your measurement strategy should reflect the best data you have access to and should be fluid.


Add me on Google+ and let me know what myths and misconceptions you debunk for businesses.

Author: "Michael Bartholow" Tags: "Search Engine Optimization"
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 10 Jul 2014 14:40

A fellow LunaMetrician recently returned from SMX Advanced and said it was refreshing to hear how much user experience (UX) and conversion rate optimization (CRO) were included in the SEO conversation this year.

The days of simply ranking for a high-volume keyword or getting visitors to the site have been eclipsed by metrics that more closely resemble offline business objectives. Now SEOs think in terms of sales leads and keep a close eye on landing page bounce rates, conversion rates and direct impact to the bottom line.

But before diving into the world of A/B and multivariate testing, it’s crucial to know where you stand. This 7-minute UX audit for landing pages should be the first step.


Which pages should I test?

Landing pages are (in a simple sense) places on your website dedicated to welcoming a visitor and efficiently completing a goal, or converting. If your website is designed to capture sales leads, these might be service pages, case studies or other informational pages that allow people to contact you to continue the sales process.

The UX Audit

This is the UX audit. Open your landing page(s) in another tab or window and answer the following seven questions. Remember: objectivity is key.

1. Digestible Text

Too much text and someone won’t even start the first line. Too little and they won’t receive enough information to complete the goal. Let’s see where your landing pages lie in that spectrum.

How long does it take to read the text aloud?

2. Strong Calls to Action

Call-to-action statements are short phrases near the conversion button or link that tell visitors what to do. “Learn more today!” might sound cheesy, until you see how many more people do click to learn more.

Which phrase most closely resembles your call-to-action button?

3. Above-the-Fold

Harvard recently redesigned its online courses page after UX testing revealed that only a small fraction of visitors clicked on links below the fold. And that’s Harvard, where students would kill for an opportunity to take an Ivy League course. Chances are your clients are not beating down the door in the quite the same way.

Without scrolling, are you able to: (1) determine what is the product or service, (2) understand why it is valuable and (3) see how to begin the conversion process?

4. Easy on the Eyes

Landing page graphics provide a welcoming first impression and help visitors decide whether or not to read the accompanying text. The right image can make all of the difference, whereas the wrong one, well, you get the idea.

Which statement most closely describes your landing page image?

5. Accessible

Not every industry or company or product needs to position itself for mobile traffic, but conversion rates from mobile devices are rising in many sectors, so it is at least worth asking the question: Are my landing pages mobile-friendly?

With smart phone in hand, navigate to your landing page. Do you have to pinch or pull or slide to read the copy and complete the goal?

6. Discoverable

Site operators act as quick indicators that search engines can crawl and index pages on the website. Copy your URL and paste it in Google with “site:” immediately before it. For example site:http://www.domain.com/folder/page

Does the page appearin Google’s results?

* Note: PPC-only landing pages are sometimes designed not to appear in the search results.

7. Page Speed

Page load speed is critical for conversions, and not just for e-commerce sites like Amazon. Study after study find that maximizing page speed positively impacts conversion rate.

Test landing page speed here. What are the results?

Your Total:

Scoring Your Results

12 points is a perfect landing page score, meaning that >9 is good, 6-9 is weak and <6 is, well, time to dedicate some attention to landing pages in need.

Other Landing Page Resources