Compete.jpg

This interview features the chief marketing officer and product manager of Compete, Inc. They discuss how Compete competes with Alexa and Comscore, site metrics, and SEO practices.

  1. Question: What exactly does Compete do?

    Answer: Stephen DiMarco: We have a diverse sample of 2,000,000 U.S. Internet users that have given us permission to analyze the web pages they visit and ask them questions via surveys. Web analytics really means analyzing what consumers do across the entire web, not just what they do within a particular site, and that marketers can use this information across the entire company, not just for online media planning or site design decisions.

    A great example is the work we’re doing in the truck segment for auto marketers; this segment is an all-out battle between Ford, GM, and Toyota with nearly a billion dollars of advertising spent each year. We’re using online behavior to predict the number of people shopping each truck in the category to understand how campaigns are swaying shoppers throughout the month. This helps auto marketers map how cost effective their advertising is, and guides whether they need to up dealer incentives or increase advertising to hit their monthly sales target.

  2. Question: What did your investors say when you started giving away your data for free?

    Answer: Stephen DiMarco: It was kind of like “Twelve Angry Men” meets “40 Year-Old Virgin”—lots of heated debates wrapped up in a great fear of the unknown. Quite rationally, our board felt we were giving away a really valuable asset, and no amount of provocative quotes from John Batelle or The Cluetrain Manifesto could assuage them.

    Seriously, we did it for three reasons:

    1. We believe that if we’re collecting information from consumers, we should make the information available back to them and be transparent how we’re using it.

    2. We believe that making the data easy to access through Compete.com or our API will increase the pace of innovation in web measurement and online marketing.

    3. We wanted to do something remarkable and game changing, and felt this data is too valuable to bottle up inside of $50,000 per year licenses.

  3. Question: Do your stats include Macintosh users and Firefox users?

    Answer: Stephen DiMarco: Yes, Compete’s sample includes both Macintosh and Firefox users. We’re data junkies and know the importance of having a sample that represents the internet at large. To that end, our sample is created from more than ten different data sources ranging from our own opt-in panels to data that we license from ISP and ASP partners. We designed our software to be compatible with Macintosh and Firefox, so they’re accurately represented in our data set. And the data that come in through our ISP and ASP partnerships also includes Linux, Macintosh, Firefox, and Safari users.

  4. Question: How are your results different from Alexa and Comscore?

    Answer: Stephen DiMarco: We like to say to say that more is better—and by measuring 2,000,000 U.S. consumers each month, we’re substantially bigger than Comscore. Our larger sample gives us more reporting depth and we feel our results are more accurate because we measure one million websites compared to the 15,000 that Comscore Media Metrix measure. There is incredible value in being able to accurately measure the “torso” of the web—sites in between the head and the tail—we’re better at this than Comscore because our sample is so much larger and we see things that their data just doesn’t pick up.

    We also have an accuracy advantage because the diversity of our data sources helps us identify and eliminate biases that show up from time to time. Our multi-source approach is a big point of differentiation—no one else in the market can do it—transforming more than ten different data streams into a common format and then performing statistical projections across 2,000,000 people on a nightly basis is no small feat.

    Alexa is a storied internet brand, but unfortunately a big part of the story is how bad its web traffic estimates are. Unlike Alexa, we go through a rigorous panel selection and normalization process that involves an independent RDD survey, demographic scaling and extensive QA from our data operations team. Alexa, on the other hand, has a single source of data—its toolbar—so it’s very susceptible to bias. In fact, we often get emails from companies offering to increase our Alexa ranking by downloading tons of toolbar and then visiting our site!

  5. Question: Then should we all remove our Alexa bookmarks and replace them with Compete?

    Answer: Jay Meattle (product manager): Not yet—I keep both bookmarked. As long as you know what you’re looking at and keep the accuracy caveats in mind, Alexa can still be a tool for quick analyses and for evaluating international users. However inaccurate it might be, it’s still another data point. That said, I don’t recommend making multi-million dollar decisions based on Alexa!

  6. Question: Is SEO black magic and bull shiitake or can one increase traffic with a few changes to headers, keywords, etc?

    Answer: Jay Meattle: Every website owner that wants more traffic—who doesn’t?—should have an SEO strategy. A few tactical tweaks can indeed increase traffic from search engines dramatically.

    At Compete, we have experienced the power of SEO first hand. There was an 9x increase in the volume of referrals from Google to Compete.com within a month of our SEO related changes going live, and today search engine referrals are at around 21x pre-SEO levels.

  7. Question: There’s often a 10x difference between my server logs and Google Analytics say is my traffic. What accounts for this?

    Answer: The short answer is that when it comes to web analytics there are many ways of measuring the same thing. There are two main approaches for compiling local web analytics data. The first method, server log analysis, reads the logs files in which the web server records all its transactions. The second method, page tagging, like Google Analytics, uses JavaScript code on each page to notify a third-party server when a page is rendered by a web browser.

    Search engine spiders, bots, etc generally cannot execute JavaScript, and hence are not counted by Google Analytics, etc. Log files on the other hand include all traffic to your servers, including spiders and bots that appear as traffic, but do not represent actual human activity. This “non-human” traffic generally accounts for the difference between the two.

  8. Question: Then when people ask, do I give the log answer or the Google Analytics answer?

    Answer: Jay Meattle: Depending on who is using the data, there is value in both numbers. For example, an SEO expert will probably want to know what your server logs have to say to determine how often Google is crawling your site. Your investors on the other hand most likely don’t care about the same metric. They’re more interested in how effective you were in driving real living people to your site. People that have money to spend! Any good JavaScript based tracking tool like Google Analytics, IndexTools, Clicktracks, Omniture, etc. is better at proving you with this other metric since they generally don’t count non-human traffic like bots, spiders, etc.

    Another good option is to use metrics from a 3rd party like Compete. This allows the consumer of this kind of information to easily compare how your site stacks up against other sites. Quoting a non-biased verifiable 3rd party also helps lend credibility to your growth story. Bigger sites such as Yahoo, Google, MSN, etc. generally take this route.

  9. Question: Everyone else is lying, do I lie too or look less successful?

    Answer: Stephen DiMarco: Call me Catholic, but I have never had much success lying about web traffic, so I can’t even bluff over emails. Therefore, I am big proponent of providing as transparent a view into real measures as possible, but as you pointed out, it’s hard to know what the truth is when three different web analytics methodologies say three different things about your web traffic.

    When it comes down to it, the best thing you can do is select the option that you feel most accurately represents the most important aspects of your business, and then be transparent about the strengths and weaknesses of your decision. It’s liberating, and it becomes a real competitive advantage when you begin growing. Misleading the market is like building a house of cards and it’s really hard to perpetuate a lie when it comes to supporting clients, partners, employees, and investors. The truth wants to be free.

    This is a real growth opportunity for the web measurement industry. The fact that both Nielsen//NetRatings and Comscore were cornered into MRC audits reinforces this point. I mean, with so much more money poised to flow into online advertising the last thing marketers, agencies and publishers needed was doubt and confusion about how best to allocate it.

    This is one of the big reasons why we launched Compete.com; we wanted to provide free access to what our data says about traffic to the top one million sites. And it’s worked—this month 500,000 people will use Compete.com to find information on the sites they care about. We’re making our data and the web transparent at the same time. Anyone can see our data and compare it to their own local analytics reports, and then let us know how we’re doing.

  10. Question: What are the most common mistakes that companies make that yields sub-optimal traffic?

    Answer: Jay Meattle: The most common mistake companies make is not paying enough attention to SEO when they’re developing new websites. As a result, companies end up not optimizing their site structurally for search engines. This can be an expensive mistake, because your site will not get 100% of the traffic that it could be getting from search engines initially; and it can be resource intensive to make major structural changes to your site later on. My advice to everyone is that start thinking about SEO right from day zero—much before the website is up and running.

  11. Question: Then what can I do to increase traffic at Truemors?

    Answer: Jay Meattle: Here are three suggestions:

    1. You need to be more pro-active about telling search engines that your content exists. You can do this by creating a comprehensive Sitemap. A Sitemap is a XML file that lists URL’s for a site along with metadata such as how often it changes, how important it is, last updated, etc. so that search engines can more intelligently crawl the site. The current sitemap for Truemors lists twenty-four URLs.

      In addition to the twenty-four URLs, I recommend listing the URL for each and every story on Truemors (13,000+). This should significantly boost the numbers of Truemors pages indexed by Google, etc and in turn increase traffic. You can learn more about Sitemaps by signing onto Google’s webmaster tools or at Sitemaps.org.

    2. Customize your meta tags and titles for each page, and clean up the URLs to make them search engine crawler friendly. For example, search engines prefer descriptive URLs like
      http://truemors.com/fisherman-caught versus http://truemors.com/?p=13208. Cleaner URLs, titles, etc are also more SEO friendly, user friendly and “clickable.” It’s a win-win all around.

    3. Use Compete’s Search Analytics to identify keywords that drive quality traffic to competing sites. Once you have this list—tailor Truemors content so that you also start appearing in natural search results for those keywords, and/or start buying those keywords as sponsored results.

  12. Question: In two years what will be the top five social networking sites (in order from largest to smallest)?

    Answer: Jay Meattle: I don’t think the top five is going to look drastically different in two years. Our data tells us that it generally takes 24+ months from launch for a site to reach significant traffic levels—arbitrarily defined as 10 million+ unique visitors per month—thus I don’t expect any net new players to enter the list. Companies can of course buy themselves a spot in the top five, and it wouldn’t surprise me if Microsoft — missing from the list below — does just that.

    1. Facebook

    2. MySpace

    3. Yahoo/Flickr/del.icio.us/Upcoming/Answers/Mail, etc..

    4. Google/YouTube

    5. Digg

    Stephen DiMarco: I agree with Jay that the big five today will most likely be the big five tomorrow. Last year we surveyed online socialites—people using social networking sites—and learned some interesting tidbits; one was that people said they have the “social capacity” to be actively engaged in four social networking sites—anything beyond that and they become socially saturated. So it’s going to be hard to displace the current leaders.

    The one opportunity I do see is for companies or existing social organizations to bring social network and media principles to their existing member bases. So while you’ll find an AARP group in Flickr or Facebook, you might also see AARP.com morph into a mash-up of an information, commerce and membership site with a strong dose of social networking tools. Likewise, you might see some companies with large customer bases—American Express, Comcast, Walmart— follow Dell’s lead and try some innovative community-based initiatives again.