Ten Questions with Compete


This interview features the chief marketing officer and product manager of Compete, Inc. They discuss how Compete competes with Alexa and Comscore, site metrics, and SEO practices.

  1. Question: What exactly does Compete do?

    Answer: Stephen DiMarco: We have a diverse sample of 2,000,000 U.S. Internet users that have given us permission to analyze the web pages they visit and ask them questions via surveys. Web analytics really means analyzing what consumers do across the entire web, not just what they do within a particular site, and that marketers can use this information across the entire company, not just for online media planning or site design decisions.

    A great example is the work we’re doing in the truck segment for auto marketers; this segment is an all-out battle between Ford, GM, and Toyota with nearly a billion dollars of advertising spent each year. We’re using online behavior to predict the number of people shopping each truck in the category to understand how campaigns are swaying shoppers throughout the month. This helps auto marketers map how cost effective their advertising is, and guides whether they need to up dealer incentives or increase advertising to hit their monthly sales target.

  2. Question: What did your investors say when you started giving away your data for free?

    Answer: Stephen DiMarco: It was kind of like “Twelve Angry Men” meets “40 Year-Old Virgin”—lots of heated debates wrapped up in a great fear of the unknown. Quite rationally, our board felt we were giving away a really valuable asset, and no amount of provocative quotes from John Batelle or The Cluetrain Manifesto could assuage them.

    Seriously, we did it for three reasons:

    1. We believe that if we’re collecting information from consumers, we should make the information available back to them and be transparent how we’re using it.

    2. We believe that making the data easy to access through Compete.com or our API will increase the pace of innovation in web measurement and online marketing.

    3. We wanted to do something remarkable and game changing, and felt this data is too valuable to bottle up inside of $50,000 per year licenses.

  3. Question: Do your stats include Macintosh users and Firefox users?

    Answer: Stephen DiMarco: Yes, Compete’s sample includes both Macintosh and Firefox users. We’re data junkies and know the importance of having a sample that represents the internet at large. To that end, our sample is created from more than ten different data sources ranging from our own opt-in panels to data that we license from ISP and ASP partners. We designed our software to be compatible with Macintosh and Firefox, so they’re accurately represented in our data set. And the data that come in through our ISP and ASP partnerships also includes Linux, Macintosh, Firefox, and Safari users.

  4. Question: How are your results different from Alexa and Comscore?

    Answer: Stephen DiMarco: We like to say to say that more is better—and by measuring 2,000,000 U.S. consumers each month, we’re substantially bigger than Comscore. Our larger sample gives us more reporting depth and we feel our results are more accurate because we measure one million websites compared to the 15,000 that Comscore Media Metrix measure. There is incredible value in being able to accurately measure the “torso” of the web—sites in between the head and the tail—we’re better at this than Comscore because our sample is so much larger and we see things that their data just doesn’t pick up.

    We also have an accuracy advantage because the diversity of our data sources helps us identify and eliminate biases that show up from time to time. Our multi-source approach is a big point of differentiation—no one else in the market can do it—transforming more than ten different data streams into a common format and then performing statistical projections across 2,000,000 people on a nightly basis is no small feat.

    Alexa is a storied internet brand, but unfortunately a big part of the story is how bad its web traffic estimates are. Unlike Alexa, we go through a rigorous panel selection and normalization process that involves an independent RDD survey, demographic scaling and extensive QA from our data operations team. Alexa, on the other hand, has a single source of data—its toolbar—so it’s very susceptible to bias. In fact, we often get emails from companies offering to increase our Alexa ranking by downloading tons of toolbar and then visiting our site!

  5. Question: Then should we all remove our Alexa bookmarks and replace them with Compete?

    Answer: Jay Meattle (product manager): Not yet—I keep both bookmarked. As long as you know what you’re looking at and keep the accuracy caveats in mind, Alexa can still be a tool for quick analyses and for evaluating international users. However inaccurate it might be, it’s still another data point. That said, I don’t recommend making multi-million dollar decisions based on Alexa!

  6. Question: Is SEO black magic and bull shiitake or can one increase traffic with a few changes to headers, keywords, etc?

    Answer: Jay Meattle: Every website owner that wants more traffic—who doesn’t?—should have an SEO strategy. A few tactical tweaks can indeed increase traffic from search engines dramatically.

    At Compete, we have experienced the power of SEO first hand. There was an 9x increase in the volume of referrals from Google to Compete.com within a month of our SEO related changes going live, and today search engine referrals are at around 21x pre-SEO levels.

  7. Question: There’s often a 10x difference between my server logs and Google Analytics say is my traffic. What accounts for this?

    Answer: The short answer is that when it comes to web analytics there are many ways of measuring the same thing. There are two main approaches for compiling local web analytics data. The first method, server log analysis, reads the logs files in which the web server records all its transactions. The second method, page tagging, like Google Analytics, uses JavaScript code on each page to notify a third-party server when a page is rendered by a web browser.

    Search engine spiders, bots, etc generally cannot execute JavaScript, and hence are not counted by Google Analytics, etc. Log files on the other hand include all traffic to your servers, including spiders and bots that appear as traffic, but do not represent actual human activity. This “non-human” traffic generally accounts for the difference between the two.

  8. Question: Then when people ask, do I give the log answer or the Google Analytics answer?

    Answer: Jay Meattle: Depending on who is using the data, there is value in both numbers. For example, an SEO expert will probably want to know what your server logs have to say to determine how often Google is crawling your site. Your investors on the other hand most likely don’t care about the same metric. They’re more interested in how effective you were in driving real living people to your site. People that have money to spend! Any good JavaScript based tracking tool like Google Analytics, IndexTools, Clicktracks, Omniture, etc. is better at proving you with this other metric since they generally don’t count non-human traffic like bots, spiders, etc.

    Another good option is to use metrics from a 3rd party like Compete. This allows the consumer of this kind of information to easily compare how your site stacks up against other sites. Quoting a non-biased verifiable 3rd party also helps lend credibility to your growth story. Bigger sites such as Yahoo, Google, MSN, etc. generally take this route.

  9. Question: Everyone else is lying, do I lie too or look less successful?

    Answer: Stephen DiMarco: Call me Catholic, but I have never had much success lying about web traffic, so I can’t even bluff over emails. Therefore, I am big proponent of providing as transparent a view into real measures as possible, but as you pointed out, it’s hard to know what the truth is when three different web analytics methodologies say three different things about your web traffic.

    When it comes down to it, the best thing you can do is select the option that you feel most accurately represents the most important aspects of your business, and then be transparent about the strengths and weaknesses of your decision. It’s liberating, and it becomes a real competitive advantage when you begin growing. Misleading the market is like building a house of cards and it’s really hard to perpetuate a lie when it comes to supporting clients, partners, employees, and investors. The truth wants to be free.

    This is a real growth opportunity for the web measurement industry. The fact that both Nielsen//NetRatings and Comscore were cornered into MRC audits reinforces this point. I mean, with so much more money poised to flow into online advertising the last thing marketers, agencies and publishers needed was doubt and confusion about how best to allocate it.

    This is one of the big reasons why we launched Compete.com; we wanted to provide free access to what our data says about traffic to the top one million sites. And it’s worked—this month 500,000 people will use Compete.com to find information on the sites they care about. We’re making our data and the web transparent at the same time. Anyone can see our data and compare it to their own local analytics reports, and then let us know how we’re doing.

  10. Question: What are the most common mistakes that companies make that yields sub-optimal traffic?

    Answer: Jay Meattle: The most common mistake companies make is not paying enough attention to SEO when they’re developing new websites. As a result, companies end up not optimizing their site structurally for search engines. This can be an expensive mistake, because your site will not get 100% of the traffic that it could be getting from search engines initially; and it can be resource intensive to make major structural changes to your site later on. My advice to everyone is that start thinking about SEO right from day zero—much before the website is up and running.

  11. Question: Then what can I do to increase traffic at Truemors?

    Answer: Jay Meattle: Here are three suggestions:

    1. You need to be more pro-active about telling search engines that your content exists. You can do this by creating a comprehensive Sitemap. A Sitemap is a XML file that lists URL’s for a site along with metadata such as how often it changes, how important it is, last updated, etc. so that search engines can more intelligently crawl the site. The current sitemap for Truemors lists twenty-four URLs.

      In addition to the twenty-four URLs, I recommend listing the URL for each and every story on Truemors (13,000+). This should significantly boost the numbers of Truemors pages indexed by Google, etc and in turn increase traffic. You can learn more about Sitemaps by signing onto Google’s webmaster tools or at Sitemaps.org.

    2. Customize your meta tags and titles for each page, and clean up the URLs to make them search engine crawler friendly. For example, search engines prefer descriptive URLs like
      http://truemors.com/fisherman-caught versus http://truemors.com/?p=13208. Cleaner URLs, titles, etc are also more SEO friendly, user friendly and “clickable.” It’s a win-win all around.

    3. Use Compete’s Search Analytics to identify keywords that drive quality traffic to competing sites. Once you have this list—tailor Truemors content so that you also start appearing in natural search results for those keywords, and/or start buying those keywords as sponsored results.

  12. Question: In two years what will be the top five social networking sites (in order from largest to smallest)?

    Answer: Jay Meattle: I don’t think the top five is going to look drastically different in two years. Our data tells us that it generally takes 24+ months from launch for a site to reach significant traffic levels—arbitrarily defined as 10 million+ unique visitors per month—thus I don’t expect any net new players to enter the list. Companies can of course buy themselves a spot in the top five, and it wouldn’t surprise me if Microsoft — missing from the list below — does just that.

    1. Facebook

    2. MySpace

    3. Yahoo/Flickr/del.icio.us/Upcoming/Answers/Mail, etc..

    4. Google/YouTube

    5. Digg

    Stephen DiMarco: I agree with Jay that the big five today will most likely be the big five tomorrow. Last year we surveyed online socialites—people using social networking sites—and learned some interesting tidbits; one was that people said they have the “social capacity” to be actively engaged in four social networking sites—anything beyond that and they become socially saturated. So it’s going to be hard to displace the current leaders.

    The one opportunity I do see is for companies or existing social organizations to bring social network and media principles to their existing member bases. So while you’ll find an AARP group in Flickr or Facebook, you might also see AARP.com morph into a mash-up of an information, commerce and membership site with a strong dose of social networking tools. Likewise, you might see some companies with large customer bases—American Express, Comcast, Walmart— follow Dell’s lead and try some innovative community-based initiatives again.

By | 2016-10-24T14:17:59+00:00 October 29th, 2007|Categories: Uncategorized|17 Comments

About the Author:

Guy Kawasaki is the chief evangelist of Canva, an online graphic design tool. Formerly, he was an advisor to the Motorola business unit of Google and chief evangelist of Apple. He is also the author of The Art of Social Media, The Art of the Start, APE: Author, Publisher, Entrepreneur, Enchantment, and nine other books. Kawasaki has a BA from Stanford University and an MBA from UCLA as well as an honorary doctorate from Babson College.


  1. Chris Anthony October 29, 2007 at 4:52 pm - Reply

    Guy, I haven’t yet read this Ten Questions piece because I’m too distracted by the latest change to your RSS feed. Providing readers with a content summary strikes me as a cheap trick to drive traffic to your site (and to make your site appear in access logs more often as a referrer) rather than a way to increase the site’s actual value.
    Right now, as I see it, you have three classes of users: users who were going to come to your site anyway so they could leave comments (did you previously have a direct link to the comments section of your posts in your RSS feeds?), users who are going to come to your site now because they’re forced to in order to read what you’re writing, and users who are going to stop reading what you’re writing because they’re expecting a level of service from your RSS feed that it’s no longer providing. That doesn’t exactly seem like the optimal situation to me; you’re not winning any readers by changing formats, and you’re potentially alienating quite a few.
    At the very least, please give us an reason why you changed the feed. Like I said above, from where I’m sitting it looks like a cheap trick, and I prefer to think better of you.

  2. Jason Gollan October 29, 2007 at 5:12 pm - Reply

    Hi Guy,
    Thanks for the interview with DiMarco. I love this quote, “Alexa is a storied internet brand,” which is only ever so slightly smarmy, don’tcha think? Compete’s been around since at least 2000, if memory serves, and has a history and track record itself in part because of the efforts of the thirteen of us who came over from Alexa Research, as a team, to help take it out of the incubator and into the real world. Not that that mattered much once the MBAs from Bain & Co. got their hands on it. Then again, were it not for those guys, I wouldn’t be in graduate school right now. Seems like Compete’s doing better with the launch of compete.com, which is a great idea. Let the mashups begin! Access to APIs = Cool stuff. Thanks again for the interview.

  3. Firenza DiLoren October 29, 2007 at 5:27 pm - Reply

    Please return to full content in RSS feeds. I’m fine with ads, just don’t make me come to your website. It disrupts my whole news consumption pattern. If this continues, I need to unsubscribe.

  4. Firenza DiLoren October 29, 2007 at 5:44 pm - Reply

    To clarify: I’m fine with the whole HTML of this page being RSS syndicated. Really the whole she-bang with ads and tie-ins left and right. While it may sound odd that a whole extra click really disrupts things, you need imagine that I speed read RSS content keeping one finger on “n” (for next article) to quickly scan if an news item interests me before jumping to the next one. Your partial paragraph totally disrupts this pattern as I need to press a different key, have another window open, quick scan, close that window before being able to continue. Given that I have X minutes a day allocated for news consumption, any decrease in efficiency sucks.

  5. Dave N October 29, 2007 at 7:56 pm - Reply

    +1 on Chris Anthony’s comment; I couldn’t have said it better. Please consider switching back or at least explaining the switch.

  6. Mike October 29, 2007 at 8:52 pm - Reply

    The only reason I came to this page is to say I will unsibscribe if the feeds are not restored. Im glad to see other people are talking about it as well. Please return to full feeds.

  7. Marc Duchesne October 30, 2007 at 6:15 am - Reply

    To all but a few : I sincerely don’t understand your actual rage about the RSS stuff. Set-up an account on Netvibes, and you’re done. Come on, the post title and the first line should be enough to raise your interest or not. Especially with someone like Guy, who knows how to grab attention at first sight.

  8. fullfeedss October 30, 2007 at 6:27 am - Reply

    I don’t know how you’ll contact me, but please do so when you get back full feeds.
    In the meantime, I’m out. Good luck on reaching the top 10, but your teasers just don’t make me click for more.

  9. Anthony Kuhn October 30, 2007 at 3:28 pm - Reply

    Wow! Your change in RSS feed content is generating plenty of froth; mostly negative it would appear. I understand that Compete Inc. is trying to capitalize on web traffic data to generate revenue, but I miss the halcyon days of the Internet when a simple metatag with relevant keywords could generate traffic without the need for a doctorate degree in SEO machinations. Must it always be about money and bottom-line? I cross-posted on your piece to http://blog.innovators-network.org The Innovators Network is a non-profit dedicated to bringing technology to startups, small businesses, non-profits, venture capitalists and intellectual property experts. Please visit us and help grow our community!
    Best wishes for continued success,
    Anthony Kuhn
    Innovators Network

  10. Innovators Network October 30, 2007 at 3:32 pm - Reply

    Compete To Increase Web Site Traffic

    Guy Kawasaki interviews the team from Compete, a web traffic analytics business, and learns three ways to increase traffics to his startup website, Truemors. First, create a Sitemap, which is an XML file that lists URL’s for a site along

  11. Small Business Essentials October 31, 2007 at 12:17 am - Reply

    Transparency and Analytics

    As I was reading the post Ten Questions with Compete on Guy Kawasakis blog, I was reminded of a recent conversation I had on StartupNation with a designer named Paula about what kind of site information is proprietary, and if website traffic dat…

  12. Kibrika November 2, 2007 at 6:20 am - Reply

    About the change in url from a meaningless index to a meaningfull text – wouldn’t it slow down the server looking up that particular article? Or is the slow down so meaningless that it would still be worth it?

  13. CarolG November 2, 2007 at 6:14 pm - Reply

    When I was young in the Northwoods of Wisconsin, I only had arcade games, a touring bike, and the Sears catalog. A blog was an electronic dream, RSS whiners.
    Man up.

  14. ۞ Search Engines WEB ۞ November 10, 2007 at 12:44 am - Reply

    One question that should have been asked is:
    How are the permissions attained from the 2 million users?
    The answer could determine how representative they are to the U.S. Web users as a whole.

  15. BizTechTalk November 12, 2007 at 8:26 am - Reply

    Attention! We actually do have your attention!

    Quick (it’s all relative) post on this one – for about a year now, I’ve been tracking a Boston-area company called Compete, and have been lucky enough to have some great conversations, lunches and dinner with a solid handful or

  16. John Somerton November 14, 2007 at 7:02 am - Reply

    The Dangerous Skew of Toolbar Generated Web Stats

    Relying on toolbar generated metrics such as Alexa to make important decisions can be very dangerous if you don’t ask yourself the single most important question of any data collection tool: What is the source of the data? Yes it’s

  17. jimmy November 14, 2007 at 1:34 pm - Reply

    ” it wouldn’t surprise me if Microsoft — missing from the list below — does just that.”
    And that they did..

Leave A Comment