Tuesday, August 14, 2012

SEO Expert Quiz by SEOMoz

The SEO Expert Quiz has 75 action-packed questions and takes 30 minutes to complete. You have nothing to lose and a lot of prestige to gain. Let the games begin!100% Take the quiz at SEO moz
  • 1 Which of the following is the least important area in which to include your keyword(s)?

    Your Answer: Meta Keywords
    Correct Answer: Meta Keywords
    The meta keywords tag is least important among these because search engines do not consider it in ranking calculations and it's never seen by visitors or searchers (unlike the meta description tag, which displays beneath listings in the SERPs).
  • #2 Which of the following would be the best choice of URL structure (for both search engines and humans)?

    Your Answer: www.wildlifeonline.com/animals/crocodile
    Correct Answer: www.wildlifeonline.com/animals/crocodile
    The best choice would be www.wildlifeonline.com/animals/crocodile - it provides the most semantic information, the best description of the content on the page and contains no parameters or subdomains that could cause issues at the engines. For more on URL structuring, see this post on SEOmoz.
  • #3 When linking to external websites, a good strategy to move up in the rankings is to use the keywords you're attempting to rank for on that page as the anchor text of the external-pointing links. For example, if you were attempting to rank a page for the phrase "hulk smash" you would want to use that phrase, "hulk smash" as the anchor text of a link pointing to a web page on another domain.

    Your Answer: False
    Correct Answer: False
    The biggest problem with linking out to other websites with your targeted keyword phrases in the anchor text is that it creates additional competition for your page in the search results, as you give relevance through anchor text and link juice to a competing page on a competing site. Thus, FALSE is the correct answer.
  • #4 Which of the following is the best way to maximize the frequency with which your site/page is crawled by the search engines?

    Your Answer: Frequently add new content
    Correct Answer: Frequently add new content
    Adding new content on a regular basis is the only one of the methods listed that will promote more frequent spidering and indexing. Tags like crawl delay have never been shown to be effective (and aren't even supported by many of the major engines). The other "partially" correct answer would be to turn up crawl frequency inside Webmaster Central at Google, but this only works if Google wants to crawl your site more actively and is restricted from doing so.
  • #5 Which of the following is a legitimate technique to improve rankings & traffic from search engines?

    Your Answer: Re-writing title tags on your pages to reflect high search volume, relevant keywords
    Correct Answer: Re-writing title tags on your pages to reflect high search volume, relevant keywords
    Of the choices, only the option to change title tags to reflect better keywords is a legitimate and effective SEO technique.
  • #6 Danny Sullivan is best known (in the field of web search) as:

    Your Answer: A journalist and pundit who covers the field of web search
    Correct Answer: A journalist and pundit who covers the field of web search
    Although there's an answer we'd love to choose :), the correct answer is that Danny's a journalist and pundit on web search who currently operates the SearchEngineLand blog and runs the SearchMarketingExpo event series.
  • #7 Which of the following is the WORST criterion for estimating the value of a link to your page/site?

    Your Answer: The ranking of the linking page for its targeted keywords
    Correct Answer: The popularity of the domain on which the page is hosted according to Alexa
    Since Alexa data is typically less useful than monkey's throwing darts at a laptop, that's the obvious choice for worst metric. The others can all contribute at least some valuable insight into the value a link might pass.
  • #8 How can Meta Description tags help with the practice of search engine optimization?

    Your Answer: They serve as the copy that will entice searchers to click on your listing
    Correct Answer: They serve as the copy that will entice searchers to click on your listing
    The correct answer is that they serve as the copy in the SERPs and are thus valuable for influencing click-through rates.
  • #9 Which of the following content types is most easily crawled by the major web search engines (Google, Yahoo!, MSN/Live & Ask.com)?

    Your Answer: XHTML
    Correct Answer: XHTML
    XHTML is the obvious choice as the other file types all create problems for search engine spiders.
  • #10 Which of the following sources is considered to be the best for acquiring competitive link data?

    Your Answer: Yahoo!
    Correct Answer: Yahoo!
    Since Yahoo! is the only engine still providing in-depth, comprehensive link data for both sites and pages, it's the obvious choice. Link commands have been disabled at MSN, throttled at Google, never existed at Ask.com and provide only a tiny subset of data at Alexa.
  • #11 Which of the following site architecture issues MOST impedes the ability of search engine spiders to crawl a site?

    Your Answer: Pages that require form submission to reach database content
    Correct Answer: Pages that require form submission to reach database content
    Since search engines will assume a site is crawlable if it has no robots.txt file, doesn't have any crawl-specific issues with paid links, can read iFrames perfectly well and is able to spider and index plenty of pages with multiple URL parameters, the correct answer is clear. Pages that require form submission effectively block spiders, as automated bots will not complete form submissions to attempt to discover web content.
  • #12 What is the generally accepted difference between SEO and SEM?

    Your Answer: SEO focuses on organic/natural search rankings, SEM encompasses all aspects of search marketing
    Correct Answer: SEO focuses on organic/natural search rankings, SEM encompasses all aspects of search marketing
    SEO - Search Engine Optimization - refers to the practice of ranking pages in the organic results at the search engines. SEM - Search Engine Marketing - refers to all practices that leverage search engines for traffic, branding, advertising & marketing.
  • #13 Which of these is NOT generally considered to be a highly important factor for ranking for a particular search term?

    Your Answer: Temporal relevance - the number and quality of links pointing to a page over a given time span
    Correct Answer: HTML Validation (according to W3C standards) of a page
    As this document would indicate, W3C validation is clearly the odd man out in this bunch.
  • #14 When creating a "flat architecture" for a site, you attempt to minimize what?

    Your Answer: The number of links a search engine must follow to reach content pages
    Correct Answer: The number of links a search engine must follow to reach content pages
    Flat site architecture refers to the link structure of the site, and thus, the only answer is "the number of links a search engine must follow to reach content pages."
  • #15 In the search marketing industry, what is traditionally represented by this graph?

    Your Answer: The "long tail" theory of keyword demand
    Correct Answer: The "long tail" theory of keyword demand
    The graph shown represents the long tail concept, which is most frequently applied to keyword demand in the search marketing world. The theory is explained in detail here.
  • #16 Which of the following is NOT a "best practice" for creating high quality title tags?

    Your Answer: Include an exhaustive list of keywords
    Correct Answer: Include an exhaustive list of keywords
    Since all the rest are very good ideas for title tag optimization (see this post for more), the outlier is to include an exhaustive list of keywords. Title tags are meant to describe the content on the page and to target 1-2 keyword phrases in the search engines, and thus, it would be terribly unwise to stuff many terms/phrases into the tag.
  • #17 Which of the following character limits is the best choice to use when limiting the length of title tags (assuming you want those tags to fully display in the search results at the major engines)?

    Your Answer: 65
    Correct Answer: 65
    As Google & Yahoo! both display between 62-68 characters (there appears to be some various depending on both the country of origin of the search and the exact query), and MSN/Live hovers between 65-69, the best answer is... 65!
  • #18 PageRank is so named because it was created by Larry Page, not because it ranks pages.

    Your Answer: TRUE
    Correct Answer: TRUE
    As you can read on Google's fun facts page, PageRank was named for its co-creator, Larry.
  • #19 A page on your site that serves as a "sitemap," linking to other pages on your domain in an organized, list format, is important because...

    Your Answer: It may help search engine crawlers to easily access many pages on your site
    Correct Answer: It may help search engine crawlers to easily access many pages on your site
    As none of the others are remotely true, the only correct answer is that a sitemap page may help search engine crawlers easily access many pages on your site, particularly if your link structure is otherwise problematic.
  • #20 Which of the following search engines patented the concept of "TrustRank" as a methodology for ranking web sites & pages?

    Your Answer: Yahoo!
    Correct Answer: Yahoo!
    The correct answer comes via the patent guru himself, Bill Slawski, who notes:
    The citation that I’ve seen most commonly pointed at regarding trustrank is this paper - Combating Web Spam with TrustRank (pdf).
    The authors listed on that paper are the named inventors on this Yahoo patent application:
    1 Link-based spam detection (20060095416)
    The remaining four describe an expansion of the trustrank process, referred to as dual trustrank, which adds elements of the social graph to the use of trustrank.
    2 Using community annotations as anchortext (20060294085)
    3 Realtime indexing and search in large, rapidly changing document collections (20060294086)
    4 Trust propagation through both explicit and implicit social networks (20060294134)
    5 Search engine with augmented relevance ranking by community participation (20070112761)
  • #21 Why are absolute (http://www.mysite.com/my-category)URLs better than relative ("/my-category") URLs for on-page internal linking?

    Your Answer: They provide more keyword context for search engines judging internal links
    Correct Answer: When scraped and copied on other domains, they provide a link back to the website
    None of the answers makes sense, except that which refers to scrapers, who often copy pages without changing links and will thus link back to your site, helping to reduce duplicate content issues, and potentially provide some link value as well.
  • #22 How can you avoid the duplicate content problems that often accompany temporal pagination issues (where content moves down a page and from page to page, as is often seen in lists of articles, multi-page articles and blogs)?

    Your Answer: Link to paginated pages with rel="nofollow" in the link tag
    Correct Answer: Add a meta robots tag with "noindex, follow" to the paginated pages
    The only method listed in the answers that's effective is to use "noindex, follow" on the paginated, non-canonical pages.
  • #23 If you update your site's URL structure to create new versions of your pages, what should you do with the old URLs?

    Your Answer: 301 redirect them to the new URLs
    Correct Answer: 301 redirect them to the new URLs
    The correct move is to 301 the pages so they pass link juice and visitors to the new, proper locations.
  • #24 When you have multiple-pages targeting the same keywords on a domain, which of the following is the best way to avoid keyword cannibalization?

    Your Answer: Place links on all the secondary pages back to the page you most want ranking for the term/phrase using the primary keywords as the anchor text of those links
    Correct Answer: Place links on all the secondary pages back to the page you most want ranking for the term/phrase using the primary keywords as the anchor text of those links
    As this blog post explains, it's best to "place links on all the secondary pages back to the page you most want ranking for the term/phrase using the primary keywords as the anchor text of those links."
  • #25 The de-facto version of a page located on the primary URL you want associated with the content is known as:

    Your Answer: Canonical Version
    Correct Answer: Canonical Version
    The only answer that is generally accepted in the search community is "canonical version."
  • #26 Which domain extensions are more often associated with greater trust and authority in the search engines?

    Your Answer: .edu, .mil and .gov
    Correct Answer: .edu, .mil and .gov
    Although the search engines themselves have said there are no specific algorithmic elements that make domains from .gov, .edu and .mil more trustworthy or authoritative, these sites, due to the restriction of the TLD licensing, certainly have an association with more trust in webmaster's eyes (and, very often,  the search results).
  • #27 High quality links to a site's homepage will help to increase the ranking ability of deeper pages on the same domain.

    Your Answer: TRUE
    Correct Answer: TRUE
    The answer is "TRUE" as the properties of PageRank, domain trust, authority and many other search ranking factors will cause internal pages on a well-linked-to domain to rank more highly.
  • #28 The practice of showing one version of content on a URL to search engines, and another, different version to human visitors of the same URL is known as?

    Your Answer: Cloaking
    Correct Answer: Cloaking
    As WebmasterWorld notes, this practice is called cloaking.
  • #29 Which HTTP server response code indicates a file that no longer exists? (File Not Found)

    Your Answer: 404
    Correct Answer: 404
    The W3C standards for HTTP status codes tells us that 404 is the correct answer.
  • #30 Spammy sites or blogs begin linking to your site. What effect is this likely to have on your search engine rankings?

    Your Answer: No effect - the search engines discount all spammy sites from passing link value, but do not penalize sites for receiving these links
    Correct Answer: A very slight positive effect is most likely, as search engines are not perfectly able to discount the link value of all spammy sites
    The correct answer is that a very slight positive effect is most likely. This is because search engines do NOT want to penalize for the acquisition of spammy links, as this would simply encourage sites to point low quality links at their competition in order to knock them out of the results. The slight positive effect is typical because not all engines are 100% perfect at removing the link value from spam.
  • #31 A link from a PageRank "3" page (according to the Google toolbar) hosted on a very strong, trusted domain can be more valuable than a link from a PageRank "4" page hosted on a weaker domain.

    Your Answer: TRUE
    Correct Answer: TRUE
    Since PageRank is not nearly the overwhelmingly strong factor influencing search rankings at Google these days, the answer is definitely "TRUE."
  • #32 What's the largest page size that Google's spider will crawl?

    Your Answer: No set limit exists - Google may crawl very large pages if it believes them to be worthwhile
    Correct Answer: No set limit exists - Google may crawl very large pages if it believes them to be worthwhile
    As evidenced by many of the 500-100K+ pages in Google's index, there is no set limit, and the search engine may spider unusually large documents if it feels the effort is warranted (particularly if many important links point to a page).
  • #33 Is it generally considered acceptable to have the same content resolve on both www and non-www URLs of a website?

    Your Answer: No, this may cause negative indexing/ranking issues
    Correct Answer: No, this may cause negative indexing/ranking issues
    This is generally considered a bad idea, and may have negative effects if the search engines do not properly count links to both versions (the most common issue) or even view the two as duplicate, competing content (unlikely, though possible).
  • #34 Which HTTP server response code indicates a page that has been temporarily relocated and links to the old location will not pass influence to the new location?

    Your Answer: 302
    Correct Answer: 302
    The W3C standards for HTTP status codes tells us that 302 is the correct answer.
  • #35 Which of these is least likely to have difficulty ranking for its targeted terms/phrases in Google?

    Your Answer: A new domain that has received significant buzz and attention in the online and offline media, along with tens of thousands of natural links
    Correct Answer: A new domain that has received significant buzz and attention in the online and offline media, along with tens of thousands of natural links
    This is a tough question, and the answer is even somewhat debatable. However, as phrased, the MOST correct answer is almost certainly - "A new domain that has received significant buzz and attention in the online and offline media, along with tens of thousands of natural links" - as each of the other situations have many examples of having very difficult times ranking well.
  • #36 What is the advantage of putting all of your important keywords in the Meta Keywords tag?

    Your Answer: It increases relevance to Yahoo! and MSN/Live, although Google & Ask ignore it
    Correct Answer: There is no specific advantage for search engines
    The answer is that no advantage is conferred upon sites who include their terms in the meta keywords tag. For more on the subject, read Danny Sullivan's excellent post.
  • #37 Which of the following link building tactics do search engines tacitly endorse?

    Your Answer: Link building via general, webmaster-focused directories
    Correct Answer: Viral content creation & promotion
    As representatives from each of the major engines have acknowledged publicly, viral content creation and promotion is viewed as a legitimate and preferred tactic for link acquisition.
  • #38 Which HTTP server response code indicates a page that has been permanently relocated and all links to the old page will pass their influence to the new page location?

    Your Answer: 301
    Correct Answer: 301
    The W3C standards for HTTP status codes tells us that 301 is the correct answer.
  • #39 Which of the following factors is considered when search engines assign value to links?

    Your Answer: The use of a proper link title in the HTML tag of the link, matching the anchor text of the link
    Correct Answer: The date/time the link was created and the temporal relationship between that link's appearance and other time-sensitive criteria
    The only one of these that search engines would consider (and have mentioned in patent applications like this one) is the temporal data.
  • #40 There is no apparent search engine rankings benefit to having a keyword-matched domain name (eg www.example.com for keyword "example").

    Your Answer: TRUE
    Correct Answer: FALSE
    This is "FALSE," as many examples of keyword-targeted domains have been shown to have a phenomenal amount of ranking success in the engines, despite other factors not being nearly as strong as the competition.
  • #41 If you want a page to pass value through its links, but stay out of the search engines' indices, which of the following tags should you place in the header?

    Your Answer: Use meta robots="noindex, follow"
    Correct Answer: Use meta robots="noindex, follow"
    As Google tells us here, the proper format would be to use meta robots="noindex, follow"
  • #42 Which of these factors is LEAST likely to decrease the value of a link?

    Your Answer: The linked-to and linking sites are both hosted on the same IP address
    Correct Answer: The linked-to domain has a link somewhere that points at the linking domain (each domain link to pages on the other's site)
    The right answer is "a link from the domain being linked to pointing at the linking site already exists (each domain link to pages on the other's site)." This is because despite the fact that these links are technically "reciprocal," they don't fit any pattern of penalization for such links (such as being listed on link list style pages). The search engines are least likely to devalue these because of all the natural patterns in which such linking occurs (blogrolls, news sites, forums, hobbyists, schools, etc.)
  • #43 Which of the following is a requirement for getting in the Google Local listings?

    Your Answer: A telephone number with a prefix matching the claimed location
    Correct Answer: A physical mail address in your claimed location
    The only one that's a must-have is the physical mailing address.
  • #44 Which of the following engines offers paid inclusion services for their main web index (not advertising):

    Your Answer: Ask
    Correct Answer: Yahoo!
    Currently, only Yahoo! offers paid inclusion through their search submit program.
  • #45 When is it advisable to leave the meta description off of a page?

    Your Answer: When the page is targeted to so many keywords that writing a meta description might hurt the click-through rate from the search results
    Correct Answer: When a large amount of pages exist and the options are between using a single meta description for all of the pages vs. leaving them with none at all
    The correct answer is "When a large amount of pages exist and the options are between using a single meta description for all of the pages vs. leaving them with none at all." Duplicate meta description tags aren't the worst thing in the world, but they're certainly not providing any value and may have downsides from a duplicate content perspective (particularly if page content is very similar). Besides that, the other answers simply don't make sense :)
  • #46 A domain will not be hurt by having a penalized site or page 301'd to it.

    Your Answer: FALSE
    Correct Answer: TRUE
    This is "TRUE," and has been tested by many a black hat. The danger here is that, once again, crafty spammers could use this technique to hurt their competitors if the search engines did penalize the receiving domain.
  • #47 Which of the following strategies is the best way to lift a page out of Google's supplemental index?

    Your Answer: Link to it internally from strong pages
    Correct Answer: Link to it internally from strong pages
    As "supplemental" has been defined by engineers at Google as being a page with very little PageRank, the best way to lift it out, from the options given, is to link to it internally from strong pages.
  • #48 Which of the following is NOT speculated to be a contributing factor in achieving "breakout" site results in Google?

    Breakout Site Results Example
    A sample of "breakout" results for the query, Comedy Central, at Google

    Your Answer: Having an active AdWords campaign
    Correct Answer: Having an active AdWords campaign
    The only one that doesn't fit is the use of an AdWords campaign, which Google has said has no impact on organic listings.
  • #49 Which of the following is the best method to insure that a page does not get crawled or indexed by a search engine?  

    Your Answer: Restrict the page using robots.txt
    Correct Answer: Restrict the page using robots.txt
    The clear best method above, and the one prescribed by the engines, is to use the robots.txt file to restrict access.
  • #50 If you want to rank for a country specific TLD/Top-Level-Domain extension (such as Yahoo.jp or Google.ca) which of the following is NOT important?

    Your Answer: Registering a domain extension that matches the targeted country (i.e. .nl for the Netherlands or .cn for China)
    Correct Answer: Linking out only to other sites with the targeted TLD extension
    Linking out only to other sites with the targeted TLD extension is certainly not a requirement nor a suggested method for inclusion into a country-specific search engine's results. See this recent video for more.
  • #51 Which of the following CANNOT get you penalized at the major search engines?

    Your Answer: Using "nofollow" internally on your site to control the flow of link juice
    Correct Answer: Using "nofollow" internally on your site to control the flow of link juice
    As Matt Cutts has noted recently, using "nofollow" to sculpt the flow of link juice is perfectly acceptable.
  • #52 Which of the following directories had its ability to pass link value removed?

    Your Answer: www.bluefind.org - The Bluefind Web Directory
    Correct Answer: www.bluefind.org - The Bluefind Web Directory
    Only BlueFind suffered this penalty - having had its ability to pass link value removed by Google, ostensibly for "selling PageRank."
  • #53 Which of the following is an acceptable way to show HTML text to search engines while creating a graphical image to display to users?  

    Your Answer: CSS layers - show the text on a layer underneath the image atop
    Correct Answer: CSS image replacement - create a rule in the CSS file that replaces the text with an image based on a given class
    The only method that's approved by search engines is to use CSS image replacement with the exact copy in both the image and the HTML text.
  • #54 For high-volume search phrases, the Search Engines usually will not differentiate between singular and plural versions of a term (eg "cell phone" vs. "cell phones" or "bird feeder" vs. "bird feeders").

    Your Answer: FALSE
    Correct Answer: FALSE
    As we can see from searches on the various phrases - cell phone vs. cell phones and bird feeder vs. bird feeders - this is FALSE. There are clear differentiations.
  • #55 If your site is ranked in the #1 organic position for a given query, advertising in the top paid position for that search result will generally not produce an additional volume of search traffic.

    Your Answer: FALSE
    Correct Answer: FALSE
    Research from several sources, including this eye-tracking research report from MarketingSherpa, indicates that the correct answer is FALSE. You get more traffic and click-throughs with both the top paid and organic results than either individually.
  • #56 What's likely to happen if multiple accounts on a single IP address vote up a story at Digg in a short time period?

    Your Answer: Your accounts will be suspended
    Correct Answer: Your accounts will be suspended
    The most likely result, particularly if this is done multiple times, is to have the accounts suspended.
  • #57 Let's assume that you're running SEO for an auction website with many listings, sorted by categories and subcategories. To achieve the maximum search engine traffic benefit, what should you do with individual product/auction pages after the auction has expired and the product is no longer available?

    Your Answer: 301 redirect them to the most appropriate category page associated with the product
    Correct Answer: 301 redirect them to the most appropriate category page associated with the product
    The "best" answer of the choices given is to 301 redirect the pages to the most appropriate category page associated with the product - this ensures that link value won't be lost, and visitors who come to the old page will get the best user experience as well.
  • #58 Which factor is most likely to decrease the ranking value of a link?

    Your Answer: Comes from a page with many reciprocal and paid links
    Correct Answer: Comes from a page with many reciprocal and paid links
    All of the answers can provide significant link value except "comes from a page with many reciprocal and paid links," which is very likely to have a strong negative affect on the value of the link.
  • #59 Which of the following search engine and country combination does not represent the most popular search engine in that country?

    Your Answer: Korea / Naver
    Correct Answer: Japan / Yahoo
    All of the above are correct, except Japan, where Google appears to now have a dominant search market share, despite Yahoo! getting more web traffic and visits. See also this piece from Multilingual-Search.com.
  • #60 Where do search engines consider content inside an iFrame to be located?

    Your Answer: Search engines cannot spider content in iFrames
    Correct Answer: On the source page the iFrame pulls from
    Engines judge iframe content the same way browsers do, and consider them to be part of the source page the iFrame pulls from (not the URL displaying the iFrame content).
  • #61 If the company you buy links from gets "busted" (discovered and penalized) by a search engine, the links you have from them will:

    Your Answer: Stop passing link value
    Correct Answer: Stop passing link value
    Since search engines don't want to give webmasters the ability to knock their competitors out with paid links, they will simply devalue the links they discover to be part of paid networks, such that they no longer pass value.
  • #62 Which of these queries would not have an "Instant Answer" or "Onebox Result" on Google?

    Your Answer: Best Chinese Restaurant in San Francisco
    Correct Answer: Best Chinese Restaurant in San Francisco
    No surprisingly, the only correct answer is "Best Chinese Restaurant in San Francisco."
  • #63 Which major search engine serves advertising listings (paid search results) from the PPC program of one of the other major engines?

    Your Answer: Ask.com
    Correct Answer: Ask.com
    Ask.com is the only major engine that shows ad results from another engine - specifically, Google.
  • #64 Duplicate content is primarily an off-site issue, created through content licensing deals and copyright violations of scraped and re-published content, rather than a site-internal problem.

    Your Answer: FALSE
    Correct Answer: FALSE
    The answer is FALSE, as on-site duplicate content issues can be serious and cause plenty of problems in the search engines.
  • #65 Links from 'noindex, follow' pages are treated exactly the same as links from default ('index, follow') pages.

    Your Answer: TRUE
    Correct Answer: TRUE
    This is TRUE - according to Matt Cutts in a comment here, links on pages with "noindex, follow" are treated exactly the same as links from default ("index, follow") pages.
  • #66 Which metric is NOT used by the major search engines to measure relevance or popularity in their ranking algorithms?

    Your Answer: Keyword usage in the URL
    Correct Answer: Keyword density in text on the page
    Keyword density is the outlier here. Dr. Garcia explains why search engines don't use the metric here.
  • #67 If they have the same content, the Search Engines will consider example.com/avocado and example.com/avocado/ to be the same page.

    Your Answer: FALSE
    Correct Answer: TRUE
    The answer is TRUE, as engines don't consider the trailing slash to create a different page (examples here and here).
  • #68 Which Search Engines currently allow the 'nocontent' attribute?

    Your Answer: MSN
    Correct Answer: Yahoo!
    To date, only Yahoo! has implemented the nocontent parameter.
  • #69 In which of the following countries does Ask.com have the most significant percentage of search engine market share?

    Your Answer: United States
    Correct Answer: United States
    Surprisingly, the answer is the US, where Ask.com has an estimated 5% market share.
  • #70 For search engine rankings & traffic in Google & Yahoo!, it is generally better to have many, small, single topic focused sites with links spread out between them than one, large, inclusive site with all the links pointing to that single domain.

    Your Answer: FALSE
    Correct Answer: FALSE
    This is FALSE, primarily because the search engines' current algorithms places a great deal of weight on large, trusted domains, rather than small, niche sites.
  • #71 The 4 major search engines - Google, Yahoo!, MSN/Live and Ask serve what approximate percentage of all searches performed in the US?

    Your Answer: ~95%
    Correct Answer: ~95%
    According to nearly every study reported (including ComScore's), the four major networks, when AOL is included (serving Google results), provide ~95% of all searches in the US.
  • #72 The linkfromdomain operator displays what information and is available at which search engine(s)?

    Your Answer: Data on who is linking to a given website - available at Google & Yahoo!
    Correct Answer: Data on what websites are linked-to from a given domain - available at MSN/Live only
    As can be seen here, Microsoft/Live is the only engine to provide the command and it shows what pages are linked-to by a given domain.
  • #73 Which of the following social media websites is the least popular (as measured by active users & visitors)?

    Your Answer: Newsvine
    Correct Answer: Newsvine
    Newsvine is the smallest of the above, both in terms of traffic and users.
  • #74 Which of the following pieces of information is NOT available from current keyword research sources?

    Your Answer: Cost per click paid by PPC advertisers
    Correct Answer: Cost per click paid by PPC advertisers
    Since all of the current search engines have blind bid systems, the cost-per-click paid by advertisers is currently unavailable anywhere.
  • #75 The use of AJAX presents what common problem for search engines and websites?

    Your Answer: Web pages with AJAX frequently take too long to load, causing crawlers to abandon them
    Correct Answer: It creates multiple pages with unique content without enabling new, spiderable, linkable URLs
    The largest problem for search engines is that AJAX frequently "creates multiple pages with unique content without enabling new, spiderable, linkable URLs."

Saturday, June 16, 2012

Google Analytics Quiz: Accounts & Profiles

Question 1 Which of the following would be an important reason for setting up a new Google Analytics account?

A. You need to track multiple websites.
B. You need different people to administer Google Analytics for different websites.
C. You need to separate mobile traffic data from desktop traffic data.
D. You need to import cost data from more than one Google AdWords account.


Answer Explanation: Answer B represents the best reason for creating a new account. Administrators have access to all profiles for all properties in an account. If an administrator for one website’s profiles should not have administrator access to the profiles for another website that you are tracking, you must create a separate Google Analytics account and assign admin roles accordingly within each account.

Answers A and C would not warrant a separate account. You can track multiple website properties within the same account, and you could set up profiles filtered to include or exclude mobile traffic within the same account.

Answers D would not require a separate account either. As a recently added capability, you can import cost data into a single Google Analytics account from more than one Google AdWords account.






Question 2 Without an associated Adwords account, what is the monthly pageview limit for a website in Google Analytics?


A. 1 million
B. 5 million
C. 10 million
D. 50 million


Answer Explanation: 10 million is the monthly pageview limit for a website in Google Analytics if you don’t have a Google Adwords account (on which you spend a daily minimum of $1US) associated with your Google Analytics login.



Question 3 You have admin access to a Google Analytics account. Which of the following conditions is required for you to provide one of your business associates with read-only access to the account? 


A. Your business associate must have a gmail address.


B. Your business associate must have a Google account.


C. Your business associate must already have admin access to another Google Analytics account.


D. You must grant your business associate read-only access on a profile-by-profile basis.




Answer Explanation: Answer B is correct; answer A is incorrect. For a Google Access administrator to grant either administrator or read-only access to another user, that user must only have a Google account, which may be associated with either a gmail email address or a non-gmail email address.


Answer D is also required for providing read-only access. As the Users tab appears under each profile name on the admin pages, creating a read-only user allows that user access to that profile only; the process would basically have to be repeated for each profile that the user would need to access. (Conversely, anyone you designate with the Account Administrator access type will be able to view, edit, and delete all profiles in your account.)


Answer C is incorrect. A user does not need to have any access to any other Google Analytics account for your to add that user with either admin or read-only access.





Monday, August 22, 2011

Quiz For On Site



Question 1  You have a book selling site. If you have a keyword density of 7% for the keyword “book”, 6% for “reading” and 3% for “bestseller”, how does the following metatag relate to your keyword density:

a. The metatag is completely useless.
b. The metatag might mislead the search engines that we offer mainly American literature.
c. I would have never had such high keyword density, if it were not for the stuffing of this tag.
d. This metatag helps to point out that some of our bestsellers are the novels of Grisham and other American authors.
e. It might help to get some additional traffic without having to optimize the site for additional keywords like “Grisham” and the different genres.

Explanation: e. is correct. The keywords in the “Keywords” metatag could be a slight boost for minor keywords like authors and genres but in any case this tag is not as important as the onpage keyword density and the anchor text of backlinks. On the other hand, if the keywords in this tag had nothing in common with the major keywords - for instance, you decide to list the names of the most popular novels on your site - this could be a confusion for the spiders and you'd better avoid it.


Question 2 What are “supplemental results”?

a. Paid listings that appear for a particular keyword.
b. Listings that do not have all the keywords in them.
c. Filtered results that are similar to the ones already shown.
d. Additional search results that will be served on your request.

Explanation: c. is correct. Supplemental results are filtered by search engines in order not to spam users with similar pages that exist on one site (or on several sites). That is why, even though supplemental results are retrieved together with the main results, they are viewable only if you choose to look at them.



Question 3 An 0ften underestimated approach to attracting traffic is having an onsite search box. Which of the following statements related to onsite search boxes are true?

a. When you have an onsite search box, your visitors will get more search results than in comparison to using Google and the other search engines.
b. Using the “site:” operator in search engines makes onsite search boxes obsolete.
c. An onsite search box allows to search for pages that are inaccessible to unregistered users (i.e. search engine spiders).
d. If you have an onsite search box and see what users are searching for, you will be able to notice new keywords that are of interest to your audience.

Explanation: a., c, and d. are true because these are all advantages of onsite search boxes. b. is false because generally search engines do not index every single page of a site, even if these pages are not disallowed in the robots.txt or are password-protected. A good onsite search box is really valuable but it is a fact that occasionally onsite search boxes use so imprecise algorithms that using the “site:” operator in Google gives much more reliable results.



Question 4 You have enough quality inbound links with the keywords in the anchor text but still you do not rank well in Google, even for your site name. What will you do? Check no more than 3 answers.

a. Try with variations of the anchor text.
b. Add more inbound links.
c. See if the targets on your site have not been damaged (i.e. renamed, or deleted, or moved) by mistake.
d. Check if you have more outbound than inbound links.
e. If inbound links don't help, add more keywords in other places.

Explanation: b. and e. are wrong. If you don't lack quality inbound links, adding some more will hardly help. It is true that quality backlinks are always valuable but since you have enough of them, then something else must be wrong. e. is also wrong because since you already have enough keywords in anchor text, where they weigh most, adding more keywords will hardly help much (unless you don't have a single keyword in other place, of course, but this is unlikely, if you are a good SEO expert) a., c, and d. are correct. a. is recommended by many experts as the solution, when you are stuck with lower ratings because the anchor text from different places to yours is suspiciously the same. c. is one of the trickiest to guess because when you perform a backlink check, you see that you have many backlinks but no hint that the target might not be in place. d. is correct because no matter that you have tons of quality inbound links, if you have many outbound links (or you are a link farm), this outweighs the inbound links.



Question 5 If you have a site that is targeted at a particular country only, which of the following are recommendable to do in order to rank well in country specific search results? Check all that apply.

a. Use the appropriate language attribute in the HTML code of the page.
b. The site should be hosted in the same country, so that its IP falls in the range of IPs that are specific for this country.
c. Have the site written in the language of the country.
d. Submit the site to local search engines.
e. Use geotargeting.

Explanation: Only e. is wrong. Geotargeting is used mostly with advertisements. In can be done with sites as well but not for the purpose of ranking high with search engines. All the rest are correct, though none of them is mandatory. Additionally, search engines use their own algorithms to determine the language of a a particular sites, even there are SEO experts who say that no efforts on your side are necessary for ranking well in country specific search results, but this is hardly so. If you want to see how your site qualifies in terms of IP, here the a tool to check IPs and countries.



Question 6 Which of the following ways can be used to hide files from search engines? Check all that apply.

a. Put a robots.txt file and write in it not to index particular pages.
b. Put the sensitive files in a separate directory.
c. Hide them with dynamic URLs.
d. Protect them with a password.
e. Put them on a separate server.

Explanation: a., d. and e. are the correct answers and e. is the best option because if the files are not physically accessible, they will not be indexed. Protecting them with a password is also good because search engines will not break the password to get inside the files but still there is a chance that the names of the files can appear in a directory listing. Most search engines do not violate your wishes in the robots.txt file, but this file was not meant to be a 100% secure way to avoid indexing, so for very sensitive information, use it with caution. Simply putting the files in a separate directory or “hiding” them with dynamic URLs has nothing to do with protection from being indexed.



Question 7 In terms of Web marketing, what does the abbreviation PPC stand for?

a. Points Per Click
b. Pay Per Click
c. Paid Placement Cost
d. Paid Points Calculator

Explanation: b. is correct. Pay Per Click measures how much online advertisers must pay each time their advertisement is clicked on.


Question 8 If you were a SEO for a technology site, what would you do to build backlinks for it? Check all that apply.

a. Post in technology blogs.
b. Submit articles to technology ezines.
c. Buy 10,000 links on a link exchange site.
d. Exchange links with other reputable technology sites.
e. Provide free syndicated content to partner-sites.

Explanation: Of the above listed practices, the only practice to avoid is c. The other four practices are a quite legal and recommendable way of building backlinks. There are some other recommendable practices for building backlinks as well but a., b., d., e. are among the best.


Question 9 What are the advantages of submitting sites to search directories? Check all that apply.

a. Submitting to search directories increases your rating with search engines.
b. By submitting to a search directory, you get a backlink to your site.
c. By submitting to a search directory your site gets certified.
d. When your site is listed in search directories, this increases the chances that search engines will index it sooner, compared to when it is not listed.
e. Submitting to search directories is a good Web marketing initiative.

Explanation: Correct are b., d. and e. because submitting a site to search engines is a good Web marketing initiative that increases the chances of having your site index and when your site is listed in a search directory, you list its URL as well, so you actually get a valuable link. a. and c. are wrong because the sole fact of submitting your site to search directories neither gets you a higher ranking, nor gets your site certified.



Question 10 How important are metatags today?

a. Not important at all.
b. They have only minor importance, but it can be neglected.
c. Somehow important.
d. Very important, almost as much as keyword density.
e. They are the single most important SEO tool.

Explanation: The correct answer is c. Metatags are certainly not as important as keyword density but they still cannot be neglected completely.



For more explanation of these questions & On Page SEO Knowledge, Go through this article: http://googleseostrategy.blogspot.com/p/seo-hints-and-tips-how-to-optimize-web.html


Thursday, August 11, 2011

Quiz Related To Unethical SEO Techniques

Question 1 How important are Meta Tags today?

a. Not important at all.
b. They have only minor importance, but it can be neglected.
c. Somehow important.
d. Very important, almost as much as keyword density.
e. They are the single most important SEO tool.

Explanation: The correct answer is c. Metatags are certainly not as important as keyword density but they still cannot be neglected completely.



Question 2 Which of the following techniques is unethical and can be a reason for banning?

a. Building backlinks.
b. Rewriting the titles to include the target keywords.
c. Creating a doorway page instead a home page.
d. Rewriting dynamic URLs into static.

Explanation: The correct answer is c. because doorway pages are aimed at misleading search engines, while all the rest (a., b. and d.) are normal SEO practices.



Question 3 How long is the period of keeping sites sandboxed?

a. 5 days.
b. 4 weeks.
c. 3 months.
d. 1 year.
e. Not defined.

Explanation: e. is correct because there is no upper and lower time frame for keeping sites sandboxed. While you cannot control the period of being sandboxed, you can take steps to minimize the damage of sandboxing.



Question 4 What is the function of the Robots.txt file?

a. It shows how the page will be seen by robots.
b. It prevents robots from spidering the site.
c. It tells robots to come to the site.
d. It provides instructions to robots which pages and directories not to index.

Explanation: Correct is d. because robots.txt neither displays pages, nor invites robots to come. Also, robots.txt is not mandatory for search engines and that is why it cannot prevent them from spidering the site.



Question 5  When do you apply for reinclusion in a search engine's index?

a. When you have made changes to your site.
b. When you have changed your hosting provider and the IP address of your site.
c. After you have been banned from the search engine for black hat practices and you have corrected your wrongdoings.
d. When you are not happy with your current ratings.

Explanation: Correct is c. In all other cases, submitting a reinclusion request is either unnecessary (as in a. and b.), or even harmful (as in d.) because it can be regarded as spam. On the other hand, if you have been banned for black hat practices, it makes no sense to submit the site for reinclusion, if you have not corrected what was wrong. More about reinclusion can be found in the Reinclusion in Google article.



Question 6 You have enough quality inbound links with the keywords in the anchor text but still you do not rank well in Google, even for your site name. What will you do? Check no more than 3 answers.

a. Try with variations of the anchor text.
b. Add more inbound links.
c. See if the targets on your site have not been damaged (i.e. renamed, or deleted, or moved) by mistake.
d. Check if you have more outbound than inbound links.
e. If inbound links don't help, add more keywords in other places.

Explanation: b. and e. are wrong. If you don't lack quality inbound links, adding some more will hardly help. It is true that quality backlinks are always valuable but since you have enough of them, then something else must be wrong. e. is also wrong because since you already have enough keywords in anchor text, where they weigh most, adding more keywords will hardly help much (unless you don't have a single keyword in other place, of course, but this is unlikely, if you are a good SEO expert) a., c, and d. are correct. a. is recommended by many experts as the solution, when you are stuck with lower ratings because the anchor text from different places to yours is suspiciously the same. c. is one of the trickiest to guess because when you perform a backlink check, you see that you have many backlinks but no hint that the target might not be in place. d. is correct because no matter that you have tons of quality inbound links, if you have many outbound links (or you are a link farm), this outweighs the inbound links.



Question 7 Which of the following linking practices can be considered as unethical cross-linking? Check all that apply.

a. Cross-linking between sites that are owned by the same company.
b. When there are many links between sites with the same IP.
c. The sites that link to each other have similar content, though there are minor differences in the text.
d. Creating many links and using the "rel=nofollow" tag around the link .
e. Each page on one site links to a page on the other site and the anchor text is the same.

Explanation: a. and d. are wrong. The fact that the owner of the linked sites is the same is not a reason that the linking is unethical. d. is one way to ask search engines not to follow a link. It is useful when you have many organic links to the same page, these links are necessary for human visitors and you want to avoid being labeled a link farm. b. If the sites have the same IP, this most likely means they are hosted on the same server and are related to each other, which is a hint of Black Hat SEO practices. c. This is an obvious example of a lazy Webmaster who has set up another site with the sole purpose of using it to link to the first one. e. looks very suspicious, though in some cases it could be the actual structure of a site. Generally, avoid using it because most likely this will get you into trouble.



Question 8 Google is gradually increasing the use of semantic analysis for determining search relevancy. Which of the following sentences about semantic analysis are true? Check all that apply.

a. Semantic analysis is a replacement of keywords.
b. Semantic analysis uses not only keywords but synonyms as well in order to determine relevancy.
c. If you have high keyword density (7-8% or more percent) but semantically the keywords measure very low (let's say, under 0.5%) you will not be listed in the top results.
d. If your site is semantically rich, backlinks become less important.

Explanation: a. and d. are wrong. Although Google has been using semantic analysis as part of its algorithm for many years, it has recently began to increase its weight in the algorithm for calculating relevancy and by no means keywords have become obsolete. Actually, semantic analysis relies on synonyms and words related to the search string in order to determine the relevance for a particular page and since a keyword is a direct match of the search string, it can't be neglected. d. is also wrong because currently semantic analysis is still not the primary way of determining relevancy, so backlinks still are very important. b. is correct – it is close to the definition of semantic analysis. c. is also correct because the lack of synonyms and related words generally suggests that the text is not written with humans in mind, but is aimed at search engines.



Question 9 If you have a site that is targeted at a particular country only, which of the following are recommendable to do in order to rank well in country specific search results? Check all that apply.

a. Use the appropriate language attribute in the HTML code of the page.
b. The site should be hosted in the same country, so that its IP falls in the range of IPs that are specific for this country.
c. Have the site written in the language of the country.
d. Submit the site to local search engines.
e. Use geotargeting.

Explanation: Only e. is wrong. Geotargeting is used mostly with advertisements. In can be done with sites as well but not for the purpose of ranking high with search engines. All the rest are correct, though none of them is mandatory. Additionally, search engines use their own algorithms to determine the language of a a particular sites, even there are SEO experts who say that no efforts on your side are necessary for ranking well in country specific search results, but this is hardly so. If you want to see how your site qualifies in terms of IP, here the a tool to check IPs and countries.



Question 10 You see that only 10% of the pages on your site are indexed by Google. What could be the reason? Check all that apply.

a. You have a robots.txt file and you have disallowed most of your site to be indexed.
b. You have many broken links.
c. Your keywords density is not good.
d. You are banned from Google.
e. You have many password protected sections.

Explanation: The correct answers are a., b., and e. If it was not your desire to exclude 90% of your site from being indexed, then you need to check if you have not made a mistake in listing the forbidden directories. Broken links cannot be followed and therefore, if the pages they lead to, are not accessible from an alternative link, there is no way for spiders to index these pages. The same applies to password protected pages – if the spider can't reach them, there is no way to index them. c. is wrong because keyword density is related to search relevance, not to site indexing. d. is also wrong because if you were banned from Google, your site wouldn't have been indexed at all.

To Know More About:  Google Penalty, Robots.txt and On Page SEO Tips

Live Chat @Admin