The SEO Expert Quiz has 75 action-packed questions and takes 30 minutes to complete. You have nothing to lose and a lot of prestige to gain. Let the games begin!100% Take the quiz at SEO moz
1 Which of the following is the least important area in which to include your keyword(s)?
Your Answer: Meta KeywordsCorrect Answer: Meta KeywordsThe meta keywords tag is least important among these because search engines do not consider it in ranking calculations and it's never seen by visitors or searchers (unlike the meta description tag, which displays beneath listings in the SERPs).-
#2 Which of the following would be the best choice of URL structure (for both search engines and humans)?
Your Answer: www.wildlifeonline.com/animals/crocodileCorrect Answer: www.wildlifeonline.com/animals/crocodileThe best choice would be www.wildlifeonline.com/animals/crocodile - it provides the most semantic information, the best description of the content on the page and contains no parameters or subdomains that could cause issues at the engines. For more on URL structuring, see this post on SEOmoz. -
#3 When linking to external websites, a good strategy to move up in the rankings is to use the keywords you're attempting to rank for on that page as the anchor text of the external-pointing links. For example, if you were attempting to rank a page for the phrase "hulk smash" you would want to use that phrase, "hulk smash" as the anchor text of a link pointing to a web page on another domain.
Your Answer: FalseCorrect Answer: FalseThe biggest problem with linking out to other websites with your targeted keyword phrases in the anchor text is that it creates additional competition for your page in the search results, as you give relevance through anchor text and link juice to a competing page on a competing site. Thus, FALSE is the correct answer. -
#4 Which of the following is the best way to maximize the frequency with which your site/page is crawled by the search engines?
Your Answer: Frequently add new contentCorrect Answer: Frequently add new contentAdding new content on a regular basis is the only one of the methods listed that will promote more frequent spidering and indexing. Tags like crawl delay have never been shown to be effective (and aren't even supported by many of the major engines). The other "partially" correct answer would be to turn up crawl frequency inside Webmaster Central at Google, but this only works if Google wants to crawl your site more actively and is restricted from doing so. -
#5 Which of the following is a legitimate technique to improve rankings & traffic from search engines?
Your Answer: Re-writing title tags on your pages to reflect high search volume, relevant keywordsCorrect Answer: Re-writing title tags on your pages to reflect high search volume, relevant keywordsOf the choices, only the option to change title tags to reflect better keywords is a legitimate and effective SEO technique. -
#6 Danny Sullivan is best known (in the field of web search) as:
Your Answer: A journalist and pundit who covers the field of web searchCorrect Answer: A journalist and pundit who covers the field of web searchAlthough there's an answer we'd love to choose :), the correct answer is that Danny's a journalist and pundit on web search who currently operates the SearchEngineLand blog and runs the SearchMarketingExpo event series. -
#7 Which of the following is the WORST criterion for estimating the value of a link to your page/site?
Your Answer: The ranking of the linking page for its targeted keywordsCorrect Answer: The popularity of the domain on which the page is hosted according to AlexaSince Alexa data is typically less useful than monkey's throwing darts at a laptop, that's the obvious choice for worst metric. The others can all contribute at least some valuable insight into the value a link might pass. -
#8 How can Meta Description tags help with the practice of search engine optimization?
Your Answer: They serve as the copy that will entice searchers to click on your listingCorrect Answer: They serve as the copy that will entice searchers to click on your listingThe correct answer is that they serve as the copy in the SERPs and are thus valuable for influencing click-through rates. -
#9 Which of the following content types is most easily crawled by the major web search engines (Google, Yahoo!, MSN/Live & Ask.com)?
Your Answer: XHTMLCorrect Answer: XHTMLXHTML is the obvious choice as the other file types all create problems for search engine spiders. -
#10 Which of the following sources is considered to be the best for acquiring competitive link data?
Your Answer: Yahoo!Correct Answer: Yahoo!Since Yahoo! is the only engine still providing in-depth, comprehensive link data for both sites and pages, it's the obvious choice. Link commands have been disabled at MSN, throttled at Google, never existed at Ask.com and provide only a tiny subset of data at Alexa. -
#11 Which of the following site architecture issues MOST impedes the ability of search engine spiders to crawl a site?
Your Answer: Pages that require form submission to reach database contentCorrect Answer: Pages that require form submission to reach database contentSince search engines will assume a site is crawlable if it has no robots.txt file, doesn't have any crawl-specific issues with paid links, can read iFrames perfectly well and is able to spider and index plenty of pages with multiple URL parameters, the correct answer is clear. Pages that require form submission effectively block spiders, as automated bots will not complete form submissions to attempt to discover web content. -
#12 What is the generally accepted difference between SEO and SEM?
Your Answer: SEO focuses on organic/natural search rankings, SEM encompasses all aspects of search marketingCorrect Answer: SEO focuses on organic/natural search rankings, SEM encompasses all aspects of search marketingSEO - Search Engine Optimization - refers to the practice of ranking pages in the organic results at the search engines. SEM - Search Engine Marketing - refers to all practices that leverage search engines for traffic, branding, advertising & marketing. -
#13 Which of these is NOT generally considered to be a highly important factor for ranking for a particular search term?
Your Answer: Temporal relevance - the number and quality of links pointing to a page over a given time spanCorrect Answer: HTML Validation (according to W3C standards) of a pageAs this document would indicate, W3C validation is clearly the odd man out in this bunch. -
#14 When creating a "flat architecture" for a site, you attempt to minimize what?
Your Answer: The number of links a search engine must follow to reach content pagesCorrect Answer: The number of links a search engine must follow to reach content pagesFlat site architecture refers to the link structure of the site, and thus, the only answer is "the number of links a search engine must follow to reach content pages." -
#15 In the search marketing industry, what is traditionally represented by this graph?
Your Answer: The "long tail" theory of keyword demandCorrect Answer: The "long tail" theory of keyword demandThe graph shown represents the long tail concept, which is most frequently applied to keyword demand in the search marketing world. The theory is explained in detail here. -
#16 Which of the following is NOT a "best practice" for creating high quality title tags?
Your Answer: Include an exhaustive list of keywordsCorrect Answer: Include an exhaustive list of keywordsSince all the rest are very good ideas for title tag optimization (see this post for more), the outlier is to include an exhaustive list of keywords. Title tags are meant to describe the content on the page and to target 1-2 keyword phrases in the search engines, and thus, it would be terribly unwise to stuff many terms/phrases into the tag. -
#17 Which of the following character limits is the best choice to use when limiting the length of title tags (assuming you want those tags to fully display in the search results at the major engines)?
Your Answer: 65Correct Answer: 65As Google & Yahoo! both display between 62-68 characters (there appears to be some various depending on both the country of origin of the search and the exact query), and MSN/Live hovers between 65-69, the best answer is... 65! -
#18 PageRank is so named because it was created by Larry Page, not because it ranks pages.
Your Answer: TRUECorrect Answer: TRUEAs you can read on Google's fun facts page, PageRank was named for its co-creator, Larry. -
#19 A page on your site that serves as a "sitemap," linking to other pages on your domain in an organized, list format, is important because...
Your Answer: It may help search engine crawlers to easily access many pages on your siteCorrect Answer: It may help search engine crawlers to easily access many pages on your siteAs none of the others are remotely true, the only correct answer is that a sitemap page may help search engine crawlers easily access many pages on your site, particularly if your link structure is otherwise problematic. -
#20 Which of the following search engines patented the concept of "TrustRank" as a methodology for ranking web sites & pages?
Your Answer: Yahoo!Correct Answer: Yahoo!The correct answer comes via the patent guru himself, Bill Slawski, who notes:The citation that I’ve seen most commonly pointed at regarding trustrank is this paper - Combating Web Spam with TrustRank (pdf).
The authors listed on that paper are the named inventors on this Yahoo patent application:
1 Link-based spam detection (20060095416)
The remaining four describe an expansion of the trustrank process, referred to as dual trustrank, which adds elements of the social graph to the use of trustrank.
2 Using community annotations as anchortext (20060294085)
3 Realtime indexing and search in large, rapidly changing document collections (20060294086)
4 Trust propagation through both explicit and implicit social networks (20060294134)
5 Search engine with augmented relevance ranking by community participation (20070112761) -
#21 Why are absolute (http://www.mysite.com/my-category)URLs better than relative ("/my-category") URLs for on-page internal linking?
Your Answer: They provide more keyword context for search engines judging internal linksCorrect Answer: When scraped and copied on other domains, they provide a link back to the websiteNone of the answers makes sense, except that which refers to scrapers, who often copy pages without changing links and will thus link back to your site, helping to reduce duplicate content issues, and potentially provide some link value as well. -
#22 How can you avoid the duplicate content problems that often accompany temporal pagination issues (where content moves down a page and from page to page, as is often seen in lists of articles, multi-page articles and blogs)?
Your Answer: Link to paginated pages with rel="nofollow" in the link tagCorrect Answer: Add a meta robots tag with "noindex, follow" to the paginated pagesThe only method listed in the answers that's effective is to use "noindex, follow" on the paginated, non-canonical pages. -
#23 If you update your site's URL structure to create new versions of your pages, what should you do with the old URLs?
Your Answer: 301 redirect them to the new URLsCorrect Answer: 301 redirect them to the new URLsThe correct move is to 301 the pages so they pass link juice and visitors to the new, proper locations. -
#24 When you have multiple-pages targeting the same keywords on a domain, which of the following is the best way to avoid keyword cannibalization?
Your Answer: Place links on all the secondary pages back to the page you most want ranking for the term/phrase using the primary keywords as the anchor text of those linksCorrect Answer: Place links on all the secondary pages back to the page you most want ranking for the term/phrase using the primary keywords as the anchor text of those linksAs this blog post explains, it's best to "place links on all the secondary pages back to the page you most want ranking for the term/phrase using the primary keywords as the anchor text of those links." -
#25 The de-facto version of a page located on the primary URL you want associated with the content is known as:
Your Answer: Canonical VersionCorrect Answer: Canonical VersionThe only answer that is generally accepted in the search community is "canonical version." -
#26 Which domain extensions are more often associated with greater trust and authority in the search engines?
Your Answer: .edu, .mil and .govCorrect Answer: .edu, .mil and .govAlthough the search engines themselves have said there are no specific algorithmic elements that make domains from .gov, .edu and .mil more trustworthy or authoritative, these sites, due to the restriction of the TLD licensing, certainly have an association with more trust in webmaster's eyes (and, very often, the search results). -
#27 High quality links to a site's homepage will help to increase the ranking ability of deeper pages on the same domain.
Your Answer: TRUECorrect Answer: TRUEThe answer is "TRUE" as the properties of PageRank, domain trust, authority and many other search ranking factors will cause internal pages on a well-linked-to domain to rank more highly. -
#28 The practice of showing one version of content on a URL to search engines, and another, different version to human visitors of the same URL is known as?
Your Answer: CloakingCorrect Answer: CloakingAs WebmasterWorld notes, this practice is called cloaking. -
#29 Which HTTP server response code indicates a file that no longer exists? (File Not Found)
Your Answer: 404Correct Answer: 404The W3C standards for HTTP status codes tells us that 404 is the correct answer. -
#30 Spammy sites or blogs begin linking to your site. What effect is this likely to have on your search engine rankings?
Your Answer: No effect - the search engines discount all spammy sites from passing link value, but do not penalize sites for receiving these linksCorrect Answer: A very slight positive effect is most likely, as search engines are not perfectly able to discount the link value of all spammy sitesThe correct answer is that a very slight positive effect is most likely. This is because search engines do NOT want to penalize for the acquisition of spammy links, as this would simply encourage sites to point low quality links at their competition in order to knock them out of the results. The slight positive effect is typical because not all engines are 100% perfect at removing the link value from spam. -
#31 A link from a PageRank "3" page (according to the Google toolbar) hosted on a very strong, trusted domain can be more valuable than a link from a PageRank "4" page hosted on a weaker domain.
Your Answer: TRUECorrect Answer: TRUESince PageRank is not nearly the overwhelmingly strong factor influencing search rankings at Google these days, the answer is definitely "TRUE." -
#32 What's the largest page size that Google's spider will crawl?
Your Answer: No set limit exists - Google may crawl very large pages if it believes them to be worthwhileCorrect Answer: No set limit exists - Google may crawl very large pages if it believes them to be worthwhileAs evidenced by many of the 500-100K+ pages in Google's index, there is no set limit, and the search engine may spider unusually large documents if it feels the effort is warranted (particularly if many important links point to a page). -
#33 Is it generally considered acceptable to have the same content resolve on both www and non-www URLs of a website?
Your Answer: No, this may cause negative indexing/ranking issuesCorrect Answer: No, this may cause negative indexing/ranking issuesThis is generally considered a bad idea, and may have negative effects if the search engines do not properly count links to both versions (the most common issue) or even view the two as duplicate, competing content (unlikely, though possible). -
#34 Which HTTP server response code indicates a page that has been temporarily relocated and links to the old location will not pass influence to the new location?
Your Answer: 302Correct Answer: 302The W3C standards for HTTP status codes tells us that 302 is the correct answer. -
#35 Which of these is least likely to have difficulty ranking for its targeted terms/phrases in Google?
Your Answer: A new domain that has received significant buzz and attention in the online and offline media, along with tens of thousands of natural linksCorrect Answer: A new domain that has received significant buzz and attention in the online and offline media, along with tens of thousands of natural linksThis is a tough question, and the answer is even somewhat debatable. However, as phrased, the MOST correct answer is almost certainly - "A new domain that has received significant buzz and attention in the online and offline media, along with tens of thousands of natural links" - as each of the other situations have many examples of having very difficult times ranking well. -
#36 What is the advantage of putting all of your important keywords in the Meta Keywords tag?
Your Answer: It increases relevance to Yahoo! and MSN/Live, although Google & Ask ignore itCorrect Answer: There is no specific advantage for search enginesThe answer is that no advantage is conferred upon sites who include their terms in the meta keywords tag. For more on the subject, read Danny Sullivan's excellent post. -
#37 Which of the following link building tactics do search engines tacitly endorse?
Your Answer: Link building via general, webmaster-focused directoriesCorrect Answer: Viral content creation & promotionAs representatives from each of the major engines have acknowledged publicly, viral content creation and promotion is viewed as a legitimate and preferred tactic for link acquisition. -
#38 Which HTTP server response code indicates a page that has been permanently relocated and all links to the old page will pass their influence to the new page location?
Your Answer: 301Correct Answer: 301The W3C standards for HTTP status codes tells us that 301 is the correct answer. -
#39 Which of the following factors is considered when search engines assign value to links?
Your Answer: The use of a proper link title in the HTML tag of the link, matching the anchor text of the linkCorrect Answer: The date/time the link was created and the temporal relationship between that link's appearance and other time-sensitive criteriaThe only one of these that search engines would consider (and have mentioned in patent applications like this one) is the temporal data. -
#40 There is no apparent search engine rankings benefit to having a keyword-matched domain name (eg www.example.com for keyword "example").
Your Answer: TRUECorrect Answer: FALSEThis is "FALSE," as many examples of keyword-targeted domains have been shown to have a phenomenal amount of ranking success in the engines, despite other factors not being nearly as strong as the competition. -
#41 If you want a page to pass value through its links, but stay out of the search engines' indices, which of the following tags should you place in the header?
Your Answer: Use meta robots="noindex, follow"Correct Answer: Use meta robots="noindex, follow"As Google tells us here, the proper format would be to use meta robots="noindex, follow" -
#42 Which of these factors is LEAST likely to decrease the value of a link?
Your Answer: The linked-to and linking sites are both hosted on the same IP addressCorrect Answer: The linked-to domain has a link somewhere that points at the linking domain (each domain link to pages on the other's site)The right answer is "a link from the domain being linked to pointing at the linking site already exists (each domain link to pages on the other's site)." This is because despite the fact that these links are technically "reciprocal," they don't fit any pattern of penalization for such links (such as being listed on link list style pages). The search engines are least likely to devalue these because of all the natural patterns in which such linking occurs (blogrolls, news sites, forums, hobbyists, schools, etc.) -
#43 Which of the following is a requirement for getting in the Google Local listings?
Your Answer: A telephone number with a prefix matching the claimed locationCorrect Answer: A physical mail address in your claimed locationThe only one that's a must-have is the physical mailing address. -
#44 Which of the following engines offers paid inclusion services for their main web index (not advertising):
Your Answer: AskCorrect Answer: Yahoo!Currently, only Yahoo! offers paid inclusion through their search submit program. -
#45 When is it advisable to leave the meta description off of a page?
Your Answer: When the page is targeted to so many keywords that writing a meta description might hurt the click-through rate from the search resultsCorrect Answer: When a large amount of pages exist and the options are between using a single meta description for all of the pages vs. leaving them with none at allThe correct answer is "When a large amount of pages exist and the options are between using a single meta description for all of the pages vs. leaving them with none at all." Duplicate meta description tags aren't the worst thing in the world, but they're certainly not providing any value and may have downsides from a duplicate content perspective (particularly if page content is very similar). Besides that, the other answers simply don't make sense :) -
#46 A domain will not be hurt by having a penalized site or page 301'd to it.
Your Answer: FALSECorrect Answer: TRUEThis is "TRUE," and has been tested by many a black hat. The danger here is that, once again, crafty spammers could use this technique to hurt their competitors if the search engines did penalize the receiving domain. -
#47 Which of the following strategies is the best way to lift a page out of Google's supplemental index?
Your Answer: Link to it internally from strong pagesCorrect Answer: Link to it internally from strong pagesAs "supplemental" has been defined by engineers at Google as being a page with very little PageRank, the best way to lift it out, from the options given, is to link to it internally from strong pages. -
#48 Which of the following is NOT speculated to be a contributing factor in achieving "breakout" site results in Google?
A sample of "breakout" results for the query, Comedy Central, at GoogleYour Answer: Having an active AdWords campaignCorrect Answer: Having an active AdWords campaignThe only one that doesn't fit is the use of an AdWords campaign, which Google has said has no impact on organic listings. -
#49 Which of the following is the best method to insure that a page does not get crawled or indexed by a search engine?
Your Answer: Restrict the page using robots.txtCorrect Answer: Restrict the page using robots.txtThe clear best method above, and the one prescribed by the engines, is to use the robots.txt file to restrict access. -
#50 If you want to rank for a country specific TLD/Top-Level-Domain extension (such as Yahoo.jp or Google.ca) which of the following is NOT important?
Your Answer: Registering a domain extension that matches the targeted country (i.e. .nl for the Netherlands or .cn for China)Correct Answer: Linking out only to other sites with the targeted TLD extensionLinking out only to other sites with the targeted TLD extension is certainly not a requirement nor a suggested method for inclusion into a country-specific search engine's results. See this recent video for more. -
#51 Which of the following CANNOT get you penalized at the major search engines?
Your Answer: Using "nofollow" internally on your site to control the flow of link juiceCorrect Answer: Using "nofollow" internally on your site to control the flow of link juiceAs Matt Cutts has noted recently, using "nofollow" to sculpt the flow of link juice is perfectly acceptable. -
#52 Which of the following directories had its ability to pass link value removed?
Your Answer: www.bluefind.org - The Bluefind Web DirectoryCorrect Answer: www.bluefind.org - The Bluefind Web DirectoryOnly BlueFind suffered this penalty - having had its ability to pass link value removed by Google, ostensibly for "selling PageRank." -
#53 Which of the following is an acceptable way to show HTML text to search engines while creating a graphical image to display to users?
Your Answer: CSS layers - show the text on a layer underneath the image atopCorrect Answer: CSS image replacement - create a rule in the CSS file that replaces the text with an image based on a given classThe only method that's approved by search engines is to use CSS image replacement with the exact copy in both the image and the HTML text. -
#54 For high-volume search phrases, the Search Engines usually will not differentiate between singular and plural versions of a term (eg "cell phone" vs. "cell phones" or "bird feeder" vs. "bird feeders").
Your Answer: FALSECorrect Answer: FALSEAs we can see from searches on the various phrases - cell phone vs. cell phones and bird feeder vs. bird feeders - this is FALSE. There are clear differentiations. -
#55 If your site is ranked in the #1 organic position for a given query, advertising in the top paid position for that search result will generally not produce an additional volume of search traffic.
Your Answer: FALSECorrect Answer: FALSEResearch from several sources, including this eye-tracking research report from MarketingSherpa, indicates that the correct answer is FALSE. You get more traffic and click-throughs with both the top paid and organic results than either individually. -
#56 What's likely to happen if multiple accounts on a single IP address vote up a story at Digg in a short time period?
Your Answer: Your accounts will be suspendedCorrect Answer: Your accounts will be suspendedThe most likely result, particularly if this is done multiple times, is to have the accounts suspended. -
#57 Let's assume that you're running SEO for an auction website with many listings, sorted by categories and subcategories. To achieve the maximum search engine traffic benefit, what should you do with individual product/auction pages after the auction has expired and the product is no longer available?
Your Answer: 301 redirect them to the most appropriate category page associated with the productCorrect Answer: 301 redirect them to the most appropriate category page associated with the productThe "best" answer of the choices given is to 301 redirect the pages to the most appropriate category page associated with the product - this ensures that link value won't be lost, and visitors who come to the old page will get the best user experience as well. -
#58 Which factor is most likely to decrease the ranking value of a link?
Your Answer: Comes from a page with many reciprocal and paid linksCorrect Answer: Comes from a page with many reciprocal and paid linksAll of the answers can provide significant link value except "comes from a page with many reciprocal and paid links," which is very likely to have a strong negative affect on the value of the link. -
#59 Which of the following search engine and country combination does not represent the most popular search engine in that country?
Your Answer: Korea / NaverCorrect Answer: Japan / YahooAll of the above are correct, except Japan, where Google appears to now have a dominant search market share, despite Yahoo! getting more web traffic and visits. See also this piece from Multilingual-Search.com. -
#60 Where do search engines consider content inside an iFrame to be located?
Your Answer: Search engines cannot spider content in iFramesCorrect Answer: On the source page the iFrame pulls fromEngines judge iframe content the same way browsers do, and consider them to be part of the source page the iFrame pulls from (not the URL displaying the iFrame content). -
#61 If the company you buy links from gets "busted" (discovered and penalized) by a search engine, the links you have from them will:
Your Answer: Stop passing link valueCorrect Answer: Stop passing link valueSince search engines don't want to give webmasters the ability to knock their competitors out with paid links, they will simply devalue the links they discover to be part of paid networks, such that they no longer pass value. -
#62 Which of these queries would not have an "Instant Answer" or "Onebox Result" on Google?
Your Answer: Best Chinese Restaurant in San FranciscoCorrect Answer: Best Chinese Restaurant in San FranciscoNo surprisingly, the only correct answer is "Best Chinese Restaurant in San Francisco." -
#63 Which major search engine serves advertising listings (paid search results) from the PPC program of one of the other major engines?
Your Answer: Ask.comCorrect Answer: Ask.comAsk.com is the only major engine that shows ad results from another engine - specifically, Google. -
#64 Duplicate content is primarily an off-site issue, created through content licensing deals and copyright violations of scraped and re-published content, rather than a site-internal problem.
Your Answer: FALSECorrect Answer: FALSEThe answer is FALSE, as on-site duplicate content issues can be serious and cause plenty of problems in the search engines. -
#65 Links from 'noindex, follow' pages are treated exactly the same as links from default ('index, follow') pages.
Your Answer: TRUECorrect Answer: TRUEThis is TRUE - according to Matt Cutts in a comment here, links on pages with "noindex, follow" are treated exactly the same as links from default ("index, follow") pages. -
#66 Which metric is NOT used by the major search engines to measure relevance or popularity in their ranking algorithms?
Your Answer: Keyword usage in the URLCorrect Answer: Keyword density in text on the pageKeyword density is the outlier here. Dr. Garcia explains why search engines don't use the metric here. -
#67 If they have the same content, the Search Engines will consider example.com/avocado and example.com/avocado/ to be the same page.
Your Answer: FALSECorrect Answer: TRUE -
#68 Which Search Engines currently allow the 'nocontent' attribute?
Your Answer: MSNCorrect Answer: Yahoo!To date, only Yahoo! has implemented the nocontent parameter. -
#69 In which of the following countries does Ask.com have the most significant percentage of search engine market share?
Your Answer: United StatesCorrect Answer: United StatesSurprisingly, the answer is the US, where Ask.com has an estimated 5% market share. -
#70 For search engine rankings & traffic in Google & Yahoo!, it is generally better to have many, small, single topic focused sites with links spread out between them than one, large, inclusive site with all the links pointing to that single domain.
Your Answer: FALSECorrect Answer: FALSEThis is FALSE, primarily because the search engines' current algorithms places a great deal of weight on large, trusted domains, rather than small, niche sites. -
#71 The 4 major search engines - Google, Yahoo!, MSN/Live and Ask serve what approximate percentage of all searches performed in the US?
Your Answer: ~95%Correct Answer: ~95%According to nearly every study reported (including ComScore's), the four major networks, when AOL is included (serving Google results), provide ~95% of all searches in the US. -
#72 The linkfromdomain operator displays what information and is available at which search engine(s)?
Your Answer: Data on who is linking to a given website - available at Google & Yahoo!Correct Answer: Data on what websites are linked-to from a given domain - available at MSN/Live onlyAs can be seen here, Microsoft/Live is the only engine to provide the command and it shows what pages are linked-to by a given domain. -
#73 Which of the following social media websites is the least popular (as measured by active users & visitors)?
Your Answer: NewsvineCorrect Answer: NewsvineNewsvine is the smallest of the above, both in terms of traffic and users. -
#74 Which of the following pieces of information is NOT available from current keyword research sources?
Your Answer: Cost per click paid by PPC advertisersCorrect Answer: Cost per click paid by PPC advertisersSince all of the current search engines have blind bid systems, the cost-per-click paid by advertisers is currently unavailable anywhere. -
#75 The use of AJAX presents what common problem for search engines and websites?
Your Answer: Web pages with AJAX frequently take too long to load, causing crawlers to abandon themCorrect Answer: It creates multiple pages with unique content without enabling new, spiderable, linkable URLsThe largest problem for search engines is that AJAX frequently "creates multiple pages with unique content without enabling new, spiderable, linkable URLs."