Search engines aim to build an index of all web pages via crawlers. These bots follow links (href
and src
) and index the files they encounter. When you enter a search query, search engines retrieve the most relevant web pages from their index and sort them based on several ranking factors.
White-hat SEO refers to techniques which search engines recommend as part of good design. They produce long-term results. Black-hat SEO involves techniques that are disapproved of by search engines. These can get your site penalized or ultimately banned from the index.
There are several techniques that are considered black-hat, such as:
There are more than 200 factors which are used to determine the relevancy of your pages. We don’t know all of them, but research has identified several important elements:
This is probably one of the most common SEO-related questions. There are several causes which might prevent your site from showing up in the search results.
First of all you should perform a site:
search in Google (e.g.: site:92west.com) to see if it has been indexed yet. If you see a list of pages, your site has been crawled and indexed by Google. If you see the following message it has not yet been indexed or it might have been removed from the index by violating Google’s Guidelines.
You can prevent a page from being indexed by Google with the help of the robots.txt file or robots meta tag. If you want to remove a page that has already been indexed, you can do this via Webmaster Tools > Optimization > Remove URLs.
No, search engines can not crawl the content on password protected pages, hence they can not be indexed.
Google can index https-pages (PayPal’s homepage is indexed as a https-version for example).
Warning: this can sometimes be the cause of duplicate content if both http and https versions are indexed.
Short answer: no.
Why not? Because the amount of traffic a page receives is not a ranking factor. Yes, high traffic websites are usually correlated with high rankings, but the high ranking is not a direct consequence of traffic. Search engines don’t know how much traffic a website gets. They know how much they sent to a website, but they don’t know how much direct or referral traffic it may get, so they have incomplete data. And if you’re using Google Analytics on your site, Google can’t use this data due to privacy concerns.
So traffic doesn’t affect your ranking. Your link profile, content, domain authority, etc. affect your ranking. This will in turn improve the amount of visitors you get.
Sometimes you might see different search results than someone else. This is due to search personalization. Search personalization is based on your search history and is tracked via your Google+ profile or cookies. For this reason SERPs might sometimes look a little bit different.
Don’t worry though, search personalization can be disabled.
It’s impossible to appear in the search results if search engines don’t know your site exists, hence why a lot of people ask how they can get their new site crawled and indexed. There are several things you can do:
This is a rather tough question to answer. Improvements can be seen after a few days, weeks or even months. There are several factors to consider, such as the previous state of the website, the amount and type of changes, crawl rate, keyword competitiveness and so on. SEO is usually a long term strategy that can take several months.
Yes, but only marginally.
Country-specific domain extensions (.de, .fr, .it...) can help you rank for local searches in your country. There is no ranking benefit of choosing a certain generic domain extensions (.com, .net, .org...).
Exact-match domains are, as the name suggests, domain names that match a certain keyword or phrase. For example: if you want to rank for the term ‘rare baseball cards’, you could buy the domain rarebaseballcards.com (exact match domain).
This type of domain used to be really powerful in the past. However, in September last year an algorithm update was released that reduces low-quality exact-match domains in search results. I suggest choosing a short domain name which is easy to remember.
The most important aspect of a site migration is URL redirection. Use a 301-redirect to send visitors from the old pages to the new. You can notify Google of a site migration via Webmaster Tools (Configuration > Change of address).
When you write naturally, you automatically use both forms in your text. Remember that it’s important to provide your readers with quality content. Text that has been over-optimized for a single keyword (without variations, synonyms, etc), will not be pleasant to read. Visitors come first, search engines second!
Long tail traffic comes from very specific keywords. So instead of ‘photoshop tutorial’ (short tail keyword), you get traffic from ‘photoshop tutorial for photo manipulation’ (long tail). Long tail traffic has less competition than traffic from short tail keywords.
The meta keywords tag has become redundant, so you don’t have to add it. The meta-description tag on the other hand, is still be useful to convince people to visit a page.
Search engines have trouble reading certain content types (images, video’s, flash...). A search engine friendly design is a website that can easily be crawled and indexed.
Search engines prefer longer pieces of content (usually +300 words). But that doesn’t mean shorter articles can’t perform well. It’s better to focus on the quality of your content than on content length.
There are three ways to optimize an image:
Duplicate content is exactly what you think it is: two pages which have identical content. In this case Google rarely shows both pages in the search results, so duplicate content can be problematic.
Duplicate content can be created intentionally (for example the print version of a page) or accidentally (because of URL parameters, inconsistent linking, etc.).
For example:
Absolute URL: http://website.com/blog/page.html
Relative URL: /blog/page.html
Relative URLs exclude the root domain of the site. When it comes to SEO, it’s best to use absolute URLs.
It’s best to use hyphens (-). Using other punctuation marks, such as underscores (_) and plus signs (+) might confuse search engines. Using no separators at all should also be avoided.
It’s best to use subfolders. This can facilitate the correct indexation of your pages by search engines. If you have several, non-related products, you can use subdomains.
Google has come a long way when it comes to handling dynamic URLs. Most of the time they will interpret the various parameters correctly. However, you can help Google with this task via Google Webmaster Tools.
No, reciprocal links aren’t bad. However, one-way incoming links seem to be more valuable.
A link farm is a massive network of sites with a single purpose: linking to other websites. These types of links are considered to be a form of spam.
Nofollow
is a link attribute that tells search engines not to pass any PageRank to the linked page. This means that a nofollow link will not improve your ranking.
The practice of buying links to improve the ranking of a page goes against Google’s Webmaster Guidelines. This is no surprise because you’re basically manipulating Google’s algorithm. That’s why these tactics are considered black-hat. Worst case scenario is a penalty for your site. But in reality it’s pretty hard for Google to discover paid links. Nevertheless it’s best to be careful.
In theory, competitors can harm your website by building a high volume of low quality links to it, or links from malicious websites. Google developed a tool to counter these practices, called the Disavow Tool. The Disavow Tool allows you to tell Google which links to ‘ignore’.
Same answer as above: low-quality links to a site that damage a site’s ranking. These links have a detrimental effect and have therefore earned the name "negative SEO".