Google Caffeine Algorithm

Last month, Google announced the completion of their new algorithm for indexing web called caffeine. The old algorithm would consist of Google updating their algorithm every few weeks or so. Google Caffeine will analyze the web in small portions and update their index on a continual daily basis.


Caffeine provides 50 percent fresher results for web searches than our last index. A talk with Matt Cutts about the Google Algorithm "Google Caffeine"


The main difference now is that the spider has crawles site is based on the ranking of keywords to make sure and give the best results. Before it was possible to get around Banana's website rank in search results for "Mango". It 's still a temporary measure until it is indexed. If there arn't keyword matching tags on the site with text on the page, your temporary rank will disappear.

Read more...

What is Yahoo Answers Posting?



Yahoo!Answers is a community site focused on the questions and answers (Q & A) launched by Yahoo On July 5, 2005, which allows users to submit questions to answer and respond to questions raised by other users.

Ask:- Asking is a snap. Ask a question about any subject that matters to you, so that other people can give the answers.

Answer:- Having an answer to a question? Share what you know and brighten someone's day.

Discover:- Browse-answer questions, and take advantage of a wealth of ideas and experiences that people have shared.


Get web site traffic through Yahoo Answers. Yahoo Answers is free and anyone can post a question and expect an answer within minutes. All information displayed on Yahoo Answers can do I access by anyone and can also be published in the format you want.

I knew exactly what he was doing wrong so I asked him a simple question: "When you leave an answer on the site and leave your link as a resource, do you give them a reason to click on your link?

First spend some time reading the various categories related to site and see what kind of questions are asked. What we are doing is searching questions we ask ourselves again and again. If people keep asking the same question means there is a need out there that is not met elsewhere.

Next, you are going to write a short report on the matter, which corresponds well to the question that everybody asks. It answers the question that people can see the value of links. you'll get a lot more traffic if you actually offer them a reason to click through to your site.

Find all the info that you need about " seo uzmanı abdulkadirseker " at www.abdulkadirseker.com.tr

Read more...

Targeted Search Engines

Targeted search engines sometimes called topical search engines are the most specific of them all. These search engines are very limited, often on a general topic such as medicine or branches of science, travel, sports, Education etc..

Examples of targeted search engines covered include citysearch, Yahoo Travel and search music, and like other forms of search engine ranking criteria vary from one search engine to another.

In reviewing the specific search engines for SEO, please note that many of these search engines are much more specific than the search engines primary or secondary. Specific search engines for research that is relevant to your subject (as Education, pets, sports, places, and so on).

Search Engine Market Share
Ten years Ago:


In 2010 Search Engine Market





Read more...

What is Robots.txt?

The file that is used to tell robots and crawlers what not to crawl on your site. Robots Exclusion Protocol is a method that allows Web site administrators to indicate to visiting robots parts of your site should not be visited by the robot. When a robot visits a website, first check the robots.txt file in the root directory.

If it can find this file, it will scan the contents to see if it can recover additional documents (files). You can customize the robots.txt file to apply only to specific robots, and deny access to certain folders or files. Here is a sample robots.txt to prevent all robots to visit the entire site: -

# Tells Scanning Robots Where They Are and Are Not Welcome
# User-agent:    can also specify by name; "*" is for everyone
# Disallow:    if this matches first part of requested path,
#            forget it

User-agent: *    # applies to all robots
Disallow: /      # disallow indexing of all pages

The recording begins with the lines of a user agent or more, stating that robots apply to the records, followed by "Disallow" and "Allow" instructions for the robot. To assess whether access to a URL is allowed, a robot must try to match the roads and let the lines do not allow the URL, in the order they appear in the folder. The first match found is used. If not found, the default assumption is that the URL is allowed.

Spiders and Robots Exclusion

Web Robots are programs that automatically traverse the hypertext structure by retrieving a Web document and recursively retrieving all documents that are referenced. This page explains how you can control what these robots to do while visiting your site.


Read more...

On Site SEO Factors

 Some other On page seo factors in seo basics tutorial as following are:
1. Domain Name Extension

.gov sites seem to be the highest  status
.edu sites seem to be given a  high status
.org sites seem to be given a high  status
.com sites excel in encompassing all the spam/ crud sites,
 resulting in the need for the highest scrutiny/ actionbyGoogle.
Perhaps one would do well with the new .info domain Top Level Domain – TLD

2. File size


Try not to exceed 100K page size (however, some issues, like this page, requires larger files).
Smaller files are better than 40K (lots of them).

3. Hyphens in the URL


The preferred method of providing a space where there can be no real space
One or two = excellent for separating keywords (ie SEO-basics, seo-basics)
Four or more = BAD, starts to look like spam.

4. Freshness pages

Google changes its contents are already on the day, the search engine are already important.
Newer the best - if news, retail or auction!
If the news of daily events, or to represent the site you should remember well that the daily updates of new events on your site, you must otherwise the search engine, which refuses websites. So the daily update of new information, much is required rank.

Google likes fresh pages. So do I.
Freshness - Amount of change in content
New pages - the ratio of old pages to new pages

5. Fresh Links

Calculated by the freshness of a link to almost google. So, links that will give you fresh content fress updation, and because the freshness of a link is going to be a very important roll in the upper part of the web site investment. So we should communicate that the freshness of our links to the important things passively.

6. Site Age

Google patent - Old is the best. Old is gold.
Well, in the Age of description of the site must have the spirit of keeping this site public to play an important role in the ranking in search engines. Because if your site is older you get first ranking in Search Engine Land.

7. URL Length


Keep minimized - use somewhat less than the 2,000 characters allowed by IE - less than 100 is good, less is better
Site Size - Google likes big sites.
Larger sites are presumed to be better funded, better organized, better constructed, and therefore better sites. Google loves large sites, for v various reasons, not all positive. This led to the emergence of machine generated spam sites 10000 pages - Size for size reasons. Google has its way and dumped millions of pages, or put them back.


Read more...

What is Purpose and Importance of SEO?

  • Search Engine Optimization helps site owners meet their various online marketing objectives such as generating leads, sales or simply building awareness.
  • The main purpose of search engine optimization is to increase the traffic generated by a website. Websites are designed to be viewed by users and search engines can help achieve this goal.
  • Search Engine power should not be underestimated. It is one of the building blocks of the foundation of the Internet.
  • Search engine optimization aims to achieve the goal of getting more visitors to a website by helping it get higher rankings in the search engines. This simply means that search engine optimization's goal is to make a website appear on the first pages, if not the first page of a search done through the search engine.
  • A survey showed that 90% of all Internet users employ search engines to aid them in their Internet-related activities. Google, the dominant player in the search engine industry, generates 70% of all search-related Internet activity.

Functionality of SEO in Seo Basics Tutorial.

Variables that have been thought to affect ranks positively in the past.

• Keywords in the domain name
• Bolding keywords, e.g. (<b>wooden boats</b>)
• Using keywords in heading tags, e.g. (<H4>Electricalebook</H4>)
• Keywords closest to the top of the page.
• Keywords in the description tag.
• Keywords in the keywords tag.
• Keywords in the names of linked pages and in the linked words, e.g. (<a href="seobasicstutorial.blogspot.com"> Seo Basics Tutorial</a>)
• Keywords in alt tags.
• Keywords as names of images, e.g. (<img src="seobasicstutorial.gif “ alt=" seobasicstutorial ">).
• Getting listings in Pay-Per-Click search engines like Google Adwords or Overture.


Read more...

White Hat SEO Techniques

In SEO Basics Tutorial, White hat SEO refers to those practices generally regarded as ethical. White hat SEO can also refer to a consultant or specialist who uses ethical practices to aid Web site owners in the design and maintenance of their Web pages. White Hat techniques conform to all guidelines of search engines and never deceive (cheat). White Hat gives importance to the users and prefers creating valuable contents for the users. White hat may take long time to get indexed, but the result is long-lasting.

White Hat SEO Aims to make it easier for the search engines to tell when your site's content is relevant to a query. In most cases we do this by making a site genuinely (sincerely) more useful to its human visitors, so that they will recommend it to others by linking it from their own websites. The most important White Hat technique is simply to have a website worth recommending.



Use Original Related Content
The more original content (directly written for your website) you have on your website the better. This is for many reasons, however one of the most direct being, more content equals potentially more keywords and the more keywords the better the rankings.

Consistently Add New Content
Adding new content to your website is not only preferred by search engines but by users as well. (Blogs perform extremely well because of this.) Making minor updates to previous content is not considered as adding new content.

Big Keyword Text Size
Having keywords stand out from the rest of the text on a page places more importance on them, thus making them more valuable then any keywords within the text itself. The best practice is to use keywords within your headings (<H1>, <H2>, <H3>) as these are typically larger in size than the rest of the text on the page.

Stylize and Emphasizing Keywords
Using effects such as bold italics and underlining to highlight keywords will also place more importance on them than the rest of the text on the page. However it is important to do so within limitations. Bolding an entire paragraph will create the opposite and potentially negative results.

Keep Content Updated
The newer the content on your website the better. We live in a fast paced environment and things change quickly, thus making some content outdated fairly quickly. Keep your content updated and fresh at all times.

Limit Content Length
Generally speaking users scan content online rather than read it. Producing three pages on one given subject rather than one long page will not only achieve better rankings but it will be easier to maintain user interest as well.

Use One Domain
Putting all of your content under one roof will help build better rankings as it will not confuse search engines and visitors when navigating from page to page. Moving visitors from one domain to will set off a red flag.

Keep Your Code Clean
Table based layout and designs are bulky and the way of the past. HTML and CSS layouts are short and clean allowing search engines to get to your content quickly. Keep your code clean and well compiled to allow easier access from search engines.

Stay Away From Unlawful Content
Search engines will not put up with any unlawful use of copyrighted content nor will they promote any website that condones illegal behavior. Stay far away from any unlawful activities and do not raise any suspicions amongst search engines.

Do Not Hide Content
One sure way to get blocked from a search engine is to hide content. This is a black hat method of SEO accomplished by coloring text the same color as the background of a website so that only a search engine spider can find it.

Do Not Practice Any Cloaking
Cloaking, serving up different content to search engines than to actual users, is highly banned by search engines and is also a sure fire way to be blacklisted immediately.

Do Not Use Doorway Pages
Doorway pages were once popular however, search engines have caught on and know when you are pretending to be one thing when you are not. Stay away from doorway pages and take down any that you might currently have set up.

Read more...

What is Header Optimization?


Header Tags:

HTML headings are defined in h1 to h6 tags. Never confuse with header tags, the header tags related only with BODY section of your webpage. Webpage header tags are webpage body sections of text and describe the overall content and meaning of webpage content. Header tags are typically created using syntax tags in HTML. Header tags are play a key role in on page optimization process and header tags reflects the whole webpage meaning. At the time of writing a H1 header tag use Primary keywords which reflects what the page contains.

Header tags contain 6 types of HTML tags:

H1 Header Tag

In this HTML header tag H1 Header Tag is used as the main heading of webpage and search engines gives the more priority to main header tags (H1) than other header tags.

H2 Header Tag

H2 Header Tag is generally used for sub heading in the Html Pages. This H 2 Header tag is the 2nd most important keyword in the HTML page and this H2 header tag gets the importance from the search engines than remaining (H3 Header Tags, H4 Header Tags, H5 Header Tags, H6 Header Tags) Header Tags. By using this Header Tags you can increase the keyword density in the Html Pages.


Header Tags SEO Basics Tips:
  • If you're working on html editing program such as Dreamweaver, You can use H1 Header Tags for highlighting the Primary Keywords.
  • Mention your main keyword at the top left side of the web page.
  • Title /heading of the page should be mentioned in H1 tag, the length of title should be limited to 60-80 characters. The standard usage of H1 tag is 3 times for the home page and once in the inner pages.


Read more...

How to Search Engines Works?

In Seo Basics Tutorial, Search engines work on the crawler-based (both Google and Yahoo fall in this category). Each search engine has its own automated program called "spider" or "web crawler" that crawls the web. The main purpose of the spider is to crawl web pages, read and collect the content, and follow the links (internal and external). The spider then deposits the information collected in the search engine database called the index.

When searchers enter a query in the search box of a search engine, the search engine’s job is to find the most relevant results to the query by matching the search query to the information in its index.

What makes or breaks a search engine is how well it answers your question when you perform a search. The higher your page ranks for these factors (yes some factors are more important than others) than the higher your page will get displayed in the search engine result pages.

It use a concept known as key words and phrases (as found in your page titles, headers, meta tags and your body text) to determine if a particular URL is relevant to what a user is searching for. This is logical since that is what the user sees but webmasters in their zeal to rank higher have often overemphasized keywords and so today search engines are placing a higher importance on keywords included in Anchor Text links ( links in the form <ahref="http://www.yoururl.com> Anchor text </a> where the bolded portion is where your keywords go). Either way the proper use of keywords on your site has much to do with your search engine rankings and the resultant traffic.

A key word can be one or combinations of several words which best describe what your web page is about or has to offer. This page is, for instance, targeting the term SEO and keywords.

Given that there are over 25 million active websites as of this writing there is no likelihood that any of your web pages can be accurately described by a unique key word or phrase. So when a search engine needs to decide which of the billions of pages in their index may be relevant to your particular search, they use a variety of algorithms which are primarily based on their analysis of the word content of the sites they have indexed.



Read more...

What is Meta Tags?


An element of HTML coding on a website that is used by search engines to index a website. Most meta-tags are included within the 'header' code of a website and the most important tags are the title, description and keyword tags. Rules used by different search engines govern how such tags are used, how many characters they should contain, and how they should be formatted. e.g.

<html>
<head>
<title>SEO Basics free SEO Tutorials for SEO Beginners</title>
<meta name="Keywords" content="Seo Basics, Seo Tutorials, Seo beginners">
<meta name="Description" content="SEO Basics free seo tutorials for seo beginners as How to optimize rank of website over the search engine by On page Optimization & Off page Optimization.">
<meta name="Robots" content="Index, Follow">
</head>
</html>

  •  What is Description Meta Tag?

An HTML tag that gives a general description of the contents of the page. This description is not displayed on the page itself, but is largely intended to help the search engines index the page correctly

The DESCRIPTION Meta tag describes the web page to the search engines.

1) They read and index the text in the tag.

2) Some search engines grabs the text form of the DESCRIPTION tag and places it under the text form of TITLE tag so searcher can read your description.

3) Some times search engines may not use the DESCRIPTION you provide. However, it uses the description with the keyword from the body text of that page or it may use the description of some standard web directories.

4) Some smaller search engine uses the DESCRIPTION tag in the results.


Following are some rules about the DESCRIPTION tags

1) The DESCRIPTION Meta tag is very important so you should use it in your site.

2) Place your DESCRIPTION tag immediately below the TITLE tags.

3) Create a nice keyworded description of up to 250 characters, including spaces.

4) Duplicate your important keywords once in the description but not more than that.


  •  What is Keyword Meta Tag?

This was important many years past, but this Meta tag is not so important these days. Some search engines may use it, but many don’t.

Following are some rules about the KEYWORDS tag


1) Limit the tag up to 300 characters, including spaces.

2) You can separate each keyword with comma and a space. Don’t use both but use either a comma or no space or use a space and no comma.

3) Make sure that most of the keywords in the tag are also in the body text.

4) Don’t use a lot of repetition.

5) Don’t use the same KEYWORDS tag in all your pages.

 
  • Using Other Meta tags
There are many other Meta tags but that not all are important but some of them are useful and that are as following:


        <meta name="revisit-after" content="7 days">                                                                     
        <meta name="robots" content="index, follow">
            Or
        <meta name="robots" content="all">
-------------------------------------------------------------------
        <meta name=”robots” content=”noindex, nofollow”>

seo


Read more...

Search Engines Spamming Techniques


In SEO Basics Tutorial, Search Engines Spamming Techniques that should be avoided.

Use of Invisible text:

Using invisible text is an extremely common search spamming practice in which a spammer uses a similar color for fonts as well as the background. Invisible text is used to stuff pages with keywords that are visible to search engines, but invisible to the viewers. All major search engines today can identify this kind of spamming easily and can penalize the site.

Stuffing keywords:

Keyword stuffing can be done in many ways. One most common way of doing this is by using invisible text. Other methods involve using keywords in very small fonts at the bottom of pages, using keywords in hidden tags (like no frames tag, alt tags, hidden value tags, option tags etc.) or stuffing the main content with repetitive keywords. Keyword stuffing is a trick that most search engines today are able to sniff out.

Link Spamming:

Link Spamming is the process of spamming search engines by getting thousands of inbound links from link farms, forums, blogs, free to all pages, unrelated websites or even by registering hundreds of domains and getting links from all of them generating a link empire. To put it in simple words link spamming is the process of getting inbound links through unethical practices solely for the purpose of ranking higher in search engines.

Most search engines give high level of importance to in-bound links and consider them as an indication that the site is credible. Participating in free for all and linking farms can get any site thousands of in-bound links, which can make them look important in the eyes of the search engines when they actually aren't. Most search engines today have come up with strict measures to deal with such kind of spamming and can even ban an involved website completely from their search listings.

Cloaking:

Simply put, cloaking is any process that involves presenting search engines with one set of information and the visitors with another. The copy presented to the search engines is highly optimized and hence the search but may rank it higher.

Creating Doorway Pages:

Doorway pages are pages that are highly optimized for search engines. They are similar to junk pages and contain nothing but keywords and irrelevant content. When a visitor enters such a page, his is either automatically redirected or asked to click on a JavaScript link.

It is not necessary that all doorway pages should always look this way. There are ways in which good and related information can be provided making the doorway pages an informative page instead of a page that contains nothing but junk. Only when one tries to take shortcuts does he relent to creating junk pages instead of offering informational content.

Page redirects:

Often people create spam filled Web pages intended for the eyes of search engines only. When someone visits those pages, they are redirected to the real page by META refresh tags, CGI, Java, JavaScript, or server side techniques. There are legitimate reasons for cloaking and similar techniques, but don't use them unless you know exactly what you are doing.

Duplicate Content:

Its Possible that content duplication is the single biggest SEO problem faced by websites today. Whether the duplication is international or not, presenting the same content to search engines under multiple URLs can cause a site to be ranked poorly or penalized. In some cases, it prevents indexing entirely.


Read more...

What is Forum Posting?

In Seo Basics Tutorial, The forums are online communities. An interesting dimension of the forum is that you can allocate a portion of which corresponds to the demographic profile you are looking for. Participation in the forum to help build the company's reputation by showing the members of the Skills Forum, and makes a positive impression on them with your capabilities.

The Forum is a hierarchical structure or tree, Forum may contain number of subforums, each of which may have a wide range of topics. Within the forum topic, each new discussion started is called a thread, and can be replied by what people want.


Guidelines to forum posting :
  • Post in topic specific forums – these are the hang out of most experts.
  • Ensure that the targeted forums are thematically relevant to the client’s site.
  • By principle we ensure that same messages are never posted on different forums.
  • It is our policy to always post the right information.
  • Why should you use our services?
  • To draw relevant traffic by posting in forums, leading to higher rankings in major search engines.
  • Our techniques assist in optimization of your forum revenues.Assured augmented Return on Investment.
  • Our team comprises professional and experienced writers who produce quality search engine optimized content.
  • To prevent duplication we regularly test the quality of our writers by probing into their style of writing and grammar usage.


Read more...

What is Social Bookmarking?

• Social bookmarking is a method for Internet users to store, organize, search and manage bookmarks of web pages on the Internet.

• These sites have millions of hits per day and help you not only get real traffic, but also to rank highly on all SE’s.

• Instant Traffic!

In social bookmarking system for users to save links to websites that they want to remember and / or share. These bookmarks are usually public, and can be saved privately, shared only certain individuals or groups, shared only inside certain networks or some other combination of public and private domains. Because people can usually view these bookmarks chronologically, category or tags, or via a search engine.

Advantages:-

Social bookmarking system has several advantages over traditional location of resources and automated classification software, such as search engines. All cloud-based classification of Internet resources (such as websites) is done by human beings, who understand the content of the resource, as opposed to software, algorithms that attempts to determine the meaning of a resource. Moreover, people tend to seek and make web pages that have not yet noticed or indexed by web crawlers. In addition, a social bookmarking system can rank a resource based on the frequency with which it has been marked by users, which may be a useful indicator for the end users of systems resources are classified based on the number of external links point to the same.

For Users, social bookmarking can be a useful way to access a consolidated set of bookmarks on different computers to organize a lot of bookmarks and share your favorite contacts. Libraries have found social bookmarking useful for a simple way to provide lists of informative links to patrons.

Seo Basics Tutorial offers Social Bookmarking Services in cheap prices to Click Here. | If you are not interested in Invoice Discounting , then you have already missed a lot.


Read more...