Search Engine Optimization

SEO is the process of improving the visibility of a website or a web page in a search engine's "natural," or un-paid ("organic" or "algorithmic"), search results.

Search Engines

Search engines remain popular—and users are more satisfied than ever with the quality of search results—but many are anxious about the collection of personal information by search engines and other websites.

Find The Right Company

Over 90 percent of all web surfers use search engines to find what they are looking for. If visitors are not finding your site, you may need search engine optimization (SEO) – whether you hire a company or do it yourself.

SEO LATEST UPDATES

Thursday, June 28, 2012

Competitive Analysis :

The main components of an SEO competitive analysis are:
  1. Top Keywords
  2. Site Structure 
  3. Linking Initiatives / Authority
  4. Social Presence
1. Top Keywords :
SEMRush is helpful for identifying “top keywords.” This tool saves loads of time by helping you determine the valuable traffic that you, or your competitors, are getting across many keywords. It will evaluate the value based upon the estimated cost per click, if you were to buy this traffic via AdWords and show you a total estimated value by keyword, as well as a running estimated total value for all organic search traffic.

The folks at SEMRush have also been very good about launching their program for several countries (including the U.S., UK, Russia, France, Brazil, and Australia, to name a few), and I want to thank them for that. There’s nothing that will help to sell SEO to the C-level folks than to see the dollar value of organic search traffic for top competitors. I typically use this valuation as a method of determining the “real” competitors for our SEO efforts.

Once you know some of the valuable keywords worth targeting, you can put those on your “hit list” and continue on with the analysis (SEMRush is exportable to CSV and Excel).

After you’ve identified those competitors that are the most relevant to your business and driving the most amount of valued organic traffic, it’s time to begin the process of peeling back the reasons why their website may have a stronger presence in the search engines than yours.
 
2. Site Structure :

Some of the same competitive analysis items from three years ago remain worth consideration today. Namely, you want to check to see how many pages these websites have indexed in Google and Bing (I would have mentioned that you should use Yahoo Site Explorer, but – sadly – that will be going away, soon).

Simply use the site: operator (site:www.example.com) and check indexation. Though these are estimates (and some might say “bad estimates”), it can give you a general sense as to the depth of a competitor’s website/pages of content. I’m beginning to use Screaming Frog, nowadays.

For today’s SEO, you will want to check out the indexing of other digital assets to find areas of opportunity:

    Check indexing of images and videos.
    Look for Facebook/Twitter profiles for the competitors (to identify what their “winning content” is; video, shopping feeds, local/map listings, and news/PR).
    Check whether competitors are operating a blog (and how they have this set up, whether it’s on a separate domain, subdomain or sub-directory).
    Look at the competitor’s titles, H1s, meta descriptions, meta keywords (all of which can be pulled using Screaming Frog).
    Look at keyword density on ranking pages.
    Evaluate URL structures, to see if “top competitors” are pushing content as close as they can to the root (www.example.com/page-name-here) or whether they’re letting the website’s structure determine the file depth of a given page (www.example.com/category/page-name-here).

3. Linking Initiatives / Authority :

Back in 2008, I might have wanted to find out who has been most successful at gaining the most keyword-rich, anchor text backlinks pointing to specific pages. Linking has certainly changed a lot in a relatively short period of time.

Nowadays, when you’re analyzing competitors that are doing well in the SERPs, analyzing anchor text usage is of interest, but also notice the mix of branded links versus keyword-rich links. Look at domain and page authority, and the number of unique linking domains (how many different websites are linking to the competitor’s site).

Analyze the diversity of the links. See if links are coming from press releases, article syndication, social bookmarking, widgets, or what else they may be doing to be successful. While several tools track links, I tend to gravitate to OpenSiteExplorer and Majestic SEO.

4. Social Presence :

To me, the foundation of a social presence is a blog. See how the competitor has set up their blog, the categorization of the blog (what are they writing about; do you need to develop an editorial calendar of your own?), and see how they are developing community for their social content.

Is your competition developing strictly textual content, or are they involved in infographics and/or video?

See if these competitors are engaged with Facebook, Twitter and YouTube, and see how many Facebook Page likes, homepage likes, Facebook comments, Twitter followers and tweets, and videos/views are showing for the competitors, to get a sense of how engaged they are in social and whether they seem to be using these synergistically with their SEO efforts.
SEO Competitive Analysis in 2011

Suffice it to say, an SEO competitive analysis today is a bit different than you might have pulled together just a few short years ago. And frankly, we’ve only scratched the surface.

Hopefully this article was helpful for those of you who are trying to determine the best way to identify “best practices,” so that you have a good sense as to what you need to do to be successful with your own SEO efforts.

Keyword Analysis

            Keyword research is one of the most important, valuable, and high return activities in the search marketing field. Ranking for the "right" keywords can make or break your website. Through the detective work of puzzling out your market's keyword demand, you not only learn which terms and phrases to target with SEO, but also learn more about your customers as a whole.


HOW TO JUDGE THE VALUE OF A KEYWORD?
 
How much is a keyword worth to your website? If you own an online shoe store, do you make more sales from visitors searching for "brown shoes" or "black boots?" The keywords visitors type into search engines are often available to webmasters, and keyword research tools allow us to find this information. However, those tools cannot show us directly how valuable it is to receive traffic from those searches. To understand the value of a keyword, we need to understand our own websites, make some hypotheses, test, and repeat - the classic web marketing formula.

Companies often choose the wrong keywords because they pick keywords based on how they — rather than their customers — think about or relate to their products or services. Sound SEO strategy involves performing a thorough keyword analysis — a check to determine which keywords people use to search the Web. Without this component of SEO, you will just be spinning your wheels trying to gain traction in a highly competitive marketplace.

Search for the term/phrase in the major engines :
Understanding which websites already rank for your keyword gives you valuable insight into the competition, and also how hard it will be to rank for the given term. Are there search advertisements running along the top and right-hand side of the organic results? Typically, many search ads means a high value keyword, and multiple search ads above the organic results often means a highly lucrative and directly conversion-prone keyword.


Understanding the Long Tail of keyword Demand:
  Going back to our online shoe store example, it would be great to rank #1 for the keyword "shoes" - or would it?
It's wonderful to deal with keywords that have 5,000 searches a day, or even 500 searches a day, but in reality, these "popular" search terms actually make up less than 30% of the searches performed on the web. The remaining 70% lie in what's called the "long tail" of search. The long tail contains hundreds of millions of unique searches that might be conducted a few times in any given day, but, when taken together, they comprise the majority of the world's demand for information through search engines.
Another lesson search marketers have learned is that long tail keywords often convert better, because they catch people later in the buying/conversion cycle. A person searching for "shoes" is probably browsing, and not ready to buy. On the other hand, someone searching for "best price on Air Jordan size 12" practically has their wallet out!
Understanding the search demand curve is critical. To the right we've included a sample keyword demand curve, illustrating the small number of queries sending larger amounts of traffic alongside the volume of less-searched terms and phrases that bring the bulk of our search referrals.

Saturday, June 16, 2012

Website Content Optimization

We perform the website content optimization process to ensure your website is search engine friendly and full opitmized to obtain proper search engine relevancy scoring to obtain first page placement. Our process includes keyword or key phrase discovery on a per page basis, content and page structure implementation, keyword or key phrase denisty implementation, internal linking and more!

There are aspects of the search engine optimization process that gain and lose importance. Website content optimization is no exception to this. Through the many algorithm changes that take place each year, the weight given to the content on your pages rises and falls.

The goal for search engine optimization or marketing implementation is to build and optimize a website that will rank well on the major search engines and, more difficult and far more important, hold those rankings through changes in the search engine algorithms.

While there are many characteristics of your website content that are in the algorithmic calculations, there are a few that consistently hold relatively high priority and thus will be the focus of this article. These are:
  1.     Heading Tags
  2.     Special Text (bold, colored, etc.)
  3.     Inline Text Links
  4.     Keyword Density
Website Content Optimization & Heading Tags

The heading tag is code used to specify to the visitor and to the search engines what the topic is of your page and/or subsections of it. You have 6 predefined heading tags to work with ranging from <H1> to <H6>.

By default these tags appear larger than standard text in a browser and are bold. These aspects can be adjusted using the font tags or by using Cascading Style Sheets (CSS).

Website Content Optimization & Special Text

"Special text" (as it is used here) special is any content on your page that is set to stand out from the rest. This includes bold, underlined, colored, highlighted, sizing and italic. This text is given weight higher than standard content and rightfully so. Bold text, for example, is generally used to define sub-headings (see above), or to pull content out on a page to insure the visitor reads it. The same can be said for the other "special text" definitions.

Search engines have thus been programmed to read this as more important than the rest of the content and will give it increased relevancy weight for scoring. This website content serves two purposes. The first is to draw the eye to these words and further reinforce the "brand". The second purpose is to add weight to the "Search Engine Positioning" portion of the name. It effectively does both.

Website Conent Optimization & Inline Text Links

Inline text links are links added right into text in the verbiage of your content.

Like special text this serves two purposes. The first is to give the reader a quick and easy way to find the information you are referring to. The second purpose of this technique is to give added weight to this phrase for the page on which the link is located and also to give weight to the target page.

Website Content Optimization & Keyword Density
Keyword density is the percentage of your total content that is made up of your targeted keywords or key phrases. There is much debate in forums, SEO chat rooms and the like as to what the "optimal" keyword density might be. Estimates seem to range from 3% to 10%.

With this in mind there are three points that you should consider:

    You do not work for Google or Yahoo! or any of the other major search engines (and if you do you're not the target audience of this article). You will never know 100% what this "magic number" is.
    Even if you did know what the optimal keyword density was today, would you still know it after the next update? Like other aspects of the search engine algorithm, optimal keyword densities change. You will be chasing smoke if you try to constantly have the optimal density and chances are you will hinder your efforts more than help by constantly changing the densities of your site.
    The optimal keyword density for one search engine is not the same as it is for another. Chasing the density of one may very well ruin your efforts on another.

Website Content Optimization Warnings

In an effort to increase keyword densities, unethical webmasters will often use tactics such as hidden text, extremely small font sizes, and other tactics that basically hide text from a live visitor that they are providing to a search engines. Take this advice, write quality content, word it well and pay close attention to your phrasing and you will do well. Use unethical tactics and your website may rank well in the short term but once one of your competitors realizes what you're doing you will be reported and your website may very well get penalized. Additionally, if a visitor realizes that you're simply "tricking" the search engines they may very well decide that you are not the type of company they want to deal with!

What are meta tags?



Meta tags are HTML codes that are inserted into the header on a web page, after the title tag. They take a variety of forms and serve a variety of purposes, but in the context of search engine optimization when people refer to meta tags, they are usually referring to the meta description tag and the meta keywords tag.
The meta description tag and the meta keywords tag were proposed so that webmasters would have a consistent method for providing meta document data to user agents, such as search engines. Unfortunately, so many unscrupulous webmasters have abused the meta description and meta keywords tag that search engines have had to d-emphasize their importance.

What are meta descriptions?

Though meta description tags are not a major factor search engines consider when ranking sites, they should not be left off the page. Both the meta keywords tag and the meta description tag contribute to your search engine ranking, and the meta description tag influences the liklihood that a person will actually click on the search engine results page and visit your site.

The meta description tag is intented to be a brief and concise summary of your page's content. Think of the Yahoo! directory. You see your site title followed by a brief description of your site or business. The meta description tag is designed to provide a brief description of your site which can be used by search engines or directories. The meta description tag takes the following form:

<meta name="description" content="Brief description of the contents of your page.">

When you write a meta description tag, you should limit it to 170 characters or 200 characters at most. You should pick a style and be consistent throughout your pages, writing a unique description for each page of your site. The Open Directory (DMOZ) also a detailed guide to writing descriptions. Click here to learn about the DMOZ style. The key is that you want your description to adhere to W3C standards and be relevant to the content of the page. Again, it is intended to provide a brief summary of the contents of the page.

What are meta keywords?


Though meta keywords tags are not a major factor search engines consider when ranking sites, they should not be left off the page. Both the meta keywords tag and the meta description tag contribute to your search engine ranking. A meta keywords tag is supposed to be a brief and concise list of the most important themes of your page. The meta keywords tag takes the following form:

<meta name="keywords" content="keywords, keyword, keyword phrase, etc.">

When you write a meta keywords list, start by scanning the copy on your page. Make a list of the most important terms you see on the page. Then read through the list. Pick the 10 or 15 terms that most accurately describe the content of the page. If you can't narrow your keyword list down to 10-15 keywords, then the content on your page may be rambling to far. Because of the hyper-competitiveness of the current search engine placement landscape, pages need to be very focused on one or two specific keyword phrases in order to have a chance to get a top ten placement. For example, a page about northern Michigan apples and central Florida oranges doesn't have much of a chance to win for either "northern Michigan apples" or "central Florida oranges." To have any chance to win, you need to have one page about northern Michigan apples and one page about central Florida oranges.

Another example: If you page is a list of exercise or fitness tips, and on the page you list tips for things to do before, during, and after a workout, then you need to think to yourself, "what 10 or 15 words or phrases is this page MOST about?" Just because your page mentions dieting in the text doesn't mean that the page is about dieting. If you want to win for dieting, then create a page about dieting. The ultimate example of a page which is focused and ready for search engine optimization is a page from an encyclopedia. Each page is brief, focused, and has just one theme.






Friday, June 15, 2012

On-page SEO Techniques

What is a Title Tag?

A title tag is a piece of HTML code that describes a specific web pages content through a keyword query that a person types into a search engine. Title Tags are a very important guide for all search engines in determining what is in the content of a specific web page. Creating a relevant title tag is one of the most important variables in achieving high search engine positioning.

View your Title Tag

You can view the title tag in the source code of a web page or at the top of your browser window. If you are interested in viewing the title tag through the source code simply click on view from the menu at the top of your browser and then click on source, open the window for full view. You will find the title tag in the head section of the source code along with your Meta description tag and Meta keyword Tag.

Here is an example of how a Title Tag is placed within the website coding:

What should your Title Tag contain?

Consider your Title Tag as a representation of your core keywords of the most important services and offerings. Depending on what product or service you are providing make sure that your core keywords stand out in the title of your index page before anything else. If you are promoting several products and services always remember to utilize the title tag on each individual URL for its specific keyword(s) and keyword phrases.

You can brand your company's name through keyword density in the Title tag as well. By including your company name you can establish popularity for branding your company for specific products and services that you offer. With popularity the name of your company can become a regular search term within the search engines.

How long should your Title Tag be?

Realizing the importance of a Title Tag when optimizing a site it is important that you consider the length of characters being included within this tag. In the opinion of Keyword Performance we feel that title tags should span no less than six words and no more than twelve which result in about a range of fifty to eighty characters, this includes spaces, hyphens, commas, etc. This will allow the search engine to utilize this information effectively. You do not want to over kill the keyword density by cramming in to much information that a search engine may look over.

Making a title tag long is not necessarily a bad thing but if you offer multiple products or services do not feel you have to cram all the keywords in one tag because you are in fear of something being missed by a search engine. Obviously you have other web pages built into your site with the specifics and can utilize the title tag for that specific offering on that particular page. Each search engine only reads so far through each title tag anyways as you can see when you pull up a search. For an average person using a search engine having too many phrases can sometimes be confusing by the way the search result is displayed. Remember building a strong title tag will increase your visibility on multiple search engines.

Keyword Performance hopes you have found this information a helpful tip in building an effective search engine optimization marketing strategy. Feel free to utilize any other information within our website to help with a better understand and maximize your search engine performance.

SEO Baics

What is SEO?

SEO stands for “search engine optimization.” SEO is a technique which helps search engines find and rank your site higher than the millions of other sites in response to a search query. SEO thus helps you get traffic from search engines.

                                                              OR

SEO stands for “search engine optimization.” It is the process of getting traffic from the “free,” “organic,” or “natural” listings on search engines.



On-Page Optimization Techniques :

1.Title Optimization
2.Meta Tags optimization
3.Important HTML Tags
4.Keyword Optimization & Synonyms
5.Link Optimization
5.Image Optimization

Thursday, June 14, 2012

Alexa Pagerank

 Alexa Page-rank Checker instantly check the Alexa page-rank of your website with our free online tool. The results show your website page rank and a traffic graph plotting six months of traffic. Simply enter your website URL into the free Alexa page-rank checker url field and press the Get Page-rank button. The page rank will be shown along with a traffic graph plotting the last six moth of traffic to your website.

The Alexa page-rank is calculated from data collected from surfers that have the Alexa toolbar installed in their browser. Basically it is a measure of your popularity relative to the other sites. You can drop in the rankings and still get the same amount of traffic that you always did. The Alexa page-rank of your website is affected by the traffic that visits your website and the traffic that all of the other websites receive. So you can have your page rank go up or down even if you get exactly the same amount of traffic everyday. The change in your page rank would then be caused by the traffic the other sites on the internet get. Feel free to check the Alexa page-rank of as many webpages as you wish with our online free Alexa page-rank checker.

What isGoogle Analytics ?

Google Analytics is a free statistics tracking and analysis service that allows web site administrators to analyze traffic flow on a web site. Although most hosting solutions come with similar software, it may not be as easily understood or as easily navigated as Google Analytics.One of the more impressive features of Google Analytics is the ability to flag certain pages as “goals.” This is specifically beneficial to e-commerce web sites. However, regardless of the purpose of a web site, Google Analytics can help make that web site better.
Whenever someone visits a web site, a variety of information is stored on the server in the form of access logs. These files are simple text files that are formatted in a specific way to allow programs to analyze the stored information and present it in a way that can be more easily understood. This data includes the IP address of the computer, the amount of data transferred, the URL visited, the time and date of the visit, the operating system and the browser used by the visitor to the web site. The most common information used to tell whether a web site is doing well is the number of visitors. This is certainly valuable information, but it tells us nothing about what visitors are doing when they visit a web site. Standard statistics software will guess at how much time people spend viewing a specific page, but this is also not as valuable as the information collated by Google Analytics. Read More..

Wednesday, June 13, 2012

Google's Sandbox

Google Sandbox will be a new term for most of the bloggers. The search engines are very strict about the websites they show in their SERP’s and if you use black hat techniques your website could be penalised- here’s how to find out if your website has been penalised!
If your website has seen a recent drop in traffic or a drop in rankings it could have been penalised by Google, most website owners don’t realise they are doing something wrong until they have been penalised. As an ethical Manchester SEO company we are very careful to ensure your website does not get penalised, but sometimes sites have penalties or are blacklisted when we start working on them.


The Google Sandbox Effect is a theory used to explain why newly-registered domains or domains with frequent ownership changes rank poorly in Google Search Engine Results Pages (SERPS). In other words new websites are put into a “sandbox” or a holding area and have their search ratings on hold until they can prove worthy of ranking.

Once Google deems a website to be of quality and importance the website will be removed from the Sandbox and can potentially show up high in Google Search Engine Results Pages. Webmasters can do numerous things to improve their website with Google, but time really is the key in getting out of the Sandbox. Sandbox believers say it can take anywhere from 6 months to a year and sometimes longer before Google will promote a website out of the Sandbox.

Because Google does not acknowledge the Sandbox and it has not been clearly proven the Sandbox Effect is just a theory. Even though it is just a theory, the Sandbox is believed by the majority of webmasters. The Sandbox is believed to have come about in 2004 when changes to Google’s Algorithm made it so new websites were banned from the top of Google Search Engine Results Pages (SERPS).

It may seem that the Sandbox is unfair to newly launched websites, but Google created the Sandbox with good reasons. Google was trying to discourage spam sites from quickly reaching the top of Search Results, getting dropped off Google, creating a new site and repeatedly showing up high in Search Results. In the past, companies would put up a promotional website for a short period of time and once the promotion was over the website was gone. These promotional websites would still show up high in Google Search Engine Results even after they were removed, causing many broken links and unhappy Google searchers.

Even with the Sandbox Effect it is still possible for newly launched websites to make it to the top of Google Search Engine Results Pages (SERPS). If Google deems a new website of being worthy it can be seen in Search Engine Results immediately, but it can still take up to 6 months for the website to rank to its fullest potential. There are many ways in which web designers use to avoid the Sandbox, some of which are discussed below. But because of its uncertainty, even if all algorithm variables are followed there is still no way to guarantee new websites from being put in the Sandbox.

How can you tell if you are in Google’s Sandbox?
Now that you know what the Sandbox is it is important to learn how to tell if a site is in the Sandbox. Before determining if a website has been put in the Sandbox you need to find out if the entire site was dropped from Google or if it has just been ranked down for certain keyword phrases.

Start by doing a search in Google for your site. To do this type: “ site:www.yourdomain.com ” in the Google Search bar (replace “yourdomain.com” with your site URL). If Google does not return any results than most likely your entire website has been dropped by Google. If Google shows your website in the results and your website recently dropped down in keyword rankings, than you are still indexed by Google and it is likely that you were placed in the Sandbox.

New websites start out on the bottom with a zero page rank but with time the site can start ranking up. Does your website have a pagerank but it is still not showing up in Google Search Engine Results Pages (SERPS)? If that is the case, your website is more than likely in the Sandbox. Often times a new website can have a pagerank after a short period of time, rank high in Google Search Results and then all of a sudden disappear from Google Search Results. When this happens it is likely that the site was put into Google’s Sandbox. Many people are left asking themselves why? There is no one answer as to why websites get put into the Sandbox, but below is a list of things thought to affect website’s placement with Google.

Tuesday, June 12, 2012

What is Robots.txt ?

      It is great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want. For instance, if you have two versions of a page (one for viewing in the browser and one for printing), you'd rather have the printing version excluded from crawling, otherwise you risk being imposed a duplicate content penalty. Also, if you happen to have sensitive data on your site that you do not want the world to see, you will also prefer that search engines do not index these pages(although in this case the only sure way for not indexing sensitive data is to keep it offline on a separate machine). Additionally, if you want to save some bandwidth by excluding images, stylesheets and javascript from indexing, you also need a way to tell spiders to keep away from these items.  the search results vary as well.    Read More...

What is Sprider ?

            A web Spider, which is also referred to as the search engine Crawler , is a program that automatically moves over online content and it produces a copy of its most visited sites. Search engines use them when indexing the pages that have been downloaded and enhance the process of searching for specific pages. A crawler is capable of indexing millions of pages daily though because search engines use different algorithms, the search results vary as well.

What is PageRank ?

     
PageRank is a ranking system that previously was the foundation of the infamous search engine, Google. When search engines were first developed, they ranked all websites equally and would return results based only on the content and meta tags the pages contained. At the time, however, the PageRank system would revolutionize search engine rankings by including one key factor: a site's authority.