SEO Services Company India

We offer a variety of SEO optimization services to meet the requirement of small and corporate companies. Our professional SEO expert team is amongst the best.

SMO Services Company India

It’s the Social Media era and there’s no denying it! You can be sure that your competition is already sizing up this opportunity, so let’s see how can you exploit this situation to promote your business?.

Link Building Services India

We provide quality SEO link building to your site. Our expert, affordable and professional link building services is committed to help you to increase your website's search engine ranking visibility.

SEO Experts India

Working Since 2009 with SEO Companies, I've really been able to establish a strategy for SEO that not only gets ranking to the 1st page, but keeps the long-term goal of keeping your business at the top of Google.

This is default featured post 5 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

Saturday, October 29, 2011

Content marketing: How SEO and thought leadership go hand in hand


When I ask marketers about their goals for their sites, the answers vary depending on their industries. I often find marketers with retail sites want SEO in order to become accessible and drive sales, technology solution providers are looking to gain visibility so they can utilize sites as a lead generation tool and others, such as marketers at environmental agencies, are primarily looking to be positioned as online resource centers to gain subscribers or customers. The goals may vary, but the solution remains the same: provide fresh and informative content that is keyword rich to engage your audience.

For businesses that want SEO, I consistently tell them they need search-optimized, quality content. For those that want thought leadership, I consistently tell them (you guessed it) they need search-optimized, quality content. The current online marketing climate necessitates marketers who want SEO results to publish niche content, and marketers who want to be known for their authoritative resources need to make sure content is optimized to standout in search and social channels across the increasingly cluttered web.

What I hear from my clients is that this is much easier said than done. When it comes to the hurdles they face, you name it I’ve heard it. Marketers struggle to write fresh and unique content, to optimize every piece of content that goes live on their site and – often – to make their SEO/ thought leadership/ content strategy sustainable.
   
These hurdles must be tackled; failure to do so ends in your competitors’ favor.  Whether you work it out in-house or decide it’s time to call in the experts and work with agencies for content marketing (the subject of a recent Brafton blog), steps must be taken to overcome your content issues. Otherwise, competing businesses will outrank you and win over your prospects with compelling information that your site and social pages lack.

If you’re still not sure content marketing is the solution to your SEO and thought leadership woes, let’s explore how it addresses each of these needs. With Google rewarding sites with fresh and relevant content, the search giant is making the requirements very black and white…better known as Panda.

PANDA AND THE IMPORTANCE OF CONTENT FOR SEO

In January of this year, Google reaffirmed the importance of timely, fresh and relevant content. Google’s Panda, which is up to version 2.5, penalizes sites with duplicate content or content that isn’t relevant to their audiences. After the first Panda update, seo experts  reported that sites lacking content that balanced SEO with quality information saw rankings and keyword referrals plummet. Conversely, sites with fresh, exclusive and niche content saw gains in terms of indexed pages, organic traffic and – often – rankings. One of the winners we explored was a niche ecommerce site that beat out shallow SEO pages from larger sites by offering high-quality, industry-specific information in an SEO-friendly format. The content was centered on its products (“drywall paint,” in particular).

Utilizing your keywords organically in the content builds up your site’s keyword density. This attracts more visits from search engine spiders and (when done right) can lead to overall better rankings in SERPs. Google is on to keyword stuffing, so SEO best practices that improve users’ experience of your site are a must.

Here’s an example of a business in the B2B cloud computing sector with steady organic traffic levels through the first killer Panda update – in fact, it claimed some non-paid search traffic gains toward the end of February after Panda rolled out. The site publishes multiple, search-friendly industry news articles each day. Its SEO viability is clear as it gained keyword search referrals throughout the launch of Panda. The number of organic referring keywords to the site in February represented a 14 percent increase over the number of referring keywords it had in January.

Brafton has reported other cases where fresh content helped sites gain (or regain) organic search traffic post-Panda. But marketers shouldn’t forget that even pre-Panda, major search engines have been working to promote quality content in search rankings. 

Bing is equally adamant about giving search visibility to high-quality results, though Google’s Panda undeniably places a premium on top-notch content – and not just any content, but content that is clearly marked via SEO as being relevant to searchers’ specific queries. Brafton has reported that being a niche authority is the No.1 factor correlating to top results in Google News, and Google’s Matt Cutts has suggested that brands that want to see SEO results should focus on becoming authorities in their niches… which brings us to our next point.

If you’re going about your content marketing(L) strategy effectively, you’re not only driving more traffic and separating yourself from your competition in terms of visibility, you’re also establishing yourself as a thought leader. 

EFFECTIVE CONTENT MARKETING AND THOUGHT LEADERSHIP

Before we can understand the relationship between content marketing and thought leadership, we need to understand the concept of being a thought leader. This is a person or entity recognized as a credible authority on a niche topic. Thought leadership requires businesses to communicate with and engage current and potential customers. While you can demonstrate thought leadership with your charm and quick (verbal) answers when you’re at an in-person event or on the phone, the way to do it online is by (frequently) publishing useful content. This can range from insightful blog posts to how-to videos (supported by on-page text for SEO); from info-rich white papers to short, breaking news stories.

There are many benefits to being recognized as a thought leader by online audiences – from improved SEO to brand recognition and repeat traffic. When your company is a thought leader, visitors will likely stay on your site for longer periods of time. You can drive them to other web pages through internal linking, which increases the interaction metrics (and further contributes to SEO success). You might be worried that integrating SEO into your thought leadership marketing will detract from the information’s value, but in reality, good content marketing should include keywords and on-site optimization that attracts (rather than alienates) your audience. It makes it easy for people to find the resources you’re providing.

Google principal engineer Matt Cutts recently addressed this issue when he answered the question “Is SEO spam?” in a Webmaster Help YouTube video. He explained that useful information positioned with the right keywords is good for search engines and users.

Another way search-optimized content can be leveraged for thought leadership is distribution across social platforms. Sharing useful content helps your business gain loyal Facebook fans or become added to Twitter Lists. Traffic driven from social media is frequently the kind that is engaged and relevant and will stay on the site longer, come back more frequently and convert.

 The same B2B business referenced above shares its news headlines to its Twitter feed. In the past two months, Twitter has been a leading referral site for the business, driving highly relevant, valuable traffic. Twitter visitors tend to stay on the site 61 percent longer than the average visitor, and they view 154 percent more pages than average. (Did I mention the site has a solid internal linking strategy?) Plus, Google Analytics indicate the percentage of these Twitter visitors who are new site visitors is lower than the overall share of new visitors to the site overall, meaning Twitter drives more repeat traffic.

Social users will repost your articles and drive your brand marketing in social media channels – but you have to make sure it’s easy for them to discover the content in the increasingly search-friendly social spaces. (Twitter and Facebook are continuously evolving new search capabilities that make it easy for users to find categorized content on the sites). Plus, positive social sharing metrics can make your content more visible to logged-in searchers, which means you should be adding social sharing buttons to your content pages to create advocates who will sing praises of your thought leadership and help boost SEO.

In summary, content is king.  While this is a phrase all marketing professionals have likely heard, the fact of the matter is that more than half of the prospects I talk to in a marketing role don’t even have a clue as to why content marketing is crucial to a business’ online success. They want to gain more organic traffic and/or be thought leaders, yet have no content marketing in place to position themselves for success. Content is necessary for these ends, and compelling content that generates demand for your products/ services/ expertise can transform a visitor into a customer.




Friday, October 28, 2011

Choosing A Best SEO Link Building Services Company in India

If you are looking to increase the visibility of your website then you must be searching for Link Building Services Company in India to accomplish your needs. If your target customer base exists only in India then an Indian based SEO Company can understand your business needs better than anybody else can. The reason is that if the physical presence of an SEO company is where you live then it will be very much helpful for you to judge their quality and can expect quick communication. However if you are looking for a link building services company it won’t matter whom you hire since the campaign can be designed to target your niche market area from anywhere in the world. Here is a list with some points to watch for when choosing a link building services company

Does the company guarantee #1 ranking?

Well realistically speaking if any company does that, you must not take their candidature seriously since no one can guarantee #1 ranking especially in the scenario of changing algorithms of the search engines. Ask the company whether they themselves stand rank #1 for link building? If not then how can they guarantee their cleints #1 position?

References from friends

It is good to take references from friends who have their websites optimised. Online websites can misguide you, even the correspondence of the company you approach for the services can misguide you to grab business but your friends or partners will give you best advice. Ask people around you if they know any company or recommend any link building firm they have already worked. If the recommendation works, its best to go with that company as the skills and results of the recommended company would already have been tested.

Online feedbacks can work to get you the best services

Look out for online reviews and feedbacks of the company you want to decide on for your SEO link building services. You can search various SEO or link building forums or sites like mouthshut.com or any other review site to see what feedback does the clients or the customers have given for the services of the company you want to run your campaign with. If the company had any negative reputation, you will find it on various forums or blogs but just going through these forums and finding nothing wrong with the company still does not make it a capable company. You need to research as much as you can before to decide on one.

Ask the company for work samples

If you have selected a few companies to then you must ask from all candidate companies to provide you with the work samples that they do to optimise websites and examples of websites they have already optimized. SEO link building Companies that do not hesitate to provide information and have already optimized a website that belong to your niche theme then it will be a great advantage. This is because since in such a scenario the company professionals must know most of things about your industry and that will save your time and big efforts to teach them about business processes.

About the Authort

Manish Singh is a SEO Expert in India currently working as an Search Engine Marketing Exceptive in the Internet Marketing field associated with Indian SEO Services Firm and doing deep research on the current Internet Marketing World Strategies, Social Media Marketing and SEM Services having a strong Analyzing Power with Troubleshooting Attitude.

Monday, October 24, 2011

SEO algorithms – Which SEO algorithm works best?


If you were taking an English language test today, or a mathematical test, and you were asked to define “algorithm”, what definition would you provide? Do any of the following match your idea of what an “algorithm” is?

1.    A process for completing tasks
2.    The means by which the end is reached.
3.    A problem for which there is no resolution.
4.    A method for solving problems.
5.    A method for defining methods.

A lot of people find the SEO Theory blog through search engine referrals for variations on “SEO algorithm”, “SEO algorithms”, “search engine algorithm”, etc. The funny thing about those referrals is that I haven’t actually written about SEO algorithms. I write an SEO algorithm roundup article last year (some of the advice in that article is now outdated, by the way). But what I called “SEO algorithms” in that post were not really SEO algorithms.

Search engine algorithms are complex things. One does not simply detail a search engine algorithm in a single blog post. But one can recap (or attempt to recap) the basic steps in the search indexing process. A fair number of SEOs have done this, some even using pictures. None of them have really done an adequate job. Nor am I likely to do an adequate job.

Search engines don’t have much to work with when they are indexing billions of pages. They just get a few hunderd pieces of information to pick from. If you have ever designed an inventory management system you’ll immediately see the advantages you have over a search engine. If you have never designed an inventory management system, you may appreciate the comparison with a little explanation.

Let’s say you operate a warehouse for automobile parts. On average I would say they have to stock around 100,000 individually identified parts. Each part comes with one or more unique identification strings or tags. The manufacturers provide their own model numbers and serial numbers, shippers and distributors may provide their own tracking IDs, and retailers (you, the guy with the warehouse) usually assign their own identification strings for internal tracking.

That one paragraph provides you with more detailed information about any given manufactured item intended for use in an automobile than any search engine knows about Web pages. If search engines could know that every Web page was tagged with one or more unique identifiers other people had provided, that would make life so much easier for them. But as it is, anyone who has struggled with canonical URL issues knows that search engines can easily confuse one page with many.

In order to index and arrange billions of pages, search engines have to make up their own unique identifiers and manage those identfiers without the benefit of making sanity checks against other people’s identifiers. But the average inventory management system has more advantages over search engines than that.

Knowing that an auto parts warehouse needs to stock about 100,000 different types of parts, we can design our facilities, software, and procedures around 100,000 unique types of parts. A search engine has absolutely no idea of how many pages it will eventually be asked to index. Your resources have to be allocated very conservatively if you are dealing with an open-ended inventory rather than with a limited inventory.

An auto parts Warehouse can track customer purchasing habits over time and find out which parts are most likely to be in high demand. A search engine can track queries and clicks but because search engines see 20-25% new queries every month, they never really know which pages will be in high demand for how long. The typical auto parts warehouse doesn’t see 20-25% new parts requests every month.

Predictability influences how you manage and organize data. Unpredictability also influences how you manage and organize data.

So think of how you might organize an endless supply of new Web pages as you find them AND how you might respond to an endless stream of new requests for information that you’re constantly growing (or changing) inventory of Web pages may or may not satisfy. In today’s world of search the major search engines rely on two major factors more than anything else: content and links.

Content is a fuzzy concept. Does content include the meta data that accompanies many Web pages? Does content include descriptive text that accompanies links (such as the descriptions we provide in directory listings)?

Links may seem more straight-forward than content but the answers we provide ourselves with for the content questions may make links more complex. After all, if we don’t associate all the text around a link with the destination page, should it be associated with the link? Have you ever thought about a search engine simply looking at a link for itself rather than for the relationship it creates between two documents?

A search engine can collect a lot of information about a link and some search engines may indeed be doing that. They may use that information to determine whether the link should be trusted, whether it should be given extra weight, or whether it should be followed (crawled). A search engine can record how it handles what it finds on the destination page and associate that finding with the link (or, perhaps more likely, with the linking page).

Ultimately, the search engine is trying to solve two problems: first, how to manage an ever-growing inventory of Web pages of unpredictable quantity, quality, and design; second, how to respond to a continuous stream of requests for information that it may or may not have seen before.

In mathematics, one algorithm can be used to solve more than one problem but the problems have to all belong to the same group (or class) of problems. They have to share similar characteristics. For example, you could use the same algorithm to find out how fast two trains are traveling if you are given their relative speeds and directions AND to find out how fast a bullet is traveling toward a moving object if you are provided with similar information. But you would have to use a completely different algorithm to determine what the volume of a sphere is.

Managing data and searching data require different processes. Hence, every search engine requires at least two algorithms. When you speak of a search engine’s algorithm, therefore, you’re thinking of a mega-algorithm that incorporates many smaller algorithms. Your task as a search optimizer becomes more complex if you address that mega-algorithm rather than focus on each real algorithm separately.
And that brings us to SEO algorithms. An SEO algorithm is the process by which you optimize content for search. Optimization doesn’t mean get the best possible ranking. In our SEO glossary here on SEO Theory you’ll find this definition for search engine optimization: “The art of designing or modifying Web pages to rank well in search engines.”

That is the most broad and comprehensive definition possible. I will occasionally clarify the definition by adding that we want converting traffic, but sometimes you optimize for something else that you hope to achieve through search. Spammers and SEOs alike prefer to optimize their link profiles (although the rules for link profile optimization have never been articulated, so basically no one knows what they are optimizing for).

In search engine optimization, you can rely on one algorithm to address the two types of search engine algorithms or you can rely on several algorithms. Most SEOs seem to prefer the several algorithm approach but let’s look at the one algorithm approach first.

Your optimization problem can be described this way: how do you get a page indexed so that it is used to respond to as many queries as possible?

Our goal is to achieve maximum optimization, such as ranking a single page for 100 seo questions (technically, I did not address 100 questions — I got tired somewhere in the 80s or 90s, I think). Maximum optimization is an ideal state in which a page ranks well (not necessarily first) for every query to which it is relevant. I don’t think that is humanly possible, at least not with the current level of SEO theory we have available.

Your algorithm needs to be simple but it can be self-referring. That is, it can invoke itself. We don’t usually speak in terms of “invoking an algorithm for SEO” but that is essentially what we do. Maximum optimization requires that a page be strongly relevant to as many queries as its indexable words are relevant to. To achieve maximum optimization, you have to repeat and emphasize every word in every possible combination in as many ways as possible.

You could create a huge page that attempts to tackle everything or you could look at how you construct your text, how you emphasize it, and how you repeat terms to determine a pattern that ensures every word (or nearly every word) is used optimally. Hence, you may find yourself emphasizing your emphasis, repeating your repetititions, and reorganizing your word patterns into more complex patterns.

We used to call that last part power keyword optimization, where you construct complex keyword expressions that can be broken down into less complex keyword expressions. This method was proposed for the keywords meta tag in the late 1990s. We can extend the method to the indexable copy of the page and call it power content optimization. So, instead of using “keyword1 keyword2″ you use “keyword3 keyword1 keyword2 keyword4″ and optimize for several variations.
There is a little more to it but let me move on.

Most SEO Experts and eTailers are not interested in maximum optimization. The algorithms one might employ for maximum optimization are more theoretical toys than anything else, as most people are looking for a return on investment. But many people are very interested in what we could call extended optimization, where you design your content to rank well for many queries (but nothing like “all relevant queries”).

For example, let’s say you have a jewelry Web site and you have a category page that lists 20 different types of jewelry (perhaps they are all rings with stones). Although you want those individual ring-with-stone pages to rank well for their most specific queries, would it not be great to have the category rank alongside them? Sure it would. That’s extended optimization. Of course, not every search engine prefers to show category pages if it can serve up the detail pages.

Your algorithm is defined in terms of what you do on the page, what you do around the page, and what you do to the page. “On the page” is self-evident to anyone who is familiar with the basic concepts of SEO page design. What you do around the page is a little less familiar because most people don’t think in terms of “managing sibling relationships” but rather they focus on “theming a Web site”. You don’t need (or want) to theme a Web site, but you do want to cross-promote your most valuable content for a specific query. Put your best foot forward.

What you do to a page usually occurs as link building, but you can do other things to a page (such as embed it in a frame, embed it in an iframe, block it, replicate it across multiple ambiguous URLs, etc.). That is, most people focus on link building rather than on piggybacking content, although there are optimizers out there who have piggy-backed plenty of my content.

A well-designed Web site should address the types of search engine algorithms (indexing and query resolution) adequately in most cases. However, if you’re the kind of person who wants to walk around the mountain rather than quickly fly over it, you can do what most of your fellow SEOs have been doing for years.

You can devise a links-to-get-indexed algorithm and a links-to-get-ranked algorithm. Remember that link building is the least efficient, least effective means of optimizing for search. It’s the most time-consuming and resource-hogging approach to search engine optimization. Therefore, everyone does it simply because it looks like it’s the right way to do things. After all, the right way has to be harder than other ways, right?

So how do we manage to separaet our algorithms for search engine optimization through link building? Because there are those links where you control the anchor text and those links where you cannot control the anchor text. If you cannot control the anchor text, all you can use the link for (with respect to search engine optimization — there are clearly other valuable uses) is to get crawled and indexed.

Some people invest a great deal of time in building links with anchor text they cannot control. These types of links “look natural”, “confer trust”, “reflect editorial opinion”, and (my favorite) “are SEO friendly”.

Some people just go for the throat and grab every link they can get with the anchor text they want. These types of links usually “look spammy”, rarely confer PageRank (or trust), bypass editorial opinion, and (my favorite) “are SEO friendly”.

Anything that is SEO friendly must be good, right?

You probably divide your link building time between asking for links and creating links. However, many SEOs are now chasing the dream of creating long-lasting link bait (something that rarely happens). Link bait provides you with “natural” links whose anchor text you cannot control. Link bait will get you crawled and will probably help you rank for expressions you never imagined, but it isn’t helping you optimize for both types of search algorithm.

Good link bait should statistically attract more links with targeted anchor text than not. Great link bait creates a brand, but that’s another story.

If you divide your resources between creating link bait and building links, you’re not optimizing your content. Link bait can be optimized after the fact but most link bait that I have looked at is not optimized. It’s designed to attract links, not rank well in search results. A well optimized page should rank well in a non-competitive query. A high optimized page should not require many links to rank well even in competitive queries.

So if you’re not thinking in terms of “SEO algorithms” then you’re not looking at how you allocate your resources. You’re not looking at how you solve the problems of getting your content indexed and getting it to rank well.

Simply being indexed doesn’t guarantee a good ranking. Of course, simply ranking well doesn’t guarantee click-throughs and conversions but that leads to a problem that doesn’t have anything to do with the search engine algorithms.

In search engine optimization there is no right way to optimize. Every query resolves the question of “which optimization methodology works best” only for itself. You cannot use one query to prove a point about optimization with another query. Your SEO algorithm therefore has to be immensely flexible but it also has to be replaceable.

That is, to do this right, you have to know more than one way to optimize. You have to be prepared to tackle your problems from different angles every time because sometimes the old tricks won’t do the job and sometimes the new tricks won’t do the job.

An algorithm is a method for solving problems. There is no universal algorithm in search engine optimization, although the SEO Method applies to all of them: experiment, evaluate, and adjust.

The Next Google Update


I would venture to guess that one of the most common questions on any webmaster forum involves someone asking, “When is the next Google update?” Although they are probably asking about visible PageRank updates in the Google toolbar, the answer involves a bit more than that.

Visible PageRank is what you see in your Google toolbar. At this point, that has been updating an average of every three months or so.


HOWEVER, real PageRank is continually updated and continually factored into how the search results are determined. By the time you see a change in the toolbar, any effects of that change have already long since been included in the search results.
Google updates its index data, including backlinks and PageRank, continually and continuously. We only export new backlinks, PageRank, or directory data every three months or so though. (We started doing that last year when too many SEOs were suffering from “B.O.”, short for backlink obsession.) When new backlinks/PageRank appear, we’ve already factored that into our rankings quite a while ago. What Is An Update by Google Engineer, Matt Cutts
For a few years now people have been confused as to what the cache date on page actually indicates. It used to be that if there were no changes on the page since the last cache date, then often Google would only change the cached version of the page about once or twice a month. If there had been changes, people would see the cached page updated much sooner.

Google has now changed the cache date to reveal the last date the page was accessed by Google. (Updated September 6, 2006)
“We’ve recently changed the date we show for the cached page to reflect when Googlebot last accessed it (whether the page had changed or not). This should make it easier for you to determine the most recent date Googlebot visited the page.” Google Webmaster
How often a page is crawled is determined by the number of links that are out there bringing the robots back to your website again.

Search engine spiders crawl the web on a continual basis by following links. The more links that are pointed to your site, the more often your pages will be crawled.

This means that websites with very few links pointed to them, will notice it taking longer for Google to find and index their new pages.

Websites with a large number of links pointed to them will likely see their new and updated pages added to the index quite rapidly.

The search engine result pages update continually. As Google finds new information, it is added to the index. The goal of the search engines is to display the results in the exact order of relevance to the search query. The more relevant your page is to the search query, the higher your page should show in the results.

Since the information going into that determination is continually changing, so are the results you see in search engine results. When you combine all that with occasional changes in how Google factors page relevance (algorithm) to the search query, you end up with results that are continually fluctuating.

In the end, all of this should mean very little to you as a site owner. The more time you spend focused on what Google is up to, the less time you are spending on building a website filled with quality content for your visitors.

If you follow the basic guidelines and develop a quality site that invites incoming links naturally, you can focus on your results over time, rather than day to day search engine fluctuations.

Tips for effective social media optimization


Social networking sites have transformed the way we communicate and connect. By now most internet savvy companies are already using (and reaping the benefits of) SMO or Social Media Optimization. For those who are still unaware of this rising global phenomenon, here is a heads-up on what SMO entails.

SMO is the process of creating multiple points of access or visibility by utilizing the traffic leverage of already existing social networking sites for brand/service promotion and in-turn improve the bottom line. In simple terms SMO is the utilization of sites like Facebook and Twitter to create a buzz about your company. SMO is a process that not only requires innovation but it also requires continuous dynamism.

7 Tips for SMO


SMO used effectively can have a tremendous impact on the traffic you generate. Here are some handy tips for efficiently using SMO to your advantage –

•    First and foremost set up fan pages or company groups on sites like Facebook, LinkedIn and Twitter
•    Inform, engage and provide information that people can actually use through these platforms
•    Set up accounts on image sharing sites like Photobucket for image sharing and optimal tagging
•    Create accounts on social bookmarking sites like Reddit for sharing additional information
•    Maintain blogs on popular blogging sites
•    Innovate and communicate for retention and increase of members on social channels and finally
•    BE ACTIVE! Regular updates on all social sites is imperative

Whether you want to be a part of this social bandwagon or not is not really a choice now! The digitalization of this world has brought with it newer ways of connecting with people and changed the way we market and promote.

-Adapt if you want to move ahead.
-Stay fresh if you want to stay viable.
-Use SMO effectively if you want to succeed in this digital world!

Talk About Your Blogs Online To Get More Traffic


Beginning to blog? Wondering how to get more readers to read your blog? There are numerous ways you can build traffic on your blog and get more people to read it. You can use several online applications to increase the views on your blog. Social Networking sites are one of the best tools to increase the traffic on your blog. In this era of Tweeting and Facebooking, spreading a word about your blog has become as compared to the earlier times when there were no such mediums available. By using these mediums extensively, more people will come through and visit your blog.
A blog takes some time to get readers initially. Many bloggers stop blogging after the initial discouragement of no users. Do not stop blogging due to lack of response initially. Instead of getting discouraged because of not getting comments on your blog, you can use Social Networking to popularise your blog posts. Tell your friends about what you have written, share the link of the articles online on your wall, and ask your friends and family to leave a comment or two. This way, all the people in your online network shall get aware of the blog and thus the visibility of the blogs shall also increase substantially.
You can also add your blog friends to your list on Social networking sites like Facebook, Orkut and Twitter. This way, you can come to know about the lives and instant updates about the fellow bloggers. With the invention of sites like Facebook, Twitter and Orkut, you can keep yourself updated in the Blogosphere. This is one of the best ways to build your visibility among so many people. Take full advantage of these Social Networking sites and you are bound to get noticed. But remember, this doesn’t happen overnight, so you must be patient. The response and building up takes time.
Several online communities or groups also have active forums where you can share your latest blog posts. The response on these sites is good, and thus this one of the smartest way to increase the readership on your blog. One of the easiest things you can do is sharing the link of your blog in your profile information. This way, all the people added in your friend list on all the Social Networking sites can see what latest has been posted and updated by you. Google mail status messages are also one of the major ways to popularise your blog. Similarly, scrap the link of your blog to all your friends on Orkut. In the same way, you can keep the link of your blog as the signature at the end of your mail body. With this, all the recipients of your mail shall know about your blog.
To conclude, Blogging is good. It is really good. But for your blogging to be successful you must be very calm and patient as this whole process is slow. But believe you me, this is very effective also.

Thursday, October 20, 2011

20 Minute SEO List

Here is a checklist of the factors that affect your rankings with Google, Bing, Yahoo! and the other search engines. The list contains positive, negative and neutral factors because all of them exist. Most of the factors in the checklist apply mainly to Google and partially to Bing, Yahoo! and all the other search engines of lesser importance. If you need more information on particular sections of the checklist, you may want to read our SEO tutorial, which gives more detailed explanations of Keywords, Links, Metatags, Visual Extras, etc.




Keywords
1
Keywords in <title> tag
This is one of the most important places to have a keyword because what is written inside the <title> tag shows in search results as your page title. The title tag must be short (6 or 7 words at most) and the the keyword must be near the beginning.
+3
2
Keywords in URL
Keywords in URLs help a lot - e.g. - http://domainname.com/seo-services.html, where “SEO services” is the keyword phrase you attempt to rank well for. But if you don't have the keywords in other parts of the document, don't rely on having them in the URL.
+3
3
Keyword density in document text
Another very important factor you need to check. 3-7 % for major keywords is best, 1-2 for minor. Keyword density of over 10% is suspicious and looks more like keyword stuffing, than a naturally written text.
+3
4
Keywords in anchor text
Also very important, especially for the anchor text of inbound links, because if you have the keyword in the anchor text in a link from another site, this is regarded as getting a vote from this site not only about your site in general, but about the keyword in particular.
+3
5
Keywords in headings (<H1>, <H2>, etc. tags)
One more place where keywords count a lot. But beware that your page has actual text about the particular keyword.
+3
6
Keywords in the beginning of a document
Also counts, though not as much as anchor text, title tag or headings. However, have in mind that the beginning of a document does not necessarily mean the first paragraph – for instance if you use tables, the first paragraph of text might be in the second half of the table.
+2
7
Keywords in <alt> tags
Spiders don't read images but they do read their textual descriptions in the <alt> tag, so if you have images on your page, fill in the <alt> tag with some keywords about them.
+2
8
Keywords in metatags
Less and less important, especially for Google. Yahoo! and Bing still rely on them, so if you are optimizing for Yahoo! or Bing, fill these tags properly. In any case, filling these tags properly will not hurt, so do it.
+1
9
Keyword proximity
Keyword proximity measures how close in the text the keywords are. It is best if they are immediately one after the other (e.g. “dog food”), with no other words between them. For instance, if you have “dog” in the first paragraph and “food” in the third paragraph, this also counts but not as much as having the phrase “dog food” without any other words in between. Keyword proximity is applicable for keyword phrases that consist of 2 or more words.
+1
10
Keyword phrases
In addition to keywords, you can optimize for keyword phrases that consist of several words – e.g. “SEO services”. It is best when the keyword phrases you optimize for are popular ones, so you can get a lot of exact matches of the search string but sometimes it makes sense to optimize for 2 or 3 separate keywords (“SEO” and “services”) than for one phrase that might occasionally get an exact match.
+1
11
Secondary keywords
Optimizing for secondary keywords can be a golden mine because when everybody else is optimizing for the most popular keywords, there will be less competition (and probably more hits) for pages that are optimized for the minor words. For instance, “real estate new jersey” might have thousand times less hits than “real estate” only but if you are operating in New Jersey, you will get less but considerably better targeted traffic.
+1
12
Keyword stemming
For English this is not so much of a factor because words that stem from the same root (e.g. dog, dogs, doggy, etc.) are considered related and if you have “dog” on your page, you will get hits for “dogs” and “doggy” as well, but for other languages keywords stemming could be an issue because different words that stem from the same root are considered as not related and you might need to optimize for all of them.
+1
13
Synonyms
Optimizing for synonyms of the target keywords, in addition to the main keywords. This is good for sites in English, for which search engines are smart enough to use synonyms as well, when ranking sites but for many other languages synonyms are not taken into account, when calculating rankings and relevancy.
+1
14
Keyword Mistypes
Spelling errors are very frequent and if you know that your target keywords have popular misspellings or alternative spellings (i.e. Christmas and Xmas), you might be tempted to optimize for them. Yes, this might get you some more traffic but having spelling mistakes on your site does not make a good impression, so you'd better don't do it, or do it only in the metatags.
0
15
Keyword dilution
When you are optimizing for an excessive amount of keywords, especially unrelated ones, this will affect the performance of all your keywords and even the major ones will be lost (diluted) in the text.
-2
16
Keyword stuffing
Any artificially inflated keyword density (10% and over) is keyword stuffing and you risk getting banned from search engines.
-3

Links - internal, inbound, outbound
17
Anchor text of inbound links
As discussed in the Keywords section, this is one of the most important factors for good rankings. It is best if you have a keyword in the anchor text but even if you don't, it is still OK.
+3
18
Origin of inbound links
Besides the anchor text, it is important if the site that links to you is a reputable one or not. Generally sites with greater Google PR are considered reputable.
+3
19
Links from similar sites
Having links from similar sites is very, very useful. It indicates that the competition is voting for you and you are popular within your topical community.
+3
20
Links from .edu and .gov sites
These links are precious because .edu and .gov sites are more reputable than .com. .biz, .info, etc. domains. Additionally, such links are hard to obtain.
+3
21
Number of backlinks
Generally the more, the better. But the reputation of the sites that link to you is more important than their number. Also important is their anchor text, is there a keyword in it, how old are they, etc.
+3
22
Anchor text of internal links
This also matters, though not as much as the anchor text of inbound links.
+2
23
Around-the-anchor text
The text that is immediately before and after the anchor text also matters because it further indicates the relevance of the link – i.e. if the link is artificial or it naturally flows in the text.
+2
24
Age of inbound links
The older, the better. Getting many new links in a short time suggests buying them.
+2
25
Links from directories
Great, though it strongly depends on which directories. Being listed in DMOZ, Yahoo Directory and similar directories is a great boost for your ranking but having tons of links from PR0 directories is useless and it can even be regarded as link spamming, if you have hundreds or thousands of such links.
+2
26
Number of outgoing links on the page that links to you
The fewer, the better for you because this way your link looks more important.
+1
27
Named anchors
Named anchors (the target place of internal links) are useful for internal navigation but are also useful for SEO because you stress additionally that a particular page, paragraph or text is important. In the code, named anchors look like this: <A href= “#dogs”>Read about dogs</A> and “#dogs” is the named anchor.
+1
28
IP address of inbound link
Google denies that they discriminate against links that come from the same IP address or C class of addresses, so for Google the IP address can be considered neutral to the weight of inbound links. However, Bing and Yahoo! may discard links from the same IPs or IP classes, so it is always better to get links from different IPs.
+1
29
Inbound links from link farms and other suspicious sites
This does not affect you in any way, provided that the links are not reciprocal. The idea is that it is beyond your control to define what a link farm links to, so you don't get penalized when such sites link to you because this is not your fault but in any case you'd better stay away from link farms and similar suspicious sites.
0
30
Many outgoing links
Google does not like pages that consists mainly of links, so you'd better keep them under 100 per page. Having many outgoing links does not get you any benefits in terms of ranking and could even make your situation worse.
-1
31
Excessive linking, link spamming
It is bad for your rankings, when you have many links to/from the same sites (even if it is not a cross- linking scheme or links to bad neighbors) because it suggests link buying or at least spamming. In the best case only some of the links are taken into account for SEO rankings.
-1
32
Outbound links to link farms and other suspicious sites
Unlike inbound links from link farms and other suspicious sites, outbound links to bad neighbors can drown you. You need periodically to check the status of the sites you link to because sometimes good sites become bad neighbors and vice versa.
-3
33
Cross-linking
Cross-linking occurs when site A links to site B, site B links to site C and site C links back to site A. This is the simplest example but more complex schemes are possible. Cross-linking looks like disguised reciprocal link trading and is penalized.
-3
34
Single pixel links
when you have a link that is a pixel or so wide it is invisible for humans, so nobody will click on it and it is obvious that this link is an attempt to manipulate search engines.
-3

Metatags
35
<Description> metatag
Metatags are becoming less and less important but if there are metatags that still matter, these are the <description> and <keywords> ones. Use the <Description> metatag to write the description of your site. Besides the fact that metatags still rock on Bing and Yahoo!, the <Description> metatag has one more advantage – it sometimes pops in the description of your site in search results.
+1
36
<Keywords> metatag
The <Keywords> metatag also matters, though as all metatags it gets almost no attention from Google and some attention from Bing and Yahoo! Keep the metatag reasonably long – 10 to 20 keywords at most. Don't stuff the <Keywords> tag with keywords that you don't have on the page, this is bad for your rankings.
+1
37
<Language> metatag
If your site is language-specific, don't leave this tag empty. Search engines have more sophisticated ways of determining the language of a page than relying on the <language>metatag but they still consider it.
+1
38
<Refresh> metatag
The <Refresh> metatag is one way to redirect visitors from your site to another. Only do it if you have recently migrated your site to a new domain and you need to temporarily redirect visitors. When used for a long time, the <refresh> metatag is regarded as unethical practice and this can hurt your ratings. In any case, redirecting through 301 is much better.
-1

Content
39
Unique content
Having more content (relevant content, which is different from the content on other sites both in wording and topics) is a real boost for your site's rankings.
+3
40
Frequency of content change
Frequent changes are favored. It is great when you constantly add new content but it is not so great when you only make small updates to existing content.
+3
41
Keywords font size
When a keyword in the document text is in a larger font size in comparison to other on-page text, this makes it more noticeable, so therefore it is more important than the rest of the text. The same applies to headings (<h1>, <h2>, etc.), which generally are in larger font size than the rest of the text.
+2
42
Keywords formatting
Bold and italic are another way to emphasize important words and phrases. However, use bold, italic and larger font sizes within reason because otherwise you might achieve just the opposite effect.
+2
43
Age of document
Recent documents (or at least regularly updated ones) are favored.
+2
44
File size
Generally long pages are not favored, or at least you can achieve better rankings if you have 3 short rather than 1 long page on a given topic, so split long pages into multiple smaller ones.
+1
45
Content separation
From a marketing point of view content separation (based on IP, browser type, etc.) might be great but for SEO it is bad because when you have one URL and differing content, search engines get confused what the actual content of the page is.
-2
46
Poor coding and design
Search engines say that they do not want poorly designed and coded sites, though there are hardly sites that are banned because of messy code or ugly images but when the design and/or coding of a site is poor, the site might not be indexable at all, so in this sense poor code and design can harm you a lot.
-2
47
Illegal Content
Using other people's copyrighted content without their permission or using content that promotes legal violations can get you kicked out of search engines.
-3
48
Invisible text
This is a black hat SEO practice and when spiders discover that you have text specially for them but not for humans, don't be surprised by the penalty.
-3
49
Cloaking
Cloaking is another illegal technique, which partially involves content separation because spiders see one page (highly-optimized, of course), and everybody else is presented with another version of the same page.
-3
50
Doorway pages
Creating pages that aim to trick spiders that your site is a highly-relevant one when it is not, is another way to get the kick from search engines.
-3
51
Duplicate content
When you have the same content on several pages on the site, this will not make your site look larger because the duplicate content penalty kicks in. To a lesser degree duplicate content applies to pages that reside on other sites but obviously these cases are not always banned – i.e. article directories or mirror sites do exist and prosper.
-3

Visual Extras and SEO
52
JavaScript
If used wisely, it will not hurt. But if your main content is displayed through JavaScript, this makes it more difficult for spiders to follow and if JavaScript code is a mess and spiders can't follow it, this will definitely hurt your ratings.
0
53
Images in text
Having a text-only site is so boring but having many images and no text is a SEO sin. Always provide in the <alt> tag a meaningful description of an image but don't stuff it with keywords or irrelevant information.
0
54
Podcasts and videos
Podcasts and videos are becoming more and more popular but as with all non-textual goodies, search engines can't read them, so if you don't have the tapescript of the podcast or the video, it is as if the podcast or movie is not there because it will not be indexed by search engines.
0
55
Images instead of text links
Using images instead of text links is bad, especially when you don't fill in the <alt> tag. But even if you fill in the <alt> tag, it is not the same as having a bold, underlined, 16-pt. link, so use images for navigation only if this is really vital for the graphic layout of your site.
-1
56
Frames
Frames are very, very bad for SEO. Avoid using them unless really necessary.
-2
57
Flash
Spiders don't index the content of Flash movies, so if you use Flash on your site, don't forget to give it an alternative textual description.
-2
58
A Flash home page
Fortunately this epidemic disease seems to have come to an end. Having a Flash home page (and sometimes whole sections of your site) and no HTML version, is a SEO suicide.
-3

Domains, URLs, Web Mastery
59
Keyword-rich URLs and filenames
A very important factor, especially for Yahoo! and Bing.
+3
60
Site Accessibility
Another fundamental issue, which that is often neglected. If the site (or separate pages) is unaccessible because of broken links, 404 errors, password-protected areas and other similar reasons, then the site simply can't be indexed.
+3
61
Sitemap
It is great to have a complete and up-to-date sitemap, spiders love it, no matter if it is a plain old HTML sitemap or the special Google sitemap format.
+2
62
Site size
Spiders love large sites, so generally it is the bigger, the better. However, big sites become user-unfriendly and difficult to navigate, so sometimes it makes sense to separate a big site into a couple of smaller ones. On the other hand, there are hardly sites that are penalized because they are 10,000+ pages, so don't split your size in pieces only because it is getting larger and larger.
+2
63
Site age
Similarly to wine, older sites are respected more. The idea is that an old, established site is more trustworthy (they have been around and are here to stay) than a new site that has just poped up and might soon disappear.
+2
64
Site theme
It is not only keywords in URLs and on page that matter. The site theme is even more important for good ranking because when the site fits into one theme, this boosts the rankings of all its pages that are related to this theme.
+2
65
File Location on Site
File location is important and files that are located in the root directory or near it tend to rank better than files that are buried 5 or more levels below.
+1
66
Domains versus subdomains, separate domains
Having a separate domain is better – i.e. instead of having blablabla.blogspot.com, register a separate blablabla.com domain.
+1
67
Top-level domains (TLDs)
Not all TLDs are equal. There are TLDs that are better than others. For instance, the most popular TLD – .com – is much better than .ws, .biz, or .info domains but (all equal) nothing beats an old .edu or .org domain.
+1
68
Hyphens in URLs
Hyphens between the words in an URL increase readability and help with SEO rankings. This applies both to hyphens in domain names and in the rest of the URL.
+1
69
URL length
Generally doesn't matter but if it is a very long URL-s, this starts to look spammy, so avoid having more than 10 words in the URL (3 or 4 for the domain name itself and 6 or 7 for the rest of address is acceptable).
0
70
IP address
Could matter only for shared hosting or when a site is hosted with a free hosting provider, when the IP or the whole C-class of IP addresses is blacklisted due to spamming or other illegal practices.
0
71
Adsense will boost your ranking
Adsense is not related in any way to SEO ranking. Google will definitely not give you a ranking bonus because of hosting Adsense ads. Adsense might boost your income but this has nothing to do with your search rankings.
0
72
Adwords will boost your ranking
Similarly to Adsense, Adwords has nothing to do with your search rankings. Adwords will bring more traffic to your site but this will not affect your rankings in whatsoever way.
0
73
Hosting downtime
Hosting downtime is directly related to accessibility because if a site is frequently down, it can't be indexed. But in practice this is a factor only if your hosting provider is really unreliable and has less than 97-98% uptime.
-1
74
Dynamic URLs
Spiders prefer static URLs, though you will see many dynamic pages on top positions. Long dynamic URLs (over 100 characters) are really bad and in any case you'd better use a tool to rewrite dynamic URLs in something more human- and SEO-friendly.
-1
75
Session IDs
This is even worse than dynamic URLs. Don't use session IDs for information that you'd like to be indexed by spiders.
-2
76
Bans in robots.txt
If indexing of a considerable portion of the site is banned, this is likely to affect the nonbanned part as well because spiders will come less frequently to a “noindex” site.
-2
77
Redirects (301 and 302)
When not applied properly, redirects can hurt a lot – the target page might not open, or worse – a redirect can be regarded as a black hat technique, when the visitor is immediately taken to a different page.
-3

Share

Twitter Delicious Facebook Digg Stumbleupon Favorites More