Thursday, August 21, 2008

Advantages and Disadvantages of Viral Marketing

In the current competitive business world, every business owners look different methods to promote their business, globally. One of the widely available, familiar and successful marketing strategies is viral marketing. The concept behind the viral marketing is word-of-mouth, i.e. use influencers to make peer-to-peer product recommendations. The term viral marketing formed by venture capitalist Steve Jurvetson in 1997 to explain the exclusive referral-marketing program produced by Hotmail, one of the primary free e-mail services. Several outsourcing companies around the world are providing viral marketing services. The viral marketing has some disadvantages also.
The advantages of viral marketing service are high credibility, low costs, great reach, high efficiency and the opportunity to continuous promotion adjustments.
The main reasons for the wide popularity of viral marketing are:
1.
Socializing and networking has now made very closer to the people. So relatives and friends are simply accessible over the net.
2.Viral marketing is one of the cost-free methods for promoting a business transaction.
3.The time and resources are easily available. In this type of marketing, one person contacting their friends or relatives. They contacted more and more people and the chain goes on. It generates revenue from advertisement.In spite of the probable risks, viral marketing has the ability to draw the greatest potential audience at a convincingly low cost, raising the reach of your business.
The main disadvantages of the viral marketing are:
1.Association with unknown groups
-
The strength of viral marketing depends on the transfer of messages from person to person. During this process, it may reach someone you would rather not be associated with.
2.Spam threats - If made badly, viral marketing can guide to significant spam issues.
3.Keep away from making merely financial-based offer
4.Brand dilution

(source-www.outsourcestrategies.com)

Guidelines For Yahoo Optimization

These important guidelines listed below will help you promote your business web site indexed and ranked in top ten search engine ranking by Yahoo for the desired keywords and to generate increased targeted traffic to your website.
1.Yahoo prefers Original and unique content with a genuine value for the reader
2.Pages that are designed primarily for human viewing are ranked higher
3.Hyperlinks used to assist people find interesting and related content
4.Use Meta tags ie title & description that accurately describe the contents of your Web page
5.Do not stuff your pages with duplicate or repeated content; Yahoo may penalize you for spamming
6.Don't use doorway pages
7.Provide a user friendly navigation
8.Don't use pages using methods to artificially inflate search engine ranking
9.Don't use text with same color as the background color or text that is not easily read, too small or is placed in an area of the web page not visible to users.
10.Don't use Cloaking technique
11.Don't participate in link exchanges with "link farms"
12.Don't stuff your pages excessive or off-topic keywords
(source-www.nddw.com)

Guidelines For Google Optimization

The guidelines listed below will help you get your site indexed and ranked by Google. Google Strongly recommends to implement these guidelines. The unethical practices may lead to your site being completely removed from the Google rankings or penalized heavily by reduction or removal of page rank. Once your site is penalized by Google, it may no longer show up in rankings on Google for a long time.
1.Make a site with a clear navigation, hierarchy and text links. Every page should be reachable from at least one static text link.
2.Create a site map to your users with links that point to all parts of your site.
3.Create a useful, information-rich site with pages that clearly and accurately describing the content about your services.
4.Add the words in the content users would type to find your pages.
5.Always try to use text instead of images to display importnt keywords, content or links.
6.Make sure that your TITLE and ALT tags are relevant, descriptive & accurate.
7.Ensure that there are no broken links and site HTML is correct.
8.Try not to use "?" character in dynamic pages.
9.Keep the links on your pages to less than 100.
10.Get other relevant sites link to yours.
11.Submit your site to Google at http://www.google.com/addurl.html.
12.Submit a xml sitemap file at Google webmaster tools. Google uses Sitemap file to learn about the structure and coverage of your site.
13.Submit your site to relevant directories such as the Open Directory Project and Yahoo Directory, as well as to other industry-specific expert sites.
14.Try to avoid JavaScript, cookies, session IDs, frames, DHTML, or Flash as search engine spiders may have trouble crawling your site.
15.Upload the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled.
16.Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.
17.Make pages for users, not for search engines.
18.Don't deceive your users or present different content to search engines than you display to users, which is commonly referred to as "cloaking."
19.Avoid unethical tricks intended to improve search engine rankings.
20.Never participate in link schemes designed to increase your site's ranking or PageRank.
21.Don't link to web spammers or "bad neighborhoods" on the web, as your own ranking and PR may be affected adversely by those links.
22.Always Manually Submit your site to Directories and Search Engines
23.Don't use hidden text or hidden links on your site.
24.Don't employ cloaking or sneaky redirects.
25.Never send automated queries to Google.
26.Don't create multiple pages, subdomains, or domains with substantially duplicate content. 27.This may get your site black listed in Google.
28.Don't create pages that install viruses or trojans.
Avoid "doorway" pages with little or no original content.
(source-www.nddw.com)

Social Network Marketing

Social Media Networking Marketing Services
Social network marketing services is aimed to help companies promote their brand and market their products & services through online social networking media. The Social network marketing is a growing trend and a powerful online medium which is effectively used as a search engine marketing and optimization tool.
Salient Features of Social Network Marketing
It spreads brand awareness in the social network websites.
Increases targeted traffic from social networking portals to your website.
It adds to search engine optimization & Internet marketing strategies.
Increases page rank, link popularity and one way links.
Social Media Networking & Optimization Services Includes
Blog Marketing
Blogging refers to actively creating your own blogs and participating in sharing your views on the related blogs of the industry, by posting comments to build your identity and brand awareness of your product or services.
Forum Marketing
Forum marketing is is an art of sharing or gaining knowledge of the required subject by becoming part of the most active forums that belong your industry which creates presence of your brand in the community and increases targeted traffic or leads to your website.
Article writing and submissions
Writing articles on the subject of your services and submitting the same at related portals is a very effective tool to spread your services and creating a brand value for your company. We help our clients by writing and submitting well researched articles thus increasing their Google page rank and targeted traffic to your website.
Issue of Press Releases
Writing and submission of effective, informative and creative press releases is a very powerful tool in seo, New Delhi Digital Works helps you create Press releases and submit the press releases online on top news portals.
(source-www.nddw.com)

Viral marketing: Tips And Techniques

The main vital aspect of this recent marketing technique which every business man should be aware of is that it is totally free. Viral marketing can return thousands of clients on a business website when it is implemented with right idea, way and planning.
Viral Marketing through social networking websites
Social networking website is defined as networks of people and friends, who use internet to communicate, socialise and share information with each other through one website. Popular websites in this nature are myspace, faceboob, bebo and others. Such website is utilized through viral marketing sources. In this concept the people are offered incentives to provide marketing material. This incentive prove to be usually interesting, entertaining, generally not related to the business involved.
Effective messageing through video
Although viral marketing may seem as ineffective but when this system followed in these networking. websites. This technique becomes effective here. When a business finds a video which could become popular, if this video is interesting, entertaining it will force viewers to pass it on to their friends to share it. By adding your website link or marketing message in to that video as footer, adding your burl to the video description, your business details are being passed whenever the video is sent. This is what makes viral marketing on social networking websites so powerful for any businesses of any size.
Also put emphasis on content
The other important concept for successful viral marketing is that do not cover the entire media with your material. The media, journalists and the members of public will value your contribution much more if every one of them has strong content and a good angle. If your content is not worthwhile, distributing viral releases or PR as much as you can will result in people devaluing your presence.
Make an effective profile
Websites like Myspace offers profiles that can be used to some extent. Making a profile for your business allows you to add people all over the world as friends, they can see your profile and you can deliver any marketing message you choose to present. To make effective use of this method an interesting unique angle is required. It is good to make an interesting, make believe characters as the profile owner and Offer attention grabbing incentives to visit your website.
Benefits of viral marketing
The great benefit of viral marketing on our social networks and anywhere also is that it is 100 percent free and provides good results. Good marketing agencies also need public relations and marketing knowledge to get the right angler for your material. This will result in more publicity. So if the budget permits it is highly recommended to look into viral marketing when a person chooses management for their viral campaigns.
(source-www.nddw.com)

VIRAL MARKETING

Viral marketing and viral advertising refer to marketing techniques that use pre-existing social networks to produce increases in brand awareness or to achieve other marketing objectives (such as product sales) through self-replicating viral processes, analogous to the spread of pathological and computer viruses. It can be word-of-mouth delivered or enhanced by the network effects of the Internet.
1. Viral marketing is a marketing phenomenon that facilitates and encourages people to pass along a marketing message voluntarily.
2. Viral promotions may take the form of video clips, interactive Flash games, advergames, ebooks, brandable software, images, or even text messages. The basic form of viral marketing is not infinitely sustainable.
It is claimed that a customer tells an average of three people about a product or service he/she likes, and eleven people about a product or service which he/she did not like.
3.Viral marketing is based on this natural human behavior.
The goal of marketers interested in creating successful viral marketing programs is to identify individuals with high Social Networking Potential (SNP) and create Viral Messages that appeal to this segment of the population and have a high probability of being passed along.
The term "viral marketing" is also sometimes used pejoratively to refer to stealth marketing campaigns
4.The use of varied kinds of astroturfing both online and offline
5. To create the impression of spontaneous word of mouth enthusiasm.
(source-http://en.wikipedia.org)

Wednesday, August 13, 2008

COOL WAYS TO ATTRACT TRRAFIC TO YOUR WEBSITE

In order to attract traffic to your website, you have to first ask yourself, what is in it for your visitor. There must be a pull factor - a bait on your website to attract traffic to visit your website.
Generally speaking, the bait must be something that people want. It must be something that people are looking for and it must be something interesting to them.
1. Provide free subscription to your e-zine or a free ebook.
It is very common nowadays to give free e-zine and free ebook. However, some beginners make a mistake by giving a free ebook that is not related to their area of niche. For example, people who visited your dog’s training website are looking for related information on dog’s training and might not be interested in a ebook related to “How to lose weight”. Bottomline, give ebook or e-zine related to your area of niche.
Almost everyone is publishing a e-zine nowadays so it's important to give something extra with the free subscription. You could offer a free gift or advertising when people subscribe.
You could include your own a link back to your website in your ebook and allow other people to give it away. If you don't want to take the time to write one, you could ask other writers permission to use their articles.
2. Provide your visitors with free content.
Your content will be more attractive to your visitors if it's up-to-date or original. You could also offer people the option to reprint the content in their e-zine or web site.
People are surfing the internet for information and if your website contain information related to their area of interest, they are going to love your sites. So make sure that the information on your site is of relevant and good quality content.
Nowadays, blogging has become very popular as it is the best way to provide free content and keep it up to date. By writing a blog, you can build a community around your blog readers and create a trusting relationship to keep them coming back to your blog for fresh content.
3. Hold free online classes or teleseminars.
You could provide free videos training on your websites or you can hold a teleseminars or free 7 days ecourse.The idea of "live" information will definitely entice people to visit your web site. You will become known as an expert on the topic. This is a very good way to brand your name and get lots of targeted traffic to your website.Many expert marketers are using this method to brand themselves and sold their products through these free online classes. You can also use these free online classes as a viral tool to attract mass traffic.
4. Provide free software.
Software is a great gift as it is often perceived as higher value than ebook or newsletter.
It could be freeware, shareware, demos etc. You could even turn part of your site into a free software directory.The free software should be related to your area of niche and is able to assist your visitors in some ways. For example, if your website is about weight loss, you can include a weight loss calculator on your website and allow your visitor free use or free download of the software.
If you know how to create software, you can include your links or banners within the software and allow other people to give it away. It is another great way to create viral marketing for your website.
(source-www.promotionworld.com)

SEM TECHNIQUES

It is often said, by many search engine optimisation experts, that content is king. Others proclaim that a text linking strategy is king. The simple fact is, if you are to be successful in your site promotion efforts, all aspects of Search Engine Marketing (SEM) are essential to the mix.SEM consists of all the elements that are required to successfully promote a website; with the objective being to improve a website's ranking on Google, Yahoo and MSN and other major search engines. These elements may be classified as:
1) On-page optimisation (all the techniques involved in manipulating page content).
2) Off-page optimisation (all the techniques involved in website promotion).
A good way to think of it is: on-page optimisation is used to promote your product, service or information. Whereas off-page optimisation is used to promote your website.On Page Optimisation.
Site Readiness - W3C Compliance: Search engine robots do not like broken HTML code. Your rankings will suffer if your pages do not conform to World Wide Web (W3C) standards. A free online HTML validation service is available at: http://validator.w3.org/.Search engines like Google now check CSS code, this must be validated also. A tool for this is available at: http://jigsaw.w3.org/css-validator/.
Site Readiness -Broken Links:It should be obvious, but is often overlooked, that broken links are a problem in terms of search engine optimisation. Clearly, if the search engine robot cannot finish spidering your site, due to broken links, then some pages will not be indexed. A tool for checking broken links can be found at: http://validator.w3.org/checklink
Site Readiness - Keyword Research: Analyzing your niche market, or business sector, allows you to create an appropriate set of targeted keywords (the most obvious ones are not necessarily the best). A free keyword tool can be downloaded from: http://www.goodkeywords.com/.Page optimisation - CopywritingOnce you have decided on the appropriate keywords/key phrases for each page, you need to work these keywords into the visible text on each page to be optimised. Remember that you are writing for users, not search engines, and the text must look natural.Page optimisation - HTML Tag OptimisationAll the appropriate HTML tags on each page should be optimised for best effect. These include:
1) Title tag.
2) Meta keyword tag.
3) Meta description tag.
4) Header tags.
Also, don't forget the image alt tags. And very importantly the link text in your anchor tags.Page optimisation - Build content.Keep adding new content to your website. If necessary publish articles and RSS feeds. Google loves new content.Off Page OptimisationOnce you have finished the on-page optimisation, your attention needs to be focused on off-page optimisation.
The essential components of off-page optimisation are:
1) Link building - begin a reciprocal and one way linking strategy.
2) Directory listings - get listed in human edited directories like Dmoz.
3) Article writing - a superb way to generate one way links.
4) Blog contributions - yet another opportunity for one way links.
5) Press releases - announce your presence to the world. More one way links.
Conclusion :-All of the above are essential to a successful site promotion campaign. If you wish to improve your Google ranking (the other search engines will follow), do not neglect one area in favour of another. Do it correctly. Do it all. Then watch your rankings soar.
(source-www.promotionworld.com)

Thursday, August 7, 2008

Web Hosting And SEO

NOWHERE to be found on Search Engines ?
If you have submitted your site again and again to Search Engines but you are still unable to get it indexed, it MAY be your Web hosting provider who is responsible.If the domain sharing your server and IP Address is penalized by a Search Engine on account of spamming then your website is also expected to be banned or penalized. This will happen if there is virtual shared IP hosting.
Another situation may arise if your website is residing on a server containing illegal adult content, and that site is on the black list of Search Engines your website will be banned. While you should never assume that your site is blacklisted, you should also take precautions. Even if your site is dismissed from the rankings temporarily, you can probably clear up the mistake by contacting the Search Engine.
To be safe its best to host with a reputed Web Hosting Provider.Your Web Hosting provider should be up 24/7. For proper website service it must be up 100% of the time so that Users and Search Engines aren't faced with a blank page or a 404 error. Search engines have no specific schedules for crawling, so your website must be up at all times in order to maintain your Search Engine Ranking.
Reputed Web Hosting Provider
The Reputation of the web host should be good with Search Engines.Although there is no clear way to identify them, web hosts that are very popular could be trusted.Some of the popular SEO Friendly Hosting providers are
Directi
Interland
1and1
Enom etc. Sites hosted by free web hosting providers do not usually rank well in Search Engines for competitive keywords.
Conclusion
You may benefit from sticking with the most popular hosting service providers, even if they are a little more expensive, rather than going for the best deal of a host with no reputation.It is also preferable to have an individual IP address so that risk of index removal is minimized (Usually Hosting providers provide dedicated IP Addresses only with purchase of SSL).
(source-http://www.webconfs.com)

Wednesday, August 6, 2008

Robots.txt

It is great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want. For instance, if you have two versions of a page (one for viewing in the browser and one for printing), you'd rather have the printing version excluded from crawling, otherwise you risk being imposed a duplicate content penalty. Also, if you happen to have sensitive data on your site that you do not want the world to see, you will also prefer that search engines do not index these pages (although in this case the only sure way for not indexing sensitive data is to keep it offline on a separate machine). Additionally, if you want to save some bandwidth by excluding images, stylesheets and javascript from indexing, you also need a way to tell spiders to keep away from these items.
One way to tell search engines which files and folders on your Web site to avoid is with the use of the Robots metatag. But since not all search engines read metatags, the Robots matatag can simply go unnoticed. A better way to inform search engines about your will is to use a robots.txt file.
What Is Robots.txt?
Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do. It is important to clarify that robots.txt is not a way from preventing search engines from crawling your site (i.e. it is not a firewall, or a kind of password protection) and the fact that you put a robots.txt file is something like putting a note “Please, do not enter” on an unlocked door – e.g. you cannot prevent thieves from coming in but the good guys will not open to door and enter. That is why we say that if you have really sen sitive data, it is too naïve to rely on robots.txt to protect it from being indexed and displayed in search results.
The location of robots.txt is very important. It must be in the main directory because otherwise user agents (search engines) will not be able to find it – they do not search the whole site for a file named robots.txt. Instead, they look first in the main directory (i.e. http://mydomain.com/robots.txt) and if they don't find it there, they simply assume that this site does not have a robots.txt file and therefore they index everything they find along the way. So, if you don't put robots.txt in the right place, do not be surprised that search engines index your whole site.
The concept and structure of robots.txt has been developed more than a decade ago and if you are interested to learn more about it, visit http://www.robotstxt.org/ or you can go straight to the Standard for Robot Exclusion because in this article we will deal only with the most important aspects of a robots.txt file. Next we will continue with the structure a robots.txt file.
Structure of a Robots.txt File
The structure of a robots.txt is pretty simple (and barely flexible) – it is an endless list of user agents and disallowed files and directories. Basically, the syntax is as follows:
User-agent:
Disallow:
“User-agent” are search engines' crawlers and disallow: lists the files and directories to be excluded from indexing. In addition to “user-agent:” and “disallow:” entries, you can include comment lines – just put the # sign at the beginning of the line:
# All user agents are disallowed to see the /temp directory.
User-agent: *
Disallow: /temp/
The Traps of a Robots.txt File
When you start making complicated files – i.e. you decide to allow different user agents access to different directories – problems can start, if you do not pay special attention to the traps of a robots.txt file. Common mistakes include typos and contradicting directives. Typos are misspelled user-agents, directories, missing colons after User-agent and Disallow, etc. Typos can be tricky to find but in some cases validation tools help.
The more serious problem is with logical errors. For instance:
User-agent: *
Disallow: /temp/
User-agent: Googlebot
Disallow: /images/
Disallow: /temp/
Disallow: /cgi-bin/
The above example is from a robots.txt that allows all agents to access everything on the site except the /temp directory. Up to here it is fine but later on there is another record that specifies more restrictive terms for Googlebot. When Googlebot starts reading robots.txt, it will see that all user agents (including Googlebot itself) are allowed to all folders except /temp/. This is enough for Googlebot to know, so it will not read the file to the end and will index everything except /temp/ - including /images/ and /cgi-bin/, which you think you have told it not to touch. You see, the structure of a robots.txt file is simple but still serious mistakes can be made easily.
Tools to Generate and Validate a Robots.txt File
Having in mind the simple syntax of a robots.txt file, you can always read it to see if everything is OK but it is much easier to use a validator, like this one: http://tool.motoricerca.info/robots-checker.phtm These tools report about common mistakes like missing slashes or colons, which if not detected compromise your efforts. For instance, if you have typed:
User agent: *
Disallow: /temp/
this is wrong because there is no slash between “user” and “agent” and the syntax is incorrect.
In those cases, when you have a complex robots.txt file – i.e. you give different instructions to different user agents or you have a long list of directories and subdirectories to exclude, writing the file manually can be a real pain. But do not worry – there are tools that will generate the file for you. What is more, there are visual tools that allow to point and select which files and folders are to be excluded. But even if you do not feel like buying a graphical tool for robots.txt generation, there are online tools to assist you. For instance, the Server-Side Robots Generator offers a dropdown list of user agents and a text box for you to list the files you don't want indexed. Honestly, it is not much of a help, unless you want to set specific rules for different search engines because in any case it is up to you to type the list of directories but is more than nothing.

Tuesday, August 5, 2008

Importance Of Sitemaps

There are many SEO tips and tricks that help in optimizing a site but one of those, the importance of which is sometimes underestimated is sitemaps. Sitemaps, as the name implies, are just a map of your site - i.e. on one single page you show the structure of your site, its sections, the links between them, etc. Sitemaps make navigating your site easier and having an updated sitemap on your site is good both for your users and for search engines. Sitemaps are an important way of communication with search engines. While in robots.txt you tell search engines which parts of your site to exclude from indexing, in your site map you tell search engines where you'd like them to go.
Sitemaps are not a novelty. They have always been part of best Web design practices but with the adoption of sitemaps by search engines, now they become even more important. However, it is necessary to make a clarification that if you are interested in sitemaps mainly from a SEO point of view, you can't go on with the conventional sitemap only (though currently Yahoo! and MSN still keep to the standard html format). For instance, Google Sitemaps uses a special (XML) format that is different from the ordinary html sitemap for human visitors.
One might ask why two sitemaps are necessary. The answer is obvious - one is for humans, the other is for spiders (for now mainly Googlebot but it is reasonable to expect that other crawlers will join the club shortly). In that relation it is necessary to clarify that having two sitemaps is not regarded as duplicate content. In 'Introduction to Sitemaps', Google explicitly states that using a sitemap will never lead to penalty for your site.
Why Use a Sitemap

Using sitemaps has many benefits, not only easier navigation and better visibility by search engines. Sitemaps offer the opportunity to inform search engines immediately about any changes on your site. Of course, you cannot expect that search engines will rush right away to index your changed pages but certainly the changes will be indexed faster, compared to when you don't have a sitemap.
Also, when you have a sitemap and submit it to the search engines, you rely less on external links that will bring search engines to your site. Sitemaps can even help with messy internal links - for instance if you by accident have broken internal links or orphaned pages that cannot be reached in other way (though there is no doubt that it is much better to fix your errors than rely on a sitemap).
If your site is new, or if you have a significant number of new (or recently updated pages), then using a sitemap can be vital to your success. Although you can still go without a sitemap, it is likely that soon sitemaps will become the standard way of submitting a site to search engines. Though it is certain that spiders will continue to index the Web and sitemaps will not make the standard crawling procedures obsolete, it is logical to say that the importance of sitemaps will continue to increase.
Sitemaps also help in classifying your site content, though search engines are by no means obliged to classify a page as belonging to a particular category or as matching a particular keyword only because you have told them so.
Having in mind that the sitemap programs of major search engines (and especially Google) are still in beta, using a sitemap might not generate huge advantages right away but as search engines improve their sitemap indexing algorithms, it is expected that more and more sites will be indexed fast via sitemaps.
Generating and Submitting the Sitemap
The steps you need to perform in order to have a sitemap for your site are simple. First, you need to generate it, then you upload it to your site, and finally you notify Google about it.
Depending on your technical skills, there are two ways to generate a sitemap - to download and install a sitemap generator or to use an online sitemap generation tool. The first is more difficult but you have more control over the output. You can download the Google sitemap generator and after you download the package, follow the installation and configuration instructions in it. This generator is a Python script, so your Web server must have Python 2.2 or later installed, in order to run it.
The second way to generate a sitemap is easier. There are many free online tools that can do the job for you. For instance, have a look at this collection of Third-party Sitemap tools. Although Google says explicitly that it has neither tested, nor verified them, this list will be useful because it includes links to online generators, downloadable sitemap generators, sitemap plugins for popular content-management systems, etc., so you will be able to find exactly what you need.
After you have created the sitemap, you need to upload it to your site (if it is not already there) and notify Google about its existence. Notifying Google includes adding the site to your Google Sitemaps account, so if you do not have an account with Google, it is high time to open one. Another detail that is useful to know in advance is that in order to add the sitemap to your account, you need to verify that you are the legitimate owner of the site.
Currently Yahoo! and MSN do not support sitemaps, or at least not in the XML format, used by Google. Yahoo! allows webmasters to submit “a text file with a list of URLs” (which can actually be a stripped-down version of a site map), while MSN does not offer even that but there are rumors that it is indexing sitemaps when they are available onsite. Most likely this situation will change in the near future and both Yahoo! and MSN will catch with Google because user-submitted site maps are just a too powerful SEO tool and cannot be ignored.
(source-http://www.webconfs.com)

Sitemap

A site map (or sitemap) is a representation of the architecture of a web site. It can be either a document in any form used as a planning tool for web design, or a web page that lists the pages on a web site, typically organized in hierarchical fashion. This helps visitors and search engine bots find pages on the site.
While some developers argue that site index is a more appropriately used term to relay page function, web visitors are used to seeing each term and generally associate both as one and the same. However, a site index is often used to mean an A-Z index that provides access to particular content, while a site map provides a general top-down view of the overall site contents.
(source-http://en.wikipedia.org)

Web Directories And Specialzed Search Engines

SEO experts spend most of their time optimizing for Google and occasionally one or two other search engines. There is nothing wrong in it and it is most logical, having in mind that topping Google is the lion's share in Web popularity but very often, no matter what you do, topping Google does not happen. Or sometimes, the price you need to pay (not literally but in terms of effort and time) to top Google and keep there is too high. Maybe we should mention here the ultimate SEO nightmare – being banned from Google, when you simply can't use Google (or not at least until you are readmitted to the club) and no matter if you like it or not, you need to have a look about possible alternatives.
What are Google Alternatives
The first alternative to Google is obvious – optimize for the other major search engines, if you have not done it already. Yahoo! and MSN (to a lesser degree) can bring you enough visitors, though sometimes it is virtually impossible to optimize for the three of them at the same time because of the differences in their algorithms. You could also optimize your site for (or at least submit to) some of the other search engines (Lycos, Excite, Netscape, etc.) but having in mind that they altogether hardly have over 3-5% of the Web search traffic, do not expect much.
Another alternative is to submit to search directories (also known as Web directories) and specialized search engines. Search directories might sound so pre-Google but submitting to the right directories might prove better than optimizing for MSN, for example. Specialized search engines and portals have the advantage that the audience they attract consists of people who are interested in a particular topic and if this is your topic, you can get to your target audience directly. It is true that specialized search engines will not bring you as many visitors, as if you were topping Google but the quality of these visitors is extremely high.
Naming all Google alternatives would be a long list and it is outside the scope of this article but just to be a little more precise about what alternatives exist, we cannot skip SEO instruments like posting to blogs and forums or paid advertisements.
Web Directories
What is a Web Directory?
Web directories (or as they are better known – search directories) existed before the search engines, especially Google, became popular. As the name implies, web directories are directories where different resources are gathered. Similarly to desktop directories, where you gather files in a directory based on some criterion, Web directories are just enormous collections of links to sites, arranged in different categories. The sites in a Web directory are listed in some order (most often alphabetic but it is not necessarily so) and users browse through them.
Although many Web directories offer a search functionality of some kind (otherwise it will be impossible to browse thousands of pages for let's say Computers), search directories are fundamentally different from search engines in the two ways – most directories are edited by humans and URLs are not gathered automatically by spiders but submitted by site owners. The main advantage of Web directories is that no matter how clever spiders become, when there is a human to view and check the pages, there is a lesser chance that pages will be classified in the wrong categories. The disadvantages of the first difference are that the lists in web directories are sometimes outdated, if no human was available to do the editing and checking for some time (but this is not that bad because search engines also deliver pages that do not exist anymore) and that sometimes you might have to wait half an year before being included in a search directory.
The second difference – no spiders – means that you must go and submit your URL to the search directory, rather than sit and wait for the spider to come to your site. Fortunately, this is done only once for each directory, so it is not that bad.
Once you are included in a particular directory, in most cases you can stay there as long as you wish to and wait for people (and search engines) to find you. The fact that a link to your site appears in a respectable Web directory is good because first, it is a backlink and second, you increase your visibility for spiders, which in turn raises your chance to be indexed by them.
Examples of Web Directories
There are hundreds and thousands of search directories but undoubtedly the most popular one is DMOZ. It is a general purpose search directory and it accepts links to all kinds of sites. Other popular general-purpose search directories are Google Directory and Yahoo! Directory. The Best of the Web is one of the oldest Web directories and it still keeps to high standards in selecting sites.
Besides general-purpose Web directories, there are incredibly many topical ones. For instance, the The Environment Directory lists links to environmental sites only, while The Radio Directory lists thousands of radio stations worldwide, arranged by country, format, etc. There are also many local and national Web directories, which accept links to sites about a particular region or country only and which can be great if your site is targeted at local and national audience only. You see, it is not possible to mention even the topics of specialized search directories only because the list will get incredibly long. Using Google and specialized search resources like The Search Engines Directory, you can find on your own many directories that are related to your area of interest.
Specialized Search Engines
What is a Specialized Search Engine?
Specialized search engines are one more tool to include in your SEO arsenal. Unlike general-purpose search engines, specialized search engines index pages for particular topics only and very often there are many pages that cannot be found in general-purpose search engines but only in specialized ones. Some of the specialized search engines are huge sites that actually host the resources they link to, or used to be search directories but have evolved to include links not only to sites that were submitted to them. There are many specialized search engines for every imaginable topic and it is always wise to be aware of the specialized search engines for your niche. The examples in the next section are by no means a full list of specialized search engines but are aimed to give you the idea of what is available. If you search harder on the Web, you will find many more resources.
Examples of Specialized Search Engines
Probably specialized search engines are not that numeric as Web directories but still certainly there is no shortage of them either, especially if one counts password-protected sites with database accessible only from within the site as a specialized search engine. As with Web directories, if there were a list of specialized search engines it would be really, really long (and constantly changing), so instead, here are some links to lists of search engines: Pandia Powersearch, Webquest, Virtual Search Engines, the already mentioned The Search Engines Directory, etc. What is common for these lists is that they offer a selection of specialized search engines, arranged by topic, so it is a good starting point for the hunt of specialized search engines.
(source-http://www.webconfs.com)

Choosing a SEO Company

After you have been dealing for some time with SEO on your own, you discover that no matter how hard you try, your site does not rank well or that your site ranks well but optimizing it for search engines takes all your time and all your other tasks lag behind. If this is the case with you, maybe it is better to consider hiring a SEO company to do the work for you. With so many SEO companies out there, you can't complain that you have no choice. Or is it just the opposite – so many companies but few reliable?
It is stretching the truth to say that there are no reliable SEO companies. Yes, there might be many scam SEO companies but if you know what to look for when selecting a SEO company, the risk of hiring fraudsters is reduced. It is much better if you yourself have a substantial knowledge of SEO and can easily decide if they promise you the starts in the sky or their goals are realistic but even if you are not quite familiar with SEO practices, here is a list with some points to watch for when choosing a SEO company:
Do they promise to guarantee #1 ranking? If they do, you have a serious reason to doubt their competencies. As the Google SEO selection tips say, no one can guarantee a #1 ranking in Google. This is true even for not so competitive words.
Get recommendation from friends, business partners, etc. Word of mouth is very important for the credibility of a company. For instance, we do not perform SEO services but despite that we constantly receive e-mails asking for SEO services. We always direct these inquiries to Blackwood Productions because we have worked with this company for a long time and we know that they are competent and reliable.
Ask in forums. There are many reputable Web master forums, so if you can't find somebody who can recommend you a SEO company right away, consider asking in Web master forums. However, beware that not all forum posters are honest people, so take their opinion (no matter if positive or negative) with a grain of salt. Forums are not such a reliable source of information as in-person contact.
Google the company name. If the company is a known fraudster, chances are that you will find a lot of information about it on the Web. However, lack of negative publicity does not mean automatically that the company is great, nor do some subjective negative opinions mean that the company is a scammer.
Ask for examples of sites they have optimized. Happy customers are the best form of promotion, so feel free to ask your potential SEO company about sites they have optimized and references from clients. If you get a rejection because of confidentiality reasons, this must ring a bell about the credibility of the SEO company - former customers are not supposed to be a secret.
Check the PR of their own site. If they can't optimize their site well enough to get a good PR (over 4-5), they are not worth hiring.
Ask them what keywords their site ranks for. Similarly to the page rank factor, if they don't rank well for the keywords of their choice, they are hardly as professional as they are pretending to be.
Do they use automated submissions? If they do, stay away from them. Automated submissions can get you banned from search engines.
Do they use any black hat SEO tricks? You need to know in advance what black hat SEO is in order to judge them, so getting familiar with the most important black hat SEO tricks is worth before you go and start cross-examining them.
Where do they collect backlinks from? Backlinks are very, very important for SEO success but if they come from link farms and other similar sites, this can cause a lot of trouble. So, make sure the SEO firm collects links from reputable sites only.
Get some personal impressions, if possible. Gut instinct and impressions from meetings are also a way to judge a company, though sometimes it is not difficult to get mislead, so use this approach with caution.
High price does not guarantee high quality. If you are eager to pay more, this does not mean that you will get more. Just because a firm costs more DOES NOT make them better SEO's. There are many reasons for high prices and high quality is only one of them. For instance, the company might work inefficiently and this is the reason for their ridiculously high costs, not the quality of their work.
Cheap is more expensive. This is also true. If you think you can pay peanuts for a professional SEO campaign, then you need to think again. Professional SEO companies offer realistic prices.
Use tricky questions. Using tricky questions is a double-edged sword, especially if you are not an expert. But there are several easy questions that can help you. For instance, you might ask them how many search engines they will automatically submit your site to. If they are scammers, they will try to impress you with big numbers. But in this case, the best answer would be "no automatic submissions". Another tricky question is to ask them if they will place in you top 10 for some competitive keywords of your choice. The trap here is that it is them, not you, who chooses the words that are best for your site. It is not that probable that they will choose exactly the same words as you suggest, so if they tell you that you just give them the words and they push you to the top, tell them “Goodbye”.
Do they offer subscription services? SEO is a constant process and if you want to rank well and keep on like that, efforts are necessary all the time. Because of this, it is better to select a company that includes post-optimization maintenance, than get a company that pushes your site to the top and then leaves you in the wild on your own.
(source-www.webconfs.com)

How To Get Traffic From Social Bookmarking Sites

Sites like digg.com, reddit.com, stumbleupon.com etc can bring you a LOT of traffic. How about getting 20,000 and more visitors a day when your listing hits the front page?Getting to the front page of these sites is not as difficult as it seems. I have been successful with digg and del.icio.us (and not so much with Reddit though the same steps should apply to it as well) multiple times and have thus compiled a list of steps that have helped me succeed:
add to del.icio.us
1.Pay attention to your Headlines
Many great articles go unnoticed on social bookmarking sites because their headline is not catchy enough. Your headline is the first (and very often the only) thing users will see from your article, so if you don't make the effort to provide a catchy headline, your chances of getting to the front page are small. Here are some examples to start with :- Original headline : The Two Types of Cognition Modified Headline : Learn to Understand Your Own Intelligence Original headline: Neat way to organize and find anything in your purse instantly! Modified Headline : How to Instantly Find Anything in Your Purse Here is a good blog post that should help you with your headlines.
2.Write a meaningful & short description
The headline is very important to draw attention but if you want to keep that attention, a meaningful description is vital. The description must be slightly provocative because this draws more attention but still, never use lies and false facts to provoke interest. For instance, if your write “This article will reveal to you the 10 sure ways to deal with stress once and forever and live like a king from now on.” visitors will hardly think that your story is true and facts-based. You also might be tempted to use a long tell-it-all paragraph to describe your great masterpiece but have in mind that many users will not bother to read anything over 100-150 characters. Additionally, some of the social bookmarking sites limit descriptions, so you'd better think in advance how to describe your article as briefly as possible.
3.Have a great first paragraph
This is a rule that is always true but for successful social bookmarking it is even more important. If you have successfully passed Level 1 (headlines) and Level 2 (description) in the Catch the User's Attraction game, don't let a bad first paragraph make them leave your site.
4.Content is king
However, the first paragraph is not everything. Going further along the chain of drawing (and retaining) users' attention, we reach the Content is King Level. If your articles are just trash, bookmarking them is useless. You might cheat users once but don't count on repetitive visits. What is more, you can get your site banned from social bookmarking sites, when you persistently post junk.
5.Make it easy for others to vote / bookmark your site
It is best when other people, not you, bookmark your site. Therefore, you must make your best to make it easier for them to do it. You can put a bookmarking button at the end of the article, so if users like your content, they can easily post it. If you are using a CMS, check if there is an extension that allows to add Digg, Del.icio.us, and other buttons but if you are using static HTML, you can always go to the social bookmarking site and copy the code that will add their button to your pages.
6.Know when to submit
The time when you submit can be crucial for your attempts to get to the front page. On most social bookmarking sites you have only 24 hours to get to the front page and stay there. So, if you post when most users (and especially your supporters) are still sleeping, you are wasting valuable time. By the time they get up, you might have gone to the tenth page. You'd better try it for yourself and see if it works for you but generally posting earlier than 10 a.m. US Central Time is not good. Many people say that they get more traffic around 3 p.m. US Central Time. Also, workdays are generally better in terms of traffic but the downside is that you have more competitors for the front page than on weekends.
7.Submit to the right category
Sometimes a site might not work for you because there is no right category for you. Or because you don't submit to the right category – technology, health, whatever – but to categories like General, Miscellaneous, etc. where all unclassified stuff goes. And since these categories fill very fast, your chance to get noticed decreases.
8.Build a top-profile
Not all users are equal on social bookmarking sites. If you are an old and respected user who has posted tons of interesting stuff, this increases the probability that what you submit will get noticed. Posting links to interesting articles on other sites is vital for building a top-profile. Additionally, it is suspicious, when your profile has links to only one site. Many social bookmarking sites frown when users submit their own content because this feels like self-promotion.
9.Cooperate with other social bookmarkers
The Lonely Wolf is a suicidal strategy on sites like StubleUpon, Digg, Netscape. Many stories make it to the front page not only because they are great but because they are backed up by your network of friends. If in the first hours after your submittal you get at least 15 votes from your friends and supporters, it is more likely that other users will vote for you. 50 votes can get you to the top page of Digg.
10.Submit in English
Linguistic diversity is great but the majority of users are from English-speaking countries and they don't understand exotic languages. So, for most of the social bookmarking sites submitting anything in a language different from English is not recommendable. The languages that are at an especial disadvantage are Chinese, Arabic, Slavic languages and all the other that use non-latin alphabet. German, Spanish, French are more understandable but still they are not English. If you really must submit your story (i.e. because you need the backlink), include an English translation at least of the title. But the best way to proceed with non-English stories is to post them on where they belong.
11.Never submit old news
Submitting old news will not help you in becoming a respected user. Yesterday's news is history. But if you still need to submit old stuff, consider feature articles, howtos and similar pieces that are up-to-date for a long time.
12.Check your facts
You must be flattered that users read your postings but you will hardly be flattered when users prove that you haven't got the facts right. In addition to sarcastic comments, you might also receive negative votes for your story, so if you want to avoid this, check you facts - or your readers will do it.
13.Check you spelling
Some sites do not allow to edit your posts later, so if you misspell the title, the URL, or a keyword, it will stay this way forever.
14.Not all topics do well
But sometimes even great content and submitting to the right category do not push you to the top. One possible reason could be that your stories are about unpopular topics. Many sites have topics that their users love and topics that don't sell that well. For instance, Apple sells well on Digg and The War in Iraq on Netscape. Negative stories - about George Bush, Microsoft, evil multinational companies, corruption and crime also have a chance to make it to the front page. You can't know these things in advance but some research on how many stories tagged with keywords like yours have made the front page in the last year or so can give you a clue.
15.Have Related Articles / Popular Articles
Traffic gurus joke that traffic from social bookmarking sites is like an invasion – the crowds pour in and in a day or two they are gone. Unfortunately this is true – after your listing rolls from the front page (provided that you reached the front page), the drop in traffic is considerable. Besides, many users come just following the link to your article, have a look at it and then they are gone. One of the ways to keep them longer on your site is to have links to Related Articles / Popular Articles or something similar that can draw their attention to other stuff on the site and make them read more than one article.
16.RSS feeds, newsletter subscriptions, affiliate marketing
RSS feeds, newsletter subscriptions, affiliate marketing are all areas in which the traffic from social bookmarking sites can help you a lot. Many people who come to your site and like it, will subscribe to RSS feeds and/or your newsletter. So, you need to put these in visible places and then you will be astonished at the number of new subscriptions you got on the day when you were on the front page of a major social bookmarking site.
17.Do not use automated submitters
After some time of active social bookmarking, you will discover that you are spending hours on end posting links. Yes, this is a lot of time and using automated submitters might look like the solution but it isn't. Automated submitters often have malware in them or are used for stealing passwords, so unless you don't care about the fate of your profile and don't mind being banned, automated submitters are not the way to go.
18.Respond to comments on your stories
Social bookmarking sites are not a newsgroup but interesting articles can trigger a pretty heated discussion with hundreds of comments. If your article gets comments, you must be proud. Always respond to commends on your stories and even better – post comments on other stories you find interesting. This is a way to make friends and to create a top-profile.
19.Prepare your server for the expected traffic
This is hardly a point of minor importance but we take for granted that you are hosting your site on a reliable server that does not crash twice a day. But have in mind that your presence on the front page of a major social bookmarking site can drive you a lot traffic, which can cause your server to crash – literally! I remember one of the times I was on the front page on Digg, I kept restarting Apache on my dedicated server because it was unable to cope with the massive traffic. I have many tools on my site and when the visitors tried them, this loaded the server additionally. Well, for an articles site getting so much traffic is not so devastating but if you are hosting on a so-so server, you'd better migrate your site to a machine that can handle a lot of simultaneous hits. Also, check if your monthly traffic allowance is enough to handle 200-500,000 or even more visitors. It is very amateurish to attract a lot of visitors and not be able to serve them because your server crashed or you have exceeded your bandwidth!
20.The snowball effect
But despite the differences in the likes of the different social bookmarking communities, there are striking similarities. You will soon discover that if a post is popular on one of the major sites, this usually drives it up on the other big and smaller sites. Usually it is Digg posts that become popular on StumbleUpon and Reddit but there are many other examples. To use this fact to your best advantage, you may want to concentrate your efforts on getting to the front page of the major players only and bet on the snowball effect to drive you to the top on other sites.An additional benefit of the snowball effect is that if your posting is interesting and people start blogging about it, you can get tons of backlinks from their blogs. This happened to me and the result was that my PR jumped to 6 on the next update.
(source-http://www.webconfs.com)

Some Ugly Aspects Of SEO

1.Dependent on search engines
It is true that in any career there are many things that are outside of your control but for SEO this is a rule number one. Search engines frequently change their algorithms and what is worse – these changes are not made public, so even the greatest SEO gurus admit that they make a lot of educated guesses about how things work. It is very discouraging to make everything perfect and then to learn that due to a change in the algorithm, your sites dropped 100 positions down. But the worst part is that you need to communicate this to clients, who are not satisfied with their sinking ratings.
2.No fixed rules
Probably this will change over time but for now the rule is that there are no rules – or at least not written ones. You can work very hard, follow everything that looks like a rule and still success is not coming. Currently you can't even rely on bringing a search engine to court because of the damages they have done to your business because search engines are not obliged to rank high sites that have made efforts to get optimized.
3.Rapid changes in rankings
But even if you somehow manage to get to the top for a particular keyword, keeping the position requires constant efforts. Well, many other businesses are like that, so this is hardly a reason to complain – except when an angry customer starts shouting at you that this week their ratings are sinking and of course this is all your fault.
4.SEO requires Patience
The SEO professional and customers both need to understand that SEO takes constant effort and time. It could take months to move ahead in the ratings, or to build tens of links. Additionally, if you stop optimizing for some time, most likely you will experience a considerable drop in ratings. You need lots of motivation and patience not to give up when things are not going your way.
5.Black hat SEO
Black hat SEO is probably one of the biggest concerns for the would-be SEO practitioner. Fraud and unfair competition are present in any industry and those who are good and ethical suffer from this but black hat SEO is still pretty widespread. It is true that search engines penalize black hat practices but still black hat SEO is a major concern for the industry.
(source-http://www.webconfs.com)

Some Good Reasons To Choose SEO as Your Career

1.High demand for SEO services
Once SEO was not a separate profession - Web masters performed some basic SEO for the sites they managed and that was all. But as sites began to grow and make money, it became more reasonable to hire a dedicated SEO specialist than to have the Web master do it. The demand for good SEO experts is high and is constantly on the rise.
2A LOT of people have made a successful SEO career
There are many living proofs that SEO is a viable business. The list is too long to be quoted here but some of the names include Rob from Blackwood Productions, Jill Wahlen from High Rankings, Rand Fishkin from SEO Moz and many others.
3.Search Engine Optimizers make Good Money !
SEO is a profession that can be practiced while working for a company or as a solo practitioner. There are many jobboards like Dice and Craigslist that publish SEO job advertisements. It is worth noting that the compensation for SEO employees is equal to or even higher than that of developers, designers and marketers. Salaries over $80K per annum are not an exception for SEO jobs. As a solo SEO practitioner you can make even more money. Almost all freelance sites have sections for SEO services and offers for $50 an hour or more are quite common. If you are still not confident that you can work on your own, you can start a SEO job, learn a bit and then start your own company. If you already feel confident that you know a lot about SEO, you can take this quiz and see how you score. Well, don't get depressed if you didn't pass - here is a great checklist that will teach you a lot, even if you are already familiar with SEO.
4.Only Web-Designing MAY NOT be enough
Many companies offer turn-key solutions that include Web design, Web development AND SEO optimization. In fact, many clients expect that when they hire somebody to make their site, the site will be SEO friendly, so if you are good both as a designer and a SEO expert, you will be a truely valuable professional. On the other hand, many other companies are dealing with SEO only because they feel that this way they can concentrate their efforts on their major strength – SEO, so you can consider this possibility as well.
5.Logical step ahead if you come from marketing or advertising
The Web has changed the way companies do business, so to some extent today's marketers and advertisers need to have at least some SEO knowledge if they want to be successful. SEO is also a great career for linguists.
6.Lots of Learning
For somebody who comes from design, development or web administration, SEO might look not technical enough and you might feel that you will downgrade if you move to SEO. Don't worry so much - you can learn a LOT from SEO, so if you are a talented techie, you are not downgrading but you are actually upgrading your skills packages.
7.SEO is already recognized as a career
Finally, if you need some more proof that SEO is a great career, have a look at the available courses and exams for SEO practitioners. Well, they might not be a CISCO certification but still they help to institutionalize the SEO profession.
(source-http://www.webconfs.com)

Monday, August 4, 2008

Hisory of Page Rank

PageRank was developed at Stanford University by Larry Page (hence the name Page-Rank) and later Sergey Brin as part of a research project about a new kind of search engine. The project started in 1995 and led to a functional prototype, named Google, in 1998. Shortly after, Page and Brin founded Google Inc., the company behind the Google search engine. While just one of many factors which determine the ranking of Google search results, PageRank continues to provide the basis for all of Google's web search tools.
PageRank is based on citation analysis that was developed in the 1950s by Eugene Garfield at the University of Pennsylvania, and Google's founders cite Garfield's work in their original paper. By following links from one page to another, virtual communities of webpages are found. Web link analysis was first developed by Jon Kleinberg and his team while working on the CLEVER project at IBM's Almaden Research Center.
(source-http://en.wikipedia.org/wiki/PageRank)

About Page Rank

PageRank is a link analysis algorithm that assigns a numerical weighting to each element of a hyperlinked set of documents, such as the World Wide Web, with the purpose of "measuring" its relative importance within the set. The algorithm may be applied to any collection of entities with reciprocal quotations and references. The numerical weight that it assigns to any given element E is also called the PageRank of E and denoted by PR(E).
PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page's value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page receives; it also analyzes the page that casts the vote. Votes cast by pages that are themselves "important" weigh more heavily and help to make other pages "important".
In other words, a PageRank results from a "ballot" among all the other pages on the World Wide Web about how important a page is. A hyperlink to a page counts as a vote of support. The PageRank of a page is defined recursively and depends on the number and PageRank metric of all pages that link to it ("incoming links"). A page that is linked to by many pages with high PageRank receives a high rank itself. If there are no links to a web page there is no support for that page.
Google assigns a numeric weighting from 0-10 for each webpage on the Internet; this PageRank denotes a site’s importance in the eyes of Google.