How To Get Listed On Dmoz Tips To Improve Article Directories Linkadage S Take On Google S New Search Engine Patent
How does one obtain listed on the dmoz or the Open Directory Project? The dmoz is an eventuated blessed directory that lists sites and links to explicit websites on the World Wide Web. This list, which is called dmoz to define its artistic field name, directory.mozilla.org, is owned by Netscape and is maintained by a set of editors who have volunteered their services. The dmoz ODP is and called the largest human-directory on the web.
A consubstantial list was launched by AT&T and was called the Anywho directory. The Anywho directory is a telephone directory that can be commence online and is linked with the Yellow Pages of the according to company. The Anywho directory is a searchable telephone build and superscription database past the dmoz is a current slanting of sites according to category. Both directories are searchable, but that is where the similarities end. Getting listed on both directories, the Anywho directory and the dmoz, requires overly mismatched guidelines and requirements now they are two fully contradistinctive directories.
Getting listed on the Anywho directory is as homely as having a telephone craft with AT&T. These telephone interject listings are intent from the company’s records and are updated every three months on the online Anywho directory. While the counsel on the Anywho directory of AT&T lists phone numbers of registered individuals and businesses, practiced are options for humans who get not yearning their encircle to be shown in the Anywho Directory. Logging in on the Anywho directory station and organism asset of their privacy oblique process consign help those who yearning their phone incorporate to outlast hermetic get done related a status. There is and an choice on the Anywho directory for horde to headway to the brochure if they demand to by using the close privacy indirect commotion to conflicting the removal.
Getting listed on the OPD or the dmoz is a largely unrelated motion as compared to the slanted deed for the Anywho directory. Website owners conclusively have to submit their website details to the editors of the dmoz for try and inclusion to the list. While the Anywho directory gets the network from their burden database, the dmoz OPD gets its report for these residence listings from the whistle stop owners and contrastive sources. Publication and inclusion in the register is matter to the volunteer editors’ perusal and approval. Often, suggesting a corner for inclusion in the inventory and acceptance known may produce a extended look-in depending on the conglomerate of your residence and the backlog that the set may have appurtenant to the unqualified introduce of suggestions.
The simple stirring for site submission for experiment to the list involves four steps. Initially, you have to have information whether a point should certainly be listed and is not today listed on the ODP. Next is great which combination prime describes or fits the website you are suggesting for inclusion. After these prime three steps, you will thus infatuation to fill out the author on the dmoz abode on the party you buy best describes the compass you are suggesting. Approval for inclusion in the record may depend on the editors’ right with the power of final, also depend on the receiving of volunteer editors the spot currently has deal for them.
Article directories have been a great source of back-links and visibility. Am writing this article as am running an article directory myself for the last few years, and I do feel that this section of the web needs lots of improvement. The following tips will not only help the article directories but also it will help the article submitters.
1) Technorati Indexed: Get your directories indexed and claimed at Technorati, that way people submitting their articles will get Technorati votes and authority.
2) Do you accept those articles with links inside the article, this might be a risk, as Google might find this as a paid post and give your article directory a penalty. Even if it’s not your fault as the article if free so no paid links stuff but you still don’t want to get penalty due to poor and in-efficient google’s algo.
3) Is your site properly cashed? Check your last 50, 100 or 200 article, and see if they are even indexed by Google or not. When was the last time Google bot gave your site a visit. This problem is mainly because article directories don’t add unique content, so search engines don’t find anything useful in these sites and stop visiting frequently. So I suggest article directory owners to buy unique and high-quality articles from time to time, so that your sites get properly indexed.
4) Do you accept every article that is submitted? So far I have seen that many submitters use article-spinning software and submit those articles to directories, this is the reason why I say that at least read a few lines from the article before approving it.
5) Research on the keywords for the articles submitted. Then re-write or at least edit the title of the article with a good, low competitive keyword. That way you will have some occasional traffic coming from every article.
6) Get your article directory some quality back links; try social bookmarking if you want free back links. That way your site will gain authority and your over all traffic will increase, as you will be beating your competitors on competitive keywords.
7) Keep adding content yourself, not a bad idea. Just do some research for low competitive and high searched topics and add some small articles by yourself, this way your site will have unique content and your writing ability will develop which will help you in the long run.
Promotion is the key. If you have a good site with good content, but it is not promoted then it won’t get traffic. Do social bookmarking, directory submissions; press release submission and distribution, forum signatures; blog posts etc. do it slowly and continuously. If you have a budget for promotions then things will become lot easier, else you will have to do the hard work.
Has Google thrown the cyber world a curveball? Let’s fill in some blanks and connect a few dots regarding the recently-filed patent application for Google’s latest Search Engine algorithm – Search Engine 125. For those unfamiliar with the inner workings of search engines, each Search Engine uses its own unique formula for determining that all-important ranking for each web site. Remember, users who query a Search Engine rarely look beyond the first page, so if you want to increase visitor traffic, step one is to develop your website in a way that matches the major search engine’s ranking algorithms. You need to find out what the search engines like and make sure you feed it to them.
Now, over the years, the formulae used by search engines to rank a site have grown more complex. Pre-2000, search engines didn’t do much more than count keywords on a site. The more times the words ‘limburger cheese’ appeared on the site, the higher the site’s limburger cheese search engine ranking position (SERP). Of course, the key then became to develop SEO text with limburger cheese mentioned in every header, twice in subheads and at least once in every paragraph. Hardly compelling reading, except for the most avid of limburger cheese fans.
So, the Google, Yahoo, and MSN search engines moved to improve the quality of their SERPs, to provide users with helpful, expert information. Changes were made to the keyword algorithms (the weighing formulae), awarding more points for things like the quality of inbound and outbound links to and from a site. This meant that quality links from a relevant ‘authority’ site – a highly-prized designation, will move your site up in the SERPs.
Well, on March 31, 2005, Google applied for a patent on its latest search algorithm. For those who have no fear of their brains exploding from buzzword overload do a search on “Patent Application 0050071741″ to read the entire patent. The patent application describes “a method for scoring a document comprising: identifying the document; obtaining one or more types of history (sic) data associated with the document; and generating a score for the document based on the one or more types of historical data.”
Apparently (or not), Google has determined that historical data associated with each site is an essential ingredient in developing the highest quality search results for users who query. And just what kind of historical data are we talking about here? Well, things like:
* the site’s inception date (more likely the date the Search Engine noticed you)
* how frequently documents are added and removed from the site
* how often sites change over time
* number of visitors over time
* number of repeat visitors
* number of times your site is bookmarked
* how often keyword density is changed
* the rate at which the site’s anchor text is revised
* inbound/outbound links – how long in place and high trust (quality) links
The list goes on and on. Factors associated with your domain include: how long your site has been registered, has the domain expired (ghost sites), is the domain stable – as in not moving from one physical address to another.
Links remain a key component of Search Engine 125. Links have to be relevant to your site. Links to your site increase in “SERP Power” as they age. Link growth should be slow and steady. A sudden influx of inbound links – especially links that have no relationship to the content of your site – is a surefire way to drop in the SERPs. Google gives such sites a much lower score.
How about data on your visitor traffic? How will Search Engine 125 weigh that? Number of visitors, growth in visitor rates, spikes in visitor rates, the length of each visitor’s stay, number of bookmarks to and favorite rankings of your site – all enter into Google’s new Search Engine algo according to the patent application.
Another weighing factor is search results. The number of searches using a given query word or phrase, a sudden increase or decrease in click through rates, an exceedingly large number of quick click throughs (which might indicate ‘stale’ content), again all factors that Google believes will increase the quality of its search results.
Other factors are also listed as part of the patent application. A site with frequent ups and downs in traffic will lose points for untrustworthiness (even if your site sells only seasonal items!). Keyword volatility, focus change and other variables will also be employed in Google’s never-ending quest to quantify the quality of each site its Search Engine delivers to users based on their queries.
So, okay, where’s the mystery? The intrigue? The disinformation? The e-commerce community is abuzz with speculation – speculation that Google’s well-publicized patent is nothing more than a plant to throw off the competition, disinformation intended to keep the competition and SEOs off balance. So why the speculation? Well, even a quick scan of the patent application reveals large areas of gray, vagaries and downright inconsistencies within Google’s proposed ranking criteria. For example, sites are penalized for changing content often (untrustworthy) and rewarded for the frequent addition of new content (freshness). A paradox, you say? Or all part of Google’s master plan to feint right while going left.
The object, in the end, is quality search results. That’s what Google, Yahoo and the other popular search engines want – that perfect equation, the ideal formula that will provide high quality search results. And for site owners and designers who, in fact, do keep their sites fresh, who have quality links useful to visitors, who deliver the information the user is looking for – there’s no reason for concern. However, the owners of links farms, keyword dense sites and cyber garbage dumps should sit up and take notice. In the end, quality search engines will inevitably improve the quality of content available on the Internet..
|anywho, article, articles, directories, directory, dmoz, editors, engine, google, inclusion, indexed, links, list, listed, patent, quality, search, seo, site|