Showing posts with label SEO Techniques. Show all posts
Showing posts with label SEO Techniques. Show all posts

12 January 2016

365daysmasti

robot.txt: What it is?


Search engines visit websites, blogs and other online portal to scan and then store or cache the information. This is then used to rank them according to the relevancy. In a large portal the whole website may be indexed according to criteria of search engines. Most of the search engine used indexed web pages collectively to rank the site. This means pages that are not important or hold information not meant for the engines are also indexed. 
In order to specify to the robots which pages to index and which to ignore a common protocol robot.txt is used. Hence a file is uploaded into the server along with the pages in the root directory. This is the robot.txt file which the robots first visit and index the pages accordingly. But remember this protocol may not be adhered to by some robots especially those with malicious intent. But all the popular search companies adhere to this standard which is public.       
The file contains the following command/commands:
User-agent: *
Disallow: /
More instructions are given below: 
To block the entire site, use a forward slash.

Disallow: /


To block a directory and everything in it, follow the directory name with a forward slash.

Disallow: /junk-directory/

To block a page, list the page.

Disallow: /private_file.html

To remove a specific image from Google Images, add the following:

User-agent: Googlebot-Image
Disallow: /images/dogs.jpg

To remove all images on your site from Google Images:

User-agent: Googlebot-Image
Disallow: /

To block files of a specific file type (for example, .gif), use the following:

User-agent: Googlebot
Disallow: /*.gif$

More information can be had from: Here Robot.TXT
Also visit Robottxt.org 
This information can also be given in the meta robot tag which should be present in every page. Another methodology is to insert x-robot header this is placed in the header which then works for all pages. Care should be taken that directives are proper and no important pages is barred.  This also applies to CMS portals.     

You have to regulary check the directives so that over time no misinformation/ wrong case is inserted. There are number of tools that can enable you to keep things in line.  
Read More

24 December 2015

365daysmasti

Trick to Remove the numbers in blogger post URLs



When you first publish a post, Blogger assigns a permanent web-address (aka an URL or a permalink) to the post.  I've previously explained how you can control the words used in this hyperlink

A common question from people who are researching SEO for their blog is "how do I get rid of the numbers in the post-URL?".

Unfortunately the answer is not as straighforward as most people hope for.


Numbers near the start of Blogger URLS

As described in setting the content of your post's permalink, the URL given to posts published in Blogger shows the year and month of the original publication date for the post. I think this is because Blogger was originally set up as an on-line diary, with a lot of the features organised around the post-date.

Today, there are ways of giving your blog a home pageshowing your posts in pages, and changing the order of the posts, which let your blog be a lot more than a date-ordered web-log.

Some blogging software (eg Wordpress) lets you choose the structure of the URLs which are used, eg leaving the date out totally, or puting it after the words.

However Blogger does not currently have any way to remove the date-part of the post URLs. And I could be wrong, but my best guess is that this will not change anytime soon.

So what options are available to remove the year and month numbers?

If you just don't want people to know the correct month and year of the post, then you can change the date before you publish the post for the the first time. Maybe make it something non-sensical (eg 1/1/1990). (However do remember that your RSS feed will show the actual date of publication, not the assigned date).

If you have some content where any month-and-year are particularly irrelevantput it into a Page instead of a Post - because Page URLs don't contain a date.  But remember that you need to give users a way to get to these Pages, and that remember that they are not sent out in your RSS feed, so subscribers won't see the content.

The third - and least attractive - option: is to accept that this is how Blogger worksand that you need to live with it or switch to another blogging tool.


Numbers near the end of Blogger URLS

Blogger puts digits at the end of post-URLs in order to make sure that each post ever published has a unique address.

Notice that I said "ever published": if you publish a post, then delete it, and then publish a second post with the same year, month and either title or customized-URL-words, then the second post's URL will have some digits put on the end, to stop it being the same as the first one.

Once a post is published, you cannot remove the digits and keep the same words and month/year.  The only way to avoid them is to make sure that your post-URLs are unique. So if you publish a post and notice that it has digits on the end of the URL, one option is to delete that post, and replace it with one which has a different publication date or customized-URL-words(don't forget to copy the post contents before you delete it!)   Or you could just set it back to draft status, and then publish it again with different and this time unique customized-URL-words.

For example, if you publish and find that you get
www.all-about-cats.com/2012-07/vegetarian-cat-food-recipes01.html
you may want to delete the post, and republish the content in a post with a different date like
www.all-about-cats.com/2012-06/vegetarian-cat-food-recipes.html

Read More

22 December 2015

365daysmasti

25 Worst SEO Techniques in 2015


Think you've mastered SEO best practices for business? You may be surprised to learn that search engine optimization is a fluid concept, and what put you at the top of search rankings just a year ago may now be considered among the worst SEO techniques employed.

Creating content and code specifically for Google can come back on you like a boomerang. After all, Google wants to deliver content that is valuable to people, not to search engines. We've been browsing the results of Moz's latest survey of search engine marketers, and some of the data was quite surprising. The survey includes feedback from 150 marketing professionals on over 90 factors to provide vast amounts of information on what's hot and what's not in the world of SEO.

Below, we've listed and summarized some of the worst SEO techniques of 2015 - factors that correlated negatively with a higher ranking. It's important to note that correlation does not equal causation; these factors have a negative correlation with higher ranks, but that does not automatically mean they are the cause. For example, the negative correlation of having a large number of <a href> tags may not be because of the tags themselves, but because the page is simply too long.

We'll follow up soon with a post on best practices for search engine optimization.  But, for now, here's a summary of the year's worst SEO techniques - are you guilty? The factors are ranked by negative correlation, from most damaging, to least.



1. Ranking on Similarweb.com (-.24)

This seems to indicate that Google actually penalizes larger sites that have more traffic, rather than burying small sites.

2. URL Length in Characters (-.11)

Long URLS are often auto-generated with unimportant content and stuffed with keywords.

3. Server Response Time (-.10)

Cheap hosting and poor programming can be a fatal problem if your site takes too long for users to access.

4. An Unusually Small Number of Pages Linking to the Domain (-.09)

Sites that are not often linked to, by other pages, tells Google that spammers may be trying to promote the domain.

5. Hyphens in URL (-.09)

Speculating, I would suggest that Google thinks this URL is stuffed with keywords designed to appeal to search engines, not people.

6. Bounce Rate (-.08)

A normal bounce rate for a site is around 50%, higher is a red flag. If the majority of viewers leave after viewing only one page, how good can the site be?

7. Untrustworthy Site Link Profile (-.07)

A lack of outbound links to authoritative sites, high bounce rates, lack of privacy and policy pages or active social network accounts makes Google deem a site untrustworthy.

8. Lack of Inbound Links (-.07)

A large site with no inbound links indicates your site has a lot of content that users don't think has value.

9. Number of HTML Pages in Target Link (-.07)

Long pages with lots of keywords embedded in links makes the site look suspicious.

10. Long Domain Name (Including Subdomains) (-.07)


As in number 2 above, long domains are often filled with keywords for search engines to index, rather than content that people are looking for.

11. Minimal Branded Anchor Text (-.06)

Google loves short links with brand names in them.

12. Use of Privacy Service (-.06)

If the domain owner is using a privacy service, it raises a red flag and suggests that the owner is hiding behind the registrar for nefarious purposes.

13. LDA Similarity Between Body Text and Query (-.05)

Don't bother attempting to stuff loads of exact key phrases into your text - it will only backfire. Focus on employing keywords in a human-readable format.

14. Top-level Domain is .edu, .biz, .net, .info, .com (-.04 to -.01)

.org has a positive correlation of .06, meaning all other top-level domains suffer by comparison.

15. Subdomain is on a TLD with Spam Links (-.04)

If your parent site is known to be the source of spam content, your subdomain will suffer as well.

16. Too Many Pages With Invalid Responses (-.03)

If crawlers only receive valid responses from a small number of pages, it can hurt your ranking. Clean up all bad links on your site.

17. Google AdSense Slots (-.03)

It would seem that selling advertising space on your site can hurt your ranking in search results.

18. Numbers in the Domain Name (-.03)

Ouch! I wish someone had told me this 19 years ago when I registered Web2Market.com!

19. Number of Characters in Title (-.03)

Overly long title tags can hurt your rankings - keep titles short and succinct.

20. SearchActionSchema.org Markup (-.02)

This indicates the page is used to search your site, and Google doesn't want to index the results of a search embedded in your site.

21. Low Internal Links (-.02)

Linking between the pages on your site can help you perform better in search result rankings.

22. High External Links (-.02)

Excessive external linking makes your site look like a directory or link factory.

23. Links to Facebook and Twitter or Open Graph Markups (-.01)

We're down to the really small stuff at this point, but gratuitous linking to social networks doesn't help you.

24. Ratio of Followed to No-Followed Links Outside of Normal Range (-.01)

If the number of followed versus no-followed links from domains to a subdomain is outside the average, it indicates the subdomain's content holds little value.

25. Keyword Matches in H1 Tag (-.01)


Don't fill your H1 tags with keywords - once common practice, Google apparently doesn't appreciate this any longer.
Read More