SEO Advice, Duplicate Content
When you are looking at SEO advice, there is a lot of it on the internet.
I think you have to look at who is giving the advice, then you have to examine the advice and check to see if it is true for you, when you try it does it work. Look to see if there is any proof that it really works that way.
For example, for a long time a lot of people were saying that pages that were dynamic would not get listed by the search engines. All you had to do was type a few searches in Google and you could find hundreds of indexed pages. So to me it was pretty clear that was just not true.
Now there is some talk about dashes in the domain name being bad.
The major problem with saying that dashes work one way or the other is that there are a lot of variables. I have about 20 sites with dashes in the URL and every one of them have been treated differently by Google.
The best was not sandboxed at all, one was sandboxed for over a year…
I just read that Google may be sandboxing sites that have concentrated niche content, but not sites with more general content.
This is a possibility that I haven’t noticed that but I will look closer for it in the future, since I have noticed that some sites get sandboxed while others do not.
Duplicate content is another SEO issue that a lot of people are talking about.
Most of my sites are currently being penalized for duplicate content so I know a little about how real this issue is.
Google said in their patent that they take many finger prints of each page if two of your pages finger prints match another page in their index then you may get banned.
No reason to believe they would throw that in their patent if it were not true.
So this idea that you can buy some private label content and change a few words or rewrite a couple sentences and bypass the duplicate content filter is just not the case.
I have a feeling that the more duplicate content that shows up the harder that the search engines are going to deal with it, meaning more filtered pages for your site.
I think you have to look at who is giving the advice, then you have to examine the advice and check to see if it is true for you, when you try it does it work. Look to see if there is any proof that it really works that way.
For example, for a long time a lot of people were saying that pages that were dynamic would not get listed by the search engines. All you had to do was type a few searches in Google and you could find hundreds of indexed pages. So to me it was pretty clear that was just not true.
Now there is some talk about dashes in the domain name being bad.
The major problem with saying that dashes work one way or the other is that there are a lot of variables. I have about 20 sites with dashes in the URL and every one of them have been treated differently by Google.
The best was not sandboxed at all, one was sandboxed for over a year…
I just read that Google may be sandboxing sites that have concentrated niche content, but not sites with more general content.
This is a possibility that I haven’t noticed that but I will look closer for it in the future, since I have noticed that some sites get sandboxed while others do not.
Duplicate content is another SEO issue that a lot of people are talking about.
Most of my sites are currently being penalized for duplicate content so I know a little about how real this issue is.
Google said in their patent that they take many finger prints of each page if two of your pages finger prints match another page in their index then you may get banned.
No reason to believe they would throw that in their patent if it were not true.
So this idea that you can buy some private label content and change a few words or rewrite a couple sentences and bypass the duplicate content filter is just not the case.
I have a feeling that the more duplicate content that shows up the harder that the search engines are going to deal with it, meaning more filtered pages for your site.