How to dissallow all wordpress posts in robots.txt?
How to dissallow all wordpress posts in robots.txt?
How to dissallow all wordpress posts in robots.txt?
Add parameters to Yoast SEO plugin schema in functions.php [closed]
Over 500.000(!) unwanted archive pages, major duplicate content problem
When I look at your error message, I get the impression that you’re running WordPress off a MS SQL database rather than a MySQL database. That’s fine, but it makes things a bit tricky from a support perspective because the two database platforms are, in fact, different. For example, the error message you’re getting is … Read more
If the blog can actually be accessed at http://blog.website.com, you’ll need to edit your siteURL and home settings on the Options screen within WordPress. This will set your blog’s default home to be http://blog.website.com instead of http://website.com/blog. There are more specific instructions in the Codex …
Writing a robots.txt is an easy process. Follow these simple steps: Open Notepad, Microsoft Word or any text editor and save the file as ‘robots,’ all lowercase, making sure to choose .txt as the file type extension (in Word, choose ‘Plain Text’ ). Next, add the following two lines of text to your file: User-agent: … Read more
Yes, you can definitely run it with the two URLs on the same box. The only reason I’ve ever seen against doing this is to avoid getting a duplicate content penalty from search engines. I’ve never done this on a Windows box, only Linux; however, assuming you can get both URLs to get to the … Read more
I’ve spoken to the SEOmoz guys about this, it’s not an error nor an “issue” and they’re making the appropriate changes to their system to reflect that.
I havent tried these yet but I was actually looking at the exact same thing the other day. http://magstags.com/wordpress-seo/remove-date-meta-descriptions-wordpress-seo/ and http://seo-ho.blogspot.com/2011/02/how-to-remove-date-from-meta.html
You won’t get blacklisted. Small descriptive duplicate content on the same url tends to be a non-factor since it is very common, duplicate content over multiple urls, different parent urls, or large amounts of it can often become diluted, thus the use of canonical urls. ps. This question has nothing to do with WordPress.