Killer SEO Audit Tips from Expert
SEO audit tips are something that every SEO professional both looks forward to and dreads. SEO audits are dreaded because they often mean hours and hours of work. However, there is also a very real sense of accomplishment that comes from finding out exactly why a website isn’t performing as well as it should and resolving the issue.
Most of us have our audit method and for the most part, we stick to it, never adding to our checklists. This may save time, but it also leaves SEO professionals and their clients at a disadvantage. Search technology is a constantly changing field, and there are things that were once commonplace (like verifying Google authorship) which aren’t so important now. It’s also true that the needs of different clients should lead you to tailor your checklist for the best results.
The point here is that there are things that are often overlooked in SEO audits, even by experienced professionals. The following are a few of the more common problems that can lead to inconclusive audits that you might want to think about adding to your audit checklist the next time around.
SEO Audit Tips
If a site has two home pages, you have two issues to content with – duplication and splitting page rank. This is something you’ll find most often in older sites that feature a “home” link. It’s more common than you think. As far as duplicate content goes, you probably already look for this on other sites, but if that aren’t doing this within the site you’re auditing, you should be, right down to duplicate tags.
This is something which is often overlooked. When you do an audit, do you check to make sure that 404 pages are returning 404 status? How about redirects? Are they status code 301 – and are there multiple redirects? There are plenty of free, easy to use tools out there which can give you your site’s status codes and redirect paths at a glance, so there’s no excuse to skip this check.
Orphaned pages are something else to watch out for. These are usually pages that get lost in the shuffle of a site redesign and have no links to them from the site. You can find them in the site map, and it’s a good idea to do this regularly rather than duplicate effort recreating these pages later – or taking a hit from search engine algorithms for having orphan pages.
Everyone knows that pages can be blocked from search by using the robots.txt file. At the same time, these pages can still bring in search traffic, and Google for one won’t hide these pages if it deems them to be relevant to a search query. If you want to remove a page from SERPS, that’s already indexed, use the no index tag or the x-robots tag.