Over the coming months I am looking to document everything that there is to know (or more likely what I can find out…) about SEO. This post is about Website Quality Indicators – What does the industry believe proves to a search engine that your website is an authority/trusted?
*I am more than happy to be corrected and/or edit this as I go along!
Quality indicators that a search engine may use to judge a website? (Props to Stuntdubl)
- Being hosted on a dedicated IP
- Outbound links (these might be the biggest IMHO — not only to put your site in its topical neighborhood, but also just a plain old GOOD neighourhood)
- Doctype and language metadata in your header
- Invalid code but linking to the W3C validator (”we tried!”)
- The existence of Access keys (accessibility best practice)
- A ’skip navigation’ link (accessibility best practice)
- Long domain registration period
- Consistent link acquisition over time
- Low link rot
- Few broken links
- User repeat visits
- Adds or notes on personalized search results
- Visitor duration – though often argued by SEO that like to rationalize their links’ pages as “resources”
- High level of users that bookmark the page
- Steady SERP position
- CTR of links
- DMOZ listing (yes, I hate this too, but I think a lot of us have seen decent evidence of it at least to speculate it a quality indicator)
- Privacy Policy
- Contact page with physical address and phone number
- Submit to local listings (assists with above)
- Extended period domain registration
- Get a half dozen trusted links (if I told you they’d no longer be trusted)
- Legit WHOIS data that matches other records
- Dedicated IP*
- Adding a FEW trusted outbound authority links – wikipedia, industry associations, etc.
- Valid code (or close to anyhow) – you can argue this all day, but on a massive scale better code = higher quality
- Fast server response time*
- No 404’
- Extremely limited downtime*