As shown in this visualisation here, search engines work by crawling billions of pages throughout the web using hyperlinks. So to do this, search engines will use their own crawlers as also known as search engine bots or spiders. Once a new page has then been discovered, it can then be rendered and indexed. Various authority and relevance metrics are then scored to ensure that the most relevant results are then retrieved against the search query in the search engine.
Search engines have some of the most intelligent data mining operations and algorithms than any other piece of technology. They continually collect data to understand an individuals behaviour and help their displayed output become more user-centric. It’s crucial to understand the intent behind queries by dissecting the language used, enticing users to click on your web page, breadcrumbs and a clear navigation, fast site speed and creating a seamless user experience.
Be a lot more human during your auditing process. It’s always encouraged to manually audit your site, so just in the way that a Google bot and users would interact with your site and navigate it, try to get familiar with the site as a whole. Look at the overall site architecture, high value, low value pages, how’s the internal linking working, and get a feel of it there. Then try and identify different technologies that the website’s built on, for example use the Chrome extension Waffleizer to quickly have a look at what our website is built on. Then analyse what tools will be best to use to crawl the site.
Continue to troubleshoot the site from accessibility, crawlability, indexability, all these different areas of the site, and then always make it bespoke to every single audit.
It’s also really important to understand who you are actually delivering your audit to. This will help to really structure your recommendations. If it’s being delivered straight to a developer, these recommendations should be incredibly concise and actionable. However, if it’s being delivered to other stakeholders, who may not come from a technical background and might not have a real understanding of SEO, then include the reason for the recommendations. For example, the return on investment, or talking around the KPIs that the recommendation is looking to tackle.
Don’t worry about what others are doing. Try and not be fazed by those around you that may have more experience or have been in the industry a lot longer than you. Technical SEO as a whole is pretty much a learning curve and comes with all the analysis that you complete. Georgie taught herself a lot of Technical SEO – it was not learnt at university. No matter if you’re working in an agency, in-house, freelance, try and audit as many sites as possible, whether that be utilising a crawling tool, manually auditing, or a combination of both. This will then give you exposure to a range of issues, range of websites, that you can then look out for in the future, and then also lean on others around you.
Technical SEO is a discipline, it’s something that can’t be taught overnight. It’s ever-evolving, and knowledge comes with time, so keep at it, keep updated with all the different industry news. Twitter is great for that. Take the time to really empower those around you, as diverse thinking really does breed creativity in all senses of life, as well as technical SEO issues. Reach out and explore the incredible communities out there, such as Women in Tech SEO.