Basic website optimization is used to create a good basis for OnPage optimization. So measures that affect the entire website. These help search engines, and especially search engine bots, to find content more quickly and to examine it more correctly.
Website structure and url structure
For an optimal website and URL structure of a website, it is determined on which sub-levels individual sub-pages should be integrated. Too deep a subdivision can lead to search engines finding the content more irregularly and only examining new publications or changes to existing sub-pages later. For example, it is also checked which main menu items or category areas should contain content.
At the same time, it is also checked and, if necessary, optimized how the display of individual URLs should be designed. User-friendly URLs and URLs that are easy to type. Meaningful URLs that users can use to determine what is on a subpage. Depending on the application, it should or is recommended to place the main keyword in the URL. Things like this are taken into account when structuring the URL.
Sitemap creation
The sitemap creation is usually similar and parallel to the creation of the website structure. For example, one or more XML sitemaps are created, which provide an overview of all available URLs in a machine-readable view. Such an XML sitemap can be submitted to search engines in order to significantly accelerate the indexing of sub-pages and to inform search engines more quickly about updated content. This means that newly published subpages and revised content can be found much faster for users in search engines.
Depending on the application, it can also be useful to provide an HTML variant of an XML sitemap. On very large websites, such a page helps the search engine bots to find all available content more quickly.
Crawlability or investigability of search engines
In various ways, it is also possible to target search engine bots across the entire website. This is particularly advantageous for very large pages in order to provide them more intelligently for search engine bots. Changes to category pages or important sub-pages can then be indexed more quickly. This so-called crawlability is important because search engines set a kind of upper limit for crawling (aka examining sub-pages) – the so-called crawl budget.
Maximizing the crawl budget is therefore important in order to keep more important content in search engines up to date, while less important content can be checked more irregularly.
Indexing management
Especially for new sub-pages or a website relaunch, it is important to get new content into the search engine index promptly. However, especially with large pages, this also includes what content should be checked and indexed when and at what regular intervals. In search engine optimization, this area is called indexing management and is an important area for keeping the page up to date in search engines such as Google and Bing in connection with the crawl budget.
Forwarding of no longer accessible content
If sub-pages are deleted or moved within the website, this triggers a 404 “Not found” error for the previous URL. The previous URL, however, usually had good positions for certain search terms in search engines or was linked from other websites. If the previous URLs are not redirected here, the website will lose its strength in search engines.
To prevent this, search engine optimizers use redirects or, in technical jargon, 301 redirects and use them to refer to a suitable, new URL. The users will not receive a 404 error message when calling up the old URL, instead the page of the stored redirect opens. And search engines can transfer the “strength” of the old URL to the redirected URL, as long as the redirect target deals with the same topics and content.
Missing redirects are one of the biggest and sharpest mistakes in website relaunches when ranking losses are recorded. Forwarding should always be made and kept up to date.
Pagespeed optimization for fast loading times
With OnPage optimization, good search engine optimizers also keep an eye on the page load time of each individual subpage. The so-called page speed optimization is about sending mobile users in particular a fast page with the smallest possible amount of data. Pagespeed has meanwhile become its own ranking factor and is particularly advantageous for mobile users.
Whether a search engine optimizer can carry out a page speed optimization as such depends on several factors. The aim is not for the search engine optimizer to implement a page speed optimization, but for it to be passed on to the relevant front-end technician of the website.
Mobile-friendly optimizations
For mobile users in particular, it is essential to receive a user-friendly version of a website. If a website is not “responsive”, the search engine optimizer will point this out. A mobile-friendly website is actually a ranking factor for mobile users.
In the same way, the search engine optimizer can also see which content is actually required for mobile devices or which can be left out. Professional optimization measures can even go into the area of responsive content, in which shortened versions and texts are delivered to mobile users without reducing the information content.
Duplicate content avoidance
If the same content is found on different subpages, this is known as duplicate content. Duplicate content is a problem for search engines because they don’t know which page to rank now. At the same time, the entire internal link strength blurs with every duplicate content page.
With OnPage optimization, it is essential that each subpage only appears once and that no duplicate content can form. This is exactly what is checked and carried out during an OnPage optimization.
The appearance of duplicate content is usually one of the bigger reasons why a website or individual subpages do not rank well in search engines – and thus one of the greatest optimization levers in OnPage optimization.