Make Your Site More Visible – Conduct Effective SEO Audit
Various search engines such as Google analyze dozens of parameters of your site. This is necessary to assess the “usefulness” of information and determine the page rank. Did you know that a regular user rarely even goes to the second search page when looking for information? Therefore, it is important to be in the top 10, and for this, it is necessary to optimize the resource. You need to create high-quality content to provide value to your users. Thanks to our article and tips, you only need 15 minutes to complete this process. Below we will tell you what aspects you need to pay attention to.Scanning
A quick audit is useful, but a comprehensive audit will still be invaluable. The first step in any site audit is to beat the full site scan. Make sure your tool is correctly configured for parsing so that it can scan all links (JavaScript, image, video, Flash, etc.). The duration of the scan depends on the size of your site and can take anywhere from a minute to several hours.Check for site indexing issues
After you crawl the site, consider everything responsible for the ability to crawl the site with a search spider. The most common problems you should check are:- Does robots.txt allow all necessary pages to be crawled
- What answers do the “server response code” give your pages
- Correctness Filling in the tag meta name = “robots” and link rel = “canonical
Check Sitemap.xml
Check your site map for relevance, it should have all the pages promoted. If you use the plugin to automatically create and manage a site map, you must make sure that you can automatically create and edit XML maps. When creating a site map, it is important:- The new page should automatically fall into sitemap.xml;
- If the page is prohibited in robots.txt or with a meta tag, then it should not be on the site map.
Check robots.txt
A page can be blocked from indexing for a variety of reasons. It may contain the “NoIndex” meta tag in the header or the robots.txt file may close the page from indexing. To scan the site, I use the “Netpeak Spider” utility, which shows both of these options when scanning. Check if important pages are blocked, or vice versa if certain pages should not be closed from indexing. The robots.txt file for search bots is very important for proper internal optimization. Using the Disallow and Allow directives, you can control the scanning of sections of a site.Check the site for errors
Errors in server response codes are easy to find, you can check your site for such errors in the Google Search Console, go to Crawl> Crawl Errors. You must fix any 4XX or 5XX errors that you find. Correct internal links that lead to invalid pages or configure 301 redirects from those pages to which you could not remove external links. Besides, make sure that the 404 Error Page is created on the site, from which you can send your visitors to important pages of the site.Re-Indexing Request
Whenever you make significant changes, you can ask search engines to crawl the necessary pages again. So you can check whether you have eliminated the main mistakes. To submit a re-indexing request to Google, go to the Google Search Console, go to Scan> View as Googlebot, paste the URL of the updated page, and click the “Scan” button. Checklist:- Correct all errors on the site.
- Check the robots.txt file.
- Ensure that all important pages can be indexed.
- Correct pages with incorrect response codes.
- Submit updated pages for re-indexing.
Avoid redirects
Redirects to the correct pages are not correctly configured, when moving a page or changing a URL, it represents a potential problem for SEO. Here are some of the most common problems:- Replace temporary redirects (302) with 301, on all pages that will no longer appear. The 302 redirect is a temporary redirect that tells search engines that you are temporarily redirecting visitors, but this page will be back soon. Thus, a 302 redirect will not transfer the weight of the missing page to a new one, as a 301 redirect does. A 302 redirect is suitable if you are testing a new page, but if your page has disappeared forever, you need to configure 301 redirects from it.
- Check Rel = “canonical”. If you have duplicate content on multiple pages, you must specify which page is the main page, setting it as the canonical URL. Each page must have at most one Rel = ”canonical” tag.
- Fix HTML and CSS markup errors. Errors in the code can significantly slow down your site. Performing regular code validation checks can help you detect and fix most errors. Find and correct as much as possible all errors in the code.
Make Perfect URLs
One of the easiest ways to recover lost external links is to find and fix external links that lead to non-existent pages on the site. Use the tool to scan the site or “External links” in Google Webmasters to find non-existent pages to which external links lead, and then correct them, either configure 301 redirects from them to relevant pages or contact the webmaster of a resource that refers to a nonexistent page to update the link to the corresponding, updated page.- Get rid of redundant outbound links. When 100 or more outbound links are on the same page, this can indicate to search engines that the page may be spam. Limit the number of outbound external links to the maximum.
- Get rid of unnecessary redirects. Whenever it’s only possible, get rid of redirects to immediately direct visitors directly to the destination page. Not only do redirects increase page load time, but they also indicate to search engines problems on the site.
- Get rid of dynamic URLs. Dynamically generated URLs are hard to read and they cannot tell users where they are. Try to get rid of dynamic URLs.
- Reduce the maximum URL length to 40 characters. Recent evidence suggests that URLs of up to 35-40 characters dominate the TOP of search engines. Of course, long URLs are also ranked, but practice shows that it is desirable to reduce their length.
- Define your primary domain. You need to determine exactly what address your site can access with or without www. Check that your site works only at one address, and from the other, you have worked a 301-page redirect. For example, if your main domain is http://site.com, then if a visitor enters http://www.site.com, the page will automatically be translated without www. If you notice that your site is available at www. and without, go to the MOZ toolbar to determine which version of the site has the most external links and leave only it, setting up a redirect with the second 301. Also, you need to make sure that the site is accessible only at one address https: // or https: //, and the same situation is possible with a flash at the end of the address with “/” or without “/”.
Optimize Meta Tags
Correction of meta tags will not take much time, but it can significantly increase the position of the site in the search and increase their CTR. But do not overdo this point, the primary goal of the meta description is to inform visitors (and search spiders) about what your page is about, but if possible, add keywords to them.- Get rid of empty or long Title and Descriptions. Make sure that all your meta tags are no longer than the recommendations for the number of characters (50-60 characters for Title; 150-160 characters for Descriptions).
- Avoid duplicating Title and Descriptions. Many sites, in particular, most online stores suffer from duplicate content or meta descriptions. Duplicate content can confuse search engines so that they won’t rank any of your pages with duplicate content. Therefore, you must try to keep the content on the site as unique as possible.
- Tag Optimization H (h1-h6). Header tags (H1), this is an important factor in page-level ranking. Use it to direct the page to the specific keywords that users are looking for. Try also to avoid repetition in other H tags (H2, H3, etc.), use them to extend the semantics of the page to secondary keywords where necessary. Also, make sure that H tags are not involved in the page layout. Go to the page code and check that the H1 tag is only once per page with the main keyword.
Content optimization
High-quality content has long been a determining factor in ranking. Therefore, you must make sure that each of your pages has enough content, and that the content is unique and useful. Find pages with poor content. Make sure that you have enough content on all pages of the site, I’m doing this in WebSite Auditor by simply sorting all the pages by the “Word Count” column. Check that all pages have 1-2 paragraphs of text (at least 250 words per page). When you check the pages, check the formatting and readability of the text.Image Optimization
View all the images on the site, you need:- Make sure that there are no broken links to images;
- Optimize the size of images for the site (if the site displays an image of 500×200, do not upload a picture of 3000×1800);
- Check that each image has unique and informative alt and title attributes.