Tue Sep 27 202211,921 words
This comprehensive technical SEO guide aims to cover every aspect of technical SEO optimization and allow you to hit the maximum possible page speed score, highest possible ranking in search, and improve user experience and conversions.
Technical SEO optimization is the process of optimizing at the browser and server level to improve page performance in search. Technical SEO can be broken down into some key areas with hundreds of different tasks.
This technical SEO tutorial can be broken down into 3 key areas of SEO technical optimization as well as touching on some other topics like user experience and security. It includes basic topics like meta tags and more advanced concepts like server optimization.
Map out and break down all the required work into small tasks, estimate the time and the impact, and start with the quick wins. If working on a very large website, start by optimizing key pages. Achieving the maximum page speed score possible especially on large complex websites can take weeks or months and can be a huge investment. The best SEO strategy is careful planning and an awareness of SEO technical optimization priorities. This SEO guide will help you define those tasks and priorities.
At a minimum use the following tools to analyze your page technical SEO. They should spot about 90% of issues and are all free.
When looking to optimize page speed as part of technical SEO optimization you can look at things from 2 perspectives.
There can be some overlap between these areas on some optimization tasks. There can also be more demanding optimization requirements on one of these areas depending on the tech stack.
The SEO browser optimization part of this technical SEO guide revolves around adding correct configuring and getting the page rendered in the browser as quick as possible.
Code optimizations can be a time consuming part of technical SEO and browser optimization - the larger the code base the more refactoring of code you will have to do.
The W3C markup validator is one of the handiest tools for checking code quality. It can identify all sorts of issues such as incorrect tag nesting and wrong / missing HTML attributes. It can ensure you have standards compliant code. Some issues that are identified by this tool may not be fixable and can be skipped but everything that can be fixed should be fixed.
On static websites more nesting means more effort is required to render the page. In terms of development it can also lead to a bloated and hard to follow code base. In front end frameworks extra nesting can impact performance as the framework has to keep track of all DOM nodes and in turn more nesting means more resources used. Always use the minimum level of nesting where possible.
Anything that changes a page layout after the initial render can cause a layout shift and a re-render. A root cause of this issue can be using JavaScript to style elements instead of CSS. Dynamically adding HTML attributes with JavaScript is one example. Where possible always use CSS over JavaScript to style elements and change page layout. A layout shift is visible as a jump on the page and is detrimental to both user experience and page speed.
Meta tags contain a lot of search engine information such as the page title and description, indexing rules, and URL which makes it a key part of technical SEO. Optimizing Meta is generally quite easy to do. You only need to make sure the required tags are present and have the correct values.
Preconnecting to domains that your assets link to saves the inital domain lookup time and provides a boost to page speed. Audit page assets and create preconnect meta tags to link to the asset domains.
<head>
<link rel="preconnect" href="https://www.google-analytics.com">
</head>
Prefetching fonts early fetches the font file right away instead of waiting for the DOM to be constructed and provides a good page speed boost. Create prefetch tags for each required font.
<head>
<link rel="prefetch" href="https://fonts.googleapis.com/css2?family=Lato:wght@300;400;700&display=swap">
</head>
Always use a meta charset tag with the UTF-8 character set to specify the character encoding of your HTML page.
<head>
<meta charset="UTF-8">
</head>
Setting a viewport tag is a requirement on any modern website and ensures your page displays and scaled correctly across devices.
<head>
<meta name="viewport" content="width=device-width,minimum-scale=1,initial-scale=1">
</head>
A favicon is a .ico image used in the browser tab. Its standard practice to always include favicon meta tags.
<head>
<link rel="icon" href="https://php.fyi/favicon.ico" type="image/x-icon">
<link rel="shortcut icon" href="https://php.fyi/favicon.ico" type="image/x-icon">
</head>
There are free tools you can use online to generate a favicon .ico image from another image type like a JPEG or PNG.
Touch icons are used across different devices when displaying links to your website. An example is if you save a link to your website on your mobile device home screen - a touch icon will display here. Different device types such as Android or iOS can require different icon sizes. There are numerous tools you can use to generate a range of different touch icon sizes based off a single image.
<head>
<link rel="icon" type="image/png" sizes="32x32" href="http://php.fyi/favicon-32x32.png">
<link rel="icon" type="image/png" sizes="96x96" href="http://php.fyi/favicon-96x96.png">
<link rel="apple-touch-icon" sizes="57x57" href="http://php.fyi/apple-touch-icon-57x57.png">
<link rel="apple-touch-icon" sizes="60x60" href="http://php.fyi/apple-touch-icon-60x60.png">
</head>
You can use a manifest.json file to configure how your web app displays on a mobile device.
{
"short_name": "PHP.FYI",
"name": "PHP.FYI",
"background_color": "#00708e",
"theme_color": "#00708e",
"display": "standalone",
"orientation": "portrait",
"icons": [
{
"src": "/apple-touch-icon-57x57.png",
"type": "image/png",
"sizes": "57x57"
},
{
"src": "/apple-touch-icon-60x60.png",
"type": "image/png",
"sizes": "60x60"
}
],
"start_url": "/?utm_source=homescreen"
}
Bloated CSS can have an impact on your page speed and in turn your overall technical SEO and browser optimization. Optimizing CSS consists of using a good structure around your CSS files and folders and using modern build tools.
Utility first CSS frameworks contain grid systems and commonly used utility classes and can help you reduce the size of your CSS.
Ensure all application and third party stylesheets are loaded via HTTPS.
Make sure whitespace etc is remove in production environments from both application and third party CSS files. Bundlers will do all the heavy lifting around minifying CSS.
On large websites it can be easy to have a bloated CSS file loading a huge amount of unused CSS. To ensure the maximum CSS performance possible:
Add a rel="preload" attribute to any stylesheet tags to preload the file and execute it as quickly as possible
<head>
<link href="/css/app.css" rel="preload" as="style">
</head>
The CSS required to render the page content above the fold (critical CSS) should be inside a style tag in the head as early as possible and before any JavaScript. Inlining it means the page styles can be loaded as early as possible. The rest of the page styles below the fold should be inside an external CSS file.
<head>
<style type="text/css">
// critical CSS //
</style>
<link href="/css/app.css" rel="stylesheet"/>
</head>
There should be one inline style tag containing the critical CSS and 1 stylesheet containing the CSS for the below the fold content. This is the optimum number and results in the fastest possible loading of CSS.
Style tags inside the body can cause layout shifts and should be avoided. CSS should only be present in the head of a page either in an inline style tag or an external CSS file.
<body>
<style type="text/css">
// avoid style tags in the body //
</style>
</body>
If using a front end framework scope all CSS by default. This prevents the styles being global (resulting in unused CSS) and loads them only when the component is present on the page.
<style lang="scss" scoped>
// CSS //
</style>
CSS sprites are a grouping of images as a single image and CSS is used to show only a single image within the group. CSS sprites should not be used for images within page content but rather UI icons and background images. CSS sprites can greatly reduce network requests as you could group 10 images into a sprite and reduce the number of network requests by 9.
Ensure all application and third party images are loaded via HTTPS
Use PNG's or SVGs for graphics, use JPEGS for photos. Using the wrong format (PNG for photos) results in significantly larger file size which can run into the MB.
Only load the minimum number of images and other assets needed in the above the fold content and lazy load the rest. Add a link rel="preload" tag for each image in the above the fold content to ensure its loaded as early as possible.
<head>
<link rel="preload" href="/image.svg" as="image">
</head>
Images below the fold should only load when the image scrolls into view. Some browsers support the loading attribute on img tags which will defer the loading of the image until it scrolls into view.
<img src="/image.svg" loading="lazy">
You can also use the JavaScript IntersectionObserver API to listen to scroll and viewport events and control lazy loading.
A height and width should always be set on images even if the image is stretched, shrunk etc. as browsers will start to render images faster if a height and width attribute are provided.
<img height="20" width="20"/>
Newer image formats such as WebP support better compression and other features that older formats do not.
Use a srcset attribute on images and create images for different screen sizes
<img srcset="image-480w.jpg 480w,
image-800w.jpg 800w"
sizes="(max-width: 600px) 480px, 800px"
src="image-800w.jpg">
Find the maximum width and height that images scale to and make sure the actual image does't exceed these dimensions. This ensures the smallest file size possible. Take into account retina sizing as images may need to be x2 size.
Compress images to a reduced (but lossless) quality where possible. Experiment with finding the best balance between size and quality.
Using a CDN (Content delivery network) provides enhanced caching capabilities, faster delivery, and loading of more assets in parallel.
If using a CMS system optimize images before they are uploaded unless the CMS supports resizing images on the fly.
JavaScript technical SEO optimization is one of the most difficult and time consuming, but most beneficial parts of technical SEO optimization. This JavaScript SEO guide covers all areas from the quick and simple to the time consuming and complex.
Use a bundler like webpack or Vite. Bundlers have a lot of benefits such as minifying JavaScript code, removing whitespace, dead code elimination through tree shaking, transpiling code to work across multiple browsers, and converting it to to execute in the most efficient way possible.
Frameworks have a bundler built in and have a number of technical SEO optimizations built in such as minification, code-splitting, and optimizing delivery of assets.
Ensure all application and third party JavaScript files loaded via HTTPS
While probably not a page ranking factory, JavaScript errors can lead to issues that can slow page rendering at least or break a page layout at worst. Most JavaScript warnings can be ignored, but actual errors should be fixed.
One of the major page speed issues can be the executed of large amount of JavaScript before its needed. Defer as much JavaScript as possible so its executed after the page is rendered.
<script src="app.js" defer></script>
Modern JavaScript bundlers and frameworks process JavaScript into an application file, a vendor file which contains third party library code, and page files. If you don't use a bundler, follow the same pattern.
<script src="app.js"></script>
<script src="vendor.js"></script>
<script src="home.js"></script>
One thing that can bloat the size of your JavaScript is loading JavaScript code on pages that don't need it. Modern frameworks and bundlers usually create separate JavaScript files for each page - home.js, contact.js etc which contain JavaScript code that only executes on those pages. For maximum performance you should follow this pattern. Only the required JavaScript for a page should be loading on a page. Remove any redundant / unused code and packages.
Pretty much all advertisements whether Google AdSense or similar execute using JavaScript. Running Ads above the fold when the page loads is a surefire way of slowing down your page and killing user experience. Try to push Ads further down the page and defer their code execution until after the initial page load. Page speed requirements on mobile are much more stringent than on desktop, and loading Ads will almost certainly knock a lot of your page speed score.
If sending requests to a Node server on page load or other times in the page lifecycle try to use Promise.all to group requests and execute them in parallel. This stops having to wait for a previous request to complete before the next one executes. This method provides huge benefits both in terms of speed and user experience especially when dealing with a large number of requests.
const response = await Promise.all([
...
])
The faster the response comes from the server - the faster the page can start rendering in the browser. The server optimization SEO part of this SEO guide revolves around optimizing the execution of your application code, database queries, caching, server configuration, response and asset delivery, and interactions with third party systems. Its a huge part of technical SEO.
Modern back end frameworks can take care of templating, routing, Server-Side Rendering (SSR) and a host of other features.
In SSR (server side rendering), JavaScript is executed server side by Node rendering the page and the response is sent to the browser. If using a modern front end framework you should use SSR or SSG instead of SPA (single page applications) mode which renders the page in the browser greatly impacting page speed.
Think of SSG (Static site generation) as SSR (Server-side rendering) AOT (Ahead of time). You SSR pre render your pages and save them on the server. These rendered pages are then sent as the response to the browser. SSG is the fastest way to render pages as you essentially cut out the SSR rendering step but it has limitations and scenarios you must consider. If you have elements on the page that change based on state - such as a user profile icon that shows in a menu bar when logged in, you must consider how to handle this.
Whatever server is used, correct configuration is required to deliver compressed responses and assets, correct content types, and correct headers. Correct server configuration heavily influences response time.
Use HTTP2 for your website and assets as the HTTPS2 protocol supports parallel loading of assets via multiplexing and other powerful features. Most modern servers today use HTTP2 by default.
Use HTTPS for the loading of pages, assets, third party libraries, and any other requests. Secure pages have been the standard for a long time now. Pages that are not using HTTPS will be heavily penalized in search and can open up the possibility of security issues.
Using cache busting URLs forces the browser to request the latest version of a file and is required if you use browser caching. Cache busting URL's use a version number or hash in the filename which changes when the file contents change. The browser sees this as a new file and requests it. Modern front-end frameworks and build tools have cache busting built in. Cache busting can be applied to any asset. Remember to optimize asset as well as page urls.
/js/app.js // changes wont be reflected if the file is cached
/js/app.v2.js // increment version number
/js/app.a57d5ua.js // random hash
If you host assets on a different domain you can reduce the size and increase the speed of HTTP requests as cookies will not be included.
If you are using the www version of your domain you can host assets on a sub domain to achieve the same result.
www.php.fyi // website domain
img.php.fyi // images domain
Any sort of code thats returned from the server should be the minimum size possible. Ensure all whitespace is removed from HTML pages, unused code is removed, and things like comments only show in development environments and are stripped out in production.
Responses that come from the server can be compressed which greatly reduces their size. GZIP is a widely supported encoding that can easily be applied to an apache server using a .htacces rule. Compression can and should be applied to all types of assets including JavaScript, CSS, and images.
Implement multiple caching layers in your application and server to speed up your response time.
Setting a TTL (time to live) on assets ensures they are cached in the browser and only requested on subsequent page views when they change. This includes HTML, CSS, Images, and JavaScript and other asset types. How you configure caching depends on the tech stack used for your website. If using a CDN, most CDNs have their own mechanism for setting TTL on assets through a UI.
Caching application code can dramatically speed up application boot up time. It takes a while to load the required configuration files and code for any application and there is always a performance hit for this which slows the server response time. Frameworks have been developed to store application code in memory so there is no need to load it initially. Some frameworks like Laravel have packages like Octane which wrap around high performance networking frameworks like Swoole.
If using a database and / or other data sources, its highly beneficial to cache as much of this data as possible and provide a mechanism to invalidate the cache when the data is updated. A database cache can return results much faster than hitting a database directly.
Check all page headers in the browser request tab and look for any incorrect headers and / or values
Adding a vary accept encoding header helps the server serve the correct resource type to the client and stops issues around compressed and uncompressed versions of a web page being served to the wrong browsers (some may support compression and some may not)
Vary: Accept-Encoding
The max-age header specifies the numbers of seconds an asset should be cached in the browser. For static assets such as images 30 days is a solid time to set. Not caching assets can have a large performance impact as uncached assets can greatly inflate the page size on subsequent requests.
Max-Age: 86400
The expires header like the max-age header specifies how long before an asset is considered stale and should be downloaded again. Rules can be set in .htaccess to set expires across different asset types.
Expires: Wed, 21 Oct 2015 07:28:00 GMT
A keep alive header specifies how long to keep the server connection open for and for how many requests. This header can be obsolete in HTTP2 and can actually cause issues in some browsers but its worth investigating if you can use this header.
Keep-Alive: timeout=5, max=1000
Adding a content encoding header tells the client if any compression has been applied to the requested resources and allows the client to decode them. A common form of encoding is GZIP as an example.
Content-Encoding: deflate, gzip
Setting a referrer policy header allows you to control how much referrer information is shared across requests. While not strictly a ranking factor it can be a handy tool and enhance security in certain situations.
Referrer-Policy: no-referrer
While strictly not a ranking factor, a content security policy header can be used to enhance the security of your website by only allowing whitelisted scripts to execute and certain domains to load assets such as CSS, JavaScript and images.
Content-Security-Policy: default-src 'self'; img-src *; media-src media.php.fyi script-src 'nonce-2726c7f26c'
When an nonce token is set on the serve header, scripts that wish to execute on the client side should have the same nonce token as an attribute or else they will fail to execute.
<script nonce="2726c7f26c">
// ... //
</script>
Most back ene applications will load a variety of start up files and configuration. These files could could include middleware, services, themes, or many other types of files. Anything thats can be deferred during boot up should be deferred. Its also worth checking no third party requests such as authentication happen during boot up that can be deferred. You can strip out any application business logic leaving only the core boot up code in your application and test the timing. Modern apps should have extremely fast boot up times (around 50ms).
Code bottlenecks can generally be a byproduct of poor coding standards brought on by a lot of different people working across a code base with varying standards and skill levels. Test large complicated blocks of code in isolation both across the front and back end. Also look for any pieces of code which may delay execution such as setTimeout and execute these in a non blocking way.
If there are third party API requests that delay code execution during the initial page load such as requests to fetch user data, make sure these requests complete as quickly as possible. Requests that happen after page load such as a payment request should also be optimized as this can enhance user experience. Use API tools like sentry to monitor API requests.
Database optimization is one area where heavy focus and regular monitoring is required. It can be easy for large numbers of expensive database queries to pile up and impact server response times as new features are added to apps.
Database queries than scan a whole table such as EXISTS, * queries that return all table columns, or queries that do a lot of JOIN's can be inefficient and slow. To optimize database queries only the required data should be returned from the database. This means specifying the exact columns you need within queries. You can use things like Graph QL to accomplish this or write a custom database abstraction layer. Adding indexes to columns is another way to speed up database queries. Follow the rules only return the data you need and return it in the most efficient way possible.
The N+1 issue is something that can result in hundreds of extra database queries and extremely slow performance on large applications. When selecting a list of table records and a list of related table records, instead of running 2 queries (one for the primary list, and one for the relation), a separate query is run for each entry in the primary list to get the related record.
Optimize the crawling of your pages by returning correct response codes and utilizing available crawler resources efficiently.
A 301 redirect should be implemented from the www and non-www version of the domain (or vice versa). This stops 2 versions of your site being indexed.
RewriteCond %{HTTPS} off [OR]
RewriteCond %{HTTP_HOST} ^www\.php\.fyi [NC]
RewriteRule (.*) https://php.fyi/$1 [L,R=301]
A 301 redirect should be implemented from the trailing slash version of the page to the non trailing slash version (or vice versa). This stops duplicate versions of pages from being indexed.
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_URI} (.+)/$
RewriteRule ^ %1 [L,R=301]
You can do a search on google to check if testing domains are indexed. In chrome you can use site: followed by a domain to see what is indexed for that domain.
site:staging.php.fyi
Pages must return the correct status codes. A page that should be indexed in search should return a 200 response code. Pages that don't exist should return a 404. If you return 404 page but a 200 response code for that page then a soft 404 can occur.
Optimize page linking and site structure to utilize crawler resources in the most efficient way possible.
If you have pages that display content based on a dynamic URL slug, make sure to validate the slug is correct. If the slug is not correct return a 404 response otherwise a soft 404 error can occur.
/articles/seo // valid slug
/articles/seoooooo // invalid slug - return 404
Any time URL parameter are used for pagination, filtering etc - the parameter values must be validated or else they can cause huge duplicate content issues.
Perform the following checks on any pages using parameters and return a 404 or update the robots meta tag to noindex,nofollow if any of the following conditions are met:
Alongside a robots meta tag, you can configure a robots.txt file to block crawling on specific areas of your site. You can block folder and files from being indexed. Be aware if pages are already indexed on Google, if you block them in the robots file you will block them from de-indexing for a period.
User-Agent: *
Allow: /
If you have a very large page with a lot of links you can choose not to include certain menus in the page HTML until the menu is visible. This can reduce the initial page HTML size and reduce the number of links.
<div v-if>...</div> // does not exist in HTML
<div v-show>...</div> // exists in HTML
Tools like good search console or crawler software can be used to find dead links. It can be easy over time for links to break, and if broken links appear to high traffic and high ranking pages it can be lead to a large loss of traffic. Audit links regularly and ensure no broken links across your site.
Orphaned pages refers to pages that are not linked within your site. Linking can give a good signal to search engines about the importance of pages. Pages that are heavily linked especially from high performing pages will be crawled regularly and should perform well in search. The only pages that can / should be orphaned are pages you don't want indexed such as time limited pages set up for specific marketing pages.
To keep page rank within your own site and concentrate crawler resources on specific links you can use the nofollow attribute on links you don't want crawled. nofollow on links can be useful when you are linking to a lot of external sites that you don't want to pass page rank to.
<a href="https://php.fyi/" rel="nofollow">PHP.FYI</a>
Choose what you link to carefully both internally and externally. As sites grow under-linking can make content harder to discover while over-linking can distort the importance of pages and waste crawler resources.
Content should always be pushed as close to the top of the page as possible. If you have large navigation menus consistent of a lot of links, you can push these below the page content. Using modern CSS like flex you can flip layouts displaying navigation menus placed at the bottom of HTML at the top of the page visually, This trick can be used across different pieces of content to control layout and crawling.
<div class="content">...</div> // displays last
<div class="menu">...</div> // displays first
Always use buttons to perform UI actions and use a link tag for any sort of page navigation.
<button>Open</button>
<a href="/">Home</button>
It might be tempting to try to hide assets from search engines to improve your page speed score but manipulating content between users and search engines can land you with severe penalties.
If using an infinite scroll or similar for pagination, make the paginated pages accessible directly from a URL e.g /articles?page=2
A canonical tag should always be used as it prevents a lot of issues around duplicate indexing of a page. In cases where the same page is served from 2 different URLs, a canonical tag can indicate the original / correct version.
An example of the same page being served from different URLs:
In this case the canonical can point to the preferred non www https version of the page.
<link rel="canonical" href="https://php.fyi/">
Always use a robots meta tag. Default all pages to noindex,nofollow and whitelist pages you want indexed. This will stop the majority of issues around duplicate content.
<meta name="robots" content="index,follow">
For any paginated set of pages always use pagination tags to signal the next and previous page in the set. Include the paginated list in your sitemap also to get these pages crawled effectively.
<link rel="prev" href="http://php.fyi/articles" />
<link rel="next" href="http://php.fyi/articles?page=2" />
You only ever want unique pages indexed in search engines. Make sure to validate URL parameters and update the meta robots tag when dealing with invalid values that create duplicate pages.
<meta name="robots" content="index,follow"> // ?page=2
<meta name="robots" content="noindex,nofollow"> // ?page=invalid
You can get your entire website indexed every 3 to 6 months by manually submitting your sitemap. If you time it correctly you can fix every indexing issue on your site and have search engines instantly start to re-index all your content.
Include every unique page in your sitemap including paginated sets.
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9 http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd">
<url>
<loc>https://php.fyi/</loc>
</url>
<url>
<loc>https://php.fyi/articles?page=2</loc>
</url>
</urlset>
Any indications you can give to search engines about the structure of your site is a good thing. Ideally set the priority on sitemap pages to reflect the nested structure of your site and / or assign more priority to more important pages.
<url>
<loc>https://php.fyi/</loc>
<priority>1.00</priority>
</url>
Include images in your sitemap to get visibility and extra traffic from image search.
<url>
<loc>https://php.fyi/</loc>
<image:image>
<image:loc>https://php.fyi/image.jpg</image:loc>
</image:image>
</url>
You can specify different language versions of a page in the sitemap. Adding language entries is only required if you have multiple language versions of a page.
<url>
<loc>https://php.fyi</loc>
<xhtml:link rel=”alternate” hreflang=”es” href=”https://php.fyi/es” />
</url>
You can also add alternate meta tags to specify the different versions but using the sitemap is probably a better option as you provide a complete mapping upfront to search engines.
<link rel="alternate" hreflang="en" href="https://php.fyi" />
<link rel="alternate" hreflang="es" href="https://php.fyi/es" />
To provide data to search engines about the make up of your code use Schema structured data. There are a number of different Schema formats you can use on your pages such as Microdata, RDFa, and JSON-LD and schema types to cover almost any data type.
There is a number of page specific schema for contact pages etc. At the minimum try to include the generic WebPage schema and tag your main h1 tag.
<body itemscope itemtype="https://schema.org/WebPage">
<h1 itemprop="name">PHP.FYI</h1>
</body>
Article page schema should be used on any blog article type pages to enhance display in search results.
<script type="application/ld+json">
{
"@context":"https://schema.org",
"@type":"Article",
"headline":"The ultimate SEO (Search Engine Optimization) Checklist",
"image":["https://php.fyi/img/articles/seo-checklist/summary.jpg"],
"datePublished":"2022-05-16T12:00:00",
"dateModified":"2022-06-07T17:23:55",
"author":"Andrew Mc Cormack"
}
</script>
Include a breadcrumb element with schema attributes on all pages except the home page to improve user experience and display your Google search page result with a breadcrumb.
<div itemscope="" itemtype="https://schema.org/BreadcrumbList">
<span itemprop="itemListElement" itemscope="" itemtype="https://schema.org/ListItem">
<a href="/" itemprop="item">
<span itemprop="name">PHP.FYI</span>
</a>
<meta itemprop="position" content="1">
</span>
<span itemprop="itemListElement" itemscope="" itemtype="https://schema.org/ListItem">
<a href="/articles" itemprop="item">
<span itemprop="name">Articles</span>
</a>
<meta itemprop="position" content="2">
</span>
</div>
Open graph meta can be used on sites such as Facebook to display information in a rich way such as a card with a preview image.
<meta property="og:title" content="An In-Depth Guide to Technical SEO">
<meta property="og:description" content="Learn the key principals behind technical SEO such as page speed, indexing, and content to completely optimize your website SEO.">
<meta property="og:image" content="https://php.fyi/img/articles/technical-seo/summary.jpg">
<meta property="og:site_name" content="php.fyi">
<meta property="og:url" content="https://php.fyi/articles/technical-seo">
Twitter meta allows your website content to be shared easily and in a rich way on Twitter,
<meta name="twitter:title" content="An In-Depth Guide to Technical SEO">
<meta name="twitter:description" content="Learn the key principals behind technical SEO such as page speed, indexing, and content to completely optimize your website SEO.">
<meta name="twitter:image" content="https://php.fyi/img/articles/technical-seo/summary.jpg">
<meta name="twitter:card" content="summary">
<meta name="twitter:site" content="@phpfyi">
<meta name="twitter:creator" content="@phpfyi">
To optimize relevance you must create unique, informative content that strikes a balance between effective messaging and keyword optimization. This part of the SEO guide should help you strike that balance.
A highly optimized meta title and description can lead to a very high click through rate in the SERP. Create keyword rich meta and pay attention to length so its not truncated in search results.
A meta title should sum up your page in a few words and needs to be short, concise, and include the specific core keyword(s) associated with your page. The pipe (|) separator can be used to break up content. You don't need to put your website name in the title tag - this will be auto-appended on Google.
<title>Software Engineer | London - PHP.FYI</title>
A meta description should sum up your page in a couple of sentences and although it provides more area for content than a meta title it should also be short, concise, and communicate exactly what the page is about. The extra area here gives opportunity to include long tail variations of keywords to match against.
<meta name="description" content="I am a web software engineer based in London with 10 years experience specializing in Vue & Laravel full stack website development.">
If you follow a strategy to de-index any duplicate content and index only unique pages then each page meta title should be unique. Paginated pages can append the page number to make the meta unique.
<title>... | Page 2</title>
A h1 tag is always required on a page and should follow a very similar format to your meta title in the fact that it should contain the core page keywords.
<h1 itemprop="name">Software Engineering</h1>
A well structured web page should consist of sections with structured content and headings. h2 and h3 sub headings should be present on pages in most scenarios and in sequential order if possible.
Think about keywords and page structure during the wireframe and design phase.
Optimize page URL's and slugs and do the same for images. Both page and image URLs should follow a tree structure and incorporate relevant keywords. One thing to note is short URLs rank better in search so take time to refine and shorten URLs where possible as part of your technical SEO strategy.
/seo/indexing
/img/seo/indexing
The text withing links should be keyword rich and reflect the page content they are linking to.
Indexing images effectively can provide a traffic source to your site from Image search as well as communicate the content of the page to search engines. Write image alt text as you would a meta description - short, but also clear and concise.
The first 100 or so words in a page have more weight than the content following. For this reason when creating this content think about creating a story - set the who, what, when, where and how, and consider how you will integrate your core keywords.
Use Google suggest and other online tools, look at competitor websites, or use a thesaurus to look for long-tail variations of keywords. Include as many variations of single and long tail keywords as possible to partial match, exact match, and phrase match against.
Creating keyword rich blocks of content such as articles requires more planning and considerations around keyword density than headings and meta. Try to get as many variations of keywords in these blocks as possible to increase relevance while taking care not to spam your page and make content unreadable. Use a tool such as the find feature in browsers to search and highlight keywords, see the density, and adjust as necessary.
Avoid duplicate content and copying content from other sources. If you need to use content from other sources take care to reword it in a unique way that fits with the language on your site.
Websites continuously evolve. Every iteration of a site or page should be an iterative improvement over the previous version learning what has and has not worked. Re-design pages and content every few months or years and keep a steady stream of new content weekly at minimum.
The last part of this technical SEO tutorial touches on some areas outside of your control but can have a major impact on your technical SEO optimization strategy.
There are some ranking factors outside of your control what you should pay careful attention to.
Search engines like Google periodically release updates to their ranking algorithm which may impact websites differently.
Generally the longer a website exists the more authority it gains. Authority is gained through a mix of factors including age, ranking, back links, and engagement.
Building a solid network of backlinks can increase traffic to your site while building authority and increasing page ranking in search. Great content can propagate across the web fairly quickly and back-links can be gained quickly through organic means. Audit your back links monthly and disavow links from spam sites.
Periodically check your overall site security and set up automated alerts. Some security issues will appear in Google Search Console or you can go with another open source or paid solution to monitor your security. Ideally PEN testing should be performed periodically.
Creating a compelling Meta and ranking highly in organic search will increase click through rate in the SERP and signal to search engines that your content is optimized for search. A good click through rate in paid search results can reduce cost.
Search for plagarised copies of your content that have been indexed on search engines at regular intervals. You can pull a long string of text from a page and add double quotes to perform a phrase match. If this exact piece of content is indexed it should show in results. As a security consideration check for copies of your website when going through results.
"Search for a chunk of text with quotes around it"
User engagement is key on any site. Pages that deliver a great user experience will rank higher in search as it they create higher engagement at the page and site level.
Focus on creating an accessible experience across devices that caters for assistive technologies.
Any buttons which send a request to the back end such as submitting a contact form should have a mechanism in place to stop double clicking. This usually involves adding a disabled state to the button after its clicked the first time and removing the disabled state when the request completes. This stops duplicate requests and duplicate form submissions. Where this is most important can be on payment forms. Allowing double clicks can create a scenario where the user can be charged twice. For great user experience add a disabled / loading state to buttons.
Avoid colour combinations that will put strain on a users eyesight and try to follow W3C guidelines of a contrast ratio of at least 4.5:1 between text and background colours.
Using fonts that are sized correctly and easy to read is key especially on sites that provide large blocks of content like articles. Use responsive font sizes for optimum sizing across devices.
@media all and (min-width: 728px) {
font-size: 16px;
}
@media all and (min-width: 1200px) {
font-size: 18px;
}
When creating web pages follow a solid design process using wireframes and taking into account user experience in every decision. The size and position of touch / clickable elements should be considered carefully so you don't make it difficult to interact with them. If multiple touch elements are placed too close together there is a chance a user will click both / the wrong element. Using mobile first design alleviates some of these problems.