In this post we are going to optimize website that hosted in IIS8 implementing basic SEO rule to help improve better user experience at the same time increase SEO ranking.
According to Google’s Search Engine Optimization Starter Guide we will optimize our IIS application.
Let’s get stepped into it. First of all we will get basic overview on IIS8 and SEO.
IIS8: Microsoft’s Internet Information Server (IIS) is an extensible web Server that host multiple web application/site.
IIS supports HTTP, HTTPS, FTP, FTPS, SMTP and NNTP.
Other’s web servers include
- Apache,
- NGNIX (pronounced engine X) from NGNIX.
- Novell’s NetWare server,
- Google Web Server (GWS) and
- IBM’s family of Domino servers.
SEO: SEO is a process to improve the visibility of a website in search engine results. Learn more from GOOGLE
Table of Content:
- Improving Site Structure
- Sitemap
- Dealing with Crawlers
- robot.txt
- BOT accessibility(Libwww-perl Access)
- Server Security
- Server Signature (Server: Microsoft-IIS/8.0)
- Canonicalization
- URL Canonicalization
- IP Canonicalization
- Optimizing Content
- Page Cache (Server Side Caching)
- Expires Tag
- HTML Compression/GZIP
Let’s get details about those topics with an basic sample application hosted in IIS. First of all we need to configure IIS to host our website. We will change windows host file to set hostname with local ip and port.
Modification of Host file, browse file : c:\Windows\System32\Drivers\etc\hosts
hit enter and then open with notepad, copy content to another opened note(open with Run as administrator) paste content > make changes > Save with replace host file.
now open browser in url bar write localseo.com and hit enter, our local site will load.
1) Improving Site Structure
1.1) Sitemap
Applications need to be well structured based on root/home page, better to using breadcrumbs that help user to navigate root/home page. Sitemap improve crawling of site by crawlers, we can also use a sitemap Xml file to help web crawlers (Google or Other search engine) listing the pages content of the site.
Below is a sample sitemap Xml format, upload the XML file (sitemap.xml) to root folder.
http://localseo.com http://localseo.com/Home http://localseo.com/About http://localseo.com/Contact
in iis i have place the sitemap.xml file in root folder.
2) Dealing with Crawlers
2.1) Web Robots
Web robots are known as spider/crawler are programmed to visit/index the site content automatically. Robots.txt file to give instructions site to web robots.
User-agent: * Disallow: /
The “User-agent: *” means this section applies to all robots. The “Disallow: /” tells the robot that it should not visit any pages on the site. Learn more about robot.
2.2) Libwww-perl Access BOT
We can restrict BOT access using webconfig by URL rewwrite rules. We need to enable IIS URL Rewrite 2.0. Click Web Platform Installer in IIS Management section.
a list of component will appear, search with writing “Rewrite” hit enter.
search result will show URL Rewrite 2.0, Install it now.
after installation URL Rewrite icon will appear, double click on it.
let’s create a new rule for Libwww-perl BOT to restrict access website.
then configure for BOT rule to restrict.
or Simply copy paste below code in webconfig then replace ur pattern.
3) Server Security
3.1) Server Signature
From security aspects it is important to turn it off, by default it is set on. Here we will rewrite server signature with outbound rule in IIS.
again double click on URL Rewrite icon and add Server variable like below image.
or Simply copy paste below code in webconfig then replace ur Changed value
4) Canonicalization
4.1) URL Canonicalization
make sure the bindings are proper with the host name in IIS. Now create a new inbound rule for Canonicalization both IP and URL with www.
or Simply copy paste below code in webconfig then replace ur pattern and redirect.
5) Optimizing Content
For better performance it is good to cache the static content. There are three different . The default is NoControl.
- NoControl Does not add a Cache-Control or Expires header to the response.
- DisableCache Adds a Cache-Control: no-cache header to the response.
-
UseMaxAge Adds a Cache-Control: max-age=
header to the response based on the value specified in the CacheControlMaxAge attribute.. -
UseExpires Adds an Expires:
header to the response based on the date specified in the httpExpires attribute.
5.1) Page Cache (Server Side Caching)
5.2) Expires Tag
5.3) HTML Compression/GZIP
Finally the webconfig file
Hope this will help 🙂
Mokhlesur Rahman says:
I think it will be helpful for SEO expert also…….. ):
Mokhlesur Rahman says:
I think it will be helpful for SEO expert also…………. ):