HTTP vs HTTPS: Why It Matters
Everything you need to know about the difference between HTTP and HTTPS and why it matters, here.
Google announced, “Beginning in October 2017, Chrome (version 62) will show the ‘NOT SECURE’ warning in two additional situations: when users enter data on an HTTP page, and on all HTTP pages visited in Incognito mode.”
Luckily the solution is simple: Switch to HTTPS
Difference Between HTTP And HTTPS
HTTP (Hypertext Transfer Protocol) and HTTPS (Hypertext Transfer Protocol Secure) are both application protocols for distributed, collaborative, and hypermedia information systems. These are the foundations of data communication on the Word Wide Web. HTTP and HTTPS are virtually the same except the added security HTTPS provides by using the Transport Layer Security (TLS) protocol. TLS protocol creates the main difference between the two since HTTP uses an insecure connection while HTTPS uses an encrypted connection and require a certificate.
TLS adds three levels of security:
It converts information into code in order to prevent unauthorized access. In other words keeps the hackers from seeing private content and conversations on the site
- Data Integrity
Prevents undetected corruption and/or modification of data during transfer.
Proves site is being used for its intended purposes.
Best Practices For Switching To HTTPS
Get Security Certificates
Certificates are issued via a certificate authority (CA) and will verify that the site actually belongs to your organization.
Tip: use a reliable CA that will also provide technical help
- Network Solutions
Note: There are more providers but these are the most widely used and trusted
Types of Certificates:
- Single domain certificate
- Multi-domain certificate
- Wild card certificate for a secure origin with many dynamic subdomains
Redirect users and search engines to the HTTPS page
Using server-side 301 HTTP redirects will make the transition to HTTP and HTTPS as seamless as possible. For more information on 301 redirects and how to use them watch this short video.
Ensure Google can still index and crawl your pages
Although Google will index HTTPS files automatically there are some measures to take to make sure that your page can be crawled and indexed by Google:
- Don’t block page using robot.txt files
- Don’t include meta noindex tags
- Test for Googlebot access with fetch as Google
Support HSTS (HTTP Support Transfer Security)
This will tell the browser to request HTTPS pages automatically and tell Google to use secure web addresses in the search results.
Suggested steps to enable HSTS on a web server that supports HSTS to decrease complexity of rollback strategy
- First use HTTPS pages without HSTS
- When you begin sending HSTS headers use a short max-age (18 week minimum). Then monitor traffic from users, other clients, and dependents’ performance (ie: ads)
- Slowly begin increasing the max-age
- Continue monitoring the traffic to make sure it doesn’t affect the users and search engines. If it doesn’t have a negative effect request to be added to Chrome HSTS preload list(optional).
Use HSTS preloading (Optional)
Extra security is added by supporting HSTS preloading. Enable by specifying both includeSubDomains directive and the preload directive in the HSTS header.
Subdomain matching works like this, if you used this directive for www.mysitename.com then, a.www.mysitename.com would be included but a.mysitename.com would not be included. So the matching will work if it contains the entire domain name, it will not do partial matching.
- Expired certificate
- Certificate registered to incorrect website name
- Missing Server name indication(SNI) support
- Crawling issues by blocking page with robot.txt files
- Indexing issues by using noindex meta tag
- Using old protocol version making your site vulnerable. Update regularly!
- Mixed security elements
- Different content on HTTP and HTTPS site
- HTTP status code errors on HTTPS site
Google Prioritizing HTTPS
Earlier this year Google asked for help in identifying and indexing HTTPS URLs. Fortunately Google has announced that Googlebot will automatically index HTTPS URLs even when there is a HTTP version of the same page. This will make the migration from HTTP to HTTPS much easier. Google is also giving a slight ranking boost to HTTPS URLs to encourage the shift to HTTPS. There are some things to keep in mind to ensure Google can index your pages, some of which we have covered but just in case you missed them they are listed below:
- Site must not contain insecure dependencies
- Never block crawling by robot.txt
- Do not redirect users to or through an HTTP page
- Do not include a rel=”canonical” link to the HTTP page
- Do not contain a noindex robots meta tag
- Do not have on-host outlinks to HTTP URLs
- List the HTTPS URL on the sitemaps
- Do not list the HTTP version of the URL
- Server must have a valid TLS certificate