For those who aren’t familiar with Squarespace, it is a website creation tool that is designed to make beautiful looking websites very simple to do, for those without any special design skills or experience. Many various agencies and SEO professionals have critiqued the effectiveness of what can be done regarding SEO on this CMS (Content Management System.)
Here is my take after spending years on these sites. This is current as of the end of 2016.
When certain bots crawl the site, there are two URL’s that are found for each one URL.
www.domain.com/about (no trailing /)is seen as one URL and www.domain.com/about/ (with a trailing /) is seen as another. Sometimes, this crawl environment makes it so that the bot crawling will see these two URL’s as duplicate content. In fact, these are not two domains. They are one, and one file on the server. Some bots see this as duplicate content.
The sites appear to canonical the trailing “/.” We have found that Squarespace adds a trailing / on any links in the navigation, but if you add a link in the body of content, it will not add a trailing /. The canonicalization on the pages on the entire site has the trailing /, even though on-page links do not have these.
Basically, Squarespace is telling the on-page crawl to do one, thing, the sitemap another, and the navigation another. It’s confusing the crawl. We need to assure that the sitemap, on-page non-nave links, and all 301 redirects point to the trailing or non-trailing URL and all be absolutely universal. We need to rebuild the sitemap with the same URL as the pointing canonical and on-page non-nav links refs are made.
In addition, we need to make sure the Google Analytics filter is set up to either propagate or not (whatever the sitemap, on-page non-nav, canonical refs and 301 redirects are pointing to. Step 5 here: http://www.trevorayers.com/top-5-google-analytics-filters/
- Step One: Clean all internal links to equal the default canonicalization the system put the NAV links and page canonicalization with the trailing”/”
- Step Two: Re-crawl with Screaming Frog and look for duplicate items and address. Once all duplicates are gone, build the sitemap.
- Step Three: Upload and re-submit sitemap to Google.
- Step Four: Build correct filters in GA.
- Step Five: Monitor the crawl, and audit in 3 months.
Currently, all blogs and all eCommerce pages have no metadata editable. We need to hard code it taking developer hours. http://developers.squarespace.com/
Plug-ins are trumped by theme. We need to do extra manual editing of code to make them work. http://developers.squarespace.com/
Further, webmasters, siteowners, and SEOs are unable to add content blocks on category product pages (eCommerce).
The boxes that you can insert are not compressible. You cannot have one on one side another. They are not changeable and this makes it hard to create a number of things relevant to the content, calls-to-action, and markup language.
On some themes, the meta descriptions can show up on the page but not all themes.Hosting is not in an SEOs control, so we are limited to what we can do with site caching, file management, and server resources. Page loads, server errors, .htaccess, etc.
Hosting is not in an SEOs control, so we are limited to what we can do with site caching, file management, and server resources. Page loads, server errors, .htaccess, etc.
Here is a nice infographic of the differences:
Summary and Suggestions:
This said, Squarespace is easy to build sites in scale but hard to maintain Organic SEO. There are workarounds, but these workarounds are timely and include extra resources in the form of web developers. If you want your site to be considered a professional, crawlable site… I suggest ditching Squarespace and going to WordPress.
Any published information from this document must be accredited to the author and the company: Kevin Spidel, Directory of Digital Strategy – V Digital Services www.vdigitalservices.com