Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.

Search Engine Optimization is the process of ensuring that perspective members are able to find your Space via a search engine.


Since is designed to be used by non-technical people, the vast majority of Search Engine Optimization is handled for you automatically.

To hide your Space and all of its content from search engine results, you can make your Space uncrawlable.


Metadata is invisible information contained within your website that external sites can use to quickly understand its contents. For instance, if you paste the link of your Event into a social media post, it may be able to pre-determine the title and description of that Event automatically.


Title tags are created based on the title of the current content and the Space Name. For instance, the About Page for the Penguin Club would look like this:

About | Penguin Club

All HTML tags relevant for all of the major search engines and social media sites will automatically be included.


Description tags are taken from several different sources:

  1. The content's description field, if available, if the first source.
  2. If the content has a body field, a truncated version will be used next.
  3. Otherwise, your Space's description (which can be set up in your Setup Dashboard) is used.

Open Graph and Twitter

Additional tags are generated based on the Open Graph and Twitter specifications to convey additional information to external sites about your content.


Search engines use a special file called Robots.txt to determine which parts of your site should be indexed. This file is automatically: is dynamically generated based on the settings of a Space (e.g. if the Events Accessibility is not set to public, this file will inform search engines not to index any Events).  For example, you can see the site map of our demo Space here:

Additionally, the appropriate meta tags are added to the <head> of each corresponding page.  For instance, a page that should be indexed will include the following:

Code Block
<meta name="robots" content="index,follow" />

Whereas a page that should not be indexed shall include the following instead:

Code Block
<meta name="robots" content="noindex,follow" />
Any changes to the accessibility of any page or module will automatically be reflected in your Robots.txt file and meta tags.


Sitemaps are used to help search engines find content on your site.

A site map is automatically generated for each Space, containing all publicly viewable Create. For example, you can see the site map of our demo Space here:

This will only contain content that has been marked as publicly accessible (see Capabilities) and will automatically get updated whenever new content is added or changed on your Space.


Any changes to your site new content or updates to existing content in a Space will be reflected in your Robots.txt and Sitemap almost immediatelythe associated Sitemap.xml. Unfortunately, search engines may take up to two weeks to reflect new changes.

Rich Snippets JSON is generated for all relevant pages so that each page on a Space is capable of being shown in a rich snippet by a search engine.

Pretty URLs

You can customize the URL of any post to reflect its content.

Canonical Links

All pages automatically include consistent canonical links.