We're updating the issue view to help you get more done. 

Handle SEO for launch

Description

Create a consistent URL path so we can block later in the robots.txt file. For example, we would have the new URL variation read something like www.edx.org/test/course/data-science where the path to block would be /test/

On the new URL's we create, we need to add a NOINDEX, FOLLOW Meta Robots tag in the <head> section on every page. The NOINDNEX tag will remove the duplicate content from ranking consideration via the legacy pages.

On the new URl, we need to write a canonical tag that references the legacy page. For example, we would have this URL www.edx.org/test/course/data-science have a canonical link tag that references https://www.edx.org/course/subject/data-science as the true and preferred variation for Google / Bing to show in their results

Add a temporary 302 redirect for the rules we want to run on the legacy page and the new variation. The 302 redirect is a temporary redirect, so Google won't pass any SEO value over to the new variation, so we won't lose out on any SEO juice. If we can find another way to do the redirect without using a 302, I'm open to this, but we can't use a 301 redirect

I think?

Once these 4 steps have been added, we would then add the Disallow rule in the robots.txt file to block Google/ Bing from seeing the new URL's. I can help test the disallow rule before we go live to make sure it won't block any critical pages on our site

 

Steps to Reproduce

None

Status

Story Points

None

Assignee

Unassigned

Reporter

Diana Huang

Labels

Reach

None

Impact

None

Customer

None

Partner Manager

None

URL

None

Contributor Name

None

Groups with Read-Only Access

None

Actual Points

None

Category of Work

None

Stakeholders

None

Sprint

Priority

Unset