-
Notifications
You must be signed in to change notification settings - Fork 99
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
must have seo favorable sitemap before permanent redirect #8
Comments
Do you have a preference for a language for the PR? I can write bash, sh, Python 2 & 3, Rust and C; if any of those suit you I'll see what I can do. |
I am most concerned that there is some aspect of google that I don't understand and will somehow damage what good will I have with their algorithms. I have Python 2 so that would be a good choice. Thank you for asking. |
I think the safest option is to actually offer static pages, which isn't that hard with the tooling available; I'm saying that as someone who's willing to significantly help implement this. If anything it will reduce page load time, which is a joy to me and improves search engine ranking. Google does index SPA's now, but they also recommend having a sitemap in their guidelines. I think it'd be best to implement the whole Webmaster Guidelines, but that would be a separate Github issue; if you agree, please make an issue for that, and if you want me to flesh it out assign me to it. |
As an aside, if it's no trouble I do have a preference for Python 3, as I don't want to further elongate the lifespan of Python 2 by writing new projects in it. |
I don't have Python 3. I did check and I do have Python 2 on the computers I would use. I have bash there too. Let's be careful to not make too much of this issue. I'm guessing this requires some form of one-time template substitution and most of the work is in researching what that template should look like in our situation. I haven't made google sitemaps before and the docs look pretty thick. The names.txt file will change a few times as I find backup copies for pages that have gone missing. So a command line script that I can rerun would be ideal. I think of this as one-time use but really it will be two or three times. In issue #3 we describe where we are headed which is a fully distributed read-write copy of this content. Ultimately we will redirect readers into the federation, an anti-seo move. This has come up recently in conversation on the matrix. |
We see google indexing the remodeled wiki. This experiment indicates that indexing might not be as fast: post. Rapid update would not be a concern here but is important for federated wiki, our eventual replacement. |
In issue #1 we describe the impending permanent redirection of cgi urls to a new implementation based on client-side rendering. We would like to preserve our search engine recognition through this move and beyond especially for engines that don't interpret javascript. A well formed sitemap could be the solution.
Our project would be advanced by the contribution of a script that could translate names.txt into such a map. A pull request with a script including install instructions and seo references would be welcome. Find names.txt here:
https://github.com/WardCunningham/remodeling/tree/master/static
The text was updated successfully, but these errors were encountered: