ivanmoony

ivanmoony t1_itp7sjl wrote

I'm afraid that will not work. The idea is to assume that google would index html before running any scripts while you are populating the noscript within your jit-compiler script. The thing works only if you have a separate html per each md, and you manually put noscript tag in each of them. Not very elegant solution, that is why I posed the question in the first place.

Maybe if you push your html rendering scripts to some server-side scripting tech? Something like you call a php script that constructs html output based on md file you pass as a parameter. That way search crawlers would seamlessly index the resulting html. No miss. I like this solution better than noscript one, but it requires rewriting parts of your code. What you'd finally get is files properly structured for search engines indexing.

1

ivanmoony t1_itm1jvi wrote

Meta tags are mostly ignored by search engines due to easy abusing. People had being putting unrelated popular search keywords just to attract visitors, so Google switched some meta tags off in their search algorithms.

My best shot is to include custom searchable text inside noscript tag in the main html which would search crawler recognize and index. It should do the trick (it does at my site), but I wonder if anyone has a better idea? Anyway, we should be careful with noscript tag. If the searchable text doesn't match the real content, that may get us on the google black list. After that, the domain will not show on any search attempt. Very dangerous.

Off the record, Google Search is already making efforts in processing scripted HTML, which is the case of Casual Mark Down. Not sure how far they got in this quest, but in the end of the process we won't need the noscript tag at all. Google would run our scripts on their servers and index the resulting HTML. That will be the day...

2