[cmsmasters_row][cmsmasters_column data_width=”1/1″][cmsmasters_text]
Website updates such as articles or blogs are indeed fun activities as an SEO Content Writer or SEO Copywriter. The activity that I usually do is writing content related to digital marketing SEO, SEM, Social Media. After writing, I usually index my page right away to be listed on Google.
To do fetching or indexing to Google, you must use Google Search Console, use the URL inspection menu to request your website page. Now my experience in terms of indexing or fetching often occurs pages or pages. We can not index alias robots.txt configuration blocking all pages of our website.
What the heck is happening? After I checked everything, there is no wrong configuration all robots.txt congress safe is not, repeatedly I tried fetching the article page that I just updated still can not. I thought my website got penalized, hmmm …
Muter around here and there looking for a solution, I checked all the configurations well that Yoast SEO Plugin, check site_map.xml everything
is configured correctly.
I tried to delete Yoast and reinstall it still could not, until finally, I found the solution:
And it turns out to solve this problem is very easy, Google has provided tools for testing robots.txt, I just need to update the contents of the standard configuration of robots.txt at:
https://www.google.com/webmasters/tools/robots-testing-tool?
This Google robots testing tool is indeed very helpful for friends who are confused when facing indexing problems
[/cmsmasters_text][/cmsmasters_column][/cmsmasters_row]