Urls not crawled

2021-08-12 04:38:05: Starting post-deployment actions
2021-08-12 04:38:05: Starting Netlify deployment.
2021-08-12 04:38:05: Starting deployment
2021-08-12 04:38:05: Post-processing completed
2021-08-12 04:38:05: No static site directory to process.
2021-08-12 04:38:05: Processing crawled site.
2021-08-12 04:38:05: Starting post-processing
2021-08-12 04:38:05: Crawling completed
2021-08-12 04:38:05: Crawling complete. 0 crawled, 0 skipped (cached).
2021-08-12 04:38:05: Using CrawlCache.
2021-08-12 04:38:05: Starting to crawl detected URLs.
2021-08-12 04:38:05: Starting crawling
2021-08-12 04:38:05: URL detection completed (4460 URLs detected)
2021-08-12 04:38:05: Detection complete. 4460 URLs added to Crawl Queue.

Hi Mulyadi,

Sorry, I’m bit slow to do much troubleshooting at the moment.

Quick fix may be to run Lokl (https://lokl.dev), which has the plugins already installed and environment tested. If you import your site into that, it will eliminate any potential environmental issues.

Is there any way you can just share the docker image?

There’s a few extra steps to using Lokl’s Docker image outside of the wizard, I think in the github repo for lokl or lokl-cli.

Alternatively, you could spin up any other different kind of environment and do usual troubleshooting for WP2Static, ie switch to default theme and no plugins - see if problem persists. Just to try and isolate the exact cause - is it environmental or site content related?

im now on win 10.
cant execute the sh command on bash.

assuming some errors during crawling:

Crawling complete. 0 crawled, 0 skipped (cached).

That would be more helpful if it listed errors!

What kind of address is it trying to crawl on? Is it a Docker setup?

It makes me think that the crawling server is not able to access the WP Site URL (common in some Docker setups, but there are workarounds)

its from a vps/upcloud