Crawling: Googlebot must be able to crawl your site by navigating through links;
Rendering: Googlebot must be able to read the content of your web page (rendering being the act of interpreting the code on the page to actually display your website);
Crawl budget: Googlebot uses a lot of resources to browse and index sites. If it's too difficult to crawl your site and takes too many resources, it won’t finish the job.
Three reasons for this:
Download the page.
Index a first time.
Find other links once the page has rendered, then repeat the operation.
As you have seen, it's always better to make Google's job easier and not harder!
Pick the Right Technology for Your Site
Here are three scenarios if you don't already have a website online:
1. Your Site Has Few Business Functions and Is Mainly Used for Marketing (Customer Acquisition)
2. Your Site Is a Web App With Important Business Functions. You Also Need a Place to Showcase Products and Acquire New Customers
This is the case for most SaaS platforms, such as OnCrawl.
3. You Want to Build, or Already Have Built, an SPA (Single-Page Application)
Adopt SEO Best Practices if Your Website Is in Angular, React, or Vue
Indexable URLs - Pages always need unique, distinct, and indexable URLs.
There must be a real page, with a server response of 200 OK for each individual page you want to index. The SPA must provide server-side URLs for each category, article, or product.
Do not use # in URLs to indicate separate pages.
Pages always need titles, meta descriptions, meta robots, their own URLs (that contains the keyword), textual content, images, alt attributes, etc.
Googlebot follows links to crawl your site, so don’t forget to put them in (href for attributes, src for images).
If you need a customer-facing website, you should consider building it in HTML/CSS.