Table of contents
With faster speed & better UI & UX, PWA can boost your Google Core Web Vitals, thus, improve your SEO. Many technology blogs & news (including SimiCart) have claimed this.
However, this is not the whole picture. In fact, in some cases, PWA can be bad for your website’s SEO.
Over 10 years of developing PWA websites, we have come across several cases in that SEO performance is actually negatively affected by PWA.
We have already written an article about PWA benefits. Now, for you to see the underwater part of the iceberg, here are some big challenges of SEO for progressive web apps that store owners may face when implementing PWA.
The main problem with PWA SEO
This goes back to basic website development knowledge: HTML & Javascript. We bet that even if you’re not tech-savvy at all, you have properly heard of these two coding languages before.
In short, HTML is the most simple coding language, which is used for creating website structure.
Meanwhile, Javascript is a more advanced one, which is responsible for more special effects.
Therefore, if you see a web page with lots of dynamic content, a fancy scrolling effect, or a cool slider, for example, that web page properly makes good use of Javascript.
(credit: Reddit)
PWA, famous for its lively UI/UX experience, as you may guess, uses a lot of Javascript.
Here is the problem: Google & many other search engines do not render Javascript websites so well. Javascript-based websites are often more complex and heavier to process than HTML websites. In fact, not only search engines, social media like Twitter or Facebook is not so fond of Javascript either.
As a result, search engine bots can have some issues with indexing your PWA site, leading to poor SEO performance.
Therefore, it demands a skillful expert that understands PWA inside out to optimize the website so that it’s SEO-friendly.
6 obstacles when optimizing PWA for SEO
1. Performance issues
“But PWA is supposed to be fast?” – You may ask.
Yes, it is, but it does not mean PWA does not have some performance issues.
There are still some prominent speed-related issues with PWA. However, PWA can be lightning-fast if it’s optimized right.
Just like Usain Bolt, even though he’s born super-duper talented, he can not be the champion if he does not have a coach who understands his ability & trains him well.
Let’s take a closer look at each issue:
Problems
Javascript files use up many resources for their parsing, compilation & execution. Moreover, due to poor server performance, the search engine bug can not render Javascript files fast.
As a result, it may affect some page speed metrics such as First Meaningful Paint & First Contentful Paint. As these two metrics are important ranking factors, this is bad for SEO. Also, a lagging site can never be good for user experience either.
On top of that, as PWA’s service worker requires HTTPS & SSL certificates to function, it can make your website slower than its HTTP version.
Solutions
Some common practices to improve speed are implementing AMP (accelerated mobile pages) along with PWA, as well as using HTTP/2 with HTTPs
Since PWA is an advanced & niched technology in the eCommerce landscape, it’s better to have experienced PWA developers optimized your site speed.
2. Javascript errors
Problem
When parsing, browsers like Google Chrome can automatically correct HTML syntax such as closing tags. However, this is not the same with Javascript errors.
As a consequence, any errors in Javascript may stop search engine spiders from crawling & indexing your web pages.
Solutions
It’s important to have developers written clean & well-structured Javascript codes for your site to avoid many time-consuming fixes later on.
If the site runs into any Javascript errors, developers will need to clean them manually & inform search engines after fixing them.
3. Hyperlink/ image not crawled due to cloaking content
Cloaking content is those seen by human eyes, but not by search engines. For example, text that is hidden behind the “show more” button or lazy loading menu links. These are often dynamic content & links written in Javascript
Problem
As Google still has some difficulties understanding Javascript, there might be some PWA’s Javascript content that will not be accessed by Google.
If your content (e.g. an important link, a product image) is invisible to the search engines, they can not index it. This will consequently have a bad impact on your SEO rankings.
Solution
You can always rely on tools like Sitechecker or Small SEO tool to see if your web page contains any cloaked content. Make sure all the key content like texts, images, and links delivered to your users matches what is served to Googlebot.
4. Module compatibility
Problem
As a relatively new technology, PWA often takes advantage of the latest modules. This might conflict with Google or other browsers, which usually rely on the less-modern modules. Some notable examples of this are ES6, Fetch API, or new syntaxes or methods.
Solution
Before implementing, you should check their compatibility. It’s not a big problem if some modules of your PWA do not work with Googlebot, because there are tools or polyfills to fix this.
Polyfill is some Javascript code lines that can allow older browsers to use modern features even if they do not natively support those features.
Based on which modules the Googlebot conflicts with, there might be a transpiling tool for it too.
For your information, transpiling (translate + compile) means translating & interpreting one programming language to another. Let’s take the ES6 module as an example, services like Babel can transpile modern C6 Javascript files to ES5, which allows older browsers to run smoothly with it.
5. Problem with server-side rendering
In an effort to render Javascript-based websites better, Googlebot uses both server-side & client-side rendering.
In short, the rendering process is like this:
- Googlebot crawls your website to find links & content for the 1st time
- The server will render links & content found after the 1st crawl. These links & contents will be indexed shortly afterward
- Google continues to crawl the website for the second, the third time, etc
- From now on, the remaining links & contents will be rendered by the client, which means they will be rendered on your users’ devices.
- After the rendering is finished, Google index your web pages.
Problem
At first, Googlebot’s rendering flow seems pretty much plausible, but there are some major drawbacks:
- Lower Indexing speed:
Google can index HTML files with server-side rendering fast, but it can take days to index Javascript files with client-side rendering
- Prioritize the wrong pages:
Imagine that you have a web page that links to some other pages. These are called internal links. Google relies on the structure of these internal links to understand the relationship between the links & which links they should prioritize.
However, for Javascript-inserted links, Google needs to wait for the client-side rendering to be finished before it can start indexing. To put it another way, Google can fully evaluate the website’s internal link structure only after the client-side rendering is finished.
As a result, some not-so-important links that do not need client-side rendering may be crawled first & more often. Thus, Google can prioritize your minor page & ignore the selling pages.
- Conflict between Server-side rendering & Client-side rendering:
There are often issues between server-side rendering & client-side rendering. For example, some important page elements (meta tags, canonical tags) are not indexed. Moreover, the two phases may send some mixed signals to Google, thus, the search engine does not know how to proceed with your pages.
Solution
A more efficient server is a major fix for these issues. Developers may need to optimize the server so that it responds instantly to deep linking requests. Also, they need to work for well-rendered HTML files, especially those for crucial page elements such as navigation, links, content, meta-tags & images.
Modern technology can help us solve the conflict between Javascript & web browsers, too. For instance, the isomorphic technique can make it easier for browsers to render Javascript files.
6. SEO for progressive web apps common mistakes
There are over 200 SEO ranking factors, some of them are heavily technical such as canonical tags, rebot.txt, etc. For this reason, merchants may easily ignore these configurations, thus, end up hurting their website’s SEO.
Before getting into fixing your PWA SEO, it’s best to make sure your current sites follow the best SEO practices.
>> See more: Best Magento SEO practices
Now, coming to the most tickling question:
Should I upgrade to PWA?
… If I don’t want to harm my website’s SEO.
It must be very confusing for merchants thinking of implementing PWA, especially when organic searches are a vital website traffic channel.
However, these challenges do not mean that Google bots can’t crawl PWA sites, nor do they rank PWA sites differently because of their programming language.
It’s just that merchants need to be aware of some technical challenges when implementing an SEO-friendly PWA site.
In fact, PWA websites can absolutely rank well in search engines, if not better than their equivalent non-PWA websites. As mentioned earlier, the web app improves SEO’s Core Web Vitals.
So, in terms of SEO, is PWA a hero or zero? It depends on whether their developers deeply understand the PWA technology & website development.
Read More: SPA vs. MPA: Pros, Cons & How To Make Final Choice
Takeaway
PWA can be a double-bladed knife for your website’s SEO ranking. On the other hand, its enhanced speed & customer experience can help boost SEO’s core web vitals. On the dark side, PWA, with its rich Javascript content, is more complicated for search engines like Google to work with. Thus, poor implementation without deep PWA understanding can cause a ranking plummet.
It’s totally possible to create an SEO-friendly PWA website. The key is to find PWA developers that are familiar with these challenges & know how to resolve them quickly.
We have 10 years of creating headless PWA for Magento