Server-side Rendering – Totally Tooling Tips
Articles,  Blog

Server-side Rendering – Totally Tooling Tips

that’s come up a lot recently is
server-side rendering. MATT: Yeah. ADDY: And it’s a hot topic. MATT: Hot topic. ADDY: Hot topic. Lots of nuance, lots
of opinions around it. MATT: Yeah, lots of feelings. ADDY: What is
server-side rendering? MATT: In a nutshell,
server-side rendering is– and this is
particularly important for JavaScript-heavy
client side apps. You take your application,
and you think of it almost like you pre-boot
it up on the server. So you stamp out
all the dom nodes. You get everything
kind of in place. You have your CSS in place
so it paints really fast. And you send all that
down to the client, and you get a really
nice, fast first paint because everything is just
already there in the document. ADDY: Right, but it
comes with a cost. MATT: Indeed. ADDY: If you’re server-side
rendering something, it’s going to potentially
improve a few metrics, like potentially first paint,
first meaningful paint, first contentful paint,
how quickly people get text on the screen. But it also comes at the
cost of you shipping down a much larger payload of
HTML to know your users. MATT: Absolutely. ADDY: And that can
have the impact of pushing out everything else,
including your JavaScript. So you get this really
fast first paint showing people
something on the screen. MATT: You can see it. ADDY: You can see it, but
then you can’t necessarily interact with it. It’s like you end up
on a shopping page. It renders really
quickly, and then you start tapping on buttons. And unless everything is
end-to-end server-side rendered, the user
might have bits of the experience that are not
necessarily complete or ready. So it’s a trade-off. MATT: And I’ve
experienced this recently. I was driving through
the woods and was on a way slower mobile
connection than I usually am. And yeah, I landed
on a page, and I knew it was server-rendered
because I was like, I can’t click on any of
this, and I knew exactly why. And it can be frustrating. ADDY: It can be. I mean, for different
verticals, maybe server-side rendering
makes sense. MATT: Totally. ADDY: If I’m a publisher,
I might care more– really heavily about making
sure the article text can be seen pretty
early on, but people use server-side rendering for
lots of different reasons. So some people use it to
optimize those metrics. Other people will do
it because they’re trying to make sure that
Googlebot and search crawlers can see their content. MATT: Especially because not
all crawlers can run JavaScript. ADDY: So we’ve said– Google has said that the crawler
can understand JavaScript to some extent, and that’s
great and everything. What I recommend,
if folks are unsure, is to go check out
the Webmaster Tools. There’s a little tool
there that let’s you– MATT: Fetches Google. ADDY: Fetches Google– that
lets you render your page the same way that the
Googlebot would, and we now know that that’s
powered by Chrome 41. MATT: Chrome 41. ADDY: We got some
docs on that now. MATT: We can leave them
down in the show notes. ADDY: We’ll link them
up in the magic boxes. And that’s great, but you
should check that out. See if it’s actually
busted in any way before– MATT: It can be surprising
because you’re like, oh, it runs JavaScript, so I’m
just going to assume it runs latest JavaScript. And that is not always the case. ADDY: No. Chrome 41 came out a while back. MATT: Yeah, exactly. It’s probably old enough
to drive at this point. ADDY: And so there
are definitely times when I forget to include
polyfills for search crawlers. Like you sometimes
need to– you can polyfill things like promises
depending on the crawler. So people should
just be careful. MATT: I think, actually,
in the article, you mention there are
even some debugging tips because you do Fetches
Google, and you might just see a blank page. And it’s hard to know then,
OK, well, what’s missing? Is my application actually
broken, or like you’re saying, am I missing a polyfill? So there are some
tips in there how you can kind of debug that
and figure out which is it. I’ve totally been
bitten by this before. I was missing a web
animations polyfill once. And I was just like, I guess
this whole thing is just broken. And I was able to debug it. And I was like, oh, I just
need to include this one file. ADDY: So it may not be
quite as complex and broken as folks think. There are some possible ideals. I know that, when some
people are thinking about building mobile
websites, progressive web apps, we’ll tell them that the
architecture pattern of choice is the application shell because
you can easily cache pieces of your UI that don’t
need to be fetched from the network
on repeat visits. It improves your
performance, and so on. But then they’re
like, well, yeah, but I kind of need
to server-side render my content, too,
because I’ve A/B tested, and server-side rendering
is improving my SEO, and it starts to get a
little bit complicated. I mean, one possible
balance you could find there is use your
architecture of choice– your performance architecture
of choice for your normal users, and then conditionally serve
the server-side rendered version to Googlebot or other
search crawlers. MATT: Do a little
UA sniffing maybe? ADDY: I didn’t say that. MATT: I didn’t say that. Well, actually, I did say that. ADDY: That’s OK. Matt said that. MATT: Matt said that. ADDY: Matt said it. MATT: There we go. ADDY: So folks should be– there’s a lot of
nuance to this problem. We’re not talking about it
in terms of performance. I know that everyone is going to
have different things that they care about. Just take a holistic look
at server-side rendering and where there are trade-offs,
where maybe just using it for SEO over other things
might make sense, and just use your tools. Measure. MATT: Exactly. Tools, not rules. ADDY: Tools, not rules. MATT: That’s what
the smart kids say. ADDY: Is there a
sign-off of some sort you had for this episode? MATT: It’s totally time for
the end of the episode now. Thanks for watching
“Totally Tooling Tips.” [DING] [MUSIC PLAYING]


  • Hani Yahya

    If you SSR just for the crawlers, you've became a slave for your slave!

    I say search engines should define some sort of convention (like the robots.txt convention) that a webmaster can easily do to imply the page has been rendered and ready to be crawled, webmaster and users shouldn't have a bad experience just because the crawler is not that smart.

  • jozsef daníel

    Biztos jó lehet,de van egy nagy baj.Csak olyan szinten beszélem inkább értem a Angolt. Hogy ez nekem magas.Meg lehet hogy kiszállok ebből az internet dologból.Annyira rám szálltak Ezt tiltanak le másnap mást tiltanak le.A mobil telefonomon kutakodnak A tábla gépen nem tudok csinálni semmit.Szóval le lépek az internetről.Ezért nem érdemes nekem fejlesztésekbe gondolkodnom Mert én nem szeretem az ilyet hogy állandóan velem és a eszközeimmel foglalkoznak.Nem tudom hogy mikor de hamarosan lelépek.

  • Ryan Ricketts

    Wait… are you recommending that we user agent sniff and then cloak? I’m so confused… you work for Google and you’re recommending that we violate Google’s own guidelines?

  • László Balogh

    Here is my server side rendering strategy: I divide the users into two groups, based on whether their browsers support the APIs natively which my app is using, or not. The former group will get server rendered pages with JS, which progressively enhance the UX. The latter will get server rendered pages only. No polyfills required. Everything which can render HTML is supported no matter what. Eventually those browsers which not implemented the newest APIs yet, will catch up, and then its users will get the JS enhanced version automatically.

  • Mehdi Raash

    just look politically, Google itself on the other side, is drawing the picture of how people look at developers tool, by making such a fantastic videos. agreed?

  • João Pedro Balieiro da Costa

    Why not using server-side rendering under a Progressive Enhancement approach? I mean, if the point of using that is to display content before the agent is available to use JavaScript, it makes sense to me that content should be also interactable at that point. The user should be able to use the site the old way until JavaScript is ready to take control. And no UA-sniffing! 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *