Tim Kadlec - Site Performance

Running time - 25:45 mins
Episode: #3

Transcript

Episode Sponsor:

s3.jpg

JM: Welcome this is the third interview on Web Payload and todays episode is all about web performance and we have just the man for the job, it’s Tim Kadlec, welcome Tim.

TK: Hey John, happy to be talking about it.

JM: Can you tell us a little about yourself, your work and what you like to do in your playtime?

TK: Sure, playtime, i have three daughters so play time is at a definite minimum. I work as a developer and a consultant living in Northern Wisconsin in a town about 2000 people. Working with a variety of clients building sites that work in multiple devices and platforms and browsers with a heavy emphasis on performance. I also do a bit of training and i wrote a book called Implementing Responsive Design about how to incorporate responsive design with what it does to your workflow and your process from start to finish.

JM: Great, so this is all about performance and lets make the case, why is it so important?

TK: There are a lot of statistics that show that performance has an indicator on key performance indicators for a company. Amazon found that when they decreased load time by 100 milliseconds they shaved off they got a 1% increase in revenue which which equates to about $157 million additional dollars each quarter just for a 100 millisecond decrease in page load time. They are one of the more prominently studies i’ve sighted but everyone from Mozilla who found that every few seconds they shaved off their load time increased their conversions resulting of millions more downloads of their browser. FaceBook, Google have done a lot of studies around performance. Just about any business metric has been tied to performance from bounce rates to page views to time spent on site. It’s not really that surprising if we think about it as the web is a very interactive medium, scrolling up and down pages, clicking links and submitting forms, hitting a button, these are all interactions and they define the experience that a person has with a site or application. So when you look at it from that perspective it shouldn't surprise us that performance is going to have a sizeable impact on how effective an site or application is.

JM: Yep it is really important there is no question. Can you give some examples of some heavy sites you have seen and bad performing sites?

TK: Well, you don’t have to look very far. The average site is now 1.5mb and there are a tonne of them out there taking it to an absurd degree. I know Oakley gets thrown around a lot. When they came out they were this massively, ridiculously sized site, and they have shrunk it down, they are now about 13mb. It took a lot of hard work to get to that size but even that is really massive and huge. Unfortunately you can just about fire up any site that is just launched and you are probably going to see and run into some performance issues before too long if you just inspect it. I try not to single out sites because there may be constraints we are not aware of from the outside. Its easy to throw stones and point fingers at these messy sites but unfortunately its a condition of the environment that they are working in and forcing these kinds of issues to happen, unfortunately.

JM: Can you give some of the biggest things to watch out for when we are talking about performance and some of the big bad errors?

TK: Some of them are very simple. Steve Souderscame up with this list of rules a few years back now that brung up (YSlow)[http://developer.yahoo.com/yslow/] which was one of the very first good performance tools out there for analysing performance and telling you some of the things to fix. So theres a lot of low hanging fruit and simple things you can do to make that the size comes down. The one that never fails to amaze me is compressing the images. There are two types lossy and lossless. Lossless means there is no visual impact at all on the image. It’s just shaving wasted bytes. Even just implementing that can have a huge impact on some of these sites. I was looking at a site the other day and just saved that saved 200-300k. There’s not that much to do, there are little drag and drop applications like Image Optim. You can easily automated it from the terminal. Making sure you have gzip enabled which can seem a little scary to some people as they have to deal with an htaccess file but there is an ready made file on github someone has already figured all of that stuff out for you. Minimising javascript and combining them to reduce requests. These are all things that are pretty easy to do but frequently aren’t done unfortunately, but they can have a sizeable impact on page load time.

JM: There is a lot of things you can do and its pretty straight forward, a lot of it. So you gave a fantastic talk, i wasn't lucky enough to see it, over in Germany and you talked about performance and in particular planning from the beginning. Can you go into that please?

TK: Yeah so, i guess i alluded to it when i said i don’t like to point fingers at sites as you don’t know whats going on, behind the door and why those decisions had to be made. One of the things that i routinely find is that its cultural and not technical. We have these lists and guidelines in which technical things can happen. When we are working on these projects performance doesn't get emphasised enough from the very beginning. It’s one of these things that unfortunately its after the product has came out and suddenly everything is slow and you have to address it at that point, but throughout the development and design process we often don’t see those issues because we are looking on fantastic machines and incredibly high speed connections and a lot of those things get masked. So i think if we really want to start to reverse this trend of increasing page weight what we have to do is put an emphasis on it from the very beginning. It has to be incorporated from the beginning and everyone on board so that the performance issues are being caught prior going to launch. That way you can hit those issues before users ever have to deal with them. That’s only going to happen if we incorporate it into the process from start to finish.

JM: One thing to watch for i guess is we are not only talking about file sizes although they are important. A 2kb javascript file can cause really big problems. If you think about infinite loops in the worst case scenario. What are some of the things to watch out for when we are talking about framer rates and paint time within browsers?

TK: Yeah, you’re right it’s not just about file sizes. In fact i just saw, although i haven't had a chance to dig through the data, from Etsy the other day, that they found out and for them that rendering performance had a an even larger effect when people engaged than load time did. So it’s not just file size and getting that load time down its about rendering, scroll performance. For those kind of things, the areas to watch are drop shadows and gradients, especially when you pair them up together. They can slow a page to a crawl when you are scrolling through and if there is any kind of animation. It’s making sure you are testing all of that cause it can have a large effect if not a larger one.

JM: So how would be go about testing some of this stuff in mobile devices and getting an accurate connection speed and bits and pieces like that. What are some of the tools we can use?

TK: The key first off is to use a real device and real browser as early in the process as you can. Emulators can miss a lot and just mask it, if you are just resizing chrome you will miss a lot of these issues you are going to run into as well. Once you have those real devices fired up, from the load time perspective use something likeSlowy ( well worth it)]which is a cheap $5 app which will simulate a slow dSL connection or what ever you want to do. My preferred tool is Charles Proxy which works on Windows or Mac. It has the throttling capabilities, so you can set all these presets, for say 3G or Edge network and choose your bandwidth and latency measurements from that, but you can also inspect traffic on a request by request basis, so you can see exactly what’s being sent down the pipe and how longs that taking, where that's coming from and it really helps pin point a lot of those issues when you are using it with a phone paired up to the proxy. You can monitor all that traffic and see what’s happening there. So from that perspective i really like to use those. I think its really important to out that into the process and experiencing that slow connection part of the deal. From the rendering performance side of things unfortunately the tools are a little lacking there. That’s going to be firing it up on these devices and seeing how it feels. There are tools that are getting better. Chrome lets you do a little with frames per second and i think Mozilla has something going on and so does Internet Explorer 11, but for the most part those tools are somewhat primitive. So those things will help but for the mobile and other things is just play around with it, just use the sites and applications and watch for the issues rear their heads.

JM: So talking about images and responsive images originally when we were setting big images and setting a max-width. Can you go into some of the other solutions?

TK: So thats how it was first kind of brought up and the width max-width 100%. And that's what makes the browser do the scaling of the image. But what we found pretty quickly, and i think it was only a few weeks after that Jason Grigsby wrote this post)/ with a demo site in which if you served appropriately sized images you would have saved about 78% which is about 160k i think. So it was quickly determined that we need a solution to serve differently sized images and not just the same images which the browser can scale. I did an experiment fairly recently a tool called Sizer-Soze. It turns out there is this cultural reference( JM added- This would be from the film The Usual Suspects] with reference to the myth character of Keyser Soze - Great film - well worth a watch) which went right over my head, which is basically what it lets you do is fires up a couple of sites and you can see at different resolutions how much you can save if you served appropriately sized images versus just massive images which have just been scaled down. What i found was that 72% of the weight could be ditched by just using a Responsive images solution technique. That number not be 100% accurate as its hard to determine every situation. In some cases you might need to account for in-betweens and images hidden in one location and displayed in another. So theres a whole bunch of constraints that might adjust that but thats fairly inline with with the projects that i work on which is 60-70% in reducing page weight. So it’s a huge improvement, it’s a big problem though, people have been fighting with what the correct solution is going to be. There are a couple of standards, duking it out Picture and SRC SET which seem to have the most behind them, SRC Set got implemented a bit in webkit from a resolution perspective, in handling high resolution displays. I think they can actually work pretty well together, in conjunction, and theres enough steam behind both of them that i think we could see that happen but there is still a lot to figure out there. There’s also a solution called (client hints, thats a server side solution)[https://github.com/igrigorik/http-client-hints] which may offer a bit of potential and power there. At the moment everything you are choosing at the moment is going to be a trade off.There’s always going to be something that you are going to give up, no matter, what solution you choose. I tend to lean towards picture fill which is a a poly fill, although it doesn't really exist, but it’s basically letting you do the picture element now today in a browser, you can actually pair SRC Set together, there’s a branch on github. That seems to be the one for me that has worked the best, the trade off with that is that because of the javascript solution the browser doesn't get to prefetch that image, which can slow things down just a little bit, but usually the file savings are worth it.

JM: Yeah that’s my experience, that these solutions are not perfect, picture fill seems to be the 1 thats got the most behind it.

TK: Yeah, whatever one you end up choosing just now, it’s probably a good idea if you can, if it’s in a CMS, its a good idea to abstract that component off to its own include somewhere, so if you are using Picture today and its not going anywhere, and they do it with SRC SET and they try to do everything with SRC SET, you can just change the little component, that little include and all of your images throughout the site will get updated. Committing to one solution and having it inside the markup on every single page, is probably not the way to handle it.

JM: We talked a little bit about javascript, how does lazy loading fit into performance and whats your thoughts on that?

TK: It can be really powerful, there are a couple of things to watch out for, if you are on a mobile device its really expensive from a time perspective and a battery perspective to make a connection to that mobile network. So when you are lazy loading there you need to be careful. You need to watch you are not continually reopening those connections and quickly draining a persons battery. It makes sense to do from the initial page weight down so the page appears quickly but then i would recommend anything that you are going to lazy load that you are pretty confident that people are going to end up needing. Lazy load it quickly after page load so you don’t need to make another connection there. But its powerful for shaving weight off for sure and whilst not technically lazy loading but if theres an image that you are not displaying on a small screen that you are going to display on a large screen, if its a responsive site, not loading that image until you absolutely need it at those breakpoints as its going to get downloaded anyway if its in the source code. So there are a lot of different ways it can be used to help and there are some standards things coming down the pipe too, there are some scripts to defer the loading of those and they are experimenting with lazy, although i’m not sure thats the exact attribute name or something to that extent to be able to define that a resource gets loaded after the initial load

JM: Didn’t know about that sounds really exciting. So in terms of finances when we are talking about performance and mobile data plans. I know i was in Germany a few years ago and i was really just checking emails and i got a bill for £300 ($500) and it was nothing spectacular i was doing on the internet. Can you talk a little about that?

TK: Yeah so you are getting into like the data roaming issues and that stuff can get heavy. That Oakley site when it went up Andy Clarke, a designer there, he did the math and figured out that it would cost him about $780 to look at on his phone which would have to be a pretty awesome site for that. Ronan Cremin wrote a post on Mobi-Forge where he analysed how performance how he puts it as he put it impacts the end users wallet and he looked a couple of sites and tried to figure out how much would these cost loaded over a roaming network. The Next Web was $44 per single page and Voque was $65 per page. So the comparison i made with Voque is that you can pay $65 with Voque for one page of their site or you can buy 15 copies of their magazine. It’s a little absurd how high and hard that it can hit people.

JM: Going back to javascript, what percentage of users do you see without javascript and why is it so important to focus on that?

TK: That’s a tough metric to get an accurate representation of, i think the biggest argument right now for that considering what happens when there is no javascript there are 2 things. First that i cant remember who said it but basically that every user doesn't have javascript enabled whilst all that stuff is downloading which can take quite a bit of time over a certain network depending on how much weight you have. And then the other thing is that you have a lot of other sites, and even content sites that have absolutely no reason to be dependant on javascript from this perspective at all but they will use it to load their content in some kind of flashy way and all it takes is just one little bug in that javascript and then that site is rendered entirely useless. So considering what happens when there is no javascript is really, to me you are creating a more robust experience. When you consider what happens in those less than ideal situations you are creating something that is more likely to stand up and perform well and be usable on any number of devices and browsers. With the unpredictability of this platform and how quickly things move, opting for robustness is always a winning solution in my opinion.

JM: Yeah absolutely. So lets talk a little about your book. Can you give an overview of what it’s about?

TK: Yeah so Implementing Responsive Design I read Ethan’s book and it’s Amazing, Ethan’s is the book to start with for sure. So what i wanted to do was to discuss how it impacts the other parts of the process, not just fluid images, media queries and fluid layouts but how that impacts on how you work on a project, what your workflow is like, what the content considerations are going to happen and look at some other enhancements you can make to those responsive sites. So thats what the book is, it starts off with the 3 cornerstones, fluid images, fluid layouts and media queries but then from there it starts to look at how you can communicate this to a client, what kind of deliverables you may want to use, what you have to do from a content strategy perspective, how you can pair it with some server side optimisations to create a REST based solution that can give you a little more power and sort of deep dives into related topics that it impacts. So the idea is that when you are done with this book then you have a good idea and starting point with all these different areas and you can go out and explore topics like content strategy in deeper detail by following u with Karen McGrane(link to book) Sara Wachter-Boettcher (link to book) it would give you a good fundamental base to work from for all of this.

JM: Sounds great, have to be honest and say i haven’t read it but its in the wish list and i will get onto it.

TK: I appreciate it, let me know how you like it

JM: I will do Tim, i will do. So you did organise a conference and you always have written this book, how on earth do you find the time for this stuff?

TK: I don’t sleep, so i don;t do the conference anymore that was Breaking Development, i worked on that with Jeff Frost and a bunch of other guys and they are all doing it now and doing an awesome job. It’s a lot of fun between that and the book and writing and coding, that's why i said i don’t have much playtime. Thankfully i really enjoy what i do, so this is fun for me. Just last night for example i got in one of those awesome groves where you are just cranking out code and you are like man this is fantastic and it helps to be passionate and excited about what you are working on, that’s what gets you through it.

JM: Yep, we are in a great industry, it’s really really good fun. i kind of see my job as my hobby and it’s great.

TK: Exactly

JM: So Tim how do people keep up with you and get your book?

TK: Sure so you can get the book at Implementing Responsive Design dot com and that links off to Amazon, PeachPit, Barnes and Noble, the whole thing. To keep up with me i guess the best way is to either watch Tim Kadlec or on TKadlec - Twitter.

JM: Thanks Tim, i really appreciate it.

TK: Thanks John.

EXCLUSIVE Content for Email subscribers only::

Get episodes 2 weeks before anyone else + Moving That Needle inside news

Up to 1 email per week // 100% Absolutely No Spam // Unsubscribe at any time