At times I think “will anyone reads this, does anyone care?”, but I always publish it anyway — and that’s for two reasons. First it’s a place for me to find stuff I may have forgotten how to do. Secondly, whilst some of this stuff is seemingly super-niche, if one person finds it helpful out there on the web, then that’s good enough for me. After all I’ve lost count of how many times I’ve read similar posts that have helped me out.
We need to be able to candidly and thoughtfully talk about technology without people assuming you’re calling that technology and the people who create/use it garbage. We have to be mindful of the community we’re fostering, and encourage people to freely discuss all this stuff without getting dog piled on.
The Tarot Cards of Tech are a set of provocations designed to help creators more fully consider the impact of technology.
Some important questions here to answer before building any new piece of technology.
Harry’s been doing a lot of digging into third-party performance in his performance work and he was kind enough to share his process with the rest of us. Lots of good ideas here!
What a great overview of how NerdWallet made their site significantly faster (6s faster time to visually complete as measured on a 3G network) in the last 6 months or so.
Surma talks about the different ways of forcing an element to be on its own element, and—most interestingly—the side-effects of each.
As per his usual, good advice from Zach about loading fonts: this time, advising not to pair font-display: optional
with preload
.
Wonderful walk-through of just how much data you’re sharing when you do something as simple as pizza and a movie.
Jessica was recently in South Africa for PixelUp (an event I keep hearing good things about) and wrote about her experience going on a safari and what sounds like an incredible encounter with an upset elephant.
The writing here is so good it makes me want to tear up everything I’ve ever written and try again.
Mr. Calvano with a really good explanation of HTTP Caching. Particularly interesting to me was the bit about the heuristic cache limits of different browsers—something I had never really dug into.
In the Chromium source as well as the Webkit source, we can see that the lifetimes.freshness variable is set to 10% of the time since the object was last modified (provided that a Last-Modified header was returned). The 10% since last modified heuristic is the same as the suggestion from the RFC. When we look at the Firefox source code, we can see that the same calculation is used, however Firefox uses the std:min() C++ function to select the less of the 10% calculation or 1 week.