Skip to main content

Some thoughts about programming for the "Mobile Web"...

I recently implemented a new feature called Cache Profiles into Barebones CMS. I'm a bit stuck on the documentation because I've had to stop and figure out how I'm going to create a mobile-friendly version of the website. I eat my own dog food after all. Om nom nom.

The concept of a "mobile-friendly website" is foreign to some people. A lot of people think, "Our main website displays the exact same on the desktop, iOS, Android, etc. and therefore it is mobile!" That is NOT a mobile-friendly website. If a user has to use pinch-to-zoom or scroll horizontally at all to read the content, you have a desktop-only website. While desktop websites do display on most modern mobile devices, they are NOT mobile-friendly and users will hate you.

Actually, that is a pretty good definition of a mobile-friendly website:

"A mobile-friendly website is one that displays on small screens such as smartphones, scrolls only in the vertical direction, and is clearly readable without requiring pinch-to-zoom or similar features."

Anyway, back to the problem at hand. I have made mobile-friendly websites before, so this is old-hat for me. My problem is coming up with a strategy that is universal and makes web developer's lives simpler. With the new "tablet" devices running "mobile" OSes, it is no longer a matter of making just two designs for a single website. We've suddenly got a lot of web developers stressing out about designing for desktop, phone, AND tablet. Which is causing some developers to ask some important questions, "What's next? How many different designs will I have to make to satisfy all these newfangled devices?"

Which brings me back to Barebones CMS. The work I've done thus far on mobile web development involves, for all intents and purposes, user-agent detection. However, if I stuff user-agent detection into the caching subsystem of Barebones CMS, the performance of the cache will eventually suffer. User-agent detection has always been and will always be a very bad idea. There is a project called WURFL, which detects any mobile handset and returns the feature set it has, which is pretty much as extreme as you can get. The download is, unfortunately, several megabytes and it isn't exactly Speedy Gonzales. In the last month alone, the download size of WURFL increased by 900KB. And that is compressed data. This situation is only going to get worse. There has to be a better way.

My initial thoughts are to detect if a specific cookie is present on the server. If not, serve up an alternate page that runs a series of tests, sets a couple cookies, and then redirects the user back to the content. The test page would mostly be Javascript that would run three primary tests: Determine actual browser screen size, connection speed, and "touch" capability. Maybe detect Flash support or whatever other anonymous features might be useful to know from the server's perspective so the correct content gets served up. If Javascript is not available, 'noscript' tags could load an image that sets the cookie - maybe fallback to user-agent detection - or just tell the user to enable Javascript instead of being one of those few people that disable it. The 'noscript' response really depends on how much the site depends on Javascript. If cookies are not available, have the page itself meta-refresh to an alternate URL after 'x' seconds pass. The page should have a link to do the same thing. The important thing here is to not get stuck in an infinite loop. Attempt to detect - on failure, display the default layout and content (usually the desktop version).

The basic idea is to attempt to determine which layout to display regardless of device. Let's say I'm on a desktop but on a super-slow dialup connection (yes, those still exist). Well, instead of being a jerk and serving up the painfully slow desktop version, how about serving up the lighter-weight mobile edition instead? With user-agent detection alone, you can't do it, but with my basic idea, you theoretically could. This is why using user-agent detection just won't work.

Edit: After thinking about it some more, things like Googlebot and web scrapers would make my idea a nightmare to implement. Googlebot would get SO confused. Ack. Maybe serve up the default layout as-is, add the Javascript "solution" into the header of the default layout, and serve up the mobile version to Javascript enabled devices by reloading the page. Server-side user-agent detection is soooo hacky but maybe use something there too to avoid loading content twice.

Edit #2: And after thinking about this a LOT more, I've got what I'm going to do worked out. First, all browsers go to the main content. If Javascript and cookies are enabled, set some global that I can use later. Also check to see if a specific cookie is set. If it isn't, determine if the user's screen width is narrower than some set limit. If it is smaller, set the cookie to mobile and reload the page. Offer all users with Javascript and cookies enabled the option to switch between mobile and regular views. The server checks for the specific cookie and then, if it exists, loads the appropriate cache profile. Those with Javascript and/or cookies disabled get served up the default cache profile without the option to switch to the mobile site. This approach makes it possible to serve content to mobile devices and stay relatively future-proof (i.e. without browser sniffing). The downside is that there are a couple of extra requests to the server for the initial mobile connection, but the user won't likely notice. I personally like this approach because it will allow me to test it easily with a plain-ol' web browser without having to fake a user-agent string.

Alright. I'm done with my thoughts.

Comments