I see that many websites offer the "classical" site for the usual PC/laptop/netbook browsers at say "www.XYZ.com" and mobile devices optimized versions at "m.XYZ.com" or "mob.XYZ.com" or such. Is this due to lack of reliable browser recognition? Or is it lack of knowledge from their programmers and web-designers?
This is an old strategy. Rudimentary device/browser detection and two completely separate websites, one for pc and one for mobile. Nightmarish for content management - essentially two parllel sites to maintain. Plus, administrators would strip down the pc site for mobile, meaning mobile users got a crappy, incomplete version of the site.
Objective today is to provide as close to the same content as possible on both. And to have as little duplication as possible. These were the objectives of approaches like Progressive Enhanvcement and Responsive Design. But those move sooooooo much processing to the client side. I'm already seeing articles announcing the death of Progressive Enhancement.
I figure a good compromise that optimizes the balance between the solid footing of server-side processing, minimal duplication, and unlimited custom formatting for devices is good old Templating and dynamic generation. Serve it all prep'd and don't rely on the device too much.
But, then sniffing is key.
Or - the Perl world has developed a much better idea, and I'd looooove to read about it.
Time flies like an arrow. Fruit flies like a banana.