alt text
Caption
Web dev: What browser is visiting the page?
User agent string:
A screenshot of a browser. The URL bar reads firefox://settings
, a button on the URL bar is labelled Netscape, a popup from the button reads: “You’re viewing a secure Opera page”, and the web page title reads “Chrome settings”.
Functionally useless. With the web standardized, we shouldn’t need user agents anyway. It would be more beneficial to ask “do you support X, Y, and Z?”
It’s called feature detection and it goes a long way back, even before Modernizr popularized it.
Popularized? That gets less than 100k downloads a week
Most developers just write their own feature checks (a lot of detections are just a single line of code) or use a library that polyfills the feature if it’s missing.
The person you’re replying to is right, though. Modernizr popularized this approach. It predates npm, and npm still isn’t their main distribution method, so the npm download numbers don’t mean anything.
Neat, thanks for clarifying! I’ve never heard of it
It used to be huge.
That’s exactly what you’re supposed to do with the modern web, via feature detection and client hints.
The user agent in Chrome (and I think Firefox too) is “frozen” now, meaning it no longer receives any major updates.
Youtube currently (for weeks now) does not work on Firefox, if you don’t use a Firefox user agent. Google doing sketchy things again.
I’ve not run into this issue and use Firefox exclusively with ublock origin
I use Charmeleon, with the effects described above.
So you don’t use Firefox, you mess with Firefox. That’s on you then. Devs can’t be held responsible for you intentionally breaking things. Only do what you know works.
YouTube works fine on Firefox…
It works fine?
What works? YT on Firefox or YT on Firefox when the user agent is changed?
Both. I use YT on Firefox constantly, and I just explicitly tried again with a swapped user agent, and there’s no issues at all, works perfectly as expected. I saw from your other reply that you use a fairly involved and heavily modifying expansion, not just a user agent switcher.
If you try to “harden” your FF, always keep in mind that a large portion of that means absolutely breaking things left and right and center. It might work, but always expect it will not. Because it’s just not something anybody would ever test for when creating web pages. So you’re running essentially unknown scenarios. It might be interesting input to the extension-author that this breaks, though. It might be something they think they got working. Of course, it could also be that it’s “Yeah that happens, it’s intentional”. But might as well report it to them.
Uh… I use librewolf that force a chrome + windows user agent and its totally fine?
Then charmeleon must change more than just the user agent
? I just tested and it worked fine.
again? Did they stop anytime recently?
User agents are useful for checking if the request was made by a (legitimate self-identifying) bot, such as Googlebot.
It could also be used in some specific scenarios where you control the client and want to easily identify your client traffic in request logs.
Or maybe you offer a download on your site and you want to reorder your list to highlight the most likely correct binary for the platform in the user agent.
There are plenty of reasonable uses for user agent that have nothing to do with feature detection.
Aren’t user agents just a plain text header? Couldn’t a malicious agent just spoof a legitimate one?
That’s correct, it is just plain text and it can easily be spoofed. You should never perform an auth check of any kind with the user agent.
In the above examples, it wouldn’t really matter if someone spoofed the header as there generally isn’t a benefit to the malicious agent.
Where some sites get into trouble though is if they have an implicit auth check using user agents. An example could be a paywalled recipe site. They want the recipe to be indexed by Google. If I spoof my user agent to be Googlebot, I’ll get to view the recipe content they want indexed, bypassing the paywall.
But, an example of a more reasonable use for checking user agent strings for bots might be regional redirects. If a new user comes to my site, maybe I want to redirect to a localized version at a different URL based on their country. However, I probably don’t want to do that if the agent is a bot, since the bot might be indexing a given URL from anywhere. If someone spoofed their user agent and they aren’t redirected, no big deal.
Web UI for touch screens is a lot different than keyboard and mouse. I still switch to desktop most of the time because the mobile site will lack critical info, though. They “have” to streamline the experience for mobile, but I hate it when they fully remove features.
You have to use user agents to fool scummy websites into thinking that you’re using chrome or edge.
Lazy web developers or clueless managers have entered the chat
User agents are essentially deprecated and are going to become less and less useful over time. The replacement is either client hints or feature detection, depending on what you’re using it for.
“yer a jedi, harry” - Gandalf
Listen here, Gandalf, you fat oaf! I’m not a fucking Jedi!
Is it… (scrolls wheel of browsers) Lynx?
I’m still amazed at how usable Lynx is, given the insane premise of the application.
What’s so insane about it? Web browsers are an evolution of the old gopher protocol. All this stuff has roots in text consoles.
The history makes plenty of sense, and explains why it’s there in the fist place. The modern internet was not designed to use by console, though.
A URL is not an agent string, just saying.
I want a WordPress plugin that refuses to load my site for anyyhing newer than Netscape 3 and pops up a modal “you need to upgrade your browser” pane.
A new browser touches the beacon
There are some use cases other than web page compatibility. One for me is in dealing with firewall and proxy policy, if the agent is a browser and comes in on specified explicit ports then force authentication, things of that nature.