It's funny you mention that today, because I had a problem with tinypic and a redirection and I was like WTF.
In the end, mixing some ideas found at stackoverflow forums (most of search results come from there nowadays), it was that TinyPic overrides the HTTP Accept content sent by the browser IF they find that the User-Agent (the browser) is known. And yeah, that is what it was happening. I was using the mixed user agent K-meleon/74 Firefox/24 string and TinyPic server, instead serve the image requested, redirected to the html page that holds the image; and I was using it instead the standalone K-meleon/74 string, that I prefer, because, if not, and here comes the opposite, you can't use the reverse search engine offered by Google (even if I prefer TinEye
, but sometimes I "need" to use Google's one).
So, if you visit sites as "unkown", or old browser, you have benefits sometimes and others not at all.
But instead web programming, the problem comes from the dynamic web we have now. Make all automatic and dynamic. Afterall the goal of computing is that, but sometimes it's just nonesense.
ABOUT THE MAIN TOPIC HERE
I didn't comment but the project looks interesting until you find that, in the end, you need to run some kind of web proxy server. But also has the benefits of being a CGI and being able to run on any system with a web server.
Edited 2 time(s). Last edit at 03/28/2014 04:37AM by JohnHell.