Using Proxy to block all web sites except

Networking/Security Forums -> General Security Discussion

Author: toolmania1 PostPosted: Tue Nov 02, 2010 7:52 pm    Post subject: Using Proxy to block all web sites except
    ----
Firefox 3.6.10
Ubuntu Linux ( but that should not matter )

In Firefox, I set up a manual proxy. I set the address to 0.0.0.0 and the port to 8080. I added sites I wanted to visit to this list like google.com, msn.com, etc.

If I try to hit any site not on this list, it tells me the proxy is not allowing it. This is what I want. However, when I add some sites, they don't load correctly. So far, Facebook and Youtube are two of these sites. I found that if I add facebook.com/sharer.php or other pages like that, I can get some of it to display correctly. How can I get these types of sites to always display correctly using the proxy like this?

Author: toolmania1 PostPosted: Tue Nov 02, 2010 10:59 pm    Post subject: this security site also will no load properly
    ----
It also happened with this web site

http://www.security-forums.com.

I only see text on the screen. I don't see the full web site.

Why would adding this site to the manual proxy setting under firefox cause it to load like that. I know that if the site it not in that exception list, the site won't load. So, that is why I added this site. But, after I did that, the site won't load correctly.

Author: Dezaxa PostPosted: Wed Nov 03, 2010 3:37 pm    Post subject:
    ----
It could be that the sites you are visiting contain content from multiple domains. This is very common with some of the larger websites. If this is so, then blocking all but the main domain would cause the page to be incomplete. It may also happen that a site will include CSS files or scripts loaded from other domains, which would cause the page not to display correctly.

If you run FireFox with the NoScript plug-in, visiting a site and clicking on the NoScript button will show you all the domains that are referenced from that page. That might help you to identify all the domains that you would need to add to your proxy settings.

Author: capiLocation: Portugal PostPosted: Wed Nov 03, 2010 5:07 pm    Post subject:
    ----
The following had been posted by toolmania1 in a separate thread; presumedly he meant to post here but hit "New Topic" instead of "Post Reply":
toolmania1 wrote:
Well, that is exactly why I run noscript. I don't allow those tracking scripts. Sometimes I temporarily allow the script to see if the page loads correctly. I remove flash cookies with the betterprivacy add on. But, these sites kinda stink if this is what they are doing and will not load without these other scripts running. I think you are right. Oh well, I guess most people don't pay attention so the sites don't really care. Us that dig in and learn about security and stuff I guess are stuck to play by their rules. I am trying hard to break their rules though.

Thanks for the info
Laughing


Last edited by capi on Wed Nov 03, 2010 5:25 pm; edited 1 time in total

Author: capiLocation: Portugal PostPosted: Wed Nov 03, 2010 5:17 pm    Post subject:
    ----
toolmania1 wrote:
Well, that is exactly why I run noscript. I don't allow those tracking scripts. [...] But, these sites kinda stink if this is what they are doing and will not load without these other scripts running.

The behavior you're seeing doesn't necessarily have anything to do with tracking scripts, and is in fact quite common.

Many sites store their (legitimate) content on separate domains, either for logistical reasons, for ease of administration, for redundancy, for political reasons, or for whatever may be the case. While a site may appear as a single "page" from the perspective of the viewer, the web's distributed nature means that in fact it may come from a multitude of different places.

If you look at the HTML source for this page, for example, you will find that Security Forums loads one of its main CSS files from http://www.windowsecurity.com/css/site-sec/screen.css -- obviously, in our case this is because SFDC is an affiliate site of WindowSecurity and thus for consistency we inherit their branding and appearance. The WindowSecurity logo you see up top is another example; we hotlink it from http://www.windowsecurity.com/img/logo-sec.gif

Heavy traffic sites like Facebook often have separate subdomains from which they load static content, such as images, to lighten up the load on their main servers which have to generate the dynamic pages on the fly. In Facebook's particular case for example, if you look at the image location of a photo, you will find that it's loaded from a server with a name such as sphotos.ak.fbcdn.ne or somesuch. In fact photos may very well be distributed over a large network of servers, located in different geographical places, potentially under different domains (for example if they're subleased from a third-party company). It's all simply part of their load balancing scheme.

Author: toolmania1 PostPosted: Wed Nov 03, 2010 5:29 pm    Post subject: Makes sense
    ----
So, I would have to add all of the sub domains and I would be good to go or just not block these sites.

I am just trying to make my computer as safe as possible.

Author: capiLocation: Portugal PostPosted: Wed Nov 03, 2010 5:42 pm    Post subject:
    ----
toolmania1 wrote:
So, I would have to add all of the sub domains and I would be good to go or just not block these sites.

Please note that just adding subdomains (as in *.example.net) may not be enough; in SFDC's case for example, most of the content we load off-site comes from windowsecurity.com, which is not a direct subdomain of security-forums.com (e.g. you couldn't catch it with *.security-forums.com).

To this end, Dezaxa's suggestion of using NoScript may be of help: you can use that extension to see from which domains the page is loading stuff, so that you can allow those domains.

toolmania1 wrote:
I am just trying to make my computer as safe as possible.

I understand that, however doing what you describe here may prove to be more trouble than it's worth. Take Facebook's example for instance: just browsing there now I noticed sphotos.ak.fbcdn.net to load the photos, static.ak.fbcdn.net to load the background image, and probably other servers too (didn't bother to look). Then of course even if you did track down all the servers they use, if they make a change you'll have to go fix your whitelist.

Of course you can reach a compromise: allow for example *.facebook.com, *.fbcdn.net and so on...

The problem with a static whitelist is that the web is inherently dynamic. Let's take SFDC for example. Say you allow all the necessary domains and are able to browse here with comfort. One day someone posts an embedded image, such as a screenshot showing a problem with their PC. The image is hosted on one of the many free image hosting services (such as ImageShack or PhotoBucket). Unless you allow those domains (and any respective subdomains they load to actually give you the photo), you won't be able to see the screenshot.

Of course you can still fix that, by adding stuff to the whitelist whenever you need to. Depending on your browsing habits, that may become annoying, though. Today you go to site A, something looks ugly, you have to find out what's missing, then add it to the list. Tomorrow you go to site B, something looks ugly... and so on. But what if you don't notice that there's any content missing?

Author: toolmania1 PostPosted: Wed Nov 03, 2010 6:20 pm    Post subject: So what next
    ----
Ok. I agree. You have convinced me. What do I do then? Any suggestions?



Networking/Security Forums -> General Security Discussion


output generated using printer-friendly topic mod, All times are GMT + 2 Hours

Page 1 of 1

Powered by phpBB 2.0.x © 2001 phpBB Group