Portal Home > Knowledgebase > Articles Database > Employing ASA 5510 SP


Employing ASA 5510 SP




Posted by mrservon, 06-24-2008, 12:54 PM
Hello, our organization is purchasing an ASA 5510 SP. One of the major goals of this device is to stop abusive crawlers/scrapers/dos attacks. Apart from looking up the user manual (which I'm in the process of doing) I'd like to know how exactly does this device detect the attack, and then how to administer this. Can someone point me in the right direction? Questions exist with this like 'how do we detect an attack', 'how do we become aware of an attack', 'how do we use the asa to either block or unblock ip's or abusive domains?'. I know this is generic but a quick up to speed is needed as we are getting this device really soon and need to hit the ground running. Tia Charles

Posted by LibraHost, 06-24-2008, 04:06 PM
to block an incoming request you could set up a 'deny' security rule for the specific ip address/port the attack comes from. but you have to know the ip address of the attacker - server logs help in this regard. {I check my logs on regular basis, one day I was excited to see that the log was very large only to find that a remote site was probing the site for hidden data (?). wha they did was follow links setup in web pages then used combinations of directory names, some based on links, some based on educated guesses, to see if there was any pages there....the bot spent almost three hours probing the site...that night a security rule went into place to block them!}

Posted by mrservon, 06-25-2008, 11:54 AM
Only problem with that is we're hit with over a million posts a month and browsing the log is not really an option. The hopes that this device also includes the ability to sift through this and find abuse is what is being aspired. Insights?

Posted by LibraHost, 06-25-2008, 12:01 PM
Good Question! Well, I'm relatively new to commercial hosting and as a developer, most solutions are a few lines of code 'away'. One way would be to write a script to read the logs and total the statistics for your site. From that, you should be able to determine average or typical usage patterns and also see the usage patterns that are suspicous. In the senario I described previously, the one ip address spent hours on the site and requested pages that didn't exist! It shouldn't be too difficult to get the logs in to a db (windows servers can write to an odbc link instead of a file) and then perform queries on the data. HTH April

Posted by utropicmedia-karl, 06-25-2008, 12:09 PM
some pf shell scripts would accomplish your goals without the additional overhead of new hardware. Kind Regards,



Was this answer helpful?

Add to Favourites Add to Favourites    Print this Article Print this Article

Also Read
Install squid with yum (Views: 751)


Language:

LoadingRetrieving latest tweet...

Back to Top Copyright © 2018 DC International LLC. - All Rights Reserved.