Should organizations implement layered defenses from different vendors? Should we rely upon a single vendor for an organization’s overall security?
According to a Gartner research paper, “Two firewall platforms are not better than one. We believe there is a higher risk associated with configuring and managing firewalls from multiple vendors than from a single vendor. Therefore, Gartner advises enterprises that have more than one firewall to standardize on a single vendor platform when the opportunity presents itself (that is, new installations or replacement during a refresh). In choosing a standard firewall, enterprises should consider the experience of their firewall administrators with each platform, scalability, central management and cost. ” (Young & Pescatore, 2008)
It also says that a firewall misconfiguration causes more than 99% of firewall breaches; not firewall flaws. It is true that debugging an error in any new appliance or tool can be cumbersome and time-consuming. Moreover, narrowing down to a single vendor relationship could help with greater discounts with less administration overhead.
However, there are situations where an enterprise could be stuck with a solution for long without much help to upgrade unless the enterprise pays almost the cost of a new solution and the extra cost of migrating to it. Sometimes it is better to diversify, especially when the industry is drastically changing and not all vendors address all issues with the changes. Continue reading “Best of Breed or Best Suite of Products”
Do organizations need hardware firewalls when the network already has host-based software firewalls? Wouldn’t it add cost and complexity to networks? Wouldnt system protected by host-based software firewalls just as secure as having a hardware firewall if they are implemented appropriately?
“Firewalls actually come in two distinct flavors: software applications that run in the background and hardware devices that plug in between your modem and one or more PCs. Both types hide your PC’s presence from other systems, prevent unauthorized access from external sources, and keep tabs on network traffic across the firewall.” (Desmond, 2004)
The host-based software firewalls are good for the host; but not for the network that the host is connected to. A hardware-based firewall is required for:
- Network address translation (NAT) to prevent exposure of internal IP addresses,
- Port management to close unsolicited access to your host,
- Stateful packet inspection (SPI) to inspect for unsolicited incoming traffic,
- Virtual private network to support connection remote connection and the host,
- Activity logging and alerts
- Content and URL filtering
The hardware-based firewall is easy to implement and saves computing resources on the host. Malware on the host can bring down the firewall on the host, but not the hardware firewall.
While the hardware-based firewall can protect threats from outside the network, the software-based firewall helps to protect from attacks within the system. Software-based firewalls help to detect unauthorized outbound traffic from the host. A user can pick and choose which application can talk to peer hosts as well as external systems and may not be able to do this with hardware-based firewalls. Continue reading “Hardware or Host Based Firewalls”
We cannot attribute the beginning of cloud computing to a particular person or time. It evolved with the evolution of Internet and enterprise computing. We may be able to trace its roots all the way back when Dr. Larry Roberts developed the ARPANET in 1969. (Whitman & Mattord, 2016)
While the evolution of ARPANET, to Ethernet and then to Internet happened, enterprises were discovering new ways to compute from mainframes to multi-tier computing. During the early stages of enterprise computing, enterprises were purchasing hardware and software to host internally. Though not in the form that we see today, enterprises had an early version of cloud in the form of networked mainframe systems with dumb terminals. They then slowly began to outsource their information systems to Internet Service Providers (ISPs) and Application Service Providers (ASPs).
The concept of using computing, as a utility was probably first proposed by Professor Noah Prywes of the University of Pennsylvania in the Fall of 1994 at a talk at Bell Labs. “All they need is just to plug in their terminals so that they receive IT services as a utility. They would pay anything to get rid of the headaches and costs of operating their own machines, upgrading software, and what not.” (Faynberg, Lu, & Skuler, 2016). It came to fruition when Amazon launched its limited beta test of Elastic Cloud Compute Cloud (EC2) in 2006. Meanwhile, Salesforce.com has already mastered how to deliver an enterprise application using a simple website. Continue reading “Cloud Computing and Data Security”