Cloud Computing and Data Security

We cannot attribute the beginning of cloud computing to a particular person or time. It evolved with the evolution of Internet and enterprise computing. We may be able to trace its roots all the way back when Dr. Larry Roberts developed the ARPANET in 1969. (Whitman & Mattord, 2016)

While the evolution of ARPANET, to Ethernet and then to Internet happened, enterprises were discovering new ways to compute from mainframes to multi-tier computing. During the early stages of enterprise computing, enterprises were purchasing hardware and software to host internally. Though not in the form that we see today, enterprises had an early version of cloud in the form of networked mainframe systems with dumb terminals. They then slowly began to outsource their information systems to Internet Service Providers (ISPs) and Application Service Providers (ASPs).

The concept of using computing, as a utility was probably first proposed by Professor Noah Prywes of the University of Pennsylvania in the Fall of 1994 at a talk at Bell Labs. “All they need is just to plug in their terminals so that they receive IT services as a utility. They would pay anything to get rid of the headaches and costs of operating their own machines, upgrading software, and what not.” (Faynberg, Lu, & Skuler, 2016). It came to fruition when Amazon launched its limited beta test of Elastic Cloud Compute Cloud (EC2) in 2006. Meanwhile, has already mastered how to deliver an enterprise application using a simple website. Continue reading “Cloud Computing and Data Security”

The Open Group Security Practitioners Conference Day 1

The Open Group Security Practitioners Conference opened with Allen Brown and Jim Hietala of The Open Group welcoming the community followed by the presentation by Murray Rosenthal of City of Toronto.

Murray says security is not just the integration with system development life cycle, but also deals with industry sectors, legal framework, and should be based on established standards. The intent of having security should marry with the reality.

Domain architecture should supplement solution architecture. He says if you don’t have security architecture, then you end up with trial and error methods, reverse engineering existing enterprise letting the enterprise go out of business. Similar is the case even for projects too.

Manu Namboodiri of BitArmor presented a different approach perspective on security virtualized environment. He suggests avoiding the legacy way of thinking and approach to security – think out of the box.

In virtualized environment, everything except data is virtualized. Data is tangible and traverse between environments. It could be duplicated and may remain remanent for ever unless disposed securely. Data is the lifeblood of the business and that’s what the business is primarily concerned about; infrastructure and the rest come secondary. Data has more threat surface than any other component in virtualized world. It requires higher and stronger security controls in the virtualized world.

Alex Woda of Avient Solutions Group, Steve Whitlock of Boeing, Predrag Zivic and Bob Steadman of Loblaw presented their thoughts on Security Architecture and how it should be developed.

The second half of the day concentrated on Cloud Computing and how to secure various types of clouds. Tim Brown of CA presented the concept of Cloud Computing followed by Views of Cloud Computing Architecture and Security by Chris Hoff of CISCO. Chris Hoff introduced Cloud Security Alliance to the community and encouraged everyone to be part of its efforts.  Steve Whitlock provided a short illustration of how Cloud Security Alliance aligns with Jericho Forum Cloud Architectural Views.

Internet Traffic Shaping in Canada

A recent survey by the Canadian Press Harris-Decima poll on the internet traffic management in Canada suggests one in five surveyed supports the idea as long as all users are treated fairly.

From the Internet Service Provider’s (ISP) point of view, they are doing the right thing by reducing clogs during peak-use-time due to peer-to-peer file sharing services. However, I believe that type of service comes with a cost to regular subscribers. In order to execute such monitoring service, ISP will need to know activities of each and every subscriber which breaching their privacy. The Privacy Commissioner of Canada should be involved in the discussions that Canadian Radio-television and Telecommunications Commission (CRTC) are currently having ensuring the privacy of Canadians.

With regards to the Canadian Press Harris-Decima survey, I am curious if the survey ever educated the respondents with the details especially about the ramifications to the regular ISP subscriber if the ISP is allowed to shape internet traffic. According to the report by the Canadian Press, 54 per cent of the respondents did not know whether the traffic management affects them personally.

Couple this with two recent bills – the Investigative Powers for the 21st Century Act and the Technical Assistance for Law Enforcement in the 21st Century Act – just introduced before the House of Commons that will allow police to collect information about Canadian Internet users without a warrant and to activate tracking devices in their mobile devices and cars; wouldn’t it be a free pass to the privacy of every Canadian internet user?