Virtual machines that run other virtual machines and so on, seems to suggest the limited need for human interfacing. Electronic companies are taking roads to more efficient data centers via virtualization – cloud-computing environments. Tens of thousands of servers in a particular company may have any primary data centers with the numbers constantly growing…a minimum of 20%, according to J. Nicholas Hoover. His report indicates the investing of companies in making data centers run more efficiently. One example given by the report suggests 60% of new servers put into operation utilize virtualization. The report also suggests the over expanding number of virtual machines can be managed by managing servers that are interfaced as a pool of resources. I wonder how many carbon-based units (humans) are planned in this massive upgrade? Has the IT/BI job market been usurped by this new…it’s latest adversary? The loss of more employment opportunities within the technology environment becomes reality as opposed to virtuality.
The term cloud is used as a metaphor for the Internet, based on how the Internet is depicted in computer network diagrams, and is an abstraction for the complex infrastructure it conceals.
The concept incorporates infrastructure as a service (IaaS), platform as a service (PaaS) and software as a service (SaaS) as well as Web 2.0 and other recent (ca. 2007-2009) technology trends that have the common theme of reliance on the Internet for satisfying the computing needs of the users. Examples include Salesforce.com, Google Apps or Windows Azure, which provide common business applications online that are accessed from a web browser, while the software and data are stored on the servers.
Cloud computing is a style of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet. Users need not have knowledge of, expertise in, or control over the technology infrastructure “in the cloud” that supports them. Cloud computing is often confused with grid computing (“a form of distributed computing whereby a ‘super and virtual computer’ is composed of a cluster of networked, loosely-coupled computers, acting in concert to perform very large tasks”), utility computing (the “packaging of computing resources, such as computation and storage, as a metered service similar to a traditional public utility such as electricity”) and autonomic computing (“computer systems capable of self-management”).
Hoover’s report includes planned evaluation pilot projects that are underway, including a number of companies that share interests in cloud platforms. Hoover says CTO Greg Simpson, seeks a project to implement technology that give flexibility, automation, and manageability. The evaluation acts as an aid in deciding whether to have one expansive internal cloud or…multiple clouds. The demand is to charge units for what they consume. “Emerging virtualization and systems management technologies will play a major role,” says Simpson, according to Hoover.
The server administrators will continue to manage some system units individually, albeit, Simpson wants to move to managing them as a group, says Hoover.
Indeed many cloud computing deployments as of 2009 depend on grids, have autonomic characteristics and bill like utilities – but cloud computing can be seen as a natural next step from the grid-utility model. Some successful cloud architectures have little or no centralized infrastructure or billing systems whatsoever, including peer-to-peer networks.
The majority of cloud computing infrastructure as of 2009 consists of reliable services delivered through data centers and built on servers with different levels of virtualization technologies. The services are accessible anywhere that has access to networking infrastructure. The Cloud appears as a single point of access for all the computing needs of consumers. Commercial offerings need to meet the quality of service requirements of customers and typically offer service level agreements. Open standards are critical to the growth of cloud computing and open source software has provided the foundation for many cloud computing implementations.
The customers engaging in cloud computing do not own the physical infrastructure serving as host to the software platform in question. Instead, they avoid capital expenditure by renting usage from a third-party provider. They consume resources as a service, paying instead for only the resources they use. Many cloud-computing offerings have adopted the utility computing model, which is analogous to how traditional utilities like electricity are consumed, while others are billed on a subscription basis. Sharing “perishable and intangible” computing power among multiple tenants can improve utilization rates, as servers are not left idle, which can reduce costs significantly while increasing the speed of application development. A side effect of this approach is that “computer capacity rises dramatically” as customers do not have to engineer for peak loads. Adoption has been enabled by “increased high-speed bandwidth” which makes it possible to receive the same response times from centralized infrastructure at other sites.
Internet networking and the web services allow interfacing with the “Cloud” from any location. With “Clouds“, the business owner can become king. A “hybrid-cloud – part public, part internal – replaced proprietary networks – an extension of corporate networks – it’s technically feasible to run data warehouses in public clouds. The “Health Insurance Portability and Accountability Act” – Sarbanes-Oxley, and credit card industry’s PCI (Peripheral Component Inter-Connect) standard put stringent controls on personal data.
Running a data warehouse on an internal cloud gets around those issues. In most cases, private cloud designers also will need to implement a virtualization management layer that goes beyond what they already have in place. “Virtualization isn’t a requirement for private-clouds.” “Workloads can be moved around without ‘hypervisor software.’ But in most cases, virtualization and internal clouds will go hand in hand. Rule of thumb: if you can master virtualization in the data center, you’ll master the private cloud.
Some reasons given for building private clouds:
Lower capacity – pooling resources will let companies reduce computing capacity by giving higher priority tasks power during peaks.
Reduce overhead – servers and related resources in a virtual data center can be managed as a unit.
Prepare – a private cloud will help IT teams get ready for private-public hybrid clouds in the future data center.
It has been suggested that enterprises start training users now to take advantage of cloud computing, according to Staten as reported by Babcock.
From where will the users to be trained come from if there is no one in the IT/BI job market bank-pool? Many IT professionals have jumped ship…gone on to better pastures…gotten another job or changed their line of work due to less and less Information and Technology Pros by company and corporate downsizing and the utilization of machine run machines?
Will the planet Earth become another “Cybortron?” Will the human race become a race of “Data Units” as opposed to “Carbon-based Units?”
Are “Cyborgs” cheaper to make than robots? Are new-born babies being implanted with ID chips in this world of plastic currency, smart cards, automated transportation, automated devices, automated surveillance, and automated electronic everything?
What does the “Vitruvian Man” imply?
“Leonardo’s famous drawings of the Vitruvian proportions of a man’s body first standing inscribed in a square and then with feet and arms outspread inscribed in a circle provides an excellent early example of the way in which his studies of proportion fuse artistic and scientific objectives. It is Leonardo, not Vitruvius, who points out that ‘If you open the legs so as to reduce the stature by one-fourteenth and open and raise your arms so that your middle fingers touch the line through the top of the head, know that the centre of the extremities of the outspread limbs will be the umbilicus, and the space between the legs will make and equilateral triangle’ (Accademia, Venice). Here he provides one of his simplest illustrations of a shifting ‘centre of magnitude’ without a corresponding change of ‘centre of normal gravity’. This remains passing through the central line from the pit of the throat through the umbilicus and pubis between the legs. Leonardo repeatedly distinguishes these two different ‘centres’ of a body, i.e., the centers of ‘magnitude’ and ‘gravity (Keele 252).”
The picture represents a cornerstone of Leonardo’s attempts to relate man to nature. Encyclopedia Britannica online states, “Leonardo envisaged the great picture chart of the human body he had produced through his anatomical drawings and Vitruvian Man as a cosmografia del minor mondo (cosmography of the microcosm).
He believed the workings of the human body to be an analogy for the workings of the universe.”
Who gets your vote, the “Virtual Machine” or the “Carbon-based Unit?” The world as we know it has changed dramatically and we are on a collision course to a totally and virtually metamorphosis of “Integrated Circuitry!”
A cohesive existence twix man – machine? Is interfacing feasible or endangered? What about the IT and BI job market, will it all end?
Til next time.
Information Week Magazine, “The Business Value of Technology”
J. Nicholas Hoover, pg. 32, Information Week, April 15 2009, “GE Puts The Cloud Model To The Test.”
Charles Babcock, pg. 27, Information Week, April 13 2009, “Time To Believe In Private Clouds.”
“The Vitruvian Man”, Leonardo di ser Piero da Vinci.
Other Internet Links:
Comprehension and Communication – More Than Being About Technology
The EMR And You – welcometosealbeach.com
The EMR and You Too
Comprehension and Communication – More Than Being About Technology
Karavukovo.org – Device Machine Dependent
“Article Posting Sites”