M2M is shifting to M2M
The final eulogy to ‘Man is the Master of the Machine’ has been written.
In the movie Terminator 3, the sequel delves into the takeover of earth by machines, until the very end, when the machine itself has a change of heart. However ominous those signs are, what is undeniable is that the age of machines is upon us.
From mere input mongers to making sense of the mountain of data, cataloguing them, analysing them and delivering a seemingly analogues interpretation of it, machines have become the new indispensable smartphone for today’s Enterprise. Within this paradigm, the original input feeder, the man, is now relegated to building strategies on top of the results that the machine has spewed to him. The shift from Man to Machines, to Machines to Machines is now here to stay.
The component that has built itself into an indispensable position in this entire equation is that of the Data Center. Not the legacy coLocation versions but the new age, intelligent data player that offers compute, store, analyze, cohabits the Cloud and Applications within itself. One that is intelligent and elastic enough to accommodate the growing data demands of tech-dictated enterprises.
The Data Center, referred rather insipidly to its very reason of its existence, is now a chameleon in the entire IT structure. In some cases, it is the eventual residency point for the data. In others, it is starting point of information that has been decoded and awaits a decision. And in between, is the binding agent in an ever-spawning network.
Come to think of it. What if it was removed from the equation? Or perhaps, more benevolently, scaled down to just a rudimentary Data Center. Before we answer that question, here’s something of an analogy.
Imagine if all your data was not retrievable one not-so-fine morning. Will we see a repeat of the dark ages? Perhaps so. It is therefore not a far-fetched misnomer when Data is referred to as the new economy.
So, what size of data is the world coming to? Here’s a curtain raiser.
Shantanu Gupta, director of Connected Intelligent Solutions at Intel, introduces the next-generation prefixes for going beyond the yottabyte; brontobyte and gegobyte.
A brontobyte, which isn’t an official SI prefix, but is apparently recognized by some people in the measurement community, is a 1 followed by 27 zeros. Gupta uses it to describe the type of sensor data we’ll get from the internet of things. From there, a gegobyte (10 to the power of 30) is just a short distance away.
Now imagine the computational strength required to make sense of this volume. Companies will hence need to have a future-proof strategy in place for collecting, organizing, cleansing, storing, and securing data – and for applying analytics to derive real-time insights to transform their businesses.
A story in Information Management highlights “Big Data Analytics: The Currency of the 21st Century Enterprise.” Quite an interesting read. The gist of the argument: Personal data has an economic value that can be bought, sold, and traded.
Emerging technologies are driving transformation within organizations. The year 2019 will see Artificial Intelligence (AI) and Machine Learning (ML) driving change in enterprises. We already see numerous use cases of these emerging technologies in industries such as BFSI, healthcare, telecom, manufacturing, and home automation. These technologies can cull data and get real-time insights about the business and offer timely solutions or corrective action, often without human intervention. AI-backed automation and predictive analytics will help predict challenges that may arise; it will streamline operations, save costs, enhance customer experience, and perform repetitive tasks. While the adoption of ML technologies will lead to exponential growth of enterprise data, the accuracy of outputs is a factor of the sanctity of the input.
That calls for a trustworthy Data Center partner, not only to store the data but also to analyze and manage it. The ideal Data Center partner should do both — cater to current requirements and also adapt to the changing IT landscape.
According to a Frost & Sullivan report, from an APAC standpoint, it said, the Data Center services market will grow at a compound annual growth rate (CAGR) of 14.7% from 2015-2022 to reach US$31.95 billion at the end of 2022. Specifically, the India Data Center market is expected to reach values of approximately $4 billion by 2024, growing at CAGR of around 9% during 2018-2024. Major cities such as Mumbai, Bangalore, and Hyderabad are witnessing high investments of local and international operators in the Indian market. The increasing construction of hyperscale facilities with the power capacity of over 50 MW will fuel the need for innovative infrastructure in the market over the next few years.
A recent study of 500 International Data Centers threw up key insights into what constitutes a well thought out Data Center strategy and one that ticks the right boxes for an Enterprises when selecting a DC partner.
It is therefore evident that the Data Center should be built to solve a business problem – both current and future, should have the flexibility to adapt to changing demands and should be agile enough to accommodate newer dynamics of the business. The paradox in the situation is that as the Data Center grows, the density of the data within it will also expand; all this on hardware that will significantly shrink. Computing power therefore becomes the differentiator and will help negate any push backs that volume will bring up.
It is not lost on DC players that Security is the other differentiator. If this data falls into the wrong hands, it could create havoc, resulting in million dollar loses for corporations. It would impact the credibility of trustworthy institutions, entrusted with sensitive consumer data. Here are two recent incidents.
- In January 2019, the HIV-positive status of 14,200 people in Singapore was leaked online. Details included identification numbers, contact details, and addresses were available in the public domain.
- In December 2018, a cyber-attack exposed the records of 500 million guests of the hotel giant Marriott International. The attack occurred over a period of four years and was traced back to a Chinese spy agency.
The emphasis on security and compliance is even stronger now with the European Union’s General Data Protection Regulation (GDPR). In fact, GDPR is hailed as one of the most critical pivots in data privacy rules in the past two decades. It is going to fundamentally change how data is handled, stored, and processed.
Given the geographic-agnostic nature of such attacks, it is not lost on Indian IT companies to be wary of an impending attack. The Government-steered Personal Data Protection Bill mandates stringent rules for security, consent of customers, data privacy and data localization. Indian businesses will need to realign their Data Center strategies to comply with this Bill, which could eventually become law. This law will push business leaders to rethink identity and access security, encryption, data systems and application security, cloud security, and DDoS, among other things. And that’s where machine to machine will score higher. Little wonder that CIOs are in favour of the benefits of automating the whole of atleast a majority of the work chain.
Machine to machine allows for a predictable, systemic patterns, allowing for hyperscale computing, deep-dive analytics, trend spotting, vulnerability recognition and elimination, risk mitigation, even alternate computing, without the vulnerabilities of man to machine directions. The choice therefore in front of the CIO are to go with a service provider who is an SI or an IT architect who has provisioned the entire landscape and hence can implement machine-derived predictable automated results.
Does this mean it is the end of human thinking? Quite to the contrary, it started because of human thinking.
Sify has always taken pride in supporting technology advancements since the launch of its first Enterprise Data Center in 2001 and we invite you to download a copy of Gartner’s Market Guide that tracks the evolution of Data Center Services Market in India and highlights the wider choice of providers, hosting locations and services.
A CFO for All Seasons: Interview of M P Vijay Kumar, CFO on CFOThoughtLeader.com
I have one primary agenda for the next 12 months: to ensure that the organization has enough support available across all of the functions to enable scale. I want to ensure that every part of the organization is in a position to enable scale and monetize market opportunities, explains Vijay. At times, it must seem to Vijay Kumar that his 12-year tenure as a CFO has been spent not at one company, but three. This must be a sense that most C-suite members at Sify Technologies likely experience in light of the company’s appetite for continuous reinvention.
Listen to the Interview
At times, it must seem to Vijay Kumar that his 12-year tenure as a CFO has been spent not at one company, but three. This must be a sense that most C-suite members at Sify Technologies likely experience in light of the company’s appetite for continuous reinvention.
Back in 2007, when Kumar arrived at the information and communications technology company, Sify was widely known as a consumer business–and one perhaps without the will or resources to attract business customers. As CFO, Kumar was part of a management team tasked with changing that perception both inside and outside of Sify’s existing world. More specifically, Kumar and his finance team were responsible for calculating and tracking the necessary capital expenditures that could provide the new business-to-business infrastructure that business customers would demand. Of course, no sooner was the infrastructure in place then Sify decided to super-size its business services menu, making it a bonafide provider of technology services. Looking forward, Kumar says that Sify’s latest innovation involves not so much its customer offerings but how customers buy its offerings by using outcome-based pricing. This is an approach that Kumar believes will empower Sify to open a new chapter of growth.
https://www.cfothoughtleader.com/cfopodcasts/493-vijay-kumar-cfo-sify-technologies/
Cloud Service Models Compared: IaaS, PaaS & SaaS
Cloud computing has been dominating the business discussions across the world as it is consumed by the whole business ecosystem and serves both small and large enterprises. Companies are faced with a choice between three predominant models of cloud deployment when adopting the technology for their business. A company may select from SaaS, PaaS, and IaaS models based on their needs and the capabilities of cloud service models. Each model has inherent advantages and characteristics.
SaaS (Software as A Service)
SaaS service models have captured the largest share in the cloud world. In SaaS, third-party service providers deliver applications while the access to them is granted to a client through a Web interface. The cloud service provider manages everything including hardware, data, networking, runtime, data, middleware, operating systems, and applications. Some SaaS services that are popular in the business world are Salesforce, GoToMeeting, Dropbox, Google Apps, and Cisco WebEx.
Service Delivery: A SaaS application is made available over the Web and can be installed on-premise or can be executed right from the browser, depending upon the application. As opposed to traditional software, SaaS-based software is delivered predominantly in subscription-based pricing. While popular end-user applications such as Dropbox and MS Office Apps offer a free trial for a limited period, their extended usage, integrations, and customer support could come at a nominal subscription cost.
How to identify if it is SaaS? If everything is being managed from a centralized location on a cloud platform by your service provider, and your application is hosted on a remote server to which you are given the access through Web-based connectivity, then it is likely to be SaaS.
Benefits: The cost of licensing is less in this model, and it also provides a mobility advantage to the workforce as the applications can be accessed from anywhere using the Web2. In this model, everything at the back-end is taken care of by the service provider while the client can use the features of specific applications. If there are any technical issues faced in infrastructure, the client can depend on the service provider to remove them.
When to Choose? You can choose this model if you do not want to take the burden of managing your IT infrastructure as well as the platform, and only want to focus on the respective application and services. You can pass on the laborious work of installation, upgrading, and management to the third-party companies that have expertise in public cloud management.
PaaS (Platform as A Service)
In the PaaS service model, the third-party service provider delivers software components and the framework to build applications while clients can take care of the development of the application. Such a framework allows companies to develop custom applications over the platform that is served. In this model, the service provider can manage servers, virtualization, storage, software, and networking while developers are allowed to develop customized applications. PaaS model can work with both private cloud and public cloud.
Service Delivery: A middleware is built into the model which can be used by developers. The developer does not need to do hard coding from scratch as the platform provides the libraries. This reduces the development time and enhances the productivity of an application developer enabling companies to reduce time-to-market.
How to identify if it is PaaS? If you are using integrated databases, have resources made available that can quickly scale, and you have access to many different cloud services to help you in developing, testing, and deploying applications, it is PaaS.
Benefits: The processes of development and testing are both cost-effective and fast. PaaS model delivers an operating environment and some on-demand services such as CRM, ERP, and Web conferencing. With PaaS, you can also enjoy additional microservices to enhance your run-time quality. Additional services can also be availed such as directory, workflow, security, and scheduling. Other benefits of using this service model include cross-platform development, built-in components, no licensing cost, and efficient application Lifecycle management.
When to Choose? PaaS is most suited if you want to create your application but need others to maintain the platform for you. When your developers need creative freedom to build highly customized applications and require you to provide tools for development, this would be the model to select.
IaaS (Infrastructure as A Service)
In this cloud service model, Data Center infrastructure components are provided including servers, virtualization, storage, software, and networking. This is a pay-as-you-go model which provides access to all services that can be utilized as per your needs. IaaS is like renting space and infrastructure components from a cloud service provider using a subscription model.
Service Delivery: The infrastructure can be managed remotely by a client. On this infrastructure, companies can install their own platforms and do the development. Some popular examples of IaaS service models are Microsoft Azure, Amazon Web Services (AWS), and Google Compute Engine (GCE).
How to identify if it is IaaS? If you have all the resources available as a service, your cost of operation is relative to your consumption, and you have complete control over your infrastructure, it is IaaS.
Benefits: A company need not invest heavily in infrastructure deployment but can use virtual Data Centers. A major advantage of this service model is that with it, a single API (Application Programming Interface) can be used to access services from multiple cloud providers. A virtualized interface can be used over pre-configured hardware, and platforms can be installed by a client. IaaS service providers also give you security features for the management of your infrastructure through licensing agreements.
When to Choose? IaaS service model is most useful when you are starting a company and need hardware and software setups for your company. You may not commit to specific hardware or software but can enjoy the freedom of scaling up anytime you need with this deployment.
You can choose between the three models depending on your business needs and availability of resources to manage things. Irrespective of the model you choose, cloud Data Center does provide you a great cost advantage and flexibility with experts to back you in difficult times. Your choice of the cloud service model would affect the level of control you have over your infrastructure and applications. Depending on the needs of your business, you can select a model after a careful evaluation of the benefits of each of the cloud service models.
Sify’s many enterprise-class cloud services deliver massive scale and geographic reach with minimum investment. We help design the right solution to fit your needs and budget, with ready-to-use compute, storage and network resources to host your applications on a public, private or hybrid multi-tenant cloud infrastructure.
From Legacy to the Modern-day Data Center Cooling Systems
Modern-day Data Centers provide massive computational capabilities while having a smaller footprint. This poses a significant challenge for keeping the Data Center cool, since more transistors in computer chips, means more heat dissipation, which requires greater cooling. Thereby, it has come to a point where traditional cooling systems are no longer adequate for modern Data Center cooling.
Legacy Cooling: Most Data Centers still use legacy cooling systems. They use raised floors to deliver cold air to Data Center servers, and this comes from Computer Room Air Conditioner (CRAC) units. These Data Centers use perforated tiles to allow cold air to leave from the plenum to enter the main area near the servers. Once this air passes through the server units, heated air is then returned to the CRAC unit for cooling. CRAC units have humidifiers to produce steam for running fans for cooling. Hence, they also ensure the required humidity conditions.
However, as room dimensions increased in modern Data Centers, legacy cooling systems become inadequate. These Data Centers need additional cooling systems besides the CRAC unit. Here is a list of techniques and methods used for modern Data Center cooling.
Server Cooling: Heat generated by the servers are absorbed and drawn away using a combination of fans, heat sinks, pipes within ITE (Information Technology Equipment) units.1 Sometimes, a server immersion cooling system may also be used for enhanced heat transfer.
Space Cooling: The overall heat generated within a Data Center is also transferred to air and then into a liquid form using the CRAC unit.
Heat Rejection: Heat rejection is an integral part of the overall cooling process. The heat taken from the server is displaced using CRAC units, CRAH (Computer Room Air Handler) units, split systems, airside economization, direct evaporative cooling and indirect evaporative cooling systems. An economizing cooling system turns off the refrigerant cycle drawing air from outside into the Data Center so that the inside air can get mixed with the outside air to create a balance. Evaporated water is used by these systems to supplement this process by absorbing energy into chilled water and then lowering the bulb temperature to match the temperature of the air.
Containments: Hot and cold aisle containment use air handlers to contain cool or hot air and let the remaining air out. A hot containment would contain hot exhaust air and let cooler air out while cold containment would do vice versa.3 Many new Data Centers use hot aisle containment which is considered as a more flexible cooling solution as it can meet the demands of increased density of systems.
Closed-Couple cooling: Closed-Couple Cooling or CCC includes above-rack, in-rack or rear-door heat exchanger systems. It involves bringing the cooling system closer to the server racks itself for enhanced heat-exchange.2 This technology is very effective as well as flexible with long-term provisions but requires significant investments.
Conclusion
Companies can choose a cooling system based on the cooling needs, infrastructure density, uptime needs, space factors, and cost factors. The choice of the right cooling system becomes critical when the Data Center needs to have high uptime and avoid any downtime due to energy issues.
Sify offers state of the art Data Centers to ensure the highest levels of availability, security, and connectivity for your IT infra. Our Data Centers are strategically located in different seismic zones across India, with highly redundant power and cooling systems that meet and even exceed the industry’s highest standards.