New Data Center and the newer attacks- are you ready?
With every passing year, IT security is becoming the most significant issue for the industry. While threats to any part of the infrastructure are a big risk, Data security is the biggest fear in enterprise.
The Data Center has grown up from being a multiple server, energy guzzling mammoth to a practically invisible, sleek, virtual storage arena, mostly on the cloud. It has become smarter, faster, more agile and highly scalable, at very little extra cost.
But this development comes at a price. Now, it has the added threats that the cloud faces, or virtualised networks and data face…or worse, the connected technologies face. Mobile networks, cloud based and virtualised accessibility, in other words, has made the Data Center extremely vulnerable to hacking or cyber attacks. So, evolving with the Data Center, these threats also present a new security paradigm that enterprises need to be ready to meet head on.
Risks unique to virtualisation are now coming to the forefront, as Data Centers adopt the virtualisation technologies. Here are a few known risks:
In a virtualised environment, protected by traditional security apps, there is no visibility to machines on the same host, unless they are routed through outside the host machine, creating critical Communication blind spots. One of the ways to meet this risk is to dedicate security scanning virtual machines that scan the communication connecters between the VMs within the host. But for a cloud host, they will need self defending virtual machines.
The connected nature of virtualised environment makes guest devices a big threat. One compromise means risk for all the devices and machines. You need a firewall and a malicious activity detection system at the VM level. An attack on the hypervisor often means bigger risk since then all the machines are compromised, especially when the API is threatened. Most vendors strive to keep the API s secure, because that is the first line of threat for the data canter and everything else that’s on the platform. The risk goes even higher when critical data is stored on the same host as less critical ones, resulting in Mixed Trust Levels. Then it is difficult to protect only the critical data, and the way out is usually self- defending VM security that can help protect against.
Data Centers on the cloud also face the same threats- since the cloud sits on a virtual environment. In addition, the cloud computing devices, which cover multiple points like roaming mobile devices, stretch the perimeter and make the environment more vulnerable. In addition, on a cloud based Data Center, data is often moved around for optimal utilization…and this means the owners are often unaware of the location of their data. Also any residual data, which may be critical, needs to be eliminated during this shifting. Multi tenancy clouds are anyway more vulnerable, provided the security is designed so that any one user cannot access any other Data center. Data encryption is most often the security app used, but with evolving usage and constant expansion of the environment, threats will always ensue.
So is there any way of staying secure with your next generation Data Center? Of course- the answer is next-gen security!
The traditional means of Data Center security will definitely not help to secure your modern infrastructure. To keep your virtualised data safe, an answer could be security appliances with numerous points in the Data Center. Their virtual nature allows them to keep track of all parts of the network, and in some cases, these can be programmed to do a specific security function as well, creating a new layer of security can help fight malware, unauthorized devices and network viruses. In addition, advanced deep scanning engines like data-loss prevention (DLP), intrusion detection/prevention services (IPS/IDS) can help create a security lock. Intelligent network monitoring algorithms can also help control the movement of data in and out of the machines, while access can be prevented for certain type of devices, unless authorised. The smartest move for a keeping a modern Data Center safe is layered security. This can place entire apps behind intelligent engines that continuously scan for anomalies.
Despite all the technology, security threats will always be there. Deploying intelligent applications and barricades for Data Centers could be your safest bet in a battle that will never end.
As Data Center technologies evolve, so will threats. The best one can do is to have the right teams, armed with the right kind of technologies and applications to counter this threat that modernisation and evolution brings.
Cloud powered by Content Delivery Network – The Elixir of Internet World
Are we all aware of the staggering statistics about Future of Internet?
India will add 40% of the world’s incremental Internet users by 2020!
More than 175 million people will do their purchases online because of convenient payment options like e-wallets and easy payments. E-commerce business will be worth $17 billion at the end of FY 2016.
82% of CMOs (across all Industry verticals) report that the Brand health of their organization is directly proportional to the number of visitors of their websites.
When computing power was at a premium in 1990s, web pages were often hosted in local standalone servers and were served as unchanging text files. This type of static content is efficient, but it can quickly become stagnant and viewers lose interest. With the advent of low-cost computing and faster Internet speeds, the concept of Dynamic Content was burgeoned that could capture user attention due to sheer dynamism and possibilities of creativity. The low-cost pay-as-you-go model is ensured via virtualization when we have transformed to Cloud. In bygone days, IT would do a CAPEX investment in Hardware purchase and invest in maintenance especially when your best hired resources should be focusing on core business.
For the organizations where-in Internet facing Application/Site performance and users experience is the key to growth, Cloud Services powered by best CDN (Content Delivery Network) is the panacea. Quick, reliable, secure and personalized viewing experiences across all the devices on your fingertips is what CDN offers.
Empiricism is the theory that all knowledge is based on experience derived from the senses. That’s where human beings sense of feeling will always reign over machines or robots. Businesses in all verticals flourish only when Empiricism gives way to a satisfied customer. It thrives on the fact that customer thinks exactly the thoughts which you want them to think so that they get hooked-on and his brain signals his fingers again and again to key in just the URL of your website and better still – don’t even open the competitor’s website on the adjacent tab.
CDN service providers delivers Internet contents like – web-objects, Applications, live/on-demand streaming media and social networks to the end-users.
Load-time, Availability, Response time, Security are few parameters on which a good CDN is adjudged.
Direct visible impact of CDN is on Business Revenue, Business Agility and increase in ability to take it to global scale.
How?
Take an example, Mr. ABC, the CIO of organization – XYZ is sitting in US with best Internet Connection speed (25 Mbps), his site is hosted in Data Center in India with Best Compute, Storage, and Network, Redundancy ensured at the First mile. When he accesses his website – www.xyz.com, he starts counting – 1, 2, 3, 4, 5, 6, 7…..while his eyes are forced to see that vexatious Loading icon.
Oops, it takes 8 seconds to load the Home Page of his own Brand. If I were a site visitor, I would definitely go to next tab and load another site to do justice to those 8 wasted seconds. So result is rise in distraction, confusion and higher chances of site abandonment and hence Revenue.
That’s where the concept of CONTENT DELIVERY NETWORK was incepted.
Following are the series of steps executed when customer keys in his web-site address on the address bar of his browser:
Step 1: Browser contacts DNS for conversion into IP address
Step 2: Browser will establish connection with the web server that hosts the website using IP address
Step 3: Retrieval of HTML code over the internet of the requested web-page
Step 4: HTML Content is delivered over the internet to the browser which then displays the HTML page – bare bone html structure, followed by hyper-linked stuff – static images etc. and lastly Dynamic content.
Step 5: When you leave the browser window idle for long or close it, the connection with Web server will end
So Step#4 above entails the need of a CDN to ensure rendering of page with lightening speeds. In absence of CDN, the complications of middle-mile will ruin all the performance optimizations, best coding practices you might have followed in creating your website.
Static files like images, javascript, css files can be cached by the browser so that in future it doesn’t have to fetch them again. Of course, many technologies and CDNs exist which ensures caching. The Site owners face the chill wind with regard to performance only when there is good amount of Dynamic content in their website.
Dynamic content is web content that changes between subsequent requests based on user access time, user preferences and personal information in order to ensure an engaging online experience. Some of the world’s largest websites are powered primarily by dynamic content as static content cannot capture user interest. According to recent studies, Seventy-four percent of online consumers get frustrated when a website promotes content that isn’t tailored to their interests
E.g. Amazon is delivering you a personalized content experience in an attempt to make you buy more. Notice the section called “Buy It Again” below the items you want to purchase. Amazon is populating this section by pulling your previous orders from its backend servers. This is much more effective for upselling purposes than listing random items.
The very reason why site performance is impacted in case of dynamic content is due to enforced round trips to the origin server, as it cannot be cached.
Only few CDNs on the planet give the required acceleration to dynamic content. Several techniques like route optimization are needed to optimize communication between the CDN edge servers and origin infrastructure to deliver dynamic content to the user that avoids Internet problem spots.
So the philosophy of having good dynamic content in your website with the lure of increasing user interest and hence hits, should be carefully chalked out along with the appropriate choice of CDN adoption.
Walking with the false myths of considering CDN as just a redundancy (or good-to-have) service can jeopardize the Business strategy significantly and will eventually become the bane of Business advancement prospects.
Facing performance pressures with your Websites and Web Portals? Feel free to reach out to us at cdn@sifycorp.com.
Transformation Integration Services – The Virtue of Single Platform Operations
An increasingly connected world needs a smarter, faster and much more agile technology driver.
And this new business paradigm demands much more secure networks and storage, more efficient and completely clear communication and collaboration. Anything less than perfect will instantly become an Achilles heel of any enterprise, jeopardising their market status, reputation and in some cases, even existence.
What organisations need are domain-specific, integrated solutions than can transform the IT infrastructure, providing better focus on their plans for increasing asset productivity, thus contributing significantly to business growth. Transformation that leads to adoption of more agile, innovative tools and practices drive lower TCO and higher RoI for enterprises. A deep drive analysis of an organisation’s operational structure can actually identify the areas where transformational integration of technologies can achieve this business value.
Communication being the stronghold of enterprise operations today, anytime -anywhere access of data and systems information can be the differentiating factor between a good business plan and an excellent one. Deployment of emerging technologies that make a single platform communication a reality, that connect stakeholders across geographies, time zones and business interests- is a process outcome of transformational integration of collaborative and communication technologies. What is needed is a partner that can support through the giant leaps in a constantly evolving networking technology landscape. Simply because to grow, every organisation needs a real time, converged, application oriented network infrastructure – to achieve productivity and efficiency at a reduced TCO.
Ensuring these systems and the infrastructure is secure is critical, integration of all functionalities needs to be protected by extremely efficient security tools and applications. Enterprises need a partner who can provide state-of-art correlation technology including vulnerability and threat correlation with a robust and pre-emptive cyber threat intelligence framework. This framework should also be equipped to assure real-time risk monitoring and modelling for critical assets. Without a doubt, this level of support would need expertise and proven value on remote monitoring and management of networks, servers, applications and managed security services that ensure each aspect of IT infrastructure is free of threats and risks.
Between 2013 and 2010, data is more than doubling every year, says an IDC report. By 2020, human world will have created 44 zeta bytes of data. Are we ready for this deluge? And the zillion dollar question is, where are enterprises going to store all this data? Virtualization can help engage cloud storage, but innovative Data Center tools can help organisations cut costs of storage while increasing efficiencies. Virtualised Data Centers ensure effective and robust IT infrastructure for all critical business needs for every organization, working on consulting and design/product supply/deploying and managing services, etc. These are the pillars for IT transformation in any organisation and help to achieve its targets of growth.
This Data is invaluable if it speaks to the strategy makers of an enterprise. Communication here needs to be available throughout the enterprise, and assure insightful strategies are created. These strategies will be the drivers of growth, today and in the future.
With communication and collaboration technologies having evolved many times over in the last few years, a unified approach to communication is imperative for every organisation. The ability to efficiently communicate and collaborate across varying medium could be just the edge that drives an enterprise to higher market presence. Convergence across audio, video and other collaborating capabilities- ensuring seamless collaboration between varying devices- this resilient network fabric can provide the ability of real time and synchronous communications within the enterprise, as well as with the suppliers and customers’ network.
One of the strengths that organisations that look for in transformational integration support services, is the ability to power communication interfaces with open standards architecture, to make sure that their unified communication is future ready. What also helps is the ability to provide a single user interface, interchangeably between devices and networks. And finally, if these strengths reach out to business applications integration, they can be leveraged to drive a much faster, more agile, scalable and robust business decision protocol. This may mean encompassing clients, geographically diverse teams or backend suppliers- but the net result is driven by clear communication across devices. All that is needed is a robust security deployment to make this entire platform foolproof and completely secure.
At the end of the day, it is time enterprises wake up to the fact that IT processes working in silos may not be the route to success any more. It is only when all systems can be integrated with the objective of transformation, on a single, highly secure platform with converged networks, enabling virtualised data storage and seamless communication between business stakeholders, will an enterprise be ready for the future.
3 Sharp Reasons Why Collaborative Communication Adds To Your Bottomline
The reach of every human and business is fast expanding with almost the same speed as which the globe is shrinking. Businesses that are able to stay in constant touch- by various innovative and fast developing communication technologies, will have a definitive edge over competition – they will be available, connected to the market real time and hence the winners!
Keeping up with the speed of business would be impossible if communication technologies did not take the way of collaboration. It has been proven by research that providing improved and innovative communication that allows real time collaboration actually delivers measurable business benefits. Most enterprises have these tools in place, but as individual bundles. Standard phone calls and net connectivity is simply not enough to keep in touch with internal teams, and neither with your market. It has to be a real life, constant communicability experience that Unified communication provides. Besides, if all communication technologies are used as standalone tools, their RoI is much lower and so is their efficiency. It is only with use that enterprises realize the value addition that comes with a collaborative communication channel that takes care of all connectivity on a single platform. The savings become clear, and so do the added benefits.
Here are the top three advantages your business can get, with a single deployment of collaborative communication technology:
# 1. The topmost benefit that shows up as business savings in your balance sheet– is the saving made on connected networks. While networking is an imperative in today’s enterprise environment, a collaborative Communication implementation that saves the cost of separate stand alone networks- and a converged infrastructure is an upfront benefit! Added to this is the saving in executive travel and lodging- which can be completely avoided by virtual meetings that provide the clarity and convenience of a face-to-face meeting. Here, a voice, video and IM communication on a converged network, promises many times less the cost of a single executive’s travel for a location meeting. These tools reduce communication delays and failures, improve decisions quality and sharpen the collaboration process – all at a lower cost.
# 2. Shorter customer relation cycles that add to the customer experience hence the productivity process, ensures faster, higher growth in the market. Clearly any businesses that can deliver a shorter sales cycle, faster customer response and more efficient and satisfying market response time- is a winner. The savings are tremendous, the downside is very limited. The calculations will be very simple- the potential benefit of collaboration in the product development cycle can easily be forecast by observing the number of new products releases planned in a time period and the change in market share and company revenue in that period. The shorter development and launch cycles ensures the benefits start showing up in terms of a reducing cycle, and addition to market revenue as well as savings on cycle processes.
# 3. Collaborative communication places the onus of in the hands of the IT team of an organisation. This adds to it the sheen of higher security and better management, miles above standalone vendor driven tools for communication. This is a great opportunity for enterprises to straighten up their security policies, and achieve an almost seamless collaboration for all teams, personnel, offices and operation centers on a single platform. This platform being managed by the technology team, will be an asset to the company, in terms of security, and allow immediate and strategic decisions to be made. One needs to remember that deployment of collaborative communication tools is a business strategic decision, and hence the RoI that this deployment will achieve will definitely be an add-on to the business processes- without compromise on security or quality of communication.
With these clear, game changing benefits, it is imperative that enterprises use the power of collaboration. Whether it is expansion or just speedening up processes, this is the way to go. The benefits in terms of increase in revenue and market share will certainly manifest, but what’s more, it will also show higher employee productivity. A very clear relationship between communication and workflow efficiencies will start showing up as the outcome of using UC. The result will be a solid foundation, one that enables technology of communication to help the leaders make sound and collaborative business decisions, at a much faster pace and a much lower cost.
Cognitive Computing Turning IT Operators into IT Innovators
Cognitive Computing: Turning IT Operators into IT Innovators
We are on the cusp of a new age of computing. One where smart businesses are starting to think differently about how they design, build, and deliver technology solutions. We believe that IT teams need rise up and help drive this innovation.
That’s why we’re collaborating to help our clients apply the power of cognitive insights to transform their businesses. This starts with a perception shift.
Technology as the Job Maker, Not the Job Taker
So much of what you hear about technology today puts technology in the position of job taker. Conversations about which jobs computers and robots will make obsolete pop up frequently in conversations about the future of tech, business, and the economy.
We have advocated version of that view ourselves: Consolidate your servers and reduce the manpower needed to maintain your technology, outsource aspects of your technology as PaaS, SaaS, Iaas, cloud based, and so forth. This kind of efficiency is part of the story. BUT, we’ve worked hard to relay the second half of this story loudly (but maybe not always clearly).
Yes, this takes away tasks from your IT team, BUT it frees them up to focus on more mission-critical IT work.
IT teams have looked at these kinds of statements skeptically. What would their function be if it weren’t maintaining IT operations?
The Cognitive Computing Era
Welcome to the cognitive computing era, where that vision of IT experts focusing on mission-critical tasks is more of a reality than ever. In the cognitive era, technology no longer just supports the business. It BECOMES the business. Your IT infrastructure is the foundation of this, but IT professionals must do more than just keep the machines running. They must embrace the role of business innovator.
According to IDC, line-of-business leaders say that they perceive IT leaders as business innovators more than business operators. Yet, that same study found that IT leaders viewed themselves more as operators. It’s time for that to change.
The I in IT
Business leaders are recognizing that IT needs to be more tightly integrated into the overall business strategy. In order for this to happen, IT needs to be enabled and viewed as more than infrastructure architects.
IT needs to embrace the idea that they can use technology to drive the business forward. They need to throw off the idea that the I in IT just stands for Information. They need to come forward and proclaim that it stands for Ideas, Imagination, Invention—and that the core of all of that is Innovation.
Priming Your Business for the Cognitive Computing Era
In the cognitive computing era, data analytics has been cut from hours to milliseconds. Organizations have the ability to take action on data in real time. Machines are learning in the cognitive era. They’re learning how to help run businesses more efficiently.
None of this is possible without the right IT infrastructure. Today’s machines can understand, reason, and learn. Traditional IT environments work, but to take the leap into the cognitive, you need more.
We believe the core of this is through enhanced capabilities. In the coming weeks, we’ll dive deeper into the following capabilities:
- Analytics Acceleration
- Data Centric Design
- Innovate with Open Technologies
- Choice for Optimization
- Lock Down while Opening Up
- Controlled Iteration at Scale
When these capabilities are wrapped in the core principals of Design for Cognitive Business, Build with Collaborative Innovation, and Deliver through Cloud Platform, business can start to map out a path where IT is innovative.
Subscribe to our blog and hear us out. We’re ready to make the case for an IT team that is more than just the architects of the machines. They’re the architects of a new kind of business.
Smarter use of collaborative tools and tips for choosing the right ones
Collaborative tools have taken enterprise by storm, not only because of their innovative nature, but also because of the convenience they offer for the critical processes- client communication and data dissemination. Technology plays a huge part, but collaboration, by nature goes beyond that. Here is some ways on which your collaboration tools go beyond mere technological innovations. And these are the ways that collaboration can be used for giving that edge to the business.
Enables easier discussions and planning
With a collaborative platform that is free of geographical limits, remote working of teams has become a reality. This is a definitive edge, because while it may upset work life balance, certainly it supports teams to get together to share ideas, strategies and information that will create them. Getting the right people together, irrespective of the location, is what drives an efficient operation. In a medium sized organisation, this ability could be the game changer. Online collaboration tools also present the opportunity to create the virtual incubator, and this helps to get everyone on a single platform against boundaries of companies, colleges or even vendors.
Equipped with a file sharing and commenting system, audio as well as video collaboration allows employees and teams to work together real time. It will also save costs of travel, maybe real estate too. Having a clear task board is one of the things online collaboration tools can help you with. This helps to create and share a focussed goal, get roadmaps for everyone to follow and finally, ensure the project and strategy tracking is on the same board- pun intended. With good collaboration tools in place, there are jobs done far more efficiently from participants from across the world, than being in the same room.
The nature of the very basic of idea sharing – brainstorming- also can be changed with collaborative networking that has innovative techniques and tools to share information, ideas, and even record them.
They help shape best working practices
When teams work without spatial and communication inhibitions, it becomes easier to discuss the best practices, and far easier to adopt them. The company’s objectives are much clearer, since everything is out THERE, clear and open for discussion. Time delays, trust issues and inability to move forward- all are created because of lack of free and real time communication. Collaboration helps to eliminate that!
Much more efficient resources management
Resources and skilled ones, specially, are a big cost for every enterprise. As the enterprise goes up the value chain, the resources will reduce and there will be an increasing need to do much more with much less. With collaboration, this is supported well. These tools help cut down on redundant and unnecessary communications, high cost means of communication like couriers, or even travel. There is much more that can be achieved with a small, faster, agile collaborative team, and that is the benefit of these tools. Any issues that crop up can also be quickly flagged, resolved and eliminated. Clear conversations that can be tagged from any location, and the ability for the management to follow them- add untold value to the process of strategising- and can be the building block of a very smart company.
Upping employee engagement and morale
Some industry leaders see the engagement advantage as the biggest reason why collaborative tools need to be adopted. This is a modern management outlook, where top-down form of running a company often backfires. Any leader who does not take inputs from employees, but prefers to fire instructions, is more likely to lose people and respect.
Collaboration tools put everyone on the same platform, affording equal visibility to everyone; ideas, making employees feel important, and critical to the company. Nothing makes them more efficient and interested than this realization – that they matter. The manager driven vision of issuing instructions is terribly detrimental to a company’s growth, an increasingly people are realising that. Besides, there is the little matter of most young employees being better clued into the social market than the manager level ones. Getting their interest and commitment could be the smartest decision by way of resources management…and collaborative tools could be the best conduit for that.
Knowledge Management benefits
The same collaboration that helps sharing ideas and forming the most well informed strategies, will also enable collaborations for the knowledge repository within the organisation. Normally, most enterprises have a KM database, that no one really manages or knows about, hence very little utilisation happens. Data that is kept in some obscure corner of the server may be valuable but its real worth only issues when it is utilised for the company’s benefit. With collaborative tools, it can be used by teams globally or at least enterprise wide, to create the best strategies, driving the company’s growth. This pooling of data, information and insights could be the catalyst that your enterprise needs to get on the fast track!
Tips for Choosing the Right Collaboration Tool for Your organisation
Today any organisation needs effective and efficient Communication among their employees. We call this capability as collaboration tool which is in the form of software and hardware. This collaboration tool has many capabilities i.e. from simple instant messaging, to video or audio conferencing. Some applications may focus on a specific element while others try to incorporate more than one capability.
This application software designed to help people involved in a common task to achieve goals. A collaborative working environment supports people in their individual with collaborative work environment thus evolving into a new class of professionals, e-professionals, who can work together irrespective of their geographical location with many tools stich together. we define Online Collaboration Software as a software application, platform or tool that is delivered as Software-as-a-Service (SaaS), or cloud, or on premises having number of capabilities within one platform, either built directly in, or integrated with other applications.
Online collaboration is a valuable tool for any business that wants to be more efficient and effective and isn’t just for companies who had multiple offices located in different geographical locations alone.
Let’s discuss some capabilities of collaboration which is being expected by any organisation.
Some groups want their work done through Group Messaging. We should always look for Integrated messaging features. Collaboration on work doesn’t always mean two or more people are working simultaneously. At time some document which needs to be reviewed by other team members. This is when messaging can be helpful in moving the process on. Users who are viewing the files should be able to leave the message in the file/document. There should also be an option to share the message privately /personally or with a group. Support of Instant messaging IM should also be integrated in the online collaborative tool OR should be part of collaborative conference tool. Sending IM is efficient way to communicate.
There are many ways of doing online conferencing like WebEx, GoToMeeting, Spontania, video, skype, google hangouts etc. and if you are working on any collaboration tool then you should need to supplement your online collaboration software with one of these services. Also we need to see the number of participants in a single conference, normally many solutions are having 25 participants but again depends on the available bandwidth also. We should also see whether the same conference can work as video and audio both? How many continuous presences are available? Public & private chatting is available? Also we should check if connecting the tool with any hardware H.323/SIP video conferencing codecs is possible or not FROM LAPTOP and vice versa? Normally it needs some Gate way software licenses or hardware to connect H.323 /SIP Codecs. All these collaborative tools should also have capabilities for features like application sharing, desktop sharing, white boarding annotation etc.
Then one of the demands are whether you can record the conference on the host pc or on cloud?
Also we are talking about BYOD i.e. bring your own device be it iOS, android smart phones etc. should have the apps available to join the collaboration services from these smart devices as well.
Finally these collaborative tools appeal to the young
One thing for sure that the young generation, group of young people won’t stay unless they’re happy with the tools they’re provided. Email isn’t enough anymore: they expect to communicate much more naturally, just as they do with their friends.
Fail to adapt to this change and you’ll quickly be left behind. If your business isn’t open to collaborative software with latest communication techniques that suit your employees, you might find it hard to find talented employees to fill the gaps in your workforce.
5 Reasons why SMBs must seek out cloud solutions
Reasons why SMBs must seek out cloud solutions
Big is not necessarily better – and small businesses now are aware of just that. Small businesses are r now ready to take on any big competitor, thanks to technology and variety available in the market these days. Not just that, small enterprises can also provide competitive prices and high levels of customer satisfaction.
Growing comfort with the cloud concept encourages small businesses to move beyond hosted general applications and email service to more critical platforms. SMBs are also considering cloud-based solutions as a prime investment area.
Few reasons why SMBs must seek cloud solutions are:
- Flexibility and Scalability: Cloud based solutions are extremely flexible and adjustments to the system can be made quickly without the need of a major re-architecture of the Data Center; and hence prove the right choice for SMBs. One does not need to bother in case of a shifting of office/ headquarters or even the Data Center.
- Geographic benefits: Cloud solutions offer a unique opportunity of having geographically unnecessary online copies of each backup. This is especially useful should a calamity hit an entire region. Additionally data backed up in one region can easily be restored in another region on any side of the globe.
- Speed: In addition to performing updates in a much more timely manner, making the switch to cloud also allows an organization to work much more efficiently. For instance, in case of SMBs and startups, when one can’t afford IT staff, trying to perform updates oneself can end up being time-consuming; putting it in the hands of others means that time can be put to better use.
- Low maintenance: While conventional storage solutions require significant amount of architecture and technical work to setup and operate on a day-to-day basis, cloud technologies offer complete solutions that may not require in-house IT resources to maintain. Additionally cloud managed services provider can provide experts to monitor and maintain backups for your entire organization from a central location.
- Cost Effective: One would need to consider the cost of the hardware needed to backup data, software licenses, storage cost, electricity and cooling system cost, the cost of an offsite storage facility, a DR site in another location, and the cost of qualified technical persons to manage the backup and recovery and testing process. After analyzing all the costs, the low monthly fee of cloud based solutions quickly becomes attractive.
Human errors is now the top threat for information security in organizations
A chain is only as strong as its weakest link. In an enterprise environment, it is often human errors that create the biggest risks. This fact is corroborated by many studies, and more than those, by many outages and episodes. Whether it is ignorance or wilful data theft, the risk that enterprises face from employees and disgruntled ex-employees is as big as (if not bigger than) external cyber threats or APTs. While ignorance has only one reprieve- training and policy enforcement, deliberate thefts can be more difficult to handle, and will need tools as well as stronger security infrastructure tools in place.
In January 2015, P&G USA filed a suit against 4 Gillette employees, for stealing and sharing corporate information with direct competitors. This is a classic case of people risks with sensitive corporate information. In another recent report on a study by Ponemon Institute, employee negligence was identified as the top threat for information security in healthcare organisations. How do CISOs identify a high risk behaviour employee? Also, how do IT organisations fight ignorance and negligence in employees to secure data, to the maximum possible lengths?
Here are some ideas on how to do it…
1. Identify careless employees, increase risk awareness- Tighten up Policies
Simple carelessness on the part of an employee-like forgetting to close a portal when not using it, using weak passwords, allowing unauthorised access to information and the most common- forgetting a mobile in a cab- can cost a company millions in terms of revenue. It can also turn back years of hard work for a market reputation or creation of strategy.
The only solution is constant training and creation of awareness about the risk careless behaviour can carry. Making compliance stronger and more strictly enforceable could help here. Creating clear policies of what is mandatory, at any cost, also helps. Once clear cut guidelines are in place, screening is easier and even re-screening isn’t such a cumbersome task any more.
All access levels need to be defined, especially for business critical systems and data. Strong encryption tools need to be in place, and authentication needs to be a non-compromiseable exercise. Unwanted devices, sites and applications usage needs to be regulated as well. Constant training needs to be in place for cyber security awareness, so employee understand the impact of even a single wrong click that opens a malware laden site. Opening unauthenticated sites, sharing passwords, carrying sensitive information in an unencrypted form- everything needs to be regulated, and the employees made aware of the risk- over and over again!
2. Mobility Led Risks- The rights tools in place
While enterprise mobility cannot be avoided in almost all enterprises, it is one of the highest reasons of data theft and loss. Studies indicate that almost 68% of all global organisations have faced a security threat from employee owned mobile devices.
Every enterprise needs to have tools in place to prevent this risk from blowing into a full outage. Again, a clearly defined BYOD policy is a critical part of this plan. Monitoring personal owned devices, encrypting data before access, are some processes that should be strictly enforced. Security solutions for isolating corporate data and encrypting it are available, and should be used.
3. Disgruntled employees- screening and rescreening would help
While employee background screening is almost mandatory for every organisation, sometime it is just not enough. In many cases, crucial facts about an employee can be missed. In addition, a dissatisfied or frustrated employee is also a threat- and specially one who knows it will be easy to walk away without anyone identifying him or her as the cause of a breach. They will have the satisfaction of causing harm to the company!
For this kind of attitude, a single screening while hiring may not be enough- follow up screenings and re-screening is required. Companies that do not insist on rescreening at regular intervals, expose themselves to threats of all kinds. Having a regular follow up on every employee’s background is an exercise that could probably detect a malignant element in the people strength of the company, which could be the fore-alarm, needed to step up security or deal with it right away.
4. Train and Update- Constantly
There are innovations in tools as well as applications for IT security, as with other technologies, on a rapidly growing basis. Every single threat is another step up for data on risk, and every time this should be documented and shared.
Every enterprise should keep abreast of these innovations, and ensure all employees are trained on a constant basis. By maintaining an updated list of risky behaviour, and the circumstances that lead to a breach, a training manual can be created. Employees need to be regularly trained for what to be cautious about and how to handle a threat. Clearly articulating the ground rules and elaborating on the consequences of the situation will certainly create a culture of security awareness in any enterprise.
Summary
While constantly evolving security technologies are creating updated tools to fight IT security, the single wrong action of an employee can undo the best of guards and checks. Every enterprise needs updated information on these tools, and needs to ensure every employee knows how to NOT be a risk. The education, training and awareness about risky behaviour are essential.
Also essential is the policy to make this awareness mandatory, these rules completely enforceable and the training a part of the corporate culture. While technology and tools can provide the ammunition to prevent breaches, the human element needs enterprise focus as well!
Enterprise Policy Vs Technology – are your people the biggest security risk?
According to a study by Intel in September 2015, almost 43 % of all data breaches were due to insider breaches (half being intentional). Threats perpetuated by disgruntled employees form an overwhelming number in these, especially in the Asia pacific region, where it is the second largest cause of all security breaches.
But despite such staggering figures, very few organisations or IT employees take the insider threat seriously- as low as 20% in the US market. A recent report by Ponemon says that in 2015, while insider attacks weren’t the biggest cause of security breaches, they caused the most damage- about USD 144,000 per instance!
Why?
Globally, very few organisations seem to have a clearly written policy that ensures employee education or affirmation about maintaining security of organisation data. If nothing else, it would help in increasing awareness of what might be dangerous, and lay down the processes for the right way of handling sensitive data!
One of the things this policy needs to define is regulate the privileges that trusted operators have- because they most often have the opportunity to cause most damage. Since they have the privilege to perform any process on critical systems using critical data, they could also, inadvertently or deliberately, be the biggest threat!
Most organisations confuse trust with granting unauthorised access to data for any employee and that has cost many companies dear! A balance between empowering an employee, and access control needs to be in place. In a vast majority of cases the unauthorised access comes from inadvertent sharing or passwords or access to critical data. What’s needed is a strict control on access. But that’s where the challenge lies- overlapping roles and inconsistent entitlements. But even more than that, is the poor governance process that keeps the backdoors open for security policy enforcement. The reason is, very often, that most organisations themselves are unaware of where their critical information is stored. It then becomes difficult to prevent inappropriate transmittal or access in the first place! And in most cases, a company’s reaction to a breach is reactive. There is hardly any attempt for predictive responses. There is almost never any system or policy in place to identify at risk accesses or individuals, so an attack may be pre-emptive or predicted.
Any policy that is to regulate data access to insider threats needs to follow some definitive guidelines. Some permissions and capabilities of employees need to be clearly regulated. These could be:
Data Classification
In order to be able to protect critical data, it first needs to be classified as critical. Understanding the consequences of a leak, an organisation needs to classify information at various levels of criticality and then work on ensuring the various security policies that confirm to each level of protection it needs. The data could include customer data, financial or market data or systems information. Each of these will have a cost attached, and access policies need to be in place for all. In addition, the security algorithms need to be clear on who can access to what levels- read, delete, copy or use in any other manner.
Privileged Identity and Passwords Management policy- a Must
In most organisations, the security and IT admin teams have access to almost all data, but with passwords. In some orgs, leadership and stakeholders are also given access. Such privileges need to be monitored by technology tools as well as policy enforcements. Who gets to see and do what, or Privileged Identity Management, has to be clear and simple but non-compromisable. It should enable regulation of multiple accesses to critical data.
Often many leadership level stakeholders share passwords and authorisations that could compromise key data or systems of a company. A policy that lays down the terms of clear privileged Identity Management can control the risks associated with this multiple usage of passwords and thus, the risk.
RBAC
In most organisations, privileges accesses are all or nothing accesses, often allowing more privileges than a person needs. A regulatory policy should be able to change that, and reduce the unnecessary risk to key data and systems information. Policies governing user entitlements need to be a strict enforcement in every organisation.
Fraudulent Access Identification
In cases where an outsider exploits an insider to access data, the advanced authentication methods should be put in use. These would go beyond passwords, and into the contextual factors. Fraudulent access can be identified by simple ways- time zones- a person logging in from another place within minutes of logging from one- or some security questions answered wrongly- anything could trigger alarm bells and even identify a fraud authentication try. But these also need to be a part of the policy process.
Virtualisation Risks – Need of Security
With innovative technologies like virtualisation, the risks of insider leaks have increased- another layer of administrators for the hypervisor. With the ability of the tool to replicate or transmit data at a single click- the risks have gone up manifold. The solution usually is to embed traditional security apps in the hypervisor layer as well, but the entire virtual infrastructure too, needs to be secured. The security policy needs to have an option for emerging technologies and the risk they pose.
Summary
So, to control the problem of unauthorised access, there needs to be a strict security paradigm with automated processes that meet compliance audits and identity security policies. What’s critical here is the tighter incidence management timelines- that deliver a timely and stronger role based security foundation.
Data Centers 2.0 – What to Expect and What to Plan For
With enterprise rapidly becoming virtual – by way of the cloud and connecting with IoT, physical servers for Data Centers are on their way out. Whatever the storage capabilities, it is just not going to work for the enterprise of today, and definitely not for tomorrow. What then is the face of Data Centers 2.0?
The cloud has all but taken over the storage scenario if Gartner’s figures are to be believed. In fact, in India, the revenue from cloud services has been projected at 33.2% between 2012 and 2017. So essentially, the cloud expenditure will grow at a higher rate than IT budgets in most organisations. Given this reality, and coupled with the rapid adoption and growth of virtualisation as an enterprise technology, the future of Data Centers is going to be vastly different from its present avatar.
Businesses are demanding a change. Virtualization is the enabling technology for a whole new wave of IT innovation. In addition to the technological prerogative, virtualisation is a technology that will drive efficiency and speed in enterprise in more ways than one, the primary one being business advantage. There will be soon no reason why any company would entertain the monolithic hardware based Data Centers that grab higher slice of the IT budget pie in just keeping it efficient and well managed.
The most obvious way the change will happen is the transformation from everything hardware to everything on a software platform. Beyond storage, even the Data Center networks! So SDDC is the way forward, definitely. According to a study by MarketsandMarkets, the market for Software Defined Data Centers will be growing at a rate of 28.8% every year between 2015 and 2020, from USD 21.78 billion in 2015, touching USD 78 Billion in 2020. With the irresistible benefits of reduced IT infra cost that the automation provides, and the resultant efficiency of market response, SDDC is definitely the hero of the still evolving Data Center industry.
With software enabled systems, a multi layered control is the next easy step. A centralised Data Center management console is now a reality and this also allows the analytics and big data processes to get more space, visibility and hence, strategic leverage. These layers can control almost everything – from users, space, virtual machines and even manage policies. With higher visibility into every single aspect of the data that can be drawn, and more efficient process control, this will certainly be a part of the Data Center of the near future.
With software essentially taking over Data Centers, they will be completely infrastructure agnostic very soon. There will be very little consideration of which hypervisor, servers or platform is being used, since the layered management tools will make it so easy for administrators to scale and work on powerful platforms, in a much more fluid and seamless fashion. The Data Center of tomorrow will take ease of management to hitherto unknown levels. In the same flow, Data Center automation will ensure better workflow management and optimal efficiency management as well. Robotics will almost certainly be utilised to a greater extent in efficiently managing Data Centers of tomorrow. Many known brands are already developing smaller robots that will help in identifying the energy patterns and their efficacy in Data Centers, and this will only get better over the next few years as Robotics becomes a more enterprise oriented technology.
However, just acquiring a Software Defined Data Center may not be the end of enterprise planning; there are certain things to watch out for, while taking up this innovative plan. Firstly, SDDC is not an enterprise reality yet, and only those organisations that have a certain maturity levels in terms of I&O engineering, are best off installing it.
Secondly, and more importantly, adoption of a software defined Data Center should come from a business need for higher agility and scalability. The underlying technology and skill requirements will not be viable if this is a purely technology initiative. In any case, this is not an off the shelf product and any enterprise planning to install has to do its own due diligence. A SDDC initiative would need that the organizations identify a suitable vendor to do the job, and do it well. Then, there are discussions around the various components of the Data Center – which may not all come from the same vendor- hence the issue of integration and seamless interoperability will arise.
Till it becomes commonplace technology, the stakeholders need to work on various commercial aspects of the transaction – software lock- in will be a big issue for this particular decision, and could play a very strategic role in the financial planning.
All said, the SDDC will replace your Data Center eventually, over the next two to three years, but it may not be the smartest choice for every enterprise right away. However, enterprises need to be prepared for the change when it comes- because then it will be too late to start preparedness and planning.