“Today, there is a clear focus on converting data into business value”

0
96

NetApp, Inc. is a proprietary computer storage and data management company headquartered in Sunnyvale, California. NetApp creates innovative products – storage systems and software that help customers around the world store, manage, protect, and retain one of their most precious corporate assets: their data.

Manish Goel, executive vice-president, product applications at NetApp spoke to Ashwin Gopinath about the way the data growth is leading to newer and smarter methods in storage solutions.


7A3_Manish-Netapp
Manish Goel

Q. Please give us a brief overview of NetApp?
A. NetApp is a storage solution provider. It’s a $6.5 billion company employing over 12,000 people globally. We provide computer storage solutions to the major enterprises and governments across the globe.

Q. What kind of work is your R&D team involved in?
A. We have a global development model; we have 10 engineering sites across the globe, all of which work on all the projects equally. India is our largest engineering site at this point in time; 40 per cent of our workforce resides in Bengaluru and they are involved across the board in every single aspect of our engineering work.

Q. What kind of challenges do you face in this domain?
A. Storage is a critical part in data infrastructure. These days, it is as integral a part as any other in how most companies look to meet their business priorities. Data, as such, has two challenges viz. unprecedented data growth and, from an IT standpoint, the intense cost-budget pressures. The IT organisations are looking for ways to store their data and get the value out of that data in a way that’s cost effective, flexible and meets their business priorities.

Explore Circuits and Projects Explore Videos and Tutorials

Q. So what are the biggest trends with respect to data centres right now?
A. The biggest trend in the data centres right now is that they are transitioning from being application-oriented silos to becoming infrastructures where multiple applications and multiple data storage units can reside on the same infrastructure. This trend is causing people to re-architect their IT data centres and move to approaches which are much more flexible and agile as their data infrastructure needs to become a kind which is seamlessly scalable, theoretically being infinite in its capacity to scale. It has to be immortal in its ability to store data for a really long time regardless of the underlying hardware life cycles. It also needs to be really intelligent as in it stores the data in the right hardware architecture so that the cost sensitive data is residing in the cost-sensitive hardware architecture and the performance sensitive data is residing in the performance sensitive architecture. All of this needs to be done completely automatically and must be handled by the storage architecture itself.

READ
The Use of Augmentative and Alternative Communication for Differently-abled Learners

Q. What challenges has this humongous increase in data brought forth?
A. There is no doubt that the scale of data has become so large that the traditional approaches of managing that data are becoming completely outdated. So essentially when you’re managing petabytes to exabytes of data, a lot of tasks where human administration was possible, now human involvement is NOT possible because of the sheer size involved and hence, that management has to be done by the underlying infrastructure itself. Whether it is storing the data in its right price point, in the right tier or the ability to protect that data from disaster(s) or something similar, the corrective as well as preventive measures have to be taken up by the system. Today, there is a clear focus on working out how to convert this data into business value and hence, a lot of emphasis is placed on data analysis.

Q. So, how does differentiation happen within competitors in this market?
A. The approach we have taken for our development site over here is that it is absolutely a full-fledged product innovation site. That was a conscious decision we made the day we opened our site here. Many companies, when they have opened their operations in India have taken an approach where they have put lower quality or lower value work to India first and there has been a tearing of the kind of work that takes place between the different development sites. We made a conscious decision to have India be a full-fledged part of our product innovation agenda. With any product that we develop, our Indian engineers work side-by-side with their US counterparts for developing that product. So, if you’re a systems engineer, this is a really exciting place to be. This is where hard-core engineering takes place. We hire the best and the brightest. We have continuously been a great place to work at. Our work culture and the kind of work we do is something we are very proud of.

Q. I read that NetApp worked closely with CERN on the LHC. Could you describe the kind of work involved?
A. The LHC produces extraordinary levels of data, somewhere in the range of petabytes/hr. The LHC has close to 150 thousand sensors which constantly monitor every happening in the collider. All of those sensors are feeding in inputs back to the databases that are being stored for future analysis. Hence, the storage architecture really has to have the capability to be able to handle that kind of a data ingest rate and then, make it available for analysis at a later date. They chose us for their data storage solutions, because of our scaling capabilities as well as the fact that we can handle those kinds of ingest as well as bandwidth rates.

READ
Develop Open Source Networking with Dialog’s OpenThread Sandbox Platform

Q. Tell us a bit about Agile Data Infrastructure (ADI).
A. What ADI really represents is that in the traditional approaches of storing data, data was locked into physical boxes and disparate architectures. So, people had viewpoints where they needed to store their mission critical data in tier 1 type architecture and non-mission critical data in tier 2 type architecture and archive data in a different architecture. People chopped up their storage architectures in many different packets viz. tier 1, tier 2, tier 3, SAN (storage area network), NAS (network attached storage) etc. What that really did was it made the data captive to the physical architecture that it was stored on.

What ADI was conceived to convey was data infrastructure should not place any limits on the ability to store all the different data types regardless of the service levels the applications are looking for. It should be able to scale seamlessly, it should be available for a much longer period than the physical hardware and it should have all the data management intelligence integrated into the data architecture itself. So, we at NetApp call it, Infinite, Immortal and Intelligent. Infinite because of scaling, Immortal because of its ability to preserve data for long periods of time and intelligent because of all the data management capabilities.

Q. What are your thoughts on Big Data being accepted throughout the Indian industry?
A. In information technology, big data is a collection of data sets so large and complex that it becomes difficult to process using on-hand database management tools. I think as companies in the country along with the government organisations are taking advantage of the advances in technology, the global trends apply equally to India as they do to any other country.

READ
Managed Leased Line Network

Even in India, there are enough examples, what with the UID card project where an enormous amount of data is being created and that requires different approaches to storing it and managing it. So, I would say that the global trend of Big Data permeating data collection would apply to India sooner than later. Ultimately, what you need to do is to store data in a most cost effective manner, so it should be cheaper to store, it should give you the level of protection that you’re looking for the data integrity in your data and, most importantly, it should allow you to take the data and convert it into business value. And that’s what the NetApp promise is, we enable our customers to become more flexible in their infrastructure, they can do things much more easily and with a high degree of cost effectiveness.

Q. What are your thoughts on the security concerns regarding placing information out on the cloud?
A. There are clearly security concerns and there are different approaches being tried out for the same. NetApp, in conjunction with CISCO and VMware, has developed multi-tenancy architecture to ensure security to the people putting their data in the cloud so that each individual’s data sets and infrastructure is physically isolated from a security standpoint.

Q. What are the factors for the increasing demand of virtualisation techniques in India?
A. Essentially, what virtualisation promises is a much higher degree of capital utilisation and that’s really what started virtualisation. People had a whole bunch of unutilised x86 servers and virtualisation enabled them to drive the capital efficiency of that infrastructure much more dramatically. As virtualisation goes to its next stage of maturity, now people are beginning to focus on the flexibility that virtualisation provides because virtualisation creates a separation between the physical state of the hardware and the actual state of the application in the machine.

Q. How do you maintain visibility in the Indian market?
A. Our marketing strategy is two-fold. One is to work with the multinationals who are already customers of ours and be a part of their global infrastructure. The second is to focus on the Indian enterprise and government customers and enable them to understand the same value proposition that has made us successful globally.


LEAVE A REPLY