In a conversation with Prisila, correspondent, Asia Business Outlook Magazine, Alexey shares his views on digital adoption and how a DAP affects digital transformation. During the conversation, he also discussed about the important design concepts and best practices for implementing various decentralized architectures.
Alexey was born with the “Passion for Data” programmed in his DNA. Alexey started his data carrier 30 years ago building volumetric model of radioactive pollution of Chernobyl nuclear disaster Zone with Russian Academy of Science and since then earned his reputation as the strategic adviser in all aspects of Data Management. Prior joining Denodo as Chief Evangelist, Alexey built and managed enterprise-wide Data Integration platforms for industry giants like Nokia and Deutsche Bank. In the last 15 years Dr Alexey has been helping many companies across 3 continents to Digitally Transform their business with the edge-cutting Data technologies from Teradata and Informatica.
Share your thoughts on big data and cloud technologies.
Big data is the concept of reality that you have to deal with the massive amounts of data and process them at very high speeds. On the other hand, cloud is the concept of delegating the data storage and data processing to a centralized infrastructure. So the concept of big data and the technology of cloud go in parallel in the tech world. You can have your big data on the cloud, and big data & analytics on premise as well. Cloud is nothing but the same technology, but moved and delegated to the global cloud providers such as Amazon, Google and others. You can move, analyze and process big data analytics, AI & ML algorithms and the huge volume of data locally or move it to the cloud. So, I feel thee future will be all about big data analytics on the cloud.
Why are Master Data Management solutions booming in today's world?
Master Data Management as a concept is a very important part of the data management framework and has been on the cards for many years now. MDM has been at the plateau of maturity for quite some time. The whole concept of MDM is to look for the data, relate it to a single subject (a person, product or a company, for instance), look for data distributed across various applications, and then try to understand if that data really belongs to the given subject. The next step is to merge the information that you have and pick-up the most relevant and up-to-date information.
There are a lot of tools and platforms that implement the concept of data mastering. There are specific tools such as Customer 360, Product 360 and Citizen 360 that focus on narrowing MDM use cases. Similarly there are also multi domain solutions that consume any kind of distributed data and bring them together.
DAP is a framework that helps organizations to adopt digital transformation in an efficient way and is also known as the golden standard of digitizing the business and all the processes within the organization.
Enlighten us about digital adoption and how a DAP affects digital transformation.
Every single conference of seminar that I am attending recently across the world has at least a couple of presentations or sessions pertaining to digital transformation. DAP is a framework that helps organizations to adopt digital transformation in an efficient way and is also known as the golden standard of digitizing the business and all the processes within the organization. Digital transformation however is not only about the framework; there are many components in that. In my opinion, people have to be very proactive in driving digital transformation within their company in order to utilize the DAP framework effectively. Also, depending on your business, you can use different tolls and products available in the market enhance your digital adoption process.
This entire process is no more optional; companies must mandatorily embrace digital transformation to stay afloat in this hypercompetitive business landscape, and DAP is one tool that will play a crucial role for organizations in their digital transformation journey.
Give us an overview of the important design concepts and best practices for implementing various decentralized architectures.
When we started doing data management 25-30 years ago, the concept was very simple and primitive. The idea was to get all the data, copy that into a single repository, and then do the important data analytics on top of that. We also had very good frameworks provided by eminent industry leaders such as Ralph Kimball, who came-up with the idea of data warehousing, which was a brilliant during those days. As a result, everybody started building data warehouses and started moving data from all possible applications into a centralized analytical repository.
"Data mesh also deals with distributed architecture. The cornerstone of data mesh is that the data producer must own the data."
After some time, we moved to the next phase of data management, and that is when the concept big data came into picture. We had to get all the data that we can access and move them physically under a single platform. If you look at the modern data landscape, every single organization has several data warehouses. While some data inevitably goes onto cloud, some stays on premise because of security regulations. This is the on-ground reality of today’s data architecture. Decentralized architecture is not something that was invented; it is the very nature of the data.
Data mesh also deals with distributed architecture. The cornerstone of data mesh is that the data producer must own the data. For example, if the finance department is producing the data, it has to own that data be responsible for the quality of that data. We can see a lot of implementations of this architecture in the eastern hemisphere and at the government levels where the nature of the data being is very sensitive and thus not available to general public.
Why is data virtualization a suggested approach for businesses that require flexible data integration?
Data virtualization is not a substitution for data warehousing, but it is an evolution of data management. We can use the historical data from data warehouses or data lakes, but not all of the data is present in a single data warehouse or data lake; it has been scattered across. There is no more a possibility of data to be moved and stored onto a single platform today. Thus, there is huge demand for real-time data processing and analytics in the current scenario.
We use cookies to ensure you get the best experience on our website. Read more...