big dataDatabaseDBA LifeDelphixOracleSQLServer

Understanding Data Gravity as a DBA

Data gravity and the friction it causes within the development cycle is an incredibly obvious problem in my eyes.

Data gravity suffers from the Von Newmann Bottleneck. It’s a basic limitation on how fast computers can be. Pretty simple, but states that the speed of where data resides and where it’s processed is the limiting factor in computing speed.

OLAP, DSS and VLDB DBAs are constantly in battle with this challenge.  How much data is being consumed in a process, how much must be brought from disk and will the processing required to create the results end up “spilling” to disk vs. completing in memory.

Microsoft researcher Jim Gray has spend most of his career looking at the economics of data, which is one of the most accurate terms of this area of technical study.  He started working at Microsoft in 1995 and although passionate about many areas of technology, his research on large databases and transactional processing speeds is one of great respect in my world.

Now some may say this has little to do with being a database administrator, but how many of us spend significant time on the cost based optimizer, as moving or getting data has cost- so economics of data it is.

And this is the fundamental principle of data gravity and why DBAs get the big bucks.

If you’re interested in learning more about data gravity, DevOps and the future of DBAs, register for the upcoming webinar.

Kellyn

http://about.me/dbakevlar

One thought on “Understanding Data Gravity as a DBA

Comments are closed.