September 15th, 2016 by dbakevlar

Along with the deep learning I’ve been allowed to do about data virtualization, I’ve learned a great deal about Test Data Management.  Since doing so, I’ve started to do some informal surveys of the DBAs I run into and ask them, “How do you get data to your testers so that they can perform tests?”  “How do you deliver code to different environments?”

confused

As a basic skill for a database administrator, we’re taught how to use export and import tools, along with cloning options to deliver data where its needed for various development and in succession, testing activities.  If DBAs didn’t deliver on time, due to resource constraints or inability, then developers would often find their own ways to manually create the data they would need to test new code. integration testing teams would need to manufacture data to validate complicated end-to-end functional testing scenarios, and performance & scalability testing teams would need to manufacture data that could stress their solutions at scale.  Rarely were their means successful and the deployment, along with the company often felt the pain.

Failure and Data

As the volume of application projects increased, larger IT organizations recognized the opportunity to gain efficiencies of scale and searched out opportunities to streamline processes and gain ways of speeding up data refreshes, even synthesizing data!  However, Developers and Testers still had little ability to self-service their needs and often found synthesized data incapable of meeting requirements and floundering deployments once to production.  IT organizations were able to achieve some efficiencies and cost savings, but significant savings related to development, along with testing productivity and quality remained a mystery.

DevOps To the Rescue

With the adoption of DevOps, a heightened focus on automation and speed of delivery occurred across IT.  Advanced Test Data Management solutions are starting to become a reality.  Companies are starting to realize that importance of data distribution, self-service and data security when delivered to non-production environments.
delphix-schema
I truly believe that no IT organization can accuse development or testing departments of lacking delivery if the groups aren’t offered the environments needed and data quality required to deliver a quality product.  One of the ways this can be accomplished is via virtualized database and application environments.  Simply virtualize the test and development, eliminating up to 90% of the storage required for physical databases and yet, still offer all the same data that is available in production.  If data security is a concern, this can all be done with data masking, built right into a proper Test Data Management product.

Test Drive TDM

If you’re interested in taking a proper Test Data Management product for a test drive, right on your laptop, try out Delphix Express.

 

Posted in Delphix, Delphix Express, devops, Test Data Management, Uncategorized Tagged with: ,

August 24th, 2016 by dbakevlar

While chatting on slack the other day, one of my peers asked if I’d seen that ESG Global had done a write up on Veritas Velocity.  Velocity is a product that won’t be available until the end of 2016 and is considered “Alpha”.  I was surprised that anyone allowed to participate in Alpha was able to publish a description on the product, but that’s not my call to make.

greatjob

What I found interesting about the article, written by ESG, discusses how Veritas Velocity, “… is combining its sophisticated approaches to data management with its broader ability to deliver superior data protection and information availability in order to offer something revolutionary.”

Revolutionary?

I found this statement to be quite odd, as what they’re doing is simply using the same technology that Delphix has utilized for years to perform what Delphix has implemented at our customer sites since 2008.  They are simply hopping on the bandwagon, (along with a number of other companies) in an attempt to take advantage of the newest buzz word, “Copy Data Management”.

There’s nothing revolutionary about what we do.  It was revolutionary back in 2008 and may be seen as revolutionary to the customers who haven’t embraced the power of virtualized environments yet, but to say what they’ve created is revolutionary isn’t true.

If we inspect (at a high level) what Veritas Velocity does:

  1. A self-contained VM appliance to manage storage and thin cloning.
  2. A configuration of storage with an NFS mount presented as VMs.
  3. Hybrid management to the cloud.
  4. The VMs are then presented as targets to be used for thin clones.
  5. Eliminating copies of data by virtualizing database environments, focused on the cloud.
  6. A User interface to manage it all.

I can replace the lead into the above list with Delphix and that describes the Delphix Engine, as well.  We also offer a mature User Interface, advanced scripting capabilities and heterogenous support.

Screen Shot 2016-08-24 at 5.04.11 PM

There are a lot of companies out there making claims that they have revolutionized new capabilities like “data virtualization”,  “copy data management “ and “test data management”. Delphix has been in this space since the beginning and as the Gartner reports prove, will continue to be the driving force behind what other companies are striving to achieve in their products.

 

Want to learn how many solutions Delphix virtualization can provide for your company’s data?  Try out Delphix Express, a simple Virtualbox or VMware open source version for your workstation to check out who’s been doing it right all along and before it was cool!

Posted in Delphix, Delphix Express Tagged with: , , , , ,

  • Facebook
  • Google+
  • LinkedIn
  • Twitter