Category: devops

September 30th, 2016 by dbakevlar

The topic of DevOps and and Agile are everywhere, but how often do you hear Source Control thrown into the mix?  Not so much in public, but behind the closed doors of technical and development meetings when Agile is in place, it’s a common theme.  When source control isn’t part of the combination, havoc ensues and a lot of DBAs working nights on production with broken hearts.


Control Freaks

So what is source control and why is it such an important part of DevOps?  The official definition of source control is:

A component of software configuration management, version control, also known as revision control or source control, is the management of changes to documents, computer programs, large web sites, and other collections of information.

Delphix, with it’s ability to provide developer with as many virtual copies of databases, including masked sensitive data, is a no-brainer when ensuring development and then test have the environments to do their jobs properly.  The added features of bookmarking and branching is the impressive part that creates full source control.

Branching and Bookmarks

Using the diagram below, note how easy it is to mark each iteration of development with a bookmark to make it easy to then lock and deliver to test, a consistent image via a virtual database, (VDB.)

  • Screen Shot 2016-03-09 at 1.28.07 PMNote the feature branches, but every pull and checkout should be a test of the build, including the data.
  • How do we include the data? We connect the source databases (even when the source was multi-terabtytes originally) to Delphix and now we have production data in version control synchronized from all sources
  • This is then a single timeline representing all sources from which to develop, branch and test.
  • After each subsequent development deployment, a branch is created for test in the form of a VDB.  The VDB’s are all read/write copies, so full testing can be performed, even destructive testing.  It’s simple to reverse a destructive test with Delphix Timeflow.
  • After each test succeeds, a merge can be performed or if a problem occurs in the testing, a bookmark can be performed to preserve the use case for closer examination upon delivery of the VDB image to development.
  • The Delphix engine can be kept keep the environment sync’d near real-time with production to deter from any surprises that a static physical refresh might create.
  • Each refresh only takes a matter of minutes vs. days or weeks with a physical duplicate or refresh process.  VDBs save over 70% on storage space allocation, too.

Delphix is capable of all of this, while implementing Agile data masking to each and every development and test environment to protect all PII and PCI data from production in non-production environments.

Delphix, DevOps and Source Control-  a match made in heaven.

Posted in Delphix, devops Tagged with: , ,

September 15th, 2016 by dbakevlar

Along with the deep learning I’ve been allowed to do about data virtualization, I’ve learned a great deal about Test Data Management.  Since doing so, I’ve started to do some informal surveys of the DBAs I run into and ask them, “How do you get data to your testers so that they can perform tests?”  “How do you deliver code to different environments?”


As a basic skill for a database administrator, we’re taught how to use export and import tools, along with cloning options to deliver data where its needed for various development and in succession, testing activities.  If DBAs didn’t deliver on time, due to resource constraints or inability, then developers would often find their own ways to manually create the data they would need to test new code. integration testing teams would need to manufacture data to validate complicated end-to-end functional testing scenarios, and performance & scalability testing teams would need to manufacture data that could stress their solutions at scale.  Rarely were their means successful and the deployment, along with the company often felt the pain.

Failure and Data

As the volume of application projects increased, larger IT organizations recognized the opportunity to gain efficiencies of scale and searched out opportunities to streamline processes and gain ways of speeding up data refreshes, even synthesizing data!  However, Developers and Testers still had little ability to self-service their needs and often found synthesized data incapable of meeting requirements and floundering deployments once to production.  IT organizations were able to achieve some efficiencies and cost savings, but significant savings related to development, along with testing productivity and quality remained a mystery.

DevOps To the Rescue

With the adoption of DevOps, a heightened focus on automation and speed of delivery occurred across IT.  Advanced Test Data Management solutions are starting to become a reality.  Companies are starting to realize that importance of data distribution, self-service and data security when delivered to non-production environments.
I truly believe that no IT organization can accuse development or testing departments of lacking delivery if the groups aren’t offered the environments needed and data quality required to deliver a quality product.  One of the ways this can be accomplished is via virtualized database and application environments.  Simply virtualize the test and development, eliminating up to 90% of the storage required for physical databases and yet, still offer all the same data that is available in production.  If data security is a concern, this can all be done with data masking, built right into a proper Test Data Management product.

Test Drive TDM

If you’re interested in taking a proper Test Data Management product for a test drive, right on your laptop, try out Delphix Express.


Posted in Delphix, Delphix Express, devops, Test Data Management, Uncategorized Tagged with: ,

March 21st, 2014 by Kyle Hailey

if someone fraudulently uses your information for medical services or drugs, you could be held liable for the costs

The demand for healthcare application development is exploding and has been exploding over the past couple of years because of

  • Obama Care – Affordable Care Act
  • Regulatory – HITECH and HIPAA Privacy Acts
  • ICD  10
  • Pro-active Health Care (versus reactive health care)
  • Mobile devices

but to develop applications for health care requires the data to be masked. Why does masking data matter and matter especially for health care? If patient information gets out it can be quite damaging. One heuristic for the importance of healthcare information is that on the black market health care information on an individual tends to sell for 100x the credit card information for an individual. Imagine that someone needs health coverage and they swipe the health care information for someone else giving them free treatment. The value of the “free treatment” can well exceed the maximums on a credit card. Also imagine the havoc it can cause for the original individual if some jumps onto their health care. Important information like blood type can be logged incorrectly or the person my have HIV logged against them when they themselves are clear. It can take years to repair the damage or never if the damage is fatal.

What do Britney Spears, George Clooney, Octomom (Nadya Suleman)and the late Farah Fawcett have in common? They are all victims of medical data breaches! … How much would a bookie pay to know the results of a boxer’s medical checkup before a title bout? What would a tabloid be willing to pay to be the first to report a celebrity’s cancer diagnosis? Unfortunately it doesn’t stop there and the average citizen is equally a target.  

When data gets to untrusted parties it is called leakage. To avoid leakage, companies use masking. Masking is a form of data mediation or transformation that replaces sensitive data with equally valid fabricated data. Masking data can be more work on top of the already significant work of provisioning copies of a source database to development and QA. Development and QA can get these database copies in minutes for almost no storage overhead using Delphix (as has been explained extensively on previous blogs) but by default these copies, or virtual databases(VDB), are not masked.  Without Delphix, to mask database copies in development and QA would require masking every single copy, but with Delphix one can provision a single VDB, masked that VDB, and then clone in minutes for almost no storage as many masked copies of that first masked VDB as needed.

 Screen Shot 2014-03-21 at 10.38.14 AM

In the above graphic, Delphix links to a source database, and keeps a compressed version along with a rolling time window of changes from the source database. With this data Delphix can spin up a clone of the source database, anywhere in that time window. The clone can be spun up in a few minutes and takes almost no storage because it initially shares all the duplicate blocks on Delphix. This first VDB can be masked and then clones of the masked VDB can be made in minutes for almost no extra storage.

With Delphix in the architecture making masked copies is fast, easy and efficient. The first VDB that is masked will take up some extra storage for all the changed data. This VDB can then become the basis for all other development and QA masked copies so there is no need to worry about whether or not a development or QA database is masked. Because the source for all development and QA copies is masked then there is no way for any unmasked copies to make it into development and QA. Without the secure architecture of Delphix  it becomes more complicated to verify and enforce that each copy is indeed masked. By consolidating the origins of all the down stream copies into a single set of masked shared data blocks, we can rest assured that all the down stream versions are also masked. The cloning interface in Delphix also logs all cloning activity and chain of custody reports can be run.

How do we actually accomplish the masking? Masking can be accomplished with a number of technologies available in the industry. With Delphix these technologies can be run on a VDB in the same manner that they are currently being used with regular physical clone databases. Alternatively Delphix has hooks for the provisioning where tools can be leveraged before the VDB is fully provisioned out.

Delphix has partnered with Axis Technology to streamline and automate the masking process with virtual databases. Look for upcoming blog posts to go into more detail about Axis and Delphix.


Posted in devops

  • Facebook
  • Google+
  • LinkedIn
  • Twitter