Category: devops

May 9th, 2017 by dbakevlar

Different combination in the game of tech create a winning roll of the dice and other times create a loss.  Better communication between teams can offer a better opportunity to deter from holes in development cycle tools, especially when DevOps is the solution you’re striving for.

It doesn’t hurt to have a map to help guide you.  This interactive map from XebiaLabs can help offer a little clarity to the solutions, but there’s definitely some holes in multiple places that could be clarified a bit more.

The power of this periodic table of DevOps tools, isn’t just that they are broken up by the tool type, but that you’re able to easily filter by open source, freemium, paid or enterprise level tools.  This assists in reach the goals of your solution.  As we all get educated, I’ll focus horizontally in future posts, but today, we’ll take a vertical look at the Database Management tier and where I specialize first.

Apples aren’t Apfel

When comparisons are made, it’s common to have the inability to do apple to apples.  Focusing on the Database Management toolsets, such as Delphix, I can tell you that only Redgate I view as a competitor and this only happened recently with their introduction of SQL Clone.  The rest of the products shown don’t offer any virtualization, (our strongest feature) in their product and we consider Liquidbase and Datical partners in many use cases.

Any tool is better than nothing, even one that helps you choose tools.  So let’s first start to discern what the “Database Management”  tools are supposed to accomplish and then create one of our own.  The overall goal appears to be version control for your database, which is a pretty cool concept.

DBMaestro

The first product on the list is something I do like because the natural “control issues” I have as a DBA.  You want to know that changes to a database occurred in a controlled, documented and organized manner.  DBMaestro allows for this and has some pretty awesome ways of doing it.  Considering that DevOps is embracing agility at an ever increasing rate, having version control capabilities that will work with both Oracle SQL Developer and Microsoft Visual Studio are highly attractive.  The versioning is still as a code change level and not at the data level, but it’s still worthy of discussion.

That it offers all of this through a simple UI is very attractive to the newer generation of developers and DBAs will still want to be part of it.

Liquibase

This is the first of two companies we partner with that are in the list.  It’s very different from DBMaestro, as it’s the only open source in the database management products and is works with XML, JSON, SQL and other formats. You can build just about anything you require and the product has an extensive support community, so if you need to find an example, it’s pretty simple to do so online.

I really like the fact that Liquibase takes compliance into consideration and has the capability to delay SQL transactions from performing without the approval from the appropriate responsible party.  It may not be version control of your data, but at least you can closely monitor and time out the changes to it.

Where Liquibase partners with Delphix is that we can perform continuous delivery via Liquibase and Delphix can then version control the data tier.  We can be used for versioning, branching and refreshing from a previous snapshot if there was a negative outcome in a development or test scenario, making continuous deployment a reality without requiring “backup scripts” for the data changes.

Redgate SQL Control

Everybody love a great user interface and like most, there’s a pretty big price tag that goes along with the ease of use when adding up all the different products that’s offered.  There’s just a ton that you can do with Redgate and you can do most of it for Oracle, MSSQL and MySQL, which is way cool.  Monitor, develop, virtualize, but the main recognition that you’re getting with the periodic table for DevOps tools is for version control and comparisons.  This comes from the SQL Control product from Redgate and offers quite a full quite of products for the developer and the DBA.

Datical

This is another product that we’ve partnered with repeatedly.  The idea that we, as DBAs can review and monitor any and all changes to a database is very attractive to any IT shop. Having it simplified into a tool is incredibly beneficial to any business who wants to deliver continuously and when implemented with Delphix, then the data can be delivered as fast as the rest of the business.

Idera

Idera’s DB Change Manager can give IT a safety net to ensure that changes intended are the changes that happen in the environment.  Idera, just like many of the others on the list supports multiple database platforms, which is a keen feature of a database change control tool, but no virtualization or copy data management, (CDM) tool exists or at least, not one exists any longer.

Fitting in

So where does Delphix fit in with all of these products?  We touched on it a little bit as I mentioned each of these tools. Delphix is recognized for the ability to deploy and that it does so as part of continuous delivery is awesome, but as I stated, it’s not a direct apple to apples comparison as we not only offer versioning control, but we do so at the data level.

Delphix Jet Stream

So let’s create an example-

We can do versioning and track changes in releases in the way of our Jet Stream product.  Jet Stream is the much loved product for our developers and testers.

I’ve often appreciated any tool set that allowed others not only to fish instead of me fishing for them,  Offering the Developer or Tester access to the administration console meant for a DBA can only set them up to fail.

Jet Stream’s interface is really clean and easy to use.  It has a clear left hand panel with options to access and the interaction is direct on what the user will be doing.  I can create bookmarks, naming versions, which allows me the ability to

If a developer is using Jet Stream, they would make changes as part of a release and once complete, create a bookmark, (a snapshot in time) of their container, (A container here is made up of the database, application tier and anything else we want included that Delphix can virtualize.)

We’ve started our new test run of a new development deployment.  We’ve made an initial book mark singing the beginning of the test and then a second bookmark to say the first set of changes were completed.

At this time, there’s a script that removes 20 rows from a table.  The check queries all verified that this is the delete statement that should remove the 20 rows in question.

SQL> delete from kinder_tbl  where c2 like '%40%';

143256 rows deleted.

SQL> commit;

SQL> insert into kinder_tbl values (...

When the tester performs the release and hits this step, the catastrophic change to the data occurs.

Whoops, thats not 20 rows.

Now, the developer could grovel to the DBA to use archaic processing like flashback database or worse, import the data back into a second table and merge the missing rows, etc.  There’s a few ways to skin this cat, but what if the developer could recover the data himself?

This developer was using Jet Stream and can simply go into the console, where they’ve been taking that extra couple seconds to bookmark each milestone during the release to the database, which INCLUDES marking the changes to the data!

If we inspect the bookmarks, we can see that the second of three occurred before the delete of data.  This makes it simple to use the “rewind” option, (bottom right icon next to the trash can for removing the bookmark) to revert the data changes.  Keep in mind that this will revert the database back to the point in time when the bookmark was performed, so ALL changes will be reverted to that point in time.

Once that is done, we can verify quickly that our data is returned and no need to bother the DBA or the concern that catastrophic change to data has impacted the development or test environment.

SQL> select count(*) from kinder_tbl
  where c2 like '%40%';

  COUNT(*)
----------
    143256

I plan on going though different combinations of tools in the periodic table of DevOps tools and show what strengths and drawbacks there are to choices in implementation in upcoming posts, so until the next post, have a great week!

 

 

Posted in Delphix, devops Tagged with: ,

September 30th, 2016 by dbakevlar

The topic of DevOps and and Agile are everywhere, but how often do you hear Source Control thrown into the mix?  Not so much in public, but behind the closed doors of technical and development meetings when Agile is in place, it’s a common theme.  When source control isn’t part of the combination, havoc ensues and a lot of DBAs working nights on production with broken hearts.

together

Control Freaks

So what is source control and why is it such an important part of DevOps?  The official definition of source control is:

A component of software configuration management, version control, also known as revision control or source control, is the management of changes to documents, computer programs, large web sites, and other collections of information.

Delphix, with it’s ability to provide developer with as many virtual copies of databases, including masked sensitive data, is a no-brainer when ensuring development and then test have the environments to do their jobs properly.  The added features of bookmarking and branching is the impressive part that creates full source control.

Branching and Bookmarks

Using the diagram below, note how easy it is to mark each iteration of development with a bookmark to make it easy to then lock and deliver to test, a consistent image via a virtual database, (VDB.)

  • Screen Shot 2016-03-09 at 1.28.07 PMNote the feature branches, but every pull and checkout should be a test of the build, including the data.
  • How do we include the data? We connect the source databases (even when the source was multi-terabtytes originally) to Delphix and now we have production data in version control synchronized from all sources
  • This is then a single timeline representing all sources from which to develop, branch and test.
  • After each subsequent development deployment, a branch is created for test in the form of a VDB.  The VDB’s are all read/write copies, so full testing can be performed, even destructive testing.  It’s simple to reverse a destructive test with Delphix Timeflow.
  • After each test succeeds, a merge can be performed or if a problem occurs in the testing, a bookmark can be performed to preserve the use case for closer examination upon delivery of the VDB image to development.
  • The Delphix engine can be kept keep the environment sync’d near real-time with production to deter from any surprises that a static physical refresh might create.
  • Each refresh only takes a matter of minutes vs. days or weeks with a physical duplicate or refresh process.  VDBs save over 70% on storage space allocation, too.

Delphix is capable of all of this, while implementing Agile data masking to each and every development and test environment to protect all PII and PCI data from production in non-production environments.

Delphix, DevOps and Source Control-  a match made in heaven.

Posted in Delphix, devops Tagged with: , ,

September 15th, 2016 by dbakevlar

Along with the deep learning I’ve been allowed to do about data virtualization, I’ve learned a great deal about Test Data Management.  Since doing so, I’ve started to do some informal surveys of the DBAs I run into and ask them, “How do you get data to your testers so that they can perform tests?”  “How do you deliver code to different environments?”

confused

As a basic skill for a database administrator, we’re taught how to use export and import tools, along with cloning options to deliver data where its needed for various development and in succession, testing activities.  If DBAs didn’t deliver on time, due to resource constraints or inability, then developers would often find their own ways to manually create the data they would need to test new code. integration testing teams would need to manufacture data to validate complicated end-to-end functional testing scenarios, and performance & scalability testing teams would need to manufacture data that could stress their solutions at scale.  Rarely were their means successful and the deployment, along with the company often felt the pain.

Failure and Data

As the volume of application projects increased, larger IT organizations recognized the opportunity to gain efficiencies of scale and searched out opportunities to streamline processes and gain ways of speeding up data refreshes, even synthesizing data!  However, Developers and Testers still had little ability to self-service their needs and often found synthesized data incapable of meeting requirements and floundering deployments once to production.  IT organizations were able to achieve some efficiencies and cost savings, but significant savings related to development, along with testing productivity and quality remained a mystery.

DevOps To the Rescue

With the adoption of DevOps, a heightened focus on automation and speed of delivery occurred across IT.  Advanced Test Data Management solutions are starting to become a reality.  Companies are starting to realize that importance of data distribution, self-service and data security when delivered to non-production environments.
delphix-schema
I truly believe that no IT organization can accuse development or testing departments of lacking delivery if the groups aren’t offered the environments needed and data quality required to deliver a quality product.  One of the ways this can be accomplished is via virtualized database and application environments.  Simply virtualize the test and development, eliminating up to 90% of the storage required for physical databases and yet, still offer all the same data that is available in production.  If data security is a concern, this can all be done with data masking, built right into a proper Test Data Management product.

Test Drive TDM

If you’re interested in taking a proper Test Data Management product for a test drive, right on your laptop, try out Delphix Express.

 

Posted in Delphix, Delphix Express, devops, Test Data Management, Uncategorized Tagged with: ,

March 21st, 2014 by Kyle Hailey

if someone fraudulently uses your information for medical services or drugs, you could be held liable for the costs

The demand for healthcare application development is exploding and has been exploding over the past couple of years because of

  • Obama Care – Affordable Care Act
  • Regulatory – HITECH and HIPAA Privacy Acts
  • ICD  10
  • Pro-active Health Care (versus reactive health care)
  • Mobile devices

but to develop applications for health care requires the data to be masked. Why does masking data matter and matter especially for health care? If patient information gets out it can be quite damaging. One heuristic for the importance of healthcare information is that on the black market health care information on an individual tends to sell for 100x the credit card information for an individual. Imagine that someone needs health coverage and they swipe the health care information for someone else giving them free treatment. The value of the “free treatment” can well exceed the maximums on a credit card. Also imagine the havoc it can cause for the original individual if some jumps onto their health care. Important information like blood type can be logged incorrectly or the person my have HIV logged against them when they themselves are clear. It can take years to repair the damage or never if the damage is fatal.

What do Britney Spears, George Clooney, Octomom (Nadya Suleman)and the late Farah Fawcett have in common? They are all victims of medical data breaches! … How much would a bookie pay to know the results of a boxer’s medical checkup before a title bout? What would a tabloid be willing to pay to be the first to report a celebrity’s cancer diagnosis? Unfortunately it doesn’t stop there and the average citizen is equally a target.  

When data gets to untrusted parties it is called leakage. To avoid leakage, companies use masking. Masking is a form of data mediation or transformation that replaces sensitive data with equally valid fabricated data. Masking data can be more work on top of the already significant work of provisioning copies of a source database to development and QA. Development and QA can get these database copies in minutes for almost no storage overhead using Delphix (as has been explained extensively on previous blogs) but by default these copies, or virtual databases(VDB), are not masked.  Without Delphix, to mask database copies in development and QA would require masking every single copy, but with Delphix one can provision a single VDB, masked that VDB, and then clone in minutes for almost no storage as many masked copies of that first masked VDB as needed.

 Screen Shot 2014-03-21 at 10.38.14 AM

In the above graphic, Delphix links to a source database, and keeps a compressed version along with a rolling time window of changes from the source database. With this data Delphix can spin up a clone of the source database, anywhere in that time window. The clone can be spun up in a few minutes and takes almost no storage because it initially shares all the duplicate blocks on Delphix. This first VDB can be masked and then clones of the masked VDB can be made in minutes for almost no extra storage.

With Delphix in the architecture making masked copies is fast, easy and efficient. The first VDB that is masked will take up some extra storage for all the changed data. This VDB can then become the basis for all other development and QA masked copies so there is no need to worry about whether or not a development or QA database is masked. Because the source for all development and QA copies is masked then there is no way for any unmasked copies to make it into development and QA. Without the secure architecture of Delphix  it becomes more complicated to verify and enforce that each copy is indeed masked. By consolidating the origins of all the down stream copies into a single set of masked shared data blocks, we can rest assured that all the down stream versions are also masked. The cloning interface in Delphix also logs all cloning activity and chain of custody reports can be run.

How do we actually accomplish the masking? Masking can be accomplished with a number of technologies available in the industry. With Delphix these technologies can be run on a VDB in the same manner that they are currently being used with regular physical clone databases. Alternatively Delphix has hooks for the provisioning where tools can be leveraged before the VDB is fully provisioned out.

Delphix has partnered with Axis Technology to streamline and automate the masking process with virtual databases. Look for upcoming blog posts to go into more detail about Axis and Delphix.

 

Posted in devops

  • Facebook
  • Google+
  • LinkedIn
  • Twitter