Subscribe to Blog via Email
The topic of DevOps and and Agile are everywhere, but how often do you hear Source Control thrown into the mix? Not so much in public, but behind the closed doors of technical and development meetings when Agile is in place, it’s a common theme. When source control isn’t part of the combination, havoc ensues and a lot of DBAs working nights on production with broken hearts.
So what is source control and why is it such an important part of DevOps? The official definition of source control is:
A component of software configuration management, version control, also known as revision control or source control, is the management of changes to documents, computer programs, large web sites, and other collections of information.
Delphix, with it’s ability to provide developer with as many virtual copies of databases, including masked sensitive data, is a no-brainer when ensuring development and then test have the environments to do their jobs properly. The added features of bookmarking and branching is the impressive part that creates full source control.
Using the diagram below, note how easy it is to mark each iteration of development with a bookmark to make it easy to then lock and deliver to test, a consistent image via a virtual database, (VDB.)
Delphix is capable of all of this, while implementing Agile data masking to each and every development and test environment to protect all PII and PCI data from production in non-production environments.
Along with the deep learning I’ve been allowed to do about data virtualization, I’ve learned a great deal about Test Data Management. Since doing so, I’ve started to do some informal surveys of the DBAs I run into and ask them, “How do you get data to your testers so that they can perform tests?” “How do you deliver code to different environments?”
Oracle Open World 2016 is almost here…where did the summer go??
With this upon us, there is something you attendees need and that’s to know about what awesome sessions are at Oracle Open World from the Delphix team! I gave my options up as is the tragedy of switching companies in late spring from Oracle, but you can catch some great content on how to reach the new level in data management with Tim Gorman, (my phenomenal hubby, duh!) and Brian Bent from Delphix:
After absorbing all this great content on Sunday, you can come over to Oak Table World at the Children’s Creativity Museum on Monday and Tuesday to see the Oak Table members present their latest technical findings to the world. The schedule and directions to the event are all available in the link above.
If you’re looking for where the cool kids will be on Thursday, check out the Delphix Sync event! There’s still time to register if you want to join us and talk about how cool data virtualization is.
If you’re a social butterfly and want to get more involved with the community, check out some of the great activities that, and I do mean THAT Jeff Smith and the SQL Developer team have been planning for Oracle Open World, like the Open World Bridge Run.
I’ve been busy reading and testing everything I can with Delphix, whenever I get a chance. I’m incredibly fascinated by copy data management and the idea of doing this with Exadata is nothing new, as Oracle has it’s own version with sparse copy. The main challenge is that Exadata’s version of this is kind of clunky and really doesn’t have the management user interface that Delphix offers.
There is a lot of disk that comes with an Exadata, not just CPU, network bandwidth and memory. Now you can’t utilize offloading with a virtualized database, but you may not be interested in doing so. The goal is to create a private cloud that you can use small storage silos for virtualized environments. We all know that copy data management is a huge issue for IT these days, so why not make the most of your Exadata, too?
With Delphix, you can even take and external source and provision a copy in just a matter of minutes to an Exadata, utilizing very little storage. You can even refresh, roll back, version and branch through the user interface provided.
I simulated two different architecture designs for how Delphix would work with Exadata. The first was with standard hardware, with Virtual Databases, (VDBs) on the Exadata and the second having both the Dsource and the VDBs on another Exadata.
Now we need to capture our gold copy to use for the DSource, which will require space, but Delphix does use compression, so it will be considerably smaller than the original database it’s using for the data source.
If we then add ALL the VDBs to the total storage utilized by that and by the Dsource, then you’d see that they only use about the same amount of space as the original database! Each of these VDBs are going to interact with the user independently, just as a standard database copy would. They can be at different points in time, track different snapshots, have different hooks, (pre or post scripts to be run for that copy) with different data, (which is just different blocks, so that would be the only additional space outside of other changes.) Pretty cool if you ask me!
While chatting on slack the other day, one of my peers asked if I’d seen that ESG Global had done a write up on Veritas Velocity. Velocity is a product that won’t be available until the end of 2016 and is considered “Alpha”. I was surprised that anyone allowed to participate in Alpha was able to publish a description on the product, but that’s not my call to make.
What I found interesting about the article, written by ESG, discusses how Veritas Velocity, “… is combining its sophisticated approaches to data management with its broader ability to deliver superior data protection and information availability in order to offer something revolutionary.”
I found this statement to be quite odd, as what they’re doing is simply using the same technology that Delphix has utilized for years to perform what Delphix has implemented at our customer sites since 2008. They are simply hopping on the bandwagon, (along with a number of other companies) in an attempt to take advantage of the newest buzz word, “Copy Data Management”.
There’s nothing revolutionary about what we do. It was revolutionary back in 2008 and may be seen as revolutionary to the customers who haven’t embraced the power of virtualized environments yet, but to say what they’ve created is revolutionary isn’t true.
If we inspect (at a high level) what Veritas Velocity does:
I can replace the lead into the above list with Delphix and that describes the Delphix Engine, as well. We also offer a mature User Interface, advanced scripting capabilities and heterogenous support.
There are a lot of companies out there making claims that they have revolutionized new capabilities like “data virtualization”, “copy data management “ and “test data management”. Delphix has been in this space since the beginning and as the Gartner reports prove, will continue to be the driving force behind what other companies are striving to achieve in their products.
Want to learn how many solutions Delphix virtualization can provide for your company’s data? Try out Delphix Express, a simple Virtualbox or VMware open source version for your workstation to check out who’s been doing it right all along and before it was cool!
I’ve been involved in two data masking projects in my time as a database administrator. One was to mask and secure credit card numbers and the other was to protect personally identifiable information, (PII) for a demographics company. I remember the pain, but it was better than what could have happened if we hadn’t protected customer data….
Times have changed and now, as part of a company that has a serious market focus on data masking, my role has time allocated to research on data protection, data masking and understanding the technical requirements.
The percentage of companies that contain data that SHOULD be masked is much higher than most would think.
The amount of data that should be masked vs. is masked can be quite different. There was a great study done by the Ponemon Instititue, (that says Ponemon, you Pokemon Go freaks…:)) that showed 23% of data was masked to some level and 45% of data was significantly masked by 2014. This still left over 30% of data at risk.
We also don’t think very clearly about how and what to protect. We often silo our security- The network administrators secure the network. The server administrators secure the host, but doesn’t concern themselves with the application or the database and the DBA may be securing the database, but the application that’s accessing it, may be open to accessing data that shouldn’t be available to those involved. We won’t even start about what George in accounting is doing.
We need to change from thinking just of disk encryption and start thinking about data encryption and application encryption with key data stores that protect all of the data- the goal of the entire project. It’s not like we’re going to see people running out of a building with a server, but seriously, it doesn’t just happen in the movies and people have stolen drives/jump or even print outs of spreadsheets drives with incredibly important data residing on it.
As I’ve been learning what is essential to masking data properly, along with what makes our product superior, is that it identifies potential data that should be masked, along with ongoing audits to ensure that data doesn’t become vulnerable over time.
This can be the largest consumption of resources in any data masking project, so I was really impressed with this area of Delphix data masking. Its really easy to use, so if you don’t understand the ins and outs to DBMS_CRYPTO or unfamiliar with the java.utilRANDOM syntax, no worries, Delphix product makes it really easy to mask data and has a centralized key store to manage everything.
It doesn’t matter if the environment is on-premise or in the cloud. Delphix, like a number of companies these days, understands that hybrid management is a requirement, so efficient masking and ensuring that at no point is sensitive data at risk is essential.
How many data breaches do we need to hear about to make us all pay more attention to this? Security topics at conferences are diminished vs. when I started to attend less than a decade ago, so I know it wasn’t that long ago it appeared to be more important to us and yet it seems to be more important of an issue.
Research was also performed that found only 7-19% of companies actually knew where all their sensitive data was located. That’s over 80% sensitive data vulnerable to a breach. I don’t know about the rest of you, but upon finishing up on that little bit of research, I understood why many feel better about not knowing and why its better just to accept this and address masking needs to ensure we’re not one of the vulnerable ones.
Automated solutions to discover vulnerable data can significantly reduce risks and reduce the demands on those that often manage the data, but don’t know what the data is for. I’ve always said that the best DBAs know the data, but how much can we really understand it and do our jobs? It’s often the users that understand it, but may not comprehend the technical requirements to safeguard it. Automated solutions removes that skill requirement from having to exist in human form, allowing us all to do our jobs better. I thought it was really cool that our data masking tool considers this and takes this pressure off of us, letting the tool do the heavy lifting.
Along with a myriad of database platforms, we also know that people are bound and determined to export data to Excel, MS Access and other flat file formats resulting in more vulnerabilities that seem out of our control. Delphix data masking tool considers this and supports many of these applications, as well. George, the new smarty-pants in accounting wrote out his own XML pull of customers and credit card numbers? No problem, we got you covered… 🙂
So now, along with telling you how to automate a script to email George to change his password from “1234” in production, I can now make recommendations on how to keep him from having the ability to print out a spreadsheet with all the customer’s credit card numbers on it and leave it on the printer…:)
Happy Monday, everyone!
A number of emails I received about trying out Delphix Express was regarding VMWare. Many of my followers had used Virtualbox for a long time and we all know, no one likes change, (OK, maybe me, but we know how abnormal I am anyway… :))
Importing an OVA is pretty simple in VMWare. In the VMWare Fusion application, click on File, Import and accept the defaults. Depending on the size of the VM, the process will take the time needed to import and if anything happens to the VM you imported in, the great thing about a VM, you just have to DELETE AND IMPORT AGAIN to erase that which you have destroyed. 🙂
Open the VM Control Console, click on the VM you want to delete, then click on Remove. Remember to click on delete files if you’d like that space back on your hard drive, too! The utility will take just a moment to clean up the VM and you can then proceed with work or re-import in the OVA file.
I know, if using Delphix Express, the IP address for the machine is displayed when it’s first started, but I also know that we DBAs are a curious lot and known for snooping around every chance we get. Due to this, you may not have noted the IP address and now need it to log into a terminal windows or want a second terminal to run or check items.
Knowing how to return the IP address is a good thing to know, so here are all the ways depending on what OS you’re on:
Linux- type in ifconfig from the terminal and you’ll see the IP address listed for the inet address for the eth0 configuration, (commonly setup as the eth0.)
Windows- ipconfig -a from the Command prompt or Click on the Window Icon, type in “Network” to go to Network and Ethernet and then click on ethernet. Your IP Address is listed in in the IPv4 Address setting.
Mac- ifconfig from terminal of click on the Apple up at top left corner of screen, click on System Preferences, click on Network, then if you’re using WiFi, click on it and then TCP/IP to view your actual IP Address listed for the IPv4 address.
If the VMware screen is blank, (no test or image on the screen or you’ve lost your cursor, the best way to get control back is to click Ctrl/Command on Mac to retrieve cursor control or make return your screen to active.
Every software has updates and just like the other software we support, updates to VMware is important. We may not utilize our VMs as often as we think, so it’s good to get into the practice to check for updates when you first log into VMware by clicking on VMWare Fusion and Check for Updates. If only takes a moment and hopefully you’ll see the following after it’s gone out to check:
By that, I mean to remember that you’re on one PC and you’re running virtual PCs on it. Don’t take up so many resources to your VMs that your PC doesn’t have enough to do its job. A VM should be pretty conservative on its resources and its important to look at the configuration and see if you can dial down any usage that isn’t necessary.
To do this, the VM must be shutdown, (not just suspended) and click on Virtual Machine, Settings. In your settings, there are a couple areas that need to be considered for resource usage:
The first, obviously, is to look at Processors & Memory. Ensure you’ve left enough memory for your PC and as PCs come with quad-core or higher processors these days, a single core is often sufficient for the processing on a VM.
The amount of space that is being used by a VM is a consideration, too. If a VM is so large that you need to purchase an external drive to run it on, then that’s a better choice vs. using up all your local disk or it may be time to build out Delphix just to virtualize the environment to start! 🙂
Verify that all disks for the VM are actually in use. I’ve seen where their are drives created for future growth, but never used or extra space that was allocated that just needs to be shrunk down. This can be accomplished by clicking on Virtual Machine, Settings and then from there, click on each of the disks in use by the VM, shrinking any that may have been over-allocated to. This is another task that can only be performed when the VM is shutdown.
Well, there’s a start to getting comfortable with VMWare Fusion. Do you have any tips or tricks that you can add to this? Comment and let use know and have a great Monday!
Delphix Express offers a virtual environment to work with all the cool features like data virtualization and data masking on just a workstation or even a laptop. The product has an immense offering, so no matter how hard Kyle, Adam and the other folks worked on this labor of love, there’s bound to be some manual configurations that are required to ensure you get the most from the product. This is where I thought I’d help and offer a virtual hug to go along with the virtual images…:)
If you’re already set on installing and working<– (Link here!!) with Delphix Express, you will find the following Vimeo videos- importing the VMs and configuring Delphix Express quite helpful. Adam Bowen did a great job with these videos to get you started, but below, I’ll go through some technical details a bit deeper to give folks added arsenal in case they’ve missed a step or challenged just starting out with VMWare.
Note- Delphix Express requires VMWare Fusion, which you can download after purchasing a license, ($79.99) but well worth the investment.
Not enough memory to run all three VM’s required as part of Delphix Express or after an upgrade, the Delphix Express uses over 6Gb.
Different laptops/workstations have different amounts of memory, CPU and space available. Memory is the most common constraint with today’s pc. Although the VMs are configured for optimal performance, the target and source environments can have the memory trimmed to 2Gb each and still perform when resources are constrained.
The VM must be shut down for this configuration change to be implemented. After stopping or before starting the VM, click on Virtual Machine, Settings. Click on Processors and Memory and then you can configure the memory usage via a slider option as seen below:
Move the slider to under 2G for the VM in question and then close the configuration window and start the VM. Perform this for each VM, (the Delphix Engine VM should already be at 2Gb.)
Issue- Population of sources and targets is empty after successful configuration.
After starting the target and source VMs, a UI interface with command line is opened and you can login right from the VMWare. Virtualbox would require a terminal opened to the desktop, but either way, you can get to the command line interface in such a way without using Putty or another desktop terminal from your workstation.
On the target VM command line, login as the delphix user. The target VM has a python script that runs in the background upon startup that checks for a delphix engine once every minute and if it locates one, will run the configuration. You can view this running in the cron:
crontab -l @reboot ..... /home/delphix/landshark_setup.py
It writes to the following log file:
You can view this file, (or tail it or cat it, whatever you are comfortable doing to view the end of the file…) I prefer just to view the last ten lines, so I’ll run a command to look at JUST the last ten lines:
tail -10 landshark_setup.log
If the configuration is having issues locating the Delphix engine, it will show in this log file. Once confirmed, then we have a couple steps to check:
VMWare issue with the one of the virtual machines not visible to another. Each VM needs to be able to communicate and interact with each other. When importing in each VM, the ability for the VM to be “host aware” with the Mac may not have occurred. If you the delphix engine VM isn’t viewable to the target or the source, you can check the log and then verify in the following way.
Click on Virtual Machine, Settings and then click on Network Adapter. Verify that the top radio option is selected for “Share with my Mac”:
Verify that this is configured for EACH of the three virtual machines involved. If this hasn’t corrected and the configuration doesn’t populate the virtual environments in the Delphix interface, then it’s time to look at the configuration for the target machine.
Get IP Address
While SSH connected to the target machine, type in the following:
Use the IP address shown, (inet address) and open a browser on your PC, adding the port used for the target configuration file, (port 8000 by default):
You should be shown the configuration file for your target server that is used to run the delphix engine configuration. There are options to update the values for different parameters. The you should focus on are:
linux_source_ip= make sure this matches the source VM’s ip address when you type in “ifconfig”.
engine_address= ip address for the delphix engine VM when you type in ifconfig on the host
engine_password= should match the password that you updated your delphix_admin to when you went through the configuration. Update it to match if it doesn’t, as I’ve seen some folks not set it to “landshark” as demonstrated in the videos, so of course, the setup will fail when the file doesn’t match the password set by the user.
oracle_xe = If you set Oracle_xe to true, then don’t set the 11g or 12c to true. To conserver workstation resources, choose only one database type.
Once you’re made all the changes you want to the page, click on Submit Changes.
You need to run the reconfiguration manually now. Remember, this runs in the background each minute, but when it does that, you can’t see what’s going on, so I recommend killing the running process and running it manually.
From the target host, type in the following:
ps -ef | grep landshark_setup
Kill the running processes:
Check for any running processes, just to be safe:
ps -ef | grep landshark_setup
Once you’ve confirmed that none are running, let’s run the script manually from the delphix user home:
Verify that the configuration runs, monitoring as it steps through each step:
This is the first time you’re performed these steps, so expect a refresh won’t be performed, but a creation will. You should now see the left panel of your Delphix Engine UI populated:
Now we’ve come to the completion of the initial configuration. In my next post on Delphix Express, I’ll discuss the Dsource and Target database configurations for different target types. Working with these files and configurations are great practice to learning about Delphix, even if your Delphix Express even if you are amazed at how easy this all was.
I’ve been going through some SERIOUS training in just over a week. This training has successfully navigated the “Three I’s”, as in its been Interesting, Interactive and Informative. The offerings are very complete and the knowledge gained is limitless.
I’d also like to send a shout out to Steve Karam, Leighton Nelson and everyone else at Delphix who’s had a hand in designing the training, both for new employees and for the those working with our hands on labs. I’ve had a chance to work with both and they’re just far above anything I’ve seen anywhere else.
Most DBAs know- If you attempt to take a shortcut in patching or upgrading, either by not testing or hoping that your environments are the same without verifying, shortcuts can go very wrong, very quickly.
Patching is also one of the most TEDIOUS tasks required of DBAs. The demands on the IT infrastructure for downtime to apply quarterly PSU patches, (not including emergency security patches) to all the development, test, QA and production databases is a task I’ve never looked forward to. Even when utilizing Enterprise Manager 12c Patch Plans with the DBLM management pack, you still had to hope that you checked compliance for all environments and prayed that your failure threshold wasn’t tripped, which means a large amount of your time would have to be allocated to address patching outside of just testing and building out patch plans.
I bet most of you already knew you could virtualize your development and test from a single Delphix compressed copy, (referred to as a DSource.) create as many virtual copies, (referred to as VDBs) as your organization needs to have for development, testing, QA, backup and disaster recovery, (if you weren’t aware of this, you can thank me later… :))
What you may not know, (and what I learned this week) is that you can also do the following:
Considering how much time and resources are saved by just eliminating such a large portion of time required for patching and upgrading, this is worth investing in Delphix just for this alone!
Want to learn more? Check out the following links:
Want to Demo Delphix? <– Click here!
Back in 2012, when I started to build a reputation as a mentor, the goal was not just to create my own path and set it afire, but for others to desire to make their own path before my footsteps cooled.
This week I joined Delphix. Many acted as if this was pre-ordained and simply part of my destiny. Due to my technical knowledge, they assumed I would be working in the same group as my husband and virtualization was just a natural fit for my skills. Although I love working just a few feet from my husband, (which we’ve done for years now) I actually joined the Product Management and Marketing team, which is a surprise to many.
All new jobs come with new challenges, but when you also change paths, it can be like walking on hot coals. You can find yourself anxious, having doubts about your skills and if you’re up to the challenge. I accepted this challenge because I wanted to have more impact with the direction of the technology I was working with. I wanted to help the business make intelligent decisions with the powerful knowledge I have about technology, customers and product and this is something that Delphix is keen on letting me be a part of. I’m pretty fearless, but even I have to remember to not let my fears or frustrations get the best of me. As I always say, there is incredible power in the simple act of doing, so just do and its surprising how quickly you’ll be successful.
In just the few days I’ve been here, I’ve already begun to build the documentation that will help determine what content will be directed to what audiences, chosen a few members of the community to do guest blogs, did some really great training and was introduced to some incredible people.
I’m learning how to manage my time a little differently than I did before, as things moved a lot slower at Oracle, but I love how my skills are more in line with what the company needs from me. Even though I’ve been here less than a week and am in a position that I’ve never held before, I know exactly what my purpose is.
I want to thank my new peers and managers for helping me quickly get up to speed. My beloved Microsoft Surface has been migrated to my secondary desk and my new work computer is set up (I’m on a Mac Air, don’t everyone gasp at once and start taking bets on how long the keyboard will last… :)) As I stated earlier, the work is new and interesting, which is why Delphix was at the top of my list for companies to join. Like my new position, Delphix technology removes many of the tedious tasks and automates much of what once was a manual process so as to get onto more interesting and rewarding adventures.
I’ll continue to update everyone on how my new world is shaking out at Delphix and hopefully will convince a couple of you to join me. It wouldn’t be the first time that’s happened. The coals are warm, but will you follow in my footsteps before they cool?
So after over two years at Oracle, I’m moving on. Yes, for those of you who haven’t seen the tweets and the posts, you heard right.
OK, everyone- cleansing breath.
I worked with great people and did some awesome things in the community, blogged everything Enterprise Manager and talked over 1/2 the Oracle community into buying and doing projects with Raspberry Pi while I was at it!
Many folks thought I was a product manager or a technical consultant, but my title was Consulting Member for the Technical Staff with the Strategic Customer Program with the Enterprise Manager and Oracle Management Cloud Group. I know I was part of a select group at Oracle, but I believe the opportunity to work at Oracle was an important step in my career and I’d recommend it to anyone for the experience it provides.
There is a huge difference working for Oracle vs. being in the Oracle community, even as an Oracle ACE Director. I was utterly amazed being part of the Oracle machine. One of the most amazing experiences was observing how releases came together. It was a complete different experience as an employee vs. a customer. Being part of a massive undertaking such as a product release, impressively building out software to be released to its customer base is pretty astounding. Understanding how and what it takes to move the machine and once it gets moving, how pertinent it is for anyone in its way to get out of the way is important to understanding how a successful product is created.
I learned a lot in just over two years and I have to admit- many of the negatives that people said would be present at Oracle, I just didn’t experience. I had great mentors and contacts inside of Oracle. It’s easy to assimilate into a big company environment when you have people like Pete Sharman, Tyler Muth, Mary Melgaard and other’s looking out for you. I’ll be sad to leave all the great people that I worked with at Oracle, too- Steve, Courtney, Scott, Werner, Andrew, Joe, Pramod and Will. At the same time, I look forward to opportunities to learn new skills with the awesome folks that have so readily embraced my quirky self at my new company. I learned a great deal in my two years at Oracle and this is knowledge that I’m able to take with me as I move forward to my new adventure.
With that said, I’ve been offered an incredible opportunity to stretch my legs a bit and try something new and I am excited to move onto this new challenge. I’ll still be speaking at conferences, but also will direct technology in a a way that should be very constructive to my technical style.
There has been a lot of rumors to where I’m off to. Some of you have guessed correctly on where I’m going, but I know none of you guessed what I’ll be doing. I will be focusing more on my multi-platform skills, so for those of you that thought I would be leaving all those years of experience in database and OS platforms, it’s going to be just the opposite.
I’m very excited to announce that, as of Monday, June 13th, I’m the new Technical Intelligence Manager at Delphix.
Buckle up, Baby! This is going to be good.