December 16th, 2016 by dbakevlar

screen-shot-2016-12-16-at-12-05-22-pm

On the first day with Delphix, I provisioned with glee, an IT Manager Happy.

On the second day with Delphix, I provisioned with glee, two SAP ASE and an IT Manager Happy.

On the third day with Delphix, I provisioned with glee, three Oracle Databases, two SAP ASE and an IT Manager Happy.

On the fourth day with Delphix, I provisioned with glee, four EBS Clones, three Oracle Databases, two SAP ASE and an IT Manager Happy.

On the fifth day with Delphix, I provisioned with glee…

Five Cloud Migrations!

Four EBS Clones, three Oracle Databases, two SAP ASE and an IT Manager Happy.

On the sixth day with Delphix, I provisioned with glee,

Six SQL Servers running, Five Cloud Migrations, four EBS Clones, three Oracle Databases, two SAP ASE and an IT Manager Happy.

On the seventh day with Delphix, I provisioned with glee,

Seven developers coding, Six SQL Servers running, Five Cloud Migrations, four EBS Clones, three Oracle Databases, two SAP ASE and an IT Manager Happy.

On the eighth day with Delphix, I provisioned with glee,

Eight testers testing, Seven developers coding, Six SQL Servers running, Five Cloud Migrations, four EBS Clones, three Oracle Databases, two SAP ASE and an IT Manager Happy.

On the ninth day with Delphix, I provisioned with glee,

Nine applications applying, Eight testers testing, Seven developers coding, Six SQL Servers running, Five Cloud Migrations, four EBS Clones, three Oracle Databases, two SAP ASE and an IT Manager Happy.

On the tenth day with Delphix, I provisioned with glee,

Ten DevOps leading, Nine applications applying, Eight testers testing, Seven developers coding, Six SQL Servers running, Five Cloud Migrations, four EBS Clones, three Oracle Databases, two SAP ASE and an IT Manager Happy.

On the eleventh day with Delphix, I provisioned with glee,

Eleven DB2 humming, Ten DevOps leading, Nine applications applying, Eight testers testing, Seven developers coding, Six SQL Servers running, Five Cloud Migrations, four EBS Clones, three Oracle Databases, two SAP ASE and an IT Manager Happy.

On the twelfth day with Delphix, I provisioned with glee,

Twelve databases masking, Eleven DB2 humming, Ten DevOps leading, nine applications applying, eight testers testing, seven developers coding, six SQL Servers running, five Cloud Migrations, four EBS Clones, three Oracle Databases, two SAP ASE and as IT Manager Happy.

happy_h

Posted in DBA Life, Delphix Tagged with: ,

November 17th, 2016 by dbakevlar

I thought I’d do something on Oracle this week, but then Microsoft made an announcement that was like an early Christmas present-  SQL Server release for Linux.

santa

I work for a company that supports Oracle and SQL Server, so I wanted to know how *real* this release was.  I first wanted to test it out on a new build and as they recommend, along as link to an Ubuntu install, I created a new VM and started from there-

screen-shot-2016-11-17-at-1-20-55-pm

Ubuntu Repository Challenge

There were a couple packages that were missing until the repository is updated to pull universe by adding repository locations into the sources.list file:

screen-shot-2016-11-17-at-1-22-39-pm

There is also a carriage return at the end of the MSSQL installation when it’s added to the sources.list file.  Remove this before you save.

Once you do this, if you’re chosen to share your network connection with your Mac, you should be able to install successfully when running the commands found on the install page from Microsoft.

CentOS And MSSQL

The second install I did was on a VM using CentOS 6.7 that was pre-discovered as a source for one of my Delphix engines.  The installation failed upon running it, which you can see here:

screen-shot-2016-11-17-at-11-21-21-am

Even attempting to work around this wasn’t successful and the challenge was that the older openssl wasn’t going to work with the new SQL Server installation.  I decided to simply upgrade to CentOS 7.

CentOS 7

The actual process of upgrading is pretty easy, but there are some instructions out there that are incorrect, so here are the proper steps:

  1.  First, take a backup of your image, (snapshot) before you begin.
  2. edit the yum directory to prep it for the upgrade by going to and creating the following file: /etc/yum.repos.d/upgrade.repo
    1. Add the following information to the file:
[upgrade]
name=upgrade
baseurl=http://dev.centos.org/centos/6/upg/x86_64/
enabled=1
gpgcheck=0

Save this file and then run the following:

yum install preupgrade-assistant-contents redhat-upgrade-tool preupgrade-assistant

You may see that one has stated it won’t install as newer ones are available-  that’s fine.  As long as you have at least newer packages, you’re fine.  Now run the preupgrade

preupg

The log final output may not write, also.  If you are able to verify the runs outside of this and it says that it was completed successfully, please know that the pre-upgrade was successful as a whole.

Once this is done, import the GPG Key:

rpm --import http://mirror.centos.org/centos/RPM-GPG-KEY-CentOS-7

After the key is imported, then you can start the upgrade:

/usr/bin/redhat-upgrade-tool-cli --force --network 7 --instrepo=http://mirror.centos.org/centos/7/os/x86_64

Once done, then you’ll need to reboot before you run your installation of SQL Server:

reboot

MSSQL Install

Once the VM has cycled, then you can run the installation using the Redhat installation as root, (my delphix user doesn’t have the rights and I decided to have MSSQL installed under root for this first test run):

su
curl https://packages.microsoft.com/config/rhel/7/mssql-server.repo > /etc/yum.repos.d/mssql-server.repo

Now run the install:

sudo yum install -y mssql-server

Once its completed, it’s time to set up your MSSQL admin and password:

sudo /opt/mssql/bin/sqlservr-setup

One more reboot and you’re done!

reboot

You should then see your SQL Server service running with the following command:

systemctl status mssql-server

You’re ready to log in and create your database, which I’ll do in a second post on this fun topic.

OK, you linux fans, go MSSQL! 🙂

 

Posted in Delphix, SQLServer Tagged with: , ,

October 20th, 2016 by dbakevlar

I’ll be attending my very first Pass Summit next week and I’m really psyched!  Delphix is a major sponsor at the event, so I’ll get to be at the booth and will be rocking some amazing new Delphix attire, (thank you to my boss for understanding that a goth girl has to keep up appearances and letting me order my own Delphix ware.)

Its an amazing event and for those of you who are my Oracle peeps, wondering what Summit is, think Oracle Open World for the Microsoft SQL Server expert folks.

chris_suddenly

I was a strong proponent of immersing in different database and technology platforms early on.  You never know when the knowledge you gain in an area that you never thought would be useful ends up saving the day.

Just Goin to Take a Look

Yesterday this philosophy came into play again.  A couple of folks were having some challenges with a testing scenario of a new MSSQL environment and asked for other Delphix experts for assistance via Slack.  I am known for multi-tasking, so I thought, while I was doing some research and building out content, I would just have the shared session going in the background while I continued to work.  As soon as I logged into the web session, the guys welcomed me and said, “Maybe Kellyn knows what’s causing this error…”

Me- “Whoops, guess I gotta pay attention…”

SQL Server, for the broader database world, has always been, unlike Oracle, multi-tenant.  This translates to a historical architecture that has a server level login AND a user database level username.  The Login ID, (login name) is linked to a userID, (and such a user name) in the (aka schema) user database.  Oracle is starting to migrate to similar architecture with Database version 12c, moving more away from schemas within a database and towards multi-tenant, where the pluggable database, (PDB) serves as the schema.

I didn’t recognize the initial error that arose from the clone process, but that’s not uncommon, as error messages can change with versions and with proprietary code.  I also have worked very little to none on MSSQL 2014.  When the guys clicked in Management Studio on the target user database and were told they didn’t have access, it wasn’t lost on anyone to look at the login and user mapping to show the login didn’t have a mapping to a username for this particular user database. What was challenging them, was that when they tried to add the mapping, (username) for the login to the database, it stated the username already existed and failed.

Old School, New Fixes

This is where “old school” MSSQL knowledge came into play.  Most of my database knowledge for SQL Server is from versions 6.5 through 2008.  Along with a lot of recovery and migrations, I also performed a process very similar to the option in Oracle to plug or unplug a PDB, in MSSQL terminology referred to as “attach and detach” of a MSSQL database.  You could then easily move the database to another SQL Server, but you very often would have what is called “orphaned users.”  This is where the login ID’s weren’t connected to the user names in the database and needed to be resynchronized correctly.  To perform this task, you could dynamically create a script to pull the logins if they didn’t already exist, run it against the “target” SQL Server and then create one that ran a procedure to synchronize the logins and user names.

Use  <user_dbname>
go
exec sp_change_users_login 'Update_One','<loginname>','<username>'
go

For the problem that was experienced above, it was simply the delphix user that wasn’t linked post restoration due to some privileges and we once we ran this command against the target database all was good again.

This wasn’t the long term solution, but pointed to where the break was in the clone design and that can now be addressed, but it shows that experience, no matter how benign having it may seem, can come in handy later on in our careers.

PASS_2016

I am looking forward to learning a bunch of NEW and AWESOME MSSQL knowledge to take back to Delphix at Pass Summit this next week, as well as meeting up with some great folks from the SQL Family.

See you next week in Seattle!

 

 

 

 

Posted in Delphix, SQLServer Tagged with: , , ,

September 30th, 2016 by dbakevlar

The topic of DevOps and and Agile are everywhere, but how often do you hear Source Control thrown into the mix?  Not so much in public, but behind the closed doors of technical and development meetings when Agile is in place, it’s a common theme.  When source control isn’t part of the combination, havoc ensues and a lot of DBAs working nights on production with broken hearts.

together

Control Freaks

So what is source control and why is it such an important part of DevOps?  The official definition of source control is:

A component of software configuration management, version control, also known as revision control or source control, is the management of changes to documents, computer programs, large web sites, and other collections of information.

Delphix, with it’s ability to provide developer with as many virtual copies of databases, including masked sensitive data, is a no-brainer when ensuring development and then test have the environments to do their jobs properly.  The added features of bookmarking and branching is the impressive part that creates full source control.

Branching and Bookmarks

Using the diagram below, note how easy it is to mark each iteration of development with a bookmark to make it easy to then lock and deliver to test, a consistent image via a virtual database, (VDB.)

  • Screen Shot 2016-03-09 at 1.28.07 PMNote the feature branches, but every pull and checkout should be a test of the build, including the data.
  • How do we include the data? We connect the source databases (even when the source was multi-terabtytes originally) to Delphix and now we have production data in version control synchronized from all sources
  • This is then a single timeline representing all sources from which to develop, branch and test.
  • After each subsequent development deployment, a branch is created for test in the form of a VDB.  The VDB’s are all read/write copies, so full testing can be performed, even destructive testing.  It’s simple to reverse a destructive test with Delphix Timeflow.
  • After each test succeeds, a merge can be performed or if a problem occurs in the testing, a bookmark can be performed to preserve the use case for closer examination upon delivery of the VDB image to development.
  • The Delphix engine can be kept keep the environment sync’d near real-time with production to deter from any surprises that a static physical refresh might create.
  • Each refresh only takes a matter of minutes vs. days or weeks with a physical duplicate or refresh process.  VDBs save over 70% on storage space allocation, too.

Delphix is capable of all of this, while implementing Agile data masking to each and every development and test environment to protect all PII and PCI data from production in non-production environments.

Delphix, DevOps and Source Control-  a match made in heaven.

Posted in Delphix, devops Tagged with: , ,

September 15th, 2016 by dbakevlar

Along with the deep learning I’ve been allowed to do about data virtualization, I’ve learned a great deal about Test Data Management.  Since doing so, I’ve started to do some informal surveys of the DBAs I run into and ask them, “How do you get data to your testers so that they can perform tests?”  “How do you deliver code to different environments?”

confused

As a basic skill for a database administrator, we’re taught how to use export and import tools, along with cloning options to deliver data where its needed for various development and in succession, testing activities.  If DBAs didn’t deliver on time, due to resource constraints or inability, then developers would often find their own ways to manually create the data they would need to test new code. integration testing teams would need to manufacture data to validate complicated end-to-end functional testing scenarios, and performance & scalability testing teams would need to manufacture data that could stress their solutions at scale.  Rarely were their means successful and the deployment, along with the company often felt the pain.

Failure and Data

As the volume of application projects increased, larger IT organizations recognized the opportunity to gain efficiencies of scale and searched out opportunities to streamline processes and gain ways of speeding up data refreshes, even synthesizing data!  However, Developers and Testers still had little ability to self-service their needs and often found synthesized data incapable of meeting requirements and floundering deployments once to production.  IT organizations were able to achieve some efficiencies and cost savings, but significant savings related to development, along with testing productivity and quality remained a mystery.

DevOps To the Rescue

With the adoption of DevOps, a heightened focus on automation and speed of delivery occurred across IT.  Advanced Test Data Management solutions are starting to become a reality.  Companies are starting to realize that importance of data distribution, self-service and data security when delivered to non-production environments.
delphix-schema
I truly believe that no IT organization can accuse development or testing departments of lacking delivery if the groups aren’t offered the environments needed and data quality required to deliver a quality product.  One of the ways this can be accomplished is via virtualized database and application environments.  Simply virtualize the test and development, eliminating up to 90% of the storage required for physical databases and yet, still offer all the same data that is available in production.  If data security is a concern, this can all be done with data masking, built right into a proper Test Data Management product.

Test Drive TDM

If you’re interested in taking a proper Test Data Management product for a test drive, right on your laptop, try out Delphix Express.

 

Posted in Delphix, Delphix Express, devops, Test Data Management, Uncategorized Tagged with: ,

August 31st, 2016 by dbakevlar

Oracle Open World 2016 is almost here…where did the summer go??

damn_fall

With this upon us, there is something you attendees need and that’s to know about what awesome sessions are at Oracle Open World from the Delphix team!  I gave my options up as is the tragedy of switching companies in late spring from Oracle, but you can catch some great content on how to reach the new level in data management with Tim Gorman, (my phenomenal hubby, duh!) and Brian Bent from Delphix:

After absorbing all this great content on Sunday, you can come over to Oak Table World at the Children’s Creativity Museum on Monday and Tuesday to see the Oak Table members present their latest technical findings to the world.  The schedule and directions to the event are all available in the link above.

If you’re looking for where the cool kids will be on Thursday, check out the Delphix Sync event!  There’s still time to register if you want to join us and talk about how cool data virtualization is.

If you’re a social butterfly and want to get more involved with the community, check out some of the great activities that, and I do mean THAT Jeff Smith and the SQL Developer team have been planning for Oracle Open World, like the Open World Bridge Run.

 

Posted in DBA Life, Delphix Tagged with: , , ,

August 26th, 2016 by dbakevlar

I’ve been busy reading and testing everything I can with Delphix, whenever I get a chance.  I’m incredibly fascinated by copy data management and the idea of doing this with Exadata is nothing new, as Oracle has it’s own version with sparse copy.  The main challenge is that Exadata’s version of this is kind of clunky and really doesn’t have the management user interface that Delphix offers.

reading_well

There is a lot of disk that comes with an Exadata, not just CPU, network bandwidth and memory.  Now you can’t utilize offloading with a virtualized database, but you may not be interested in doing so.  The goal is to create a private cloud that you can use small storage silos for virtualized environments.  We all know that copy data management is a huge issue for IT these days, so why not make the most of your Exadata, too?

With Delphix, you can even take and external source and provision a copy in just a matter of minutes to an Exadata, utilizing very little storage.  You can even refresh, roll back, version and branch through the user interface provided.

I simulated two different architecture designs for how Delphix would work with Exadata.  The first was with standard hardware, with Virtual Databases, (VDBs) on the Exadata and the second having both the Dsource and the VDBs on another Exadata.

VDBs On A Second Exadata

  • Production on EXADATA,
  • Standard RMAN sync to Delphix
  • VDBs hosted on EXADATA DB compute nodes
  • 10Gb NFS is standard connectivity on EXADATA

 

Screen Shot 2016-03-07 at 5.47.53 PM

VDBs on Standard Storage, Source on Exadata

  • Production on EXADATA, standard RMAN sync to Delphix
  • VDBs hosted on commodity x86 servers

Screen Shot 2016-03-07 at 5.46.36 PM

How Does it All Work

Now we need to capture our gold copy to use for the DSource, which will require space, but Delphix does use compression, so it will be considerably smaller than the original database it’s using for the data source.Screen Shot 2016-03-07 at 5.55.57 PM

If we then add ALL the VDBs to the total storage utilized by that and by the Dsource, then you’d see that they only use about the same amount of space as the original database!  Each of these VDBs are going to interact with the user independently, just as a standard database copy would.  They can be at different points in time, track different snapshots, have different hooks, (pre or post scripts to be run for that copy) with different data, (which is just different blocks, so that would be the only additional space outside of other changes.)  Pretty cool if you ask me!

Save a server, save space and your sanity, clone with Delphix.

Posted in Copy Data Management, Delphix Tagged with: , ,

August 24th, 2016 by dbakevlar

While chatting on slack the other day, one of my peers asked if I’d seen that ESG Global had done a write up on Veritas Velocity.  Velocity is a product that won’t be available until the end of 2016 and is considered “Alpha”.  I was surprised that anyone allowed to participate in Alpha was able to publish a description on the product, but that’s not my call to make.

greatjob

What I found interesting about the article, written by ESG, discusses how Veritas Velocity, “… is combining its sophisticated approaches to data management with its broader ability to deliver superior data protection and information availability in order to offer something revolutionary.”

Revolutionary?

I found this statement to be quite odd, as what they’re doing is simply using the same technology that Delphix has utilized for years to perform what Delphix has implemented at our customer sites since 2008.  They are simply hopping on the bandwagon, (along with a number of other companies) in an attempt to take advantage of the newest buzz word, “Copy Data Management”.

There’s nothing revolutionary about what we do.  It was revolutionary back in 2008 and may be seen as revolutionary to the customers who haven’t embraced the power of virtualized environments yet, but to say what they’ve created is revolutionary isn’t true.

If we inspect (at a high level) what Veritas Velocity does:

  1. A self-contained VM appliance to manage storage and thin cloning.
  2. A configuration of storage with an NFS mount presented as VMs.
  3. Hybrid management to the cloud.
  4. The VMs are then presented as targets to be used for thin clones.
  5. Eliminating copies of data by virtualizing database environments, focused on the cloud.
  6. A User interface to manage it all.

I can replace the lead into the above list with Delphix and that describes the Delphix Engine, as well.  We also offer a mature User Interface, advanced scripting capabilities and heterogenous support.

Screen Shot 2016-08-24 at 5.04.11 PM

There are a lot of companies out there making claims that they have revolutionized new capabilities like “data virtualization”,  “copy data management “ and “test data management”. Delphix has been in this space since the beginning and as the Gartner reports prove, will continue to be the driving force behind what other companies are striving to achieve in their products.

 

Want to learn how many solutions Delphix virtualization can provide for your company’s data?  Try out Delphix Express, a simple Virtualbox or VMware open source version for your workstation to check out who’s been doing it right all along and before it was cool!

Posted in Delphix, Delphix Express Tagged with: , , , , ,

August 15th, 2016 by dbakevlar

I’ve been involved in two data masking projects in my time as a database administrator.  One was to mask and secure credit card numbers and the other was to protect personally identifiable information, (PII) for a demographics company.  I remember the pain, but it was better than what could have happened if we hadn’t protected customer data….

blowup

Times have changed and now, as part of a company that has a serious market focus on data masking, my role has time allocated to research on data protection, data masking and understanding the technical requirements.

Reasons to Mask

The percentage of companies that contain data that SHOULD be masked is much higher than most would think.

Screen Shot 2016-08-15 at 12.59.05 PM

The amount of data that should be masked vs. is masked can be quite different.  There was a great study done by the Ponemon Instititue, (that says Ponemon, you Pokemon Go freaks…:)) that showed 23% of data was masked to some level and 45% of data was significantly masked by 2014.  This still left over 30% of data at risk.

The Mindset Around Securing Data

We also don’t think very clearly about how and what to protect.  We often silo our security-  The network administrators secure the network.  The server administrators secure the host, but doesn’t concern themselves with the application or the database and the DBA may be securing the database, but the application that’s accessing it, may be open to accessing data that shouldn’t be available to those involved.  We won’t even start about what George in accounting is doing.

We need to change from thinking just of disk encryption and start thinking about data encryption and application encryption with key data stores that protect all of the data-  the goal of the entire project.  It’s not like we’re going to see people running out of a building with a server, but seriously, it doesn’t just happen in the movies and people have stolen drives/jump or even print outs of spreadsheets drives with incredibly important data residing on it.

As I’ve been learning what is essential to masking data properly, along with what makes our product superior, is that it identifies potential data that should be masked, along with ongoing audits to ensure that data doesn’t become vulnerable over time.

Screen Shot 2016-08-15 at 12.30.34 PM

This can be the largest consumption of resources in any data masking project, so I was really impressed with this area of Delphix data masking.  Its really easy to use, so if you don’t understand the ins and outs to DBMS_CRYPTO or unfamiliar with the java.utilRANDOM syntax, no worries, Delphix product makes it really easy to mask data and has a centralized key store to manage everything.

Screen Shot 2016-08-15 at 11.52.53 AM

It doesn’t matter if the environment is on-premise or in the cloud.  Delphix, like a number of companies these days, understands that hybrid management is a requirement, so efficient masking and ensuring that at no point is sensitive data at risk is essential.

The Shift

How many data breaches do we need to hear about to make us all pay more attention to this?  Security topics at conferences are diminished vs. when I started to attend less than a decade ago, so I know it wasn’t that long ago it appeared to be more important to us and yet it seems to be more important of an issue.

Screen Shot 2016-08-15 at 11.47.19 AM

Research was also performed that found only 7-19% of companies actually knew where all their sensitive data was located.  That’s over 80% sensitive data vulnerable to a breach.  I don’t know about the rest of you, but upon finishing up on that little bit of research, I understood why many feel better about not knowing and why its better just to accept this and address masking needs to ensure we’re not one of the vulnerable ones.

Automated solutions to discover vulnerable data can significantly reduce risks and reduce the demands on those that often manage the data, but don’t know what the data is for.  I’ve always said that the best DBAs know the data, but how much can we really understand it and do our jobs?  It’s often the users that understand it, but may not comprehend the technical requirements to safeguard it.  Automated solutions removes that skill requirement from having to exist in human form, allowing us all to do our jobs better.  I thought it was really cool that our data masking tool considers this and takes this pressure off of us, letting the tool do the heavy lifting.

Along with a myriad of database platforms, we also know that people are bound and determined to export data to Excel, MS Access and other flat file formats resulting in more vulnerabilities that seem out of our control.  Delphix data masking tool considers this and supports many of these applications, as well.  George, the new smarty-pants in accounting wrote out his own XML pull of customers and credit card numbers?  No problem, we got you covered… 🙂

Screen Shot 2016-08-15 at 12.51.45 PM

So now, along with telling you how to automate a script to email George to change his password from “1234” in production, I can now make recommendations on how to keep him from having the ability to print out a spreadsheet with all the customer’s credit card numbers on it and leave it on the printer…:)

Happy Monday, everyone!

 

 

 

 

Posted in Data Masking, Delphix, Oracle Tagged with: ,

July 14th, 2016 by dbakevlar

So some unknowing fool gave me full access to a fridge of Starbucks caffeinated beverages, which should be against the law in this fine country.  Needless to say, my ability to type approximately 180wpm has offered me the opportunity to catch up and blog about what it’s like almost a month in employed with Delphix.

typing fast 1

In just the last two weeks, a number of conference sessions have come about and events that I plan on being part of.  Some of them are well known, some aren’t and some require a bit of promotion to make them even greater than they already are, (you don’t want to be that person that wonders why they missed out, now do you?)

RMOUG Quarterly Education Workshop

At the end of this month, on July 29th, RMOUG is having their summer QEW at the Denver Aquarium.  The summer event is our second largest event of the year, (behind February’s Training Days conference) and the planning is starting to come together, (which is good considering it’s in a couple weeks!)  The speakers are two great ones in the community, Solarwinds’ Janis Griffin and OnX’s Jim Czuprynski.  We’ll have a Women in Technology lunch and learn, (Jim’s wife, Ruth, who had a long career in technology) will be part of the round table conversation with the women in our community to discuss some great topics!  If you have family you’d like to invite, check out the opportunity for discount tickets to enjoy the aquarium post the event!  You can read the details of the event and register at RMOUG’s website.

The Bay Bridge Run

The first will be a promotion for Oracle Open World.  Every year there is a run across the bridge on September 18th and it’s a great way to get to know some of the best people in the Oracle community before we get lost amidst the Moscone mayhem.  The view is fantastic and if you’re like me and age has impacted your ability to keep up with some of those in our community that like to run 100+ miles in a single outing, you can walk the bridge as well.  If you’re interested, check out the following Facebook link and come be a part of this great event!

baybridge

During Oracle Open World, I’ll be busy Tuesday and Wednesday, (Sept. 20th and 21st) at the Children’s Museum, heading up the Oak Table World.  It’s one of those tasks that I took over for Kyle so he can focus on what he does best.  The event will be at the Children’s museum by the Moscone again and we’re close to signing on Pythian and one other vendor to sponsor the event with Delphix.  Tim will be designing the great t-shirts that are coveted each year and I’m considering scheduling a hackathon if there’s enough interest.  I’ll get the schedule posted soon and thanks for everyone’s patience!

oaktable

 

OK, time for another Starbucks, pray for my office mates… 🙂

 

Posted in DBA Life Tagged with: , ,

June 24th, 2016 by dbakevlar

I’ve been going through some SERIOUS training in just over a week.  This training has successfully navigated the “Three I’s”, as in its been Interesting, Interactive and Informative.  The offerings are very complete and the knowledge gained is limitless.

I’d also like to send a shout out to Steve Karam, Leighton Nelson and everyone else at Delphix who’s had a hand in designing the training, both for new employees and for the those working with our hands on labs.  I’ve had a chance to work with both and they’re just far above anything I’ve seen anywhere else.

The Challenge of Patching and Upgrades

Most DBAs know-  If you attempt to take a shortcut in patching or upgrading, either by not testing or hoping that your environments are the same without verifying, shortcuts can go very wrong, very quickly.
hardway

Patching is also one of the most TEDIOUS tasks required of DBAs. The demands on the IT infrastructure for downtime to apply quarterly PSU patches, (not including emergency security patches) to all the development, test, QA and production databases is a task I’ve never looked forward to.  Even when utilizing Enterprise Manager 12c Patch Plans with the DBLM management pack, you still had to hope that you checked compliance for all environments and prayed that your failure threshold wasn’t tripped, which means a large amount of your time would have to be allocated to address patching outside of just testing and building out patch plans.

This is Where Delphix Saves the Day

I bet most of you already knew you could virtualize your development and test from a single Delphix compressed copy, (referred to as a DSource.) create as many virtual copies, (referred to as VDBs) as your organization needs to have for development, testing, QA, backup and disaster recovery, (if you weren’t aware of this, you can thank me later… :))

What you may not know, (and what I learned this week) is that you can also do the following:

  • Test patches and upgrades on a VDB or a DSource to verify there aren’t any issues instead of doing a full, manual clone which is very time consuming.
  • Apply patches and upgrades to a DSource and the patch and/or upgrade all the VDBs attached to the DSource by simply performing/scheduling  a refresh.

Screen Shot 2016-06-24 at 10.56.17 AM

Considering how much time and resources are saved by just eliminating such a large portion of time required for patching and upgrading, this is worth investing in Delphix just for this alone!

Want to learn more?  Check out the following links:

Testing Oracle Upgrade and Application Patching

Upgrade Oracle DSources After an Oracle Upgrade

Want to Demo Delphix? <– Click here!   

Posted in Database, DBA Life, Delphix Tagged with: , ,

June 15th, 2016 by dbakevlar

Back in 2012, when I started to build a reputation as a mentor, the goal was not just to create my own path and set it afire, but for others to desire to make their own path before my footsteps cooled.

hot-coals

Never Fear Hot Coals

This week I joined Delphix.  Many acted as if this was pre-ordained and simply part of my destiny.  Due to my technical knowledge, they assumed I would be working in the same group as my husband and virtualization was just a natural fit for my skills.  Although I love working just a few feet from my husband, (which we’ve done for years now) I actually joined the Product Management and Marketing team, which is a surprise to many.

All new jobs come with new challenges, but when you also change paths, it can be like walking on hot coals.  You can find yourself anxious, having doubts about your skills and if you’re up to the challenge. I accepted this challenge because I wanted to have more impact with the direction of the technology I was working with.  I wanted to help the business make intelligent decisions with the powerful knowledge I have about technology, customers and product and this is something that Delphix is keen on letting me be a part of. I’m pretty fearless, but even I have to remember to not let my fears or frustrations get the best of me.  As I always say, there is incredible power in the simple act of doing, so just do and its surprising how quickly you’ll be successful.

In just the few days I’ve been here, I’ve already begun to build the documentation that will help determine what content will be directed to what audiences, chosen a few members of the community to do guest blogs, did some really great training and was introduced to some incredible people.

I’m learning how to manage my time a little differently than I did before, as things moved a lot slower at Oracle, but I love how my skills are more in line with what the company needs from me.  Even though I’ve been here less than a week and am in a position that I’ve never held before, I know exactly what my purpose is.

I want to thank my new peers and managers for helping me quickly get up to speed.  My beloved Microsoft Surface has been migrated to my secondary desk and my new work computer is set up (I’m on a Mac Air, don’t everyone gasp at once and start taking bets on how long the keyboard will last… :))  As I stated earlier, the work is new and interesting, which is why Delphix was at the top of my list for companies to join.  Like my new position, Delphix technology removes many of the tedious tasks and automates much of what once was a manual process so as to get onto more interesting and rewarding adventures.

delphix

I’ll continue to update everyone on how my new world is shaking out at Delphix and hopefully will convince a couple of you to join me.  It wouldn’t be the first time that’s happened.  The coals are warm, but will you follow in my footsteps before they cool?

 

 

Posted in Delphix Tagged with: ,

June 7th, 2016 by dbakevlar

So after over two years at Oracle, I’m moving on.  Yes, for those of you who haven’t seen the tweets and the posts, you heard right.

dorothy_appaulled

OK, everyone-  cleansing breath.

I worked with great people and did some awesome things in the community, blogged everything Enterprise Manager and talked over 1/2 the Oracle community into buying and doing projects with Raspberry Pi while I was at it!

Many folks thought I was a product manager or a technical consultant, but my title was Consulting Member for the Technical Staff with the Strategic Customer Program with the Enterprise Manager and Oracle Management Cloud Group.  I know I was part of a select group at Oracle, but I believe the opportunity to work at Oracle was an important step in my career and I’d recommend it to anyone for the experience it provides.

There is a huge difference working for Oracle vs. being in the Oracle community, even as an Oracle ACE Director.  I was utterly amazed being part of the Oracle machine.  One of the most amazing experiences was observing how releases came together.   It was a complete different experience as an employee vs. a customer.  Being part of a massive undertaking such as a product release, impressively building out software to be released to its customer base is pretty astounding.  Understanding how and what it takes to move the machine and once it gets moving, how pertinent it is for anyone in its way to get out of the way is important to understanding how a successful product is created.

I learned a lot in just over two years and I have to admit-  many of the negatives that people said would be present at Oracle, I just didn’t experience.  I had great mentors and contacts inside of Oracle.  It’s easy to assimilate into a big company environment when you have people like Pete Sharman, Tyler Muth, Mary Melgaard and other’s looking out for you. I’ll be sad to leave all the great people that I worked with at Oracle, too-  Steve, Courtney, Scott, Werner, Andrew, Joe, Pramod and Will.  At the same time, I look forward to opportunities to learn new skills with the awesome folks that have so readily embraced my quirky self at my new company. I learned a great deal in my two years at Oracle and this is knowledge that I’m able to take with me as I move forward to my new adventure.

With that said, I’ve been offered an incredible opportunity to stretch my legs a bit and try something new and I am excited to move onto this new challenge.  I’ll still be speaking at conferences, but also will direct technology in a a way that should be very constructive to my technical style.

There has been a lot of rumors to where I’m off to.  Some of you have guessed correctly on where I’m going, but I know none of you guessed what I’ll be doing.  I will be focusing more on my multi-platform skills, so for those of you that thought I would be leaving all those years of experience in database and OS platforms, it’s going to be just the opposite.

I’m very excited to announce that, as of Monday, June 13th, I’m the new Technical Intelligence Manager at Delphix.

delphix

Buckle up, Baby!  This is going to be good.

Posted in DBA Life, Delphix Tagged with:

  • Facebook
  • Google+
  • LinkedIn
  • Twitter