Subscribe to Blog via Email
Follow me on TwitterMy Tweets
It was a really busy summer and ended with me returning after a week of vacation in Singapore. What should I do after a 17hr flight and jet lag? Two webinars and a SQL Saturday event! What better way to get over jet lag and get my game back on and just jump back in!
I started out by having a webinar this morning on “DBA to DevOps to DataOps- the Revolution.” I had a feeling with the jet lag, I’d be done faster than I’d hoped, but with the amount of questions from the over 400 attendees, it was an awesome one hour with everyone. I focused on the important topic of data gravity and how the role of the DBA can evolve to be more productive for the business.
There were reference links that I knew were important and the PDF slide deck doesn’t provide that, so please refer to the following links below to catch up with all the delphix blog posts I’ve written on this topic:
FYI- there are two more blog posts that will be published shortly on delphix.com, so stay tuned for those.
On Thursday, I’ll be presenting with Oracle, “The DBA Diaries” focused on the cloud. It should be a great conversation on where DBAs are in the scheme of the cloud and how our role is evolving.
To round up the week, I’ll be presenting at SQL Saturday Denver, my local SQL Saturday event for the SQL Server community! Delphix is sponsoring this awesome event and I’m looking forward to presenting, (as is Tim at this event.)
Sunday– I SLEEP! No, I lie… I’ll be uploading all my code, video and content for ODTUG’s Geekathon. Then I sleep. 🙂
Data gravity and the friction it causes within the development cycle is an incredibly obvious problem in my eyes.
Data gravity suffers from the Von Newmann Bottleneck. It’s a basic limitation on how fast computers can be. Pretty simple, but states that the speed of where data resides and where it’s processed is the limiting factor in computing speed.
OLAP, DSS and VLDB DBAs are constantly in battle with this challenge. How much data is being consumed in a process, how much must be brought from disk and will the processing required to create the results end up “spilling” to disk vs. completing in memory.
Microsoft researcher Jim Gray has spend most of his career looking at the economics of data, which is one of the most accurate terms of this area of technical study. He started working at Microsoft in 1995 and although passionate about many areas of technology, his research on large databases and transactional processing speeds is one of great respect in my world.
Now some may say this has little to do with being a database administrator, but how many of us spend significant time on the cost based optimizer, as moving or getting data has cost- so economics of data it is.
And this is the fundamental principle of data gravity and why DBAs get the big bucks.
If you’re interested in learning more about data gravity, DevOps and the future of DBAs, register for the upcoming webinar.
This is an extensive series of blog posts, (four so far) to be followed by an ebook, a podcast and two webinars. One is to be announced soon from Oracle called, “The DBA Diaries” and the other will be a from Delphix, titled, “The Revolution: From Databases and DevOps to DataOps“.
The goal for all of this is to ease transition for the Database community as the brutal shift to the cloud, now underway, changes our day to day lives. Development continues to move at an ever accelerating pace and yet the DBA is standing still, waiting for the data to catch up with it all. This is a concept that many refer to as “data gravity“.
The concept was first coined just a few years ago by a Senior VP Platform Engineer, Dave McCrory. It was an open discussion aimed at understanding how data impacted the way technology changed when connected with network, software and compute.
He discusses the basic understanding that there’s a limit in “the speed with which information can get from memory (where data is stored) to computing (where data is acted upon) is the limiting factor in computing speed.” called the Von Newmann Bottleneck.
These are essential concepts that I believe all DBAs and Developers should understand, as data gravity impacts all of us. Its the reason for many enhancements to database, network and compute power. Its the reason optimization specialists are in such demand. Other roles such as backup, monitoring and error handling can be automated, but the more that we drive logic into programs, nothing is as good as true skill in optimization when it comes to eliminating much of data gravity issues. Less data, less weight- it’s as simple as that.
We all know the cloud discussions are coming, and with that, even bigger challenges are felt by the gravity from data. Until then, let’s just take a step back and recognize that we need some new goals and some new skills. If you’re like to learn more about data gravity, but don’t have time to take it all in at once, consider following it on Twitter, which is curated by Dave McCrory.
I’m off to Jacksonville, Fl. tomorrow to speak at SQL Saturday #649!
After returning from KSCOPE two weeks ago, I was again approached to be a judge this year on the Geekathon 2017.
I was thrilled when the option was left open to me to compete this year. instead of be a judge. As much as I love to offer my insight and expertise in a judging capacity, I really do love a great maker opportunity, which I haven’t had much time to allocate to with all the work I’m doing at Delphix.
Needless to say, I’m looking forward to this year’s competition, so look out maker’s, there’s a new…err, old gun….old set of Mickey ears in town! (For those of you who aren’t getting that, next year’s KSCOPE 2018 conference is in Orlando, Fl…. :))
I’ve been part of the maker’s space for a number of years, finding it one of the most diverse and supportive groups in the nation. I’ve presented at local events and been advertised in the NY Times to represent makers for the Barnes and Noble Book store events.
I’ve officially built out my proposal and submitted it as a team of one, but if you’re new to the makers space, unsure of what to do with a beacon or would just like to have a team to collaborate with, that’s an option! Just find your maker-mates, come up with an awesome idea and if you need a beacon, just ask- the ODTUGr’s will be glad to send you one so you can compete! Just go to the ODTUG Geekathon site and submit a team and an idea!
Database Administrators, (DBAs) through their own self-promotion, will tell you they’re the smartest people in the room and being such, will avoid buzzwords that create cataclysmic shifts in technology as DevOps has. One of our main role is to maintain consistent availability, which is always threatened by change and DevOps opposes this with a focus on methodologies like agile, continuous delivery and lean development.
Residing a step or more behind bleeding edge has never phased the DBA. We were the cool kids by being retro, those refusing to fall for the latest trend or the coolest new feature, knowing that with bleeding edge comes risk and that a DBA that takes risks is a DBA out of work. So we put up the barricades and refused the radical claims and cultural shift to DevOps.
As I travel to multiple events focused on numerous platforms the database is crucial to, I’m faced with peers frustrated with DevOps and considerable conversation dedicated to how it’s the end of the database administrator. It may be my imagination, but I’ve been hearing this same story, with the blame assigned elsewhere- either its Agile, DevOps, the Cloud or even a latest release of the actual database platform. The story’s the same- the end of the Database Administrator.
The most alarming and obvious pain point of this, is that in each of these scenarios, the result was the Database Administrator a focal point in the end more so than they were when it began. When it comes to DevOps, the specific challenges of the goal needed the DBA more so than any of these storylines. As development hurdled top speed to deliver what the business required, the DBA and operations as a whole, delivered the security, the stability and the methodologies to build automation at the level that the other groups simply never needed previously.
Powerful DBAs with skills not just in scripting, but in efficiency and logic, were able to take complicated, multi-tier environments and break them down into strategies that could be easily adopted. As they’d overcome the challenges of the database being central and blamed for everything in the IT environment, they were able to dissect and built out complex management and monitoring of end-to-end DevOps. As essential as System, Network and Server Administration was to the Operations group, the Database Administrator possessed advanced skills required, a hybrid of the developer and the operations personnel that make them a natural fit for DevOps.
Thanks to this awesome post from 2012 from Alex Tatiyants which resonated so well with the DBAs I speak to every day, even in 2017.
I know you’ve read the title and are thinking, “Great, I’m going to learn how to write better presentations!” The truth is, it’s about how conferences do call for papers timelines, abstracts, as well as how we manage our content.
Timing Is Everything
In recent years, there’s been a scramble to get on the schedule for the best speakers. I remember when I first started as the director for RMOUG’s annual Training Days conference. I had to keep the opening date for Call For Papers, (CFP) secret, since the moment I opened the portal up, IOUG, ODTUG and UKOUG were sure to pounce- all focused on getting on the limited travel agendas for top notch conference speakers.
This resulted in these events having the CFP abstract content submitted 6-10 months before the event. Speakers would have no choice but to submit repeat, vague or incomplete abstract submissions, which invariably impacted the quality of what was presented at the event. I know a number of speakers are sitting back and saying, “Wait a minute…” but let me finish. The way the quality was impacted was in various ways and not in the way many would first consider:
First Isn’t Always Good
As I know RMOUG is about to open our CFP, part of me is torn on the way we always do this. One one hand, we want adequate time for people to submit abstracts, for our reviewers to have adequate time to review the content and that our scoring system is everything to our acceptances, but….but….. if I were to ensure I have the BEST content, what are some of the enhancements I’d want in place that I know as a speaker would significantly help?
The goal of this change would be to:
Fix the Software
“A good presentation is like a good whisky. Proper aging is important.”
The first time I give a presentation, I’m rarely happy with it. I want my messaging on fleek and it just rarely occurs where the story, the content slides and my brain has it all wrapped up tight. It seems I can go through it a 100 times before I get up on stage, but as I start presenting the material in front of audiences, the story comes together and this often creates a need to update the abstract.
Maybe its the control freaks in us on the database side of the house. Once our CFP is done, we lock down access for potential speakers to tweak anything. I don’t know about any other speakers, but as I submit throughout the year, I keep a running document of my abstracts. I tweak them as I go along and there are many times I’d like to update the ones I’ve already submitted as the talk “ages” and takes shape. I fall into the 50% of speakers that build a lot of content each year and this means I often show up at an event and realize the title I submitted hasn’t changed in the last 9 months, but the message has mutated with my experience giving the session and the summary that should draw the folks into the session just doesn’t match any longer.
It would be AWESOME to have abstract software that allows you to tweak your abstract and submit it to the conference director for seasoned speakers. We want to give the audience a great presentation, but I know at KSCOPE, my abstract was written 10 months before the conference, before I understood the power of what Delphix could do and if you read the summary in the catalog, no way I was reaching my real audience.
I *thought* I was really great at keeping track of all my profiles out on the internet. I get very frustrated when I can’t update them when I move to a new role or a new company and as much as some folks might tease me about moving around, the truth is, if you want to be a mover and a shaker, we’re often left with no choice but to move to shake it up.
I was completely confounded when someone told me that I had a profile that said I was an ACE Director still and that they had to go in and have it updated. I was sure they’d made a mistake and just missed the “alumni” behind it. I’m very good about ensuring I always copy and paste my biography from my document that I keep my abstracts in. I logged into the portal that was listed as the culprit and quickly noticed that is didn’t have my default “pasted” profile. It also listed my title with the word, “new” in front of it. It told me that for some reason, I’d attempted to edit instead of copy and paste over and it proved to me again that taking short cuts like this, no matter how benign they can seem just shows how easy human intervention causes human error. I had broken my own rule of editing a profile by hand and had paid for it.
For conference speaker bios, can we PLEASE, just link to speaker’s Linkedin profiles? We use bind variables in our code, so why would we have hard coded values for data that can change like our own information? It seems like a petty thing, but I can’t tell you how many times I’ve received emails asking me to update profiles that I don’t have access to or how it’s confused folks with out of date information that I can’t pin down the owner of.
Your Part as the Speaker
So what can I offer on becoming a better speaker? A few its of advice:
OK, time to fly home and face the music. Esme wants to know why I was cuddling another puppy at the KSCOPE puppy therapy circle in the exhibitor area two days ago. Look how happy the puppy was though! 🙂
I know that right sidebar on my blog has an AWFUL lot of Microsoft events on it. There’s so many, I’ve begun to use the hashtag #MySummerOfSQL due to it. For those of you that follow me with Oracle, it doesn’t mean that I’m leaving the Oracle community- not even close. I’m as dedicated as ever to Oracle and hope to dig back into my performance roots on both platforms, but know that the summer is the quiet time for Oracle user group events, so I’ll be keeping myself busy with SQL Saturdays and the AWESOME preview to the annual Pass Summit conference, (for the Oracle peeps, think of an Oracle Open World for Microsoft folks, sans the sales folks… :)) which is a series of worldwide webinars called the 24 HOP, (24 Hours of Pass).
I want to thank the Microsoft SQL Pass community for embracing me and letting me regain my footing since departing the my time as a SQL Server DBA back with the release of SQL Server 2012 and I’m really loving all the enhancements in SQL Server 2014, 2016 and now, 2017!
For those on the Oracle side of the house, hopefully the Oracle Open World acceptances will come out in the next two weeks and I’m crossing my fingers I’ll get to speak either on my own or even better, with one of the fantastic co-presenters I’m hoping to partner up with- Gurcan Orhan and Mike Donovan of DB Visit.
I’m busy prepping my slides for the last HUGE Oracle conference before the summer break, KSCOPE, in San Antonio this next week, but I’ll try to get one more blog post out this week. Of course, it’s going to be more on the SQL Server/Oracle optimizer comparisons.
So see my Oracle peeps in San Antonio next week for the ever AWESOME KSCOPE 2017 and help celebrate their 20th birthday!
I just returned from a week in Paris and it was fantastic!
No amount of terrorist attack could hinder my enthusiasm for the week of vacation and although Tim and I were unaware a man with a hammer at Notre Dame would garner so much attention from the news outlets, I realized quick enough that I was in a slew of online, video and print as the major American tourist who witnessed…well, a person lying wounded on the pavement with police over him, guns drawn. I also became quickly alerted that there was a massive crowd coming towards me looking for shelter from any further gun shots in the cathedral, (which I was currently first in line!) We were ushered inside immediately and I have to say, I can’t think of a more beautiful place to be held while a terror attack investigation was going on outside.
I’d tweeted just before entering the courtyard and then the line to enter the cathedral that we were going to visit Notre Dame and after being ushered in by security, some of my followers on Twitter alerted me that there was something going on, but they weren’t sure what. I quickly responded that it was alright and that we were being protected inside the cathedral. This ended up alerting the major media outlets and they started to contact me. We had about 2 hours inside, so I did grant BBC and AP the time to tell them that the situation was very well contained and that we were safe. I was hoping to keep it from being blown out of proportion, to be honest.
Needless to say, after the initial investigation was completed and we were released, Tim and I went to get a glass of red wine, (I commonly don’t drink wine) to calm the nerves and as I’ve stated in my interviews- the French police had everything incredibly under control and contained the situation in an efficient and orderly manner. I applaud how effectively they’ve handled the escalated terrorism threats in France and I wish other countries were as prepared as they are.
The rest of the week we enjoyed Paris and even spent a day down on the Southern coast of France. The last evening, just days after the terrorist event, you could see that no one was getting the French down, definitely not some silly terrorist!
So, back to work for me! I have two webinars this week!
Tomorrow is an Oracle one from Pro Huddle, focused on DevOps for the DBA!
Wednesday is for the SQL Server VirtualPass folks and is on Virtualization, Opportunities When Migrating to the Cloud!
Happy National Goth Day! Although I consider myself “Goth Lite”, it’s a national holiday in my little world and I’m pondering what books I’ll write after I run out of tech titles. Needless to say, I’ve chosen the title, “Staying Geeky and Goth After 50” and chosen the following for the cover art:
I’m considering adding a black choker to Ma Goth and maybe some facial piercings to Pa- we’ll see.
As May winds down, I look forward to a vacation in Paris the beginning of June with my favorite person, Tim Gorman. As much as we travel together, we rarely get to see much of each other while at an event and are both looking forward to some downtime to just spend together.
After last week’s Data Summit 2017, DBTA did a wonderful write up on my session, but as I’m my worst critic, let me know I needed to do a better job of relaying my message to the community, as in the article, not one mention of the importance of virtualization in removing that pesky bottleneck of data from cloud migrations. I greatly appreciate the opportunity to speak at this awesome event and shall promise to do better next time… 🙂
For week 21 of the year, I’m off to sunny Phoenix, Arizona to Data Platforms 2017.
This is an incredible event focused on data operations and no other event that I’ve seen better speak to the power of DevOps, automation and the power of visualizing a future of data at the speed of business than through virtualization. I’m going to be speaking at one session, focused on some of our solutions for virtualizing everything, not just the database. We all know there are bottlenecks everywhere to getting to the cloud, automating and data in general and a number of solutions are a focus for getting away from the RDBMS, but in reality, it’s just moved the resource hit. We also know that databases don’t have sole ownership on complexity. I’ll try to shed some light on how to lighten the load across the database, applications, flat files and anything else possible to make life for the next generation of technologist a piece of cake.
Again- Happy National Goth Day!!
After looking at my upcoming schedule for 2017, I realized that I need to start posting on where and what conference(s) I’m at regularly.
With that realization, I present “Where in the World is Goth Girl” posts that will help keep track of me, (before my husband has a homing device surgically implanted in my brain….:))
This week I’m in Manhattan for Data Summit 2017, presenting on Wednesday on Database Virtualization. As soon as I’m done presenting on Wednesday, I’m heading to the airport to fly to Cleveland for the second day of NEOOUG, Great Lakes Oracle Conference, (GLOC) 2017 to present twice on Thursday morning before we fly out that evening for home.
I have two slide decks that I’ve just gone through and revamped for the events and I’m rather proud of my entertaining and aestethically pleasing, powerpoint accomplishment. Seriously, how can you not love slides like this?
I’ll be speaking on Database Virtualization and the Cloud at both events and then my second session at GLOC will be on Test Data Management for DBAs. As many people know- I try to connect with as many people as I can while at events, so if you see a Linkedin connection or Twitter follow from DBAKevlar, it’s just me being social!
I was in a COE, (Center of Excellence) meeting yesterday and someone asked me, “Kellyn, is your blog correct? Are you really speaking at a Blockchain event??” Yeah, I’m all over the technical map these days and you know what?
I love the variety of technology, the diversity of attendance and the differences in how the conferences are managed. Now that last one might seem odd and you might think that they’d all be similar, but its surprising how different they really are.
Today I’m going to talk about an aspect of conferences that’s very near to my heart, which is networking via events. For women in technology, there are some unique challenges for us when it comes to networking. Men have concerns about approaching women to network- such as fearful of accusations of inappropriate interaction and women have the challenge that a lot of networking opportunities occur outside of the workplace and in social situations that we may not be comfortable in. No matter who you are, no matter what your intentions, there’s a lot of wariness and in the end, women often just lose out when it comes to building their network. I’ve been able to breach this pretty successfully, but I have seen where it’s backfired and have found myself on more than one occasion defending both genders who’ve ended up on the losing side of the situation.
With that said, conferences and other professional events can assist with helping us geeks build our networks and it’s not all about networking events. I noticed a while back that the SQL Server community appeared to be more networked among their members. I believe part of this is due to the long history of their event software and some of its features.
Using the SQL Pass website, specifically the local user group event management software- notice that its all centralized. Unlike the significantly independent Oracle user groups, SQL Server user groups are able to use a centralized repository for their event management, speaker portal, scheduling, etc. It’s not to say that there aren’t any events outside of Pass Summit and SQL Saturdays, there’s actually a ton, but this was the portal for the regional user groups, creating the spoke that bridged out to the larger community.
Outside of submitting my abstract proposals to as many SQL Saturdays worldwide from one portal, I also can maintain one speaker biography, information about my blog, Twitter, Linkedin and other social media in this one location.
The second benefit of this simplicity, is that these biographies and profiles “feed” the conference schedules and event sites. You have a central location for management, but hundreds of event sites where different members can connect. After abstracts have been approved and the schedule built, I can easily go into an event’s schedule and click on each speaker biography and choose to connect with anyone listed who has entered their social media information in their global profile.
Using my profile as an example, you’ll notice the social media icons under my title are available with a simple click of the mouse:
This gives me both an easy way to network with my fellow speakers, but also an excuse to network with them! I can click on each one of the social media buttons and choose to follow each of the speakers on Twitter and connect with them on Linkedin. I send a note with the Linkedin connection telling the speaker that we’re both speaking at the event and due to this, I’d like to add them to my network.
As you can join as many regional and virtual user groups as you like, (and your Pass membership is free) I joined the three in Colorado, (Denver, Boulder and Colorado Springs.) Each one of those offers the ability to also connect with the board members using a similar method, (now going to use Todd and David as my examples from the Denver SQL Server user group.)
The Oracle user groups have embraced adding twitter links to most speaker bios and some board groups, but I know for RMOUG, many still hesitated or aren’t using social media to the extent they could. I can’t stress enough how impressed I am when I see events incorporate Linkedin and Twitter into their speaker and management profiles, knowing the value they bring to technical careers, networks and the community.
Although the SQL Server community is a good example, they aren’t the only ones. I’m also speaking at new events on emergent technologies, like Data Platforms 2017. I’ll be polite and expose my own profile page, but I’m told I’m easy to find in the sea of male speakers… 🙂 Along with my picture, bio and session information, there are links to my social media connections, allowing people to connect with me:
Yes, the Bizzabo software, (same software package that RMOUG will be using for our 2018 conference, along with a few other Oracle events this coming year) is aesthetically appealing, but more importantly, it incorporates important networking features that in the past just weren’t as essential as they are in today’s business world.
I first learned the networking tactic of connecting with people I was speaking with from Jeff Smith and I think its a great skill that everyone should take advantage of, no matter if you’re speaking or just attending. For women, I think it’s essential to your career to take advantage of opportunities to network outside of the traditional ways we’ve been taught in the past and this is just one more way to work around that glass ceiling.
How many times have you had a developer come to you and say, “I just did a bad thing in the database. Can you recover from what I just did?”
With Delphix virtualization, we make this pretty easy to address from the user interface with a simple slider to recover from a PIT before the catastrophic mistake, but today, we’ll discuss how to do this from the command line
1.Log into the Delphix engine as an admin user.
ssh delphix_admin@<yourengine> delphix > timeflow delphix timeflow > ls
2. Depending on the platform that you’re using, (in our example, we’ll use Oracle) you’ll see the list of the databases available and can choose the one that you want to refresh before the catastrophic incident from the clueless developer
delphix database> select [VDB name]
3. We can do a simple rollback if we just want to go back to the last snapshot or we can use an list command to see more options:
delphix database "[VDB Name]"> rollback
delphix database "[VDB Name]" rollback *> ls Properties type: OracleRollbackParameters credential: (unset) timeflowPointParameters: type: TimeflowPointSemantic container: (required) location: LATEST_POINT username: (unset)
4. So we’ve decided to do a PIT recovery after the mistake and use the following command and then commit the changes:
delphix database "[VDB Name]" rollback *> set timeflowPointParameters.location=82439
delphix database "[VDB Name" rollback *> commit
That’s all there is to it.
So what if the developer is incompetent and screws up repeatedly?
Follow steps 1-4 above and then to purge the problem from the environment, run the following command from the delphix engine:
delphix database developer [developer employee ID] remove *> commit
An eject button we’ve installed in the delphix engine will remove the developer from the premises and Delphix will even submit all necessary paperwork to Human Resources to complete his termination processing.
If you’d like to automate the process, you can create a handy script that simply asks for the following parameters by calling it from any shell, Powershell for windows or even Jenkins as part of DevOps!
Happy April 1st!
I ended up speaking at two events this last week. Now if timezones and flights weren’t enough to confuse someone, I was speaking at both an Oracle AND a SQL Server event- yeah, that’s how I roll these days.
I arrived last Sunday in Salt Lake, which is just a slightly milder weather and more conservative version of Colorado, to speak at UTOUG’s Spring Training Days Conference. I love this location and the weather was remarkable, but even with the warm temps, skiing was still only a 1/2 hour drive from the city. Many of the speakers and attendees took advantage of this opportunity by doing just that while visiting. I chose to hang out with Michelle Kolbe and Lori Lorusso. I had a great time at the event and although I was only onsite for 48hrs, I really like this event so close to my home state.
I presented on Virtualization 101 for DBAs and it was a well attended session. I really loved how many questions I received and how curious the database community has become about how this is the key to moving to the cloud seamlessly.
There are significant take-aways from UTOUG. The user group, although small, is well cared for and the event is using some of the best tools to ensure that they get the best bang for the buck. It’s well organized and I applaud all that Michelle does to keep everyone engaged. It’s not an easy endeavor, yet she takes this challenge on with gusto and with much success.
After spending Wednesday at home, I was back at the airport to head to Reykjavik, Iceland for their SQL Saturday. I’ve visited Iceland a couple times now and if you aren’t aware of this, IcelandAir offers up to 7 day layovers to visit Iceland and then you can continue on to your final destination. Tim and I have taken advantage of this perk on one of our trips to OUGN, (Norway) and it was a great way to visit some of this incredible country. When the notification arrived for SQL Saturday Iceland, I promptly submitted my abstracts and crossed my fingers. Lucky for me, Ásgeir Gunnarsson accepted my abstract and I was offered the chance to speak with this great SQL Server user group.
After arriving before 7am on Friday morning at Keflavik airport, I realized that I wouldn’t have a hotel room ready for me, no matter how much I wanted to sleep. Luckily there is a great article on the “I Love Reykjavik” site offering inside info on what to do if you do show up early. I was able to use the FlyBus to get a shuttle directly to and from my hotel, (all you have to do is ask the front desk to call them the night before you’re leaving and they’ll pick you back up in front of your hotel 3 hrs before your flight.) Once I arrived, I was able to check in my bags with their front desk and headed out into town.
I stayed at Hlemmur Square, which was central to the town and the event and next to almost all of the buses throughout the city. The main street in front of it, Laugavegur, is one of the main streets that runs East-West and is very walkable. Right across this street from the hotel was a very “memorable” museum, the Phallilogical Museum. I’m not going to link to it or post any pictures, but if you’re curious, I’ll warn you, it’s NSFW, even if it’s very, uhm…educational. It was recommended by a few folks on Twitter and it did ensure I stayed awake after only 2 hours of sleep in 24 hours!
As I wandered about town, there are a few things you’ll note about Iceland- the murals of graffiti is really awesome and Icelandic folks like good quality products- the stores housed local and international goods often made from wool, wood, quality metal and such. The city parliment building is easily accessible and it’s right across from the main shopping area and new city development.
On Saturday, I was quick to arrive at Iceland’s SQL Saturday, as I had a full list of sessions I wanted to attend. I was starting to feel the effects of Iceland weather on my joints, but I was going to make sure I got the most out of the event. I had connected with a couple of the speakers at the dinner the night before, but with jet lag, you hope you’ll make a better impression on the day of the event.
I had the opportunity to learn about the most common challenges with SQL Server 2016 and that Dynamic Data Masking isn’t an enterprise solution. Due to lacking discovery tools, the ability to join to non-masked objects and common values, (i.e. 80% of data is local and the most common location value would easily be identified, etc.) the confidential data of masked objects could be identified.
I also enjoyed an introduction to containers with SQL Server and security challenges. The opening slide from Andy says it all:
Makes you proud to be an American, doesn’t it? 🙂
My session was in the afternoon and we not only had excellent discussions on how to empower database environments with virtualization, but I even did a few quick demonstrations of ease of cloud management with AWS and Oracle…yes, to SQL Server DBAs. It was interesting to see the ease of management, but how easy it was for me to manage Oracle with the interface. I performed all validations of data refreshes from the command line, so there was no doubt that I was working in Oracle, yet the refreshes and such were done in AWS and with the Delphix Admin console.
I made it through the last session on the introduction to containers with SQL Server, which included a really interesting demonstration of a SQL Server container sans an OS installation, allowing it to run with very limited resource requirements on a Mac. After this session was over, I was thankful that two of my fellow presenters were willing to drop me off at my hotel and I promptly collapsed in slumber, ready to return home. I was sorry to miss out on the after event dinner and drinks, but learned that although I love Iceland, a few days and some extra recovery time may be required.
There are a lot of people and companies starting to push the same old myth regarding the death of the database administrator role in companies. On the Oracle side, it started with release Oracle 7 and now is proposed with the introduction of cloud. Hopefully my post will help ease the mind of those out there with concerns. There are a number of OBVIOUS reasons this is simply not true, but I’m going to write a few posts over the next year on some of the less obvious ones that will ensure DBAs stay employed for the long haul.
The first and to some- less obvious reason that DBAs are going to continue to be a necessary role in Information Technology with the Cloud is that almost all databases use a Cost Based Optimizer, (CBO).
I’m not going to go into when it was introduced in the different platforms, but over 90% of database platforms used in the market today have a CBO. This grants the database the ability to make performance decisions based on cost vs. strict rules, granting, (in theory and in most instances) better performance.
There was an interesting thread on Oracle-l on hit of IO for an EBS environment due to extended statistics. There were links in the conversation to Jonathan Lewis’ blog that bring you to some incredibly interesting investigations on adaptive plans and other posts on configuration recommendations/bugs involved with extended statistics.
With the introduction of the CBO, the DBA was supposed to have less to worry about in the way of performance. The database was supposed to have automated statistics gathering that would then be used, along with type of process, kernel settings and parameters to make intelligent decisions without human intervention. The capability allowed the engine to take advantage of advanced features outside of simple rules, (if index for where clause columns exist, then use, etc.)
Some CBOs perform with more consistency than others, but many times the challenge of why a database chose a plan is lost on the DBA due to the complexity required to make these decisions. The one thing the DBA thought they could count on was the database engine using up to date statistics on objects, calls and parameters to make the decision. DBAs began to tear apart the algorithms behind every table/index scan and the cost of each process and limits for each memory and IO feature. As their knowledge increased, IT shops became more dependent upon their skills to take the CBO to the level required to ensure customers received the data they needed when they needed it. We learned to know when to ignore the cost on a query or transaction and how to force the database to choose the improved plan.
I am a database administrator that HATED Oracle dynamic sampling and still find the cost way out weighing the benefit. There were few cases where it served a DBA like me, who possessed strong CBO and statistics knowledge, that for Oracle to make choices for me, (especially with SQL that had controlled hints included in the statements) caused me to find new ways to disable it anyway I could. I had dreams of the feature maturing into something that would serve my needs instead of waking me from those dreams to address another challenge where none should have been present.
If you managed as many multi-TB databases as I did, extensive dynamic sampling, especially on large objects could come back to haunt you. I performed a number of traces on processes where an Exadata was being accused of a configuration problem when in truth, it was 8 minutes of dynamic sampling out of a 9 minute db time. In each instance, I proved dynamic sampling was to blame via trace file evidence and in each instance, developers and application folks involved would ask why dynamic sampling was even considered a feature. I did see the feature usage and benefits, but it was rarely for the very large databases I managed.
The next logical step in Oracle’s mind for enhancing features like dynamic sampling was to add Adaptive Plans. This is another feature that Oracle has introduced to benefit query and transactional process performance in databases. Allow the plan to adapt to allow the plan to adapt to the run in question, but if you’ve read the thread and the links included in the first part of this post, you’ll know that if often performs less than optimally.
In the end, OnPrem databases required extensive knowledge of the internal database workings, metrics and an strong research skills were required to guarantee the most consistent performance for any enterprise database engine.
All DBAs have experienced the quick fix solutionist, (not even a word, but I’m making it up here!) that would make recommendations like:
“Oh, it’s eating up CPU? Let’s get more/faster CPU!”
“I/O waits? Just get faster disk!”
“We need more compute? Just throw more at it!”
As a DBA, we knew that this was the quick and honestly, a temporary fix. To quote Cary Millsap, “You can’t hardware your way out of a software problem.” It’s one of my favorites, as I found myself in the situation of explaining why adding hardware was only a short-term solution. To answer why it’s short-term, we have to ask ourselves, “What is the natural life of a database?”
Either in design, processes, users or code, (especially with poorly written code.) If you didn’t correct the poor foundation that was causing the heavy usage on the system by ensuring it ran more efficiently, you would only find yourself in the same place in six months or if lucky, two years, explaining why the “database sucks” again. This required research, testing and traditional optimization techniques, not enabling by granting it more resources to eat up in the future.
Considering that in a very high level view, any cloud is really just running all of these same product features and database engines on somebody else’s computer. How does this allow for complex features that required expertise to manage bypassed?
Unlike initial project startups or quick development spin ups, do we think companies are just going to continue to pay for more and more compute and IO?
I would be willing to bet it’s more cost effective to have people who know how to do more with less. At what point does that graph of price vs. demand hit the point that having people who know what they’re doing with a database make a difference? I think it’s a lot lower than the threshold many companies assume with statements of “You won’t need a Database Administrator anymore- Just standard administrator and developers!”
Tell me what you think!
No, this isn’t a title for a future Star Wars movie, but our own future, foreseen by me, (as well as many others) from experience, research and discussions everyday.
No, it’s not this dark and menacing…no sith lords.
We know who the main players in the current cloud arena are and how much they hold of the cloud market.
Many are betting that they can make a dent in that market and as much as it looks like some companies have the cloud all “wrapped up”, it may not be as clean a win as you might think.
Most companies foresee having one, primary cloud vendor, yet that may be today, but not tomorrow. As a DBA, I was told over and over again, “We’re hiring you just for your Oracle skills. We won’t have any need for your SQL Server or other database platform skills.” Within six weeks, a mission critical system would be discovered that ran on another database platform and my skills were needed to first recover from whatever cataclysmic situation had occurred and then centralizing the management for it under IT.
How many of you want to take bets on this happening with the cloud? IT is often viewed as a road block by many companies and so the business, when it needs something, will find a way to get what it needs. Historically, this meant getting a server, putting it under someone’s desk and having them purchase or develop the product outside of the IT department. Now, with the ease of the cloud, someone will simply create what they need, have hosted in the cloud what will become critical to a business at a point in the future and then the IT organization will need to be responsible, secure it and manage it.
This leaves IT folks with some new challenges. Instead of having to consolidate to company standards for servers or migrate databases or data centers, they will have to migrate between clouds.
This type of need, along with a demand for business migrating into the cloud, will create cloud price wars. They will be very similar to what we’re experiencing with our mobile providers, first introduced among the big four providers when T-Mobile did away with contracts and transfer fees. Verizon, Sprint and AT&T were quick to follow with their own versions to entice customers and make it easier to move from one provider to another.
This is another reason of why I’m at Delphix. I see how important its going to be for us to help customers to:
Although Amazon and Azure are rulers of the roost today, there are other companies that may be trailing in the arena that may rule it tomorrow. If there’s one thing we know is constant, it’s change. There was a time when we all laughed at the geeks and their smartphones, yet now we all own one. I wouldn’t count anybody out of the race yet and it might be pertinent to start betting on those that enable those in the race.
Ah, yes, it’s that RMOUG Training Days time of the year again!
As a techie, I didn’t put a lot of time into my slides when I first started presenting, thinking I’d simply dazzle them with my amazing knowledge.
What I found is that slide deck skills don’t just help your presentation, they help you tell the story better, which should be the goal of every presenter. No matter how technical we are, we most likely haven’t had any formal Powerpoint training and as I’ve been upping my “PPT skills” this last year, I thought I’d share with the rest of the class…:)
Presenter View something that would have really helped me out when I first started. To not have to remember every little detail and to have my notes displayed at the bottom of my screen would have been a stellar advancement.
Having notes at the bottom upped my presentation game, as I found I removed much of the text on my slides and instead used reference links to blog posts, articles and white papers to fully engage the attendee in my sessions vs. receiving minimal thoughts displayed on a slide.
I found that I went to more full screen graphics with a single phrase and was able to simply talk with the audience, having my notes keep me on topic, without having a slide full of discussion points that were more for me than those in attendance
A picture is worth a thousand words, but a moving picture? That’s a whole story and the reason people go to movies, watch TV and why even adults like cartoons. We can display an idea with movement more completely than a still image.
Powerpoint makes animations quite simple and it just takes the ability to group, highlight and animate with the animation toolbar:
All the basic and even a number of advanced animations are available to make your presentations pop and the ability to create a complex animation to get your point across!
Why is it great to have every other or every third slide a full screen graphic? It gives the audience a rest between the data they’ve been taking in and with technical or complex topic presentations, your attendee is likely to appreciate the break.
No matter if you’ve just discussed the algorithm used for a specific data masking feature or like the slide above, discussing networking for geeky introverts, a slide like this makes an impact and the audience will more likely remember intricate details when combined with a humorous phrase and memorable picture.
Always use stock images and if you end up using an image from someone else’s site, etc. ask and give credit.
I used to update all my slides to a new template by copying them over in slide view, but now I know that I can switch them to a new slide template and switch out individual slides to new single template slides with the Layout menu in Powerpoint.
Having more control about the layout- everything from text, to graphics, etc. saves me time from manually updating everything in a single slide. If there is a common format that you use, you can make a copy of the template and save it off to use in the future, too.
Well, it’s time for me to get more tasks in preparation for the conference done! We’re looking forward to seeing everyone at RMOUG Training Days 2017!
Wednesday: Introduction for Connor McDonald from Oracle, 2017 Keynote- 9:45am, Main Ballroom
Lunch with the Experts- 12:30pm, Main Ballroom
Women in Technology Round Table with Komal Goyal and Rene Antunez– 1:15pm, room 1A
Social Media, the Next Generation with Rene Antunez– 4pm, room 1A
Welcome Reception Host- 5:30pm in the Main Ballroom
Thursday: Delphix Hands on Lab- 9:00am in the OTN Area
Lunch with the Experts– 12:30pm, Main Ballroom
Closing Session for the OTN Area– 2:00pm
Virtualization and the Cloud– 4:00pm- room 4F
I will be found at the Registration Desk, the OTN area doing interviews, taking pictures and videos throughout the other times of the conference!
I’ve been at Delphix for just over six months now. In that time, I was working with a number of great people on a number of initiatives surrounding competitive, the company roadmap and some new initiatives. With the introduction of our CEO, Chris Cook, new CMO, Michelle Kerr and other pivotal positions within this growing company, it became apparent that we’d be redirecting our focus on Delphix’s message and connections within the community.
I was still quite involved in the community, even though my speaking had been trimmed down considerably with the other demands at Delphix. Even though I wasn’t submitting abstracts to many of the big events I’d done so in previous years, I still spoke at 2-3 events each month during the fall and made clear introductions into the Test Data Management, Agile and re-introduction into the SQL Server communities.
As of yesterday, my role was enhanced so that evangelism, which was previously 10% of my allocation, is now going to be upwards of 80% as the Technical Evangelist for the Office of the CTO at Delphix. I’m thrilled that I’m going to be speaking, engaging and blogging with the community at a level I’ve never done before. I’ll be joined by the AWESOME Adam Bowen, (@CloudSurgeon on Twitter) in his role as Strategic Advisor and as the first members of this new group at Delphix. I would like to thank all those that supported me to gain this position and the vision of the management to see the value of those in the community that make technology successful day in and day out.
I’ve always been impressed with the organizations who recognize the power of grassroots evangelism and the power it has in the industry. What will I and Adam be doing? Our CEO, Chris Cook said it best in his announcement:
As members of the [Office of CTO], Adam and Kellyn will function as executives with our customers, prospects and at market facing events. They will evangelize the direction and values of Delphix; old, current, and new industry trends; and act as a customer advocate/sponsor, when needed. They will connect identified trends back into Marketing and Engineering to help shape our message and product direction. In this role, Adam and Kellyn will drive thought leadership and market awareness of Delphix by representing the company at high leverage, high impact events and meetings. 
As many of you know, I’m persistent, but rarely patient, so I’ve already started to fulfill my role and be prepared for some awesome new content, events that I’ll be speaking at and new initiatives. The first on our list was releasing the new Delphix Trial via the Amazon Cloud. You’ll have the opportunity to read a number of great posts to help you feel like an Amazon guru, even if you’re brand new to the cloud. In the upcoming months, watch for new features, stories and platforms that we’ll introduce you to. This delivery system, using Terraform, (thanks to Adam) is the coolest and easiest way for anyone to try out Delphix, with their own AWS account and start to learn the power of Delphix with use case studies that are directed to their role in the IT organization.
So Brent Ozar’s group of geeks did something that I highly support- a survey of data professional’s salaries. Anyone who knows me, knows I live by data and I’m all about transparency. The data from the survey is available for download from the site and they’re promoting app developers to download the Excel spreadsheet of the raw data and work with it.
Now I’m a bit busy with work as the Technical Intelligence Manager at Delphix and a little conference that I’m the director for, called RMOUG Training Days, which is less than a month from now, but I couldn’t resist the temptation to load the data into one of my XE databases on a local VM and play with it a bit...just a bit.
It was easy to save the data as a CSV and use SQL Loader to dump it into Oracle XE. I could have used BCP and loaded it into SQL Server, too, (I know, I’m old school) but I had a quick VM with XE on it, so I just grabbed that quick to give me a database to query from. I did edit the CSV and removed both the “looking” column and took out the headers. If you choose to keep them, make sure you add the column back into the control file and update the “options ( skip=0)” to be “options ( skip=1)” to not load the column headers as a row in the table.
The control file to load the data has the following syntax:
--Control file for data -- options ( skip=0 ) load data infile 'salary.csv' into table salary_base fields terminated by ',' optionally enclosed by '"' (TIMEDT DATE "MM-DD-YYYY HH24:MI:SS" , SALARYUSD , PRIMARYDB , YEARSWDB , OTHERDB , EMPSTATUS , JOBTITLE , SUPERVISE , YEARSONJOB , TEAMCNT , DBSERVERS , EDUCATION , TECHDEGREE , CERTIFICATIONS , HOURSWEEKLY , DAYSTELECOMMUTE , EMPLOYMENTSECTOR)
and the table creation is the following:
create table SALARY_BSE(TIMEDT TIMESTAMP not null, SALARYUSD NUMBER not null, COUNTRY VARCHAR(40), PRIMARYDB VARCHAR(35), YEARSWDB NUMBER, OTHERDB VARCHAR(150), EMPSTATUS VARCHAR(100), JOBTITLE VARCHAR(70), SUPERVISE VARCHAR(80), YEARSONJOB NUMBER, TEAMCNT VARCHAR(15), DBSERVERS VARCHAR(50), EDUCATION VARCHAR(50), TECHDEGREE VARCHAR(75), CERTIFICATIONS VARCHAR(40), HOURSWEEKLY NUMBER, DAYSTELECOMMUTE VARCHAR(40), EMPLOYMENTSECTOR VARCHAR(35));
I used Excel to create some simple graphs from my results and queried the data from SQL Developer, (Jeff would be so proud of me for not using the command line… :))
Here’s what I queried and found interesting in the results.
The database flavors we work on may be a bit more diverse than most assume. Now this one was actually difficult, as the field could be freely typed into and there were some mispellings, combinations of capital and small letters, etc. The person who wrote “postgress”, yeah, we’ll talk… 🙂
The data was still heavily askew towards the MSSQL crowd. Over 2700 respondents were SQL Server and only 169 were listed their primary database platform as others, but Oracle was the majority:
Now the important stuff for a lot of people is the actual salary. Many folks think that Oracle DBAs make a lot more than those that specialize in SQL Server, but I haven’t found that and as this survey demonstrated, the averages were pretty close here, too. No matter if you’re Oracle or SQL Server, we ain’t making as much as that Amazon DBA…:)
Many of those who filled out the survey haven’t been in the field that long, (less than five years). There’s still a considerable amount of folks who’ve been in the industry since it’s inception.
Of the 30% of us that don’t have degrees in our chosen field, most of us stopped after getting a bachelors to find our path in life:
There’s still a few of us, (just under 200) out there who had to accumulate a lot of school loans getting a masters or a Doctorate/PHD, before we figured out that tech was the place to be…:)
The last quick look I did was to see by country, what were the top and bottom average salaries for DBAs-
Not too bad, Switzerland and Denmark… 🙂
I wish there’d been more respondents to the survey, but very happy with the data that was provided. I’m considering doing one of my own, just to get more people from the Oracle side, but until then, here’s a little something to think about as we prep for the new year and another awesome year in the database industry!