DBA LifeMicrosoft

That’ll Be Fun

I have a t-shirt that says, “Underestimate me, that’ll be fun” on the front of it.  This statement pretty much sums up my life and I continue to both impress and annoy people, which means I’m living up to it.

The session evaluations came out for PASS Summit and I was surprised by some of the reviews I received.  The opportunity to under estimate you is kind of expected.  I’m pretty comfortable with my skills as a presenter, but it is still new to me to not be as well known in the speaking community as I was in the Oracle side.  There is an additional thrill, as there is less expectation on who I am and I’m graded just like everyone else going into a presentation.

Summit Sessions

I presented two technical sessions-  One on DevOps and a second on GDPR.  As these were both topics I focused on in my previous job, I had no issue speaking on either, but only DevOps is incorporated into my new role and for a very specific goal.  I was thrilled to talk about how I’ve been able to bring DevOps into my new Analytics role and was excited about sharing the fact that DevOps skills came in handy, even in roles where it may not have seemed likely.

I built out the session with a use case story in mind.  I find it resonates with the audience better and I give a better presentation.  I didn’t set an agenda for the session going in – just discussed what my challenge was in my new role and how I was addressing it with DevOps skills I possessed.  I then did a demo of the automations, change control and processing I’d built so far, followed by the future of the project, which more heavily incorporated advanced Microsoft DevOps tools and automation, along with orchestration.  I thought it was a good session, but there were two things that held it back, and this is what I’m going to try to dig into.

As I read my reviews of my DevOps and Decoys session.  I was intrigued by the varying scores, as well as the comments.  Of the 150+ attendees, over 70 filled out electronic evaluations, which was great.  There were lessons from these evaluations, as well as interesting data that I found important to break down as a speaker, a DevOps engineer and a woman in tech.  Knowing how to review your own evaluations objectively is an important skill for any speaker.

101 is Important

For those attending the session who expected a very dry run of what is DevOps is, this wasn’t it and I think people hoped I would do an introduction before I dove into what I was doing.  I had the time, I should have-  plain and simple.  

I forget how many misconceptions are still out there when it comes to DevOps and that a few attendees only came to see my session on DevOps with all offered at Summit.  Lesson learned and this was valuable feedback.

Wait for It

For the few negative comments and scores I received, it was consistently from those few that didn’t stay for the whole session.  I rarely think twice if anyone leaves in the middle of the session.  There’s a lot to see, a lot of places people have to be and often overlapping or scheduling demands causing people to have to leave and it’s important to not take it personally.  I did receive feedback stating they had left because they felt my session wasn’t going to cover DevOps and was just talking about automation.  This was an opposite reaction to those that did stay for the whole session who then the held out for the section of the talk on the future, (post the demonstration) on where the automation I’d built was going.  This is the natural evololution to DevOps- you automate, you create processes, introduce version control, orchestration and then continue till your integration and delivery matures to a DevOps environment.  I was hoping that is what attendees would get from my session, as often as I hear there are lacking examples of how you get from A-Z of DevOps.  I felt I obviously missed this in how I presented this session at Summit.  I required a clear agenda-  another lesson to learn for this session.

The Unknown Woman

PASS Summit is one of the only Microsoft events where I receive evaluations that contain consistent bias in the comments.  Its common place for most tech events, but rare at Micrsoft ones. If you’re wondering how I’d recognize this – while in the Oracle community, I had access to speaker evaluations for years.  I had noted patterns in the evaluation scores and comments and collected them for six years, categorizing them by men, women and “Speakers with Accents”.  That third category you might think better titled, “speakers that English is a second language”, but I discovered even those with Scottish, Australian or New Zealand accents, with English as their first language, were receiving the same pattern of comments on their evaluations, so I’m sticking to this.  It wasn’t rare to receive evaluations comments suggesting these speakers take classes beforehand on how to pronounce words as American’s do or the attendee walked out due to not understanding the speaker well enough.  We Americans are lazy listeners, we really are.  I’ve never received or seen these types of comments when speaking outside of the US.

Men’s reviewes also had some odd patterns when evaluating competent and/or confident speakers.  While all male speakers received comments about their topic knowledge, there would be those who also received odd comments about room temperature, the quality of the projector, and even the break-snack options.  

The women who present in the technical community, are rarely poor speakers to begin with, (in my opinion) having lesser risk taking skills that hinders the creation of a greater breadth of speaker quality.  Almost every woman speaker I know, receives comments regarding lack of organization of slides and presentation, speaking too fast or a comment on their physical appearance.  Although seemingly more constructive, the recommendations on the direction they chose for their topic isn’t often helpful, but vague and sounds disgruntled.  It does take a considerable amount of fearlessness vs. what women are raised to have to get up and speak in front of a technical audience.  The discomfort more often appears to be on the audience members when reviewing comments submitted in evaluations – much of it lacking substance and constructive feedback.

In The Know

While speaking in the Oracle community, I was well enough known, and my masculine communication style had lent me to receiving more comments commonly received by the men, (room temperature, snack options, etc).  I was one of two women I knew that fell into the men’s pile of evaluations as it was that prevalent.  

With my DevOps session, this wasn’t so and upon reflection, last year’s Summit evaluations I also experienced this.  I found it fascinating as I was scruntinized for my choice in topic direction, talking speed, (I always talk fast, but I rarely, if ever hear it in evaluations) and my organization of material.  I started to review the data, fascinated as to what had placed me back into the category of “evaluation for a woman speaker”.

Persistance and Speaking Up

I’ve only been speaking in the Microsoft community for 1 1/2 years.  It’s only my second PASS Summit.  This is a significant difference from the Oracle community, where I’ve spoke for a decade and am one of the few women.  I was an ACE Director and am an Oak Table member.  I had solidified my standing in the Oracle community, where I am one of many in the Microsoft side who works in my technical arena and there are significantly more women in the Microsoft community.  

This leads me to a theory that women who speak more at a given event, who do more and are a larger part of the community change the bias that we commonly deal with.  As I was well known in the Oracle community, my presence was expected more than an outlier and I received evaluations similar to the male speakers because of it.  That also leads me to recommend to women to monitor their evaluations and as they also are expected to improve, I also expect their evaluations will improve even without added effort.  The more you’re known, the less you’re an outlier and bias is less liable to happen in your evaluations.

I’ll continue to monitor the data and see if I begin to see patterns develop and/or change, but I also want to share this, as I think it’s important for those who have just started speaking to learn how to determine what to take of value from their evaluations and what to not take personally.  Know that it’s alright to be fascinated by evaluation data, even the absurd.  Sometimes you’ll just run into someone while they’re having a bad day or maybe they’re feeling under the weather, (I won’t touch on the chance they’re hung over at Summit… :))  and you need to just give everyone a pass, (not that PASS).

With all of this, my GDPR session evaluations came back with incredibly high scores, almost perfect 5’s across the board.  I was very humbled by this, as I felt it was a very dry topic and hoped my personality kept everyone awake the last day of the event.  Namaste.

Kellyn

http://about.me/dbakevlar