• DBA Life - Oracle

    Blogger Registration Complete for OOW 2011

    Thanks to Chet Justice of OracleNerd, I’m now registered for Oracle Open World 2011 with the Blogger’s registration.  I will be sitting here impatiently waiting for October now… 🙂 Anyone who remembers last year’s will also remember that I (again) did not get to attend and worked miserably through the week, listening to the tweets on Twitter and blurbs on Facebook, then was reminded on Blogs afterwards of all I missed.  Not this year!  I am going and I am going to enjoy it-  try and stop me… 🙂

  • Oracle

    Oracle Streams and Performance Considerations

    Although Streams is a goner, (Golden Gate taketh over…) My notes on performance considerations when utilizing streams as I work with it is always valuable.  These came from a number of blogs, notes from Oracle and simple grief, so thought I would share… 🙂 1. TYPE OF DATA REPLICATION Due to the way this table is loaded, we need to be concious of “queue spill”. Queue spill, per Oracle’s support documentation, is associated with LCR’s not being processed in a timely fashion per interval from the buffered queue, resulting in the old LCRs to “spill” from memory out to disk…

  • Oracle

    Reasons Behind Collecting SQL Baselines Before an 11G Upgrade

    Jeff Smith, via Twitter, asked a very good question about my previous post regarding baseline plans when upgrading from 10g to 11g, (or for previous versions, which I can also go into the steps in how to ensure more consistent performance during upgrades from earlier versions as well to 11g…) “…this assumes that new 11g plans wouldn’t be ‘better’..right? Are people gun-shy b/c 9-10 upgrade issues?” Why would we want to go back to 10g performance when we have just upgraded to a beautiful new version of 11g? Many responded in the Twitter conversation in regards to insurance and this…

  • Oracle

    Consistent Performance for 10g Upgrades to 11g

    When involved in an 11g upgrade of a 10g database, one of the important goals for a DBA is to maintain the same performance in the newly upgraded environment as the previous one. One of the best ways to accomplish this is through baseline collection of the execution plans that then can be imported and loaded directly into the SQL Plan Baseline once the database is upgraded. The disclaimer needs to be added, you must have a license for the tuning pack, so always check your DBA_FEATURE_USAGE_STATISTICS view and ensure you are in compliance. The steps to complete this are…

  • Oracle

    AWR for RAC

    Another little known set of AWR reports are ones that are RAC specific and cluster wide versions, giving the RAC DBA a unique view of the cluster, no individual instance AWR report can provide. The first report can be run from the command line,  resides in the $ORACLE_HOME/rdbms/admin directory and is called awrgrpt.sql.  Like the other versions of AWR, there is an HTML and a text version of the report offered, so when your manager or user would like to have something formatted and presented in an attractive format, there is an option. The report breaks down the cluster by…

  • ASH and AWR - Oracle

    For the Love of AWR and ASH…

    Yes, I love AWR and ASH reports when performing database detective work.  They are an essential tool in my performance tuning arsenal, along with the logs, trace files/profiling, Oracle Enterprise Manager and a suite of personally collect queries from over the years.  As complex as databases are these days, the more data that I have at hand to backup what I am tuning or reporting on, the more comfortable I am, (OK, so I’m not one for “theories”… :)) Just as I would not use a screw driver to hammer in a nail, all DBA’s should understand and not confuse…

  • Oracle

    The Tao of a Database Outage

    As many companies judge a DBA’s worth on the uptime of a database, nothing tests a DBA’s mettle like database service outages where the cause is indeterminate at the time of the outage.  The secondary challenge is that once the database is brought back up, the DBA must then take the necessary steps to analyze the data and figure out what caused the outage, hopefully resolving or reporting to ensure outages to not re-occur.  The DBA’s goal should not just be to resolve the issue, but to provide a high level, clearly documented report of what occurred and what steps…

  • DBA Life - Oracle

    RMOUG 2011- Looking Good!

    My slides and white paper are all submitted for RMOUG 2011 and I’m starting to get a bit anxious about the event.    Being the “hometown girl” to represent Pythian comes with some new pressures that I didn’t feel last year when total failure just meant I was making a fool out of myself… 🙂 The schedule is looking pretty good so far, but as it goes every year, there are a number of presentations that I would like to attend that are all scheduled the same time as my own, (OK, so I would have only been able to attend ONE of them, but I…

  • Oracle

    Index Reviews and Fruit Baskets

    I received an email from an obviously annoyed DBA this weekend.  I’m sure many of you are not surprised, I can be rather annoying. The annoyance stemmed from a request  I was working on to rebuild a number of bitmap indexes.  The problem is, when I take on a problem, I have a tendency to demand to do things the right way, so a simple request to check bitmap indexes for fragmentation,  always turns into a bit larger project with me at the helm. When it comes to indexes, it’s not just a simple rebuild.  I want the index JUSTIFIED. …

  • Oracle

    Ugliest Query In the World

    This is an Apex query and it no longer exists in the environment it once resided in-  When I first came across it, the first red flag was the optimizer cost of 109 billion, (yeah, I had to count the numbers a couple times in disbelief…)  I was reminded of it when Chet from OracleNerd tweeted about a poorly written Apex query and I had to share.  All names and identities have been changed to protect the innocent, (and more likely, the guilty…) Enjoy! UgliestQuery!

  • Oracle

    No Statspack Snapshot for You!

    This Statspack bug is an older one, but I ran into it this week again. The work around is very simple to correct the problem, but its surprising how disconcerting it can be for anyone seeing “unique constraint violated”. Seen in the alert log: Tue Dec 14 09:00:43 2010 Errors in file <dir>/xxxx_j000_xxxx.trc: ORA-12012: error on auto execute of job xxx ORA-00001: unique constraint (PERFSTAT.STATS$MUTEX_SLEEP_PK) violated ORA-06512: at “PERFSTAT.STATSPACK”, line 5264 ORA-06512: at “PERFSTAT.STATSPACK”, line 104 ORA-06512: at line 1 ***************************************************** Pulled from the trace file listed above from the alert log: <dir>/xxxx_j000_xxxx.trc Oracle Database 10g Enterprise Edition Release 10.2.0.1.0…

  • Oracle

    The Problem with the Automated Statistics Gathering

    Any DBA worth the time of day knows how important having solid statistics is to a database.  This is how the cost based optimizer is able to make solid decisions about the best path to take when performing a table scan, choosing an index, the amount of resources required to perform a task and so on and so forth. In any database that is built, post 10g, release 2 and on, there are maintenance  jobs that are created by default, one of them, to collect statistics automatically, with it’s goal to ensure that the database has the newest, most updated…

  • Oracle

    But…. I Collected Stats!–Updated!!

    *****After much adieu, I’ve finally returned to this post to rewrite what concerned me and befuddled others… :)***** I’ve worked with some great developers.  Developers who can write code in their sleep to perform the most complex functions and aggregations, (makes me quite envious, as I need to actually think how to write code… :))  Most of these developers are still tripped up by the Oracle Cost Based Optimizer, (CBO) and honestly, I think they should be.  Just how I have to think about how to code, it should be expected that they need to think about how the CBO works. …

  • Oracle

    Follow up: When an RMan Clone Goes Bad

    A follow up to this previous post.  I thought I had this one down-  had followed the standard protocol when you receive a 600 error.  I looked it up on MOS and had my bug number, documentation and proof of what the bug was, right? Gotta love being a DBA-  the day you think you’ve figured out, something comes up to surprise you and you learn something new.  ORA-00600: internal error code, arguments: [kcvhvdf_1], [], [], [], [], [], [], [] Odd version of bug 9314439: RECOVERY FAILS ON CLONED DATABASE ORA-600 [KCVHVDF_1] which is an 11.1.0.7 bug, but I…

  • Oracle

    When RMAN Cloning Goes Bad…

    One of the options for cloning an environment with limited disk space  involves creating a parameters file for the RMAN clone memory script to run that contains the “skip tablespace” feature.  This allows the DBA to retain a working list of tablespaces that contain data that isn’t required in development/test or for other cloning purposes when performing the actual clone. startup auxiliary nomount; run { allocate auxiliary channel c1 device type disk; …so on and so forth for all the channels desired… duplicate target database to SKIP TABLESPACE USERS,TBL_DATA1,TBL_INDX1,….; } This can also be extremely helpful in environments where disk…

  • Oracle

    Simple Reporting Without Materialized Views

    A common initial resolution to reporting requirements without a reporting database is to utilize materialized views.  Unfortunately, the cost to refresh these views can be extremely expensive on a database, especially if it’s an OLTP that rarely has a lull in activity. A lesser known feature of Oracle’s is the Change Data Capture, (a.k.a CDC http://download.oracle.com/docs/cd/B10500_01/server.920/a96520/cdc.htm ) The CDC process utilizes a materizalized view log on a source table with a destination copy of the source table that can be kept insync with the original in any interval chosen.  The result is often a much lighter footprint than an mview refresh,…

  • Oracle

    Transactional and Reporting

    In my years as an Oracle and SQL Server DBA and the many technical environments I’ve worked in, there’s almost always one database in each place that is a hybrid of OLTP and OLAP.  A highly transactional database that also has reporting requirements, but no separate reporting database.  This type of database is pretty much a guaranteed challenge for any DBA.   On one hand, the DBA has demands of the application demanding micro-second response time for completion of transactions.  On the other, he or she has users wanting information about the data resulting from all those transactions.  A DBA’s best course for successfully taming these types of hybrid environments…

  • Oracle

    Parallel with 11G

    Sigh….IF I could only live in “Uncle Larry’s” perfect database world where I have more control over what choices are made from the conception of the database’s life and know exactly what goals a development group has vs. Oracle. How much easier a DBA’s life would be if we could just say, “Sorry, Oracle’s new feature would work well only if you completely redesign the logic of your process in code A, B and C of our main systems that have naturally grown in complexity over the years.  Of course, you don’t mind adjusting all priorities for the next six…