Before autonomous data correction software met the mainframe, a day in my life as a DBA looked like this:\n2 a.m. \u2013 Diagnose a critical maintenance utility failure for a panicked night operator, re-submit the REORG job, and head back to bed.8 a.m. \u2013 Leverage a database tool to pull pertinent data for an emergency report on an internal customer\u2019s sales region.9 a.m. \u2013 Use various database tools and review performance-related data to improve data access for developers alarmed their application performance is slowly degrading.12 p.m. \u2013 As lunch approaches, identify where I can save data for a scheduled backup, having noticed unforeseen space problems, and successfully capture my backup.\nThere I was, a highly trained computer professional, troubleshooting random bugs and issues, sometimes in the wee hours of morning.\n\nThe world changed for DBAs like me when database tools came on the market. Rudimentary at first, the past 20 years have seen such software blossom into autonomous tools that can adapt and intervene on a DBA\u2019s behalf to address any event before it becomes a problem.\nAdd to the equation the power and performance of the mainframe to keep such systems running at peak performance, and autonomous solutions can free up a skilled employee and change entire systems for the better.\nHow autonomous computing improves a DBA\u2019s life\nThere are five levels of autonomous computing:\nLevel 1 is no autonomy, and level 2 has little enough that significant human input is still required. At level 3, things get interesting. This is where software can not only detect an event, but it can also advise users on the solution. Level 4 lets the system both advise and make certain corrections. And at Level 5, we see autonomous fixes for detected events, with the ability to \u201cwatch and learn,\u201d adapting to changes in real time.\nIf all of this seems processor-intensive, it is. Autonomous software produces its own share of log data, and it consumes precious processor resources as it listens to events upon which to act with its purposeful \u201cwatch and learn\u201d raison d\u2019etre. And with the sheer volume of data passing through any business\u2019s systems, not just any processor can contain transactions of this volume.\nWith the advent of microservices running on the mainframe, each monitoring specific sensors for specific conditions, the future is actually here for DBAs \u2014 and the future is the mainframe. Suddenly, a DBA\u2019s day can look like this:\n6 a.m. \u2013 DBA reviews the previous night\u2019s batch window activity via a mobile UI, all problems already solved.8 a.m. \u2013 Internal customer pulls their own report with an interactive web UI and automated assistance.9 a.m. \u2013 DBA notified via an interactive web UI of performance avoidance measures taken on business-critical objects.12 p.m. \u2013 DBA notified of space outage avoidance measures taken during routine data backupALL DAY LONG \u2013 DBA is free from tedious troubleshooting and can focus on higher and more business critical tasks.\nThe whole reason we build machines is to free up human capital and focus our precious time solving bigger problems. Does autonomic software mean the days of a DBA are numbered? Absolutely not. Instead of being reactive, DBAs of the future will be proactive, able to bend their training and creativity toward building better, smarter, and more efficient systems. And mainframe database administrators know the secret sauce is the mainframe itself. With the muscle and processing power of the mainframe, autonomous troubleshooting software can handle the drudgery so the DBA can plug into their human ingenuity.