Why am I talking about this short article? - First because I like debuggers and in fact any system tools such as compilers, decompilers..etc that could be used to delve deeper into the inner workings of your code. These are the tools of hackers also so if you want to be on top of securing your software (Oracle in this case) you need to know the tools that hackers may use and also how they work. This will give you an edge.
The other point I want to talk about in this article is Brianís mention at the start of it of the use of print statements in code to find issues in debugging. Print statements can also be used not just for debugging but also for instrumentation of your code. Oracle themselves do this extensively with the wait interface. All the wait interface really is, is a set of print statements added to various function calls in the Oracle kernel. This method is useful as you can see with the wait interface example as the workings of the kernel can be debugged at run time in a production environment without the use of a debugger. A trace file can be produced that contains a huge amount of data about how the software did its work and conclusions can be made as to what went wrong or how it could have done its stuff better next time.
Contrary to popular belief Oracle are not the only ones to do this (adding instrumentation) to their products. I know of companies who have instrumented the whole of their application product. This was very useful for developers, consultants and customers to be able to set trace at run time to analyse why a particular function behaved as it did or to analyse any particular bug. The instrumentation, like Oracles wait interface could be turned on or off at will.
I can see Brianís point about developers using print statements instead of a debugger and how this is not an efficient way to debug code but print statements do also have their role in production systems for instrumentation - Oracle have made this case and independently a previous employer of mine were also successful with this technique.
Other companies may use the same techniques to provide application audit facilities, this could be done in C for Pro*C or OCI programs or in PL/SQL for PL/SQL code or forms, or indeed at any level in the application stack. This instrumentation could write records to the file system or to database tables.
Now for a possible downside to using print statements for instrumentation and also we have a security angle. We know of the risks of allowing users to enable Oracle SQL trace or indeed any of the other events that generate trace files in Oracle, particularly dump files of memory in the Oracle kernel. If users can do this and read the produced trace files then itís highly possible that structural information relating to the Oracle configuration including password hashes or even passwords can be learned as well as application specific details such as program code, SQL or application database structure.
What about instrumentation that provides useful logging and trace information about an application. It is highly feasible that developers, consultants, users and many others can set this trace to enabled and generate trace files independently of the software supplier. Itís also very feasible that they can also gain access to the produced trace or log files and analyse them. Quite often these files will include function flow, SQL statements, data used in the SQL (i.e. bind variables) and even internal data values used. This information would be extremely useful to a hacker.
If your company has instrumented their applications (and I think the number of companies that do do this is not trivial) then you need to ensure that this trace mechanism is protected and that users who are not authorised to turn it on cannot actually turn it on. Also you need to ensure that the trace, log files or even database tables that store the trace data cannot be accessed. Being able to access this type of data, whilst useful for support and maintenance, could break some of the newer legal requirements such as Sarbanes Oxley, GLB or HIPAA.
What if you are using Fine grained access (Row level security - see my two part paper) to control access to critical data and you also use instrumentation in the application source code. It is highly likely that a user could enter a screen and enter critical data, the data would not be visible to anyone else and could not be viewed by others as RLS protects it. But what if the instrumentation (trace) grabbed the values in the screen code or in a PL/SQL package before the database inserts the data then writes that data to a trace file. The data you have protected with RLS could be sat viewable by anyone on a file system in a trace file.
Debuggers are a hackerís paradise and instrumentation whilst useful, in fact very useful in some cases also needs to be protected and controlled. Access to enable trace needs to be protected. Any above all steps to ensure critical data values are not written to trace files also needs to be taken otherwise features such as Row Level security could be easily bypassed.