Wednesday, September 2, 2009

My Barriers to Learning TDD

As a follow up to my previous post declaring that I finally "get" TDD to some extent, I figured I'd reflect on the barriers that I have had up to this point which made it difficult for me to learn how to unit test in some effective manner as well as truly understanding the benefits of TDD practices.  Now, I am by no means claiming to be an expert in unit testing, mocking, or TDD.  I didn't open the refrigerator, drink some of the TDD-flavored Kool-Aid, and threw on a subsequent Mortarboard to illustrate that I've somehow graduated into this new, higher level of software development.  I'm still learning from others as well as my own errors experiences just like everyone else.  The purpose of this is to reveal to others some of the issues that I had and hopefully provide some insight on how to overcome such.

 

The Examples Are Too Simple

This barrier was by far the most difficult for me to overcome.  I would read up on all of the buzz about how to add unit tests to your code and how to apply unit testing into TDD.  I would follow the examples and understand it and be fine; however, I would always stall out whenever I tried to do it in a real world scenario. One day, I got extremely frustrated trying to fit all of this this together even if it looked like a square peg for a round hole to me at that time.  I began to wonder how unit testing in general would work on such complex real world objects and that's when it hit me.  Around this same time, I was diving into the S.O.L.I.D. principles of Object Oriented Design and began to count the dependencies I had in the code.  The more I broke out the code into less complex objects that followed the principles, the easier it was for me to map the unit test examples that previously were "too simple" to my code.  The barrier wasn't so much that the examples were too simple as it was that my code that I was trying to directly apply the lesson to were too complex.  On the last project I had, I didn't apply unit testing to it; however, I tried to adhere to the S.O.L.I.D. principles as much as possible.  When I was trying to retrofit the tests into it, I was able to do it as well as find places where I didn't follow the S.O.L.I.D. principles as much as I thought I did.  Looking back, I probably wouldn't have had any of the refactoring issues I had after the fact if I had used TDD to provide a test-first basis for the object designs.

 

But...This Class Uses the Database

This was another barrier for me that partially stems from the previous example.  Looking over the majority of Unit Testing and TDD books and blog posts out there, very few of them actually touch dependencies as common as a database.  Looking at my own code at the time, I had repository classes that handled all of the CRUD operations to the database as well as spit out the records as strongly typed collections (where necessary).  How do I test this class when there is such a deep dependency with the database?  I was truly perplexed by this issue and decided to ask the question on StackOverflow.  The responses led me to realize that my class was doing 2 things; calling the database AND transforming the returned data in to a manner I needed.  By pushing the direct database calls to a separate class, I was able to mock away the database and make sure these particular class could be tested.

Now, that only solved half of the issue.  I abstracted the dependency of those classes but I still had data provider classes that still interacted with the database.  This is where I truly learned the difference between Unit Testing and Integration Testing.  With such an outside dependency as a database or a web service, I began to see some examples that created separate, controlled instances of those dependencies in order to test the queries and CRUD functions.  The technique I use now for database integration is to use NHibernate, even if the primary project doesn't use it.  NHibernate has the ability to recreate table schemas for each tests which helps automate the state of the database for each test.  An example of this can be shown over at NHForge.org's How-To section under Your First NHibernate Based Application.  After setting this up, it provides me an easy way to test the barrier of the database.

 

Am I Just Testing the Mocking Framework?

I had a class the called into a 3rd party library that I wanted to test.  I abstracted the 3rd party library into a proxy and called such from my class.  The 3rd party library sent messages over a TCP socket connection and returned back an "OK" or an "ERR" message.  The class that I wanted to test created and sent the messages and then reported on what it got back.  For example, I would call a method in the class and it would send a message "FOO" to the proxy and get back "OK".  That's a little simplified (see barrier #1 above) but the class itself did pretty much that.  When I setup the test, I mocked out the proxy dependency and had it return "OK" whenever my class sent it "FOO".  This is not that meaningful of a test since there is no logic in that 1 method.  Other methods had logic based on the parameters; however, this one method was just this simple.  Was I just testing the mocking framework with this particular test?  Arguably, yes...I was; however, in the future, if that ever had to change, I now have a test that I can use to ensure that I get the information correctly.

 

But Unit Testing = More Code & Time

This wasn't so much a barrier for me mentally but was a barrier to adopt it at work.  From what I've experienced now, writing unit tests AFTER the code has already been written DOES lead to more code being written, more refactoring of existing code to make it testable, and more time to do all of this.  However, I have seen that writing your unit tests BEFORE you write any code (as advocated in TDD), you tend to code at the same speed if not a little faster.  I found that I wrote code faster when I did tests before since the production code was already designed for me through the tests I had just written.  There was less thought at the time of coding since the design was setup while I was writing the tests.  I also found myself near-instantly defaulting into the S.O.L.I.D. principles instead of looking at them sometimes as a refactoring step.  Lastly, I wrote less code that would have been for future use (i.e. YAGNI). 

All of this came with a paradigm shift in my object design style.  While some of the basic directory structure and such was the same, I looked at my classes more at a "how do I want to call this class" as opposed to "what methods do I want to show in this class".  The shift in my thinking was more of a consumer than a provider or retailer.  If you only code what you need, you tend to not code things you don't need.  It's as if you go to the grocery store for 1 item and that's the only item that the store has.  Nothing else to distract you in that point in time.  No other products around that you may need in the future.  You are in that 1 moment in time and in that moment, you only have 1 need as a consumer.

 

Summary

While there were other barriers I came across, the rest were either very minor or more about basic implementation like how to return an IDataReader object. I'm sure that I'll stumble across more barriers and moments of enlightenment the more I practice TDD or just unit testing in general.  Until then, I hope some of the items described and discussed here can help others who have struggled to get their head around unit testing in general and some of the principles of TDD.


kick it on DotNetKicks.comShout it