I have received several inquiries regarding the SimpleTestRunner and a means of visualizing test failures. It is important to note that the current SimpleTestRunner GUI is really intended for automation purposes. A full GUI similar to what you’d find in FlexUnit or Fluint, is on the roadmap. It’s just not ready yet.
In the meantime, there are several options for recording test results.
The most longstanding option is the DebugTestListener. Just run the following…
var runner:TestRunner = new TestRunner();
runner.test = LibraryTestSuite.getInstance();
Note: This must be run in debug mode.
You will see the following in the debug console. (I’ve intentionally introduced some errors so you see the output)
FUnit Framework - Flex Unit Testing Environment
Framework Version: 0.72.0452 (Build 1106)
Adobe Windows (StandAlone)
Player Version: WIN 9,0,124,0
################################ UNIT TESTS ################################
Running tests in ''...
############## F A I L U R E S #################
1) funit.framework::ArrayAssertTests.itemsOfTypeFailure :
Expected: all items instance of <String>
But was: [ "x", "y", <Object> ]
2) funit.framework::ArrayAssertTests.itemsUniqueFailure :
Expected: all items unique
But was: [ "x", "y", "x" ]
Executed tests : 160
Ignored tests : 1
Failed tests : 2
Unhandled exceptions : 0
Total time : 1.231 seconds
The XmlResultWriter will take an TestResult object and translate it into an NUnit-style xml schema. If you choose, you can simply trace the result to the debug console or write it to a flat file with Adobe AIR.
If you want to get a little more fancy can use this in combination with ANT and write this out to an xml file. Unlike other approaches, this does not require use of either the AIR runtime or XMLSocket connections. Instead, you need only target the standalone debug player distributed with the Flex SDK. This approach is ideal for with automated build integration, especially those with test report generation.
Filed Under (Updates) by Ryan Christiansen on 02-16-2009
The next major revision to the framework core is now available for download. Release 0.70 provides significant enhancements in three key areas:
Automated Builds and Continuous Integration
GUI Test Runner
Although a forthcoming enhanced UI was detailed in the framework roadmap, community feedback indicated that a GUI was critical, even in the earliest development stages. I had admittedly focused more on the NUnit feature set, rather than front-end usability. Not surprisingly, certain aspects of a .NET approach stifled the strengths of the Flash environment making GUI development cumbersome at best. This release addressed several of these limitations.
The SimpleTestRunner was designed for use with automated testing where visual feedback is useful but little interactivity is required.
A more advanced testing interface, such as the FlexUnit GUI shown here, is actively underway.
Continuous Integration Support for TeamCity
On of the most exciting aspects of Release 0.70 is that the build was fully automated start to finish. That means that the SVN checkout, ANT builds, FUnit testing with results, and final zipped distributions were all done automatically by my TeamCity build server.
It is important to note that TeamCity is not just running the tests, it is ingesting live data and displays detailed information about test results and failures as they occur. Click here to have a closer look.
Filed Under (General) by Ryan Christiansen on 01-19-2009
I’m sure many of you are familiar with what it means to eat your own dog food. Put simply, and you can quote me…
Any developer that won’t use his own project shouldn’t be surprised when no one else will either.
With that said, I do my best to apply FUnit and it’s support libraries wherever possible. This has resulted in direct improvements to code quality, user experience, and extensibility. Here are just a few ways this approach has helped already:
Find bugs earlier in the deployment cycle.
Identify confusing or weak areas of usability.
Remove unnecessary or redundant APIs.
Build Automation – Just to Name A Few…
Over the past couple weeks, I have improved support for third-party build management tools such as TeamCity, Cruise, and Maven. Although TeamCity and Cruise (the makers of the open-source project CruiseControl) are commercial products, both provide very powerful free editions as well. Maven on the other hand is fully open-source and is gaining momentum in the development community.
Commercial vs. Open Source
Being less experienced in automation and server technologies, I’m a perfect test subject for ease of use. I could probably write an entire article on the semantics of free vs. commercial software so I’ll keep it brief.
So do I prefer commercial or open source? well both… it depends. Solutions are not better just because they’re free any more than commercial products are better because of financial backing. There are plenty of examples of all cases. Subversion and TortoiseSVN are a great examples of a successful open source projects that i love. InstallShield on the other hand is an expensive, bloated, metastasized ball of corrosive goo that has managed to break the simplest (previously working) installers without changing a single line of code … but I digress … rant.close();
Choosing a Continuous Integration Server
My first instinct was to try Maven. I wanted to choose something I felt the community would already be using. Unfortunately, not having used it before, I found Maven difficult to pick up despite relatively good documentation and community examples. It is important to note, however, that there is great progress being made in this arena for Flex development with projects like flex-mojos. Therefore, I will be revisiting Maven in the future to ensure it is supported properly.
There are two specific things made me favor TeamCity and Cruise over Maven.
Once installed it just works with no additional configuration.
The web interface made management of build scripts simple and intuitive.
But the real deal-maker for TeamCity was something called Remote Run. Remote Run allows developers to trigger personal builds of local changes directly from their development IDE (Eclipse, VisualStudio, or IntelliJ IDEA). Here’s the best part, you don’t have to commit your changes to version control to run your build remotely. How cool is that?
This may not seem like a big deal when you’re talking about smaller project you can test locally. But it is a big deal for complex builds that require automation to be tested. What if you need to modify the build scripts? What if you change your build dependencies? What if you have multiple developers on the project? If you break the build in SVN, you hurt progress for everybody.
You don’t want to commit something unless you know it works. For this there is a second feature to Remote Run called Pre-Tested Commit. This will automatically commit your selected changes if and only if they can be build successfully on the server.
Yeah… I like TeamCity… a lot.
And the winner is… TeamCity
You can log in as a guest to my project’s build automation and deployment server below or from the community links in the sidebar.
You will have access to build Artifacts and FUnit test statistics. This includes compiled binary, asdocs, distributions, source files, and nightly builds. Just in case you missed it, yes FUnit is fully integrated and maintains detailed UI records of all executed tests.
Note: The TeamCity server instance is currently hosted by my development machine from home. Until I can buy a second dedicated box and somehow acquire a legal copy of Windows Server 2003 R2 or 2008, I can’t guarantee it’s reliability. Although TeamCity is available for Linux and Mac OSX as well, I would prefer a Windows server.
This post represents an exhaustive list of supported value equatable object types. Relevant implementation details and code samples are available for each.
Primitive types:int, uint, Number, Boolean, and String
Coerced types:Date, XML, XMLList, Namespace, and QName
Complex types:Array, ByteArray, and IList
Assert.areNotEqual(123,123.456);// Note: The following will pass in FUnit but fail in FlexUnit.// The condition (NaN == NaN) will actually fail in// ActionScript so special handling is required.
Assert.areNotEqual(true,false);// Although 'isTrue' or 'isFalse' is more appropriate here.
Assert.areEqual("hello world!","hello world!");
Assert.areNotEqual("hello world!","goodbye world!");// The StringAssert class is better suited for more// advanced string assertions.
StringAssert.endsWith("!","hello world!");// and so on ...
Important Note: In order to maintain simplicity in expected behavior, value coercion is limited to global top-level classes only. A ‘quick-reference’ should not be necessary while writing equality tests. The only top-level classes not currently handled are Error and RegExp. This is largely due to the fact that they are immutable objects (cannot be changed after they are created). Pending further discussions and/or feedback, value coercion for these types will likely be added as well.
Assert.areNotEqual(newDate(123456),newDate(654321));// Note: Unlike the .NET environment ActionScript does not// treat dates as a value type. Therefore, FUnit will// coerce the Date class to evaluate equality.// For reference equality use 'Assert.areSame()'.
XML / XMLList
var xml1:XML = <value>Hello World</value>;var xml2:XML = <value>Hello World</value>;
Assert.areEqual( xml1, xml2 );// Similar to Date class value coercion, FUnit can distinguish// value equality from reference equality.
Assert.areNotSame( xml1, xml2 );
Filed Under (Updates) by Ryan Christiansen on 09-03-2008
The next major revision to the framework core is now available for download. Although there are few changes to the core runtime engine, this release represents significant research surrounding semantics of the Assert ‘areEqual’ method. More in depth coverage of these issues are found here and here.
A detailed article outlining the effects of this change will be available within the next few days.
Don’t let the title fool you… JUnit, NUnit, FlexUnit, and I have gone round and round for weeks since my last post. Every time I thought I had adequately addressed the issue, a new crop of discoveries would start the battle over again. Despite the rough beginnings, however, I couldn’t be happier with the outcome. Furthermore, the functional distinctions between FUnit and FlexUnit have gone beyond how tests are created and deeper into how they are written. Here is what I’ve discovered.
Discovery 1: All xUnit Frameworks are not created equal.
You may have noticed the addendum in my last post concerning JUnit vs. NUnit array equality. This was my first clue that striking the right balance between authoring language and developer intent was more art than science. The xUnit pattern is a means of providing ubiquity across languages, while it is the responsibility of the framework instance to tailor to the needs of the language. With this in mind, controlled deviations among testing frameworks is not only common but preferable.
Note the JUnit source code for ‘assertEquals’ here:
As you see in the above JUnit example, ‘assertEquals’ relies on a call to the ‘expected’ data types Object.equals() implementation. I confess, I was under the false impression that this method served as a member-wise equality comparer. On the contrary, the default behavior for Object.equals() is actually reference equality. I also discovered that this method is rarely overridden outside of primitive types such as String, Boolean, Double, Single, etc. This means that in terms of JUnit, ‘assertEquals’ behaves more like ‘assertSame’ in most cases.
NUnit takes a different approach to object equality. In addition to performing value comparison of primitive types, NUnit adds a few others… Array, ICollection (equatable to IList in AS3), Stream and Date. In such cases, NUnit ‘areEqual’ ensures that the value of the comparing types are equal without relying solely on reference equality. Object.equals() is still used by NUnit but only after all other value comparisons have failed.
The following array equality assertion in NUnit will pass.
There is a similar method available in JUnit called Assert.assertArrayEquals(). I’d like to point out here that there are actually two Assert classes in JUnit, ‘junit.frameworks::Assert’ and ‘org.junit::Assert’. This is a good indicator that JUnit has been rethinking its testing approach as well. Legacy support is a primary factor of “why things are they way they are”. Once an established framework like JUnit has put its stakes in the ground, it’s hard to move them.
Discovery 2: All languages are not created equal… duh.
The JUnit ‘assertEquals’ code sample is a bit misleading. What you don’t see is that there are actually 19 other methods just like it. These method overloads serve as a form of type equality check. In order for two types to be compared by value, they must be of equivalent types. In fact, in some languages comparison of incompatible types won’t even compile.
Compilation of the following condition in C# will fail.
Boolean value=(123456=="123456")?true:false;// The following error will be thrownOperator'==' cannot be applied to operands
of type 'int' and 'string'
Unfortunately, method overloading is not even supported in ActionScript 3. Equally unfortunate, FlexUnit makes no attempt to account for such type mismatches. Type equality is a key component of value equality. The inclusion of FlexUnit ‘assertStrictEquals’ is a step in the right direction but still relies heavily on developers to do the leg work.
The following equality assertion in FlexUnit will pass.
In this example, standard equality (==) on XML is a value comparison while strict equality (===) is a reference comparison. It becomes a requirement, therefore, that a developer be intimately familiar with the nuances of each equality type and its variances across data types. I would argue that maintaining separation between Assert.areEqual() and Assert.areSame() is much clearer to the developer (but much harder on me).
Discovery 3: FlexUnit is to JUnit as FUnit is to NUnit.
The more I study the mechanics of the JUnit and NUnit testing frameworks, the clearer the distinction is between the two. Given ActionScript’s strong Java influence, it makes sense that Adobe would choose to pattern FlexUnit after JUnit. I will also say that the developers of FlexUnit did an oustanding job mirroring the testing patterns of JUnit. If you can unit-test in Java with JUnit then you can unit-test in Flash with FlexUnit and vice versa. That’s a very powerful thing.
There’s just one problem… I don’t like JUnit… at all.
Due to patterns established by JUnit, I believe it has made FlexUnit unnecessarily complex for both framing and writing tests. In fact, it was my initial frustrations with FlexUnit (and the JUnit pattern) that led me to establish the FUnit framework. More specifically, I didn’t have the same frustrating experience when using NUnit. Here’s just a couple of ways that NUnit spoiled me.
No need to extend the TestCase base class
Tests are easily flagged with the [Test] metadata tag
No “test” prefix is required for TestCase methods to be reflected
Expected errors are marked with an [ExpectedError] tag (no wrapper code)
Supports SetUp, TearDown, FixtureSetUp, and FixtureTearDown
It should come as no surprise that I’ve sided with NUnit on this one. A simple side-by-side comparison of FlexUnit vs. FUnit markup makes it pretty obvious why I like it.
There’s far more to this topic than I was able to post here but I hope I’ve provided sufficient background to describe the issues at hand. For those of you pulling regular svn updates, the core changes have been implemented and are available for use. Since I’d rather not unnecessarily muddle the problem and solution, a detailed explanation of these changes merits an article of it’s own. Look for this article shortly along with an official update release.
It is not my intention to discredit FlexUnit’s decision to diverge it’s implementation from other testing frameworks, but rather to ensure that FUnit is firmly planted in what the testing community considers the de facto standard.
I will cite JUnit and NUnit as references for xUnit best practice. These frameworks are long established and widely adopted tools for unit-testing in the Java and .NET languages. The discrepancy between JUnit/NUnit and FlexUnit can be summarized as follows:
When evaluating equality of value types such as Boolean, String, Number, etc. the behavior will be similar in most cases (we’ll outline these differences later). Test outcome for Object and Array assertions, however, may be drastically different between FlexUnit and other xUnit frameworks.
Important Correction: Since posting this article I’ve discovered that the JUnit method overload Assert.assertEquals(object, object) has been deprecated and replaced by Assert.assertArrayEquals(object, object). Variance between JUnit and NUnit may indicate that minor inconsistencies exist between testing frameworks as a whole and is not limited to FlexUnit alone. Further discussions on this topic and how it pertains to FUnit will continue here.
The following array equality assertion in JUnit will pass.
The following array equality assertion in FlexUnit will fail.
var array1:Array = ["one","two","three"];var array2:Array = ["one","two","three"];
The intent of ‘assertEquals’ according to xUnit convention is to ensure that the value of two objects are identical, not that they reference the same object. A second assertion type ‘assertSame’ exists in JUnit (‘areSame’ for NUnit) to ensure that two objects share the same memory reference.
There are 3 possible reasons that may have lead FlexUnit developers to change the assertEquals implementation.
Inconsistencies between the ActionScript Object class and other languages.
Existence of two Flash equality types; equals (==) and strict equals (===).
Flash developer expectations and implied meaning of ‘assertEquals’.
Relevant Language Differences Between ActionScript and Java and .NET
The primary difference between the ActionScript Object definition and it’s Java/.NET counterparts is the absence of an Object.equals() method. All Java/.NET objects inherit from or override this method. Unfortunately, there is no ActionScript equivalent for member-wise equality comparison.
Object.equals() plays a crucial role in the ‘assertEquals’ implementation.
Note the JUnit source code for ‘assertEquals’ here:
It is important to note here that ‘standard equality’ (==) is a looser form of equality that does not exist in Java/.NET. These languages use what Flash considers ‘strict equality’ (===). Behavioral differences between these types become apparent when evaluating equality of value types (Boolean, String, Number, etc.).
The following equality assertion in JUnit (or NUnit) will fail.
int value1 =5;String value2 ="5";Assert.assertEquals(value1, value2);
The following equality assertion in FlexUnit will pass.
var value1:int = 5;var value2:String= "5";
According to the Flash Language Reference:
The strict equality (===) operator differs from the equality (==) operator in only two ways:
The strict equality operator performs automatic data conversion only for the number types (Number, int, and uint), whereas the equality operator performs automatic data conversion for all primitive data types.
When comparing null and undefined, the strict equality operator returns false.
In other words, ‘standard equality’ does not enforce type equality of value types. This is inconsistent with other strong-typed languages.
Admittedly, my initial impressions of ‘assertEquals’ is that i would perform reference comparison. If you say “Assert that value1 and value2 are equal.”, it would be easy to visualize “value1 == value2″. Since this is not what will be executed, this could be viewed as misleading.
The FUnit framework was designed to be intuitive. It was also designed to be ubiquitous with other frameworks. I don’t want developers of JUnit, NUnit, or FUnit tests to have to change their logic just because they switch languages. The less you have to think about the tools, the more focused you can be on using them.
I would opt for a stronger more consistent xUnit form of equality assertion. This however might require minor changes when migrating tests from FlexUnit implementations. I would also create a secondary LegacyAssert class that would perform identically to it’s FlexUnit counterpart. This would allow the developer to choose between xUnit convention or a more familiar FlexUnit behavior (or both in combination). This will ultimately empower the developer to be more explicit in their unit-testing.
FUnit Assert API
function areEqual( expected:Object, actual:Object):voidfunction areSame( expected:Object, actual:Object):void
FUnit LegacyAssert API
function areEqual( expected:Object, actual:Object):voidfunction areStrictEqual( expected:Object, actual:Object):void
Which 'assertEquals' implementation works best for you?
Filed Under (Updates) by Ryan Christiansen on 06-01-2008
I realize there are countless web-based issue tracking systems available. These systems vary widely from open source to high dollar enterprise installations. Unfortunately, I had a number restricting factors that virtually eliminated all of them.
Guru Not Included: My server-side experience is… less than ideal.
Hosting Restrictions: I don’t have the resources to support X or Y technology.
Complex Setup: Near impossible in absence of said “server-side experience”
Expensive: Open source projects aren’t always funded, including this one.
Option Nightmare: It crams so many “features”, it’s practically unusable.
Wrong Target User: I don’t need much, but give me what I need.
Ugly Ugly Ugly: People DO judge a book by it’s cover…
I finally found “just what I was looking for”. There is room to grow but Lighthouse sailed past my limitation gauntlet with flying colors. I was able to create my account and begin roughing out community-based milestones and tickets almost immediately. The tools are simple and intuitive, and a good looking interface never hurt anybody. Better still, this hosted solution is free for open-source projects. Although I found this option available on other commercial products, I still preferred this one.
Lighthouse will help provide better visibility into FUnit development planning and changes as well as a sounding board for desired features and bug fixes. The new Issue Tracking Dashboard is now active and I encourage community involvment. Please do not hesitate to create new tickets for any desirables or bugs you may encounter. You may also comment on existing tickets to raise their priority and I will do my best to expedite those areas.
Filed Under (Updates) by Ryan Christiansen on 05-25-2008
I discovered that within the past few days, Joseph Berkovitz and Alex Uhlmann have released Flexcover 0.50. I believe this project will be a tremendous asset for FUnit and I will be closely following it’s progress.
There is one feature in particular I’m especially impressed with. FlexCover now supports Branch Coverage. This is the first major step in not only providing test coverage, but test coverage quality. Just because a test executes your class method, doesn’t mean that all the code in that method is called. This is where Branch Coverage steps in. In a nutshell, FlexCover tracks every condition (if, else if) and whether it passes or fails. Obviously, if the condition fails, it’s code will never be executed…or tested.
One major caveat to FlexCover, however, is that it requires a “modified” Flex compiler. This is how the coverage “hooks” are injected into your application code. In addition, this step is needed emit detailed metadata of your code, files, and application structure. I anticipate that given Alex’s ties with Adobe that there will be better compiler extensibility available with the introduction of Flex 4 next year. Perhaps then this will no longer be necessary.
I have to say, the documentation for such an early project is excellent and very well outlined. I was able to get good coverage analysis for FUnit itself. I was able to create a small sub-set of FlexCover’s AIR based CoverageViewer for the web. Sadly, the source view that demonstrates the branch count won’t be available online (for now). The sample coverage data is viewable here.