?>
May
19
Filed Under (Debug Console, General, Graphical Interface, Test Runners) by Ryan Christiansen on 05-19-2009

I have received several inquiries regarding the SimpleTestRunner and a means of visualizing test failures. It is important to note that the current SimpleTestRunner GUI is really intended for automation purposes. A full GUI similar to what you’d find in FlexUnit or Fluint, is on the roadmap. It’s just not ready yet.

In the meantime, there are several options for recording test results.

Option 1:

The most longstanding option is the DebugTestListener. Just run the following…

var runner:TestRunner = new TestRunner();
runner.test = LibraryTestSuite.getInstance();
runner.run( new DebugTestListener() );

Note: This must be run in debug mode.
You will see the following in the debug console.
(I’ve intentionally introduced some errors so you see the output)

------------------------------------------------------------------
FUnit Framework - Flex Unit Testing Environment
Framework Version: 0.72.0452 (Build 1106)
 
Adobe Windows (StandAlone)
Player Version: WIN 9,0,124,0
------------------------------------------------------------------
 
################################ UNIT TESTS ################################
Running tests in ''...
***** funit.framework::ArrayAssertTests.isNotSubsetOf
***** funit.framework::ArrayAssertTests.doesNotContain
***** funit.framework::ArrayAssertTests.containsFailureOnEmpty
***** funit.framework::ArrayAssertTests.areNotEqualHandlesNull
***** funit.framework::ArrayAssertTests.isNotEmpty
 
[continued ...]
 
############################################################################
############## F A I L U R E S #################
1) funit.framework::ArrayAssertTests.itemsOfTypeFailure :
  Expected: all items instance of <String>
  But was: [ "x", "y", <Object> ]
     at funit.framework::CollectionAssert$/allItemsAreInstancesOf()[..\Framework\main\source\funit\framework\CollectionAssert.as:82]
     at funit.framework::ArrayAssertTests/itemsOfTypeFailure()[..\Framework\tests\source\funit\framework\ArrayAssertTests.as:59]
     at Function/http://adobe.com/AS3/2006/builtin::apply()
     at InternalMethodInfo/invoke()[..\SwirlyVision\Corelib\main\source\sv\reflection\InternalMethodInfo.as:96]
     at funit.core::TestMethod/invokeTestCase()[..\Framework\main\source\funit\core\TestMethod.as:127]
     at funit.core::TestMethod/runTestCaseImpl()[..\Framework\main\source\funit\core\TestMethod.as:105]
     at funit.core::TestMethod/runTestCase()[..\Framework\main\source\funit\core\TestMethod.as:82]
     at funit.core::TestCase/run()[..\Framework\main\source\funit\core\TestCase.as:56]
     at funit.core::TestSuite/runTest()[..\Framework\main\source\funit\core\TestSuite.as:243]
     at <anonymous>()[..\Framework\main\source\funit\core\TestSuite.as:180]
     at flash.events::EventDispatcher/dispatchEventFunction()
     at flash.events::EventDispatcher/dispatchEvent()
     at funit.core::TestScheduler/scheduleTest()[..\Framework\main\source\funit\core\TestScheduler.as:128]
     at <anonymous>()[..\Framework\main\source\funit\core\TestScheduler.as:105]
     at flash.events::EventDispatcher/dispatchEventFunction()
     at flash.events::EventDispatcher/dispatchEvent()
     at flash.utils::Timer/tick()
2) funit.framework::ArrayAssertTests.itemsUniqueFailure :
  Expected: all items unique
  But was: [ "x", "y", "x" ]
     at funit.framework::CollectionAssert$/allItemsAreUnique()[..\Framework\main\source\funit\framework\CollectionAssert.as:124]
     at funit.framework::ArrayAssertTests/itemsUniqueFailure()[..\Framework\tests\source\funit\framework\ArrayAssertTests.as:115]
     at Function/http://adobe.com/AS3/2006/builtin::apply()
     at InternalMethodInfo/invoke()[..\SwirlyVision\Corelib\main\source\sv\reflection\InternalMethodInfo.as:96]
     at funit.core::TestMethod/invokeTestCase()[..\Framework\main\source\funit\core\TestMethod.as:127]
     at funit.core::TestMethod/runTestCaseImpl()[..\Framework\main\source\funit\core\TestMethod.as:105]
     at funit.core::TestMethod/runTestCase()[..\Framework\main\source\funit\core\TestMethod.as:82]
     at funit.core::TestCase/run()[..\Framework\main\source\funit\core\TestCase.as:56]
     at funit.core::TestSuite/runTest()[..\Framework\main\source\funit\core\TestSuite.as:243]
     at <anonymous>()[..\Framework\main\source\funit\core\TestSuite.as:180]
     at flash.events::EventDispatcher/dispatchEventFunction()
     at flash.events::EventDispatcher/dispatchEvent()
     at funit.core::TestScheduler/scheduleTest()[..\Framework\main\source\funit\core\TestScheduler.as:128]
     at <anonymous>()[..\Framework\main\source\funit\core\TestScheduler.as:105]
     at flash.events::EventDispatcher/dispatchEventFunction()
     at flash.events::EventDispatcher/dispatchEvent()
     at flash.utils::Timer/tick()
############################################################################
Executed tests : 160
Ignored tests : 1
Failed tests : 2
Unhandled exceptions : 0
Total time : 1.231 seconds
############################################################################

Option 2:

The XmlResultWriter will take an TestResult object and translate it into an NUnit-style xml schema. If you choose, you can simply trace the result to the debug console or write it to a flat file with Adobe AIR.

If you want to get a little more fancy can use this in combination with ANT and write this out to an xml file. Unlike other approaches, this does not require use of either the AIR runtime or XMLSocket connections. Instead, you need only target the standalone debug player distributed with the Flex SDK. This approach is ideal for with automated build integration, especially those with test report generation.

Here’s how you might set up your test runner…

<mx:Application xmlns:mx="http://www.adobe.com/2006/mxml"
                xmlns:funit="http://www.funit.org/2007/mxml">
 
    <mx:Script>
        <![CDATA[
 
            import mx.utils.StringUtil;
 
            import funit.LibraryTestSuite;
            import funit.listeners.events.RunFinishEvent;
            import funit.utils.XmlResultWriter;
 
            private function runTests() : void
            {
                runner.addEventListener(
                    RunFinishEvent.RUN_FINISH,
                    runFinishedHandler
                );
                runner.run();
            }
 
            private function runFinishedHandler( event:RunFinishEvent ) : void
            {
                var writer:XmlResultWriter =
                    new XmlResultWriter(event.result);
 
                var result:String =
                    "<?xml version=\"1.0\" encoding=\"utf-8\"?>\n" +
                    writer.result.toXMLString();
 
                trace( StringUtil.substitute("##funit\[TestResult {0}]##", result) );
            }
        ]]>
    </mx:Script>
 
    <mx:creationComplete>runTests();</mx:creationComplete>
 
    <funit:AutomationTestRunner id="runner">
        <funit:test>{LibraryTestSuite.getInstance()}</funit:test>
    </funit:AutomationTestRunner>
 
</mx:Application>

Believe it or not, the ANT side is just as easy.
(Properties are used for clarity and reuse)

<!--
    Execute Tests
-->
<target name="test" description="Test All">
 
    <exec executable="${flashplayer.debug.exe}" errorproperty="trace.output">
        <arg line="${bin.debug.dir}/FUnitTestRunner.swf" />
        <redirector>
            <errorfilterchain>
                <ignoreblank />
            </errorfilterchain>
        </redirector>
    </exec>
 
    <!-- Extract results and write TestResult.xml -->
    <echo file="${bin.debug.dir}/TestResult.xml">${trace.output}</echo>
    <replaceregexp file="${bin.debug.dir}/TestResult.xml" match="(?s).*##funit\[TestResult (.+)\]##" replace="\1" />
 
    <!-- Display original trace statements -->
    <echo>${trace.output}</echo>
 
</target>

The in the redirector is necessary to prevent double line spacing. I’m not sure why this happens but it helps keep our final output pretty.

The regular expression helps ensure that only the result xml is written to the final file. Without it, any other trace calls would be written as well, causing invalid xml markup.

trace(StringUtil.substitute("##funit\[TestResult {0}]##", result));

To avoid this, I’ve wrapped the result in a unique closure. Simply put, only the value of result would make it into TestResult.xml.



Feb
16
Filed Under (Updates) by Ryan Christiansen on 02-16-2009

The next major revision to the framework core is now available for download. Release 0.70 provides significant enhancements in three key areas:

  • GUI Support
  • Test Automation
  • Automated Builds and Continuous Integration

GUI Test Runner

Although a forthcoming enhanced UI was detailed in the framework roadmap, community feedback indicated that a GUI was critical, even in the earliest development stages. I had admittedly focused more on the NUnit feature set, rather than front-end usability. Not surprisingly, certain aspects of a .NET approach stifled the strengths of the Flash environment making GUI development cumbersome at best. This release addressed several of these limitations.

The SimpleTestRunner was designed for use with automated testing where visual feedback is useful but little interactivity is required.

A more advanced testing interface, such as the FlexUnit GUI shown here, is actively underway.

Continuous Integration Support for TeamCity

On of the most exciting aspects of Release 0.70 is that the build was fully automated start to finish. That means that the SVN checkout, ANT builds, FUnit testing with results, and final zipped distributions were all done automatically by my TeamCity build server.

FUnit Tests in TeamCity

It is important to note that TeamCity is not just running the tests, it is ingesting live data and displays detailed information about test results and failures as they occur. Click here to have a closer look.



Jan
19
Filed Under (General) by Ryan Christiansen on 01-19-2009

I’m sure many of you are familiar with what it means to eat your own dog food. Put simply, and you can quote me…

Any developer that won’t use his own project shouldn’t be surprised when no one else will either.

With that said, I do my best to apply FUnit and it’s support libraries wherever possible. This has resulted in direct improvements to code quality, user experience, and extensibility. Here are just a few ways this approach has helped already:

  • Find bugs earlier in the deployment cycle.
  • Identify confusing or weak areas of usability.
  • Remove unnecessary or redundant APIs.

Build Automation – Just to Name A Few…

Over the past couple weeks, I have improved support for third-party build management tools such as TeamCity, Cruise, and Maven. Although TeamCity and Cruise (the makers of the open-source project CruiseControl) are commercial products, both provide very powerful free editions as well. Maven on the other hand is fully open-source and is gaining momentum in the development community.

Commercial vs. Open Source

Being less experienced in automation and server technologies, I’m a perfect test subject for ease of use. I could probably write an entire article on the semantics of free vs. commercial software so I’ll keep it brief.

So do I prefer commercial or open source? well both… it depends. Solutions are not better just because they’re free any more than commercial products are better because of financial backing. There are plenty of examples of all cases. Subversion and TortoiseSVN are a great examples of a successful open source projects that i love. InstallShield on the other hand is an expensive, bloated, metastasized ball of corrosive goo that has managed to break the simplest (previously working) installers without changing a single line of code … but I digress … rant.close();

Choosing a Continuous Integration Server

My first instinct was to try Maven. I wanted to choose something I felt the community would already be using. Unfortunately, not having used it before, I found Maven difficult to pick up despite relatively good documentation and community examples. It is important to note, however, that there is great progress being made in this arena for Flex development with projects like flex-mojos. Therefore, I will be revisiting Maven in the future to ensure it is supported properly.

There are two specific things made me favor TeamCity and Cruise over Maven.

  • Once installed it just works with no additional configuration.
  • The web interface made management of build scripts simple and intuitive.

But the real deal-maker for TeamCity was something called Remote Run. Remote Run allows developers to trigger personal builds of local changes directly from their development IDE (Eclipse, VisualStudio, or IntelliJ IDEA). Here’s the best part, you don’t have to commit your changes to version control to run your build remotely. How cool is that?

This may not seem like a big deal when you’re talking about smaller project you can test locally. But it is a big deal for complex builds that require automation to be tested. What if you need to modify the build scripts? What if you change your build dependencies? What if you have multiple developers on the project? If you break the build in SVN, you hurt progress for everybody.

You don’t want to commit something unless you know it works. For this there is a second feature to Remote Run called Pre-Tested Commit. This will automatically commit your selected changes if and only if they can be build successfully on the server.

Yeah… I like TeamCity… a lot.

And the winner is… TeamCity

You can log in as a guest to my project’s build automation and deployment server below or from the community links in the sidebar.

Login here: FUnit Build Automation Dashboard

You will have access to build Artifacts and FUnit test statistics. This includes compiled binary, asdocs, distributions, source files, and nightly builds. Just in case you missed it, yes FUnit is fully integrated and maintains detailed UI records of all executed tests.

Note: The TeamCity server instance is currently hosted by my development machine from home. Until I can buy a second dedicated box and somehow acquire a legal copy of Windows Server 2003 R2 or 2008, I can’t guarantee it’s reliability. Although TeamCity is available for Linux and Mac OSX as well, I would prefer a Windows server.



Sep
06
Filed Under (General, Updates) by Ryan Christiansen on 09-06-2008

This post represents an exhaustive list of supported value equatable object types. Relevant implementation details and code samples are available for each.

  • Primitive types: int, uint, Number, Boolean, and String
  • Coerced types: Date, XML, XMLList, Namespace, and QName
  • Complex types: Array, ByteArray, and IList

Primitive types:

Numerics
Assert.areEqual( 123.456, 123.456 );
Assert.areNotEqual( 123, 123.456 );
 
// Note: The following will pass in FUnit but fail in FlexUnit.
//       The condition (NaN == NaN) will actually fail in
//       ActionScript so special handling is required.
Assert.areEqual( NaN, NaN );
Boolean
Assert.areEqual( true, true );
Assert.areNotEqual( true, false );
 
// Although 'isTrue' or 'isFalse' is more appropriate here.
Assert.isTrue( true );
String
Assert.areEqual( "hello world!", "hello world!" );
Assert.areNotEqual( "hello world!", "goodbye world!" );
 
// The StringAssert class is better suited for more
// advanced string assertions.
StringAssert.areEqualIgnoringCase( "hello!", "HELLO!" );
StringAssert.isEmpty( "" );
StringAssert.contains( "world", "hello world!" );
StringAssert.startsWith( "hello", "hello world!" );
StringAssert.endsWith( "!", "hello world!" );
// and so on ...

Coerced types:

Important Note: In order to maintain simplicity in expected behavior, value coercion is limited to global top-level classes only. A ‘quick-reference’ should not be necessary while writing equality tests. The only top-level classes not currently handled are Error and RegExp. This is largely due to the fact that they are immutable objects (cannot be changed after they are created). Pending further discussions and/or feedback, value coercion for these types will likely be added as well.

Date
Assert.areEqual( new Date(123456), new Date(123456) );
Assert.areNotEqual( new Date(123456), new Date(654321) );
 
// Note: Unlike the .NET environment ActionScript does not
//       treat dates as a value type. Therefore, FUnit will
//       coerce the Date class to evaluate equality.
//       For reference equality use 'Assert.areSame()'.
Assert.areNotSame( new Date(123456), new Date(123456) );
XML / XMLList
var xml1:XML = <value>Hello World</value>;
var xml2:XML = <value>Hello World</value>;
Assert.areEqual( xml1, xml2 );
 
// Similar to Date class value coercion, FUnit can distinguish
// value equality from reference equality.
Assert.areNotSame( xml1, xml2 );
Namespace / QName
var ns1:Namespace =
     new Namespace( "funit", "http://www.funit.org/2009/" );
var ns2:Namespace =
     new Namespace( "funit", "http://www.funit.org/2009/" );
 
Assert.areEqual( ns1, ns2 );
Assert.areNotSame( ns1, ns2 );
 
var qname1:QName =
     new QName ( "http://www.funit.org/2009/", "funit" );
var qname2:QName =
     new QName ( "http://www.funit.org/2009/", "funit" );
 
Assert.areEqual( qname1, qname2 );
Assert.areNotSame( qname1, qname2 );

Complex types:

Array
Assert.areEqual( ["x", "y", "z"], ["x", "y", "z"] );
Assert.areNotEqual( ["x", "y", "z"], ["x", "y", "a"] );
 
// Associative Arrays are also handled.
var array1:Array = new Array();
array1["firstName"] = "John";
array1["middleName"] = "Adam";
array1["lastName"] = "Doe";
 
var array2:Array = new Array();
array2["firstName"] = "John";
array2["middleName"] = "Adam";
array2["lastName"] = "Doe";
 
Assert.areEqual( array1, array2 );
 
// Complex recursive arrays are also handled by internally
// checking for re-entrant behaviors and preventing stack
// overflow.
var recursive1:Array = new Array( null, ["x", "y", "z"], [] );
recursive1[0] = recursive1;
 
var recursive2:Array = new Array( null, ["x", "y", "z"], [] );
recursive2[0] = recursive2;
 
// Note: No stack overflow will occur here, but more variations
//       and complex samples are needed to ensure all potential
//       loopholes are handled.
Assert.areEqual( recursive1, recursive2 );
ByteArray
var array1:ByteArray = new ByteArray();
array1.writeDouble( Math.PI );
array1.writeUTF( "Hello World" );
array1.writeObject( ["x", "y", "z"] );
 
var array2:ByteArray = new ByteArray();
array2.writeDouble( Math.PI );
array2.writeUTF( "Hello World" );
array2.writeObject( ["x", "y", "z"] );
 
Assert.areEqual( array1, array2 );
Assert.areNotSame( array1, array2 );
IList
var set1:ArrayCollection =
     new ArrayCollection( ["x", "y", "z"] );
var set2:ArrayCollection =
     new ArrayCollection( ["x", "y", "z"] );
 
Assert.areEqual( set1, set2 );
Assert.areNotSame( set1, set2 );


Sep
03
Filed Under (Updates) by Ryan Christiansen on 09-03-2008

The next major revision to the framework core is now available for download. Although there are few changes to the core runtime engine, this release represents significant research surrounding semantics of the Assert ‘areEqual’ method. More in depth coverage of these issues are found here and here.

A detailed article outlining the effects of this change will be available within the next few days.



Sep
02
Filed Under (General, Updates) by Ryan Christiansen on 09-02-2008

Don’t let the title fool you… JUnit, NUnit, FlexUnit, and I have gone round and round for weeks since my last post. Every time I thought I had adequately addressed the issue, a new crop of discoveries would start the battle over again. Despite the rough beginnings, however, I couldn’t be happier with the outcome. Furthermore, the functional distinctions between FUnit and FlexUnit have gone beyond how tests are created and deeper into how they are written. Here is what I’ve discovered.

Discovery 1: All xUnit Frameworks are not created equal.

You may have noticed the addendum in my last post concerning JUnit vs. NUnit array equality. This was my first clue that striking the right balance between authoring language and developer intent was more art than science. The xUnit pattern is a means of providing ubiquity across languages, while it is the responsibility of the framework instance to tailor to the needs of the language. With this in mind, controlled deviations among testing frameworks is not only common but preferable.

Note the JUnit source code for ‘assertEquals’ here:
static public void assertEquals(String message, Object expected, Object actual) {
	if (expected == null && actual == null)
		return;
	if (expected != null && expected.equals(actual))
		return;
	failNotEquals(message, expected, actual);
}

As you see in the above JUnit example, ‘assertEquals’ relies on a call to the ‘expected’ data types Object.equals() implementation. I confess, I was under the false impression that this method served as a member-wise equality comparer. On the contrary, the default behavior for Object.equals() is actually reference equality. I also discovered that this method is rarely overridden outside of primitive types such as String, Boolean, Double, Single, etc. This means that in terms of JUnit, ‘assertEquals’ behaves more like ‘assertSame’ in most cases.

NUnit takes a different approach to object equality. In addition to performing value comparison of primitive types, NUnit adds a few others… Array, ICollection (equatable to IList in AS3), Stream and Date. In such cases, NUnit ‘areEqual’ ensures that the value of the comparing types are equal without relying solely on reference equality. Object.equals() is still used by NUnit but only after all other value comparisons have failed.

The following array equality assertion in NUnit will pass.
String[] array1 = { "one", "two", "three" };
String[] array2 = { "one", "two", "three" };
Assert.assertEquals(array1, array2);

There is a similar method available in JUnit called Assert.assertArrayEquals(). I’d like to point out here that there are actually two Assert classes in JUnit, ‘junit.frameworks::Assert’ and ‘org.junit::Assert’. This is a good indicator that JUnit has been rethinking its testing approach as well. Legacy support is a primary factor of “why things are they way they are”. Once an established framework like JUnit has put its stakes in the ground, it’s hard to move them.

Discovery 2: All languages are not created equal… duh.

The JUnit ‘assertEquals’ code sample is a bit misleading. What you don’t see is that there are actually 19 other methods just like it. These method overloads serve as a form of type equality check. In order for two types to be compared by value, they must be of equivalent types. In fact, in some languages comparison of incompatible types won’t even compile.

Compilation of the following condition in C# will fail.
Boolean value = (123456 == "123456") ? true : false;
 
// The following error will be thrown
Operator '==' cannot be applied to operands
     of type 'int' and 'string'

Unfortunately, method overloading is not even supported in ActionScript 3. Equally unfortunate, FlexUnit makes no attempt to account for such type mismatches. Type equality is a key component of value equality. The inclusion of FlexUnit ‘assertStrictEquals’ is a step in the right direction but still relies heavily on developers to do the leg work.

The following equality assertion in FlexUnit will pass.
var xml1:XML = <value>Hello World</value>;
var xml2:XML = <value>Hello World</value>;
Assert.assertEquals(xml1, xml2);
The following equality assertions in FlexUnit will fail.
var xml1:XML = <value>Hello World</value>;
var xml2:XML = <value>Hello World</value>;
Assert.assertStrictEquals(xml1, xml2);

In this example, standard equality (==) on XML is a value comparison while strict equality (===) is a reference comparison. It becomes a requirement, therefore, that a developer be intimately familiar with the nuances of each equality type and its variances across data types. I would argue that maintaining separation between Assert.areEqual() and Assert.areSame() is much clearer to the developer (but much harder on me).

Discovery 3: FlexUnit is to JUnit as FUnit is to NUnit.

The more I study the mechanics of the JUnit and NUnit testing frameworks, the clearer the distinction is between the two. Given ActionScript’s strong Java influence, it makes sense that Adobe would choose to pattern FlexUnit after JUnit. I will also say that the developers of FlexUnit did an oustanding job mirroring the testing patterns of JUnit. If you can unit-test in Java with JUnit then you can unit-test in Flash with FlexUnit and vice versa. That’s a very powerful thing.

There’s just one problem… I don’t like JUnit… at all.

Due to patterns established by JUnit, I believe it has made FlexUnit unnecessarily complex for both framing and writing tests. In fact, it was my initial frustrations with FlexUnit (and the JUnit pattern) that led me to establish the FUnit framework. More specifically, I didn’t have the same frustrating experience when using NUnit. Here’s just a couple of ways that NUnit spoiled me.

  • No need to extend the TestCase base class
  • Tests are easily flagged with the [Test] metadata tag
  • No “test” prefix is required for TestCase methods to be reflected
  • Expected errors are marked with an [ExpectedError] tag (no wrapper code)
  • Supports SetUp, TearDown, FixtureSetUp, and FixtureTearDown

It should come as no surprise that I’ve sided with NUnit on this one. A simple side-by-side comparison of FlexUnit vs. FUnit markup makes it pretty obvious why I like it.

There’s far more to this topic than I was able to post here but I hope I’ve provided sufficient background to describe the issues at hand. For those of you pulling regular svn updates, the core changes have been implemented and are available for use. Since I’d rather not unnecessarily muddle the problem and solution, a detailed explanation of these changes merits an article of it’s own. Look for this article shortly along with an official update release.



Jul
07
Filed Under (General, Updates) by Ryan Christiansen on 07-07-2008

It is not my intention to discredit FlexUnit’s decision to diverge it’s implementation from other testing frameworks, but rather to ensure that FUnit is firmly planted in what the testing community considers the de facto standard.

I will cite JUnit and NUnit as references for xUnit best practice. These frameworks are long established and widely adopted tools for unit-testing in the Java and .NET languages. The discrepancy between JUnit/NUnit and FlexUnit can be summarized as follows:

JUnit ‘assertEquals‘ / NUnit ‘areEqual‘ assert value equality.
FlexUnit ‘assertEquals‘ asserts reference equality.

When evaluating equality of value types such as Boolean, String, Number, etc. the behavior will be similar in most cases (we’ll outline these differences later). Test outcome for Object and Array assertions, however, may be drastically different between FlexUnit and other xUnit frameworks.

Important Correction: Since posting this article I’ve discovered that the JUnit method overload Assert.assertEquals(object[], object[]) has been deprecated and replaced by Assert.assertArrayEquals(object[], object[]). Variance between JUnit and NUnit may indicate that minor inconsistencies exist between testing frameworks as a whole and is not limited to FlexUnit alone. Further discussions on this topic and how it pertains to FUnit will continue here.

The following array equality assertion in JUnit will pass.
String[] array1 = { "one", "two", "three" };
String[] array2 = { "one", "two", "three" };
Assert.assertArrayEquals(array1, array2);
The following array equality assertion in NUnit will pass.
String[] array1 = { "one", "two", "three" };
String[] array2 = { "one", "two", "three" };
Assert.assertEquals(array1, array2);
The following array equality assertion in FlexUnit will fail.
var array1:Array = [ "one", "two", "three" ];
var array2:Array = [ "one", "two", "three" ];
Assert.assertEquals(array1, array2);

The intent of ‘assertEquals’ according to xUnit convention is to ensure that the value of two objects are identical, not that they reference the same object. A second assertion type ‘assertSame’ exists in JUnit (‘areSame’ for NUnit) to ensure that two objects share the same memory reference.

There are 3 possible reasons that may have lead FlexUnit developers to change the assertEquals implementation.

  • Inconsistencies between the ActionScript Object class and other languages.
  • Existence of two Flash equality types; equals (==) and strict equals (===).
  • Flash developer expectations and implied meaning of ‘assertEquals’.

Relevant Language Differences Between ActionScript and Java and .NET

The primary difference between the ActionScript Object definition and it’s Java/.NET counterparts is the absence of an Object.equals() method. All Java/.NET objects inherit from or override this method. Unfortunately, there is no ActionScript equivalent for member-wise equality comparison.

Object.equals() plays a crucial role in the ‘assertEquals’ implementation.

Note the JUnit source code for ‘assertEquals’ here:
static public void assertEquals(String message, Object expected, Object actual) {
	if (expected == null && actual == null)
		return;
	if (expected != null && expected.equals(actual))
		return;
	failNotEquals(message, expected, actual);
}

Standard Equality Vs. Strict Equality

It is important to note here that ‘standard equality’ (==) is a looser form of equality that does not exist in Java/.NET. These languages use what Flash considers ‘strict equality’ (===). Behavioral differences between these types become apparent when evaluating equality of value types (Boolean, String, Number, etc.).

The following equality assertion in JUnit (or NUnit) will fail.
int value1 = 5;
String value2 = "5";
Assert.assertEquals(value1, value2);
The following equality assertion in FlexUnit will pass.
var value1:int = 5;
var value2:String= "5";
Assert.assertEquals(value1, value2);

According to the Flash Language Reference:

The strict equality (===) operator differs from the equality (==) operator in only two ways:

  • The strict equality operator performs automatic data conversion only for the number types (Number, int, and uint), whereas the equality operator performs automatic data conversion for all primitive data types.
  • When comparing null and undefined, the strict equality operator returns false.

In other words, ‘standard equality’ does not enforce type equality of value types. This is inconsistent with other strong-typed languages.

User Expectation

Admittedly, my initial impressions of ‘assertEquals’ is that i would perform reference comparison. If you say “Assert that value1 and value2 are equal.”, it would be easy to visualize “value1 == value2″. Since this is not what will be executed, this could be viewed as misleading.

Proposed Solution

The FUnit framework was designed to be intuitive. It was also designed to be ubiquitous with other frameworks. I don’t want developers of JUnit, NUnit, or FUnit tests to have to change their logic just because they switch languages. The less you have to think about the tools, the more focused you can be on using them.

I would opt for a stronger more consistent xUnit form of equality assertion. This however might require minor changes when migrating tests from FlexUnit implementations. I would also create a secondary LegacyAssert class that would perform identically to it’s FlexUnit counterpart. This would allow the developer to choose between xUnit convention or a more familiar FlexUnit behavior (or both in combination). This will ultimately empower the developer to be more explicit in their unit-testing.

FUnit Assert API
function areEqual( expected:Object, actual:Object ) : void
function areSame( expected:Object, actual:Object ) : void
FUnit LegacyAssert API
function areEqual( expected:Object, actual:Object ) : void
function areStrictEqual( expected:Object, actual:Object ) : void
Which 'assertEquals' implementation works best for you?
View Results


Jun
01
Filed Under (Updates) by Ryan Christiansen on 06-01-2008

I realize there are countless web-based issue tracking systems available. These systems vary widely from open source to high dollar enterprise installations. Unfortunately, I had a number restricting factors that virtually eliminated all of them.

  • Guru Not Included: My server-side experience is… less than ideal.
  • Hosting Restrictions: I don’t have the resources to support X or Y technology.
  • Complex Setup: Near impossible in absence of said “server-side experience”
  • Expensive: Open source projects aren’t always funded, including this one.
  • Option Nightmare: It crams so many “features”, it’s practically unusable.
  • Wrong Target User: I don’t need much, but give me what I need.
  • Ugly Ugly Ugly: People DO judge a book by it’s cover…

I finally found “just what I was looking for”. There is room to grow but Lighthouse sailed past my limitation gauntlet with flying colors.Lighthouse, Beautifully simple issue tracking and bug reporting. I was able to create my account and begin roughing out community-based milestones and tickets almost immediately. The tools are simple and intuitive, and a good looking interface never hurt anybody. Better still, this hosted solution is free for open-source projects. Although I found this option available on other commercial products, I still preferred this one.

Lighthouse will help provide better visibility into FUnit development planning and changes as well as a sounding board for desired features and bug fixes. The new Issue Tracking Dashboard is now active and I encourage community involvment. Please do not hesitate to create new tickets for any desirables or bugs you may encounter. You may also comment on existing tickets to raise their priority and I will do my best to expedite those areas.



May
25
Filed Under (Updates) by Ryan Christiansen on 05-25-2008

I discovered that within the past few days, Joseph Berkovitz and Alex Uhlmann have released Flexcover 0.50. I believe this project will be a tremendous asset for FUnit and I will be closely following it’s progress.

There is one feature in particular I’m especially impressed with. FlexCover now supports Branch Coverage. This is the first major step in not only providing test coverage, but test coverage quality. Just because a test executes your class method, doesn’t mean that all the code in that method is called. This is where Branch Coverage steps in. In a nutshell, FlexCover tracks every condition (if, else if) and whether it passes or fails. Obviously, if the condition fails, it’s code will never be executed…or tested.

FUnit Branch Coverage Sample

One major caveat to FlexCover, however, is that it requires a “modified” Flex compiler. This is how the coverage “hooks” are injected into your application code. In addition, this step is needed emit detailed metadata of your code, files, and application structure. I anticipate that given Alex’s ties with Adobe that there will be better compiler extensibility available with the introduction of Flex 4 next year. Perhaps then this will no longer be necessary.

I have to say, the documentation for such an early project is excellent and very well outlined. I was able to get good coverage analysis for FUnit itself. I was able to create a small sub-set of FlexCover’s AIR based CoverageViewer for the web. Sadly, the source view that demonstrates the branch count won’t be available online (for now). The sample coverage data is viewable here.



Apr
02
Filed Under (Updates) by Ryan Christiansen on 04-02-2008

FUnit Development Roadmap

Copyright © 2008 FUnit.org. All rights reserved.



.