Friday, February 26, 2021

Make sure your work area is well lit

 Make sure your work area is well lit

My dad always told me to make sure my work area was well lit. It didn’t matter whether I was doing homework, working under the hood of my car, or practicing my clarinet. Without proper lighting, he convinced me, you can’t see what you’re doing, you squint to keep things in focus, or can’t find what you’re looking for.

I channel my dad whenever I see someone washing dishes in the kitchen. How do you know it’s clean if you can’t see it?

When my oldest son was five or six years-old we visited some friends for dinner and drinks. Before eating we played washer-board and my son loved it. After we ate and the adults were enjoying beverages, my son whispered to me, “Dad, I lost one of the washers. You have to help me find it.”

He led me to a part of the outside deck near the doorwall to their condo, and pointed to the gravel under their deck saying, “It’s somewhere around here.” I imagined he dropped the washer and it slid between the deck boards into the gravel, so we began searching.

And searching.

And searching.

We couldn’t find the washer and I was getting impatient, so I asked my son, “Are you sure you dropped it here?” “No, I dropped it over there,” he replied pointing beyond the deck into the backyard, “but the light is better over here.”

I was reminded of this event later when I was consulting for a company that had great talent on the network and virtual machine management areas but not so much in database administration. They were panicking because response times suddenly slowed drastically.

They had great network monitoring tools so the network guys were the first to take a shot at it. The looked everywhere but couldn’t find the washer.

Next up was the VMWare team. All the virtual machines looked mostly idle no matter which tool they used to examine them. Low CPU utilization. Low network utilization. All systems normal. They couldn’t find the washer.

The developers, meanwhile, were already suspecting the database, pretty sure that something happened to some table causing some statistic to recommend some suboptimal query plan causing transactions that would normally takes milliseconds to scan a table and take over a minute. Only a database can choke an otherwise perfectly performing system with the equivalent of an impediment jinx without killing it.

As a developer, keeping your work area well lit (and your room clean) means instrumenting your applications so wherever there may be a problem so it’s easy to find out what happened. Sometimes that’s a log file. Sometimes it's a progress bar following status, and sometimes it's the ability to pop into an inspector/debugger to look at the running system to see abnormal behavior under-the-covers.

When you come across a problem that’s difficult to track down, it may be because that area of your system isn’t well lit.

Thanks, dad.

Thursday, April 7, 2016

Clip of the day : logging SObjects without nulls

// sobjects are difficult to read with all the "field=null" values, so
// return a string without them.
public string stringWithoutNullFields(string aString) { return aString.replaceAll('\\w+=null',''); }
public string toString(SObject anSobject) { return stringWithoutNullFields(anSobject + ''); }
public string toString(list<sobject> aList) { return stringWithoutNullFields(aList + ''); }
public string toString(map<string, sobject> aMap) { return stringWithoutNullFields(aMap + ''); }
public string toString(map<id, sobject> aMap) { return stringWithoutNullFields(aMap + ''); }

Whether you're using system.debug(someObject); or logging using a home-rolled log (I do), it'd difficult to read the output of some sobjects if too many of the fields look something like "a_field__c=null, another_field__c=null, yet_another_field__c=null..."

Get rid of all that nonsense using aString.replaceAll('\\w+=null','');.

Wednesday, September 16, 2015

Professor Strunk's 1919 advice to developers in 2016

Before computers existed, English professor William Strunk gave sage advice on how to develop software.
"Vigorous writing is concise. A sentence should contain no unnecessary words, a paragraph no unnecessary sentences, for the same reason that a drawing should have no unnecessary lines and a machine no unnecessary parts."
Frankly, it was the "...and a machine no unnecessary parts," I thought especially relevant.

While preparing for a presentation at Dreamforce 15 I was reminded of this advice while considering a slide recommending Salesforce developers clean-up their change sets and deployments before moving them into production.

Honestly, during the development of even a single method or subroutine it's possible for there to be code that may have had meaning in the method's early development, but after multiple iterations has been orphaned or otherwise has no impact.  It is the developer's responsibility to identify and eliminate unnecessary code and unused variables, just as in a class unused methods should be eliminated (no matter our affection for them).  Why would it be such a surprise that unused classes, pages, components, reports, report folders, flows, images, etc. shouldn't also be eliminated before being immortalized in a production system?

"...and a machine no unnecessary parts."

Some integrated development environments (IDEs) have code browsers that are able to identify unreachable code.  I'm not yet aware of one for Salesforce development, but if you know of one please share it.

Until then, it is the developer's responsibility to eliminate code and other artifacts from their projects and repositories--remembering to pay as much attention to their static resources, permission sets, profiles, and other configuration elements as to Visualforce and Apex.

Salesforce haystacks are large enough without the code and components that do nothing getting in the way of maintenance, documentation, code coverage, and ultimately process improvement.

Thursday, September 3, 2015

Step-by-step easy JSRemoting with Angular

I've been doing a lot of reading the last week or so learning how to mix AngularJS with Visualforce.  I've watched videos, read articles, read documentation, but none of them were simple.  It's as though developer's couldn't resist showing-off something else, and that something else buried the simplicity of simple JSRemoting calls inside Visualforce.

All I wanted to do is call some already-existing Remote Methods from inside an Angular page, and try to make sure it played nice with the other Angular features, like "promises."

We're going to start with a simple Apex controller with two methods.  The first, Divide(), simply divides its first argument by its second and returns the result.  As simple as it is it will be valuable later when we test our Angular Javascript to see how exceptions are handled--all we need to do is pass 0 for the second argument to see how exceptions behave.

The second method, Xyzzy(), simply returns a string.  All remote and REST classes should have some simple methods that do very little to simplify testing.

global class TomTestController {
  
    @RemoteAction
    global static Double Divide(double n, double d) {
        return n / d;
    }

    @RemoteAction
    global static String Xyzzy() {
        return 'Nothing happens.';
    }
}

After saving that class in your org create a new page (mine's called TomTest.page) with the simple contents below.

<apex:page showHeader="false" sidebar="false" standardStylesheets="false" controller="TomTestController">
    <apex:includeScript value="//ajax.googleapis.com/ajax/libs/angularjs/1.3.14/angular.min.js" />
    
    <div ng-app="myApp" ng-controller="myCtrl">
        <p>Hello, world!</p>
    </div>

    <script type="text/javascript">

        var app = angular.module('myApp', [ ]);
        app.controller('myCtrl', function($scope, $q) {
            
        });
        
    </script>

</apex:page>

The page above output the obligatory "Hello, world!" but functionally does nothing Angular-ish, short of defining an app and giving it a controller.  You should make certain the page does very little by inspecting the page from your browser to see what's written out to the console.  Knowing what "nothing" looks like is the first step to recognizing when "something" happens and you know whether it was something you intended or not.

The best thing about the page above is it doesn't include anything that distracts from our purpose.  There are no stylesheets to wonder whether they're needed and no other Javascript library you may think are required to get a simple example working.

The next thing we're going to do is add our Divide() method. But before we drop it into the Javascript let's look at what it normally looks like inside our non-Angular Javascript.

TomTestController.Divide(1, 1, function(result, event) {
    if (event.status)
        console.log('It worked!');
    else
        console.log('It failed!);
});

This is about as simple as JSRemoting code goes.  The browser is going to call the Divide() method on the TomTestController class and passes the numbers 1 and 1.  When the callout finishes event.status will tell us it worked (true) or failed (false).

In fact, we can put that call into our Javascript right now and run it to see what happens.  Update your page so it contains:


<apex:page showHeader="false" sidebar="false" standardStylesheets="false" controller="TomTestController">
    <apex:includeScript value="//ajax.googleapis.com/ajax/libs/angularjs/1.3.14/angular.min.js" />
    
    <div ng-app="myApp" ng-controller="myCtrl">
        <p>Hello, world!</p>
    </div>

    <script type="text/javascript">

        var app = angular.module('myApp', [ ]);
        app.controller('myCtrl', function($scope, $q) {
            
        });

        TomTestController.Divide(1, 1, function(result, event) {            
            if (event.status)
                console.log('It worked!');
            else
                console.log('It failed!');
        }, {buffer: false});
        
    </script>

</apex:page>

You should have read "It worked!" in your console log.

To make our remote call work with Angular promises, we need to wrap it inside a function that Angular-izes our call with promises so developers can use the .then().then().catch() code we've been reading so much about.

function Divide(n, d) {  
    var deferred = $q.defer();
    try {
        TomTestController.Divide(n, d, function(result, event) {
            if (event.status)
                deferred.resolve(result);
            else
                deferred.reject(event);
        }, {buffer: false});
    } catch (e) {
        deferred.reject(e);
    }
    
    return deferred.promise;
}

Our callout is still recognizable, but it has a few new features.  Principally, it creates a promise and calls either deferred.resolve() or deferred.reject() depending on the call's success or failure respectively.

Once our function is defined inside Angular's controller we can call it with (1, 1) to see how it works, and how it looks when it works inside the inspector.

<apex:page showHeader="false" sidebar="false" standardStylesheets="false" controller="TomTestController">
    <apex:includeScript value="//ajax.googleapis.com/ajax/libs/angularjs/1.3.14/angular.min.js" />
    
    <div ng-app="myApp" ng-controller="myCtrl">
        <p>Hello, world!</p>
    </div>

    <script type="text/javascript">
        var app = angular.module('myApp', [ ]);
        app.controller('myCtrl', function($scope, $q) {
            
            function Divide(n, d) {  
                var deferred = $q.defer();
                try {
                    TomTestController.Divide(n, d, function(result, event) {
                        if (event.status)
                            deferred.resolve(result);
                        else
                            deferred.reject(event);
                    }, {buffer: false});
                } catch (e) {
                    deferred.reject(e);
                }
                return deferred.promise;
            }
            
            Divide(1, 1);
        });
        
    </script>

</apex:page>

I know.  When you inspected it again you couldn't tell if anything happened.  The page functioned exactly as before.

So now let's show what happens if we use one of those .then() calls.  First, change the Divide() call above to it looks like :

Divide(1, 1).then(function() { console.log('Success!'); });

Or you can write it how you may be seeing it in other Angular examples...

Divide(1, 1)
    .then(function() { console.log('Success!'); });

You should have seen the text "Success!" printed on the console.

But what if our .then() function needed the output of our Divide()?  What would that look like?

Divide(1, 1)
    .then(function(data) { console.log(data); });

Notice in the code above our anonymous function now accepts and argument (data) and prints it instead of "Success!"  When you run this version of the code you should see "1" output to the console log.

But Divide() can also fail, and that is why .then() takes two function arguments, the first is for successful returns and the second for failures.

Let's pass two functions and modify our console.log() calls so we can tell which we're getting.

Divide(1, 1)
    .then(
        function(arg) { console.log('good', arg); },
        function(arg) { console.log(' bad', arg); }
    );

You should have seen "good 1" in the console log.

But what about errors?  What happens when we get an exception?  If you haven't already tried it, change the code to Divide(1, 0).  What did you get?  I got an error warning me, "Visualforce Remoting Exception: Divide by 0" followed by "bad >Object...".  When you look at the object sent to the second anonymous function notice that it's the "event" object passed when the code called deferred.reject(event);

Now that you have JSRemoting working inside Angular with promises, now is a good time to play around with it.  Below is my addition of Xyzzy().  But sometime tomorrow I think I'll create a remote for Echo() that simply returns its argument, or maybe a quick [ select ... from something ... limit 10 ]; to see what that looks like.

Let me know how it works for you.

<apex:page showHeader="false" sidebar="false" standardStylesheets="false" controller="TomTestController">
    <apex:includeScript value="//ajax.googleapis.com/ajax/libs/angularjs/1.3.14/angular.min.js" />
    
    <div ng-app="myApp" ng-controller="myCtrl">
        <p>Hello, world!</p>
    </div>

    <script type="text/javascript">
        var app = angular.module('myApp', [ ]);
        app.controller('myCtrl', function($scope, $q) {
            
            function Divide(n, d) {  
                var deferred = $q.defer();
                try {
                    TomTestController.Divide(n, d, function(result, event) {
                        if (event.status)
                            deferred.resolve(result);
                        else
                            deferred.reject(event);
                    }, {buffer: false});
                } catch (e) {
                    deferred.reject(e);
                }
                return deferred.promise;
            }
            
            function Xyzzy() {  
                var deferred = $q.defer();
                try {
                    TomTestController.Xyzzy(function(result, event) {
                        if (event.status)
                            deferred.resolve(result);
                        else
                            deferred.reject(event);
                    }, {buffer: false});
                } catch (e) {
                    deferred.reject(e);
                }
                return deferred.promise;
            }
            
            Divide(1, 0)
                .then(function(success) { Xyzzy(); })
                .catch(function(error) { console.log('ERROR', error); });
        });
        
    </script>

</apex:page>

Monday, August 31, 2015

Javascript AS a Visualforce page

There are several reasons a developer may want or need to have a their Javascript inside a Visualforce page.  Before explaining what those reasons may be, let just look at how you go about it.

Step 1 - Create your Javascript Page

The source below comes from a page I named "TomTestJS.page"

<apex:page showHeader="false" sidebar="false" standardStylesheets="false" 
    contentType="text/javascript">
 console.log('We are here!');
 document.write('This is the Javascript');
</apex:page>

Step 2 - Include your Javascript inside another page

Use <apex:includeScript value="{!$Page.TomTestJS" /> if the Javascript needs to be loaded at the top of the page and <script src="{!$Page.TomTestJS" /> if it needs to be loaded later, perhaps after some content has been rendered to the DOM.

The page below renders:

This is the page.
This is Javascript.

<apex:page showHeader="false" sidebar="false" standardStylesheets="false">

<p>This is the page.</p>

<script src="{!$Page.TomTestJS}" />

</apex:page>

The page above included the Javascript below some page content, that's why the document.write() output appeared below the HTML output.

If we instead did it the more traditional way using <apex:includeScript /> at the top of the page, the output renders:

This is the Javascript.
This is the page.

<apex:page showHeader="false" sidebar="false" standardStylesheets="false">
<apex:includeScript value="{!$Page.TomTestJS}" />
<p>This is the page.</p>

</apex:page>

I can think of a few reasons why programmers may want to do this.  Coincidentally, there the reasons I've wanted to do this.
  1. It's easier to track the source code in a repository if the files exist as independent entities and not part of a zip file.
  2. It's easier to see when a specific Javasript was last modified
  3. It allows the Visualforce preprocessor to resolve merge fields in the Javascript before being loaded into the browser (for assets that may exist in a static resource or as another page.
  4. It allows what would normally live inside <script /> tags inside a Visualforce page to exist independently, change independently, etc.
There are other reasons, too.  Today I had to port some Javascript and HTML from a media team into a sandbox and the team had taken liberties with their templates and other references that required "fixing" to work inside Salesforce.  Moving one of these Javascript files into a page and letting the preprocessor take care of a merge field to resolve the location of a static resource worked like a charm.

Thursday, August 27, 2015

How I got started with Angular and Visualforce

If you're reading this then you may be early-on in exploring AngularJS and wondering how you can get the W3Schools Angular Tutorial working inside Salesforce's Visualforce.

The tutorial's first page looks relatively straightforward, and with a simple closing tag for the <input> it will even pass Visualforce's compiler.

<!DOCTYPE html>
<html lang="en-US">
<script src="http://ajax.googleapis.com/ajax/libs/angularjs/1.3.14/angular.min.js"></script>
<body>

    <div ng-app="">
  <p>Name : <input type="text" ng-model="name" /></p>
  <h1>Hello {{name}}</h1>
    </div>

</body>
</html>

If you tried this inside Visualforce you likely got the same output I did.   Instead of behaving like it does in the tutorial, Visualforce stubbornly displays "{{name}}."

Without delay, here's a Visualforce-ized version of the W3Schools tutorial.

<apex:page showHeader="false" sidebar="false" standardStylesheets="false">
<script src="https://ajax.googleapis.com/ajax/libs/angularjs/1.3.14/angular.min.js" />

<div ng-app="noop">
    <p>Input something in the input box:</p>
 
    <p>Name : <input type="text" ng-model="name" placeholder="Enter name here" /></p>
 
    <h1>Hello {{name}}</h1>
</div>

<script>
    var myAppModule = angular.module('noop', []);
</script>

</apex:page>

Visualforce requires ng-app to have a value to pass its own syntax checker.  If a value is passed to ng-app to get past Visualforce then that value is interpreted by Angular as a module used to bootstrap your page and must be defined.

In the example above I created a module called "noop" that literally does nothing but take-up space to make something else work.

Now my page behaved just like W3Schools said it should.

Having Googled around some more, I found multiple tutorials and videos introducing the neat things people have done with Visualforce and Angular, but all of them are too complicated for the absolute novice.  But the search pages did alert me that Salesforce is so geeked about the combination of Angular and Visualforce that they've created an app for the appexchange that installs angular, underbar, bootstrap, and several other JS libraries.  The app is called Angilar MP [sic].  The page gives instructions for how to install it into your org and includes some demo pages showing how to put more complicated examples together.

Since the app loads all those Javascript libraries into a staticresource we can re-write our application to look just a tad more Visualforce-like.

<apex:page showHeader="false" sidebar="false" standardStylesheets="false">
<apex:includeScript value="{!URLFOR($Resource.AngularMP, 'js/vendor/angular.1.0.6.min.js')}" />
<div ng-app="noop">
    <p>Input something in the input box:</p>
 
    <p>Name : <input type="text" ng-model="name" placeholder="Enter name here" /></p>
 
    <h1>Hello {{name}}</h1>
</div>

<script>
    var myAppModule = angular.module('noop', []);
</script>

</apex:page>

All it really does is replace the <script src="..." /> with <apex:includeScript value="..." /> and use the staticresource's Angular JS source.

PS If you're not already familiar with it, another of the cool resources include in the package is UnderscoreJS.  Lots of cool Javascript utility functions in there I wish I'd known were around years ago.  Regardless, they'll make my current pages easier to write.


Tuesday, May 5, 2015

Working with Salesforce's destructiveChanges.xml

If you've ever had a need to remove a bunch of custom objects, fields, pages, classes, etc. from an org, or from multiple orgs you've probably come across documentation about destructiveChanges.xml.  If you're familiar with developing on the Salesforce platform using Maven's Mate or Eclipse, you're probably already familiar with package.xml.  Both files have nearly identically formats.  The difference between them is package.xml enumerates the stuff you want to synchronize between your org and your development environment and destructiveChanges.xml enumerates the items you want to obliterate (or delete) from whatever org you point it at.


The easiest way to see how they're identical is to look at what each of them looks like empty.

package.xml
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
    <version>29.0</version>
</Package>

destructiveChanges.xml
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
</Package>

The only difference between them is destructiveChanges doesn't have a <version> tag.

Let's look again after we add a class to each.  In package.xml we're synchronizing a class and in destructiveChanges.xml its a class we want to remove from our org.

package.xml
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
    <version>29.0</version>
    <types>
        <members>TomTest</members>
        <name>ApexClass</name>
    <types>
</Package>

destructiveChanges.xml
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
    <types>
        <members>TomTest</members>
        <name>ApexClass</name>
    <types>
</Package>

As a percentage, the two files are more similar now than they were before. The only difference between them is still the <version> tag.

Executing destructive changes

So how do we execute destructive changes?  The short answer is using Salesforce's migration tool.  In a few minutes we'll execute "ant undeployCode," but we've a few items to take care of first.

For me, the first problem was where to put the files destructiveChanges.xml and package.xml. The former is new and the latter is NOT the same file that usually appears in the src/ directory.

At Xede, we create git repositories for our projects.  Each repository is forked from xede-sf-template.

DrozBook:git tgagne$ ls -lR xede-sf-template
total 16
-rw-r--r--  1 tgagne  staff   684 Aug 21  2014 README.md
-rwxr-xr-x  1 tgagne  staff  1430 Sep 17  2014 build.xml
drwxr-xr-x  4 tgagne  staff   136 Aug 21  2014 del

xede-sf-template//del:
total 16
-rwxr-xr-x  1 tgagne  staff  563 Jan 17  2014 destructiveChanges.xml
-rw-r--r--  1 tgagne  staff  136 Aug 21  2014 package.xml

The repo includes a directory named "del" (not very imaginative) and inside it are the files destructiveChanges.xml and package.xml.  It seems odd to me, but the migration tool requires both the destructiveChanges.xml AND a package.xml to reside there.

The package.xml file is the same empty version as before.  But the template's destructiveChanges.xml contains placeholders--but still basically does nothing.

DrozBook:xede-sf-template tgagne$ cat del/package.xml
<package xmlns="http://soap.sforce.com/2006/04/metadata">
    <version>29.0</version>
</package>

DrozBook:xede-sf-template tgagne$ cat del/destructiveChanges.xml 
<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
    <types>
        <name>ApexClass</name>
    </types>
    <types>
        <name>ApexComponent</name>
    </types>
    <types>
        <name>ApexPage</name>
    </types>
    <types>
        <name>ApexTrigger</name>
    </types>
    <types>
        <name>CustomObject</name>
    </types>
    <types>
        <name>Flow</name>
    </types>
    <types>
        <name>StaticResource</name>
    </types>
    <types>
        <name>Workflow</name>
    </types>
</Package>

Now that we have a directory with both files in it, and we have versions of those files that basically do nothing, let's get ready to run the tool.

There's one more file we need to create that's required by the tool, build.xml.  If you're not already using it for deployments you're likely not using it at all.  My version of build.xml is in the parent of del/.  You can see it above in the directory listing of xede-sf-template.

DrozBook:xede-sf-template tgagne$ cat build.xml
<project name="xede-sf-template" default="usage" basedir="." xmlns:sf="antlib:com.salesforce">

    <property environment="env"/>

    <target name="undeployCode">
      <sf:deploy 
 username="${env.SFUSER}" 
 password="${env.SFPASS}" 
 serverurl="${env.SFURL}" 
 maxPoll="${env.SFPOLL}" 
 ignoreWarnings="true"
 checkOnly="${env.CHECKONLY}"
 runAllTests="${env.RUNALLTESTS}"
        deployRoot="del"/>
    </target>

</project>
If 
Since build.xml is in the parent directory to del/ the "deployRoot" attribute is "del," the subdirectory.

The environment property (<property environment.../>) allows operating system environment variables to be substituted inside your build.xml.  In the example above, the environment variables are about what you'd expect them to be (using the bash shell):

export SFUSER=myusername
export SFPASS=mysecretpassword
export SFURL=https://login.salesforce.com (or https://test.salesforce.com)
export SFPOLL=120
export CHECKONLY=false
export RUNALLTESTS=false

Right about now you may be thinking, "Who wants to set all those environment variables?" Truthfully, I don't.  That's why I created a little script to do it for me called "build."  But before we get into that let's just edit our build.xml file so it doesn't need environment variables.

The build.xml below is for a production org.

DrozBook:xede-sf-template tgagne$ cat build.xml
<project name="xede-sf-template" default="usage" basedir="." xmlns:sf="antlib:com.salesforce">

    <target name="undeployCode">
      <sf:deploy 
 username="tgagne+customer@xede.com" 
 password="mysupersecretpassword" 
 serverurl="https://login.salesforce.com" 
 maxPoll="120" 
 ignoreWarnings="true"
 checkOnly="false"
 runAllTests="false"
        deployRoot="del"/>
    </target>

</project>

So now we have our build.xml, our del directory, del/destructiveChanges.xml which lists nothing and an empty del/package.xml file.  Let's run ant.

DrozBook:xede-sf-template tgagne$ ant undeployCode
Buildfile: /Users/tgagne/git/xede-sf-template/build.xml

undeployCode:
[sf:deploy] Request for a deploy submitted successfully.
[sf:deploy] Request ID for the current deploy task: 0AfU00000034k0SKAQ
[sf:deploy] Waiting for server to finish processing the request...
[sf:deploy] Request Status: InProgress
[sf:deploy] Request Status: Succeeded
[sf:deploy] *********** DEPLOYMENT SUCCEEDED ***********
[sf:deploy] Finished request 0AfU00000034k0SKAQ successfully.

BUILD SUCCESSFUL
Total time: 15 seconds

As you can see, it did nothing.  Let's give it something to do, but make it a class that doesn't exist in the target org.

DrozBook:xede-sf-template tgagne$ cat del/destructiveChanges.xml
<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
    <types>
        <members>DoesNotExist</members>
        <name>ApexClass</name>
    </types>
    ... same as before ...
</Package>

I've added a single class, DoesNotExist, to the ApexClass types list and we'll run it again.

DrozBook:xede-sf-template tgagne$ ant undeployCode
Buildfile: /Users/tgagne/git/xede-sf-template/build.xml

undeployCode:
[sf:deploy] Request for a deploy submitted successfully.
[sf:deploy] Request ID for the current deploy task: 0AfU00000034k0mKAA
[sf:deploy] Waiting for server to finish processing the request...
[sf:deploy] Request Status: InProgress
[sf:deploy] Request Status: Succeeded
[sf:deploy] *********** DEPLOYMENT SUCCEEDED ***********
[sf:deploy] All warnings:
[sf:deploy] 1.  destructiveChanges.xml -- Warning: No ApexClass named: DoesNotExist found
[sf:deploy] *********** DEPLOYMENT SUCCEEDED ***********
[sf:deploy] Finished request 0AfU00000034k0mKAA successfully.

BUILD SUCCESSFUL
Total time: 15 seconds

Ant (with the migration tool plugin) is telling us it tried removing the Apex class "DoesNotExist" but it didn't exist.  If the class had existed before but had already been removed this is the message it would display.

As a reader exercise, go ahead and create a class "DoesNotExist" in your org.  I went into Setup->Classes->New and entered "public class DoesNotExist{}". It's about as useless a class as you can create, though I've seen and perhaps written worse.

If you run ant again you'll see it doesn't report an error.
DrozBook:xede-sf-template tgagne$ ant undeployCode
Buildfile: /Users/tgagne/git/xede-sf-template/build.xml

undeployCode:
[sf:deploy] Request for a deploy submitted successfully.
[sf:deploy] Request ID for the current deploy task: 0AfU00000034k11KAA
[sf:deploy] Waiting for server to finish processing the request...
[sf:deploy] Request Status: InProgress
[sf:deploy] Request Status: Succeeded
[sf:deploy] *********** DEPLOYMENT SUCCEEDED ***********
[sf:deploy] Finished request 0AfU00000034k11KAA successfully.

BUILD SUCCESSFUL
Total time: 15 seconds

And there you have it!  For a little extra I'll share my "build" script which makes it pretty easy to extract, undeploy (what we just did) and deploy code with or without tests or verification-only.
Follow @TomGagne