Andosi Blog - the Art of Great Design

What are the best practices when using the Microsoft Power Platform and Dynamics 365?

What are the best practices when using the Microsoft Power Platform and Dynamics 365?

What are the best practices when using the Microsoft Power Platform and Dynamics 365?

  •  Establish multiple environments - Empowers your organization to mitigate the risks that can come from making changes directly in your one-and-only Production instance.
  • Define each environment’s purpose - Depends on the unique needs of your organization.
  • Have a plan – Consider how and when to migrate system customizations from one environment to the next.
  • Decide on solution type - For systems based on the Microsoft Power Platform, you’ll have to choose between Managed and Unmanaged Solutions.
Read more

Connecting to a Microsoft CRM 2013 Internet Facing Deployment with C#


Integrating and retrieving data from a Microsoft CRM 2013 instance is a common request, and I have found myself working on a number of these solutions lately. I encountered a recent example where a client was using an Internet Facing Deployment of CRM 2013.

In this case, I needed to connect to the Microsoft CRM 2013 server to retrieve data to be integrated into Dynamics GP.

Here is a quick sample of code you may use to connect to an internet facing deployment of Microsoft CRM 2013 using claims based authentication.

// Get the endpoint of the CRM organization service.
Uri organizationServiceUri = new Uri(@"CrmOrganizationServiceEndpointUri"); // In production code, abstract this to configuration.
// Get the service configuration.
IServiceConfiguration<IOrganizationService> serviceConfiguration = ServiceConfigurationFactory.CreateConfiguration<IOrganizationService>(organizationServiceUri);
// Set up the credentials.
ClientCredentials clientCredentials = new ClientCredentials();
clientCredentials.UserName.UserName = "CrmUserName"; // In production code, abstract this to configuration.
clientCredentials.UserName.Password = "CrmPassword"; // In production code, abstract this to configuration.
using (OrganizationServiceProxy organizationServiceProxy = new OrganizationServiceProxy(serviceConfiguration, clientCredentials))
     // Perform business logic here.

This blog has been relocated from with authorization.

Executing SQL Queries via SharePoint Web Services

Can you really execute native SQL Queries from SharePoint Web Services?

Microsoft SharePoint is great for building enterprise systems tying together various data sources.  If the information you are looking for is in a SharePoint List or Document Library, it is straightforward to call the built-in Web Services to query or manipulate that data.  Through custom Web Parts, you can run server-side code and easily retrieve data that lives outside SharePoint.

Read more

Capturing (and using) raw SOAP messages in WCF

WCF is great for building web services.  It's also great for interacting with existing web services.  Visual Studio makes it so easy . . . add a service reference, point to the WSDL of the service and just like that, you have a set of classes to handle the service and data contracts.
Unfortunately, sometimes web services don't live up to their contracts.  Recently, I was interacting with a web service and found that sometimes the response would be null.  I fired up Fiddler and looked at the SOAP messages.  Sure enough, an error had occurred on the remote server and the response, while valid XML, looked nothing like the promised Data Contract.  It did however provide a useful description of the error.
There was no SOAP fault . . . no exception thrown (unless I tried to use the null response without checking), just a null response object by the time WCF handed it to me.
I searched the web and found several suggestions.

  • Use Fiddler . . . tried that but it doesn't help when you are trying to get your service into production.
  • Build a SOAP Extension . . . that's great if you are building a legacy XML Web Service and just want to log or tweak the messages.  I'm doing WCF and needed to make the messages available to my service in real time.
  • Edit the Visual Studio Generated classes to wrap the response in XML and deserialize it yourself . . . that might work but you throw away so much niceness that's built for you already.
I ended up building a custom Endpoint behavior and applying it to the generated SOAP Client object.
First, you need a class to hold the raw Request and Response:
        public class InspectedSOAPMessages
            public string Request { getset; }

            public string Response { getset; }
Create an instance of the class and pass it into the new Endpoint Behavior:
        InspectedSOAPMessages soapMessages = new InspectedSOAPMessages();
        batchClient.Endpoint.Behaviors.Add(new CapturingEndpointBehavior(soapMessages));
        BatchService.batchResponseType response = batchClient.BatchOperation(batchRequest);
           //response will be null if the contract was violated . . . 
        if (response == null)
           results.Message = soapMessages.Response;    } else { //process normal response
Now, all we need is the set of classes to handle the Endpoint Behavior:
    using System.ServiceModel;
    using System.ServiceModel.Channels;
    using System.ServiceModel.Description;
    using System.ServiceModel.Dispatcher;

    namespace MBSGuru
        /// Allows capturing of raw SOAP Messages
        public class CapturingEndpointBehavior : IEndpointBehavior
            /// Holds the messages
            public InspectedSOAPMessages SoapMessages { getset; }

            public CapturingEndpointBehavior(InspectedSOAPMessages soapMessages)
                this.SoapMessages = soapMessages;

            /// Required by IEndpointBehavior
            ///<param name="endpoint"></param>
            ///<param name="bindingParameters"></param>
            public void AddBindingParameters(ServiceEndpoint endpoint, BindingParameterCollection bindingParameters) { return; }

            /// Required by IEndpointBehavior
            ///<param name="endpoint"></param>
            ///<param name="clientRuntime"></param>
            public void ApplyClientBehavior(ServiceEndpoint endpoint, ClientRuntime clientRuntime)
                clientRuntime.MessageInspectors.Add(new CapturingMessageInspector(this.SoapMessages));

            /// Required by IEndpointBehavior
            ///<param name="endpoint"></param>
            ///<param name="endpointDispatcher"></param>
            public void ApplyDispatchBehavior(ServiceEndpoint endpoint, EndpointDispatcher endpointDispatcher) { return; }

            /// Required by IEndpointBehavior
            ///<param name="endpoint"></param>
            public void Validate(ServiceEndpoint endpoint) { return; }

        /// Actual inspector that captures the messages
        public class CapturingMessageInspector : IClientMessageInspector
            /// Holds the messages
            public InspectedSOAPMessages SoapMessages { getset; }

            public CapturingMessageInspector(InspectedSOAPMessages soapMessages)
                this.SoapMessages = soapMessages;

            /// Called after the web service call completes.  Allows capturing of raw response.
            ///<param name="reply"></param>
            ///<param name="correlationState"></param>
            public void AfterReceiveReply(ref System.ServiceModel.Channels.Message reply, object correlationState)
                this.SoapMessages.Response = reply.ToString();

            /// Called before the web service is invoked.  Allows capturing of raw request.
            ///<param name="request"></param>
            ///<param name="channel"></param>
            public object BeforeSendRequest(ref System.ServiceModel.Channels.Message request, IClientChannel channel)
                this.SoapMessages.Request = request.ToString();
                return null;
And just like that, you have a copy of the request and response.  How you handle the violation of the contract is up to you.  At least now you can report something more informative than "Error Occurred"
This blog has been relocated from with authorization.

CSS Consolidator jiu-jitsu

CSS Consolidator

Sometimes you inherit a huge pile of CSS and want to make a few changes. Sometimes there are huge groups of duplicated rules. Sometimes, you just want to see everywhere Comic Sans is used in your stylesheets. (Hopefully, nowhere.)
Paste in your Source CSS below and click Consolidate CSS.  Rules and selectors will be consolidated in the next textbox.  All processing is done client-side in JavaScript.
Below that, you will see a breakdown of your styles, arranged by Selector, Attribute and Values.
Why did I do this?  Why not?

Source CSS (No @media tags please)

Consolidated CSS


This blog has been relocated from with authorization.

Create a Vendor in Dynamics GP 2010 with eConnect using In Memory Serialization

Developing several integrations between Microsoft Dynamics GP 2010 and various third-party systems the last few weeks reminded me to update my previous article on In Memory Serialization for eConnect 10.
Microsoft Dynamics GP 2010 uses eConnect version 11 which includes significant updates. Notably, the COM+ component has been changed to a WCF service.
In this example, I am creating a new vendor record in Dynamics GP using the same in memory serialization technique. Why write a file to disk unnecessarily?
To run the following code on your machine:

  1. Install the latest version of the eConnect 11 SDK.
  2. Create a new Console Application in Microsoft Visual Studio.
Add references to these dynamic link libraries which are located by default in C:\Program Files (x86)\Microsoft Dynamics\eConnect 11.0\API\. (Ignore the x86 if you are using a 32-bit system.)
  1. Microsoft.Dynamics.GP.eConnect.dll
  2. Microsoft.Dynamics.GP.eConnect.Serialization.dll
Replace the Program.cs class in the project for the new class below.

using System;
using System.IO;
using System.Xml;
using System.Xml.Serialization;
using Microsoft.Dynamics.GP.eConnect;
using Microsoft.Dynamics.GP.eConnect.Serialization;
class Program
    static void Main(string[] args)
        Console.WriteLine("Beginning integration test.");
        using (eConnectMethods eConnectMethods = new eConnectMethods())
                Console.WriteLine("Creating a new customer.");
                // Modify the connection string for your environment.
                string connectionString = @"data source=localhost; initial catalog=TWO; integrated security=SSPI";
                // Create the vendor.
                taUpdateCreateVendorRcd vendor = new taUpdateCreateVendorRcd();
                // Assign the vendor to a new master vendor type.
                PMVendorMasterType vendorType = new PMVendorMasterType();
                vendorType.taUpdateCreateVendorRcd = vendor;
                // Assign the master vendor type to a new 
                // collection of master vendor types.
                PMVendorMasterType[] masterVendorTypes = { vendorType };
                // Serialize the master vendor type in memory.
                eConnectType eConnectType = new eConnectType();
                MemoryStream memoryStream = new MemoryStream();
                XmlSerializer xmlSerializer = new XmlSerializer(eConnectType.GetType());
                // Assign the master vendor types to the eConnectType.
                eConnectType.PMVendorMasterType = masterVendorTypes;
                // Serialize the eConnectType.
                xmlSerializer.Serialize(memoryStream, eConnectType);
                // Reset the position of the memory stream to the start.              
                memoryStream.Position = 0;
                // Create an XmlDocument from the serialized eConnectType in memory.
                XmlDocument xmlDocument = new XmlDocument();
                // Call eConnect to process the XmlDocument.
                eConnectMethods.CreateEntity(connectionString, xmlDocument.OuterXml);
                Console.WriteLine("Successfully created vendor {0}.", vendor.VENDORID);
            catch (Exception ex)
                Console.WriteLine("Exception occured: " + ex.Message);
        Console.WriteLine("Integration test complete." +
                           Environment.NewLine +
        Console.WriteLine("Press <Enter> to continue...");
Execute the project and you should see the following output:
And you should see the vendor created in Dynamics GP:
This blog has been relocated from with authorization.

Dynamics GP ActiveX component can't create object run-time error 429

Recently I was asked to look into an error a client was receiving while standing up a new Citrix server installation for their Dynamics GP 9 clients:

NOTE: If you are recieving this error while attempting to upgrade from Dynamics GP 9 to Dynamics GP 10, see this article.
When we hit the debug button to look at the underlying VBA code, we saw this line:

Based on the line of VBA code above, I knew that a connection to a database was needed, and the RetrieveGlobals9.dll was not installed on the new Citrix server. You might see slightly different code in your VBA, but the important part was that it was trying to call CreateObject on RetreiveGlobals9.
The first step to solve this problem was to download a copy of RetrieveGlobals9.dll from Partner Source at this URL.
Contact your partner to download this for you if you don't have access to Partner Source. Place the file into the Microsoft Dynamics GP folder in the Program Files directory. Since this was a 64-bit machine, the Microsoft Dynamics GP folder was in Program Files (x86).
Next, open a command prompt in elevated mode.
Finally, to register the .dll, type:
Regsvr32 "C:\Program Files (x86)\Microsoft Dynamics\GP\RetrieveGlobals9.dll"
You will need to include the quotes because the directories have spaces in them.
If everything worked properly, you'll receive a confirmation that the .dll was registered successfully.
After these steps were complete, we were able to open and use Microsoft Dynamics GP without receiving the initial VBA error.
Hope it helps!
This blog has been relocated from with authorization.

Record Lock Trace v 2.0

I've updated the Lock Trace Utility for Dynamics GP with 2 new features.
The first enhancement will replace the standard "This batch is being edited by another user" prompt when attempting to post or delete a SOP Batch with the name of the user that has the batch locked with an activity record in SY00800.

The second enhancement will clear all records that are locked by users that are not actively logged in to GP.  This will be executed when attempting to access a transaction locked by a user that is not actively logged in.  After, the record will be available and the following prompt will be presented.
The basic delete statements executed by this feature are listed below:
delete tempdb.dbo.dex_lock where session_id not in (select SQLSESID from dynamics.dbo.activity)
delete tempdb.dbo.dex_session where session_id not in (select SQLSESID from dynamics.dbo.activity)
delete DYNAMICS.dbo.SY00800 where userid not in (select USERID from DYNAMICS.dbo.activity)

If you are not sure what version of the utility you have installed, since there isn't a GUI, I've updated the about message to include the version information.

Download Record Lock Trace for Dynamics GP here or view the product page for more information.  Please leave a comment if there are other features or transaction types that you would like added to this utility.

This blog has been relocated from with authorization.

Resolving "Error occurred in deployment step 'Activate Features': Invalid file name"

The other day I was working on a SharePoint project that required the deployment of a Content Type, a List Template, a couple of List Instances and a couple of Feature Receivers. Things were coming along well until I started to reorganize the project. I dragged the List Instances into the List Template folder and renamed several folders to better represent their purposes. When I went to deploy, I got the error: "Error occurred in deployment step 'Activate Features': Invalid file name".

I looked around the .spdata files, checked the Feature file and double-checked them all again. Everything looked right. The ULS Logger wasn't much help but it did give me the actual exception: "Exception: Microsoft.SharePoint.SPException: Invalid file name. The file name you specified could not be used. It may be the name of an existing file or directory, or you may not have permission to access the file"
Still not much help. I removed all the items from the feature and was able to deploy successfully. Unfortunately, a feature that doesn't do anything isn't much good so I started adding items and deploying one at a time. Feature Receivers: Check. Content Type: Check. List Template: Check. List Instance: Failed.
After looking at the List Instance for a while and not seeing the problem, I deleted the Instance from the project and recreated it from scratch. I added it to the feature, deployed and it failed again.
I went through the List Template again and saw the familiar warning:
<!-- Do not change the value of the Name attribute below. 
If it does not match the folder name of the List Definition project item, 
an error will occur when the project is run. -->
I had renamed the List Template folder from within Visual Studio but it did not update the ListTemplate Name element.
After correcting the Name to match the new folder name everything worked and the world was right again.
Add Solution:
Found 1 deployment conflict(s). Resolving conflicts ...
Deleted list instance 'Lists/SomeAwesomeList' from server.
Adding solution 'SomeAwesomeSolution.wsp'...
Deploying solution 'SomeAwesomeSolution.wsp'...
Activate Features:
Activating feature 'SomeAwesomeFeature' ...
Run Post-Deployment Command:
Skipping deployment step because a post-deployment command is not specified.
========== Build: 1 succeeded or up-to-date, 0 failed, 0 skipped ==========
========== Deploy: 1 succeeded, 0 failed, 0 skipped ==========
This blog has been relocated from with authorization.

Dynamics World List of Influential People

I'm not really sure how, and no I did not nominate myself, but I made it into the top 260 nominees for Dynamics World's Microsoft Dynamics Most Influential People.  Thanks to who ever did nominate me.  You can view the list of nominees here.

Several of my old friends are on the list; Mark Polino, Ross Carlson, Dwight Specht, Troy Ensor, Bob McAdam, and Shane Hall to a name a few.  Plus some of my new friends like Mariano and Dave Musgrave will surely be moving up the list this year.  People like Andy Hafer really deserve the recognition for their contributions to the community as do all of the Dynamics MVPs that we all hear from so often.  Many of those who run the popular VAR and ISV organizations are on the list as well.

You can vote for those that you think deserve recognition for their contributions at  You will have to scroll through the polls to find individuals you might want to vote for.  I'm in Poll 11 as are Dwight Specht (IBIS), Tony DiBenedetto (TriBridge), Andy Hafer (GPUG), and Bill Marshall (MC2).

This blog has been relocated from with authorization.

GP Password Expired mid-day... while I was posting a Batch

I was recently talking shop with another consultant that has a customer who's Dynamics GP password was valid when they logged in but expired after causing a batch posting interruption.  Has this ever happened to you?

He submitted a support request to Microsoft to explain this issue and seek out a resolution.  The response he received was:

As for ways to overcome the password becoming invalid mid-day, there are two options:

1. Keep the password synchronized (time-wise) with the windows password and use the windows password reminder as a GP password reminder
2. Use the GP Password Expiration Notification utility, freely available from the blog link below

Microsoft recommended the GP Password Expiration Notification utility with qualification, of course, that it is not something Microsoft created or provides support for.  Even so, it's nice to see that this utility is serving the community well by helping to solve and prevent common problems.

If you haven't downloaded it already you should before you have to recover from a batch posting interruption because your password has expired.

This blog has been relocated from with authorization.

Dynamics GP 2010 R2 Feature List

Straight out of Tech Conference in Fargo this week, thanks to Dave Musgrave, I came across this graphic that lists some of the exciting new features coming out with Dynamics GP 2010 R2:

Dave goes on to explain that not only is Dynamics GP not being laid to rest as some have speculated since Microsoft begin investing in Dynamics AX nearly a decade ago but rather in a future release they are planning to include a web based client for Dynamics GP.  That's a great improvement that I think we can all look forward to but first let's look at what's coming now:
  • Field Service Series customers should be looking forward to some of the FSS enhancements on deck:
    • The Contract Line Hold feature is well overdue.  I've had several clients that need this functionality.
    • Tech Stock Replenishment will be a much welcomed addition.  I'm actively working on a project where we are having to work around this.
    • Contract Transfer Approval Workflow is something else our Contract Admin clients have been asking for.  It's as though Microsoft is reading my mind.
  • The Reporting and BI enhancements for Dynamics GP seem endless:
    • More Excel Report functionality; Bulk Deployment, Group Excel Reports, Multi-Company Support, and Default Column Ordering will be great enhancements.
    • MORE Analytical SSRS Reports and Metrics.  I'm not sure any of us can get enough of those!
    • New Business Analyzer!  Mariano touches on that in his coverage of the conference here
    • MORE Word Templates; SOP Returns and Word Template Generator.  I was skeptical of the Word Templates after first seeing them at Tech Conference 2010 but I have to admit they're growing on me.

There are several other great features and enhancements to Extender, Deployment and Migration Tools, and Email Support among others.  It's always good to see that Microsoft continues to invest in what is already such a fantastic product.  Well done.

This blog has been relocated from with authorization.

Love, Hate and the ViewState

I was recently tasked with creating a SharePoint interface to Microsoft Dynamics GP Item Maintenance. As the client's business had grown, inconsistencies in theirItem Master became apparent. When new items were needed, a similar existing item was copied and the details updated to match the new item. If there were no similar items, a new item was created.
The problem was, many of the existing items were not properly categorized. There were people in the organization who knew bits of information about items, but nobody had all the information to correctly set up an item. Setting up an item properly required a combination of phone calls, emails and a little bit of luck. It could sometimes take weeks to get the item set up. This held up BOMs, Routings and pretty much everything else dependent on the new item.
As usual for a new project, we started with a discovery phase. We tried to identify the groups that knew the necessary bits of information about items and proceeded to schedule interviews. During the interviews we gathered a lot of information to help us get started. We also found several points that were unclear: i.e. Accounting said Billing provides this. Billing said it was Sales, Sales thought it was Purchasing, Purchasing pointed to Engineering and Engineering said it was Accounting. After a few round trips, we were able to pin most things down but in the end, there were still a few bits of information that nobody understood.
It was clear the Workflow would have to be very flexible. Adding to the complexity, some Items could bypass entire groups. For example, there's no need to set a price on an item if you don’t sell it. And why bother Purchasing if you are making this item in your own shop?
To solve this business problem, we needed to deliver a product with:

  • Flexible Workflow rules that could be adapted as the business changes (and as the users work with the system and discover steps that had been overlooked.)
  • Field-Level security to ensure each group can only edit their section.
  • The ability to assign some fields to multiple groups and users.
  • The ability to add new fields and groups as new requirements surface.
  • Audit Logging for accountability.
  • An intuitive User Interface.
  • The ability to open, edit and copy existing GP Items.
  • The ability to push the approved changes back to GP.

This was not a simple SharePoint WebPart with an ASP.NET Form you can throw on a Page and start using. The form had to be generated dynamically, with different sections editable by different users. Lookup Fields had to be populated dynamically. Some based on values stored in GP like Class Code, others with a Hard-Coded list of choices, like Item Type. Oh, and new lookup fields could be added at any time.
What's more, the fields and their rules were not known at design time. All the metadata describing the rules had to be parsed during form generation and postback.
While building this application I ran into a few challenges. One in particular had to do with updating posted values based on business rules and then rendering the fields, not as the user had posted, but as the rules dictated. Now that you know the background, how do we make this happen?
The ViewState is great! It makes your life as a web developer so much easier. Some action causes a postback and all the fields are repopulated with their values. You can handle events for a control without having to worry about the rest of the form. Controls get their IDs set automatically. What's not to love? Plenty.

  1. The ViewState is transmitted and parsed with every page post/load cycle using bandwidth, memory and CPU cycles for the server and the client.
  2. The ViewState, while encoded, is not encrypted. I've seen this argument and I don’t think it's really all that relevant. After all, you are sending the same information back and forth through the form fields. And if you need to keep a secret, use HTTPS.
  3. The ViewState is persistent insistent. If I take the users' submitted data and do some processing on the backend, I just might want to change some values on the form when it is sent back. Suppose they tried to order 100 widgets but you just sold some and you only have 75. Send the form back to the user with quantity set to 75 and display a nice alert telling them they are lucky to get that many. Thanks to ViewState, the quantity field gets set back to 100 automatically. I know, JavaScript could validate the page before the user submits. But what if the user is running without JavaScript? I know I do unless I'm on a trusted page. Add-ons like NoScript offer significant protection while surfing and after a whitelisting your common sites, they pretty much stay out of the way. But even with JavaScript, if I'm dealing with Dynamic Data, (what other kind is there?) the validation rules may have changed since the page was loaded. I suppose I could build some AJAXy validation JavaScript, but again, you can't count on JavaScript being there and you should never trust anything a user submits, even if you think your JavaScript has sanitized it.

So, lets say we want to prevent ViewState from running. Easy enough, just set the EnableViewState property of the control to false:

         TextBox t = new TextBox();
         t.EnableViewState = false;
         t.Text = this.ToString();
         return t;

And what if you want to disable ViewState on the whole page?

      private void Page_Init(object sender, System.EventArgs e)
         this.EnableViewState = false;
         //do some other interesting stuff 


So, you’ve defeated the ViewState on a couple of projects and now it's time to build a SharePoint Web Part. Create the Web Part, add some controls, disable ViewState, deploy the Web Part and life is good! And we still have time to make happy hour (the first one!)
Wait a minute. Didn't you disable ViewState? Why aren't your backend changes sticking? Because SharePoint Web Parts love ViewState so much, they insist on using it. Disable it on the control? Doesn't matter. It happens anyway. Control.ClearChildViewState() doesn't even help. How about firing up SharePoint Designer and disabling it for the entire Page that hosts the Web Part? Congratulations, you've done it, and you’ve disabled pretty much all the SharePoint functionality too. No, there has to be way around this.
Actually, there are two:

  1. Create a LiteralControl and build it's Text property with the HTML you need to create your form field. Now you can render the LiteralControl and IIS will never even see the field. Remember, you will have to inspect the Post Variables yourself to find out how many widgets you just sold. Unfortunately, I tend to make the occasional mistake dynamically building HTML from within a C Sharp app. Leave off a quote or miss a closing tag and your form starts acting really strange. That brings us to the other option.
  2. Create your controls like normal and add them to parent controls if you like. No need to worry about the HTML, IIS will get it right. But instead of adding the controls to the page (or a page element) render them into the Text property of our friend the LiteralControl. Using a StringBuilder, a StringWriter and an HTMLTextWriter, it all falls into place
             //...Build your textbox as you like. 
             //Don't forget a unique ID. 
             TextBox tbQuantity = new TextBox();
             LiteralControl LControl = new LiteralControl();
             LControl.Text = this.RenderControlToString(tbQuantity);
          public string RenderControlToString(WebControl Control)
             StringBuilder sb = new StringBuilder();
             using (StringWriter sw = new StringWriter(sb))
                using (HtmlTextWriter textWriter = new HtmlTextWriter(sw))
             } return sb.ToString();

Using the HtmlTextWriter, we have well-formed HTML without having to worry about ViewState moving our cheese. Even in a SharePoint Web Part. And as a bonus, the ID you assigned to the Control is the ID that will be returned. No prepending all the parent control IDs by IIS. This, too helps to reduce the page size and load time.

This blog has been relocated from with authorization.

Microsoft Dynamics ERP Push to the Cloud

The Software Advice Blog has a interview clip up with Guy Weismantel, Microsoft's Director of ERP Marketing, in which he discusses their strategy for moving Dynamics ERP to the cloud.

He makes a good point; many ERP components are already delivered as cloud based services and we continue to move in that direction.  A complete, cloud based ERP won't be suitable for every Dynamics ERP customer but will prove to be a very good for many.

Check out the clip at

This blog has been relocated from with authorization.

Register Now to Save - Convergence 2011 in Atlanta!

The Early Registration Deadline for Convergence 2011 expires on February 21 and hotels are filling up.  That's just 5 days away!

Register now at to save $300 off the regular price that kicks in on the 21st.

Register 3+ people and receive another $100 off for the the 3rd + Attendee.  Straight Arrow took advantage of this one with many of us invading Atlanta this year.  Hope to see you there!

This blog has been relocated from with authorization.

Updated: Lock Trace Utility for Dynamics GP - PO Support

I received a request to extend the Lock Trace functionality to support locked Purchase Orders.  It was a very simple change so I went ahead and implemented that functionality.  You can download this new version here if you're interested in knowing who has a Purchase Order locked when you attempt to access it.

I started working on another feature for this utility but didn't want to hold back this new feature while working on that.  I created a product page here where I'll document release notes as this utility evolves.

Keep the feedback coming.  I have several other ideas for this utility and would like to hear yours.

This blog has been relocated from with authorization.

Record Lock Trace Addon for Dynamics GP

Natively, Dynamics GP doesn't always do a good job of enabling users to resolve common issues on their own.  A classic example of this is the prompt displayed when attempting to access a record that is locked by another user.  By default, as a user you are presented with the very generic message:
This is the message presented when attempting to access a document that is locked by another user in Sales Transaction Entry.  Now, you can download the free Record Lock Tracing addon for Dynamics GP (Tested on v10 and GP2010) that will replace this generic message with a message that includes the specific user that has the record locked:
This has already solved some headaches for some of my clients eliminating the need to contact their internal IT department in order to determine which user has a record locked.  This is particularly useful during periods with heightened urgency such as when processing shipments and invoices at month end.
This can be extended to include other transactions such as Purchase Orders.  I'm happy to build on this further if there is demand for it.  Please leave me a comment or send me an e-mail and let me know how I might extend this to add value for you.

This blog has been relocated from with authorization.

Creating Customer Invoices using Sql Server Reporting Services 2008 R2

Recently I was tasked with creating a customer facing invoice using Sql Server Reporting Services 2008 R2. This particular client had employed a technique using Page Headers and Footers that didn't upgrade properly once they moved to hosting their reports in SharePoint 2010 Integrated mode.

Because Page Headers and Page Footers are not allowed to have references to data fields, the main issue was with how to consistently place a totals section in a fixed position at the bottom of the page without using a Page Footer.
Making the task even more difficult was the fact that this report should generate all customer invoices at once. Page numbering had to be based on the individual invoice, not the total pages in the "report". Although SSRS 2008 R2 has introduced a new method to reset page numbers on a group, explained by Chris Hays here, I choose to generate my page numbers in SQL rather than in SSRS.
Each one of these techniques deserves a deep dive, but in general this report was accomplished by:
  • Using SQL Server window aggregate functions to both limit the number of invoice lines which should be presented on each page and to serve up the page numbers to be grouped upon and used in the display. Read this for more information: Aggregate Functions
  • Using one large cell of a table containing rectangles and a nested table for the order lines.
  • Fixing the report totals section at the proper location at the bottom of the report.
My idea was that I would limit the number of invoice line items to around 10 per page with my Sql generated page numbers. The report would be grouped by invoice number and page number. I would leave enough whitespace for those 10 lines to grow into therefore the information at the bottom of the report could be fixed just like a page footer.
I was having a difficult time initially getting this to work properly because the "footer" was getting pushed to the second page when the number of lines would grow. After some research, I discovered that the behavior for consuming whitespace has been changed in SSRS 2008 R2. See this article for details: Behavior Changes in Sql Server Reporting Services 2008 R2
Here is the important part:

Preserving White Space in a Report Body or Rectangle Container

Extra white space is no longer removed by default. When you render a report that had extra white space on the report body when viewed on the report design surface, the trailing white space after the last report item on the page is preserved. This may result in more pages for an existing report. To remove the white space, set the report property ConsumeContainerWhitespace to true.

Once I changed the ConsumerContainerWhitespace property to true, the report worked as I expected.
This blog has been relocated from with authorization.

  • This email address is being protected from spambots. You need JavaScript enabled to view it.
  • (813) 792-1939
  • Privacy Policy