Friday, November 30, 2018

eConnect Performance: Using GetNextDocNumbers vs taGetNext stored procedures

By Steve Endow

This one is definitely an obscure topic that nobody is asking about.

But hey, I was curious.

I was researching whether the eConnect GetNextGLJournalEntryNumber could handle a heavy load, and whether it would throw any errors when issuing lots of JE numbers.

Interestingly, I was unable to break it.  I was able to get one SQL exception error when running my test while also trying to get a new JE number in the GP Journal Entry window, but I was unable to reproduce that error.

This image shows 3 instances of my load test application simultaneously retrieving a total of 3,000 JE numbers over about 45 seconds.

3,000 JE Numbers


 public string GetNextJENumbereConn()  
 {  
   GetNextDocNumbers getNext = new GetNextDocNumbers();  
   
   try  
   {  
     string nextJE = getNext.GetNextGLJournalEntryNumber(IncrementDecrement.Increment, ConnectionStringWindows());  
     return nextJE;  
   }  
   catch (Exception ex)  
   {  
     throw ex;  
   }  
 }  


Testing just 1 instance of my load tester, I saw that it took about 16 seconds to generate 1,000 JE numbers using the eConnect method.

So, naturally, I wondered what the performance would be if I called the stored procedure directly.

Thursday, November 29, 2018

How long does it take to import Dynamics GP GL JEs with Analytical Accounting?

By Steve Endow

I recently did some tests to see how long it takes to import GL Journal Entries with large numbers of line items.

Those test results were fairly consistent, and showed that eConnect does a pretty good job of importing JEs with a large number of lines.  Only 8 seconds to import a JE with 2,000 lines seems pretty good to me, as I suspect most customers don't import JEs that large.

For anyone who is familiar with Dynamics GP integrations, the next obvious question is how eConnect handles imports of GL JEs when there is Analytical Accounting data involved.

From experience, I know that GL JEs with AA do not import terribly quickly or efficiently.

Here are the results of importing a single JE with varying line counts.  Every line in the JE has one AA code assigned.

100 lines:  3-8 seconds
200 lines:  3-9 seconds
500 lines:  9-17 seconds
1,000 lines:  21-47 seconds 
2,000 lines:  174-222 seconds


There are two obvious differences when you add Analytical Accounting codes to your GL JE import.

First, they take much, much longer.  A standard JE with 500 lines takes 2 seconds, but when you add AA, that same JE takes 9-17 seconds.

A standard JE with 2,000 lines takes 8 seconds, but when you add AA, that same JE takes 174-222 seconds.

HUUUUUGE decrease in import performance.

The second issue is the incredible variance in import times for the single JEs with AA data.  When importing the standard JEs, the import times were completely consistent.  Over 10 runs, I might have seen a 1 second variance, if any.

In this test with AA data, I imported the same JE at least 10 times for each line count, and you can see how different the times are.  The durations seem almost random.  The import times did not gradually increase or decrease--they would just increase on one run, and then decrease on the next.

The times varied from 89% to 200%, which is pretty wild.  I don't know how a stored procedure could have so much variance in performance from one run to the next.  If it wasn't such a nightmare to trace the activity from the eConnect procedures, I'd look into it.

So there you have it.  If you have to import transactions with Analytical Accounting data, you have been warned.  It seems that the eConnect procs for AA do not perform well.

Steve Endow is a Microsoft MVP in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter and YouTube





How long does it take to import a Dynamics GP GL Journal Entry with LOTS of lines?

By Steve Endow

I received a question about the performance of eConnect when importing large GL Journal Entries, such as a JE with 1,000 lines.

In some prior eConnect / GP load testing, I had only imported JEs with up to 500 lines, so I didn't know the answer.

I just fired up my Batch Load Test import tool and tested some eConnect imports of GL JEs with lots of lines.

Here's what I found.  Your import times may vary depending on your environment, but this is some baseline data to consider.

Importing a single JE with these line counts took:

500 lines:  2 seconds
1,000 lines:  4 seconds
2,000 lines:  8 seconds
4,000 lines:  16 seconds
8,000 lines:  31 seconds
10,000 lines:  39 seconds
15,000 lines:  54 seconds
20,000 lines:  77 seconds



So it looks like eConnect can pretty easily import a standard JE with quite a few lines without any issues.

CAN you import a JE with 20,000 lines?  Yes.

SHOULD you import a JE with 20,000 lines?  I would not recommend it as a routine process.  Break up the data into more manageable JE sizes.  If nothing else, it will make reconciliations and research more manageable.


Note that this is a standard GL JE without Analytical Accounting.  In my experience, eConnect performance for a JE with AA is horrible.  Definitely keep your JEs small and your batches small when importing with AA data.


Steve Endow is a Microsoft MVP in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter and YouTube




Friday, November 16, 2018

My source code control is better than yours!

By Steve Endow

Today I received an email from someone who had just inherited a GitHub account from a former employee.  He asked, "What do I do with this?"

His organization is not a "development" shop.  They support business systems, ERP systems, SQL Servers, SharePoint, PowerApps, SSRS, etc.  They don't code C# and JavaScript and Python.

So the GitHub repositories under this account were being used to store various files that are ostensibly "source code", but they were not Visual Studio projects with 6 branches and 5 developers flinging code around.  They were files like SQL scripts.

One question was: How do I access and work with SQL scripts in GitHub?  Can I use SQL Server Management Studio to access the files in GitHub?

Interesting question.  I don't think SSMS can work with GitHub, but...

At GPUG Summit 2018 in Phoenix, Jonathan Cox demonstrated during the SQL Server shootout that VS Code can be used to work with SQL scripts, and it can also work with GitHub.  So in theory, you could use VS Code to work with SQL scripts in GitHub.

But what about the other file types that are not related to VS Code?

Thursday, November 15, 2018

.NET code to create a Dynamics GP batch with eConnect

By Steve Endow

I just posted a story about how I discovered that sometimes I need to explicitly create Dynamics GP batches using eConnect in order to control the batch posting date.

Here's the code that I used.

First, I check to see if the batch exists.


 public static bool BatchExists(string database, int series, string batchSource, string batchID)   
 {  
   try  
   {  
     string commandText = "SELECT COUNT(*) AS RecordCount FROM dbo.SY00500 WHERE SERIES = @SERIES AND BCHSOURC = @BCHSOURC AND BACHNUMB = @BACHNUMB"; // AND GLPOSTDT = @GLPOSTDT";  
     SqlParameter[] sqlParameters = new SqlParameter[3];  
     sqlParameters[0] = new SqlParameter("@SERIES", System.Data.SqlDbType.Int);  
     sqlParameters[0].Value = series;  
     sqlParameters[1] = new SqlParameter("@BCHSOURC", System.Data.SqlDbType.VarChar, 15);  
     sqlParameters[1].Value = batchSource.Trim();  
     sqlParameters[2] = new SqlParameter("@BACHNUMB", System.Data.SqlDbType.VarChar, 15);  
     sqlParameters[2].Value = batchID.Trim();  
     string result = DataAccess.ExecuteScalar(database, CommandType.Text, commandText, sqlParameters);  
     int recordCount = Convert.ToInt32(result);  
     if (recordCount > 0)  
     {  
       return true;  
     }  
     else  
     {  
       return false;  
     }  
   }  
   catch (Exception ex)  
   {  
     Log.Write("An unexpected error occurred in DataAccess.BatchExists: " + ex.Message, true);  
     return false;  
   }  
 }  


If the batch does exist, I make sure to update the posting date.

Sometimes you have to explicitly create Dynamics GP Batches with eConnect

By Steve Endow

You just can't know everything.

Maybe I never took the time to look at this one field when using that particular Dynamics GP batch posting setting, and never knew about this quirk.

Maybe I did know about this quirk at some point over the last 12 or 13 years working with eConnect, but forgot about it.

Either way, I didn't know about it when I needed to know about it.

Last week I had a call with a customer to try and troubleshoot a weird posting date issue.  They import a bunch of AP Invoices every Monday morning into dozens of batches, review the batch that Monday or Tuesday, and then post the batch.  Simple, right?

But when they were reconciling the GL for their month end close, they were seeing AP Invoice transactions post to prior weeks and prior fiscal periods.  An invoice imported on November 5 posted to October 30.  Another invoice imported on November 5 posted to November 1.

It was weird.

We looked at their GP posting settings for Payables Transaction Entry, and it looked pretty normal.

Typical Batch Posting Date

The customer is using the Posting Date from the Batch.  Okay, so that rules out an issue with the invoices posting based on the Transaction posting date.

But, that doesn't explain why invoices imported at the same time on Monday morning would post to different dates.

Thursday, November 8, 2018

I created my first PowerApp! It's a Conference Badge Scanner!

By Steve Endow

I love attending conferences.  They are a rare opportunity to meet the faces behind the emails and voices and spend time with friends that you only get to see once or twice a year.

At the GPUG Summit 2018 conference in Phoenix last month, I had a great time, but there was one thing that I noticed was really frustrating me.

After a session, attendees would have questions or need assistance with a Dynamics GP issue, and I would promise to email them some information.  Sometimes I would hand them one of my business cards, sometimes they would hand me theirs.  If we had a pen handy, one of us would scribble something on the card to remind us what we talked about and prompt a follow up email.

Unfortunately, I have plenty of evidence that this process just doesn't work well.  Everyone is running around to different sessions at the conference, there are tons of distractions, we run out of business cards, the cards get stuffed into a backpack, and by the time we get home from the conference, exhausted, the last thing we want to do is sift through a pile of business cards with cryptic notes on them and try to remember what we were supposed to do, who we were supposed to contact, and which conversation went with which business card.

At the Summit conference, there were a lot of sessions on PowerApps and Flow, so I started thinking...

And I tweeted:


I had never used PowerApps and had never created a PowerApp before, so I didn't know if this was possible, and if it was possible, how difficult it would be.

Turns out, it is definitely possible.  And although it was a little more challenging than I thought, now that I've done it, relative to other things I've done, it's not that difficult.  And considering what the app can do, it's amazing how easy it is to build.

And here it is--my very own PowerApps Conference Badge Scanner!


So what do we have here?

In the upper left is the live view of the cell phone camera. When the app is open on my phone, I can take a photo of a conference attendee's badge.



In the upper right is a photo of the badge.  Because conference badges can have lots of other noisy text besides the attendee name, I found that it is best if I take a photo of just the text that I want scanned.