Category: Dynamics GP


GP & SQL Compatibility

I replied to a GPUG question earlier today from someone wanting to use SQL 2016 with GP 2015.  It bears repeating here and applies to similar situations as well.

When Microsoft says not compatible, they mean it. It may work, but it’s not supported. If you do run into an issue with compatibility down the road, you have just burned all bridges, since you won’t likely be able to restore it later to a supported SQL version. At that point you can only hope that you’ve kept a copy of databases from the earlier SQL version and reenter everything from that point, or start over. It’s usually a deprecated function is specified in GP coding that will be the problem, and the only fix is to rewrite the software.

For years I was the “Damn the torpedoes, full speed ahead” guy, the one that could always trick the system into making it work after a client broke it, but after seeing the destruction and mayhem it can cause down the road (read potentially hundreds of consulting hours), it’s wise to trust Microsoft’s compatibility list. It’s there for a reason.

What appears to be a clear question on the surface quickly clouds in the face of reality.  Your data is not only important, it’s vital to the life of your business.  There are several reasons that I will cover below.

Database upgrades complete with no errors.

This is the first reason clients think of when testing upgrades, and this is the one we are concerned least about.  If this were a data migration from one system to another, then yes, the figures must be checked closely.  Barring damaged data, database upgrades simply add columns and procedures to accommodate new processes and features in Dynamics GP.  Your data actually remains relatively untouched during this process.  While you may think initially that all you need to be concerned about is your ‘bottom line’, that’s actually the one thing you don’t need to worry about.  Check, yes.  Concern, no.  What is more likely to have changed is the way to access that data and how it gets reported.  Bottom line on your bottom line, I cannot recall one upgrade in thousands where the figures did not tie out.

Vital business processes are not hampered.

This brings us to the second and most crucial step, yet one passed over by most clients – testing processes.  Data doesn’t do us much good if we can’t act or report on it.  Each new version of Dynamics GP brings new features, and those ‘features’ occasionally change location of an action button or drop-down from the bottom of a window to the top, or even to another window.  You need to know where changes have been made so you aren’t trying to learn while the UPS driver is waiting for a check.

Speaking of checks, the formatting, along with that of many other reports (yes, a check is a ‘report’ to the software) is also modified during upgrade processes to accommodate the changes to the tables noted above.  Some of the more frequently customer-modified reports include GL Trial Balances, SOP Invoices, Check layouts, and Purchase Orders.  With the upgrading of the formats, sometimes sort orders are changed, or other changes which may affect the looks of your invoices, checks, etc.

It is imperative that you devote time during a test upgrade to run through:

  • Daily Processes in Finance, Payables, Receivables, Payroll, Inventory
  • Weekly Procedures
  • Monthly Procedures
  • Quarterly and Year-end Procedures

That includes printing checks – to blank paper, then hold up the printed check to blank computer check stock to verify alignment.  Don’t forget to shred the printed check forms, particularly if GP is printing the signature.

Third-party and integration tasks should be checked as well.  Stand-alone programs and web interfaces should be checked for proper operation, functionality, and security.

Little or no testing can jeopardize your upgrade and result in not only lost productivity for you, but additional costs for an emergency situation that would have been uncovered and corrected in test.  Will we do an upgrade without a test?  Certainly, if you can afford the down time and loss of productivity.  The no-test scenario is usually reserved for 1-2 user systems running Payables and GL only.

Frequently an upgrade also means a new server.  This gives you a perfect chance to test the capacity and capability of the new server by loading the server as you would on a daily basis to make sure the application is as responsive as you need it to be before you’re in production and it’s too late to fix a problem without lost production time.  So don’t just have one person test the system – plan a time when everyone can test together.  If there are problems, we can diagnose and correct them before it becomes a critical issue.

CAL’s ‘What’s New’ training can be beneficial if done during in the test upgrade phase.  This will eliminate many of the navigation questions that may arise when first encountering the upgraded interface.  This training can be reinforced with additional assistance when the live upgrade is performed.

There are several reasons for backing up your Dynamics GP SQL Data and shared files, among them:

  • Business Continuity
  • Human or Computer Error Recovery
  • Historical Preservation
  • Data Health and Maintenance

 

Business Continuity or Disaster Recovery is the most comprehensive and usually requires the most planning.  You must consider how much data loss is acceptable and how long your accounting system can afford to be down.

Let’s first consider the types of SQL Recovery Modes available.  It is important to understand these simply because of the types and granularity of restores depend on them.  For our purposes we will focus on the 2 most common recovery models, Simple and Full.

Simple Recovery Model is just that – simple; the backup job creates the backup file and clears committed transactions from the log file, then truncates the log.  It cannot be restored to a point in time other than when it was created.  These can be done as often as needed, but remember, you’re backing up the entire database, so keep a close eye on drive space.

Full Recovery Model (the one we recommend) has many more options, but comes with a bit more overhead.  In full recovery mode, the full database is backed up, including data and log file.  It does not, however, truncate the log file.  There is a secondary, and more frequent, backup called the Transaction Log Backup (never could have guessed that one, right?) that does the actual truncation of the log file.  In a restore situation, you restore the full backup and any log files to get you to the point of error.  This type of backup is your best bet in the event that someone accidentally clears data or other human error situation.

For Business Continuity purposes, you ideally should have a balance of on-disk and external media on- and off-site.  Two or three days should be sufficient for on-disk backups, as you would rarely want to take your accounting system back that many days.  External media retention is up to you, but a week’s worth of backups with at least one day offsite works well for most.

Many companies want or require a year-end backup archived.  Back up the database to external media, and archive as desired.  You may want to burn these to disk as shelf life is longer and would not be affected by EMP, media deterioration or accidental erasure.  Include a copy of the current DYNAMICS database for ease of data access.  Your company may have moved on, but this data if locked in time.

Don’t let your IT tell you that they are snapshotting the server and you don’t need SQL backups.  With full recovery model, you must have both Full and Transaction Log backups to keep the database healthy and keep size and performance in check.

One additional word on Server Snapshots, particularly virtual server snapshots.  If you are running them for Disaster Recovery, DO NOT run them during the work day.  SQL interprets the snapshot scan of the live databases as a disk freeze and any transaction which may be taking place during the few seconds of snapshot scans will likely be corrupted and could require extensive repair of the database if not caught quickly.

One other common error is to have the database set to Full Recovery Model, then let other backup software do the database backups.  This is fine AS LONG AS it also is capable of running Transaction Log backups.  If you only back up the database and not the log, it never truncates and will both cause serious performance issues and finally fill up the disk at which point the database (and GP) shut down.  Know your recovery model, backup type, and frequency.  We can work with you IT staff to develop a plan that works for you.

Test your backups!  Even the best of us can fall into that trap.  The backup looks good, but if you can’t restore it, you might as well not have one.  That’s why my comment on on-disk backups above.  If you need to restore to a point-in-time, have your transaction logs backed up to disk, but your full backup is on some tape somewhere, you’ll be down for as long as it takes to recover the backup media and get it mounted.  Keep several recent backups and relevant log backups on disk for speedy recovery.

There are some non-SQL files that should be backed up regularly as well.  That includes modified reports and forms dictionaries, FRx SysData folder (if you’re still using FRx, and if you are, why?  …but that’s another topic), Integration Manager database, signature files, Mekorma Stub Library, etc..  CAL usually has these under a server ‘GPShare’ folder, so include that in your backups.

Several final notes and frequently misunderstood items:

  1. Backing up your DYNAMICS database does not back up your data. The DYNAMICS database is the GP system database, but only contains system-level information – users, registration, security, etc.  The actual transactions are in your company database.
  2. When you add a company to Dynamics GP, make sure the company database is included in the backup. I usually set the backups for ‘all user databases’ – that way, if a new company is added, it’s automatically included in the backup.  If you use ‘all user databases’, you’re also covered for Management Reporter and SQL Server Report Server backups.
  3. Check your SQL System backups – Master, Model, and MSDB. With those you can recover your GP SQL users in the event of a rare, but possible, SQL application crash.  Without them, you will have to create new users in SQL to tie back to the GP users.

A client on GP 2010 had just installed Professional Service Tools with the primary intent of running the Copy Company function to transition from test to production.  The utility was failing with the error:

Unhandled script exception:
[Microsoft][SQL Server Native Client 10.0][SQL Server]Could not find stored procedure ‘XYZ.dbo.eeCompanyCopy’.

The cause was that the end user had set up their own ODBC DSN and did not clear the check boxes for ‘Use ANSI Quoted Identifiers’.  Clearing the ANSI flags on the connector allowed the utility to run without error.

 

It should have been an easy upgrade…  Dynamics GP 9.0 to GP 11.0, but I needed to get 9.0 to latest Service Pack to run the GP 11 (2010) upgrade.  It would not install, so I turned off DEP, turned off UAC, it still failed.  I even pulled out my bag of tricks, copied the server-side folder to a workstation replacing the existing GP (after zipping the original folder).  It updated the application, but there was nothing in Utilities to allow me to update the databases.

Since Microsoft no longer supports GP 9.0, KB searches came up empty, but in researching, I came across a Blog entry referring to updating .NET 1.1 Framework to SP1 and also referenced a non-existent KB.  I downloaded and updated .NET to SP1, accepting the warning that it had compatibility issues with Server 2008, and it completed the framework install successfully.  I re-launched the GP 9 MSP, and IT RAN!

Hopefully this helps others in this situation.

I am frequently asked to condense Microsoft’s detailed System Specifications for Dynamics GP.  Following is a ‘short list’ of server hardware requirements and recommendations that will give maximum performance at reasonable cost:

1. 64-bit hardware platform, Hardware RAID. Windows Server 2008R2 Standard 64-bit and SQL Server 2008R2 64-bit Standard Edition. 64-bit allows use of memory over 4GB without disk swap. 1 or more quad-core processors.
2. 12GB RAM minimum.
3. Drive configuration – The recommendation is OS on RAID1 (2 drives), Data on RAID5 (3 or more drives), which is a minimum of 5 drives. One additional often-overlooked factor is drive speed. Use 10,000 RPM drives as a minimum for the data drives, 15,000 RPM preferred. If it’s a matter of size over speed, lower the size (15k RPM is not available in all drive sizes) to maintain speed. For OS, 80GB drive space minimum, for Data, (3 (or more)) 132GB in RAID5, yielding 260+GB of drive space. For maximum performance, add one additional small 15,000 RPM drive and move the SQL tempdb to that drive. Isolating tempdb will boost SQL performance dramatically.
If you must back off the drive count, go for a RAID5 array of 10,000+ RPM drives in a single partition. In this configuration, partitioning data from OS has no good effect on performance, since all partitions are running on the same drives, and it is the division of tasks by array that increase performance, not just a cosmetic partitioning.

If the server is virtual, either Hyper-V or VMware architecture is supported. Allocate at least one processor and 4-8GB RAM. Drive configuration is not as critical, however better performance will be achieved if you use a fixed disk size rather than dynamic. If you do use a fixed disk size, please allocate sufficient space for Microsoft SQL database growth, installation software, and on-disk data backups. Depending on the number of GP companies and anticipated data volume, a 32-64GB System drive and 100GB data drive should be sufficient.

Do you find that each time you log into Management Reporter on Terminal Server or Citrix, you need to re-enter your login and default company?  The reason those are lost between sessions is that their user profiles are set up to delete temporary files and folders on exit.  This unfortunately is where Management Reporter coders chose to put the configuration files that hold that information from session to session. Making the issue more difficult is the fact that Management Reporter coding requires a SQL user login per company, so it’s not a single sign on as it was in FRx.

Constantly adding your login information can be quite frustrating.  We and a number of others have asked Microsoft to change the location of the user configuration file to a different location.  The request is pending.

However, we can tell you how to change the settings so that deleting temporary files and folders on exit is turned off.  (Note: It actually needs to be that way for GP in the event a user is posting and the connection to the Terminal Server drops.  If the session is left running on the Terminal Server, the posting will continue normally.  If the setting to delete is enforced, the session is forcibly closed resulting in a hung posting or data corruption.)

The environmental variables that hold the MR settings are %APPDATA% and %LOCALAPPDATA%.  Some GP functionality also depends on these variables.  User accounts should have the ability to read and write to these variables and they should be retained, not deleted.  On a Windows 7 workstation those variables point to c:\users\User_Name\AppData and \AppData\Local.  In a roaming profile frequently used for Terminal Server\Citrix users, these variables may reside on a completely different server, requiring cross-server security changes.

Bottom line, it’s not an easy fix.  But it can be done. And it is one of this things that will make your users very happy.

Originally posted on http://www.erpsoftwareblog.com/2012/07/management-reporter-tip-how-to-maintain-logins-and-default-company-settings-between-sessions/

This relates immediately to a server install I was performing, but is something to keep in mind.
1. Don’t try to use SSRS (SQL Server Reporting Services) Reports in GP if SQL is an Express version of 2005, 2008, or 2012. According to GP 2010 System Requirements, SQL Express is supported for everything but Analysis Cubes. I spent too much time trying to get it to work until I came across a Blog post flatly stating that it does not work. Even running GP as a local administrator and logged in with the SQL sa account, I continued to get the error that I did not have permission to install to that location. I did set security in the Reports Site Settings (you need to run IE as administrator the first time to add yourself), gave our user, domain admins, and domain users full access to everything, then repeated the permission settings for folder security. I even went so far as to set security permissions on the file folder structure in Windows Explorer – same error. As I stated, the Blog post was sufficient to put a halt to my efforts.
2. The client wanted to use SQL 2012, which for GP2010SP3 supports. However, Management Reporter would not validate on SQL 2012 to allow creation of the databases. In this case, Management Reporter System Requirements DO state SQL 2005 or 2008 are supported, with no mention of 2012.

One additional note – You will have little or no trouble installing Management Reporter Server components if you download and pre-install the Access 2010 Runtime and Access Runtime SP1 from Microsoft Downloads. If you don’t run those first, setup will hang on the MR Server configuration windows. If it’s a 64-bit OS, install 64-bit Access Runtime and SP. If Office 32-bit is installed on the server, you cannot just run the 32-bit Access Runtime install, because it senses a mismatch between the Windows Server and Office platforms. The only option is to uninstall MSOffice 32-bit and then install the 64-bit Access Runtime. You will then need to install Office (Word, Excel) as 64-bit.

This was an interesting project, one I have not had to deal with for a few years.  A client was still running a single-user Great Plains 6.0 Ctree on Windows 98 (remember that?).  Fortunately, I still had the migration chunk file so it was doable.  First order of business was getting a copy of the code and data off the old PC, and ended up pulling the drive, attaching it to my notebook via  a USB adapter, and copying the folder from there to a virtual Windows XP Mode install on my notebook.  The Ctree version fired right up, and I verified data.  Upgrade to 7.5 was simple enough, except when I continued to get Pathname errors running Shrink on PM and GL history files.  I verified that the files existed and GP produced historical reports.  The problem was that during installation of GP 7.5, Utilities did not synchronize to the existing account framework.  As soon as I changed the dex.ini flag to TRUE and forced synchronization all was well.  Data upgrade was flawless as was Check Links.

On to migration.  I needed to use MSDE (SQL 2000 Express) for the database, which is also why I needed XP Mode.  I created the empty mirror database and ran the push migration from 7.5 Ctree.  The migration ran for 16 hours, and the data set was less than 500,000 records!  The processor was maxed on the virtual machine, which is why the excess time, but I didn’t want to kill the process, change processor and memory parameters for the VM, restore and restart.  I need to remember that for next time.

The upgrade to 8 was textbook.  On to 10 – a new problem.  Component installation failed on SQL Server Native Client 9.0 and would not complete.  There was nothing in the error logs.  I then tried installing just the SQL Client from the Bin folder, and finally received a usable error – installation was not permitted on a terminal session.  The fix was to disable integration components in XP mode and restart the Virtual PC.  Success!  Remember to re-enable integration components so you can see your local Windows 7 drives.

The upgrade processes to 10 and finally 2010 were non-problematic, and running SQL backups between upgrades ensured a successful conclusion.